Not covered: Niceties of asynchrony, when threads run truly concurrently (it’s complicated, but you need ’em often even for non-concurrent stuff so deal with it), when evented poll systems are truly asynchronous (never, but it doesn’t matter).
🏗 cover uvloop.
libraries for async-style coroutine concurrency
Modern python async-style stuff.
tl;dr Use the event loop from tornado or pyzmq. These non-thread IO things are comparatively easy and well-documented. And they work with the new python 3 async style. You can use them to farm off heavy computation to other threaded nightmare hell farms or whatever, and they already work right now.
UPDATE: actually raw asyncio is getting civilized these days, might be worth using. But there is a complicated relationship between the various bits. And I no longer need to do this, so you are on your own. G’luck.
Here are some ingredients that might make these work better:
BBC’s tutorial Python Asyncio Part 1 – Basic Concepts and Patterns
HTTPX “is a fully featured HTTP client for Python 3, which provides sync and async APIs, and support for both HTTP/1.1 and HTTP/2.”
Seems to aim to be the future version of the popular python
aiohttp seems to be the ascendant asynhronous server/client swiss army knife for HTTP stuff
backoff is a handy python library for a menial and common task, retrying with a slightly longer delay. This is the only thing from this list I am currently using.
sanic is a hip, python3.5+-only, Flask-like web server. Supports websocket and graphql extensions. See also aiohttp which is the same (or is the underlying engine?) idk
rxexists for python as rxpy and is tornado compatible.
terminado provides a terminal for tornado, for quick and dirty interaction.
This feels over-engineered to me, but looks easy for some common cases.
0mq itself is attractive because it already uses tornado loops, and can pass numpy arrays without copying.
aiomonitor inject REPL for async python
Alternative async ecosystems
Yes, as always there is something newer and hipper and more artisinal.
Curio is a library for performing concurrent I/O and common system programming tasks such as launching subprocesses and farming work out to thread and process pools. It uses Python coroutines and the explicit async/await syntax introduced in Python 3.5. Its programming model is based on cooperative multitasking and existing programming abstractions such as threads, sockets, files, subprocesses, locks, and queues. You’ll find it to be small and fast.
The essay that explains why there is a different synchronous ecosystem: Nathaniel J. Smith, Some thoughts on asynchronous API design in a post-async/await world
Curio doesn’t have much in the way of tooling yet. e.g. for HTTP requests you might use curequests or asks. for a server, you might import a raw HTTP2 library and go bareback.
I think this has morphed into the trio system which now looks active. TBD.
Check datasette for an example of integrating threading loops (
Sometimes you need it.
But I don’t have much to say, and am not expert.
For threaded and multi-proc concurrency you sometimes need simple shared variables. Here is, e.g. counters HOWTO.
If we are doing parallel stuff, we need locking to avoid two things doing something at the same time that should noe be. portalocker is a handy tool to lock files and optionally other stuff.
No comments yet. Why not leave one?