joshlk/many_requests
Dead easy interface for executing many HTTP requests asynchronously. Also provides helper functions for executing embarrassingly parallel async coroutines.
repo name | joshlk/many_requests |
repo link | https://github.com/joshlk/many_requests |
homepage | |
language | Python |
size (curr.) | 5921 kB |
stars (curr.) | 303 |
created | 2020-11-11 |
license | MIT License |
many_requests
Dead easy interface for executing many HTTP requests asynchronously. It has been tested in the wild with over 10 million requests. Automatically handles errors and executes retries.
Built on-top of Trio and asks. Interface heavily inspired by Requests and joblib.
Also provides helper functions for executing embarrassingly parallel async coroutines.
To install:
pip install many-requests
Example Usage
Execute 10 GET requests for example.org:
from many_requests import ManyRequests
responses = ManyRequests(n_workers=5, n_connections=5)(
method='GET', url=['https://example.org' for i in range(10)])
Query HackNews API for 10 items and parse JSON output:
responses = ManyRequests(n_workers=5, n_connections=5, json=True)(
method='GET',
url=[f'https://hacker-news.firebaseio.com/v0/item/{i}.json?print=pretty' for i in range(10)])
To execute embarrassingly parallel async coroutines:
from many_requests import EasyAsync, delayed
import trio
EasyAsync(n_workers = 4)(delayed(trio.sleep)(i) for i in range(10))