The asyncio module in Python provides powerful tools for writing asynchronous and concurrent code. One very useful function is asyncio.gather(), which allows you to simplify running multiple coroutines concurrently.
The key thing
Here's a quick example:
import asyncio
async def fetch_data(url):
# Pretend we fetch data from URL
return f"Downloaded from {url}"
urls = ["https://www.python1.org", "https://www.python2.org"]
async def main():
results = await asyncio.gather(*[fetch_data(url) for url in urls])
print(results)
asyncio.run(main())
This will fetch data from the two URLs concurrently and print out the results list when both are completed.
Some key points on using
One common use case is kicking off I/O-bound work like multiple network requests concurrently:
user_coros = [fetch_user_data(uid) for uid in user_ids]
user_data = await asyncio.gather(*user_coros)
In summary,