When building applications with aiohttp in Python, it's common to need to make multiple requests concurrently rather than sequentially. There are a few ways to achieve this while avoiding some common pitfalls.
Use asyncio.gather
The easiest way is with
import asyncio
import aiohttp
async def fetch(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
return await response.text()
urls = ['https://example.com/1', 'https://example.com/2']
async def main():
results = await asyncio.gather(*[fetch(url) for url in urls])
print(results)
asyncio.run(main())
This fires off all the
Reuse session
When making multiple requests, it's better to reuse a single
async with aiohttp.ClientSession() as session:
response1 = await session.get(url1)
response2 = await session.get(url2)
Avoid limits
If you have a large number of URLs, don't try to kick them all off at once! There are connection limits per domain, so you could hit throttling or errors. Use
semaphore = asyncio.Semaphore(10)
async def fetch(url):
async with semaphore:
# rest of request code
This allows 10 requests at a time. Tune based on target sites.
In summary, aiohttp and asyncio provide great tools for concurrent requests, but take care to reuse sessions and limit concurrency. Happy fetching!