Sending files over the network is a common task in many Python applications. The aiohttp library provides a nice API for handling file transfers asynchronously and efficiently in your Python code.
In this guide, we'll walk through a practical example of sending a file from a Python script to a client using aiohttp. We'll cover some key concepts and best practices along the way.
Why aiohttp for Sending Files?
The aiohttp library is great for building asynchronous Python applications. It shines for tasks like:
Compared to alternatives like basic sockets, aiohttp handles a lot of complexity for you. And it allows your Python code to offload blocking I/O work to the background.
This makes it a great fit for transferring files. The asynchronous design means your code stays responsive while a file sends in the background.
Sending a File Example
Let's look at a full example of sending a file with aiohttp.
First we'll start up an
import aiohttp
import aiofiles
async def handle_file_request(request):
file_path = "example_file.pdf"
async with aiofiles.open(file_path, mode='rb') as f:
file_data = await f.read()
return web.Response(
body=file_data,
headers={
"Content-Disposition": f'attachment; filename={file_path}'
}
)
app = web.Application()
app.add_routes([web.get('/', handle_file_request)])
if __name__ == '__main__':
web.run_app(app)
The key parts:
Now when clients connect and request the root URL path, they'll receive the PDF file.
The code is clean and simple. But there's still room for improvement...
Streaming Large Files
For large files, reading the entire contents into memory can be inefficient.
We can optimize by streaming chunks of the file instead. This prevents loading gigabytes of file data at once:
async def stream_file(request):
file_path = "massive_video.mp4"
chunk_size = 1024*1024 # 1MB chunks
async with aiofiles.open(file_path, mode='rb') as f:
resp = web.StreamResponse(
headers={
"Content-Disposition": f'attachment; filename={file_path}'
}
)
await resp.prepare(request)
while True:
chunk = await f.read(chunk_size)
if not chunk:
break
await resp.write(chunk)
return resp
By yielding 1MB chunks instead of the full file, we reduce memory usage. The client will receive the same complete file streamed across multiple response packets.
And aiohttp handles the asynchronous chunk streaming without blocking our main thread.
Handling Many Files Efficiently
What if we need to send multiple files from the same server? Opening each one synchronously would limit performance:
# DON'T DO THIS
for file_path in file_paths:
async with open(file_path) as f:
await process(f) # sync open
The trick is to launch all the file reads concurrently:
import asyncio
async def stream_files(request):
file_paths = ["file1.pdf", "file2.mov", "file3.doc"]
file_chunks = []
async def read_chunk(fp):
async with aiofiles.open(fp, 'rb') as f:
while True:
chunk = await f.read(1024*1024)
if not chunk:
break
file_chunks.append(chunk)
await asyncio.gather(*[read_chunk(fp) for fp in file_paths])
# Combine file chunks and stream response...
By gathering the
This processes 100+ files much faster than sequential sync opens!
Other Features
We've covered the basics of sending files with aiohttp. But there are many other neat features like:
The aiohttp docs show how to enable these.
Key Takeaways
Hopefully this gives you a solid starting point for sending files asynchronously in Python.