Any suggestions?

Use Streaming Upload, as docs put it:

Requests supports streaming uploads, which allow you to send large streams or files without reading them into memory. To stream and upload, simply provide a file-like object for your body:

with open('massive-body', 'rb') as f:
    requests.post('http://some.url/streamed', data=f)
Answer from Daweo on Stack Overflow
Top answer
1 of 2
4

Any suggestions?

Use Streaming Upload, as docs put it:

Requests supports streaming uploads, which allow you to send large streams or files without reading them into memory. To stream and upload, simply provide a file-like object for your body:

with open('massive-body', 'rb') as f:
    requests.post('http://some.url/streamed', data=f)
2 of 2
1

When you pass files arg then requests lib makes a multipart form upload. i.e. it is like submitting a form, where the file is passed as a named field (file in your example)

I suspect the problem you saw is because when you pass a file object as data arg, as suggested in the docs here https://requests.readthedocs.io/en/latest/user/advanced/#streaming-uploads then it does a streaming upload but the file content is used as the whole http post body.

So I think the server at the other end is expecting a form with a file field, but we're just sending the binary content of the file by itself.

What we need is some way to wrap the content of the file with the right "envelope" as we send it to the server, so that it can recognise the data we are sending.

See this issue where others have noted the same problem: https://github.com/psf/requests/issues/1584

I think the best suggestion from there is to use this additional lib, which provides streaming multipart form file upload: https://github.com/requests/toolbelt#multipartform-data-encoder

For example:

from requests_toolbelt import MultipartEncoder
import requests

encoder = MultipartEncoder(
    fields={'file': ('myfilename.xyz', open(path, 'rb'), 'text/plain')}
)
response = requests.post(
    url, data=encoder, headers={'Content-Type': encoder.content_type}
)
Discussions

Can't upload file larger than 2GB to webdav by requests
Dear Sir, I can't upload file larger than 2GB to webdav server by requests, and the following is the error message. Any idea about how to solve this problem ? Thanks a lot for your help. OS: SL5,SL6 Python version:2.6.6 requests version:2.1.0 More on github.com
🌐 github.com
25
December 6, 2013
Can you upload large tarballs using Python3 requests by chunking the file?
I have been able to successfully upload some of the tarballs using python3 requests.post, but some of the tarballs are too big and I get a MemoryError from python. Through some research, I found the best solution for my situation would be to chunk the tarballs into smaller, multipart uploads. More on community.sonatype.com
🌐 community.sonatype.com
1
0
August 4, 2022
How multipart file upload work in get request ?
HTTP does not forbid a body on a GET request, but the meaning of a body on a GET request is undefined. It's a quirk of Telegram if it accepts uploads through a GET instead of a POST. Most servers will generally reject a GET with a body. This StackOverflow post probably explains it best: https://stackoverflow.com/a/983458 Edit: judging by the Telegram documentation I'm guessing that Telegram's servers treat POST and GET requests exactly the same. This is not standard behavior for HTTP and should not generally be relied upon. More on reddit.com
🌐 r/Python
7
3
June 12, 2022
How to upload file with python requests? - Stack Overflow
If you want to upload a single file with Python requests library, then requests lib supports streaming uploads, which allow you to send large files or streams without reading into memory. More on stackoverflow.com
🌐 stackoverflow.com
🌐
FastAPI
fastapi.tiangolo.com › tutorial › request-files
Request Files - FastAPI
You can declare multiple File and Form parameters in a path operation, but you can't also declare Body fields that you expect to receive as JSON, as the request will have the body encoded using multipart/form-data instead of application/json. This is not a limitation of FastAPI, it's part of the HTTP protocol. You can make a file optional by using standard type annotations and setting a default value of None: ... from typing import Annotated from fastapi import FastAPI, File, UploadFile app = FastAPI() @app.post("/files/") async def create_file(file: Annotated[bytes | None, File()] = None): if not file: return {"message": "No file sent"} else: return {"file_size": len(file)} @app.post("/uploadfile/") async def create_upload_file(file: UploadFile | None = None): if not file: return {"message": "No upload file sent"} else: return {"filename": file.filename}
🌐
api.video
api.video › blog › tutorials › upload-a-big-video-file-using-python
Upload a big video file using Python
This sample works by opening, reading, and sending your video file as chunks of binary data. When it arrives, it's encoded into HLS for you, no matter what kind of video it started out as. ## How to upload a large video that is over 199 MiB to api.video. (Though this script will also work for videos under 200 MiB if you want to test it out.) import requests import os ## Set up variables for endpoints (we will create the third URL programmatically later) auth_url = "https://ws.api.video/auth/api-key" create_url = "https://ws.api.video/videos" ## Set up headers and payload for first authentication request headers = { "Accept": "application/json", "Content-Type": "application/json" } payload = { "apiKey": "your API key here" } ## Send the first authentication request to get a token.
🌐
py4u
py4u.org › blog › python-requests-upload-large-file-with-additional-data
How to Upload Large Files with Additional Data Using Python Requests Without MemoryError: A Complete Guide
By avoiding f.read(), you let requests handle chunked streaming, keeping memory usage low. ... Use open("file", "rb") and pass the file object to files. Send metadata with the data parameter.
🌐
ProxiesAPI
proxiesapi.com › articles › streaming-uploads-in-python-requests-using-file-like-objects
Streaming Uploads in Python Requests using File-Like Objects | ProxiesAPI
February 3, 2024 - with open('large_video.mp4', 'rb') as f: requests.post('https://example.com/upload', files={'video': f}) Instead, we can create a file-like object that streams the data in chunks: ... This streams the file data in memory-efficient chunks without ...
🌐
GeeksforGeeks
geeksforgeeks.org › python › how-to-upload-files-using-python-requests-library
How to Upload Files Using Python Requests Library - GeeksforGeeks
July 23, 2025 - In this example, below Python code uses the requests library to perform a POST request to https://httpbin.org/post, uploading the file example.txt with the specified content type of multipart/form-data.
🌐
GitHub
github.com › psf › requests › issues › 1784
Can't upload file larger than 2GB to webdav by requests · Issue #1784 · psf/requests
December 6, 2013 - Dear Sir, I can't upload file larger than 2GB to webdav server by requests, and the following is the error message. Any idea about how to solve this problem ? Thanks a lot for your help. OS: SL5,SL6 Python version:2.6.6 requests version:2.1.0
Author   wchang
Find elsewhere
🌐
Sonatype Community
community.sonatype.com › sonatype nexus repository
Can you upload large tarballs using Python3 requests by chunking the file? - Sonatype Nexus Repository - Sonatype Community
August 4, 2022 - I have been able to successfully upload some of the tarballs using python3 requests.post, but some of the tarballs are too big and I get a MemoryError from python. Through some research, I found the best solution for my situation would be to chunk the tarballs into smaller, multipart uploads.
🌐
TutorialsPoint
tutorialspoint.com › requests › requests_file_upload.htm
File Upload with Requests in Python
We will use the http://httpbin.org/post to upload the file. import requests myurl = 'https://httpbin.org/post' files = {'file': open('test.txt', 'rb')} getdata = requests.post(myurl, files=files) print(getdata.text) ...
Top answer
1 of 9
427

If upload_file is meant to be the file, use:

files = {'upload_file': open('file.txt','rb')}
values = {'DB': 'photcat', 'OUT': 'csv', 'SHORT': 'short'}

r = requests.post(url, files=files, data=values)

and requests will send a multi-part form POST body with the upload_file field set to the contents of the file.txt file.

The filename will be included in the mime header for the specific field:

>>> import requests
>>> open('file.txt', 'wb')  # create an empty demo file
<_io.BufferedWriter name='file.txt'>
>>> files = {'upload_file': open('file.txt', 'rb')}
>>> print(requests.Request('POST', 'http://example.com', files=files).prepare().body.decode('ascii'))
--c226ce13d09842658ffbd31e0563c6bd
Content-Disposition: form-data; name="upload_file"; filename="file.txt"


--c226ce13d09842658ffbd31e0563c6bd--

Note the filename="file.txt" parameter.

You can use a tuple for the files mapping value, with between 2 and 4 elements, if you need more control. The first element is the filename, followed by the contents, and an optional content-type header value and an optional mapping of additional headers:

files = {'upload_file': ('foobar.txt', open('file.txt','rb'), 'text/x-spam')}

This sets an alternative filename and content type, leaving out the optional headers.

If you are meaning the whole POST body to be taken from a file (with no other fields specified), then don't use the files parameter, just post the file directly as data. You then may want to set a Content-Type header too, as none will be set otherwise. See Python requests - POST data from a file.

2 of 9
64

The new Python requests library has simplified this process, we can use the 'files' variable to signal that we want to upload a multipart-encoded file:

url = 'http://httpbin.org/post'
files = {'file': open('report.xls', 'rb')}

r = requests.post(url, files=files)
r.text
🌐
cyberangles
cyberangles.org › blog › using-python-requests-to-bridge-a-file-without-loading-into-memory
How to Stream Large Files from URL to Multipart POST with Python Requests (Without Loading into Memory) — CyberAngles.org
For multipart POST requests, the standard requests API can upload files using files={'field': (filename, fileobj)}, but this may still load small files into memory. For very large files, we use MultipartEncoder from requests-toolbelt, which is ...
🌐
kmiku7's blog
kmiku7.github.io › 2018 › 11 › 16 › Why-uploading-large-file-using-Requests-is-very-slow
Why uploading large file using Requests is very slow? | kmiku7's blog
September 20, 2019 - Recently I write a python script to upload files using Requests Libaray. The script need about 7 miniutes to upload a file about 3GB in size. While the curl only take less than a miniute.According ser
🌐
GeeksforGeeks
geeksforgeeks.org › python › how-to-download-large-file-in-python-with-requests
How To Download Large File In Python With Requests - GeeksforGeeks
July 23, 2025 - In this article, we will explore how to download large files in Python with Requestsand code examples to demonstrate different approaches.
Top answer
1 of 6
31

Reading through the mailing list thread linked to by systempuntoout, I found a clue towards the solution.

The mmap module allows you to open file that acts like a string. Parts of the file are loaded into memory on demand.

Here's the code I'm using now:

Copyimport urllib2
import mmap

# Open the file as a memory mapped string. Looks like a string, but 
# actually accesses the file behind the scenes. 
f = open('somelargefile.zip','rb')
mmapped_file_as_string = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ)

# Do the request
request = urllib2.Request(url, mmapped_file_as_string)
request.add_header("Content-Type", "application/zip")
response = urllib2.urlopen(request)

#close everything
mmapped_file_as_string.close()
f.close()
2 of 6
5

The documentation doesn't say you can do this, but the code in urllib2 (and httplib) accepts any object with a read() method as data. So using an open file seems to do the trick.

You'll need to set the Content-Length header yourself. If it's not set, urllib2 will call len() on the data, which file objects don't support.

Copyimport os.path
import urllib2

data = open(filename, 'r')
headers = { 'Content-Length' : os.path.getsize(filename) }
response = urllib2.urlopen(url, data, headers)

This is the relevant code that handles the data you supply. It's from the HTTPConnection class in httplib.py in Python 2.7:

Copydef send(self, data):
    """Send `data' to the server."""
    if self.sock is None:
        if self.auto_open:
            self.connect()
        else:
            raise NotConnected()

    if self.debuglevel > 0:
        print "send:", repr(data)
    blocksize = 8192
    if hasattr(data,'read') and not isinstance(data, array):
        if self.debuglevel > 0: print "sendIng a read()able"
        datablock = data.read(blocksize)
        while datablock:
            self.sock.sendall(datablock)
            datablock = data.read(blocksize)
    else:
        self.sock.sendall(data)
🌐
GitHub
gist.github.com › nbari › 7335384
python chunk upload files · GitHub
python chunk upload files. GitHub Gist: instantly share code, notes, and snippets.
🌐
sqlpey
sqlpey.com › python › python-requests-file-upload-strategies
Python Requests: Effective Strategies for Sending Files via POST
November 4, 2025 - requests supports streaming uploads by passing an open file object directly to the data parameter: # Client Side: Streaming Upload STREAM_URL = 'http://some.url/streamed' LARGE_FILE = 'massive-body' with open(LARGE_FILE, 'rb') as large_f: # Pass the file stream directly to 'data' for streaming ...
🌐
GitHub
github.com › psf › requests › issues › 1584
POST a Multipart-Encoded File with streaming · Issue #1584 · psf/requests
September 9, 2013 - """ path = 'deliver/upload/' url = urlparse.urljoin(address, path) # get id for group url_group_id = urlparse.urljoin(address, 'deliver/groupidbyname/?groupname={}'.format(group)) response = requests.get(url_group_id) group_id = response.text # upload file values = {'md5hash': md5sum(filename), 'group': group_id, 'version': version, 'username': username, } with open(filename, 'rb') as f: response = requests.post(url, data=values, files={'file': f}) return response · Is there a way to enable streaming of the large file for this case?
Author   robi-wan