FastAPI and Uvicorn is running synchronously and very slow

See original GitHub issue

I’m new in FastAPI and i’m testing file uploads and asyncronous requests. However, when i perform several request with clients parallel and serial the FastAPI process each upload in Queue (synchronously) and very slow. I’m performing the API with uvicorn and gunicorn and with 1 worker. With both execution the time spent was the same.

My client send 4 files with approximately 20MB in parallel (or in serial) for FastAPI endpoint, however, it is storing the files one at a time and very slow.

I made the same upload with a aiohttp endpoint and the files was stored in approximately 0.6 seconds with client making request in parallel (multiprocessing) and 0.8 seconds with client making request in serial (in mean). When i made this uploads in FastAPI the files was stored in approximately 13 seconds with client making parallel request and 15 seconds with client making serial request (in mean)

Would I like know if i’m making anything wrong?

Server Code


# app.py

from fastapi import FastAPI, File, UploadFile
import random
import aiofiles
import os

app = FastAPI()

STORAGE_PATH = 'storage'

@app.post("/")
async def read_root(file: UploadFile=File('teste')):
    fpath = os.path.join(
        STORAGE_PATH, f'{random.randint(0, 5000)}_{file.filename}'
    )
    async with aiofiles.open(fpath, 'wb') as f:
        content = await file.read()
        await f.write(content)

    return {"Test": "Test"}


# uvicorn app:app --host 0.0.0.0 --port 8000 --workers 1
# gunicorn -w=1 -k=uvicorn.workers.UvicornWorker --bind=0.0.0.0:8000 app:app

Client Code

FILES = ['f1.txt', 'f2.txt', 'f3.txt', 'f4.txt' ]

def request(fname):
    files = {'file': open(fname,'rb')} 
    requests.post("http://localhost:8000/", files=files)


def req_mp():
    start = datetime.now() 
    pool = Pool(4) 
    pool.map(request, FILES) 
    print(datetime.now() - start)


def req_serial():
    start = datetime.now()  
    for fn in FILES:    
        request(fn)
    print(datetime.now() - start)

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:20 (7 by maintainers)

github_iconTop GitHub Comments

3reactions
obatakucommented, May 7, 2020

@igor-rodrigues1 sanic always reads files entirely into memory before passing control to your handler whereas fastapi via starlette uses a tempfile.SpooledTemporaryFile and thus rolls data over onto disk once it grows beyond 1 MB. if you overwrite the default spooling threshold for UploadFile like

from starlette.datastructures import UploadFile as StarletteUploadFile

# keep the SpooledTemporaryFile in-memory
StarletteUploadFile.spool_max_size = 0

… I think you may see performance somewhat similar to that of sanic & uvicorn as it should eliminate the file I/O trip

2reactions
PhilippHomanncommented, Jun 29, 2020

Found streaming_form_data which does multipart/form-data parsing much faster, as it uses Cython. In his blog the author wrote that byte-wise parsing in pure Python is slow, so thats why multipart is slow in the current implementation of starlette.

Created a gist with an example.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Fastapi python code execution speed impacted by ...
Basically, gunicorn is a synchronous web server, while uvicorn is an asynchronous web server. Since you're using fastapi and await keywords I ...
Read more >
Concurrency and async / await - FastAPI
Asynchronous code just means that the language has a way to tell the computer / program that at some point in the code,...
Read more >
Concurrency with FastAPI - Medium
To simplify asynchronous code and make it as simple as synchronous code, Async IO uses coroutines and futures as there are no callbacks....
Read more >
Understanding python async with FastAPI - DEV Community ‍ ‍
It was all perfectly synchronous so I couldn't have anything take too ... app with FastAPI, using async/await and a library like uvicorn, ......
Read more >
Uvicorn
The uvicorn command line tool is the easiest way to run your application. ... Two common patterns you might use are either function-based...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found