Forums

Large File Uploading

Hi so inside my web app I have an audio recorder, which records user audio input with js, sends this as a .wav blob file to my Django view and inside my Django view this file uploaded to the PA media directory.

Being .wav files, these exceed the PA 100 Mb limit on file uploads quite easily. I have read options to overcome this on this page: https://help.pythonanywhere.com/pages/UploadingAndDownloadingFiles/

From the options on that page, I believe I can't do that splitting and re-joining as this would be done on my PC, and in my web app the file creation and upload are all happening on the website.

I'm still not quite sure how the SFTP option works, but will it be able to facilitate this file upload somehow? Like when a user records audio, and saves, can the upload from the Django view to the PA media directory be handled with SFTP?

Hmm, that's a tricky one. I don't think you could use SFTP, because that requires your PythonAnywhere username and password, which you'd obviously not want to put on your site in JavaScript.

Could you perhaps split the WAVs using JS embedded in your site? A quick search for "javascript split audio file" came up with this project, and several others, so perhaps you could use them?

Hi sorry for the late response, so I just realised that my files actually don't need to be more 5 mins long which means that they definitely shouldn't exceed 100 MiB limit, in fact the ones that are failing to upload are much much smaller than that.

Like when my application records and uploads 1 second long files that are under 100 kb, these successfully upload, both to PA media storage and to an AWS S3 bucket which I've linked. But as soon as I go above even 100 kb, the files stop uploading and they don't appear.

Do you know what the problem is here? Shouldn't these files, which are way below the limit, upload?

EDIT:

So upon looking in my server log I found these:

  • SIGPIPE: writing to a closed pipe/socket/fd (probably the client disconnected) on request
  • uwsgi_response_write_headers_do(): Broken pipe [core/writer.c line 248] during POST

And in my error log I found this: OSError: write error

Now the odd thing is that these occur when there is a SUCCESSFUL upload (the <100kb files that I mentioned above). With the unsuccessful uploads of files >100kb, there aren't any messages in the logs at all.

What code are you using to upload the files? Those errors would normally indicate something disconnecting the client side from the server before the server was able to send all of the data back. Perhaps some kind of client-side timeout could be causing it?

Yea I'm not quite sure where the timeout is coming from, could be a uWSGI setting of some sort.

I've managed to fix my problem by directly uploading from my JavaScript to AWS, rather than than transferring to the Python side and uploading from there.

I switched to Heroku and followed this tutorial: https://devcenter.heroku.com/articles/s3-upload-python

But the tutorial should work with PythonAnywhere as well if anyone gets stuck on a similar problem.

Thanks for the help!