DB has 3.5M items (=rows), there are indexes and foreign key relations. Size of the DB is about 1GB. I'm going to migrate to mysql in order to improve my website's loading times.
After a bit of problem solving I seem to have two options:
1) Executing manage.py loaddata datadump.json in PAW bash console and having the datadump on the server. In this case the datadump needs to be in json lines format due to memory limitations. However, this will most likely take quite a lot of time+cpu usage+my website will be down while the migration takes place.
2) Use SSH tunneling and do the loaddata command locally. This would take about 60 hours but the website would be available during that time.
I'm just interested in what more experienced users recommend and is there something that I haven't thought of?