Forums

Treading lightly for MySQL initial data-load.

Hello.

I have a JSON file that I've built for loading via Django_import_export.

It works fine, but I want to tread much more carefully, as I can only push about 500 rows at a time--I think the Django instance is yielding (which is fine; I'm a free user just testing this out for now).

Would a mysqlimport of an uploaded CSV file be lighter? It seems so, as it'll just involve the MySQL process, but I'd really like to tread lightly and optimize the data load if I can. 5 columns of data are in use, and it is sparse.

I'd appreciate any feedback or comments anyone would have!

Not going through django certainly may be faster. However, you may be pre-maturely optimizing here- how many times are you going to have to load initial data to MySQL anyways?

Well, I'm not so much optimizing as trying not to have my process yielding (ala being killed). Fair point, though.

This is a one-shot data-load.

I'm new to Django but not to MySQL and friends. 35K rows shouldn't be excessive, but I can break this up into batches if it'd be better for PA. As I indicate, I'm primarily trying to tread lightly here. (On my dev box I just vomited this thing in its entirety, no big deal).

Oh. I see. I thought you meant the python keyword yield. Hmm. Not sure about it in this case- what was the error you were seeing when django got killed? Could it have been a ram constraint thing? ie. getting killed because you are trying to put too much into memory.