Forums

Schedule task that saves data into DB

Hi - Im new to scheduled tasks. I need to run a task that makes an API request and then saves the data to a database. I understand how to create the task in PA that runs the script and retrieves the data, but I am missing the "copy the retrieved data into the DB" part. Any guidance on how to do this would be appreciated.

That all depends on which database you're using -- is it MySQL, SQLite or Postgres? Are you using a web framework like Django, or anything else that provides an ORM (like SQLAlchemy)?

So right now I only had the script that calls API that returns the data. What Im going to do is set up a django project and create a view that accepts the data via a POST request and then the view saves the data into a DB. Not sure if this is the most optimal way of doing it but that's what I can think of for now.

It be sth like

1) Script in scheduled task calls the external API that returns data. This is done periodically. 2) In the same script I call my own django view with a POST request and save the data from point 1 in my own DB

That would work, but I think it would be easier to write a custom Django management command. If you do that, then you'll be able to run something like

cd <your project directory>; workon <your virtualenv>; python manage.py <the name of your management command>

...replacing the stuff in <>s, of course. Inside the management command you could put any Python code, including stuff that uses Django's normal database stuff, so you can just create objects and save them to the database.

Here are the docs.

Hi - I understand. However, how would I go about running that management command on a daily basis? Not sure how to do this (I don't think its via the tasks section)

Ok I think i got it. For anyone who may end up here: In the command line when creating a task:

/home/yourpythonanywhereusername/.virtualenvs/yourvenv/bin/python /home/yourpythonanywhereusername/yourprojectname/manage.py <your management command name>

Thank you giles. This is certainly better than making calls everywhere. Also my previous solution wasn't working because the request was timing out after five minutes and the script required much more time than that.

That's correct. Glad to hear that you sorted that!