Forums

Task failed to start. Too many processes/threads or RuntimeError: can't start new thread

I'm getting a couple of unfortunate errors

Traceback (most recent call last): File "/home/tjw0000/.local/lib/python3.6/site-packages/apscheduler/schedulers/base.py", line 958, in_process_jobs executor.submit_job(job, run_times) File "/home/tjw0000/.local/lib/python3.6/site-packages/apscheduler/executors/base.py", line 71, in submit_job self._do_submit_job(job, run_times) File "/home/tjw0000/.local/lib/python3.6/site-packages/apscheduler/executors/pool.py", line 22, in _do_submit_job f = self._pool.submit(run_job, job, job._jobstore_alias, run_times, self._logger.name) File "/usr/lib/python3.6/concurrent/futures/thread.py", line 115, in submit self._adjust_thread_count() File "/usr/lib/python3.6/concurrent/futures/thread.py", line 134, in _adjust_thread_count t.start() File "/usr/lib/python3.6/threading.py", line 846, in start _start_new_thread(self._bootstrap, ()) RuntimeError: can't start new thread


2017-12-27 18:23:22 -- Task failed to start. Too many processes/threads


These are in my Task log.

I'm running 8 instances of the same script executing slightly different code, the scripts run hourly, stay running for 56 minutes, and then shut down.

About half of the scripts give me one of the above errors, I can't tell why these are throwing the errors versus the other ones (and, for that matter I can't tell why i get the two different thread errors).

Is there a setting I can increase or something I can do to avoid getting these thread errors? The script is really lightweight, it just happens to be always running and potentially hogging threads (I don't know much about that part)

We limit users to 128 concurrent processes, so that would mean that each of your 8 scripts is starting 16 subprocesses/threads? that seems a little excessive! Can you find out what's going on there? It'll be even worse if all 8 scripts don't end before the next 8 start up?

I wonder if switching to our "always-on tasks" beta might help? https://help.pythonanywhere.com/pages/DebuggingStaticFiles/ - let me know if you want to give that a try...

We limit users to 128 concurrent processes, so that would mean that each of your 8 scripts is starting 16 subprocesses/threads? Can you find out what's going on there?

Could you elaborate on how to check this?

I wonder if switching to our "always-on tasks" beta might help? https://help.pythonanywhere.com/pages/DebuggingStaticFiles/ - let me know if you want to give that a try...

I emailed the link on there, I would like to give that a try.

There's not much we can tell you about your code. I would suggest looking at your code to work out what's going on.

There's not much we can tell you about your code. I would suggest looking at your code to work out what's going on.

Well, my code doesn't run into this problem when running on a MacBook Pro, it's something specific to PythonAnywhere, which is why i was hoping for some further advice from you or tips on getting transparency on threads.

I see you're using a package called "apscheduler" -- it looks lke that's what's starting the threads. Maybe you can look into its configuration settings, perhaps there's a way of limiting the total number of threads it uses?