Forums

scheduling task for spider under virtualenv having issues

I am working on a project which has been developed using scrapy framework on behalf of my client. My client has server on pythonanywhere. I am having the following issues which I can not seem to resolve.

Any insight on the issue much appreciated in advance.

case1# "virtualwrapper has been used to create the virtualenv environment " "while installing scrapy on virtualenv,

instructions given on "pythonnaywhere" blog has been followed.

case2# "script run fine from console "

case3: "The problem he is having with running the following script while scheduling the task on the webinterface of pythonanywhere"

case4: "below is the script which he wants to schedule"

run_lycabundles_spider.sh

1
2
3
4
5
#!/bin/bash  
source $WORKON_HOME/scraping_test/bin/activate
cd ~/.virtualenvs/scraping_test/projects/lycamobile_stack/lycamobile_stack  
scrapy crawl lycamobile
deactivate

"Add a new scheduled task:" new task are added

/home/ratepath/.virtualenvs/scraping_test/projects/lycamobile_stack/lycamobile_stack/run_lycabundles_spider.sh

what I am getting is as follows:

/home/ratepath/.virtualenvs/scraping_test/projects/lycamobile_stack/lycamobile_stack/run_lycabundles_spider.sh: line 3: /scraping_test/bin/activate: No such file or directory

2016-01-19 13:19:07+0000 [scrapy] INFO: Scrapy 0.16.5 started (bot: lycamobile_spider) Traceback (most recent call last): File "/usr/local/bin/scrapy", line 4, in <module>execute()

File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 131, in execute_run_print_help(parser, _run_command, cmd, args, opts)

File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 76, in _run_print_help func(a, *kw)

File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 138, in _run_command cmd.run(args, opts) File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 43, in run spider = self.crawler.spiders.create(spname, **opts.spargs)

File "/usr/local/lib/python2.7/dist-packages/scrapy/command.py", line 33, in crawler self._crawler.configure()

File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 38, in configure self.extensions = ExtensionManager.from_crawler(self)

File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 50, in from_crawler return cls.from_settings(crawler.settings, crawler)

File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 31, in from_settings mw = mwcls.from_crawler(crawler)

File "/usr/local/lib/python2.7/dist-packages/scrapy/contrib/feedexport.py", line 163, in from_crawler o = cls(crawler.settings)

File "/usr/local/lib/python2.7/dist-packages/scrapy/contrib/feedexport.py", line 143, in init self.exporters = self._load_components('FEED_EXPORTERS')

File "/usr/local/lib/python2.7/dist-packages/scrapy/contrib/feedexport.py", line 201, in _load_components d[k] = load_object(v)

File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/misc.py", line 39, in load_object raise ImportError, "Error loading object '%s': %s" % (path, e)

ImportError: Error loading object 'lycamobile_stack.exporters.LycamobileItemExporter': No module named exporters /home/ratepath/.virtualenvs/scraping_test/projects/lycamobile_stack/lycamobile_stack/run_lycabundles_spider.sh: line 10: deactivate: command not found

2016-01-19 13:19:07 -- Completed task, took 1.00 seconds, return code was 127.

$WORKON_HOME is not defined when your script runs so you're trying to activate a virtualenv that doesn't exist:

/home/ratepath/.virtualenvs/scraping_test/projects/lycamobile_stack/lycamobile_stack/run_lycabundles_spider.sh: line 3: /scraping_test/bin/activate: No such file or directory

Define WORKON_HOME or just use the full path to the activate script for your virtualenv.

Thank you so much for taking the time in responding my queries. This is a great mistake I have made. Thanks a lot. I will update once I test the script.... Thanks a lot again.

Hi Glenn, Just to update you that the fix you provided worked out of the box on scheduling the task in "Pythonanywhere" server . I am very happy to see you responding promptly and truly saving my day. Thank you so again for the fix.

:)