Do you want to run your Scrapy framework’s spiders without
any paid server? Do you want to get clear picture of the generated logs? Do you
want to run multiple spiders at once? Do you want to setup cronjobs?
I have used scrapinghub to do so without any price.
Steps:
1.
Create a free account from here: scrapinghub.com
2.
Create an instance of spider
3.
Go to deploy and run spiders section
4.
Note down ID and API key from there
5.
Install the commandline tool shub using pip command
pip
install shub
6.
Go to the root directory of the scrapy project
7.
Login to the instance of scrapinghub
shub login
8.
Enter API key and ID of project (Enter Y to
remember the project or N to enter ID on each deployment)
9.
Deploy the latest project
shub deploy
Now after each change on the project, you will need to do shub deploy .
Comments
Post a Comment