Unknown Command: Crawl Error while using Scrapy

Posted by Agung Pambudi in ,
There is something missing in installation : scrapy.cfg

You should run scrapy crawl spider_name command being in a scrapy project folder, where scrapy.cfg



You have to execute it in your 'startproject' folder. You will have another commands if it finds your scrapy.cfg file. You can see the diference here:

$ scrapy startproject craigslist_sample
$ cd craigslist_sample/
$ ls
craigslist_sample  scrapy.cfg
$ scrapy
Scrapy 0.12.0.2536 - project: craigslist_sample

Usage:
  scrapy <command> [options] [args]

Available commands:
  crawl         Start crawling from a spider or URL
  deploy        Deploy project in Scrapyd target
  fetch         Fetch a URL using the Scrapy downloader
  genspider     Generate new spider using pre-defined templates
  list          List available spiders
  parse         Parse URL (using its spider) and print the results
  queue         Deprecated command. See Scrapyd documentation.
  runserver     Deprecated command. Use 'server' command instead
  runspider     Run a self-contained spider (without creating a project)
  server        Start Scrapyd server for this project
  settings      Get settings values
  shell         Interactive scraping console
  startproject  Create new project
  version       Print Scrapy version
  view          Open URL in browser, as seen by Scrapy

Use "scrapy <command> -h" to see more info about a command


$ cd ..
$ scrapy
Scrapy 0.12.0.2536 - no active project

Usage:
  scrapy <command> [options] [args]

Available commands:
  fetch         Fetch a URL using the Scrapy downloader
  runspider     Run a self-contained spider (without creating a project)
  settings      Get settings values
  shell         Interactive scraping console
  startproject  Create new project
  version       Print Scrapy version
  view          Open URL in browser, as seen by Scrapy

Use "scrapy <command> -h" to see more info about a command

No comments:

Post a Comment