2013-08-29 84 views
1

我是python的新手。我在64位Windows 7上运行python 2.7.2 64位版本。 我遵循教程并在我的机器上安装了scrapy。然后我创建了一个项目demoz。但是当我输入scrapy crawl demoz时会显示错误。scrapy新手:教程。运行scrapy抓取时出现错误dmoz

d:\Scrapy workspace\tutorial>scrapy crawl dmoz 
2013-08-29 16:10:45+0800 [scrapy] INFO: Scrapy 0.18.1 started (bot: tutorial) 
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Optional features available: ssl, http1 
1 
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Overridden settings: {'NEWSPIDER_MODULE 
': 'tutorial.spiders', 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tuto 
rial'} 
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Enabled extensions: LogStats, TelnetCon 
sole, CloseSpider, WebService, CoreStats, SpiderState 
Traceback (most recent call last): 
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main 
"__main__", fname, loader, pkg_name) 
File "C:\Python27\lib\runpy.py", line 72, in _run_code 
exec code in run_globals 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py" 
, line 168, in <module> 
execute() 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py" 
, line 143, in execute 
_run_print_help(parser, _run_command, cmd, args, opts) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py" 
, line 88, in _run_print_help 
func(*a, **kw) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py" 
, line 150, in _run_command 
cmd.run(args, opts) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\commands\cr 
awl.py", line 46, in run 
spider = self.crawler.spiders.create(spname, **opts.spargs) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\command.py" 
, line 34, in crawler 
self._crawler.configure() 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\crawler.py" 
, line 44, in configure 
self.engine = ExecutionEngine(self, self._spider_closed) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\core\engine 
.py", line 61, in __init__ 
self.scheduler_cls = load_object(self.settings['SCHEDULER']) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\utils\misc. 
py", line 40, in load_object 
raise ImportError, "Error loading object '%s': %s" % (path, e) 
ImportError: Error loading object 'scrapy.core.scheduler.Scheduler': No module n 
amed queuelib' 

我猜他们是错误的东西在安装任何人可以帮助请..在此先感谢..

回答

2

能否请您核实项目蜘蛛的名字已创建“demoz “还是”dmoz“?

你在命令

d:\Scrapy workspace\tutorial>scrapy crawl dmoz 
+1

谢谢您指定“DMOZ”作为蜘蛛的名字,蜘蛛的名字是wrong.Then我已经下载的解决了这个问题的模块queuelib并安装它。 – jone