Unhandled error in Deferred:
2020-04-13 16:21:35 [twisted] CRITICAL: Unhandled error in Deferred:
2020-04-13 16:21:35 [twisted] CRITICAL:
Traceback (most recent call last):
File “H:\anaconda3.4\anaconda_\lib\site-packages\twisted\internet\task.py”, line 517, in oneWorkUnit
result = next(self.iterator)
File "H:\anaconda3.4\anaconda\lib\site-packages\scrapy\utils\defer.py", line 63, in
work = (callable(elem, *args, **named) for elem in iterable)
File "H:\anaconda3.4\anaconda\lib\site-packages\scrapy\core\scraper.py", line 183, in process_spidermw_output
self.crawler.engine.crawl(request=output, spider=spider)
File "H:\anaconda3.4\anaconda\lib\site-packages\scrapy\core\engine.py", line 210, in crawl
self.schedule(request, spider)
File “H:\anaconda3.4\anaconda_\lib\site-packages\scrapy\core\engine.py”, line 216, in schedule
if not self.slot.scheduler.enqueue_request(request):
File “H:\anaconda3.4\anaconda_\lib\site-packages\scrapy\core\scheduler.py”, line 57, in enqueue_request
dqok = self.dqpush(request)
File "H:\anaconda3.4\anaconda\lib\site-packages\scrapy\core\scheduler.py", line 86, in dqpush
self.dqs.push(reqd, -request.priority)
File "H:\anaconda3.4\anaconda\lib\site-packages\queuelib\pqueue.py", line 35, in push
q.push(obj) # this may fail (eg. serialization error)
File “H:\anaconda3.4\anaconda_\lib\site-packages\scrapy\squeues.py”, line 15, in push
s = serialize(obj)
File “H:\anaconda3.4\anaconda_\lib\site-packages\scrapy\squeues.py”, line 27, in pickle_serialize
return pickle.dumps(obj, protocol=2)
File "H:\anaconda3.4\anaconda\lib\site-packages\parsel\selector.py", line 204, in getstate
raise TypeError(“can’t pickle Selector objects”)
TypeError: can’t pickle Selector objects
带你彻底掌握Scrapy,用Django+Elasticsearch搭建搜索引擎
了解课程