我的chrome版本是92 断点到这一步运行出现以下提示,还有先前遗漏的问题一直未解决,老师是否可以加我qq跟踪下问题,我的qq:123399918
2022-01-08 10:07:09 [undetected_chromedriver.patcher] DEBUG: unzipping C:\Users\Lenovo\AppData\Local\Temp\tmpn6nkfp85
2022-01-08 10:07:09 [undetected_chromedriver.patcher] INFO: patching driver executable C:\Users\Lenovo\appdata\roaming\undetected_chromedriver\chromedriver.exe
2022-01-08 10:07:09 [uc] DEBUG: created a temporary folder in which the user-data (profile) will be stored during this
session, and added it to chrome startup arguments: --user-data-dir=C:\Users\Lenovo\AppData\Local\Temp\tmpwsi09lud
2022-01-08 10:07:09 [uc] DEBUG: did not find a bad exit_type flag
2022-01-08 10:07:09 [scrapy.core.engine] ERROR: Error while obtaining start requests
Traceback (most recent call last):
File “E:\scrapy\envs\python38\lib\site-packages\scrapy\core\engine.py”, line 129, in next_request
request = next(slot.start_requests)
File “E:\scrapy\code\ArticleSpider\ArticleSpider\spiders\cnblogs.py”, line 15, in start_requests
browser = uc.Chrome()
File "E:\scrapy\envs\python38\lib\site-packages\undetected_chromedriver_init.py", line 356, in init
self.browser_pid = start_detached(
File “E:\scrapy\envs\python38\lib\site-packages\undetected_chromedriver\dprocess.py”, line 30, in start_detached
multiprocessing.Process(
File “d:\anaconda\lib\multiprocessing\process.py”, line 121, in start
self._popen = self._Popen(self)
File “d:\anaconda\lib\multiprocessing\context.py”, line 224, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File “d:\anaconda\lib\multiprocessing\context.py”, line 327, in _Popen
return Popen(process_obj)
File “d:\anaconda\lib\multiprocessing\popen_spawn_win32.py”, line 45, in init
prep_data = spawn.get_preparation_data(process_obj._name)
File “d:\anaconda\lib\multiprocessing\spawn.py”, line 154, in get_preparation_data
_check_not_importing_main()
File “d:\anaconda\lib\multiprocessing\spawn.py”, line 134, in _check_not_importing_main
raise RuntimeError(’’'
RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.
This probably means that you are not using fork to start your
child processes and you have forgotten to use the proper idiom
in the main module:
if name == ‘main’:
freeze_support()
…
The “freeze_support()” line can be omitted if the program
is not going to be frozen to produce an executable.
2022-01-08 10:07:09 [scrapy.core.engine] INFO: Closing spider (finished)
2022-01-08 10:07:09 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{‘elapsed_time_seconds’: 16.25926,
‘finish_reason’: ‘finished’,
‘finish_time’: datetime.datetime(2022, 1, 8, 2, 7, 9, 285794),
‘log_count/DEBUG’: 5,
‘log_count/ERROR’: 1,
‘log_count/INFO’: 11,
‘start_time’: datetime.datetime(2022, 1, 8, 2, 6, 53, 26534)}
2022-01-08 10:07:09 [scrapy.core.engine] INFO: Spider closed (finished)
2022-01-08 10:07:09 [uc] DEBUG: closing webdriver
2022-01-08 10:07:09 [uc] DEBUG: killing browser
2022-01-08 10:07:09 [uc] DEBUG: successfully removed C:\Users\Lenovo\AppData\Local\Temp\tmpwsi09lud
带你彻底掌握Scrapy,用Django+Elasticsearch搭建搜索引擎
了解课程