吾爱破解 - 52pojie.cn

 找回密码
 注册[Register]

QQ登录

只需一步,快速开始

查看: 1260|回复: 4
收起左侧

[求助] python的scrapy执行错误

[复制链接]
l363412580 发表于 2020-1-17 23:44
各位大神,小弟刚学python爬虫,跟教程安装了所以库。但在测试的时候出现错误。麻烦帮看一下是什么问题,谢谢
我没使用虚拟环境,直接使用python官方安装的默认python版本,安装位置在:c:\program files (x86)\python38-32\
python版本是3.8.0

命令行错误提示:
C:\>scrapy runspider baidu.py
2020-01-17 23:34:36 [scrapy.utils.log] INFO: Scrapy 1.8.0 started (bot: scrapybot)
2020-01-17 23:34:36 [scrapy.utils.log] INFO: Versions: lxml 4.4.1.0, libxml2 2.9.5, cssselect 1.1.0, parsel 1.5.2, w3lib 1.21.0, Twisted 19.10.0, Python 3.8.0 (tags/v3.8.0:fa919fd, Oct 14 2019, 19:21:23) [MSC v.1916 32 bit (Intel)], pyOpenSSL 19.1.0 (OpenSSL 1.1.1d  10 Sep 2019), cryptography 2.8, Platform Windows-10-10.0.18362-SP0
2020-01-17 23:34:36 [scrapy.crawler] INFO: Overridden settings: {'SPIDER_LOADER_WARN_ONLY': True}
2020-01-17 23:34:36 [scrapy.extensions.telnet] INFO: Telnet Password: f9481d79ce1771c0
2020-01-17 23:34:36 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.logstats.LogStats']
Unhandled error in Deferred:
2020-01-17 23:34:37 [twisted] CRITICAL: Unhandled error in Deferred:

Traceback (most recent call last):
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\crawler.py", line 184, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\crawler.py", line 188, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "c:\program files (x86)\python38-32\lib\site-packages\twisted\internet\defer.py", line 1613, in unwindGenerator
    return _cancellableInlineCallbacks(gen)
  File "c:\program files (x86)\python38-32\lib\site-packages\twisted\internet\defer.py", line 1529, in _cancellableInlineCallbacks
    _inlineCallbacks(None, g, status)
--- <exception caught here> ---
  File "c:\program files (x86)\python38-32\lib\site-packages\twisted\internet\defer.py", line 1418, in _inlineCallbacks
    result = g.send(result)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\crawler.py", line 86, in crawl
    self.engine = self._create_engine()
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\crawler.py", line 111, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\core\engine.py", line 69, in __init__
    self.downloader = downloader_cls(crawler)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\core\downloader\__init__.py", line 86, in __init__
    self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\middleware.py", line 53, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\utils\misc.py", line 46, in load_object
    mod = import_module(module)
  File "c:\program files (x86)\python38-32\lib\importlib\__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import

  File "<frozen importlib._bootstrap>", line 991, in _find_and_load

  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked

  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked

  File "<frozen importlib._bootstrap_external>", line 783, in exec_module

  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed

  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\downloadermiddlewares\httpproxy.py", line 5, in <module>
    from urllib2 import _parse_proxy
builtins.SyntaxError: invalid syntax (urllib2.py, line 220)

2020-01-17 23:34:37 [twisted] CRITICAL:
Traceback (most recent call last):
  File "c:\program files (x86)\python38-32\lib\site-packages\twisted\internet\defer.py", line 1418, in _inlineCallbacks
    result = g.send(result)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\crawler.py", line 86, in crawl
    self.engine = self._create_engine()
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\crawler.py", line 111, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\core\engine.py", line 69, in __init__
    self.downloader = downloader_cls(crawler)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\core\downloader\__init__.py", line 86, in __init__
    self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\middleware.py", line 53, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\utils\misc.py", line 46, in load_object
    mod = import_module(module)
  File "c:\program files (x86)\python38-32\lib\importlib\__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 783, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "c:\program files (x86)\python38-32\lib\site-packages\scrapy\downloadermiddlewares\httpproxy.py", line 5, in <module>
    from urllib2 import _parse_proxy
  File "c:\program files (x86)\python38-32\lib\site-packages\urllib2.py", line 220
    raise AttributeError, attr
                        ^
SyntaxError: invalid syntax

==============================

发帖前要善用论坛搜索功能,那里可能会有你要找的答案或者已经有人发布过相同内容了,请勿重复发帖。

1170 发表于 2020-1-18 00:09
本帖最后由 1170 于 2020-1-18 00:25 编辑

运行命令 scrapy crawl baidu(项目名)
 楼主| l363412580 发表于 2020-1-18 00:30
1170 发表于 2020-1-18 00:09
运行命令 scrapy crawl baidu(项目名)

都执行了,提示语法错误。使用scrapy新建的空项目
1170 发表于 2020-1-18 00:59
l363412580 发表于 2020-1-18 00:30
都执行了,提示语法错误。使用scrapy新建的空项目

你吧代码发上来吧
 楼主| l363412580 发表于 2020-1-18 22:37
1170 发表于 2020-1-18 00:59
你吧代码发上来吧

已经解决,谢谢。我系统有2.7和3.8两个版本python,卸载掉2.7的就可以使用了
您需要登录后才可以回帖 登录 | 注册[Register]

本版积分规则

返回列表

RSS订阅|小黑屋|处罚记录|联系我们|吾爱破解 - LCG - LSG ( 京ICP备16042023号 | 京公网安备 11010502030087号 )

GMT+8, 2024-11-26 22:33

Powered by Discuz!

Copyright © 2001-2020, Tencent Cloud.

快速回复 返回顶部 返回列表