==> Cloning scrapy ... Cloning into 'scrapy'... ==> Collecting packager identity from makepkg.conf -> name : Reproducible Arch Linux tests -> email : reproducible@archlinux.org -> gpg-key : undefined -> protocol: https ==> Configuring scrapy :: Synchronizing package databases... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 131k 100 131k 0 0 384k 0 --:--:-- --:--:-- --:--:-- 383k 100 131k 100 131k 0 0 384k 0 --:--:-- --:--:-- --:--:-- 383k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 8426k 100 8426k 0 0 20.6M 0 --:--:-- --:--:-- --:--:-- 20.6M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 141k 100 141k 0 0 413k 0 --:--:-- --:--:-- --:--:-- 413k :: Starting full system upgrade... there is nothing to do PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin TMPDIR=/tmp/archlinux-ci/ SOURCE_DATE_EPOCH=1701950410 SHELL=/bin/bash SCHROOT_CHROOT_NAME=jenkins-reproducible-archlinux no_proxy=localhost,127.0.0.1 SCHROOT_COMMAND=bash -l -c SOURCE_DATE_EPOCH='1701950410' PATH='/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin' TMPDIR='/tmp/archlinux-ci/' printenv 2>&1 SCHROOT_SESSION_ID=archlinux-scrapy-archlinuxrb-build-J6zkxUDh PWD=/tmp/archlinux-ci/scrapy-archlinuxrb-build-J6zkxUDh/scrapy LOGNAME=jenkins SCHROOT_ALIAS_NAME=jenkins-reproducible-archlinux SCHROOT_GROUP=jenkins SCHROOT_USER=jenkins ftp_proxy=http://127.0.0.1:3128 HOME=/var/lib/jenkins LANG=C https_proxy=http://127.0.0.1:3128 USER=jenkins FTP_PROXY=http://127.0.0.1:3128 SHLVL=1 HTTPS_PROXY=http://127.0.0.1:3128 HTTP_PROXY=http://127.0.0.1:3128 SCHROOT_GID=116 http_proxy=http://127.0.0.1:3128 DEBUGINFOD_URLS=https://debuginfod.archlinux.org SCHROOT_UID=108 HG=/usr/bin/hg _=/usr/sbin/printenv ==> Making package: scrapy 2.9.0-1 (Thu Dec 7 12:00:42 2023) ==> Checking runtime dependencies... ==> Installing missing dependencies... resolving dependencies... looking for conflicting packages... warning: dependency cycle detected: warning: python-incremental will be installed before its python-twisted dependency Packages (50) python-annotated-types-0.6.0-1 python-attrs-23.1.0-1 python-autocommand-2.2.2-4 python-automat-22.10.0-3 python-cffi-1.16.0-1 python-charset-normalizer-3.3.2-1 python-click-8.1.7-1 python-constantly-15.1.0-13 python-fastjsonschema-2.19.0-1 python-filelock-3.12.4-1 python-hyperlink-21.0.0-5 python-idna-3.4-3 python-incremental-22.10.0-3 python-inflect-7.0.0-2 python-jaraco.context-4.3.0-3 python-jaraco.functools-3.9.0-1 python-jaraco.text-3.11.1-3 python-jmespath-1.0.1-2 python-more-itertools-10.1.0-1 python-ordered-set-4.1.0-4 python-platformdirs-4.1.0-1 python-ply-3.11-12 python-pyasn1-0.4.8-8 python-pyasn1-modules-0.2.8-7 python-pycparser-2.21-5 python-pydantic-2.5.2-1 python-pydantic-core-1:2.14.5-1 python-requests-2.31.0-1 python-requests-file-1.5.1-6 python-tomli-2.0.1-3 python-trove-classifiers-2023.11.29-1 python-typing_extensions-4.8.0-1 python-validate-pyproject-0.13-1 python-cryptography-41.0.7-1 python-cssselect-1.2.0-3 python-itemadapter-0.8.0-2 python-itemloaders-1.0.6-2 python-lxml-4.9.2-3 python-packaging-23.2-1 python-parsel-1.7.0-2 python-protego-0.2.1-2 python-pydispatcher-2.0.7-1 python-pyopenssl-23.3.0-1 python-queuelib-1.6.2-3 python-service-identity-23.1.0-1 python-setuptools-1:68.2.2-1 python-tldextract-5.1.1-1 python-twisted-22.10.0-3 python-w3lib-2.1.2-1 python-zope-interface-6.1-1 Total Download Size: 14.21 MiB Total Installed Size: 94.97 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 106k 100 106k 0 0 9.9M 0 --:--:-- --:--:-- --:--:-- 10.4M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 230k 100 230k 0 0 31.1M 0 --:--:-- --:--:-- --:--:-- 32.1M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 286k 100 286k 0 0 36.4M 0 --:--:-- --:--:-- --:--:-- 39.9M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 984k 100 984k 0 0 112M 0 --:--:-- --:--:-- --:--:-- 120M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 47006 100 47006 0 0 20.7M 0 --:--:-- --:--:-- --:--:-- 22.4M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 22673 100 22673 0 0 42485 0 --:--:-- --:--:-- --:--:-- 42538 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 43013 100 43013 0 0 115k 0 --:--:-- --:--:-- --:--:-- 115k 100 43013 100 43013 0 0 115k 0 --:--:-- --:--:-- --:--:-- 115k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 1290k 100 1290k 0 0 93.1M 0 --:--:-- --:--:-- --:--:-- 96.9M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 99813 100 99813 0 0 41.7M 0 --:--:-- --:--:-- --:--:-- 47.5M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 16973 100 16973 0 0 2790k 0 --:--:-- --:--:-- --:--:-- 3315k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 11682 100 11682 0 0 5647k 0 --:--:-- --:--:-- --:--:-- 11.1M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 22922 100 22922 0 0 4021k 0 --:--:-- --:--:-- --:--:-- 4476k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 21709 100 21709 0 0 3555k 0 --:--:-- --:--:-- --:--:-- 4240k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 65512 100 65512 0 0 10.4M 0 --:--:-- --:--:-- --:--:-- 12.4M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 1566k 100 1566k 0 0 61.8M 0 --:--:-- --:--:-- --:--:-- 63.7M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 718k 100 718k 0 0 63.5M 0 --:--:-- --:--:-- --:--:-- 70.1M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 78819 100 78819 0 0 11.6M 0 --:--:-- --:--:-- --:--:-- 12.5M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 21013 100 21013 0 0 3578k 0 --:--:-- --:--:-- --:--:-- 4104k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 16181 100 16181 0 0 2715k 0 --:--:-- --:--:-- --:--:-- 3160k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 104k 100 104k 0 0 12.8M 0 --:--:-- --:--:-- --:--:-- 14.5M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 32853 100 32853 0 0 1979k 0 --:--:-- --:--:-- --:--:-- 2005k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 28235 100 28235 0 0 4719k 0 --:--:-- --:--:-- --:--:-- 5514k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 51642 100 51642 0 0 29.2M 0 --:--:-- --:--:-- --:--:-- 49.2M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 15338 100 15338 0 0 2557k 0 --:--:-- --:--:-- --:--:-- 2995k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 73796 100 73796 0 0 12.0M 0 --:--:-- --:--:-- --:--:-- 14.0M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 772k 100 772k 0 0 86.8M 0 --:--:-- --:--:-- --:--:-- 94.3M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 30156 100 30156 0 0 126k 0 --:--:-- --:--:-- --:--:-- 126k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 46604 100 46604 0 0 4763k 0 --:--:-- --:--:-- --:--:-- 5056k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 21554 100 21554 0 0 98k 0 --:--:-- --:--:-- --:--:-- 98k 100 21554 100 21554 0 0 98k 0 --:--:-- --:--:-- --:--:-- 98k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 42011 100 42011 0 0 184k 0 --:--:-- --:--:-- --:--:-- 184k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 23981 100 23981 0 0 111k 0 --:--:-- --:--:-- --:--:-- 112k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 112k 100 112k 0 0 16.6M 0 --:--:-- --:--:-- --:--:-- 18.3M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 33365 100 33365 0 0 166k 0 --:--:-- --:--:-- --:--:-- 167k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 103k 100 103k 0 0 9880k 0 --:--:-- --:--:-- --:--:-- 10.1M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 157k 100 157k 0 0 22.4M 0 --:--:-- --:--:-- --:--:-- 25.5M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 245k 100 245k 0 0 22.1M 0 --:--:-- --:--:-- --:--:-- 24.0M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 24657 100 24657 0 0 4084k 0 --:--:-- --:--:-- --:--:-- 4815k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 95344 100 95344 0 0 14.5M 0 --:--:-- --:--:-- --:--:-- 15.1M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 95221 100 95221 0 0 14.4M 0 --:--:-- --:--:-- --:--:-- 15.1M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 118k 100 118k 0 0 14.1M 0 --:--:-- --:--:-- --:--:-- 12.8M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 7888 100 7888 0 0 1188k 0 --:--:-- --:--:-- --:--:-- 1283k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 21597 100 21597 0 0 3594k 0 --:--:-- --:--:-- --:--:-- 4218k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 106k 100 106k 0 0 15.8M 0 --:--:-- --:--:-- --:--:-- 17.3M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 377k 100 377k 0 0 2010k 0 --:--:-- --:--:-- --:--:-- 2018k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 14878 100 14878 0 0 1733k 0 --:--:-- --:--:-- --:--:-- 1816k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 209k 100 209k 0 0 67.9M 0 --:--:-- --:--:-- --:--:-- 102M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 35293 100 35293 0 0 5848k 0 --:--:-- --:--:-- --:--:-- 6893k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 68370 100 68370 0 0 10.5M 0 --:--:-- --:--:-- --:--:-- 10.8M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 126k 100 126k 0 0 19.0M 0 --:--:-- --:--:-- --:--:-- 20.6M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 5726k 100 5726k 0 0 139M 0 --:--:-- --:--:-- --:--:-- 139M checking keyring... checking package integrity... loading package files... checking for file conflicts... checking available disk space... :: Processing package changes... installing python-ply... installing python-pycparser... installing python-cffi... Optional dependencies for python-cffi python-setuptools: "limited api" version checking in cffi.setuptools_ext [pending] installing python-cryptography... installing python-cssselect... installing python-itemadapter... installing python-w3lib... installing python-lxml... Optional dependencies for python-lxml python-beautifulsoup4: support for beautifulsoup parser to parse not well formed HTML python-cssselect: support for cssselect [installed] python-html5lib: support for html5lib parser python-lxml-docs: offline docs installing python-more-itertools... installing python-jaraco.functools... installing python-jaraco.context... installing python-autocommand... installing python-annotated-types... installing python-typing_extensions... installing python-pydantic-core... installing python-pydantic... Optional dependencies for python-pydantic mypy: for type validation with mypy python-dotenv: for .env file support python-email-validator: for email validation python-hypothesis: for hypothesis plugin when using legacy v1 installing python-inflect... installing python-jaraco.text... installing python-ordered-set... installing python-packaging... installing python-platformdirs... installing python-tomli... installing python-fastjsonschema... installing python-trove-classifiers... installing python-validate-pyproject... installing python-setuptools... installing python-parsel... installing python-jmespath... installing python-itemloaders... installing python-protego... installing python-pydispatcher... installing python-pyopenssl... installing python-queuelib... installing python-attrs... installing python-pyasn1... installing python-pyasn1-modules... installing python-service-identity... Optional dependencies for python-service-identity python-idna: for Internationalized Domain Names support [pending] installing python-idna... installing python-charset-normalizer... installing python-requests... Optional dependencies for python-requests python-chardet: alternative character encoding library python-pysocks: SOCKS proxy support installing python-requests-file... installing python-filelock... installing python-tldextract... installing python-zope-interface... installing python-constantly... installing python-click... installing python-incremental... installing python-automat... Optional dependencies for python-automat python-graphviz: for automat-visualize python-twisted: for automat-visualize [pending] installing python-hyperlink... installing python-twisted... Optional dependencies for python-twisted python-pyopenssl: for TLS client hostname verification [installed] python-service-identity: for TLS client hostname verification [installed] python-idna: for TLS client hostname verification [installed] python-cryptography: for using conch [installed] python-pyasn1: for using conch [installed] python-appdirs: for using conch python-bcrypt: for using conch python-h2: for http2 support python-priority: for http2 support python-pyserial: for serial support tk: for using tkconch :: Running post-transaction hooks... (1/1) Arming ConditionNeedsUpdate... ==> Checking buildtime dependencies... ==> Installing missing dependencies... resolving dependencies... looking for conflicting packages... Packages (4) python-pyproject-hooks-1.0.0-5 python-build-1.0.3-1 python-installer-0.7.0-3 python-wheel-0.40.0-3 Total Download Size: 0.32 MiB Total Installed Size: 1.87 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 21380 100 21380 0 0 3569k 0 --:--:-- --:--:-- --:--:-- 4175k % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 118k 100 118k 0 0 17.7M 0 --:--:-- --:--:-- --:--:-- 19.2M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 123k 100 123k 0 0 18.0M 0 --:--:-- --:--:-- --:--:-- 20.1M % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 67199 100 67199 0 0 10.2M 0 --:--:-- --:--:-- --:--:-- 10.6M checking keyring... checking package integrity... loading package files... checking for file conflicts... checking available disk space... :: Processing package changes... installing python-pyproject-hooks... installing python-build... Optional dependencies for python-build python-virtualenv: Use virtualenv for build isolation installing python-installer... installing python-wheel... Optional dependencies for python-wheel python-keyring: for wheel.signatures python-xdg: for wheel.signatures :: Running post-transaction hooks... (1/1) Arming ConditionNeedsUpdate... ==> Retrieving sources... -> Downloading scrapy-2.9.0.tar.gz... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 1322k 0 1322k 0 0 2507k 0 --:--:-- --:--:-- --:--:-- 2507k ==> Validating source files with sha512sums... scrapy-2.9.0.tar.gz ... Passed ==> Validating source files with b2sums... scrapy-2.9.0.tar.gz ... Passed ==> Extracting sources... -> Extracting scrapy-2.9.0.tar.gz with bsdtar ==> Starting build()... * Getting build dependencies for wheel... :3: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html running egg_info creating Scrapy.egg-info writing Scrapy.egg-info/PKG-INFO writing dependency_links to Scrapy.egg-info/dependency_links.txt writing entry points to Scrapy.egg-info/entry_points.txt writing requirements to Scrapy.egg-info/requires.txt writing top-level names to Scrapy.egg-info/top_level.txt writing manifest file 'Scrapy.egg-info/SOURCES.txt' reading manifest file 'Scrapy.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'INSTALL' warning: no files found matching 'requirements-*.txt' warning: no files found matching 'license.txt' under directory 'scrapy' no previously-included directories found matching 'docs/build' warning: no files found matching '*' under directory 'bin' warning: no previously-included files matching '__pycache__' found anywhere in distribution warning: no previously-included files matching '*.py[cod]' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS' writing manifest file 'Scrapy.egg-info/SOURCES.txt' * Building wheel... :3: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/scrapy copying scrapy/statscollectors.py -> build/lib/scrapy copying scrapy/squeues.py -> build/lib/scrapy copying scrapy/spiderloader.py -> build/lib/scrapy copying scrapy/signals.py -> build/lib/scrapy copying scrapy/signalmanager.py -> build/lib/scrapy copying scrapy/shell.py -> build/lib/scrapy copying scrapy/robotstxt.py -> build/lib/scrapy copying scrapy/responsetypes.py -> build/lib/scrapy copying scrapy/resolver.py -> build/lib/scrapy copying scrapy/pqueues.py -> build/lib/scrapy copying scrapy/middleware.py -> build/lib/scrapy copying scrapy/mail.py -> build/lib/scrapy copying scrapy/logformatter.py -> build/lib/scrapy copying scrapy/link.py -> build/lib/scrapy copying scrapy/item.py -> build/lib/scrapy copying scrapy/interfaces.py -> build/lib/scrapy copying scrapy/extension.py -> build/lib/scrapy copying scrapy/exporters.py -> build/lib/scrapy copying scrapy/exceptions.py -> build/lib/scrapy copying scrapy/dupefilters.py -> build/lib/scrapy copying scrapy/crawler.py -> build/lib/scrapy copying scrapy/cmdline.py -> build/lib/scrapy copying scrapy/__main__.py -> build/lib/scrapy copying scrapy/__init__.py -> build/lib/scrapy creating build/lib/scrapy/utils copying scrapy/utils/versions.py -> build/lib/scrapy/utils copying scrapy/utils/url.py -> build/lib/scrapy/utils copying scrapy/utils/trackref.py -> build/lib/scrapy/utils copying scrapy/utils/testsite.py -> build/lib/scrapy/utils copying scrapy/utils/testproc.py -> build/lib/scrapy/utils copying scrapy/utils/test.py -> build/lib/scrapy/utils copying scrapy/utils/template.py -> build/lib/scrapy/utils copying scrapy/utils/ssl.py -> build/lib/scrapy/utils copying scrapy/utils/spider.py -> build/lib/scrapy/utils copying scrapy/utils/sitemap.py -> build/lib/scrapy/utils copying scrapy/utils/signal.py -> build/lib/scrapy/utils copying scrapy/utils/serialize.py -> build/lib/scrapy/utils copying scrapy/utils/response.py -> build/lib/scrapy/utils copying scrapy/utils/request.py -> build/lib/scrapy/utils copying scrapy/utils/reqser.py -> build/lib/scrapy/utils copying scrapy/utils/reactor.py -> build/lib/scrapy/utils copying scrapy/utils/python.py -> build/lib/scrapy/utils copying scrapy/utils/project.py -> build/lib/scrapy/utils copying scrapy/utils/ossignal.py -> build/lib/scrapy/utils copying scrapy/utils/misc.py -> build/lib/scrapy/utils copying scrapy/utils/log.py -> build/lib/scrapy/utils copying scrapy/utils/job.py -> build/lib/scrapy/utils copying scrapy/utils/iterators.py -> build/lib/scrapy/utils copying scrapy/utils/httpobj.py -> build/lib/scrapy/utils copying scrapy/utils/gz.py -> build/lib/scrapy/utils copying scrapy/utils/ftp.py -> build/lib/scrapy/utils copying scrapy/utils/engine.py -> build/lib/scrapy/utils copying scrapy/utils/display.py -> build/lib/scrapy/utils copying scrapy/utils/deprecate.py -> build/lib/scrapy/utils copying scrapy/utils/defer.py -> build/lib/scrapy/utils copying scrapy/utils/decorators.py -> build/lib/scrapy/utils copying scrapy/utils/datatypes.py -> build/lib/scrapy/utils copying scrapy/utils/curl.py -> build/lib/scrapy/utils copying scrapy/utils/console.py -> build/lib/scrapy/utils copying scrapy/utils/conf.py -> build/lib/scrapy/utils copying scrapy/utils/boto.py -> build/lib/scrapy/utils copying scrapy/utils/benchserver.py -> build/lib/scrapy/utils copying scrapy/utils/asyncgen.py -> build/lib/scrapy/utils copying scrapy/utils/__init__.py -> build/lib/scrapy/utils creating build/lib/scrapy/spiders copying scrapy/spiders/sitemap.py -> build/lib/scrapy/spiders copying scrapy/spiders/init.py -> build/lib/scrapy/spiders copying scrapy/spiders/feed.py -> build/lib/scrapy/spiders copying scrapy/spiders/crawl.py -> build/lib/scrapy/spiders copying scrapy/spiders/__init__.py -> build/lib/scrapy/spiders creating build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/urllength.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/referer.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/offsite.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/httperror.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/depth.py -> build/lib/scrapy/spidermiddlewares copying scrapy/spidermiddlewares/__init__.py -> build/lib/scrapy/spidermiddlewares creating build/lib/scrapy/settings copying scrapy/settings/default_settings.py -> build/lib/scrapy/settings copying scrapy/settings/__init__.py -> build/lib/scrapy/settings creating build/lib/scrapy/selector copying scrapy/selector/unified.py -> build/lib/scrapy/selector copying scrapy/selector/__init__.py -> build/lib/scrapy/selector creating build/lib/scrapy/pipelines copying scrapy/pipelines/media.py -> build/lib/scrapy/pipelines copying scrapy/pipelines/images.py -> build/lib/scrapy/pipelines copying scrapy/pipelines/files.py -> build/lib/scrapy/pipelines copying scrapy/pipelines/__init__.py -> build/lib/scrapy/pipelines creating build/lib/scrapy/loader copying scrapy/loader/processors.py -> build/lib/scrapy/loader copying scrapy/loader/common.py -> build/lib/scrapy/loader copying scrapy/loader/__init__.py -> build/lib/scrapy/loader creating build/lib/scrapy/linkextractors copying scrapy/linkextractors/lxmlhtml.py -> build/lib/scrapy/linkextractors copying scrapy/linkextractors/__init__.py -> build/lib/scrapy/linkextractors creating build/lib/scrapy/http copying scrapy/http/headers.py -> build/lib/scrapy/http copying scrapy/http/cookies.py -> build/lib/scrapy/http copying scrapy/http/common.py -> build/lib/scrapy/http copying scrapy/http/__init__.py -> build/lib/scrapy/http creating build/lib/scrapy/extensions copying scrapy/extensions/throttle.py -> build/lib/scrapy/extensions copying scrapy/extensions/telnet.py -> build/lib/scrapy/extensions copying scrapy/extensions/statsmailer.py -> build/lib/scrapy/extensions copying scrapy/extensions/spiderstate.py -> build/lib/scrapy/extensions copying scrapy/extensions/postprocessing.py -> build/lib/scrapy/extensions copying scrapy/extensions/memusage.py -> build/lib/scrapy/extensions copying scrapy/extensions/memdebug.py -> build/lib/scrapy/extensions copying scrapy/extensions/logstats.py -> build/lib/scrapy/extensions copying scrapy/extensions/httpcache.py -> build/lib/scrapy/extensions copying scrapy/extensions/feedexport.py -> build/lib/scrapy/extensions copying scrapy/extensions/debug.py -> build/lib/scrapy/extensions copying scrapy/extensions/corestats.py -> build/lib/scrapy/extensions copying scrapy/extensions/closespider.py -> build/lib/scrapy/extensions copying scrapy/extensions/__init__.py -> build/lib/scrapy/extensions creating build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/useragent.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/stats.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/robotstxt.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/retry.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/redirect.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpproxy.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpcompression.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpcache.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/httpauth.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/downloadtimeout.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/defaultheaders.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/decompression.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/cookies.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/ajaxcrawl.py -> build/lib/scrapy/downloadermiddlewares copying scrapy/downloadermiddlewares/__init__.py -> build/lib/scrapy/downloadermiddlewares creating build/lib/scrapy/core copying scrapy/core/spidermw.py -> build/lib/scrapy/core copying scrapy/core/scraper.py -> build/lib/scrapy/core copying scrapy/core/scheduler.py -> build/lib/scrapy/core copying scrapy/core/engine.py -> build/lib/scrapy/core copying scrapy/core/__init__.py -> build/lib/scrapy/core creating build/lib/scrapy/contracts copying scrapy/contracts/default.py -> build/lib/scrapy/contracts copying scrapy/contracts/__init__.py -> build/lib/scrapy/contracts creating build/lib/scrapy/commands copying scrapy/commands/view.py -> build/lib/scrapy/commands copying scrapy/commands/version.py -> build/lib/scrapy/commands copying scrapy/commands/startproject.py -> build/lib/scrapy/commands copying scrapy/commands/shell.py -> build/lib/scrapy/commands copying scrapy/commands/settings.py -> build/lib/scrapy/commands copying scrapy/commands/runspider.py -> build/lib/scrapy/commands copying scrapy/commands/parse.py -> build/lib/scrapy/commands copying scrapy/commands/list.py -> build/lib/scrapy/commands copying scrapy/commands/genspider.py -> build/lib/scrapy/commands copying scrapy/commands/fetch.py -> build/lib/scrapy/commands copying scrapy/commands/edit.py -> build/lib/scrapy/commands copying scrapy/commands/crawl.py -> build/lib/scrapy/commands copying scrapy/commands/check.py -> build/lib/scrapy/commands copying scrapy/commands/bench.py -> build/lib/scrapy/commands copying scrapy/commands/__init__.py -> build/lib/scrapy/commands creating build/lib/scrapy/http/response copying scrapy/http/response/xml.py -> build/lib/scrapy/http/response copying scrapy/http/response/text.py -> build/lib/scrapy/http/response copying scrapy/http/response/html.py -> build/lib/scrapy/http/response copying scrapy/http/response/__init__.py -> build/lib/scrapy/http/response creating build/lib/scrapy/http/request copying scrapy/http/request/rpc.py -> build/lib/scrapy/http/request copying scrapy/http/request/json_request.py -> build/lib/scrapy/http/request copying scrapy/http/request/form.py -> build/lib/scrapy/http/request copying scrapy/http/request/__init__.py -> build/lib/scrapy/http/request creating build/lib/scrapy/core/http2 copying scrapy/core/http2/stream.py -> build/lib/scrapy/core/http2 copying scrapy/core/http2/protocol.py -> build/lib/scrapy/core/http2 copying scrapy/core/http2/agent.py -> build/lib/scrapy/core/http2 copying scrapy/core/http2/__init__.py -> build/lib/scrapy/core/http2 creating build/lib/scrapy/core/downloader copying scrapy/core/downloader/webclient.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/tls.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/middleware.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/contextfactory.py -> build/lib/scrapy/core/downloader copying scrapy/core/downloader/__init__.py -> build/lib/scrapy/core/downloader creating build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/s3.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http2.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http11.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http10.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/http.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/ftp.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/file.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/datauri.py -> build/lib/scrapy/core/downloader/handlers copying scrapy/core/downloader/handlers/__init__.py -> build/lib/scrapy/core/downloader/handlers running egg_info writing Scrapy.egg-info/PKG-INFO writing dependency_links to Scrapy.egg-info/dependency_links.txt writing entry points to Scrapy.egg-info/entry_points.txt writing requirements to Scrapy.egg-info/requires.txt writing top-level names to Scrapy.egg-info/top_level.txt reading manifest file 'Scrapy.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'INSTALL' warning: no files found matching 'requirements-*.txt' warning: no files found matching 'license.txt' under directory 'scrapy' no previously-included directories found matching 'docs/build' warning: no files found matching '*' under directory 'bin' warning: no previously-included files matching '__pycache__' found anywhere in distribution warning: no previously-included files matching '*.py[cod]' found anywhere in distribution adding license file 'LICENSE' adding license file 'AUTHORS' writing manifest file 'Scrapy.egg-info/SOURCES.txt' /usr/lib/python3.11/site-packages/setuptools/command/build_py.py:204: _Warning: Package 'scrapy.templates.project' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.project' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.project' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.project' to be distributed and are already explicitly excluding 'scrapy.templates.project' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) /usr/lib/python3.11/site-packages/setuptools/command/build_py.py:204: _Warning: Package 'scrapy.templates.project.module' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.project.module' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.project.module' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.project.module' to be distributed and are already explicitly excluding 'scrapy.templates.project.module' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) /usr/lib/python3.11/site-packages/setuptools/command/build_py.py:204: _Warning: Package 'scrapy.templates.project.module.spiders' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.project.module.spiders' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.project.module.spiders' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.project.module.spiders' to be distributed and are already explicitly excluding 'scrapy.templates.project.module.spiders' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) /usr/lib/python3.11/site-packages/setuptools/command/build_py.py:204: _Warning: Package 'scrapy.templates.spiders' is absent from the `packages` configuration. !! ******************************************************************************** ############################ # Package would be ignored # ############################ Python recognizes 'scrapy.templates.spiders' as an importable package[^1], but it is absent from setuptools' `packages` configuration. This leads to an ambiguous overall configuration. If you want to distribute this package, please make sure that 'scrapy.templates.spiders' is explicitly added to the `packages` configuration field. Alternatively, you can also rely on setuptools' discovery methods (for example by using `find_namespace_packages(...)`/`find_namespace:` instead of `find_packages(...)`/`find:`). You can read more about "package discovery" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html If you don't want 'scrapy.templates.spiders' to be distributed and are already explicitly excluding 'scrapy.templates.spiders' via `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`, you can try to use `exclude_package_data`, or `include-package-data=False` in combination with a more fine grained `package-data` configuration. You can read more about "package data files" on setuptools documentation page: - https://setuptools.pypa.io/en/latest/userguide/datafiles.html [^1]: For Python, any directory (with suitable naming) can be imported, even if it does not contain any `.py` files. On the other hand, currently there is no concept of package data directory, all directories are treated like packages. ******************************************************************************** !! check.warn(importable) copying scrapy/VERSION -> build/lib/scrapy copying scrapy/mime.types -> build/lib/scrapy creating build/lib/scrapy/templates creating build/lib/scrapy/templates/project copying scrapy/templates/project/scrapy.cfg -> build/lib/scrapy/templates/project creating build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/__init__.py -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/items.py.tmpl -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/middlewares.py.tmpl -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/pipelines.py.tmpl -> build/lib/scrapy/templates/project/module copying scrapy/templates/project/module/settings.py.tmpl -> build/lib/scrapy/templates/project/module creating build/lib/scrapy/templates/project/module/spiders copying scrapy/templates/project/module/spiders/__init__.py -> build/lib/scrapy/templates/project/module/spiders creating build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/basic.tmpl -> build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/crawl.tmpl -> build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/csvfeed.tmpl -> build/lib/scrapy/templates/spiders copying scrapy/templates/spiders/xmlfeed.tmpl -> build/lib/scrapy/templates/spiders installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64 creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/scrapy creating build/bdist.linux-x86_64/wheel/scrapy/templates creating build/bdist.linux-x86_64/wheel/scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/xmlfeed.tmpl -> build/bdist.linux-x86_64/wheel/scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/csvfeed.tmpl -> build/bdist.linux-x86_64/wheel/scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/crawl.tmpl -> build/bdist.linux-x86_64/wheel/scrapy/templates/spiders copying build/lib/scrapy/templates/spiders/basic.tmpl -> build/bdist.linux-x86_64/wheel/scrapy/templates/spiders creating build/bdist.linux-x86_64/wheel/scrapy/templates/project creating build/bdist.linux-x86_64/wheel/scrapy/templates/project/module creating build/bdist.linux-x86_64/wheel/scrapy/templates/project/module/spiders copying build/lib/scrapy/templates/project/module/spiders/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/templates/project/module/spiders copying build/lib/scrapy/templates/project/module/settings.py.tmpl -> build/bdist.linux-x86_64/wheel/scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/pipelines.py.tmpl -> build/bdist.linux-x86_64/wheel/scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/middlewares.py.tmpl -> build/bdist.linux-x86_64/wheel/scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/items.py.tmpl -> build/bdist.linux-x86_64/wheel/scrapy/templates/project/module copying build/lib/scrapy/templates/project/module/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/templates/project/module copying build/lib/scrapy/templates/project/scrapy.cfg -> build/bdist.linux-x86_64/wheel/scrapy/templates/project copying build/lib/scrapy/mime.types -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/VERSION -> build/bdist.linux-x86_64/wheel/scrapy creating build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/bench.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/check.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/crawl.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/edit.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/fetch.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/genspider.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/list.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/parse.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/runspider.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/settings.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/shell.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/startproject.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/version.py -> build/bdist.linux-x86_64/wheel/scrapy/commands copying build/lib/scrapy/commands/view.py -> build/bdist.linux-x86_64/wheel/scrapy/commands creating build/bdist.linux-x86_64/wheel/scrapy/contracts copying build/lib/scrapy/contracts/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/contracts copying build/lib/scrapy/contracts/default.py -> build/bdist.linux-x86_64/wheel/scrapy/contracts creating build/bdist.linux-x86_64/wheel/scrapy/core creating build/bdist.linux-x86_64/wheel/scrapy/core/downloader creating build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/datauri.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/file.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/ftp.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http10.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http11.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/http2.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/handlers/s3.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader/handlers copying build/lib/scrapy/core/downloader/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader copying build/lib/scrapy/core/downloader/contextfactory.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader copying build/lib/scrapy/core/downloader/middleware.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader copying build/lib/scrapy/core/downloader/tls.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader copying build/lib/scrapy/core/downloader/webclient.py -> build/bdist.linux-x86_64/wheel/scrapy/core/downloader creating build/bdist.linux-x86_64/wheel/scrapy/core/http2 copying build/lib/scrapy/core/http2/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/core/http2 copying build/lib/scrapy/core/http2/agent.py -> build/bdist.linux-x86_64/wheel/scrapy/core/http2 copying build/lib/scrapy/core/http2/protocol.py -> build/bdist.linux-x86_64/wheel/scrapy/core/http2 copying build/lib/scrapy/core/http2/stream.py -> build/bdist.linux-x86_64/wheel/scrapy/core/http2 copying build/lib/scrapy/core/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/core copying build/lib/scrapy/core/engine.py -> build/bdist.linux-x86_64/wheel/scrapy/core copying build/lib/scrapy/core/scheduler.py -> build/bdist.linux-x86_64/wheel/scrapy/core copying build/lib/scrapy/core/scraper.py -> build/bdist.linux-x86_64/wheel/scrapy/core copying build/lib/scrapy/core/spidermw.py -> build/bdist.linux-x86_64/wheel/scrapy/core creating build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/ajaxcrawl.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/cookies.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/decompression.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/defaultheaders.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/downloadtimeout.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpauth.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpcache.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpcompression.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/httpproxy.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/redirect.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/retry.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/robotstxt.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/stats.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares copying build/lib/scrapy/downloadermiddlewares/useragent.py -> build/bdist.linux-x86_64/wheel/scrapy/downloadermiddlewares creating build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/closespider.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/corestats.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/debug.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/feedexport.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/httpcache.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/logstats.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/memdebug.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/memusage.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/postprocessing.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/spiderstate.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/statsmailer.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/telnet.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions copying build/lib/scrapy/extensions/throttle.py -> build/bdist.linux-x86_64/wheel/scrapy/extensions creating build/bdist.linux-x86_64/wheel/scrapy/http creating build/bdist.linux-x86_64/wheel/scrapy/http/request copying build/lib/scrapy/http/request/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/http/request copying build/lib/scrapy/http/request/form.py -> build/bdist.linux-x86_64/wheel/scrapy/http/request copying build/lib/scrapy/http/request/json_request.py -> build/bdist.linux-x86_64/wheel/scrapy/http/request copying build/lib/scrapy/http/request/rpc.py -> build/bdist.linux-x86_64/wheel/scrapy/http/request creating build/bdist.linux-x86_64/wheel/scrapy/http/response copying build/lib/scrapy/http/response/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/http/response copying build/lib/scrapy/http/response/html.py -> build/bdist.linux-x86_64/wheel/scrapy/http/response copying build/lib/scrapy/http/response/text.py -> build/bdist.linux-x86_64/wheel/scrapy/http/response copying build/lib/scrapy/http/response/xml.py -> build/bdist.linux-x86_64/wheel/scrapy/http/response copying build/lib/scrapy/http/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/http copying build/lib/scrapy/http/common.py -> build/bdist.linux-x86_64/wheel/scrapy/http copying build/lib/scrapy/http/cookies.py -> build/bdist.linux-x86_64/wheel/scrapy/http copying build/lib/scrapy/http/headers.py -> build/bdist.linux-x86_64/wheel/scrapy/http creating build/bdist.linux-x86_64/wheel/scrapy/linkextractors copying build/lib/scrapy/linkextractors/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/linkextractors copying build/lib/scrapy/linkextractors/lxmlhtml.py -> build/bdist.linux-x86_64/wheel/scrapy/linkextractors creating build/bdist.linux-x86_64/wheel/scrapy/loader copying build/lib/scrapy/loader/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/loader copying build/lib/scrapy/loader/common.py -> build/bdist.linux-x86_64/wheel/scrapy/loader copying build/lib/scrapy/loader/processors.py -> build/bdist.linux-x86_64/wheel/scrapy/loader creating build/bdist.linux-x86_64/wheel/scrapy/pipelines copying build/lib/scrapy/pipelines/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/pipelines copying build/lib/scrapy/pipelines/files.py -> build/bdist.linux-x86_64/wheel/scrapy/pipelines copying build/lib/scrapy/pipelines/images.py -> build/bdist.linux-x86_64/wheel/scrapy/pipelines copying build/lib/scrapy/pipelines/media.py -> build/bdist.linux-x86_64/wheel/scrapy/pipelines creating build/bdist.linux-x86_64/wheel/scrapy/selector copying build/lib/scrapy/selector/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/selector copying build/lib/scrapy/selector/unified.py -> build/bdist.linux-x86_64/wheel/scrapy/selector creating build/bdist.linux-x86_64/wheel/scrapy/settings copying build/lib/scrapy/settings/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/settings copying build/lib/scrapy/settings/default_settings.py -> build/bdist.linux-x86_64/wheel/scrapy/settings creating build/bdist.linux-x86_64/wheel/scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/depth.py -> build/bdist.linux-x86_64/wheel/scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/httperror.py -> build/bdist.linux-x86_64/wheel/scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/offsite.py -> build/bdist.linux-x86_64/wheel/scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/referer.py -> build/bdist.linux-x86_64/wheel/scrapy/spidermiddlewares copying build/lib/scrapy/spidermiddlewares/urllength.py -> build/bdist.linux-x86_64/wheel/scrapy/spidermiddlewares creating build/bdist.linux-x86_64/wheel/scrapy/spiders copying build/lib/scrapy/spiders/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/spiders copying build/lib/scrapy/spiders/crawl.py -> build/bdist.linux-x86_64/wheel/scrapy/spiders copying build/lib/scrapy/spiders/feed.py -> build/bdist.linux-x86_64/wheel/scrapy/spiders copying build/lib/scrapy/spiders/init.py -> build/bdist.linux-x86_64/wheel/scrapy/spiders copying build/lib/scrapy/spiders/sitemap.py -> build/bdist.linux-x86_64/wheel/scrapy/spiders creating build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/asyncgen.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/benchserver.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/boto.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/conf.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/console.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/curl.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/datatypes.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/decorators.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/defer.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/deprecate.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/display.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/engine.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/ftp.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/gz.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/httpobj.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/iterators.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/job.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/log.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/misc.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/ossignal.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/project.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/python.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/reactor.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/reqser.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/request.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/response.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/serialize.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/signal.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/sitemap.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/spider.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/ssl.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/template.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/test.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/testproc.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/testsite.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/trackref.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/url.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/utils/versions.py -> build/bdist.linux-x86_64/wheel/scrapy/utils copying build/lib/scrapy/__init__.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/__main__.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/cmdline.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/crawler.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/dupefilters.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/exceptions.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/exporters.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/extension.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/interfaces.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/item.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/link.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/logformatter.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/mail.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/middleware.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/pqueues.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/resolver.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/responsetypes.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/robotstxt.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/shell.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/signalmanager.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/signals.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/spiderloader.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/squeues.py -> build/bdist.linux-x86_64/wheel/scrapy copying build/lib/scrapy/statscollectors.py -> build/bdist.linux-x86_64/wheel/scrapy running install_egg_info Copying Scrapy.egg-info to build/bdist.linux-x86_64/wheel/Scrapy-2.9.0-py3.11.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/Scrapy-2.9.0.dist-info/WHEEL creating '/tmp/archlinux-ci/scrapy-archlinuxrb-build-J6zkxUDh/scrapy/src/scrapy-2.9.0/dist/.tmp-1jb1xv0m/Scrapy-2.9.0-py2.py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it adding 'scrapy/VERSION' adding 'scrapy/__init__.py' adding 'scrapy/__main__.py' adding 'scrapy/cmdline.py' adding 'scrapy/crawler.py' adding 'scrapy/dupefilters.py' adding 'scrapy/exceptions.py' adding 'scrapy/exporters.py' adding 'scrapy/extension.py' adding 'scrapy/interfaces.py' adding 'scrapy/item.py' adding 'scrapy/link.py' adding 'scrapy/logformatter.py' adding 'scrapy/mail.py' adding 'scrapy/middleware.py' adding 'scrapy/mime.types' adding 'scrapy/pqueues.py' adding 'scrapy/resolver.py' adding 'scrapy/responsetypes.py' adding 'scrapy/robotstxt.py' adding 'scrapy/shell.py' adding 'scrapy/signalmanager.py' adding 'scrapy/signals.py' adding 'scrapy/spiderloader.py' adding 'scrapy/squeues.py' adding 'scrapy/statscollectors.py' adding 'scrapy/commands/__init__.py' adding 'scrapy/commands/bench.py' adding 'scrapy/commands/check.py' adding 'scrapy/commands/crawl.py' adding 'scrapy/commands/edit.py' adding 'scrapy/commands/fetch.py' adding 'scrapy/commands/genspider.py' adding 'scrapy/commands/list.py' adding 'scrapy/commands/parse.py' adding 'scrapy/commands/runspider.py' adding 'scrapy/commands/settings.py' adding 'scrapy/commands/shell.py' adding 'scrapy/commands/startproject.py' adding 'scrapy/commands/version.py' adding 'scrapy/commands/view.py' adding 'scrapy/contracts/__init__.py' adding 'scrapy/contracts/default.py' adding 'scrapy/core/__init__.py' adding 'scrapy/core/engine.py' adding 'scrapy/core/scheduler.py' adding 'scrapy/core/scraper.py' adding 'scrapy/core/spidermw.py' adding 'scrapy/core/downloader/__init__.py' adding 'scrapy/core/downloader/contextfactory.py' adding 'scrapy/core/downloader/middleware.py' adding 'scrapy/core/downloader/tls.py' adding 'scrapy/core/downloader/webclient.py' adding 'scrapy/core/downloader/handlers/__init__.py' adding 'scrapy/core/downloader/handlers/datauri.py' adding 'scrapy/core/downloader/handlers/file.py' adding 'scrapy/core/downloader/handlers/ftp.py' adding 'scrapy/core/downloader/handlers/http.py' adding 'scrapy/core/downloader/handlers/http10.py' adding 'scrapy/core/downloader/handlers/http11.py' adding 'scrapy/core/downloader/handlers/http2.py' adding 'scrapy/core/downloader/handlers/s3.py' adding 'scrapy/core/http2/__init__.py' adding 'scrapy/core/http2/agent.py' adding 'scrapy/core/http2/protocol.py' adding 'scrapy/core/http2/stream.py' adding 'scrapy/downloadermiddlewares/__init__.py' adding 'scrapy/downloadermiddlewares/ajaxcrawl.py' adding 'scrapy/downloadermiddlewares/cookies.py' adding 'scrapy/downloadermiddlewares/decompression.py' adding 'scrapy/downloadermiddlewares/defaultheaders.py' adding 'scrapy/downloadermiddlewares/downloadtimeout.py' adding 'scrapy/downloadermiddlewares/httpauth.py' adding 'scrapy/downloadermiddlewares/httpcache.py' adding 'scrapy/downloadermiddlewares/httpcompression.py' adding 'scrapy/downloadermiddlewares/httpproxy.py' adding 'scrapy/downloadermiddlewares/redirect.py' adding 'scrapy/downloadermiddlewares/retry.py' adding 'scrapy/downloadermiddlewares/robotstxt.py' adding 'scrapy/downloadermiddlewares/stats.py' adding 'scrapy/downloadermiddlewares/useragent.py' adding 'scrapy/extensions/__init__.py' adding 'scrapy/extensions/closespider.py' adding 'scrapy/extensions/corestats.py' adding 'scrapy/extensions/debug.py' adding 'scrapy/extensions/feedexport.py' adding 'scrapy/extensions/httpcache.py' adding 'scrapy/extensions/logstats.py' adding 'scrapy/extensions/memdebug.py' adding 'scrapy/extensions/memusage.py' adding 'scrapy/extensions/postprocessing.py' adding 'scrapy/extensions/spiderstate.py' adding 'scrapy/extensions/statsmailer.py' adding 'scrapy/extensions/telnet.py' adding 'scrapy/extensions/throttle.py' adding 'scrapy/http/__init__.py' adding 'scrapy/http/common.py' adding 'scrapy/http/cookies.py' adding 'scrapy/http/headers.py' adding 'scrapy/http/request/__init__.py' adding 'scrapy/http/request/form.py' adding 'scrapy/http/request/json_request.py' adding 'scrapy/http/request/rpc.py' adding 'scrapy/http/response/__init__.py' adding 'scrapy/http/response/html.py' adding 'scrapy/http/response/text.py' adding 'scrapy/http/response/xml.py' adding 'scrapy/linkextractors/__init__.py' adding 'scrapy/linkextractors/lxmlhtml.py' adding 'scrapy/loader/__init__.py' adding 'scrapy/loader/common.py' adding 'scrapy/loader/processors.py' adding 'scrapy/pipelines/__init__.py' adding 'scrapy/pipelines/files.py' adding 'scrapy/pipelines/images.py' adding 'scrapy/pipelines/media.py' adding 'scrapy/selector/__init__.py' adding 'scrapy/selector/unified.py' adding 'scrapy/settings/__init__.py' adding 'scrapy/settings/default_settings.py' adding 'scrapy/spidermiddlewares/__init__.py' adding 'scrapy/spidermiddlewares/depth.py' adding 'scrapy/spidermiddlewares/httperror.py' adding 'scrapy/spidermiddlewares/offsite.py' adding 'scrapy/spidermiddlewares/referer.py' adding 'scrapy/spidermiddlewares/urllength.py' adding 'scrapy/spiders/__init__.py' adding 'scrapy/spiders/crawl.py' adding 'scrapy/spiders/feed.py' adding 'scrapy/spiders/init.py' adding 'scrapy/spiders/sitemap.py' adding 'scrapy/templates/project/scrapy.cfg' adding 'scrapy/templates/project/module/__init__.py' adding 'scrapy/templates/project/module/items.py.tmpl' adding 'scrapy/templates/project/module/middlewares.py.tmpl' adding 'scrapy/templates/project/module/pipelines.py.tmpl' adding 'scrapy/templates/project/module/settings.py.tmpl' adding 'scrapy/templates/project/module/spiders/__init__.py' adding 'scrapy/templates/spiders/basic.tmpl' adding 'scrapy/templates/spiders/crawl.tmpl' adding 'scrapy/templates/spiders/csvfeed.tmpl' adding 'scrapy/templates/spiders/xmlfeed.tmpl' adding 'scrapy/utils/__init__.py' adding 'scrapy/utils/asyncgen.py' adding 'scrapy/utils/benchserver.py' adding 'scrapy/utils/boto.py' adding 'scrapy/utils/conf.py' adding 'scrapy/utils/console.py' adding 'scrapy/utils/curl.py' adding 'scrapy/utils/datatypes.py' adding 'scrapy/utils/decorators.py' adding 'scrapy/utils/defer.py' adding 'scrapy/utils/deprecate.py' adding 'scrapy/utils/display.py' adding 'scrapy/utils/engine.py' adding 'scrapy/utils/ftp.py' adding 'scrapy/utils/gz.py' adding 'scrapy/utils/httpobj.py' adding 'scrapy/utils/iterators.py' adding 'scrapy/utils/job.py' adding 'scrapy/utils/log.py' adding 'scrapy/utils/misc.py' adding 'scrapy/utils/ossignal.py' adding 'scrapy/utils/project.py' adding 'scrapy/utils/python.py' adding 'scrapy/utils/reactor.py' adding 'scrapy/utils/reqser.py' adding 'scrapy/utils/request.py' adding 'scrapy/utils/response.py' adding 'scrapy/utils/serialize.py' adding 'scrapy/utils/signal.py' adding 'scrapy/utils/sitemap.py' adding 'scrapy/utils/spider.py' adding 'scrapy/utils/ssl.py' adding 'scrapy/utils/template.py' adding 'scrapy/utils/test.py' adding 'scrapy/utils/testproc.py' adding 'scrapy/utils/testsite.py' adding 'scrapy/utils/trackref.py' adding 'scrapy/utils/url.py' adding 'scrapy/utils/versions.py' adding 'Scrapy-2.9.0.dist-info/AUTHORS' adding 'Scrapy-2.9.0.dist-info/LICENSE' adding 'Scrapy-2.9.0.dist-info/METADATA' adding 'Scrapy-2.9.0.dist-info/WHEEL' adding 'Scrapy-2.9.0.dist-info/entry_points.txt' adding 'Scrapy-2.9.0.dist-info/top_level.txt' adding 'Scrapy-2.9.0.dist-info/RECORD' removing build/bdist.linux-x86_64/wheel Successfully built Scrapy-2.9.0-py2.py3-none-any.whl ==> Entering fakeroot environment... ==> Starting package()... ==> Tidying install... -> Removing libtool files... -> Purging unwanted files... -> Removing static library files... -> Stripping unneeded symbols from binaries and libraries... -> Compressing man and info pages... ==> Checking for packaging issues... ==> Creating package "scrapy"... -> Generating .PKGINFO file... -> Generating .BUILDINFO file... -> Generating .MTREE file... -> Compressing package... ==> Leaving fakeroot environment. ==> Finished making: scrapy 2.9.0-1 (Thu Dec 7 12:01:11 2023)