Started by timer
Running as SYSTEM
Building remotely on Ubuntu_18.04_bioeng49 (buildslave Testing) in workspace /home/cmiss/Jenkins/workspace/SPARC-API
[WS-CLEANUP] Deleting project workspace...
[WS-CLEANUP] Deferred wipeout is used...
[WS-CLEANUP] Done
The recommended git tool is: NONE
No credentials specified
Cloning the remote Git repository
Cloning repository https://github.com/nih-sparc/sparc-api.git
> git init /home/cmiss/Jenkins/workspace/SPARC-API # timeout=10
Fetching upstream changes from https://github.com/nih-sparc/sparc-api.git
> git --version # timeout=10
> git --version # 'git version 2.25.1'
> git fetch --tags --force --progress -- https://github.com/nih-sparc/sparc-api.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/nih-sparc/sparc-api.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
> git rev-parse refs/remotes/origin/master^{commit} # timeout=10
Checking out Revision 2048c5e04b09e49677f77192b1bdcfcee6a917dc (refs/remotes/origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 2048c5e04b09e49677f77192b1bdcfcee6a917dc # timeout=10
Commit message: "Merge pull request #237 from alan-wu/monthly-stats-sorting"
> git rev-list --no-walk 2048c5e04b09e49677f77192b1bdcfcee6a917dc # timeout=10
[SPARC-API] $ /bin/sh -xe /tmp/shiningpanda17175098426917673974.sh
+ pwd
+ export PYTHONPATH=/home/cmiss/Jenkins/workspace/SPARC-API
+ export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python
+ pip install -r requirements.txt
Requirement already satisfied: api==0.0.7 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 1)) (0.0.7)
Requirement already satisfied: pennsieve==6.1.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 2)) (6.1.1)
Requirement already satisfied: boto3==1.17.67 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 3)) (1.17.67)
Requirement already satisfied: botocore==1.20.67 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 4)) (1.20.67)
Requirement already satisfied: certifi==2019.11.28 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 5)) (2019.11.28)
Requirement already satisfied: chardet==3.0.4 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 6)) (3.0.4)
Collecting Click==7.1.2 (from -r requirements.txt (line 7))
Using cached click-7.1.2-py2.py3-none-any.whl.metadata (2.9 kB)
Requirement already satisfied: docutils==0.15.2 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 8)) (0.15.2)
Requirement already satisfied: Flask==1.1.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 9)) (1.1.1)
Requirement already satisfied: flask-marshmallow==0.10.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 10)) (0.10.1)
Requirement already satisfied: flask-cors==3.0.8 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 11)) (3.0.8)
Requirement already satisfied: gunicorn==20.0.4 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 12)) (20.0.4)
Requirement already satisfied: idna==2.8 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 13)) (2.8)
Requirement already satisfied: itsdangerous==1.1.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 14)) (1.1.0)
Requirement already satisfied: Jinja2==2.11.3 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 15)) (2.11.3)
Requirement already satisfied: jmespath==0.9.4 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 16)) (0.9.4)
Requirement already satisfied: MarkupSafe==1.1.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 17)) (1.1.1)
Requirement already satisfied: marshmallow==3.2.2 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 18)) (3.2.2)
Requirement already satisfied: nose==1.3.7 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 19)) (1.3.7)
Requirement already satisfied: osparc==0.4.3 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 20)) (0.4.3)
Requirement already satisfied: pillow in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 21)) (10.4.0)
Requirement already satisfied: public==2019.4.13 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 22)) (2019.4.13)
Requirement already satisfied: pytest in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 23)) (5.4.3)
Requirement already satisfied: pymongo==3.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 24)) (3.8.0)
Requirement already satisfied: python-dateutil==2.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 25)) (2.8.0)
Requirement already satisfied: python-dotenv==0.10.3 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 26)) (0.10.3)
Requirement already satisfied: query-string==2019.4.13 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 27)) (2019.4.13)
Requirement already satisfied: requests==2.25.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 28)) (2.25.1)
Requirement already satisfied: s3transfer==0.4.2 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 29)) (0.4.2)
Requirement already satisfied: sendgrid==6.9.7 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 30)) (6.9.7)
Requirement already satisfied: six==1.13.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 31)) (1.13.0)
Requirement already satisfied: SQLAlchemy==1.3.20 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 32)) (1.3.20)
Requirement already satisfied: urllib3==1.26.4 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 33)) (1.26.4)
Requirement already satisfied: Werkzeug==0.16.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 34)) (0.16.0)
Requirement already satisfied: psycopg2-binary==2.9.9 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 35)) (2.9.9)
Requirement already satisfied: APScheduler==3.7.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 36)) (3.7.0)
Requirement already satisfied: google-api-python-client==2.52.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 37)) (2.52.0)
Requirement already satisfied: oauth2client==4.1.3 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 38)) (4.1.3)
Requirement already satisfied: algoliasearch==2.6.2 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 39)) (2.6.2)
Requirement already satisfied: contentful==1.13.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 40)) (1.13.1)
Requirement already satisfied: contentful_management==2.11.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 41)) (2.11.0)
Requirement already satisfied: Flask-Caching==2.3.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 42)) (2.3.0)
Requirement already satisfied: configparser>=3.5 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (7.1.0)
Requirement already satisfied: deprecated>=1.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (1.2.14)
Requirement already satisfied: future>=0.15.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (1.0.0)
Requirement already satisfied: futures in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (3.0.5)
Requirement already satisfied: protobuf>=3.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (5.28.0)
Requirement already satisfied: python-jose==3.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (3.2.0)
Requirement already satisfied: pytz>=2016 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (2024.1)
Requirement already satisfied: rsa==4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (4.0)
Requirement already satisfied: semver>=2.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (3.0.2)
Requirement already satisfied: websocket-client>=0.57.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (1.8.0)
Requirement already satisfied: docopt>=0.6 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (0.6.2)
Requirement already satisfied: psutil>=5.4 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (6.0.0)
Requirement already satisfied: setuptools>=3.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from gunicorn==20.0.4->-r requirements.txt (line 12)) (74.1.2)
Requirement already satisfied: python-http-client>=3.2.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from sendgrid==6.9.7->-r requirements.txt (line 30)) (3.3.7)
Requirement already satisfied: starkbank-ecdsa>=2.0.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from sendgrid==6.9.7->-r requirements.txt (line 30)) (2.2.0)
Requirement already satisfied: tzlocal~=2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from APScheduler==3.7.0->-r requirements.txt (line 36)) (2.1)
Requirement already satisfied: httplib2<1dev,>=0.15.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (0.22.0)
Requirement already satisfied: google-auth<3.0.0dev,>=1.19.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (2.34.0)
Requirement already satisfied: google-auth-httplib2>=0.1.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (0.2.0)
Requirement already satisfied: google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (2.19.2)
Requirement already satisfied: uritemplate<5,>=3.0.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (4.1.1)
Requirement already satisfied: pyasn1>=0.1.7 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from oauth2client==4.1.3->-r requirements.txt (line 38)) (0.6.0)
Requirement already satisfied: pyasn1-modules>=0.0.5 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from oauth2client==4.1.3->-r requirements.txt (line 38)) (0.4.0)
Requirement already satisfied: cachelib<0.10.0,>=0.9.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from Flask-Caching==2.3.0->-r requirements.txt (line 42)) (0.9.0)
Requirement already satisfied: ecdsa<0.15 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from python-jose==3.2.0->pennsieve==6.1.1->-r requirements.txt (line 2)) (0.14.1)
Requirement already satisfied: py>=1.5.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (1.11.0)
Requirement already satisfied: packaging in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (24.1)
Requirement already satisfied: attrs>=17.4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (24.2.0)
Requirement already satisfied: more-itertools>=4.0.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (10.5.0)
Requirement already satisfied: pluggy<1.0,>=0.12 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (0.13.1)
Requirement already satisfied: wcwidth in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (0.2.13)
Requirement already satisfied: wrapt<2,>=1.10 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from deprecated>=1.2.0->pennsieve==6.1.1->-r requirements.txt (line 2)) (1.16.0)
Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-api-python-client==2.52.0->-r requirements.txt (line 37)) (1.65.0)
Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-api-python-client==2.52.0->-r requirements.txt (line 37)) (1.24.0)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-auth<3.0.0dev,>=1.19.0->google-api-python-client==2.52.0->-r requirements.txt (line 37)) (5.5.0)
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from httplib2<1dev,>=0.15.0->google-api-python-client==2.52.0->-r requirements.txt (line 37)) (3.1.4)
Using cached click-7.1.2-py2.py3-none-any.whl (82 kB)
Installing collected packages: Click
Attempting uninstall: Click
Found existing installation: click 8.1.7
Uninstalling click-8.1.7:
Successfully uninstalled click-8.1.7
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
black 24.8.0 requires click>=8.0.0, but you have click 7.1.2 which is incompatible.
Successfully installed Click-7.1.2
+ pip install -r requirements-dev.txt
Requirement already satisfied: pytest==5.4.3 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 1)) (5.4.3)
Requirement already satisfied: pennsieve==6.1.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 2)) (6.1.1)
Requirement already satisfied: black in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 3)) (24.8.0)
Requirement already satisfied: isort in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 4)) (5.13.2)
Requirement already satisfied: nose in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 5)) (1.3.7)
Requirement already satisfied: packaging in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 6)) (24.1)
Requirement already satisfied: py>=1.5.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (1.11.0)
Requirement already satisfied: attrs>=17.4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (24.2.0)
Requirement already satisfied: more-itertools>=4.0.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (10.5.0)
Requirement already satisfied: pluggy<1.0,>=0.12 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (0.13.1)
Requirement already satisfied: wcwidth in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (0.2.13)
Requirement already satisfied: boto3 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.17.67)
Requirement already satisfied: configparser>=3.5 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (7.1.0)
Requirement already satisfied: deprecated>=1.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.2.14)
Requirement already satisfied: future>=0.15.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.0.0)
Requirement already satisfied: futures in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (3.0.5)
Requirement already satisfied: protobuf>=3.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (5.28.0)
Requirement already satisfied: python-jose==3.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (3.2.0)
Requirement already satisfied: pytz>=2016 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2024.1)
Requirement already satisfied: requests>=2.18 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2.25.1)
Requirement already satisfied: rsa==4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (4.0)
Requirement already satisfied: semver>=2.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (3.0.2)
Requirement already satisfied: websocket-client>=0.57.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.8.0)
Requirement already satisfied: docopt>=0.6 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.6.2)
Requirement already satisfied: psutil>=5.4 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (6.0.0)
Requirement already satisfied: python-dateutil>=2.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2.8.0)
Requirement already satisfied: six<2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from python-jose==3.2.0->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.13.0)
Requirement already satisfied: ecdsa<0.15 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from python-jose==3.2.0->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.14.1)
Requirement already satisfied: pyasn1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from python-jose==3.2.0->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.6.0)
Collecting click>=8.0.0 (from black->-r requirements-dev.txt (line 3))
Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
Requirement already satisfied: mypy-extensions>=0.4.3 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (1.0.0)
Requirement already satisfied: pathspec>=0.9.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (0.12.1)
Requirement already satisfied: platformdirs>=2 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (4.3.2)
Requirement already satisfied: tomli>=1.1.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (2.0.1)
Requirement already satisfied: typing-extensions>=4.0.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (4.12.2)
Requirement already satisfied: wrapt<2,>=1.10 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from deprecated>=1.2.0->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.16.0)
Requirement already satisfied: chardet<5,>=3.0.2 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from requests>=2.18->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (3.0.4)
Requirement already satisfied: idna<3,>=2.5 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from requests>=2.18->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2.8)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from requests>=2.18->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.26.4)
Requirement already satisfied: certifi>=2017.4.17 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from requests>=2.18->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2019.11.28)
Requirement already satisfied: botocore<1.21.0,>=1.20.67 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from boto3->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.20.67)
Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from boto3->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.9.4)
Requirement already satisfied: s3transfer<0.5.0,>=0.4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from boto3->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.4.2)
Using cached click-8.1.7-py3-none-any.whl (97 kB)
Installing collected packages: click
Attempting uninstall: click
Found existing installation: click 7.1.2
Uninstalling click-7.1.2:
Successfully uninstalled click-7.1.2
Successfully installed click-8.1.7
+ pytest
============================= test session starts ==============================
platform linux -- Python 3.9.20, pytest-5.4.3, py-1.11.0, pluggy-0.13.1
rootdir: /home/cmiss/Jenkins/workspace/SPARC-API
collected 109 items
tests/test_api.py ..........F........ [ 17%]
tests/test_biolucida.py ............. [ 29%]
tests/test_dataset_info.py .s..ss..... [ 39%]
tests/test_health.py . [ 40%]
tests/test_monthly_stats.py ......... [ 48%]
tests/test_osparc.py .............F [ 61%]
tests/test_pmr.py ..F...FF. [ 69%]
tests/test_scicrunch.py FFFFFFFFFFFFFFF..s......... [ 94%]
tests/test_segmentation_info.py .. [ 96%]
tests/test_thumbnails.py .FF [ 99%]
tests/test_update_contentful_entries.py . [100%]
=================================== FAILURES ===================================
____________________________ test_create_wrike_task ____________________________
client = <FlaskClient <Flask 'app.main'>>
def test_create_wrike_task(client):
r = client.post(f"/tasks", data={"title": "test-integration-task-sparc-api"})
> assert r.status_code == 400
E assert 409 == 400
E + where 409 = <Response streamed [409 CONFLICT]>.status_code
tests/test_api.py:137: AssertionError
________________________ test_osparc_failing_simulation ________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d060efd90>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('api.osparc.io', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'api.osparc.io', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -3] Temporary failure in name resolution
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0626feb0>
method = 'POST'
url = '/v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect'
body = None
headers = {'Accept': 'application/json', 'Authorization': 'Basic MDY3M2UzYjEtZjdjMC01YTdjLWI5ZjAtMDI1MTRiMWE0NzFmOmJjOTExYWZjLTViNmEtNTJhOC05NzFjLWM4NTJiYjBiZDg5Mw==', 'Content-Type': 'application/json', 'User-Agent': 'osparc-api/0.3.0/python'}
retries = Retry(total=0, connect=None, read=None, redirect=None, status=None)
redirect = False, assert_same_host = False, timeout = None, pool_timeout = None
release_conn = True, chunked = False, body_pos = None
response_kw = {'preload_content': True, 'request_url': 'https://api.osparc.io/v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect'}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect', query=None, fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0626feb0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d060efd90>
method = 'POST'
url = '/v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': None, 'headers': {'Accept': 'application/json', 'Authorization': 'Basic MDY3M2UzYjEtZjdjMC01YTdjLWI5ZjAtMDI1M...TViNmEtNTJhOC05NzFjLWM4NTJiYjBiZDg5Mw==', 'Content-Type': 'application/json', 'User-Agent': 'osparc-api/0.3.0/python'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0626feb0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d060efd90>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d060efd90>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d060efd90>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d060efd90>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_osparc_failing_simulation(client):
data = {
"opencor": {
"model_url": "https://models.physiomeproject.org/e/611/HumanSAN_Fabbri_Fantini_Wilders_Severi_2017.cellml",
"json_config": {
"simulation": {
"Ending point": 3.0,
"Point interval": 1.0,
},
"output": ["Membrane/V"]
}
},
"solver": {
"name": "simcore/services/comp/opencor",
"version": "1.0.3"
}
}
res = {
"status": "nok",
"description": "the simulation failed"
}
r = client.post("/start_simulation", json=data)
assert r.status_code == 200
json_data = json.loads(r.data)
assert (status := json_data.get("status")) is not None
assert status == "ok"
assert (check_simulation_data := json_data.get("data")) is not None
while True:
> r = client.post("/check_simulation", json=check_simulation_data)
tests/test_osparc.py:178:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1039: in post
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:1479: in check_simulation
return json.dumps(do_check_simulation(data))
app/osparc/osparc.py:153: in check_simulation
status = solvers_api.inspect_job(solver_name, solver_version, job_id)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/osparc/api/solvers_api.py:677: in inspect_job
return self.inspect_job_with_http_info(solver_key, version, job_id, **kwargs) # noqa: E501
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/osparc/api/solvers_api.py:763: in inspect_job_with_http_info
return self.api_client.call_api(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/osparc/api_client.py:351: in call_api
return self.__call_api(resource_path, method,
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/osparc/api_client.py:180: in __call_api
response_data = self.request(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/osparc/api_client.py:394: in request
return self.rest_client.POST(url,
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/osparc/rest.py:271: in POST
return self.request("POST", url,
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/osparc/rest.py:163: in request
r = self.pool_manager.request(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/request.py:78: in request
return self.request_encode_body(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/request.py:170: in request_encode_body
return self.urlopen(method, url, **extra_kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/poolmanager.py:375: in urlopen
response = conn.urlopen(method, u.request_uri, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:783: in urlopen
return self.urlopen(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:783: in urlopen
return self.urlopen(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:783: in urlopen
return self.urlopen(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755: in urlopen
retries = retries.increment(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=None, redirect=None, status=None)
method = 'POST'
url = '/v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d060efd90>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0626feb0>
_stacktrace = <traceback object at 0x7f1d05831240>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='api.osparc.io', port=443): Max retries exceeded with url: /v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d060efd90>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
----------------------------- Captured stdout call -----------------------------
Error while setting featured dataset id: Method not allowed with this API key
Error while setting featured dataset id: Method not allowed with this API key
Error while setting featured dataset id: Method not allowed with this API key
Error while setting featured dataset id: Method not allowed with this API key
Error while setting featured dataset id: Method not allowed with this API key
----------------------------- Captured stderr call -----------------------------
WARNING:urllib3.connectionpool:Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d060efa00>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect
WARNING:urllib3.connectionpool:Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d060ef4c0>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect
WARNING:urllib3.connectionpool:Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d060eff70>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect
------------------------------ Captured log call -------------------------------
WARNING urllib3.connectionpool:connectionpool.py:780 Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d060efa00>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect
WARNING urllib3.connectionpool:connectionpool.py:780 Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d060ef4c0>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect
WARNING urllib3.connectionpool:connectionpool.py:780 Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d060eff70>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /v0/solvers/simcore%2Fservices%2Fcomp%2Fopencor/releases/1.0.3/jobs/f4852ca0-30fd-43d9-8567-9ce51c351d3f:inspect
___________________________ test_pmr_file_valid_path ___________________________
client = <FlaskClient <Flask 'app.main'>>
def test_pmr_file_valid_path(client):
r = client.post("/pmr_file", json={"path": "workspace/486/rawfile/55879cbc485e2d4c41f3dc6d60424b849f94c4ee/HumanSAN_Fabbri_Fantini_Wilders_Severi_2017.cellml"})
> assert r.status_code == 200
E assert 400 == 200
E + where 400 = <Response streamed [400 BAD REQUEST]>.status_code
tests/test_pmr.py:23: AssertionError
___________ test_pmr_latest_exposure_workspace_with_latest_exposure ____________
client = <FlaskClient <Flask 'app.main'>>
def test_pmr_latest_exposure_workspace_with_latest_exposure(client):
r = client.post("/pmr_latest_exposure", json={"workspace_url": "https://models.physiomeproject.org/workspace/486"})
> assert r.status_code == 200
E assert 400 == 200
E + where 400 = <Response streamed [400 BAD REQUEST]>.status_code
tests/test_pmr.py:43: AssertionError
__________ test_pmr_latest_exposure_workspace_without_latest_exposure __________
client = <FlaskClient <Flask 'app.main'>>
def test_pmr_latest_exposure_workspace_without_latest_exposure(client):
r = client.post("/pmr_latest_exposure", json={"workspace_url": "https://models.physiomeproject.org/workspace/698"})
> assert r.status_code == 200
E assert 400 == 200
E + where 400 = <Response streamed [400 BAD REQUEST]>.status_code
tests/test_pmr.py:50: AssertionError
_____________________________ test_scicrunch_keys ______________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d041e8f40>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041e8970>
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?q=&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='q=&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041e8970>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d041e8f40>
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?q=&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041e8970>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d041e8f40>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d041e8f40>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d041e8f40>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d041e8f40>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d041e8730>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041e8970>
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?q=&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='q=&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?q=&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d041e8f40>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041e8970>
_stacktrace = <traceback object at 0x7f1d04197440>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?q=&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d041e8f40>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_keys(client):
> r = client.get('/search/')
tests/test_scicrunch.py:26:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:713: in kb_search
response = requests.get(f'{Config.SCI_CRUNCH_HOST}/_search?q={query}&size={limit}&from={start}&api_key={Config.KNOWLEDGEBASE_KEY}')
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:76: in get
return request('get', url, params=params, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d041e8730>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?q=&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d041e8f40>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
____________________ test_scicrunch_versions_are_supported _____________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d09fb7c10>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fb7490>
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf&q=%22%22'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf&q=%22%22', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fb7490>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d09fb7c10>
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf&q=%22%22'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fb7490>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d09fb7c10>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d09fb7c10>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d09fb7c10>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d09fb7c10>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d09fb7700>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fb7490>
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf&q=%22%22'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf&q=%22%22', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf&q=%22%22'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fb7c10>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fb7490>
_stacktrace = <traceback object at 0x7f1d09f99a00>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf&q=%22%22 (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fb7c10>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
def test_scicrunch_versions_are_supported():
# Lines below are to allow the test to be run from the root dir or sparc-api/tests
current_directory = os.path.dirname(os.path.abspath(__file__))
app_directory = os.path.join(current_directory, '..', 'app')
# List the contents of the 'app' directory, to find which versions we have files for
available_versions = os.listdir(app_directory)
> r = requests.get(f'{Config.SCI_CRUNCH_HOST}/_search?api_key={Config.KNOWLEDGEBASE_KEY}&q=""')
tests/test_scicrunch.py:38:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:76: in get
return request('get', url, params=params, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d09fb7700>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf&q=%22%22 (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fb7c10>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
__________________________ test_scicrunch_dataset_doi __________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04146100>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041530d0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"query": {"term": {"pennsieve.identifier.aggregate": "55"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '61', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041530d0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d04146100>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"query": {"term": {"pennsieve.identifier.aggregate": "55"}}}', 'headers': {'User-Agent': 'python-requests/...ip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '61', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041530d0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d04146100>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04146100>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04146100>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d04146100>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d04153640>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041530d0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"query": {"term": {"pennsieve.identifier.aggregate": "55"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '61', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04146100>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041530d0>
_stacktrace = <traceback object at 0x7f1d0416a180>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04146100>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_dataset_doi(client):
# Testing with dataset 55
identifier = "55"
> run_doi_test = check_doi_status(client, identifier, '10.26275/pzek-91wx')
tests/test_scicrunch.py:68:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_scicrunch.py:53: in check_doi_status
r = client.get('/dataset_info/using_pennsieve_identifier', query_string={'identifier': dataset_id})
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:624: in get_dataset_info_pennsieve_identifier
return reform_dataset_results(dataset_search(query))
app/main.py:690: in dataset_search
response = requests.post(f'{Config.SCI_CRUNCH_HOST}/_search',
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d04153640>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04146100>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
_____________________ test_scicrunch_multiple_dataset_doi ______________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1cf7958d30>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf7949f10>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"query": {"term": {"pennsieve.identifier.aggregate": "55"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '61', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf7949f10>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1cf7958d30>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"query": {"term": {"pennsieve.identifier.aggregate": "55"}}}', 'headers': {'User-Agent': 'python-requests/...ip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '61', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf7949f10>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1cf7958d30>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1cf7958d30>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1cf7958d30>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1cf7958d30>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1cf79490d0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf7949f10>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"query": {"term": {"pennsieve.identifier.aggregate": "55"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '61', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1cf7958d30>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf7949f10>
_stacktrace = <traceback object at 0x7f1d0563a2c0>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1cf7958d30>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_multiple_dataset_doi(client):
# Testing with dataset 55 and 68
> run_doi_test_1 = check_doi_status(client, "55", '10.26275/pzek-91wx')
tests/test_scicrunch.py:84:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_scicrunch.py:53: in check_doi_status
r = client.get('/dataset_info/using_pennsieve_identifier', query_string={'identifier': dataset_id})
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:624: in get_dataset_info_pennsieve_identifier
return reform_dataset_results(dataset_search(query))
app/main.py:690: in dataset_search
response = requests.post(f'{Config.SCI_CRUNCH_HOST}/_search',
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1cf79490d0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1cf7958d30>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
_____________________ test_scicrunch_multiple_dataset_ids ______________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d0a136d90>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0a13cf70>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 999, "query": {"terms": {"pennsieve.identifier": ["55", "68"]}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '73', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0a13cf70>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d0a136d90>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"size": 999, "query": {"terms": {"pennsieve.identifier": ["55", "68"]}}}', 'headers': {'User-Agent': 'pyth...ip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '73', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0a13cf70>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d0a136d90>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d0a136d90>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d0a136d90>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d0a136d90>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d0a13c0d0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0a13cf70>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 999, "query": {"terms": {"pennsieve.identifier": ["55", "68"]}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '73', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d0a136d90>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0a13cf70>
_stacktrace = <traceback object at 0x7f1d0a12ddc0>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d0a136d90>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_multiple_dataset_ids(client):
# Testing with dataset 55 and 68
> r = client.get('/dataset_info/using_multiple_discoverIds/?discoverIds=55&discoverIds=68')
tests/test_scicrunch.py:102:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:589: in get_dataset_info_discoverIds
return process_results(dataset_search(query))
app/main.py:690: in dataset_search
response = requests.post(f'{Config.SCI_CRUNCH_HOST}/_search',
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d0a13c0d0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d0a136d90>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
____________________________ test_scicrunch_search _____________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d041df520>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041df4c0>
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?q=heart&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='q=heart&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041df4c0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d041df520>
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?q=heart&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041df4c0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d041df520>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d041df520>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d041df520>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d041df520>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1cf7a37100>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041df4c0>
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?q=heart&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='q=heart&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'GET'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?q=heart&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d041df520>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d041df4c0>
_stacktrace = <traceback object at 0x7f1d09f22480>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?q=heart&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d041df520>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_search(client):
> r = client.get('/search/heart')
tests/test_scicrunch.py:113:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:713: in kb_search
response = requests.get(f'{Config.SCI_CRUNCH_HOST}/_search?q={query}&size={limit}&from={start}&api_key={Config.KNOWLEDGEBASE_KEY}')
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:76: in get
return request('get', url, params=params, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1cf7a37100>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?q=heart&size=10&from=0&api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d041df520>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
___________________________ test_scicrunch_all_data ____________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04c65430>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c65790>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '23', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c65790>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d04c65430>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"size": 10, "from": 0}', 'headers': {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '23', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c65790>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d04c65430>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04c65430>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04c65430>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d04c65430>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d09f6a970>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c65790>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '23', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04c65430>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c65790>
_stacktrace = <traceback object at 0x7f1cf7ac2600>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04c65430>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_all_data(client):
> r = client.get('/filter-search/')
tests/test_scicrunch.py:119:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:734: in filter_search
response = requests.post(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d09f6a970>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04c65430>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
____________________________ test_scicrunch_filter _____________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d0563e9a0>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0563e220>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(anatomy.organ.name.aggregate:((heart)))"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '105', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0563e220>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d0563e9a0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(anatomy.organ.name.aggregate:((heart)))"}}}',...p, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '105', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0563e220>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d0563e9a0>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d0563e9a0>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d0563e9a0>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d0563e9a0>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d0a103f10>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0563e220>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(anatomy.organ.name.aggregate:((heart)))"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '105', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d0563e9a0>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d0563e220>
_stacktrace = <traceback object at 0x7f1cf7a3ed00>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d0563e9a0>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_filter(client):
> r = client.get('/filter-search/', query_string={'term': 'organ', 'facet': 'heart'})
tests/test_scicrunch.py:124:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:734: in filter_search
response = requests.post(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d0a103f10>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d0563e9a0>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
_______________________ test_scicrunch_filter_scaffolds ________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04d8df40>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04dae100>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "objects.additional_mimetype.name:(application%2fx.vnd.abi.scaffold.meta%2Bjson)"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '144', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04dae100>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d04d8df40>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"size": 10, "from": 0, "query": {"query_string": {"query": "objects.additional_mimetype.name:(application%...p, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '144', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04dae100>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d04d8df40>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04d8df40>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04d8df40>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d04d8df40>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d04dae400>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04dae100>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "objects.additional_mimetype.name:(application%2fx.vnd.abi.scaffold.meta%2Bjson)"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '144', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04d8df40>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04dae100>
_stacktrace = <traceback object at 0x7f1d04d912c0>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04d8df40>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_filter_scaffolds(client):
> r = client.get('/filter-search/?facet=scaffolds&term=datasets')
tests/test_scicrunch.py:129:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:734: in filter_search
response = requests.post(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d04dae400>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04d8df40>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
_________________________ test_scicrunch_basic_search __________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1cf794eb20>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf794eee0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(Heart)"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '72', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf794eee0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1cf794eb20>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(Heart)"}}}', 'headers': {'User-Agent': 'pytho...ip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '72', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf794eee0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1cf794eb20>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1cf794eb20>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1cf794eb20>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1cf794eb20>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d0a0bd2e0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf794eee0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(Heart)"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '72', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1cf794eb20>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1cf794eee0>
_stacktrace = <traceback object at 0x7f1d0a0c8980>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1cf794eb20>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_basic_search(client):
> r = client.get('/filter-search/Heart/?facet=All+Species&term=species')
tests/test_scicrunch.py:134:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:734: in filter_search
response = requests.post(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d0a0bd2e0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1cf794eb20>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
_________________________ test_scicrunch_image_search __________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d040c6a90>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d040c61c0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"query": {"query_string": {"fields": ["*mimetype.name"], "query": "\\"[\'\\"*jp2* OR *vnd.ome.xml* OR *jpx*\\"\']\\""}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '115', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d040c61c0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d040c6a90>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"query": {"query_string": {"fields": ["*mimetype.name"], "query": "\\"[\'\\"*jp2* OR *vnd.ome.xml* OR *jpx...p, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '115', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d040c61c0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d040c6a90>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d040c6a90>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d040c6a90>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d040c6a90>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d0a00e5e0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d040c61c0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"query": {"query_string": {"fields": ["*mimetype.name"], "query": "\\"[\'\\"*jp2* OR *vnd.ome.xml* OR *jpx*\\"\']\\""}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '115', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d040c6a90>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d040c61c0>
_stacktrace = <traceback object at 0x7f1d0a0ea1c0>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d040c6a90>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_image_search(client):
> r = client.get('/multiple_dataset_info/using_multiple_mimetype/?q="*jp2*+OR+*vnd.ome.xml*+OR+*jpx*"')
tests/test_scicrunch.py:138:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:580: in get_file_info_from_mimetype
return process_results(dataset_search(query))
app/main.py:690: in dataset_search
response = requests.post(f'{Config.SCI_CRUNCH_HOST}/_search',
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d0a00e5e0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d040c6a90>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
_________________________ test_scicrunch_boolean_logic _________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04c79610>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c794f0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(attributes.subject.sex.value:((male) OR (female)))"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '116', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c794f0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d04c79610>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(attributes.subject.sex.value:((male) OR (fema...p, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '116', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c794f0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d04c79610>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04c79610>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d04c79610>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d04c79610>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d041def40>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c794f0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(attributes.subject.sex.value:((male) OR (female)))"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '116', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04c79610>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d04c794f0>
_stacktrace = <traceback object at 0x7f1d09f31380>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04c79610>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_boolean_logic(client):
> r = client.get('/filter-search/?facet=All+Species&term=species&facet=male&term=gender&facet=female&term=gender')
tests/test_scicrunch.py:142:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:734: in filter_search
response = requests.post(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d041def40>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d04c79610>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
______________________ test_scicrunch_combined_facet_text ______________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d09fcf3d0>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fcf8b0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(heart) AND (attributes.subject.sex.value:((male) OR (female)))"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '128', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fcf8b0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d09fcf3d0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(heart) AND (attributes.subject.sex.value:((ma...p, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '128', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fcf8b0>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d09fcf3d0>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d09fcf3d0>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d09fcf3d0>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d09fcf3d0>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d04c73ee0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fcf8b0>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"query": "(heart) AND (attributes.subject.sex.value:((male) OR (female)))"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '128', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fcf3d0>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d09fcf8b0>
_stacktrace = <traceback object at 0x7f1d057ec600>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fcf3d0>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_scicrunch_combined_facet_text(client):
> r = client.get('/filter-search/heart/?facet=All+Species&term=species&facet=male&term=gender&facet=female&term=gender')
tests/test_scicrunch.py:147:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:734: in filter_search
response = requests.post(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d04c73ee0>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fcf3d0>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
_____________________________ test_getting_facets ______________________________
client = <FlaskClient <Flask 'app.main'>>
def test_getting_facets(client):
r = client.get('/get-facets/organ')
facet_results = json.loads(r.data)
facets = [facet_result['key'] for facet_result in facet_results]
> assert 'heart' in facets
E AssertionError: assert 'heart' in []
tests/test_scicrunch.py:155: AssertionError
----------------------------- Captured stderr call -----------------------------
--- Logging error ---
Traceback (most recent call last):
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py", line 169, in _new_conn
conn = connection.create_connection(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py", line 73, in create_connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
File "/usr/lib/python3.9/socket.py", line 966, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -2] Name or service not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
httplib_response = self._make_request(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 382, in _make_request
self._validate_conn(conn)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 1010, in _validate_conn
conn.connect()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py", line 353, in connect
conn = self._new_conn()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py", line 181, in _new_conn
raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d09fdba30>: Failed to establish a new connection: [Errno -2] Name or service not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 755, in urlopen
retries = retries.increment(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py", line 574, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fdba30>: Failed to establish a new connection: [Errno -2] Name or service not known'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/cmiss/Jenkins/workspace/SPARC-API/app/main.py", line 758, in get_facets
response = requests.post(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py", line 119, in post
return request('post', url, data=data, json=json, **kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fdba30>: Failed to establish a new connection: [Errno -2] Name or service not known'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.9/logging/__init__.py", line 1083, in emit
msg = self.format(record)
File "/usr/lib/python3.9/logging/__init__.py", line 927, in format
return fmt.format(record)
File "/usr/lib/python3.9/logging/__init__.py", line 663, in format
record.message = record.getMessage()
File "/usr/lib/python3.9/logging/__init__.py", line 367, in getMessage
msg = msg % self.args
TypeError: not all arguments converted during string formatting
Call stack:
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/bin/pytest", line 8, in <module>
sys.exit(main())
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/config/__init__.py", line 124, in main
ret = config.hook.pytest_cmdline_main(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/main.py", line 240, in pytest_cmdline_main
return wrap_session(config, _main)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/main.py", line 191, in wrap_session
session.exitstatus = doit(config, session) or 0
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/main.py", line 247, in _main
config.hook.pytest_runtestloop(session=session)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/main.py", line 272, in pytest_runtestloop
item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 85, in pytest_runtest_protocol
runtestprotocol(item, nextitem=nextitem)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 100, in runtestprotocol
reports.append(call_and_report(item, "call", log))
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 186, in call_and_report
call = call_runtest_hook(item, when, **kwds)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 216, in call_runtest_hook
return CallInfo.from_call(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 244, in from_call
result = func()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 217, in <lambda>
lambda: ihook(item=item, **kwds), when=when, reraise=reraise
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 135, in pytest_runtest_call
item.runtest()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/python.py", line 1477, in runtest
self.ihook.pytest_pyfunc_call(pyfuncitem=self)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/python.py", line 182, in pytest_pyfunc_call
result = testfunction(**testargs)
File "/home/cmiss/Jenkins/workspace/SPARC-API/tests/test_scicrunch.py", line 152, in test_getting_facets
r = client.get('/get-facets/organ')
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py", line 1029, in get
return self.open(*args, **kw)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py", line 222, in open
return Client.open(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py", line 993, in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py", line 884, in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py", line 1119, in run_wsgi_app
app_rv = app(environ, start_response)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py", line 2463, in __call__
return self.wsgi_app(environ, start_response)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py", line 2446, in wsgi_app
response = self.full_dispatch_request()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py", line 1949, in full_dispatch_request
rv = self.dispatch_request()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py", line 1935, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/cmiss/Jenkins/workspace/SPARC-API/app/main.py", line 767, in get_facets
logging.error(f"Could not search SciCrunch for path {path}", ex)
Message: 'Could not search SciCrunch for path anatomy.organ.name.aggregate'
Arguments: (ConnectionError(MaxRetryError("HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fdba30>: Failed to establish a new connection: [Errno -2] Name or service not known'))")),)
--- Logging error ---
Traceback (most recent call last):
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py", line 169, in _new_conn
conn = connection.create_connection(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py", line 73, in create_connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
File "/usr/lib/python3.9/socket.py", line 966, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -2] Name or service not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
httplib_response = self._make_request(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 382, in _make_request
self._validate_conn(conn)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 1010, in _validate_conn
conn.connect()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py", line 353, in connect
conn = self._new_conn()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py", line 181, in _new_conn
raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d09fdba30>: Failed to establish a new connection: [Errno -2] Name or service not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py", line 755, in urlopen
retries = retries.increment(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py", line 574, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fdba30>: Failed to establish a new connection: [Errno -2] Name or service not known'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/cmiss/Jenkins/workspace/SPARC-API/app/main.py", line 758, in get_facets
response = requests.post(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py", line 119, in post
return request('post', url, data=data, json=json, **kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fdba30>: Failed to establish a new connection: [Errno -2] Name or service not known'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.9/logging/__init__.py", line 1083, in emit
msg = self.format(record)
File "/usr/lib/python3.9/logging/__init__.py", line 927, in format
return fmt.format(record)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/logging.py", line 74, in format
return super().format(record)
File "/usr/lib/python3.9/logging/__init__.py", line 663, in format
record.message = record.getMessage()
File "/usr/lib/python3.9/logging/__init__.py", line 367, in getMessage
msg = msg % self.args
TypeError: not all arguments converted during string formatting
Call stack:
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/bin/pytest", line 8, in <module>
sys.exit(main())
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/config/__init__.py", line 124, in main
ret = config.hook.pytest_cmdline_main(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/main.py", line 240, in pytest_cmdline_main
return wrap_session(config, _main)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/main.py", line 191, in wrap_session
session.exitstatus = doit(config, session) or 0
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/main.py", line 247, in _main
config.hook.pytest_runtestloop(session=session)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/main.py", line 272, in pytest_runtestloop
item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 85, in pytest_runtest_protocol
runtestprotocol(item, nextitem=nextitem)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 100, in runtestprotocol
reports.append(call_and_report(item, "call", log))
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 186, in call_and_report
call = call_runtest_hook(item, when, **kwds)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 216, in call_runtest_hook
return CallInfo.from_call(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 244, in from_call
result = func()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 217, in <lambda>
lambda: ihook(item=item, **kwds), when=when, reraise=reraise
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/runner.py", line 135, in pytest_runtest_call
item.runtest()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/python.py", line 1477, in runtest
self.ihook.pytest_pyfunc_call(pyfuncitem=self)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda>
self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall
res = hook_impl.function(*args)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/python.py", line 182, in pytest_pyfunc_call
result = testfunction(**testargs)
File "/home/cmiss/Jenkins/workspace/SPARC-API/tests/test_scicrunch.py", line 152, in test_getting_facets
r = client.get('/get-facets/organ')
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py", line 1029, in get
return self.open(*args, **kw)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py", line 222, in open
return Client.open(
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py", line 993, in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py", line 884, in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py", line 1119, in run_wsgi_app
app_rv = app(environ, start_response)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py", line 2463, in __call__
return self.wsgi_app(environ, start_response)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py", line 2446, in wsgi_app
response = self.full_dispatch_request()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py", line 1949, in full_dispatch_request
rv = self.dispatch_request()
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py", line 1935, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/cmiss/Jenkins/workspace/SPARC-API/app/main.py", line 767, in get_facets
logging.error(f"Could not search SciCrunch for path {path}", ex)
File "/usr/lib/python3.9/logging/__init__.py", line 2064, in error
root.error(msg, *args, **kwargs)
File "/usr/lib/python3.9/logging/__init__.py", line 1475, in error
self._log(ERROR, msg, args, **kwargs)
File "/usr/lib/python3.9/logging/__init__.py", line 1589, in _log
self.handle(record)
File "/usr/lib/python3.9/logging/__init__.py", line 1599, in handle
self.callHandlers(record)
File "/usr/lib/python3.9/logging/__init__.py", line 1661, in callHandlers
hdlr.handle(record)
File "/usr/lib/python3.9/logging/__init__.py", line 952, in handle
self.emit(record)
File "/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/_pytest/logging.py", line 310, in emit
logging.StreamHandler.emit(self, record)
Message: 'Could not search SciCrunch for path anatomy.organ.name.aggregate'
Arguments: (ConnectionError(MaxRetryError("HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d09fdba30>: Failed to establish a new connection: [Errno -2] Name or service not known'))")),)
_________________________ test_create_identifier_query _________________________
self = <urllib3.connection.HTTPSConnection object at 0x7f1d062e5c70>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('scicrunch.org', 443), timeout = None, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`socket.getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
try:
host.encode("idna")
except UnicodeError:
return six.raise_from(
LocationParseError(u"'%s', label empty or too long" % host), None
)
> for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
host = 'scicrunch.org', port = 443, family = <AddressFamily.AF_INET: 2>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0
def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
"""Resolve host and port into list of address info entries.
Translate the host/port argument into a sequence of 5-tuples that contain
all the necessary arguments for creating a socket connected to that service.
host is a domain name, a string representation of an IPv4/v6 address or
None. port is a string service name such as 'http', a numeric port number or
None. By passing None as the value of host and port, you can pass NULL to
the underlying C API.
The family, type and proto arguments can be optionally specified in order to
narrow the list of addresses returned. Passing zero as a value for each of
these arguments selects the full range of results.
"""
# We override this function since we want to translate the numeric family
# and socket type values to enum constants.
addrlist = []
> for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E socket.gaierror: [Errno -2] Name or service not known
/usr/lib/python3.9/socket.py:966: gaierror
During handling of the above exception, another exception occurred:
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d062e5a60>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"fields": ["*identifier"], "query": "*e6435710-dd9c-46b7-9dfd-932103469733"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '129', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d062e5a60>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d062e5c70>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': b'{"size": 10, "from": 0, "query": {"query_string": {"fields": ["*identifier"], "query": "*e6435710-dd9c-46b7...p, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '129', 'Content-Type': 'application/json'}}
timeout_obj = Timeout(connect=None, read=None, total=None)
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
> self._validate_conn(conn)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d062e5a60>
conn = <urllib3.connection.HTTPSConnection object at 0x7f1d062e5c70>
def _validate_conn(self, conn):
"""
Called right before a request is made, after the socket is created.
"""
super(HTTPSConnectionPool, self)._validate_conn(conn)
# Force connect early to allow us to validate the connection.
if not getattr(conn, "sock", None): # AppEngine might not have `.sock`
> conn.connect()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d062e5c70>
def connect(self):
# Add certificate verification
> conn = self._new_conn()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPSConnection object at 0x7f1d062e5c70>
def _new_conn(self):
"""Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw["source_address"] = self.source_address
if self.socket_options:
extra_kw["socket_options"] = self.socket_options
try:
conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
except SocketTimeout:
raise ConnectTimeoutError(
self,
"Connection to %s timed out. (connect timeout=%s)"
% (self.host, self.timeout),
)
except SocketError as e:
> raise NewConnectionError(
self, "Failed to establish a new connection: %s" % e
)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f1d062e5c70>: Failed to establish a new connection: [Errno -2] Name or service not known
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x7f1d062e5580>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d062e5a60>
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
body = b'{"size": 10, "from": 0, "query": {"query_string": {"fields": ["*identifier"], "query": "*e6435710-dd9c-46b7-9dfd-932103469733"}}}'
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '129', 'Content-Type': 'application/json'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/1/elastic/SPARC_PortalDatasets_pr/_search', query='api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'POST'
url = '/api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d062e5c70>: Failed to establish a new connection: [Errno -2] Name or service not known')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7f1d062e5a60>
_stacktrace = <traceback object at 0x7f1cf79356c0>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
raise six.reraise(type(error), error, _stacktrace)
elif read is not None:
read -= 1
elif error:
# Other retry?
if other is not None:
other -= 1
elif response and response.get_redirect_location():
# Redirect retry?
if redirect is not None:
redirect -= 1
cause = "too many redirects"
redirect_location = response.get_redirect_location()
status = response.status
else:
# Incrementing because of a server error like a 500 in
# status_forcelist and the given method is in the allowed_methods
cause = ResponseError.GENERIC_ERROR
if response and response.status:
if status_count is not None:
status_count -= 1
cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
status = response.status
history = self.history + (
RequestHistory(method, url, error, status, redirect_location),
)
new_retry = self.new(
total=total,
connect=connect,
read=read,
redirect=redirect,
status=status_count,
other=other,
history=history,
)
if new_retry.is_exhausted():
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d062e5c70>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError
During handling of the above exception, another exception occurred:
client = <FlaskClient <Flask 'app.main'>>
def test_create_identifier_query(client):
> r = client.get('/dataset_info/using_object_identifier?identifier=package:e6435710-dd9c-46b7-9dfd-932103469733')
tests/test_scicrunch.py:159:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
return self.open(*args, **kw)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
return Client.open(
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
app_rv = app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
response = self.handle_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
response = self.full_dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
rv = self.handle_user_exception(e)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
raise value
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
rv = self.dispatch_request()
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:605: in get_dataset_info_object_identifier
return reform_dataset_results(dataset_search(query))
app/main.py:690: in dataset_search
response = requests.post(f'{Config.SCI_CRUNCH_HOST}/_search',
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:119: in post
return request('post', url, data=data, json=json, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x7f1d062e5580>
request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scicrunch.org', port=443): Max retries exceeded with url: /api/1/elastic/SPARC_PortalDatasets_pr/_search?api_key=xBOrIfnZTvJQtobGo8XHRvThdMYGTxtf (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f1d062e5c70>: Failed to establish a new connection: [Errno -2] Name or service not known'))
../../shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
__________________________ test_neurolucida_thumbnail __________________________
client = <FlaskClient <Flask 'app.main'>>
def test_neurolucida_thumbnail(client):
query_string = {'datasetId': 37, 'version': 3, 'path': 'files/derivative/sub-54-5/TJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml'}
r = client.get('/thumbnail/neurolucida', query_string=query_string)
> assert r.data.decode('utf-8').startswith('iVBORw0KGgoAAAANSUhEUgAAAtAAAAIcCAIAAABQHw4EAAAgAElEQVR4Xuy9P4hjWZbu+zWv4erChVwNAyPjQq')
E assert False
E + where False = <built-in method startswith of str object at 0x7f1d05660f10>('iVBORw0KGgoAAAANSUhEUgAAAtAAAAIcCAIAAABQHw4EAAAgAElEQVR4Xuy9P4hjWZbu+zWv4erChVwNAyPjQq')
E + where <built-in method startswith of str object at 0x7f1d05660f10> = '<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">\n<title>400 Bad Request</title>\n<h1>Bad Request</h1>\n<p>Unable to make a connection to NEUROLUCIDA_HOST.</p>\n'.startswith
E + where '<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">\n<title>400 Bad Request</title>\n<h1>Bad Request</h1>\n<p>Unable to make a connection to NEUROLUCIDA_HOST.</p>\n' = <built-in method decode of bytes object at 0x7f1cf7a455e0>('utf-8')
E + where <built-in method decode of bytes object at 0x7f1cf7a455e0> = b'<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">\n<title>400 Bad Request</title>\n<h1>Bad Request</h1>\n<p>Unable to make a connection to NEUROLUCIDA_HOST.</p>\n'.decode
E + where b'<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">\n<title>400 Bad Request</title>\n<h1>Bad Request</h1>\n<p>Unable to make a connection to NEUROLUCIDA_HOST.</p>\n' = <Response 164 bytes [400 BAD REQUEST]>.data
tests/test_thumbnails.py:22: AssertionError
____________________ test_neurolucida_thumbnail_dataset_221 ____________________
client = <FlaskClient <Flask 'app.main'>>
def test_neurolucida_thumbnail_dataset_221(client):
query_string = {'datasetId': 221, 'version': 3, 'path': 'files/derivative/sub-M168/digital-traces/pCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml'}
r = client.get('/thumbnail/neurolucida', query_string=query_string)
> assert r.data.decode('utf-8').startswith('iVBORw0KGgoAAAANSUhEUgAAAtAAAAIcCAIAAABQHw4EAAAgAElEQVR4Xuzdd3xV9f348fe569ydm')
E assert False
E + where False = <built-in method startswith of str object at 0x7f1d056602d0>('iVBORw0KGgoAAAANSUhEUgAAAtAAAAIcCAIAAABQHw4EAAAgAElEQVR4Xuzdd3xV9f348fe569ydm')
E + where <built-in method startswith of str object at 0x7f1d056602d0> = '<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">\n<title>400 Bad Request</title>\n<h1>Bad Request</h1>\n<p>Unable to make a connection to NEUROLUCIDA_HOST.</p>\n'.startswith
E + where '<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">\n<title>400 Bad Request</title>\n<h1>Bad Request</h1>\n<p>Unable to make a connection to NEUROLUCIDA_HOST.</p>\n' = <built-in method decode of bytes object at 0x7f1cf7a45b90>('utf-8')
E + where <built-in method decode of bytes object at 0x7f1cf7a45b90> = b'<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">\n<title>400 Bad Request</title>\n<h1>Bad Request</h1>\n<p>Unable to make a connection to NEUROLUCIDA_HOST.</p>\n'.decode
E + where b'<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">\n<title>400 Bad Request</title>\n<h1>Bad Request</h1>\n<p>Unable to make a connection to NEUROLUCIDA_HOST.</p>\n' = <Response 164 bytes [400 BAD REQUEST]>.data
tests/test_thumbnails.py:29: AssertionError
=============================== warnings summary ===============================
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/apscheduler/__init__.py:1
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/apscheduler/__init__.py:1: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
from pkg_resources import get_distribution, DistributionNotFound
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/marshmallow/__init__.py:17
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/marshmallow/__init__.py:17: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
__version_info__ = tuple(LooseVersion(__version__).version)
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_marshmallow/__init__.py:34
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_marshmallow/__init__.py:34: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
__version_info__ = tuple(LooseVersion(__version__).version)
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:19
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:19: DeprecationWarning: Call to deprecated create function FileDescriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
DESCRIPTOR = _descriptor.FileDescriptor(
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:36
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:36: DeprecationWarning: Call to deprecated create function FieldDescriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
_descriptor.FieldDescriptor(
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:53
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:53: DeprecationWarning: Call to deprecated create function FieldDescriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
_descriptor.FieldDescriptor(
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:70
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:70: DeprecationWarning: Call to deprecated create function FieldDescriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
_descriptor.FieldDescriptor(
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:29
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:29: DeprecationWarning: Call to deprecated create function Descriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
_CACHESEGMENT = _descriptor.Descriptor(
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/nose/importer.py:12
/home/cmiss/Jenkins/shiningpanda/jobs/c6a9534b/virtualenvs/d41d8cd9/lib/python3.9/site-packages/nose/importer.py:12: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
from imp import find_module, load_module, acquire_lock, release_lock
tests/test_health.py::test_request_response
/home/cmiss/Jenkins/workspace/SPARC-API/tests/test_health.py:13: DeprecationWarning: Please use assertEqual instead.
assert_equals("healthy", json_response.get("status"))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
=========================== short test summary info ============================
FAILED tests/test_api.py::test_create_wrike_task - assert 409 == 400
FAILED tests/test_osparc.py::test_osparc_failing_simulation - urllib3.excepti...
FAILED tests/test_pmr.py::test_pmr_file_valid_path - assert 400 == 200
FAILED tests/test_pmr.py::test_pmr_latest_exposure_workspace_with_latest_exposure
FAILED tests/test_pmr.py::test_pmr_latest_exposure_workspace_without_latest_exposure
FAILED tests/test_scicrunch.py::test_scicrunch_keys - requests.exceptions.Con...
FAILED tests/test_scicrunch.py::test_scicrunch_versions_are_supported - reque...
FAILED tests/test_scicrunch.py::test_scicrunch_dataset_doi - requests.excepti...
FAILED tests/test_scicrunch.py::test_scicrunch_multiple_dataset_doi - request...
FAILED tests/test_scicrunch.py::test_scicrunch_multiple_dataset_ids - request...
FAILED tests/test_scicrunch.py::test_scicrunch_search - requests.exceptions.C...
FAILED tests/test_scicrunch.py::test_scicrunch_all_data - requests.exceptions...
FAILED tests/test_scicrunch.py::test_scicrunch_filter - requests.exceptions.C...
FAILED tests/test_scicrunch.py::test_scicrunch_filter_scaffolds - requests.ex...
FAILED tests/test_scicrunch.py::test_scicrunch_basic_search - requests.except...
FAILED tests/test_scicrunch.py::test_scicrunch_image_search - requests.except...
FAILED tests/test_scicrunch.py::test_scicrunch_boolean_logic - requests.excep...
FAILED tests/test_scicrunch.py::test_scicrunch_combined_facet_text - requests...
FAILED tests/test_scicrunch.py::test_getting_facets - AssertionError: assert ...
FAILED tests/test_scicrunch.py::test_create_identifier_query - requests.excep...
FAILED tests/test_thumbnails.py::test_neurolucida_thumbnail - assert False
FAILED tests/test_thumbnails.py::test_neurolucida_thumbnail_dataset_221 - ass...
===== 22 failed, 83 passed, 4 skipped, 10 warnings in 20356.58s (5:39:16) ======
Build step 'Virtualenv Builder' marked build as failure
[Slack Notifications] found #1258 as previous completed, non-aborted build
[Slack Notifications] will send OnEveryFailureNotification because build matches and user preferences allow it
Finished: FAILURE