Skip to content
Failed

Console Output

Started by timer
Running as SYSTEM
Building remotely on Ubuntu_18.04_bioeng49 (buildslave Testing) in workspace /home/cmiss/Jenkins/workspace/SPARC-API-DEV
[WS-CLEANUP] Deleting project workspace...
[WS-CLEANUP] Deferred wipeout is used...
[WS-CLEANUP] Done
The recommended git tool is: NONE
No credentials specified
Cloning the remote Git repository
Cloning repository https://github.com/nih-sparc/sparc-api.git
 > git init /home/cmiss/Jenkins/workspace/SPARC-API-DEV # timeout=10
Fetching upstream changes from https://github.com/nih-sparc/sparc-api.git
 > git --version # timeout=10
 > git --version # 'git version 2.25.1'
 > git fetch --tags --force --progress -- https://github.com/nih-sparc/sparc-api.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/nih-sparc/sparc-api.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
 > git rev-parse refs/remotes/origin/master^{commit} # timeout=10
Checking out Revision 0fd3c9499be845c4494ef7554d10ac2361d5f4f8 (refs/remotes/origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0fd3c9499be845c4494ef7554d10ac2361d5f4f8 # timeout=10
Commit message: "Merge pull request #214 from Tehsurfer/syntax-hotfix"
 > git rev-list --no-walk 0fd3c9499be845c4494ef7554d10ac2361d5f4f8 # timeout=10
[SPARC-API-DEV] $ /bin/sh -xe /tmp/shiningpanda8024573346729982557.sh
+ pwd
+ export PYTHONPATH=/home/cmiss/Jenkins/workspace/SPARC-API-DEV
+ export SCICRUNCH_HOST=https://scicrunch.org/api/1/elastic/SPARC_PortalDatasets_dev
+ export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python
+ pip install -r requirements.txt
Requirement already satisfied: api==0.0.7 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 1)) (0.0.7)
Requirement already satisfied: pennsieve==6.1.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 2)) (6.1.1)
Requirement already satisfied: boto3==1.17.67 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 3)) (1.17.67)
Requirement already satisfied: botocore==1.20.67 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 4)) (1.20.67)
Requirement already satisfied: certifi==2019.11.28 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 5)) (2019.11.28)
Requirement already satisfied: chardet==3.0.4 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 6)) (3.0.4)
Collecting Click==7.1.2 (from -r requirements.txt (line 7))
  Using cached click-7.1.2-py2.py3-none-any.whl.metadata (2.9 kB)
Requirement already satisfied: docutils==0.15.2 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 8)) (0.15.2)
Requirement already satisfied: Flask==1.1.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 9)) (1.1.1)
Requirement already satisfied: flask-marshmallow==0.10.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 10)) (0.10.1)
Requirement already satisfied: flask-cors==3.0.8 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 11)) (3.0.8)
Requirement already satisfied: gunicorn==20.0.4 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 12)) (20.0.4)
Requirement already satisfied: idna==2.8 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 13)) (2.8)
Requirement already satisfied: itsdangerous==1.1.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 14)) (1.1.0)
Requirement already satisfied: Jinja2==2.11.3 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 15)) (2.11.3)
Requirement already satisfied: jmespath==0.9.4 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 16)) (0.9.4)
Requirement already satisfied: MarkupSafe==1.1.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 17)) (1.1.1)
Requirement already satisfied: marshmallow==3.2.2 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 18)) (3.2.2)
Requirement already satisfied: nose==1.3.7 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 19)) (1.3.7)
Requirement already satisfied: osparc==0.4.3 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 20)) (0.4.3)
Requirement already satisfied: pillow in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 21)) (10.3.0)
Requirement already satisfied: public==2019.4.13 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 22)) (2019.4.13)
Requirement already satisfied: pytest in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 23)) (5.4.3)
Requirement already satisfied: pymongo==3.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 24)) (3.8.0)
Requirement already satisfied: python-dateutil==2.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 25)) (2.8.0)
Requirement already satisfied: python-dotenv==0.10.3 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 26)) (0.10.3)
Requirement already satisfied: query-string==2019.4.13 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 27)) (2019.4.13)
Requirement already satisfied: requests==2.25.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 28)) (2.25.1)
Requirement already satisfied: s3transfer==0.4.2 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 29)) (0.4.2)
Requirement already satisfied: sendgrid==6.9.7 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 30)) (6.9.7)
Requirement already satisfied: six==1.13.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 31)) (1.13.0)
Requirement already satisfied: SQLAlchemy==1.3.20 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 32)) (1.3.20)
Requirement already satisfied: urllib3==1.26.4 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 33)) (1.26.4)
Requirement already satisfied: Werkzeug==0.16.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 34)) (0.16.0)
Requirement already satisfied: psycopg2-binary==2.9.5 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 35)) (2.9.5)
Requirement already satisfied: APScheduler==3.7.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 36)) (3.7.0)
Requirement already satisfied: google-api-python-client==2.52.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 37)) (2.52.0)
Requirement already satisfied: oauth2client==4.1.3 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 38)) (4.1.3)
Requirement already satisfied: algoliasearch==2.6.2 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 39)) (2.6.2)
Requirement already satisfied: contentful==1.13.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 40)) (1.13.1)
Requirement already satisfied: contentful_management==2.11.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements.txt (line 41)) (2.11.0)
Requirement already satisfied: configparser>=3.5 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (6.0.1)
Requirement already satisfied: deprecated>=1.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (1.2.14)
Requirement already satisfied: future>=0.15.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (1.0.0)
Requirement already satisfied: futures in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (3.0.5)
Requirement already satisfied: protobuf>=3.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (4.25.3)
Requirement already satisfied: python-jose==3.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (3.2.0)
Requirement already satisfied: pytz>=2016 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (2024.1)
Requirement already satisfied: rsa==4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (4.0)
Requirement already satisfied: semver>=2.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (3.0.2)
Requirement already satisfied: websocket-client>=0.57.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (1.7.0)
Requirement already satisfied: docopt>=0.6 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (0.6.2)
Requirement already satisfied: psutil>=5.4 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements.txt (line 2)) (5.9.8)
Requirement already satisfied: setuptools>=3.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from gunicorn==20.0.4->-r requirements.txt (line 12)) (69.2.0)
Requirement already satisfied: python-http-client>=3.2.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from sendgrid==6.9.7->-r requirements.txt (line 30)) (3.3.7)
Requirement already satisfied: starkbank-ecdsa>=2.0.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from sendgrid==6.9.7->-r requirements.txt (line 30)) (2.2.0)
Requirement already satisfied: tzlocal~=2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from APScheduler==3.7.0->-r requirements.txt (line 36)) (2.1)
Requirement already satisfied: httplib2<1dev,>=0.15.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (0.22.0)
Requirement already satisfied: google-auth<3.0.0dev,>=1.19.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (2.29.0)
Requirement already satisfied: google-auth-httplib2>=0.1.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (0.2.0)
Requirement already satisfied: google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (2.18.0)
Requirement already satisfied: uritemplate<5,>=3.0.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-python-client==2.52.0->-r requirements.txt (line 37)) (4.1.1)
Requirement already satisfied: pyasn1>=0.1.7 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from oauth2client==4.1.3->-r requirements.txt (line 38)) (0.6.0)
Requirement already satisfied: pyasn1-modules>=0.0.5 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from oauth2client==4.1.3->-r requirements.txt (line 38)) (0.4.0)
Requirement already satisfied: ecdsa<0.15 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from python-jose==3.2.0->pennsieve==6.1.1->-r requirements.txt (line 2)) (0.14.1)
Requirement already satisfied: py>=1.5.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (1.11.0)
Requirement already satisfied: packaging in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (24.0)
Requirement already satisfied: attrs>=17.4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (23.2.0)
Requirement already satisfied: more-itertools>=4.0.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (10.2.0)
Requirement already satisfied: pluggy<1.0,>=0.12 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (0.13.1)
Requirement already satisfied: wcwidth in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest->-r requirements.txt (line 23)) (0.2.13)
Requirement already satisfied: wrapt<2,>=1.10 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from deprecated>=1.2.0->pennsieve==6.1.1->-r requirements.txt (line 2)) (1.16.0)
Requirement already satisfied: googleapis-common-protos<2.0.dev0,>=1.56.2 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-api-python-client==2.52.0->-r requirements.txt (line 37)) (1.63.0)
Requirement already satisfied: proto-plus<2.0.0dev,>=1.22.3 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-api-python-client==2.52.0->-r requirements.txt (line 37)) (1.23.0)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from google-auth<3.0.0dev,>=1.19.0->google-api-python-client==2.52.0->-r requirements.txt (line 37)) (5.3.3)
Requirement already satisfied: pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from httplib2<1dev,>=0.15.0->google-api-python-client==2.52.0->-r requirements.txt (line 37)) (3.1.2)
Using cached click-7.1.2-py2.py3-none-any.whl (82 kB)
Installing collected packages: Click
  Attempting uninstall: Click
    Found existing installation: click 8.1.7
    Uninstalling click-8.1.7:
      Successfully uninstalled click-8.1.7
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
black 24.3.0 requires click>=8.0.0, but you have click 7.1.2 which is incompatible.
Successfully installed Click-7.1.2
+ pip install -r requirements-dev.txt
Requirement already satisfied: pytest==5.4.3 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 1)) (5.4.3)
Requirement already satisfied: pennsieve==6.1.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 2)) (6.1.1)
Requirement already satisfied: black in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 3)) (24.3.0)
Requirement already satisfied: isort in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 4)) (5.13.2)
Requirement already satisfied: nose in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 5)) (1.3.7)
Requirement already satisfied: packaging in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from -r requirements-dev.txt (line 6)) (24.0)
Requirement already satisfied: py>=1.5.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (1.11.0)
Requirement already satisfied: attrs>=17.4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (23.2.0)
Requirement already satisfied: more-itertools>=4.0.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (10.2.0)
Requirement already satisfied: pluggy<1.0,>=0.12 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (0.13.1)
Requirement already satisfied: wcwidth in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pytest==5.4.3->-r requirements-dev.txt (line 1)) (0.2.13)
Requirement already satisfied: boto3 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.17.67)
Requirement already satisfied: configparser>=3.5 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (6.0.1)
Requirement already satisfied: deprecated>=1.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.2.14)
Requirement already satisfied: future>=0.15.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.0.0)
Requirement already satisfied: futures in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (3.0.5)
Requirement already satisfied: protobuf>=3.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (4.25.3)
Requirement already satisfied: python-jose==3.2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (3.2.0)
Requirement already satisfied: pytz>=2016 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2024.1)
Requirement already satisfied: requests>=2.18 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2.25.1)
Requirement already satisfied: rsa==4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (4.0)
Requirement already satisfied: semver>=2.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (3.0.2)
Requirement already satisfied: websocket-client>=0.57.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.7.0)
Requirement already satisfied: docopt>=0.6 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.6.2)
Requirement already satisfied: psutil>=5.4 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (5.9.8)
Requirement already satisfied: python-dateutil>=2.8.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2.8.0)
Requirement already satisfied: six<2.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from python-jose==3.2.0->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.13.0)
Requirement already satisfied: ecdsa<0.15 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from python-jose==3.2.0->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.14.1)
Requirement already satisfied: pyasn1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from python-jose==3.2.0->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.6.0)
Collecting click>=8.0.0 (from black->-r requirements-dev.txt (line 3))
  Using cached click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
Requirement already satisfied: mypy-extensions>=0.4.3 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (1.0.0)
Requirement already satisfied: pathspec>=0.9.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (0.12.1)
Requirement already satisfied: platformdirs>=2 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (4.2.0)
Requirement already satisfied: tomli>=1.1.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (2.0.1)
Requirement already satisfied: typing-extensions>=4.0.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from black->-r requirements-dev.txt (line 3)) (4.11.0)
Requirement already satisfied: wrapt<2,>=1.10 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from deprecated>=1.2.0->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.16.0)
Requirement already satisfied: chardet<5,>=3.0.2 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from requests>=2.18->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (3.0.4)
Requirement already satisfied: idna<3,>=2.5 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from requests>=2.18->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2.8)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from requests>=2.18->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.26.4)
Requirement already satisfied: certifi>=2017.4.17 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from requests>=2.18->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (2019.11.28)
Requirement already satisfied: botocore<1.21.0,>=1.20.67 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from boto3->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (1.20.67)
Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from boto3->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.9.4)
Requirement already satisfied: s3transfer<0.5.0,>=0.4.0 in /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages (from boto3->pennsieve==6.1.1->-r requirements-dev.txt (line 2)) (0.4.2)
Using cached click-8.1.7-py3-none-any.whl (97 kB)
Installing collected packages: click
  Attempting uninstall: click
    Found existing installation: click 7.1.2
    Uninstalling click-7.1.2:
      Successfully uninstalled click-7.1.2
Successfully installed click-8.1.7
+ pytest
============================= test session starts ==============================
platform linux -- Python 3.9.19, pytest-5.4.3, py-1.11.0, pluggy-0.13.1
rootdir: /home/cmiss/Jenkins/workspace/SPARC-API-DEV
collected 101 items

tests/test_api.py ..............F....                                    [ 18%]
tests/test_biolucida.py .............                                    [ 31%]
tests/test_dataset_info.py .s..ss.....                                   [ 42%]
tests/test_health.py .                                                   [ 43%]
tests/test_monthly_stats.py ......                                       [ 49%]
tests/test_osparc.py ..............                                      [ 63%]
tests/test_pmr.py .....                                                  [ 68%]
tests/test_scicrunch.py ................s.........                       [ 94%]
tests/test_segmentation_info.py ..                                       [ 96%]
tests/test_thumbnails.py .FF                                             [ 99%]
tests/test_update_contentful_entries.py .                                [100%]

=================================== FAILURES ===================================
____________________________ test_onto_term_lookup _____________________________

client = <FlaskClient <Flask 'app.main'>>

    def test_onto_term_lookup(client):
        r = client.get('/onto_term_lookup', query_string={'term': 'http://purl.obolibrary.org/obo/NCBITaxon_9606'})
        assert r.status_code == 200
        json_data = r.get_json()
>       assert json_data['label'] == 'Human'
E       AssertionError: assert 'not found' == 'Human'
E         - Human
E         + not found

tests/test_api.py:196: AssertionError
__________________________ test_neurolucida_thumbnail __________________________

self = <urllib3.connection.HTTPSConnection object at 0x7fede6f457f0>

    def _new_conn(self):
        """Establish a socket connection and set nodelay settings on it.
    
        :return: New socket connection.
        """
        extra_kw = {}
        if self.source_address:
            extra_kw["source_address"] = self.source_address
    
        if self.socket_options:
            extra_kw["socket_options"] = self.socket_options
    
        try:
>           conn = connection.create_connection(
                (self._dns_host, self.port), self.timeout, **extra_kw
            )

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

address = ('sparc.biolucida.net', 8081), timeout = None, source_address = None
socket_options = [(6, 1, 1)]

    def create_connection(
        address,
        timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
        source_address=None,
        socket_options=None,
    ):
        """Connect to *address* and return the socket object.
    
        Convenience function.  Connect to *address* (a 2-tuple ``(host,
        port)``) and return the socket object.  Passing the optional
        *timeout* parameter will set the timeout on the socket instance
        before attempting to connect.  If no *timeout* is supplied, the
        global default timeout setting returned by :func:`socket.getdefaulttimeout`
        is used.  If *source_address* is set it must be a tuple of (host, port)
        for the socket to bind as a source address before making the connection.
        An host of '' or port 0 tells the OS to use the default.
        """
    
        host, port = address
        if host.startswith("["):
            host = host.strip("[]")
        err = None
    
        # Using the value from allowed_gai_family() in the context of getaddrinfo lets
        # us select whether to work with IPv4 DNS records, IPv6 records, or both.
        # The original create_connection function always returns all records.
        family = allowed_gai_family()
    
        try:
            host.encode("idna")
        except UnicodeError:
            return six.raise_from(
                LocationParseError(u"'%s', label empty or too long" % host), None
            )
    
        for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
            af, socktype, proto, canonname, sa = res
            sock = None
            try:
                sock = socket.socket(af, socktype, proto)
    
                # If provided, set socket level options before connecting.
                _set_socket_options(sock, socket_options)
    
                if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT:
                    sock.settimeout(timeout)
                if source_address:
                    sock.bind(source_address)
                sock.connect(sa)
                return sock
    
            except socket.error as e:
                err = e
                if sock is not None:
                    sock.close()
                    sock = None
    
        if err is not None:
>           raise err

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:96: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

address = ('sparc.biolucida.net', 8081), timeout = None, source_address = None
socket_options = [(6, 1, 1)]

    def create_connection(
        address,
        timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
        source_address=None,
        socket_options=None,
    ):
        """Connect to *address* and return the socket object.
    
        Convenience function.  Connect to *address* (a 2-tuple ``(host,
        port)``) and return the socket object.  Passing the optional
        *timeout* parameter will set the timeout on the socket instance
        before attempting to connect.  If no *timeout* is supplied, the
        global default timeout setting returned by :func:`socket.getdefaulttimeout`
        is used.  If *source_address* is set it must be a tuple of (host, port)
        for the socket to bind as a source address before making the connection.
        An host of '' or port 0 tells the OS to use the default.
        """
    
        host, port = address
        if host.startswith("["):
            host = host.strip("[]")
        err = None
    
        # Using the value from allowed_gai_family() in the context of getaddrinfo lets
        # us select whether to work with IPv4 DNS records, IPv6 records, or both.
        # The original create_connection function always returns all records.
        family = allowed_gai_family()
    
        try:
            host.encode("idna")
        except UnicodeError:
            return six.raise_from(
                LocationParseError(u"'%s', label empty or too long" % host), None
            )
    
        for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
            af, socktype, proto, canonname, sa = res
            sock = None
            try:
                sock = socket.socket(af, socktype, proto)
    
                # If provided, set socket level options before connecting.
                _set_socket_options(sock, socket_options)
    
                if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT:
                    sock.settimeout(timeout)
                if source_address:
                    sock.bind(source_address)
>               sock.connect(sa)
E               TimeoutError: [Errno 110] Connection timed out

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:86: TimeoutError

During handling of the above exception, another exception occurred:

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede6f45430>
method = 'GET'
url = '/thumbnail?datasetId=37&version=3&path=files%2Fderivative%2Fsub-54-5%2FTJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/thumbnail', query='datasetId=37&version=3&path=files%2Fderivative%2Fsub-54-5%2FTJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param url:
            The URL to perform the request on.
    
        :param body:
            Data to send in the request body, either :class:`str`, :class:`bytes`,
            an iterable of :class:`str`/:class:`bytes`, or a file-like object.
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When ``False``, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
    
        parsed_url = parse_url(url)
        destination_scheme = parsed_url.scheme
    
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parsed_url.url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/urllib3/urllib3/issues/651>
        release_this_conn = release_conn
    
        http_tunnel_required = connection_requires_http_tunnel(
            self.proxy, self.proxy_config, destination_scheme
        )
    
        # Merge the proxy headers. Only done when not using HTTP CONNECT. We
        # have to copy the headers dict so we can safely change it without those
        # changes being reflected in anyone else's copy.
        if not http_tunnel_required:
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn and http_tunnel_required:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
>           httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede6f45430>
conn = <urllib3.connection.HTTPSConnection object at 0x7fede6f457f0>
method = 'GET'
url = '/thumbnail?datasetId=37&version=3&path=files%2Fderivative%2Fsub-54-5%2FTJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = Timeout(connect=None, read=None, total=None)

    def _make_request(
        self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
    ):
        """
        Perform a request on a given urllib connection object taken from our
        pool.
    
        :param conn:
            a connection from one of our connection pools
    
        :param timeout:
            Socket timeout in seconds for the request. This can be a
            float or integer, which will set the same timeout value for
            the socket connect and the socket read, or an instance of
            :class:`urllib3.util.Timeout`, which gives you more fine-grained
            control over your timeouts.
        """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout
    
        # Trigger any extra validation we need to do.
        try:
>           self._validate_conn(conn)

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede6f45430>
conn = <urllib3.connection.HTTPSConnection object at 0x7fede6f457f0>

    def _validate_conn(self, conn):
        """
        Called right before a request is made, after the socket is created.
        """
        super(HTTPSConnectionPool, self)._validate_conn(conn)
    
        # Force connect early to allow us to validate the connection.
        if not getattr(conn, "sock", None):  # AppEngine might not have  `.sock`
>           conn.connect()

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.HTTPSConnection object at 0x7fede6f457f0>

    def connect(self):
        # Add certificate verification
>       conn = self._new_conn()

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.HTTPSConnection object at 0x7fede6f457f0>

    def _new_conn(self):
        """Establish a socket connection and set nodelay settings on it.
    
        :return: New socket connection.
        """
        extra_kw = {}
        if self.source_address:
            extra_kw["source_address"] = self.source_address
    
        if self.socket_options:
            extra_kw["socket_options"] = self.socket_options
    
        try:
            conn = connection.create_connection(
                (self._dns_host, self.port), self.timeout, **extra_kw
            )
    
        except SocketTimeout:
            raise ConnectTimeoutError(
                self,
                "Connection to %s timed out. (connect timeout=%s)"
                % (self.host, self.timeout),
            )
    
        except SocketError as e:
>           raise NewConnectionError(
                self, "Failed to establish a new connection: %s" % e
            )
E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7fede6f457f0>: Failed to establish a new connection: [Errno 110] Connection timed out

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError

During handling of the above exception, another exception occurred:

self = <requests.adapters.HTTPAdapter object at 0x7fede45a4580>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
>               resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede6f45430>
method = 'GET'
url = '/thumbnail?datasetId=37&version=3&path=files%2Fderivative%2Fsub-54-5%2FTJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/thumbnail', query='datasetId=37&version=3&path=files%2Fderivative%2Fsub-54-5%2FTJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param url:
            The URL to perform the request on.
    
        :param body:
            Data to send in the request body, either :class:`str`, :class:`bytes`,
            an iterable of :class:`str`/:class:`bytes`, or a file-like object.
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When ``False``, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
    
        parsed_url = parse_url(url)
        destination_scheme = parsed_url.scheme
    
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parsed_url.url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/urllib3/urllib3/issues/651>
        release_this_conn = release_conn
    
        http_tunnel_required = connection_requires_http_tunnel(
            self.proxy, self.proxy_config, destination_scheme
        )
    
        # Merge the proxy headers. Only done when not using HTTP CONNECT. We
        # have to copy the headers dict so we can safely change it without those
        # changes being reflected in anyone else's copy.
        if not http_tunnel_required:
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn and http_tunnel_required:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw["request_method"] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(
                httplib_response,
                pool=self,
                connection=response_conn,
                retries=retries,
                **response_kw
            )
    
            # Everything went great!
            clean_exit = True
    
        except EmptyPoolError:
            # Didn't get a connection from the pool, no need to clean up
            clean_exit = True
            release_this_conn = False
            raise
    
        except (
            TimeoutError,
            HTTPException,
            SocketError,
            ProtocolError,
            BaseSSLError,
            SSLError,
            CertificateError,
        ) as e:
            # Discard the connection for these exceptions. It will be
            # replaced during the next _get_conn() call.
            clean_exit = False
            if isinstance(e, (BaseSSLError, CertificateError)):
                e = SSLError(e)
            elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError("Cannot connect to proxy.", e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError("Connection aborted.", e)
    
>           retries = retries.increment(
                method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
            )

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'GET'
url = '/thumbnail?datasetId=37&version=3&path=files%2Fderivative%2Fsub-54-5%2FTJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fede6f457f0>: Failed to establish a new connection: [Errno 110] Connection timed out')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede6f45430>
_stacktrace = <traceback object at 0x7fedec3e94c0>

    def increment(
        self,
        method=None,
        url=None,
        response=None,
        error=None,
        _pool=None,
        _stacktrace=None,
    ):
        """Return a new Retry object with incremented retry counters.
    
        :param response: A response object, or None, if the server did not
            return a response.
        :type response: :class:`~urllib3.response.HTTPResponse`
        :param Exception error: An error encountered during the request, or
            None if the response was received successfully.
    
        :return: A new ``Retry`` object.
        """
        if self.total is False and error:
            # Disabled, indicate to re-raise the error.
            raise six.reraise(type(error), error, _stacktrace)
    
        total = self.total
        if total is not None:
            total -= 1
    
        connect = self.connect
        read = self.read
        redirect = self.redirect
        status_count = self.status
        other = self.other
        cause = "unknown"
        status = None
        redirect_location = None
    
        if error and self._is_connection_error(error):
            # Connect retry?
            if connect is False:
                raise six.reraise(type(error), error, _stacktrace)
            elif connect is not None:
                connect -= 1
    
        elif error and self._is_read_error(error):
            # Read retry?
            if read is False or not self._is_method_retryable(method):
                raise six.reraise(type(error), error, _stacktrace)
            elif read is not None:
                read -= 1
    
        elif error:
            # Other retry?
            if other is not None:
                other -= 1
    
        elif response and response.get_redirect_location():
            # Redirect retry?
            if redirect is not None:
                redirect -= 1
            cause = "too many redirects"
            redirect_location = response.get_redirect_location()
            status = response.status
    
        else:
            # Incrementing because of a server error like a 500 in
            # status_forcelist and the given method is in the allowed_methods
            cause = ResponseError.GENERIC_ERROR
            if response and response.status:
                if status_count is not None:
                    status_count -= 1
                cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                status = response.status
    
        history = self.history + (
            RequestHistory(method, url, error, status, redirect_location),
        )
    
        new_retry = self.new(
            total=total,
            connect=connect,
            read=read,
            redirect=redirect,
            status=status_count,
            other=other,
            history=history,
        )
    
        if new_retry.is_exhausted():
>           raise MaxRetryError(_pool, url, error or ResponseError(cause))
E           urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='sparc.biolucida.net', port=8081): Max retries exceeded with url: /thumbnail?datasetId=37&version=3&path=files%2Fderivative%2Fsub-54-5%2FTJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fede6f457f0>: Failed to establish a new connection: [Errno 110] Connection timed out'))

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError

During handling of the above exception, another exception occurred:

client = <FlaskClient <Flask 'app.main'>>

    def test_neurolucida_thumbnail(client):
        query_string = {'datasetId': 37, 'version': 3, 'path': 'files/derivative/sub-54-5/TJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml'}
>       r = client.get('/thumbnail/neurolucida', query_string=query_string)

tests/test_thumbnails.py:20: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
    return self.open(*args, **kw)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
    return Client.open(
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
    response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
    rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
    app_rv = app(environ, start_response)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
    return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
    response = self.handle_exception(e)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
    return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
    reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
    raise value
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
    response = self.full_dispatch_request()
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
    rv = self.handle_user_exception(e)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
    return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
    reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
    raise value
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
    rv = self.dispatch_request()
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:309: in thumbnail_from_neurolucida_file
    response = requests.get(url, params=query_args)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:76: in get
    return request('get', url, params=params, **kwargs)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
    return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
    resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
    r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.adapters.HTTPAdapter object at 0x7fede45a4580>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )
    
            # Send the request.
            else:
                if hasattr(conn, 'proxy_pool'):
                    conn = conn.proxy_pool
    
                low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
    
                try:
                    low_conn.putrequest(request.method,
                                        url,
                                        skip_accept_encoding=True)
    
                    for header, value in request.headers.items():
                        low_conn.putheader(header, value)
    
                    low_conn.endheaders()
    
                    for i in request.body:
                        low_conn.send(hex(len(i))[2:].encode('utf-8'))
                        low_conn.send(b'\r\n')
                        low_conn.send(i)
                        low_conn.send(b'\r\n')
                    low_conn.send(b'0\r\n\r\n')
    
                    # Receive the response from the server
                    try:
                        # For Python 2.7, use buffering of HTTP responses
                        r = low_conn.getresponse(buffering=True)
                    except TypeError:
                        # For compatibility with Python 3.3+
                        r = low_conn.getresponse()
    
                    resp = HTTPResponse.from_httplib(
                        r,
                        pool=conn,
                        connection=low_conn,
                        preload_content=False,
                        decode_content=False
                    )
                except:
                    # If we hit any problems here, clean up the connection.
                    # Then, reraise so that we can handle the actual exception.
                    low_conn.close()
                    raise
    
        except (ProtocolError, socket.error) as err:
            raise ConnectionError(err, request=request)
    
        except MaxRetryError as e:
            if isinstance(e.reason, ConnectTimeoutError):
                # TODO: Remove this in 3.0.0: see #2811
                if not isinstance(e.reason, NewConnectionError):
                    raise ConnectTimeout(e, request=request)
    
            if isinstance(e.reason, ResponseError):
                raise RetryError(e, request=request)
    
            if isinstance(e.reason, _ProxyError):
                raise ProxyError(e, request=request)
    
            if isinstance(e.reason, _SSLError):
                # This branch is for urllib3 v1.22 and later.
                raise SSLError(e, request=request)
    
>           raise ConnectionError(e, request=request)
E           requests.exceptions.ConnectionError: HTTPSConnectionPool(host='sparc.biolucida.net', port=8081): Max retries exceeded with url: /thumbnail?datasetId=37&version=3&path=files%2Fderivative%2Fsub-54-5%2FTJU_3Scan_ratheart54-5_updated_06_11_19_Fiducials.xml (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fede6f457f0>: Failed to establish a new connection: [Errno 110] Connection timed out'))

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
____________________ test_neurolucida_thumbnail_dataset_221 ____________________

self = <urllib3.connection.HTTPSConnection object at 0x7fedd7619d30>

    def _new_conn(self):
        """Establish a socket connection and set nodelay settings on it.
    
        :return: New socket connection.
        """
        extra_kw = {}
        if self.source_address:
            extra_kw["source_address"] = self.source_address
    
        if self.socket_options:
            extra_kw["socket_options"] = self.socket_options
    
        try:
>           conn = connection.create_connection(
                (self._dns_host, self.port), self.timeout, **extra_kw
            )

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:169: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

address = ('sparc.biolucida.net', 8081), timeout = None, source_address = None
socket_options = [(6, 1, 1)]

    def create_connection(
        address,
        timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
        source_address=None,
        socket_options=None,
    ):
        """Connect to *address* and return the socket object.
    
        Convenience function.  Connect to *address* (a 2-tuple ``(host,
        port)``) and return the socket object.  Passing the optional
        *timeout* parameter will set the timeout on the socket instance
        before attempting to connect.  If no *timeout* is supplied, the
        global default timeout setting returned by :func:`socket.getdefaulttimeout`
        is used.  If *source_address* is set it must be a tuple of (host, port)
        for the socket to bind as a source address before making the connection.
        An host of '' or port 0 tells the OS to use the default.
        """
    
        host, port = address
        if host.startswith("["):
            host = host.strip("[]")
        err = None
    
        # Using the value from allowed_gai_family() in the context of getaddrinfo lets
        # us select whether to work with IPv4 DNS records, IPv6 records, or both.
        # The original create_connection function always returns all records.
        family = allowed_gai_family()
    
        try:
            host.encode("idna")
        except UnicodeError:
            return six.raise_from(
                LocationParseError(u"'%s', label empty or too long" % host), None
            )
    
        for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
            af, socktype, proto, canonname, sa = res
            sock = None
            try:
                sock = socket.socket(af, socktype, proto)
    
                # If provided, set socket level options before connecting.
                _set_socket_options(sock, socket_options)
    
                if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT:
                    sock.settimeout(timeout)
                if source_address:
                    sock.bind(source_address)
                sock.connect(sa)
                return sock
    
            except socket.error as e:
                err = e
                if sock is not None:
                    sock.close()
                    sock = None
    
        if err is not None:
>           raise err

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:96: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

address = ('sparc.biolucida.net', 8081), timeout = None, source_address = None
socket_options = [(6, 1, 1)]

    def create_connection(
        address,
        timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
        source_address=None,
        socket_options=None,
    ):
        """Connect to *address* and return the socket object.
    
        Convenience function.  Connect to *address* (a 2-tuple ``(host,
        port)``) and return the socket object.  Passing the optional
        *timeout* parameter will set the timeout on the socket instance
        before attempting to connect.  If no *timeout* is supplied, the
        global default timeout setting returned by :func:`socket.getdefaulttimeout`
        is used.  If *source_address* is set it must be a tuple of (host, port)
        for the socket to bind as a source address before making the connection.
        An host of '' or port 0 tells the OS to use the default.
        """
    
        host, port = address
        if host.startswith("["):
            host = host.strip("[]")
        err = None
    
        # Using the value from allowed_gai_family() in the context of getaddrinfo lets
        # us select whether to work with IPv4 DNS records, IPv6 records, or both.
        # The original create_connection function always returns all records.
        family = allowed_gai_family()
    
        try:
            host.encode("idna")
        except UnicodeError:
            return six.raise_from(
                LocationParseError(u"'%s', label empty or too long" % host), None
            )
    
        for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
            af, socktype, proto, canonname, sa = res
            sock = None
            try:
                sock = socket.socket(af, socktype, proto)
    
                # If provided, set socket level options before connecting.
                _set_socket_options(sock, socket_options)
    
                if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT:
                    sock.settimeout(timeout)
                if source_address:
                    sock.bind(source_address)
>               sock.connect(sa)
E               TimeoutError: [Errno 110] Connection timed out

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/connection.py:86: TimeoutError

During handling of the above exception, another exception occurred:

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede48c5220>
method = 'GET'
url = '/thumbnail?datasetId=221&version=3&path=files%2Fderivative%2Fsub-M168%2Fdigital-traces%2FpCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/thumbnail', query='datasetId=221&version=3&path=files%2Fderivative%2Fsub-M168%2Fdigital-traces%2FpCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param url:
            The URL to perform the request on.
    
        :param body:
            Data to send in the request body, either :class:`str`, :class:`bytes`,
            an iterable of :class:`str`/:class:`bytes`, or a file-like object.
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When ``False``, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
    
        parsed_url = parse_url(url)
        destination_scheme = parsed_url.scheme
    
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parsed_url.url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/urllib3/urllib3/issues/651>
        release_this_conn = release_conn
    
        http_tunnel_required = connection_requires_http_tunnel(
            self.proxy, self.proxy_config, destination_scheme
        )
    
        # Merge the proxy headers. Only done when not using HTTP CONNECT. We
        # have to copy the headers dict so we can safely change it without those
        # changes being reflected in anyone else's copy.
        if not http_tunnel_required:
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn and http_tunnel_required:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
>           httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:699: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede48c5220>
conn = <urllib3.connection.HTTPSConnection object at 0x7fedd7619d30>
method = 'GET'
url = '/thumbnail?datasetId=221&version=3&path=files%2Fderivative%2Fsub-M168%2Fdigital-traces%2FpCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml'
timeout = Timeout(connect=None, read=None, total=None), chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = Timeout(connect=None, read=None, total=None)

    def _make_request(
        self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
    ):
        """
        Perform a request on a given urllib connection object taken from our
        pool.
    
        :param conn:
            a connection from one of our connection pools
    
        :param timeout:
            Socket timeout in seconds for the request. This can be a
            float or integer, which will set the same timeout value for
            the socket connect and the socket read, or an instance of
            :class:`urllib3.util.Timeout`, which gives you more fine-grained
            control over your timeouts.
        """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout
    
        # Trigger any extra validation we need to do.
        try:
>           self._validate_conn(conn)

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:382: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede48c5220>
conn = <urllib3.connection.HTTPSConnection object at 0x7fedd7619d30>

    def _validate_conn(self, conn):
        """
        Called right before a request is made, after the socket is created.
        """
        super(HTTPSConnectionPool, self)._validate_conn(conn)
    
        # Force connect early to allow us to validate the connection.
        if not getattr(conn, "sock", None):  # AppEngine might not have  `.sock`
>           conn.connect()

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:1010: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.HTTPSConnection object at 0x7fedd7619d30>

    def connect(self):
        # Add certificate verification
>       conn = self._new_conn()

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:353: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.HTTPSConnection object at 0x7fedd7619d30>

    def _new_conn(self):
        """Establish a socket connection and set nodelay settings on it.
    
        :return: New socket connection.
        """
        extra_kw = {}
        if self.source_address:
            extra_kw["source_address"] = self.source_address
    
        if self.socket_options:
            extra_kw["socket_options"] = self.socket_options
    
        try:
            conn = connection.create_connection(
                (self._dns_host, self.port), self.timeout, **extra_kw
            )
    
        except SocketTimeout:
            raise ConnectTimeoutError(
                self,
                "Connection to %s timed out. (connect timeout=%s)"
                % (self.host, self.timeout),
            )
    
        except SocketError as e:
>           raise NewConnectionError(
                self, "Failed to establish a new connection: %s" % e
            )
E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7fedd7619d30>: Failed to establish a new connection: [Errno 110] Connection timed out

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connection.py:181: NewConnectionError

During handling of the above exception, another exception occurred:

self = <requests.adapters.HTTPAdapter object at 0x7fede48c5580>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
>               resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:439: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede48c5220>
method = 'GET'
url = '/thumbnail?datasetId=221&version=3&path=files%2Fderivative%2Fsub-M168%2Fdigital-traces%2FpCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml'
body = None
headers = {'User-Agent': 'python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/thumbnail', query='datasetId=221&version=3&path=files%2Fderivative%2Fsub-M168%2Fdigital-traces%2FpCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml', fragment=None)
destination_scheme = None, conn = None, release_this_conn = True
http_tunnel_required = False, err = None, clean_exit = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param url:
            The URL to perform the request on.
    
        :param body:
            Data to send in the request body, either :class:`str`, :class:`bytes`,
            an iterable of :class:`str`/:class:`bytes`, or a file-like object.
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When ``False``, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
    
        parsed_url = parse_url(url)
        destination_scheme = parsed_url.scheme
    
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parsed_url.url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/urllib3/urllib3/issues/651>
        release_this_conn = release_conn
    
        http_tunnel_required = connection_requires_http_tunnel(
            self.proxy, self.proxy_config, destination_scheme
        )
    
        # Merge the proxy headers. Only done when not using HTTP CONNECT. We
        # have to copy the headers dict so we can safely change it without those
        # changes being reflected in anyone else's copy.
        if not http_tunnel_required:
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn and http_tunnel_required:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw["request_method"] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(
                httplib_response,
                pool=self,
                connection=response_conn,
                retries=retries,
                **response_kw
            )
    
            # Everything went great!
            clean_exit = True
    
        except EmptyPoolError:
            # Didn't get a connection from the pool, no need to clean up
            clean_exit = True
            release_this_conn = False
            raise
    
        except (
            TimeoutError,
            HTTPException,
            SocketError,
            ProtocolError,
            BaseSSLError,
            SSLError,
            CertificateError,
        ) as e:
            # Discard the connection for these exceptions. It will be
            # replaced during the next _get_conn() call.
            clean_exit = False
            if isinstance(e, (BaseSSLError, CertificateError)):
                e = SSLError(e)
            elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError("Cannot connect to proxy.", e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError("Connection aborted.", e)
    
>           retries = retries.increment(
                method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
            )

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/connectionpool.py:755: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'GET'
url = '/thumbnail?datasetId=221&version=3&path=files%2Fderivative%2Fsub-M168%2Fdigital-traces%2FpCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml'
response = None
error = NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fedd7619d30>: Failed to establish a new connection: [Errno 110] Connection timed out')
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fede48c5220>
_stacktrace = <traceback object at 0x7fede6e96d00>

    def increment(
        self,
        method=None,
        url=None,
        response=None,
        error=None,
        _pool=None,
        _stacktrace=None,
    ):
        """Return a new Retry object with incremented retry counters.
    
        :param response: A response object, or None, if the server did not
            return a response.
        :type response: :class:`~urllib3.response.HTTPResponse`
        :param Exception error: An error encountered during the request, or
            None if the response was received successfully.
    
        :return: A new ``Retry`` object.
        """
        if self.total is False and error:
            # Disabled, indicate to re-raise the error.
            raise six.reraise(type(error), error, _stacktrace)
    
        total = self.total
        if total is not None:
            total -= 1
    
        connect = self.connect
        read = self.read
        redirect = self.redirect
        status_count = self.status
        other = self.other
        cause = "unknown"
        status = None
        redirect_location = None
    
        if error and self._is_connection_error(error):
            # Connect retry?
            if connect is False:
                raise six.reraise(type(error), error, _stacktrace)
            elif connect is not None:
                connect -= 1
    
        elif error and self._is_read_error(error):
            # Read retry?
            if read is False or not self._is_method_retryable(method):
                raise six.reraise(type(error), error, _stacktrace)
            elif read is not None:
                read -= 1
    
        elif error:
            # Other retry?
            if other is not None:
                other -= 1
    
        elif response and response.get_redirect_location():
            # Redirect retry?
            if redirect is not None:
                redirect -= 1
            cause = "too many redirects"
            redirect_location = response.get_redirect_location()
            status = response.status
    
        else:
            # Incrementing because of a server error like a 500 in
            # status_forcelist and the given method is in the allowed_methods
            cause = ResponseError.GENERIC_ERROR
            if response and response.status:
                if status_count is not None:
                    status_count -= 1
                cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                status = response.status
    
        history = self.history + (
            RequestHistory(method, url, error, status, redirect_location),
        )
    
        new_retry = self.new(
            total=total,
            connect=connect,
            read=read,
            redirect=redirect,
            status=status_count,
            other=other,
            history=history,
        )
    
        if new_retry.is_exhausted():
>           raise MaxRetryError(_pool, url, error or ResponseError(cause))
E           urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='sparc.biolucida.net', port=8081): Max retries exceeded with url: /thumbnail?datasetId=221&version=3&path=files%2Fderivative%2Fsub-M168%2Fdigital-traces%2FpCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fedd7619d30>: Failed to establish a new connection: [Errno 110] Connection timed out'))

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/urllib3/util/retry.py:574: MaxRetryError

During handling of the above exception, another exception occurred:

client = <FlaskClient <Flask 'app.main'>>

    def test_neurolucida_thumbnail_dataset_221(client):
        query_string = {'datasetId': 221, 'version': 3, 'path': 'files/derivative/sub-M168/digital-traces/pCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml'}
>       r = client.get('/thumbnail/neurolucida', query_string=query_string)

tests/test_thumbnails.py:27: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1029: in get
    return self.open(*args, **kw)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/testing.py:222: in open
    return Client.open(
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:993: in open
    response = self.run_wsgi_app(environ.copy(), buffered=buffered)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:884: in run_wsgi_app
    rv = run_wsgi_app(self.application, environ, buffered=buffered)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/werkzeug/test.py:1119: in run_wsgi_app
    app_rv = app(environ, start_response)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2463: in __call__
    return self.wsgi_app(environ, start_response)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2449: in wsgi_app
    response = self.handle_exception(e)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
    return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1866: in handle_exception
    reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
    raise value
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:2446: in wsgi_app
    response = self.full_dispatch_request()
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1951: in full_dispatch_request
    rv = self.handle_user_exception(e)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_cors/extension.py:161: in wrapped_function
    return cors_after_request(app.make_response(f(*args, **kwargs)))
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1820: in handle_user_exception
    reraise(exc_type, exc_value, tb)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/_compat.py:39: in reraise
    raise value
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1949: in full_dispatch_request
    rv = self.dispatch_request()
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask/app.py:1935: in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
app/main.py:309: in thumbnail_from_neurolucida_file
    response = requests.get(url, params=query_args)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:76: in get
    return request('get', url, params=params, **kwargs)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/api.py:61: in request
    return session.request(method=method, url=url, **kwargs)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:542: in request
    resp = self.send(prep, **send_kwargs)
../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/sessions.py:655: in send
    r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.adapters.HTTPAdapter object at 0x7fede48c5580>
request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True
cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )
    
            # Send the request.
            else:
                if hasattr(conn, 'proxy_pool'):
                    conn = conn.proxy_pool
    
                low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
    
                try:
                    low_conn.putrequest(request.method,
                                        url,
                                        skip_accept_encoding=True)
    
                    for header, value in request.headers.items():
                        low_conn.putheader(header, value)
    
                    low_conn.endheaders()
    
                    for i in request.body:
                        low_conn.send(hex(len(i))[2:].encode('utf-8'))
                        low_conn.send(b'\r\n')
                        low_conn.send(i)
                        low_conn.send(b'\r\n')
                    low_conn.send(b'0\r\n\r\n')
    
                    # Receive the response from the server
                    try:
                        # For Python 2.7, use buffering of HTTP responses
                        r = low_conn.getresponse(buffering=True)
                    except TypeError:
                        # For compatibility with Python 3.3+
                        r = low_conn.getresponse()
    
                    resp = HTTPResponse.from_httplib(
                        r,
                        pool=conn,
                        connection=low_conn,
                        preload_content=False,
                        decode_content=False
                    )
                except:
                    # If we hit any problems here, clean up the connection.
                    # Then, reraise so that we can handle the actual exception.
                    low_conn.close()
                    raise
    
        except (ProtocolError, socket.error) as err:
            raise ConnectionError(err, request=request)
    
        except MaxRetryError as e:
            if isinstance(e.reason, ConnectTimeoutError):
                # TODO: Remove this in 3.0.0: see #2811
                if not isinstance(e.reason, NewConnectionError):
                    raise ConnectTimeout(e, request=request)
    
            if isinstance(e.reason, ResponseError):
                raise RetryError(e, request=request)
    
            if isinstance(e.reason, _ProxyError):
                raise ProxyError(e, request=request)
    
            if isinstance(e.reason, _SSLError):
                # This branch is for urllib3 v1.22 and later.
                raise SSLError(e, request=request)
    
>           raise ConnectionError(e, request=request)
E           requests.exceptions.ConnectionError: HTTPSConnectionPool(host='sparc.biolucida.net', port=8081): Max retries exceeded with url: /thumbnail?datasetId=221&version=3&path=files%2Fderivative%2Fsub-M168%2Fdigital-traces%2FpCm168_AAV_Z_20x_191211_S3B_lx_IGS.xml (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fedd7619d30>: Failed to establish a new connection: [Errno 110] Connection timed out'))

../../shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/requests/adapters.py:516: ConnectionError
=============================== warnings summary ===============================
/home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/apscheduler/__init__.py:1
  /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/apscheduler/__init__.py:1: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
    from pkg_resources import get_distribution, DistributionNotFound

/home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/marshmallow/__init__.py:17
  /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/marshmallow/__init__.py:17: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    __version_info__ = tuple(LooseVersion(__version__).version)

/home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_marshmallow/__init__.py:34
  /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/flask_marshmallow/__init__.py:34: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    __version_info__ = tuple(LooseVersion(__version__).version)

/home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:19
  /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:19: DeprecationWarning: Call to deprecated create function FileDescriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
    DESCRIPTOR = _descriptor.FileDescriptor(

/home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:36
  /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:36: DeprecationWarning: Call to deprecated create function FieldDescriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
    _descriptor.FieldDescriptor(

/home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:53
  /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:53: DeprecationWarning: Call to deprecated create function FieldDescriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
    _descriptor.FieldDescriptor(

/home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:70
  /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:70: DeprecationWarning: Call to deprecated create function FieldDescriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
    _descriptor.FieldDescriptor(

/home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:29
  /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/pennsieve/cache/cache_segment_pb2.py:29: DeprecationWarning: Call to deprecated create function Descriptor(). Note: Create unlinked descriptors is going to go away. Please use get/find descriptors from generated code or query the descriptor_pool.
    _CACHESEGMENT = _descriptor.Descriptor(

app/manifest_name_to_discover_name.py:5075
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5075: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-4\sam-3\sub-4_sam-3_ChAT_P4-3p1_20x.jp2': 'files/derivative/sub-4/sam-3/sub-4_sam-3_ChAT_P4-3p1_20x.jp2',

app/manifest_name_to_discover_name.py:5076
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5076: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-4\sam-3\sub-4_sam-3_TH_P4-3p1_20x.jp2': 'files/derivative/sub-4/sam-3/sub-4_sam-3_TH_P4-3p1_20x.jp2',

app/manifest_name_to_discover_name.py:5077
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5077: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-5\sam-1\sub-5_sam-1_ChAT_P5-1p1_20x.jp2': 'files/derivative/sub-5/sam-1/sub-5_sam-1_ChAT_P5-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5078
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5078: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-5\sam-1\sub-5_sam-1_TH_P5-1p4_20x.jp2': 'files/derivative/sub-5/sam-1/sub-5_sam-1_TH_P5-1p4_20x.jp2',

app/manifest_name_to_discover_name.py:5079
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5079: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-5\sam-3\sub-5_sam-3_ChAT_P5-3p1_20x.jp2': 'files/derivative/sub-5/sam-3/sub-5_sam-3_ChAT_P5-3p1_20x.jp2',

app/manifest_name_to_discover_name.py:5080
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5080: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-5\sam-3\sub-5_sam-3_TH_P5-3p1_20x.jp2': 'files/derivative/sub-5/sam-3/sub-5_sam-3_TH_P5-3p1_20x.jp2',

app/manifest_name_to_discover_name.py:5081
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5081: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-6\sam-1\sub-6_sam-1_ChAT_NPcontrol_P6-1p1_20x.jp2': 'files/derivative/sub-6/sam-1/sub-6_sam-1_ChAT_NPcontrol_P6-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5082
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5082: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-6\sam-1\sub-6_sam-1_TH_NPcontrol_P6-1p1_20x.jp2': 'files/derivative/sub-6/sam-1/sub-6_sam-1_TH_NPcontrol_P6-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5083
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5083: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-6\sam-7\sub-6_sam-7_ChAT_P6-7p1_20x.jp2': 'files/derivative/sub-6/sam-7/sub-6_sam-7_ChAT_P6-7p1_20x.jp2',

app/manifest_name_to_discover_name.py:5084
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5084: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-6\sam-7\sub-6_sam-7_TH_P6-7p1_20x.jp2': 'files/derivative/sub-6/sam-7/sub-6_sam-7_TH_P6-7p1_20x.jp2',

app/manifest_name_to_discover_name.py:5085
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5085: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-7\sam-7\sub-7_sam-7_ChAT_NPcontrol_P7-7p4_20x.jp2': 'files/derivative/sub-7/sam-7/sub-7_sam-7_ChAT_NPcontrol_P7-7p4_20x.jp2',

app/manifest_name_to_discover_name.py:5086
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5086: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-7\sam-7\sub-7_sam-7_TH_NPcontrol_P7-7p1_20x.jp2': 'files/derivative/sub-7/sam-7/sub-7_sam-7_TH_NPcontrol_P7-7p1_20x.jp2',

app/manifest_name_to_discover_name.py:5087
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5087: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-8\sam-7\sub-8\sam-1\sub-8_sam-1_ChAT_P8-1p1_20x.jp2': 'files/derivative/sub-8/sam-1/sub-8_sam-1_ChAT_P8-1p1_20x (1).jp2',

app/manifest_name_to_discover_name.py:5088
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5088: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-8\sam-7\sub-8\sam-1\sub-8_sam-1_TH_P8-1p1_20x.jp2': 'files/derivative/sub-8/sam-1/sub-8_sam-1_TH_P8-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5089
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5089: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-8\sam-7\sub-8\sam-7\sub-8_sam-7_ChAT_P8-7p1_20x.jp2': 'files/derivative/sub-8/sam-7/sub-8_sam-7_ChAT_P8-7p1_20x.jp2',

app/manifest_name_to_discover_name.py:5090
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5090: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-8\sam-7\sub-8_sam-7_TH_P8-7p1_20x.jp2': 'files/derivative/sub-8/sam-7/sub-8_sam-7_TH_P8-7p1_20x.jp2',

app/manifest_name_to_discover_name.py:5091
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5091: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-9\sam-3\sub-9_sam-3_ChAT_P9-3p1_20x.jp2': 'files/derivative/sub-9/sam-3/sub-9_sam-3_ChAT_P9-3p1_20x.jp2',

app/manifest_name_to_discover_name.py:5092
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5092: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-9\sam-3\sub-9_sam-3_TH_P9-3p3_20x.jp2': 'files/derivative/sub-9/sam-3/sub-9_sam-3_TH_P9-3p3_20x.jp2',

app/manifest_name_to_discover_name.py:5093
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5093: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-10\sam-1\sub-10_sam-1_ChAT_P10-1p1_20x.jp2': 'files/derivative/sub-10/sam-1/sub-10_sam-1_ChAT_P10-1p1_20x (1).jp2',

app/manifest_name_to_discover_name.py:5094
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5094: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-10\sam-1\sub-10_sam-1_TH_P10-1p1_20x.jp2': 'files/derivative/sub-10/sam-1/sub-10_sam-1_TH_P10-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5095
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5095: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-11\sam-1\sub-11_sam-1_ChAT_P11-1p2_20x.jp2': 'files/derivative/sub-11/sam-1/sub-11_sam-1_ChAT_P11-1p2_20x.jp2',

app/manifest_name_to_discover_name.py:5096
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5096: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-11\sam-1\sub-11_sam-1_TH_P11-1p1_20x.jp2': 'files/derivative/sub-11/sam-1/sub-11_sam-1_TH_P11-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5097
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5097: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-11\sam-3\sub-11_sam-3_ChAT_P11-3p3_20x.jp2': 'files/derivative/sub-11/sam-3/sub-11_sam-3_ChAT_P11-3p3_20x.jp2',

app/manifest_name_to_discover_name.py:5098
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5098: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-11\sam-3\sub-11_sam-3_TH_P11-3p1_20x.jp2': 'files/derivative/sub-11/sam-3/sub-11_sam-3_TH_P11-3p1_20x (1).jp2',

app/manifest_name_to_discover_name.py:5099
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5099: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-12\sam-1\sub-12_sam-1_ChAT_P12-1p1_20x.jp2': 'files/derivative/sub-12/sam-1/sub-12_sam-1_ChAT_P12-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5100
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5100: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-12\sam-1\sub-12_sam-1_TH_P12-1p1_20x.jp2': 'files/derivative/sub-12/sam-1/sub-12_sam-1_TH_P12-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5101
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5101: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-12\sam-3\sub-12_sam-3_ChAT_P12-3p3_20x.jp2': 'files/derivative/sub-12/sam-3/sub-12_sam-3_ChAT_P12-3p3_20x.jp2',

app/manifest_name_to_discover_name.py:5102
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5102: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-12\sam-3\sub-12_sam-3_TH_P12-3p1_20x.jp2': 'files/derivative/sub-12/sam-3/sub-12_sam-3_TH_P12-3p1_20x.jp2',

app/manifest_name_to_discover_name.py:5103
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5103: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-13\sam-1\sub-13_sam-1_ChAT_P13-1p1_20x.jp2': 'files/derivative/sub-13/sam-1/sub-13_sam-1_ChAT_P13-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5104
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5104: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-13\sam-1\sub-13_sam-1_TH_P13-1p1_20x.jp2': 'files/derivative/sub-13/sam-1/sub-13_sam-1_TH_P13-1p1_20x.jp2',

app/manifest_name_to_discover_name.py:5105
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5105: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-13\sam-3\sub-13_sam-3_ChAT_P13-3p3_20x.jp2': 'files/derivative/sub-13/sam-3/sub-13_sam-3_ChAT_P13-3p3_20x.jp2',

app/manifest_name_to_discover_name.py:5106
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5106: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-13\sam-3\sub-13_sam-3_TH_P13-3p1_20x.jp2': 'files/derivative/sub-13/sam-3/sub-13_sam-3_TH_P13-3p1_20x.jp2',

app/manifest_name_to_discover_name.py:5107
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5107: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-14\sam-2\sub-14_sam-2_ChAT_P14-2p1_20x.jp2': 'files/derivative/sub-14/sam-2/sub-14_sam-2_ChAT_P14-2p1_20x.jp2',

app/manifest_name_to_discover_name.py:5108
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5108: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-14\sam-2\sub-14_sam-2_TH_P14-2p1_20x.jp2': 'files/derivative/sub-14/sam-2/sub-14_sam-2_TH_P14-2p1_20x.jp2',

app/manifest_name_to_discover_name.py:5109
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5109: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-14\sam-3\sub-14_sam-3_ChAT_P14-3p3_20x.jp2': 'files/derivative/sub-14/sam-3/sub-14_sam-3_ChAT_P14-3p3_20x (1).jp2',

app/manifest_name_to_discover_name.py:5110
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5110: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-14\sam-3\sub-14_sam-3_TH_P14-3p1_20x.jp2': 'files/derivative/sub-14/sam-3/sub-14_sam-3_TH_P14-3p1_20x.jp2',

app/manifest_name_to_discover_name.py:5111
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5111: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-15\sam-2\sub-15_sam-2_ChAT_P15-2p2_20x.jp2': 'files/derivative/sub-15/sam-2/sub-15_sam-2_ChAT_P15-2p2_20x.jp2',

app/manifest_name_to_discover_name.py:5112
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5112: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-15\sam-2\sub-15_sam-2_TH_P15-2p1_20x.jp2': 'files/derivative/sub-15/sam-2/sub-15_sam-2_TH_P15-2p1_20x.jp2',

app/manifest_name_to_discover_name.py:5113
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5113: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-15\sam-3\sub-15_sam-3_ChAT_P15-3p3_20x.jp2': 'files/derivative/sub-15/sam-3/sub-15_sam-3_ChAT_P15-3p3_20x.jp2',

app/manifest_name_to_discover_name.py:5114
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/app/manifest_name_to_discover_name.py:5114: DeprecationWarning: invalid escape sequence \s
    'files/derivative/sub-15\sam-3\sub-15_sam-3_TH_P15-3p1_20x.jp2': 'files/derivative/sub-15/sam-3/sub-15_sam-3_TH_P15-3p1_20x.jp2',

/home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/nose/importer.py:12
  /home/cmiss/Jenkins/shiningpanda/jobs/7c726052/virtualenvs/d41d8cd9/lib/python3.9/site-packages/nose/importer.py:12: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
    from imp import find_module, load_module, acquire_lock, release_lock

tests/test_health.py::test_request_response
  /home/cmiss/Jenkins/workspace/SPARC-API-DEV/tests/test_health.py:13: DeprecationWarning: Please use assertEqual instead.
    assert_equals("healthy", json_response.get("status"))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
=========================== short test summary info ============================
FAILED tests/test_api.py::test_onto_term_lookup - AssertionError: assert 'not...
FAILED tests/test_thumbnails.py::test_neurolucida_thumbnail - requests.except...
FAILED tests/test_thumbnails.py::test_neurolucida_thumbnail_dataset_221 - req...
======= 3 failed, 94 passed, 4 skipped, 50 warnings in 571.54s (0:09:31) =======
Build step 'Virtualenv Builder' marked build as failure
[Slack Notifications] found #894 as previous completed, non-aborted build
[Slack Notifications] will send OnEveryFailureNotification because build matches and user preferences allow it
Finished: FAILURE