Browse Source

Restoring authorship annotation for <achigin@yandex-team.ru>. Commit 2 of 2.

achigin 3 years ago
parent
commit
d21d0e1590

+ 943 - 943
contrib/python/urllib3/.dist-info/METADATA

@@ -1,24 +1,24 @@
 Metadata-Version: 2.1
-Name: urllib3 
+Name: urllib3
 Version: 1.26.8
-Summary: HTTP library with thread-safe connection pooling, file post, and more. 
-Home-page: https://urllib3.readthedocs.io/ 
-Author: Andrey Petrov 
-Author-email: andrey.petrov@shazow.net 
-License: MIT 
+Summary: HTTP library with thread-safe connection pooling, file post, and more.
+Home-page: https://urllib3.readthedocs.io/
+Author: Andrey Petrov
+Author-email: andrey.petrov@shazow.net
+License: MIT
 Project-URL: Documentation, https://urllib3.readthedocs.io/
 Project-URL: Code, https://github.com/urllib3/urllib3
 Project-URL: Issue tracker, https://github.com/urllib3/urllib3/issues
-Keywords: urllib httplib threadsafe filepost http https ssl pooling 
-Platform: UNKNOWN 
-Classifier: Environment :: Web Environment 
-Classifier: Intended Audience :: Developers 
-Classifier: License :: OSI Approved :: MIT License 
-Classifier: Operating System :: OS Independent 
-Classifier: Programming Language :: Python 
-Classifier: Programming Language :: Python :: 2 
+Keywords: urllib httplib threadsafe filepost http https ssl pooling
+Platform: UNKNOWN
+Classifier: Environment :: Web Environment
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: MIT License
+Classifier: Operating System :: OS Independent
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 2
 Classifier: Programming Language :: Python :: 2.7
-Classifier: Programming Language :: Python :: 3 
+Classifier: Programming Language :: Python :: 3
 Classifier: Programming Language :: Python :: 3.5
 Classifier: Programming Language :: Python :: 3.6
 Classifier: Programming Language :: Python :: 3.7
@@ -28,8 +28,8 @@ Classifier: Programming Language :: Python :: 3.10
 Classifier: Programming Language :: Python :: 3.11
 Classifier: Programming Language :: Python :: Implementation :: CPython
 Classifier: Programming Language :: Python :: Implementation :: PyPy
-Classifier: Topic :: Internet :: WWW/HTTP 
-Classifier: Topic :: Software Development :: Libraries 
+Classifier: Topic :: Internet :: WWW/HTTP
+Classifier: Topic :: Software Development :: Libraries
 Requires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4
 Description-Content-Type: text/x-rst
 License-File: LICENSE.txt
@@ -43,61 +43,61 @@ Requires-Dist: certifi ; extra == 'secure'
 Requires-Dist: ipaddress ; (python_version == "2.7") and extra == 'secure'
 Provides-Extra: socks
 Requires-Dist: PySocks (!=1.5.7,<2.0,>=1.5.6) ; extra == 'socks'
- 
- 
+
+
 urllib3 is a powerful, *user-friendly* HTTP client for Python. Much of the
-Python ecosystem already uses urllib3 and you should too. 
-urllib3 brings many critical features that are missing from the Python 
-standard libraries: 
- 
-- Thread safety. 
-- Connection pooling. 
-- Client-side SSL/TLS verification. 
-- File uploads with multipart encoding. 
-- Helpers for retrying requests and dealing with HTTP redirects. 
+Python ecosystem already uses urllib3 and you should too.
+urllib3 brings many critical features that are missing from the Python
+standard libraries:
+
+- Thread safety.
+- Connection pooling.
+- Client-side SSL/TLS verification.
+- File uploads with multipart encoding.
+- Helpers for retrying requests and dealing with HTTP redirects.
 - Support for gzip, deflate, and brotli encoding.
-- Proxy support for HTTP and SOCKS. 
-- 100% test coverage. 
- 
+- Proxy support for HTTP and SOCKS.
+- 100% test coverage.
+
 urllib3 is powerful and easy to use:
- 
+
 .. code-block:: python
 
-    >>> import urllib3 
-    >>> http = urllib3.PoolManager() 
-    >>> r = http.request('GET', 'http://httpbin.org/robots.txt') 
-    >>> r.status 
-    200 
-    >>> r.data 
-    'User-agent: *\nDisallow: /deny\n' 
- 
-
-Installing 
----------- 
- 
-urllib3 can be installed with `pip <https://pip.pypa.io>`_:: 
- 
+    >>> import urllib3
+    >>> http = urllib3.PoolManager()
+    >>> r = http.request('GET', 'http://httpbin.org/robots.txt')
+    >>> r.status
+    200
+    >>> r.data
+    'User-agent: *\nDisallow: /deny\n'
+
+
+Installing
+----------
+
+urllib3 can be installed with `pip <https://pip.pypa.io>`_::
+
     $ python -m pip install urllib3
- 
+
 Alternatively, you can grab the latest source code from `GitHub <https://github.com/urllib3/urllib3>`_::
- 
+
     $ git clone git://github.com/urllib3/urllib3.git
-    $ python setup.py install 
- 
- 
-Documentation 
-------------- 
- 
-urllib3 has usage and reference documentation at `urllib3.readthedocs.io <https://urllib3.readthedocs.io>`_. 
- 
- 
-Contributing 
------------- 
- 
-urllib3 happily accepts contributions. Please see our 
-`contributing documentation <https://urllib3.readthedocs.io/en/latest/contributing.html>`_ 
-for some tips on getting started. 
- 
+    $ python setup.py install
+
+
+Documentation
+-------------
+
+urllib3 has usage and reference documentation at `urllib3.readthedocs.io <https://urllib3.readthedocs.io>`_.
+
+
+Contributing
+------------
+
+urllib3 happily accepts contributions. Please see our
+`contributing documentation <https://urllib3.readthedocs.io/en/latest/contributing.html>`_
+for some tips on getting started.
+
 
 Security Disclosures
 --------------------
@@ -107,9 +107,9 @@ To report a security vulnerability, please use the
 Tidelift will coordinate the fix and disclosure with maintainers.
 
 
-Maintainers 
------------ 
- 
+Maintainers
+-----------
+
 - `@sethmlarson <https://github.com/sethmlarson>`__ (Seth M. Larson)
 - `@pquentin <https://github.com/pquentin>`__ (Quentin Pradet)
 - `@theacodes <https://github.com/theacodes>`__ (Thea Flowers)
@@ -117,13 +117,13 @@ Maintainers
 - `@lukasa <https://github.com/lukasa>`__ (Cory Benfield)
 - `@sigmavirus24 <https://github.com/sigmavirus24>`__ (Ian Stapleton Cordasco)
 - `@shazow <https://github.com/shazow>`__ (Andrey Petrov)
- 
-👋 
- 
 
-Sponsorship 
------------ 
- 
+👋
+
+
+Sponsorship
+-----------
+
 If your company benefits from this library, please consider `sponsoring its
 development <https://urllib3.readthedocs.io/en/latest/sponsors.html>`_.
 
@@ -147,10 +147,10 @@ For Enterprise
 
 .. _Tidelift Subscription: https://tidelift.com/subscription/pkg/pypi-urllib3?utm_source=pypi-urllib3&utm_medium=referral&utm_campaign=readme
 
- 
+
 Changes
 =======
- 
+
 1.26.8 (2022-01-07)
 -------------------
 
@@ -185,7 +185,7 @@ Changes
   ``Transfer-Encoding`` headers in the case that one is already specified.
 * Fixed typo in deprecation message to recommend ``Retry.DEFAULT_ALLOWED_METHODS``.
 
- 
+
 1.26.5 (2021-05-26)
 -------------------
 
@@ -529,885 +529,885 @@ Changes
 * pyopenssl: Use vendored version of ``six``. (Issue #1231)
 
 
-1.21.1 (2017-05-02) 
-------------------- 
- 
-* Fixed SecureTransport issue that would cause long delays in response body 
-  delivery. (Pull #1154) 
- 
-* Fixed regression in 1.21 that threw exceptions when users passed the 
-  ``socket_options`` flag to the ``PoolManager``.  (Issue #1165) 
- 
-* Fixed regression in 1.21 that threw exceptions when users passed the 
-  ``assert_hostname`` or ``assert_fingerprint`` flag to the ``PoolManager``. 
-  (Pull #1157) 
- 
- 
-1.21 (2017-04-25) 
------------------ 
- 
-* Improved performance of certain selector system calls on Python 3.5 and 
-  later. (Pull #1095) 
- 
-* Resolved issue where the PyOpenSSL backend would not wrap SysCallError 
-  exceptions appropriately when sending data. (Pull #1125) 
- 
-* Selectors now detects a monkey-patched select module after import for modules 
-  that patch the select module like eventlet, greenlet. (Pull #1128) 
- 
-* Reduced memory consumption when streaming zlib-compressed responses 
-  (as opposed to raw deflate streams). (Pull #1129) 
- 
-* Connection pools now use the entire request context when constructing the 
-  pool key. (Pull #1016) 
- 
-* ``PoolManager.connection_from_*`` methods now accept a new keyword argument, 
-  ``pool_kwargs``, which are merged with the existing ``connection_pool_kw``. 
-  (Pull #1016) 
- 
-* Add retry counter for ``status_forcelist``. (Issue #1147) 
- 
-* Added ``contrib`` module for using SecureTransport on macOS: 
-  ``urllib3.contrib.securetransport``.  (Pull #1122) 
- 
-* urllib3 now only normalizes the case of ``http://`` and ``https://`` schemes: 
-  for schemes it does not recognise, it assumes they are case-sensitive and 
-  leaves them unchanged. 
-  (Issue #1080) 
- 
- 
-1.20 (2017-01-19) 
------------------ 
- 
-* Added support for waiting for I/O using selectors other than select, 
-  improving urllib3's behaviour with large numbers of concurrent connections. 
-  (Pull #1001) 
- 
-* Updated the date for the system clock check. (Issue #1005) 
- 
-* ConnectionPools now correctly consider hostnames to be case-insensitive. 
-  (Issue #1032) 
- 
-* Outdated versions of PyOpenSSL now cause the PyOpenSSL contrib module 
-  to fail when it is injected, rather than at first use. (Pull #1063) 
- 
-* Outdated versions of cryptography now cause the PyOpenSSL contrib module 
-  to fail when it is injected, rather than at first use. (Issue #1044) 
- 
-* Automatically attempt to rewind a file-like body object when a request is 
-  retried or redirected. (Pull #1039) 
- 
-* Fix some bugs that occur when modules incautiously patch the queue module. 
-  (Pull #1061) 
- 
+1.21.1 (2017-05-02)
+-------------------
+
+* Fixed SecureTransport issue that would cause long delays in response body
+  delivery. (Pull #1154)
+
+* Fixed regression in 1.21 that threw exceptions when users passed the
+  ``socket_options`` flag to the ``PoolManager``.  (Issue #1165)
+
+* Fixed regression in 1.21 that threw exceptions when users passed the
+  ``assert_hostname`` or ``assert_fingerprint`` flag to the ``PoolManager``.
+  (Pull #1157)
+
+
+1.21 (2017-04-25)
+-----------------
+
+* Improved performance of certain selector system calls on Python 3.5 and
+  later. (Pull #1095)
+
+* Resolved issue where the PyOpenSSL backend would not wrap SysCallError
+  exceptions appropriately when sending data. (Pull #1125)
+
+* Selectors now detects a monkey-patched select module after import for modules
+  that patch the select module like eventlet, greenlet. (Pull #1128)
+
+* Reduced memory consumption when streaming zlib-compressed responses
+  (as opposed to raw deflate streams). (Pull #1129)
+
+* Connection pools now use the entire request context when constructing the
+  pool key. (Pull #1016)
+
+* ``PoolManager.connection_from_*`` methods now accept a new keyword argument,
+  ``pool_kwargs``, which are merged with the existing ``connection_pool_kw``.
+  (Pull #1016)
+
+* Add retry counter for ``status_forcelist``. (Issue #1147)
+
+* Added ``contrib`` module for using SecureTransport on macOS:
+  ``urllib3.contrib.securetransport``.  (Pull #1122)
+
+* urllib3 now only normalizes the case of ``http://`` and ``https://`` schemes:
+  for schemes it does not recognise, it assumes they are case-sensitive and
+  leaves them unchanged.
+  (Issue #1080)
+
+
+1.20 (2017-01-19)
+-----------------
+
+* Added support for waiting for I/O using selectors other than select,
+  improving urllib3's behaviour with large numbers of concurrent connections.
+  (Pull #1001)
+
+* Updated the date for the system clock check. (Issue #1005)
+
+* ConnectionPools now correctly consider hostnames to be case-insensitive.
+  (Issue #1032)
+
+* Outdated versions of PyOpenSSL now cause the PyOpenSSL contrib module
+  to fail when it is injected, rather than at first use. (Pull #1063)
+
+* Outdated versions of cryptography now cause the PyOpenSSL contrib module
+  to fail when it is injected, rather than at first use. (Issue #1044)
+
+* Automatically attempt to rewind a file-like body object when a request is
+  retried or redirected. (Pull #1039)
+
+* Fix some bugs that occur when modules incautiously patch the queue module.
+  (Pull #1061)
+
 * Prevent retries from occurring on read timeouts for which the request method
-  was not in the method whitelist. (Issue #1059) 
- 
-* Changed the PyOpenSSL contrib module to lazily load idna to avoid 
-  unnecessarily bloating the memory of programs that don't need it. (Pull 
-  #1076) 
- 
-* Add support for IPv6 literals with zone identifiers. (Pull #1013) 
- 
-* Added support for socks5h:// and socks4a:// schemes when working with SOCKS 
-  proxies, and controlled remote DNS appropriately. (Issue #1035) 
- 
- 
-1.19.1 (2016-11-16) 
-------------------- 
- 
-* Fixed AppEngine import that didn't function on Python 3.5. (Pull #1025) 
- 
- 
-1.19 (2016-11-03) 
------------------ 
- 
-* urllib3 now respects Retry-After headers on 413, 429, and 503 responses when 
-  using the default retry logic. (Pull #955) 
- 
-* Remove markers from setup.py to assist ancient setuptools versions. (Issue 
-  #986) 
- 
-* Disallow superscripts and other integerish things in URL ports. (Issue #989) 
- 
-* Allow urllib3's HTTPResponse.stream() method to continue to work with 
-  non-httplib underlying FPs. (Pull #990) 
- 
-* Empty filenames in multipart headers are now emitted as such, rather than 
+  was not in the method whitelist. (Issue #1059)
+
+* Changed the PyOpenSSL contrib module to lazily load idna to avoid
+  unnecessarily bloating the memory of programs that don't need it. (Pull
+  #1076)
+
+* Add support for IPv6 literals with zone identifiers. (Pull #1013)
+
+* Added support for socks5h:// and socks4a:// schemes when working with SOCKS
+  proxies, and controlled remote DNS appropriately. (Issue #1035)
+
+
+1.19.1 (2016-11-16)
+-------------------
+
+* Fixed AppEngine import that didn't function on Python 3.5. (Pull #1025)
+
+
+1.19 (2016-11-03)
+-----------------
+
+* urllib3 now respects Retry-After headers on 413, 429, and 503 responses when
+  using the default retry logic. (Pull #955)
+
+* Remove markers from setup.py to assist ancient setuptools versions. (Issue
+  #986)
+
+* Disallow superscripts and other integerish things in URL ports. (Issue #989)
+
+* Allow urllib3's HTTPResponse.stream() method to continue to work with
+  non-httplib underlying FPs. (Pull #990)
+
+* Empty filenames in multipart headers are now emitted as such, rather than
   being suppressed. (Issue #1015)
- 
-* Prefer user-supplied Host headers on chunked uploads. (Issue #1009) 
- 
- 
-1.18.1 (2016-10-27) 
-------------------- 
- 
-* CVE-2016-9015. Users who are using urllib3 version 1.17 or 1.18 along with 
-  PyOpenSSL injection and OpenSSL 1.1.0 *must* upgrade to this version. This 
-  release fixes a vulnerability whereby urllib3 in the above configuration 
-  would silently fail to validate TLS certificates due to erroneously setting 
-  invalid flags in OpenSSL's ``SSL_CTX_set_verify`` function. These erroneous 
-  flags do not cause a problem in OpenSSL versions before 1.1.0, which 
-  interprets the presence of any flag as requesting certificate validation. 
- 
-  There is no PR for this patch, as it was prepared for simultaneous disclosure 
+
+* Prefer user-supplied Host headers on chunked uploads. (Issue #1009)
+
+
+1.18.1 (2016-10-27)
+-------------------
+
+* CVE-2016-9015. Users who are using urllib3 version 1.17 or 1.18 along with
+  PyOpenSSL injection and OpenSSL 1.1.0 *must* upgrade to this version. This
+  release fixes a vulnerability whereby urllib3 in the above configuration
+  would silently fail to validate TLS certificates due to erroneously setting
+  invalid flags in OpenSSL's ``SSL_CTX_set_verify`` function. These erroneous
+  flags do not cause a problem in OpenSSL versions before 1.1.0, which
+  interprets the presence of any flag as requesting certificate validation.
+
+  There is no PR for this patch, as it was prepared for simultaneous disclosure
   and release. The master branch received the same fix in Pull #1010.
- 
- 
-1.18 (2016-09-26) 
------------------ 
- 
+
+
+1.18 (2016-09-26)
+-----------------
+
 * Fixed incorrect message for IncompleteRead exception. (Pull #973)
- 
-* Accept ``iPAddress`` subject alternative name fields in TLS certificates. 
-  (Issue #258) 
- 
-* Fixed consistency of ``HTTPResponse.closed`` between Python 2 and 3. 
-  (Issue #977) 
- 
-* Fixed handling of wildcard certificates when using PyOpenSSL. (Issue #979) 
- 
- 
-1.17 (2016-09-06) 
------------------ 
- 
-* Accept ``SSLContext`` objects for use in SSL/TLS negotiation. (Issue #835) 
- 
-* ConnectionPool debug log now includes scheme, host, and port. (Issue #897) 
- 
-* Substantially refactored documentation. (Issue #887) 
- 
-* Used URLFetch default timeout on AppEngine, rather than hardcoding our own. 
-  (Issue #858) 
- 
-* Normalize the scheme and host in the URL parser (Issue #833) 
- 
-* ``HTTPResponse`` contains the last ``Retry`` object, which now also 
-  contains retries history. (Issue #848) 
- 
-* Timeout can no longer be set as boolean, and must be greater than zero. 
+
+* Accept ``iPAddress`` subject alternative name fields in TLS certificates.
+  (Issue #258)
+
+* Fixed consistency of ``HTTPResponse.closed`` between Python 2 and 3.
+  (Issue #977)
+
+* Fixed handling of wildcard certificates when using PyOpenSSL. (Issue #979)
+
+
+1.17 (2016-09-06)
+-----------------
+
+* Accept ``SSLContext`` objects for use in SSL/TLS negotiation. (Issue #835)
+
+* ConnectionPool debug log now includes scheme, host, and port. (Issue #897)
+
+* Substantially refactored documentation. (Issue #887)
+
+* Used URLFetch default timeout on AppEngine, rather than hardcoding our own.
+  (Issue #858)
+
+* Normalize the scheme and host in the URL parser (Issue #833)
+
+* ``HTTPResponse`` contains the last ``Retry`` object, which now also
+  contains retries history. (Issue #848)
+
+* Timeout can no longer be set as boolean, and must be greater than zero.
   (Pull #924)
- 
-* Removed pyasn1 and ndg-httpsclient from dependencies used for PyOpenSSL. We 
-  now use cryptography and idna, both of which are already dependencies of 
+
+* Removed pyasn1 and ndg-httpsclient from dependencies used for PyOpenSSL. We
+  now use cryptography and idna, both of which are already dependencies of
   PyOpenSSL. (Pull #930)
- 
-* Fixed infinite loop in ``stream`` when amt=None. (Issue #928) 
- 
-* Try to use the operating system's certificates when we are using an 
+
+* Fixed infinite loop in ``stream`` when amt=None. (Issue #928)
+
+* Try to use the operating system's certificates when we are using an
   ``SSLContext``. (Pull #941)
- 
-* Updated cipher suite list to allow ChaCha20+Poly1305. AES-GCM is preferred to 
+
+* Updated cipher suite list to allow ChaCha20+Poly1305. AES-GCM is preferred to
   ChaCha20, but ChaCha20 is then preferred to everything else. (Pull #947)
- 
+
 * Updated cipher suite list to remove 3DES-based cipher suites. (Pull #958)
- 
+
 * Removed the cipher suite fallback to allow HIGH ciphers. (Pull #958)
- 
-* Implemented ``length_remaining`` to determine remaining content 
+
+* Implemented ``length_remaining`` to determine remaining content
   to be read. (Pull #949)
- 
-* Implemented ``enforce_content_length`` to enable exceptions when 
+
+* Implemented ``enforce_content_length`` to enable exceptions when
   incomplete data chunks are received. (Pull #949)
- 
-* Dropped connection start, dropped connection reset, redirect, forced retry, 
+
+* Dropped connection start, dropped connection reset, redirect, forced retry,
   and new HTTPS connection log levels to DEBUG, from INFO. (Pull #967)
- 
- 
-1.16 (2016-06-11) 
------------------ 
- 
-* Disable IPv6 DNS when IPv6 connections are not possible. (Issue #840) 
- 
-* Provide ``key_fn_by_scheme`` pool keying mechanism that can be 
-  overridden. (Issue #830) 
- 
-* Normalize scheme and host to lowercase for pool keys, and include 
-  ``source_address``. (Issue #830) 
- 
-* Cleaner exception chain in Python 3 for ``_make_request``. 
-  (Issue #861) 
- 
-* Fixed installing ``urllib3[socks]`` extra. (Issue #864) 
- 
-* Fixed signature of ``ConnectionPool.close`` so it can actually safely be 
-  called by subclasses. (Issue #873) 
- 
-* Retain ``release_conn`` state across retries. (Issues #651, #866) 
- 
-* Add customizable ``HTTPConnectionPool.ResponseCls``, which defaults to 
-  ``HTTPResponse`` but can be replaced with a subclass. (Issue #879) 
- 
- 
-1.15.1 (2016-04-11) 
-------------------- 
- 
-* Fix packaging to include backports module. (Issue #841) 
- 
- 
-1.15 (2016-04-06) 
------------------ 
- 
-* Added Retry(raise_on_status=False). (Issue #720) 
- 
-* Always use setuptools, no more distutils fallback. (Issue #785) 
- 
-* Dropped support for Python 3.2. (Issue #786) 
- 
-* Chunked transfer encoding when requesting with ``chunked=True``. 
-  (Issue #790) 
- 
-* Fixed regression with IPv6 port parsing. (Issue #801) 
- 
-* Append SNIMissingWarning messages to allow users to specify it in 
-  the PYTHONWARNINGS environment variable. (Issue #816) 
- 
-* Handle unicode headers in Py2. (Issue #818) 
- 
-* Log certificate when there is a hostname mismatch. (Issue #820) 
- 
-* Preserve order of request/response headers. (Issue #821) 
- 
- 
-1.14 (2015-12-29) 
------------------ 
- 
-* contrib: SOCKS proxy support! (Issue #762) 
- 
-* Fixed AppEngine handling of transfer-encoding header and bug 
-  in Timeout defaults checking. (Issue #763) 
- 
- 
-1.13.1 (2015-12-18) 
-------------------- 
- 
-* Fixed regression in IPv6 + SSL for match_hostname. (Issue #761) 
- 
- 
-1.13 (2015-12-14) 
------------------ 
- 
-* Fixed ``pip install urllib3[secure]`` on modern pip. (Issue #706) 
- 
-* pyopenssl: Fixed SSL3_WRITE_PENDING error. (Issue #717) 
- 
-* pyopenssl: Support for TLSv1.1 and TLSv1.2. (Issue #696) 
- 
-* Close connections more defensively on exception. (Issue #734) 
- 
-* Adjusted ``read_chunked`` to handle gzipped, chunk-encoded bodies without 
-  repeatedly flushing the decoder, to function better on Jython. (Issue #743) 
- 
-* Accept ``ca_cert_dir`` for SSL-related PoolManager configuration. (Issue #758) 
- 
- 
-1.12 (2015-09-03) 
------------------ 
- 
-* Rely on ``six`` for importing ``httplib`` to work around 
-  conflicts with other Python 3 shims. (Issue #688) 
- 
-* Add support for directories of certificate authorities, as supported by 
-  OpenSSL. (Issue #701) 
- 
-* New exception: ``NewConnectionError``, raised when we fail to establish 
-  a new connection, usually ``ECONNREFUSED`` socket error. 
- 
- 
-1.11 (2015-07-21) 
------------------ 
- 
-* When ``ca_certs`` is given, ``cert_reqs`` defaults to 
-  ``'CERT_REQUIRED'``. (Issue #650) 
- 
-* ``pip install urllib3[secure]`` will install Certifi and 
-  PyOpenSSL as dependencies. (Issue #678) 
- 
-* Made ``HTTPHeaderDict`` usable as a ``headers`` input value 
-  (Issues #632, #679) 
- 
-* Added `urllib3.contrib.appengine <https://urllib3.readthedocs.io/en/latest/contrib.html#google-app-engine>`_ 
-  which has an ``AppEngineManager`` for using ``URLFetch`` in a 
-  Google AppEngine environment. (Issue #664) 
- 
-* Dev: Added test suite for AppEngine. (Issue #631) 
- 
-* Fix performance regression when using PyOpenSSL. (Issue #626) 
- 
-* Passing incorrect scheme (e.g. ``foo://``) will raise 
-  ``ValueError`` instead of ``AssertionError`` (backwards 
-  compatible for now, but please migrate). (Issue #640) 
- 
-* Fix pools not getting replenished when an error occurs during a 
-  request using ``release_conn=False``. (Issue #644) 
- 
-* Fix pool-default headers not applying for url-encoded requests 
-  like GET. (Issue #657) 
- 
-* log.warning in Python 3 when headers are skipped due to parsing 
-  errors. (Issue #642) 
- 
-* Close and discard connections if an error occurs during read. 
-  (Issue #660) 
- 
-* Fix host parsing for IPv6 proxies. (Issue #668) 
- 
-* Separate warning type SubjectAltNameWarning, now issued once 
-  per host. (Issue #671) 
- 
-* Fix ``httplib.IncompleteRead`` not getting converted to 
-  ``ProtocolError`` when using ``HTTPResponse.stream()`` 
-  (Issue #674) 
- 
-1.10.4 (2015-05-03) 
-------------------- 
- 
-* Migrate tests to Tornado 4. (Issue #594) 
- 
-* Append default warning configuration rather than overwrite. 
-  (Issue #603) 
- 
-* Fix streaming decoding regression. (Issue #595) 
- 
-* Fix chunked requests losing state across keep-alive connections. 
-  (Issue #599) 
- 
-* Fix hanging when chunked HEAD response has no body. (Issue #605) 
- 
- 
-1.10.3 (2015-04-21) 
-------------------- 
- 
-* Emit ``InsecurePlatformWarning`` when SSLContext object is missing. 
-  (Issue #558) 
- 
-* Fix regression of duplicate header keys being discarded. 
-  (Issue #563) 
- 
-* ``Response.stream()`` returns a generator for chunked responses. 
-  (Issue #560) 
- 
-* Set upper-bound timeout when waiting for a socket in PyOpenSSL. 
-  (Issue #585) 
- 
-* Work on platforms without `ssl` module for plain HTTP requests. 
-  (Issue #587) 
- 
-* Stop relying on the stdlib's default cipher list. (Issue #588) 
- 
- 
-1.10.2 (2015-02-25) 
-------------------- 
- 
-* Fix file descriptor leakage on retries. (Issue #548) 
- 
-* Removed RC4 from default cipher list. (Issue #551) 
- 
-* Header performance improvements. (Issue #544) 
- 
-* Fix PoolManager not obeying redirect retry settings. (Issue #553) 
- 
- 
-1.10.1 (2015-02-10) 
-------------------- 
- 
-* Pools can be used as context managers. (Issue #545) 
- 
-* Don't re-use connections which experienced an SSLError. (Issue #529) 
- 
-* Don't fail when gzip decoding an empty stream. (Issue #535) 
- 
-* Add sha256 support for fingerprint verification. (Issue #540) 
- 
-* Fixed handling of header values containing commas. (Issue #533) 
- 
- 
-1.10 (2014-12-14) 
------------------ 
- 
-* Disabled SSLv3. (Issue #473) 
- 
-* Add ``Url.url`` property to return the composed url string. (Issue #394) 
- 
-* Fixed PyOpenSSL + gevent ``WantWriteError``. (Issue #412) 
- 
-* ``MaxRetryError.reason`` will always be an exception, not string. 
-  (Issue #481) 
- 
-* Fixed SSL-related timeouts not being detected as timeouts. (Issue #492) 
- 
-* Py3: Use ``ssl.create_default_context()`` when available. (Issue #473) 
- 
-* Emit ``InsecureRequestWarning`` for *every* insecure HTTPS request. 
-  (Issue #496) 
- 
-* Emit ``SecurityWarning`` when certificate has no ``subjectAltName``. 
-  (Issue #499) 
- 
-* Close and discard sockets which experienced SSL-related errors. 
-  (Issue #501) 
- 
-* Handle ``body`` param in ``.request(...)``. (Issue #513) 
- 
-* Respect timeout with HTTPS proxy. (Issue #505) 
- 
-* PyOpenSSL: Handle ZeroReturnError exception. (Issue #520) 
- 
- 
-1.9.1 (2014-09-13) 
------------------- 
- 
-* Apply socket arguments before binding. (Issue #427) 
- 
-* More careful checks if fp-like object is closed. (Issue #435) 
- 
-* Fixed packaging issues of some development-related files not 
-  getting included. (Issue #440) 
- 
-* Allow performing *only* fingerprint verification. (Issue #444) 
- 
-* Emit ``SecurityWarning`` if system clock is waaay off. (Issue #445) 
- 
-* Fixed PyOpenSSL compatibility with PyPy. (Issue #450) 
- 
-* Fixed ``BrokenPipeError`` and ``ConnectionError`` handling in Py3. 
-  (Issue #443) 
- 
- 
- 
-1.9 (2014-07-04) 
----------------- 
- 
-* Shuffled around development-related files. If you're maintaining a distro 
-  package of urllib3, you may need to tweak things. (Issue #415) 
- 
-* Unverified HTTPS requests will trigger a warning on the first request. See 
-  our new `security documentation 
-  <https://urllib3.readthedocs.io/en/latest/security.html>`_ for details. 
-  (Issue #426) 
- 
-* New retry logic and ``urllib3.util.retry.Retry`` configuration object. 
-  (Issue #326) 
- 
-* All raised exceptions should now wrapped in a 
-  ``urllib3.exceptions.HTTPException``-extending exception. (Issue #326) 
- 
-* All errors during a retry-enabled request should be wrapped in 
-  ``urllib3.exceptions.MaxRetryError``, including timeout-related exceptions 
-  which were previously exempt. Underlying error is accessible from the 
+
+
+1.16 (2016-06-11)
+-----------------
+
+* Disable IPv6 DNS when IPv6 connections are not possible. (Issue #840)
+
+* Provide ``key_fn_by_scheme`` pool keying mechanism that can be
+  overridden. (Issue #830)
+
+* Normalize scheme and host to lowercase for pool keys, and include
+  ``source_address``. (Issue #830)
+
+* Cleaner exception chain in Python 3 for ``_make_request``.
+  (Issue #861)
+
+* Fixed installing ``urllib3[socks]`` extra. (Issue #864)
+
+* Fixed signature of ``ConnectionPool.close`` so it can actually safely be
+  called by subclasses. (Issue #873)
+
+* Retain ``release_conn`` state across retries. (Issues #651, #866)
+
+* Add customizable ``HTTPConnectionPool.ResponseCls``, which defaults to
+  ``HTTPResponse`` but can be replaced with a subclass. (Issue #879)
+
+
+1.15.1 (2016-04-11)
+-------------------
+
+* Fix packaging to include backports module. (Issue #841)
+
+
+1.15 (2016-04-06)
+-----------------
+
+* Added Retry(raise_on_status=False). (Issue #720)
+
+* Always use setuptools, no more distutils fallback. (Issue #785)
+
+* Dropped support for Python 3.2. (Issue #786)
+
+* Chunked transfer encoding when requesting with ``chunked=True``.
+  (Issue #790)
+
+* Fixed regression with IPv6 port parsing. (Issue #801)
+
+* Append SNIMissingWarning messages to allow users to specify it in
+  the PYTHONWARNINGS environment variable. (Issue #816)
+
+* Handle unicode headers in Py2. (Issue #818)
+
+* Log certificate when there is a hostname mismatch. (Issue #820)
+
+* Preserve order of request/response headers. (Issue #821)
+
+
+1.14 (2015-12-29)
+-----------------
+
+* contrib: SOCKS proxy support! (Issue #762)
+
+* Fixed AppEngine handling of transfer-encoding header and bug
+  in Timeout defaults checking. (Issue #763)
+
+
+1.13.1 (2015-12-18)
+-------------------
+
+* Fixed regression in IPv6 + SSL for match_hostname. (Issue #761)
+
+
+1.13 (2015-12-14)
+-----------------
+
+* Fixed ``pip install urllib3[secure]`` on modern pip. (Issue #706)
+
+* pyopenssl: Fixed SSL3_WRITE_PENDING error. (Issue #717)
+
+* pyopenssl: Support for TLSv1.1 and TLSv1.2. (Issue #696)
+
+* Close connections more defensively on exception. (Issue #734)
+
+* Adjusted ``read_chunked`` to handle gzipped, chunk-encoded bodies without
+  repeatedly flushing the decoder, to function better on Jython. (Issue #743)
+
+* Accept ``ca_cert_dir`` for SSL-related PoolManager configuration. (Issue #758)
+
+
+1.12 (2015-09-03)
+-----------------
+
+* Rely on ``six`` for importing ``httplib`` to work around
+  conflicts with other Python 3 shims. (Issue #688)
+
+* Add support for directories of certificate authorities, as supported by
+  OpenSSL. (Issue #701)
+
+* New exception: ``NewConnectionError``, raised when we fail to establish
+  a new connection, usually ``ECONNREFUSED`` socket error.
+
+
+1.11 (2015-07-21)
+-----------------
+
+* When ``ca_certs`` is given, ``cert_reqs`` defaults to
+  ``'CERT_REQUIRED'``. (Issue #650)
+
+* ``pip install urllib3[secure]`` will install Certifi and
+  PyOpenSSL as dependencies. (Issue #678)
+
+* Made ``HTTPHeaderDict`` usable as a ``headers`` input value
+  (Issues #632, #679)
+
+* Added `urllib3.contrib.appengine <https://urllib3.readthedocs.io/en/latest/contrib.html#google-app-engine>`_
+  which has an ``AppEngineManager`` for using ``URLFetch`` in a
+  Google AppEngine environment. (Issue #664)
+
+* Dev: Added test suite for AppEngine. (Issue #631)
+
+* Fix performance regression when using PyOpenSSL. (Issue #626)
+
+* Passing incorrect scheme (e.g. ``foo://``) will raise
+  ``ValueError`` instead of ``AssertionError`` (backwards
+  compatible for now, but please migrate). (Issue #640)
+
+* Fix pools not getting replenished when an error occurs during a
+  request using ``release_conn=False``. (Issue #644)
+
+* Fix pool-default headers not applying for url-encoded requests
+  like GET. (Issue #657)
+
+* log.warning in Python 3 when headers are skipped due to parsing
+  errors. (Issue #642)
+
+* Close and discard connections if an error occurs during read.
+  (Issue #660)
+
+* Fix host parsing for IPv6 proxies. (Issue #668)
+
+* Separate warning type SubjectAltNameWarning, now issued once
+  per host. (Issue #671)
+
+* Fix ``httplib.IncompleteRead`` not getting converted to
+  ``ProtocolError`` when using ``HTTPResponse.stream()``
+  (Issue #674)
+
+1.10.4 (2015-05-03)
+-------------------
+
+* Migrate tests to Tornado 4. (Issue #594)
+
+* Append default warning configuration rather than overwrite.
+  (Issue #603)
+
+* Fix streaming decoding regression. (Issue #595)
+
+* Fix chunked requests losing state across keep-alive connections.
+  (Issue #599)
+
+* Fix hanging when chunked HEAD response has no body. (Issue #605)
+
+
+1.10.3 (2015-04-21)
+-------------------
+
+* Emit ``InsecurePlatformWarning`` when SSLContext object is missing.
+  (Issue #558)
+
+* Fix regression of duplicate header keys being discarded.
+  (Issue #563)
+
+* ``Response.stream()`` returns a generator for chunked responses.
+  (Issue #560)
+
+* Set upper-bound timeout when waiting for a socket in PyOpenSSL.
+  (Issue #585)
+
+* Work on platforms without `ssl` module for plain HTTP requests.
+  (Issue #587)
+
+* Stop relying on the stdlib's default cipher list. (Issue #588)
+
+
+1.10.2 (2015-02-25)
+-------------------
+
+* Fix file descriptor leakage on retries. (Issue #548)
+
+* Removed RC4 from default cipher list. (Issue #551)
+
+* Header performance improvements. (Issue #544)
+
+* Fix PoolManager not obeying redirect retry settings. (Issue #553)
+
+
+1.10.1 (2015-02-10)
+-------------------
+
+* Pools can be used as context managers. (Issue #545)
+
+* Don't re-use connections which experienced an SSLError. (Issue #529)
+
+* Don't fail when gzip decoding an empty stream. (Issue #535)
+
+* Add sha256 support for fingerprint verification. (Issue #540)
+
+* Fixed handling of header values containing commas. (Issue #533)
+
+
+1.10 (2014-12-14)
+-----------------
+
+* Disabled SSLv3. (Issue #473)
+
+* Add ``Url.url`` property to return the composed url string. (Issue #394)
+
+* Fixed PyOpenSSL + gevent ``WantWriteError``. (Issue #412)
+
+* ``MaxRetryError.reason`` will always be an exception, not string.
+  (Issue #481)
+
+* Fixed SSL-related timeouts not being detected as timeouts. (Issue #492)
+
+* Py3: Use ``ssl.create_default_context()`` when available. (Issue #473)
+
+* Emit ``InsecureRequestWarning`` for *every* insecure HTTPS request.
+  (Issue #496)
+
+* Emit ``SecurityWarning`` when certificate has no ``subjectAltName``.
+  (Issue #499)
+
+* Close and discard sockets which experienced SSL-related errors.
+  (Issue #501)
+
+* Handle ``body`` param in ``.request(...)``. (Issue #513)
+
+* Respect timeout with HTTPS proxy. (Issue #505)
+
+* PyOpenSSL: Handle ZeroReturnError exception. (Issue #520)
+
+
+1.9.1 (2014-09-13)
+------------------
+
+* Apply socket arguments before binding. (Issue #427)
+
+* More careful checks if fp-like object is closed. (Issue #435)
+
+* Fixed packaging issues of some development-related files not
+  getting included. (Issue #440)
+
+* Allow performing *only* fingerprint verification. (Issue #444)
+
+* Emit ``SecurityWarning`` if system clock is waaay off. (Issue #445)
+
+* Fixed PyOpenSSL compatibility with PyPy. (Issue #450)
+
+* Fixed ``BrokenPipeError`` and ``ConnectionError`` handling in Py3.
+  (Issue #443)
+
+
+
+1.9 (2014-07-04)
+----------------
+
+* Shuffled around development-related files. If you're maintaining a distro
+  package of urllib3, you may need to tweak things. (Issue #415)
+
+* Unverified HTTPS requests will trigger a warning on the first request. See
+  our new `security documentation
+  <https://urllib3.readthedocs.io/en/latest/security.html>`_ for details.
+  (Issue #426)
+
+* New retry logic and ``urllib3.util.retry.Retry`` configuration object.
+  (Issue #326)
+
+* All raised exceptions should now wrapped in a
+  ``urllib3.exceptions.HTTPException``-extending exception. (Issue #326)
+
+* All errors during a retry-enabled request should be wrapped in
+  ``urllib3.exceptions.MaxRetryError``, including timeout-related exceptions
+  which were previously exempt. Underlying error is accessible from the
   ``.reason`` property. (Issue #326)
- 
-* ``urllib3.exceptions.ConnectionError`` renamed to 
-  ``urllib3.exceptions.ProtocolError``. (Issue #326) 
- 
-* Errors during response read (such as IncompleteRead) are now wrapped in 
-  ``urllib3.exceptions.ProtocolError``. (Issue #418) 
- 
-* Requesting an empty host will raise ``urllib3.exceptions.LocationValueError``. 
-  (Issue #417) 
- 
-* Catch read timeouts over SSL connections as 
-  ``urllib3.exceptions.ReadTimeoutError``. (Issue #419) 
- 
-* Apply socket arguments before connecting. (Issue #427) 
- 
- 
-1.8.3 (2014-06-23) 
------------------- 
- 
-* Fix TLS verification when using a proxy in Python 3.4.1. (Issue #385) 
- 
-* Add ``disable_cache`` option to ``urllib3.util.make_headers``. (Issue #393) 
- 
-* Wrap ``socket.timeout`` exception with 
-  ``urllib3.exceptions.ReadTimeoutError``. (Issue #399) 
- 
-* Fixed proxy-related bug where connections were being reused incorrectly. 
-  (Issues #366, #369) 
- 
-* Added ``socket_options`` keyword parameter which allows to define 
-  ``setsockopt`` configuration of new sockets. (Issue #397) 
- 
-* Removed ``HTTPConnection.tcp_nodelay`` in favor of 
-  ``HTTPConnection.default_socket_options``. (Issue #397) 
- 
-* Fixed ``TypeError`` bug in Python 2.6.4. (Issue #411) 
- 
- 
-1.8.2 (2014-04-17) 
------------------- 
- 
-* Fix ``urllib3.util`` not being included in the package. 
- 
- 
-1.8.1 (2014-04-17) 
------------------- 
- 
-* Fix AppEngine bug of HTTPS requests going out as HTTP. (Issue #356) 
- 
-* Don't install ``dummyserver`` into ``site-packages`` as it's only needed 
-  for the test suite. (Issue #362) 
- 
-* Added support for specifying ``source_address``. (Issue #352) 
- 
- 
-1.8 (2014-03-04) 
----------------- 
- 
-* Improved url parsing in ``urllib3.util.parse_url`` (properly parse '@' in 
-  username, and blank ports like 'hostname:'). 
- 
-* New ``urllib3.connection`` module which contains all the HTTPConnection 
-  objects. 
- 
-* Several ``urllib3.util.Timeout``-related fixes. Also changed constructor 
-  signature to a more sensible order. [Backwards incompatible] 
-  (Issues #252, #262, #263) 
- 
-* Use ``backports.ssl_match_hostname`` if it's installed. (Issue #274) 
- 
-* Added ``.tell()`` method to ``urllib3.response.HTTPResponse`` which 
-  returns the number of bytes read so far. (Issue #277) 
- 
-* Support for platforms without threading. (Issue #289) 
- 
-* Expand default-port comparison in ``HTTPConnectionPool.is_same_host`` 
-  to allow a pool with no specified port to be considered equal to to an 
-  HTTP/HTTPS url with port 80/443 explicitly provided. (Issue #305) 
- 
-* Improved default SSL/TLS settings to avoid vulnerabilities. 
-  (Issue #309) 
- 
-* Fixed ``urllib3.poolmanager.ProxyManager`` not retrying on connect errors. 
-  (Issue #310) 
- 
-* Disable Nagle's Algorithm on the socket for non-proxies. A subset of requests 
-  will send the entire HTTP request ~200 milliseconds faster; however, some of 
-  the resulting TCP packets will be smaller. (Issue #254) 
- 
-* Increased maximum number of SubjectAltNames in ``urllib3.contrib.pyopenssl`` 
-  from the default 64 to 1024 in a single certificate. (Issue #318) 
- 
-* Headers are now passed and stored as a custom 
-  ``urllib3.collections_.HTTPHeaderDict`` object rather than a plain ``dict``. 
-  (Issue #329, #333) 
- 
-* Headers no longer lose their case on Python 3. (Issue #236) 
- 
-* ``urllib3.contrib.pyopenssl`` now uses the operating system's default CA 
-  certificates on inject. (Issue #332) 
- 
-* Requests with ``retries=False`` will immediately raise any exceptions without 
-  wrapping them in ``MaxRetryError``. (Issue #348) 
- 
-* Fixed open socket leak with SSL-related failures. (Issue #344, #348) 
- 
- 
-1.7.1 (2013-09-25) 
------------------- 
- 
-* Added granular timeout support with new ``urllib3.util.Timeout`` class. 
-  (Issue #231) 
- 
-* Fixed Python 3.4 support. (Issue #238) 
- 
- 
-1.7 (2013-08-14) 
----------------- 
- 
-* More exceptions are now pickle-able, with tests. (Issue #174) 
- 
-* Fixed redirecting with relative URLs in Location header. (Issue #178) 
- 
-* Support for relative urls in ``Location: ...`` header. (Issue #179) 
- 
-* ``urllib3.response.HTTPResponse`` now inherits from ``io.IOBase`` for bonus 
-  file-like functionality. (Issue #187) 
- 
-* Passing ``assert_hostname=False`` when creating a HTTPSConnectionPool will 
-  skip hostname verification for SSL connections. (Issue #194) 
- 
-* New method ``urllib3.response.HTTPResponse.stream(...)`` which acts as a 
-  generator wrapped around ``.read(...)``. (Issue #198) 
- 
-* IPv6 url parsing enforces brackets around the hostname. (Issue #199) 
- 
-* Fixed thread race condition in 
-  ``urllib3.poolmanager.PoolManager.connection_from_host(...)`` (Issue #204) 
- 
-* ``ProxyManager`` requests now include non-default port in ``Host: ...`` 
-  header. (Issue #217) 
- 
-* Added HTTPS proxy support in ``ProxyManager``. (Issue #170 #139) 
- 
-* New ``RequestField`` object can be passed to the ``fields=...`` param which 
-  can specify headers. (Issue #220) 
- 
-* Raise ``urllib3.exceptions.ProxyError`` when connecting to proxy fails. 
-  (Issue #221) 
- 
-* Use international headers when posting file names. (Issue #119) 
- 
-* Improved IPv6 support. (Issue #203) 
- 
- 
-1.6 (2013-04-25) 
----------------- 
- 
-* Contrib: Optional SNI support for Py2 using PyOpenSSL. (Issue #156) 
- 
-* ``ProxyManager`` automatically adds ``Host: ...`` header if not given. 
- 
-* Improved SSL-related code. ``cert_req`` now optionally takes a string like 
-  "REQUIRED" or "NONE". Same with ``ssl_version`` takes strings like "SSLv23" 
-  The string values reflect the suffix of the respective constant variable. 
-  (Issue #130) 
- 
-* Vendored ``socksipy`` now based on Anorov's fork which handles unexpectedly 
-  closed proxy connections and larger read buffers. (Issue #135) 
- 
-* Ensure the connection is closed if no data is received, fixes connection leak 
-  on some platforms. (Issue #133) 
- 
-* Added SNI support for SSL/TLS connections on Py32+. (Issue #89) 
- 
-* Tests fixed to be compatible with Py26 again. (Issue #125) 
- 
-* Added ability to choose SSL version by passing an ``ssl.PROTOCOL_*`` constant 
-  to the ``ssl_version`` parameter of ``HTTPSConnectionPool``. (Issue #109) 
- 
-* Allow an explicit content type to be specified when encoding file fields. 
-  (Issue #126) 
- 
-* Exceptions are now pickleable, with tests. (Issue #101) 
- 
-* Fixed default headers not getting passed in some cases. (Issue #99) 
- 
-* Treat "content-encoding" header value as case-insensitive, per RFC 2616 
-  Section 3.5. (Issue #110) 
- 
-* "Connection Refused" SocketErrors will get retried rather than raised. 
-  (Issue #92) 
- 
-* Updated vendored ``six``, no longer overrides the global ``six`` module 
-  namespace. (Issue #113) 
- 
-* ``urllib3.exceptions.MaxRetryError`` contains a ``reason`` property holding 
-  the exception that prompted the final retry. If ``reason is None`` then it 
-  was due to a redirect. (Issue #92, #114) 
- 
-* Fixed ``PoolManager.urlopen()`` from not redirecting more than once. 
-  (Issue #149) 
- 
-* Don't assume ``Content-Type: text/plain`` for multi-part encoding parameters 
-  that are not files. (Issue #111) 
- 
-* Pass `strict` param down to ``httplib.HTTPConnection``. (Issue #122) 
- 
-* Added mechanism to verify SSL certificates by fingerprint (md5, sha1) or 
-  against an arbitrary hostname (when connecting by IP or for misconfigured 
-  servers). (Issue #140) 
- 
-* Streaming decompression support. (Issue #159) 
- 
- 
-1.5 (2012-08-02) 
----------------- 
- 
-* Added ``urllib3.add_stderr_logger()`` for quickly enabling STDERR debug 
-  logging in urllib3. 
- 
-* Native full URL parsing (including auth, path, query, fragment) available in 
-  ``urllib3.util.parse_url(url)``. 
- 
-* Built-in redirect will switch method to 'GET' if status code is 303. 
-  (Issue #11) 
- 
-* ``urllib3.PoolManager`` strips the scheme and host before sending the request 
-  uri. (Issue #8) 
- 
-* New ``urllib3.exceptions.DecodeError`` exception for when automatic decoding, 
-  based on the Content-Type header, fails. 
- 
-* Fixed bug with pool depletion and leaking connections (Issue #76). Added 
-  explicit connection closing on pool eviction. Added 
-  ``urllib3.PoolManager.clear()``. 
- 
-* 99% -> 100% unit test coverage. 
- 
- 
-1.4 (2012-06-16) 
----------------- 
- 
-* Minor AppEngine-related fixes. 
- 
-* Switched from ``mimetools.choose_boundary`` to ``uuid.uuid4()``. 
- 
-* Improved url parsing. (Issue #73) 
- 
-* IPv6 url support. (Issue #72) 
- 
- 
-1.3 (2012-03-25) 
----------------- 
- 
-* Removed pre-1.0 deprecated API. 
- 
-* Refactored helpers into a ``urllib3.util`` submodule. 
- 
-* Fixed multipart encoding to support list-of-tuples for keys with multiple 
-  values. (Issue #48) 
- 
-* Fixed multiple Set-Cookie headers in response not getting merged properly in 
-  Python 3. (Issue #53) 
- 
-* AppEngine support with Py27. (Issue #61) 
- 
-* Minor ``encode_multipart_formdata`` fixes related to Python 3 strings vs 
-  bytes. 
- 
- 
-1.2.2 (2012-02-06) 
------------------- 
- 
-* Fixed packaging bug of not shipping ``test-requirements.txt``. (Issue #47) 
- 
- 
-1.2.1 (2012-02-05) 
------------------- 
- 
-* Fixed another bug related to when ``ssl`` module is not available. (Issue #41) 
- 
-* Location parsing errors now raise ``urllib3.exceptions.LocationParseError`` 
-  which inherits from ``ValueError``. 
- 
- 
-1.2 (2012-01-29) 
----------------- 
- 
-* Added Python 3 support (tested on 3.2.2) 
- 
-* Dropped Python 2.5 support (tested on 2.6.7, 2.7.2) 
- 
-* Use ``select.poll`` instead of ``select.select`` for platforms that support 
-  it. 
- 
-* Use ``Queue.LifoQueue`` instead of ``Queue.Queue`` for more aggressive 
-  connection reusing. Configurable by overriding ``ConnectionPool.QueueCls``. 
- 
-* Fixed ``ImportError`` during install when ``ssl`` module is not available. 
-  (Issue #41) 
- 
-* Fixed ``PoolManager`` redirects between schemes (such as HTTP -> HTTPS) not 
-  completing properly. (Issue #28, uncovered by Issue #10 in v1.1) 
- 
-* Ported ``dummyserver`` to use ``tornado`` instead of ``webob`` + 
-  ``eventlet``. Removed extraneous unsupported dummyserver testing backends. 
-  Added socket-level tests. 
- 
-* More tests. Achievement Unlocked: 99% Coverage. 
- 
- 
-1.1 (2012-01-07) 
----------------- 
- 
-* Refactored ``dummyserver`` to its own root namespace module (used for 
-  testing). 
- 
-* Added hostname verification for ``VerifiedHTTPSConnection`` by vendoring in 
-  Py32's ``ssl_match_hostname``. (Issue #25) 
- 
-* Fixed cross-host HTTP redirects when using ``PoolManager``. (Issue #10) 
- 
-* Fixed ``decode_content`` being ignored when set through ``urlopen``. (Issue 
-  #27) 
- 
-* Fixed timeout-related bugs. (Issues #17, #23) 
- 
- 
-1.0.2 (2011-11-04) 
------------------- 
- 
-* Fixed typo in ``VerifiedHTTPSConnection`` which would only present as a bug if 
-  you're using the object manually. (Thanks pyos) 
- 
-* Made RecentlyUsedContainer (and consequently PoolManager) more thread-safe by 
-  wrapping the access log in a mutex. (Thanks @christer) 
- 
-* Made RecentlyUsedContainer more dict-like (corrected ``__delitem__`` and 
-  ``__getitem__`` behaviour), with tests. Shouldn't affect core urllib3 code. 
- 
- 
-1.0.1 (2011-10-10) 
------------------- 
- 
-* Fixed a bug where the same connection would get returned into the pool twice, 
-  causing extraneous "HttpConnectionPool is full" log warnings. 
- 
- 
-1.0 (2011-10-08) 
----------------- 
- 
-* Added ``PoolManager`` with LRU expiration of connections (tested and 
-  documented). 
-* Added ``ProxyManager`` (needs tests, docs, and confirmation that it works 
-  with HTTPS proxies). 
-* Added optional partial-read support for responses when 
-  ``preload_content=False``. You can now make requests and just read the headers 
-  without loading the content. 
-* Made response decoding optional (default on, same as before). 
-* Added optional explicit boundary string for ``encode_multipart_formdata``. 
-* Convenience request methods are now inherited from ``RequestMethods``. Old 
-  helpers like ``get_url`` and ``post_url`` should be abandoned in favour of 
-  the new ``request(method, url, ...)``. 
-* Refactored code to be even more decoupled, reusable, and extendable. 
-* License header added to ``.py`` files. 
-* Embiggened the documentation: Lots of Sphinx-friendly docstrings in the code 
+
+* ``urllib3.exceptions.ConnectionError`` renamed to
+  ``urllib3.exceptions.ProtocolError``. (Issue #326)
+
+* Errors during response read (such as IncompleteRead) are now wrapped in
+  ``urllib3.exceptions.ProtocolError``. (Issue #418)
+
+* Requesting an empty host will raise ``urllib3.exceptions.LocationValueError``.
+  (Issue #417)
+
+* Catch read timeouts over SSL connections as
+  ``urllib3.exceptions.ReadTimeoutError``. (Issue #419)
+
+* Apply socket arguments before connecting. (Issue #427)
+
+
+1.8.3 (2014-06-23)
+------------------
+
+* Fix TLS verification when using a proxy in Python 3.4.1. (Issue #385)
+
+* Add ``disable_cache`` option to ``urllib3.util.make_headers``. (Issue #393)
+
+* Wrap ``socket.timeout`` exception with
+  ``urllib3.exceptions.ReadTimeoutError``. (Issue #399)
+
+* Fixed proxy-related bug where connections were being reused incorrectly.
+  (Issues #366, #369)
+
+* Added ``socket_options`` keyword parameter which allows to define
+  ``setsockopt`` configuration of new sockets. (Issue #397)
+
+* Removed ``HTTPConnection.tcp_nodelay`` in favor of
+  ``HTTPConnection.default_socket_options``. (Issue #397)
+
+* Fixed ``TypeError`` bug in Python 2.6.4. (Issue #411)
+
+
+1.8.2 (2014-04-17)
+------------------
+
+* Fix ``urllib3.util`` not being included in the package.
+
+
+1.8.1 (2014-04-17)
+------------------
+
+* Fix AppEngine bug of HTTPS requests going out as HTTP. (Issue #356)
+
+* Don't install ``dummyserver`` into ``site-packages`` as it's only needed
+  for the test suite. (Issue #362)
+
+* Added support for specifying ``source_address``. (Issue #352)
+
+
+1.8 (2014-03-04)
+----------------
+
+* Improved url parsing in ``urllib3.util.parse_url`` (properly parse '@' in
+  username, and blank ports like 'hostname:').
+
+* New ``urllib3.connection`` module which contains all the HTTPConnection
+  objects.
+
+* Several ``urllib3.util.Timeout``-related fixes. Also changed constructor
+  signature to a more sensible order. [Backwards incompatible]
+  (Issues #252, #262, #263)
+
+* Use ``backports.ssl_match_hostname`` if it's installed. (Issue #274)
+
+* Added ``.tell()`` method to ``urllib3.response.HTTPResponse`` which
+  returns the number of bytes read so far. (Issue #277)
+
+* Support for platforms without threading. (Issue #289)
+
+* Expand default-port comparison in ``HTTPConnectionPool.is_same_host``
+  to allow a pool with no specified port to be considered equal to to an
+  HTTP/HTTPS url with port 80/443 explicitly provided. (Issue #305)
+
+* Improved default SSL/TLS settings to avoid vulnerabilities.
+  (Issue #309)
+
+* Fixed ``urllib3.poolmanager.ProxyManager`` not retrying on connect errors.
+  (Issue #310)
+
+* Disable Nagle's Algorithm on the socket for non-proxies. A subset of requests
+  will send the entire HTTP request ~200 milliseconds faster; however, some of
+  the resulting TCP packets will be smaller. (Issue #254)
+
+* Increased maximum number of SubjectAltNames in ``urllib3.contrib.pyopenssl``
+  from the default 64 to 1024 in a single certificate. (Issue #318)
+
+* Headers are now passed and stored as a custom
+  ``urllib3.collections_.HTTPHeaderDict`` object rather than a plain ``dict``.
+  (Issue #329, #333)
+
+* Headers no longer lose their case on Python 3. (Issue #236)
+
+* ``urllib3.contrib.pyopenssl`` now uses the operating system's default CA
+  certificates on inject. (Issue #332)
+
+* Requests with ``retries=False`` will immediately raise any exceptions without
+  wrapping them in ``MaxRetryError``. (Issue #348)
+
+* Fixed open socket leak with SSL-related failures. (Issue #344, #348)
+
+
+1.7.1 (2013-09-25)
+------------------
+
+* Added granular timeout support with new ``urllib3.util.Timeout`` class.
+  (Issue #231)
+
+* Fixed Python 3.4 support. (Issue #238)
+
+
+1.7 (2013-08-14)
+----------------
+
+* More exceptions are now pickle-able, with tests. (Issue #174)
+
+* Fixed redirecting with relative URLs in Location header. (Issue #178)
+
+* Support for relative urls in ``Location: ...`` header. (Issue #179)
+
+* ``urllib3.response.HTTPResponse`` now inherits from ``io.IOBase`` for bonus
+  file-like functionality. (Issue #187)
+
+* Passing ``assert_hostname=False`` when creating a HTTPSConnectionPool will
+  skip hostname verification for SSL connections. (Issue #194)
+
+* New method ``urllib3.response.HTTPResponse.stream(...)`` which acts as a
+  generator wrapped around ``.read(...)``. (Issue #198)
+
+* IPv6 url parsing enforces brackets around the hostname. (Issue #199)
+
+* Fixed thread race condition in
+  ``urllib3.poolmanager.PoolManager.connection_from_host(...)`` (Issue #204)
+
+* ``ProxyManager`` requests now include non-default port in ``Host: ...``
+  header. (Issue #217)
+
+* Added HTTPS proxy support in ``ProxyManager``. (Issue #170 #139)
+
+* New ``RequestField`` object can be passed to the ``fields=...`` param which
+  can specify headers. (Issue #220)
+
+* Raise ``urllib3.exceptions.ProxyError`` when connecting to proxy fails.
+  (Issue #221)
+
+* Use international headers when posting file names. (Issue #119)
+
+* Improved IPv6 support. (Issue #203)
+
+
+1.6 (2013-04-25)
+----------------
+
+* Contrib: Optional SNI support for Py2 using PyOpenSSL. (Issue #156)
+
+* ``ProxyManager`` automatically adds ``Host: ...`` header if not given.
+
+* Improved SSL-related code. ``cert_req`` now optionally takes a string like
+  "REQUIRED" or "NONE". Same with ``ssl_version`` takes strings like "SSLv23"
+  The string values reflect the suffix of the respective constant variable.
+  (Issue #130)
+
+* Vendored ``socksipy`` now based on Anorov's fork which handles unexpectedly
+  closed proxy connections and larger read buffers. (Issue #135)
+
+* Ensure the connection is closed if no data is received, fixes connection leak
+  on some platforms. (Issue #133)
+
+* Added SNI support for SSL/TLS connections on Py32+. (Issue #89)
+
+* Tests fixed to be compatible with Py26 again. (Issue #125)
+
+* Added ability to choose SSL version by passing an ``ssl.PROTOCOL_*`` constant
+  to the ``ssl_version`` parameter of ``HTTPSConnectionPool``. (Issue #109)
+
+* Allow an explicit content type to be specified when encoding file fields.
+  (Issue #126)
+
+* Exceptions are now pickleable, with tests. (Issue #101)
+
+* Fixed default headers not getting passed in some cases. (Issue #99)
+
+* Treat "content-encoding" header value as case-insensitive, per RFC 2616
+  Section 3.5. (Issue #110)
+
+* "Connection Refused" SocketErrors will get retried rather than raised.
+  (Issue #92)
+
+* Updated vendored ``six``, no longer overrides the global ``six`` module
+  namespace. (Issue #113)
+
+* ``urllib3.exceptions.MaxRetryError`` contains a ``reason`` property holding
+  the exception that prompted the final retry. If ``reason is None`` then it
+  was due to a redirect. (Issue #92, #114)
+
+* Fixed ``PoolManager.urlopen()`` from not redirecting more than once.
+  (Issue #149)
+
+* Don't assume ``Content-Type: text/plain`` for multi-part encoding parameters
+  that are not files. (Issue #111)
+
+* Pass `strict` param down to ``httplib.HTTPConnection``. (Issue #122)
+
+* Added mechanism to verify SSL certificates by fingerprint (md5, sha1) or
+  against an arbitrary hostname (when connecting by IP or for misconfigured
+  servers). (Issue #140)
+
+* Streaming decompression support. (Issue #159)
+
+
+1.5 (2012-08-02)
+----------------
+
+* Added ``urllib3.add_stderr_logger()`` for quickly enabling STDERR debug
+  logging in urllib3.
+
+* Native full URL parsing (including auth, path, query, fragment) available in
+  ``urllib3.util.parse_url(url)``.
+
+* Built-in redirect will switch method to 'GET' if status code is 303.
+  (Issue #11)
+
+* ``urllib3.PoolManager`` strips the scheme and host before sending the request
+  uri. (Issue #8)
+
+* New ``urllib3.exceptions.DecodeError`` exception for when automatic decoding,
+  based on the Content-Type header, fails.
+
+* Fixed bug with pool depletion and leaking connections (Issue #76). Added
+  explicit connection closing on pool eviction. Added
+  ``urllib3.PoolManager.clear()``.
+
+* 99% -> 100% unit test coverage.
+
+
+1.4 (2012-06-16)
+----------------
+
+* Minor AppEngine-related fixes.
+
+* Switched from ``mimetools.choose_boundary`` to ``uuid.uuid4()``.
+
+* Improved url parsing. (Issue #73)
+
+* IPv6 url support. (Issue #72)
+
+
+1.3 (2012-03-25)
+----------------
+
+* Removed pre-1.0 deprecated API.
+
+* Refactored helpers into a ``urllib3.util`` submodule.
+
+* Fixed multipart encoding to support list-of-tuples for keys with multiple
+  values. (Issue #48)
+
+* Fixed multiple Set-Cookie headers in response not getting merged properly in
+  Python 3. (Issue #53)
+
+* AppEngine support with Py27. (Issue #61)
+
+* Minor ``encode_multipart_formdata`` fixes related to Python 3 strings vs
+  bytes.
+
+
+1.2.2 (2012-02-06)
+------------------
+
+* Fixed packaging bug of not shipping ``test-requirements.txt``. (Issue #47)
+
+
+1.2.1 (2012-02-05)
+------------------
+
+* Fixed another bug related to when ``ssl`` module is not available. (Issue #41)
+
+* Location parsing errors now raise ``urllib3.exceptions.LocationParseError``
+  which inherits from ``ValueError``.
+
+
+1.2 (2012-01-29)
+----------------
+
+* Added Python 3 support (tested on 3.2.2)
+
+* Dropped Python 2.5 support (tested on 2.6.7, 2.7.2)
+
+* Use ``select.poll`` instead of ``select.select`` for platforms that support
+  it.
+
+* Use ``Queue.LifoQueue`` instead of ``Queue.Queue`` for more aggressive
+  connection reusing. Configurable by overriding ``ConnectionPool.QueueCls``.
+
+* Fixed ``ImportError`` during install when ``ssl`` module is not available.
+  (Issue #41)
+
+* Fixed ``PoolManager`` redirects between schemes (such as HTTP -> HTTPS) not
+  completing properly. (Issue #28, uncovered by Issue #10 in v1.1)
+
+* Ported ``dummyserver`` to use ``tornado`` instead of ``webob`` +
+  ``eventlet``. Removed extraneous unsupported dummyserver testing backends.
+  Added socket-level tests.
+
+* More tests. Achievement Unlocked: 99% Coverage.
+
+
+1.1 (2012-01-07)
+----------------
+
+* Refactored ``dummyserver`` to its own root namespace module (used for
+  testing).
+
+* Added hostname verification for ``VerifiedHTTPSConnection`` by vendoring in
+  Py32's ``ssl_match_hostname``. (Issue #25)
+
+* Fixed cross-host HTTP redirects when using ``PoolManager``. (Issue #10)
+
+* Fixed ``decode_content`` being ignored when set through ``urlopen``. (Issue
+  #27)
+
+* Fixed timeout-related bugs. (Issues #17, #23)
+
+
+1.0.2 (2011-11-04)
+------------------
+
+* Fixed typo in ``VerifiedHTTPSConnection`` which would only present as a bug if
+  you're using the object manually. (Thanks pyos)
+
+* Made RecentlyUsedContainer (and consequently PoolManager) more thread-safe by
+  wrapping the access log in a mutex. (Thanks @christer)
+
+* Made RecentlyUsedContainer more dict-like (corrected ``__delitem__`` and
+  ``__getitem__`` behaviour), with tests. Shouldn't affect core urllib3 code.
+
+
+1.0.1 (2011-10-10)
+------------------
+
+* Fixed a bug where the same connection would get returned into the pool twice,
+  causing extraneous "HttpConnectionPool is full" log warnings.
+
+
+1.0 (2011-10-08)
+----------------
+
+* Added ``PoolManager`` with LRU expiration of connections (tested and
+  documented).
+* Added ``ProxyManager`` (needs tests, docs, and confirmation that it works
+  with HTTPS proxies).
+* Added optional partial-read support for responses when
+  ``preload_content=False``. You can now make requests and just read the headers
+  without loading the content.
+* Made response decoding optional (default on, same as before).
+* Added optional explicit boundary string for ``encode_multipart_formdata``.
+* Convenience request methods are now inherited from ``RequestMethods``. Old
+  helpers like ``get_url`` and ``post_url`` should be abandoned in favour of
+  the new ``request(method, url, ...)``.
+* Refactored code to be even more decoupled, reusable, and extendable.
+* License header added to ``.py`` files.
+* Embiggened the documentation: Lots of Sphinx-friendly docstrings in the code
   and docs in ``docs/`` and on https://urllib3.readthedocs.io/.
-* Embettered all the things! 
-* Started writing this file. 
- 
- 
-0.4.1 (2011-07-17) 
------------------- 
- 
-* Minor bug fixes, code cleanup. 
- 
- 
-0.4 (2011-03-01) 
----------------- 
- 
-* Better unicode support. 
-* Added ``VerifiedHTTPSConnection``. 
-* Added ``NTLMConnectionPool`` in contrib. 
-* Minor improvements. 
- 
- 
-0.3.1 (2010-07-13) 
------------------- 
- 
-* Added ``assert_host_name`` optional parameter. Now compatible with proxies. 
- 
- 
-0.3 (2009-12-10) 
----------------- 
- 
-* Added HTTPS support. 
-* Minor bug fixes. 
-* Refactored, broken backwards compatibility with 0.2. 
-* API to be treated as stable from this version forward. 
- 
- 
-0.2 (2008-11-17) 
----------------- 
- 
-* Added unit tests. 
-* Bug fixes. 
- 
- 
-0.1 (2008-11-16) 
----------------- 
- 
-* First release. 
- 
- 
+* Embettered all the things!
+* Started writing this file.
+
+
+0.4.1 (2011-07-17)
+------------------
+
+* Minor bug fixes, code cleanup.
+
+
+0.4 (2011-03-01)
+----------------
+
+* Better unicode support.
+* Added ``VerifiedHTTPSConnection``.
+* Added ``NTLMConnectionPool`` in contrib.
+* Minor improvements.
+
+
+0.3.1 (2010-07-13)
+------------------
+
+* Added ``assert_host_name`` optional parameter. Now compatible with proxies.
+
+
+0.3 (2009-12-10)
+----------------
+
+* Added HTTPS support.
+* Minor bug fixes.
+* Refactored, broken backwards compatibility with 0.2.
+* API to be treated as stable from this version forward.
+
+
+0.2 (2008-11-17)
+----------------
+
+* Added unit tests.
+* Bug fixes.
+
+
+0.1 (2008-11-16)
+----------------
+
+* First release.
+
+

+ 1 - 1
contrib/python/urllib3/.dist-info/top_level.txt

@@ -1 +1 @@
-urllib3 
+urllib3

+ 50 - 50
contrib/python/urllib3/urllib3/__init__.py

@@ -1,29 +1,29 @@
-""" 
+"""
 Python HTTP library with thread-safe connection pooling, file post support, user friendly, and more
-""" 
-from __future__ import absolute_import 
+"""
+from __future__ import absolute_import
 
 # Set default logging handler to avoid "No handler found" warnings.
 import logging
-import warnings 
+import warnings
 from logging import NullHandler
- 
+
 from . import exceptions
 from ._version import __version__
 from .connectionpool import HTTPConnectionPool, HTTPSConnectionPool, connection_from_url
-from .filepost import encode_multipart_formdata 
-from .poolmanager import PoolManager, ProxyManager, proxy_from_url 
-from .response import HTTPResponse 
-from .util.request import make_headers 
+from .filepost import encode_multipart_formdata
+from .poolmanager import PoolManager, ProxyManager, proxy_from_url
+from .response import HTTPResponse
+from .util.request import make_headers
 from .util.retry import Retry
 from .util.timeout import Timeout
 from .util.url import get_host
- 
+
 __author__ = "Andrey Petrov (andrey.petrov@shazow.net)"
 __license__ = "MIT"
 __version__ = __version__
- 
-__all__ = ( 
+
+__all__ = (
     "HTTPConnectionPool",
     "HTTPSConnectionPool",
     "PoolManager",
@@ -38,48 +38,48 @@ __all__ = (
     "get_host",
     "make_headers",
     "proxy_from_url",
-) 
- 
-logging.getLogger(__name__).addHandler(NullHandler()) 
- 
- 
-def add_stderr_logger(level=logging.DEBUG): 
-    """ 
-    Helper for quickly adding a StreamHandler to the logger. Useful for 
-    debugging. 
- 
-    Returns the handler after adding it. 
-    """ 
-    # This method needs to be in this __init__.py to get the __name__ correct 
-    # even if urllib3 is vendored within another package. 
-    logger = logging.getLogger(__name__) 
-    handler = logging.StreamHandler() 
+)
+
+logging.getLogger(__name__).addHandler(NullHandler())
+
+
+def add_stderr_logger(level=logging.DEBUG):
+    """
+    Helper for quickly adding a StreamHandler to the logger. Useful for
+    debugging.
+
+    Returns the handler after adding it.
+    """
+    # This method needs to be in this __init__.py to get the __name__ correct
+    # even if urllib3 is vendored within another package.
+    logger = logging.getLogger(__name__)
+    handler = logging.StreamHandler()
     handler.setFormatter(logging.Formatter("%(asctime)s %(levelname)s %(message)s"))
-    logger.addHandler(handler) 
-    logger.setLevel(level) 
+    logger.addHandler(handler)
+    logger.setLevel(level)
     logger.debug("Added a stderr logging handler to logger: %s", __name__)
-    return handler 
- 
- 
-# ... Clean up. 
-del NullHandler 
- 
- 
-# All warning filters *must* be appended unless you're really certain that they 
-# shouldn't be: otherwise, it's very hard for users to use most Python 
-# mechanisms to silence them. 
-# SecurityWarning's always go off by default. 
+    return handler
+
+
+# ... Clean up.
+del NullHandler
+
+
+# All warning filters *must* be appended unless you're really certain that they
+# shouldn't be: otherwise, it's very hard for users to use most Python
+# mechanisms to silence them.
+# SecurityWarning's always go off by default.
 warnings.simplefilter("always", exceptions.SecurityWarning, append=True)
-# SubjectAltNameWarning's should go off once per host 
+# SubjectAltNameWarning's should go off once per host
 warnings.simplefilter("default", exceptions.SubjectAltNameWarning, append=True)
-# InsecurePlatformWarning's don't vary between requests, so we keep it default. 
+# InsecurePlatformWarning's don't vary between requests, so we keep it default.
 warnings.simplefilter("default", exceptions.InsecurePlatformWarning, append=True)
-# SNIMissingWarnings should go off only once. 
+# SNIMissingWarnings should go off only once.
 warnings.simplefilter("default", exceptions.SNIMissingWarning, append=True)
- 
- 
-def disable_warnings(category=exceptions.HTTPWarning): 
-    """ 
-    Helper for quickly disabling all urllib3 warnings. 
-    """ 
+
+
+def disable_warnings(category=exceptions.HTTPWarning):
+    """
+    Helper for quickly disabling all urllib3 warnings.
+    """
     warnings.simplefilter("ignore", category)

+ 288 - 288
contrib/python/urllib3/urllib3/_collections.py

@@ -1,323 +1,323 @@
-from __future__ import absolute_import 
+from __future__ import absolute_import
 
-try: 
+try:
     from collections.abc import Mapping, MutableMapping
 except ImportError:
     from collections import Mapping, MutableMapping
 try:
-    from threading import RLock 
-except ImportError:  # Platform-specific: No threads available 
-
-    class RLock: 
-        def __enter__(self): 
-            pass 
- 
-        def __exit__(self, exc_type, exc_value, traceback): 
-            pass 
- 
- 
+    from threading import RLock
+except ImportError:  # Platform-specific: No threads available
+
+    class RLock:
+        def __enter__(self):
+            pass
+
+        def __exit__(self, exc_type, exc_value, traceback):
+            pass
+
+
 from collections import OrderedDict
 
 from .exceptions import InvalidHeader
 from .packages import six
 from .packages.six import iterkeys, itervalues
- 
+
 __all__ = ["RecentlyUsedContainer", "HTTPHeaderDict"]
- 
- 
-_Null = object() 
- 
- 
-class RecentlyUsedContainer(MutableMapping): 
-    """ 
-    Provides a thread-safe dict-like container which maintains up to 
-    ``maxsize`` keys while throwing away the least-recently-used keys beyond 
-    ``maxsize``. 
- 
-    :param maxsize: 
-        Maximum number of recent elements to retain. 
- 
-    :param dispose_func: 
-        Every time an item is evicted from the container, 
-        ``dispose_func(value)`` is called.  Callback which will get called 
-    """ 
- 
-    ContainerCls = OrderedDict 
- 
-    def __init__(self, maxsize=10, dispose_func=None): 
-        self._maxsize = maxsize 
-        self.dispose_func = dispose_func 
- 
-        self._container = self.ContainerCls() 
-        self.lock = RLock() 
- 
-    def __getitem__(self, key): 
-        # Re-insert the item, moving it to the end of the eviction line. 
-        with self.lock: 
-            item = self._container.pop(key) 
-            self._container[key] = item 
-            return item 
- 
-    def __setitem__(self, key, value): 
-        evicted_value = _Null 
-        with self.lock: 
-            # Possibly evict the existing value of 'key' 
-            evicted_value = self._container.get(key, _Null) 
-            self._container[key] = value 
- 
-            # If we didn't evict an existing value, we might have to evict the 
-            # least recently used item from the beginning of the container. 
-            if len(self._container) > self._maxsize: 
-                _key, evicted_value = self._container.popitem(last=False) 
- 
-        if self.dispose_func and evicted_value is not _Null: 
-            self.dispose_func(evicted_value) 
- 
-    def __delitem__(self, key): 
-        with self.lock: 
-            value = self._container.pop(key) 
- 
-        if self.dispose_func: 
-            self.dispose_func(value) 
- 
-    def __len__(self): 
-        with self.lock: 
-            return len(self._container) 
- 
-    def __iter__(self): 
+
+
+_Null = object()
+
+
+class RecentlyUsedContainer(MutableMapping):
+    """
+    Provides a thread-safe dict-like container which maintains up to
+    ``maxsize`` keys while throwing away the least-recently-used keys beyond
+    ``maxsize``.
+
+    :param maxsize:
+        Maximum number of recent elements to retain.
+
+    :param dispose_func:
+        Every time an item is evicted from the container,
+        ``dispose_func(value)`` is called.  Callback which will get called
+    """
+
+    ContainerCls = OrderedDict
+
+    def __init__(self, maxsize=10, dispose_func=None):
+        self._maxsize = maxsize
+        self.dispose_func = dispose_func
+
+        self._container = self.ContainerCls()
+        self.lock = RLock()
+
+    def __getitem__(self, key):
+        # Re-insert the item, moving it to the end of the eviction line.
+        with self.lock:
+            item = self._container.pop(key)
+            self._container[key] = item
+            return item
+
+    def __setitem__(self, key, value):
+        evicted_value = _Null
+        with self.lock:
+            # Possibly evict the existing value of 'key'
+            evicted_value = self._container.get(key, _Null)
+            self._container[key] = value
+
+            # If we didn't evict an existing value, we might have to evict the
+            # least recently used item from the beginning of the container.
+            if len(self._container) > self._maxsize:
+                _key, evicted_value = self._container.popitem(last=False)
+
+        if self.dispose_func and evicted_value is not _Null:
+            self.dispose_func(evicted_value)
+
+    def __delitem__(self, key):
+        with self.lock:
+            value = self._container.pop(key)
+
+        if self.dispose_func:
+            self.dispose_func(value)
+
+    def __len__(self):
+        with self.lock:
+            return len(self._container)
+
+    def __iter__(self):
         raise NotImplementedError(
             "Iteration over this class is unlikely to be threadsafe."
         )
- 
-    def clear(self): 
-        with self.lock: 
-            # Copy pointers to all values, then wipe the mapping 
-            values = list(itervalues(self._container)) 
-            self._container.clear() 
- 
-        if self.dispose_func: 
-            for value in values: 
-                self.dispose_func(value) 
- 
-    def keys(self): 
-        with self.lock: 
-            return list(iterkeys(self._container)) 
- 
- 
-class HTTPHeaderDict(MutableMapping): 
-    """ 
-    :param headers: 
-        An iterable of field-value pairs. Must not contain multiple field names 
-        when compared case-insensitively. 
- 
-    :param kwargs: 
-        Additional field-value pairs to pass in to ``dict.update``. 
- 
-    A ``dict`` like container for storing HTTP Headers. 
- 
-    Field names are stored and compared case-insensitively in compliance with 
-    RFC 7230. Iteration provides the first case-sensitive key seen for each 
-    case-insensitive pair. 
- 
-    Using ``__setitem__`` syntax overwrites fields that compare equal 
-    case-insensitively in order to maintain ``dict``'s api. For fields that 
-    compare equal, instead create a new ``HTTPHeaderDict`` and use ``.add`` 
-    in a loop. 
- 
-    If multiple fields that are equal case-insensitively are passed to the 
-    constructor or ``.update``, the behavior is undefined and some will be 
-    lost. 
- 
-    >>> headers = HTTPHeaderDict() 
-    >>> headers.add('Set-Cookie', 'foo=bar') 
-    >>> headers.add('set-cookie', 'baz=quxx') 
-    >>> headers['content-length'] = '7' 
-    >>> headers['SET-cookie'] 
-    'foo=bar, baz=quxx' 
-    >>> headers['Content-Length'] 
-    '7' 
-    """ 
- 
-    def __init__(self, headers=None, **kwargs): 
-        super(HTTPHeaderDict, self).__init__() 
-        self._container = OrderedDict() 
-        if headers is not None: 
-            if isinstance(headers, HTTPHeaderDict): 
-                self._copy_from(headers) 
-            else: 
-                self.extend(headers) 
-        if kwargs: 
-            self.extend(kwargs) 
- 
-    def __setitem__(self, key, val): 
-        self._container[key.lower()] = [key, val] 
-        return self._container[key.lower()] 
- 
-    def __getitem__(self, key): 
-        val = self._container[key.lower()] 
+
+    def clear(self):
+        with self.lock:
+            # Copy pointers to all values, then wipe the mapping
+            values = list(itervalues(self._container))
+            self._container.clear()
+
+        if self.dispose_func:
+            for value in values:
+                self.dispose_func(value)
+
+    def keys(self):
+        with self.lock:
+            return list(iterkeys(self._container))
+
+
+class HTTPHeaderDict(MutableMapping):
+    """
+    :param headers:
+        An iterable of field-value pairs. Must not contain multiple field names
+        when compared case-insensitively.
+
+    :param kwargs:
+        Additional field-value pairs to pass in to ``dict.update``.
+
+    A ``dict`` like container for storing HTTP Headers.
+
+    Field names are stored and compared case-insensitively in compliance with
+    RFC 7230. Iteration provides the first case-sensitive key seen for each
+    case-insensitive pair.
+
+    Using ``__setitem__`` syntax overwrites fields that compare equal
+    case-insensitively in order to maintain ``dict``'s api. For fields that
+    compare equal, instead create a new ``HTTPHeaderDict`` and use ``.add``
+    in a loop.
+
+    If multiple fields that are equal case-insensitively are passed to the
+    constructor or ``.update``, the behavior is undefined and some will be
+    lost.
+
+    >>> headers = HTTPHeaderDict()
+    >>> headers.add('Set-Cookie', 'foo=bar')
+    >>> headers.add('set-cookie', 'baz=quxx')
+    >>> headers['content-length'] = '7'
+    >>> headers['SET-cookie']
+    'foo=bar, baz=quxx'
+    >>> headers['Content-Length']
+    '7'
+    """
+
+    def __init__(self, headers=None, **kwargs):
+        super(HTTPHeaderDict, self).__init__()
+        self._container = OrderedDict()
+        if headers is not None:
+            if isinstance(headers, HTTPHeaderDict):
+                self._copy_from(headers)
+            else:
+                self.extend(headers)
+        if kwargs:
+            self.extend(kwargs)
+
+    def __setitem__(self, key, val):
+        self._container[key.lower()] = [key, val]
+        return self._container[key.lower()]
+
+    def __getitem__(self, key):
+        val = self._container[key.lower()]
         return ", ".join(val[1:])
- 
-    def __delitem__(self, key): 
-        del self._container[key.lower()] 
- 
-    def __contains__(self, key): 
-        return key.lower() in self._container 
- 
-    def __eq__(self, other): 
+
+    def __delitem__(self, key):
+        del self._container[key.lower()]
+
+    def __contains__(self, key):
+        return key.lower() in self._container
+
+    def __eq__(self, other):
         if not isinstance(other, Mapping) and not hasattr(other, "keys"):
-            return False 
-        if not isinstance(other, type(self)): 
-            other = type(self)(other) 
+            return False
+        if not isinstance(other, type(self)):
+            other = type(self)(other)
         return dict((k.lower(), v) for k, v in self.itermerged()) == dict(
             (k.lower(), v) for k, v in other.itermerged()
         )
- 
-    def __ne__(self, other): 
-        return not self.__eq__(other) 
- 
+
+    def __ne__(self, other):
+        return not self.__eq__(other)
+
     if six.PY2:  # Python 2
-        iterkeys = MutableMapping.iterkeys 
-        itervalues = MutableMapping.itervalues 
- 
-    __marker = object() 
- 
-    def __len__(self): 
-        return len(self._container) 
- 
-    def __iter__(self): 
-        # Only provide the originally cased names 
-        for vals in self._container.values(): 
-            yield vals[0] 
- 
-    def pop(self, key, default=__marker): 
+        iterkeys = MutableMapping.iterkeys
+        itervalues = MutableMapping.itervalues
+
+    __marker = object()
+
+    def __len__(self):
+        return len(self._container)
+
+    def __iter__(self):
+        # Only provide the originally cased names
+        for vals in self._container.values():
+            yield vals[0]
+
+    def pop(self, key, default=__marker):
         """D.pop(k[,d]) -> v, remove specified key and return the corresponding value.
         If key is not found, d is returned if given, otherwise KeyError is raised.
         """
-        # Using the MutableMapping function directly fails due to the private marker. 
-        # Using ordinary dict.pop would expose the internal structures. 
-        # So let's reinvent the wheel. 
-        try: 
-            value = self[key] 
-        except KeyError: 
-            if default is self.__marker: 
-                raise 
-            return default 
-        else: 
-            del self[key] 
-            return value 
- 
-    def discard(self, key): 
-        try: 
-            del self[key] 
-        except KeyError: 
-            pass 
- 
-    def add(self, key, val): 
-        """Adds a (name, value) pair, doesn't overwrite the value if it already 
-        exists. 
- 
-        >>> headers = HTTPHeaderDict(foo='bar') 
-        >>> headers.add('Foo', 'baz') 
-        >>> headers['foo'] 
-        'bar, baz' 
-        """ 
-        key_lower = key.lower() 
-        new_vals = [key, val] 
-        # Keep the common case aka no item present as fast as possible 
-        vals = self._container.setdefault(key_lower, new_vals) 
-        if new_vals is not vals: 
-            vals.append(val) 
- 
-    def extend(self, *args, **kwargs): 
-        """Generic import function for any type of header-like object. 
-        Adapted version of MutableMapping.update in order to insert items 
-        with self.add instead of self.__setitem__ 
-        """ 
-        if len(args) > 1: 
+        # Using the MutableMapping function directly fails due to the private marker.
+        # Using ordinary dict.pop would expose the internal structures.
+        # So let's reinvent the wheel.
+        try:
+            value = self[key]
+        except KeyError:
+            if default is self.__marker:
+                raise
+            return default
+        else:
+            del self[key]
+            return value
+
+    def discard(self, key):
+        try:
+            del self[key]
+        except KeyError:
+            pass
+
+    def add(self, key, val):
+        """Adds a (name, value) pair, doesn't overwrite the value if it already
+        exists.
+
+        >>> headers = HTTPHeaderDict(foo='bar')
+        >>> headers.add('Foo', 'baz')
+        >>> headers['foo']
+        'bar, baz'
+        """
+        key_lower = key.lower()
+        new_vals = [key, val]
+        # Keep the common case aka no item present as fast as possible
+        vals = self._container.setdefault(key_lower, new_vals)
+        if new_vals is not vals:
+            vals.append(val)
+
+    def extend(self, *args, **kwargs):
+        """Generic import function for any type of header-like object.
+        Adapted version of MutableMapping.update in order to insert items
+        with self.add instead of self.__setitem__
+        """
+        if len(args) > 1:
             raise TypeError(
                 "extend() takes at most 1 positional "
                 "arguments ({0} given)".format(len(args))
             )
-        other = args[0] if len(args) >= 1 else () 
- 
-        if isinstance(other, HTTPHeaderDict): 
-            for key, val in other.iteritems(): 
-                self.add(key, val) 
-        elif isinstance(other, Mapping): 
-            for key in other: 
-                self.add(key, other[key]) 
-        elif hasattr(other, "keys"): 
-            for key in other.keys(): 
-                self.add(key, other[key]) 
-        else: 
-            for key, value in other: 
-                self.add(key, value) 
- 
-        for key, value in kwargs.items(): 
-            self.add(key, value) 
- 
+        other = args[0] if len(args) >= 1 else ()
+
+        if isinstance(other, HTTPHeaderDict):
+            for key, val in other.iteritems():
+                self.add(key, val)
+        elif isinstance(other, Mapping):
+            for key in other:
+                self.add(key, other[key])
+        elif hasattr(other, "keys"):
+            for key in other.keys():
+                self.add(key, other[key])
+        else:
+            for key, value in other:
+                self.add(key, value)
+
+        for key, value in kwargs.items():
+            self.add(key, value)
+
     def getlist(self, key, default=__marker):
-        """Returns a list of all the values for the named field. Returns an 
-        empty list if the key doesn't exist.""" 
-        try: 
-            vals = self._container[key.lower()] 
-        except KeyError: 
+        """Returns a list of all the values for the named field. Returns an
+        empty list if the key doesn't exist."""
+        try:
+            vals = self._container[key.lower()]
+        except KeyError:
             if default is self.__marker:
                 return []
             return default
-        else: 
-            return vals[1:] 
- 
-    # Backwards compatibility for httplib 
-    getheaders = getlist 
-    getallmatchingheaders = getlist 
-    iget = getlist 
- 
+        else:
+            return vals[1:]
+
+    # Backwards compatibility for httplib
+    getheaders = getlist
+    getallmatchingheaders = getlist
+    iget = getlist
+
     # Backwards compatibility for http.cookiejar
     get_all = getlist
 
-    def __repr__(self): 
-        return "%s(%s)" % (type(self).__name__, dict(self.itermerged())) 
- 
-    def _copy_from(self, other): 
-        for key in other: 
-            val = other.getlist(key) 
-            if isinstance(val, list): 
-                # Don't need to convert tuples 
-                val = list(val) 
-            self._container[key.lower()] = [key] + val 
- 
-    def copy(self): 
-        clone = type(self)() 
-        clone._copy_from(self) 
-        return clone 
- 
-    def iteritems(self): 
-        """Iterate over all header lines, including duplicate ones.""" 
-        for key in self: 
-            vals = self._container[key.lower()] 
-            for val in vals[1:]: 
-                yield vals[0], val 
- 
-    def itermerged(self): 
-        """Iterate over all headers, merging duplicate ones together.""" 
-        for key in self: 
-            val = self._container[key.lower()] 
+    def __repr__(self):
+        return "%s(%s)" % (type(self).__name__, dict(self.itermerged()))
+
+    def _copy_from(self, other):
+        for key in other:
+            val = other.getlist(key)
+            if isinstance(val, list):
+                # Don't need to convert tuples
+                val = list(val)
+            self._container[key.lower()] = [key] + val
+
+    def copy(self):
+        clone = type(self)()
+        clone._copy_from(self)
+        return clone
+
+    def iteritems(self):
+        """Iterate over all header lines, including duplicate ones."""
+        for key in self:
+            vals = self._container[key.lower()]
+            for val in vals[1:]:
+                yield vals[0], val
+
+    def itermerged(self):
+        """Iterate over all headers, merging duplicate ones together."""
+        for key in self:
+            val = self._container[key.lower()]
             yield val[0], ", ".join(val[1:])
- 
-    def items(self): 
-        return list(self.iteritems()) 
- 
-    @classmethod 
-    def from_httplib(cls, message):  # Python 2 
-        """Read headers from a Python 2 httplib message object.""" 
-        # python2.7 does not expose a proper API for exporting multiheaders 
-        # efficiently. This function re-reads raw lines from the message 
-        # object and extracts the multiheaders properly. 
+
+    def items(self):
+        return list(self.iteritems())
+
+    @classmethod
+    def from_httplib(cls, message):  # Python 2
+        """Read headers from a Python 2 httplib message object."""
+        # python2.7 does not expose a proper API for exporting multiheaders
+        # efficiently. This function re-reads raw lines from the message
+        # object and extracts the multiheaders properly.
         obs_fold_continued_leaders = (" ", "\t")
-        headers = [] 
- 
-        for line in message.headers: 
+        headers = []
+
+        for line in message.headers:
             if line.startswith(obs_fold_continued_leaders):
                 if not headers:
                     # We received a header line that starts with OWS as described
@@ -330,8 +330,8 @@ class HTTPHeaderDict(MutableMapping):
                     key, value = headers[-1]
                     headers[-1] = (key, value + " " + line.strip())
                     continue
- 
+
             key, value = line.split(":", 1)
-            headers.append((key, value.strip())) 
- 
-        return cls(headers) 
+            headers.append((key, value.strip()))
+
+        return cls(headers)

+ 203 - 203
contrib/python/urllib3/urllib3/connection.py

@@ -1,39 +1,39 @@
-from __future__ import absolute_import 
+from __future__ import absolute_import
 
-import datetime 
-import logging 
-import os 
+import datetime
+import logging
+import os
 import re
-import socket 
-import warnings 
+import socket
+import warnings
 from socket import error as SocketError
 from socket import timeout as SocketTimeout
 
-from .packages import six 
-from .packages.six.moves.http_client import HTTPConnection as _HTTPConnection 
-from .packages.six.moves.http_client import HTTPException  # noqa: F401 
+from .packages import six
+from .packages.six.moves.http_client import HTTPConnection as _HTTPConnection
+from .packages.six.moves.http_client import HTTPException  # noqa: F401
 from .util.proxy import create_proxy_ssl_context
- 
-try:  # Compiled with SSL? 
-    import ssl 
-
-    BaseSSLError = ssl.SSLError 
-except (ImportError, AttributeError):  # Platform-specific: No SSL. 
-    ssl = None 
- 
-    class BaseSSLError(BaseException): 
-        pass 
- 
- 
+
+try:  # Compiled with SSL?
+    import ssl
+
+    BaseSSLError = ssl.SSLError
+except (ImportError, AttributeError):  # Platform-specific: No SSL.
+    ssl = None
+
+    class BaseSSLError(BaseException):
+        pass
+
+
 try:
     # Python 3: not a no-op, we're adding this to the namespace so it can be imported.
-    ConnectionError = ConnectionError 
+    ConnectionError = ConnectionError
 except NameError:
     # Python 2
-    class ConnectionError(Exception): 
-        pass 
- 
- 
+    class ConnectionError(Exception):
+        pass
+
+
 try:  # Python 3:
     # Not a no-op, we're adding this to the namespace so it can be imported.
     BrokenPipeError = BrokenPipeError
@@ -45,90 +45,90 @@ except NameError:  # Python 2:
 
 from ._collections import HTTPHeaderDict  # noqa (historical, removed in v2)
 from ._version import __version__
-from .exceptions import ( 
+from .exceptions import (
     ConnectTimeoutError,
     NewConnectionError,
-    SubjectAltNameWarning, 
-    SystemTimeWarning, 
-) 
+    SubjectAltNameWarning,
+    SystemTimeWarning,
+)
 from .util import SKIP_HEADER, SKIPPABLE_HEADERS, connection
-from .util.ssl_ import ( 
+from .util.ssl_ import (
     assert_fingerprint,
     create_urllib3_context,
     is_ipaddress,
     resolve_cert_reqs,
     resolve_ssl_version,
     ssl_wrap_socket,
-) 
+)
 from .util.ssl_match_hostname import CertificateError, match_hostname
- 
-log = logging.getLogger(__name__) 
- 
+
+log = logging.getLogger(__name__)
+
 port_by_scheme = {"http": 80, "https": 443}
- 
+
 # When it comes time to update this value as a part of regular maintenance
 # (ie test_recent_date is failing) update it to ~6 months before the current date.
 RECENT_DATE = datetime.date(2020, 7, 1)
- 
+
 _CONTAINS_CONTROL_CHAR_RE = re.compile(r"[^-!#$%&'*+.^_`|~0-9a-zA-Z]")
- 
 
-class HTTPConnection(_HTTPConnection, object): 
-    """ 
+
+class HTTPConnection(_HTTPConnection, object):
+    """
     Based on :class:`http.client.HTTPConnection` but provides an extra constructor
-    backwards-compatibility layer between older and newer Pythons. 
- 
-    Additional keyword parameters are used to configure attributes of the connection. 
-    Accepted parameters include: 
- 
+    backwards-compatibility layer between older and newer Pythons.
+
+    Additional keyword parameters are used to configure attributes of the connection.
+    Accepted parameters include:
+
     - ``strict``: See the documentation on :class:`urllib3.connectionpool.HTTPConnectionPool`
     - ``source_address``: Set the source address for the current connection.
     - ``socket_options``: Set specific options on the underlying socket. If not specified, then
       defaults are loaded from ``HTTPConnection.default_socket_options`` which includes disabling
       Nagle's algorithm (sets TCP_NODELAY to 1) unless the connection is behind a proxy.
- 
+
       For example, if you wish to enable TCP Keep Alive in addition to the defaults,
       you might pass:
- 
+
       .. code-block:: python
- 
+
          HTTPConnection.default_socket_options + [
              (socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1),
          ]
 
       Or you may want to disable the defaults by passing an empty list (e.g., ``[]``).
-    """ 
- 
+    """
+
     default_port = port_by_scheme["http"]
- 
-    #: Disable Nagle's algorithm by default. 
-    #: ``[(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)]`` 
-    default_socket_options = [(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)] 
- 
-    #: Whether this connection verifies the host's certificate. 
-    is_verified = False 
- 
+
+    #: Disable Nagle's algorithm by default.
+    #: ``[(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)]``
+    default_socket_options = [(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)]
+
+    #: Whether this connection verifies the host's certificate.
+    is_verified = False
+
     #: Whether this proxy connection (if used) verifies the proxy host's
     #: certificate.
     proxy_is_verified = None
 
-    def __init__(self, *args, **kw): 
+    def __init__(self, *args, **kw):
         if not six.PY2:
             kw.pop("strict", None)
- 
+
         # Pre-set source_address.
         self.source_address = kw.get("source_address")
- 
-        #: The socket options provided by the user. If no options are 
-        #: provided, we use the default options. 
+
+        #: The socket options provided by the user. If no options are
+        #: provided, we use the default options.
         self.socket_options = kw.pop("socket_options", self.default_socket_options)
- 
+
         # Proxy options provided by the user.
         self.proxy = kw.pop("proxy", None)
         self.proxy_config = kw.pop("proxy_config", None)
 
-        _HTTPConnection.__init__(self, *args, **kw) 
- 
+        _HTTPConnection.__init__(self, *args, **kw)
+
     @property
     def host(self):
         """
@@ -158,53 +158,53 @@ class HTTPConnection(_HTTPConnection, object):
         """
         self._dns_host = value
 
-    def _new_conn(self): 
+    def _new_conn(self):
         """Establish a socket connection and set nodelay settings on it.
- 
-        :return: New socket connection. 
-        """ 
-        extra_kw = {} 
-        if self.source_address: 
+
+        :return: New socket connection.
+        """
+        extra_kw = {}
+        if self.source_address:
             extra_kw["source_address"] = self.source_address
- 
-        if self.socket_options: 
+
+        if self.socket_options:
             extra_kw["socket_options"] = self.socket_options
- 
-        try: 
-            conn = connection.create_connection( 
+
+        try:
+            conn = connection.create_connection(
                 (self._dns_host, self.port), self.timeout, **extra_kw
             )
- 
+
         except SocketTimeout:
-            raise ConnectTimeoutError( 
+            raise ConnectTimeoutError(
                 self,
                 "Connection to %s timed out. (connect timeout=%s)"
                 % (self.host, self.timeout),
             )
- 
-        except SocketError as e: 
-            raise NewConnectionError( 
+
+        except SocketError as e:
+            raise NewConnectionError(
                 self, "Failed to establish a new connection: %s" % e
             )
- 
-        return conn 
- 
+
+        return conn
+
     def _is_using_tunnel(self):
         # Google App Engine's httplib does not define _tunnel_host
         return getattr(self, "_tunnel_host", None)
 
-    def _prepare_conn(self, conn): 
-        self.sock = conn 
+    def _prepare_conn(self, conn):
+        self.sock = conn
         if self._is_using_tunnel():
-            # TODO: Fix tunnel so it doesn't depend on self.sock state. 
-            self._tunnel() 
-            # Mark this connection as not reusable 
-            self.auto_open = 0 
- 
-    def connect(self): 
-        conn = self._new_conn() 
-        self._prepare_conn(conn) 
- 
+            # TODO: Fix tunnel so it doesn't depend on self.sock state.
+            self._tunnel()
+            # Mark this connection as not reusable
+            self.auto_open = 0
+
+    def connect(self):
+        conn = self._new_conn()
+        self._prepare_conn(conn)
+
     def putrequest(self, method, url, *args, **kwargs):
         """ """
         # Empty docstring because the indentation of CPython's implementation
@@ -238,62 +238,62 @@ class HTTPConnection(_HTTPConnection, object):
             headers["User-Agent"] = _get_default_user_agent()
         super(HTTPConnection, self).request(method, url, body=body, headers=headers)
 
-    def request_chunked(self, method, url, body=None, headers=None): 
-        """ 
-        Alternative to the common request method, which sends the 
-        body with chunked encoding and not as one block 
-        """ 
+    def request_chunked(self, method, url, body=None, headers=None):
+        """
+        Alternative to the common request method, which sends the
+        body with chunked encoding and not as one block
+        """
         headers = headers or {}
         header_keys = set([six.ensure_str(k.lower()) for k in headers])
         skip_accept_encoding = "accept-encoding" in header_keys
         skip_host = "host" in header_keys
-        self.putrequest( 
+        self.putrequest(
             method, url, skip_accept_encoding=skip_accept_encoding, skip_host=skip_host
-        ) 
+        )
         if "user-agent" not in header_keys:
             self.putheader("User-Agent", _get_default_user_agent())
-        for header, value in headers.items(): 
-            self.putheader(header, value) 
+        for header, value in headers.items():
+            self.putheader(header, value)
         if "transfer-encoding" not in header_keys:
             self.putheader("Transfer-Encoding", "chunked")
-        self.endheaders() 
- 
-        if body is not None: 
+        self.endheaders()
+
+        if body is not None:
             stringish_types = six.string_types + (bytes,)
-            if isinstance(body, stringish_types): 
-                body = (body,) 
-            for chunk in body: 
-                if not chunk: 
-                    continue 
+            if isinstance(body, stringish_types):
+                body = (body,)
+            for chunk in body:
+                if not chunk:
+                    continue
                 if not isinstance(chunk, bytes):
                     chunk = chunk.encode("utf8")
-                len_str = hex(len(chunk))[2:] 
+                len_str = hex(len(chunk))[2:]
                 to_send = bytearray(len_str.encode())
                 to_send += b"\r\n"
                 to_send += chunk
                 to_send += b"\r\n"
                 self.send(to_send)
- 
-        # After the if clause, to always have a closed body 
+
+        # After the if clause, to always have a closed body
         self.send(b"0\r\n\r\n")
- 
- 
-class HTTPSConnection(HTTPConnection): 
+
+
+class HTTPSConnection(HTTPConnection):
     """
     Many of the parameters to this constructor are passed to the underlying SSL
     socket by means of :py:func:`urllib3.util.ssl_wrap_socket`.
     """
 
     default_port = port_by_scheme["https"]
- 
+
     cert_reqs = None
     ca_certs = None
     ca_cert_dir = None
     ca_cert_data = None
-    ssl_version = None 
+    ssl_version = None
     assert_fingerprint = None
     tls_in_tls_required = False
- 
+
     def __init__(
         self,
         host,
@@ -307,19 +307,19 @@ class HTTPSConnection(HTTPConnection):
         server_hostname=None,
         **kw
     ):
- 
+
         HTTPConnection.__init__(self, host, port, strict=strict, timeout=timeout, **kw)
- 
-        self.key_file = key_file 
-        self.cert_file = cert_file 
+
+        self.key_file = key_file
+        self.cert_file = cert_file
         self.key_password = key_password
-        self.ssl_context = ssl_context 
+        self.ssl_context = ssl_context
         self.server_hostname = server_hostname
- 
-        # Required property for Google AppEngine 1.9.0 which otherwise causes 
-        # HTTPS requests to go out as HTTP. (See Issue #356) 
+
+        # Required property for Google AppEngine 1.9.0 which otherwise causes
+        # HTTPS requests to go out as HTTP. (See Issue #356)
         self._protocol = "https"
- 
+
     def set_cert(
         self,
         key_file=None,
@@ -332,76 +332,76 @@ class HTTPSConnection(HTTPConnection):
         ca_cert_dir=None,
         ca_cert_data=None,
     ):
-        """ 
-        This method should only be called once, before the connection is used. 
-        """ 
+        """
+        This method should only be called once, before the connection is used.
+        """
         # If cert_reqs is not provided we'll assume CERT_REQUIRED unless we also
         # have an SSLContext object in which case we'll use its verify_mode.
-        if cert_reqs is None: 
+        if cert_reqs is None:
             if self.ssl_context is not None:
-                cert_reqs = self.ssl_context.verify_mode 
+                cert_reqs = self.ssl_context.verify_mode
             else:
                 cert_reqs = resolve_cert_reqs(None)
- 
-        self.key_file = key_file 
-        self.cert_file = cert_file 
-        self.cert_reqs = cert_reqs 
+
+        self.key_file = key_file
+        self.cert_file = cert_file
+        self.cert_reqs = cert_reqs
         self.key_password = key_password
-        self.assert_hostname = assert_hostname 
-        self.assert_fingerprint = assert_fingerprint 
+        self.assert_hostname = assert_hostname
+        self.assert_fingerprint = assert_fingerprint
         self.ca_certs = os.path.expanduser(ca_certs) \
             if isinstance(ca_certs, six.string_types) else ca_certs
-        self.ca_cert_dir = ca_cert_dir and os.path.expanduser(ca_cert_dir) 
+        self.ca_cert_dir = ca_cert_dir and os.path.expanduser(ca_cert_dir)
         self.ca_cert_data = ca_cert_data
- 
-    def connect(self): 
-        # Add certificate verification 
-        conn = self._new_conn() 
-        hostname = self.host 
+
+    def connect(self):
+        # Add certificate verification
+        conn = self._new_conn()
+        hostname = self.host
         tls_in_tls = False
- 
+
         if self._is_using_tunnel():
             if self.tls_in_tls_required:
                 conn = self._connect_tls_proxy(hostname, conn)
                 tls_in_tls = True
 
-            self.sock = conn 
-
-            # Calls self._set_hostport(), so self.host is 
-            # self._tunnel_host below. 
-            self._tunnel() 
-            # Mark this connection as not reusable 
-            self.auto_open = 0 
- 
-            # Override the host with the one we're requesting data from. 
-            hostname = self._tunnel_host 
- 
+            self.sock = conn
+
+            # Calls self._set_hostport(), so self.host is
+            # self._tunnel_host below.
+            self._tunnel()
+            # Mark this connection as not reusable
+            self.auto_open = 0
+
+            # Override the host with the one we're requesting data from.
+            hostname = self._tunnel_host
+
         server_hostname = hostname
         if self.server_hostname is not None:
             server_hostname = self.server_hostname
 
-        is_time_off = datetime.date.today() < RECENT_DATE 
-        if is_time_off: 
+        is_time_off = datetime.date.today() < RECENT_DATE
+        if is_time_off:
             warnings.warn(
                 (
                     "System time is way off (before {0}). This will probably "
                     "lead to SSL verification errors"
                 ).format(RECENT_DATE),
                 SystemTimeWarning,
-            ) 
- 
-        # Wrap socket using verification with the root certs in 
-        # trusted_root_certs 
+            )
+
+        # Wrap socket using verification with the root certs in
+        # trusted_root_certs
         default_ssl_context = False
-        if self.ssl_context is None: 
+        if self.ssl_context is None:
             default_ssl_context = True
-            self.ssl_context = create_urllib3_context( 
-                ssl_version=resolve_ssl_version(self.ssl_version), 
-                cert_reqs=resolve_cert_reqs(self.cert_reqs), 
-            ) 
- 
-        context = self.ssl_context 
-        context.verify_mode = resolve_cert_reqs(self.cert_reqs) 
+            self.ssl_context = create_urllib3_context(
+                ssl_version=resolve_ssl_version(self.ssl_version),
+                cert_reqs=resolve_cert_reqs(self.cert_reqs),
+            )
+
+        context = self.ssl_context
+        context.verify_mode = resolve_cert_reqs(self.cert_reqs)
 
         # Try to load OS default certs if none are given.
         # Works well on Windows (requires Python3.4+)
@@ -414,19 +414,19 @@ class HTTPSConnection(HTTPConnection):
         ):
             context.load_default_certs()
 
-        self.sock = ssl_wrap_socket( 
-            sock=conn, 
-            keyfile=self.key_file, 
-            certfile=self.cert_file, 
+        self.sock = ssl_wrap_socket(
+            sock=conn,
+            keyfile=self.key_file,
+            certfile=self.cert_file,
             key_password=self.key_password,
-            ca_certs=self.ca_certs, 
-            ca_cert_dir=self.ca_cert_dir, 
+            ca_certs=self.ca_certs,
+            ca_cert_dir=self.ca_cert_dir,
             ca_cert_data=self.ca_cert_data,
             server_hostname=server_hostname,
             ssl_context=context,
             tls_in_tls=tls_in_tls,
         )
- 
+
         # If we're using all defaults and the connection
         # is TLSv1 or TLSv1.1 we throw a DeprecationWarning
         # for the host.
@@ -444,7 +444,7 @@ class HTTPSConnection(HTTPConnection):
                 DeprecationWarning,
             )
 
-        if self.assert_fingerprint: 
+        if self.assert_fingerprint:
             assert_fingerprint(
                 self.sock.getpeercert(binary_form=True), self.assert_fingerprint
             )
@@ -453,10 +453,10 @@ class HTTPSConnection(HTTPConnection):
             and not getattr(context, "check_hostname", False)
             and self.assert_hostname is not False
         ):
-            # While urllib3 attempts to always turn off hostname matching from 
-            # the TLS library, this cannot always be done. So we check whether 
-            # the TLS Library still thinks it's matching hostnames. 
-            cert = self.sock.getpeercert() 
+            # While urllib3 attempts to always turn off hostname matching from
+            # the TLS library, this cannot always be done. So we check whether
+            # the TLS Library still thinks it's matching hostnames.
+            cert = self.sock.getpeercert()
             if not cert.get("subjectAltName", ()):
                 warnings.warn(
                     (
@@ -466,14 +466,14 @@ class HTTPSConnection(HTTPConnection):
                         "for details.)".format(hostname)
                     ),
                     SubjectAltNameWarning,
-                ) 
+                )
             _match_hostname(cert, self.assert_hostname or server_hostname)
- 
-        self.is_verified = ( 
+
+        self.is_verified = (
             context.verify_mode == ssl.CERT_REQUIRED
             or self.assert_fingerprint is not None
-        ) 
- 
+        )
+
     def _connect_tls_proxy(self, hostname, conn):
         """
         Establish a TLS connection to the proxy using the provided SSL context.
@@ -488,7 +488,7 @@ class HTTPSConnection(HTTPConnection):
                 server_hostname=hostname,
                 ssl_context=ssl_context,
             )
- 
+
         ssl_context = create_proxy_ssl_context(
             self.ssl_version,
             self.cert_reqs,
@@ -531,7 +531,7 @@ class HTTPSConnection(HTTPConnection):
         return socket
 
 
-def _match_hostname(cert, asserted_hostname): 
+def _match_hostname(cert, asserted_hostname):
     # Our upstream implementation of ssl.match_hostname()
     # only applies this normalization to IP addresses so it doesn't
     # match DNS SANs so we do the same thing!
@@ -539,20 +539,20 @@ def _match_hostname(cert, asserted_hostname):
     if is_ipaddress(stripped_hostname):
         asserted_hostname = stripped_hostname
 
-    try: 
-        match_hostname(cert, asserted_hostname) 
-    except CertificateError as e: 
+    try:
+        match_hostname(cert, asserted_hostname)
+    except CertificateError as e:
         log.warning(
             "Certificate did not match expected hostname: %s. Certificate: %s",
             asserted_hostname,
             cert,
-        ) 
-        # Add cert to exception and reraise so client code can inspect 
-        # the cert when catching the exception, if they want to 
-        e._peer_cert = cert 
-        raise 
- 
- 
+        )
+        # Add cert to exception and reraise so client code can inspect
+        # the cert when catching the exception, if they want to
+        e._peer_cert = cert
+        raise
+
+
 def _get_default_user_agent():
     return "python-urllib3/%s" % __version__
 

File diff suppressed because it is too large
+ 473 - 473
contrib/python/urllib3/urllib3/connectionpool.py


+ 395 - 395
contrib/python/urllib3/urllib3/contrib/_securetransport/bindings.py

@@ -1,38 +1,38 @@
-""" 
-This module uses ctypes to bind a whole bunch of functions and constants from 
-SecureTransport. The goal here is to provide the low-level API to 
-SecureTransport. These are essentially the C-level functions and constants, and 
-they're pretty gross to work with. 
- 
-This code is a bastardised version of the code found in Will Bond's oscrypto 
-library. An enormous debt is owed to him for blazing this trail for us. For 
-that reason, this code should be considered to be covered both by urllib3's 
-license and by oscrypto's: 
- 
-    Copyright (c) 2015-2016 Will Bond <will@wbond.net> 
- 
-    Permission is hereby granted, free of charge, to any person obtaining a 
-    copy of this software and associated documentation files (the "Software"), 
-    to deal in the Software without restriction, including without limitation 
-    the rights to use, copy, modify, merge, publish, distribute, sublicense, 
-    and/or sell copies of the Software, and to permit persons to whom the 
-    Software is furnished to do so, subject to the following conditions: 
- 
-    The above copyright notice and this permission notice shall be included in 
-    all copies or substantial portions of the Software. 
- 
-    THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 
-    IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 
-    FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 
-    AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 
-    LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 
-    FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 
-    DEALINGS IN THE SOFTWARE. 
-""" 
-from __future__ import absolute_import 
- 
-import platform 
-from ctypes import ( 
+"""
+This module uses ctypes to bind a whole bunch of functions and constants from
+SecureTransport. The goal here is to provide the low-level API to
+SecureTransport. These are essentially the C-level functions and constants, and
+they're pretty gross to work with.
+
+This code is a bastardised version of the code found in Will Bond's oscrypto
+library. An enormous debt is owed to him for blazing this trail for us. For
+that reason, this code should be considered to be covered both by urllib3's
+license and by oscrypto's:
+
+    Copyright (c) 2015-2016 Will Bond <will@wbond.net>
+
+    Permission is hereby granted, free of charge, to any person obtaining a
+    copy of this software and associated documentation files (the "Software"),
+    to deal in the Software without restriction, including without limitation
+    the rights to use, copy, modify, merge, publish, distribute, sublicense,
+    and/or sell copies of the Software, and to permit persons to whom the
+    Software is furnished to do so, subject to the following conditions:
+
+    The above copyright notice and this permission notice shall be included in
+    all copies or substantial portions of the Software.
+
+    THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+    IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+    FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+    AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+    LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+    FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+    DEALINGS IN THE SOFTWARE.
+"""
+from __future__ import absolute_import
+
+import platform
+from ctypes import (
     CDLL,
     CFUNCTYPE,
     POINTER,
@@ -45,23 +45,23 @@ from ctypes import (
     c_uint32,
     c_ulong,
     c_void_p,
-) 
+)
 from ctypes.util import find_library
 
 from ...packages.six import raise_from
- 
+
 if platform.system() != "Darwin":
     raise ImportError("Only macOS is supported")
- 
-version = platform.mac_ver()[0] 
+
+version = platform.mac_ver()[0]
 version_info = tuple(map(int, version.split(".")))
-if version_info < (10, 8): 
-    raise OSError( 
+if version_info < (10, 8):
+    raise OSError(
         "Only OS X 10.8 and newer are supported, not %s.%s"
         % (version_info[0], version_info[1])
-    ) 
- 
- 
+    )
+
+
 def load_cdll(name, macos10_16_path):
     """Loads a CDLL by name, falling back to known path on 10.16+"""
     try:
@@ -87,214 +87,214 @@ CoreFoundation = load_cdll(
 )
 
 
-Boolean = c_bool 
-CFIndex = c_long 
-CFStringEncoding = c_uint32 
-CFData = c_void_p 
-CFString = c_void_p 
-CFArray = c_void_p 
-CFMutableArray = c_void_p 
-CFDictionary = c_void_p 
-CFError = c_void_p 
-CFType = c_void_p 
-CFTypeID = c_ulong 
- 
-CFTypeRef = POINTER(CFType) 
-CFAllocatorRef = c_void_p 
- 
-OSStatus = c_int32 
- 
-CFDataRef = POINTER(CFData) 
-CFStringRef = POINTER(CFString) 
-CFArrayRef = POINTER(CFArray) 
-CFMutableArrayRef = POINTER(CFMutableArray) 
-CFDictionaryRef = POINTER(CFDictionary) 
-CFArrayCallBacks = c_void_p 
-CFDictionaryKeyCallBacks = c_void_p 
-CFDictionaryValueCallBacks = c_void_p 
- 
-SecCertificateRef = POINTER(c_void_p) 
-SecExternalFormat = c_uint32 
-SecExternalItemType = c_uint32 
-SecIdentityRef = POINTER(c_void_p) 
-SecItemImportExportFlags = c_uint32 
-SecItemImportExportKeyParameters = c_void_p 
-SecKeychainRef = POINTER(c_void_p) 
-SSLProtocol = c_uint32 
-SSLCipherSuite = c_uint32 
-SSLContextRef = POINTER(c_void_p) 
-SecTrustRef = POINTER(c_void_p) 
-SSLConnectionRef = c_uint32 
-SecTrustResultType = c_uint32 
-SecTrustOptionFlags = c_uint32 
-SSLProtocolSide = c_uint32 
-SSLConnectionType = c_uint32 
-SSLSessionOption = c_uint32 
- 
- 
-try: 
-    Security.SecItemImport.argtypes = [ 
-        CFDataRef, 
-        CFStringRef, 
-        POINTER(SecExternalFormat), 
-        POINTER(SecExternalItemType), 
-        SecItemImportExportFlags, 
-        POINTER(SecItemImportExportKeyParameters), 
-        SecKeychainRef, 
-        POINTER(CFArrayRef), 
-    ] 
-    Security.SecItemImport.restype = OSStatus 
- 
-    Security.SecCertificateGetTypeID.argtypes = [] 
-    Security.SecCertificateGetTypeID.restype = CFTypeID 
- 
-    Security.SecIdentityGetTypeID.argtypes = [] 
-    Security.SecIdentityGetTypeID.restype = CFTypeID 
- 
-    Security.SecKeyGetTypeID.argtypes = [] 
-    Security.SecKeyGetTypeID.restype = CFTypeID 
- 
+Boolean = c_bool
+CFIndex = c_long
+CFStringEncoding = c_uint32
+CFData = c_void_p
+CFString = c_void_p
+CFArray = c_void_p
+CFMutableArray = c_void_p
+CFDictionary = c_void_p
+CFError = c_void_p
+CFType = c_void_p
+CFTypeID = c_ulong
+
+CFTypeRef = POINTER(CFType)
+CFAllocatorRef = c_void_p
+
+OSStatus = c_int32
+
+CFDataRef = POINTER(CFData)
+CFStringRef = POINTER(CFString)
+CFArrayRef = POINTER(CFArray)
+CFMutableArrayRef = POINTER(CFMutableArray)
+CFDictionaryRef = POINTER(CFDictionary)
+CFArrayCallBacks = c_void_p
+CFDictionaryKeyCallBacks = c_void_p
+CFDictionaryValueCallBacks = c_void_p
+
+SecCertificateRef = POINTER(c_void_p)
+SecExternalFormat = c_uint32
+SecExternalItemType = c_uint32
+SecIdentityRef = POINTER(c_void_p)
+SecItemImportExportFlags = c_uint32
+SecItemImportExportKeyParameters = c_void_p
+SecKeychainRef = POINTER(c_void_p)
+SSLProtocol = c_uint32
+SSLCipherSuite = c_uint32
+SSLContextRef = POINTER(c_void_p)
+SecTrustRef = POINTER(c_void_p)
+SSLConnectionRef = c_uint32
+SecTrustResultType = c_uint32
+SecTrustOptionFlags = c_uint32
+SSLProtocolSide = c_uint32
+SSLConnectionType = c_uint32
+SSLSessionOption = c_uint32
+
+
+try:
+    Security.SecItemImport.argtypes = [
+        CFDataRef,
+        CFStringRef,
+        POINTER(SecExternalFormat),
+        POINTER(SecExternalItemType),
+        SecItemImportExportFlags,
+        POINTER(SecItemImportExportKeyParameters),
+        SecKeychainRef,
+        POINTER(CFArrayRef),
+    ]
+    Security.SecItemImport.restype = OSStatus
+
+    Security.SecCertificateGetTypeID.argtypes = []
+    Security.SecCertificateGetTypeID.restype = CFTypeID
+
+    Security.SecIdentityGetTypeID.argtypes = []
+    Security.SecIdentityGetTypeID.restype = CFTypeID
+
+    Security.SecKeyGetTypeID.argtypes = []
+    Security.SecKeyGetTypeID.restype = CFTypeID
+
     Security.SecCertificateCreateWithData.argtypes = [CFAllocatorRef, CFDataRef]
-    Security.SecCertificateCreateWithData.restype = SecCertificateRef 
- 
+    Security.SecCertificateCreateWithData.restype = SecCertificateRef
+
     Security.SecCertificateCopyData.argtypes = [SecCertificateRef]
-    Security.SecCertificateCopyData.restype = CFDataRef 
- 
+    Security.SecCertificateCopyData.restype = CFDataRef
+
     Security.SecCopyErrorMessageString.argtypes = [OSStatus, c_void_p]
-    Security.SecCopyErrorMessageString.restype = CFStringRef 
- 
-    Security.SecIdentityCreateWithCertificate.argtypes = [ 
-        CFTypeRef, 
-        SecCertificateRef, 
+    Security.SecCopyErrorMessageString.restype = CFStringRef
+
+    Security.SecIdentityCreateWithCertificate.argtypes = [
+        CFTypeRef,
+        SecCertificateRef,
         POINTER(SecIdentityRef),
-    ] 
-    Security.SecIdentityCreateWithCertificate.restype = OSStatus 
- 
-    Security.SecKeychainCreate.argtypes = [ 
-        c_char_p, 
-        c_uint32, 
-        c_void_p, 
-        Boolean, 
-        c_void_p, 
+    ]
+    Security.SecIdentityCreateWithCertificate.restype = OSStatus
+
+    Security.SecKeychainCreate.argtypes = [
+        c_char_p,
+        c_uint32,
+        c_void_p,
+        Boolean,
+        c_void_p,
         POINTER(SecKeychainRef),
-    ] 
-    Security.SecKeychainCreate.restype = OSStatus 
- 
+    ]
+    Security.SecKeychainCreate.restype = OSStatus
+
     Security.SecKeychainDelete.argtypes = [SecKeychainRef]
-    Security.SecKeychainDelete.restype = OSStatus 
- 
-    Security.SecPKCS12Import.argtypes = [ 
-        CFDataRef, 
-        CFDictionaryRef, 
+    Security.SecKeychainDelete.restype = OSStatus
+
+    Security.SecPKCS12Import.argtypes = [
+        CFDataRef,
+        CFDictionaryRef,
         POINTER(CFArrayRef),
-    ] 
-    Security.SecPKCS12Import.restype = OSStatus 
- 
-    SSLReadFunc = CFUNCTYPE(OSStatus, SSLConnectionRef, c_void_p, POINTER(c_size_t)) 
+    ]
+    Security.SecPKCS12Import.restype = OSStatus
+
+    SSLReadFunc = CFUNCTYPE(OSStatus, SSLConnectionRef, c_void_p, POINTER(c_size_t))
     SSLWriteFunc = CFUNCTYPE(
         OSStatus, SSLConnectionRef, POINTER(c_byte), POINTER(c_size_t)
     )
- 
+
     Security.SSLSetIOFuncs.argtypes = [SSLContextRef, SSLReadFunc, SSLWriteFunc]
-    Security.SSLSetIOFuncs.restype = OSStatus 
- 
+    Security.SSLSetIOFuncs.restype = OSStatus
+
     Security.SSLSetPeerID.argtypes = [SSLContextRef, c_char_p, c_size_t]
-    Security.SSLSetPeerID.restype = OSStatus 
- 
+    Security.SSLSetPeerID.restype = OSStatus
+
     Security.SSLSetCertificate.argtypes = [SSLContextRef, CFArrayRef]
-    Security.SSLSetCertificate.restype = OSStatus 
- 
+    Security.SSLSetCertificate.restype = OSStatus
+
     Security.SSLSetCertificateAuthorities.argtypes = [SSLContextRef, CFTypeRef, Boolean]
-    Security.SSLSetCertificateAuthorities.restype = OSStatus 
- 
+    Security.SSLSetCertificateAuthorities.restype = OSStatus
+
     Security.SSLSetConnection.argtypes = [SSLContextRef, SSLConnectionRef]
-    Security.SSLSetConnection.restype = OSStatus 
- 
+    Security.SSLSetConnection.restype = OSStatus
+
     Security.SSLSetPeerDomainName.argtypes = [SSLContextRef, c_char_p, c_size_t]
-    Security.SSLSetPeerDomainName.restype = OSStatus 
- 
+    Security.SSLSetPeerDomainName.restype = OSStatus
+
     Security.SSLHandshake.argtypes = [SSLContextRef]
-    Security.SSLHandshake.restype = OSStatus 
- 
+    Security.SSLHandshake.restype = OSStatus
+
     Security.SSLRead.argtypes = [SSLContextRef, c_char_p, c_size_t, POINTER(c_size_t)]
-    Security.SSLRead.restype = OSStatus 
- 
+    Security.SSLRead.restype = OSStatus
+
     Security.SSLWrite.argtypes = [SSLContextRef, c_char_p, c_size_t, POINTER(c_size_t)]
-    Security.SSLWrite.restype = OSStatus 
- 
+    Security.SSLWrite.restype = OSStatus
+
     Security.SSLClose.argtypes = [SSLContextRef]
-    Security.SSLClose.restype = OSStatus 
- 
+    Security.SSLClose.restype = OSStatus
+
     Security.SSLGetNumberSupportedCiphers.argtypes = [SSLContextRef, POINTER(c_size_t)]
-    Security.SSLGetNumberSupportedCiphers.restype = OSStatus 
- 
-    Security.SSLGetSupportedCiphers.argtypes = [ 
-        SSLContextRef, 
-        POINTER(SSLCipherSuite), 
+    Security.SSLGetNumberSupportedCiphers.restype = OSStatus
+
+    Security.SSLGetSupportedCiphers.argtypes = [
+        SSLContextRef,
+        POINTER(SSLCipherSuite),
         POINTER(c_size_t),
-    ] 
-    Security.SSLGetSupportedCiphers.restype = OSStatus 
- 
-    Security.SSLSetEnabledCiphers.argtypes = [ 
-        SSLContextRef, 
-        POINTER(SSLCipherSuite), 
+    ]
+    Security.SSLGetSupportedCiphers.restype = OSStatus
+
+    Security.SSLSetEnabledCiphers.argtypes = [
+        SSLContextRef,
+        POINTER(SSLCipherSuite),
         c_size_t,
-    ] 
-    Security.SSLSetEnabledCiphers.restype = OSStatus 
- 
+    ]
+    Security.SSLSetEnabledCiphers.restype = OSStatus
+
     Security.SSLGetNumberEnabledCiphers.argtype = [SSLContextRef, POINTER(c_size_t)]
-    Security.SSLGetNumberEnabledCiphers.restype = OSStatus 
- 
-    Security.SSLGetEnabledCiphers.argtypes = [ 
-        SSLContextRef, 
-        POINTER(SSLCipherSuite), 
+    Security.SSLGetNumberEnabledCiphers.restype = OSStatus
+
+    Security.SSLGetEnabledCiphers.argtypes = [
+        SSLContextRef,
+        POINTER(SSLCipherSuite),
         POINTER(c_size_t),
-    ] 
-    Security.SSLGetEnabledCiphers.restype = OSStatus 
- 
+    ]
+    Security.SSLGetEnabledCiphers.restype = OSStatus
+
     Security.SSLGetNegotiatedCipher.argtypes = [SSLContextRef, POINTER(SSLCipherSuite)]
-    Security.SSLGetNegotiatedCipher.restype = OSStatus 
- 
-    Security.SSLGetNegotiatedProtocolVersion.argtypes = [ 
-        SSLContextRef, 
+    Security.SSLGetNegotiatedCipher.restype = OSStatus
+
+    Security.SSLGetNegotiatedProtocolVersion.argtypes = [
+        SSLContextRef,
         POINTER(SSLProtocol),
-    ] 
-    Security.SSLGetNegotiatedProtocolVersion.restype = OSStatus 
- 
+    ]
+    Security.SSLGetNegotiatedProtocolVersion.restype = OSStatus
+
     Security.SSLCopyPeerTrust.argtypes = [SSLContextRef, POINTER(SecTrustRef)]
-    Security.SSLCopyPeerTrust.restype = OSStatus 
- 
+    Security.SSLCopyPeerTrust.restype = OSStatus
+
     Security.SecTrustSetAnchorCertificates.argtypes = [SecTrustRef, CFArrayRef]
-    Security.SecTrustSetAnchorCertificates.restype = OSStatus 
- 
+    Security.SecTrustSetAnchorCertificates.restype = OSStatus
+
     Security.SecTrustSetAnchorCertificatesOnly.argstypes = [SecTrustRef, Boolean]
-    Security.SecTrustSetAnchorCertificatesOnly.restype = OSStatus 
- 
+    Security.SecTrustSetAnchorCertificatesOnly.restype = OSStatus
+
     Security.SecTrustEvaluate.argtypes = [SecTrustRef, POINTER(SecTrustResultType)]
-    Security.SecTrustEvaluate.restype = OSStatus 
- 
+    Security.SecTrustEvaluate.restype = OSStatus
+
     Security.SecTrustGetCertificateCount.argtypes = [SecTrustRef]
-    Security.SecTrustGetCertificateCount.restype = CFIndex 
- 
+    Security.SecTrustGetCertificateCount.restype = CFIndex
+
     Security.SecTrustGetCertificateAtIndex.argtypes = [SecTrustRef, CFIndex]
-    Security.SecTrustGetCertificateAtIndex.restype = SecCertificateRef 
- 
-    Security.SSLCreateContext.argtypes = [ 
-        CFAllocatorRef, 
-        SSLProtocolSide, 
+    Security.SecTrustGetCertificateAtIndex.restype = SecCertificateRef
+
+    Security.SSLCreateContext.argtypes = [
+        CFAllocatorRef,
+        SSLProtocolSide,
         SSLConnectionType,
-    ] 
-    Security.SSLCreateContext.restype = SSLContextRef 
- 
+    ]
+    Security.SSLCreateContext.restype = SSLContextRef
+
     Security.SSLSetSessionOption.argtypes = [SSLContextRef, SSLSessionOption, Boolean]
-    Security.SSLSetSessionOption.restype = OSStatus 
- 
+    Security.SSLSetSessionOption.restype = OSStatus
+
     Security.SSLSetProtocolVersionMin.argtypes = [SSLContextRef, SSLProtocol]
-    Security.SSLSetProtocolVersionMin.restype = OSStatus 
- 
+    Security.SSLSetProtocolVersionMin.restype = OSStatus
+
     Security.SSLSetProtocolVersionMax.argtypes = [SSLContextRef, SSLProtocol]
-    Security.SSLSetProtocolVersionMax.restype = OSStatus 
- 
+    Security.SSLSetProtocolVersionMax.restype = OSStatus
+
     try:
         Security.SSLSetALPNProtocols.argtypes = [SSLContextRef, CFArrayRef]
         Security.SSLSetALPNProtocols.restype = OSStatus
@@ -303,216 +303,216 @@ try:
         pass
 
     Security.SecCopyErrorMessageString.argtypes = [OSStatus, c_void_p]
-    Security.SecCopyErrorMessageString.restype = CFStringRef 
- 
-    Security.SSLReadFunc = SSLReadFunc 
-    Security.SSLWriteFunc = SSLWriteFunc 
-    Security.SSLContextRef = SSLContextRef 
-    Security.SSLProtocol = SSLProtocol 
-    Security.SSLCipherSuite = SSLCipherSuite 
-    Security.SecIdentityRef = SecIdentityRef 
-    Security.SecKeychainRef = SecKeychainRef 
-    Security.SecTrustRef = SecTrustRef 
-    Security.SecTrustResultType = SecTrustResultType 
-    Security.SecExternalFormat = SecExternalFormat 
-    Security.OSStatus = OSStatus 
- 
-    Security.kSecImportExportPassphrase = CFStringRef.in_dll( 
+    Security.SecCopyErrorMessageString.restype = CFStringRef
+
+    Security.SSLReadFunc = SSLReadFunc
+    Security.SSLWriteFunc = SSLWriteFunc
+    Security.SSLContextRef = SSLContextRef
+    Security.SSLProtocol = SSLProtocol
+    Security.SSLCipherSuite = SSLCipherSuite
+    Security.SecIdentityRef = SecIdentityRef
+    Security.SecKeychainRef = SecKeychainRef
+    Security.SecTrustRef = SecTrustRef
+    Security.SecTrustResultType = SecTrustResultType
+    Security.SecExternalFormat = SecExternalFormat
+    Security.OSStatus = OSStatus
+
+    Security.kSecImportExportPassphrase = CFStringRef.in_dll(
         Security, "kSecImportExportPassphrase"
-    ) 
-    Security.kSecImportItemIdentity = CFStringRef.in_dll( 
+    )
+    Security.kSecImportItemIdentity = CFStringRef.in_dll(
         Security, "kSecImportItemIdentity"
-    ) 
- 
-    # CoreFoundation time! 
+    )
+
+    # CoreFoundation time!
     CoreFoundation.CFRetain.argtypes = [CFTypeRef]
-    CoreFoundation.CFRetain.restype = CFTypeRef 
- 
+    CoreFoundation.CFRetain.restype = CFTypeRef
+
     CoreFoundation.CFRelease.argtypes = [CFTypeRef]
-    CoreFoundation.CFRelease.restype = None 
- 
+    CoreFoundation.CFRelease.restype = None
+
     CoreFoundation.CFGetTypeID.argtypes = [CFTypeRef]
-    CoreFoundation.CFGetTypeID.restype = CFTypeID 
- 
-    CoreFoundation.CFStringCreateWithCString.argtypes = [ 
-        CFAllocatorRef, 
-        c_char_p, 
+    CoreFoundation.CFGetTypeID.restype = CFTypeID
+
+    CoreFoundation.CFStringCreateWithCString.argtypes = [
+        CFAllocatorRef,
+        c_char_p,
         CFStringEncoding,
-    ] 
-    CoreFoundation.CFStringCreateWithCString.restype = CFStringRef 
- 
+    ]
+    CoreFoundation.CFStringCreateWithCString.restype = CFStringRef
+
     CoreFoundation.CFStringGetCStringPtr.argtypes = [CFStringRef, CFStringEncoding]
-    CoreFoundation.CFStringGetCStringPtr.restype = c_char_p 
- 
-    CoreFoundation.CFStringGetCString.argtypes = [ 
-        CFStringRef, 
-        c_char_p, 
-        CFIndex, 
+    CoreFoundation.CFStringGetCStringPtr.restype = c_char_p
+
+    CoreFoundation.CFStringGetCString.argtypes = [
+        CFStringRef,
+        c_char_p,
+        CFIndex,
         CFStringEncoding,
-    ] 
-    CoreFoundation.CFStringGetCString.restype = c_bool 
- 
+    ]
+    CoreFoundation.CFStringGetCString.restype = c_bool
+
     CoreFoundation.CFDataCreate.argtypes = [CFAllocatorRef, c_char_p, CFIndex]
-    CoreFoundation.CFDataCreate.restype = CFDataRef 
- 
+    CoreFoundation.CFDataCreate.restype = CFDataRef
+
     CoreFoundation.CFDataGetLength.argtypes = [CFDataRef]
-    CoreFoundation.CFDataGetLength.restype = CFIndex 
- 
+    CoreFoundation.CFDataGetLength.restype = CFIndex
+
     CoreFoundation.CFDataGetBytePtr.argtypes = [CFDataRef]
-    CoreFoundation.CFDataGetBytePtr.restype = c_void_p 
- 
-    CoreFoundation.CFDictionaryCreate.argtypes = [ 
-        CFAllocatorRef, 
-        POINTER(CFTypeRef), 
-        POINTER(CFTypeRef), 
-        CFIndex, 
-        CFDictionaryKeyCallBacks, 
+    CoreFoundation.CFDataGetBytePtr.restype = c_void_p
+
+    CoreFoundation.CFDictionaryCreate.argtypes = [
+        CFAllocatorRef,
+        POINTER(CFTypeRef),
+        POINTER(CFTypeRef),
+        CFIndex,
+        CFDictionaryKeyCallBacks,
         CFDictionaryValueCallBacks,
-    ] 
-    CoreFoundation.CFDictionaryCreate.restype = CFDictionaryRef 
- 
+    ]
+    CoreFoundation.CFDictionaryCreate.restype = CFDictionaryRef
+
     CoreFoundation.CFDictionaryGetValue.argtypes = [CFDictionaryRef, CFTypeRef]
-    CoreFoundation.CFDictionaryGetValue.restype = CFTypeRef 
- 
-    CoreFoundation.CFArrayCreate.argtypes = [ 
-        CFAllocatorRef, 
-        POINTER(CFTypeRef), 
-        CFIndex, 
-        CFArrayCallBacks, 
-    ] 
-    CoreFoundation.CFArrayCreate.restype = CFArrayRef 
- 
-    CoreFoundation.CFArrayCreateMutable.argtypes = [ 
-        CFAllocatorRef, 
-        CFIndex, 
+    CoreFoundation.CFDictionaryGetValue.restype = CFTypeRef
+
+    CoreFoundation.CFArrayCreate.argtypes = [
+        CFAllocatorRef,
+        POINTER(CFTypeRef),
+        CFIndex,
         CFArrayCallBacks,
-    ] 
-    CoreFoundation.CFArrayCreateMutable.restype = CFMutableArrayRef 
- 
+    ]
+    CoreFoundation.CFArrayCreate.restype = CFArrayRef
+
+    CoreFoundation.CFArrayCreateMutable.argtypes = [
+        CFAllocatorRef,
+        CFIndex,
+        CFArrayCallBacks,
+    ]
+    CoreFoundation.CFArrayCreateMutable.restype = CFMutableArrayRef
+
     CoreFoundation.CFArrayAppendValue.argtypes = [CFMutableArrayRef, c_void_p]
-    CoreFoundation.CFArrayAppendValue.restype = None 
- 
+    CoreFoundation.CFArrayAppendValue.restype = None
+
     CoreFoundation.CFArrayGetCount.argtypes = [CFArrayRef]
-    CoreFoundation.CFArrayGetCount.restype = CFIndex 
- 
+    CoreFoundation.CFArrayGetCount.restype = CFIndex
+
     CoreFoundation.CFArrayGetValueAtIndex.argtypes = [CFArrayRef, CFIndex]
-    CoreFoundation.CFArrayGetValueAtIndex.restype = c_void_p 
- 
-    CoreFoundation.kCFAllocatorDefault = CFAllocatorRef.in_dll( 
+    CoreFoundation.CFArrayGetValueAtIndex.restype = c_void_p
+
+    CoreFoundation.kCFAllocatorDefault = CFAllocatorRef.in_dll(
         CoreFoundation, "kCFAllocatorDefault"
-    ) 
+    )
     CoreFoundation.kCFTypeArrayCallBacks = c_void_p.in_dll(
         CoreFoundation, "kCFTypeArrayCallBacks"
     )
-    CoreFoundation.kCFTypeDictionaryKeyCallBacks = c_void_p.in_dll( 
+    CoreFoundation.kCFTypeDictionaryKeyCallBacks = c_void_p.in_dll(
         CoreFoundation, "kCFTypeDictionaryKeyCallBacks"
-    ) 
-    CoreFoundation.kCFTypeDictionaryValueCallBacks = c_void_p.in_dll( 
+    )
+    CoreFoundation.kCFTypeDictionaryValueCallBacks = c_void_p.in_dll(
         CoreFoundation, "kCFTypeDictionaryValueCallBacks"
-    ) 
- 
-    CoreFoundation.CFTypeRef = CFTypeRef 
-    CoreFoundation.CFArrayRef = CFArrayRef 
-    CoreFoundation.CFStringRef = CFStringRef 
-    CoreFoundation.CFDictionaryRef = CFDictionaryRef 
- 
-except (AttributeError): 
+    )
+
+    CoreFoundation.CFTypeRef = CFTypeRef
+    CoreFoundation.CFArrayRef = CFArrayRef
+    CoreFoundation.CFStringRef = CFStringRef
+    CoreFoundation.CFDictionaryRef = CFDictionaryRef
+
+except (AttributeError):
     raise ImportError("Error initializing ctypes")
- 
- 
-class CFConst(object): 
-    """ 
-    A class object that acts as essentially a namespace for CoreFoundation 
-    constants. 
-    """ 
-
-    kCFStringEncodingUTF8 = CFStringEncoding(0x08000100) 
- 
- 
-class SecurityConst(object): 
-    """ 
-    A class object that acts as essentially a namespace for Security constants. 
-    """ 
-
-    kSSLSessionOptionBreakOnServerAuth = 0 
- 
-    kSSLProtocol2 = 1 
-    kSSLProtocol3 = 2 
-    kTLSProtocol1 = 4 
-    kTLSProtocol11 = 7 
-    kTLSProtocol12 = 8 
+
+
+class CFConst(object):
+    """
+    A class object that acts as essentially a namespace for CoreFoundation
+    constants.
+    """
+
+    kCFStringEncodingUTF8 = CFStringEncoding(0x08000100)
+
+
+class SecurityConst(object):
+    """
+    A class object that acts as essentially a namespace for Security constants.
+    """
+
+    kSSLSessionOptionBreakOnServerAuth = 0
+
+    kSSLProtocol2 = 1
+    kSSLProtocol3 = 2
+    kTLSProtocol1 = 4
+    kTLSProtocol11 = 7
+    kTLSProtocol12 = 8
     # SecureTransport does not support TLS 1.3 even if there's a constant for it
     kTLSProtocol13 = 10
     kTLSProtocolMaxSupported = 999
- 
-    kSSLClientSide = 1 
-    kSSLStreamType = 0 
- 
-    kSecFormatPEMSequence = 10 
- 
-    kSecTrustResultInvalid = 0 
-    kSecTrustResultProceed = 1 
-    # This gap is present on purpose: this was kSecTrustResultConfirm, which 
-    # is deprecated. 
-    kSecTrustResultDeny = 3 
-    kSecTrustResultUnspecified = 4 
-    kSecTrustResultRecoverableTrustFailure = 5 
-    kSecTrustResultFatalTrustFailure = 6 
-    kSecTrustResultOtherError = 7 
- 
-    errSSLProtocol = -9800 
-    errSSLWouldBlock = -9803 
-    errSSLClosedGraceful = -9805 
-    errSSLClosedNoNotify = -9816 
-    errSSLClosedAbort = -9806 
- 
-    errSSLXCertChainInvalid = -9807 
-    errSSLCrypto = -9809 
-    errSSLInternal = -9810 
-    errSSLCertExpired = -9814 
-    errSSLCertNotYetValid = -9815 
-    errSSLUnknownRootCert = -9812 
-    errSSLNoRootCert = -9813 
-    errSSLHostNameMismatch = -9843 
-    errSSLPeerHandshakeFail = -9824 
-    errSSLPeerUserCancelled = -9839 
-    errSSLWeakPeerEphemeralDHKey = -9850 
-    errSSLServerAuthCompleted = -9841 
-    errSSLRecordOverflow = -9847 
- 
-    errSecVerifyFailed = -67808 
-    errSecNoTrustSettings = -25263 
-    errSecItemNotFound = -25300 
-    errSecInvalidTrustSettings = -25262 
- 
-    # Cipher suites. We only pick the ones our default cipher string allows. 
+
+    kSSLClientSide = 1
+    kSSLStreamType = 0
+
+    kSecFormatPEMSequence = 10
+
+    kSecTrustResultInvalid = 0
+    kSecTrustResultProceed = 1
+    # This gap is present on purpose: this was kSecTrustResultConfirm, which
+    # is deprecated.
+    kSecTrustResultDeny = 3
+    kSecTrustResultUnspecified = 4
+    kSecTrustResultRecoverableTrustFailure = 5
+    kSecTrustResultFatalTrustFailure = 6
+    kSecTrustResultOtherError = 7
+
+    errSSLProtocol = -9800
+    errSSLWouldBlock = -9803
+    errSSLClosedGraceful = -9805
+    errSSLClosedNoNotify = -9816
+    errSSLClosedAbort = -9806
+
+    errSSLXCertChainInvalid = -9807
+    errSSLCrypto = -9809
+    errSSLInternal = -9810
+    errSSLCertExpired = -9814
+    errSSLCertNotYetValid = -9815
+    errSSLUnknownRootCert = -9812
+    errSSLNoRootCert = -9813
+    errSSLHostNameMismatch = -9843
+    errSSLPeerHandshakeFail = -9824
+    errSSLPeerUserCancelled = -9839
+    errSSLWeakPeerEphemeralDHKey = -9850
+    errSSLServerAuthCompleted = -9841
+    errSSLRecordOverflow = -9847
+
+    errSecVerifyFailed = -67808
+    errSecNoTrustSettings = -25263
+    errSecItemNotFound = -25300
+    errSecInvalidTrustSettings = -25262
+
+    # Cipher suites. We only pick the ones our default cipher string allows.
     # Source: https://developer.apple.com/documentation/security/1550981-ssl_cipher_suite_values
-    TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 = 0xC02C 
-    TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 = 0xC030 
-    TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 = 0xC02B 
-    TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 = 0xC02F 
+    TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 = 0xC02C
+    TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 = 0xC030
+    TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 = 0xC02B
+    TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 = 0xC02F
     TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256 = 0xCCA9
     TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 = 0xCCA8
-    TLS_DHE_RSA_WITH_AES_256_GCM_SHA384 = 0x009F 
-    TLS_DHE_RSA_WITH_AES_128_GCM_SHA256 = 0x009E 
-    TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 = 0xC024 
-    TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384 = 0xC028 
-    TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA = 0xC00A 
-    TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA = 0xC014 
-    TLS_DHE_RSA_WITH_AES_256_CBC_SHA256 = 0x006B 
-    TLS_DHE_RSA_WITH_AES_256_CBC_SHA = 0x0039 
-    TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 = 0xC023 
-    TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 = 0xC027 
-    TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA = 0xC009 
-    TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA = 0xC013 
-    TLS_DHE_RSA_WITH_AES_128_CBC_SHA256 = 0x0067 
-    TLS_DHE_RSA_WITH_AES_128_CBC_SHA = 0x0033 
-    TLS_RSA_WITH_AES_256_GCM_SHA384 = 0x009D 
-    TLS_RSA_WITH_AES_128_GCM_SHA256 = 0x009C 
-    TLS_RSA_WITH_AES_256_CBC_SHA256 = 0x003D 
-    TLS_RSA_WITH_AES_128_CBC_SHA256 = 0x003C 
-    TLS_RSA_WITH_AES_256_CBC_SHA = 0x0035 
-    TLS_RSA_WITH_AES_128_CBC_SHA = 0x002F 
+    TLS_DHE_RSA_WITH_AES_256_GCM_SHA384 = 0x009F
+    TLS_DHE_RSA_WITH_AES_128_GCM_SHA256 = 0x009E
+    TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 = 0xC024
+    TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384 = 0xC028
+    TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA = 0xC00A
+    TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA = 0xC014
+    TLS_DHE_RSA_WITH_AES_256_CBC_SHA256 = 0x006B
+    TLS_DHE_RSA_WITH_AES_256_CBC_SHA = 0x0039
+    TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 = 0xC023
+    TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 = 0xC027
+    TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA = 0xC009
+    TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA = 0xC013
+    TLS_DHE_RSA_WITH_AES_128_CBC_SHA256 = 0x0067
+    TLS_DHE_RSA_WITH_AES_128_CBC_SHA = 0x0033
+    TLS_RSA_WITH_AES_256_GCM_SHA384 = 0x009D
+    TLS_RSA_WITH_AES_128_GCM_SHA256 = 0x009C
+    TLS_RSA_WITH_AES_256_CBC_SHA256 = 0x003D
+    TLS_RSA_WITH_AES_128_CBC_SHA256 = 0x003C
+    TLS_RSA_WITH_AES_256_CBC_SHA = 0x0035
+    TLS_RSA_WITH_AES_128_CBC_SHA = 0x002F
     TLS_AES_128_GCM_SHA256 = 0x1301
     TLS_AES_256_GCM_SHA384 = 0x1302
     TLS_AES_128_CCM_8_SHA256 = 0x1305

+ 303 - 303
contrib/python/urllib3/urllib3/contrib/_securetransport/low_level.py

@@ -1,61 +1,61 @@
-""" 
-Low-level helpers for the SecureTransport bindings. 
- 
-These are Python functions that are not directly related to the high-level APIs 
-but are necessary to get them to work. They include a whole bunch of low-level 
-CoreFoundation messing about and memory management. The concerns in this module 
-are almost entirely about trying to avoid memory leaks and providing 
-appropriate and useful assistance to the higher-level code. 
-""" 
-import base64 
-import ctypes 
-import itertools 
+"""
+Low-level helpers for the SecureTransport bindings.
+
+These are Python functions that are not directly related to the high-level APIs
+but are necessary to get them to work. They include a whole bunch of low-level
+CoreFoundation messing about and memory management. The concerns in this module
+are almost entirely about trying to avoid memory leaks and providing
+appropriate and useful assistance to the higher-level code.
+"""
+import base64
+import ctypes
+import itertools
 import os
 import re
-import ssl 
+import ssl
 import struct
-import tempfile 
- 
+import tempfile
+
 from .bindings import CFConst, CoreFoundation, Security
- 
-# This regular expression is used to grab PEM data out of a PEM bundle. 
-_PEM_CERTS_RE = re.compile( 
-    b"-----BEGIN CERTIFICATE-----\n(.*?)\n-----END CERTIFICATE-----", re.DOTALL 
-) 
- 
- 
-def _cf_data_from_bytes(bytestring): 
-    """ 
-    Given a bytestring, create a CFData object from it. This CFData object must 
-    be CFReleased by the caller. 
-    """ 
-    return CoreFoundation.CFDataCreate( 
-        CoreFoundation.kCFAllocatorDefault, bytestring, len(bytestring) 
-    ) 
- 
- 
-def _cf_dictionary_from_tuples(tuples): 
-    """ 
-    Given a list of Python tuples, create an associated CFDictionary. 
-    """ 
-    dictionary_size = len(tuples) 
- 
-    # We need to get the dictionary keys and values out in the same order. 
-    keys = (t[0] for t in tuples) 
-    values = (t[1] for t in tuples) 
-    cf_keys = (CoreFoundation.CFTypeRef * dictionary_size)(*keys) 
-    cf_values = (CoreFoundation.CFTypeRef * dictionary_size)(*values) 
- 
-    return CoreFoundation.CFDictionaryCreate( 
-        CoreFoundation.kCFAllocatorDefault, 
-        cf_keys, 
-        cf_values, 
-        dictionary_size, 
-        CoreFoundation.kCFTypeDictionaryKeyCallBacks, 
-        CoreFoundation.kCFTypeDictionaryValueCallBacks, 
-    ) 
- 
- 
+
+# This regular expression is used to grab PEM data out of a PEM bundle.
+_PEM_CERTS_RE = re.compile(
+    b"-----BEGIN CERTIFICATE-----\n(.*?)\n-----END CERTIFICATE-----", re.DOTALL
+)
+
+
+def _cf_data_from_bytes(bytestring):
+    """
+    Given a bytestring, create a CFData object from it. This CFData object must
+    be CFReleased by the caller.
+    """
+    return CoreFoundation.CFDataCreate(
+        CoreFoundation.kCFAllocatorDefault, bytestring, len(bytestring)
+    )
+
+
+def _cf_dictionary_from_tuples(tuples):
+    """
+    Given a list of Python tuples, create an associated CFDictionary.
+    """
+    dictionary_size = len(tuples)
+
+    # We need to get the dictionary keys and values out in the same order.
+    keys = (t[0] for t in tuples)
+    values = (t[1] for t in tuples)
+    cf_keys = (CoreFoundation.CFTypeRef * dictionary_size)(*keys)
+    cf_values = (CoreFoundation.CFTypeRef * dictionary_size)(*values)
+
+    return CoreFoundation.CFDictionaryCreate(
+        CoreFoundation.kCFAllocatorDefault,
+        cf_keys,
+        cf_values,
+        dictionary_size,
+        CoreFoundation.kCFTypeDictionaryKeyCallBacks,
+        CoreFoundation.kCFTypeDictionaryValueCallBacks,
+    )
+
+
 def _cfstr(py_bstr):
     """
     Given a Python binary data, create a CFString.
@@ -101,277 +101,277 @@ def _create_cfstring_array(lst):
     return cf_arr
 
 
-def _cf_string_to_unicode(value): 
-    """ 
-    Creates a Unicode string from a CFString object. Used entirely for error 
-    reporting. 
- 
-    Yes, it annoys me quite a lot that this function is this complex. 
-    """ 
-    value_as_void_p = ctypes.cast(value, ctypes.POINTER(ctypes.c_void_p)) 
- 
-    string = CoreFoundation.CFStringGetCStringPtr( 
+def _cf_string_to_unicode(value):
+    """
+    Creates a Unicode string from a CFString object. Used entirely for error
+    reporting.
+
+    Yes, it annoys me quite a lot that this function is this complex.
+    """
+    value_as_void_p = ctypes.cast(value, ctypes.POINTER(ctypes.c_void_p))
+
+    string = CoreFoundation.CFStringGetCStringPtr(
         value_as_void_p, CFConst.kCFStringEncodingUTF8
-    ) 
-    if string is None: 
-        buffer = ctypes.create_string_buffer(1024) 
-        result = CoreFoundation.CFStringGetCString( 
+    )
+    if string is None:
+        buffer = ctypes.create_string_buffer(1024)
+        result = CoreFoundation.CFStringGetCString(
             value_as_void_p, buffer, 1024, CFConst.kCFStringEncodingUTF8
-        ) 
-        if not result: 
+        )
+        if not result:
             raise OSError("Error copying C string from CFStringRef")
-        string = buffer.value 
-    if string is not None: 
+        string = buffer.value
+    if string is not None:
         string = string.decode("utf-8")
-    return string 
- 
- 
-def _assert_no_error(error, exception_class=None): 
-    """ 
-    Checks the return code and throws an exception if there is an error to 
-    report 
-    """ 
-    if error == 0: 
-        return 
- 
-    cf_error_string = Security.SecCopyErrorMessageString(error, None) 
-    output = _cf_string_to_unicode(cf_error_string) 
-    CoreFoundation.CFRelease(cf_error_string) 
- 
+    return string
+
+
+def _assert_no_error(error, exception_class=None):
+    """
+    Checks the return code and throws an exception if there is an error to
+    report
+    """
+    if error == 0:
+        return
+
+    cf_error_string = Security.SecCopyErrorMessageString(error, None)
+    output = _cf_string_to_unicode(cf_error_string)
+    CoreFoundation.CFRelease(cf_error_string)
+
     if output is None or output == u"":
         output = u"OSStatus %s" % error
- 
-    if exception_class is None: 
-        exception_class = ssl.SSLError 
- 
-    raise exception_class(output) 
- 
- 
-def _cert_array_from_pem(pem_bundle): 
-    """ 
-    Given a bundle of certs in PEM format, turns them into a CFArray of certs 
-    that can be used to validate a cert chain. 
-    """ 
+
+    if exception_class is None:
+        exception_class = ssl.SSLError
+
+    raise exception_class(output)
+
+
+def _cert_array_from_pem(pem_bundle):
+    """
+    Given a bundle of certs in PEM format, turns them into a CFArray of certs
+    that can be used to validate a cert chain.
+    """
     # Normalize the PEM bundle's line endings.
     pem_bundle = pem_bundle.replace(b"\r\n", b"\n")
 
-    der_certs = [ 
+    der_certs = [
         base64.b64decode(match.group(1)) for match in _PEM_CERTS_RE.finditer(pem_bundle)
-    ] 
-    if not der_certs: 
-        raise ssl.SSLError("No root certificates specified") 
- 
-    cert_array = CoreFoundation.CFArrayCreateMutable( 
-        CoreFoundation.kCFAllocatorDefault, 
-        0, 
+    ]
+    if not der_certs:
+        raise ssl.SSLError("No root certificates specified")
+
+    cert_array = CoreFoundation.CFArrayCreateMutable(
+        CoreFoundation.kCFAllocatorDefault,
+        0,
         ctypes.byref(CoreFoundation.kCFTypeArrayCallBacks),
-    ) 
-    if not cert_array: 
-        raise ssl.SSLError("Unable to allocate memory!") 
- 
-    try: 
-        for der_bytes in der_certs: 
-            certdata = _cf_data_from_bytes(der_bytes) 
-            if not certdata: 
-                raise ssl.SSLError("Unable to allocate memory!") 
-            cert = Security.SecCertificateCreateWithData( 
-                CoreFoundation.kCFAllocatorDefault, certdata 
-            ) 
-            CoreFoundation.CFRelease(certdata) 
-            if not cert: 
-                raise ssl.SSLError("Unable to build cert object!") 
- 
-            CoreFoundation.CFArrayAppendValue(cert_array, cert) 
-            CoreFoundation.CFRelease(cert) 
-    except Exception: 
-        # We need to free the array before the exception bubbles further. 
-        # We only want to do that if an error occurs: otherwise, the caller 
-        # should free. 
-        CoreFoundation.CFRelease(cert_array) 
+    )
+    if not cert_array:
+        raise ssl.SSLError("Unable to allocate memory!")
+
+    try:
+        for der_bytes in der_certs:
+            certdata = _cf_data_from_bytes(der_bytes)
+            if not certdata:
+                raise ssl.SSLError("Unable to allocate memory!")
+            cert = Security.SecCertificateCreateWithData(
+                CoreFoundation.kCFAllocatorDefault, certdata
+            )
+            CoreFoundation.CFRelease(certdata)
+            if not cert:
+                raise ssl.SSLError("Unable to build cert object!")
+
+            CoreFoundation.CFArrayAppendValue(cert_array, cert)
+            CoreFoundation.CFRelease(cert)
+    except Exception:
+        # We need to free the array before the exception bubbles further.
+        # We only want to do that if an error occurs: otherwise, the caller
+        # should free.
+        CoreFoundation.CFRelease(cert_array)
         raise
- 
-    return cert_array 
- 
- 
-def _is_cert(item): 
-    """ 
-    Returns True if a given CFTypeRef is a certificate. 
-    """ 
-    expected = Security.SecCertificateGetTypeID() 
-    return CoreFoundation.CFGetTypeID(item) == expected 
- 
- 
-def _is_identity(item): 
-    """ 
-    Returns True if a given CFTypeRef is an identity. 
-    """ 
-    expected = Security.SecIdentityGetTypeID() 
-    return CoreFoundation.CFGetTypeID(item) == expected 
- 
- 
-def _temporary_keychain(): 
-    """ 
-    This function creates a temporary Mac keychain that we can use to work with 
-    credentials. This keychain uses a one-time password and a temporary file to 
-    store the data. We expect to have one keychain per socket. The returned 
-    SecKeychainRef must be freed by the caller, including calling 
-    SecKeychainDelete. 
- 
-    Returns a tuple of the SecKeychainRef and the path to the temporary 
-    directory that contains it. 
-    """ 
-    # Unfortunately, SecKeychainCreate requires a path to a keychain. This 
-    # means we cannot use mkstemp to use a generic temporary file. Instead, 
-    # we're going to create a temporary directory and a filename to use there. 
-    # This filename will be 8 random bytes expanded into base64. We also need 
-    # some random bytes to password-protect the keychain we're creating, so we 
-    # ask for 40 random bytes. 
-    random_bytes = os.urandom(40) 
+
+    return cert_array
+
+
+def _is_cert(item):
+    """
+    Returns True if a given CFTypeRef is a certificate.
+    """
+    expected = Security.SecCertificateGetTypeID()
+    return CoreFoundation.CFGetTypeID(item) == expected
+
+
+def _is_identity(item):
+    """
+    Returns True if a given CFTypeRef is an identity.
+    """
+    expected = Security.SecIdentityGetTypeID()
+    return CoreFoundation.CFGetTypeID(item) == expected
+
+
+def _temporary_keychain():
+    """
+    This function creates a temporary Mac keychain that we can use to work with
+    credentials. This keychain uses a one-time password and a temporary file to
+    store the data. We expect to have one keychain per socket. The returned
+    SecKeychainRef must be freed by the caller, including calling
+    SecKeychainDelete.
+
+    Returns a tuple of the SecKeychainRef and the path to the temporary
+    directory that contains it.
+    """
+    # Unfortunately, SecKeychainCreate requires a path to a keychain. This
+    # means we cannot use mkstemp to use a generic temporary file. Instead,
+    # we're going to create a temporary directory and a filename to use there.
+    # This filename will be 8 random bytes expanded into base64. We also need
+    # some random bytes to password-protect the keychain we're creating, so we
+    # ask for 40 random bytes.
+    random_bytes = os.urandom(40)
     filename = base64.b16encode(random_bytes[:8]).decode("utf-8")
     password = base64.b16encode(random_bytes[8:])  # Must be valid UTF-8
-    tempdirectory = tempfile.mkdtemp() 
- 
+    tempdirectory = tempfile.mkdtemp()
+
     keychain_path = os.path.join(tempdirectory, filename).encode("utf-8")
- 
-    # We now want to create the keychain itself. 
-    keychain = Security.SecKeychainRef() 
-    status = Security.SecKeychainCreate( 
+
+    # We now want to create the keychain itself.
+    keychain = Security.SecKeychainRef()
+    status = Security.SecKeychainCreate(
         keychain_path, len(password), password, False, None, ctypes.byref(keychain)
-    ) 
-    _assert_no_error(status) 
- 
-    # Having created the keychain, we want to pass it off to the caller. 
-    return keychain, tempdirectory 
- 
- 
-def _load_items_from_file(keychain, path): 
-    """ 
-    Given a single file, loads all the trust objects from it into arrays and 
-    the keychain. 
-    Returns a tuple of lists: the first list is a list of identities, the 
-    second a list of certs. 
-    """ 
-    certificates = [] 
-    identities = [] 
-    result_array = None 
- 
+    )
+    _assert_no_error(status)
+
+    # Having created the keychain, we want to pass it off to the caller.
+    return keychain, tempdirectory
+
+
+def _load_items_from_file(keychain, path):
+    """
+    Given a single file, loads all the trust objects from it into arrays and
+    the keychain.
+    Returns a tuple of lists: the first list is a list of identities, the
+    second a list of certs.
+    """
+    certificates = []
+    identities = []
+    result_array = None
+
     with open(path, "rb") as f:
-        raw_filedata = f.read() 
- 
-    try: 
-        filedata = CoreFoundation.CFDataCreate( 
+        raw_filedata = f.read()
+
+    try:
+        filedata = CoreFoundation.CFDataCreate(
             CoreFoundation.kCFAllocatorDefault, raw_filedata, len(raw_filedata)
-        ) 
-        result_array = CoreFoundation.CFArrayRef() 
-        result = Security.SecItemImport( 
-            filedata,  # cert data 
-            None,  # Filename, leaving it out for now 
-            None,  # What the type of the file is, we don't care 
-            None,  # what's in the file, we don't care 
-            0,  # import flags 
-            None,  # key params, can include passphrase in the future 
-            keychain,  # The keychain to insert into 
+        )
+        result_array = CoreFoundation.CFArrayRef()
+        result = Security.SecItemImport(
+            filedata,  # cert data
+            None,  # Filename, leaving it out for now
+            None,  # What the type of the file is, we don't care
+            None,  # what's in the file, we don't care
+            0,  # import flags
+            None,  # key params, can include passphrase in the future
+            keychain,  # The keychain to insert into
             ctypes.byref(result_array),  # Results
-        ) 
-        _assert_no_error(result) 
- 
-        # A CFArray is not very useful to us as an intermediary 
-        # representation, so we are going to extract the objects we want 
-        # and then free the array. We don't need to keep hold of keys: the 
-        # keychain already has them! 
-        result_count = CoreFoundation.CFArrayGetCount(result_array) 
-        for index in range(result_count): 
+        )
+        _assert_no_error(result)
+
+        # A CFArray is not very useful to us as an intermediary
+        # representation, so we are going to extract the objects we want
+        # and then free the array. We don't need to keep hold of keys: the
+        # keychain already has them!
+        result_count = CoreFoundation.CFArrayGetCount(result_array)
+        for index in range(result_count):
             item = CoreFoundation.CFArrayGetValueAtIndex(result_array, index)
-            item = ctypes.cast(item, CoreFoundation.CFTypeRef) 
- 
-            if _is_cert(item): 
-                CoreFoundation.CFRetain(item) 
-                certificates.append(item) 
-            elif _is_identity(item): 
-                CoreFoundation.CFRetain(item) 
-                identities.append(item) 
-    finally: 
-        if result_array: 
-            CoreFoundation.CFRelease(result_array) 
- 
-        CoreFoundation.CFRelease(filedata) 
- 
-    return (identities, certificates) 
- 
- 
-def _load_client_cert_chain(keychain, *paths): 
-    """ 
-    Load certificates and maybe keys from a number of files. Has the end goal 
-    of returning a CFArray containing one SecIdentityRef, and then zero or more 
-    SecCertificateRef objects, suitable for use as a client certificate trust 
-    chain. 
-    """ 
-    # Ok, the strategy. 
-    # 
-    # This relies on knowing that macOS will not give you a SecIdentityRef 
-    # unless you have imported a key into a keychain. This is a somewhat 
-    # artificial limitation of macOS (for example, it doesn't necessarily 
-    # affect iOS), but there is nothing inside Security.framework that lets you 
-    # get a SecIdentityRef without having a key in a keychain. 
-    # 
-    # So the policy here is we take all the files and iterate them in order. 
-    # Each one will use SecItemImport to have one or more objects loaded from 
-    # it. We will also point at a keychain that macOS can use to work with the 
-    # private key. 
-    # 
-    # Once we have all the objects, we'll check what we actually have. If we 
-    # already have a SecIdentityRef in hand, fab: we'll use that. Otherwise, 
-    # we'll take the first certificate (which we assume to be our leaf) and 
-    # ask the keychain to give us a SecIdentityRef with that cert's associated 
-    # key. 
-    # 
-    # We'll then return a CFArray containing the trust chain: one 
-    # SecIdentityRef and then zero-or-more SecCertificateRef objects. The 
-    # responsibility for freeing this CFArray will be with the caller. This 
-    # CFArray must remain alive for the entire connection, so in practice it 
-    # will be stored with a single SSLSocket, along with the reference to the 
-    # keychain. 
-    certificates = [] 
-    identities = [] 
- 
-    # Filter out bad paths. 
-    paths = (path for path in paths if path) 
- 
-    try: 
-        for file_path in paths: 
+            item = ctypes.cast(item, CoreFoundation.CFTypeRef)
+
+            if _is_cert(item):
+                CoreFoundation.CFRetain(item)
+                certificates.append(item)
+            elif _is_identity(item):
+                CoreFoundation.CFRetain(item)
+                identities.append(item)
+    finally:
+        if result_array:
+            CoreFoundation.CFRelease(result_array)
+
+        CoreFoundation.CFRelease(filedata)
+
+    return (identities, certificates)
+
+
+def _load_client_cert_chain(keychain, *paths):
+    """
+    Load certificates and maybe keys from a number of files. Has the end goal
+    of returning a CFArray containing one SecIdentityRef, and then zero or more
+    SecCertificateRef objects, suitable for use as a client certificate trust
+    chain.
+    """
+    # Ok, the strategy.
+    #
+    # This relies on knowing that macOS will not give you a SecIdentityRef
+    # unless you have imported a key into a keychain. This is a somewhat
+    # artificial limitation of macOS (for example, it doesn't necessarily
+    # affect iOS), but there is nothing inside Security.framework that lets you
+    # get a SecIdentityRef without having a key in a keychain.
+    #
+    # So the policy here is we take all the files and iterate them in order.
+    # Each one will use SecItemImport to have one or more objects loaded from
+    # it. We will also point at a keychain that macOS can use to work with the
+    # private key.
+    #
+    # Once we have all the objects, we'll check what we actually have. If we
+    # already have a SecIdentityRef in hand, fab: we'll use that. Otherwise,
+    # we'll take the first certificate (which we assume to be our leaf) and
+    # ask the keychain to give us a SecIdentityRef with that cert's associated
+    # key.
+    #
+    # We'll then return a CFArray containing the trust chain: one
+    # SecIdentityRef and then zero-or-more SecCertificateRef objects. The
+    # responsibility for freeing this CFArray will be with the caller. This
+    # CFArray must remain alive for the entire connection, so in practice it
+    # will be stored with a single SSLSocket, along with the reference to the
+    # keychain.
+    certificates = []
+    identities = []
+
+    # Filter out bad paths.
+    paths = (path for path in paths if path)
+
+    try:
+        for file_path in paths:
             new_identities, new_certs = _load_items_from_file(keychain, file_path)
-            identities.extend(new_identities) 
-            certificates.extend(new_certs) 
- 
-        # Ok, we have everything. The question is: do we have an identity? If 
-        # not, we want to grab one from the first cert we have. 
-        if not identities: 
-            new_identity = Security.SecIdentityRef() 
-            status = Security.SecIdentityCreateWithCertificate( 
+            identities.extend(new_identities)
+            certificates.extend(new_certs)
+
+        # Ok, we have everything. The question is: do we have an identity? If
+        # not, we want to grab one from the first cert we have.
+        if not identities:
+            new_identity = Security.SecIdentityRef()
+            status = Security.SecIdentityCreateWithCertificate(
                 keychain, certificates[0], ctypes.byref(new_identity)
-            ) 
-            _assert_no_error(status) 
-            identities.append(new_identity) 
- 
-            # We now want to release the original certificate, as we no longer 
-            # need it. 
-            CoreFoundation.CFRelease(certificates.pop(0)) 
- 
-        # We now need to build a new CFArray that holds the trust chain. 
-        trust_chain = CoreFoundation.CFArrayCreateMutable( 
-            CoreFoundation.kCFAllocatorDefault, 
-            0, 
-            ctypes.byref(CoreFoundation.kCFTypeArrayCallBacks), 
-        ) 
-        for item in itertools.chain(identities, certificates): 
-            # ArrayAppendValue does a CFRetain on the item. That's fine, 
-            # because the finally block will release our other refs to them. 
-            CoreFoundation.CFArrayAppendValue(trust_chain, item) 
- 
-        return trust_chain 
-    finally: 
-        for obj in itertools.chain(identities, certificates): 
-            CoreFoundation.CFRelease(obj) 
+            )
+            _assert_no_error(status)
+            identities.append(new_identity)
+
+            # We now want to release the original certificate, as we no longer
+            # need it.
+            CoreFoundation.CFRelease(certificates.pop(0))
+
+        # We now need to build a new CFArray that holds the trust chain.
+        trust_chain = CoreFoundation.CFArrayCreateMutable(
+            CoreFoundation.kCFAllocatorDefault,
+            0,
+            ctypes.byref(CoreFoundation.kCFTypeArrayCallBacks),
+        )
+        for item in itertools.chain(identities, certificates):
+            # ArrayAppendValue does a CFRetain on the item. That's fine,
+            # because the finally block will release our other refs to them.
+            CoreFoundation.CFArrayAppendValue(trust_chain, item)
+
+        return trust_chain
+    finally:
+        for obj in itertools.chain(identities, certificates):
+            CoreFoundation.CFRelease(obj)
 
 
 TLS_PROTOCOL_VERSIONS = {

+ 215 - 215
contrib/python/urllib3/urllib3/contrib/appengine.py

@@ -1,101 +1,101 @@
-""" 
-This module provides a pool manager that uses Google App Engine's 
-`URLFetch Service <https://cloud.google.com/appengine/docs/python/urlfetch>`_. 
- 
-Example usage:: 
- 
-    from urllib3 import PoolManager 
-    from urllib3.contrib.appengine import AppEngineManager, is_appengine_sandbox 
- 
-    if is_appengine_sandbox(): 
-        # AppEngineManager uses AppEngine's URLFetch API behind the scenes 
-        http = AppEngineManager() 
-    else: 
-        # PoolManager uses a socket-level API behind the scenes 
-        http = PoolManager() 
- 
-    r = http.request('GET', 'https://google.com/') 
- 
-There are `limitations <https://cloud.google.com/appengine/docs/python/\ 
-urlfetch/#Python_Quotas_and_limits>`_ to the URLFetch service and it may not be 
-the best choice for your application. There are three options for using 
-urllib3 on Google App Engine: 
- 
-1. You can use :class:`AppEngineManager` with URLFetch. URLFetch is 
-   cost-effective in many circumstances as long as your usage is within the 
-   limitations. 
-2. You can use a normal :class:`~urllib3.PoolManager` by enabling sockets. 
-   Sockets also have `limitations and restrictions 
-   <https://cloud.google.com/appengine/docs/python/sockets/\ 
-   #limitations-and-restrictions>`_ and have a lower free quota than URLFetch. 
-   To use sockets, be sure to specify the following in your ``app.yaml``:: 
- 
-        env_variables: 
-            GAE_USE_SOCKETS_HTTPLIB : 'true' 
- 
-3. If you are using `App Engine Flexible 
-<https://cloud.google.com/appengine/docs/flexible/>`_, you can use the standard 
-:class:`PoolManager` without any configuration or special environment variables. 
-""" 
- 
-from __future__ import absolute_import 
+"""
+This module provides a pool manager that uses Google App Engine's
+`URLFetch Service <https://cloud.google.com/appengine/docs/python/urlfetch>`_.
+
+Example usage::
+
+    from urllib3 import PoolManager
+    from urllib3.contrib.appengine import AppEngineManager, is_appengine_sandbox
+
+    if is_appengine_sandbox():
+        # AppEngineManager uses AppEngine's URLFetch API behind the scenes
+        http = AppEngineManager()
+    else:
+        # PoolManager uses a socket-level API behind the scenes
+        http = PoolManager()
+
+    r = http.request('GET', 'https://google.com/')
+
+There are `limitations <https://cloud.google.com/appengine/docs/python/\
+urlfetch/#Python_Quotas_and_limits>`_ to the URLFetch service and it may not be
+the best choice for your application. There are three options for using
+urllib3 on Google App Engine:
+
+1. You can use :class:`AppEngineManager` with URLFetch. URLFetch is
+   cost-effective in many circumstances as long as your usage is within the
+   limitations.
+2. You can use a normal :class:`~urllib3.PoolManager` by enabling sockets.
+   Sockets also have `limitations and restrictions
+   <https://cloud.google.com/appengine/docs/python/sockets/\
+   #limitations-and-restrictions>`_ and have a lower free quota than URLFetch.
+   To use sockets, be sure to specify the following in your ``app.yaml``::
+
+        env_variables:
+            GAE_USE_SOCKETS_HTTPLIB : 'true'
+
+3. If you are using `App Engine Flexible
+<https://cloud.google.com/appengine/docs/flexible/>`_, you can use the standard
+:class:`PoolManager` without any configuration or special environment variables.
+"""
+
+from __future__ import absolute_import
 
 import io
-import logging 
-import warnings 
- 
-from ..exceptions import ( 
-    HTTPError, 
-    HTTPWarning, 
-    MaxRetryError, 
-    ProtocolError, 
+import logging
+import warnings
+
+from ..exceptions import (
+    HTTPError,
+    HTTPWarning,
+    MaxRetryError,
+    ProtocolError,
     SSLError,
     TimeoutError,
-) 
+)
 from ..packages.six.moves.urllib.parse import urljoin
-from ..request import RequestMethods 
-from ..response import HTTPResponse 
+from ..request import RequestMethods
+from ..response import HTTPResponse
 from ..util.retry import Retry
 from ..util.timeout import Timeout
 from . import _appengine_environ
- 
-try: 
-    from google.appengine.api import urlfetch 
-except ImportError: 
-    urlfetch = None 
- 
- 
-log = logging.getLogger(__name__) 
- 
- 
-class AppEnginePlatformWarning(HTTPWarning): 
-    pass 
- 
- 
-class AppEnginePlatformError(HTTPError): 
-    pass 
- 
- 
-class AppEngineManager(RequestMethods): 
-    """ 
-    Connection manager for Google App Engine sandbox applications. 
- 
-    This manager uses the URLFetch service directly instead of using the 
-    emulated httplib, and is subject to URLFetch limitations as described in 
-    the App Engine documentation `here 
-    <https://cloud.google.com/appengine/docs/python/urlfetch>`_. 
- 
-    Notably it will raise an :class:`AppEnginePlatformError` if: 
-        * URLFetch is not available. 
-        * If you attempt to use this on App Engine Flexible, as full socket 
-          support is available. 
-        * If a request size is more than 10 megabytes. 
+
+try:
+    from google.appengine.api import urlfetch
+except ImportError:
+    urlfetch = None
+
+
+log = logging.getLogger(__name__)
+
+
+class AppEnginePlatformWarning(HTTPWarning):
+    pass
+
+
+class AppEnginePlatformError(HTTPError):
+    pass
+
+
+class AppEngineManager(RequestMethods):
+    """
+    Connection manager for Google App Engine sandbox applications.
+
+    This manager uses the URLFetch service directly instead of using the
+    emulated httplib, and is subject to URLFetch limitations as described in
+    the App Engine documentation `here
+    <https://cloud.google.com/appengine/docs/python/urlfetch>`_.
+
+    Notably it will raise an :class:`AppEnginePlatformError` if:
+        * URLFetch is not available.
+        * If you attempt to use this on App Engine Flexible, as full socket
+          support is available.
+        * If a request size is more than 10 megabytes.
         * If a response size is more than 32 megabytes.
-        * If you use an unsupported request method such as OPTIONS. 
- 
-    Beyond those cases, it will raise normal urllib3 errors. 
-    """ 
- 
+        * If you use an unsupported request method such as OPTIONS.
+
+    Beyond those cases, it will raise normal urllib3 errors.
+    """
+
     def __init__(
         self,
         headers=None,
@@ -103,31 +103,31 @@ class AppEngineManager(RequestMethods):
         validate_certificate=True,
         urlfetch_retries=True,
     ):
-        if not urlfetch: 
-            raise AppEnginePlatformError( 
+        if not urlfetch:
+            raise AppEnginePlatformError(
                 "URLFetch is not available in this environment."
             )
- 
-        warnings.warn( 
-            "urllib3 is using URLFetch on Google App Engine sandbox instead " 
-            "of sockets. To use sockets directly instead of URLFetch see " 
+
+        warnings.warn(
+            "urllib3 is using URLFetch on Google App Engine sandbox instead "
+            "of sockets. To use sockets directly instead of URLFetch see "
             "https://urllib3.readthedocs.io/en/1.26.x/reference/urllib3.contrib.html.",
             AppEnginePlatformWarning,
         )
- 
-        RequestMethods.__init__(self, headers) 
-        self.validate_certificate = validate_certificate 
-        self.urlfetch_retries = urlfetch_retries 
- 
-        self.retries = retries or Retry.DEFAULT 
- 
-    def __enter__(self): 
-        return self 
- 
-    def __exit__(self, exc_type, exc_val, exc_tb): 
-        # Return False to re-raise any potential exceptions 
-        return False 
- 
+
+        RequestMethods.__init__(self, headers)
+        self.validate_certificate = validate_certificate
+        self.urlfetch_retries = urlfetch_retries
+
+        self.retries = retries or Retry.DEFAULT
+
+    def __enter__(self):
+        return self
+
+    def __exit__(self, exc_type, exc_val, exc_tb):
+        # Return False to re-raise any potential exceptions
+        return False
+
     def urlopen(
         self,
         method,
@@ -139,80 +139,80 @@ class AppEngineManager(RequestMethods):
         timeout=Timeout.DEFAULT_TIMEOUT,
         **response_kw
     ):
- 
-        retries = self._get_retries(retries, redirect) 
- 
-        try: 
+
+        retries = self._get_retries(retries, redirect)
+
+        try:
             follow_redirects = redirect and retries.redirect != 0 and retries.total
-            response = urlfetch.fetch( 
-                url, 
-                payload=body, 
-                method=method, 
-                headers=headers or {}, 
-                allow_truncated=False, 
-                follow_redirects=self.urlfetch_retries and follow_redirects, 
-                deadline=self._get_absolute_timeout(timeout), 
-                validate_certificate=self.validate_certificate, 
-            ) 
-        except urlfetch.DeadlineExceededError as e: 
-            raise TimeoutError(self, e) 
- 
-        except urlfetch.InvalidURLError as e: 
+            response = urlfetch.fetch(
+                url,
+                payload=body,
+                method=method,
+                headers=headers or {},
+                allow_truncated=False,
+                follow_redirects=self.urlfetch_retries and follow_redirects,
+                deadline=self._get_absolute_timeout(timeout),
+                validate_certificate=self.validate_certificate,
+            )
+        except urlfetch.DeadlineExceededError as e:
+            raise TimeoutError(self, e)
+
+        except urlfetch.InvalidURLError as e:
             if "too large" in str(e):
-                raise AppEnginePlatformError( 
-                    "URLFetch request too large, URLFetch only " 
+                raise AppEnginePlatformError(
+                    "URLFetch request too large, URLFetch only "
                     "supports requests up to 10mb in size.",
                     e,
                 )
-            raise ProtocolError(e) 
- 
-        except urlfetch.DownloadError as e: 
+            raise ProtocolError(e)
+
+        except urlfetch.DownloadError as e:
             if "Too many redirects" in str(e):
-                raise MaxRetryError(self, url, reason=e) 
-            raise ProtocolError(e) 
- 
-        except urlfetch.ResponseTooLargeError as e: 
-            raise AppEnginePlatformError( 
-                "URLFetch response too large, URLFetch only supports" 
+                raise MaxRetryError(self, url, reason=e)
+            raise ProtocolError(e)
+
+        except urlfetch.ResponseTooLargeError as e:
+            raise AppEnginePlatformError(
+                "URLFetch response too large, URLFetch only supports"
                 "responses up to 32mb in size.",
                 e,
             )
- 
-        except urlfetch.SSLCertificateError as e: 
-            raise SSLError(e) 
- 
-        except urlfetch.InvalidMethodError as e: 
-            raise AppEnginePlatformError( 
+
+        except urlfetch.SSLCertificateError as e:
+            raise SSLError(e)
+
+        except urlfetch.InvalidMethodError as e:
+            raise AppEnginePlatformError(
                 "URLFetch does not support method: %s" % method, e
             )
- 
-        http_response = self._urlfetch_response_to_http_response( 
+
+        http_response = self._urlfetch_response_to_http_response(
             response, retries=retries, **response_kw
         )
- 
-        # Handle redirect? 
-        redirect_location = redirect and http_response.get_redirect_location() 
-        if redirect_location: 
-            # Check for redirect response 
+
+        # Handle redirect?
+        redirect_location = redirect and http_response.get_redirect_location()
+        if redirect_location:
+            # Check for redirect response
             if self.urlfetch_retries and retries.raise_on_redirect:
-                raise MaxRetryError(self, url, "too many redirects") 
-            else: 
-                if http_response.status == 303: 
+                raise MaxRetryError(self, url, "too many redirects")
+            else:
+                if http_response.status == 303:
                     method = "GET"
- 
-                try: 
+
+                try:
                     retries = retries.increment(
                         method, url, response=http_response, _pool=self
                     )
-                except MaxRetryError: 
-                    if retries.raise_on_redirect: 
-                        raise MaxRetryError(self, url, "too many redirects") 
-                    return http_response 
- 
-                retries.sleep_for_retry(http_response) 
-                log.debug("Redirecting %s -> %s", url, redirect_location) 
-                redirect_url = urljoin(url, redirect_location) 
-                return self.urlopen( 
+                except MaxRetryError:
+                    if retries.raise_on_redirect:
+                        raise MaxRetryError(self, url, "too many redirects")
+                    return http_response
+
+                retries.sleep_for_retry(http_response)
+                log.debug("Redirecting %s -> %s", url, redirect_location)
+                redirect_url = urljoin(url, redirect_location)
+                return self.urlopen(
                     method,
                     redirect_url,
                     body,
@@ -222,14 +222,14 @@ class AppEngineManager(RequestMethods):
                     timeout=timeout,
                     **response_kw
                 )
- 
-        # Check if we should retry the HTTP response. 
+
+        # Check if we should retry the HTTP response.
         has_retry_after = bool(http_response.getheader("Retry-After"))
-        if retries.is_retry(method, http_response.status, has_retry_after): 
+        if retries.is_retry(method, http_response.status, has_retry_after):
             retries = retries.increment(method, url, response=http_response, _pool=self)
-            log.debug("Retry: %s", url) 
-            retries.sleep(http_response) 
-            return self.urlopen( 
+            log.debug("Retry: %s", url)
+            retries.sleep(http_response)
+            return self.urlopen(
                 method,
                 url,
                 body=body,
@@ -239,37 +239,37 @@ class AppEngineManager(RequestMethods):
                 timeout=timeout,
                 **response_kw
             )
- 
-        return http_response 
- 
-    def _urlfetch_response_to_http_response(self, urlfetch_resp, **response_kw): 
- 
-        if is_prod_appengine(): 
-            # Production GAE handles deflate encoding automatically, but does 
-            # not remove the encoding header. 
+
+        return http_response
+
+    def _urlfetch_response_to_http_response(self, urlfetch_resp, **response_kw):
+
+        if is_prod_appengine():
+            # Production GAE handles deflate encoding automatically, but does
+            # not remove the encoding header.
             content_encoding = urlfetch_resp.headers.get("content-encoding")
- 
+
             if content_encoding == "deflate":
                 del urlfetch_resp.headers["content-encoding"]
- 
+
         transfer_encoding = urlfetch_resp.headers.get("transfer-encoding")
-        # We have a full response's content, 
-        # so let's make sure we don't report ourselves as chunked data. 
+        # We have a full response's content,
+        # so let's make sure we don't report ourselves as chunked data.
         if transfer_encoding == "chunked":
-            encodings = transfer_encoding.split(",") 
+            encodings = transfer_encoding.split(",")
             encodings.remove("chunked")
             urlfetch_resp.headers["transfer-encoding"] = ",".join(encodings)
- 
+
         original_response = HTTPResponse(
-            # In order for decoding to work, we must present the content as 
-            # a file-like object. 
+            # In order for decoding to work, we must present the content as
+            # a file-like object.
             body=io.BytesIO(urlfetch_resp.content),
             msg=urlfetch_resp.header_msg,
-            headers=urlfetch_resp.headers, 
-            status=urlfetch_resp.status_code, 
-            **response_kw 
-        ) 
- 
+            headers=urlfetch_resp.headers,
+            status=urlfetch_resp.status_code,
+            **response_kw
+        )
+
         return HTTPResponse(
             body=io.BytesIO(urlfetch_resp.content),
             headers=urlfetch_resp.headers,
@@ -278,35 +278,35 @@ class AppEngineManager(RequestMethods):
             **response_kw
         )
 
-    def _get_absolute_timeout(self, timeout): 
-        if timeout is Timeout.DEFAULT_TIMEOUT: 
-            return None  # Defer to URLFetch's default. 
-        if isinstance(timeout, Timeout): 
-            if timeout._read is not None or timeout._connect is not None: 
-                warnings.warn( 
-                    "URLFetch does not support granular timeout settings, " 
-                    "reverting to total or default URLFetch timeout.", 
+    def _get_absolute_timeout(self, timeout):
+        if timeout is Timeout.DEFAULT_TIMEOUT:
+            return None  # Defer to URLFetch's default.
+        if isinstance(timeout, Timeout):
+            if timeout._read is not None or timeout._connect is not None:
+                warnings.warn(
+                    "URLFetch does not support granular timeout settings, "
+                    "reverting to total or default URLFetch timeout.",
                     AppEnginePlatformWarning,
                 )
-            return timeout.total 
-        return timeout 
- 
-    def _get_retries(self, retries, redirect): 
-        if not isinstance(retries, Retry): 
+            return timeout.total
+        return timeout
+
+    def _get_retries(self, retries, redirect):
+        if not isinstance(retries, Retry):
             retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
- 
-        if retries.connect or retries.read or retries.redirect: 
-            warnings.warn( 
-                "URLFetch only supports total retries and does not " 
-                "recognize connect, read, or redirect retry parameters.", 
+
+        if retries.connect or retries.read or retries.redirect:
+            warnings.warn(
+                "URLFetch only supports total retries and does not "
+                "recognize connect, read, or redirect retry parameters.",
                 AppEnginePlatformWarning,
             )
- 
-        return retries 
- 
- 
+
+        return retries
+
+
 # Alias methods from _appengine_environ to maintain public API interface.
- 
+
 is_appengine = _appengine_environ.is_appengine
 is_appengine_sandbox = _appengine_environ.is_appengine_sandbox
 is_local_appengine = _appengine_environ.is_local_appengine

+ 67 - 67
contrib/python/urllib3/urllib3/contrib/ntlmpool.py

@@ -1,18 +1,18 @@
-""" 
-NTLM authenticating pool, contributed by erikcederstran 
- 
-Issue #10, see: http://code.google.com/p/urllib3/issues/detail?id=10 
-""" 
-from __future__ import absolute_import 
- 
+"""
+NTLM authenticating pool, contributed by erikcederstran
+
+Issue #10, see: http://code.google.com/p/urllib3/issues/detail?id=10
+"""
+from __future__ import absolute_import
+
 import warnings
-from logging import getLogger 
+from logging import getLogger
+
+from ntlm import ntlm
+
+from .. import HTTPSConnectionPool
+from ..packages.six.moves.http_client import HTTPSConnection
 
-from ntlm import ntlm 
- 
-from .. import HTTPSConnectionPool 
-from ..packages.six.moves.http_client import HTTPSConnection 
- 
 warnings.warn(
     "The 'urllib3.contrib.ntlmpool' module is deprecated and will be removed "
     "in urllib3 v2.0 release, urllib3 is not able to support it properly due "
@@ -21,75 +21,75 @@ warnings.warn(
     DeprecationWarning,
 )
 
-log = getLogger(__name__) 
- 
- 
-class NTLMConnectionPool(HTTPSConnectionPool): 
-    """ 
-    Implements an NTLM authentication version of an urllib3 connection pool 
-    """ 
- 
+log = getLogger(__name__)
+
+
+class NTLMConnectionPool(HTTPSConnectionPool):
+    """
+    Implements an NTLM authentication version of an urllib3 connection pool
+    """
+
     scheme = "https"
- 
-    def __init__(self, user, pw, authurl, *args, **kwargs): 
-        """ 
-        authurl is a random URL on the server that is protected by NTLM. 
-        user is the Windows user, probably in the DOMAIN\\username format. 
-        pw is the password for the user. 
-        """ 
-        super(NTLMConnectionPool, self).__init__(*args, **kwargs) 
-        self.authurl = authurl 
-        self.rawuser = user 
+
+    def __init__(self, user, pw, authurl, *args, **kwargs):
+        """
+        authurl is a random URL on the server that is protected by NTLM.
+        user is the Windows user, probably in the DOMAIN\\username format.
+        pw is the password for the user.
+        """
+        super(NTLMConnectionPool, self).__init__(*args, **kwargs)
+        self.authurl = authurl
+        self.rawuser = user
         user_parts = user.split("\\", 1)
-        self.domain = user_parts[0].upper() 
-        self.user = user_parts[1] 
-        self.pw = pw 
- 
-    def _new_conn(self): 
-        # Performs the NTLM handshake that secures the connection. The socket 
-        # must be kept open while requests are performed. 
-        self.num_connections += 1 
+        self.domain = user_parts[0].upper()
+        self.user = user_parts[1]
+        self.pw = pw
+
+    def _new_conn(self):
+        # Performs the NTLM handshake that secures the connection. The socket
+        # must be kept open while requests are performed.
+        self.num_connections += 1
         log.debug(
             "Starting NTLM HTTPS connection no. %d: https://%s%s",
             self.num_connections,
             self.host,
             self.authurl,
         )
- 
+
         headers = {"Connection": "Keep-Alive"}
         req_header = "Authorization"
         resp_header = "www-authenticate"
- 
-        conn = HTTPSConnection(host=self.host, port=self.port) 
- 
-        # Send negotiation message 
+
+        conn = HTTPSConnection(host=self.host, port=self.port)
+
+        # Send negotiation message
         headers[req_header] = "NTLM %s" % ntlm.create_NTLM_NEGOTIATE_MESSAGE(
             self.rawuser
         )
         log.debug("Request headers: %s", headers)
         conn.request("GET", self.authurl, None, headers)
-        res = conn.getresponse() 
-        reshdr = dict(res.getheaders()) 
+        res = conn.getresponse()
+        reshdr = dict(res.getheaders())
         log.debug("Response status: %s %s", res.status, res.reason)
         log.debug("Response headers: %s", reshdr)
         log.debug("Response data: %s [...]", res.read(100))
- 
-        # Remove the reference to the socket, so that it can not be closed by 
-        # the response object (we want to keep the socket open) 
-        res.fp = None 
- 
-        # Server should respond with a challenge message 
+
+        # Remove the reference to the socket, so that it can not be closed by
+        # the response object (we want to keep the socket open)
+        res.fp = None
+
+        # Server should respond with a challenge message
         auth_header_values = reshdr[resp_header].split(", ")
-        auth_header_value = None 
-        for s in auth_header_values: 
+        auth_header_value = None
+        for s in auth_header_values:
             if s[:5] == "NTLM ":
-                auth_header_value = s[5:] 
-        if auth_header_value is None: 
+                auth_header_value = s[5:]
+        if auth_header_value is None:
             raise Exception(
                 "Unexpected %s response header: %s" % (resp_header, reshdr[resp_header])
             )
- 
-        # Send authentication message 
+
+        # Send authentication message
         ServerChallenge, NegotiateFlags = ntlm.parse_NTLM_CHALLENGE_MESSAGE(
             auth_header_value
         )
@@ -99,19 +99,19 @@ class NTLMConnectionPool(HTTPSConnectionPool):
         headers[req_header] = "NTLM %s" % auth_msg
         log.debug("Request headers: %s", headers)
         conn.request("GET", self.authurl, None, headers)
-        res = conn.getresponse() 
+        res = conn.getresponse()
         log.debug("Response status: %s %s", res.status, res.reason)
         log.debug("Response headers: %s", dict(res.getheaders()))
         log.debug("Response data: %s [...]", res.read()[:100])
-        if res.status != 200: 
-            if res.status == 401: 
+        if res.status != 200:
+            if res.status == 401:
                 raise Exception("Server rejected request: wrong username or password")
             raise Exception("Wrong server response: %s %s" % (res.status, res.reason))
- 
-        res.fp = None 
+
+        res.fp = None
         log.debug("Connection established")
-        return conn 
- 
+        return conn
+
     def urlopen(
         self,
         method,
@@ -122,8 +122,8 @@ class NTLMConnectionPool(HTTPSConnectionPool):
         redirect=True,
         assert_same_host=True,
     ):
-        if headers is None: 
-            headers = {} 
+        if headers is None:
+            headers = {}
         headers["Connection"] = "Keep-Alive"
         return super(NTLMConnectionPool, self).urlopen(
             method, url, body, headers, retries, redirect, assert_same_host

Some files were not shown because too many files changed in this diff