1
0
mirror of https://github.com/kevin1024/vcrpy.git synced 2025-12-08 16:53:23 +00:00

Compare commits

...

316 Commits

Author SHA1 Message Date
Matthias (~talfus-laddus)
d5ba702a1b only log message if response is appended
Closes: https://github.com/kevin1024/vcrpy/issues/685
2025-11-19 13:53:06 -03:00
Seow Alex
952994b365 Patch httpcore instead of httpx 2025-11-19 13:44:40 -03:00
Jair Henrique
e2f3240835 Fixes some tornado tests 2025-11-19 12:18:05 -03:00
Jair Henrique
bb690833bc Enables SIM ruff lint 2025-11-19 12:18:05 -03:00
Jair Henrique
73eed94c47 Drops Python 3.9 support 2025-11-19 12:18:05 -03:00
dependabot[bot]
a23fe0333a build(deps): bump actions/setup-python from 5 to 6
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5 to 6.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v5...v6)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-19 11:14:40 -03:00
dependabot[bot]
bb743861b6 build(deps): bump astral-sh/setup-uv from 6 to 7
Bumps [astral-sh/setup-uv](https://github.com/astral-sh/setup-uv) from 6 to 7.
- [Release notes](https://github.com/astral-sh/setup-uv/releases)
- [Commits](https://github.com/astral-sh/setup-uv/compare/v6...v7)

---
updated-dependencies:
- dependency-name: astral-sh/setup-uv
  dependency-version: '7'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-19 06:57:48 -03:00
dependabot[bot]
ac70eaa17f build(deps): bump astral-sh/setup-uv from 5 to 6
Bumps [astral-sh/setup-uv](https://github.com/astral-sh/setup-uv) from 5 to 6.
- [Release notes](https://github.com/astral-sh/setup-uv/releases)
- [Commits](https://github.com/astral-sh/setup-uv/compare/v5...v6)

---
updated-dependencies:
- dependency-name: astral-sh/setup-uv
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-11 16:57:56 -03:00
dependabot[bot]
d50f3385a6 build(deps): bump actions/checkout from 4 to 5
Bumps [actions/checkout](https://github.com/actions/checkout) from 4 to 5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-26 09:37:05 -03:00
Jair Henrique
14db4de224 Use uv on CI 2025-08-26 09:35:26 -03:00
Sebastian Pipping
2c4df79498 Merge pull request #917 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2025-08-01 19:17:52 +02:00
pre-commit
1456673cb4 pre-commit: Autoupdate 2025-08-01 16:06:54 +00:00
pre-commit
19bd4e012c pre-commit: Autoupdate 2025-03-23 16:52:07 -03:00
Karolina Surma
558c7fc625 Import iscoroutinefunction() from inspect rather than asyncio
The asyncio function is deprecated starting from Python 3.14 and
will be removed from Python 3.16.
2025-03-23 16:49:33 -03:00
Sebastian Pipping
8217a4c21b Merge pull request #903 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2025-01-17 21:03:56 +01:00
pre-commit
bd0aa59cd2 pre-commit: Autoupdate 2025-01-17 20:50:18 +01:00
Sebastian Pipping
9a37817a3a Merge pull request #904 from kevin1024/fix-ci
Fix CI for error `unshare: write failed /proc/self/uid_map: Operation not permitted` with Ubuntu >=24.04
2025-01-17 20:49:41 +01:00
Sebastian Pipping
b4c65bd677 main.yml: Allow creation of user namespaces to unshare in Ubuntu >=24.04 2025-01-17 20:36:18 +01:00
Sebastian Pipping
93bc59508c Pin GitHub Actions and Read the Docs to explicit Ubuntu 24.04 2025-01-17 20:16:30 +01:00
pre-commit
e313a9cd52 pre-commit: Autoupdate 2025-01-12 09:28:43 -03:00
Sebastian Pipping
5f1b20c4ca Merge pull request #763 from danielnsilva/drop-unused-requests
Add an option to remove unused requests from cassette
2025-01-11 20:51:28 +01:00
Daniel Silva
cd31d71901 refactor: move logic for building used interactions dict before saving 2025-01-11 16:56:59 +00:00
Daniel Silva
4607ca1102 fix: add drop_unused_requests check in cassette saving logic 2025-01-11 16:55:04 +00:00
Daniel Silva
e3ced4385e docs: update example in advanced.rst
Co-authored-by: Sebastian Pipping <sebastian@pipping.org>
2025-01-11 11:21:55 -05:00
Jair Henrique
80099ac6d7 Clean pytest configurations 2025-01-08 16:16:56 -03:00
Sebastian Pipping
440bc20faf Merge pull request #809 from alga/alga-https-proxy
Fix HTTPS proxy handling
2025-01-08 00:20:54 +01:00
Albertas Agejevas
3ddff27cda Remove redundant assertions.
They are covered by the next line.
2025-01-07 21:48:24 +02:00
Albertas Agejevas
30b423e8c0 Use mode="none" in proxy tests as suggested by @hartwork. 2025-01-07 21:48:24 +02:00
Albertas Agejevas
752ba0b749 Fix HTTPS proxy handling. 2025-01-07 21:48:24 +02:00
Albertas Agejevas
c16e526d6a Integration test for HTTPS proxy handling. 2025-01-07 21:48:24 +02:00
Daniel Silva
d64cdd337b style: fix formatting issues to comply with pre-commit hooks 2025-01-04 23:45:43 +00:00
Martin Brunthaler
ac230b76af Call urllib.parse less frequently 2025-01-04 15:42:49 -03:00
pre-commit
965f3658d5 pre-commit: Autoupdate 2025-01-04 15:21:25 -03:00
Jair Henrique
6465a5995b Fix docs conf 2025-01-04 15:19:50 -03:00
dependabot[bot]
69ca261a88 build(deps): bump sphinx-rtd-theme from 2.0.0 to 3.0.2
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 2.0.0 to 3.0.2.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/2.0.0...3.0.2)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-04 15:19:50 -03:00
kevin1024
3278619dcc Release v7.0.0 2024-12-30 18:59:50 -05:00
Aleksei Kozharin
3fb62e0f9b fix: correctly handle asyncio.run when loop exists 2024-12-30 13:34:03 -03:00
dependabot[bot]
81978659f1 build(deps): update sphinx requirement from <8 to <9
Updates the requirements on [sphinx](https://github.com/sphinx-doc/sphinx) to permit the latest version.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/v8.0.2/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v0.1.61611...v8.0.2)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-12-28 14:03:41 -03:00
pre-commit
be651bd27c pre-commit: Autoupdate 2024-12-28 13:48:49 -03:00
Jair Henrique
a6698ed060 Fix aiohttp tests 2024-12-28 13:42:54 -03:00
Igor Gumenyuk
48d0a2e453 Fixed missing version_string attribute when used with urllib3>=2.3.0
urllib3 in v2.3.0 introduced attribute `version_string` (https://github.com/urllib3/urllib3/pull/3316/files). This attribute  is missing in `VCRHTTPResponse` which causes errors like AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string'

This fixes https://github.com/kevin1024/vcrpy/issues/888
2024-12-28 13:39:53 -03:00
Jair Henrique
5b858b132d Fix lint 2024-12-28 13:25:04 -03:00
Jair Henrique
c8d99a99ec Fix ruff configuration 2024-12-28 13:21:15 -03:00
Sebastian Pipping
ce27c63685 Merge pull request #736 from kevin1024/drop-python38
[14 Oct 2024] Drop python 3.8 support
2024-10-13 04:05:13 +02:00
Jair Henrique
ab8944d3ca Drop python 3.8 support 2024-10-12 22:44:42 -03:00
Sebastian Pipping
c6a7f4ae15 Merge pull request #872 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-10-11 01:22:08 +02:00
Kevin McCarthy
1d100dda25 release v6.0.2 2024-10-07 08:55:44 -04:00
pre-commit
7275e5d65d pre-commit: Autoupdate 2024-10-04 16:05:32 +00:00
Sebastian Pipping
c6be705fb4 Merge pull request #871 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-09-20 22:37:01 +02:00
pre-commit
10b7f4efb3 pre-commit: Autoupdate 2024-09-20 16:05:30 +00:00
Sebastian Pipping
7a6ef00f4d Merge pull request #870 from kevin1024/dependabot/github_actions/peter-evans/create-pull-request-7
build(deps): bump peter-evans/create-pull-request from 6 to 7
2024-09-19 00:33:49 +02:00
Sebastian Pipping
3bf6ac7184 Merge pull request #867 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-09-19 00:32:45 +02:00
pre-commit
983b2202ed pre-commit: Autoupdate 2024-09-19 00:28:52 +02:00
dependabot[bot]
15a6b71997 build(deps): bump peter-evans/create-pull-request from 6 to 7
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 6 to 7.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](https://github.com/peter-evans/create-pull-request/compare/v6...v7)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-09-18 22:27:51 +00:00
Sebastian Pipping
1ca708dcff Merge pull request #862 from kevin1024/test-on-313
Start testing with CPython 3.13
2024-09-19 00:00:50 +02:00
Thomas Grainger
f5597fa6c1 use pytest_httbin.certs.where() for cafile 2024-09-18 22:36:17 +01:00
Thomas Grainger
2b3247b3df remove redundant load_cert_chain 2024-09-18 22:35:15 +01:00
Thomas Grainger
d123a5e8d0 replace fixture with constant 2024-09-18 22:34:48 +01:00
Thomas Grainger
e2815fbc88 move httbin_ssl_context fixture into the one place it's used 2024-09-18 22:31:36 +01:00
Thomas Grainger
f9d4500c6e test on 3.13 2024-09-18 19:24:37 +01:00
Sebastian Pipping
71eb624708 Merge pull request #851 from sathieu/consume-body-once2
Ensure body is consumed only once (alternative to #847)
2024-08-02 19:16:38 +02:00
Sebastian Pipping
dc449715c1 Merge pull request #864 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-08-02 19:15:52 +02:00
pre-commit
275b9085f3 pre-commit: Autoupdate 2024-08-02 16:05:06 +00:00
Thomas Grainger
35650b141b Merge pull request #856 from connortann/patch-1
Fix package install: remove use of deprecated setuptools.command.test
2024-07-29 17:35:23 +01:00
Thomas Grainger
9c8b679136 remove unused sys import from setup.py 2024-07-29 17:27:19 +01:00
connortann
fab082eff5 Update setup.py: remove use of deprecated setuptools.command.test 2024-07-29 11:10:18 +01:00
Sebastian Pipping
ffc04f9128 Merge pull request #853 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-07-27 01:25:13 +02:00
pre-commit
4d84da1809 pre-commit: Autoupdate 2024-07-26 16:05:08 +00:00
Mathieu Parent
241b0bbd91 Ensure body is consumed only once
Fixes: #846
Signed-off-by: Mathieu Parent <math.parent@gmail.com>
2024-07-21 22:53:45 +02:00
Sebastian Pipping
042e16c3e4 Merge pull request #850 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-07-13 21:52:05 +02:00
pre-commit
acef3f49bf pre-commit: Autoupdate 2024-07-05 16:05:04 +00:00
Sebastian Pipping
9cfa6c5173 Merge pull request #845 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-06-30 15:24:31 +02:00
Sebastian Pipping
39a86ba3cf pre-commit: Fix Ruff invocation for >=0.5.0
Related changes in Ruff:
https://github.com/astral-sh/ruff/pull/9687
2024-06-30 15:18:35 +02:00
pre-commit
543c72ba51 pre-commit: Autoupdate 2024-06-28 16:04:49 +00:00
Sebastian Pipping
86b114f2f5 Merge pull request #831 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-06-02 15:33:33 +02:00
Sebastian Pipping
4b06f3dba1 Merge pull request #836 from chuckwondo/fix-testing-instructions
Fix typos in testing instructions
2024-06-02 15:29:57 +02:00
pre-commit
1c6503526b pre-commit: Autoupdate 2024-06-02 15:27:57 +02:00
Sebastian Pipping
c9c05682cb Merge pull request #843 from kevin1024/fix-ci-through-recent-setuptools
Fix CI through recent Setuptools + start building once a week to notice CI issues earlier
2024-06-02 15:27:23 +02:00
Sebastian Pipping
39c8648aa7 main.yml: Make sure that build issues do not go unnoticed for >7 days 2024-06-02 15:23:51 +02:00
Sebastian Pipping
dfff84d5bb main.yml: Fix "pypy-3.9,urllib3<2" CI through recent Setuptools
Command "pip install codecov '.[tests]' 'urllib3<2'" was failing
with error:
> hpy 0.9.0 has requirement setuptools>=64.0, but you have
> setuptools 58.1.0.
2024-06-02 15:23:08 +02:00
Chuck Daniels
40ac0de652 Fix typos in test commands 2024-04-21 19:58:00 -04:00
Sebastian Pipping
f3147f574b Merge pull request #830 from pjonsson/permit-urllib3
Permit urllib3 >=2 for non-PyPy Python >=3.10 in order to help users of Poetry
2024-03-11 19:35:43 +01:00
Sebastian Pipping
298a6933ff Merge pull request #829 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-03-10 23:17:01 +01:00
Peter A. Jonsson
52da776b59 Permit urllib3 2.x for non-PyPy Python >=3.10
Poetry makes platform indepdent lock files,
so the PyPy marker is there even when using
CPython >= 3.10.

Add a third constraint that permits any urllib3
version when using Python >=3.10 and some
other implementation than PyPy.
2024-03-10 22:37:04 +01:00
pre-commit
8842fb1c3a pre-commit: Autoupdate 2024-03-08 16:04:43 +00:00
Sebastian Pipping
6c4ba172d8 Merge pull request #827 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-03-06 14:40:32 +01:00
pre-commit
c88f2c0dab pre-commit: Mass-apply ruff formatter 2024-03-06 14:35:01 +01:00
pre-commit
3fd6b1c0b4 pre-commit: Autoupdate 2024-03-01 16:04:42 +00:00
Sebastian Pipping
c6d87309f4 Merge pull request #823 from mgorny/httpbin-compat
Improve test compatibility with legacy httpbin index
2024-02-28 20:38:23 +01:00
Sebastian Pipping
1fb9179cf9 Merge pull request #813 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-02-28 20:37:08 +01:00
pre-commit
a58e0d8830 pre-commit: Autoupdate 2024-02-23 16:04:44 +00:00
dependabot[bot]
acc101412d build(deps): bump pre-commit/action from 3.0.0 to 3.0.1
Bumps [pre-commit/action](https://github.com/pre-commit/action) from 3.0.0 to 3.0.1.
- [Release notes](https://github.com/pre-commit/action/releases)
- [Commits](https://github.com/pre-commit/action/compare/v3.0.0...v3.0.1)

---
updated-dependencies:
- dependency-name: pre-commit/action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-21 10:39:22 -03:00
Michał Górny
e60dafb8dc Improve test compatibility with legacy httpbin index
Make the tests slightly more flexible to match both the flasgger-based
and legacy httpbin index.  This is needed for compatibility with
https://github.com/psf/httpbin/pull/44 when flasgger is not installed
(e.g. on architectures that are not supported by Rust).
2024-02-16 19:33:41 +01:00
dependabot[bot]
3ce5979acb build(deps): bump peter-evans/create-pull-request from 5 to 6
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 5 to 6.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](https://github.com/peter-evans/create-pull-request/compare/v5...v6)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-06 10:13:52 -03:00
Kevin McCarthy
68038d0559 release v6.0.1 2024-01-25 11:13:56 -05:00
Thomas Grainger
f76289aa78 Merge pull request #811 from kevin1024/graingert-patch-1
return values from generator decorator
2024-01-24 13:22:32 +00:00
Thomas Grainger
6252b92f50 add test for generator decorator return 2024-01-23 17:29:35 +00:00
Kevin McCarthy
1e3a5ac753 Release v6.0.0 2024-01-23 10:54:46 -05:00
Thomas Grainger
b1c45cd249 return values from generator decorator 2024-01-23 15:07:23 +00:00
Thomas Grainger
3a5ff1c1ce Merge pull request #810 from graingert/fix-tornado-related-deprecation-warnings
fix tornado related warnings by replacing pytest-tornado with a simple coroutine runner
2024-01-23 14:02:31 +00:00
Thomas Grainger
bf80673454 fix tornado related warnings 2024-01-23 13:02:29 +00:00
Thomas Grainger
8028420cbb Merge pull request #801 from graingert/fix-resource-warning-2 2024-01-23 12:39:35 +00:00
Thomas Grainger
784b2dcb29 tornado test_redirects is no longer an online test 2024-01-23 12:33:45 +00:00
Thomas Grainger
42b4a5d2fa move off of mockbin on tornado tests also 2024-01-23 12:33:15 +00:00
Thomas Grainger
b7f6c2fce2 mark tornado tests as online 2024-01-23 12:24:48 +00:00
Thomas Grainger
6d7a842a33 fix test_tornado_exception_can_be_caught RuntimeError: generator raised StopIteration 2024-01-23 12:24:48 +00:00
Thomas Grainger
db1f5b0dee tornado 6 changes raise_error behaviour 2024-01-23 12:12:52 +00:00
Thomas Grainger
c6667ac56c restore scheme fixture for tests 2024-01-23 12:12:05 +00:00
Thomas Grainger
a093fb177d add new deprecation warnings for tornado tests 2024-01-23 12:09:39 +00:00
Thomas Grainger
666686b542 restore pytest-tornado 2024-01-23 11:12:02 +00:00
Thomas Grainger
5104b1f462 Merge branch 'master' of github.com:kevin1024/vcrpy into fix-resource-warning-2 2024-01-23 11:03:49 +00:00
Jair Henrique
62fe272a8e Remove tox reference from contributing docs 2024-01-22 23:17:25 -03:00
Jair Henrique
f9b69d8da7 Remove tox reference from runtests.sh file 2024-01-22 23:17:25 -03:00
Jair Henrique
cb77cb8f69 Remove tox.ini file 2024-01-22 23:17:25 -03:00
Jair Henrique
e37fc9ab6e Refactor ci main workflow to use matrix to run tests without tox 2024-01-22 23:17:25 -03:00
Jair Henrique
abbb50135f Organize dependencies for tests and extras requires 2024-01-22 23:17:25 -03:00
Jair Henrique
0594de9b3e Remove tox.ini from MANIFEST.in file 2024-01-22 23:17:25 -03:00
Jair Henrique
53f686aa5b Refactor test to not use tox.ini file 2024-01-22 23:17:25 -03:00
pre-commit
1677154f04 pre-commit: Autoupdate 2024-01-22 23:14:05 -03:00
Allan Crooks
54bc6467eb Run linters. 2024-01-22 23:13:10 -03:00
Allan Crooks
c5487384ee Fix handling of encoded content in HTTPX stub.
Also copied over and adjusted some of the tests from
test_requests.py relating to gzipped handling to show
that the HTTPX stub is behaving in a consistent way to
how the requests stub is.
2024-01-22 23:13:10 -03:00
Allan Crooks
5cf23298ac HTTPX stub now generates cassettes in the same format as other stubs.
As part of this, I've removed the tests which inspect the
data type of the response content in the cassette. That
behaviour should be controlled via the inbuilt serializers.
2024-01-22 23:13:10 -03:00
Allan Crooks
5fa7010712 Allow HTTPX stub to read cassettes generated by other stubs.
This was due to a custom format being defined in the HTTPX stub.
2024-01-22 23:13:10 -03:00
pre-commit
f1e0241673 pre-commit: Autoupdate 2024-01-05 16:36:48 -03:00
pre-commit
a3a255d606 pre-commit: Autoupdate 2024-01-02 10:58:59 -03:00
dependabot[bot]
0782382982 build(deps): bump actions/checkout from 3 to 4
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-18 12:16:06 -03:00
dependabot[bot]
395d2be295 build(deps): bump actions/setup-python from 4 to 5
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4 to 5.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-18 11:08:49 -03:00
Thomas Grainger
ee6e7905e9 Merge branch 'master' into fix-resource-warning-2 2023-12-15 19:13:39 +00:00
Thomas Grainger
cc4d03c62e close unremoved connections (pool already removed the connection) 2023-12-15 19:11:25 +00:00
Thomas Grainger
8e13af2ee9 use context manager for requests.Session 2023-12-15 18:38:56 +00:00
pre-commit
b522d3f0a3 pre-commit: Autoupdate 2023-12-15 14:46:28 -03:00
Thomas Grainger
d39c26b358 remember to close removed connections 2023-12-15 14:26:44 +00:00
Thomas Grainger
d76c243513 Revert "Fix ResourceWarning unclosed socket"
This reverts commit f4144359f6.
2023-12-15 14:01:14 +00:00
Thomas Grainger
5cff354ec8 Revert "fix a KeyError"
This reverts commit fa789e975b.
2023-12-15 14:01:10 +00:00
Thomas Grainger
80614dbd00 fix resource warning due to pytest-asyncio 2023-12-15 13:51:52 +00:00
Thomas Grainger
356ff4122c fix sync do_request().stream 2023-12-15 13:40:06 +00:00
Thomas Grainger
cf765928ac remove redundant contextlib import 2023-12-15 13:35:31 +00:00
Thomas Grainger
73d11e80eb fix httpx resource warnings 2023-12-15 13:34:52 +00:00
Thomas Grainger
97de8a0fce ignore warning from dateutil 2023-12-15 13:09:13 +00:00
Thomas Grainger
895ae205ca use asyncio.run to run coroutines 2023-12-15 11:39:51 +00:00
Thomas Grainger
f075c8b0b4 close aiohttp session on errors 2023-12-15 11:39:16 +00:00
Thomas Grainger
3919cb2573 remember to close the VCRHTTPSConnection 2023-12-15 11:08:49 +00:00
Thomas Grainger
bddec2e62a use socketserver.ThreadingTCPServer as a contextmanager 2023-12-15 11:08:31 +00:00
Thomas Grainger
fa789e975b fix a KeyError 2023-12-15 11:07:56 +00:00
Thomas Grainger
556fd0166c enable filterwarnings=error 2023-12-15 11:07:40 +00:00
Thomas Grainger
17c78bff9e Merge branch 'master' of github.com:kevin1024/vcrpy into fix-resource-warning-2 2023-12-15 10:48:27 +00:00
Sebastian Pipping
713cb36d35 Merge pull request #800 from kevin1024/pre-commit-ci
Start using pre-commit in CI
2023-12-12 20:28:19 +01:00
Sebastian Pipping
b0cb8765d5 pre-commit: Add --show-source to ruff
Suggested by Jair Henrique.
2023-12-12 20:01:09 +01:00
Sebastian Pipping
97ad51fe6c pre-commit: Enable trailing-whitespace 2023-12-12 20:01:09 +01:00
Sebastian Pipping
1dd9cbde8b pre-commit: Mass-apply trailing-whitespace 2023-12-12 20:01:09 +01:00
Sebastian Pipping
962284072b pre-commit: Enable end-of-file-fixer 2023-12-12 19:01:50 +01:00
Sebastian Pipping
e9102b2bb4 pre-commit: Mass-apply end-of-file-fixer 2023-12-12 19:01:50 +01:00
Sebastian Pipping
957c8bd7a3 pre-commit: Protect against accidental merge conflict markers 2023-12-12 19:01:50 +01:00
Sebastian Pipping
2d5f8a499e lint.yml: Drop as superseded by pre-commit.yml 2023-12-12 19:01:50 +01:00
Sebastian Pipping
e5555a5d5b pre-commit: Make CI keep keep the config up to date via pull requests 2023-12-12 19:01:50 +01:00
Sebastian Pipping
a542567e4a pre-commit: Integrate with GitHub Actions CI 2023-12-12 19:01:50 +01:00
Sebastian Pipping
3168e7813e pre-commit: Enable Ruff and Ruff Black-style formatting 2023-12-12 19:00:57 +01:00
Jair Henrique
88cf01aa14 Fix format code 2023-12-12 14:24:22 -03:00
Parker Hancock
85ae012d9c fix linting 2023-12-12 14:24:22 -03:00
Parker Hancock
db1e9e7180 make cassettes human readable 2023-12-12 14:24:22 -03:00
Jair Henrique
dbf7a3337b Show ruff diff errors 2023-12-12 13:58:25 -03:00
dependabot[bot]
dd97b02b72 build(deps): bump sphinx-rtd-theme from 1.3.0 to 2.0.0
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 1.3.0 to 2.0.0.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/1.3.0...2.0.0)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-11 11:27:04 -03:00
dependabot[bot]
e8346ad30e build(deps): bump actions/setup-python from 4 to 5
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4 to 5.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-11 10:42:13 -03:00
Sebastian Pipping
6a31904333 Merge pull request #789 from kevin1024/fix-imports-to-assertions-py
tests: Fix imports to `tests/assertions.py` (fixes #773)
2023-12-11 00:58:01 +01:00
Jair Henrique
796dc8de7e Move lint from tox to gh action 2023-12-10 20:48:01 -03:00
Sebastian Pipping
ecb5d84f0f tests: Fix imports to tests/assertions.py 2023-12-11 00:36:46 +01:00
Jair Henrique
cebdd45849 Use ruff to check code format intead of black 2023-12-10 20:12:09 -03:00
Sebastian Pipping
8a8d46f130 Merge pull request #775 from kevin1024/python-3-12
Finish up on Python 3.12 support
2023-12-10 23:49:46 +01:00
Sebastian Pipping
954a100dfd Finish up on Python 3.12 support 2023-12-10 23:33:30 +01:00
Sebastian Pipping
604c0be571 Merge pull request #787 from kevin1024/fix-pypy-3-10
Fix CI / Block urllib3 >=2 for PyPy (alternative to #786)
2023-12-10 23:32:23 +01:00
Sebastian Pipping
0e57182207 setup.py: Block urllib3 >=2 for PyPy (including 3.10)
It kept failing CI
2023-12-08 23:05:09 +01:00
Rob Brackett
c062c9f54c Remove spaces at end-of-line in changelog
This matches the project's `.editorconfig` rules.
2023-12-08 16:59:39 -03:00
Rob Brackett
2abf1188a9 Fix list formatting in v5.1.0 changelog
The list of changes was not indented enough, and so didn't actually get formatted as a list when rendering HTML, which left it pretty unreadable. This also adds a blank line between the last 4.x version and 5.0.0 to match the extra blank lines between other major versions.
2023-12-08 16:59:39 -03:00
dependabot[bot]
2b2935a1e7 build(deps): bump sphinx-rtd-theme from 1.2.2 to 1.3.0
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 1.2.2 to 1.3.0.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/1.2.2...1.3.0)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-08 16:39:02 -03:00
dependabot[bot]
a8545c89a5 build(deps): bump actions/checkout from 3 to 4
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-08 16:38:47 -03:00
Parker Hancock
5532c0b4cf more attempts to make the linters happy 2023-12-08 16:38:33 -03:00
Parker Hancock
f4467a8d6c make linters happy 2023-12-08 16:38:33 -03:00
Parker Hancock
f5fc7aac22 fix tests 2023-12-08 16:38:33 -03:00
Parker Hancock
e8e9a4af9f remove unnecssary comment 2023-12-08 16:38:33 -03:00
Parker Hancock
7bf8f65815 fixes for httpx 2023-12-08 16:38:33 -03:00
Michał Górny
defad28771 Disable C extension in aiohttp to fix Python 3.12 install
Disable the C extension in aiohttp that's incompatible with Python 3.12
as of 3.8.5, in order to make it possible to install it (in pure Python
version) for testing.
2023-08-09 10:09:12 -03:00
Michał Górny
69621c67fb Copy debuglevel and _http_vsn attrs into response classes
Copy the `debuglevel` and `_http_vsn` attributes from base connection
class into response classes, in order to fix compatibility with
Python 3.12.  For reasons I don't comprehend, these end up being called
on the class rather than instance, so regular proxying logic does not
work.

Fixes #707
2023-08-09 10:09:12 -03:00
Michał Górny
469a10b980 Enable testing on pypy-3.9 & 3.10 2023-08-09 10:09:12 -03:00
Michał Górny
d90cea0260 Enable testing on Python 3.12 2023-08-09 10:09:12 -03:00
Harmon
c9da7a102f Configure Read the Docs to install the library 2023-08-07 08:34:45 -03:00
Mohammad Razavi
f4144359f6 Fix ResourceWarning unclosed socket
This PR fixes issue #710 by properly closing the underlying socket. It
first uses `pool._put_conn` to keep the connection in the pool, and
later removes and closes it when the context manager exits.

I was unsure about the exact purpose of the `ConnectonRemove` class,
so I made minimal changes to minimize the risk of breaking the code
and there may be better solutions for fixing this issue.

For example, the `urllib3.connectionpool.HTTPConnectionPool` will
utilize a weakref to terminate pool connections. By appending our
connection to it, it will also take care of closing our connection. So
another solution could be to modify the `__exit__` in
`patch.ConnectionRemover` method and add our connection to the pool:

```py
class ConnectionRemover:
    ...

    def __exit__(self, *args):
        for pool, connections in self._connection_pool_to_connections.items():
            for connection in connections:
                if isinstance(connection, self._connection_class):
                    pool._put_conn(connection)
```
2023-08-07 08:42:58 +02:00
Jair Henrique
69de388649 Drop simplejson support 2023-08-01 08:53:31 -03:00
Jair Henrique
6446d00e27 Drop boto 2 support 2023-07-31 08:49:23 -03:00
Kevin McCarthy
d6bded1820 bump version to v5.1.0 2023-07-30 17:11:15 -10:00
Sebastian Pipping
e7c00a4bf9 Merge pull request #739 from kevin1024/issue-734-fix-body-matcher-for-chunked-requests
Fix body matcher for chunked requests (fixes #734)
2023-07-23 23:22:34 +02:00
dependabot[bot]
92dd4d00f7 build(deps): update sphinx requirement from <7 to <8
Updates the requirements on [sphinx](https://github.com/sphinx-doc/sphinx) to permit the latest version.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v0.1.61611...v7.0.1)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-07-23 18:19:26 -03:00
Jair Henrique
cf3ffcad61 Create action to validate docs 2023-07-23 18:09:49 -03:00
Jair Henrique
3ad462e766 Drop dependabot duplicity check 2023-07-23 18:09:49 -03:00
Jair Henrique
cdab3fcb30 Drop iscoroutinefunction fallback function for unsupported python 2023-07-23 12:30:58 -03:00
Jair Henrique
d3a5f4dd6c Add editorconfig file 2023-07-23 12:03:42 -03:00
Jair Henrique
75c8607fd2 Fix read the docs build 2023-07-20 13:17:15 -03:00
Jair Henrique
8c075c7fb3 Configure read the docs V2 2023-07-20 13:10:07 -03:00
Sebastian Pipping
a045a46bb4 Merge pull request #740 from kevin1024/issue-512-fix-query-param-filter-for-aiohttp
Fix query param filter for aiohttp (fixes #517)
2023-07-19 15:37:54 +02:00
Sebastian Pipping
1d979b078d Merge pull request #743 from charettes/remove-six
Remove unnecessary dependency on six.
2023-07-18 23:11:31 +02:00
Simon Charette
f7d76bd40a Remove unnecessary dependency on six.
Remove the last remaining usag of it in VCR.testcase.
2023-07-16 22:22:34 -04:00
Sebastian Pipping
7e11cfc9e4 Merge pull request #738 from kevin1024/json-loads-py36-plus
Make `json.loads` of Python >=3.6 decode bytes by itself
2023-07-11 17:30:39 +02:00
Sebastian Pipping
c95b7264a2 Merge pull request #735 from quasimik/patch-1
Fix typo in docs
2023-07-10 18:52:30 +02:00
Sebastian Pipping
8ab8e63e04 test_filter.py: Make test_filter_querystring meaner 2023-07-10 16:34:09 +02:00
Sebastian Pipping
d2c1da9ab7 test_aiohttp.py: Cover filter_query_parameters with aiohttp 2023-07-10 16:34:08 +02:00
Sebastian Pipping
8336d66976 aiohttp_stubs.py: Stop leaking unfiltered URL into cassette responses 2023-07-10 16:11:26 +02:00
Sebastian Pipping
e69b10c2e0 test_matchers.py: Briefly cover chunked transfer encoding 2023-07-08 02:25:51 +02:00
Sebastian Pipping
a6b9a070a5 matchers.py: Decode chunked request bodies 2023-07-08 02:25:51 +02:00
Sebastian Pipping
e35205c5c8 matchers.py: Support transforming the request body multiple times 2023-07-08 01:29:19 +02:00
Sebastian Pipping
05f61ea56c Make json.loads of Python >=3.6 decode bytes by itself
Quoting https://docs.python.org/3/library/json.html#json.loads :
> Changed in version 3.6: s can now be of type bytes or bytearray.
> The input encoding should be UTF-8, UTF-16 or UTF-32.
2023-07-07 20:00:57 +02:00
Michael Liu
943cabb14f docs/advanced.rst: fix typo 2023-07-04 16:45:49 +08:00
Jair Henrique
4f70152e7c Enable rule B (flake8-bugbear) on ruff 2023-06-27 17:36:26 -03:00
Jair Henrique
016a394f2c Enable E, W and F linters for ruff 2023-06-26 20:46:09 -03:00
Jair Henrique
6b2fc182c3 Improve string format 2023-06-26 20:46:09 -03:00
Jair Henrique
a77173c002 Use ruff as linter 2023-06-26 20:46:09 -03:00
Kevin McCarthy
34d5384318 bump version to v5.0.0 2023-06-26 12:54:39 -05:00
Sebastian Pipping
ad1010d0f8 Merge pull request #695 from kevin1024/drop37
Drop support for Python 3.7 (after 2023-06-27)
2023-06-26 18:32:42 +02:00
Amos Ng
d99593bcd3 Split persister errors into CassetteNotFoundError and CassetteDecodeError (#681) 2023-06-26 18:27:35 +02:00
Sebastian Pipping
8c03c37df4 Merge pull request #725 from kevin1024/make-assert-is-json-less-misleading
assertions.py: Fix mis-leading `assert_is_json`
2023-06-26 17:43:28 +02:00
Jair Henrique
b827cbe2da Drop support to python 3.7 2023-06-26 11:46:20 -03:00
Kevin McCarthy
92ca5a102c fix misspelled word 2023-06-26 09:22:16 -05:00
Kevin McCarthy
d2281ab646 version bump to v4.4.0 2023-06-26 09:17:41 -05:00
Sebastian Pipping
f21c8f0224 assertions.py: Fix mis-leading assert_is_json
Parameter name "a_string" was mistaken and function
name "assert_is_json" was less clear than ideal,
given that it explicitly needs bytes unlike json.loads .
2023-06-24 15:59:23 +02:00
Sebastian Pipping
8b97fd6551 Merge pull request #644 from neliseiska/replace_assert_with_raise
Replace `assert` with `raise AssertionError`
2023-06-22 22:29:13 +02:00
Sebastian Pipping
29e42211d7 Merge pull request #722 from kevin1024/run-online-tests-only-once
main.yml: Run online tests only once (to save runtime)
2023-06-22 15:46:04 +02:00
Sebastian Pipping
6e511b67fd Merge pull request #723 from kevin1024/issue-719-compression-urllib3-v2
Make decompression robust towards already decompressed input (arguably fixes #719)
2023-06-22 15:45:10 +02:00
Sebastian Pipping
9b6cb1ce23 Merge pull request #721 from kevin1024/issue-714-response-raw-stream-urllib3-v2
Make response.raw.stream() work for urllib3 v2 (fixes #714)
2023-06-22 15:44:29 +02:00
Sebastian Pipping
6a12bd1511 test_requests.py: Cover response.raw.stream() 2023-06-21 14:52:13 +02:00
Sebastian Pipping
3411bedc06 Make response.raw.stream() work for urllib3 v2 2023-06-21 14:52:13 +02:00
Sebastian Pipping
438a65426b filters.py: Make decompression robust towards decompressed input 2023-06-21 02:28:36 +02:00
Sebastian Pipping
8c6b1fdf38 test_requests.py: Extend coverage of gzip response
.. with regard to:
- not crashing with decode_compressed_response==True
- expected cassette content for body string
- expected response content, i.e. proper decompression
2023-06-21 02:28:36 +02:00
Sebastian Pipping
15e9f1868c main.yml: Run online tests only once
Online tests are tests that need access to the Internet
to pass (and hence have @pytest.marker.online decoration).
2023-06-21 00:38:58 +02:00
Sebastian Pipping
7eb235cd9c Merge pull request #720 from kevin1024/use-python3-command
Use python3 (and pip3) command
2023-06-19 15:07:23 +02:00
Sebastian Pipping
d2f2731481 Replace command "pip" with "pip3" 2023-06-18 23:08:17 +02:00
Sebastian Pipping
b2a895cb89 Replace command "python" by "python3" 2023-06-18 23:07:04 +02:00
Sebastian Pipping
ffb2f44236 Merge pull request #718 from kevin1024/enforce-online-marker-completeness
Make CI enforce that all online tests are marked with @pytest.mark.online
2023-06-18 23:03:48 +02:00
Sebastian Pipping
d66392a3fb main.yml: Enforce that use of @pytest.mark.online remains complete 2023-06-18 21:33:30 +02:00
Sebastian Pipping
b9cab239a7 runtests.sh: Fix variable quoting + add exec 2023-06-18 17:40:49 +02:00
Sebastian Pipping
276a41d9b6 Merge pull request #674 from jspricke/pytest.mark.online
Mark tests with @pytest.mark.online that need access to the Internet
2023-06-18 17:34:16 +02:00
Jochen Sprickerhof
7007e944ae pytest.mark.online tests that need internet 2023-06-18 16:52:51 +02:00
Sebastian Pipping
bd112a2385 docs/usage.rst: Fix assertions
Symptom was:
> Traceback (most recent call last):
>   File "/tmp/tmp.kJAKlLngAX/foo.py", line 6, in <module>
>     assert 'Example domains' in response
> TypeError: a bytes-like object is required, not 'str'
2023-06-18 11:14:44 -03:00
Sebastian Pipping
42848285a0 docs/usage.rst: Fix urllib import
Symptom was:
> Traceback (most recent call last):
>   File "/tmp/tmp.kJAKlLngAX/foo.py", line 5, in <module>
>     response = urllib.request.urlopen('http://www.iana.org/domains/reserved').read()
> AttributeError: module 'urllib' has no attribute 'request'
2023-06-18 11:14:44 -03:00
Sebastian Pipping
e3aae34ef7 Merge pull request #713 from mghantous/mg/read1
VCRHTTPResponse Not Working with Biopython 1.81
2023-06-12 13:08:00 +02:00
Sebastian Pipping
f4316d2dae Merge pull request #712 from kevin1024/integrate-vcrpy-unittest
Integrate vcrpy-unittest (alternative to #709)
2023-06-08 18:09:22 +02:00
Sebastian Pipping
d613a814d3 vcr/unittest: Simplify file layout
.. and make "from vcr.unittest import VCRTestCase" work again.
2023-06-08 16:28:34 +02:00
Sebastian Pipping
ce234e503f docs/usage.rst: Drop needless self-reference 2023-06-08 16:28:34 +02:00
Sebastian Pipping
3527d25ce8 vcr/unittest: Simplify super(C, self) in class C to super()
.. for Python 3
2023-06-08 16:28:34 +02:00
Sebastian Pipping
dedb7ec403 Resolve needless inheritence from object (Python 3) 2023-06-08 16:28:34 +02:00
Sebastian Pipping
59263d6025 vcr/unittest: Resolve needless inheritence from object 2023-06-08 16:28:34 +02:00
Sebastian Pipping
2842cabec6 vcr/unittest: Remove unused logger 2023-06-08 16:28:34 +02:00
Sebastian Pipping
ad650a7ee1 vcr/unittest: Apply black formatting 2023-06-08 16:28:34 +02:00
Sebastian Pipping
9232915885 docs/usage.rst: Break up a long line 2023-06-08 16:28:34 +02:00
Sebastian Pipping
cbb540029f docs/usage.rst: Adapt documentation to new code location 2023-06-08 16:28:34 +02:00
Sebastian Pipping
bf30d9a5e5 vcr/unittest: Fix test test_get_vcr_with_matcher
Matcher needs attribute __name__ for function vcr.matchers.get_matchers_results .
2023-06-08 16:28:34 +02:00
Sebastian Pipping
f06f71ece4 vcr/unittest: Stop disguising MagicMock as Mock 2023-06-08 16:28:34 +02:00
Sebastian Pipping
1070d417b3 vcr/unittest: Apply 2to3 2023-06-08 16:28:34 +02:00
Sebastian Pipping
46726a9a61 vcr/unittest: Fix import of VCRTestCase in tests 2023-06-08 16:28:34 +02:00
Sebastian Pipping
87db8e69ff vcr/unittest: Use unitest.mock rather than mock of PyPI 2023-06-08 16:28:34 +02:00
Sebastian Pipping
52701ebca4 vcr/unittest: Make import to vcrpy relative 2023-06-08 16:28:34 +02:00
Sebastian Pipping
69679dc3fc vcr/unittest: Drop forward imports
.. to resolve import ambiguity.
2023-06-08 16:28:34 +02:00
Sebastian Pipping
c13f33b1e0 Add unmodified vcrpy-unittest code
Source commit is a2fd7625fde1ea15c8982759b07007aef40424b3.
License is MIT just like vcrpy.
2023-06-08 16:28:34 +02:00
Matt Ghantous
5476dd010c Casting to BufferedReader no longer needed in test 2023-06-05 23:47:47 -04:00
Matt Ghantous
0add77d5ae Add read1 method to VCRHTTPResponse 2023-06-05 23:20:52 -04:00
Yaroslav Halchenko
96a6e91def Codespell: action + config (#704) 2023-06-05 16:56:55 +02:00
Abram Clark
3b41f0ede3 Fix for #174 to prevent filters from corrupting request 2023-05-27 09:40:53 -03:00
Kevin McCarthy
0e06836908 bump version to v4.3.1 2023-05-26 11:02:14 -05:00
Sebastian Pipping
69db5c936f Limit support for urllib3 >=2 to Python >=3.10 for now
.. because it turned out broken for Python <3.10.
2023-05-26 10:55:44 -05:00
Sebastian Pipping
7c402ae4b0 test_vcr.py: Clarify that test_vcr_before_record_request_params is an offline test 2023-05-26 10:55:44 -05:00
Sebastian Pipping
b5c0938d2e tox.ini: Cover both urllib3 v1 and v2 2023-05-26 10:55:44 -05:00
Sebastian Pipping
3ad93fff42 tox.ini: Drop needless "boto3: urllib3"
boto3 depends on botocore which in turn depends on urllib3.
2023-05-26 10:55:44 -05:00
Sebastian Pipping
89f2005250 Fix VCRHTTPResponse for requests.cookies.extract_cookies_to_jar 2023-05-26 10:55:44 -05:00
Sebastian Pipping
88c0039089 Make test "test_cookies" more mean and helpful 2023-05-26 10:55:44 -05:00
Sebastian Pipping
1b3a1235f2 Make VCRHTTPResponse interface satisfy urllib3.response.HTTPResponse 2023-05-26 10:55:44 -05:00
Sebastian Pipping
fd1aaab3bf Respect urllib3.response.HTTPResponse.data 2023-05-26 10:55:44 -05:00
Sebastian Pipping
00da5ac5af Make test_headers robust with regard to order of headers 2023-05-26 10:55:44 -05:00
Sebastian Pipping
ac20cd1dd3 Tolerate urllib3.response.HTTPResponse.msg being None 2023-05-26 10:55:44 -05:00
Sonny V
64d6811eda build(tox.ini): revert pinning urllib to <2
In #690 a quick fix was introduced to get a green ci, this change should no longer be required.
2023-05-26 10:55:44 -05:00
Sonny V
51c99bb9df fix: use urllib3.connection where needed.
Since urllib3 v2 the re-export of connection.HTTPConnection in
urllib3.connectionpool was removed.

In this commit we use urllib3.connection where needed. Some references
to connectionpool.HTTPConnection are still there for backward
compatibility.

Closes #688
2023-05-26 10:55:44 -05:00
Sebastian Pipping
43484e7cff test_aiohttp.py: Make cookie tests use pytest-httpbin (#706)
.. to make them faster and more robust.
2023-05-26 01:16:20 +02:00
Sebastian Pipping
199f9f07f8 Merge pull request #705 from kevin1024/fix-test-dependencies
Fix test dependencies
2023-05-25 20:29:37 +02:00
Sebastian Pipping
13af8cae43 setup.py: Add missing test dependencies 2023-05-25 16:40:42 +02:00
Sebastian Pipping
436b62f587 setup.py: Drop unused test dependency "mock"
All imports use unittest.mock rather than mock of PyPI.
2023-05-25 16:33:14 +02:00
Sebastian Pipping
5b40a67b3b setup.py: Extract variable tests_require
.. and apply sorting, but nothing more
2023-05-25 16:32:08 +02:00
Sebastian Pipping
c41bd2bb40 Stop installing libgnutls28-dev 2023-05-24 16:41:46 -03:00
Kevin McCarthy
62cb151918 Release v4.3.0 2023-05-24 13:48:31 -05:00
Sebastian Pipping
1a3bc67c7c Merge pull request #701 from kevin1024/run-actions-on-push-to-topic-branches
Allow triggering CI manually
2023-05-17 18:01:28 +02:00
Sebastian Pipping
aeff51bd79 main.yml: Allow triggering CI manually 2023-05-17 16:45:13 +02:00
Sebastian Pipping
e9f0ede9dd main.yml: Drop superflous specification of branches 2023-05-17 16:44:54 +02:00
Sebastian Pipping
0235eab766 Merge pull request #698 from kevin1024/reduce-legacy
Drop support for botocore <1.11.0 and requests <2.16.2 (fixes #693)
2023-05-15 15:55:15 +02:00
Sebastian Pipping
31c8dc0a1e Drop support for requests <2.16.2 2023-05-15 14:06:26 +02:00
Sebastian Pipping
24af48d468 Drop support for botocore <1.11.0 2023-05-15 14:06:26 +02:00
Kian-Meng Ang
44359bfe43 Fix typo, succeedes -> succeeds (#672)
Found via `codespell`.
2023-05-13 17:03:27 +02:00
Jair Henrique
14cef83c15 Move some tests to use mockbin instead httpbin 2023-05-11 17:03:54 -03:00
Jair Henrique
77da67ef0a Remove duplicated fixture 2023-05-11 10:05:50 -03:00
dependabot[bot]
58329f812b build(deps): bump actions/checkout from 3.1.0 to 3.5.2
Bumps [actions/checkout](https://github.com/actions/checkout) from 3.1.0 to 3.5.2.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3.1.0...v3.5.2)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-05-08 09:15:50 -03:00
Sebastian Pipping
06913ce21a tox.ini: Exclude ./venv/ from flake8 2023-05-05 10:14:46 -03:00
Sebastian Pipping
4994c53590 Fix formatting with regard to black 23.3.0 2023-05-05 10:14:46 -03:00
Sebastian Pipping
1d90853f3b tox.ini: Quick-fix the CI for recent tox and recent urllib3 2023-05-05 10:14:46 -03:00
Daniel Silva
36c7465cf7 docs: add drop_unused_requests option 2023-01-04 21:59:58 +00:00
Daniel Silva
010fa268d1 test: add tests to drop_unused_requests option 2023-01-04 20:09:31 +00:00
Daniel Silva
99c0384770 feat: add an option to exclude unused interactions
Introduce the `drop_unused_requests` option (False by default). If True, it will force the `Cassette` saving operation with only played old interactions and new ones if they exist. As a result, unused old requests are dropped.

Add `_old_interactions`, `_played_interactions` and `_new_interactions()`.  The `_old_interactions` are previously recorded interactions loaded from Cassette files. The `_played_interactions` is a set of old interactions that were marked as played.  A new interaction is a tuple (request, response) in `self.data` that is not in `_old_interactions` list.
2023-01-02 03:52:52 +00:00
Terseus
42d79b1102 Restore the pytest-httpbin package in tox.ini
The branch with the fix for HTTPS redirects is included in v1.0.1

See https://github.com/kevin1024/pytest-httpbin/releases/tag/v1.0.1
2022-11-01 09:08:59 -03:00
Jair Henrique
cef85a4986 remove pytest deprecation warning for yield_fixture 2022-10-31 22:44:26 -03:00
Terseus
964615af25 Include how to use record_on_exception in the docs 2022-10-31 22:43:03 -03:00
Terseus
3b6d79fc0b Prevent Sphinx warning about language = None
Since version 5.x Sphinx triggers a warning when `language = None`, this
caused `tox -e docs` to fail.
Set it to "en".

See https://github.com/sphinx-doc/sphinx/pull/10481
2022-10-31 22:43:03 -03:00
Terseus
f48922ce09 Fix not calling all the exit stack when record_on_exception is False
The initial technique to implement `record_on_exception=False` was to
not emptying the generator returned by
`CassetteContextDecorator._patch_generator` when an exception happens to
skip the `cassette._save` call, however this had the side effect of not
emptying the `ExitStack` created inside the generator which contains the
`_patch.__exit__` calls to remove the patches.

This was innocuous in CPython, which uses a reference counting garbage
collector so the `ExitStack` was immediately collected after losing
scope and therefore its `__exit__` method executed.
Pypy, on the other hand, uses a generational garbage collector so its
objects may survive more time, enough for the `ExitStack` not called
until much later, which may cause the patches to live more than expected
when `record_on_exception=False`.

This was found because the test
`test_nesting_context_managers_by_checking_references_of_http_connection`
was failing because it was executed after
`test_dont_record_on_exception`.

Now the cassette instance is saved inside the `CassetteContextDecorator`
instance to have better control on where to save the cassette, and moved
the `cassette._save` call from the `_patch_generator` method to the
`__exit__` method to be free to empty the generator and remove the
patches always.
2022-10-31 22:43:03 -03:00
Terseus
2980bfccde Fix lint errors 2022-10-31 22:43:03 -03:00
Dan Passaro
7599f4d50a Fix Py3 tests using b'' literal 2022-10-31 22:43:03 -03:00
Dan Passaro
995020bf06 Add record_on_exception flag.
Defaults to True, which maintains historical behavior.

Fixes #205.
2022-10-31 22:43:03 -03:00
Jair Henrique
423ccaa40b Set fail-fast to false on CI 2022-10-31 22:18:30 -03:00
Josef
526fdbb194 Add Path handling to use_cassette and to filesystem.py persister
* now it is possible to use path from pathlib
2022-10-31 22:15:14 -03:00
Evgeni Golov
511d0ab855 add python 3.11 support 2022-10-31 09:00:35 -03:00
Jair Henrique
60ac99c907 Run lint on CI 2022-10-14 10:53:54 -03:00
dependabot[bot]
57dee93e11 build(deps): bump actions/checkout from 3.0.2 to 3.1.0
Bumps [actions/checkout](https://github.com/actions/checkout) from 3.0.2 to 3.1.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3.0.2...v3.1.0)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-10-11 10:34:41 -03:00
Jair Henrique
0eece7f96e Add isort to code lint 2022-10-09 11:35:37 -03:00
Chris Wesseling
eb59d871b4 Handles empty responses with gzip/deflate encoding.
Closes #661
2022-10-09 11:04:44 -03:00
Haapalainen, Jonne
a79356cf5f Replace assert with raise AssertionError
Assert seems to behave badly in some cases and should not be used for
handling runtime errors.

see for example:
https://medium.com/@jadhavmanoj/python-what-is-raise-and-assert-statement-c3908697bc62
https://github.com/emre/notes/blob/master/python/when-to-use-assert.md
2022-05-05 11:21:27 +03:00
97 changed files with 2799 additions and 1676 deletions

14
.editorconfig Normal file
View File

@@ -0,0 +1,14 @@
root = true
[*]
indent_style = space
indent_size = 4
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
[Makefile]
indent_style = tab
[*.{yml,yaml}]
indent_size = 2

22
.github/workflows/codespell.yml vendored Normal file
View File

@@ -0,0 +1,22 @@
---
name: Codespell
on:
push:
branches: [master]
pull_request:
branches: [master]
permissions:
contents: read
jobs:
codespell:
name: Check for spelling errors
runs-on: ubuntu-24.04
steps:
- name: Checkout
uses: actions/checkout@v5
- name: Codespell
uses: codespell-project/actions-codespell@v2

23
.github/workflows/docs.yml vendored Normal file
View File

@@ -0,0 +1,23 @@
name: Validate docs
on:
push:
paths:
- 'docs/**'
jobs:
validate:
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v5
- uses: actions/setup-python@v6
with:
python-version: "3.12"
- name: Install build dependencies
run: pip install -r docs/requirements.txt
- name: Rendering HTML documentation
run: sphinx-build -b html docs/ html
- name: Inspect html rendered
run: cat html/index.html

View File

@@ -5,36 +5,57 @@ on:
branches: branches:
- master - master
pull_request: pull_request:
branches: schedule:
- "*" - cron: "0 16 * * 5" # Every Friday 4pm
workflow_dispatch:
jobs: jobs:
build: build:
runs-on: ubuntu-latest runs-on: ubuntu-24.04
strategy: strategy:
fail-fast: false
matrix: matrix:
python-version: ["3.7", "3.8", "3.9", "3.10", "pypy-3.8"] python-version:
- "3.10"
- "3.11"
- "3.12"
- "3.13"
- "pypy-3.11"
steps: steps:
- name: Install libgnutls28-dev - uses: actions/checkout@v5
run: | - name: Install uv
sudo apt update -q uses: astral-sh/setup-uv@v7
sudo apt install -q -y libgnutls28-dev libcurl4-gnutls-dev
- uses: actions/checkout@v3.0.2
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4 uses: actions/setup-python@v6
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
allow-prereleases: true
- name: Install project dependencies - name: Install project dependencies
run: | run: |
pip install --upgrade pip uv pip install --system --upgrade pip setuptools
pip install codecov tox tox-gh-actions uv pip install --system codecov '.[tests]'
uv pip check
- name: Run tests with tox - name: Allow creation of user namespaces (e.g. to the unshare command)
run: tox run: |
# .. so that we don't get error:
# unshare: write failed /proc/self/uid_map: Operation not permitted
# Idea from https://github.com/YoYoGames/GameMaker-Bugs/issues/6015#issuecomment-2135552784 .
sudo sysctl kernel.apparmor_restrict_unprivileged_userns=0
- name: Run online tests
run: ./runtests.sh --cov=./vcr --cov-branch --cov-report=xml --cov-append -m online
- name: Run offline tests with no access to the Internet
run: |
# We're using unshare to take Internet access
# away so that we'll notice whenever some new test
# is missing @pytest.mark.online decoration in the future
unshare --map-root-user --net -- \
sh -c 'ip link set lo up; ./runtests.sh --cov=./vcr --cov-branch --cov-report=xml --cov-append -m "not online"'
- name: Run coverage - name: Run coverage
run: codecov run: codecov

View File

@@ -0,0 +1,62 @@
# Copyright (c) 2023 Sebastian Pipping <sebastian@pipping.org>
# Licensed under the MIT license
name: Detect outdated pre-commit hooks
on:
schedule:
- cron: '0 16 * * 5' # Every Friday 4pm
# NOTE: This will drop all permissions from GITHUB_TOKEN except metadata read,
# and then (re)add the ones listed below:
permissions:
contents: write
pull-requests: write
jobs:
pre_commit_detect_outdated:
name: Detect outdated pre-commit hooks
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v5
- name: Set up Python 3.12
uses: actions/setup-python@v6
with:
python-version: 3.12
- name: Install pre-commit
run: |-
pip install \
--disable-pip-version-check \
--no-warn-script-location \
--user \
pre-commit
echo "PATH=${HOME}/.local/bin:${PATH}" >> "${GITHUB_ENV}"
- name: Check for outdated hooks
run: |-
pre-commit autoupdate
git diff -- .pre-commit-config.yaml
- name: Create pull request from changes (if any)
id: create-pull-request
uses: peter-evans/create-pull-request@v7
with:
author: 'pre-commit <pre-commit@tools.invalid>'
base: master
body: |-
For your consideration.
:warning: Please **CLOSE AND RE-OPEN** this pull request so that [further workflow runs get triggered](https://github.com/peter-evans/create-pull-request/blob/main/docs/concepts-guidelines.md#triggering-further-workflow-runs) for this pull request.
branch: precommit-autoupdate
commit-message: "pre-commit: Autoupdate"
delete-branch: true
draft: true
labels: enhancement
title: "pre-commit: Autoupdate"
- name: Log pull request URL
if: "${{ steps.create-pull-request.outputs.pull-request-url }}"
run: |
echo "Pull request URL is: ${{ steps.create-pull-request.outputs.pull-request-url }}"

20
.github/workflows/pre-commit.yml vendored Normal file
View File

@@ -0,0 +1,20 @@
# Copyright (c) 2023 Sebastian Pipping <sebastian@pipping.org>
# Licensed under the MIT license
name: Run pre-commit
on:
- pull_request
- push
- workflow_dispatch
jobs:
pre-commit:
name: Run pre-commit
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v5
- uses: actions/setup-python@v6
with:
python-version: 3.12
- uses: pre-commit/action@v3.0.1

17
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,17 @@
# Copyright (c) 2023 Sebastian Pipping <sebastian@pipping.org>
# Licensed under the MIT license
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.14.5
hooks:
- id: ruff
args: ["--output-format=full"]
- id: ruff-format
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: check-merge-conflict
- id: end-of-file-fixer
- id: trailing-whitespace

24
.readthedocs.yaml Normal file
View File

@@ -0,0 +1,24 @@
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the version of Python and other tools you might need
build:
os: ubuntu-24.04
tools:
python: "3.12"
# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py
# We recommend specifying your dependencies to enable reproducible builds:
# https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: docs/requirements.txt
- method: pip
path: .

View File

@@ -1,6 +1,5 @@
include README.rst include README.rst
include LICENSE.txt include LICENSE.txt
include tox.ini
recursive-include tests * recursive-include tests *
recursive-exclude * __pycache__ recursive-exclude * __pycache__
recursive-exclude * *.py[co] recursive-exclude * *.py[co]

View File

@@ -4,7 +4,7 @@ VCR.py 📼
########### ###########
|PyPI| |Python versions| |Build Status| |CodeCov| |Gitter| |CodeStyleBlack| |PyPI| |Python versions| |Build Status| |CodeCov| |Gitter|
---- ----
@@ -70,6 +70,3 @@ more details
.. |CodeCov| image:: https://codecov.io/gh/kevin1024/vcrpy/branch/master/graph/badge.svg .. |CodeCov| image:: https://codecov.io/gh/kevin1024/vcrpy/branch/master/graph/badge.svg
:target: https://codecov.io/gh/kevin1024/vcrpy :target: https://codecov.io/gh/kevin1024/vcrpy
:alt: Code Coverage Status :alt: Code Coverage Status
.. |CodeStyleBlack| image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
:alt: Code Style: black

View File

@@ -24,4 +24,4 @@
<stop offset="1" stop-color="#27DDA6"/> <stop offset="1" stop-color="#27DDA6"/>
</linearGradient> </linearGradient>
</defs> </defs>
</svg> </svg>

Before

Width:  |  Height:  |  Size: 6.2 KiB

After

Width:  |  Height:  |  Size: 6.2 KiB

View File

@@ -16,7 +16,7 @@ a nice addition. Here's an example:
with vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml') as cass: with vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml') as cass:
response = urllib2.urlopen('http://www.zombo.com/').read() response = urllib2.urlopen('http://www.zombo.com/').read()
# cass should have 1 request inside it # cass should have 1 request inside it
assert len(cass) == 1 assert len(cass) == 1
# the request uri should have been http://www.zombo.com/ # the request uri should have been http://www.zombo.com/
assert cass.requests[0].uri == 'http://www.zombo.com/' assert cass.requests[0].uri == 'http://www.zombo.com/'
@@ -71,7 +71,7 @@ Finally, register your class with VCR to use your new serializer.
import vcr import vcr
class BogoSerializer(object): class BogoSerializer:
""" """
Must implement serialize() and deserialize() methods Must implement serialize() and deserialize() methods
""" """
@@ -136,7 +136,8 @@ Create your own persistence class, see the example below:
Your custom persister must implement both ``load_cassette`` and ``save_cassette`` Your custom persister must implement both ``load_cassette`` and ``save_cassette``
methods. The ``load_cassette`` method must return a deserialized cassette or raise methods. The ``load_cassette`` method must return a deserialized cassette or raise
``ValueError`` if no cassette is found. either ``CassetteNotFoundError`` if no cassette is found, or ``CassetteDecodeError``
if the cassette cannot be successfully deserialized.
Once the persister class is defined, register with VCR like so... Once the persister class is defined, register with VCR like so...
@@ -188,7 +189,7 @@ of post data parameters to filter.
.. code:: python .. code:: python
with my_vcr.use_cassette('test.yml', filter_post_data_parameters=['client_secret']): with my_vcr.use_cassette('test.yml', filter_post_data_parameters=['api_key']):
requests.post('http://api.com/postdata', data={'api_key': 'secretstring'}) requests.post('http://api.com/postdata', data={'api_key': 'secretstring'})
Advanced use of filter_headers, filter_query_parameters and filter_post_data_parameters Advanced use of filter_headers, filter_query_parameters and filter_post_data_parameters
@@ -207,7 +208,7 @@ So these two calls are the same:
# original (still works) # original (still works)
vcr = VCR(filter_headers=['authorization']) vcr = VCR(filter_headers=['authorization'])
# new # new
vcr = VCR(filter_headers=[('authorization', None)]) vcr = VCR(filter_headers=[('authorization', None)])
@@ -217,7 +218,7 @@ Here are two examples of the new functionality:
# replace with a static value (most common) # replace with a static value (most common)
vcr = VCR(filter_headers=[('authorization', 'XXXXXX')]) vcr = VCR(filter_headers=[('authorization', 'XXXXXX')])
# replace with a callable, for example when testing # replace with a callable, for example when testing
# lots of different kinds of authorization. # lots of different kinds of authorization.
def replace_auth(key, value, request): def replace_auth(key, value, request):
@@ -285,7 +286,7 @@ sensitive data from the response body:
before_record_response=scrub_string(settings.USERNAME, 'username'), before_record_response=scrub_string(settings.USERNAME, 'username'),
) )
with my_vcr.use_cassette('test.yml'): with my_vcr.use_cassette('test.yml'):
# your http code here # your http code here
Decode compressed response Decode compressed response
@@ -404,3 +405,38 @@ the Cassette ``allow_playback_repeats`` option.
for x in range(10): for x in range(10):
response = urllib2.urlopen('http://www.zombo.com/').read() response = urllib2.urlopen('http://www.zombo.com/').read()
assert cass.all_played assert cass.all_played
Discards Cassette on Errors
---------------------------
By default VCR will save the cassette file even when there is any error inside
the enclosing context/test.
If you want to save the cassette only when the test succeeds, set the Cassette
``record_on_exception`` option to ``False``.
.. code:: python
try:
my_vcr = VCR(record_on_exception=False)
with my_vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml') as cass:
response = urllib2.urlopen('http://www.zombo.com/').read()
raise RuntimeError("Oops, something happened")
except RuntimeError:
pass
# Since there was an exception, the cassette file hasn't been created.
assert not os.path.exists('fixtures/vcr_cassettes/synopsis.yaml')
Drop unused requests
--------------------
Even if any HTTP request is changed or removed from tests, previously recorded
interactions remain in the cassette file. If set the ``drop_unused_requests``
option to ``True``, VCR will not save old HTTP interactions if they are not used.
.. code:: python
my_vcr = VCR(drop_unused_requests=True)
with my_vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml'):
... # your HTTP interactions here

View File

@@ -7,6 +7,70 @@ For a full list of triaged issues, bugs and PRs and what release they are target
All help in providing PRs to close out bug issues is appreciated. Even if that is providing a repo that fully replicates issues. We have very generous contributors that have added these to bug issues which meant another contributor picked up the bug and closed it out. All help in providing PRs to close out bug issues is appreciated. Even if that is providing a repo that fully replicates issues. We have very generous contributors that have added these to bug issues which meant another contributor picked up the bug and closed it out.
- Unreleased
- Drop support for Python 3.9
- Drop support for urllib3 < 2
- 7.0.0
- Drop support for python 3.8 (major version bump) - thanks @jairhenrique
- Various linting and test fixes - thanks @jairhenrique
- Bugfix for urllib2>=2.3.0 - missing version_string (#888)
- Bugfix for asyncio.run - thanks @alekeik1
- 6.0.2
- Ensure body is consumed only once (#846) - thanks @sathieu
- Permit urllib3 2.x for non-PyPy Python >=3.10
- Fix typos in test commands - thanks @chuckwondo
- Several test and workflow improvements - thanks @hartwork and @graingert
- 6.0.1
- Bugfix with to Tornado cassette generator (thanks @graingert)
- 6.0.0
- BREAKING: Fix issue with httpx support (thanks @parkerhancock) in #784. NOTE: You may have to recreate some of your cassettes produced in previous releases due to the binary format being saved incorrectly in previous releases
- BREAKING: Drop support for `boto` (vcrpy still supports boto3, but is dropping the deprecated `boto` support in this release. (thanks @jairhenrique)
- Fix compatibility issue with Python 3.12 (thanks @hartwork)
- Drop simplejson (fixes some compatibility issues) (thanks @jairhenrique)
- Run CI on Python 3.12 and PyPy 3.9-3.10 (thanks @mgorny)
- Various linting and docs improvements (thanks @jairhenrique)
- Tornado fixes (thanks @graingert)
- 5.1.0
- Use ruff for linting (instead of current flake8/isort/pyflakes) - thanks @jairhenrique
- Enable rule B (flake8-bugbear) on ruff - thanks @jairhenrique
- Configure read the docs V2 - thanks @jairhenrique
- Fix typo in docs - thanks @quasimik
- Make json.loads of Python >=3.6 decode bytes by itself - thanks @hartwork
- Fix body matcher for chunked requests (fixes #734) - thanks @hartwork
- Fix query param filter for aiohttp (fixes #517) - thanks @hartwork and @salomvary
- Remove unnecessary dependency on six. - thanks @charettes
- build(deps): update sphinx requirement from <7 to <8 - thanks @jairhenrique
- Add action to validate docs - thanks @jairhenrique
- Add editorconfig file - thanks @jairhenrique
- Drop iscoroutinefunction fallback function for unsupported python thanks @jairhenrique
- 5.0.0
- BREAKING CHANGE: Drop support for Python 3.7. 3.7 is EOL as of 6/27/23 Thanks @jairhenrique
- BREAKING CHANGE: Custom Cassette persisters no longer catch ValueError. If you have implemented a custom persister (has anyone implemented a custom persister? Let us know!) then you will need to throw a CassetteNotFoundError when unable to find a cassette. See #681 for discussion and reason for this change. Thanks @amosjyng for the PR and the review from @hartwork
- 4.4.0
- HUGE thanks to @hartwork for all the work done on this release!
- Bring vcr/unittest in to vcrpy as a full feature of vcr instead of a separate library. Big thanks to @hartwork for doing this and to @agriffis for originally creating the library
- Make decompression robust towards already decompressed input (thanks @hartwork)
- Bugfix: Add read1 method (fixes compatibility with biopython), thanks @mghantous
- Bugfix: Prevent filters from corrupting request (thanks @abramclark)
- Bugfix: Add support for `response.raw.stream()` to fix urllib v2 compat
- Bugfix: Replace `assert` with `raise AssertionError`: fixes support for `PYTHONOPTIMIZE=1`
- Add pytest.mark.online to run test suite offline, thanks @jspricke
- use python3 and pip3 binaries to ease debian packaging (thanks @hartwork)
- Add codespell (thanks @mghantous)
- 4.3.1
- Support urllib3 v1 and v2. NOTE: there is an issue running urllib3 v2 on
Python older than 3.10, so this is currently blocked in the requirements.
Hopefully we can resolve this situation in the future. Thanks to @shifqu,
hartwork, jairhenrique, pquentin, and vEpiphyte for your work on this.
- 4.3.0
- Add support for Python 3.11 (Thanks @evgeni)
- Drop support for botocore <1.11.0 and requests <2.16.2 (thanks @hartwork)
- Bugfix: decode_compressed_response raises exception on empty responses. Thanks @CharString
- Don't save requests from decorated tests if decorated test fails (thanks @dan-passaro)
- Fix not calling all the exit stack when record_on_exception is False (thanks @Terseus)
- Various CI, documentation, testing, and formatting improvements (Thanks @jairhenrique, @dan-passaro, @hartwork, and Terseus)
- 4.2.1 - 4.2.1
- Fix a bug where the first request in a redirect chain was not being recorded with aiohttp - Fix a bug where the first request in a redirect chain was not being recorded with aiohttp
- Various typos and small fixes, thanks @jairhenrique, @timgates42 - Various typos and small fixes, thanks @jairhenrique, @timgates42
@@ -31,8 +95,8 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- Bugfix: Fix test suite by switching to mockbin (thanks @jairhenrique) - Bugfix: Fix test suite by switching to mockbin (thanks @jairhenrique)
- 4.0.2 - 4.0.2
- Fix mock imports as reported in #504 by @llybin. Thank you. - Fix mock imports as reported in #504 by @llybin. Thank you.
- 4.0.1 - 4.0.1
- Fix logo alignment for PyPI - Fix logo alignment for PyPI
- 4.0.0 - 4.0.0
- Remove Python2 support (@hugovk) - Remove Python2 support (@hugovk)
- Add Python 3.8 TravisCI support (@neozenith) - Add Python 3.8 TravisCI support (@neozenith)
@@ -44,7 +108,7 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- Add support for `request_info` on mocked responses in aiohttp stub #495 (@nickdirienzo) - Add support for `request_info` on mocked responses in aiohttp stub #495 (@nickdirienzo)
- doc: fixed variable name (a -> cass) in an example for rewind #492 (@yarikoptic) - doc: fixed variable name (a -> cass) in an example for rewind #492 (@yarikoptic)
- 2.1.1 - 2.1.1
- Format code with black (@neozenith) - Format code with black (@neozenith)
- Use latest pypy3 in Travis (@hugovk) - Use latest pypy3 in Travis (@hugovk)
- Improve documentation about custom matchers (@gward) - Improve documentation about custom matchers (@gward)
@@ -52,7 +116,7 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- Add `pytest-recording` to the documentation as an alternative Pytest plugin (@Stranger6667) - Add `pytest-recording` to the documentation as an alternative Pytest plugin (@Stranger6667)
- Fix yarl and python3.5 version issue (@neozenith) - Fix yarl and python3.5 version issue (@neozenith)
- Fix header matcher for boto3 - fixes #474 (@simahawk) - Fix header matcher for boto3 - fixes #474 (@simahawk)
- 2.1.0 - 2.1.0
- Add a `rewind` method to reset a cassette (thanks @khamidou) - Add a `rewind` method to reset a cassette (thanks @khamidou)
- New error message with more details on why the cassette failed to play a request (thanks @arthurHamon2, @neozenith) - New error message with more details on why the cassette failed to play a request (thanks @arthurHamon2, @neozenith)
- Handle connect tunnel URI (thanks @jeking3) - Handle connect tunnel URI (thanks @jeking3)
@@ -64,9 +128,9 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- Fix bugs on aiohttp integration (thanks @graingert, @steinnes, @stj, @lamenezes, @lmazuel) - Fix bugs on aiohttp integration (thanks @graingert, @steinnes, @stj, @lamenezes, @lmazuel)
- Fix Biopython incompatibility (thanks @rishab121) - Fix Biopython incompatibility (thanks @rishab121)
- Fix Boto3 integration (thanks @1oglop1, @arthurHamon2) - Fix Boto3 integration (thanks @1oglop1, @arthurHamon2)
- 2.0.1 - 2.0.1
- Fix bug when using vcrpy with python 3.4 - Fix bug when using vcrpy with python 3.4
- 2.0.0 - 2.0.0
- Support python 3.7 (fix httplib2 and urllib2, thanks @felixonmars) - Support python 3.7 (fix httplib2 and urllib2, thanks @felixonmars)
- [#356] Fixes `before_record_response` so the original response isn't changed (thanks @kgraves) - [#356] Fixes `before_record_response` so the original response isn't changed (thanks @kgraves)
- Fix requests stub when using proxy (thanks @samuelfekete @daneoshiga) - Fix requests stub when using proxy (thanks @samuelfekete @daneoshiga)
@@ -76,56 +140,56 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- Improve docs (thanks @adamchainz) - Improve docs (thanks @adamchainz)
- 1.13.0 - 1.13.0
- Fix support to latest aiohttp version (3.3.2). Fix content-type bug in aiohttp stub. Save URL with query params properly when using aiohttp. - Fix support to latest aiohttp version (3.3.2). Fix content-type bug in aiohttp stub. Save URL with query params properly when using aiohttp.
- 1.12.0 - 1.12.0
- Fix support to latest aiohttp version (3.2.1), Adapted setup to PEP508, Support binary responses on aiohttp, Dropped support for EOL python versions (2.6 and 3.3) - Fix support to latest aiohttp version (3.2.1), Adapted setup to PEP508, Support binary responses on aiohttp, Dropped support for EOL python versions (2.6 and 3.3)
- 1.11.1 - 1.11.1
- Fix compatibility with newest requests and urllib3 releases - Fix compatibility with newest requests and urllib3 releases
- 1.11.0 - 1.11.0
- Allow injection of persistence methods + bugfixes (thanks @j-funk and @IvanMalison), - Allow injection of persistence methods + bugfixes (thanks @j-funk and @IvanMalison),
- Support python 3.6 + CI tests (thanks @derekbekoe and @graingert), - Support python 3.6 + CI tests (thanks @derekbekoe and @graingert),
- Support pytest-asyncio coroutines (thanks @graingert) - Support pytest-asyncio coroutines (thanks @graingert)
- 1.10.5 - 1.10.5
- Added a fix to httplib2 (thanks @carlosds730), Fix an issue with - Added a fix to httplib2 (thanks @carlosds730), Fix an issue with
- aiohttp (thanks @madninja), Add missing requirement yarl (thanks @lamenezes), - aiohttp (thanks @madninja), Add missing requirement yarl (thanks @lamenezes),
- Remove duplicate mock triple (thanks @FooBarQuaxx) - Remove duplicate mock triple (thanks @FooBarQuaxx)
- 1.10.4 - 1.10.4
- Fix an issue with asyncio aiohttp (thanks @madninja) - Fix an issue with asyncio aiohttp (thanks @madninja)
- 1.10.3 - 1.10.3
- Fix some issues with asyncio and params (thanks @anovikov1984 and @lamenezes) - Fix some issues with asyncio and params (thanks @anovikov1984 and @lamenezes)
- Fix some issues with cassette serialize / deserialize and empty response bodies (thanks @gRoussac and @dz0ny) - Fix some issues with cassette serialize / deserialize and empty response bodies (thanks @gRoussac and @dz0ny)
- 1.10.2 - 1.10.2
- Fix 1.10.1 release - add aiohttp support back in - Fix 1.10.1 release - add aiohttp support back in
- 1.10.1 - 1.10.1
- [bad release] Fix build for Fedora package + python2 (thanks @puiterwijk and @lamenezes) - [bad release] Fix build for Fedora package + python2 (thanks @puiterwijk and @lamenezes)
- 1.10.0 - 1.10.0
- Add support for aiohttp (thanks @lamenezes) - Add support for aiohttp (thanks @lamenezes)
- 1.9.0 - 1.9.0
- Add support for boto3 (thanks @desdm, @foorbarna). - Add support for boto3 (thanks @desdm, @foorbarna).
- Fix deepcopy issue for response headers when `decode_compressed_response` is enabled (thanks @nickdirienzo) - Fix deepcopy issue for response headers when `decode_compressed_response` is enabled (thanks @nickdirienzo)
- 1.8.0 - 1.8.0
- Fix for Serialization errors with JSON adapter (thanks @aliaksandrb). - Fix for Serialization errors with JSON adapter (thanks @aliaksandrb).
- Avoid concatenating bytes with strings (thanks @jaysonsantos). - Avoid concatenating bytes with strings (thanks @jaysonsantos).
- Exclude __pycache__ dirs & compiled files in sdist (thanks @koobs). - Exclude __pycache__ dirs & compiled files in sdist (thanks @koobs).
- Fix Tornado support behavior for Tornado 3 (thanks @abhinav). - Fix Tornado support behavior for Tornado 3 (thanks @abhinav).
- decode_compressed_response option and filter (thanks @jayvdb). - decode_compressed_response option and filter (thanks @jayvdb).
- 1.7.4 [#217] - 1.7.4 [#217]
- Make use_cassette decorated functions actually return a value (thanks @bcen). - Make use_cassette decorated functions actually return a value (thanks @bcen).
- [#199] Fix path transformation defaults. - [#199] Fix path transformation defaults.
- Better headers dictionary management. - Better headers dictionary management.
- 1.7.3 [#188] - 1.7.3 [#188]
- ``additional_matchers`` kwarg on ``use_cassette``. - ``additional_matchers`` kwarg on ``use_cassette``.
- [#191] Actually support passing multiple before_record_request functions (thanks @agriffis). - [#191] Actually support passing multiple before_record_request functions (thanks @agriffis).
- 1.7.2 - 1.7.2
- [#186] Get effective_url in tornado (thanks @mvschaik) - [#186] Get effective_url in tornado (thanks @mvschaik)
- [#187] Set request_time on Response object in tornado (thanks @abhinav). - [#187] Set request_time on Response object in tornado (thanks @abhinav).
- 1.7.1 - 1.7.1
- [#183] Patch ``fetch_impl`` instead of the entire HTTPClient class for Tornado (thanks @abhinav). - [#183] Patch ``fetch_impl`` instead of the entire HTTPClient class for Tornado (thanks @abhinav).
- 1.7.0 - 1.7.0
- [#177] Properly support coroutine/generator decoration. - [#177] Properly support coroutine/generator decoration.
- [#178] Support distribute (thanks @graingert). [#163] Make compatibility between python2 and python3 recorded cassettes more robust (thanks @gward). - [#178] Support distribute (thanks @graingert). [#163] Make compatibility between python2 and python3 recorded cassettes more robust (thanks @gward).
- 1.6.1 - 1.6.1
- [#169] Support conditional requirements in old versions of pip - [#169] Support conditional requirements in old versions of pip
- Fix RST parse errors generated by pandoc - Fix RST parse errors generated by pandoc
- [Tornado] Fix unsupported features exception not being raised - [Tornado] Fix unsupported features exception not being raised
@@ -139,17 +203,17 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- Fix crash when cassette path contains cassette library directory (thanks @gazpachoking). - Fix crash when cassette path contains cassette library directory (thanks @gazpachoking).
- 1.5.0 - 1.5.0
- Automatic cassette naming and 'application/json' post data filtering (thanks @marco-santamaria). - Automatic cassette naming and 'application/json' post data filtering (thanks @marco-santamaria).
- 1.4.2 - 1.4.2
- Fix a bug caused by requests 2.7 and chunked transfer encoding - Fix a bug caused by requests 2.7 and chunked transfer encoding
- 1.4.1 - 1.4.1
- Include README, tests, LICENSE in package. Thanks @ralphbean. - Include README, tests, LICENSE in package. Thanks @ralphbean.
- 1.4.0 - 1.4.0
- Filter post data parameters (thanks @eadmundo) - Filter post data parameters (thanks @eadmundo)
- Support for posting files through requests, inject\_cassette kwarg to access cassette from ``use_cassette`` decorated function, ``with_current_defaults`` actually works (thanks @samstav). - Support for posting files through requests, inject\_cassette kwarg to access cassette from ``use_cassette`` decorated function, ``with_current_defaults`` actually works (thanks @samstav).
- 1.3.0 - 1.3.0
- Fix/add support for urllib3 (thanks @aisch) - Fix/add support for urllib3 (thanks @aisch)
- Fix default port for https (thanks @abhinav). - Fix default port for https (thanks @abhinav).
- 1.2.0 - 1.2.0
- Add custom\_patches argument to VCR/Cassette objects to allow users to stub custom classes when cassettes become active. - Add custom\_patches argument to VCR/Cassette objects to allow users to stub custom classes when cassettes become active.
- 1.1.4 - 1.1.4
- Add force reset around calls to actual connection from stubs, to ensure compatibility with the version of httplib/urlib2 in python 2.7.9. - Add force reset around calls to actual connection from stubs, to ensure compatibility with the version of httplib/urlib2 in python 2.7.9.
@@ -160,22 +224,22 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- fix Windows connectionpool stub bug (thanks @gazpachoking) - fix Windows connectionpool stub bug (thanks @gazpachoking)
- add support for requests 2.5 - add support for requests 2.5
- 1.1.2 - 1.1.2
- Add urllib==1.7.1 support. - Add urllib==1.7.1 support.
- Make json serialize error handling correct - Make json serialize error handling correct
- Improve logging of match failures. - Improve logging of match failures.
- 1.1.1 - 1.1.1
- Use function signature preserving ``wrapt.decorator`` to write the decorator version of use\_cassette in order to ensure compatibility with py.test fixtures and python 2. - Use function signature preserving ``wrapt.decorator`` to write the decorator version of use\_cassette in order to ensure compatibility with py.test fixtures and python 2.
- Move all request filtering into the ``before_record_callable``. - Move all request filtering into the ``before_record_callable``.
- 1.1.0 - 1.1.0
- Add ``before_record_response``. Fix several bugs related to the context management of cassettes. - Add ``before_record_response``. Fix several bugs related to the context management of cassettes.
- 1.0.3 - 1.0.3
- Fix an issue with requests 2.4 and make sure case sensitivity is consistent across python versions - Fix an issue with requests 2.4 and make sure case sensitivity is consistent across python versions
- 1.0.2 - 1.0.2
- Fix an issue with requests 2.3 - Fix an issue with requests 2.3
- 1.0.1 - 1.0.1
- Fix a bug with the new ignore requests feature and the once record mode - Fix a bug with the new ignore requests feature and the once record mode
- 1.0.0 - 1.0.0
- *BACKWARDS INCOMPATIBLE*: Please see the 'upgrade' section in the README. Take a look at the matcher section as well, you might want to update your ``match_on`` settings. - *BACKWARDS INCOMPATIBLE*: Please see the 'upgrade' section in the README. Take a look at the matcher section as well, you might want to update your ``match_on`` settings.
- Add support for filtering sensitive data from requests, matching query strings after the order changes and improving the built-in matchers, (thanks to @mshytikov) - Add support for filtering sensitive data from requests, matching query strings after the order changes and improving the built-in matchers, (thanks to @mshytikov)
- Support for ignoring requests to certain hosts, bump supported Python3 version to 3.4, fix some bugs with Boto support (thanks @marusich) - Support for ignoring requests to certain hosts, bump supported Python3 version to 3.4, fix some bugs with Boto support (thanks @marusich)
- Fix error with URL field capitalization in README (thanks @simon-weber) - Fix error with URL field capitalization in README (thanks @simon-weber)
@@ -183,27 +247,27 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- Added ``all_played`` property on cassette (thanks @mshytikov) - Added ``all_played`` property on cassette (thanks @mshytikov)
- 0.7.0 - 0.7.0
- VCR.py now supports Python 3! (thanks @asundg) - VCR.py now supports Python 3! (thanks @asundg)
- Also I refactored the stub connections quite a bit to add support for the putrequest and putheader calls. - Also I refactored the stub connections quite a bit to add support for the putrequest and putheader calls.
- This version also adds support for httplib2 (thanks @nilp0inter). - This version also adds support for httplib2 (thanks @nilp0inter).
- I have added a couple tests for boto since it is an http client in its own right. - I have added a couple tests for boto since it is an http client in its own right.
- Finally, this version includes a fix for a bug where requests wasn't being patched properly (thanks @msabramo). - Finally, this version includes a fix for a bug where requests wasn't being patched properly (thanks @msabramo).
- 0.6.0 - 0.6.0
- Store response headers as a list since a HTTP response can have the same header twice (happens with set-cookie sometimes). - Store response headers as a list since a HTTP response can have the same header twice (happens with set-cookie sometimes).
- This has the added benefit of preserving the order of headers. - This has the added benefit of preserving the order of headers.
- Thanks @smallcode for the bug report leading to this change. - Thanks @smallcode for the bug report leading to this change.
- I have made an effort to ensure backwards compatibility with the old cassettes' header storage mechanism, but if you want to upgrade to the new header storage, you should delete your cassettes and re-record them. - I have made an effort to ensure backwards compatibility with the old cassettes' header storage mechanism, but if you want to upgrade to the new header storage, you should delete your cassettes and re-record them.
- Also this release adds better error messages (thanks @msabramo) - Also this release adds better error messages (thanks @msabramo)
- and adds support for using VCR as a decorator (thanks @smallcode for the motivation) - and adds support for using VCR as a decorator (thanks @smallcode for the motivation)
- 0.5.0 - 0.5.0
- Change the ``response_of`` method to ``responses_of`` since cassettes can now contain more than one response for a request. - Change the ``response_of`` method to ``responses_of`` since cassettes can now contain more than one response for a request.
- Since this changes the API, I'm bumping the version. - Since this changes the API, I'm bumping the version.
- Also includes 2 bugfixes: - Also includes 2 bugfixes:
- a better error message when attempting to overwrite a cassette file, - a better error message when attempting to overwrite a cassette file,
- and a fix for a bug with requests sessions (thanks @msabramo) - and a fix for a bug with requests sessions (thanks @msabramo)
- 0.4.0 - 0.4.0
- Change default request recording behavior for multiple requests. - Change default request recording behavior for multiple requests.
- If you make the same request multiple times to the same URL, the response might be different each time (maybe the response has a timestamp in it or something), so this will make the same request multiple times and save them all. - If you make the same request multiple times to the same URL, the response might be different each time (maybe the response has a timestamp in it or something), so this will make the same request multiple times and save them all.
- Then, when you are replaying the cassette, the responses will be played back in the same order in which they were received. - Then, when you are replaying the cassette, the responses will be played back in the same order in which they were received.
- If you were making multiple requests to the same URL in a cassette before version 0.4.0, you might need to regenerate your cassette files. - If you were making multiple requests to the same URL in a cassette before version 0.4.0, you might need to regenerate your cassette files.
- Also, removes support for the cassette.play\_count counter API, since individual requests aren't unique anymore. - Also, removes support for the cassette.play\_count counter API, since individual requests aren't unique anymore.
@@ -223,7 +287,7 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- 0.3.0 - 0.3.0
- *Backwards incompatible release* - *Backwards incompatible release*
- Added support for record modes, and changed the default recording behavior to the "once" record mode. Please see the documentation on record modes for more. - Added support for record modes, and changed the default recording behavior to the "once" record mode. Please see the documentation on record modes for more.
- Added support for custom request matching, and changed the default request matching behavior to match only on the URL and method. - Added support for custom request matching, and changed the default request matching behavior to match only on the URL and method.
- Also, improved the httplib mocking to add support for the ``HTTPConnection.send()`` method. - Also, improved the httplib mocking to add support for the ``HTTPConnection.send()`` method.
- This means that requests won't actually be sent until the response is read, since I need to record the entire request in order to match up the appropriate response. - This means that requests won't actually be sent until the response is read, since I need to record the entire request in order to match up the appropriate response.
- I don't think this should cause any issues unless you are sending requests without ever loading the response (which none of the standard httplib wrappers do, as far as I know). - I don't think this should cause any issues unless you are sending requests without ever loading the response (which none of the standard httplib wrappers do, as far as I know).
@@ -231,13 +295,13 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- 0.2.1 - 0.2.1
- Fixed missing modules in setup.py - Fixed missing modules in setup.py
- 0.2.0 - 0.2.0
- Added configuration API, which lets you configure some settings on VCR (see the README). - Added configuration API, which lets you configure some settings on VCR (see the README).
- Also, VCR no longer saves cassettes if they haven't changed at all and supports JSON as well as YAML (thanks @sirpengi). - Also, VCR no longer saves cassettes if they haven't changed at all and supports JSON as well as YAML (thanks @sirpengi).
- Added amazing new skeumorphic logo, thanks @hairarrow. - Added amazing new skeumorphic logo, thanks @hairarrow.
- 0.1.0 - 0.1.0
- *backwards incompatible release - delete your old cassette files* - *backwards incompatible release - delete your old cassette files*
- This release adds the ability to access the cassette to make assertions on it - This release adds the ability to access the cassette to make assertions on it
- as well as a major code refactor thanks to @dlecocq. - as well as a major code refactor thanks to @dlecocq.
- It also fixes a couple longstanding bugs with redirects and HTTPS. [#3 and #4] - It also fixes a couple longstanding bugs with redirects and HTTPS. [#3 and #4]
- 0.0.4 - 0.0.4
- If you have libyaml installed, vcrpy will use the c bindings instead. Speed up your tests! Thanks @dlecocq - If you have libyaml installed, vcrpy will use the c bindings instead. Speed up your tests! Thanks @dlecocq
@@ -247,4 +311,3 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- Add support for requests / urllib3 - Add support for requests / urllib3
- 0.0.1 - 0.0.1
- Initial Release - Initial Release

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
# #
# vcrpy documentation build configuration file, created by # vcrpy documentation build configuration file, created by
# sphinx-quickstart on Sun Sep 13 11:18:00 2015. # sphinx-quickstart on Sun Sep 13 11:18:00 2015.
@@ -94,7 +93,7 @@ version = release = find_version("..", "vcr", "__init__.py")
# #
# This is also used if you do content translation via gettext catalogs. # This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases. # Usually you set "language" from the command line for these cases.
language = None language = "en"
# There are two options for replacing |today|: either, you set today to some # There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used: # non-false value, then it is used:
@@ -317,5 +316,5 @@ texinfo_documents = [
# Example configuration for intersphinx: refer to the Python standard library. # Example configuration for intersphinx: refer to the Python standard library.
intersphinx_mapping = {"https://docs.python.org/": None} intersphinx_mapping = {"python": ("https://docs.python.org/3", None)}
html_theme = "alabaster" html_theme = "alabaster"

View File

@@ -24,7 +24,7 @@ So whilst reporting issues are valuable, please consider:
- contributing an issue with a toy repo that replicates the issue. - contributing an issue with a toy repo that replicates the issue.
- contributing PRs is a more valuable donation of your time and effort. - contributing PRs is a more valuable donation of your time and effort.
Thanks again for your interest and support in VCRpy. Thanks again for your interest and support in VCRpy.
We really appreciate it. We really appreciate it.
@@ -57,7 +57,7 @@ Simply adding these three labels for incoming issues means a lot for maintaining
- Which library does it affect? ``core``, ``aiohttp``, ``requests``, ``urllib3``, ``tornado4``, ``httplib2`` - Which library does it affect? ``core``, ``aiohttp``, ``requests``, ``urllib3``, ``tornado4``, ``httplib2``
- If it is a bug, is it ``Verified Can Replicate`` or ``Requires Help Replicating`` - If it is a bug, is it ``Verified Can Replicate`` or ``Requires Help Replicating``
- Thanking people for raising issues. Feedback is always appreciated. - Thanking people for raising issues. Feedback is always appreciated.
- Politely asking if they are able to link to an example repo that replicates the issue if they haven't already. Being able to *clone and go* helps the next person and we like that. 😃 - Politely asking if they are able to link to an example repo that replicates the issue if they haven't already. Being able to *clone and go* helps the next person and we like that. 😃
**Maintainer:** **Maintainer:**
@@ -68,13 +68,13 @@ This involves creating PRs to address bugs and enhancement requests. It also mea
The PR reviewer is a second set of eyes to see if: The PR reviewer is a second set of eyes to see if:
- Are there tests covering the code paths added/modified? - Are there tests covering the code paths added/modified?
- Do the tests and modifications make sense seem appropriate? - Do the tests and modifications make sense seem appropriate?
- Add specific feedback, even on approvals, why it is accepted. eg "I like how you use a context manager there. 😄 " - Add specific feedback, even on approvals, why it is accepted. eg "I like how you use a context manager there. 😄 "
- Also make sure they add a line to `docs/changelog.rst` to claim credit for their contribution. - Also make sure they add a line to `docs/changelog.rst` to claim credit for their contribution.
**Release Manager:** **Release Manager:**
- Ensure CI is passing. - Ensure CI is passing.
- Create a release on github and tag it with the changelog release notes. - Create a release on github and tag it with the changelog release notes.
- ``python setup.py build sdist bdist_wheel`` - ``python3 setup.py build sdist bdist_wheel``
- ``twine upload dist/*`` - ``twine upload dist/*``
- Go to ReadTheDocs build page and trigger a build https://readthedocs.org/projects/vcrpy/builds/ - Go to ReadTheDocs build page and trigger a build https://readthedocs.org/projects/vcrpy/builds/
@@ -83,39 +83,21 @@ The PR reviewer is a second set of eyes to see if:
Running VCR's test suite Running VCR's test suite
------------------------ ------------------------
The tests are all run automatically on `Travis The tests are all run automatically on `Github Actions CI <https://github.com/kevin1024/vcrpy/actions>`__,
CI <https://travis-ci.org/kevin1024/vcrpy>`__, but you can also run them but you can also run them yourself using `pytest <http://pytest.org/>`__.
yourself using `pytest <http://pytest.org/>`__ and
`Tox <http://tox.testrun.org/>`__.
Tox will automatically run them in all environments VCR.py supports if they are available on your `PATH`. Alternatively you can use `tox-pyenv <https://pypi.org/project/tox-pyenv/>`_ with In order for the boto3 tests to run, you will need an AWS key.
`pyenv <https://github.com/pyenv/pyenv>`_. Refer to the `boto3
We recommend you read the documentation for each and see the section further below. documentation <https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/index.html>`__
for how to set this up. I have marked the boto3 tests as optional in
The test suite is pretty big and slow, but you can tell tox to only run specific tests like this::
tox -e {pyNN}-{HTTP_LIBRARY} -- <pytest flags passed through>
tox -e py37-requests -- -v -k "'test_status_code or test_gzip'"
tox -e py37-requests -- -v --last-failed
This will run only tests that look like ``test_status_code`` or
``test_gzip`` in the test suite, and only in the python 3.7 environment
that has ``requests`` installed.
Also, in order for the boto tests to run, you will need an AWS key.
Refer to the `boto
documentation <https://boto.readthedocs.io/en/latest/getting_started.html>`__
for how to set this up. I have marked the boto tests as optional in
Travis so you don't have to worry about them failing if you submit a Travis so you don't have to worry about them failing if you submit a
pull request. pull request.
Using PyEnv with VCR's test suite Using Pyenv with VCR's test suite
--------------------------------- ---------------------------------
PyEnv is a tool for managing multiple installation of python on your system. Pyenv is a tool for managing multiple installation of python on your system.
See the full documentation at their `github <https://github.com/pyenv/pyenv>`_ See the full documentation at their `github <https://github.com/pyenv/pyenv>`_
but we are also going to use `tox-pyenv <https://pypi.org/project/tox-pyenv/>`_
in this example:: in this example::
git clone https://github.com/pyenv/pyenv ~/.pyenv git clone https://github.com/pyenv/pyenv ~/.pyenv
@@ -126,27 +108,21 @@ in this example::
# Setup shim paths # Setup shim paths
eval "$(pyenv init -)" eval "$(pyenv init -)"
# Setup your local system tox tooling
pip install tox tox-pyenv
# Install supported versions (at time of writing), this does not activate them # Install supported versions (at time of writing), this does not activate them
pyenv install 3.7.5 3.8.0 pypy3.8 pyenv install 3.12.0 pypy3.10
# This activates them # This activates them
pyenv local 3.7.5 3.8.0 pypy3.8 pyenv local 3.12.0 pypy3.10
# Run the whole test suite # Run the whole test suite
tox pip install .[tests]
./runtests.sh
# Run the whole test suite or just part of it
tox -e lint
tox -e py37-requests
Troubleshooting on MacOSX Troubleshooting on MacOSX
------------------------- -------------------------
If you have this kind of error when running tox : If you have this kind of error when running tests :
.. code:: python .. code:: python

View File

@@ -4,25 +4,25 @@ Installation
VCR.py is a package on `PyPI <https://pypi.python.org>`__, so you can install VCR.py is a package on `PyPI <https://pypi.python.org>`__, so you can install
with pip:: with pip::
pip install vcrpy pip3 install vcrpy
Compatibility Compatibility
------------- -------------
VCR.py supports Python 3.7+, and `pypy <http://pypy.org>`__. VCR.py supports Python 3.9+, and `pypy <http://pypy.org>`__.
The following HTTP libraries are supported: The following HTTP libraries are supported:
- ``aiohttp`` - ``aiohttp``
- ``boto``
- ``boto3`` - ``boto3``
- ``http.client`` - ``http.client``
- ``httplib2`` - ``httplib2``
- ``requests`` (both 1.x and 2.x versions) - ``requests`` (>=2.16.2 versions)
- ``tornado.httpclient`` - ``tornado.httpclient``
- ``urllib2`` - ``urllib2``
- ``urllib3`` - ``urllib3``
- ``httpx`` - ``httpx``
- ``httpcore``
Speed Speed
----- -----
@@ -35,7 +35,7 @@ rebuilding pyyaml.
1. Test if pyyaml is built with libyaml. This should work:: 1. Test if pyyaml is built with libyaml. This should work::
python -c 'from yaml import CLoader' python3 -c 'from yaml import CLoader'
2. Install libyaml according to your Linux distribution, or using `Homebrew 2. Install libyaml according to your Linux distribution, or using `Homebrew
<http://mxcl.github.com/homebrew/>`__ on Mac:: <http://mxcl.github.com/homebrew/>`__ on Mac::
@@ -46,8 +46,8 @@ rebuilding pyyaml.
3. Rebuild pyyaml with libyaml:: 3. Rebuild pyyaml with libyaml::
pip uninstall pyyaml pip3 uninstall pyyaml
pip --no-cache-dir install pyyaml pip3 --no-cache-dir install pyyaml
Upgrade Upgrade
------- -------
@@ -61,7 +61,7 @@ is to simply delete your cassettes and re-record all of them. VCR.py
also provides a migration script that attempts to upgrade your 0.x also provides a migration script that attempts to upgrade your 0.x
cassettes to the new 1.x format. To use it, run the following command:: cassettes to the new 1.x format. To use it, run the following command::
python -m vcr.migration PATH python3 -m vcr.migration PATH
The PATH can be either a path to the directory with cassettes or the The PATH can be either a path to the directory with cassettes or the
path to a single cassette. path to a single cassette.

2
docs/requirements.txt Normal file
View File

@@ -0,0 +1,2 @@
sphinx<9
sphinx_rtd_theme==3.0.2

View File

@@ -4,11 +4,11 @@ Usage
.. code:: python .. code:: python
import vcr import vcr
import urllib import urllib.request
with vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml'): with vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml'):
response = urllib.request.urlopen('http://www.iana.org/domains/reserved').read() response = urllib.request.urlopen('http://www.iana.org/domains/reserved').read()
assert 'Example domains' in response assert b'Example domains' in response
Run this test once, and VCR.py will record the HTTP request to Run this test once, and VCR.py will record the HTTP request to
``fixtures/vcr_cassettes/synopsis.yaml``. Run it again, and VCR.py will ``fixtures/vcr_cassettes/synopsis.yaml``. Run it again, and VCR.py will
@@ -26,7 +26,7 @@ look like this:
@vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml') @vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml')
def test_iana(): def test_iana():
response = urllib.request.urlopen('http://www.iana.org/domains/reserved').read() response = urllib.request.urlopen('http://www.iana.org/domains/reserved').read()
assert 'Example domains' in response assert b'Example domains' in response
When using the decorator version of ``use_cassette``, it is possible to When using the decorator version of ``use_cassette``, it is possible to
omit the path to the cassette file. omit the path to the cassette file.
@@ -36,7 +36,7 @@ omit the path to the cassette file.
@vcr.use_cassette() @vcr.use_cassette()
def test_iana(): def test_iana():
response = urllib.request.urlopen('http://www.iana.org/domains/reserved').read() response = urllib.request.urlopen('http://www.iana.org/domains/reserved').read()
assert 'Example domains' in response assert b'Example domains' in response
In this case, the cassette file will be given the same name as the test In this case, the cassette file will be given the same name as the test
function, and it will be placed in the same directory as the file in function, and it will be placed in the same directory as the file in
@@ -92,9 +92,73 @@ all
Unittest Integration Unittest Integration
-------------------- --------------------
While it's possible to use the context manager or decorator forms with unittest, Inherit from ``VCRTestCase`` for automatic recording and playback of HTTP
there's also a ``VCRTestCase`` provided separately by `vcrpy-unittest interactions.
<https://github.com/agriffis/vcrpy-unittest>`__.
.. code:: python
from vcr.unittest import VCRTestCase
import requests
class MyTestCase(VCRTestCase):
def test_something(self):
response = requests.get('http://example.com')
Similar to how VCR.py returns the cassette from the context manager,
``VCRTestCase`` makes the cassette available as ``self.cassette``:
.. code:: python
self.assertEqual(len(self.cassette), 1)
self.assertEqual(self.cassette.requests[0].uri, 'http://example.com')
By default cassettes will be placed in the ``cassettes`` subdirectory next to the
test, named according to the test class and method. For example, the above test
would read from and write to ``cassettes/MyTestCase.test_something.yaml``
The configuration can be modified by overriding methods on your subclass:
``_get_vcr_kwargs``, ``_get_cassette_library_dir`` and ``_get_cassette_name``.
To modify the ``VCR`` object after instantiation, for example to add a matcher,
you can hook on ``_get_vcr``, for example:
.. code:: python
class MyTestCase(VCRTestCase):
def _get_vcr(self, **kwargs):
myvcr = super(MyTestCase, self)._get_vcr(**kwargs)
myvcr.register_matcher('mymatcher', mymatcher)
myvcr.match_on = ['mymatcher']
return myvcr
See
`the source
<https://github.com/kevin1024/vcrpy/blob/master/vcr/unittest.py>`__
for the default implementations of these methods.
If you implement a ``setUp`` method on your test class then make sure to call
the parent version ``super().setUp()`` in your own in order to continue getting
the cassettes produced.
VCRMixin
~~~~~~~~
In case inheriting from ``VCRTestCase`` is difficult because of an existing
class hierarchy containing tests in the base classes, inherit from ``VCRMixin``
instead.
.. code:: python
from vcr.unittest import VCRMixin
import requests
import unittest
class MyTestMixin(VCRMixin):
def test_something(self):
response = requests.get(self.url)
class MyTestCase(MyTestMixin, unittest.TestCase):
url = 'http://example.com'
Pytest Integration Pytest Integration
------------------ ------------------

View File

@@ -1,2 +1,32 @@
[tool.black] [tool.codespell]
line-length=110 skip = '.git,*.pdf,*.svg,.tox'
ignore-regex = "\\\\[fnrstv]"
[tool.pytest]
addopts = ["--strict-config", "--strict-markers"]
asyncio_default_fixture_loop_scope = "session"
asyncio_default_test_loop_scope = "session"
markers = ["online"]
[tool.ruff]
line-length = 110
target-version = "py310"
[tool.ruff.lint]
select = [
"B", # flake8-bugbear
"C4", # flake8-comprehensions
"COM", # flake8-commas
"E", # pycodestyle error
"F", # pyflakes
"I", # isort
"ISC", # flake8-implicit-str-concat
"PIE", # flake8-pie
"RUF", # Ruff-specific rules
"UP", # pyupgrade
"W", # pycodestyle warning
"SIM",
]
[tool.ruff.lint.isort]
known-first-party = ["vcr"]

View File

@@ -1,7 +1,5 @@
#!/bin/bash #!/bin/bash
# https://blog.ionelmc.ro/2015/04/14/tox-tricks-and-patterns/#when-it-inevitably-leads-to-shell-scripts # If you are getting an INVOCATION ERROR for this script then there is a good chance you are running on Windows.
# If you are getting an INVOCATION ERROR for this script then there is # You can and should use WSL for running tests on Windows when it calls bash scripts.
# a good chance you are running on Windows. REQUESTS_CA_BUNDLE=`python3 -m pytest_httpbin.certs` exec pytest "$@"
# You can and should use WSL for running tox on Windows when it calls bash scripts.
REQUESTS_CA_BUNDLE=`python -m pytest_httpbin.certs` pytest $*

View File

@@ -3,12 +3,11 @@
import codecs import codecs
import os import os
import re import re
import sys from pathlib import Path
from setuptools import setup, find_packages from setuptools import find_packages, setup
from setuptools.command.test import test as TestCommand
long_description = open("README.rst", "r").read() long_description = Path("README.rst").read_text()
here = os.path.abspath(os.path.dirname(__file__)) here = os.path.abspath(os.path.dirname(__file__))
@@ -28,27 +27,33 @@ def find_version(*file_paths):
raise RuntimeError("Unable to find version string.") raise RuntimeError("Unable to find version string.")
class PyTest(TestCommand):
def finalize_options(self):
TestCommand.finalize_options(self)
self.test_args = []
self.test_suite = True
def run_tests(self):
# import here, cause outside the eggs aren't loaded
import pytest
errno = pytest.main(self.test_args)
sys.exit(errno)
install_requires = [ install_requires = [
"PyYAML", "PyYAML",
"wrapt", "wrapt",
"six>=1.5",
"yarl",
] ]
extras_require = {
"tests": [
"aiohttp",
"boto3",
"cryptography",
"httpbin",
"httpcore",
"httplib2",
"httpx",
"pycurl; platform_python_implementation !='PyPy'",
"pytest",
"pytest-aiohttp",
"pytest-asyncio",
"pytest-cov",
"pytest-httpbin",
"requests>=2.22.0",
"tornado",
"urllib3",
"werkzeug==2.0.3",
],
}
setup( setup(
name="vcrpy", name="vcrpy",
version=find_version("vcr", "__init__.py"), version=find_version("vcr", "__init__.py"),
@@ -59,20 +64,21 @@ setup(
author_email="me@kevinmccarthy.org", author_email="me@kevinmccarthy.org",
url="https://github.com/kevin1024/vcrpy", url="https://github.com/kevin1024/vcrpy",
packages=find_packages(exclude=["tests*"]), packages=find_packages(exclude=["tests*"]),
python_requires=">=3.7", python_requires=">=3.10",
install_requires=install_requires, install_requires=install_requires,
license="MIT", license="MIT",
tests_require=["pytest", "mock", "pytest-httpbin"], extras_require=extras_require,
tests_require=extras_require["tests"],
classifiers=[ classifiers=[
"Development Status :: 5 - Production/Stable", "Development Status :: 5 - Production/Stable",
"Environment :: Console", "Environment :: Console",
"Intended Audience :: Developers", "Intended Audience :: Developers",
"Programming Language :: Python", "Programming Language :: Python",
"Programming Language :: Python :: 3", "Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: 3 :: Only", "Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy", "Programming Language :: Python :: Implementation :: PyPy",

0
tests/__init__.py Normal file
View File

View File

@@ -11,9 +11,12 @@ def assert_cassette_has_one_response(cass):
assert cass.play_count == 1 assert cass.play_count == 1
def assert_is_json(a_string): def assert_is_json_bytes(b: bytes):
assert isinstance(b, bytes)
try: try:
json.loads(a_string.decode("utf-8")) json.loads(b)
except Exception: except Exception as error:
assert False raise AssertionError() from error
assert True assert True

View File

@@ -15,9 +15,9 @@
}, },
"response": { "response": {
"status": { "status": {
"message": "OK", "message": "OK",
"code": 200 "code": 200
}, },
"headers": { "headers": {
"access-control-allow-origin": ["*"], "access-control-allow-origin": ["*"],
"content-type": ["application/json"], "content-type": ["application/json"],
@@ -25,7 +25,7 @@
"server": ["gunicorn/0.17.4"], "server": ["gunicorn/0.17.4"],
"content-length": ["32"], "content-length": ["32"],
"connection": ["keep-alive"] "connection": ["keep-alive"]
}, },
"body": { "body": {
"string": "{\n \"origin\": \"217.122.164.194\"\n}" "string": "{\n \"origin\": \"217.122.164.194\"\n}"
} }

View File

@@ -2,7 +2,7 @@ version: 1
interactions: interactions:
- request: - request:
body: null body: null
headers: headers:
accept: ['*/*'] accept: ['*/*']
accept-encoding: ['gzip, deflate, compress'] accept-encoding: ['gzip, deflate, compress']
user-agent: ['python-requests/2.2.1 CPython/2.6.1 Darwin/10.8.0'] user-agent: ['python-requests/2.2.1 CPython/2.6.1 Darwin/10.8.0']

View File

@@ -1,31 +1,31 @@
[ [
{ {
"request": { "request": {
"body": null, "body": null,
"protocol": "http", "protocol": "http",
"method": "GET", "method": "GET",
"headers": { "headers": {
"accept-encoding": "gzip, deflate, compress", "accept-encoding": "gzip, deflate, compress",
"accept": "*/*", "accept": "*/*",
"user-agent": "python-requests/2.2.1 CPython/2.6.1 Darwin/10.8.0" "user-agent": "python-requests/2.2.1 CPython/2.6.1 Darwin/10.8.0"
}, },
"host": "httpbin.org", "host": "httpbin.org",
"path": "/ip", "path": "/ip",
"port": 80 "port": 80
}, },
"response": { "response": {
"status": { "status": {
"message": "OK", "message": "OK",
"code": 200 "code": 200
}, },
"headers": [ "headers": [
"access-control-allow-origin: *\r\n", "access-control-allow-origin: *\r\n",
"content-type: application/json\r\n", "content-type: application/json\r\n",
"date: Mon, 21 Apr 2014 23:13:40 GMT\r\n", "date: Mon, 21 Apr 2014 23:13:40 GMT\r\n",
"server: gunicorn/0.17.4\r\n", "server: gunicorn/0.17.4\r\n",
"content-length: 32\r\n", "content-length: 32\r\n",
"connection: keep-alive\r\n" "connection: keep-alive\r\n"
], ],
"body": { "body": {
"string": "{\n \"origin\": \"217.122.164.194\"\n}" "string": "{\n \"origin\": \"217.122.164.194\"\n}"
} }

View File

@@ -10,7 +10,7 @@ interactions:
uri: http://seomoz.org/ uri: http://seomoz.org/
response: response:
body: {string: ''} body: {string: ''}
headers: headers:
Location: ['http://moz.com/'] Location: ['http://moz.com/']
Server: ['BigIP'] Server: ['BigIP']
Connection: ['Keep-Alive'] Connection: ['Keep-Alive']

View File

@@ -2,28 +2,27 @@
import asyncio import asyncio
import aiohttp import aiohttp
from aiohttp.test_utils import TestClient
async def aiohttp_request(loop, method, url, output="text", encoding="utf-8", content_type=None, **kwargs): async def aiohttp_request(loop, method, url, output="text", encoding="utf-8", content_type=None, **kwargs):
session = aiohttp.ClientSession(loop=loop) async with aiohttp.ClientSession(loop=loop) as session:
response_ctx = session.request(method, url, **kwargs) response_ctx = session.request(method, url, **kwargs)
response = await response_ctx.__aenter__() response = await response_ctx.__aenter__()
if output == "text": if output == "text":
content = await response.text() content = await response.text()
elif output == "json": elif output == "json":
content_type = content_type or "application/json" content_type = content_type or "application/json"
content = await response.json(encoding=encoding, content_type=content_type) content = await response.json(encoding=encoding, content_type=content_type)
elif output == "raw": elif output == "raw":
content = await response.read() content = await response.read()
elif output == "stream": elif output == "stream":
content = await response.content.read() content = await response.content.read()
response_ctx._resp.close() response_ctx._resp.close()
await session.close() await session.close()
return response, content return response, content
def aiohttp_app(): def aiohttp_app():

View File

@@ -0,0 +1,41 @@
interactions:
- request:
body: ''
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate, br
connection:
- keep-alive
host:
- httpbin.org
user-agent:
- python-httpx/0.23.0
method: GET
uri: https://httpbin.org/gzip
response:
content: "{\n \"gzipped\": true, \n \"headers\": {\n \"Accept\": \"*/*\",
\n \"Accept-Encoding\": \"gzip, deflate, br\", \n \"Host\": \"httpbin.org\",
\n \"User-Agent\": \"python-httpx/0.23.0\", \n \"X-Amzn-Trace-Id\": \"Root=1-62a62a8d-5f39b5c50c744da821d6ea99\"\n
\ }, \n \"method\": \"GET\", \n \"origin\": \"146.200.25.115\"\n}\n"
headers:
Access-Control-Allow-Credentials:
- 'true'
Access-Control-Allow-Origin:
- '*'
Connection:
- keep-alive
Content-Encoding:
- gzip
Content-Length:
- '230'
Content-Type:
- application/json
Date:
- Sun, 12 Jun 2022 18:03:57 GMT
Server:
- gunicorn/19.9.0
http_version: HTTP/1.1
status_code: 200
version: 1

View File

@@ -0,0 +1,42 @@
interactions:
- request:
body: null
headers:
Accept:
- '*/*'
Accept-Encoding:
- gzip, deflate, br
Connection:
- keep-alive
User-Agent:
- python-requests/2.28.0
method: GET
uri: https://httpbin.org/gzip
response:
body:
string: !!binary |
H4sIAKwrpmIA/z2OSwrCMBCG956izLIkfQSxkl2RogfQA9R2bIM1iUkqaOndnYDIrGa+/zELDB9l
LfYgg5uRwYhtj86DXKDuOrQBJKR5Cuy38kZ3pld6oHu0sqTH29QGZMnVkepgtMYuKKNJcEe0vJ3U
C4mcjI9hpaiygqaUW7ETFYGLR8frAXXE9h1Go7nD54w++FxkYp8VsDJ4IBH6E47NmVzGqUHFkn8g
rJsvp2omYs8AAAA=
headers:
Access-Control-Allow-Credentials:
- 'true'
Access-Control-Allow-Origin:
- '*'
Connection:
- Close
Content-Encoding:
- gzip
Content-Length:
- '182'
Content-Type:
- application/json
Date:
- Sun, 12 Jun 2022 18:08:44 GMT
Server:
- Pytest-HTTPBIN/0.1.0
status:
code: 200
message: great
version: 1

View File

@@ -13,7 +13,7 @@ interactions:
user-agent: user-agent:
- python-httpx/0.12.1 - python-httpx/0.12.1
method: GET method: GET
uri: https://httpbin.org/headers uri: https://mockbin.org/headers
response: response:
content: "{\n \"headers\": {\n \"Accept\": \"*/*\", \n \"Accept-Encoding\"\ content: "{\n \"headers\": {\n \"Accept\": \"*/*\", \n \"Accept-Encoding\"\
: \"gzip, deflate, br\", \n \"Host\": \"httpbin.org\", \n \"User-Agent\"\ : \"gzip, deflate, br\", \n \"Host\": \"httpbin.org\", \n \"User-Agent\"\

View File

@@ -1,21 +1,27 @@
import contextlib
import logging import logging
import ssl
import urllib.parse import urllib.parse
import pytest import pytest
import pytest_httpbin.certs
import yarl
import vcr
asyncio = pytest.importorskip("asyncio") asyncio = pytest.importorskip("asyncio")
aiohttp = pytest.importorskip("aiohttp") aiohttp = pytest.importorskip("aiohttp")
import vcr # noqa: E402
from .aiohttp_utils import aiohttp_app, aiohttp_request # noqa: E402 from .aiohttp_utils import aiohttp_app, aiohttp_request # noqa: E402
HTTPBIN_SSL_CONTEXT = ssl.create_default_context(cafile=pytest_httpbin.certs.where())
def run_in_loop(fn): def run_in_loop(fn):
with contextlib.closing(asyncio.new_event_loop()) as loop: async def wrapper():
asyncio.set_event_loop(loop) return await fn(asyncio.get_running_loop())
task = loop.create_task(fn(loop))
return loop.run_until_complete(task) return asyncio.run(wrapper())
def request(method, url, output="text", **kwargs): def request(method, url, output="text", **kwargs):
@@ -33,14 +39,10 @@ def post(url, output="text", **kwargs):
return request("POST", url, output="text", **kwargs) return request("POST", url, output="text", **kwargs)
@pytest.fixture(params=["https", "http"]) @pytest.mark.online
def scheme(request): def test_status(tmpdir, httpbin):
"""Fixture that returns both http and https.""" url = httpbin.url
return request.param
def test_status(tmpdir, scheme):
url = scheme + "://httpbin.org"
with vcr.use_cassette(str(tmpdir.join("status.yaml"))): with vcr.use_cassette(str(tmpdir.join("status.yaml"))):
response, _ = get(url) response, _ = get(url)
@@ -50,9 +52,10 @@ def test_status(tmpdir, scheme):
assert cassette.play_count == 1 assert cassette.play_count == 1
@pytest.mark.online
@pytest.mark.parametrize("auth", [None, aiohttp.BasicAuth("vcrpy", "test")]) @pytest.mark.parametrize("auth", [None, aiohttp.BasicAuth("vcrpy", "test")])
def test_headers(tmpdir, scheme, auth): def test_headers(tmpdir, auth, httpbin):
url = scheme + "://httpbin.org" url = httpbin.url
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): with vcr.use_cassette(str(tmpdir.join("headers.yaml"))):
response, _ = get(url, auth=auth) response, _ = get(url, auth=auth)
@@ -61,14 +64,16 @@ def test_headers(tmpdir, scheme, auth):
request = cassette.requests[0] request = cassette.requests[0]
assert "AUTHORIZATION" in request.headers assert "AUTHORIZATION" in request.headers
cassette_response, _ = get(url, auth=auth) cassette_response, _ = get(url, auth=auth)
assert dict(cassette_response.headers) == dict(response.headers) assert cassette_response.headers.items() == response.headers.items()
assert cassette.play_count == 1 assert cassette.play_count == 1
assert "istr" not in cassette.data[0] assert "istr" not in cassette.data[0]
assert "yarl.URL" not in cassette.data[0] assert "yarl.URL" not in cassette.data[0]
def test_case_insensitive_headers(tmpdir, scheme): @pytest.mark.online
url = scheme + "://httpbin.org" def test_case_insensitive_headers(tmpdir, httpbin):
url = httpbin.url
with vcr.use_cassette(str(tmpdir.join("whatever.yaml"))): with vcr.use_cassette(str(tmpdir.join("whatever.yaml"))):
_, _ = get(url) _, _ = get(url)
@@ -79,8 +84,10 @@ def test_case_insensitive_headers(tmpdir, scheme):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_text(tmpdir, scheme): @pytest.mark.online
url = scheme + "://httpbin.org" def test_text(tmpdir, httpbin):
url = httpbin.url
with vcr.use_cassette(str(tmpdir.join("text.yaml"))): with vcr.use_cassette(str(tmpdir.join("text.yaml"))):
_, response_text = get(url) _, response_text = get(url)
@@ -90,8 +97,9 @@ def test_text(tmpdir, scheme):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_json(tmpdir, scheme): @pytest.mark.online
url = scheme + "://httpbin.org/get" def test_json(tmpdir, httpbin):
url = httpbin.url + "/json"
headers = {"Content-Type": "application/json"} headers = {"Content-Type": "application/json"}
with vcr.use_cassette(str(tmpdir.join("json.yaml"))): with vcr.use_cassette(str(tmpdir.join("json.yaml"))):
@@ -103,8 +111,9 @@ def test_json(tmpdir, scheme):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_binary(tmpdir, scheme): @pytest.mark.online
url = scheme + "://httpbin.org/image/png" def test_binary(tmpdir, httpbin):
url = httpbin.url + "/image/png"
with vcr.use_cassette(str(tmpdir.join("binary.yaml"))): with vcr.use_cassette(str(tmpdir.join("binary.yaml"))):
_, response_binary = get(url, output="raw") _, response_binary = get(url, output="raw")
@@ -114,23 +123,25 @@ def test_binary(tmpdir, scheme):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_stream(tmpdir, scheme): @pytest.mark.online
url = scheme + "://httpbin.org/get" def test_stream(tmpdir, httpbin):
url = httpbin.url
with vcr.use_cassette(str(tmpdir.join("stream.yaml"))): with vcr.use_cassette(str(tmpdir.join("stream.yaml"))):
resp, body = get(url, output="raw") # Do not use stream here, as the stream is exhausted by vcr _, body = get(url, output="raw") # Do not use stream here, as the stream is exhausted by vcr
with vcr.use_cassette(str(tmpdir.join("stream.yaml"))) as cassette: with vcr.use_cassette(str(tmpdir.join("stream.yaml"))) as cassette:
cassette_resp, cassette_body = get(url, output="stream") _, cassette_body = get(url, output="stream")
assert cassette_body == body assert cassette_body == body
assert cassette.play_count == 1 assert cassette.play_count == 1
@pytest.mark.online
@pytest.mark.parametrize("body", ["data", "json"]) @pytest.mark.parametrize("body", ["data", "json"])
def test_post(tmpdir, scheme, body, caplog): def test_post(tmpdir, body, caplog, httpbin):
caplog.set_level(logging.INFO) caplog.set_level(logging.INFO)
data = {"key1": "value1", "key2": "value2"} data = {"key1": "value1", "key2": "value2"}
url = scheme + "://httpbin.org/post" url = httpbin.url
with vcr.use_cassette(str(tmpdir.join("post.yaml"))): with vcr.use_cassette(str(tmpdir.join("post.yaml"))):
_, response_json = post(url, **{body: data}) _, response_json = post(url, **{body: data})
@@ -145,14 +156,15 @@ def test_post(tmpdir, scheme, body, caplog):
( (
log log
for log in caplog.records for log in caplog.records
if log.getMessage() == "<Request (POST) {}> not in cassette, sending to real server".format(url) if log.getMessage() == f"<Request (POST) {url}> not in cassette, sending to real server"
), ),
None, None,
), "Log message not found." ), "Log message not found."
def test_params(tmpdir, scheme): @pytest.mark.online
url = scheme + "://httpbin.org/get?d=d" def test_params(tmpdir, httpbin):
url = httpbin.url + "/get?d=d"
headers = {"Content-Type": "application/json"} headers = {"Content-Type": "application/json"}
params = {"a": 1, "b": 2, "c": "c"} params = {"a": 1, "b": 2, "c": "c"}
@@ -166,8 +178,9 @@ def test_params(tmpdir, scheme):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_params_same_url_distinct_params(tmpdir, scheme): @pytest.mark.online
url = scheme + "://httpbin.org/get" def test_params_same_url_distinct_params(tmpdir, httpbin):
url = httpbin.url + "/json"
headers = {"Content-Type": "application/json"} headers = {"Content-Type": "application/json"}
params = {"a": 1, "b": 2, "c": "c"} params = {"a": 1, "b": 2, "c": "c"}
@@ -180,13 +193,16 @@ def test_params_same_url_distinct_params(tmpdir, scheme):
assert cassette.play_count == 1 assert cassette.play_count == 1
other_params = {"other": "params"} other_params = {"other": "params"}
with vcr.use_cassette(str(tmpdir.join("get.yaml"))) as cassette: with (
with pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException): vcr.use_cassette(str(tmpdir.join("get.yaml"))) as cassette,
get(url, output="text", params=other_params) pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException),
):
get(url, output="text", params=other_params)
def test_params_on_url(tmpdir, scheme): @pytest.mark.online
url = scheme + "://httpbin.org/get?a=1&b=foo" def test_params_on_url(tmpdir, httpbin):
url = httpbin.url + "/get?a=1&b=foo"
headers = {"Content-Type": "application/json"} headers = {"Content-Type": "application/json"}
with vcr.use_cassette(str(tmpdir.join("get.yaml"))) as cassette: with vcr.use_cassette(str(tmpdir.join("get.yaml"))) as cassette:
@@ -250,8 +266,9 @@ def test_aiohttp_test_client_json(aiohttp_client, tmpdir):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_redirect(aiohttp_client, tmpdir): @pytest.mark.online
url = "https://mockbin.org/redirect/302/2" def test_redirect(tmpdir, httpbin):
url = httpbin.url + "/redirect/2"
with vcr.use_cassette(str(tmpdir.join("redirect.yaml"))): with vcr.use_cassette(str(tmpdir.join("redirect.yaml"))):
response, _ = get(url) response, _ = get(url)
@@ -268,15 +285,14 @@ def test_redirect(aiohttp_client, tmpdir):
# looking request_info. # looking request_info.
assert cassette_response.request_info.url == response.request_info.url assert cassette_response.request_info.url == response.request_info.url
assert cassette_response.request_info.method == response.request_info.method assert cassette_response.request_info.method == response.request_info.method
assert {k: v for k, v in cassette_response.request_info.headers.items()} == { assert cassette_response.request_info.headers.items() == response.request_info.headers.items()
k: v for k, v in response.request_info.headers.items()
}
assert cassette_response.request_info.real_url == response.request_info.real_url assert cassette_response.request_info.real_url == response.request_info.real_url
def test_not_modified(aiohttp_client, tmpdir): @pytest.mark.online
def test_not_modified(tmpdir, httpbin):
"""It doesn't try to redirect on 304""" """It doesn't try to redirect on 304"""
url = "https://httpbin.org/status/304" url = httpbin.url + "/status/304"
with vcr.use_cassette(str(tmpdir.join("not_modified.yaml"))): with vcr.use_cassette(str(tmpdir.join("not_modified.yaml"))):
response, _ = get(url) response, _ = get(url)
@@ -291,13 +307,14 @@ def test_not_modified(aiohttp_client, tmpdir):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_double_requests(tmpdir): @pytest.mark.online
def test_double_requests(tmpdir, httpbin):
"""We should capture, record, and replay all requests and response chains, """We should capture, record, and replay all requests and response chains,
even if there are duplicate ones. even if there are duplicate ones.
We should replay in the order we saw them. We should replay in the order we saw them.
""" """
url = "https://httpbin.org/get" url = httpbin.url
with vcr.use_cassette(str(tmpdir.join("text.yaml"))): with vcr.use_cassette(str(tmpdir.join("text.yaml"))):
_, response_text1 = get(url, output="text") _, response_text1 = get(url, output="text")
@@ -322,31 +339,41 @@ def test_double_requests(tmpdir):
assert cassette.play_count == 2 assert cassette.play_count == 2
def test_cookies(scheme, tmpdir): def test_cookies(httpbin_both, tmpdir):
async def run(loop): async def run(loop):
cookies_url = scheme + ( cookies_url = httpbin_both.url + (
"://httpbin.org/response-headers?" "/response-headers?"
"set-cookie=" + urllib.parse.quote("cookie_1=val_1; Path=/") + "&" "set-cookie=" + urllib.parse.quote("cookie_1=val_1; Path=/") + "&"
"Set-Cookie=" + urllib.parse.quote("Cookie_2=Val_2; Path=/") "Set-Cookie=" + urllib.parse.quote("Cookie_2=Val_2; Path=/")
) )
home_url = scheme + "://httpbin.org/" home_url = httpbin_both.url + "/"
tmp = str(tmpdir.join("cookies.yaml")) tmp = str(tmpdir.join("cookies.yaml"))
req_cookies = {"Cookie_3": "Val_3"} req_cookies = {"Cookie_3": "Val_3"}
req_headers = {"Cookie": "Cookie_4=Val_4"} req_headers = {"Cookie": "Cookie_4=Val_4"}
# ------------------------- Record -------------------------- # # ------------------------- Record -------------------------- #
with vcr.use_cassette(tmp) as cassette: with vcr.use_cassette(tmp) as cassette:
async with aiohttp.ClientSession(loop=loop) as session: async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url) cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
home_resp = await session.get(home_url, cookies=req_cookies, headers=req_headers) home_resp = await session.get(
home_url,
cookies=req_cookies,
headers=req_headers,
ssl=HTTPBIN_SSL_CONTEXT,
)
assert cassette.play_count == 0 assert cassette.play_count == 0
assert_responses(cookies_resp, home_resp) assert_responses(cookies_resp, home_resp)
# -------------------------- Play --------------------------- # # -------------------------- Play --------------------------- #
with vcr.use_cassette(tmp, record_mode=vcr.mode.NONE) as cassette: with vcr.use_cassette(tmp, record_mode=vcr.mode.NONE) as cassette:
async with aiohttp.ClientSession(loop=loop) as session: async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url) cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
home_resp = await session.get(home_url, cookies=req_cookies, headers=req_headers) home_resp = await session.get(
home_url,
cookies=req_cookies,
headers=req_headers,
ssl=HTTPBIN_SSL_CONTEXT,
)
assert cassette.play_count == 2 assert cassette.play_count == 2
assert_responses(cookies_resp, home_resp) assert_responses(cookies_resp, home_resp)
@@ -362,57 +389,76 @@ def test_cookies(scheme, tmpdir):
run_in_loop(run) run_in_loop(run)
def test_cookies_redirect(scheme, tmpdir): def test_cookies_redirect(httpbin_both, tmpdir):
async def run(loop): async def run(loop):
# Sets cookie as provided by the query string and redirects # Sets cookie as provided by the query string and redirects
cookies_url = scheme + "://httpbin.org/cookies/set?Cookie_1=Val_1" cookies_url = httpbin_both.url + "/cookies/set?Cookie_1=Val_1"
tmp = str(tmpdir.join("cookies.yaml")) tmp = str(tmpdir.join("cookies.yaml"))
# ------------------------- Record -------------------------- # # ------------------------- Record -------------------------- #
with vcr.use_cassette(tmp) as cassette: with vcr.use_cassette(tmp) as cassette:
async with aiohttp.ClientSession(loop=loop) as session: async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url) cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
assert not cookies_resp.cookies assert not cookies_resp.cookies
cookies = session.cookie_jar.filter_cookies(cookies_url) cookies = session.cookie_jar.filter_cookies(yarl.URL(cookies_url))
assert cookies["Cookie_1"].value == "Val_1" assert cookies["Cookie_1"].value == "Val_1"
assert cassette.play_count == 0 assert cassette.play_count == 0
cassette.requests[1].headers["Cookie"] == "Cookie_1=Val_1"
assert cassette.requests[1].headers["Cookie"] == "Cookie_1=Val_1"
# -------------------------- Play --------------------------- # # -------------------------- Play --------------------------- #
with vcr.use_cassette(tmp, record_mode=vcr.mode.NONE) as cassette: with vcr.use_cassette(tmp, record_mode=vcr.mode.NONE) as cassette:
async with aiohttp.ClientSession(loop=loop) as session: async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url) cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
assert not cookies_resp.cookies assert not cookies_resp.cookies
cookies = session.cookie_jar.filter_cookies(cookies_url) cookies = session.cookie_jar.filter_cookies(yarl.URL(cookies_url))
assert cookies["Cookie_1"].value == "Val_1" assert cookies["Cookie_1"].value == "Val_1"
assert cassette.play_count == 2 assert cassette.play_count == 2
cassette.requests[1].headers["Cookie"] == "Cookie_1=Val_1"
assert cassette.requests[1].headers["Cookie"] == "Cookie_1=Val_1"
# Assert that it's ignoring expiration date # Assert that it's ignoring expiration date
with vcr.use_cassette(tmp, record_mode=vcr.mode.NONE) as cassette: with vcr.use_cassette(tmp, record_mode=vcr.mode.NONE) as cassette:
cassette.responses[0]["headers"]["set-cookie"] = [ cassette.responses[0]["headers"]["set-cookie"] = [
"Cookie_1=Val_1; Expires=Wed, 21 Oct 2015 07:28:00 GMT" "Cookie_1=Val_1; Expires=Wed, 21 Oct 2015 07:28:00 GMT",
] ]
async with aiohttp.ClientSession(loop=loop) as session: async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url) cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
assert not cookies_resp.cookies assert not cookies_resp.cookies
cookies = session.cookie_jar.filter_cookies(cookies_url) cookies = session.cookie_jar.filter_cookies(yarl.URL(cookies_url))
assert cookies["Cookie_1"].value == "Val_1" assert cookies["Cookie_1"].value == "Val_1"
run_in_loop(run) run_in_loop(run)
def test_not_allow_redirects(tmpdir): @pytest.mark.online
url = "https://mockbin.org/redirect/308/5" def test_not_allow_redirects(tmpdir, httpbin):
url = httpbin + "/redirect-to?url=.%2F&status_code=308"
path = str(tmpdir.join("redirects.yaml")) path = str(tmpdir.join("redirects.yaml"))
with vcr.use_cassette(path): with vcr.use_cassette(path):
response, _ = get(url, allow_redirects=False) response, _ = get(url, allow_redirects=False)
assert response.url.path == "/redirect/308/5" assert response.url.path == "/redirect-to"
assert response.status == 308 assert response.status == 308
with vcr.use_cassette(path) as cassette: with vcr.use_cassette(path) as cassette:
response, _ = get(url, allow_redirects=False) response, _ = get(url, allow_redirects=False)
assert response.url.path == "/redirect/308/5" assert response.url.path == "/redirect-to"
assert response.status == 308 assert response.status == 308
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_filter_query_parameters(tmpdir, httpbin):
url = httpbin + "?password=secret"
path = str(tmpdir.join("query_param_filter.yaml"))
with vcr.use_cassette(path, filter_query_parameters=["password"]) as cassette:
get(url)
assert "password" not in cassette.requests[0].url
assert "secret" not in cassette.requests[0].url
with open(path) as f:
cassette_content = f.read()
assert "password" not in cassette_content
assert "secret" not in cassette_content

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
"""Basic tests for cassettes""" """Basic tests for cassettes"""
# External imports # External imports
@@ -40,7 +39,7 @@ def test_basic_json_use(tmpdir, httpbin):
test_fixture = str(tmpdir.join("synopsis.json")) test_fixture = str(tmpdir.join("synopsis.json"))
with vcr.use_cassette(test_fixture, serializer="json"): with vcr.use_cassette(test_fixture, serializer="json"):
response = urlopen(httpbin.url).read() response = urlopen(httpbin.url).read()
assert b"difficult sometimes" in response assert b"HTTP Request &amp; Response Service" in response
def test_patched_content(tmpdir, httpbin): def test_patched_content(tmpdir, httpbin):

View File

@@ -1,79 +0,0 @@
import pytest
boto = pytest.importorskip("boto")
import boto # NOQA
import boto.iam # NOQA
from boto.s3.connection import S3Connection # NOQA
from boto.s3.key import Key # NOQA
from configparser import DuplicateSectionError # NOQA
import vcr # NOQA
def test_boto_stubs(tmpdir):
with vcr.use_cassette(str(tmpdir.join("boto-stubs.yml"))):
# Perform the imports within the patched context so that
# CertValidatingHTTPSConnection refers to the patched version.
from boto.https_connection import CertValidatingHTTPSConnection
from vcr.stubs.boto_stubs import VCRCertValidatingHTTPSConnection
# Prove that the class was patched by the stub and that we can instantiate it.
assert issubclass(CertValidatingHTTPSConnection, VCRCertValidatingHTTPSConnection)
CertValidatingHTTPSConnection("hostname.does.not.matter")
def test_boto_without_vcr():
s3_conn = S3Connection()
s3_bucket = s3_conn.get_bucket("boto-demo-1394171994") # a bucket you can access
k = Key(s3_bucket)
k.key = "test.txt"
k.set_contents_from_string("hello world i am a string")
def test_boto_medium_difficulty(tmpdir):
s3_conn = S3Connection()
s3_bucket = s3_conn.get_bucket("boto-demo-1394171994") # a bucket you can access
with vcr.use_cassette(str(tmpdir.join("boto-medium.yml"))):
k = Key(s3_bucket)
k.key = "test.txt"
k.set_contents_from_string("hello world i am a string")
with vcr.use_cassette(str(tmpdir.join("boto-medium.yml"))):
k = Key(s3_bucket)
k.key = "test.txt"
k.set_contents_from_string("hello world i am a string")
def test_boto_hardcore_mode(tmpdir):
with vcr.use_cassette(str(tmpdir.join("boto-hardcore.yml"))):
s3_conn = S3Connection()
s3_bucket = s3_conn.get_bucket("boto-demo-1394171994") # a bucket you can access
k = Key(s3_bucket)
k.key = "test.txt"
k.set_contents_from_string("hello world i am a string")
with vcr.use_cassette(str(tmpdir.join("boto-hardcore.yml"))):
s3_conn = S3Connection()
s3_bucket = s3_conn.get_bucket("boto-demo-1394171994") # a bucket you can access
k = Key(s3_bucket)
k.key = "test.txt"
k.set_contents_from_string("hello world i am a string")
def test_boto_iam(tmpdir):
try:
boto.config.add_section("Boto")
except DuplicateSectionError:
pass
# Ensure that boto uses HTTPS
boto.config.set("Boto", "is_secure", "true")
# Ensure that boto uses CertValidatingHTTPSConnection
boto.config.set("Boto", "https_validate_certificates", "true")
with vcr.use_cassette(str(tmpdir.join("boto-iam.yml"))):
iam_conn = boto.iam.connect_to_region("universal")
iam_conn.get_all_users()
with vcr.use_cassette(str(tmpdir.join("boto-iam.yml"))):
iam_conn = boto.iam.connect_to_region("universal")
iam_conn.get_all_users()

View File

@@ -1,14 +1,15 @@
import pytest
import os import os
import pytest
import vcr
boto3 = pytest.importorskip("boto3") boto3 = pytest.importorskip("boto3")
import boto3 # NOQA import botocore # noqa
import botocore # NOQA
import vcr # NOQA
try: try:
from botocore import awsrequest # NOQA from botocore import awsrequest # noqa
botocore_awsrequest = True botocore_awsrequest = True
except ImportError: except ImportError:
@@ -18,12 +19,12 @@ except ImportError:
# https://github.com/boto/botocore/pull/1495 # https://github.com/boto/botocore/pull/1495
boto3_skip_vendored_requests = pytest.mark.skipif( boto3_skip_vendored_requests = pytest.mark.skipif(
botocore_awsrequest, botocore_awsrequest,
reason="botocore version {ver} does not use vendored requests anymore.".format(ver=botocore.__version__), reason=f"botocore version {botocore.__version__} does not use vendored requests anymore.",
) )
boto3_skip_awsrequest = pytest.mark.skipif( boto3_skip_awsrequest = pytest.mark.skipif(
not botocore_awsrequest, not botocore_awsrequest,
reason="botocore version {ver} still uses vendored requests.".format(ver=botocore.__version__), reason=f"botocore version {botocore.__version__} still uses vendored requests.",
) )
IAM_USER_NAME = "vcrpy" IAM_USER_NAME = "vcrpy"
@@ -55,24 +56,6 @@ def get_user(iam_client):
return _get_user return _get_user
@boto3_skip_vendored_requests
def test_boto_vendored_stubs(tmpdir):
with vcr.use_cassette(str(tmpdir.join("boto3-stubs.yml"))):
# Perform the imports within the patched context so that
# HTTPConnection, VerifiedHTTPSConnection refers to the patched version.
from botocore.vendored.requests.packages.urllib3.connectionpool import (
HTTPConnection,
VerifiedHTTPSConnection,
)
from vcr.stubs.boto3_stubs import VCRRequestsHTTPConnection, VCRRequestsHTTPSConnection
# Prove that the class was patched by the stub and that we can instantiate it.
assert issubclass(HTTPConnection, VCRRequestsHTTPConnection)
assert issubclass(VerifiedHTTPSConnection, VCRRequestsHTTPSConnection)
HTTPConnection("hostname.does.not.matter")
VerifiedHTTPSConnection("hostname.does.not.matter")
@pytest.mark.skipif( @pytest.mark.skipif(
os.environ.get("TRAVIS_PULL_REQUEST") != "false", os.environ.get("TRAVIS_PULL_REQUEST") != "false",
reason="Encrypted Environment Variables from Travis Repository Settings" reason="Encrypted Environment Variables from Travis Repository Settings"
@@ -80,7 +63,6 @@ def test_boto_vendored_stubs(tmpdir):
"https://docs.travis-ci.com/user/pull-requests/#pull-requests-and-security-restrictions", "https://docs.travis-ci.com/user/pull-requests/#pull-requests-and-security-restrictions",
) )
def test_boto_medium_difficulty(tmpdir, get_user): def test_boto_medium_difficulty(tmpdir, get_user):
with vcr.use_cassette(str(tmpdir.join("boto3-medium.yml"))): with vcr.use_cassette(str(tmpdir.join("boto3-medium.yml"))):
response = get_user() response = get_user()
assert response["User"]["UserName"] == IAM_USER_NAME assert response["User"]["UserName"] == IAM_USER_NAME

View File

@@ -1,16 +1,20 @@
import os
import json import json
import pytest import os
import vcr
from urllib.request import urlopen from urllib.request import urlopen
import pytest
import vcr
from vcr.cassette import Cassette
@pytest.mark.online
def test_set_serializer_default_config(tmpdir, httpbin): def test_set_serializer_default_config(tmpdir, httpbin):
my_vcr = vcr.VCR(serializer="json") my_vcr = vcr.VCR(serializer="json")
with my_vcr.use_cassette(str(tmpdir.join("test.json"))): with my_vcr.use_cassette(str(tmpdir.join("test.json"))):
assert my_vcr.serializer == "json" assert my_vcr.serializer == "json"
urlopen(httpbin.url + "/get") urlopen(httpbin.url)
with open(str(tmpdir.join("test.json"))) as f: with open(str(tmpdir.join("test.json"))) as f:
file_content = f.read() file_content = f.read()
@@ -18,27 +22,30 @@ def test_set_serializer_default_config(tmpdir, httpbin):
assert json.loads(file_content) assert json.loads(file_content)
@pytest.mark.online
def test_default_set_cassette_library_dir(tmpdir, httpbin): def test_default_set_cassette_library_dir(tmpdir, httpbin):
my_vcr = vcr.VCR(cassette_library_dir=str(tmpdir.join("subdir"))) my_vcr = vcr.VCR(cassette_library_dir=str(tmpdir.join("subdir")))
with my_vcr.use_cassette("test.json"): with my_vcr.use_cassette("test.json"):
urlopen(httpbin.url + "/get") urlopen(httpbin.url)
assert os.path.exists(str(tmpdir.join("subdir").join("test.json"))) assert os.path.exists(str(tmpdir.join("subdir").join("test.json")))
@pytest.mark.online
def test_override_set_cassette_library_dir(tmpdir, httpbin): def test_override_set_cassette_library_dir(tmpdir, httpbin):
my_vcr = vcr.VCR(cassette_library_dir=str(tmpdir.join("subdir"))) my_vcr = vcr.VCR(cassette_library_dir=str(tmpdir.join("subdir")))
cld = str(tmpdir.join("subdir2")) cld = str(tmpdir.join("subdir2"))
with my_vcr.use_cassette("test.json", cassette_library_dir=cld): with my_vcr.use_cassette("test.json", cassette_library_dir=cld):
urlopen(httpbin.url + "/get") urlopen(httpbin.url)
assert os.path.exists(str(tmpdir.join("subdir2").join("test.json"))) assert os.path.exists(str(tmpdir.join("subdir2").join("test.json")))
assert not os.path.exists(str(tmpdir.join("subdir").join("test.json"))) assert not os.path.exists(str(tmpdir.join("subdir").join("test.json")))
@pytest.mark.online
def test_override_match_on(tmpdir, httpbin): def test_override_match_on(tmpdir, httpbin):
my_vcr = vcr.VCR(match_on=["method"]) my_vcr = vcr.VCR(match_on=["method"])
@@ -46,7 +53,7 @@ def test_override_match_on(tmpdir, httpbin):
urlopen(httpbin.url) urlopen(httpbin.url)
with my_vcr.use_cassette(str(tmpdir.join("test.json"))) as cass: with my_vcr.use_cassette(str(tmpdir.join("test.json"))) as cass:
urlopen(httpbin.url + "/get") urlopen(httpbin.url)
assert len(cass) == 1 assert len(cass) == 1
assert cass.play_count == 1 assert cass.play_count == 1
@@ -55,6 +62,43 @@ def test_override_match_on(tmpdir, httpbin):
def test_missing_matcher(): def test_missing_matcher():
my_vcr = vcr.VCR() my_vcr = vcr.VCR()
my_vcr.register_matcher("awesome", object) my_vcr.register_matcher("awesome", object)
with pytest.raises(KeyError): with pytest.raises(KeyError), my_vcr.use_cassette("test.yaml", match_on=["notawesome"]):
with my_vcr.use_cassette("test.yaml", match_on=["notawesome"]): pass
pass
@pytest.mark.online
def test_dont_record_on_exception(tmpdir, httpbin):
my_vcr = vcr.VCR(record_on_exception=False)
@my_vcr.use_cassette(str(tmpdir.join("dontsave.yml")))
def some_test():
assert b"Not in content" in urlopen(httpbin.url)
with pytest.raises(AssertionError):
some_test()
assert not os.path.exists(str(tmpdir.join("dontsave.yml")))
# Make sure context decorator has the same behavior
with pytest.raises(AssertionError), my_vcr.use_cassette(str(tmpdir.join("dontsave2.yml"))):
assert b"Not in content" in urlopen(httpbin.url).read()
assert not os.path.exists(str(tmpdir.join("dontsave2.yml")))
def test_set_drop_unused_requests(tmpdir, httpbin):
my_vcr = vcr.VCR(drop_unused_requests=True)
file = str(tmpdir.join("test.yaml"))
with my_vcr.use_cassette(file):
urlopen(httpbin.url)
urlopen(httpbin.url + "/get")
cassette = Cassette.load(path=file)
assert len(cassette) == 2
with my_vcr.use_cassette(file):
urlopen(httpbin.url)
cassette = Cassette.load(path=file)
assert len(cassette) == 1

View File

@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
"""Basic tests about save behavior""" """Basic tests about save behavior"""
# External imports # External imports
@@ -6,10 +5,13 @@ import os
import time import time
from urllib.request import urlopen from urllib.request import urlopen
import pytest
# Internal imports # Internal imports
import vcr import vcr
@pytest.mark.online
def test_disk_saver_nowrite(tmpdir, httpbin): def test_disk_saver_nowrite(tmpdir, httpbin):
""" """
Ensure that when you close a cassette without changing it it doesn't Ensure that when you close a cassette without changing it it doesn't
@@ -30,6 +32,7 @@ def test_disk_saver_nowrite(tmpdir, httpbin):
assert last_mod == last_mod2 assert last_mod == last_mod2
@pytest.mark.online
def test_disk_saver_write(tmpdir, httpbin): def test_disk_saver_write(tmpdir, httpbin):
""" """
Ensure that when you close a cassette after changing it it does Ensure that when you close a cassette after changing it it does

View File

@@ -1,11 +1,14 @@
import base64 import base64
import pytest
from urllib.request import urlopen, Request
from urllib.parse import urlencode
from urllib.error import HTTPError
import vcr
import json import json
from assertions import assert_cassette_has_one_response, assert_is_json from urllib.error import HTTPError
from urllib.parse import urlencode
from urllib.request import Request, urlopen
import pytest
import vcr
from ..assertions import assert_cassette_has_one_response, assert_is_json_bytes
def _request_with_auth(url, username, password): def _request_with_auth(url, username, password):
@@ -43,13 +46,18 @@ def test_filter_basic_auth(tmpdir, httpbin):
def test_filter_querystring(tmpdir, httpbin): def test_filter_querystring(tmpdir, httpbin):
url = httpbin.url + "/?foo=bar" url = httpbin.url + "/?password=secret"
cass_file = str(tmpdir.join("filter_qs.yaml")) cass_file = str(tmpdir.join("filter_qs.yaml"))
with vcr.use_cassette(cass_file, filter_query_parameters=["foo"]): with vcr.use_cassette(cass_file, filter_query_parameters=["password"]):
urlopen(url) urlopen(url)
with vcr.use_cassette(cass_file, filter_query_parameters=["foo"]) as cass: with vcr.use_cassette(cass_file, filter_query_parameters=["password"]) as cass:
urlopen(url) urlopen(url)
assert "foo" not in cass.requests[0].url assert "password" not in cass.requests[0].url
assert "secret" not in cass.requests[0].url
with open(cass_file) as f:
cassette_content = f.read()
assert "password" not in cassette_content
assert "secret" not in cassette_content
def test_filter_post_data(tmpdir, httpbin): def test_filter_post_data(tmpdir, httpbin):
@@ -103,7 +111,19 @@ def test_decompress_gzip(tmpdir, httpbin):
with vcr.use_cassette(cass_file) as cass: with vcr.use_cassette(cass_file) as cass:
decoded_response = urlopen(url).read() decoded_response = urlopen(url).read()
assert_cassette_has_one_response(cass) assert_cassette_has_one_response(cass)
assert_is_json(decoded_response) assert_is_json_bytes(decoded_response)
def test_decomptess_empty_body(tmpdir, httpbin):
url = httpbin.url + "/gzip"
request = Request(url, headers={"Accept-Encoding": ["gzip, deflate"]}, method="HEAD")
cass_file = str(tmpdir.join("gzip_empty_response.yaml"))
with vcr.use_cassette(cass_file, decode_compressed_response=True):
response = urlopen(request).read()
with vcr.use_cassette(cass_file) as cass:
decoded_response = urlopen(request).read()
assert_cassette_has_one_response(cass)
assert decoded_response == response
def test_decompress_deflate(tmpdir, httpbin): def test_decompress_deflate(tmpdir, httpbin):
@@ -115,7 +135,7 @@ def test_decompress_deflate(tmpdir, httpbin):
with vcr.use_cassette(cass_file) as cass: with vcr.use_cassette(cass_file) as cass:
decoded_response = urlopen(url).read() decoded_response = urlopen(url).read()
assert_cassette_has_one_response(cass) assert_cassette_has_one_response(cass)
assert_is_json(decoded_response) assert_is_json_bytes(decoded_response)
def test_decompress_regular(tmpdir, httpbin): def test_decompress_regular(tmpdir, httpbin):
@@ -127,4 +147,25 @@ def test_decompress_regular(tmpdir, httpbin):
with vcr.use_cassette(cass_file) as cass: with vcr.use_cassette(cass_file) as cass:
resp = urlopen(url).read() resp = urlopen(url).read()
assert_cassette_has_one_response(cass) assert_cassette_has_one_response(cass)
assert_is_json(resp) assert_is_json_bytes(resp)
def test_before_record_request_corruption(tmpdir, httpbin):
"""Modifying request in before_record_request should not affect outgoing request"""
def before_record(request):
request.headers.clear()
request.body = b""
return request
req = Request(
httpbin.url + "/post",
data=urlencode({"test": "exists"}).encode(),
headers={"X-Test": "exists"},
)
cass_file = str(tmpdir.join("modified_response.yaml"))
with vcr.use_cassette(cass_file, before_record_request=before_record):
resp = json.loads(urlopen(req).read())
assert resp["headers"]["X-Test"] == "exists"
assert resp["form"]["test"] == "exists"

View File

@@ -1,15 +1,13 @@
# -*- coding: utf-8 -*-
"""Integration tests with httplib2""" """Integration tests with httplib2"""
import sys
from urllib.parse import urlencode from urllib.parse import urlencode
import pytest import pytest
import pytest_httpbin.certs import pytest_httpbin.certs
import vcr import vcr
from assertions import assert_cassette_has_one_response from ..assertions import assert_cassette_has_one_response
httplib2 = pytest.importorskip("httplib2") httplib2 = pytest.importorskip("httplib2")
@@ -20,8 +18,6 @@ def http():
with the certificate replaced by the httpbin one. with the certificate replaced by the httpbin one.
""" """
kwargs = {"ca_certs": pytest_httpbin.certs.where()} kwargs = {"ca_certs": pytest_httpbin.certs.where()}
if sys.version_info[:2] in [(2, 7), (3, 7)]:
kwargs["disable_ssl_certificate_validation"] = True
return httplib2.Http(**kwargs) return httplib2.Http(**kwargs)
@@ -61,14 +57,15 @@ def test_response_headers(tmpdir, httpbin_both):
assert set(headers) == set(resp.items()) assert set(headers) == set(resp.items())
def test_effective_url(tmpdir): @pytest.mark.online
def test_effective_url(tmpdir, httpbin):
"""Ensure that the effective_url is captured""" """Ensure that the effective_url is captured"""
url = "http://mockbin.org/redirect/301" url = httpbin.url + "/redirect-to?url=.%2F&status_code=301"
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): with vcr.use_cassette(str(tmpdir.join("headers.yaml"))):
resp, _ = http().request(url) resp, _ = http().request(url)
effective_url = resp["content-location"] effective_url = resp["content-location"]
assert effective_url == "http://mockbin.org/redirect/301/0" assert effective_url == httpbin.url + "/"
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): with vcr.use_cassette(str(tmpdir.join("headers.yaml"))):
resp, _ = http().request(url) resp, _ = http().request(url)

View File

@@ -1,11 +1,19 @@
import pytest
import os import os
import pytest
import vcr
from ..assertions import assert_is_json_bytes
asyncio = pytest.importorskip("asyncio") asyncio = pytest.importorskip("asyncio")
httpx = pytest.importorskip("httpx") httpx = pytest.importorskip("httpx")
import vcr # noqa: E402
from vcr.stubs.httpx_stubs import HTTPX_REDIRECT_PARAM # noqa: E402 @pytest.fixture(params=["https", "http"])
def scheme(request):
"""Fixture that returns both http and https."""
return request.param
class BaseDoRequest: class BaseDoRequest:
@@ -14,6 +22,7 @@ class BaseDoRequest:
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
self._client_args = args self._client_args = args
self._client_kwargs = kwargs self._client_kwargs = kwargs
self._client_kwargs["follow_redirects"] = self._client_kwargs.get("follow_redirects", True)
def _make_client(self): def _make_client(self):
return self._client_class(*self._client_args, **self._client_kwargs) return self._client_class(*self._client_args, **self._client_kwargs)
@@ -23,21 +32,36 @@ class DoSyncRequest(BaseDoRequest):
_client_class = httpx.Client _client_class = httpx.Client
def __enter__(self): def __enter__(self):
self._client = self._make_client()
return self return self
def __exit__(self, *args): def __exit__(self, *args):
pass self._client.close()
del self._client
@property @property
def client(self): def client(self):
try: try:
return self._client return self._client
except AttributeError: except AttributeError as e:
self._client = self._make_client() raise ValueError('To access sync client, use "with do_request() as client"') from e
return self._client
def __call__(self, *args, **kwargs): def __call__(self, *args, **kwargs):
return self.client.request(*args, timeout=60, **kwargs) if hasattr(self, "_client"):
return self.client.request(*args, timeout=60, **kwargs)
# Use one-time context and dispose of the client afterwards
with self:
return self.client.request(*args, timeout=60, **kwargs)
def stream(self, *args, **kwargs):
if hasattr(self, "_client"):
with self.client.stream(*args, **kwargs) as response:
return b"".join(response.iter_bytes())
# Use one-time context and dispose of the client afterwards
with self, self.client.stream(*args, **kwargs) as response:
return b"".join(response.iter_bytes())
class DoAsyncRequest(BaseDoRequest): class DoAsyncRequest(BaseDoRequest):
@@ -64,8 +88,8 @@ class DoAsyncRequest(BaseDoRequest):
def client(self): def client(self):
try: try:
return self._client return self._client
except AttributeError: except AttributeError as e:
raise ValueError('To access async client, use "with do_request() as client"') raise ValueError('To access async client, use "with do_request() as client"') from e
def __call__(self, *args, **kwargs): def __call__(self, *args, **kwargs):
if hasattr(self, "_loop"): if hasattr(self, "_loop"):
@@ -73,14 +97,27 @@ class DoAsyncRequest(BaseDoRequest):
# Use one-time context and dispose of the loop/client afterwards # Use one-time context and dispose of the loop/client afterwards
with self: with self:
return self(*args, **kwargs) return self._loop.run_until_complete(self.client.request(*args, **kwargs))
async def _get_stream(self, *args, **kwargs):
async with self.client.stream(*args, **kwargs) as response:
content = b""
async for c in response.aiter_bytes():
content += c
return content
def stream(self, *args, **kwargs):
if hasattr(self, "_loop"):
return self._loop.run_until_complete(self._get_stream(*args, **kwargs))
# Use one-time context and dispose of the loop/client afterwards
with self:
return self._loop.run_until_complete(self._get_stream(*args, **kwargs))
def pytest_generate_tests(metafunc): def pytest_generate_tests(metafunc):
if "do_request" in metafunc.fixturenames: if "do_request" in metafunc.fixturenames:
metafunc.parametrize("do_request", [DoAsyncRequest, DoSyncRequest]) metafunc.parametrize("do_request", [DoAsyncRequest, DoSyncRequest])
if "scheme" in metafunc.fixturenames:
metafunc.parametrize("scheme", ["http", "https"])
@pytest.fixture @pytest.fixture
@@ -88,8 +125,10 @@ def yml(tmpdir, request):
return str(tmpdir.join(request.function.__name__ + ".yaml")) return str(tmpdir.join(request.function.__name__ + ".yaml"))
def test_status(tmpdir, scheme, do_request): @pytest.mark.online
url = scheme + "://mockbin.org/request" def test_status(tmpdir, httpbin, do_request):
url = httpbin.url
with vcr.use_cassette(str(tmpdir.join("status.yaml"))): with vcr.use_cassette(str(tmpdir.join("status.yaml"))):
response = do_request()("GET", url) response = do_request()("GET", url)
@@ -99,8 +138,10 @@ def test_status(tmpdir, scheme, do_request):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_case_insensitive_headers(tmpdir, scheme, do_request): @pytest.mark.online
url = scheme + "://mockbin.org/request" def test_case_insensitive_headers(tmpdir, httpbin, do_request):
url = httpbin.url
with vcr.use_cassette(str(tmpdir.join("whatever.yaml"))): with vcr.use_cassette(str(tmpdir.join("whatever.yaml"))):
do_request()("GET", url) do_request()("GET", url)
@@ -111,8 +152,10 @@ def test_case_insensitive_headers(tmpdir, scheme, do_request):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_content(tmpdir, scheme, do_request): @pytest.mark.online
url = scheme + "://httpbin.org" def test_content(tmpdir, httpbin, do_request):
url = httpbin.url
with vcr.use_cassette(str(tmpdir.join("cointent.yaml"))): with vcr.use_cassette(str(tmpdir.join("cointent.yaml"))):
response = do_request()("GET", url) response = do_request()("GET", url)
@@ -122,21 +165,22 @@ def test_content(tmpdir, scheme, do_request):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_json(tmpdir, scheme, do_request): @pytest.mark.online
url = scheme + "://httpbin.org/get" def test_json(tmpdir, httpbin, do_request):
headers = {"Content-Type": "application/json"} url = httpbin.url + "/json"
with vcr.use_cassette(str(tmpdir.join("json.yaml"))): with vcr.use_cassette(str(tmpdir.join("json.yaml"))):
response = do_request(headers=headers)("GET", url) response = do_request()("GET", url)
with vcr.use_cassette(str(tmpdir.join("json.yaml"))) as cassette: with vcr.use_cassette(str(tmpdir.join("json.yaml"))) as cassette:
cassette_response = do_request(headers=headers)("GET", url) cassette_response = do_request()("GET", url)
assert cassette_response.json() == response.json() assert cassette_response.json() == response.json()
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_params_same_url_distinct_params(tmpdir, scheme, do_request): @pytest.mark.online
url = scheme + "://httpbin.org/get" def test_params_same_url_distinct_params(tmpdir, httpbin, do_request):
url = httpbin.url + "/get"
headers = {"Content-Type": "application/json"} headers = {"Content-Type": "application/json"}
params = {"a": 1, "b": False, "c": "c"} params = {"a": 1, "b": False, "c": "c"}
@@ -150,51 +194,39 @@ def test_params_same_url_distinct_params(tmpdir, scheme, do_request):
assert cassette.play_count == 1 assert cassette.play_count == 1
params = {"other": "params"} params = {"other": "params"}
with vcr.use_cassette(str(tmpdir.join("get.yaml"))) as cassette: with (
with pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException): vcr.use_cassette(str(tmpdir.join("get.yaml"))) as cassette,
do_request()("GET", url, params=params, headers=headers) pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException),
):
do_request()("GET", url, params=params, headers=headers)
def test_redirect(tmpdir, do_request, yml): @pytest.mark.online
url = "https://mockbin.org/redirect/303/2" def test_redirect(httpbin, yml, do_request):
url = httpbin.url + "/redirect-to"
redirect_kwargs = {HTTPX_REDIRECT_PARAM.name: True} response = do_request()("GET", url)
response = do_request()("GET", url, **redirect_kwargs)
with vcr.use_cassette(yml): with vcr.use_cassette(yml):
response = do_request()("GET", url, **redirect_kwargs) response = do_request()("GET", url, params={"url": "./get", "status_code": 302})
with vcr.use_cassette(yml) as cassette: with vcr.use_cassette(yml) as cassette:
cassette_response = do_request()("GET", url, **redirect_kwargs) cassette_response = do_request()("GET", url, params={"url": "./get", "status_code": 302})
assert cassette_response.status_code == response.status_code assert cassette_response.status_code == response.status_code
assert len(cassette_response.history) == len(response.history) assert len(cassette_response.history) == len(response.history)
assert len(cassette) == 3 assert len(cassette) == 2
assert cassette.play_count == 3 assert cassette.play_count == 2
# Assert that the real response and the cassette response have a similar # Assert that the real response and the cassette response have a similar
# looking request_info. # looking request_info.
assert cassette_response.request.url == response.request.url assert cassette_response.request.url == response.request.url
assert cassette_response.request.method == response.request.method assert cassette_response.request.method == response.request.method
assert {k: v for k, v in cassette_response.request.headers.items()} == { assert cassette_response.request.headers.items() == response.request.headers.items()
k: v for k, v in response.request.headers.items()
}
def test_work_with_gzipped_data(tmpdir, do_request, yml):
with vcr.use_cassette(yml):
do_request()("GET", "https://httpbin.org/gzip")
with vcr.use_cassette(yml) as cassette:
cassette_response = do_request()("GET", "https://httpbin.org/gzip")
assert "gzip" in cassette_response.json()["headers"]["Accept-Encoding"]
assert cassette_response.read()
assert cassette.play_count == 1
@pytest.mark.online
@pytest.mark.parametrize("url", ["https://github.com/kevin1024/vcrpy/issues/" + str(i) for i in range(3, 6)]) @pytest.mark.parametrize("url", ["https://github.com/kevin1024/vcrpy/issues/" + str(i) for i in range(3, 6)])
def test_simple_fetching(tmpdir, do_request, yml, url): def test_simple_fetching(do_request, yml, url):
with vcr.use_cassette(yml): with vcr.use_cassette(yml):
do_request()("GET", url) do_request()("GET", url)
@@ -204,96 +236,127 @@ def test_simple_fetching(tmpdir, do_request, yml, url):
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_behind_proxy(do_request): @pytest.mark.online
# This is recorded because otherwise we should have a live proxy somewhere. def test_cookies(tmpdir, httpbin, do_request):
yml = (
os.path.dirname(os.path.realpath(__file__)) + "/cassettes/" + "test_httpx_test_test_behind_proxy.yml"
)
url = "https://httpbin.org/headers"
proxy = "http://localhost:8080"
proxies = {"http://": proxy, "https://": proxy}
with vcr.use_cassette(yml):
response = do_request(proxies=proxies, verify=False)("GET", url)
with vcr.use_cassette(yml) as cassette:
cassette_response = do_request(proxies=proxies, verify=False)("GET", url)
assert str(cassette_response.request.url) == url
assert cassette.play_count == 1
assert cassette_response.headers["Via"] == "my_own_proxy", str(cassette_response.headers)
assert cassette_response.request.url == response.request.url
def test_cookies(tmpdir, scheme, do_request):
def client_cookies(client): def client_cookies(client):
return [c for c in client.client.cookies] return list(client.client.cookies)
def response_cookies(response): def response_cookies(response):
return [c for c in response.cookies] return list(response.cookies)
with do_request() as client: url = httpbin.url + "/cookies/set"
params = {"k1": "v1", "k2": "v2"}
with do_request(params=params, follow_redirects=False) as client:
assert client_cookies(client) == [] assert client_cookies(client) == []
redirect_kwargs = {HTTPX_REDIRECT_PARAM.name: True}
url = scheme + "://httpbin.org"
testfile = str(tmpdir.join("cookies.yml")) testfile = str(tmpdir.join("cookies.yml"))
with vcr.use_cassette(testfile): with vcr.use_cassette(testfile):
r1 = client("GET", url + "/cookies/set?k1=v1&k2=v2", **redirect_kwargs) r1 = client("GET", url)
assert response_cookies(r1.history[0]) == ["k1", "k2"]
assert response_cookies(r1) == []
r2 = client("GET", url + "/cookies", **redirect_kwargs) assert response_cookies(r1) == ["k1", "k2"]
assert len(r2.json()["cookies"]) == 2
r2 = client("GET", url)
assert response_cookies(r2) == ["k1", "k2"]
assert client_cookies(client) == ["k1", "k2"] assert client_cookies(client) == ["k1", "k2"]
with do_request() as new_client: with do_request(params=params, follow_redirects=False) as new_client:
assert client_cookies(new_client) == [] assert client_cookies(new_client) == []
with vcr.use_cassette(testfile) as cassette: with vcr.use_cassette(testfile) as cassette:
cassette_response = new_client("GET", url + "/cookies/set?k1=v1&k2=v2") cassette_response = new_client("GET", url)
assert response_cookies(cassette_response.history[0]) == ["k1", "k2"]
assert response_cookies(cassette_response) == []
assert cassette.play_count == 2 assert cassette.play_count == 1
assert response_cookies(cassette_response) == ["k1", "k2"]
assert client_cookies(new_client) == ["k1", "k2"] assert client_cookies(new_client) == ["k1", "k2"]
def test_relative_redirects(tmpdir, scheme, do_request): @pytest.mark.online
redirect_kwargs = {HTTPX_REDIRECT_PARAM.name: True} def test_stream(tmpdir, httpbin, do_request):
url = httpbin.url + "/stream-bytes/512"
testfile = str(tmpdir.join("stream.yml"))
url = scheme + "://mockbin.com/redirect/301?to=/redirect/301?to=/request"
testfile = str(tmpdir.join("relative_redirects.yml"))
with vcr.use_cassette(testfile): with vcr.use_cassette(testfile):
response = do_request()("GET", url, **redirect_kwargs) response_content = do_request().stream("GET", url)
assert len(response.history) == 2, response assert len(response_content) == 512
assert response.json()["url"].endswith("request")
with vcr.use_cassette(testfile) as cassette: with vcr.use_cassette(testfile) as cassette:
response = do_request()("GET", url, **redirect_kwargs) cassette_content = do_request().stream("GET", url)
assert len(response.history) == 2 assert cassette_content == response_content
assert response.json()["url"].endswith("request") assert len(cassette_content) == 512
assert cassette.play_count == 1
assert cassette.play_count == 3
def test_redirect_wo_allow_redirects(do_request, yml): # Regular cassette formats support the status reason,
url = "https://mockbin.org/redirect/308/5" # but the old HTTPX cassette format does not.
@pytest.mark.parametrize(
redirect_kwargs = {HTTPX_REDIRECT_PARAM.name: False} "cassette_name,reason",
[
with vcr.use_cassette(yml): ("requests", "great"),
response = do_request()("GET", url, **redirect_kwargs) ("httpx_old_format", "OK"),
],
assert str(response.url).endswith("308/5") )
assert response.status_code == 308 def test_load_cassette_format(do_request, cassette_name, reason):
mydir = os.path.dirname(os.path.realpath(__file__))
yml = f"{mydir}/cassettes/gzip_{cassette_name}.yaml"
url = "https://httpbin.org/gzip"
with vcr.use_cassette(yml) as cassette: with vcr.use_cassette(yml) as cassette:
response = do_request()("GET", url, **redirect_kwargs) cassette_response = do_request()("GET", url)
assert str(cassette_response.request.url) == url
assert str(response.url).endswith("308/5")
assert response.status_code == 308
assert cassette.play_count == 1 assert cassette.play_count == 1
# Should be able to load up the JSON inside,
# regardless whether the content is the gzipped
# in the cassette or not.
json = cassette_response.json()
assert json["method"] == "GET", json
assert cassette_response.status_code == 200
assert cassette_response.reason_phrase == reason
def test_gzip__decode_compressed_response_false(tmpdir, httpbin, do_request):
"""
Ensure that httpx is able to automatically decompress the response body.
"""
for _ in range(2): # one for recording, one for re-playing
with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))) as cassette:
response = do_request()("GET", httpbin + "/gzip")
assert response.headers["content-encoding"] == "gzip" # i.e. not removed
# The content stored in the cassette should be gzipped.
assert cassette.responses[0]["body"]["string"][:2] == b"\x1f\x8b"
assert_is_json_bytes(response.content) # i.e. uncompressed bytes
def test_gzip__decode_compressed_response_true(do_request, tmpdir, httpbin):
url = httpbin + "/gzip"
expected_response = do_request()("GET", url)
expected_content = expected_response.content
assert expected_response.headers["content-encoding"] == "gzip" # self-test
with vcr.use_cassette(
str(tmpdir.join("decode_compressed.yaml")),
decode_compressed_response=True,
) as cassette:
r = do_request()("GET", url)
assert r.headers["content-encoding"] == "gzip" # i.e. not removed
content_length = r.headers["content-length"]
assert r.content == expected_content
# Has the cassette body been decompressed?
cassette_response_body = cassette.responses[0]["body"]["string"]
assert isinstance(cassette_response_body, str)
# Content should be JSON.
assert cassette_response_body[0:1] == "{"
with vcr.use_cassette(str(tmpdir.join("decode_compressed.yaml")), decode_compressed_response=True):
r = httpx.get(url)
assert "content-encoding" not in r.headers # i.e. removed
assert r.content == expected_content
# As the content is uncompressed, it should have a bigger
# length than the compressed version.
assert r.headers["content-length"] > content_length

View File

@@ -1,6 +1,7 @@
from urllib.request import urlopen
import socket import socket
from contextlib import contextmanager from contextlib import contextmanager
from urllib.request import urlopen
import vcr import vcr
@@ -27,9 +28,9 @@ def test_ignore_localhost(tmpdir, httpbin):
with overridden_dns({"httpbin.org": "127.0.0.1"}): with overridden_dns({"httpbin.org": "127.0.0.1"}):
cass_file = str(tmpdir.join("filter_qs.yaml")) cass_file = str(tmpdir.join("filter_qs.yaml"))
with vcr.use_cassette(cass_file, ignore_localhost=True) as cass: with vcr.use_cassette(cass_file, ignore_localhost=True) as cass:
urlopen("http://localhost:{}/".format(httpbin.port)) urlopen(f"http://localhost:{httpbin.port}/")
assert len(cass) == 0 assert len(cass) == 0
urlopen("http://httpbin.org:{}/".format(httpbin.port)) urlopen(f"http://httpbin.org:{httpbin.port}/")
assert len(cass) == 1 assert len(cass) == 1
@@ -37,9 +38,9 @@ def test_ignore_httpbin(tmpdir, httpbin):
with overridden_dns({"httpbin.org": "127.0.0.1"}): with overridden_dns({"httpbin.org": "127.0.0.1"}):
cass_file = str(tmpdir.join("filter_qs.yaml")) cass_file = str(tmpdir.join("filter_qs.yaml"))
with vcr.use_cassette(cass_file, ignore_hosts=["httpbin.org"]) as cass: with vcr.use_cassette(cass_file, ignore_hosts=["httpbin.org"]) as cass:
urlopen("http://httpbin.org:{}/".format(httpbin.port)) urlopen(f"http://httpbin.org:{httpbin.port}/")
assert len(cass) == 0 assert len(cass) == 0
urlopen("http://localhost:{}/".format(httpbin.port)) urlopen(f"http://localhost:{httpbin.port}/")
assert len(cass) == 1 assert len(cass) == 1
@@ -47,8 +48,8 @@ def test_ignore_localhost_and_httpbin(tmpdir, httpbin):
with overridden_dns({"httpbin.org": "127.0.0.1"}): with overridden_dns({"httpbin.org": "127.0.0.1"}):
cass_file = str(tmpdir.join("filter_qs.yaml")) cass_file = str(tmpdir.join("filter_qs.yaml"))
with vcr.use_cassette(cass_file, ignore_hosts=["httpbin.org"], ignore_localhost=True) as cass: with vcr.use_cassette(cass_file, ignore_hosts=["httpbin.org"], ignore_localhost=True) as cass:
urlopen("http://httpbin.org:{}".format(httpbin.port)) urlopen(f"http://httpbin.org:{httpbin.port}")
urlopen("http://localhost:{}".format(httpbin.port)) urlopen(f"http://localhost:{httpbin.port}")
assert len(cass) == 0 assert len(cass) == 0
@@ -56,12 +57,12 @@ def test_ignore_localhost_twice(tmpdir, httpbin):
with overridden_dns({"httpbin.org": "127.0.0.1"}): with overridden_dns({"httpbin.org": "127.0.0.1"}):
cass_file = str(tmpdir.join("filter_qs.yaml")) cass_file = str(tmpdir.join("filter_qs.yaml"))
with vcr.use_cassette(cass_file, ignore_localhost=True) as cass: with vcr.use_cassette(cass_file, ignore_localhost=True) as cass:
urlopen("http://localhost:{}".format(httpbin.port)) urlopen(f"http://localhost:{httpbin.port}")
assert len(cass) == 0 assert len(cass) == 0
urlopen("http://httpbin.org:{}".format(httpbin.port)) urlopen(f"http://httpbin.org:{httpbin.port}")
assert len(cass) == 1 assert len(cass) == 1
with vcr.use_cassette(cass_file, ignore_localhost=True) as cass: with vcr.use_cassette(cass_file, ignore_localhost=True) as cass:
assert len(cass) == 1 assert len(cass) == 1
urlopen("http://localhost:{}".format(httpbin.port)) urlopen(f"http://localhost:{httpbin.port}")
urlopen("http://httpbin.org:{}".format(httpbin.port)) urlopen(f"http://httpbin.org:{httpbin.port}")
assert len(cass) == 1 assert len(cass) == 1

View File

@@ -1,7 +1,8 @@
import vcr
import pytest
from urllib.request import urlopen from urllib.request import urlopen
import pytest
import vcr
DEFAULT_URI = "http://httpbin.org/get?p1=q1&p2=q2" # base uri for testing DEFAULT_URI = "http://httpbin.org/get?p1=q1&p2=q2" # base uri for testing
@@ -35,7 +36,6 @@ def cassette(tmpdir, httpbin, httpbin_secure):
], ],
) )
def test_matchers(httpbin, httpbin_secure, cassette, matcher, matching_uri, not_matching_uri): def test_matchers(httpbin, httpbin_secure, cassette, matcher, matching_uri, not_matching_uri):
matching_uri = _replace_httpbin(matching_uri, httpbin, httpbin_secure) matching_uri = _replace_httpbin(matching_uri, httpbin, httpbin_secure)
not_matching_uri = _replace_httpbin(not_matching_uri, httpbin, httpbin_secure) not_matching_uri = _replace_httpbin(not_matching_uri, httpbin, httpbin_secure)
default_uri = _replace_httpbin(DEFAULT_URI, httpbin, httpbin_secure) default_uri = _replace_httpbin(DEFAULT_URI, httpbin, httpbin_secure)
@@ -51,9 +51,11 @@ def test_matchers(httpbin, httpbin_secure, cassette, matcher, matching_uri, not_
assert cass.play_count == 1 assert cass.play_count == 1
# play cassette with not matching on uri, it should fail # play cassette with not matching on uri, it should fail
with pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException): with (
with vcr.use_cassette(cassette, match_on=[matcher]) as cass: pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException),
urlopen(not_matching_uri) vcr.use_cassette(cassette, match_on=[matcher]) as cass,
):
urlopen(not_matching_uri)
def test_method_matcher(cassette, httpbin, httpbin_secure): def test_method_matcher(cassette, httpbin, httpbin_secure):
@@ -65,17 +67,23 @@ def test_method_matcher(cassette, httpbin, httpbin_secure):
assert cass.play_count == 1 assert cass.play_count == 1
# should fail if method does not match # should fail if method does not match
with pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException): with (
with vcr.use_cassette(cassette, match_on=["method"]) as cass: pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException),
# is a POST request vcr.use_cassette(cassette, match_on=["method"]) as cass,
urlopen(default_uri, data=b"") ):
# is a POST request
urlopen(default_uri, data=b"")
@pytest.mark.parametrize( @pytest.mark.parametrize(
"uri", [DEFAULT_URI, "http://httpbin.org/get?p2=q2&p1=q1", "http://httpbin.org/get?p2=q2&p1=q1"] "uri",
(
DEFAULT_URI,
"http://httpbin.org/get?p2=q2&p1=q1",
"http://httpbin.org/get?p2=q2&p1=q1",
),
) )
def test_default_matcher_matches(cassette, uri, httpbin, httpbin_secure): def test_default_matcher_matches(cassette, uri, httpbin, httpbin_secure):
uri = _replace_httpbin(uri, httpbin, httpbin_secure) uri = _replace_httpbin(uri, httpbin, httpbin_secure)
with vcr.use_cassette(cassette) as cass: with vcr.use_cassette(cassette) as cass:
@@ -94,14 +102,12 @@ def test_default_matcher_matches(cassette, uri, httpbin, httpbin_secure):
) )
def test_default_matcher_does_not_match(cassette, uri, httpbin, httpbin_secure): def test_default_matcher_does_not_match(cassette, uri, httpbin, httpbin_secure):
uri = _replace_httpbin(uri, httpbin, httpbin_secure) uri = _replace_httpbin(uri, httpbin, httpbin_secure)
with pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException): with pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException), vcr.use_cassette(cassette):
with vcr.use_cassette(cassette): urlopen(uri)
urlopen(uri)
def test_default_matcher_does_not_match_on_method(cassette, httpbin, httpbin_secure): def test_default_matcher_does_not_match_on_method(cassette, httpbin, httpbin_secure):
default_uri = _replace_httpbin(DEFAULT_URI, httpbin, httpbin_secure) default_uri = _replace_httpbin(DEFAULT_URI, httpbin, httpbin_secure)
with pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException): with pytest.raises(vcr.errors.CannotOverwriteExistingCassetteException), vcr.use_cassette(cassette):
with vcr.use_cassette(cassette): # is a POST request
# is a POST request urlopen(default_uri, data=b"")
urlopen(default_uri, data=b"")

View File

@@ -1,7 +1,10 @@
import pytest
import vcr
from urllib.request import urlopen from urllib.request import urlopen
import pytest
import vcr
from vcr.errors import CannotOverwriteExistingCassetteException
def test_making_extra_request_raises_exception(tmpdir, httpbin): def test_making_extra_request_raises_exception(tmpdir, httpbin):
# make two requests in the first request that are considered # make two requests in the first request that are considered
@@ -16,5 +19,5 @@ def test_making_extra_request_raises_exception(tmpdir, httpbin):
with vcr.use_cassette(str(tmpdir.join("test.json")), match_on=["method"]): with vcr.use_cassette(str(tmpdir.join("test.json")), match_on=["method"]):
assert urlopen(httpbin.url + "/status/200").getcode() == 200 assert urlopen(httpbin.url + "/status/200").getcode() == 200
assert urlopen(httpbin.url + "/status/201").getcode() == 201 assert urlopen(httpbin.url + "/status/201").getcode() == 201
with pytest.raises(Exception): with pytest.raises(CannotOverwriteExistingCassetteException):
urlopen(httpbin.url + "/status/200") urlopen(httpbin.url + "/status/200")

View File

@@ -1,15 +1,13 @@
# -*- coding: utf-8 -*-
"""Test using a proxy.""" """Test using a proxy."""
# External imports import asyncio
import multiprocessing
import pytest
import http.server import http.server
import socketserver import socketserver
import threading
from urllib.request import urlopen from urllib.request import urlopen
# Internal imports import pytest
import vcr import vcr
# Conditional imports # Conditional imports
@@ -32,20 +30,51 @@ class Proxy(http.server.SimpleHTTPRequestHandler):
# In Python 2 the response is an addinfourl instance. # In Python 2 the response is an addinfourl instance.
status = upstream_response.code status = upstream_response.code
headers = upstream_response.info().items() headers = upstream_response.info().items()
self.send_response(status, upstream_response.msg) self.log_request(status)
self.send_response_only(status, upstream_response.msg)
for header in headers: for header in headers:
self.send_header(*header) self.send_header(*header)
self.end_headers() self.end_headers()
self.copyfile(upstream_response, self.wfile) self.copyfile(upstream_response, self.wfile)
def do_CONNECT(self):
host, port = self.path.split(":")
@pytest.yield_fixture(scope="session") asyncio.run(self._tunnel(host, port, self.connection))
async def _tunnel(self, host, port, client_sock):
target_r, target_w = await asyncio.open_connection(host=host, port=port)
self.send_response(http.HTTPStatus.OK)
self.end_headers()
source_r, source_w = await asyncio.open_connection(sock=client_sock)
async def channel(reader, writer):
while True:
data = await reader.read(1024)
if not data:
break
writer.write(data)
await writer.drain()
writer.close()
await writer.wait_closed()
await asyncio.gather(
channel(target_r, source_w),
channel(source_r, target_w),
)
@pytest.fixture(scope="session")
def proxy_server(): def proxy_server():
httpd = socketserver.ThreadingTCPServer(("", 0), Proxy) with socketserver.ThreadingTCPServer(("", 0), Proxy) as httpd:
proxy_process = multiprocessing.Process(target=httpd.serve_forever) proxy_process = threading.Thread(target=httpd.serve_forever)
proxy_process.start() proxy_process.start()
yield "http://{}:{}".format(*httpd.server_address) yield "http://{}:{}".format(*httpd.server_address)
proxy_process.terminate() httpd.shutdown()
proxy_process.join()
def test_use_proxy(tmpdir, httpbin, proxy_server): def test_use_proxy(tmpdir, httpbin, proxy_server):
@@ -53,8 +82,26 @@ def test_use_proxy(tmpdir, httpbin, proxy_server):
with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))): with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))):
response = requests.get(httpbin.url, proxies={"http": proxy_server}) response = requests.get(httpbin.url, proxies={"http": proxy_server})
with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))) as cassette: with vcr.use_cassette(str(tmpdir.join("proxy.yaml")), mode="none") as cassette:
cassette_response = requests.get(httpbin.url, proxies={"http": proxy_server}) cassette_response = requests.get(httpbin.url, proxies={"http": proxy_server})
assert cassette_response.headers == response.headers assert cassette_response.headers == response.headers
assert cassette.play_count == 1 assert cassette.play_count == 1
def test_use_https_proxy(tmpdir, httpbin_secure, proxy_server):
"""Ensure that it works with an HTTPS proxy."""
with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))):
response = requests.get(httpbin_secure.url, proxies={"https": proxy_server})
with vcr.use_cassette(str(tmpdir.join("proxy.yaml")), mode="none") as cassette:
cassette_response = requests.get(
httpbin_secure.url,
proxies={"https": proxy_server},
)
assert cassette_response.headers == response.headers
assert cassette.play_count == 1
# The cassette URL points to httpbin, not the proxy
assert cassette.requests[0].url == httpbin_secure.url + "/"

View File

@@ -1,7 +1,10 @@
import pytest
import vcr
from urllib.request import urlopen from urllib.request import urlopen
import pytest
import vcr
from vcr.errors import CannotOverwriteExistingCassetteException
def test_once_record_mode(tmpdir, httpbin): def test_once_record_mode(tmpdir, httpbin):
testfile = str(tmpdir.join("recordmode.yml")) testfile = str(tmpdir.join("recordmode.yml"))
@@ -16,7 +19,7 @@ def test_once_record_mode(tmpdir, httpbin):
# the first time, it's played from the cassette. # the first time, it's played from the cassette.
# but, try to access something else from the same cassette, and an # but, try to access something else from the same cassette, and an
# exception is raised. # exception is raised.
with pytest.raises(Exception): with pytest.raises(CannotOverwriteExistingCassetteException):
urlopen(httpbin.url + "/get").read() urlopen(httpbin.url + "/get").read()
@@ -92,7 +95,7 @@ def test_new_episodes_record_mode_two_times(tmpdir, httpbin):
assert urlopen(url).read() == original_second_response assert urlopen(url).read() == original_second_response
# now that we are back in once mode, this should raise # now that we are back in once mode, this should raise
# an error. # an error.
with pytest.raises(Exception): with pytest.raises(CannotOverwriteExistingCassetteException):
urlopen(url).read() urlopen(url).read()
@@ -121,9 +124,11 @@ def test_none_record_mode(tmpdir, httpbin):
# Cassette file doesn't exist, yet we are trying to make a request. # Cassette file doesn't exist, yet we are trying to make a request.
# raise hell. # raise hell.
testfile = str(tmpdir.join("recordmode.yml")) testfile = str(tmpdir.join("recordmode.yml"))
with vcr.use_cassette(testfile, record_mode=vcr.mode.NONE): with (
with pytest.raises(Exception): vcr.use_cassette(testfile, record_mode=vcr.mode.NONE),
urlopen(httpbin.url).read() pytest.raises(CannotOverwriteExistingCassetteException),
):
urlopen(httpbin.url).read()
def test_none_record_mode_with_existing_cassette(tmpdir, httpbin): def test_none_record_mode_with_existing_cassette(tmpdir, httpbin):
@@ -138,5 +143,5 @@ def test_none_record_mode_with_existing_cassette(tmpdir, httpbin):
urlopen(httpbin.url).read() urlopen(httpbin.url).read()
assert cass.play_count == 1 assert cass.play_count == 1
# but if I try to hit the net, raise an exception. # but if I try to hit the net, raise an exception.
with pytest.raises(Exception): with pytest.raises(CannotOverwriteExistingCassetteException):
urlopen(httpbin.url + "/get").read() urlopen(httpbin.url + "/get").read()

View File

@@ -1,6 +1,9 @@
import vcr
from urllib.request import urlopen from urllib.request import urlopen
import pytest
import vcr
def true_matcher(r1, r2): def true_matcher(r1, r2):
return True return True
@@ -10,6 +13,7 @@ def false_matcher(r1, r2):
return False return False
@pytest.mark.online
def test_registered_true_matcher(tmpdir, httpbin): def test_registered_true_matcher(tmpdir, httpbin):
my_vcr = vcr.VCR() my_vcr = vcr.VCR()
my_vcr.register_matcher("true", true_matcher) my_vcr.register_matcher("true", true_matcher)
@@ -21,10 +25,11 @@ def test_registered_true_matcher(tmpdir, httpbin):
with my_vcr.use_cassette(testfile, match_on=["true"]): with my_vcr.use_cassette(testfile, match_on=["true"]):
# I can get the response twice even though I only asked for it once # I can get the response twice even though I only asked for it once
urlopen(httpbin.url + "/get") urlopen(httpbin.url)
urlopen(httpbin.url + "/get") urlopen(httpbin.url)
@pytest.mark.online
def test_registered_false_matcher(tmpdir, httpbin): def test_registered_false_matcher(tmpdir, httpbin):
my_vcr = vcr.VCR() my_vcr = vcr.VCR()
my_vcr.register_matcher("false", false_matcher) my_vcr.register_matcher("false", false_matcher)

View File

@@ -1,16 +1,17 @@
# -*- coding: utf-8 -*-
"""Tests for cassettes with custom persistence""" """Tests for cassettes with custom persistence"""
# External imports # External imports
import os import os
from urllib.request import urlopen from urllib.request import urlopen
import pytest
# Internal imports # Internal imports
import vcr import vcr
from vcr.persisters.filesystem import FilesystemPersister from vcr.persisters.filesystem import CassetteDecodeError, CassetteNotFoundError, FilesystemPersister
class CustomFilesystemPersister(object): class CustomFilesystemPersister:
"""Behaves just like default FilesystemPersister but adds .test extension """Behaves just like default FilesystemPersister but adds .test extension
to the cassette file""" to the cassette file"""
@@ -25,6 +26,19 @@ class CustomFilesystemPersister(object):
FilesystemPersister.save_cassette(cassette_path, cassette_dict, serializer) FilesystemPersister.save_cassette(cassette_path, cassette_dict, serializer)
class BadPersister(FilesystemPersister):
"""A bad persister that raises different errors."""
@staticmethod
def load_cassette(cassette_path, serializer):
if "nonexistent" in cassette_path:
raise CassetteNotFoundError()
elif "encoding" in cassette_path:
raise CassetteDecodeError()
else:
raise ValueError("buggy persister")
def test_save_cassette_with_custom_persister(tmpdir, httpbin): def test_save_cassette_with_custom_persister(tmpdir, httpbin):
"""Ensure you can save a cassette using custom persister""" """Ensure you can save a cassette using custom persister"""
my_vcr = vcr.VCR() my_vcr = vcr.VCR()
@@ -52,4 +66,22 @@ def test_load_cassette_with_custom_persister(tmpdir, httpbin):
with my_vcr.use_cassette(test_fixture, serializer="json"): with my_vcr.use_cassette(test_fixture, serializer="json"):
response = urlopen(httpbin.url).read() response = urlopen(httpbin.url).read()
assert b"difficult sometimes" in response assert b"HTTP Request &amp; Response Service" in response
def test_load_cassette_persister_exception_handling(tmpdir, httpbin):
"""
Ensure expected errors from persister are swallowed while unexpected ones
are passed up the call stack.
"""
my_vcr = vcr.VCR()
my_vcr.register_persister(BadPersister)
with my_vcr.use_cassette("bad/nonexistent") as cass:
assert len(cass) == 0
with my_vcr.use_cassette("bad/encoding") as cass:
assert len(cass) == 0
with pytest.raises(ValueError), my_vcr.use_cassette("bad/buggy") as cass:
pass

View File

@@ -1,6 +1,7 @@
import vcr
from urllib.request import urlopen from urllib.request import urlopen
import vcr
def test_recorded_request_uri_with_redirected_request(tmpdir, httpbin): def test_recorded_request_uri_with_redirected_request(tmpdir, httpbin):
with vcr.use_cassette(str(tmpdir.join("test.yml"))) as cass: with vcr.use_cassette(str(tmpdir.join("test.yml"))) as cass:

View File

@@ -1,10 +1,12 @@
"""Test requests' interaction with vcr""" """Test requests' interaction with vcr"""
import pytest import pytest
import vcr import vcr
from assertions import assert_cassette_empty, assert_is_json
from ..assertions import assert_cassette_empty, assert_is_json_bytes
requests = pytest.importorskip("requests") requests = pytest.importorskip("requests")
from requests.exceptions import ConnectionError # noqa E402
def test_status_code(httpbin_both, tmpdir): def test_status_code(httpbin_both, tmpdir):
@@ -113,22 +115,6 @@ def test_post_chunked_binary(tmpdir, httpbin):
assert req1 == req2 assert req1 == req2
@pytest.mark.skipif("sys.version_info >= (3, 6)", strict=True, raises=ConnectionError)
def test_post_chunked_binary_secure(tmpdir, httpbin_secure):
"""Ensure that we can send chunked binary without breaking while trying to concatenate bytes with str."""
data1 = iter([b"data", b"to", b"send"])
data2 = iter([b"data", b"to", b"send"])
url = httpbin_secure.url + "/post"
with vcr.use_cassette(str(tmpdir.join("requests.yaml"))):
req1 = requests.post(url, data1).content
print(req1)
with vcr.use_cassette(str(tmpdir.join("requests.yaml"))):
req2 = requests.post(url, data2).content
assert req1 == req2
def test_redirects(tmpdir, httpbin_both): def test_redirects(tmpdir, httpbin_both):
"""Ensure that we can handle redirects""" """Ensure that we can handle redirects"""
url = httpbin_both + "/redirect-to?url=bytes/1024" url = httpbin_both + "/redirect-to?url=bytes/1024"
@@ -143,6 +129,17 @@ def test_redirects(tmpdir, httpbin_both):
assert cass.play_count == 2 assert cass.play_count == 2
def test_raw_stream(tmpdir, httpbin):
expected_response = requests.get(httpbin.url, stream=True)
expected_content = b"".join(expected_response.raw.stream())
for _ in range(2): # one for recording, one for cassette reply
with vcr.use_cassette(str(tmpdir.join("raw_stream.yaml"))):
actual_response = requests.get(httpbin.url, stream=True)
actual_content = b"".join(actual_response.raw.stream())
assert actual_content == expected_content
def test_cross_scheme(tmpdir, httpbin_secure, httpbin): def test_cross_scheme(tmpdir, httpbin_secure, httpbin):
"""Ensure that requests between schemes are treated separately""" """Ensure that requests between schemes are treated separately"""
# First fetch a url under http, and then again under https and then # First fetch a url under http, and then again under https and then
@@ -155,20 +152,41 @@ def test_cross_scheme(tmpdir, httpbin_secure, httpbin):
assert len(cass) == 2 assert len(cass) == 2
def test_gzip(tmpdir, httpbin_both): def test_gzip__decode_compressed_response_false(tmpdir, httpbin_both):
""" """
Ensure that requests (actually urllib3) is able to automatically decompress Ensure that requests (actually urllib3) is able to automatically decompress
the response body the response body
""" """
for _ in range(2): # one for recording, one for re-playing
with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))):
response = requests.get(httpbin_both + "/gzip")
assert response.headers["content-encoding"] == "gzip" # i.e. not removed
assert_is_json_bytes(response.content) # i.e. uncompressed bytes
def test_gzip__decode_compressed_response_true(tmpdir, httpbin_both):
url = httpbin_both + "/gzip" url = httpbin_both + "/gzip"
response = requests.get(url)
with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): expected_response = requests.get(url)
response = requests.get(url) expected_content = expected_response.content
assert_is_json(response.content) assert expected_response.headers["content-encoding"] == "gzip" # self-test
with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): with vcr.use_cassette(
assert_is_json(response.content) str(tmpdir.join("decode_compressed.yaml")),
decode_compressed_response=True,
) as cassette:
r = requests.get(url)
assert r.headers["content-encoding"] == "gzip" # i.e. not removed
assert r.content == expected_content
# Has the cassette body been decompressed?
cassette_response_body = cassette.responses[0]["body"]["string"]
assert isinstance(cassette_response_body, str)
with vcr.use_cassette(str(tmpdir.join("decode_compressed.yaml")), decode_compressed_response=True):
r = requests.get(url)
assert "content-encoding" not in r.headers # i.e. removed
assert r.content == expected_content
def test_session_and_connection_close(tmpdir, httpbin): def test_session_and_connection_close(tmpdir, httpbin):
@@ -248,7 +266,7 @@ def test_nested_cassettes_with_session_created_before_nesting(httpbin_both, tmpd
def test_post_file(tmpdir, httpbin_both): def test_post_file(tmpdir, httpbin_both):
"""Ensure that we handle posting a file.""" """Ensure that we handle posting a file."""
url = httpbin_both + "/post" url = httpbin_both + "/post"
with vcr.use_cassette(str(tmpdir.join("post_file.yaml"))) as cass, open("tox.ini", "rb") as f: with vcr.use_cassette(str(tmpdir.join("post_file.yaml"))) as cass, open(".editorconfig", "rb") as f:
original_response = requests.post(url, f).content original_response = requests.post(url, f).content
# This also tests that we do the right thing with matching the body when they are files. # This also tests that we do the right thing with matching the body when they are files.
@@ -256,10 +274,10 @@ def test_post_file(tmpdir, httpbin_both):
str(tmpdir.join("post_file.yaml")), str(tmpdir.join("post_file.yaml")),
match_on=("method", "scheme", "host", "port", "path", "query", "body"), match_on=("method", "scheme", "host", "port", "path", "query", "body"),
) as cass: ) as cass:
with open("tox.ini", "rb") as f: with open(".editorconfig", "rb") as f:
tox_content = f.read() editorconfig = f.read()
assert cass.requests[0].body.read() == tox_content assert cass.requests[0].body.read() == editorconfig
with open("tox.ini", "rb") as f: with open(".editorconfig", "rb") as f:
new_response = requests.post(url, f).content new_response = requests.post(url, f).content
assert original_response == new_response assert original_response == new_response

View File

@@ -1,9 +1,10 @@
import vcr
import zlib
import json
import http.client as httplib import http.client as httplib
import json
import zlib
from assertions import assert_is_json import vcr
from ..assertions import assert_is_json_bytes
def _headers_are_case_insensitive(host, port): def _headers_are_case_insensitive(host, port):
@@ -65,7 +66,7 @@ def test_original_decoded_response_is_not_modified(tmpdir, httpbin):
# Assert that we do not modify the original response while appending # Assert that we do not modify the original response while appending
# to the cassette. # to the cassette.
assert "gzip" == inside.headers["content-encoding"] assert inside.headers["content-encoding"] == "gzip"
# They should effectively be the same response. # They should effectively be the same response.
inside_headers = (h for h in inside.headers.items() if h[0].lower() != "date") inside_headers = (h for h in inside.headers.items() if h[0].lower() != "date")
@@ -83,7 +84,7 @@ def test_original_decoded_response_is_not_modified(tmpdir, httpbin):
inside = conn.getresponse() inside = conn.getresponse()
assert "content-encoding" not in inside.headers assert "content-encoding" not in inside.headers
assert_is_json(inside.read()) assert_is_json_bytes(inside.read())
def _make_before_record_response(fields, replacement="[REDACTED]"): def _make_before_record_response(fields, replacement="[REDACTED]"):
@@ -119,9 +120,9 @@ def test_original_response_is_not_modified_by_before_filter(tmpdir, httpbin):
# The scrubbed field should be the same, because no cassette existed. # The scrubbed field should be the same, because no cassette existed.
# Furthermore, the responses should be identical. # Furthermore, the responses should be identical.
inside_body = json.loads(inside.read().decode("utf-8")) inside_body = json.loads(inside.read())
outside_body = json.loads(outside.read().decode("utf-8")) outside_body = json.loads(outside.read())
assert not inside_body[field_to_scrub] == replacement assert inside_body[field_to_scrub] != replacement
assert inside_body[field_to_scrub] == outside_body[field_to_scrub] assert inside_body[field_to_scrub] == outside_body[field_to_scrub]
# Ensure that when a cassette exists, the scrubbed response is returned. # Ensure that when a cassette exists, the scrubbed response is returned.
@@ -130,5 +131,5 @@ def test_original_response_is_not_modified_by_before_filter(tmpdir, httpbin):
conn.request("GET", "/get") conn.request("GET", "/get")
inside = conn.getresponse() inside = conn.getresponse()
inside_body = json.loads(inside.read().decode("utf-8")) inside_body = json.loads(inside.read())
assert inside_body[field_to_scrub] == replacement assert inside_body[field_to_scrub] == replacement

View File

@@ -1,33 +1,60 @@
# -*- coding: utf-8 -*-
"""Test requests' interaction with vcr""" """Test requests' interaction with vcr"""
import asyncio
import functools
import inspect
import json import json
import os
import ssl
import pytest import pytest
import vcr import vcr
from vcr.errors import CannotOverwriteExistingCassetteException from vcr.errors import CannotOverwriteExistingCassetteException
from assertions import assert_cassette_empty, assert_is_json from ..assertions import assert_cassette_empty, assert_is_json_bytes
tornado = pytest.importorskip("tornado") tornado = pytest.importorskip("tornado")
gen = pytest.importorskip("tornado.gen")
http = pytest.importorskip("tornado.httpclient") http = pytest.importorskip("tornado.httpclient")
# whether the current version of Tornado supports the raise_error argument for # whether the current version of Tornado supports the raise_error argument for
# fetch(). # fetch().
supports_raise_error = tornado.version_info >= (4,) supports_raise_error = tornado.version_info >= (4,)
raise_error_for_response_code_only = tornado.version_info >= (6,)
def gen_test(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
async def coro():
return await gen.coroutine(func)(*args, **kwargs)
return asyncio.run(coro())
# Patch the signature so pytest can inject fixtures
# we can't use wrapt.decorator because it returns a generator function
wrapper.__signature__ = inspect.signature(func)
return wrapper
@pytest.fixture(params=["simple", "curl", "default"]) @pytest.fixture(params=["simple", "curl", "default"])
def get_client(request): def get_client(request):
ca_bundle_path = os.environ.get("REQUESTS_CA_BUNDLE")
ssl_ctx = ssl.create_default_context(ssl.Purpose.SERVER_AUTH)
ssl_ctx.load_verify_locations(cafile=ca_bundle_path)
ssl_ctx.verify_mode = ssl.CERT_REQUIRED
if request.param == "simple": if request.param == "simple":
from tornado import simple_httpclient as simple from tornado import simple_httpclient as simple
return lambda: simple.SimpleAsyncHTTPClient() return lambda: simple.SimpleAsyncHTTPClient(defaults={"ssl_options": ssl_ctx})
elif request.param == "curl":
if request.param == "curl":
curl = pytest.importorskip("tornado.curl_httpclient") curl = pytest.importorskip("tornado.curl_httpclient")
return lambda: curl.CurlAsyncHTTPClient() return lambda: curl.CurlAsyncHTTPClient(defaults={"ca_certs": ca_bundle_path})
else:
return lambda: http.AsyncHTTPClient() return lambda: http.AsyncHTTPClient(defaults={"ssl_options": ssl_ctx})
def get(client, url, **kwargs): def get(client, url, **kwargs):
@@ -44,67 +71,65 @@ def post(client, url, data=None, **kwargs):
return client.fetch(http.HTTPRequest(url, method="POST", **kwargs)) return client.fetch(http.HTTPRequest(url, method="POST", **kwargs))
@pytest.fixture(params=["https", "http"]) @pytest.mark.online
def scheme(request): @gen_test
"""Fixture that returns both http and https.""" def test_status_code(get_client, tmpdir, httpbin_both):
return request.param
@pytest.mark.gen_test
def test_status_code(get_client, scheme, tmpdir):
"""Ensure that we can read the status code""" """Ensure that we can read the status code"""
url = scheme + "://httpbin.org/" url = httpbin_both.url
with vcr.use_cassette(str(tmpdir.join("atts.yaml"))): with vcr.use_cassette(str(tmpdir.join("atts.yaml"))):
status_code = (yield get(get_client(), url)).code status_code = (yield get(get_client(), url)).code
with vcr.use_cassette(str(tmpdir.join("atts.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("atts.yaml"))) as cass:
assert status_code == (yield get(get_client(), url)).code assert status_code == (yield get(get_client(), url)).code
assert 1 == cass.play_count assert cass.play_count == 1
@pytest.mark.gen_test @pytest.mark.online
def test_headers(get_client, scheme, tmpdir): @gen_test
def test_headers(get_client, httpbin_both, tmpdir):
"""Ensure that we can read the headers back""" """Ensure that we can read the headers back"""
url = scheme + "://httpbin.org/" url = httpbin_both.url
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): with vcr.use_cassette(str(tmpdir.join("headers.yaml"))):
headers = (yield get(get_client(), url)).headers headers = (yield get(get_client(), url)).headers
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("headers.yaml"))) as cass:
assert headers == (yield get(get_client(), url)).headers assert headers == (yield get(get_client(), url)).headers
assert 1 == cass.play_count assert cass.play_count == 1
@pytest.mark.gen_test @pytest.mark.online
def test_body(get_client, tmpdir, scheme): @gen_test
def test_body(get_client, tmpdir, httpbin_both):
"""Ensure the responses are all identical enough""" """Ensure the responses are all identical enough"""
url = scheme + "://httpbin.org/bytes/1024" url = httpbin_both.url + "/bytes/1024"
with vcr.use_cassette(str(tmpdir.join("body.yaml"))): with vcr.use_cassette(str(tmpdir.join("body.yaml"))):
content = (yield get(get_client(), url)).body content = (yield get(get_client(), url)).body
with vcr.use_cassette(str(tmpdir.join("body.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("body.yaml"))) as cass:
assert content == (yield get(get_client(), url)).body assert content == (yield get(get_client(), url)).body
assert 1 == cass.play_count assert cass.play_count == 1
@pytest.mark.gen_test @gen_test
def test_effective_url(get_client, scheme, tmpdir): def test_effective_url(get_client, tmpdir, httpbin):
"""Ensure that the effective_url is captured""" """Ensure that the effective_url is captured"""
url = scheme + "://mockbin.org/redirect/301?url=/html" url = httpbin.url + "/redirect/1"
with vcr.use_cassette(str(tmpdir.join("url.yaml"))): with vcr.use_cassette(str(tmpdir.join("url.yaml"))):
effective_url = (yield get(get_client(), url)).effective_url effective_url = (yield get(get_client(), url)).effective_url
assert effective_url == scheme + "://mockbin.org/redirect/301/0" assert effective_url == httpbin.url + "/get"
with vcr.use_cassette(str(tmpdir.join("url.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("url.yaml"))) as cass:
assert effective_url == (yield get(get_client(), url)).effective_url assert effective_url == (yield get(get_client(), url)).effective_url
assert 1 == cass.play_count assert cass.play_count == 1
@pytest.mark.gen_test @pytest.mark.online
def test_auth(get_client, tmpdir, scheme): @gen_test
def test_auth(get_client, tmpdir, httpbin_both):
"""Ensure that we can handle basic auth""" """Ensure that we can handle basic auth"""
auth = ("user", "passwd") auth = ("user", "passwd")
url = scheme + "://httpbin.org/basic-auth/user/passwd" url = httpbin_both.url + "/basic-auth/user/passwd"
with vcr.use_cassette(str(tmpdir.join("auth.yaml"))): with vcr.use_cassette(str(tmpdir.join("auth.yaml"))):
one = yield get(get_client(), url, auth_username=auth[0], auth_password=auth[1]) one = yield get(get_client(), url, auth_username=auth[0], auth_password=auth[1])
@@ -112,14 +137,15 @@ def test_auth(get_client, tmpdir, scheme):
two = yield get(get_client(), url, auth_username=auth[0], auth_password=auth[1]) two = yield get(get_client(), url, auth_username=auth[0], auth_password=auth[1])
assert one.body == two.body assert one.body == two.body
assert one.code == two.code assert one.code == two.code
assert 1 == cass.play_count assert cass.play_count == 1
@pytest.mark.gen_test @pytest.mark.online
def test_auth_failed(get_client, tmpdir, scheme): @gen_test
def test_auth_failed(get_client, tmpdir, httpbin_both):
"""Ensure that we can save failed auth statuses""" """Ensure that we can save failed auth statuses"""
auth = ("user", "wrongwrongwrong") auth = ("user", "wrongwrongwrong")
url = scheme + "://httpbin.org/basic-auth/user/passwd" url = httpbin_both.url + "/basic-auth/user/passwd"
with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("auth-failed.yaml"))) as cass:
# Ensure that this is empty to begin with # Ensure that this is empty to begin with
assert_cassette_empty(cass) assert_cassette_empty(cass)
@@ -135,14 +161,15 @@ def test_auth_failed(get_client, tmpdir, scheme):
assert exc_info.value.code == 401 assert exc_info.value.code == 401
assert one.body == two.body assert one.body == two.body
assert one.code == two.code == 401 assert one.code == two.code == 401
assert 1 == cass.play_count assert cass.play_count == 1
@pytest.mark.gen_test @pytest.mark.online
def test_post(get_client, tmpdir, scheme): @gen_test
def test_post(get_client, tmpdir, httpbin_both):
"""Ensure that we can post and cache the results""" """Ensure that we can post and cache the results"""
data = {"key1": "value1", "key2": "value2"} data = {"key1": "value1", "key2": "value2"}
url = scheme + "://httpbin.org/post" url = httpbin_both.url + "/post"
with vcr.use_cassette(str(tmpdir.join("requests.yaml"))): with vcr.use_cassette(str(tmpdir.join("requests.yaml"))):
req1 = (yield post(get_client(), url, data)).body req1 = (yield post(get_client(), url, data)).body
@@ -150,13 +177,13 @@ def test_post(get_client, tmpdir, scheme):
req2 = (yield post(get_client(), url, data)).body req2 = (yield post(get_client(), url, data)).body
assert req1 == req2 assert req1 == req2
assert 1 == cass.play_count assert cass.play_count == 1
@pytest.mark.gen_test @gen_test
def test_redirects(get_client, tmpdir, scheme): def test_redirects(get_client, tmpdir, httpbin):
"""Ensure that we can handle redirects""" """Ensure that we can handle redirects"""
url = scheme + "://mockbin.org/redirect/301?url=bytes/1024" url = httpbin + "/redirect-to?url=bytes/1024&status_code=301"
with vcr.use_cassette(str(tmpdir.join("requests.yaml"))): with vcr.use_cassette(str(tmpdir.join("requests.yaml"))):
content = (yield get(get_client(), url)).body content = (yield get(get_client(), url)).body
@@ -165,32 +192,38 @@ def test_redirects(get_client, tmpdir, scheme):
assert cass.play_count == 1 assert cass.play_count == 1
@pytest.mark.gen_test @pytest.mark.online
def test_cross_scheme(get_client, tmpdir, scheme): @gen_test
def test_cross_scheme(get_client, tmpdir, httpbin, httpbin_secure):
"""Ensure that requests between schemes are treated separately""" """Ensure that requests between schemes are treated separately"""
# First fetch a url under http, and then again under https and then # First fetch a url under http, and then again under https and then
# ensure that we haven't served anything out of cache, and we have two # ensure that we haven't served anything out of cache, and we have two
# requests / response pairs in the cassette # requests / response pairs in the cassette
url = httpbin.url
url_secure = httpbin_secure.url
with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass:
yield get(get_client(), "https://httpbin.org/") yield get(get_client(), url)
yield get(get_client(), "http://httpbin.org/") yield get(get_client(), url_secure)
assert cass.play_count == 0 assert cass.play_count == 0
assert len(cass) == 2 assert len(cass) == 2
# Then repeat the same requests and ensure both were replayed. # Then repeat the same requests and ensure both were replayed.
with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass:
yield get(get_client(), "https://httpbin.org/") yield get(get_client(), url)
yield get(get_client(), "http://httpbin.org/") yield get(get_client(), url_secure)
assert cass.play_count == 2 assert cass.play_count == 2
@pytest.mark.gen_test @pytest.mark.online
def test_gzip(get_client, tmpdir, scheme): @gen_test
def test_gzip(get_client, tmpdir, httpbin_both):
""" """
Ensure that httpclient is able to automatically decompress the response Ensure that httpclient is able to automatically decompress the response
body body
""" """
url = scheme + "://httpbin.org/gzip" url = httpbin_both + "/gzip"
# use_gzip was renamed to decompress_response in 4.0 # use_gzip was renamed to decompress_response in 4.0
kwargs = {} kwargs = {}
@@ -201,36 +234,39 @@ def test_gzip(get_client, tmpdir, scheme):
with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))):
response = yield get(get_client(), url, **kwargs) response = yield get(get_client(), url, **kwargs)
assert_is_json(response.body) assert_is_json_bytes(response.body)
with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))) as cass:
response = yield get(get_client(), url, **kwargs) response = yield get(get_client(), url, **kwargs)
assert_is_json(response.body) assert_is_json_bytes(response.body)
assert 1 == cass.play_count assert cass.play_count == 1
@pytest.mark.gen_test @pytest.mark.online
def test_https_with_cert_validation_disabled(get_client, tmpdir): @gen_test
def test_https_with_cert_validation_disabled(get_client, tmpdir, httpbin_secure):
cass_path = str(tmpdir.join("cert_validation_disabled.yaml")) cass_path = str(tmpdir.join("cert_validation_disabled.yaml"))
url = httpbin_secure.url
with vcr.use_cassette(cass_path): with vcr.use_cassette(cass_path):
yield get(get_client(), "https://httpbin.org", validate_cert=False) yield get(get_client(), url, validate_cert=False)
with vcr.use_cassette(cass_path) as cass: with vcr.use_cassette(cass_path) as cass:
yield get(get_client(), "https://httpbin.org", validate_cert=False) yield get(get_client(), url, validate_cert=False)
assert 1 == cass.play_count assert cass.play_count == 1
@pytest.mark.gen_test @gen_test
def test_unsupported_features_raises_in_future(get_client, tmpdir): def test_unsupported_features_raises_in_future(get_client, tmpdir, httpbin):
"""Ensure that the exception for an AsyncHTTPClient feature not being """Ensure that the exception for an AsyncHTTPClient feature not being
supported is raised inside the future.""" supported is raised inside the future."""
def callback(chunk): def callback(chunk):
assert False, "Did not expect to be called." raise AssertionError("Did not expect to be called.")
with vcr.use_cassette(str(tmpdir.join("invalid.yaml"))): with vcr.use_cassette(str(tmpdir.join("invalid.yaml"))):
future = get(get_client(), "http://httpbin.org", streaming_callback=callback) future = get(get_client(), httpbin.url, streaming_callback=callback)
with pytest.raises(Exception) as excinfo: with pytest.raises(Exception) as excinfo:
yield future yield future
@@ -239,60 +275,76 @@ def test_unsupported_features_raises_in_future(get_client, tmpdir):
@pytest.mark.skipif(not supports_raise_error, reason="raise_error unavailable in tornado <= 3") @pytest.mark.skipif(not supports_raise_error, reason="raise_error unavailable in tornado <= 3")
@pytest.mark.gen_test @pytest.mark.skipif(
raise_error_for_response_code_only,
reason="raise_error only ignores HTTPErrors due to response code",
)
@gen_test
def test_unsupported_features_raise_error_disabled(get_client, tmpdir): def test_unsupported_features_raise_error_disabled(get_client, tmpdir):
"""Ensure that the exception for an AsyncHTTPClient feature not being """Ensure that the exception for an AsyncHTTPClient feature not being
supported is not raised if raise_error=False.""" supported is not raised if raise_error=False."""
def callback(chunk): def callback(chunk):
assert False, "Did not expect to be called." raise AssertionError("Did not expect to be called.")
with vcr.use_cassette(str(tmpdir.join("invalid.yaml"))): with vcr.use_cassette(str(tmpdir.join("invalid.yaml"))):
response = yield get( response = yield get(
get_client(), "http://httpbin.org", streaming_callback=callback, raise_error=False get_client(),
"http://httpbin.org",
streaming_callback=callback,
raise_error=False,
) )
assert "not yet supported by VCR" in str(response.error) assert "not yet supported by VCR" in str(response.error)
@pytest.mark.gen_test @pytest.mark.online
def test_cannot_overwrite_cassette_raises_in_future(get_client, tmpdir): @gen_test
def test_cannot_overwrite_cassette_raises_in_future(get_client, tmpdir, httpbin):
"""Ensure that CannotOverwriteExistingCassetteException is raised inside """Ensure that CannotOverwriteExistingCassetteException is raised inside
the future.""" the future."""
with vcr.use_cassette(str(tmpdir.join("overwrite.yaml"))): url = httpbin.url
yield get(get_client(), "http://httpbin.org/get")
with vcr.use_cassette(str(tmpdir.join("overwrite.yaml"))): with vcr.use_cassette(str(tmpdir.join("overwrite.yaml"))):
future = get(get_client(), "http://httpbin.org/headers") yield get(get_client(), url + "/get")
with vcr.use_cassette(str(tmpdir.join("overwrite.yaml"))):
future = get(get_client(), url + "/headers")
with pytest.raises(CannotOverwriteExistingCassetteException): with pytest.raises(CannotOverwriteExistingCassetteException):
yield future yield future
@pytest.mark.skipif(not supports_raise_error, reason="raise_error unavailable in tornado <= 3") @pytest.mark.skipif(not supports_raise_error, reason="raise_error unavailable in tornado <= 3")
@pytest.mark.gen_test @pytest.mark.skipif(
def test_cannot_overwrite_cassette_raise_error_disabled(get_client, tmpdir): raise_error_for_response_code_only,
reason="raise_error only ignores HTTPErrors due to response code",
)
@gen_test
def test_cannot_overwrite_cassette_raise_error_disabled(get_client, tmpdir, httpbin):
"""Ensure that CannotOverwriteExistingCassetteException is not raised if """Ensure that CannotOverwriteExistingCassetteException is not raised if
raise_error=False in the fetch() call.""" raise_error=False in the fetch() call."""
with vcr.use_cassette(str(tmpdir.join("overwrite.yaml"))): url = httpbin.url
yield get(get_client(), "http://httpbin.org/get", raise_error=False)
with vcr.use_cassette(str(tmpdir.join("overwrite.yaml"))): with vcr.use_cassette(str(tmpdir.join("overwrite.yaml"))):
response = yield get(get_client(), "http://httpbin.org/headers", raise_error=False) yield get(get_client(), url + "/get", raise_error=False)
with vcr.use_cassette(str(tmpdir.join("overwrite.yaml"))):
response = yield get(get_client(), url + "/headers", raise_error=False)
assert isinstance(response.error, CannotOverwriteExistingCassetteException) assert isinstance(response.error, CannotOverwriteExistingCassetteException)
@pytest.mark.gen_test @gen_test
@vcr.use_cassette(path_transformer=vcr.default_vcr.ensure_suffix(".yaml")) @vcr.use_cassette(path_transformer=vcr.default_vcr.ensure_suffix(".yaml"))
def test_tornado_with_decorator_use_cassette(get_client): def test_tornado_with_decorator_use_cassette(get_client):
response = yield get_client().fetch(http.HTTPRequest("http://www.google.com/", method="GET")) response = yield get_client().fetch(http.HTTPRequest("http://www.google.com/", method="GET"))
assert response.body.decode("utf-8") == "not actually google" assert response.body.decode("utf-8") == "not actually google"
@pytest.mark.gen_test @gen_test
@vcr.use_cassette(path_transformer=vcr.default_vcr.ensure_suffix(".yaml")) @vcr.use_cassette(path_transformer=vcr.default_vcr.ensure_suffix(".yaml"))
def test_tornado_exception_can_be_caught(get_client): def test_tornado_exception_can_be_caught(get_client):
try: try:
@@ -306,45 +358,53 @@ def test_tornado_exception_can_be_caught(get_client):
assert e.code == 404 assert e.code == 404
@pytest.mark.gen_test @pytest.mark.online
def test_existing_references_get_patched(tmpdir): @gen_test
def test_existing_references_get_patched(tmpdir, httpbin):
from tornado.httpclient import AsyncHTTPClient from tornado.httpclient import AsyncHTTPClient
url = httpbin.url + "/get"
with vcr.use_cassette(str(tmpdir.join("data.yaml"))): with vcr.use_cassette(str(tmpdir.join("data.yaml"))):
client = AsyncHTTPClient() client = AsyncHTTPClient()
yield get(client, "http://httpbin.org/get") yield get(client, url)
with vcr.use_cassette(str(tmpdir.join("data.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("data.yaml"))) as cass:
yield get(client, "http://httpbin.org/get") yield get(client, url)
assert cass.play_count == 1 assert cass.play_count == 1
@pytest.mark.gen_test @pytest.mark.online
def test_existing_instances_get_patched(get_client, tmpdir): @gen_test
def test_existing_instances_get_patched(get_client, tmpdir, httpbin):
"""Ensure that existing instances of AsyncHTTPClient get patched upon """Ensure that existing instances of AsyncHTTPClient get patched upon
entering VCR context.""" entering VCR context."""
url = httpbin.url + "/get"
client = get_client() client = get_client()
with vcr.use_cassette(str(tmpdir.join("data.yaml"))): with vcr.use_cassette(str(tmpdir.join("data.yaml"))):
yield get(client, "http://httpbin.org/get") yield get(client, url)
with vcr.use_cassette(str(tmpdir.join("data.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("data.yaml"))) as cass:
yield get(client, "http://httpbin.org/get") yield get(client, url)
assert cass.play_count == 1 assert cass.play_count == 1
@pytest.mark.gen_test @pytest.mark.online
def test_request_time_is_set(get_client, tmpdir): @gen_test
def test_request_time_is_set(get_client, tmpdir, httpbin):
"""Ensures that the request_time on HTTPResponses is set.""" """Ensures that the request_time on HTTPResponses is set."""
url = httpbin.url + "/get"
with vcr.use_cassette(str(tmpdir.join("data.yaml"))): with vcr.use_cassette(str(tmpdir.join("data.yaml"))):
client = get_client() client = get_client()
response = yield get(client, "http://httpbin.org/get") response = yield get(client, url)
assert response.request_time is not None assert response.request_time is not None
with vcr.use_cassette(str(tmpdir.join("data.yaml"))) as cass: with vcr.use_cassette(str(tmpdir.join("data.yaml"))) as cass:
client = get_client() client = get_client()
response = yield get(client, "http://httpbin.org/get") response = yield get(client, url)
assert response.request_time is not None assert response.request_time is not None
assert cass.play_count == 1 assert cass.play_count == 1

View File

@@ -1,145 +0,0 @@
# -*- coding: utf-8 -*-
"""Integration tests with urllib2"""
import ssl
from urllib.request import urlopen
from urllib.parse import urlencode
import pytest_httpbin.certs
# Internal imports
import vcr
from assertions import assert_cassette_has_one_response
def urlopen_with_cafile(*args, **kwargs):
context = ssl.create_default_context(cafile=pytest_httpbin.certs.where())
context.check_hostname = False
kwargs["context"] = context
try:
return urlopen(*args, **kwargs)
except TypeError:
# python2/pypi don't let us override this
del kwargs["cafile"]
return urlopen(*args, **kwargs)
def test_response_code(httpbin_both, tmpdir):
"""Ensure we can read a response code from a fetch"""
url = httpbin_both.url
with vcr.use_cassette(str(tmpdir.join("atts.yaml"))):
code = urlopen_with_cafile(url).getcode()
with vcr.use_cassette(str(tmpdir.join("atts.yaml"))):
assert code == urlopen_with_cafile(url).getcode()
def test_random_body(httpbin_both, tmpdir):
"""Ensure we can read the content, and that it's served from cache"""
url = httpbin_both.url + "/bytes/1024"
with vcr.use_cassette(str(tmpdir.join("body.yaml"))):
body = urlopen_with_cafile(url).read()
with vcr.use_cassette(str(tmpdir.join("body.yaml"))):
assert body == urlopen_with_cafile(url).read()
def test_response_headers(httpbin_both, tmpdir):
"""Ensure we can get information from the response"""
url = httpbin_both.url
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))):
open1 = urlopen_with_cafile(url).info().items()
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))):
open2 = urlopen_with_cafile(url).info().items()
assert sorted(open1) == sorted(open2)
def test_effective_url(tmpdir):
"""Ensure that the effective_url is captured"""
url = "http://mockbin.org/redirect/301"
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))):
effective_url = urlopen_with_cafile(url).geturl()
assert effective_url == "http://mockbin.org/redirect/301/0"
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))):
assert effective_url == urlopen_with_cafile(url).geturl()
def test_multiple_requests(httpbin_both, tmpdir):
"""Ensure that we can cache multiple requests"""
urls = [httpbin_both.url, httpbin_both.url, httpbin_both.url + "/get", httpbin_both.url + "/bytes/1024"]
with vcr.use_cassette(str(tmpdir.join("multiple.yaml"))) as cass:
[urlopen_with_cafile(url) for url in urls]
assert len(cass) == len(urls)
def test_get_data(httpbin_both, tmpdir):
"""Ensure that it works with query data"""
data = urlencode({"some": 1, "data": "here"})
url = httpbin_both.url + "/get?" + data
with vcr.use_cassette(str(tmpdir.join("get_data.yaml"))):
res1 = urlopen_with_cafile(url).read()
with vcr.use_cassette(str(tmpdir.join("get_data.yaml"))):
res2 = urlopen_with_cafile(url).read()
assert res1 == res2
def test_post_data(httpbin_both, tmpdir):
"""Ensure that it works when posting data"""
data = urlencode({"some": 1, "data": "here"}).encode("utf-8")
url = httpbin_both.url + "/post"
with vcr.use_cassette(str(tmpdir.join("post_data.yaml"))):
res1 = urlopen_with_cafile(url, data).read()
with vcr.use_cassette(str(tmpdir.join("post_data.yaml"))) as cass:
res2 = urlopen_with_cafile(url, data).read()
assert len(cass) == 1
assert res1 == res2
assert_cassette_has_one_response(cass)
def test_post_unicode_data(httpbin_both, tmpdir):
"""Ensure that it works when posting unicode data"""
data = urlencode({"snowman": "".encode()}).encode("utf-8")
url = httpbin_both.url + "/post"
with vcr.use_cassette(str(tmpdir.join("post_data.yaml"))):
res1 = urlopen_with_cafile(url, data).read()
with vcr.use_cassette(str(tmpdir.join("post_data.yaml"))) as cass:
res2 = urlopen_with_cafile(url, data).read()
assert len(cass) == 1
assert res1 == res2
assert_cassette_has_one_response(cass)
def test_cross_scheme(tmpdir, httpbin_secure, httpbin):
"""Ensure that requests between schemes are treated separately"""
# First fetch a url under https, and then again under https and then
# ensure that we haven't served anything out of cache, and we have two
# requests / response pairs in the cassette
with vcr.use_cassette(str(tmpdir.join("cross_scheme.yaml"))) as cass:
urlopen_with_cafile(httpbin_secure.url)
urlopen_with_cafile(httpbin.url)
assert len(cass) == 2
assert cass.play_count == 0
def test_decorator(httpbin_both, tmpdir):
"""Test the decorator version of VCR.py"""
url = httpbin_both.url
@vcr.use_cassette(str(tmpdir.join("atts.yaml")))
def inner1():
return urlopen_with_cafile(url).getcode()
@vcr.use_cassette(str(tmpdir.join("atts.yaml")))
def inner2():
return urlopen_with_cafile(url).getcode()
assert inner1() == inner2()

View File

@@ -4,9 +4,12 @@
import pytest import pytest
import pytest_httpbin import pytest_httpbin
import vcr import vcr
from vcr.patch import force_reset from vcr.patch import force_reset
from assertions import assert_cassette_empty, assert_is_json from vcr.stubs.compat import get_headers
from ..assertions import assert_cassette_empty, assert_is_json_bytes
urllib3 = pytest.importorskip("urllib3") urllib3 = pytest.importorskip("urllib3")
@@ -14,7 +17,8 @@ urllib3 = pytest.importorskip("urllib3")
@pytest.fixture(scope="module") @pytest.fixture(scope="module")
def verify_pool_mgr(): def verify_pool_mgr():
return urllib3.PoolManager( return urllib3.PoolManager(
cert_reqs="CERT_REQUIRED", ca_certs=pytest_httpbin.certs.where() # Force certificate check. cert_reqs="CERT_REQUIRED",
ca_certs=pytest_httpbin.certs.where(), # Force certificate check.
) )
@@ -40,7 +44,8 @@ def test_headers(tmpdir, httpbin_both, verify_pool_mgr):
headers = verify_pool_mgr.request("GET", url).headers headers = verify_pool_mgr.request("GET", url).headers
with vcr.use_cassette(str(tmpdir.join("headers.yaml"))): with vcr.use_cassette(str(tmpdir.join("headers.yaml"))):
assert headers == verify_pool_mgr.request("GET", url).headers new_headers = verify_pool_mgr.request("GET", url).headers
assert sorted(get_headers(headers)) == sorted(get_headers(new_headers))
def test_body(tmpdir, httpbin_both, verify_pool_mgr): def test_body(tmpdir, httpbin_both, verify_pool_mgr):
@@ -94,9 +99,10 @@ def test_post(tmpdir, httpbin_both, verify_pool_mgr):
assert req1 == req2 assert req1 == req2
def test_redirects(tmpdir, verify_pool_mgr): @pytest.mark.online
def test_redirects(tmpdir, verify_pool_mgr, httpbin):
"""Ensure that we can handle redirects""" """Ensure that we can handle redirects"""
url = "http://mockbin.org/redirect/301" url = httpbin.url + "/redirect/1"
with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))): with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))):
content = verify_pool_mgr.request("GET", url).data content = verify_pool_mgr.request("GET", url).data
@@ -132,10 +138,10 @@ def test_gzip(tmpdir, httpbin_both, verify_pool_mgr):
with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))):
response = verify_pool_mgr.request("GET", url) response = verify_pool_mgr.request("GET", url)
assert_is_json(response.data) assert_is_json_bytes(response.data)
with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))): with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))):
assert_is_json(response.data) assert_is_json_bytes(response.data)
def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr): def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr):
@@ -144,18 +150,18 @@ def test_https_with_cert_validation_disabled(tmpdir, httpbin_secure, pool_mgr):
def test_urllib3_force_reset(): def test_urllib3_force_reset():
cpool = urllib3.connectionpool conn = urllib3.connection
http_original = cpool.HTTPConnection http_original = conn.HTTPConnection
https_original = cpool.HTTPSConnection https_original = conn.HTTPSConnection
verified_https_original = cpool.VerifiedHTTPSConnection verified_https_original = conn.VerifiedHTTPSConnection
with vcr.use_cassette(path="test"): with vcr.use_cassette(path="test"):
first_cassette_HTTPConnection = cpool.HTTPConnection first_cassette_HTTPConnection = conn.HTTPConnection
first_cassette_HTTPSConnection = cpool.HTTPSConnection first_cassette_HTTPSConnection = conn.HTTPSConnection
first_cassette_VerifiedHTTPSConnection = cpool.VerifiedHTTPSConnection first_cassette_VerifiedHTTPSConnection = conn.VerifiedHTTPSConnection
with force_reset(): with force_reset():
assert cpool.HTTPConnection is http_original assert conn.HTTPConnection is http_original
assert cpool.HTTPSConnection is https_original assert conn.HTTPSConnection is https_original
assert cpool.VerifiedHTTPSConnection is verified_https_original assert conn.VerifiedHTTPSConnection is verified_https_original
assert cpool.HTTPConnection is first_cassette_HTTPConnection assert conn.HTTPConnection is first_cassette_HTTPConnection
assert cpool.HTTPSConnection is first_cassette_HTTPSConnection assert conn.HTTPSConnection is first_cassette_HTTPSConnection
assert cpool.VerifiedHTTPSConnection is first_cassette_VerifiedHTTPSConnection assert conn.VerifiedHTTPSConnection is first_cassette_VerifiedHTTPSConnection

View File

@@ -1,12 +1,13 @@
import http.client as httplib import http.client as httplib
import multiprocessing import multiprocessing
import pytest
from xmlrpc.client import ServerProxy from xmlrpc.client import ServerProxy
from xmlrpc.server import SimpleXMLRPCServer from xmlrpc.server import SimpleXMLRPCServer
requests = pytest.importorskip("requests") import pytest
import vcr # NOQA import vcr
requests = pytest.importorskip("requests")
def test_domain_redirect(): def test_domain_redirect():
@@ -51,6 +52,7 @@ def test_flickr_multipart_upload(httpbin, tmpdir):
assert cass.play_count == 1 assert cass.play_count == 1
@pytest.mark.online
def test_flickr_should_respond_with_200(tmpdir): def test_flickr_should_respond_with_200(tmpdir):
testfile = str(tmpdir.join("flickr.yml")) testfile = str(tmpdir.join("flickr.yml"))
with vcr.use_cassette(testfile): with vcr.use_cassette(testfile):
@@ -60,14 +62,15 @@ def test_flickr_should_respond_with_200(tmpdir):
def test_cookies(tmpdir, httpbin): def test_cookies(tmpdir, httpbin):
testfile = str(tmpdir.join("cookies.yml")) testfile = str(tmpdir.join("cookies.yml"))
with vcr.use_cassette(testfile): with vcr.use_cassette(testfile), requests.Session() as s:
s = requests.Session()
s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2") s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2")
assert s.cookies.keys() == ["k1", "k2"]
r2 = s.get(httpbin.url + "/cookies") r2 = s.get(httpbin.url + "/cookies")
assert len(r2.json()["cookies"]) == 2 assert sorted(r2.json()["cookies"].keys()) == ["k1", "k2"]
@pytest.mark.online
def test_amazon_doctype(tmpdir): def test_amazon_doctype(tmpdir):
# amazon gzips its homepage. For some reason, in requests 2.7, it's not # amazon gzips its homepage. For some reason, in requests 2.7, it's not
# getting gunzipped. # getting gunzipped.
@@ -83,7 +86,7 @@ def start_rpc_server(q):
httpd.serve_forever() httpd.serve_forever()
@pytest.yield_fixture(scope="session") @pytest.fixture(scope="session")
def rpc_server(): def rpc_server():
q = multiprocessing.Queue() q = multiprocessing.Queue()
proxy_process = multiprocessing.Process(target=start_rpc_server, args=(q,)) proxy_process = multiprocessing.Process(target=start_rpc_server, args=(q,))

View File

@@ -7,9 +7,11 @@ from unittest import mock
import pytest import pytest
import yaml import yaml
from vcr.cassette import Cassette from vcr.cassette import Cassette
from vcr.errors import UnhandledHTTPRequestError from vcr.errors import UnhandledHTTPRequestError
from vcr.patch import force_reset from vcr.patch import force_reset
from vcr.request import Request
from vcr.stubs import VCRHTTPSConnection from vcr.stubs import VCRHTTPSConnection
@@ -19,15 +21,31 @@ def test_cassette_load(tmpdir):
yaml.dump( yaml.dump(
{ {
"interactions": [ "interactions": [
{"request": {"body": "", "uri": "foo", "method": "GET", "headers": {}}, "response": "bar"} {
] "request": {"body": "", "uri": "foo", "method": "GET", "headers": {}},
} "response": "bar",
) },
],
},
),
) )
a_cassette = Cassette.load(path=str(a_file)) a_cassette = Cassette.load(path=str(a_file))
assert len(a_cassette) == 1 assert len(a_cassette) == 1
def test_cassette_load_nonexistent():
a_cassette = Cassette.load(path="something/nonexistent.yml")
assert len(a_cassette) == 0
def test_cassette_load_invalid_encoding(tmpdir):
a_file = tmpdir.join("invalid_encoding.yml")
with open(a_file, "wb") as fd:
fd.write(b"\xda")
a_cassette = Cassette.load(path=str(a_file))
assert len(a_cassette) == 0
def test_cassette_not_played(): def test_cassette_not_played():
a = Cassette("test") a = Cassette("test")
assert not a.play_count assert not a.play_count
@@ -96,7 +114,7 @@ def make_get_request():
@mock.patch("vcr.stubs.VCRHTTPResponse") @mock.patch("vcr.stubs.VCRHTTPResponse")
def test_function_decorated_with_use_cassette_can_be_invoked_multiple_times(*args): def test_function_decorated_with_use_cassette_can_be_invoked_multiple_times(*args):
decorated_function = Cassette.use(path="test")(make_get_request) decorated_function = Cassette.use(path="test")(make_get_request)
for i in range(4): for _ in range(4):
decorated_function() decorated_function()
@@ -142,7 +160,7 @@ def test_cassette_allow_playback_repeats():
a = Cassette("test", allow_playback_repeats=True) a = Cassette("test", allow_playback_repeats=True)
a.append("foo", "bar") a.append("foo", "bar")
a.append("other", "resp") a.append("other", "resp")
for x in range(10): for _ in range(10):
assert a.play_response("foo") == "bar" assert a.play_response("foo") == "bar"
assert a.play_count == 10 assert a.play_count == 10
assert a.all_played is False assert a.all_played is False
@@ -204,14 +222,16 @@ def test_nesting_cassette_context_managers(*args):
with contextlib.ExitStack() as exit_stack: with contextlib.ExitStack() as exit_stack:
first_cassette = exit_stack.enter_context(Cassette.use(path="test")) first_cassette = exit_stack.enter_context(Cassette.use(path="test"))
exit_stack.enter_context( exit_stack.enter_context(
mock.patch.object(first_cassette, "play_response", return_value=first_response) mock.patch.object(first_cassette, "play_response", return_value=first_response),
) )
assert_get_response_body_is("first_response") assert_get_response_body_is("first_response")
# Make sure a second cassette can supersede the first # Make sure a second cassette can supersede the first
with Cassette.use(path="test") as second_cassette: with (
with mock.patch.object(second_cassette, "play_response", return_value=second_response): Cassette.use(path="test") as second_cassette,
assert_get_response_body_is("second_response") mock.patch.object(second_cassette, "play_response", return_value=second_response),
):
assert_get_response_body_is("second_response")
# Now the first cassette should be back in effect # Now the first cassette should be back in effect
assert_get_response_body_is("first_response") assert_get_response_body_is("first_response")
@@ -393,3 +413,25 @@ def test_find_requests_with_most_matches_many_similar_requests(mock_get_matchers
(1, ["method", "path"], [("query", "failed : query")]), (1, ["method", "path"], [("query", "failed : query")]),
(3, ["method", "path"], [("query", "failed : query")]), (3, ["method", "path"], [("query", "failed : query")]),
] ]
def test_used_interactions(tmpdir):
interactions = [
{"request": {"body": "", "uri": "foo1", "method": "GET", "headers": {}}, "response": "bar1"},
{"request": {"body": "", "uri": "foo2", "method": "GET", "headers": {}}, "response": "bar2"},
{"request": {"body": "", "uri": "foo3", "method": "GET", "headers": {}}, "response": "bar3"},
]
file = tmpdir.join("test_cassette.yml")
file.write(yaml.dump({"interactions": [interactions[0], interactions[1]]}))
cassette = Cassette.load(path=str(file))
request = Request._from_dict(interactions[1]["request"])
cassette.play_response(request)
assert len(cassette._played_interactions) < len(cassette._old_interactions)
request = Request._from_dict(interactions[2]["request"])
cassette.append(request, interactions[2]["response"])
assert len(cassette._new_interactions()) == 1
used_interactions = cassette._played_interactions + cassette._new_interactions()
assert len(used_interactions) == 2

View File

@@ -55,15 +55,18 @@ from vcr.cassette import Cassette
], ],
) )
def test_CannotOverwriteExistingCassetteException_get_message( def test_CannotOverwriteExistingCassetteException_get_message(
mock_find_requests_with_most_matches, most_matches, expected_message mock_find_requests_with_most_matches,
most_matches,
expected_message,
): ):
mock_find_requests_with_most_matches.return_value = most_matches mock_find_requests_with_most_matches.return_value = most_matches
cassette = Cassette("path") cassette = Cassette("path")
failed_request = "request" failed_request = "request"
exception_message = errors.CannotOverwriteExistingCassetteException._get_message(cassette, "request") exception_message = errors.CannotOverwriteExistingCassetteException._get_message(cassette, "request")
expected = ( expected = (
"Can't overwrite existing cassette (%r) in your current record mode (%r).\n" f"Can't overwrite existing cassette ({cassette._path!r}) "
"No match for the request (%r) was found.\n" f"in your current record mode ({cassette.record_mode!r}).\n"
"%s" % (cassette._path, cassette.record_mode, failed_request, expected_message) f"No match for the request ({failed_request!r}) was found.\n"
f"{expected_message}"
) )
assert exception_message == expected assert exception_message == expected

View File

@@ -1,18 +1,19 @@
from io import BytesIO
from vcr.filters import (
remove_headers,
replace_headers,
remove_query_parameters,
replace_query_parameters,
remove_post_data_parameters,
replace_post_data_parameters,
decode_response,
)
from vcr.request import Request
import gzip import gzip
import json import json
from unittest import mock
import zlib import zlib
from io import BytesIO
from unittest import mock
from vcr.filters import (
decode_response,
remove_headers,
remove_post_data_parameters,
remove_query_parameters,
replace_headers,
replace_post_data_parameters,
replace_query_parameters,
)
from vcr.request import Request
def test_replace_headers(): def test_replace_headers():
@@ -196,7 +197,7 @@ def test_replace_json_post_data_parameters():
("six", "doesntexist"), ("six", "doesntexist"),
], ],
) )
request_data = json.loads(request.body.decode("utf-8")) request_data = json.loads(request.body)
expected_data = json.loads('{"one": "keep", "three": "tada", "four": "SHOUT"}') expected_data = json.loads('{"one": "keep", "three": "tada", "four": "SHOUT"}')
assert request_data == expected_data assert request_data == expected_data
@@ -207,8 +208,8 @@ def test_remove_json_post_data_parameters():
request = Request("POST", "http://google.com", body, {}) request = Request("POST", "http://google.com", body, {})
request.headers["Content-Type"] = "application/json" request.headers["Content-Type"] = "application/json"
remove_post_data_parameters(request, ["id"]) remove_post_data_parameters(request, ["id"])
request_body_json = json.loads(request.body.decode("utf-8")) request_body_json = json.loads(request.body)
expected_json = json.loads(b'{"foo": "bar", "baz": "qux"}'.decode("utf-8")) expected_json = json.loads(b'{"foo": "bar", "baz": "qux"}')
assert request_body_json == expected_json assert request_body_json == expected_json
@@ -297,6 +298,18 @@ def test_decode_response_deflate():
assert decoded_response["headers"]["content-length"] == [str(len(body))] assert decoded_response["headers"]["content-length"] == [str(len(body))]
def test_decode_response_deflate_already_decompressed():
body = b"deflate message"
gzip_response = {
"body": {"string": body},
"headers": {
"content-encoding": ["deflate"],
},
}
decoded_response = decode_response(gzip_response)
assert decoded_response["body"]["string"] == body
def test_decode_response_gzip(): def test_decode_response_gzip():
body = b"gzip message" body = b"gzip message"
@@ -324,3 +337,15 @@ def test_decode_response_gzip():
decoded_response = decode_response(gzip_response) decoded_response = decode_response(gzip_response)
assert decoded_response["body"]["string"] == body assert decoded_response["body"]["string"] == body
assert decoded_response["headers"]["content-length"] == [str(len(body))] assert decoded_response["headers"]["content-length"] == [str(len(body))]
def test_decode_response_gzip_already_decompressed():
body = b"gzip message"
gzip_response = {
"body": {"string": body},
"headers": {
"content-encoding": ["gzip"],
},
}
decoded_response = decode_response(gzip_response)
assert decoded_response["body"]["string"] == body

View File

@@ -1,6 +1,7 @@
import pytest import pytest
from vcr.serializers.jsonserializer import serialize
from vcr.request import Request from vcr.request import Request
from vcr.serializers.jsonserializer import serialize
def test_serialize_binary(): def test_serialize_binary():

View File

@@ -3,8 +3,7 @@ from unittest import mock
import pytest import pytest
from vcr import matchers from vcr import matchers, request
from vcr import request
# the dict contains requests with corresponding to its key difference # the dict contains requests with corresponding to its key difference
# with 'base' request. # with 'base' request.
@@ -64,6 +63,9 @@ boto3_bytes_headers = {
"Expect": b"100-continue", "Expect": b"100-continue",
"Content-Length": "21", "Content-Length": "21",
} }
chunked_headers = {
"Transfer-Encoding": "chunked",
}
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -75,10 +77,16 @@ boto3_bytes_headers = {
), ),
( (
request.Request( request.Request(
"POST", "http://host.com/", "a=1&b=2", {"Content-Type": "application/x-www-form-urlencoded"} "POST",
"http://host.com/",
"a=1&b=2",
{"Content-Type": "application/x-www-form-urlencoded"},
), ),
request.Request( request.Request(
"POST", "http://host.com/", "b=2&a=1", {"Content-Type": "application/x-www-form-urlencoded"} "POST",
"http://host.com/",
"b=2&a=1",
{"Content-Type": "application/x-www-form-urlencoded"},
), ),
), ),
( (
@@ -87,23 +95,38 @@ boto3_bytes_headers = {
), ),
( (
request.Request( request.Request(
"POST", "http://host.com/", "a=1&b=2", {"Content-Type": "application/x-www-form-urlencoded"} "POST",
"http://host.com/",
"a=1&b=2",
{"Content-Type": "application/x-www-form-urlencoded"},
), ),
request.Request( request.Request(
"POST", "http://host.com/", "b=2&a=1", {"Content-Type": "application/x-www-form-urlencoded"} "POST",
"http://host.com/",
"b=2&a=1",
{"Content-Type": "application/x-www-form-urlencoded"},
), ),
), ),
( (
request.Request( request.Request(
"POST", "http://host.com/", '{"a": 1, "b": 2}', {"Content-Type": "application/json"} "POST",
"http://host.com/",
'{"a": 1, "b": 2}',
{"Content-Type": "application/json"},
), ),
request.Request( request.Request(
"POST", "http://host.com/", '{"b": 2, "a": 1}', {"content-type": "application/json"} "POST",
"http://host.com/",
'{"b": 2, "a": 1}',
{"content-type": "application/json"},
), ),
), ),
( (
request.Request( request.Request(
"POST", "http://host.com/", req1_body, {"User-Agent": "xmlrpclib", "Content-Type": "text/xml"} "POST",
"http://host.com/",
req1_body,
{"User-Agent": "xmlrpclib", "Content-Type": "text/xml"},
), ),
request.Request( request.Request(
"POST", "POST",
@@ -114,10 +137,16 @@ boto3_bytes_headers = {
), ),
( (
request.Request( request.Request(
"POST", "http://host.com/", '{"a": 1, "b": 2}', {"Content-Type": "application/json"} "POST",
"http://host.com/",
'{"a": 1, "b": 2}',
{"Content-Type": "application/json"},
), ),
request.Request( request.Request(
"POST", "http://host.com/", '{"b": 2, "a": 1}', {"content-type": "application/json"} "POST",
"http://host.com/",
'{"b": 2, "a": 1}',
{"content-type": "application/json"},
), ),
), ),
( (
@@ -125,6 +154,36 @@ boto3_bytes_headers = {
request.Request("POST", "http://aws.custom.com/", b"123", boto3_bytes_headers), request.Request("POST", "http://aws.custom.com/", b"123", boto3_bytes_headers),
request.Request("POST", "http://aws.custom.com/", b"123", boto3_bytes_headers), request.Request("POST", "http://aws.custom.com/", b"123", boto3_bytes_headers),
), ),
(
# chunked transfer encoding: decoded bytes versus encoded bytes
request.Request("POST", "scheme1://host1.test/", b"123456789_123456", chunked_headers),
request.Request(
"GET",
"scheme2://host2.test/",
b"10\r\n123456789_123456\r\n0\r\n\r\n",
chunked_headers,
),
),
(
# chunked transfer encoding: bytes iterator versus string iterator
request.Request(
"POST",
"scheme1://host1.test/",
iter([b"123456789_", b"123456"]),
chunked_headers,
),
request.Request("GET", "scheme2://host2.test/", iter(["123456789_", "123456"]), chunked_headers),
),
(
# chunked transfer encoding: bytes iterator versus single byte iterator
request.Request(
"POST",
"scheme1://host1.test/",
iter([b"123456789_", b"123456"]),
chunked_headers,
),
request.Request("GET", "scheme2://host2.test/", iter(b"123456789_123456"), chunked_headers),
),
], ],
) )
def test_body_matcher_does_match(r1, r2): def test_body_matcher_does_match(r1, r2):
@@ -140,10 +199,16 @@ def test_body_matcher_does_match(r1, r2):
), ),
( (
request.Request( request.Request(
"POST", "http://host.com/", '{"a": 1, "b": 3}', {"Content-Type": "application/json"} "POST",
"http://host.com/",
'{"a": 1, "b": 3}',
{"Content-Type": "application/json"},
), ),
request.Request( request.Request(
"POST", "http://host.com/", '{"b": 2, "a": 1}', {"content-type": "application/json"} "POST",
"http://host.com/",
'{"b": 2, "a": 1}',
{"content-type": "application/json"},
), ),
), ),
( (

View File

@@ -1,6 +1,7 @@
import filecmp import filecmp
import json import json
import shutil import shutil
import yaml import yaml
import vcr.migration import vcr.migration
@@ -16,9 +17,9 @@ def test_try_migrate_with_json(tmpdir):
cassette = tmpdir.join("cassette.json").strpath cassette = tmpdir.join("cassette.json").strpath
shutil.copy("tests/fixtures/migration/old_cassette.json", cassette) shutil.copy("tests/fixtures/migration/old_cassette.json", cassette)
assert vcr.migration.try_migrate(cassette) assert vcr.migration.try_migrate(cassette)
with open("tests/fixtures/migration/new_cassette.json", "r") as f: with open("tests/fixtures/migration/new_cassette.json") as f:
expected_json = json.load(f) expected_json = json.load(f)
with open(cassette, "r") as f: with open(cassette) as f:
actual_json = json.load(f) actual_json = json.load(f)
assert actual_json == expected_json assert actual_json == expected_json
@@ -27,9 +28,9 @@ def test_try_migrate_with_yaml(tmpdir):
cassette = tmpdir.join("cassette.yaml").strpath cassette = tmpdir.join("cassette.yaml").strpath
shutil.copy("tests/fixtures/migration/old_cassette.yaml", cassette) shutil.copy("tests/fixtures/migration/old_cassette.yaml", cassette)
assert vcr.migration.try_migrate(cassette) assert vcr.migration.try_migrate(cassette)
with open("tests/fixtures/migration/new_cassette.yaml", "r") as f: with open("tests/fixtures/migration/new_cassette.yaml") as f:
expected_yaml = yaml.load(f, Loader=Loader) expected_yaml = yaml.load(f, Loader=Loader)
with open(cassette, "r") as f: with open(cassette) as f:
actual_yaml = yaml.load(f, Loader=Loader) actual_yaml = yaml.load(f, Loader=Loader)
assert actual_yaml == expected_yaml assert actual_yaml == expected_yaml

View File

@@ -1,6 +1,6 @@
import pytest import pytest
from vcr.request import Request, HeadersDict from vcr.request import HeadersDict, Request
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -60,7 +60,6 @@ def test_uri(method, uri):
def test_HeadersDict(): def test_HeadersDict():
# Simple test of CaseInsensitiveDict # Simple test of CaseInsensitiveDict
h = HeadersDict() h = HeadersDict()
assert h == {} assert h == {}

View File

@@ -1,4 +1,3 @@
# coding: UTF-8
import io import io
from vcr.stubs import VCRHTTPResponse from vcr.stubs import VCRHTTPResponse
@@ -89,11 +88,11 @@ def test_response_parses_correctly_and_fp_attribute_error_is_not_thrown():
b"different types of cancer cells. Recently, the first HDACi was\n " b"different types of cancer cells. Recently, the first HDACi was\n "
b"approved for the " b"approved for the "
b"treatment of cutaneous T cell lymphomas. Most HDACi currently in\n " b"treatment of cutaneous T cell lymphomas. Most HDACi currently in\n "
b"clinical " b"clinical ",
}, },
} }
vcr_response = VCRHTTPResponse(recorded_response) vcr_response = VCRHTTPResponse(recorded_response)
handle = io.TextIOWrapper(io.BufferedReader(vcr_response), encoding="utf-8") handle = io.TextIOWrapper(vcr_response, encoding="utf-8")
handle = iter(handle) handle = iter(handle)
articles = [line for line in handle] articles = list(handle)
assert len(articles) > 1 assert len(articles) > 1

View File

@@ -1,32 +1,29 @@
# -*- encoding: utf-8 -*-
from unittest import mock from unittest import mock
import pytest import pytest
from vcr.request import Request from vcr.request import Request
from vcr.serialize import deserialize, serialize from vcr.serialize import deserialize, serialize
from vcr.serializers import yamlserializer, jsonserializer, compat from vcr.serializers import compat, jsonserializer, yamlserializer
def test_deserialize_old_yaml_cassette(): def test_deserialize_old_yaml_cassette():
with open("tests/fixtures/migration/old_cassette.yaml", "r") as f: with open("tests/fixtures/migration/old_cassette.yaml") as f, pytest.raises(ValueError):
with pytest.raises(ValueError): deserialize(f.read(), yamlserializer)
deserialize(f.read(), yamlserializer)
def test_deserialize_old_json_cassette(): def test_deserialize_old_json_cassette():
with open("tests/fixtures/migration/old_cassette.json", "r") as f: with open("tests/fixtures/migration/old_cassette.json") as f, pytest.raises(ValueError):
with pytest.raises(ValueError): deserialize(f.read(), jsonserializer)
deserialize(f.read(), jsonserializer)
def test_deserialize_new_yaml_cassette(): def test_deserialize_new_yaml_cassette():
with open("tests/fixtures/migration/new_cassette.yaml", "r") as f: with open("tests/fixtures/migration/new_cassette.yaml") as f:
deserialize(f.read(), yamlserializer) deserialize(f.read(), yamlserializer)
def test_deserialize_new_json_cassette(): def test_deserialize_new_json_cassette():
with open("tests/fixtures/migration/new_cassette.json", "r") as f: with open("tests/fixtures/migration/new_cassette.json") as f:
deserialize(f.read(), jsonserializer) deserialize(f.read(), jsonserializer)
@@ -77,7 +74,7 @@ def test_deserialize_py2py3_yaml_cassette(tmpdir, req_body, expect):
cfile = tmpdir.join("test_cassette.yaml") cfile = tmpdir.join("test_cassette.yaml")
cfile.write(REQBODY_TEMPLATE.format(req_body=req_body)) cfile.write(REQBODY_TEMPLATE.format(req_body=req_body))
with open(str(cfile)) as f: with open(str(cfile)) as f:
(requests, responses) = deserialize(f.read(), yamlserializer) (requests, _) = deserialize(f.read(), yamlserializer)
assert requests[0].body == expect assert requests[0].body == expect

View File

@@ -1,8 +1,14 @@
import contextlib
import http.client as httplib
from io import BytesIO
from tempfile import NamedTemporaryFile
from unittest import mock from unittest import mock
from vcr import mode from pytest import mark
from vcr.stubs import VCRHTTPSConnection
from vcr import mode, use_cassette
from vcr.cassette import Cassette from vcr.cassette import Cassette
from vcr.stubs import VCRHTTPSConnection
class TestVCRConnection: class TestVCRConnection:
@@ -11,9 +17,59 @@ class TestVCRConnection:
vcr_connection.ssl_version = "example_ssl_version" vcr_connection.ssl_version = "example_ssl_version"
assert vcr_connection.real_connection.ssl_version == "example_ssl_version" assert vcr_connection.real_connection.ssl_version == "example_ssl_version"
@mark.online
@mock.patch("vcr.cassette.Cassette.can_play_response_for", return_value=False) @mock.patch("vcr.cassette.Cassette.can_play_response_for", return_value=False)
def testing_connect(*args): def testing_connect(*args):
vcr_connection = VCRHTTPSConnection("www.google.com") with contextlib.closing(VCRHTTPSConnection("www.google.com")) as vcr_connection:
vcr_connection.cassette = Cassette("test", record_mode=mode.ALL) vcr_connection.cassette = Cassette("test", record_mode=mode.ALL)
vcr_connection.real_connection.connect() vcr_connection.real_connection.connect()
assert vcr_connection.real_connection.sock is not None assert vcr_connection.real_connection.sock is not None
def test_body_consumed_once_stream(self, tmpdir, httpbin):
self._test_body_consumed_once(
tmpdir,
httpbin,
BytesIO(b"1234567890"),
BytesIO(b"9876543210"),
BytesIO(b"9876543210"),
)
def test_body_consumed_once_iterator(self, tmpdir, httpbin):
self._test_body_consumed_once(
tmpdir,
httpbin,
iter([b"1234567890"]),
iter([b"9876543210"]),
iter([b"9876543210"]),
)
# data2 and data3 should serve the same data, potentially as iterators
def _test_body_consumed_once(
self,
tmpdir,
httpbin,
data1,
data2,
data3,
):
with NamedTemporaryFile(dir=tmpdir, suffix=".yml") as f:
testpath = f.name
# NOTE: ``use_cassette`` is not okay with the file existing
# already. So we using ``.close()`` to not only
# close but also delete the empty file, before we start.
f.close()
host, port = httpbin.host, httpbin.port
match_on = ["method", "uri", "body"]
with use_cassette(testpath, match_on=match_on):
conn1 = httplib.HTTPConnection(host, port)
conn1.request("POST", "/anything", body=data1)
conn1.getresponse()
conn2 = httplib.HTTPConnection(host, port)
conn2.request("POST", "/anything", body=data2)
conn2.getresponse()
with use_cassette(testpath, match_on=match_on) as cass:
conn3 = httplib.HTTPConnection(host, port)
conn3.request("POST", "/anything", body=data3)
conn3.getresponse()
assert cass.play_counts[0] == 0
assert cass.play_counts[1] == 1

199
tests/unit/test_unittest.py Normal file
View File

@@ -0,0 +1,199 @@
import os
from unittest import TextTestRunner, defaultTestLoader
from unittest.mock import MagicMock
from urllib.request import urlopen
import pytest
from vcr.unittest import VCRTestCase
def test_defaults():
class MyTest(VCRTestCase):
def test_foo(self):
pass
test = run_testcase(MyTest)[0][0]
expected_path = os.path.join(os.path.dirname(__file__), "cassettes")
expected_name = "MyTest.test_foo.yaml"
assert os.path.dirname(test.cassette._path) == expected_path
assert os.path.basename(test.cassette._path) == expected_name
def test_disabled():
# Baseline vcr_enabled = True
class MyTest(VCRTestCase):
def test_foo(self):
pass
test = run_testcase(MyTest)[0][0]
assert hasattr(test, "cassette")
# Test vcr_enabled = False
class MyTest(VCRTestCase):
vcr_enabled = False
def test_foo(self):
pass
test = run_testcase(MyTest)[0][0]
assert not hasattr(test, "cassette")
def test_cassette_library_dir():
class MyTest(VCRTestCase):
def test_foo(self):
pass
def _get_cassette_library_dir(self):
return "/testing"
test = run_testcase(MyTest)[0][0]
assert test.cassette._path.startswith("/testing/")
def test_cassette_name():
class MyTest(VCRTestCase):
def test_foo(self):
pass
def _get_cassette_name(self):
return "my-custom-name"
test = run_testcase(MyTest)[0][0]
assert os.path.basename(test.cassette._path) == "my-custom-name"
def test_vcr_kwargs_overridden():
class MyTest(VCRTestCase):
def test_foo(self):
pass
def _get_vcr_kwargs(self):
kwargs = super()._get_vcr_kwargs()
kwargs["record_mode"] = "new_episodes"
return kwargs
test = run_testcase(MyTest)[0][0]
assert test.cassette.record_mode == "new_episodes"
def test_vcr_kwargs_passed():
class MyTest(VCRTestCase):
def test_foo(self):
pass
def _get_vcr_kwargs(self):
return super()._get_vcr_kwargs(
record_mode="new_episodes",
)
test = run_testcase(MyTest)[0][0]
assert test.cassette.record_mode == "new_episodes"
def test_vcr_kwargs_cassette_dir():
# Test that _get_cassette_library_dir applies if cassette_library_dir
# is absent from vcr kwargs.
class MyTest(VCRTestCase):
def test_foo(self):
pass
def _get_vcr_kwargs(self):
return {
"record_mode": "new_episodes",
}
_get_cassette_library_dir = MagicMock(return_value="/testing")
test = run_testcase(MyTest)[0][0]
assert test.cassette._path.startswith("/testing/")
assert test._get_cassette_library_dir.call_count == 1
# Test that _get_cassette_library_dir is ignored if cassette_library_dir
# is present in vcr kwargs.
class MyTest(VCRTestCase):
def test_foo(self):
pass
def _get_vcr_kwargs(self):
return {
"cassette_library_dir": "/testing",
}
_get_cassette_library_dir = MagicMock(return_value="/ignored")
test = run_testcase(MyTest)[0][0]
assert test.cassette._path.startswith("/testing/")
assert test._get_cassette_library_dir.call_count == 0
@pytest.mark.online
def test_get_vcr_with_matcher(tmpdir):
cassette_dir = tmpdir.mkdir("cassettes")
assert len(cassette_dir.listdir()) == 0
mock_matcher = MagicMock(return_value=True, __name__="MockMatcher")
class MyTest(VCRTestCase):
def test_foo(self):
self.response = urlopen("http://example.com").read()
def _get_vcr(self):
myvcr = super()._get_vcr()
myvcr.register_matcher("mymatcher", mock_matcher)
myvcr.match_on = ["mymatcher"]
return myvcr
def _get_cassette_library_dir(self):
return str(cassette_dir)
# First run to fill cassette.
test = run_testcase(MyTest)[0][0]
assert len(test.cassette.requests) == 1
assert not mock_matcher.called # nothing in cassette
# Second run to call matcher.
test = run_testcase(MyTest)[0][0]
assert len(test.cassette.requests) == 1
assert mock_matcher.called
assert (
repr(mock_matcher.mock_calls[0])
== "call(<Request (GET) http://example.com>, <Request (GET) http://example.com>)"
)
@pytest.mark.online
def test_testcase_playback(tmpdir):
cassette_dir = tmpdir.mkdir("cassettes")
assert len(cassette_dir.listdir()) == 0
# First test actually reads from the web.
class MyTest(VCRTestCase):
def test_foo(self):
self.response = urlopen("http://example.com").read()
def _get_cassette_library_dir(self):
return str(cassette_dir)
test = run_testcase(MyTest)[0][0]
assert b"Example Domain" in test.response
assert len(test.cassette.requests) == 1
assert test.cassette.play_count == 0
# Second test reads from cassette.
test2 = run_testcase(MyTest)[0][0]
assert test.cassette is not test2.cassette
assert b"Example Domain" in test.response
assert len(test2.cassette.requests) == 1
assert test2.cassette.play_count == 1
def run_testcase(testcase_class):
"""Run all the tests in a TestCase and return them."""
suite = defaultTestLoader.loadTestsFromTestCase(testcase_class)
tests = list(suite._tests)
result = TextTestRunner().run(suite)
return tests, result

33
tests/unit/test_util.py Normal file
View File

@@ -0,0 +1,33 @@
from io import BytesIO, StringIO
import pytest
from vcr import request
from vcr.util import read_body
@pytest.mark.parametrize(
"input_, expected_output",
[
(BytesIO(b"Stream"), b"Stream"),
(StringIO("Stream"), b"Stream"),
(iter(["StringIter"]), b"StringIter"),
(iter(["String", "Iter"]), b"StringIter"),
(iter([b"BytesIter"]), b"BytesIter"),
(iter([b"Bytes", b"Iter"]), b"BytesIter"),
(iter([70, 111, 111]), b"Foo"),
(iter([]), b""),
("String", b"String"),
(b"Bytes", b"Bytes"),
],
)
def test_read_body(input_, expected_output):
r = request.Request("POST", "http://host.com/", input_, {})
assert read_body(r) == expected_output
def test_unsupported_read_body():
r = request.Request("POST", "http://host.com/", iter([[]]), {})
with pytest.raises(ValueError) as excinfo:
assert read_body(r)
assert excinfo.value.args == ("Body type <class 'list'> not supported",)

View File

@@ -1,20 +1,22 @@
from unittest import mock import http.client as httplib
import os import os
from pathlib import Path
from unittest import mock
import pytest import pytest
import http.client as httplib
from vcr import VCR, mode, use_cassette from vcr import VCR, mode, use_cassette
from vcr.patch import _HTTPConnection, force_reset
from vcr.request import Request from vcr.request import Request
from vcr.stubs import VCRHTTPSConnection from vcr.stubs import VCRHTTPSConnection
from vcr.patch import _HTTPConnection, force_reset
def test_vcr_use_cassette(): def test_vcr_use_cassette():
record_mode = mock.Mock() record_mode = mock.Mock()
test_vcr = VCR(record_mode=record_mode) test_vcr = VCR(record_mode=record_mode)
with mock.patch( with mock.patch(
"vcr.cassette.Cassette.load", return_value=mock.MagicMock(inject=False) "vcr.cassette.Cassette.load",
return_value=mock.MagicMock(inject=False),
) as mock_cassette_load: ) as mock_cassette_load:
@test_vcr.use_cassette("test") @test_vcr.use_cassette("test")
@@ -40,7 +42,7 @@ def test_vcr_use_cassette():
def test_vcr_before_record_request_params(): def test_vcr_before_record_request_params():
base_path = "http://httpbin.org/" base_path = "http://whatever.test/"
def before_record_cb(request): def before_record_cb(request):
if request.path != "/get": if request.path != "/get":
@@ -70,16 +72,19 @@ def test_vcr_before_record_request_params():
# Test filter_headers # Test filter_headers
request = Request( request = Request(
"GET", base_path + "?foo=bar", "", {"cookie": "test", "other": "fun", "bert": "nobody"} "GET",
base_path + "?foo=bar",
"",
{"cookie": "test", "other": "fun", "bert": "nobody"},
) )
assert cassette.filter_request(request).headers == {"other": "fun", "bert": "ernie"} assert cassette.filter_request(request).headers == {"other": "fun", "bert": "ernie"}
# Test ignore_hosts # Test ignore_hosts
request = Request("GET", "http://www.test.com" + "?foo=bar", "", {"cookie": "test", "other": "fun"}) request = Request("GET", "http://www.test.com?foo=bar", "", {"cookie": "test", "other": "fun"})
assert cassette.filter_request(request) is None assert cassette.filter_request(request) is None
# Test ignore_localhost # Test ignore_localhost
request = Request("GET", "http://localhost:8000" + "?foo=bar", "", {"cookie": "test", "other": "fun"}) request = Request("GET", "http://localhost:8000?foo=bar", "", {"cookie": "test", "other": "fun"})
assert cassette.filter_request(request) is None assert cassette.filter_request(request) is None
with test_vcr.use_cassette("test", before_record_request=None) as cassette: with test_vcr.use_cassette("test", before_record_request=None) as cassette:
@@ -95,7 +100,6 @@ def test_vcr_before_record_response_iterable():
# Prevent actually saving the cassette # Prevent actually saving the cassette
with mock.patch("vcr.cassette.FilesystemPersister.save_cassette"): with mock.patch("vcr.cassette.FilesystemPersister.save_cassette"):
# Baseline: non-iterable before_record_response should work # Baseline: non-iterable before_record_response should work
mock_filter = mock.Mock() mock_filter = mock.Mock()
vcr = VCR(before_record_response=mock_filter) vcr = VCR(before_record_response=mock_filter)
@@ -119,7 +123,6 @@ def test_before_record_response_as_filter():
# Prevent actually saving the cassette # Prevent actually saving the cassette
with mock.patch("vcr.cassette.FilesystemPersister.save_cassette"): with mock.patch("vcr.cassette.FilesystemPersister.save_cassette"):
filter_all = mock.Mock(return_value=None) filter_all = mock.Mock(return_value=None)
vcr = VCR(before_record_response=filter_all) vcr = VCR(before_record_response=filter_all)
with vcr.use_cassette("test") as cassette: with vcr.use_cassette("test") as cassette:
@@ -133,7 +136,6 @@ def test_vcr_path_transformer():
# Prevent actually saving the cassette # Prevent actually saving the cassette
with mock.patch("vcr.cassette.FilesystemPersister.save_cassette"): with mock.patch("vcr.cassette.FilesystemPersister.save_cassette"):
# Baseline: path should be unchanged # Baseline: path should be unchanged
vcr = VCR() vcr = VCR()
with vcr.use_cassette("test") as cassette: with vcr.use_cassette("test") as cassette:
@@ -261,7 +263,9 @@ def test_cassette_library_dir_with_decoration_and_super_explicit_path():
def test_cassette_library_dir_with_path_transformer(): def test_cassette_library_dir_with_path_transformer():
library_dir = "/library_dir" library_dir = "/library_dir"
vcr = VCR( vcr = VCR(
inject_cassette=True, cassette_library_dir=library_dir, path_transformer=lambda path: path + ".json" inject_cassette=True,
cassette_library_dir=library_dir,
path_transformer=lambda path: path + ".json",
) )
@vcr.use_cassette() @vcr.use_cassette()
@@ -360,3 +364,27 @@ def test_dynamically_added(self):
TestVCRClass.test_dynamically_added = test_dynamically_added TestVCRClass.test_dynamically_added = test_dynamically_added
del test_dynamically_added del test_dynamically_added
def test_path_class_as_cassette():
path = Path(__file__).parent.parent.joinpath(
"integration/cassettes/test_httpx_test_test_behind_proxy.yml",
)
with use_cassette(path):
pass
def test_use_cassette_generator_return():
ret_val = object()
vcr = VCR()
@vcr.use_cassette("test")
def gen():
return ret_val
yield
with pytest.raises(StopIteration) as exc_info:
next(gen())
assert exc_info.value.value is ret_val

View File

@@ -2,15 +2,10 @@ import sys
def test_vcr_import_deprecation(recwarn): def test_vcr_import_deprecation(recwarn):
if "vcr" in sys.modules: if "vcr" in sys.modules:
# Remove imported module entry if already loaded in another test # Remove imported module entry if already loaded in another test
del sys.modules["vcr"] del sys.modules["vcr"]
import vcr # noqa: F401 import vcr # noqa: F401
if sys.version_info[0] == 2: assert len(recwarn) == 0
assert len(recwarn) == 1
assert issubclass(recwarn[0].category, DeprecationWarning)
else:
assert len(recwarn) == 0

105
tox.ini
View File

@@ -1,105 +0,0 @@
[tox]
skip_missing_interpreters=true
envlist =
cov-clean,
lint,
{py37,py38,py39,py310}-{requests,httplib2,urllib3,tornado4,boto3,aiohttp,httpx},
{pypy3}-{requests,httplib2,urllib3,tornado4,boto3},
{py310}-httpx019,
cov-report
[gh-actions]
python =
3.7: py37, lint
3.8: py38
3.9: py39
3.10: py310
pypy-3: pypy3
# Coverage environment tasks: cov-clean and cov-report
# https://pytest-cov.readthedocs.io/en/latest/tox.html
[testenv:cov-clean]
deps = coverage
skip_install=true
commands = coverage erase
[testenv:cov-report]
deps = coverage
skip_install=true
commands =
coverage html
coverage report --fail-under=90
[testenv:lint]
skipsdist = True
commands =
black --version
black --check --diff .
flake8 --version
flake8 --exclude=./docs/conf.py,./.tox/
pyflakes ./docs/conf.py
deps =
flake8
black
basepython = python3.7
[testenv:docs]
# Running sphinx from inside the "docs" directory
# ensures it will not pick up any stray files that might
# get into a virtual environment under the top-level directory
# or other artifacts under build/
changedir = docs
# The only dependency is sphinx
# If we were using extensions packaged separately,
# we would specify them here.
# A better practice is to specify a specific version of sphinx.
deps =
sphinx
sphinx_rtd_theme
# This is the sphinx command to generate HTML.
# In other circumstances, we might want to generate a PDF or an ebook
commands =
sphinx-build -W -b html -d {envtmpdir}/doctrees . {envtmpdir}/html
# We use Python 3.7. Tox sometimes tries to autodetect it based on the name of
# the testenv, but "docs" does not give useful clues so we have to be explicit.
basepython = python3.7
[testenv]
# Need to use develop install so that paths
# for aggregate code coverage combine
usedevelop=true
commands =
./runtests.sh --cov=./vcr --cov-branch --cov-report=xml --cov-append {posargs}
deps =
Werkzeug==2.0.3
pytest
git+https://github.com/immerrr/pytest-httpbin@fix-redirect-location-scheme-for-secure-server
pytest-cov
PyYAML
ipaddress
requests: requests>=2.22.0
httplib2: httplib2
urllib3: urllib3
boto3: boto3
boto3: urllib3
aiohttp: aiohttp
aiohttp: pytest-asyncio
aiohttp: pytest-aiohttp
httpx: httpx
{py37,py38,py39,py310}-{httpx}: httpx
{py37,py38,py39,py310}-{httpx}: pytest-asyncio
httpx: httpx>0.19
# httpx==0.19 is the latest version that supports allow_redirects, newer versions use follow_redirects
httpx019: httpx==0.19
{py37,py38,py39,py310}-{httpx}: pytest-asyncio
depends =
lint,{py37,py38,py39,py310,pypy3}-{requests,httplib2,urllib3,tornado4,boto3},{py37,py38,py39,py310}-{aiohttp},{py37,py38,py39,py310}-{httpx}: cov-clean
cov-report: lint,{py37,py38,py39,py310,pypy3}-{requests,httplib2,urllib3,tornado4,boto3},{py37,py38,py39,py310}-{aiohttp}
passenv =
AWS_ACCESS_KEY_ID
AWS_DEFAULT_REGION
AWS_SECRET_ACCESS_KEY
[flake8]
max_line_length = 110

View File

@@ -1,9 +1,10 @@
import logging import logging
from .config import VCR
from logging import NullHandler from logging import NullHandler
from .record_mode import RecordMode as mode # noqa import is not used in this file
__version__ = "4.2.1" from .config import VCR
from .record_mode import RecordMode as mode # noqa: F401
__version__ = "7.0.0"
logging.getLogger(__name__).addHandler(NullHandler()) logging.getLogger(__name__).addHandler(NullHandler())

View File

@@ -1,3 +1,3 @@
async def handle_coroutine(vcr, fn): # noqa: E999 async def handle_coroutine(vcr, fn):
with vcr as cassette: with vcr as cassette:
return await fn(cassette) # noqa: E999 return await fn(cassette)

View File

@@ -1,28 +1,20 @@
import collections import collections
import contextlib import contextlib
import copy import copy
import sys
import inspect import inspect
import logging import logging
from inspect import iscoroutinefunction
import wrapt import wrapt
from .errors import UnhandledHTTPRequestError
from .matchers import requests_match, uri, method, get_matchers_results
from .patch import CassettePatcherBuilder
from .serializers import yamlserializer
from .persisters.filesystem import FilesystemPersister
from .util import partition_dict
from ._handle_coroutine import handle_coroutine from ._handle_coroutine import handle_coroutine
from .errors import UnhandledHTTPRequestError
from .matchers import get_matchers_results, method, requests_match, uri
from .patch import CassettePatcherBuilder
from .persisters.filesystem import CassetteDecodeError, CassetteNotFoundError, FilesystemPersister
from .record_mode import RecordMode from .record_mode import RecordMode
from .serializers import yamlserializer
try: from .util import partition_dict
from asyncio import iscoroutinefunction
except ImportError:
def iscoroutinefunction(*args, **kwargs):
return False
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@@ -45,7 +37,11 @@ class CassetteContextDecorator:
this class as a context manager in ``__exit__``. this class as a context manager in ``__exit__``.
""" """
_non_cassette_arguments = ("path_transformer", "func_path_generator") _non_cassette_arguments = (
"path_transformer",
"func_path_generator",
"record_on_exception",
)
@classmethod @classmethod
def from_args(cls, cassette_class, **kwargs): def from_args(cls, cassette_class, **kwargs):
@@ -55,6 +51,7 @@ class CassetteContextDecorator:
self.cls = cls self.cls = cls
self._args_getter = args_getter self._args_getter = args_getter
self.__finish = None self.__finish = None
self.__cassette = None
def _patch_generator(self, cassette): def _patch_generator(self, cassette):
with contextlib.ExitStack() as exit_stack: with contextlib.ExitStack() as exit_stack:
@@ -64,9 +61,6 @@ class CassetteContextDecorator:
log.debug(log_format.format(action="Entering", path=cassette._path)) log.debug(log_format.format(action="Entering", path=cassette._path))
yield cassette yield cassette
log.debug(log_format.format(action="Exiting", path=cassette._path)) log.debug(log_format.format(action="Exiting", path=cassette._path))
# TODO(@IvanMalison): Hmmm. it kind of feels like this should be
# somewhere else.
cassette._save()
def __enter__(self): def __enter__(self):
# This assertion is here to prevent the dangerous behavior # This assertion is here to prevent the dangerous behavior
@@ -79,15 +73,28 @@ class CassetteContextDecorator:
# pass # pass
assert self.__finish is None, "Cassette already open." assert self.__finish is None, "Cassette already open."
other_kwargs, cassette_kwargs = partition_dict( other_kwargs, cassette_kwargs = partition_dict(
lambda key, _: key in self._non_cassette_arguments, self._args_getter() lambda key, _: key in self._non_cassette_arguments,
self._args_getter(),
) )
if other_kwargs.get("path_transformer"): if other_kwargs.get("path_transformer"):
transformer = other_kwargs["path_transformer"] transformer = other_kwargs["path_transformer"]
cassette_kwargs["path"] = transformer(cassette_kwargs["path"]) cassette_kwargs["path"] = transformer(cassette_kwargs["path"])
self.__finish = self._patch_generator(self.cls.load(**cassette_kwargs)) self.__cassette = self.cls.load(**cassette_kwargs)
self.__finish = self._patch_generator(self.__cassette)
return next(self.__finish) return next(self.__finish)
def __exit__(self, *args): def __exit__(self, *exc_info):
exception_was_raised = any(exc_info)
record_on_exception = self._args_getter().get("record_on_exception", True)
if record_on_exception or not exception_was_raised:
self.__cassette._save()
self.__cassette = None
# Fellow programmer, don't remove this `next`, if `self.__finish` is
# not consumed the unpatcher functions accumulated in the `exit_stack`
# object created in `_patch_generator` will not be called until
# `exit_stack` is not garbage collected.
# This works in CPython but not in Pypy, where the unpatchers will not
# be called until much later.
next(self.__finish, None) next(self.__finish, None)
self.__finish = None self.__finish = None
@@ -118,20 +125,7 @@ class CassetteContextDecorator:
duration of the generator. duration of the generator.
""" """
with self as cassette: with self as cassette:
coroutine = fn(cassette) return (yield from fn(cassette))
# We don't need to catch StopIteration. The caller (Tornado's
# gen.coroutine, for example) will handle that.
to_yield = next(coroutine)
while True:
try:
to_send = yield to_yield
except Exception:
to_yield = coroutine.throw(*sys.exc_info())
else:
try:
to_yield = coroutine.send(to_send)
except StopIteration:
break
def _handle_function(self, fn): def _handle_function(self, fn):
with self as cassette: with self as cassette:
@@ -183,6 +177,7 @@ class Cassette:
custom_patches=(), custom_patches=(),
inject=False, inject=False,
allow_playback_repeats=False, allow_playback_repeats=False,
drop_unused_requests=False,
): ):
self._persister = persister or FilesystemPersister self._persister = persister or FilesystemPersister
self._path = path self._path = path
@@ -195,6 +190,7 @@ class Cassette:
self.record_mode = record_mode self.record_mode = record_mode
self.custom_patches = custom_patches self.custom_patches = custom_patches
self.allow_playback_repeats = allow_playback_repeats self.allow_playback_repeats = allow_playback_repeats
self.drop_unused_requests = drop_unused_requests
# self.data is the list of (req, resp) tuples # self.data is the list of (req, resp) tuples
self.data = [] self.data = []
@@ -202,6 +198,10 @@ class Cassette:
self.dirty = False self.dirty = False
self.rewound = False self.rewound = False
# Subsets of self.data to store old and played interactions
self._old_interactions = []
self._played_interactions = []
@property @property
def play_count(self): def play_count(self):
return sum(self.play_counts.values()) return sum(self.play_counts.values())
@@ -221,14 +221,14 @@ class Cassette:
@property @property
def write_protected(self): def write_protected(self):
return self.rewound and self.record_mode == RecordMode.ONCE or self.record_mode == RecordMode.NONE return (self.rewound and self.record_mode == RecordMode.ONCE) or self.record_mode == RecordMode.NONE
def append(self, request, response): def append(self, request, response):
"""Add a request, response pair to this cassette""" """Add a request, response pair to this cassette"""
log.info("Appending request %s and response %s", request, response)
request = self._before_record_request(request) request = self._before_record_request(request)
if not request: if not request:
return return
log.info("Appending request %s and response %s", request, response)
# Deepcopy is here because mutation of `response` will corrupt the # Deepcopy is here because mutation of `response` will corrupt the
# real response. # real response.
response = copy.deepcopy(response) response = copy.deepcopy(response)
@@ -263,10 +263,11 @@ class Cassette:
for index, response in self._responses(request): for index, response in self._responses(request):
if self.play_counts[index] == 0 or self.allow_playback_repeats: if self.play_counts[index] == 0 or self.allow_playback_repeats:
self.play_counts[index] += 1 self.play_counts[index] += 1
self._played_interactions.append((request, response))
return response return response
# The cassette doesn't contain the request asked for. # The cassette doesn't contain the request asked for.
raise UnhandledHTTPRequestError( raise UnhandledHTTPRequestError(
"The cassette (%r) doesn't contain the request (%r) asked for" % (self._path, request) f"The cassette ({self._path!r}) doesn't contain the request ({request!r}) asked for",
) )
def responses_of(self, request): def responses_of(self, request):
@@ -281,7 +282,7 @@ class Cassette:
return responses return responses
# The cassette doesn't contain the request asked for. # The cassette doesn't contain the request asked for.
raise UnhandledHTTPRequestError( raise UnhandledHTTPRequestError(
"The cassette (%r) doesn't contain the request (%r) asked for" % (self._path, request) f"The cassette ({self._path!r}) doesn't contain the request ({request!r}) asked for",
) )
def rewind(self): def rewind(self):
@@ -300,7 +301,7 @@ class Cassette:
""" """
best_matches = [] best_matches = []
request = self._before_record_request(request) request = self._before_record_request(request)
for index, (stored_request, response) in enumerate(self.data): for _, (stored_request, _) in enumerate(self.data):
successes, fails = get_matchers_results(request, stored_request, self._match_on) successes, fails = get_matchers_results(request, stored_request, self._match_on)
best_matches.append((len(successes), stored_request, successes, fails)) best_matches.append((len(successes), stored_request, successes, fails))
best_matches.sort(key=lambda t: t[0], reverse=True) best_matches.sort(key=lambda t: t[0], reverse=True)
@@ -323,26 +324,51 @@ class Cassette:
return final_best_matches return final_best_matches
def _new_interactions(self):
"""List of new HTTP interactions (request/response tuples)"""
new_interactions = []
for request, response in self.data:
if all(
not requests_match(request, old_request, self._match_on)
for old_request, _ in self._old_interactions
):
new_interactions.append((request, response))
return new_interactions
def _as_dict(self): def _as_dict(self):
return {"requests": self.requests, "responses": self.responses} return {"requests": self.requests, "responses": self.responses}
def _build_used_interactions_dict(self):
interactions = self._played_interactions + self._new_interactions()
cassete_dict = {
"requests": [request for request, _ in interactions],
"responses": [response for _, response in interactions],
}
return cassete_dict
def _save(self, force=False): def _save(self, force=False):
if self.drop_unused_requests and len(self._played_interactions) < len(self._old_interactions):
cassete_dict = self._build_used_interactions_dict()
force = True
else:
cassete_dict = self._as_dict()
if force or self.dirty: if force or self.dirty:
self._persister.save_cassette(self._path, self._as_dict(), serializer=self._serializer) self._persister.save_cassette(self._path, cassete_dict, serializer=self._serializer)
self.dirty = False self.dirty = False
def _load(self): def _load(self):
try: try:
requests, responses = self._persister.load_cassette(self._path, serializer=self._serializer) requests, responses = self._persister.load_cassette(self._path, serializer=self._serializer)
for request, response in zip(requests, responses): for request, response in zip(requests, responses, strict=False):
self.append(request, response) self.append(request, response)
self._old_interactions.append((request, response))
self.dirty = False self.dirty = False
self.rewound = True self.rewound = True
except ValueError: except (CassetteDecodeError, CassetteNotFoundError):
pass pass
def __str__(self): def __str__(self):
return "<Cassette containing {} recorded response(s)>".format(len(self)) return f"<Cassette containing {len(self)} recorded response(s)>"
def __len__(self): def __len__(self):
"""Return the number of request,response pairs stored in here""" """Return the number of request,response pairs stored in here"""
@@ -350,7 +376,7 @@ class Cassette:
def __contains__(self, request): def __contains__(self, request):
"""Return whether or not a request has been stored""" """Return whether or not a request has been stored"""
for index, response in self._responses(request): for index, _ in self._responses(request):
if self.play_counts[index] == 0 or self.allow_playback_repeats: if self.play_counts[index] == 0 or self.allow_playback_repeats:
return True return True
return False return False

View File

@@ -1,20 +1,17 @@
import copy import copy
from collections import abc as collections_abc
import functools import functools
import inspect import inspect
import os import os
import types import types
from collections import abc as collections_abc
from pathlib import Path
import six from . import filters, matchers
from .cassette import Cassette from .cassette import Cassette
from .serializers import yamlserializer, jsonserializer
from .persisters.filesystem import FilesystemPersister from .persisters.filesystem import FilesystemPersister
from .util import compose, auto_decorate
from .record_mode import RecordMode from .record_mode import RecordMode
from . import matchers from .serializers import jsonserializer, yamlserializer
from . import filters from .util import auto_decorate, compose
class VCR: class VCR:
@@ -50,6 +47,8 @@ class VCR:
cassette_library_dir=None, cassette_library_dir=None,
func_path_generator=None, func_path_generator=None,
decode_compressed_response=False, decode_compressed_response=False,
record_on_exception=True,
drop_unused_requests=False,
): ):
self.serializer = serializer self.serializer = serializer
self.match_on = match_on self.match_on = match_on
@@ -81,13 +80,15 @@ class VCR:
self.path_transformer = path_transformer self.path_transformer = path_transformer
self.func_path_generator = func_path_generator self.func_path_generator = func_path_generator
self.decode_compressed_response = decode_compressed_response self.decode_compressed_response = decode_compressed_response
self.record_on_exception = record_on_exception
self._custom_patches = tuple(custom_patches) self._custom_patches = tuple(custom_patches)
self.drop_unused_requests = drop_unused_requests
def _get_serializer(self, serializer_name): def _get_serializer(self, serializer_name):
try: try:
serializer = self.serializers[serializer_name] serializer = self.serializers[serializer_name]
except KeyError: except KeyError:
raise KeyError("Serializer {} doesn't exist or isn't registered".format(serializer_name)) raise KeyError(f"Serializer {serializer_name} doesn't exist or isn't registered") from None
return serializer return serializer
def _get_matchers(self, matcher_names): def _get_matchers(self, matcher_names):
@@ -96,11 +97,11 @@ class VCR:
for m in matcher_names: for m in matcher_names:
matchers.append(self.matchers[m]) matchers.append(self.matchers[m])
except KeyError: except KeyError:
raise KeyError("Matcher {} doesn't exist or isn't registered".format(m)) raise KeyError(f"Matcher {m} doesn't exist or isn't registered") from None
return matchers return matchers
def use_cassette(self, path=None, **kwargs): def use_cassette(self, path=None, **kwargs):
if path is not None and not isinstance(path, str): if path is not None and not isinstance(path, (str, Path)):
function = path function = path
# Assume this is an attempt to decorate a function # Assume this is an attempt to decorate a function
return self._use_cassette(**kwargs)(function) return self._use_cassette(**kwargs)(function)
@@ -124,6 +125,7 @@ class VCR:
func_path_generator = kwargs.get("func_path_generator", self.func_path_generator) func_path_generator = kwargs.get("func_path_generator", self.func_path_generator)
cassette_library_dir = kwargs.get("cassette_library_dir", self.cassette_library_dir) cassette_library_dir = kwargs.get("cassette_library_dir", self.cassette_library_dir)
additional_matchers = kwargs.get("additional_matchers", ()) additional_matchers = kwargs.get("additional_matchers", ())
record_on_exception = kwargs.get("record_on_exception", self.record_on_exception)
if cassette_library_dir: if cassette_library_dir:
@@ -150,6 +152,8 @@ class VCR:
"path_transformer": path_transformer, "path_transformer": path_transformer,
"func_path_generator": func_path_generator, "func_path_generator": func_path_generator,
"allow_playback_repeats": kwargs.get("allow_playback_repeats", False), "allow_playback_repeats": kwargs.get("allow_playback_repeats", False),
"record_on_exception": record_on_exception,
"drop_unused_requests": kwargs.get("drop_unused_requests", self.drop_unused_requests),
} }
path = kwargs.get("path") path = kwargs.get("path")
if path: if path:
@@ -159,7 +163,8 @@ class VCR:
def _build_before_record_response(self, options): def _build_before_record_response(self, options):
before_record_response = options.get("before_record_response", self.before_record_response) before_record_response = options.get("before_record_response", self.before_record_response)
decode_compressed_response = options.get( decode_compressed_response = options.get(
"decode_compressed_response", self.decode_compressed_response "decode_compressed_response",
self.decode_compressed_response,
) )
filter_functions = [] filter_functions = []
if decode_compressed_response: if decode_compressed_response:
@@ -183,10 +188,12 @@ class VCR:
filter_headers = options.get("filter_headers", self.filter_headers) filter_headers = options.get("filter_headers", self.filter_headers)
filter_query_parameters = options.get("filter_query_parameters", self.filter_query_parameters) filter_query_parameters = options.get("filter_query_parameters", self.filter_query_parameters)
filter_post_data_parameters = options.get( filter_post_data_parameters = options.get(
"filter_post_data_parameters", self.filter_post_data_parameters "filter_post_data_parameters",
self.filter_post_data_parameters,
) )
before_record_request = options.get( before_record_request = options.get(
"before_record_request", options.get("before_record", self.before_record_request) "before_record_request",
options.get("before_record", self.before_record_request),
) )
ignore_hosts = options.get("ignore_hosts", self.ignore_hosts) ignore_hosts = options.get("ignore_hosts", self.ignore_hosts)
ignore_localhost = options.get("ignore_localhost", self.ignore_localhost) ignore_localhost = options.get("ignore_localhost", self.ignore_localhost)
@@ -196,12 +203,12 @@ class VCR:
if filter_query_parameters: if filter_query_parameters:
replacements = [p if isinstance(p, tuple) else (p, None) for p in filter_query_parameters] replacements = [p if isinstance(p, tuple) else (p, None) for p in filter_query_parameters]
filter_functions.append( filter_functions.append(
functools.partial(filters.replace_query_parameters, replacements=replacements) functools.partial(filters.replace_query_parameters, replacements=replacements),
) )
if filter_post_data_parameters: if filter_post_data_parameters:
replacements = [p if isinstance(p, tuple) else (p, None) for p in filter_post_data_parameters] replacements = [p if isinstance(p, tuple) else (p, None) for p in filter_post_data_parameters]
filter_functions.append( filter_functions.append(
functools.partial(filters.replace_post_data_parameters, replacements=replacements) functools.partial(filters.replace_post_data_parameters, replacements=replacements),
) )
hosts_to_ignore = set(ignore_hosts) hosts_to_ignore = set(ignore_hosts)
@@ -216,7 +223,7 @@ class VCR:
filter_functions.extend(before_record_request) filter_functions.extend(before_record_request)
def before_record_request(request): def before_record_request(request):
request = copy.copy(request) request = copy.deepcopy(request)
for function in filter_functions: for function in filter_functions:
if request is None: if request is None:
break break
@@ -250,5 +257,5 @@ class VCR:
def test_case(self, predicate=None): def test_case(self, predicate=None):
predicate = predicate or self.is_test_method predicate = predicate or self.is_test_method
# TODO: Remove this reference to `six` in favor of the Python3 equivalent metaclass = auto_decorate(self.use_cassette, predicate)
return six.with_metaclass(auto_decorate(self.use_cassette, predicate)) return metaclass("temporary_class", (), {})

View File

@@ -13,30 +13,29 @@ class CannotOverwriteExistingCassetteException(Exception):
best_matches = cassette.find_requests_with_most_matches(failed_request) best_matches = cassette.find_requests_with_most_matches(failed_request)
if best_matches: if best_matches:
# Build a comprehensible message to put in the exception. # Build a comprehensible message to put in the exception.
best_matches_msg = "Found {} similar requests with {} different matcher(s) :\n".format( best_matches_msg = (
len(best_matches), len(best_matches[0][2]) f"Found {len(best_matches)} similar requests "
f"with {len(best_matches[0][2])} different matcher(s) :\n"
) )
for idx, best_match in enumerate(best_matches, start=1): for idx, best_match in enumerate(best_matches, start=1):
request, succeeded_matchers, failed_matchers_assertion_msgs = best_match request, succeeded_matchers, failed_matchers_assertion_msgs = best_match
best_matches_msg += ( best_matches_msg += (
"\n%s - (%r).\n" f"\n{idx} - ({request!r}).\n"
"Matchers succeeded : %s\n" f"Matchers succeeded : {succeeded_matchers}\n"
"Matchers failed :\n" % (idx, request, succeeded_matchers) "Matchers failed :\n"
) )
for failed_matcher, assertion_msg in failed_matchers_assertion_msgs: for failed_matcher, assertion_msg in failed_matchers_assertion_msgs:
best_matches_msg += "%s - assertion failure :\n" "%s\n" % (failed_matcher, assertion_msg) best_matches_msg += f"{failed_matcher} - assertion failure :\n{assertion_msg}\n"
else: else:
best_matches_msg = "No similar requests, that have not been played, found." best_matches_msg = "No similar requests, that have not been played, found."
return ( return (
"Can't overwrite existing cassette (%r) in " f"Can't overwrite existing cassette ({cassette._path!r}) in "
"your current record mode (%r).\n" f"your current record mode ({cassette.record_mode!r}).\n"
"No match for the request (%r) was found.\n" f"No match for the request ({failed_request!r}) was found.\n"
"%s" % (cassette._path, cassette.record_mode, failed_request, best_matches_msg) f"{best_matches_msg}"
) )
class UnhandledHTTPRequestError(KeyError): class UnhandledHTTPRequestError(KeyError):
"""Raised when a cassette does not contain the request we want.""" """Raised when a cassette does not contain the request we want."""
pass

View File

@@ -1,8 +1,8 @@
from io import BytesIO
from urllib.parse import urlparse, urlencode, urlunparse
import copy import copy
import json import json
import zlib import zlib
from io import BytesIO
from urllib.parse import urlencode, urlparse, urlunparse
from .util import CaseInsensitiveDict from .util import CaseInsensitiveDict
@@ -95,7 +95,7 @@ def replace_post_data_parameters(request, replacements):
new_body[k] = rv new_body[k] = rv
request.body = new_body request.body = new_body
elif request.headers.get("Content-Type") == "application/json": elif request.headers.get("Content-Type") == "application/json":
json_data = json.loads(request.body.decode("utf-8")) json_data = json.loads(request.body)
for k, rv in replacements.items(): for k, rv in replacements.items():
if k in json_data: if k in json_data:
ov = json_data.pop(k) ov = json_data.pop(k)
@@ -150,10 +150,18 @@ def decode_response(response):
"""Returns decompressed body according to encoding using zlib. """Returns decompressed body according to encoding using zlib.
to (de-)compress gzip format, use wbits = zlib.MAX_WBITS | 16 to (de-)compress gzip format, use wbits = zlib.MAX_WBITS | 16
""" """
if not body:
return ""
if encoding == "gzip": if encoding == "gzip":
return zlib.decompress(body, zlib.MAX_WBITS | 16) try:
return zlib.decompress(body, zlib.MAX_WBITS | 16)
except zlib.error:
return body # assumes that the data was already decompressed
else: # encoding == 'deflate' else: # encoding == 'deflate'
return zlib.decompress(body) try:
return zlib.decompress(body)
except zlib.error:
return body # assumes that the data was already decompressed
# Deepcopy here in case `headers` contain objects that could # Deepcopy here in case `headers` contain objects that could
# be mutated by a shallow copy and corrupt the real response. # be mutated by a shallow copy and corrupt the real response.

View File

@@ -1,55 +1,74 @@
import json import json
import logging
import urllib import urllib
import xmlrpc.client import xmlrpc.client
from .util import read_body from string import hexdigits
import logging
from .util import read_body
_HEXDIG_CODE_POINTS: set[int] = {ord(s.encode("ascii")) for s in hexdigits}
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
def method(r1, r2): def method(r1, r2):
assert r1.method == r2.method, "{} != {}".format(r1.method, r2.method) if r1.method != r2.method:
raise AssertionError(f"{r1.method} != {r2.method}")
def uri(r1, r2): def uri(r1, r2):
assert r1.uri == r2.uri, "{} != {}".format(r1.uri, r2.uri) if r1.uri != r2.uri:
raise AssertionError(f"{r1.uri} != {r2.uri}")
def host(r1, r2): def host(r1, r2):
assert r1.host == r2.host, "{} != {}".format(r1.host, r2.host) if r1.host != r2.host:
raise AssertionError(f"{r1.host} != {r2.host}")
def scheme(r1, r2): def scheme(r1, r2):
assert r1.scheme == r2.scheme, "{} != {}".format(r1.scheme, r2.scheme) if r1.scheme != r2.scheme:
raise AssertionError(f"{r1.scheme} != {r2.scheme}")
def port(r1, r2): def port(r1, r2):
assert r1.port == r2.port, "{} != {}".format(r1.port, r2.port) if r1.port != r2.port:
raise AssertionError(f"{r1.port} != {r2.port}")
def path(r1, r2): def path(r1, r2):
assert r1.path == r2.path, "{} != {}".format(r1.path, r2.path) if r1.path != r2.path:
raise AssertionError(f"{r1.path} != {r2.path}")
def query(r1, r2): def query(r1, r2):
assert r1.query == r2.query, "{} != {}".format(r1.query, r2.query) if r1.query != r2.query:
raise AssertionError(f"{r1.query} != {r2.query}")
def raw_body(r1, r2): def raw_body(r1, r2):
assert read_body(r1) == read_body(r2) if read_body(r1) != read_body(r2):
raise AssertionError
def body(r1, r2): def body(r1, r2):
transformer = _get_transformer(r1) transformers = list(_get_transformers(r1))
r2_transformer = _get_transformer(r2) if transformers != list(_get_transformers(r2)):
if transformer != r2_transformer: transformers = []
transformer = _identity
assert transformer(read_body(r1)) == transformer(read_body(r2)) b1 = read_body(r1)
b2 = read_body(r2)
for transform in transformers:
b1 = transform(b1)
b2 = transform(b2)
if b1 != b2:
raise AssertionError
def headers(r1, r2): def headers(r1, r2):
assert r1.headers == r2.headers, "{} != {}".format(r1.headers, r2.headers) if r1.headers != r2.headers:
raise AssertionError(f"{r1.headers} != {r2.headers}")
def _header_checker(value, header="Content-Type"): def _header_checker(value, header="Content-Type"):
@@ -62,17 +81,71 @@ def _header_checker(value, header="Content-Type"):
return checker return checker
def _dechunk(body):
if isinstance(body, str):
body = body.encode("utf-8")
elif isinstance(body, bytearray):
body = bytes(body)
elif hasattr(body, "__iter__"):
body = list(body)
if body:
if isinstance(body[0], str):
body = ("".join(body)).encode("utf-8")
elif isinstance(body[0], bytes):
body = b"".join(body)
elif isinstance(body[0], int):
body = bytes(body)
else:
raise ValueError(f"Body chunk type {type(body[0])} not supported")
else:
body = None
if not isinstance(body, bytes):
return body
# Now decode chunked data format (https://en.wikipedia.org/wiki/Chunked_transfer_encoding)
# Example input: b"45\r\n<69 bytes>\r\n0\r\n\r\n" where int(b"45", 16) == 69.
CHUNK_GAP = b"\r\n"
BODY_LEN: int = len(body)
chunks: list[bytes] = []
pos: int = 0
while True:
for i in range(pos, BODY_LEN):
if body[i] not in _HEXDIG_CODE_POINTS:
break
if i == 0 or body[i : i + len(CHUNK_GAP)] != CHUNK_GAP:
if pos == 0:
return body # i.e. assume non-chunk data
raise ValueError("Malformed chunked data")
size_bytes = int(body[pos:i], 16)
if size_bytes == 0: # i.e. well-formed ending
return b"".join(chunks)
chunk_data_first = i + len(CHUNK_GAP)
chunk_data_after_last = chunk_data_first + size_bytes
if body[chunk_data_after_last : chunk_data_after_last + len(CHUNK_GAP)] != CHUNK_GAP:
raise ValueError("Malformed chunked data")
chunk_data = body[chunk_data_first:chunk_data_after_last]
chunks.append(chunk_data)
pos = chunk_data_after_last + len(CHUNK_GAP)
def _transform_json(body): def _transform_json(body):
# Request body is always a byte string, but json.loads() wants a text
# string. RFC 7159 says the default encoding is UTF-8 (although UTF-16
# and UTF-32 are also allowed: hmmmmm).
if body: if body:
return json.loads(body.decode("utf-8")) return json.loads(body)
_xml_header_checker = _header_checker("text/xml") _xml_header_checker = _header_checker("text/xml")
_xmlrpc_header_checker = _header_checker("xmlrpc", header="User-Agent") _xmlrpc_header_checker = _header_checker("xmlrpc", header="User-Agent")
_checker_transformer_pairs = ( _checker_transformer_pairs = (
(_header_checker("chunked", header="Transfer-Encoding"), _dechunk),
( (
_header_checker("application/x-www-form-urlencoded"), _header_checker("application/x-www-form-urlencoded"),
lambda body: urllib.parse.parse_qs(body.decode("ascii")), lambda body: urllib.parse.parse_qs(body.decode("ascii")),
@@ -82,22 +155,16 @@ _checker_transformer_pairs = (
) )
def _identity(x): def _get_transformers(request):
return x
def _get_transformer(request):
for checker, transformer in _checker_transformer_pairs: for checker, transformer in _checker_transformer_pairs:
if checker(request.headers): if checker(request.headers):
return transformer yield transformer
else:
return _identity
def requests_match(r1, r2, matchers): def requests_match(r1, r2, matchers):
successes, failures = get_matchers_results(r1, r2, matchers) _, failures = get_matchers_results(r1, r2, matchers)
if failures: if failures:
log.debug("Requests {} and {} differ.\n" "Failure details:\n" "{}".format(r1, r2, failures)) log.debug(f"Requests {r1} and {r2} differ.\nFailure details:\n{failures}")
return len(failures) == 0 return len(failures) == 0

View File

@@ -7,7 +7,7 @@ It merges and deletes the request obsolete keys (protocol, host, port, path)
into new 'uri' key. into new 'uri' key.
Usage:: Usage::
python -m vcr.migration PATH python3 -m vcr.migration PATH
The PATH can be path to the directory with cassettes or cassette itself The PATH can be path to the directory with cassettes or cassette itself
""" """
@@ -17,11 +17,12 @@ import os
import shutil import shutil
import sys import sys
import tempfile import tempfile
import yaml import yaml
from .serializers import yamlserializer, jsonserializer
from .serialize import serialize
from . import request from . import request
from .serialize import serialize
from .serializers import jsonserializer, yamlserializer
from .stubs.compat import get_httpmessage from .stubs.compat import get_httpmessage
# Use the libYAML versions if possible # Use the libYAML versions if possible
@@ -54,7 +55,7 @@ def build_uri(**parts):
port = parts["port"] port = parts["port"]
scheme = parts["protocol"] scheme = parts["protocol"]
default_port = {"https": 443, "http": 80}[scheme] default_port = {"https": 443, "http": 80}[scheme]
parts["port"] = ":{}".format(port) if port != default_port else "" parts["port"] = f":{port}" if port != default_port else ""
return "{protocol}://{host}{port}{path}".format(**parts) return "{protocol}://{host}{port}{path}".format(**parts)
@@ -91,7 +92,7 @@ def migrate_json(in_fp, out_fp):
def _list_of_tuples_to_dict(fs): def _list_of_tuples_to_dict(fs):
return {k: v for k, v in fs[0]} return dict(fs[0])
def _already_migrated(data): def _already_migrated(data):
@@ -117,7 +118,7 @@ def migrate(file_path, migration_fn):
# because we assume that original files can be reverted # because we assume that original files can be reverted
# we will try to copy the content. (os.rename not needed) # we will try to copy the content. (os.rename not needed)
with tempfile.TemporaryFile(mode="w+") as out_fp: with tempfile.TemporaryFile(mode="w+") as out_fp:
with open(file_path, "r") as in_fp: with open(file_path) as in_fp:
if not migration_fn(in_fp, out_fp): if not migration_fn(in_fp, out_fp):
return False return False
with open(file_path, "w") as in_fp: with open(file_path, "w") as in_fp:
@@ -129,7 +130,7 @@ def migrate(file_path, migration_fn):
def try_migrate(path): def try_migrate(path):
if path.endswith(".json"): if path.endswith(".json"):
return migrate(path, migrate_json) return migrate(path, migrate_json)
elif path.endswith(".yaml") or path.endswith(".yml"): elif path.endswith((".yaml", ".yml")):
return migrate(path, migrate_yml) return migrate(path, migrate_yml)
return False return False
@@ -137,7 +138,7 @@ def try_migrate(path):
def main(): def main():
if len(sys.argv) != 2: if len(sys.argv) != 2:
raise SystemExit( raise SystemExit(
"Please provide path to cassettes directory or file. " "Usage: python -m vcr.migration PATH" "Please provide path to cassettes directory or file. Usage: python3 -m vcr.migration PATH",
) )
path = sys.argv[1] path = sys.argv[1]
@@ -149,7 +150,7 @@ def main():
for file_path in files: for file_path in files:
migrated = try_migrate(file_path) migrated = try_migrate(file_path)
status = "OK" if migrated else "FAIL" status = "OK" if migrated else "FAIL"
sys.stderr.write("[{}] {}\n".format(status, file_path)) sys.stderr.write(f"[{status}] {file_path}\n")
sys.stderr.write("Done.\n") sys.stderr.write("Done.\n")

View File

@@ -1,13 +1,13 @@
"""Utilities for patching in cassettes""" """Utilities for patching in cassettes"""
import contextlib import contextlib
import functools import functools
import http.client as httplib
import itertools import itertools
import logging
from unittest import mock from unittest import mock
from .stubs import VCRHTTPConnection, VCRHTTPSConnection from .stubs import VCRHTTPConnection, VCRHTTPSConnection
import http.client as httplib
import logging
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
# Save some of the original types for the purposes of unpatching # Save some of the original types for the purposes of unpatching
@@ -16,42 +16,47 @@ _HTTPSConnection = httplib.HTTPSConnection
# Try to save the original types for boto3 # Try to save the original types for boto3
try: try:
from botocore.awsrequest import AWSHTTPSConnection, AWSHTTPConnection from botocore.awsrequest import AWSHTTPConnection, AWSHTTPSConnection
except ImportError: except ImportError as e:
try: try:
import botocore.vendored.requests.packages.urllib3.connectionpool as cpool import botocore.vendored.requests # noqa: F401
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
pass pass
else: else:
_Boto3VerifiedHTTPSConnection = cpool.VerifiedHTTPSConnection raise RuntimeError(
_cpoolBoto3HTTPConnection = cpool.HTTPConnection "vcrpy >=4.2.2 and botocore <1.11.0 are not compatible"
_cpoolBoto3HTTPSConnection = cpool.HTTPSConnection "; please upgrade botocore (or downgrade vcrpy)",
) from e
else: else:
_Boto3VerifiedHTTPSConnection = AWSHTTPSConnection _Boto3VerifiedHTTPSConnection = AWSHTTPSConnection
_cpoolBoto3HTTPConnection = AWSHTTPConnection _cpoolBoto3HTTPConnection = AWSHTTPConnection
_cpoolBoto3HTTPSConnection = AWSHTTPSConnection _cpoolBoto3HTTPSConnection = AWSHTTPSConnection
cpool = None cpool = None
conn = None
# Try to save the original types for urllib3 # Try to save the original types for urllib3
try: try:
import urllib3.connection as conn
import urllib3.connectionpool as cpool import urllib3.connectionpool as cpool
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
pass pass
else: else:
_VerifiedHTTPSConnection = cpool.VerifiedHTTPSConnection _VerifiedHTTPSConnection = conn.VerifiedHTTPSConnection
_cpoolHTTPConnection = cpool.HTTPConnection _connHTTPConnection = conn.HTTPConnection
_cpoolHTTPSConnection = cpool.HTTPSConnection _connHTTPSConnection = conn.HTTPSConnection
# Try to save the original types for requests # Try to save the original types for requests
try: try:
if not cpool: import requests
import requests.packages.urllib3.connectionpool as cpool
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
pass pass
else: else:
_VerifiedHTTPSConnection = cpool.VerifiedHTTPSConnection if requests.__build__ < 0x021602:
_cpoolHTTPConnection = cpool.HTTPConnection raise RuntimeError(
_cpoolHTTPSConnection = cpool.HTTPSConnection "vcrpy >=4.2.2 and requests <2.16.2 are not compatible"
"; please upgrade requests (or downgrade vcrpy)",
)
# Try to save the original types for httplib2 # Try to save the original types for httplib2
try: try:
@@ -63,14 +68,6 @@ else:
_HTTPSConnectionWithTimeout = httplib2.HTTPSConnectionWithTimeout _HTTPSConnectionWithTimeout = httplib2.HTTPSConnectionWithTimeout
_SCHEME_TO_CONNECTION = httplib2.SCHEME_TO_CONNECTION _SCHEME_TO_CONNECTION = httplib2.SCHEME_TO_CONNECTION
# Try to save the original types for boto
try:
import boto.https_connection
except ImportError: # pragma: no cover
pass
else:
_CertValidatingHTTPSConnection = boto.https_connection.CertValidatingHTTPSConnection
# Try to save the original types for Tornado # Try to save the original types for Tornado
try: try:
import tornado.simple_httpclient import tornado.simple_httpclient
@@ -95,12 +92,12 @@ else:
try: try:
import httpx import httpcore
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
pass pass
else: else:
_HttpxSyncClient_send = httpx.Client.send _HttpcoreConnectionPool_handle_request = httpcore.ConnectionPool.handle_request
_HttpxAsyncClient_send = httpx.AsyncClient.send _HttpcoreAsyncConnectionPool_handle_async_request = httpcore.AsyncConnectionPool.handle_async_request
class CassettePatcherBuilder: class CassettePatcherBuilder:
@@ -122,10 +119,9 @@ class CassettePatcherBuilder:
self._boto3(), self._boto3(),
self._urllib3(), self._urllib3(),
self._httplib2(), self._httplib2(),
self._boto(),
self._tornado(), self._tornado(),
self._aiohttp(), self._aiohttp(),
self._httpx(), self._httpcore(),
self._build_patchers_from_mock_triples(self._cassette.custom_patches), self._build_patchers_from_mock_triples(self._cassette.custom_patches),
) )
@@ -140,7 +136,9 @@ class CassettePatcherBuilder:
return return
return mock.patch.object( return mock.patch.object(
obj, patched_attribute, self._recursively_apply_get_cassette_subclass(replacement_class) obj,
patched_attribute,
self._recursively_apply_get_cassette_subclass(replacement_class),
) )
def _recursively_apply_get_cassette_subclass(self, replacement_dict_or_obj): def _recursively_apply_get_cassette_subclass(self, replacement_dict_or_obj):
@@ -182,9 +180,7 @@ class CassettePatcherBuilder:
bases = (base_class,) bases = (base_class,)
if not issubclass(base_class, object): # Check for old style class if not issubclass(base_class, object): # Check for old style class
bases += (object,) bases += (object,)
return type( return type(f"{base_class.__name__}{self._cassette._path}", bases, {"cassette": self._cassette})
"{}{}".format(base_class.__name__, self._cassette._path), bases, dict(cassette=self._cassette)
)
@_build_patchers_from_mock_triples_decorator @_build_patchers_from_mock_triples_decorator
def _httplib(self): def _httplib(self):
@@ -196,24 +192,15 @@ class CassettePatcherBuilder:
from .stubs import requests_stubs from .stubs import requests_stubs
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
return () return ()
return self._urllib3_patchers(cpool, requests_stubs) return self._urllib3_patchers(cpool, conn, requests_stubs)
@_build_patchers_from_mock_triples_decorator @_build_patchers_from_mock_triples_decorator
def _boto3(self): def _boto3(self):
try: try:
# botocore using awsrequest # botocore using awsrequest
import botocore.awsrequest as cpool import botocore.awsrequest as cpool
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
try: pass
# botocore using vendored requests
import botocore.vendored.requests.packages.urllib3.connectionpool as cpool
except ImportError: # pragma: no cover
pass
else:
from .stubs import boto3_stubs
yield self._urllib3_patchers(cpool, boto3_stubs)
else: else:
from .stubs import boto3_stubs from .stubs import boto3_stubs
@@ -255,12 +242,13 @@ class CassettePatcherBuilder:
def _urllib3(self): def _urllib3(self):
try: try:
import urllib3.connection as conn
import urllib3.connectionpool as cpool import urllib3.connectionpool as cpool
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
return () return ()
from .stubs import urllib3_stubs from .stubs import urllib3_stubs
return self._urllib3_patchers(cpool, urllib3_stubs) return self._urllib3_patchers(cpool, conn, urllib3_stubs)
@_build_patchers_from_mock_triples_decorator @_build_patchers_from_mock_triples_decorator
def _httplib2(self): def _httplib2(self):
@@ -269,26 +257,18 @@ class CassettePatcherBuilder:
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
pass pass
else: else:
from .stubs.httplib2_stubs import VCRHTTPConnectionWithTimeout from .stubs.httplib2_stubs import VCRHTTPConnectionWithTimeout, VCRHTTPSConnectionWithTimeout
from .stubs.httplib2_stubs import VCRHTTPSConnectionWithTimeout
yield cpool, "HTTPConnectionWithTimeout", VCRHTTPConnectionWithTimeout yield cpool, "HTTPConnectionWithTimeout", VCRHTTPConnectionWithTimeout
yield cpool, "HTTPSConnectionWithTimeout", VCRHTTPSConnectionWithTimeout yield cpool, "HTTPSConnectionWithTimeout", VCRHTTPSConnectionWithTimeout
yield cpool, "SCHEME_TO_CONNECTION", { yield (
"http": VCRHTTPConnectionWithTimeout, cpool,
"https": VCRHTTPSConnectionWithTimeout, "SCHEME_TO_CONNECTION",
} {
"http": VCRHTTPConnectionWithTimeout,
@_build_patchers_from_mock_triples_decorator "https": VCRHTTPSConnectionWithTimeout,
def _boto(self): },
try: )
import boto.https_connection as cpool
except ImportError: # pragma: no cover
pass
else:
from .stubs.boto_stubs import VCRCertValidatingHTTPSConnection
yield cpool, "CertValidatingHTTPSConnection", VCRCertValidatingHTTPSConnection
@_build_patchers_from_mock_triples_decorator @_build_patchers_from_mock_triples_decorator
def _tornado(self): def _tornado(self):
@@ -324,31 +304,34 @@ class CassettePatcherBuilder:
yield client.ClientSession, "_request", new_request yield client.ClientSession, "_request", new_request
@_build_patchers_from_mock_triples_decorator @_build_patchers_from_mock_triples_decorator
def _httpx(self): def _httpcore(self):
try: try:
import httpx import httpcore
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
return return
else: else:
from .stubs.httpx_stubs import async_vcr_send, sync_vcr_send from .stubs.httpcore_stubs import vcr_handle_async_request, vcr_handle_request
new_async_client_send = async_vcr_send(self._cassette, _HttpxAsyncClient_send) new_handle_async_request = vcr_handle_async_request(
yield httpx.AsyncClient, "send", new_async_client_send self._cassette,
_HttpcoreAsyncConnectionPool_handle_async_request,
)
yield httpcore.AsyncConnectionPool, "handle_async_request", new_handle_async_request
new_sync_client_send = sync_vcr_send(self._cassette, _HttpxSyncClient_send) new_handle_request = vcr_handle_request(self._cassette, _HttpcoreConnectionPool_handle_request)
yield httpx.Client, "send", new_sync_client_send yield httpcore.ConnectionPool, "handle_request", new_handle_request
def _urllib3_patchers(self, cpool, stubs): def _urllib3_patchers(self, cpool, conn, stubs):
http_connection_remover = ConnectionRemover( http_connection_remover = ConnectionRemover(
self._get_cassette_subclass(stubs.VCRRequestsHTTPConnection) self._get_cassette_subclass(stubs.VCRRequestsHTTPConnection),
) )
https_connection_remover = ConnectionRemover( https_connection_remover = ConnectionRemover(
self._get_cassette_subclass(stubs.VCRRequestsHTTPSConnection) self._get_cassette_subclass(stubs.VCRRequestsHTTPSConnection),
) )
mock_triples = ( mock_triples = (
(cpool, "VerifiedHTTPSConnection", stubs.VCRRequestsHTTPSConnection), (conn, "VerifiedHTTPSConnection", stubs.VCRRequestsHTTPSConnection),
(cpool, "HTTPConnection", stubs.VCRRequestsHTTPConnection), (conn, "HTTPConnection", stubs.VCRRequestsHTTPConnection),
(cpool, "HTTPSConnection", stubs.VCRRequestsHTTPSConnection), (conn, "HTTPSConnection", stubs.VCRRequestsHTTPSConnection),
(cpool, "is_connection_dropped", mock.Mock(return_value=False)), # Needed on Windows only (cpool, "is_connection_dropped", mock.Mock(return_value=False)), # Needed on Windows only
(cpool.HTTPConnectionPool, "ConnectionCls", stubs.VCRRequestsHTTPConnection), (cpool.HTTPConnectionPool, "ConnectionCls", stubs.VCRRequestsHTTPConnection),
(cpool.HTTPSConnectionPool, "ConnectionCls", stubs.VCRRequestsHTTPSConnection), (cpool.HTTPSConnectionPool, "ConnectionCls", stubs.VCRRequestsHTTPSConnection),
@@ -393,10 +376,6 @@ class ConnectionRemover:
if isinstance(connection, self._connection_class): if isinstance(connection, self._connection_class):
self._connection_pool_to_connections.setdefault(pool, set()).add(connection) self._connection_pool_to_connections.setdefault(pool, set()).add(connection)
def remove_connection_to_pool_entry(self, pool, connection):
if isinstance(connection, self._connection_class):
self._connection_pool_to_connections[self._connection_class].remove(connection)
def __enter__(self): def __enter__(self):
return self return self
@@ -407,10 +386,13 @@ class ConnectionRemover:
connection = pool.pool.get() connection = pool.pool.get()
if isinstance(connection, self._connection_class): if isinstance(connection, self._connection_class):
connections.remove(connection) connections.remove(connection)
connection.close()
else: else:
readd_connections.append(connection) readd_connections.append(connection)
for connection in readd_connections: for connection in readd_connections:
pool._put_conn(connection) pool._put_conn(connection)
for connection in connections:
connection.close()
def reset_patchers(): def reset_patchers():
@@ -418,69 +400,23 @@ def reset_patchers():
yield mock.patch.object(httplib, "HTTPSConnection", _HTTPSConnection) yield mock.patch.object(httplib, "HTTPSConnection", _HTTPSConnection)
try: try:
import requests import urllib3.connection as conn
if requests.__build__ < 0x021603:
# Avoid double unmock if requests 2.16.3
# First, this is pointless, requests.packages.urllib3 *IS* urllib3 (see packages.py)
# Second, this is unmocking twice the same classes with different namespaces
# and is creating weird issues and bugs:
# > AssertionError: assert <class 'urllib3.connection.HTTPConnection'>
# > is <class 'requests.packages.urllib3.connection.HTTPConnection'>
# This assert should work!!!
# Note that this also means that now, requests.packages is never imported
# if requests 2.16.3 or greater is used with VCRPy.
import requests.packages.urllib3.connectionpool as cpool
else:
raise ImportError("Skip requests not vendored anymore")
except ImportError: # pragma: no cover
pass
else:
# unpatch requests v1.x
yield mock.patch.object(cpool, "VerifiedHTTPSConnection", _VerifiedHTTPSConnection)
yield mock.patch.object(cpool, "HTTPConnection", _cpoolHTTPConnection)
# unpatch requests v2.x
if hasattr(cpool.HTTPConnectionPool, "ConnectionCls"):
yield mock.patch.object(cpool.HTTPConnectionPool, "ConnectionCls", _cpoolHTTPConnection)
yield mock.patch.object(cpool.HTTPSConnectionPool, "ConnectionCls", _cpoolHTTPSConnection)
if hasattr(cpool, "HTTPSConnection"):
yield mock.patch.object(cpool, "HTTPSConnection", _cpoolHTTPSConnection)
try:
import urllib3.connectionpool as cpool import urllib3.connectionpool as cpool
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
pass pass
else: else:
yield mock.patch.object(cpool, "VerifiedHTTPSConnection", _VerifiedHTTPSConnection) yield mock.patch.object(conn, "VerifiedHTTPSConnection", _VerifiedHTTPSConnection)
yield mock.patch.object(cpool, "HTTPConnection", _cpoolHTTPConnection) yield mock.patch.object(conn, "HTTPConnection", _connHTTPConnection)
yield mock.patch.object(cpool, "HTTPSConnection", _cpoolHTTPSConnection) yield mock.patch.object(conn, "HTTPSConnection", _connHTTPSConnection)
if hasattr(cpool.HTTPConnectionPool, "ConnectionCls"): if hasattr(cpool.HTTPConnectionPool, "ConnectionCls"):
yield mock.patch.object(cpool.HTTPConnectionPool, "ConnectionCls", _cpoolHTTPConnection) yield mock.patch.object(cpool.HTTPConnectionPool, "ConnectionCls", _connHTTPConnection)
yield mock.patch.object(cpool.HTTPSConnectionPool, "ConnectionCls", _cpoolHTTPSConnection) yield mock.patch.object(cpool.HTTPSConnectionPool, "ConnectionCls", _connHTTPSConnection)
try: try:
# unpatch botocore with awsrequest # unpatch botocore with awsrequest
import botocore.awsrequest as cpool import botocore.awsrequest as cpool
except ImportError: # pragma: no cover except ImportError: # pragma: no cover
try: pass
# unpatch botocore with vendored requests
import botocore.vendored.requests.packages.urllib3.connectionpool as cpool
except ImportError: # pragma: no cover
pass
else:
# unpatch requests v1.x
yield mock.patch.object(cpool, "VerifiedHTTPSConnection", _Boto3VerifiedHTTPSConnection)
yield mock.patch.object(cpool, "HTTPConnection", _cpoolBoto3HTTPConnection)
# unpatch requests v2.x
if hasattr(cpool.HTTPConnectionPool, "ConnectionCls"):
yield mock.patch.object(cpool.HTTPConnectionPool, "ConnectionCls", _cpoolBoto3HTTPConnection)
yield mock.patch.object(
cpool.HTTPSConnectionPool, "ConnectionCls", _cpoolBoto3HTTPSConnection
)
if hasattr(cpool, "HTTPSConnection"):
yield mock.patch.object(cpool, "HTTPSConnection", _cpoolBoto3HTTPSConnection)
else: else:
if hasattr(cpool.AWSHTTPConnectionPool, "ConnectionCls"): if hasattr(cpool.AWSHTTPConnectionPool, "ConnectionCls"):
yield mock.patch.object(cpool.AWSHTTPConnectionPool, "ConnectionCls", _cpoolBoto3HTTPConnection) yield mock.patch.object(cpool.AWSHTTPConnectionPool, "ConnectionCls", _cpoolBoto3HTTPConnection)
@@ -498,13 +434,6 @@ def reset_patchers():
yield mock.patch.object(cpool, "HTTPSConnectionWithTimeout", _HTTPSConnectionWithTimeout) yield mock.patch.object(cpool, "HTTPSConnectionWithTimeout", _HTTPSConnectionWithTimeout)
yield mock.patch.object(cpool, "SCHEME_TO_CONNECTION", _SCHEME_TO_CONNECTION) yield mock.patch.object(cpool, "SCHEME_TO_CONNECTION", _SCHEME_TO_CONNECTION)
try:
import boto.https_connection as cpool
except ImportError: # pragma: no cover
pass
else:
yield mock.patch.object(cpool, "CertValidatingHTTPSConnection", _CertValidatingHTTPSConnection)
try: try:
import tornado.simple_httpclient as simple import tornado.simple_httpclient as simple
except ImportError: # pragma: no cover except ImportError: # pragma: no cover

View File

@@ -1,25 +1,40 @@
# .. _persister_example: # .. _persister_example:
import os from pathlib import Path
from ..serialize import serialize, deserialize
from ..serialize import deserialize, serialize
class CassetteNotFoundError(FileNotFoundError):
pass
class CassetteDecodeError(ValueError):
pass
class FilesystemPersister: class FilesystemPersister:
@classmethod @classmethod
def load_cassette(cls, cassette_path, serializer): def load_cassette(cls, cassette_path, serializer):
cassette_path = Path(cassette_path) # if cassette path is already Path this is no operation
if not cassette_path.is_file():
raise CassetteNotFoundError()
try: try:
with open(cassette_path) as f: with cassette_path.open() as f:
cassette_content = f.read() data = f.read()
except OSError: except UnicodeDecodeError as err:
raise ValueError("Cassette not found.") raise CassetteDecodeError("Can't read Cassette, Encoding is broken") from err
cassette = deserialize(cassette_content, serializer)
return cassette return deserialize(data, serializer)
@staticmethod @staticmethod
def save_cassette(cassette_path, cassette_dict, serializer): def save_cassette(cassette_path, cassette_dict, serializer):
data = serialize(cassette_dict, serializer) data = serialize(cassette_dict, serializer)
dirname, filename = os.path.split(cassette_path) cassette_path = Path(cassette_path) # if cassette path is already Path this is no operation
if dirname and not os.path.exists(dirname):
os.makedirs(dirname) cassette_folder = cassette_path.parent
with open(cassette_path, "w") as f: if not cassette_folder.exists():
cassette_folder.mkdir(parents=True)
with cassette_path.open("w") as f:
f.write(data) f.write(data)

View File

@@ -1,8 +1,10 @@
import warnings
from io import BytesIO
from urllib.parse import urlparse, parse_qsl
from .util import CaseInsensitiveDict
import logging import logging
import warnings
from contextlib import suppress
from io import BytesIO
from urllib.parse import parse_qsl, urlparse
from .util import CaseInsensitiveDict, _is_nonsequence_iterator
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@@ -16,13 +18,25 @@ class Request:
self.method = method self.method = method
self.uri = uri self.uri = uri
self._was_file = hasattr(body, "read") self._was_file = hasattr(body, "read")
self._was_iter = _is_nonsequence_iterator(body)
if self._was_file: if self._was_file:
self.body = body.read() self.body = body.read()
elif self._was_iter:
self.body = list(body)
else: else:
self.body = body self.body = body
self.headers = headers self.headers = headers
log.debug("Invoking Request %s", self.uri) log.debug("Invoking Request %s", self.uri)
@property
def uri(self):
return self._uri
@uri.setter
def uri(self, uri):
self._uri = uri
self.parsed_uri = urlparse(uri)
@property @property
def headers(self): def headers(self):
return self._headers return self._headers
@@ -35,7 +49,11 @@ class Request:
@property @property
def body(self): def body(self):
return BytesIO(self._body) if self._was_file else self._body if self._was_file:
return BytesIO(self._body)
if self._was_iter:
return iter(self._body)
return self._body
@body.setter @body.setter
def body(self, value): def body(self, value):
@@ -45,37 +63,36 @@ class Request:
def add_header(self, key, value): def add_header(self, key, value):
warnings.warn( warnings.warn(
"Request.add_header is deprecated. " "Please assign to request.headers instead.", "Request.add_header is deprecated. Please assign to request.headers instead.",
DeprecationWarning, DeprecationWarning,
stacklevel=2,
) )
self.headers[key] = value self.headers[key] = value
@property @property
def scheme(self): def scheme(self):
return urlparse(self.uri).scheme return self.parsed_uri.scheme
@property @property
def host(self): def host(self):
return urlparse(self.uri).hostname return self.parsed_uri.hostname
@property @property
def port(self): def port(self):
parse_uri = urlparse(self.uri) port = self.parsed_uri.port
port = parse_uri.port
if port is None: if port is None:
try: with suppress(KeyError):
port = {"https": 443, "http": 80}[parse_uri.scheme] port = {"https": 443, "http": 80}[self.parsed_uri.scheme]
except KeyError:
pass
return port return port
@property @property
def path(self): def path(self):
return urlparse(self.uri).path return self.parsed_uri.path
@property @property
def query(self): def query(self):
q = urlparse(self.uri).query q = self.parsed_uri.query
return sorted(parse_qsl(q)) return sorted(parse_qsl(q))
# alias for backwards compatibility # alias for backwards compatibility
@@ -89,7 +106,7 @@ class Request:
return self.scheme return self.scheme
def __str__(self): def __str__(self):
return "<Request ({}) {}>".format(self.method, self.uri) return f"<Request ({self.method}) {self.uri}>"
def __repr__(self): def __repr__(self):
return self.__str__() return self.__str__()

View File

@@ -1,7 +1,8 @@
from vcr.serializers import compat
from vcr.request import Request
import yaml import yaml
from vcr.request import Request
from vcr.serializers import compat
# version 1 cassettes started with VCR 1.0.x. # version 1 cassettes started with VCR 1.0.x.
# Before 1.0.x, there was no versioning. # Before 1.0.x, there was no versioning.
CASSETTE_FORMAT_VERSION = 1 CASSETTE_FORMAT_VERSION = 1
@@ -27,7 +28,7 @@ def _warn_about_old_cassette_format():
raise ValueError( raise ValueError(
"Your cassette files were generated in an older version " "Your cassette files were generated in an older version "
"of VCR. Delete your cassettes or run the migration script." "of VCR. Delete your cassettes or run the migration script."
"See http://git.io/mHhLBg for more details." "See http://git.io/mHhLBg for more details.",
) )
@@ -52,7 +53,7 @@ def serialize(cassette_dict, serializer):
"request": compat.convert_to_unicode(request._to_dict()), "request": compat.convert_to_unicode(request._to_dict()),
"response": compat.convert_to_unicode(response), "response": compat.convert_to_unicode(response),
} }
for request, response in zip(cassette_dict["requests"], cassette_dict["responses"]) for request, response in zip(cassette_dict["requests"], cassette_dict["responses"], strict=False)
] ]
data = {"version": CASSETTE_FORMAT_VERSION, "interactions": interactions} data = {"version": CASSETTE_FORMAT_VERSION, "interactions": interactions}
return serializer.serialize(data) return serializer.serialize(data)

View File

@@ -56,7 +56,7 @@ def convert_body_to_unicode(resp):
If the request or responses body is bytes, decode it to a string If the request or responses body is bytes, decode it to a string
(for python3 support) (for python3 support)
""" """
if type(resp) is not dict: if not isinstance(resp, dict):
# Some of the tests just serialize and deserialize a string. # Some of the tests just serialize and deserialize a string.
return _convert_string_to_unicode(resp) return _convert_string_to_unicode(resp)
else: else:

View File

@@ -1,7 +1,4 @@
try: import json
import simplejson as json
except ImportError:
import json
def deserialize(cassette_string): def deserialize(cassette_string):
@@ -17,13 +14,5 @@ def serialize(cassette_dict):
try: try:
return json.dumps(cassette_dict, indent=4) + "\n" return json.dumps(cassette_dict, indent=4) + "\n"
except UnicodeDecodeError as original: # py2 except TypeError:
raise UnicodeDecodeError( raise TypeError(error_message) from None
original.encoding,
b"Error serializing cassette to JSON",
original.start,
original.end,
original.args[-1] + error_message,
)
except TypeError: # py3
raise TypeError(error_message)

View File

@@ -2,9 +2,10 @@ import yaml
# Use the libYAML versions if possible # Use the libYAML versions if possible
try: try:
from yaml import CLoader as Loader, CDumper as Dumper from yaml import CDumper as Dumper
from yaml import CLoader as Loader
except ImportError: except ImportError:
from yaml import Loader, Dumper from yaml import Dumper, Loader
def deserialize(cassette_string): def deserialize(cassette_string):

View File

@@ -1,14 +1,13 @@
"""Stubs for patching HTTP and HTTPS requests""" """Stubs for patching HTTP and HTTPS requests"""
import logging import logging
from http.client import ( from contextlib import suppress
HTTPConnection, from http.client import HTTPConnection, HTTPResponse, HTTPSConnection
HTTPSConnection,
HTTPResponse,
)
from io import BytesIO from io import BytesIO
from vcr.request import Request
from vcr.errors import CannotOverwriteExistingCassetteException from vcr.errors import CannotOverwriteExistingCassetteException
from vcr.request import Request
from . import compat from . import compat
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@@ -49,8 +48,9 @@ def parse_headers(header_list):
def serialize_headers(response): def serialize_headers(response):
headers = response.headers if response.msg is None else response.msg
out = {} out = {}
for key, values in compat.get_headers(response.msg): for key, values in compat.get_headers(headers):
out.setdefault(key, []) out.setdefault(key, [])
out[key].extend(values) out[key].extend(values)
return out return out
@@ -67,8 +67,10 @@ class VCRHTTPResponse(HTTPResponse):
self.reason = recorded_response["status"]["message"] self.reason = recorded_response["status"]["message"]
self.status = self.code = recorded_response["status"]["code"] self.status = self.code = recorded_response["status"]["code"]
self.version = None self.version = None
self.version_string = None
self._content = BytesIO(self.recorded_response["body"]["string"]) self._content = BytesIO(self.recorded_response["body"]["string"])
self._closed = False self._closed = False
self._original_response = self # for requests.session.Session cookie extraction
headers = self.recorded_response["headers"] headers = self.recorded_response["headers"]
# Since we are loading a response that has already been serialized, our # Since we are loading a response that has already been serialized, our
@@ -76,7 +78,7 @@ class VCRHTTPResponse(HTTPResponse):
# libraries trying to process a chunked response. By removing the # libraries trying to process a chunked response. By removing the
# transfer-encoding: chunked header, this should cause the downstream # transfer-encoding: chunked header, this should cause the downstream
# libraries to process this as a non-chunked response. # libraries to process this as a non-chunked response.
te_key = [h for h in headers.keys() if h.upper() == "TRANSFER-ENCODING"] te_key = [h for h in headers if h.upper() == "TRANSFER-ENCODING"]
if te_key: if te_key:
del headers[te_key[0]] del headers[te_key[0]]
self.headers = self.msg = parse_headers(headers) self.headers = self.msg = parse_headers(headers)
@@ -93,6 +95,9 @@ class VCRHTTPResponse(HTTPResponse):
def read(self, *args, **kwargs): def read(self, *args, **kwargs):
return self._content.read(*args, **kwargs) return self._content.read(*args, **kwargs)
def read1(self, *args, **kwargs):
return self._content.read1(*args, **kwargs)
def readall(self): def readall(self):
return self._content.readall() return self._content.readall()
@@ -145,6 +150,35 @@ class VCRHTTPResponse(HTTPResponse):
def readable(self): def readable(self):
return self._content.readable() return self._content.readable()
@property
def length_remaining(self):
return self._content.getbuffer().nbytes - self._content.tell()
def get_redirect_location(self):
"""
Returns (a) redirect location string if we got a redirect
status code and valid location, (b) None if redirect status and
no location, (c) False if not a redirect status code.
See https://urllib3.readthedocs.io/en/stable/reference/urllib3.response.html .
"""
if not (300 <= self.status <= 399):
return False
return self.getheader("Location")
@property
def data(self):
return self._content.getbuffer().tobytes()
def drain_conn(self):
pass
def stream(self, amt=65536, decode_content=None):
while True:
b = self._content.read(amt)
yield b
if not b:
break
class VCRConnection: class VCRConnection:
# A reference to the cassette that's currently being patched in # A reference to the cassette that's currently being patched in
@@ -154,28 +188,40 @@ class VCRConnection:
""" """
Returns empty string for the default port and ':port' otherwise Returns empty string for the default port and ':port' otherwise
""" """
port = self.real_connection.port port = (
self.real_connection.port
if not self.real_connection._tunnel_host
else self.real_connection._tunnel_port
)
default_port = {"https": 443, "http": 80}[self._protocol] default_port = {"https": 443, "http": 80}[self._protocol]
return ":{}".format(port) if port != default_port else "" return f":{port}" if port != default_port else ""
def _real_host(self):
"""Returns the request host"""
if self.real_connection._tunnel_host:
# The real connection is to an HTTPS proxy
return self.real_connection._tunnel_host
else:
return self.real_connection.host
def _uri(self, url): def _uri(self, url):
"""Returns request absolute URI""" """Returns request absolute URI"""
if url and not url.startswith("/"): if url and not url.startswith("/"):
# Then this must be a proxy request. # Then this must be a proxy request.
return url return url
uri = "{}://{}{}{}".format(self._protocol, self.real_connection.host, self._port_postfix(), url) uri = f"{self._protocol}://{self._real_host()}{self._port_postfix()}{url}"
log.debug("Absolute URI: %s", uri) log.debug("Absolute URI: %s", uri)
return uri return uri
def _url(self, uri): def _url(self, uri):
"""Returns request selector url from absolute URI""" """Returns request selector url from absolute URI"""
prefix = "{}://{}{}".format(self._protocol, self.real_connection.host, self._port_postfix()) prefix = f"{self._protocol}://{self._real_host()}{self._port_postfix()}"
return uri.replace(prefix, "", 1) return uri.replace(prefix, "", 1)
def request(self, method, url, body=None, headers=None, *args, **kwargs): def request(self, method, url, body=None, headers=None, *args, **kwargs):
"""Persist the request metadata in self._vcr_request""" """Persist the request metadata in self._vcr_request"""
self._vcr_request = Request(method=method, uri=self._uri(url), body=body, headers=headers or {}) self._vcr_request = Request(method=method, uri=self._uri(url), body=body, headers=headers or {})
log.debug("Got {}".format(self._vcr_request)) log.debug(f"Got {self._vcr_request}")
# Note: The request may not actually be finished at this point, so # Note: The request may not actually be finished at this point, so
# I'm not sending the actual request until getresponse(). This # I'm not sending the actual request until getresponse(). This
@@ -191,7 +237,7 @@ class VCRConnection:
of putheader() calls. of putheader() calls.
""" """
self._vcr_request = Request(method=method, uri=self._uri(url), body="", headers={}) self._vcr_request = Request(method=method, uri=self._uri(url), body="", headers={})
log.debug("Got {}".format(self._vcr_request)) log.debug(f"Got {self._vcr_request}")
def putheader(self, header, *values): def putheader(self, header, *values):
self._vcr_request.headers[header] = values self._vcr_request.headers[header] = values
@@ -223,19 +269,20 @@ class VCRConnection:
# Check to see if the cassette has a response for this request. If so, # Check to see if the cassette has a response for this request. If so,
# then return it # then return it
if self.cassette.can_play_response_for(self._vcr_request): if self.cassette.can_play_response_for(self._vcr_request):
log.info("Playing response for {} from cassette".format(self._vcr_request)) log.info(f"Playing response for {self._vcr_request} from cassette")
response = self.cassette.play_response(self._vcr_request) response = self.cassette.play_response(self._vcr_request)
return VCRHTTPResponse(response) return VCRHTTPResponse(response)
else: else:
if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request):
raise CannotOverwriteExistingCassetteException( raise CannotOverwriteExistingCassetteException(
cassette=self.cassette, failed_request=self._vcr_request cassette=self.cassette,
failed_request=self._vcr_request,
) )
# Otherwise, we should send the request, then get the response # Otherwise, we should send the request, then get the response
# and return it. # and return it.
log.info("{} not in cassette, sending to real server".format(self._vcr_request)) log.info(f"{self._vcr_request} not in cassette, sending to real server")
# This is imported here to avoid circular import. # This is imported here to avoid circular import.
# TODO(@IvanMalison): Refactor to allow normal import. # TODO(@IvanMalison): Refactor to allow normal import.
from vcr.patch import force_reset from vcr.patch import force_reset
@@ -250,12 +297,13 @@ class VCRConnection:
# get the response # get the response
response = self.real_connection.getresponse() response = self.real_connection.getresponse()
response_data = response.data if hasattr(response, "data") else response.read()
# put the response into the cassette # put the response into the cassette
response = { response = {
"status": {"code": response.status, "message": response.reason}, "status": {"code": response.status, "message": response.reason},
"headers": serialize_headers(response), "headers": serialize_headers(response),
"body": {"string": response.read()}, "body": {"string": response_data},
} }
self.cassette.append(self._vcr_request, response) self.cassette.append(self._vcr_request, response)
return VCRHTTPResponse(response) return VCRHTTPResponse(response)
@@ -323,12 +371,8 @@ class VCRConnection:
TODO: Separately setting the attribute on the two instances is not TODO: Separately setting the attribute on the two instances is not
ideal. We should switch to a proxying implementation. ideal. We should switch to a proxying implementation.
""" """
try: with suppress(AttributeError):
setattr(self.real_connection, name, value) setattr(self.real_connection, name, value)
except AttributeError:
# raised if real_connection has not been set yet, such as when
# we're setting the real_connection itself for the first time
pass
super().__setattr__(name, value) super().__setattr__(name, value)
@@ -355,6 +399,8 @@ class VCRHTTPConnection(VCRConnection):
_baseclass = HTTPConnection _baseclass = HTTPConnection
_protocol = "http" _protocol = "http"
debuglevel = _baseclass.debuglevel
_http_vsn = _baseclass._http_vsn
class VCRHTTPSConnection(VCRConnection): class VCRHTTPSConnection(VCRConnection):
@@ -363,3 +409,5 @@ class VCRHTTPSConnection(VCRConnection):
_baseclass = HTTPSConnection _baseclass = HTTPSConnection
_protocol = "https" _protocol = "https"
is_verified = True is_verified = True
debuglevel = _baseclass.debuglevel
_http_vsn = _baseclass._http_vsn

View File

@@ -1,15 +1,15 @@
"""Stubs for aiohttp HTTP clients""" """Stubs for aiohttp HTTP clients"""
import asyncio import asyncio
import functools import functools
import logging
import json import json
import logging
from aiohttp import ClientConnectionError, ClientResponse, RequestInfo, streams from collections.abc import Mapping
from aiohttp import hdrs, CookieJar
from http.cookies import CookieError, Morsel, SimpleCookie from http.cookies import CookieError, Morsel, SimpleCookie
from aiohttp import ClientConnectionError, ClientResponse, CookieJar, RequestInfo, hdrs, streams
from aiohttp.helpers import strip_auth_from_url from aiohttp.helpers import strip_auth_from_url
from multidict import CIMultiDict, CIMultiDictProxy, MultiDict from multidict import CIMultiDict, CIMultiDictProxy, MultiDict
from typing import Union, Mapping
from yarl import URL from yarl import URL
from vcr.errors import CannotOverwriteExistingCassetteException from vcr.errors import CannotOverwriteExistingCassetteException
@@ -36,7 +36,7 @@ class MockClientResponse(ClientResponse):
session=None, session=None,
) )
async def json(self, *, encoding="utf-8", loads=json.loads, **kwargs): # NOQA: E999 async def json(self, *, encoding="utf-8", loads=json.loads, **kwargs):
stripped = self._body.strip() stripped = self._body.strip()
if not stripped: if not stripped:
return None return None
@@ -67,7 +67,7 @@ def build_response(vcr_request, vcr_response, history):
headers=_deserialize_headers(vcr_request.headers), headers=_deserialize_headers(vcr_request.headers),
real_url=URL(vcr_request.url), real_url=URL(vcr_request.url),
) )
response = MockClientResponse(vcr_request.method, URL(vcr_response.get("url")), request_info=request_info) response = MockClientResponse(vcr_request.method, URL(vcr_request.url), request_info=request_info)
response.status = vcr_response["status"]["code"] response.status = vcr_response["status"]["code"]
response._body = vcr_response["body"].get("string", b"") response._body = vcr_response["body"].get("string", b"")
response.reason = vcr_response["status"]["message"] response.reason = vcr_response["status"]["message"]
@@ -163,8 +163,7 @@ async def record_response(cassette, vcr_request, response):
vcr_response = { vcr_response = {
"status": {"code": response.status, "message": response.reason}, "status": {"code": response.status, "message": response.reason},
"headers": _serialize_headers(response.headers), "headers": _serialize_headers(response.headers),
"body": body, # NOQA: E999 "body": body,
"url": str(response.url),
} }
cassette.append(vcr_request, vcr_response) cassette.append(vcr_request, vcr_response)
@@ -230,7 +229,7 @@ def _build_cookie_header(session, cookies, cookie_header, url):
return c.output(header="", sep=";").strip() return c.output(header="", sep=";").strip()
def _build_url_with_params(url_str: str, params: Mapping[str, Union[str, int, float]]) -> URL: def _build_url_with_params(url_str: str, params: Mapping[str, str | int | float]) -> URL:
# This code is basically a copy&paste of aiohttp. # This code is basically a copy&paste of aiohttp.
# https://github.com/aio-libs/aiohttp/blob/master/aiohttp/client_reqrep.py#L225 # https://github.com/aio-libs/aiohttp/blob/master/aiohttp/client_reqrep.py#L225
url = URL(url_str) url = URL(url_str)
@@ -262,7 +261,7 @@ def vcr_request(cassette, real_request):
vcr_request = Request(method, str(request_url), data, _serialize_headers(headers)) vcr_request = Request(method, str(request_url), data, _serialize_headers(headers))
if cassette.can_play_response_for(vcr_request): if cassette.can_play_response_for(vcr_request):
log.info("Playing response for {} from cassette".format(vcr_request)) log.info(f"Playing response for {vcr_request} from cassette")
response = play_responses(cassette, vcr_request, kwargs) response = play_responses(cassette, vcr_request, kwargs)
for redirect in response.history: for redirect in response.history:
self._cookie_jar.update_cookies(redirect.cookies, redirect.url) self._cookie_jar.update_cookies(redirect.cookies, redirect.url)
@@ -274,7 +273,7 @@ def vcr_request(cassette, real_request):
log.info("%s not in cassette, sending to real server", vcr_request) log.info("%s not in cassette, sending to real server", vcr_request)
response = await real_request(self, method, url, **kwargs) # NOQA: E999 response = await real_request(self, method, url, **kwargs)
await record_responses(cassette, vcr_request, response) await record_responses(cassette, vcr_request, response)
return response return response

View File

@@ -1,17 +1,7 @@
"""Stubs for boto3""" """Stubs for boto3"""
try:
# boto using awsrequest
from botocore.awsrequest import AWSHTTPConnection as HTTPConnection
from botocore.awsrequest import AWSHTTPSConnection as VerifiedHTTPSConnection
except ImportError: # pragma: nocover from botocore.awsrequest import AWSHTTPConnection as HTTPConnection
# boto using vendored requests from botocore.awsrequest import AWSHTTPSConnection as VerifiedHTTPSConnection
# urllib3 defines its own HTTPConnection classes, which boto3 goes ahead and assumes
# you're using. It includes some polyfills for newer features missing in older pythons.
try:
from urllib3.connectionpool import HTTPConnection, VerifiedHTTPSConnection
except ImportError: # pragma: nocover
from requests.packages.urllib3.connectionpool import HTTPConnection, VerifiedHTTPSConnection
from ..stubs import VCRHTTPConnection, VCRHTTPSConnection from ..stubs import VCRHTTPConnection, VCRHTTPSConnection

View File

@@ -1,8 +0,0 @@
"""Stubs for boto"""
from boto.https_connection import CertValidatingHTTPSConnection
from ..stubs import VCRHTTPSConnection
class VCRCertValidatingHTTPSConnection(VCRHTTPSConnection):
_baseclass = CertValidatingHTTPSConnection

View File

@@ -1,6 +1,5 @@
from io import BytesIO
import http.client import http.client
from io import BytesIO
""" """
The python3 http.client api moved some stuff around, so this is an abstraction The python3 http.client api moved some stuff around, so this is an abstraction
@@ -13,7 +12,7 @@ def get_header(message, name):
def get_header_items(message): def get_header_items(message):
for (key, values) in get_headers(message): for key, values in get_headers(message):
for value in values: for value in values:
yield key, value yield key, value

215
vcr/stubs/httpcore_stubs.py Normal file
View File

@@ -0,0 +1,215 @@
import asyncio
import functools
import logging
from collections import defaultdict
from collections.abc import AsyncIterable, Iterable
from httpcore import Response
from httpcore._models import ByteStream
from vcr.errors import CannotOverwriteExistingCassetteException
from vcr.filters import decode_response
from vcr.request import Request as VcrRequest
from vcr.serializers.compat import convert_body_to_bytes
_logger = logging.getLogger(__name__)
async def _convert_byte_stream(stream):
if isinstance(stream, Iterable):
return list(stream)
if isinstance(stream, AsyncIterable):
return [part async for part in stream]
raise TypeError(
f"_convert_byte_stream: stream must be Iterable or AsyncIterable, got {type(stream).__name__}",
)
def _serialize_headers(real_response):
"""
Some headers can appear multiple times, like "Set-Cookie".
Therefore serialize every header key to a list of values.
"""
headers = defaultdict(list)
for name, value in real_response.headers:
headers[name.decode("ascii")].append(value.decode("ascii"))
return dict(headers)
async def _serialize_response(real_response):
# The reason_phrase may not exist
try:
reason_phrase = real_response.extensions["reason_phrase"].decode("ascii")
except KeyError:
reason_phrase = None
# Reading the response stream consumes the iterator, so we need to restore it afterwards
content = b"".join(await _convert_byte_stream(real_response.stream))
real_response.stream = ByteStream(content)
return {
"status": {"code": real_response.status, "message": reason_phrase},
"headers": _serialize_headers(real_response),
"body": {"string": content},
}
def _deserialize_headers(headers):
"""
httpcore accepts headers as list of tuples of header key and value.
"""
return [
(name.encode("ascii"), value.encode("ascii")) for name, values in headers.items() for value in values
]
def _deserialize_response(vcr_response):
# Cassette format generated for HTTPX requests by older versions of
# vcrpy. We restructure the content to resemble what a regular
# cassette looks like.
if "status_code" in vcr_response:
vcr_response = decode_response(
convert_body_to_bytes(
{
"headers": vcr_response["headers"],
"body": {"string": vcr_response["content"]},
"status": {"code": vcr_response["status_code"]},
},
),
)
extensions = None
else:
extensions = (
{"reason_phrase": vcr_response["status"]["message"].encode("ascii")}
if vcr_response["status"]["message"]
else None
)
return Response(
vcr_response["status"]["code"],
headers=_deserialize_headers(vcr_response["headers"]),
content=vcr_response["body"]["string"],
extensions=extensions,
)
async def _make_vcr_request(real_request):
# Reading the request stream consumes the iterator, so we need to restore it afterwards
body = b"".join(await _convert_byte_stream(real_request.stream))
real_request.stream = ByteStream(body)
uri = bytes(real_request.url).decode("ascii")
# As per HTTPX: If there are multiple headers with the same key, then we concatenate them with commas
headers = defaultdict(list)
for name, value in real_request.headers:
headers[name.decode("ascii")].append(value.decode("ascii"))
headers = {name: ", ".join(values) for name, values in headers.items()}
return VcrRequest(real_request.method.decode("ascii"), uri, body, headers)
async def _vcr_request(cassette, real_request):
vcr_request = await _make_vcr_request(real_request)
if cassette.can_play_response_for(vcr_request):
return vcr_request, _play_responses(cassette, vcr_request)
if cassette.write_protected and cassette.filter_request(vcr_request):
raise CannotOverwriteExistingCassetteException(
cassette=cassette,
failed_request=vcr_request,
)
_logger.info("%s not in cassette, sending to real server", vcr_request)
return vcr_request, None
async def _record_responses(cassette, vcr_request, real_response):
cassette.append(vcr_request, await _serialize_response(real_response))
def _play_responses(cassette, vcr_request):
vcr_response = cassette.play_response(vcr_request)
real_response = _deserialize_response(vcr_response)
return real_response
async def _vcr_handle_async_request(
cassette,
real_handle_async_request,
self,
real_request,
):
vcr_request, vcr_response = await _vcr_request(cassette, real_request)
if vcr_response:
return vcr_response
real_response = await real_handle_async_request(self, real_request)
await _record_responses(cassette, vcr_request, real_response)
return real_response
def vcr_handle_async_request(cassette, real_handle_async_request):
@functools.wraps(real_handle_async_request)
def _inner_handle_async_request(self, real_request):
return _vcr_handle_async_request(
cassette,
real_handle_async_request,
self,
real_request,
)
return _inner_handle_async_request
def _run_async_function(sync_func, *args, **kwargs):
"""
Safely run an asynchronous function from a synchronous context.
Handles both cases:
- An event loop is already running.
- No event loop exists yet.
"""
try:
asyncio.get_running_loop()
except RuntimeError:
return asyncio.run(sync_func(*args, **kwargs))
else:
# If inside a running loop, create a task and wait for it
return asyncio.ensure_future(sync_func(*args, **kwargs))
def _vcr_handle_request(cassette, real_handle_request, self, real_request):
vcr_request, vcr_response = _run_async_function(
_vcr_request,
cassette,
real_request,
)
if vcr_response:
return vcr_response
real_response = real_handle_request(self, real_request)
_run_async_function(_record_responses, cassette, vcr_request, real_response)
return real_response
def vcr_handle_request(cassette, real_handle_request):
@functools.wraps(real_handle_request)
def _inner_handle_request(self, real_request):
return _vcr_handle_request(cassette, real_handle_request, self, real_request)
return _inner_handle_request

View File

@@ -1,6 +1,7 @@
"""Stubs for httplib2""" """Stubs for httplib2"""
from httplib2 import HTTPConnectionWithTimeout, HTTPSConnectionWithTimeout from httplib2 import HTTPConnectionWithTimeout, HTTPSConnectionWithTimeout
from ..stubs import VCRHTTPConnection, VCRHTTPSConnection from ..stubs import VCRHTTPConnection, VCRHTTPSConnection
@@ -27,7 +28,6 @@ class VCRHTTPSConnectionWithTimeout(VCRHTTPSConnection, HTTPSConnectionWithTimeo
_baseclass = HTTPSConnectionWithTimeout _baseclass = HTTPSConnectionWithTimeout
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
# Delete the keyword arguments that HTTPSConnection would not recognize # Delete the keyword arguments that HTTPSConnection would not recognize
safe_keys = { safe_keys = {
"host", "host",

View File

@@ -1,170 +0,0 @@
import functools
import logging
from unittest.mock import patch, MagicMock
import httpx
from vcr.request import Request as VcrRequest
from vcr.errors import CannotOverwriteExistingCassetteException
import inspect
_httpx_signature = inspect.signature(httpx.Client.request)
try:
HTTPX_REDIRECT_PARAM = _httpx_signature.parameters["follow_redirects"]
except KeyError:
HTTPX_REDIRECT_PARAM = _httpx_signature.parameters["allow_redirects"]
_logger = logging.getLogger(__name__)
def _transform_headers(httpx_response):
"""
Some headers can appear multiple times, like "Set-Cookie".
Therefore transform to every header key to list of values.
"""
out = {}
for key, var in httpx_response.headers.raw:
decoded_key = key.decode("utf-8")
out.setdefault(decoded_key, [])
out[decoded_key].append(var.decode("utf-8"))
return out
def _to_serialized_response(httpx_response):
return {
"status_code": httpx_response.status_code,
"http_version": httpx_response.http_version,
"headers": _transform_headers(httpx_response),
"content": httpx_response.content.decode("utf-8", "ignore"),
}
def _from_serialized_headers(headers):
"""
httpx accepts headers as list of tuples of header key and value.
"""
header_list = []
for key, values in headers.items():
for v in values:
header_list.append((key, v))
return header_list
@patch("httpx.Response.close", MagicMock())
@patch("httpx.Response.read", MagicMock())
def _from_serialized_response(request, serialized_response, history=None):
content = serialized_response.get("content").encode()
response = httpx.Response(
status_code=serialized_response.get("status_code"),
request=request,
headers=_from_serialized_headers(serialized_response.get("headers")),
content=content,
history=history or [],
)
response._content = content
return response
def _make_vcr_request(httpx_request, **kwargs):
body = httpx_request.read().decode("utf-8")
uri = str(httpx_request.url)
headers = dict(httpx_request.headers)
return VcrRequest(httpx_request.method, uri, body, headers)
def _shared_vcr_send(cassette, real_send, *args, **kwargs):
real_request = args[1]
vcr_request = _make_vcr_request(real_request, **kwargs)
if cassette.can_play_response_for(vcr_request):
return vcr_request, _play_responses(cassette, real_request, vcr_request, args[0], kwargs)
if cassette.write_protected and cassette.filter_request(vcr_request):
raise CannotOverwriteExistingCassetteException(cassette=cassette, failed_request=vcr_request)
_logger.info("%s not in cassette, sending to real server", vcr_request)
return vcr_request, None
def _record_responses(cassette, vcr_request, real_response):
for past_real_response in real_response.history:
past_vcr_request = _make_vcr_request(past_real_response.request)
cassette.append(past_vcr_request, _to_serialized_response(past_real_response))
if real_response.history:
# If there was a redirection keep we want the request which will hold the
# final redirect value
vcr_request = _make_vcr_request(real_response.request)
cassette.append(vcr_request, _to_serialized_response(real_response))
return real_response
def _play_responses(cassette, request, vcr_request, client, kwargs):
history = []
allow_redirects = kwargs.get(
HTTPX_REDIRECT_PARAM.name,
HTTPX_REDIRECT_PARAM.default,
)
vcr_response = cassette.play_response(vcr_request)
response = _from_serialized_response(request, vcr_response)
while allow_redirects and 300 <= response.status_code <= 399:
next_url = response.headers.get("location")
if not next_url:
break
vcr_request = VcrRequest("GET", next_url, None, dict(response.headers))
vcr_request = cassette.find_requests_with_most_matches(vcr_request)[0][0]
history.append(response)
# add cookies from response to session cookie store
client.cookies.extract_cookies(response)
vcr_response = cassette.play_response(vcr_request)
response = _from_serialized_response(vcr_request, vcr_response, history)
return response
async def _async_vcr_send(cassette, real_send, *args, **kwargs):
vcr_request, response = _shared_vcr_send(cassette, real_send, *args, **kwargs)
if response:
# add cookies from response to session cookie store
args[0].cookies.extract_cookies(response)
return response
real_response = await real_send(*args, **kwargs)
return _record_responses(cassette, vcr_request, real_response)
def async_vcr_send(cassette, real_send):
@functools.wraps(real_send)
def _inner_send(*args, **kwargs):
return _async_vcr_send(cassette, real_send, *args, **kwargs)
return _inner_send
def _sync_vcr_send(cassette, real_send, *args, **kwargs):
vcr_request, response = _shared_vcr_send(cassette, real_send, *args, **kwargs)
if response:
# add cookies from response to session cookie store
args[0].cookies.extract_cookies(response)
return response
real_response = real_send(*args, **kwargs)
return _record_responses(cassette, vcr_request, real_response)
def sync_vcr_send(cassette, real_send):
@functools.wraps(real_send)
def _inner_send(*args, **kwargs):
return _sync_vcr_send(cassette, real_send, *args, **kwargs)
return _inner_send

View File

@@ -1,9 +1,6 @@
"""Stubs for requests""" """Stubs for requests"""
try: from urllib3.connection import HTTPConnection, VerifiedHTTPSConnection
from urllib3.connectionpool import HTTPConnection, VerifiedHTTPSConnection
except ImportError:
from requests.packages.urllib3.connectionpool import HTTPConnection, VerifiedHTTPSConnection
from ..stubs import VCRHTTPConnection, VCRHTTPSConnection from ..stubs import VCRHTTPConnection, VCRHTTPSConnection

View File

@@ -1,4 +1,5 @@
"""Stubs for tornado HTTP clients""" """Stubs for tornado HTTP clients"""
import functools import functools
from io import BytesIO from io import BytesIO
@@ -29,9 +30,9 @@ def vcr_fetch_impl(cassette, real_fetch_impl):
request, request,
599, 599,
error=Exception( error=Exception(
"The request (%s) uses AsyncHTTPClient functionality " f"The request ({request!r}) uses AsyncHTTPClient functionality "
"that is not yet supported by VCR.py. Please make the " "that is not yet supported by VCR.py. Please make the "
"request outside a VCR.py context." % repr(request) "request outside a VCR.py context.",
), ),
request_time=self.io_loop.time() - request.start_time, request_time=self.io_loop.time() - request.start_time,
) )
@@ -65,14 +66,15 @@ def vcr_fetch_impl(cassette, real_fetch_impl):
request, request,
599, 599,
error=CannotOverwriteExistingCassetteException( error=CannotOverwriteExistingCassetteException(
cassette=cassette, failed_request=vcr_request cassette=cassette,
failed_request=vcr_request,
), ),
request_time=self.io_loop.time() - request.start_time, request_time=self.io_loop.time() - request.start_time,
) )
return callback(response) return callback(response)
def new_callback(response): def new_callback(response):
headers = [(k, response.headers.get_list(k)) for k in response.headers.keys()] headers = [(k, response.headers.get_list(k)) for k in response.headers]
vcr_response = { vcr_response = {
"status": {"code": response.code, "message": response.reason}, "status": {"code": response.code, "message": response.reason},

View File

@@ -1,6 +1,7 @@
"""Stubs for urllib3""" """Stubs for urllib3"""
from urllib3.connectionpool import HTTPConnection, VerifiedHTTPSConnection from urllib3.connection import HTTPConnection, VerifiedHTTPSConnection
from ..stubs import VCRHTTPConnection, VCRHTTPSConnection from ..stubs import VCRHTTPConnection, VCRHTTPSConnection
# urllib3 defines its own HTTPConnection classes. It includes some polyfills # urllib3 defines its own HTTPConnection classes. It includes some polyfills

39
vcr/unittest.py Normal file
View File

@@ -0,0 +1,39 @@
import inspect
import os
import unittest
from .config import VCR
class VCRMixin:
"""A TestCase mixin that provides VCR integration."""
vcr_enabled = True
def setUp(self):
super().setUp()
if self.vcr_enabled:
kwargs = self._get_vcr_kwargs()
myvcr = self._get_vcr(**kwargs)
cm = myvcr.use_cassette(self._get_cassette_name())
self.cassette = cm.__enter__()
self.addCleanup(cm.__exit__, None, None, None)
def _get_vcr(self, **kwargs):
if "cassette_library_dir" not in kwargs:
kwargs["cassette_library_dir"] = self._get_cassette_library_dir()
return VCR(**kwargs)
def _get_vcr_kwargs(self, **kwargs):
return kwargs
def _get_cassette_library_dir(self):
testdir = os.path.dirname(inspect.getfile(self.__class__))
return os.path.join(testdir, "cassettes")
def _get_cassette_name(self):
return f"{self.__class__.__name__}.{self._testMethodName}.yaml"
class VCRTestCase(VCRMixin, unittest.TestCase):
pass

View File

@@ -1,9 +1,5 @@
import types import types
from collections.abc import Mapping, MutableMapping
try:
from collections.abc import Mapping, MutableMapping
except ImportError:
from collections import Mapping, MutableMapping
# Shamelessly stolen from https://github.com/kennethreitz/requests/blob/master/requests/structures.py # Shamelessly stolen from https://github.com/kennethreitz/requests/blob/master/requests/structures.py
@@ -31,7 +27,7 @@ class CaseInsensitiveDict(MutableMapping):
""" """
def __init__(self, data=None, **kwargs): def __init__(self, data=None, **kwargs):
self._store = dict() self._store = {}
if data is None: if data is None:
data = {} data = {}
self.update(data, **kwargs) self.update(data, **kwargs)
@@ -93,9 +89,28 @@ def compose(*functions):
return composed return composed
def _is_nonsequence_iterator(obj):
return hasattr(obj, "__iter__") and not isinstance(
obj,
(bytearray, bytes, dict, list, str),
)
def read_body(request): def read_body(request):
if hasattr(request.body, "read"): if hasattr(request.body, "read"):
return request.body.read() return request.body.read()
if _is_nonsequence_iterator(request.body):
body = list(request.body)
if body:
if isinstance(body[0], str):
return "".join(body).encode("utf-8")
elif isinstance(body[0], (bytes, bytearray)):
return b"".join(body)
elif isinstance(body[0], int):
return bytes(body)
else:
raise ValueError(f"Body type {type(body[0])} not supported")
return b""
return request.body return request.body