1
0
mirror of https://github.com/kevin1024/vcrpy.git synced 2025-12-09 01:03:24 +00:00

Compare commits

...

164 Commits

Author SHA1 Message Date
dependabot[bot]
ac70eaa17f build(deps): bump astral-sh/setup-uv from 5 to 6
Bumps [astral-sh/setup-uv](https://github.com/astral-sh/setup-uv) from 5 to 6.
- [Release notes](https://github.com/astral-sh/setup-uv/releases)
- [Commits](https://github.com/astral-sh/setup-uv/compare/v5...v6)

---
updated-dependencies:
- dependency-name: astral-sh/setup-uv
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-11 16:57:56 -03:00
dependabot[bot]
d50f3385a6 build(deps): bump actions/checkout from 4 to 5
Bumps [actions/checkout](https://github.com/actions/checkout) from 4 to 5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-08-26 09:37:05 -03:00
Jair Henrique
14db4de224 Use uv on CI 2025-08-26 09:35:26 -03:00
Sebastian Pipping
2c4df79498 Merge pull request #917 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2025-08-01 19:17:52 +02:00
pre-commit
1456673cb4 pre-commit: Autoupdate 2025-08-01 16:06:54 +00:00
pre-commit
19bd4e012c pre-commit: Autoupdate 2025-03-23 16:52:07 -03:00
Karolina Surma
558c7fc625 Import iscoroutinefunction() from inspect rather than asyncio
The asyncio function is deprecated starting from Python 3.14 and
will be removed from Python 3.16.
2025-03-23 16:49:33 -03:00
Sebastian Pipping
8217a4c21b Merge pull request #903 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2025-01-17 21:03:56 +01:00
pre-commit
bd0aa59cd2 pre-commit: Autoupdate 2025-01-17 20:50:18 +01:00
Sebastian Pipping
9a37817a3a Merge pull request #904 from kevin1024/fix-ci
Fix CI for error `unshare: write failed /proc/self/uid_map: Operation not permitted` with Ubuntu >=24.04
2025-01-17 20:49:41 +01:00
Sebastian Pipping
b4c65bd677 main.yml: Allow creation of user namespaces to unshare in Ubuntu >=24.04 2025-01-17 20:36:18 +01:00
Sebastian Pipping
93bc59508c Pin GitHub Actions and Read the Docs to explicit Ubuntu 24.04 2025-01-17 20:16:30 +01:00
pre-commit
e313a9cd52 pre-commit: Autoupdate 2025-01-12 09:28:43 -03:00
Sebastian Pipping
5f1b20c4ca Merge pull request #763 from danielnsilva/drop-unused-requests
Add an option to remove unused requests from cassette
2025-01-11 20:51:28 +01:00
Daniel Silva
cd31d71901 refactor: move logic for building used interactions dict before saving 2025-01-11 16:56:59 +00:00
Daniel Silva
4607ca1102 fix: add drop_unused_requests check in cassette saving logic 2025-01-11 16:55:04 +00:00
Daniel Silva
e3ced4385e docs: update example in advanced.rst
Co-authored-by: Sebastian Pipping <sebastian@pipping.org>
2025-01-11 11:21:55 -05:00
Jair Henrique
80099ac6d7 Clean pytest configurations 2025-01-08 16:16:56 -03:00
Sebastian Pipping
440bc20faf Merge pull request #809 from alga/alga-https-proxy
Fix HTTPS proxy handling
2025-01-08 00:20:54 +01:00
Albertas Agejevas
3ddff27cda Remove redundant assertions.
They are covered by the next line.
2025-01-07 21:48:24 +02:00
Albertas Agejevas
30b423e8c0 Use mode="none" in proxy tests as suggested by @hartwork. 2025-01-07 21:48:24 +02:00
Albertas Agejevas
752ba0b749 Fix HTTPS proxy handling. 2025-01-07 21:48:24 +02:00
Albertas Agejevas
c16e526d6a Integration test for HTTPS proxy handling. 2025-01-07 21:48:24 +02:00
Daniel Silva
d64cdd337b style: fix formatting issues to comply with pre-commit hooks 2025-01-04 23:45:43 +00:00
Martin Brunthaler
ac230b76af Call urllib.parse less frequently 2025-01-04 15:42:49 -03:00
pre-commit
965f3658d5 pre-commit: Autoupdate 2025-01-04 15:21:25 -03:00
Jair Henrique
6465a5995b Fix docs conf 2025-01-04 15:19:50 -03:00
dependabot[bot]
69ca261a88 build(deps): bump sphinx-rtd-theme from 2.0.0 to 3.0.2
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 2.0.0 to 3.0.2.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/2.0.0...3.0.2)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-04 15:19:50 -03:00
kevin1024
3278619dcc Release v7.0.0 2024-12-30 18:59:50 -05:00
Aleksei Kozharin
3fb62e0f9b fix: correctly handle asyncio.run when loop exists 2024-12-30 13:34:03 -03:00
dependabot[bot]
81978659f1 build(deps): update sphinx requirement from <8 to <9
Updates the requirements on [sphinx](https://github.com/sphinx-doc/sphinx) to permit the latest version.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/v8.0.2/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v0.1.61611...v8.0.2)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-12-28 14:03:41 -03:00
pre-commit
be651bd27c pre-commit: Autoupdate 2024-12-28 13:48:49 -03:00
Jair Henrique
a6698ed060 Fix aiohttp tests 2024-12-28 13:42:54 -03:00
Igor Gumenyuk
48d0a2e453 Fixed missing version_string attribute when used with urllib3>=2.3.0
urllib3 in v2.3.0 introduced attribute `version_string` (https://github.com/urllib3/urllib3/pull/3316/files). This attribute  is missing in `VCRHTTPResponse` which causes errors like AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string'

This fixes https://github.com/kevin1024/vcrpy/issues/888
2024-12-28 13:39:53 -03:00
Jair Henrique
5b858b132d Fix lint 2024-12-28 13:25:04 -03:00
Jair Henrique
c8d99a99ec Fix ruff configuration 2024-12-28 13:21:15 -03:00
Sebastian Pipping
ce27c63685 Merge pull request #736 from kevin1024/drop-python38
[14 Oct 2024] Drop python 3.8 support
2024-10-13 04:05:13 +02:00
Jair Henrique
ab8944d3ca Drop python 3.8 support 2024-10-12 22:44:42 -03:00
Sebastian Pipping
c6a7f4ae15 Merge pull request #872 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-10-11 01:22:08 +02:00
Kevin McCarthy
1d100dda25 release v6.0.2 2024-10-07 08:55:44 -04:00
pre-commit
7275e5d65d pre-commit: Autoupdate 2024-10-04 16:05:32 +00:00
Sebastian Pipping
c6be705fb4 Merge pull request #871 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-09-20 22:37:01 +02:00
pre-commit
10b7f4efb3 pre-commit: Autoupdate 2024-09-20 16:05:30 +00:00
Sebastian Pipping
7a6ef00f4d Merge pull request #870 from kevin1024/dependabot/github_actions/peter-evans/create-pull-request-7
build(deps): bump peter-evans/create-pull-request from 6 to 7
2024-09-19 00:33:49 +02:00
Sebastian Pipping
3bf6ac7184 Merge pull request #867 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-09-19 00:32:45 +02:00
pre-commit
983b2202ed pre-commit: Autoupdate 2024-09-19 00:28:52 +02:00
dependabot[bot]
15a6b71997 build(deps): bump peter-evans/create-pull-request from 6 to 7
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 6 to 7.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](https://github.com/peter-evans/create-pull-request/compare/v6...v7)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-09-18 22:27:51 +00:00
Sebastian Pipping
1ca708dcff Merge pull request #862 from kevin1024/test-on-313
Start testing with CPython 3.13
2024-09-19 00:00:50 +02:00
Thomas Grainger
f5597fa6c1 use pytest_httbin.certs.where() for cafile 2024-09-18 22:36:17 +01:00
Thomas Grainger
2b3247b3df remove redundant load_cert_chain 2024-09-18 22:35:15 +01:00
Thomas Grainger
d123a5e8d0 replace fixture with constant 2024-09-18 22:34:48 +01:00
Thomas Grainger
e2815fbc88 move httbin_ssl_context fixture into the one place it's used 2024-09-18 22:31:36 +01:00
Thomas Grainger
f9d4500c6e test on 3.13 2024-09-18 19:24:37 +01:00
Sebastian Pipping
71eb624708 Merge pull request #851 from sathieu/consume-body-once2
Ensure body is consumed only once (alternative to #847)
2024-08-02 19:16:38 +02:00
Sebastian Pipping
dc449715c1 Merge pull request #864 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-08-02 19:15:52 +02:00
pre-commit
275b9085f3 pre-commit: Autoupdate 2024-08-02 16:05:06 +00:00
Thomas Grainger
35650b141b Merge pull request #856 from connortann/patch-1
Fix package install: remove use of deprecated setuptools.command.test
2024-07-29 17:35:23 +01:00
Thomas Grainger
9c8b679136 remove unused sys import from setup.py 2024-07-29 17:27:19 +01:00
connortann
fab082eff5 Update setup.py: remove use of deprecated setuptools.command.test 2024-07-29 11:10:18 +01:00
Sebastian Pipping
ffc04f9128 Merge pull request #853 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-07-27 01:25:13 +02:00
pre-commit
4d84da1809 pre-commit: Autoupdate 2024-07-26 16:05:08 +00:00
Mathieu Parent
241b0bbd91 Ensure body is consumed only once
Fixes: #846
Signed-off-by: Mathieu Parent <math.parent@gmail.com>
2024-07-21 22:53:45 +02:00
Sebastian Pipping
042e16c3e4 Merge pull request #850 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-07-13 21:52:05 +02:00
pre-commit
acef3f49bf pre-commit: Autoupdate 2024-07-05 16:05:04 +00:00
Sebastian Pipping
9cfa6c5173 Merge pull request #845 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-06-30 15:24:31 +02:00
Sebastian Pipping
39a86ba3cf pre-commit: Fix Ruff invocation for >=0.5.0
Related changes in Ruff:
https://github.com/astral-sh/ruff/pull/9687
2024-06-30 15:18:35 +02:00
pre-commit
543c72ba51 pre-commit: Autoupdate 2024-06-28 16:04:49 +00:00
Sebastian Pipping
86b114f2f5 Merge pull request #831 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-06-02 15:33:33 +02:00
Sebastian Pipping
4b06f3dba1 Merge pull request #836 from chuckwondo/fix-testing-instructions
Fix typos in testing instructions
2024-06-02 15:29:57 +02:00
pre-commit
1c6503526b pre-commit: Autoupdate 2024-06-02 15:27:57 +02:00
Sebastian Pipping
c9c05682cb Merge pull request #843 from kevin1024/fix-ci-through-recent-setuptools
Fix CI through recent Setuptools + start building once a week to notice CI issues earlier
2024-06-02 15:27:23 +02:00
Sebastian Pipping
39c8648aa7 main.yml: Make sure that build issues do not go unnoticed for >7 days 2024-06-02 15:23:51 +02:00
Sebastian Pipping
dfff84d5bb main.yml: Fix "pypy-3.9,urllib3<2" CI through recent Setuptools
Command "pip install codecov '.[tests]' 'urllib3<2'" was failing
with error:
> hpy 0.9.0 has requirement setuptools>=64.0, but you have
> setuptools 58.1.0.
2024-06-02 15:23:08 +02:00
Chuck Daniels
40ac0de652 Fix typos in test commands 2024-04-21 19:58:00 -04:00
Sebastian Pipping
f3147f574b Merge pull request #830 from pjonsson/permit-urllib3
Permit urllib3 >=2 for non-PyPy Python >=3.10 in order to help users of Poetry
2024-03-11 19:35:43 +01:00
Sebastian Pipping
298a6933ff Merge pull request #829 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-03-10 23:17:01 +01:00
Peter A. Jonsson
52da776b59 Permit urllib3 2.x for non-PyPy Python >=3.10
Poetry makes platform indepdent lock files,
so the PyPy marker is there even when using
CPython >= 3.10.

Add a third constraint that permits any urllib3
version when using Python >=3.10 and some
other implementation than PyPy.
2024-03-10 22:37:04 +01:00
pre-commit
8842fb1c3a pre-commit: Autoupdate 2024-03-08 16:04:43 +00:00
Sebastian Pipping
6c4ba172d8 Merge pull request #827 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-03-06 14:40:32 +01:00
pre-commit
c88f2c0dab pre-commit: Mass-apply ruff formatter 2024-03-06 14:35:01 +01:00
pre-commit
3fd6b1c0b4 pre-commit: Autoupdate 2024-03-01 16:04:42 +00:00
Sebastian Pipping
c6d87309f4 Merge pull request #823 from mgorny/httpbin-compat
Improve test compatibility with legacy httpbin index
2024-02-28 20:38:23 +01:00
Sebastian Pipping
1fb9179cf9 Merge pull request #813 from kevin1024/precommit-autoupdate
pre-commit: Autoupdate
2024-02-28 20:37:08 +01:00
pre-commit
a58e0d8830 pre-commit: Autoupdate 2024-02-23 16:04:44 +00:00
dependabot[bot]
acc101412d build(deps): bump pre-commit/action from 3.0.0 to 3.0.1
Bumps [pre-commit/action](https://github.com/pre-commit/action) from 3.0.0 to 3.0.1.
- [Release notes](https://github.com/pre-commit/action/releases)
- [Commits](https://github.com/pre-commit/action/compare/v3.0.0...v3.0.1)

---
updated-dependencies:
- dependency-name: pre-commit/action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-21 10:39:22 -03:00
Michał Górny
e60dafb8dc Improve test compatibility with legacy httpbin index
Make the tests slightly more flexible to match both the flasgger-based
and legacy httpbin index.  This is needed for compatibility with
https://github.com/psf/httpbin/pull/44 when flasgger is not installed
(e.g. on architectures that are not supported by Rust).
2024-02-16 19:33:41 +01:00
dependabot[bot]
3ce5979acb build(deps): bump peter-evans/create-pull-request from 5 to 6
Bumps [peter-evans/create-pull-request](https://github.com/peter-evans/create-pull-request) from 5 to 6.
- [Release notes](https://github.com/peter-evans/create-pull-request/releases)
- [Commits](https://github.com/peter-evans/create-pull-request/compare/v5...v6)

---
updated-dependencies:
- dependency-name: peter-evans/create-pull-request
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-06 10:13:52 -03:00
Kevin McCarthy
68038d0559 release v6.0.1 2024-01-25 11:13:56 -05:00
Thomas Grainger
f76289aa78 Merge pull request #811 from kevin1024/graingert-patch-1
return values from generator decorator
2024-01-24 13:22:32 +00:00
Thomas Grainger
6252b92f50 add test for generator decorator return 2024-01-23 17:29:35 +00:00
Kevin McCarthy
1e3a5ac753 Release v6.0.0 2024-01-23 10:54:46 -05:00
Thomas Grainger
b1c45cd249 return values from generator decorator 2024-01-23 15:07:23 +00:00
Thomas Grainger
3a5ff1c1ce Merge pull request #810 from graingert/fix-tornado-related-deprecation-warnings
fix tornado related warnings by replacing pytest-tornado with a simple coroutine runner
2024-01-23 14:02:31 +00:00
Thomas Grainger
bf80673454 fix tornado related warnings 2024-01-23 13:02:29 +00:00
Thomas Grainger
8028420cbb Merge pull request #801 from graingert/fix-resource-warning-2 2024-01-23 12:39:35 +00:00
Thomas Grainger
784b2dcb29 tornado test_redirects is no longer an online test 2024-01-23 12:33:45 +00:00
Thomas Grainger
42b4a5d2fa move off of mockbin on tornado tests also 2024-01-23 12:33:15 +00:00
Thomas Grainger
b7f6c2fce2 mark tornado tests as online 2024-01-23 12:24:48 +00:00
Thomas Grainger
6d7a842a33 fix test_tornado_exception_can_be_caught RuntimeError: generator raised StopIteration 2024-01-23 12:24:48 +00:00
Thomas Grainger
db1f5b0dee tornado 6 changes raise_error behaviour 2024-01-23 12:12:52 +00:00
Thomas Grainger
c6667ac56c restore scheme fixture for tests 2024-01-23 12:12:05 +00:00
Thomas Grainger
a093fb177d add new deprecation warnings for tornado tests 2024-01-23 12:09:39 +00:00
Thomas Grainger
666686b542 restore pytest-tornado 2024-01-23 11:12:02 +00:00
Thomas Grainger
5104b1f462 Merge branch 'master' of github.com:kevin1024/vcrpy into fix-resource-warning-2 2024-01-23 11:03:49 +00:00
Jair Henrique
62fe272a8e Remove tox reference from contributing docs 2024-01-22 23:17:25 -03:00
Jair Henrique
f9b69d8da7 Remove tox reference from runtests.sh file 2024-01-22 23:17:25 -03:00
Jair Henrique
cb77cb8f69 Remove tox.ini file 2024-01-22 23:17:25 -03:00
Jair Henrique
e37fc9ab6e Refactor ci main workflow to use matrix to run tests without tox 2024-01-22 23:17:25 -03:00
Jair Henrique
abbb50135f Organize dependencies for tests and extras requires 2024-01-22 23:17:25 -03:00
Jair Henrique
0594de9b3e Remove tox.ini from MANIFEST.in file 2024-01-22 23:17:25 -03:00
Jair Henrique
53f686aa5b Refactor test to not use tox.ini file 2024-01-22 23:17:25 -03:00
pre-commit
1677154f04 pre-commit: Autoupdate 2024-01-22 23:14:05 -03:00
Allan Crooks
54bc6467eb Run linters. 2024-01-22 23:13:10 -03:00
Allan Crooks
c5487384ee Fix handling of encoded content in HTTPX stub.
Also copied over and adjusted some of the tests from
test_requests.py relating to gzipped handling to show
that the HTTPX stub is behaving in a consistent way to
how the requests stub is.
2024-01-22 23:13:10 -03:00
Allan Crooks
5cf23298ac HTTPX stub now generates cassettes in the same format as other stubs.
As part of this, I've removed the tests which inspect the
data type of the response content in the cassette. That
behaviour should be controlled via the inbuilt serializers.
2024-01-22 23:13:10 -03:00
Allan Crooks
5fa7010712 Allow HTTPX stub to read cassettes generated by other stubs.
This was due to a custom format being defined in the HTTPX stub.
2024-01-22 23:13:10 -03:00
pre-commit
f1e0241673 pre-commit: Autoupdate 2024-01-05 16:36:48 -03:00
pre-commit
a3a255d606 pre-commit: Autoupdate 2024-01-02 10:58:59 -03:00
dependabot[bot]
0782382982 build(deps): bump actions/checkout from 3 to 4
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-18 12:16:06 -03:00
dependabot[bot]
395d2be295 build(deps): bump actions/setup-python from 4 to 5
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4 to 5.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-18 11:08:49 -03:00
Thomas Grainger
ee6e7905e9 Merge branch 'master' into fix-resource-warning-2 2023-12-15 19:13:39 +00:00
Thomas Grainger
cc4d03c62e close unremoved connections (pool already removed the connection) 2023-12-15 19:11:25 +00:00
Thomas Grainger
8e13af2ee9 use context manager for requests.Session 2023-12-15 18:38:56 +00:00
pre-commit
b522d3f0a3 pre-commit: Autoupdate 2023-12-15 14:46:28 -03:00
Thomas Grainger
d39c26b358 remember to close removed connections 2023-12-15 14:26:44 +00:00
Thomas Grainger
d76c243513 Revert "Fix ResourceWarning unclosed socket"
This reverts commit f4144359f6.
2023-12-15 14:01:14 +00:00
Thomas Grainger
5cff354ec8 Revert "fix a KeyError"
This reverts commit fa789e975b.
2023-12-15 14:01:10 +00:00
Thomas Grainger
80614dbd00 fix resource warning due to pytest-asyncio 2023-12-15 13:51:52 +00:00
Thomas Grainger
356ff4122c fix sync do_request().stream 2023-12-15 13:40:06 +00:00
Thomas Grainger
cf765928ac remove redundant contextlib import 2023-12-15 13:35:31 +00:00
Thomas Grainger
73d11e80eb fix httpx resource warnings 2023-12-15 13:34:52 +00:00
Thomas Grainger
97de8a0fce ignore warning from dateutil 2023-12-15 13:09:13 +00:00
Thomas Grainger
895ae205ca use asyncio.run to run coroutines 2023-12-15 11:39:51 +00:00
Thomas Grainger
f075c8b0b4 close aiohttp session on errors 2023-12-15 11:39:16 +00:00
Thomas Grainger
3919cb2573 remember to close the VCRHTTPSConnection 2023-12-15 11:08:49 +00:00
Thomas Grainger
bddec2e62a use socketserver.ThreadingTCPServer as a contextmanager 2023-12-15 11:08:31 +00:00
Thomas Grainger
fa789e975b fix a KeyError 2023-12-15 11:07:56 +00:00
Thomas Grainger
556fd0166c enable filterwarnings=error 2023-12-15 11:07:40 +00:00
Thomas Grainger
17c78bff9e Merge branch 'master' of github.com:kevin1024/vcrpy into fix-resource-warning-2 2023-12-15 10:48:27 +00:00
Sebastian Pipping
713cb36d35 Merge pull request #800 from kevin1024/pre-commit-ci
Start using pre-commit in CI
2023-12-12 20:28:19 +01:00
Sebastian Pipping
b0cb8765d5 pre-commit: Add --show-source to ruff
Suggested by Jair Henrique.
2023-12-12 20:01:09 +01:00
Sebastian Pipping
97ad51fe6c pre-commit: Enable trailing-whitespace 2023-12-12 20:01:09 +01:00
Sebastian Pipping
1dd9cbde8b pre-commit: Mass-apply trailing-whitespace 2023-12-12 20:01:09 +01:00
Sebastian Pipping
962284072b pre-commit: Enable end-of-file-fixer 2023-12-12 19:01:50 +01:00
Sebastian Pipping
e9102b2bb4 pre-commit: Mass-apply end-of-file-fixer 2023-12-12 19:01:50 +01:00
Sebastian Pipping
957c8bd7a3 pre-commit: Protect against accidental merge conflict markers 2023-12-12 19:01:50 +01:00
Sebastian Pipping
2d5f8a499e lint.yml: Drop as superseded by pre-commit.yml 2023-12-12 19:01:50 +01:00
Sebastian Pipping
e5555a5d5b pre-commit: Make CI keep keep the config up to date via pull requests 2023-12-12 19:01:50 +01:00
Sebastian Pipping
a542567e4a pre-commit: Integrate with GitHub Actions CI 2023-12-12 19:01:50 +01:00
Sebastian Pipping
3168e7813e pre-commit: Enable Ruff and Ruff Black-style formatting 2023-12-12 19:00:57 +01:00
Jair Henrique
88cf01aa14 Fix format code 2023-12-12 14:24:22 -03:00
Parker Hancock
85ae012d9c fix linting 2023-12-12 14:24:22 -03:00
Parker Hancock
db1e9e7180 make cassettes human readable 2023-12-12 14:24:22 -03:00
Jair Henrique
dbf7a3337b Show ruff diff errors 2023-12-12 13:58:25 -03:00
dependabot[bot]
dd97b02b72 build(deps): bump sphinx-rtd-theme from 1.3.0 to 2.0.0
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 1.3.0 to 2.0.0.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/1.3.0...2.0.0)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-11 11:27:04 -03:00
dependabot[bot]
e8346ad30e build(deps): bump actions/setup-python from 4 to 5
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4 to 5.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-12-11 10:42:13 -03:00
Sebastian Pipping
6a31904333 Merge pull request #789 from kevin1024/fix-imports-to-assertions-py
tests: Fix imports to `tests/assertions.py` (fixes #773)
2023-12-11 00:58:01 +01:00
Jair Henrique
796dc8de7e Move lint from tox to gh action 2023-12-10 20:48:01 -03:00
Sebastian Pipping
ecb5d84f0f tests: Fix imports to tests/assertions.py 2023-12-11 00:36:46 +01:00
Jair Henrique
cebdd45849 Use ruff to check code format intead of black 2023-12-10 20:12:09 -03:00
Mohammad Razavi
f4144359f6 Fix ResourceWarning unclosed socket
This PR fixes issue #710 by properly closing the underlying socket. It
first uses `pool._put_conn` to keep the connection in the pool, and
later removes and closes it when the context manager exits.

I was unsure about the exact purpose of the `ConnectonRemove` class,
so I made minimal changes to minimize the risk of breaking the code
and there may be better solutions for fixing this issue.

For example, the `urllib3.connectionpool.HTTPConnectionPool` will
utilize a weakref to terminate pool connections. By appending our
connection to it, it will also take care of closing our connection. So
another solution could be to modify the `__exit__` in
`patch.ConnectionRemover` method and add our connection to the pool:

```py
class ConnectionRemover:
    ...

    def __exit__(self, *args):
        for pool, connections in self._connection_pool_to_connections.items():
            for connection in connections:
                if isinstance(connection, self._connection_class):
                    pool._put_conn(connection)
```
2023-08-07 08:42:58 +02:00
Daniel Silva
36c7465cf7 docs: add drop_unused_requests option 2023-01-04 21:59:58 +00:00
Daniel Silva
010fa268d1 test: add tests to drop_unused_requests option 2023-01-04 20:09:31 +00:00
Daniel Silva
99c0384770 feat: add an option to exclude unused interactions
Introduce the `drop_unused_requests` option (False by default). If True, it will force the `Cassette` saving operation with only played old interactions and new ones if they exist. As a result, unused old requests are dropped.

Add `_old_interactions`, `_played_interactions` and `_new_interactions()`.  The `_old_interactions` are previously recorded interactions loaded from Cassette files. The `_played_interactions` is a set of old interactions that were marked as played.  A new interaction is a tuple (request, response) in `self.data` that is not in `_old_interactions` list.
2023-01-02 03:52:52 +00:00
59 changed files with 978 additions and 421 deletions

View File

@@ -13,10 +13,10 @@ permissions:
jobs:
codespell:
name: Check for spelling errors
runs-on: ubuntu-latest
runs-on: ubuntu-24.04
steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@v5
- name: Codespell
uses: codespell-project/actions-codespell@v2

View File

@@ -7,14 +7,14 @@ on:
jobs:
validate:
runs-on: ubuntu-latest
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v4
- uses: actions/checkout@v5
- uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Install build dependencies
run: pip install -r docs/requirements.txt
- name: Rendering HTML documentation

View File

@@ -5,39 +5,71 @@ on:
branches:
- master
pull_request:
schedule:
- cron: '0 16 * * 5' # Every Friday 4pm
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
runs-on: ubuntu-24.04
strategy:
fail-fast: false
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12", "pypy-3.8", "pypy-3.9", "pypy-3.10"]
python-version:
- "3.9"
- "3.10"
- "3.11"
- "3.12"
- "3.13"
- "pypy-3.9"
- "pypy-3.10"
urllib3-requirement:
- "urllib3>=2"
- "urllib3<2"
exclude:
- python-version: "3.9"
urllib3-requirement: "urllib3>=2"
- python-version: "pypy-3.9"
urllib3-requirement: "urllib3>=2"
- python-version: "pypy-3.10"
urllib3-requirement: "urllib3>=2"
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v5
- name: Install uv
uses: astral-sh/setup-uv@v6
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: pip
allow-prereleases: true
- name: Install project dependencies
run: |
pip3 install --upgrade pip
pip3 install codecov tox tox-gh-actions
uv pip install --system --upgrade pip setuptools
uv pip install --system codecov '.[tests]' '${{ matrix.urllib3-requirement }}'
uv pip check
- name: Run online tests with tox
run: tox -- -m online
- name: Allow creation of user namespaces (e.g. to the unshare command)
run: |
# .. so that we don't get error:
# unshare: write failed /proc/self/uid_map: Operation not permitted
# Idea from https://github.com/YoYoGames/GameMaker-Bugs/issues/6015#issuecomment-2135552784 .
sudo sysctl kernel.apparmor_restrict_unprivileged_userns=0
- name: Run offline tests with tox with no access to the Internet
- name: Run online tests
run: ./runtests.sh --cov=./vcr --cov-branch --cov-report=xml --cov-append -m online
- name: Run offline tests with no access to the Internet
run: |
# We're using unshare to take Internet access
# away from tox so that we'll notice whenever some new test
# away so that we'll notice whenever some new test
# is missing @pytest.mark.online decoration in the future
unshare --map-root-user --net -- \
sh -c 'ip link set lo up; tox -- -m "not online"'
sh -c 'ip link set lo up; ./runtests.sh --cov=./vcr --cov-branch --cov-report=xml --cov-append -m "not online"'
- name: Run coverage
run: codecov

View File

@@ -0,0 +1,62 @@
# Copyright (c) 2023 Sebastian Pipping <sebastian@pipping.org>
# Licensed under the MIT license
name: Detect outdated pre-commit hooks
on:
schedule:
- cron: '0 16 * * 5' # Every Friday 4pm
# NOTE: This will drop all permissions from GITHUB_TOKEN except metadata read,
# and then (re)add the ones listed below:
permissions:
contents: write
pull-requests: write
jobs:
pre_commit_detect_outdated:
name: Detect outdated pre-commit hooks
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v5
- name: Set up Python 3.12
uses: actions/setup-python@v5
with:
python-version: 3.12
- name: Install pre-commit
run: |-
pip install \
--disable-pip-version-check \
--no-warn-script-location \
--user \
pre-commit
echo "PATH=${HOME}/.local/bin:${PATH}" >> "${GITHUB_ENV}"
- name: Check for outdated hooks
run: |-
pre-commit autoupdate
git diff -- .pre-commit-config.yaml
- name: Create pull request from changes (if any)
id: create-pull-request
uses: peter-evans/create-pull-request@v7
with:
author: 'pre-commit <pre-commit@tools.invalid>'
base: master
body: |-
For your consideration.
:warning: Please **CLOSE AND RE-OPEN** this pull request so that [further workflow runs get triggered](https://github.com/peter-evans/create-pull-request/blob/main/docs/concepts-guidelines.md#triggering-further-workflow-runs) for this pull request.
branch: precommit-autoupdate
commit-message: "pre-commit: Autoupdate"
delete-branch: true
draft: true
labels: enhancement
title: "pre-commit: Autoupdate"
- name: Log pull request URL
if: "${{ steps.create-pull-request.outputs.pull-request-url }}"
run: |
echo "Pull request URL is: ${{ steps.create-pull-request.outputs.pull-request-url }}"

20
.github/workflows/pre-commit.yml vendored Normal file
View File

@@ -0,0 +1,20 @@
# Copyright (c) 2023 Sebastian Pipping <sebastian@pipping.org>
# Licensed under the MIT license
name: Run pre-commit
on:
- pull_request
- push
- workflow_dispatch
jobs:
pre-commit:
name: Run pre-commit
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v5
- uses: actions/setup-python@v5
with:
python-version: 3.12
- uses: pre-commit/action@v3.0.1

17
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,17 @@
# Copyright (c) 2023 Sebastian Pipping <sebastian@pipping.org>
# Licensed under the MIT license
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.12.7
hooks:
- id: ruff
args: ["--output-format=full"]
- id: ruff-format
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: check-merge-conflict
- id: end-of-file-fixer
- id: trailing-whitespace

View File

@@ -7,7 +7,7 @@ version: 2
# Set the version of Python and other tools you might need
build:
os: ubuntu-22.04
os: ubuntu-24.04
tools:
python: "3.12"

View File

@@ -1,6 +1,5 @@
include README.rst
include LICENSE.txt
include tox.ini
recursive-include tests *
recursive-exclude * __pycache__
recursive-exclude * *.py[co]

View File

@@ -4,7 +4,7 @@ VCR.py 📼
###########
|PyPI| |Python versions| |Build Status| |CodeCov| |Gitter| |CodeStyleBlack|
|PyPI| |Python versions| |Build Status| |CodeCov| |Gitter|
----
@@ -70,6 +70,3 @@ more details
.. |CodeCov| image:: https://codecov.io/gh/kevin1024/vcrpy/branch/master/graph/badge.svg
:target: https://codecov.io/gh/kevin1024/vcrpy
:alt: Code Coverage Status
.. |CodeStyleBlack| image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
:alt: Code Style: black

View File

@@ -24,4 +24,4 @@
<stop offset="1" stop-color="#27DDA6"/>
</linearGradient>
</defs>
</svg>
</svg>

Before

Width:  |  Height:  |  Size: 6.2 KiB

After

Width:  |  Height:  |  Size: 6.2 KiB

View File

@@ -16,7 +16,7 @@ a nice addition. Here's an example:
with vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml') as cass:
response = urllib2.urlopen('http://www.zombo.com/').read()
# cass should have 1 request inside it
assert len(cass) == 1
assert len(cass) == 1
# the request uri should have been http://www.zombo.com/
assert cass.requests[0].uri == 'http://www.zombo.com/'
@@ -208,7 +208,7 @@ So these two calls are the same:
# original (still works)
vcr = VCR(filter_headers=['authorization'])
# new
vcr = VCR(filter_headers=[('authorization', None)])
@@ -218,7 +218,7 @@ Here are two examples of the new functionality:
# replace with a static value (most common)
vcr = VCR(filter_headers=[('authorization', 'XXXXXX')])
# replace with a callable, for example when testing
# lots of different kinds of authorization.
def replace_auth(key, value, request):
@@ -286,7 +286,7 @@ sensitive data from the response body:
before_record_response=scrub_string(settings.USERNAME, 'username'),
)
with my_vcr.use_cassette('test.yml'):
# your http code here
# your http code here
Decode compressed response
@@ -427,3 +427,16 @@ If you want to save the cassette only when the test succeeds, set the Cassette
# Since there was an exception, the cassette file hasn't been created.
assert not os.path.exists('fixtures/vcr_cassettes/synopsis.yaml')
Drop unused requests
--------------------
Even if any HTTP request is changed or removed from tests, previously recorded
interactions remain in the cassette file. If set the ``drop_unused_requests``
option to ``True``, VCR will not save old HTTP interactions if they are not used.
.. code:: python
my_vcr = VCR(drop_unused_requests=True)
with my_vcr.use_cassette('fixtures/vcr_cassettes/synopsis.yaml'):
... # your HTTP interactions here

View File

@@ -7,6 +7,26 @@ For a full list of triaged issues, bugs and PRs and what release they are target
All help in providing PRs to close out bug issues is appreciated. Even if that is providing a repo that fully replicates issues. We have very generous contributors that have added these to bug issues which meant another contributor picked up the bug and closed it out.
- 7.0.0
- Drop support for python 3.8 (major version bump) - thanks @jairhenrique
- Various linting and test fixes - thanks @jairhenrique
- Bugfix for urllib2>=2.3.0 - missing version_string (#888)
- Bugfix for asyncio.run - thanks @alekeik1
- 6.0.2
- Ensure body is consumed only once (#846) - thanks @sathieu
- Permit urllib3 2.x for non-PyPy Python >=3.10
- Fix typos in test commands - thanks @chuckwondo
- Several test and workflow improvements - thanks @hartwork and @graingert
- 6.0.1
- Bugfix with to Tornado cassette generator (thanks @graingert)
- 6.0.0
- BREAKING: Fix issue with httpx support (thanks @parkerhancock) in #784. NOTE: You may have to recreate some of your cassettes produced in previous releases due to the binary format being saved incorrectly in previous releases
- BREAKING: Drop support for `boto` (vcrpy still supports boto3, but is dropping the deprecated `boto` support in this release. (thanks @jairhenrique)
- Fix compatibility issue with Python 3.12 (thanks @hartwork)
- Drop simplejson (fixes some compatibility issues) (thanks @jairhenrique)
- Run CI on Python 3.12 and PyPy 3.9-3.10 (thanks @mgorny)
- Various linting and docs improvements (thanks @jairhenrique)
- Tornado fixes (thanks @graingert)
- 5.1.0
- Use ruff for linting (instead of current flake8/isort/pyflakes) - thanks @jairhenrique
- Enable rule B (flake8-bugbear) on ruff - thanks @jairhenrique
@@ -287,4 +307,3 @@ All help in providing PRs to close out bug issues is appreciated. Even if that i
- Add support for requests / urllib3
- 0.0.1
- Initial Release

View File

@@ -316,5 +316,5 @@ texinfo_documents = [
# Example configuration for intersphinx: refer to the Python standard library.
intersphinx_mapping = {"https://docs.python.org/": None}
intersphinx_mapping = {"python": ("https://docs.python.org/3", None)}
html_theme = "alabaster"

View File

@@ -24,7 +24,7 @@ So whilst reporting issues are valuable, please consider:
- contributing an issue with a toy repo that replicates the issue.
- contributing PRs is a more valuable donation of your time and effort.
Thanks again for your interest and support in VCRpy.
Thanks again for your interest and support in VCRpy.
We really appreciate it.
@@ -57,7 +57,7 @@ Simply adding these three labels for incoming issues means a lot for maintaining
- Which library does it affect? ``core``, ``aiohttp``, ``requests``, ``urllib3``, ``tornado4``, ``httplib2``
- If it is a bug, is it ``Verified Can Replicate`` or ``Requires Help Replicating``
- Thanking people for raising issues. Feedback is always appreciated.
- Politely asking if they are able to link to an example repo that replicates the issue if they haven't already. Being able to *clone and go* helps the next person and we like that. 😃
- Politely asking if they are able to link to an example repo that replicates the issue if they haven't already. Being able to *clone and go* helps the next person and we like that. 😃
**Maintainer:**
@@ -68,7 +68,7 @@ This involves creating PRs to address bugs and enhancement requests. It also mea
The PR reviewer is a second set of eyes to see if:
- Are there tests covering the code paths added/modified?
- Do the tests and modifications make sense seem appropriate?
- Add specific feedback, even on approvals, why it is accepted. eg "I like how you use a context manager there. 😄 "
- Add specific feedback, even on approvals, why it is accepted. eg "I like how you use a context manager there. 😄 "
- Also make sure they add a line to `docs/changelog.rst` to claim credit for their contribution.
**Release Manager:**
@@ -83,39 +83,21 @@ The PR reviewer is a second set of eyes to see if:
Running VCR's test suite
------------------------
The tests are all run automatically on `Travis
CI <https://travis-ci.org/kevin1024/vcrpy>`__, but you can also run them
yourself using `pytest <http://pytest.org/>`__ and
`Tox <http://tox.testrun.org/>`__.
The tests are all run automatically on `Github Actions CI <https://github.com/kevin1024/vcrpy/actions>`__,
but you can also run them yourself using `pytest <http://pytest.org/>`__.
Tox will automatically run them in all environments VCR.py supports if they are available on your `PATH`. Alternatively you can use `tox-pyenv <https://pypi.org/project/tox-pyenv/>`_ with
`pyenv <https://github.com/pyenv/pyenv>`_.
We recommend you read the documentation for each and see the section further below.
The test suite is pretty big and slow, but you can tell tox to only run specific tests like this::
tox -e {pyNN}-{HTTP_LIBRARY} -- <pytest flags passed through>
tox -e py38-requests -- -v -k "'test_status_code or test_gzip'"
tox -e py38-requests -- -v --last-failed
This will run only tests that look like ``test_status_code`` or
``test_gzip`` in the test suite, and only in the python 3.8 environment
that has ``requests`` installed.
Also, in order for the boto3 tests to run, you will need an AWS key.
In order for the boto3 tests to run, you will need an AWS key.
Refer to the `boto3
documentation <https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/index.html>`__
for how to set this up. I have marked the boto3 tests as optional in
Travis so you don't have to worry about them failing if you submit a
pull request.
Using PyEnv with VCR's test suite
Using Pyenv with VCR's test suite
---------------------------------
PyEnv is a tool for managing multiple installation of python on your system.
See the full documentation at their `github <https://github.com/pyenv/pyenv>`_
but we are also going to use `tox-pyenv <https://pypi.org/project/tox-pyenv/>`_
Pyenv is a tool for managing multiple installation of python on your system.
See the full documentation at their `github <https://github.com/pyenv/pyenv>`_
in this example::
git clone https://github.com/pyenv/pyenv ~/.pyenv
@@ -126,27 +108,21 @@ in this example::
# Setup shim paths
eval "$(pyenv init -)"
# Setup your local system tox tooling
pip3 install tox tox-pyenv
# Install supported versions (at time of writing), this does not activate them
pyenv install 3.8.0 pypy3.8
pyenv install 3.12.0 pypy3.10
# This activates them
pyenv local 3.8.0 pypy3.8
pyenv local 3.12.0 pypy3.10
# Run the whole test suite
tox
# Run the whole test suite or just part of it
tox -e lint
tox -e py38-requests
pip install .[tests]
./runtests.sh
Troubleshooting on MacOSX
-------------------------
If you have this kind of error when running tox :
If you have this kind of error when running tests :
.. code:: python

View File

@@ -9,7 +9,7 @@ with pip::
Compatibility
-------------
VCR.py supports Python 3.8+, and `pypy <http://pypy.org>`__.
VCR.py supports Python 3.9+, and `pypy <http://pypy.org>`__.
The following HTTP libraries are supported:

View File

@@ -1,2 +1,2 @@
sphinx<8
sphinx_rtd_theme==1.3.0
sphinx<9
sphinx_rtd_theme==3.0.2

View File

@@ -1,18 +1,17 @@
[tool.black]
line-length=110
[tool.codespell]
skip = '.git,*.pdf,*.svg,.tox'
ignore-regex = "\\\\[fnrstv]"
#
# ignore-words-list = ''
[tool.pytest.ini_options]
markers = [
"online",
]
addopts = ["--strict-config", "--strict-markers"]
asyncio_default_fixture_loop_scope = "function"
markers = ["online"]
[tool.ruff]
line-length = 110
target-version = "py39"
[tool.ruff.lint]
select = [
"B", # flake8-bugbear
"C4", # flake8-comprehensions
@@ -26,8 +25,6 @@ select = [
"UP", # pyupgrade
"W", # pycodestyle warning
]
line-length = 110
target-version = "py38"
[tool.ruff.isort]
known-first-party = [ "vcr" ]
[tool.ruff.lint.isort]
known-first-party = ["vcr"]

View File

@@ -1,7 +1,5 @@
#!/bin/bash
# https://blog.ionelmc.ro/2015/04/14/tox-tricks-and-patterns/#when-it-inevitably-leads-to-shell-scripts
# If you are getting an INVOCATION ERROR for this script then there is
# a good chance you are running on Windows.
# You can and should use WSL for running tox on Windows when it calls bash scripts.
# If you are getting an INVOCATION ERROR for this script then there is a good chance you are running on Windows.
# You can and should use WSL for running tests on Windows when it calls bash scripts.
REQUESTS_CA_BUNDLE=`python3 -m pytest_httpbin.certs` exec pytest "$@"

View File

@@ -3,10 +3,8 @@
import codecs
import os
import re
import sys
from setuptools import find_packages, setup
from setuptools.command.test import test as TestCommand
long_description = open("README.rst").read()
here = os.path.abspath(os.path.dirname(__file__))
@@ -28,20 +26,6 @@ def find_version(*file_paths):
raise RuntimeError("Unable to find version string.")
class PyTest(TestCommand):
def finalize_options(self):
TestCommand.finalize_options(self)
self.test_args = []
self.test_suite = True
def run_tests(self):
# import here, cause outside the eggs aren't loaded
import pytest
errno = pytest.main(self.test_args)
sys.exit(errno)
install_requires = [
"PyYAML",
"wrapt",
@@ -55,26 +39,34 @@ install_requires = [
"urllib3 <2; python_version <'3.10'",
# https://github.com/kevin1024/vcrpy/pull/775#issuecomment-1847849962
"urllib3 <2; platform_python_implementation =='PyPy'",
# Workaround for Poetry with CPython >= 3.10, problem description at:
# https://github.com/kevin1024/vcrpy/pull/826
"urllib3; platform_python_implementation !='PyPy' and python_version >='3.10'",
]
tests_require = [
"aiohttp",
"boto3",
"httplib2",
"httpx",
"pytest",
"pytest-aiohttp",
"pytest-httpbin",
"requests>=2.16.2",
"tornado",
# Needed to un-break httpbin 0.7.0. For httpbin >=0.7.1 and after,
# this pin and the dependency itself can be removed, provided
# that the related bug in httpbin has been fixed:
# https://github.com/kevin1024/vcrpy/issues/645#issuecomment-1562489489
# https://github.com/postmanlabs/httpbin/issues/673
# https://github.com/postmanlabs/httpbin/pull/674
"Werkzeug==2.0.3",
]
extras_require = {
"tests": [
"aiohttp",
"boto3",
"httplib2",
"httpx",
"pytest-aiohttp",
"pytest-asyncio",
"pytest-cov",
"pytest-httpbin",
"pytest",
"requests>=2.22.0",
"tornado",
"urllib3",
# Needed to un-break httpbin 0.7.0. For httpbin >=0.7.1 and after,
# this pin and the dependency itself can be removed, provided
# that the related bug in httpbin has been fixed:
# https://github.com/kevin1024/vcrpy/issues/645#issuecomment-1562489489
# https://github.com/postmanlabs/httpbin/issues/673
# https://github.com/postmanlabs/httpbin/pull/674
"Werkzeug==2.0.3",
],
}
setup(
name="vcrpy",
@@ -86,21 +78,22 @@ setup(
author_email="me@kevinmccarthy.org",
url="https://github.com/kevin1024/vcrpy",
packages=find_packages(exclude=["tests*"]),
python_requires=">=3.8",
python_requires=">=3.9",
install_requires=install_requires,
license="MIT",
tests_require=tests_require,
extras_require=extras_require,
tests_require=extras_require["tests"],
classifiers=[
"Development Status :: 5 - Production/Stable",
"Environment :: Console",
"Intended Audience :: Developers",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",

0
tests/__init__.py Normal file
View File

View File

@@ -15,9 +15,9 @@
},
"response": {
"status": {
"message": "OK",
"message": "OK",
"code": 200
},
},
"headers": {
"access-control-allow-origin": ["*"],
"content-type": ["application/json"],
@@ -25,7 +25,7 @@
"server": ["gunicorn/0.17.4"],
"content-length": ["32"],
"connection": ["keep-alive"]
},
},
"body": {
"string": "{\n \"origin\": \"217.122.164.194\"\n}"
}

View File

@@ -2,7 +2,7 @@ version: 1
interactions:
- request:
body: null
headers:
headers:
accept: ['*/*']
accept-encoding: ['gzip, deflate, compress']
user-agent: ['python-requests/2.2.1 CPython/2.6.1 Darwin/10.8.0']

View File

@@ -1,31 +1,31 @@
[
{
"request": {
"body": null,
"protocol": "http",
"method": "GET",
"body": null,
"protocol": "http",
"method": "GET",
"headers": {
"accept-encoding": "gzip, deflate, compress",
"accept": "*/*",
"accept-encoding": "gzip, deflate, compress",
"accept": "*/*",
"user-agent": "python-requests/2.2.1 CPython/2.6.1 Darwin/10.8.0"
},
"host": "httpbin.org",
"path": "/ip",
},
"host": "httpbin.org",
"path": "/ip",
"port": 80
},
},
"response": {
"status": {
"message": "OK",
"message": "OK",
"code": 200
},
},
"headers": [
"access-control-allow-origin: *\r\n",
"content-type: application/json\r\n",
"date: Mon, 21 Apr 2014 23:13:40 GMT\r\n",
"server: gunicorn/0.17.4\r\n",
"content-length: 32\r\n",
"access-control-allow-origin: *\r\n",
"content-type: application/json\r\n",
"date: Mon, 21 Apr 2014 23:13:40 GMT\r\n",
"server: gunicorn/0.17.4\r\n",
"content-length: 32\r\n",
"connection: keep-alive\r\n"
],
],
"body": {
"string": "{\n \"origin\": \"217.122.164.194\"\n}"
}

View File

@@ -10,7 +10,7 @@ interactions:
uri: http://seomoz.org/
response:
body: {string: ''}
headers:
headers:
Location: ['http://moz.com/']
Server: ['BigIP']
Connection: ['Keep-Alive']

View File

@@ -5,24 +5,24 @@ import aiohttp
async def aiohttp_request(loop, method, url, output="text", encoding="utf-8", content_type=None, **kwargs):
session = aiohttp.ClientSession(loop=loop)
response_ctx = session.request(method, url, **kwargs)
async with aiohttp.ClientSession(loop=loop) as session:
response_ctx = session.request(method, url, **kwargs)
response = await response_ctx.__aenter__()
if output == "text":
content = await response.text()
elif output == "json":
content_type = content_type or "application/json"
content = await response.json(encoding=encoding, content_type=content_type)
elif output == "raw":
content = await response.read()
elif output == "stream":
content = await response.content.read()
response = await response_ctx.__aenter__()
if output == "text":
content = await response.text()
elif output == "json":
content_type = content_type or "application/json"
content = await response.json(encoding=encoding, content_type=content_type)
elif output == "raw":
content = await response.read()
elif output == "stream":
content = await response.content.read()
response_ctx._resp.close()
await session.close()
response_ctx._resp.close()
await session.close()
return response, content
return response, content
def aiohttp_app():

View File

@@ -0,0 +1,41 @@
interactions:
- request:
body: ''
headers:
accept:
- '*/*'
accept-encoding:
- gzip, deflate, br
connection:
- keep-alive
host:
- httpbin.org
user-agent:
- python-httpx/0.23.0
method: GET
uri: https://httpbin.org/gzip
response:
content: "{\n \"gzipped\": true, \n \"headers\": {\n \"Accept\": \"*/*\",
\n \"Accept-Encoding\": \"gzip, deflate, br\", \n \"Host\": \"httpbin.org\",
\n \"User-Agent\": \"python-httpx/0.23.0\", \n \"X-Amzn-Trace-Id\": \"Root=1-62a62a8d-5f39b5c50c744da821d6ea99\"\n
\ }, \n \"method\": \"GET\", \n \"origin\": \"146.200.25.115\"\n}\n"
headers:
Access-Control-Allow-Credentials:
- 'true'
Access-Control-Allow-Origin:
- '*'
Connection:
- keep-alive
Content-Encoding:
- gzip
Content-Length:
- '230'
Content-Type:
- application/json
Date:
- Sun, 12 Jun 2022 18:03:57 GMT
Server:
- gunicorn/19.9.0
http_version: HTTP/1.1
status_code: 200
version: 1

View File

@@ -0,0 +1,42 @@
interactions:
- request:
body: null
headers:
Accept:
- '*/*'
Accept-Encoding:
- gzip, deflate, br
Connection:
- keep-alive
User-Agent:
- python-requests/2.28.0
method: GET
uri: https://httpbin.org/gzip
response:
body:
string: !!binary |
H4sIAKwrpmIA/z2OSwrCMBCG956izLIkfQSxkl2RogfQA9R2bIM1iUkqaOndnYDIrGa+/zELDB9l
LfYgg5uRwYhtj86DXKDuOrQBJKR5Cuy38kZ3pld6oHu0sqTH29QGZMnVkepgtMYuKKNJcEe0vJ3U
C4mcjI9hpaiygqaUW7ETFYGLR8frAXXE9h1Go7nD54w++FxkYp8VsDJ4IBH6E47NmVzGqUHFkn8g
rJsvp2omYs8AAAA=
headers:
Access-Control-Allow-Credentials:
- 'true'
Access-Control-Allow-Origin:
- '*'
Connection:
- Close
Content-Encoding:
- gzip
Content-Length:
- '182'
Content-Type:
- application/json
Date:
- Sun, 12 Jun 2022 18:08:44 GMT
Server:
- Pytest-HTTPBIN/0.1.0
status:
code: 200
message: great
version: 1

View File

@@ -1,16 +0,0 @@
import os
import ssl
import pytest
@pytest.fixture
def httpbin_ssl_context():
ssl_ca_location = os.environ["REQUESTS_CA_BUNDLE"]
ssl_cert_location = os.environ["REQUESTS_CA_BUNDLE"].replace("cacert.pem", "cert.pem")
ssl_key_location = os.environ["REQUESTS_CA_BUNDLE"].replace("cacert.pem", "key.pem")
ssl_context = ssl.create_default_context(cafile=ssl_ca_location)
ssl_context.load_cert_chain(ssl_cert_location, ssl_key_location)
return ssl_context

View File

@@ -1,8 +1,10 @@
import contextlib
import logging
import ssl
import urllib.parse
import pytest
import pytest_httpbin.certs
import yarl
import vcr
@@ -12,12 +14,14 @@ aiohttp = pytest.importorskip("aiohttp")
from .aiohttp_utils import aiohttp_app, aiohttp_request # noqa: E402
HTTPBIN_SSL_CONTEXT = ssl.create_default_context(cafile=pytest_httpbin.certs.where())
def run_in_loop(fn):
with contextlib.closing(asyncio.new_event_loop()) as loop:
asyncio.set_event_loop(loop)
task = loop.create_task(fn(loop))
return loop.run_until_complete(task)
async def wrapper():
return await fn(asyncio.get_running_loop())
return asyncio.run(wrapper())
def request(method, url, output="text", **kwargs):
@@ -260,6 +264,12 @@ def test_aiohttp_test_client_json(aiohttp_client, tmpdir):
assert cassette.play_count == 1
def test_cleanup_from_pytest_asyncio():
# work around https://github.com/pytest-dev/pytest-asyncio/issues/724
asyncio.get_event_loop().close()
asyncio.set_event_loop(None)
@pytest.mark.online
def test_redirect(tmpdir, httpbin):
url = httpbin.url + "/redirect/2"
@@ -333,7 +343,7 @@ def test_double_requests(tmpdir, httpbin):
assert cassette.play_count == 2
def test_cookies(httpbin_both, httpbin_ssl_context, tmpdir):
def test_cookies(httpbin_both, tmpdir):
async def run(loop):
cookies_url = httpbin_both.url + (
"/response-headers?"
@@ -348,12 +358,12 @@ def test_cookies(httpbin_both, httpbin_ssl_context, tmpdir):
# ------------------------- Record -------------------------- #
with vcr.use_cassette(tmp) as cassette:
async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url, ssl=httpbin_ssl_context)
cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
home_resp = await session.get(
home_url,
cookies=req_cookies,
headers=req_headers,
ssl=httpbin_ssl_context,
ssl=HTTPBIN_SSL_CONTEXT,
)
assert cassette.play_count == 0
assert_responses(cookies_resp, home_resp)
@@ -361,12 +371,12 @@ def test_cookies(httpbin_both, httpbin_ssl_context, tmpdir):
# -------------------------- Play --------------------------- #
with vcr.use_cassette(tmp, record_mode=vcr.mode.NONE) as cassette:
async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url, ssl=httpbin_ssl_context)
cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
home_resp = await session.get(
home_url,
cookies=req_cookies,
headers=req_headers,
ssl=httpbin_ssl_context,
ssl=HTTPBIN_SSL_CONTEXT,
)
assert cassette.play_count == 2
assert_responses(cookies_resp, home_resp)
@@ -383,7 +393,7 @@ def test_cookies(httpbin_both, httpbin_ssl_context, tmpdir):
run_in_loop(run)
def test_cookies_redirect(httpbin_both, httpbin_ssl_context, tmpdir):
def test_cookies_redirect(httpbin_both, tmpdir):
async def run(loop):
# Sets cookie as provided by the query string and redirects
cookies_url = httpbin_both.url + "/cookies/set?Cookie_1=Val_1"
@@ -392,9 +402,9 @@ def test_cookies_redirect(httpbin_both, httpbin_ssl_context, tmpdir):
# ------------------------- Record -------------------------- #
with vcr.use_cassette(tmp) as cassette:
async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url, ssl=httpbin_ssl_context)
cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
assert not cookies_resp.cookies
cookies = session.cookie_jar.filter_cookies(cookies_url)
cookies = session.cookie_jar.filter_cookies(yarl.URL(cookies_url))
assert cookies["Cookie_1"].value == "Val_1"
assert cassette.play_count == 0
@@ -403,9 +413,9 @@ def test_cookies_redirect(httpbin_both, httpbin_ssl_context, tmpdir):
# -------------------------- Play --------------------------- #
with vcr.use_cassette(tmp, record_mode=vcr.mode.NONE) as cassette:
async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url, ssl=httpbin_ssl_context)
cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
assert not cookies_resp.cookies
cookies = session.cookie_jar.filter_cookies(cookies_url)
cookies = session.cookie_jar.filter_cookies(yarl.URL(cookies_url))
assert cookies["Cookie_1"].value == "Val_1"
assert cassette.play_count == 2
@@ -417,9 +427,9 @@ def test_cookies_redirect(httpbin_both, httpbin_ssl_context, tmpdir):
"Cookie_1=Val_1; Expires=Wed, 21 Oct 2015 07:28:00 GMT",
]
async with aiohttp.ClientSession(loop=loop, cookie_jar=aiohttp.CookieJar(unsafe=True)) as session:
cookies_resp = await session.get(cookies_url, ssl=httpbin_ssl_context)
cookies_resp = await session.get(cookies_url, ssl=HTTPBIN_SSL_CONTEXT)
assert not cookies_resp.cookies
cookies = session.cookie_jar.filter_cookies(cookies_url)
cookies = session.cookie_jar.filter_cookies(yarl.URL(cookies_url))
assert cookies["Cookie_1"].value == "Val_1"
run_in_loop(run)

View File

@@ -39,7 +39,7 @@ def test_basic_json_use(tmpdir, httpbin):
test_fixture = str(tmpdir.join("synopsis.json"))
with vcr.use_cassette(test_fixture, serializer="json"):
response = urlopen(httpbin.url).read()
assert b"A simple HTTP Request &amp; Response Service." in response
assert b"HTTP Request &amp; Response Service" in response
def test_patched_content(tmpdir, httpbin):

View File

@@ -5,6 +5,7 @@ from urllib.request import urlopen
import pytest
import vcr
from vcr.cassette import Cassette
@pytest.mark.online
@@ -85,3 +86,21 @@ def test_dont_record_on_exception(tmpdir, httpbin):
assert b"Not in content" in urlopen(httpbin.url).read()
assert not os.path.exists(str(tmpdir.join("dontsave2.yml")))
def test_set_drop_unused_requests(tmpdir, httpbin):
my_vcr = vcr.VCR(drop_unused_requests=True)
file = str(tmpdir.join("test.yaml"))
with my_vcr.use_cassette(file):
urlopen(httpbin.url)
urlopen(httpbin.url + "/get")
cassette = Cassette.load(path=file)
assert len(cassette) == 2
with my_vcr.use_cassette(file):
urlopen(httpbin.url)
cassette = Cassette.load(path=file)
assert len(cassette) == 1

View File

@@ -5,10 +5,11 @@ from urllib.parse import urlencode
from urllib.request import Request, urlopen
import pytest
from assertions import assert_cassette_has_one_response, assert_is_json_bytes
import vcr
from ..assertions import assert_cassette_has_one_response, assert_is_json_bytes
def _request_with_auth(url, username, password):
request = Request(url)

View File

@@ -1,12 +1,14 @@
"""Integration tests with httplib2"""
from urllib.parse import urlencode
import pytest
import pytest_httpbin.certs
from assertions import assert_cassette_has_one_response
import vcr
from ..assertions import assert_cassette_has_one_response
httplib2 = pytest.importorskip("httplib2")

View File

@@ -1,7 +1,11 @@
import os
import pytest
import vcr
from ..assertions import assert_is_json_bytes
asyncio = pytest.importorskip("asyncio")
httpx = pytest.importorskip("httpx")
@@ -28,25 +32,37 @@ class DoSyncRequest(BaseDoRequest):
_client_class = httpx.Client
def __enter__(self):
self._client = self._make_client()
return self
def __exit__(self, *args):
pass
self._client.close()
del self._client
@property
def client(self):
try:
return self._client
except AttributeError:
self._client = self._make_client()
return self._client
except AttributeError as e:
raise ValueError('To access sync client, use "with do_request() as client"') from e
def __call__(self, *args, **kwargs):
return self.client.request(*args, timeout=60, **kwargs)
if hasattr(self, "_client"):
return self.client.request(*args, timeout=60, **kwargs)
# Use one-time context and dispose of the client afterwards
with self:
return self.client.request(*args, timeout=60, **kwargs)
def stream(self, *args, **kwargs):
with self.client.stream(*args, **kwargs) as response:
return b"".join(response.iter_bytes())
if hasattr(self, "_client"):
with self.client.stream(*args, **kwargs) as response:
return b"".join(response.iter_bytes())
# Use one-time context and dispose of the client afterwards
with self:
with self.client.stream(*args, **kwargs) as response:
return b"".join(response.iter_bytes())
class DoAsyncRequest(BaseDoRequest):
@@ -207,22 +223,6 @@ def test_redirect(httpbin, yml, do_request):
assert cassette_response.request.headers.items() == response.request.headers.items()
@pytest.mark.online
def test_work_with_gzipped_data(httpbin, do_request, yml):
url = httpbin.url + "/gzip?foo=bar"
headers = {"accept-encoding": "deflate, gzip"}
with vcr.use_cassette(yml):
do_request(headers=headers)("GET", url)
with vcr.use_cassette(yml) as cassette:
cassette_response = do_request(headers=headers)("GET", url)
assert cassette_response.headers["content-encoding"] == "gzip"
assert cassette_response.read()
assert cassette.play_count == 1
@pytest.mark.online
@pytest.mark.parametrize("url", ["https://github.com/kevin1024/vcrpy/issues/" + str(i) for i in range(3, 6)])
def test_simple_fetching(do_request, yml, url):
@@ -285,3 +285,77 @@ def test_stream(tmpdir, httpbin, do_request):
assert cassette_content == response_content
assert len(cassette_content) == 512
assert cassette.play_count == 1
# Regular cassette formats support the status reason,
# but the old HTTPX cassette format does not.
@pytest.mark.parametrize(
"cassette_name,reason",
[
("requests", "great"),
("httpx_old_format", "OK"),
],
)
def test_load_cassette_format(do_request, cassette_name, reason):
mydir = os.path.dirname(os.path.realpath(__file__))
yml = f"{mydir}/cassettes/gzip_{cassette_name}.yaml"
url = "https://httpbin.org/gzip"
with vcr.use_cassette(yml) as cassette:
cassette_response = do_request()("GET", url)
assert str(cassette_response.request.url) == url
assert cassette.play_count == 1
# Should be able to load up the JSON inside,
# regardless whether the content is the gzipped
# in the cassette or not.
json = cassette_response.json()
assert json["method"] == "GET", json
assert cassette_response.status_code == 200
assert cassette_response.reason_phrase == reason
def test_gzip__decode_compressed_response_false(tmpdir, httpbin, do_request):
"""
Ensure that httpx is able to automatically decompress the response body.
"""
for _ in range(2): # one for recording, one for re-playing
with vcr.use_cassette(str(tmpdir.join("gzip.yaml"))) as cassette:
response = do_request()("GET", httpbin + "/gzip")
assert response.headers["content-encoding"] == "gzip" # i.e. not removed
# The content stored in the cassette should be gzipped.
assert cassette.responses[0]["body"]["string"][:2] == b"\x1f\x8b"
assert_is_json_bytes(response.content) # i.e. uncompressed bytes
def test_gzip__decode_compressed_response_true(do_request, tmpdir, httpbin):
url = httpbin + "/gzip"
expected_response = do_request()("GET", url)
expected_content = expected_response.content
assert expected_response.headers["content-encoding"] == "gzip" # self-test
with vcr.use_cassette(
str(tmpdir.join("decode_compressed.yaml")),
decode_compressed_response=True,
) as cassette:
r = do_request()("GET", url)
assert r.headers["content-encoding"] == "gzip" # i.e. not removed
content_length = r.headers["content-length"]
assert r.content == expected_content
# Has the cassette body been decompressed?
cassette_response_body = cassette.responses[0]["body"]["string"]
assert isinstance(cassette_response_body, str)
# Content should be JSON.
assert cassette_response_body[0:1] == "{"
with vcr.use_cassette(str(tmpdir.join("decode_compressed.yaml")), decode_compressed_response=True):
r = httpx.get(url)
assert "content-encoding" not in r.headers # i.e. removed
assert r.content == expected_content
# As the content is uncompressed, it should have a bigger
# length than the compressed version.
assert r.headers["content-length"] > content_length

View File

@@ -1,5 +1,6 @@
"""Test using a proxy."""
import asyncio
import http.server
import socketserver
import threading
@@ -36,15 +37,44 @@ class Proxy(http.server.SimpleHTTPRequestHandler):
self.end_headers()
self.copyfile(upstream_response, self.wfile)
def do_CONNECT(self):
host, port = self.path.split(":")
asyncio.run(self._tunnel(host, port, self.connection))
async def _tunnel(self, host, port, client_sock):
target_r, target_w = await asyncio.open_connection(host=host, port=port)
self.send_response(http.HTTPStatus.OK)
self.end_headers()
source_r, source_w = await asyncio.open_connection(sock=client_sock)
async def channel(reader, writer):
while True:
data = await reader.read(1024)
if not data:
break
writer.write(data)
await writer.drain()
writer.close()
await writer.wait_closed()
await asyncio.gather(
channel(target_r, source_w),
channel(source_r, target_w),
)
@pytest.fixture(scope="session")
def proxy_server():
httpd = socketserver.ThreadingTCPServer(("", 0), Proxy)
proxy_process = threading.Thread(target=httpd.serve_forever)
proxy_process.start()
yield "http://{}:{}".format(*httpd.server_address)
httpd.shutdown()
proxy_process.join()
with socketserver.ThreadingTCPServer(("", 0), Proxy) as httpd:
proxy_process = threading.Thread(target=httpd.serve_forever)
proxy_process.start()
yield "http://{}:{}".format(*httpd.server_address)
httpd.shutdown()
proxy_process.join()
def test_use_proxy(tmpdir, httpbin, proxy_server):
@@ -52,10 +82,26 @@ def test_use_proxy(tmpdir, httpbin, proxy_server):
with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))):
response = requests.get(httpbin.url, proxies={"http": proxy_server})
with vcr.use_cassette(str(tmpdir.join("proxy.yaml")), mode="once") as cassette:
with vcr.use_cassette(str(tmpdir.join("proxy.yaml")), mode="none") as cassette:
cassette_response = requests.get(httpbin.url, proxies={"http": proxy_server})
for key in set(cassette_response.headers.keys()) & set(response.headers.keys()):
assert cassette_response.headers[key] == response.headers[key]
assert cassette_response.headers == response.headers
assert cassette.play_count == 1
def test_use_https_proxy(tmpdir, httpbin_secure, proxy_server):
"""Ensure that it works with an HTTPS proxy."""
with vcr.use_cassette(str(tmpdir.join("proxy.yaml"))):
response = requests.get(httpbin_secure.url, proxies={"https": proxy_server})
with vcr.use_cassette(str(tmpdir.join("proxy.yaml")), mode="none") as cassette:
cassette_response = requests.get(
httpbin_secure.url,
proxies={"https": proxy_server},
)
assert cassette_response.headers == response.headers
assert cassette.play_count == 1
# The cassette URL points to httpbin, not the proxy
assert cassette.requests[0].url == httpbin_secure.url + "/"

View File

@@ -66,7 +66,7 @@ def test_load_cassette_with_custom_persister(tmpdir, httpbin):
with my_vcr.use_cassette(test_fixture, serializer="json"):
response = urlopen(httpbin.url).read()
assert b"A simple HTTP Request &amp; Response Service." in response
assert b"HTTP Request &amp; Response Service" in response
def test_load_cassette_persister_exception_handling(tmpdir, httpbin):

View File

@@ -1,9 +1,11 @@
"""Test requests' interaction with vcr"""
import pytest
from assertions import assert_cassette_empty, assert_is_json_bytes
import vcr
from ..assertions import assert_cassette_empty, assert_is_json_bytes
requests = pytest.importorskip("requests")
@@ -264,7 +266,7 @@ def test_nested_cassettes_with_session_created_before_nesting(httpbin_both, tmpd
def test_post_file(tmpdir, httpbin_both):
"""Ensure that we handle posting a file."""
url = httpbin_both + "/post"
with vcr.use_cassette(str(tmpdir.join("post_file.yaml"))) as cass, open("tox.ini", "rb") as f:
with vcr.use_cassette(str(tmpdir.join("post_file.yaml"))) as cass, open(".editorconfig", "rb") as f:
original_response = requests.post(url, f).content
# This also tests that we do the right thing with matching the body when they are files.
@@ -272,10 +274,10 @@ def test_post_file(tmpdir, httpbin_both):
str(tmpdir.join("post_file.yaml")),
match_on=("method", "scheme", "host", "port", "path", "query", "body"),
) as cass:
with open("tox.ini", "rb") as f:
tox_content = f.read()
assert cass.requests[0].body.read() == tox_content
with open("tox.ini", "rb") as f:
with open(".editorconfig", "rb") as f:
editorconfig = f.read()
assert cass.requests[0].body.read() == editorconfig
with open(".editorconfig", "rb") as f:
new_response = requests.post(url, f).content
assert original_response == new_response

View File

@@ -2,10 +2,10 @@ import http.client as httplib
import json
import zlib
from assertions import assert_is_json_bytes
import vcr
from ..assertions import assert_is_json_bytes
def _headers_are_case_insensitive(host, port):
conn = httplib.HTTPConnection(host, port)

View File

@@ -1,19 +1,45 @@
"""Test requests' interaction with vcr"""
import asyncio
import functools
import inspect
import json
import pytest
from assertions import assert_cassette_empty, assert_is_json_bytes
import vcr
from vcr.errors import CannotOverwriteExistingCassetteException
from ..assertions import assert_cassette_empty, assert_is_json_bytes
tornado = pytest.importorskip("tornado")
gen = pytest.importorskip("tornado.gen")
http = pytest.importorskip("tornado.httpclient")
# whether the current version of Tornado supports the raise_error argument for
# fetch().
supports_raise_error = tornado.version_info >= (4,)
raise_error_for_response_code_only = tornado.version_info >= (6,)
def gen_test(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
async def coro():
return await gen.coroutine(func)(*args, **kwargs)
return asyncio.run(coro())
# Patch the signature so pytest can inject fixtures
# we can't use wrapt.decorator because it returns a generator function
wrapper.__signature__ = inspect.signature(func)
return wrapper
@pytest.fixture(params=["https", "http"])
def scheme(request):
"""Fixture that returns both http and https."""
return request.param
@pytest.fixture(params=["simple", "curl", "default"])
@@ -43,7 +69,8 @@ def post(client, url, data=None, **kwargs):
return client.fetch(http.HTTPRequest(url, method="POST", **kwargs))
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_status_code(get_client, scheme, tmpdir):
"""Ensure that we can read the status code"""
url = scheme + "://httpbin.org/"
@@ -55,7 +82,8 @@ def test_status_code(get_client, scheme, tmpdir):
assert 1 == cass.play_count
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_headers(get_client, scheme, tmpdir):
"""Ensure that we can read the headers back"""
url = scheme + "://httpbin.org/"
@@ -67,7 +95,8 @@ def test_headers(get_client, scheme, tmpdir):
assert 1 == cass.play_count
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_body(get_client, tmpdir, scheme):
"""Ensure the responses are all identical enough"""
@@ -80,7 +109,7 @@ def test_body(get_client, tmpdir, scheme):
assert 1 == cass.play_count
@pytest.mark.gen_test
@gen_test
def test_effective_url(get_client, tmpdir, httpbin):
"""Ensure that the effective_url is captured"""
url = httpbin.url + "/redirect/1"
@@ -93,7 +122,8 @@ def test_effective_url(get_client, tmpdir, httpbin):
assert 1 == cass.play_count
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_auth(get_client, tmpdir, scheme):
"""Ensure that we can handle basic auth"""
auth = ("user", "passwd")
@@ -108,7 +138,8 @@ def test_auth(get_client, tmpdir, scheme):
assert 1 == cass.play_count
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_auth_failed(get_client, tmpdir, scheme):
"""Ensure that we can save failed auth statuses"""
auth = ("user", "wrongwrongwrong")
@@ -131,7 +162,8 @@ def test_auth_failed(get_client, tmpdir, scheme):
assert 1 == cass.play_count
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_post(get_client, tmpdir, scheme):
"""Ensure that we can post and cache the results"""
data = {"key1": "value1", "key2": "value2"}
@@ -146,10 +178,10 @@ def test_post(get_client, tmpdir, scheme):
assert 1 == cass.play_count
@pytest.mark.gen_test
def test_redirects(get_client, tmpdir, scheme):
@gen_test
def test_redirects(get_client, tmpdir, httpbin):
"""Ensure that we can handle redirects"""
url = scheme + "://mockbin.org/redirect/301?url=bytes/1024"
url = httpbin + "/redirect-to?url=bytes/1024&status_code=301"
with vcr.use_cassette(str(tmpdir.join("requests.yaml"))):
content = (yield get(get_client(), url)).body
@@ -158,7 +190,8 @@ def test_redirects(get_client, tmpdir, scheme):
assert cass.play_count == 1
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_cross_scheme(get_client, tmpdir, scheme):
"""Ensure that requests between schemes are treated separately"""
# First fetch a url under http, and then again under https and then
@@ -177,7 +210,8 @@ def test_cross_scheme(get_client, tmpdir, scheme):
assert cass.play_count == 2
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_gzip(get_client, tmpdir, scheme):
"""
Ensure that httpclient is able to automatically decompress the response
@@ -202,7 +236,8 @@ def test_gzip(get_client, tmpdir, scheme):
assert 1 == cass.play_count
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_https_with_cert_validation_disabled(get_client, tmpdir):
cass_path = str(tmpdir.join("cert_validation_disabled.yaml"))
@@ -214,7 +249,7 @@ def test_https_with_cert_validation_disabled(get_client, tmpdir):
assert 1 == cass.play_count
@pytest.mark.gen_test
@gen_test
def test_unsupported_features_raises_in_future(get_client, tmpdir):
"""Ensure that the exception for an AsyncHTTPClient feature not being
supported is raised inside the future."""
@@ -232,7 +267,11 @@ def test_unsupported_features_raises_in_future(get_client, tmpdir):
@pytest.mark.skipif(not supports_raise_error, reason="raise_error unavailable in tornado <= 3")
@pytest.mark.gen_test
@pytest.mark.skipif(
raise_error_for_response_code_only,
reason="raise_error only ignores HTTPErrors due to response code",
)
@gen_test
def test_unsupported_features_raise_error_disabled(get_client, tmpdir):
"""Ensure that the exception for an AsyncHTTPClient feature not being
supported is not raised if raise_error=False."""
@@ -251,7 +290,8 @@ def test_unsupported_features_raise_error_disabled(get_client, tmpdir):
assert "not yet supported by VCR" in str(response.error)
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_cannot_overwrite_cassette_raises_in_future(get_client, tmpdir):
"""Ensure that CannotOverwriteExistingCassetteException is raised inside
the future."""
@@ -267,7 +307,11 @@ def test_cannot_overwrite_cassette_raises_in_future(get_client, tmpdir):
@pytest.mark.skipif(not supports_raise_error, reason="raise_error unavailable in tornado <= 3")
@pytest.mark.gen_test
@pytest.mark.skipif(
raise_error_for_response_code_only,
reason="raise_error only ignores HTTPErrors due to response code",
)
@gen_test
def test_cannot_overwrite_cassette_raise_error_disabled(get_client, tmpdir):
"""Ensure that CannotOverwriteExistingCassetteException is not raised if
raise_error=False in the fetch() call."""
@@ -281,14 +325,14 @@ def test_cannot_overwrite_cassette_raise_error_disabled(get_client, tmpdir):
assert isinstance(response.error, CannotOverwriteExistingCassetteException)
@pytest.mark.gen_test
@gen_test
@vcr.use_cassette(path_transformer=vcr.default_vcr.ensure_suffix(".yaml"))
def test_tornado_with_decorator_use_cassette(get_client):
response = yield get_client().fetch(http.HTTPRequest("http://www.google.com/", method="GET"))
assert response.body.decode("utf-8") == "not actually google"
@pytest.mark.gen_test
@gen_test
@vcr.use_cassette(path_transformer=vcr.default_vcr.ensure_suffix(".yaml"))
def test_tornado_exception_can_be_caught(get_client):
try:
@@ -302,7 +346,8 @@ def test_tornado_exception_can_be_caught(get_client):
assert e.code == 404
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_existing_references_get_patched(tmpdir):
from tornado.httpclient import AsyncHTTPClient
@@ -315,7 +360,8 @@ def test_existing_references_get_patched(tmpdir):
assert cass.play_count == 1
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_existing_instances_get_patched(get_client, tmpdir):
"""Ensure that existing instances of AsyncHTTPClient get patched upon
entering VCR context."""
@@ -330,7 +376,8 @@ def test_existing_instances_get_patched(get_client, tmpdir):
assert cass.play_count == 1
@pytest.mark.gen_test
@pytest.mark.online
@gen_test
def test_request_time_is_set(get_client, tmpdir):
"""Ensures that the request_time on HTTPResponses is set."""

View File

@@ -5,12 +5,13 @@ from urllib.parse import urlencode
from urllib.request import urlopen
import pytest_httpbin.certs
from assertions import assert_cassette_has_one_response
from pytest import mark
# Internal imports
import vcr
from ..assertions import assert_cassette_has_one_response
def urlopen_with_cafile(*args, **kwargs):
context = ssl.create_default_context(cafile=pytest_httpbin.certs.where())

View File

@@ -4,12 +4,13 @@
import pytest
import pytest_httpbin
from assertions import assert_cassette_empty, assert_is_json_bytes
import vcr
from vcr.patch import force_reset
from vcr.stubs.compat import get_headers
from ..assertions import assert_cassette_empty, assert_is_json_bytes
urllib3 = pytest.importorskip("urllib3")

View File

@@ -63,12 +63,12 @@ def test_flickr_should_respond_with_200(tmpdir):
def test_cookies(tmpdir, httpbin):
testfile = str(tmpdir.join("cookies.yml"))
with vcr.use_cassette(testfile):
s = requests.Session()
s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2")
assert s.cookies.keys() == ["k1", "k2"]
with requests.Session() as s:
s.get(httpbin.url + "/cookies/set?k1=v1&k2=v2")
assert s.cookies.keys() == ["k1", "k2"]
r2 = s.get(httpbin.url + "/cookies")
assert sorted(r2.json()["cookies"].keys()) == ["k1", "k2"]
r2 = s.get(httpbin.url + "/cookies")
assert sorted(r2.json()["cookies"].keys()) == ["k1", "k2"]
@pytest.mark.online

View File

@@ -11,6 +11,7 @@ import yaml
from vcr.cassette import Cassette
from vcr.errors import UnhandledHTTPRequestError
from vcr.patch import force_reset
from vcr.request import Request
from vcr.stubs import VCRHTTPSConnection
@@ -410,3 +411,25 @@ def test_find_requests_with_most_matches_many_similar_requests(mock_get_matchers
(1, ["method", "path"], [("query", "failed : query")]),
(3, ["method", "path"], [("query", "failed : query")]),
]
def test_used_interactions(tmpdir):
interactions = [
{"request": {"body": "", "uri": "foo1", "method": "GET", "headers": {}}, "response": "bar1"},
{"request": {"body": "", "uri": "foo2", "method": "GET", "headers": {}}, "response": "bar2"},
{"request": {"body": "", "uri": "foo3", "method": "GET", "headers": {}}, "response": "bar3"},
]
file = tmpdir.join("test_cassette.yml")
file.write(yaml.dump({"interactions": [interactions[0], interactions[1]]}))
cassette = Cassette.load(path=str(file))
request = Request._from_dict(interactions[1]["request"])
cassette.play_response(request)
assert len(cassette._played_interactions) < len(cassette._old_interactions)
request = Request._from_dict(interactions[2]["request"])
cassette.append(request, interactions[2]["response"])
assert len(cassette._new_interactions()) == 1
used_interactions = cassette._played_interactions + cassette._new_interactions()
assert len(used_interactions) == 2

View File

@@ -1,8 +1,12 @@
import contextlib
import http.client as httplib
from io import BytesIO
from tempfile import NamedTemporaryFile
from unittest import mock
from pytest import mark
from vcr import mode
from vcr import mode, use_cassette
from vcr.cassette import Cassette
from vcr.stubs import VCRHTTPSConnection
@@ -16,7 +20,56 @@ class TestVCRConnection:
@mark.online
@mock.patch("vcr.cassette.Cassette.can_play_response_for", return_value=False)
def testing_connect(*args):
vcr_connection = VCRHTTPSConnection("www.google.com")
vcr_connection.cassette = Cassette("test", record_mode=mode.ALL)
vcr_connection.real_connection.connect()
assert vcr_connection.real_connection.sock is not None
with contextlib.closing(VCRHTTPSConnection("www.google.com")) as vcr_connection:
vcr_connection.cassette = Cassette("test", record_mode=mode.ALL)
vcr_connection.real_connection.connect()
assert vcr_connection.real_connection.sock is not None
def test_body_consumed_once_stream(self, tmpdir, httpbin):
self._test_body_consumed_once(
tmpdir,
httpbin,
BytesIO(b"1234567890"),
BytesIO(b"9876543210"),
BytesIO(b"9876543210"),
)
def test_body_consumed_once_iterator(self, tmpdir, httpbin):
self._test_body_consumed_once(
tmpdir,
httpbin,
iter([b"1234567890"]),
iter([b"9876543210"]),
iter([b"9876543210"]),
)
# data2 and data3 should serve the same data, potentially as iterators
def _test_body_consumed_once(
self,
tmpdir,
httpbin,
data1,
data2,
data3,
):
with NamedTemporaryFile(dir=tmpdir, suffix=".yml") as f:
testpath = f.name
# NOTE: ``use_cassette`` is not okay with the file existing
# already. So we using ``.close()`` to not only
# close but also delete the empty file, before we start.
f.close()
host, port = httpbin.host, httpbin.port
match_on = ["method", "uri", "body"]
with use_cassette(testpath, match_on=match_on):
conn1 = httplib.HTTPConnection(host, port)
conn1.request("POST", "/anything", body=data1)
conn1.getresponse()
conn2 = httplib.HTTPConnection(host, port)
conn2.request("POST", "/anything", body=data2)
conn2.getresponse()
with use_cassette(testpath, match_on=match_on) as cass:
conn3 = httplib.HTTPConnection(host, port)
conn3.request("POST", "/anything", body=data3)
conn3.getresponse()
assert cass.play_counts[0] == 0
assert cass.play_counts[1] == 1

33
tests/unit/test_util.py Normal file
View File

@@ -0,0 +1,33 @@
from io import BytesIO, StringIO
import pytest
from vcr import request
from vcr.util import read_body
@pytest.mark.parametrize(
"input_, expected_output",
[
(BytesIO(b"Stream"), b"Stream"),
(StringIO("Stream"), b"Stream"),
(iter(["StringIter"]), b"StringIter"),
(iter(["String", "Iter"]), b"StringIter"),
(iter([b"BytesIter"]), b"BytesIter"),
(iter([b"Bytes", b"Iter"]), b"BytesIter"),
(iter([70, 111, 111]), b"Foo"),
(iter([]), b""),
("String", b"String"),
(b"Bytes", b"Bytes"),
],
)
def test_read_body(input_, expected_output):
r = request.Request("POST", "http://host.com/", input_, {})
assert read_body(r) == expected_output
def test_unsupported_read_body():
r = request.Request("POST", "http://host.com/", iter([[]]), {})
with pytest.raises(ValueError) as excinfo:
assert read_body(r)
assert excinfo.value.args == ("Body type <class 'list'> not supported",)

View File

@@ -372,3 +372,19 @@ def test_path_class_as_cassette():
)
with use_cassette(path):
pass
def test_use_cassette_generator_return():
ret_val = object()
vcr = VCR()
@vcr.use_cassette("test")
def gen():
return ret_val
yield
with pytest.raises(StopIteration) as exc_info:
next(gen())
assert exc_info.value.value is ret_val

87
tox.ini
View File

@@ -1,87 +0,0 @@
[tox]
skip_missing_interpreters=true
envlist =
cov-clean,
lint,
{py38,py39,py310,py311,py312}-{requests-urllib3-1,httplib2,urllib3-1,tornado4,boto3,aiohttp,httpx},
{py310,py311,py312}-{requests-urllib3-2,urllib3-2},
{pypy3}-{requests-urllib3-1,httplib2,urllib3-1,tornado4,boto3},
#{py310}-httpx019,
cov-report
[gh-actions]
python =
3.8: py38
3.9: py39
3.10: py310, lint
3.11: py311
3.12: py312
pypy-3: pypy3
# Coverage environment tasks: cov-clean and cov-report
# https://pytest-cov.readthedocs.io/en/latest/tox.html
[testenv:cov-clean]
deps = coverage
skip_install=true
commands = coverage erase
[testenv:cov-report]
deps = coverage
skip_install=true
commands =
coverage html
coverage report --fail-under=90
[testenv:lint]
skipsdist = True
commands =
black --version
black --check --diff .
ruff --version
ruff check .
deps =
black
ruff
basepython = python3.10
[testenv]
# Need to use develop install so that paths
# for aggregate code coverage combine
usedevelop=true
commands =
./runtests.sh --cov=./vcr --cov-branch --cov-report=xml --cov-append {posargs}
allowlist_externals =
./runtests.sh
deps =
Werkzeug==2.0.3
pytest
pytest-httpbin>=1.0.1
pytest-cov
PyYAML
ipaddress
requests: requests>=2.22.0
httplib2: httplib2
urllib3-1: urllib3<2
urllib3-2: urllib3<3
boto3: boto3
aiohttp: aiohttp
aiohttp: pytest-asyncio
aiohttp: pytest-aiohttp
httpx: httpx
{py38,py39,py310}-{httpx}: httpx
{py38,py39,py310}-{httpx}: pytest-asyncio
httpx: httpx>0.19
httpx019: httpx==0.19
{py38,py39,py310}-{httpx}: pytest-asyncio
depends =
lint,{py38,py39,py310,py311,py312,pypy3}-{requests-urllib3-1,httplib2,urllib3-1,tornado4,boto3},{py310,py311,py312}-{requests-urllib3-2,urllib3-2},{py38,py39,py310,py311,py312}-{aiohttp},{py38,py39,py310,py311,py312}-{httpx}: cov-clean
cov-report: lint,{py38,py39,py310,py311,py312,pypy3}-{requests-urllib3-1,httplib2,urllib3-1,tornado4,boto3},{py310,py311,py312}-{requests-urllib3-2,urllib3-2},{py38,py39,py310,py311,py312}-{aiohttp}
passenv =
AWS_ACCESS_KEY_ID
AWS_DEFAULT_REGION
AWS_SECRET_ACCESS_KEY
setenv =
# workaround for broken C extension in aiohttp
# see: https://github.com/aio-libs/aiohttp/issues/7229
py312: AIOHTTP_NO_EXTENSIONS=1

View File

@@ -4,7 +4,7 @@ from logging import NullHandler
from .config import VCR
from .record_mode import RecordMode as mode # noqa: F401
__version__ = "5.1.0"
__version__ = "7.0.0"
logging.getLogger(__name__).addHandler(NullHandler())

View File

@@ -3,8 +3,7 @@ import contextlib
import copy
import inspect
import logging
import sys
from asyncio import iscoroutinefunction
from inspect import iscoroutinefunction
import wrapt
@@ -126,20 +125,7 @@ class CassetteContextDecorator:
duration of the generator.
"""
with self as cassette:
coroutine = fn(cassette)
# We don't need to catch StopIteration. The caller (Tornado's
# gen.coroutine, for example) will handle that.
to_yield = next(coroutine)
while True:
try:
to_send = yield to_yield
except Exception:
to_yield = coroutine.throw(*sys.exc_info())
else:
try:
to_yield = coroutine.send(to_send)
except StopIteration:
break
return (yield from fn(cassette))
def _handle_function(self, fn):
with self as cassette:
@@ -191,6 +177,7 @@ class Cassette:
custom_patches=(),
inject=False,
allow_playback_repeats=False,
drop_unused_requests=False,
):
self._persister = persister or FilesystemPersister
self._path = path
@@ -203,6 +190,7 @@ class Cassette:
self.record_mode = record_mode
self.custom_patches = custom_patches
self.allow_playback_repeats = allow_playback_repeats
self.drop_unused_requests = drop_unused_requests
# self.data is the list of (req, resp) tuples
self.data = []
@@ -210,6 +198,10 @@ class Cassette:
self.dirty = False
self.rewound = False
# Subsets of self.data to store old and played interactions
self._old_interactions = []
self._played_interactions = []
@property
def play_count(self):
return sum(self.play_counts.values())
@@ -229,7 +221,7 @@ class Cassette:
@property
def write_protected(self):
return self.rewound and self.record_mode == RecordMode.ONCE or self.record_mode == RecordMode.NONE
return (self.rewound and self.record_mode == RecordMode.ONCE) or self.record_mode == RecordMode.NONE
def append(self, request, response):
"""Add a request, response pair to this cassette"""
@@ -271,6 +263,7 @@ class Cassette:
for index, response in self._responses(request):
if self.play_counts[index] == 0 or self.allow_playback_repeats:
self.play_counts[index] += 1
self._played_interactions.append((request, response))
return response
# The cassette doesn't contain the request asked for.
raise UnhandledHTTPRequestError(
@@ -331,12 +324,36 @@ class Cassette:
return final_best_matches
def _new_interactions(self):
"""List of new HTTP interactions (request/response tuples)"""
new_interactions = []
for request, response in self.data:
if all(
not requests_match(request, old_request, self._match_on)
for old_request, _ in self._old_interactions
):
new_interactions.append((request, response))
return new_interactions
def _as_dict(self):
return {"requests": self.requests, "responses": self.responses}
def _build_used_interactions_dict(self):
interactions = self._played_interactions + self._new_interactions()
cassete_dict = {
"requests": [request for request, _ in interactions],
"responses": [response for _, response in interactions],
}
return cassete_dict
def _save(self, force=False):
if self.drop_unused_requests and len(self._played_interactions) < len(self._old_interactions):
cassete_dict = self._build_used_interactions_dict()
force = True
else:
cassete_dict = self._as_dict()
if force or self.dirty:
self._persister.save_cassette(self._path, self._as_dict(), serializer=self._serializer)
self._persister.save_cassette(self._path, cassete_dict, serializer=self._serializer)
self.dirty = False
def _load(self):
@@ -344,6 +361,7 @@ class Cassette:
requests, responses = self._persister.load_cassette(self._path, serializer=self._serializer)
for request, response in zip(requests, responses):
self.append(request, response)
self._old_interactions.append((request, response))
self.dirty = False
self.rewound = True
except (CassetteDecodeError, CassetteNotFoundError):

View File

@@ -48,6 +48,7 @@ class VCR:
func_path_generator=None,
decode_compressed_response=False,
record_on_exception=True,
drop_unused_requests=False,
):
self.serializer = serializer
self.match_on = match_on
@@ -81,6 +82,7 @@ class VCR:
self.decode_compressed_response = decode_compressed_response
self.record_on_exception = record_on_exception
self._custom_patches = tuple(custom_patches)
self.drop_unused_requests = drop_unused_requests
def _get_serializer(self, serializer_name):
try:
@@ -151,6 +153,7 @@ class VCR:
"func_path_generator": func_path_generator,
"allow_playback_repeats": kwargs.get("allow_playback_repeats", False),
"record_on_exception": record_on_exception,
"drop_unused_requests": kwargs.get("drop_unused_requests", self.drop_unused_requests),
}
path = kwargs.get("path")
if path:

View File

@@ -3,11 +3,10 @@ import logging
import urllib
import xmlrpc.client
from string import hexdigits
from typing import List, Set
from .util import read_body
_HEXDIG_CODE_POINTS: Set[int] = {ord(s.encode("ascii")) for s in hexdigits}
_HEXDIG_CODE_POINTS: set[int] = {ord(s.encode("ascii")) for s in hexdigits}
log = logging.getLogger(__name__)
@@ -109,7 +108,7 @@ def _dechunk(body):
CHUNK_GAP = b"\r\n"
BODY_LEN: int = len(body)
chunks: List[bytes] = []
chunks: list[bytes] = []
pos: int = 0
while True:

View File

@@ -1,4 +1,5 @@
"""Utilities for patching in cassettes"""
import contextlib
import functools
import http.client as httplib
@@ -260,10 +261,14 @@ class CassettePatcherBuilder:
yield cpool, "HTTPConnectionWithTimeout", VCRHTTPConnectionWithTimeout
yield cpool, "HTTPSConnectionWithTimeout", VCRHTTPSConnectionWithTimeout
yield cpool, "SCHEME_TO_CONNECTION", {
"http": VCRHTTPConnectionWithTimeout,
"https": VCRHTTPSConnectionWithTimeout,
}
yield (
cpool,
"SCHEME_TO_CONNECTION",
{
"http": VCRHTTPConnectionWithTimeout,
"https": VCRHTTPSConnectionWithTimeout,
},
)
@_build_patchers_from_mock_triples_decorator
def _tornado(self):
@@ -368,10 +373,6 @@ class ConnectionRemover:
if isinstance(connection, self._connection_class):
self._connection_pool_to_connections.setdefault(pool, set()).add(connection)
def remove_connection_to_pool_entry(self, pool, connection):
if isinstance(connection, self._connection_class):
self._connection_pool_to_connections[self._connection_class].remove(connection)
def __enter__(self):
return self
@@ -382,10 +383,13 @@ class ConnectionRemover:
connection = pool.pool.get()
if isinstance(connection, self._connection_class):
connections.remove(connection)
connection.close()
else:
readd_connections.append(connection)
for connection in readd_connections:
pool._put_conn(connection)
for connection in connections:
connection.close()
def reset_patchers():

View File

@@ -3,7 +3,7 @@ import warnings
from io import BytesIO
from urllib.parse import parse_qsl, urlparse
from .util import CaseInsensitiveDict
from .util import CaseInsensitiveDict, _is_nonsequence_iterator
log = logging.getLogger(__name__)
@@ -17,13 +17,25 @@ class Request:
self.method = method
self.uri = uri
self._was_file = hasattr(body, "read")
self._was_iter = _is_nonsequence_iterator(body)
if self._was_file:
self.body = body.read()
elif self._was_iter:
self.body = list(body)
else:
self.body = body
self.headers = headers
log.debug("Invoking Request %s", self.uri)
@property
def uri(self):
return self._uri
@uri.setter
def uri(self, uri):
self._uri = uri
self.parsed_uri = urlparse(uri)
@property
def headers(self):
return self._headers
@@ -36,7 +48,11 @@ class Request:
@property
def body(self):
return BytesIO(self._body) if self._was_file else self._body
if self._was_file:
return BytesIO(self._body)
if self._was_iter:
return iter(self._body)
return self._body
@body.setter
def body(self, value):
@@ -54,30 +70,29 @@ class Request:
@property
def scheme(self):
return urlparse(self.uri).scheme
return self.parsed_uri.scheme
@property
def host(self):
return urlparse(self.uri).hostname
return self.parsed_uri.hostname
@property
def port(self):
parse_uri = urlparse(self.uri)
port = parse_uri.port
port = self.parsed_uri.port
if port is None:
try:
port = {"https": 443, "http": 80}[parse_uri.scheme]
port = {"https": 443, "http": 80}[self.parsed_uri.scheme]
except KeyError:
pass
return port
@property
def path(self):
return urlparse(self.uri).path
return self.parsed_uri.path
@property
def query(self):
q = urlparse(self.uri).query
q = self.parsed_uri.query
return sorted(parse_qsl(q))
# alias for backwards compatibility

View File

@@ -66,6 +66,7 @@ class VCRHTTPResponse(HTTPResponse):
self.reason = recorded_response["status"]["message"]
self.status = self.code = recorded_response["status"]["code"]
self.version = None
self.version_string = None
self._content = BytesIO(self.recorded_response["body"]["string"])
self._closed = False
self._original_response = self # for requests.session.Session cookie extraction
@@ -186,22 +187,34 @@ class VCRConnection:
"""
Returns empty string for the default port and ':port' otherwise
"""
port = self.real_connection.port
port = (
self.real_connection.port
if not self.real_connection._tunnel_host
else self.real_connection._tunnel_port
)
default_port = {"https": 443, "http": 80}[self._protocol]
return f":{port}" if port != default_port else ""
def _real_host(self):
"""Returns the request host"""
if self.real_connection._tunnel_host:
# The real connection is to an HTTPS proxy
return self.real_connection._tunnel_host
else:
return self.real_connection.host
def _uri(self, url):
"""Returns request absolute URI"""
if url and not url.startswith("/"):
# Then this must be a proxy request.
return url
uri = f"{self._protocol}://{self.real_connection.host}{self._port_postfix()}{url}"
uri = f"{self._protocol}://{self._real_host()}{self._port_postfix()}{url}"
log.debug("Absolute URI: %s", uri)
return uri
def _url(self, uri):
"""Returns request selector url from absolute URI"""
prefix = f"{self._protocol}://{self.real_connection.host}{self._port_postfix()}"
prefix = f"{self._protocol}://{self._real_host()}{self._port_postfix()}"
return uri.replace(prefix, "", 1)
def request(self, method, url, body=None, headers=None, *args, **kwargs):

View File

@@ -1,10 +1,12 @@
"""Stubs for aiohttp HTTP clients"""
import asyncio
import functools
import json
import logging
from collections.abc import Mapping
from http.cookies import CookieError, Morsel, SimpleCookie
from typing import Mapping, Union
from typing import Union
from aiohttp import ClientConnectionError, ClientResponse, CookieJar, RequestInfo, hdrs, streams
from aiohttp.helpers import strip_auth_from_url

View File

@@ -1,4 +1,5 @@
"""Stubs for boto3"""
from botocore.awsrequest import AWSHTTPConnection as HTTPConnection
from botocore.awsrequest import AWSHTTPSConnection as VerifiedHTTPSConnection

View File

@@ -1,3 +1,4 @@
import asyncio
import functools
import inspect
import logging
@@ -6,7 +7,9 @@ from unittest.mock import MagicMock, patch
import httpx
from vcr.errors import CannotOverwriteExistingCassetteException
from vcr.filters import decode_response
from vcr.request import Request as VcrRequest
from vcr.serializers.compat import convert_body_to_bytes
_httpx_signature = inspect.signature(httpx.Client.request)
@@ -33,14 +36,29 @@ def _transform_headers(httpx_response):
return out
def _to_serialized_response(httpx_response):
return {
"status_code": httpx_response.status_code,
"http_version": httpx_response.http_version,
"headers": _transform_headers(httpx_response),
"content": httpx_response.content,
async def _to_serialized_response(resp, aread):
# The content shouldn't already have been read in by HTTPX.
assert not hasattr(resp, "_decoder")
# Retrieve the content, but without decoding it.
with patch.dict(resp.headers, {"Content-Encoding": ""}):
if aread:
await resp.aread()
else:
resp.read()
result = {
"status": {"code": resp.status_code, "message": resp.reason_phrase},
"headers": _transform_headers(resp),
"body": {"string": resp.content},
}
# As the content wasn't decoded, we restore the response to a state which
# will be capable of decoding the content for the consumer.
del resp._decoder
resp._content = resp._get_content_decoder().decode(resp.content)
return result
def _from_serialized_headers(headers):
"""
@@ -57,15 +75,32 @@ def _from_serialized_headers(headers):
@patch("httpx.Response.close", MagicMock())
@patch("httpx.Response.read", MagicMock())
def _from_serialized_response(request, serialized_response, history=None):
content = serialized_response.get("content")
# Cassette format generated for HTTPX requests by older versions of
# vcrpy. We restructure the content to resemble what a regular
# cassette looks like.
if "status_code" in serialized_response:
serialized_response = decode_response(
convert_body_to_bytes(
{
"headers": serialized_response["headers"],
"body": {"string": serialized_response["content"]},
"status": {"code": serialized_response["status_code"]},
},
),
)
extensions = None
else:
extensions = {"reason_phrase": serialized_response["status"]["message"].encode()}
response = httpx.Response(
status_code=serialized_response.get("status_code"),
status_code=serialized_response["status"]["code"],
request=request,
headers=_from_serialized_headers(serialized_response.get("headers")),
content=content,
headers=_from_serialized_headers(serialized_response["headers"]),
content=serialized_response["body"]["string"],
history=history or [],
extensions=extensions,
)
response._content = content
return response
@@ -91,17 +126,17 @@ def _shared_vcr_send(cassette, real_send, *args, **kwargs):
return vcr_request, None
def _record_responses(cassette, vcr_request, real_response):
async def _record_responses(cassette, vcr_request, real_response, aread):
for past_real_response in real_response.history:
past_vcr_request = _make_vcr_request(past_real_response.request)
cassette.append(past_vcr_request, _to_serialized_response(past_real_response))
cassette.append(past_vcr_request, await _to_serialized_response(past_real_response, aread))
if real_response.history:
# If there was a redirection keep we want the request which will hold the
# final redirect value
vcr_request = _make_vcr_request(real_response.request)
cassette.append(vcr_request, _to_serialized_response(real_response))
cassette.append(vcr_request, await _to_serialized_response(real_response, aread))
return real_response
@@ -119,8 +154,8 @@ async def _async_vcr_send(cassette, real_send, *args, **kwargs):
return response
real_response = await real_send(*args, **kwargs)
await real_response.aread()
return _record_responses(cassette, vcr_request, real_response)
await _record_responses(cassette, vcr_request, real_response, aread=True)
return real_response
def async_vcr_send(cassette, real_send):
@@ -131,6 +166,22 @@ def async_vcr_send(cassette, real_send):
return _inner_send
def _run_async_function(sync_func, *args, **kwargs):
"""
Safely run an asynchronous function from a synchronous context.
Handles both cases:
- An event loop is already running.
- No event loop exists yet.
"""
try:
asyncio.get_running_loop()
except RuntimeError:
return asyncio.run(sync_func(*args, **kwargs))
else:
# If inside a running loop, create a task and wait for it
return asyncio.ensure_future(sync_func(*args, **kwargs))
def _sync_vcr_send(cassette, real_send, *args, **kwargs):
vcr_request, response = _shared_vcr_send(cassette, real_send, *args, **kwargs)
if response:
@@ -139,8 +190,8 @@ def _sync_vcr_send(cassette, real_send, *args, **kwargs):
return response
real_response = real_send(*args, **kwargs)
real_response.read()
return _record_responses(cassette, vcr_request, real_response)
_run_async_function(_record_responses, cassette, vcr_request, real_response, aread=False)
return real_response
def sync_vcr_send(cassette, real_send):

View File

@@ -1,4 +1,5 @@
"""Stubs for tornado HTTP clients"""
import functools
from io import BytesIO

View File

@@ -89,9 +89,28 @@ def compose(*functions):
return composed
def _is_nonsequence_iterator(obj):
return hasattr(obj, "__iter__") and not isinstance(
obj,
(bytearray, bytes, dict, list, str),
)
def read_body(request):
if hasattr(request.body, "read"):
return request.body.read()
if _is_nonsequence_iterator(request.body):
body = list(request.body)
if body:
if isinstance(body[0], str):
return "".join(body).encode("utf-8")
elif isinstance(body[0], (bytes, bytearray)):
return b"".join(body)
elif isinstance(body[0], int):
return bytes(body)
else:
raise ValueError(f"Body type {type(body[0])} not supported")
return b""
return request.body