Merge remote-tracking branch 'origin/master' into 4072-no-more-blocking-part-2

This commit is contained in:
Itamar Turner-Trauring 2023-12-07 15:44:32 -05:00
commit ac5178269b
37 changed files with 231 additions and 469 deletions

View File

@ -46,8 +46,8 @@ export PIP_FIND_LINKS="file://${WHEELHOUSE_PATH}"
# setuptools 45 requires Python 3.5 or newer. Even though we upgraded pip
# above, it may still not be able to get us a compatible version unless we
# explicitly ask for one.
"${PIP}" install --upgrade setuptools==44.0.0 wheel
"${PIP}" install --upgrade setuptools wheel
# Just about every user of this image wants to use tox from the bootstrap
# virtualenv so go ahead and install it now.
"${PIP}" install "tox~=3.0"
"${PIP}" install "tox~=4.0"

View File

@ -45,16 +45,17 @@ jobs:
fail-fast: false
matrix:
include:
# On macOS don't bother with 3.8, just to get faster builds.
- os: macos-12
python-version: "3.9"
- os: macos-12
python-version: "3.11"
python-version: "3.12"
# We only support PyPy on Linux at the moment.
- os: ubuntu-latest
python-version: "pypy-3.8"
- os: ubuntu-latest
python-version: "pypy-3.9"
- os: ubuntu-latest
python-version: "3.12"
- os: windows-latest
python-version: "3.12"
steps:
# See https://github.com/actions/checkout. A fetch-depth of 0
@ -72,7 +73,7 @@ jobs:
- name: Install Python packages
run: |
pip install --upgrade "tox<4" tox-gh-actions setuptools
pip install --upgrade tox tox-gh-actions setuptools
pip list
- name: Display tool versions
@ -169,7 +170,7 @@ jobs:
- false
include:
- os: ubuntu-20.04
python-version: "3.10"
python-version: "3.12"
force-foolscap: true
steps:
@ -204,7 +205,7 @@ jobs:
- name: Install Python packages
run: |
pip install --upgrade "tox<4"
pip install --upgrade tox
pip list
- name: Display tool versions
@ -264,7 +265,7 @@ jobs:
- name: Install Python packages
run: |
pip install --upgrade "tox<4"
pip install --upgrade tox
pip list
- name: Display tool versions

View File

@ -73,10 +73,15 @@ key on this list.
~$1020
1DskmM8uCvmvTKjPbeDgfmVsGifZCmxouG
* Aspiration contract (first phase, 2019)
$300k-$350k
* Aspiration contract
$300k-$350k (first phase, 2019)
$800k (second phase, 2020)
1gDXYQNH4kCJ8Dk7kgiztfjNUaA1KJcHv
* OpenCollective development work (2023)
~$260k
1KZYr8UU2XjuEdSPzn2pF8eRPZZvffByDf
Historical Donation Addresses
=============================
@ -104,17 +109,17 @@ This document is signed by the Tahoe-LAFS Release-Signing Key (GPG keyid
(https://github.com/tahoe-lafs/tahoe-lafs.git) as `docs/donations.rst`.
Both actions require access to secrets held closely by Tahoe developers.
signed: Brian Warner, 27-Dec-2018
signed: Brian Warner, 25-Oct-2023
-----BEGIN PGP SIGNATURE-----
iQEzBAEBCAAdFiEE405i0G0Oac/KQXn/veDTHWhmanoFAlwlrdsACgkQveDTHWhm
anqEqQf/SdxMvI0+YbsZe+Gr/+lNWrNtfxAkjgLUZYRPmElZG6UKkNuPghXfsYRM
71nRbgbn05jrke7AGlulxNplTxYP/5LQVf5K1nvTE7yPI/LBMudIpAbM3wPiLKSD
qecrVZiqiIBPHWScyya91qirTHtJTJj39cs/N9937hD+Pm65paHWHDZhMkhStGH7
05WtvD0G+fFuAgs04VDBz/XVQlPbngkmdKjIL06jpIAgzC3H9UGFcqe55HKY66jK
W769TiRuGLLS07cOPqg8t2hPpE4wv9Gs02hfg1Jc656scsFuEkh5eMMj/MXcFsED
8vwn16kjJk1fkeg+UofnXsHeHIJalQ==
=/E+V
iQEzBAEBCAAdFiEE405i0G0Oac/KQXn/veDTHWhmanoFAmU5YZMACgkQveDTHWhm
anqt+ggAo2kulNmjrWA5VhqE8i6ckkxQMRVY4y0LAfiI0ho/505ZBZvpoh/Ze31x
ZJj4DczHmZM+m3L+fZyubT4ldagYEojtwkYmxHAQz2DIV4PrdjsUQWyvkNcTBZWu
y5mR5ATk3EYRa19xGEosWK1OzW2kgRbpAbznuWsdxxw9vNENBrolGRsyJqRQHCiV
/4UkrGiOegaJSFMKy2dCyDF3ExD6wT9+fdqC5xDJZjhD+SUDJnD4oWLYLroj//v1
sy4J+/ElNU9oaC0jDb9fx1ECk+u6B+YiaYlW/MrZNqzKCM/76yZ8sA2+ynsOHGtL
bPFpLJjX6gBwHkMqvkWhsJEojxkFVQ==
=gxlb
-----END PGP SIGNATURE-----

View File

@ -131,3 +131,54 @@ developer summit.
* acdfc299c35eed3bb27f7463ad8cdfcdcd4dcfd5184f290f87530c2be999de3e
1.41401086 (@$714.16) = $1009.83, plus 0.000133 tx-fee
Aspiration Contract
-------------------
In December 2018, we entered into an agreement with a non-profit named
Aspiration (https://aspirationtech.org/) to fund contractors for development
work. They handle payroll, taxes, and oversight, in exchange for an 8%
management fee. The first phase of work will extend through most of 2019.
* Recipient: Aspiration
* Address: 1gDXYQNH4kCJ8Dk7kgiztfjNUaA1KJcHv
These txids record the transfers from the primary 1Pxi address to the
Aspiration-specific 1gDXY subaddress. In some cases, leftover funds
were swept back into the main 1Pxi address after the transfers were
complete.
First phase, transfers performed 28-Dec-2018 - 31-Dec-2018, total 89
BTC, about $350K.
* 95c68d488bd92e8c164195370aaa516dff05aa4d8c543d3fb8cfafae2b811e7a
1.0 BTC plus 0.00002705 tx-fee
* c0a5b8e3a63c56c4365d4c3ded0821bc1170f6351502849168bc34e30a0582d7
89.0 BTC plus 0.00000633 tx-fee
* 421cff5f398509aaf48951520738e0e63dfddf1157920c15bdc72c34e24cf1cf
return 0.00005245 BTC to 1Pxi, less 0.00000211 tx-fee
In November 2020, we funded a second phase of the work: 51.38094 BTC,
about $800K.
* 7558cbf3b24e8d835809d2d6f01a8ba229190102efdf36280d0639abaa488721
1.0 BTC plus 0.00230766 tx-fee
* 9c78ae6bb7db62cbd6be82fd52d50a2f015285b562f05de0ebfb0e5afc6fd285
56.0 BTC plus 0.00057400 tx-fee
* fbee4332e8c7ffbc9c1bcaee773f063550e589e58d350d14f6daaa473966c368
returning 5.61906 BTC to 1Pxi, less 0.00012000 tx-fee
Open Collective
---------------
In August 2023, we started working with Open Collective to fund a
grant covering development work performed over the last year.
* Recipient: Open Collective (US)
* Address: 1KZYr8UU2XjuEdSPzn2pF8eRPZZvffByDf
The first phase transferred 7.5 BTC (about $260K).
* (txid)
(amount)

View File

@ -30,6 +30,7 @@ from allmydata.util.deferredutil import async_to_deferred
if sys.platform.startswith('win'):
pytest.skip('Skipping Tor tests on Windows', allow_module_level=True)
@pytest.mark.skipif(sys.version_info[:2] > (3, 11), reason='Chutney still does not support 3.12')
@pytest_twisted.inlineCallbacks
def test_onion_service_storage(reactor, request, temp_dir, flog_gatherer, tor_network, tor_introducer_furl):
"""
@ -140,6 +141,7 @@ def _create_anonymous_node(reactor, name, web_port, request, temp_dir, flog_gath
print("okay, launched")
return result
@pytest.mark.skipif(sys.version_info[:2] > (3, 11), reason='Chutney still does not support 3.12')
@pytest.mark.skipif(sys.platform.startswith('darwin'), reason='This test has issues on macOS')
@pytest_twisted.inlineCallbacks
def test_anonymous_client(reactor, request, temp_dir, flog_gatherer, tor_network, introducer_furl):

View File

@ -1,8 +1,7 @@
#! /usr/bin/env python
from __future__ import print_function
import locale, os, platform, subprocess, sys, traceback
from importlib.metadata import version, PackageNotFoundError
def foldlines(s, numlines=None):
@ -72,17 +71,10 @@ def print_as_ver():
traceback.print_exc(file=sys.stderr)
sys.stderr.flush()
def print_setuptools_ver():
try:
import pkg_resources
out = str(pkg_resources.require("setuptools"))
print("setuptools:", foldlines(out))
except (ImportError, EnvironmentError):
sys.stderr.write("\nGot exception using 'pkg_resources' to get the version of setuptools. Exception follows\n")
traceback.print_exc(file=sys.stderr)
sys.stderr.flush()
except pkg_resources.DistributionNotFound:
print("setuptools:", version("setuptools"))
except PackageNotFoundError:
print('setuptools: DistributionNotFound')
@ -91,14 +83,8 @@ def print_py_pkg_ver(pkgname, modulename=None):
modulename = pkgname
print()
try:
import pkg_resources
out = str(pkg_resources.require(pkgname))
print(pkgname + ': ' + foldlines(out))
except (ImportError, EnvironmentError):
sys.stderr.write("\nGot exception using 'pkg_resources' to get the version of %s. Exception follows.\n" % (pkgname,))
traceback.print_exc(file=sys.stderr)
sys.stderr.flush()
except pkg_resources.DistributionNotFound:
print(pkgname + ': ' + version(pkgname))
except PackageNotFoundError:
print(pkgname + ': DistributionNotFound')
try:
__import__(modulename)

View File

@ -0,0 +1 @@
Added support for Python 3.12, and work with Eliot 1.15

0
newsfragments/4074.minor Normal file
View File

0
newsfragments/4075.minor Normal file
View File

3
pyproject.toml Normal file
View File

@ -0,0 +1,3 @@
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"

View File

@ -32,9 +32,8 @@ VERSION_PY_FILENAME = 'src/allmydata/_version.py'
version = read_version_py(VERSION_PY_FILENAME)
install_requires = [
# we don't need much out of setuptools but the version checking stuff
# needs pkg_resources and PEP 440 version specifiers.
"setuptools >= 28.8.0",
# importlib.resources.files and friends are new in Python 3.9.
"importlib_resources; python_version < '3.9'",
"zfec >= 1.1.0",
@ -113,7 +112,7 @@ install_requires = [
"magic-wormhole >= 0.10.2",
# We want a new enough version to support custom JSON encoders.
"eliot >= 1.13.0",
"eliot >= 1.14.0",
"pyrsistent",
@ -158,10 +157,6 @@ install_requires = [
"filelock",
]
setup_requires = [
'setuptools >= 28.8.0', # for PEP-440 style versions
]
tor_requires = [
# 23.5 added support for custom TLS contexts in web_agent(), which is
# needed for the HTTP storage client to run over Tor.
@ -385,8 +380,8 @@ setup(name="tahoe-lafs", # also set in __init__.py
package_dir = {'':'src'},
packages=find_packages('src') + ['allmydata.test.plugins'],
classifiers=trove_classifiers,
# We support Python 3.8 or later, 3.12 is untested for now
python_requires=">=3.8, <3.12",
# We support Python 3.8 or later, 3.13 is untested for now
python_requires=">=3.8, <3.13",
install_requires=install_requires,
extras_require={
# Duplicate the Twisted pywin32 dependency here. See
@ -410,9 +405,8 @@ setup(name="tahoe-lafs", # also set in __init__.py
# selected here are just the current versions at the time.
# Bumping them to keep up with future releases is fine as long
# as those releases are known to actually work.
"pip==22.0.3",
"wheel==0.37.1",
"setuptools==60.9.1",
"pip==23.3.1",
"wheel==0.41.3",
"subunitreporter==23.8.0",
"python-subunit==1.4.2",
"junitxml==0.7",
@ -448,7 +442,6 @@ setup(name="tahoe-lafs", # also set in __init__.py
"allmydata": ["ported-modules.txt"],
},
include_package_data=True,
setup_requires=setup_requires,
entry_points={
'console_scripts': [
'tahoe = allmydata.scripts.runner:run',

View File

@ -3,16 +3,6 @@ Decentralized storage grid.
community web site: U{https://tahoe-lafs.org/}
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from future.utils import PY2, PY3
if PY2:
# Don't import future str() so we don't break Foolscap serialization on Python 2.
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, max, min # noqa: F401
from past.builtins import unicode as str
__all__ = [
"__version__",
@ -52,12 +42,6 @@ __appname__ = "tahoe-lafs"
# https://tahoe-lafs.org/trac/tahoe-lafs/wiki/Versioning
__full_version__ = __appname__ + '/' + str(__version__)
# Install Python 3 module locations in Python 2:
from future import standard_library
standard_library.install_aliases()
# Monkey-patch 3rd party libraries:
from ._monkeypatch import patch
patch()
@ -72,8 +56,7 @@ del patch
#
# Also note that BytesWarnings only happen if Python is run with -b option, so
# in practice this should only affect tests.
if PY3:
import warnings
# Error on BytesWarnings, to catch things like str(b""), but only for
# allmydata code.
warnings.filterwarnings("error", category=BytesWarning, module=".*allmydata.*")
import warnings
# Error on BytesWarnings, to catch things like str(b""), but only for
# allmydata code.
warnings.filterwarnings("error", category=BytesWarning, module=".*allmydata.*")

View File

@ -1,89 +0,0 @@
"""
Ported to Python 3.
"""
from __future__ import unicode_literals
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
# Note: please minimize imports in this file. In particular, do not import
# any module from Tahoe-LAFS or its dependencies, and do not import any
# modules at all at global level. That includes setuptools and pkg_resources.
# It is ok to import modules from the Python Standard Library if they are
# always available, or the import is protected by try...except ImportError.
# Includes some indirect dependencies, but does not include allmydata.
# These are in the order they should be listed by --version, etc.
package_imports = [
# package name module name
('foolscap', 'foolscap'),
('zfec', 'zfec'),
('Twisted', 'twisted'),
('zope.interface', 'zope.interface'),
('python', None),
('platform', None),
('pyOpenSSL', 'OpenSSL'),
('OpenSSL', None),
('pyasn1', 'pyasn1'),
('service-identity', 'service_identity'),
('pyasn1-modules', 'pyasn1_modules'),
('cryptography', 'cryptography'),
('cffi', 'cffi'),
('six', 'six'),
('enum34', 'enum'),
('pycparser', 'pycparser'),
('PyYAML', 'yaml'),
('magic-wormhole', 'wormhole'),
('setuptools', 'setuptools'),
('eliot', 'eliot'),
('attrs', 'attr'),
('autobahn', 'autobahn'),
]
# Dependencies for which we don't know how to get a version number at run-time.
not_import_versionable = [
'zope.interface',
]
# Dependencies reported by pkg_resources that we can safely ignore.
ignorable = [
'argparse',
'distribute',
'twisted-web',
'twisted-core',
'twisted-conch',
]
# These are suppressed globally:
global_deprecation_messages = [
"BaseException.message has been deprecated as of Python 2.6",
"twisted.internet.interfaces.IFinishableConsumer was deprecated in Twisted 11.1.0: Please use IConsumer (and IConsumer.unregisterProducer) instead.",
"twisted.internet.interfaces.IStreamClientEndpointStringParser was deprecated in Twisted 14.0.0: This interface has been superseded by IStreamClientEndpointStringParserWithReactor.",
]
# These are suppressed while importing dependencies:
deprecation_messages = [
"the sha module is deprecated; use the hashlib module instead",
"object.__new__\(\) takes no parameters",
"The popen2 module is deprecated. Use the subprocess module.",
"the md5 module is deprecated; use hashlib instead",
"twisted.web.error.NoResource is deprecated since Twisted 9.0. See twisted.web.resource.NoResource.",
"the sets module is deprecated",
]
runtime_warning_messages = [
"Not using mpz_powm_sec. You should rebuild using libgmp >= 5 to avoid timing attack vulnerability.",
]
warning_imports = [
'twisted.persisted.sob',
'twisted.python.filepath',
]

View File

@ -12,6 +12,7 @@ from base64 import urlsafe_b64encode
from functools import partial
from configparser import NoSectionError
from six import ensure_text
from foolscap.furl import (
decode_furl,
)
@ -989,6 +990,9 @@ class _Client(node.Node, pollmixin.PollMixin):
static_servers = servers_yaml.get("storage", {})
log.msg("found %d static servers in private/servers.yaml" %
len(static_servers))
static_servers = {
ensure_text(key): value for (key, value) in static_servers.items()
}
self.storage_broker.set_static_servers(static_servers)
except EnvironmentError:
pass

View File

@ -11,6 +11,7 @@ from typing import (
Optional,
Union,
List,
IO
)
from twisted.python.filepath import FilePath
@ -178,6 +179,7 @@ def load_grid_manager(config_path: Optional[FilePath]):
:raises: ValueError if the confguration is invalid or IOError if
expected files can't be opened.
"""
config_file: Union[IO[bytes], IO[str]]
if config_path is None:
config_file = sys.stdin
else:

View File

@ -78,7 +78,7 @@ _READONLY_PEERS = Field(
def _serialize_existing_shares(existing_shares):
return {
server: list(shares)
ensure_str(server): list(shares)
for (server, shares)
in existing_shares.items()
}
@ -91,7 +91,7 @@ _EXISTING_SHARES = Field(
def _serialize_happiness_mappings(happiness_mappings):
return {
sharenum: base32.b2a(serverid)
str(sharenum): ensure_str(base32.b2a(serverid))
for (sharenum, serverid)
in happiness_mappings.items()
}
@ -112,7 +112,7 @@ _UPLOAD_TRACKERS = Field(
u"upload_trackers",
lambda trackers: list(
dict(
server=tracker.get_name(),
server=ensure_str(tracker.get_name()),
shareids=sorted(tracker.buckets.keys()),
)
for tracker
@ -123,7 +123,7 @@ _UPLOAD_TRACKERS = Field(
_ALREADY_SERVERIDS = Field(
u"already_serverids",
lambda d: d,
lambda d: {str(k): v for k, v in d.items()},
u"Some servers which are already holding some shares that we were interested in uploading.",
)

View File

@ -200,14 +200,14 @@ def read_config(basedir, portnumfile, generated_files: Iterable = (), _valid_con
config_path = FilePath(basedir).child("tahoe.cfg")
try:
config_str = config_path.getContent()
config_bytes = config_path.getContent()
except EnvironmentError as e:
if e.errno != errno.ENOENT:
raise
# The file is missing, just create empty ConfigParser.
config_str = u""
else:
config_str = config_str.decode("utf-8-sig")
config_str = config_bytes.decode("utf-8-sig")
return config_from_string(
basedir,

View File

@ -695,7 +695,7 @@ class HTTPServer(BaseApp):
if accept.best == CBOR_MIME_TYPE:
request.setHeader("Content-Type", CBOR_MIME_TYPE)
f = TemporaryFile()
cbor2.dump(data, f)
cbor2.dump(data, f) # type: ignore
def read_data(offset: int, length: int) -> bytes:
f.seek(offset)

View File

@ -32,7 +32,6 @@ Ported to Python 3.
from __future__ import annotations
from six import ensure_text
from typing import Union, Callable, Any, Optional, cast, Dict
from os import urandom
import re
@ -273,7 +272,6 @@ class StorageFarmBroker(service.MultiService):
# doesn't really matter but it makes the logging behavior more
# predictable and easier to test (and at least one test does depend on
# this sorted order).
servers = {ensure_text(key): value for (key, value) in servers.items()}
for (server_id, server) in sorted(servers.items()):
try:
storage_server = self._make_storage_server(

View File

@ -125,5 +125,5 @@ if sys.platform == "win32":
initialize()
from eliot import to_file
from allmydata.util.eliotutil import eliot_json_encoder
to_file(open("eliot.log", "wb"), encoder=eliot_json_encoder)
from allmydata.util.jsonbytes import AnyBytesJSONEncoder
to_file(open("eliot.log", "wb"), encoder=AnyBytesJSONEncoder)

View File

@ -1352,6 +1352,26 @@ class _TestCaseMixin(object):
def assertRaises(self, *a, **kw):
return self._dummyCase.assertRaises(*a, **kw)
def failUnless(self, *args, **kwargs):
"""Backwards compatibility method."""
self.assertTrue(*args, **kwargs)
def failIf(self, *args, **kwargs):
"""Backwards compatibility method."""
self.assertFalse(*args, **kwargs)
def failIfEqual(self, *args, **kwargs):
"""Backwards compatibility method."""
self.assertNotEqual(*args, **kwargs)
def failUnlessEqual(self, *args, **kwargs):
"""Backwards compatibility method."""
self.assertEqual(*args, **kwargs)
def failUnlessReallyEqual(self, *args, **kwargs):
"""Backwards compatibility method."""
self.assertReallyEqual(*args, **kwargs)
class SyncTestCase(_TestCaseMixin, TestCase):
"""

View File

@ -29,6 +29,7 @@ from eliot import (
ILogger,
)
from eliot.testing import (
MemoryLogger,
swap_logger,
check_for_errors,
)
@ -37,8 +38,8 @@ from twisted.python.monkey import (
MonkeyPatcher,
)
from ..util.eliotutil import (
MemoryLogger,
from ..util.jsonbytes import (
AnyBytesJSONEncoder
)
_NAME = Field.for_types(
@ -146,7 +147,7 @@ def with_logging(
"""
@wraps(test_method)
def run_with_logging(*args, **kwargs):
validating_logger = MemoryLogger()
validating_logger = MemoryLogger(encoder=AnyBytesJSONEncoder)
original = swap_logger(None)
try:
swap_logger(_TwoLoggers(original, validating_logger))

View File

@ -850,6 +850,7 @@ class StorageClients(SyncTestCase):
actionType=u"storage-client:broker:set-static-servers",
succeeded=True,
),
encoder_=json.AnyBytesJSONEncoder
)
def test_static_servers(self, logger):
"""
@ -884,6 +885,7 @@ class StorageClients(SyncTestCase):
actionType=u"storage-client:broker:make-storage-server",
succeeded=False,
),
encoder_=json.AnyBytesJSONEncoder
)
def test_invalid_static_server(self, logger):
"""

View File

@ -507,7 +507,7 @@ class TestUtil(unittest.TestCase):
"""
remove a simple prefix properly
"""
self.assertEquals(
self.assertEqual(
remove_prefix(b"foobar", b"foo"),
b"bar"
)
@ -523,7 +523,7 @@ class TestUtil(unittest.TestCase):
"""
removing a zero-length prefix does nothing
"""
self.assertEquals(
self.assertEqual(
remove_prefix(b"foobar", b""),
b"foobar",
)
@ -532,7 +532,7 @@ class TestUtil(unittest.TestCase):
"""
removing a prefix which is the whole string is empty
"""
self.assertEquals(
self.assertEqual(
remove_prefix(b"foobar", b"foobar"),
b"",
)

View File

@ -47,7 +47,6 @@ from eliot import (
Message,
MessageType,
fields,
FileDestination,
MemoryLogger,
)
from eliot.twisted import DeferredContext
@ -64,7 +63,6 @@ from twisted.internet.task import deferLater
from twisted.internet import reactor
from ..util.eliotutil import (
eliot_json_encoder,
log_call_deferred,
_parse_destination_description,
_EliotLogging,
@ -188,8 +186,8 @@ class ParseDestinationDescriptionTests(SyncTestCase):
"""
reactor = object()
self.assertThat(
_parse_destination_description("file:-")(reactor),
Equals(FileDestination(stdout, encoder=eliot_json_encoder)),
_parse_destination_description("file:-")(reactor).file,
Equals(stdout),
)

View File

@ -28,14 +28,14 @@ INTRODUCERS_CFG_FURLS_COMMENTED="""introducers:
class MultiIntroTests(unittest.TestCase):
def setUp(self):
async def setUp(self):
# setup tahoe.cfg and basedir/private/introducers
# create a custom tahoe.cfg
self.basedir = os.path.dirname(self.mktemp())
c = open(os.path.join(self.basedir, "tahoe.cfg"), "w")
config = {'hide-ip':False, 'listen': 'tcp',
'port': None, 'location': None, 'hostname': 'example.net'}
write_node_config(c, config)
await write_node_config(c, config)
c.write("[storage]\n")
c.write("enabled = false\n")
c.close()
@ -63,8 +63,7 @@ class MultiIntroTests(unittest.TestCase):
# assertions
self.failUnlessEqual(ic_count, len(connections["introducers"]))
@defer.inlineCallbacks
def test_read_introducer_furl_from_tahoecfg(self):
async def test_read_introducer_furl_from_tahoecfg(self):
"""
The deprecated [client]introducer.furl item is still read and respected.
"""
@ -72,7 +71,7 @@ class MultiIntroTests(unittest.TestCase):
c = open(os.path.join(self.basedir, "tahoe.cfg"), "w")
config = {'hide-ip':False, 'listen': 'tcp',
'port': None, 'location': None, 'hostname': 'example.net'}
write_node_config(c, config)
await write_node_config(c, config)
fake_furl = "furl1"
c.write("[client]\n")
c.write("introducer.furl = %s\n" % fake_furl)
@ -139,14 +138,14 @@ introducers:
"""
class NoDefault(unittest.TestCase):
def setUp(self):
async def setUp(self):
# setup tahoe.cfg and basedir/private/introducers
# create a custom tahoe.cfg
self.basedir = os.path.dirname(self.mktemp())
c = open(os.path.join(self.basedir, "tahoe.cfg"), "w")
config = {'hide-ip':False, 'listen': 'tcp',
'port': None, 'location': None, 'hostname': 'example.net'}
write_node_config(c, config)
await write_node_config(c, config)
c.write("[storage]\n")
c.write("enabled = false\n")
c.close()

View File

@ -26,7 +26,7 @@ from typing import Union, Callable, Tuple, Iterable
from queue import Queue
from cbor2 import dumps
from pycddl import ValidationError as CDDLValidationError
from hypothesis import assume, given, strategies as st
from hypothesis import assume, given, strategies as st, settings as hypothesis_settings
from fixtures import Fixture, TempDir, MonkeyPatch
from treq.testing import StubTreq
from klein import Klein
@ -442,6 +442,9 @@ class CustomHTTPServerTests(SyncTestCase):
result_of(client.get_version())
@given(length=st.integers(min_value=1, max_value=1_000_000))
# On Python 3.12 we're getting weird deadline issues in CI, so disabling
# for now.
@hypothesis_settings(deadline=None)
def test_limited_content_fits(self, length):
"""
``http_client.limited_content()`` returns the body if it is less than

View File

@ -23,7 +23,7 @@ def assert_soup_has_favicon(testcase, soup):
``BeautifulSoup`` object ``soup`` contains the tahoe favicon link.
"""
links = soup.find_all(u'link', rel=u'shortcut icon')
testcase.assert_(
testcase.assertTrue(
any(t[u'href'] == u'/icon.png' for t in links), soup)
@ -92,6 +92,6 @@ def assert_soup_has_text(testcase, soup, text):
``BeautifulSoup`` object ``soup`` contains the passed in ``text`` anywhere
as a text node.
"""
testcase.assert_(
testcase.assertTrue(
soup.find_all(string=re.compile(re.escape(text))),
soup)

View File

@ -117,7 +117,7 @@ class TestStreamingLogs(AsyncTestCase):
proto.transport.loseConnection()
yield proto.is_closed
self.assertThat(len(messages), Equals(3))
self.assertThat(len(messages), Equals(3), messages)
self.assertThat(messages[0]["action_type"], Equals("test:cli:some-exciting-action"))
self.assertThat(messages[0]["arguments"],
Equals(["hello", "good-\\xff-day", 123, {"a": 35}, [None]]))

View File

@ -1,195 +0,0 @@
"""
Bring in some Eliot updates from newer versions of Eliot than we can
depend on in Python 2. The implementations are copied from Eliot 1.14 and
only changed enough to add Python 2 compatibility.
Every API in this module (except ``eliot_json_encoder``) should be obsolete as
soon as we depend on Eliot 1.14 or newer.
When that happens:
* replace ``capture_logging``
with ``partial(eliot.testing.capture_logging, encoder_=eliot_json_encoder)``
* replace ``validateLogging``
with ``partial(eliot.testing.validateLogging, encoder_=eliot_json_encoder)``
* replace ``MemoryLogger``
with ``partial(eliot.MemoryLogger, encoder=eliot_json_encoder)``
Ported to Python 3.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from future.utils import PY2
if PY2:
from builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import json as pyjson
from functools import wraps, partial
from eliot import (
MemoryLogger as _MemoryLogger,
)
from eliot.testing import (
check_for_errors,
swap_logger,
)
from .jsonbytes import AnyBytesJSONEncoder
# There are currently a number of log messages that include non-UTF-8 bytes.
# Allow these, at least for now. Later when the whole test suite has been
# converted to our SyncTestCase or AsyncTestCase it will be easier to turn
# this off and then attribute log failures to specific codepaths so they can
# be fixed (and then not regressed later) because those instances will result
# in test failures instead of only garbage being written to the eliot log.
eliot_json_encoder = AnyBytesJSONEncoder
class _CustomEncoderMemoryLogger(_MemoryLogger):
"""
Override message validation from the Eliot-supplied ``MemoryLogger`` to
use our chosen JSON encoder.
This is only necessary on Python 2 where we use an old version of Eliot
that does not parameterize the encoder.
"""
def __init__(self, encoder=eliot_json_encoder):
"""
@param encoder: A JSONEncoder subclass to use when encoding JSON.
"""
self._encoder = encoder
super(_CustomEncoderMemoryLogger, self).__init__()
def _validate_message(self, dictionary, serializer):
"""Validate an individual message.
As a side-effect, the message is replaced with its serialized contents.
@param dictionary: A message C{dict} to be validated. Might be mutated
by the serializer!
@param serializer: C{None} or a serializer.
@raises TypeError: If a field name is not unicode, or the dictionary
fails to serialize to JSON.
@raises eliot.ValidationError: If serializer was given and validation
failed.
"""
if serializer is not None:
serializer.validate(dictionary)
for key in dictionary:
if not isinstance(key, str):
if isinstance(key, bytes):
key.decode("utf-8")
else:
raise TypeError(dictionary, "%r is not unicode" % (key,))
if serializer is not None:
serializer.serialize(dictionary)
try:
pyjson.dumps(dictionary, cls=self._encoder)
except Exception as e:
raise TypeError("Message %s doesn't encode to JSON: %s" % (dictionary, e))
if PY2:
MemoryLogger = partial(_CustomEncoderMemoryLogger, encoder=eliot_json_encoder)
else:
MemoryLogger = partial(_MemoryLogger, encoder=eliot_json_encoder)
def validateLogging(
assertion, *assertionArgs, **assertionKwargs
):
"""
Decorator factory for L{unittest.TestCase} methods to add logging
validation.
1. The decorated test method gets a C{logger} keyword argument, a
L{MemoryLogger}.
2. All messages logged to this logger will be validated at the end of
the test.
3. Any unflushed logged tracebacks will cause the test to fail.
For example:
from unittest import TestCase
from eliot.testing import assertContainsFields, validateLogging
class MyTests(TestCase):
def assertFooLogging(self, logger):
assertContainsFields(self, logger.messages[0], {"key": 123})
@param assertion: A callable that will be called with the
L{unittest.TestCase} instance, the logger and C{assertionArgs} and
C{assertionKwargs} once the actual test has run, allowing for extra
logging-related assertions on the effects of the test. Use L{None} if you
want the cleanup assertions registered but no custom assertions.
@param assertionArgs: Additional positional arguments to pass to
C{assertion}.
@param assertionKwargs: Additional keyword arguments to pass to
C{assertion}.
@param encoder_: C{json.JSONEncoder} subclass to use when validating JSON.
"""
encoder_ = assertionKwargs.pop("encoder_", eliot_json_encoder)
def decorator(function):
@wraps(function)
def wrapper(self, *args, **kwargs):
skipped = False
kwargs["logger"] = logger = MemoryLogger(encoder=encoder_)
self.addCleanup(check_for_errors, logger)
# TestCase runs cleanups in reverse order, and we want this to
# run *before* tracebacks are checked:
if assertion is not None:
self.addCleanup(
lambda: skipped
or assertion(self, logger, *assertionArgs, **assertionKwargs)
)
try:
return function(self, *args, **kwargs)
except self.skipException:
skipped = True
raise
return wrapper
return decorator
# PEP 8 variant:
validate_logging = validateLogging
def capture_logging(
assertion, *assertionArgs, **assertionKwargs
):
"""
Capture and validate all logging that doesn't specify a L{Logger}.
See L{validate_logging} for details on the rest of its behavior.
"""
encoder_ = assertionKwargs.pop("encoder_", eliot_json_encoder)
def decorator(function):
@validate_logging(
assertion, *assertionArgs, encoder_=encoder_, **assertionKwargs
)
@wraps(function)
def wrapper(self, *args, **kwargs):
logger = kwargs["logger"]
previous_logger = swap_logger(logger)
def cleanup():
swap_logger(previous_logger)
self.addCleanup(cleanup)
return function(self, *args, **kwargs)
return wrapper
return decorator

View File

@ -15,7 +15,7 @@ scheduler affinity or cgroups, but that's not the end of the world.
"""
import os
from typing import TypeVar, Callable
from typing import TypeVar, Callable, cast
from functools import partial
import threading
from typing_extensions import ParamSpec
@ -24,8 +24,9 @@ from unittest import TestCase
from twisted.python.threadpool import ThreadPool
from twisted.internet.threads import deferToThreadPool
from twisted.internet import reactor
from twisted.internet.interfaces import IReactorFromThreads
_CPU_THREAD_POOL = ThreadPool(minthreads=0, maxthreads=os.cpu_count(), name="TahoeCPU")
_CPU_THREAD_POOL = ThreadPool(minthreads=0, maxthreads=os.cpu_count() or 1, name="TahoeCPU")
if hasattr(threading, "_register_atexit"):
# This is a private API present in Python 3.8 or later, specifically
# designed for thread pool shutdown. Since it's private, it might go away
@ -64,7 +65,7 @@ async def defer_to_thread(f: Callable[P, R], *args: P.args, **kwargs: P.kwargs)
return f(*args, **kwargs)
# deferToThreadPool has no type annotations...
result = await deferToThreadPool(reactor, _CPU_THREAD_POOL, f, *args, **kwargs)
result = await deferToThreadPool(cast(IReactorFromThreads, reactor), _CPU_THREAD_POOL, f, *args, **kwargs)
return result

View File

@ -3,17 +3,6 @@ Tools aimed at the interaction between Tahoe-LAFS implementation and Eliot.
Ported to Python 3.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from __future__ import (
unicode_literals,
print_function,
absolute_import,
division,
)
__all__ = [
"MemoryLogger",
@ -26,11 +15,6 @@ __all__ = [
"capture_logging",
]
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
from six import ensure_text
from sys import (
stdout,
)
@ -42,6 +26,7 @@ from logging import (
)
from json import loads
from six import ensure_text
from zope.interface import (
implementer,
)
@ -61,6 +46,11 @@ from eliot import (
write_traceback,
start_action,
)
from eliot.testing import (
MemoryLogger,
capture_logging,
)
from eliot._validation import (
ValidationError,
)
@ -87,11 +77,8 @@ from twisted.internet.defer import (
)
from twisted.application.service import Service
from ._eliot_updates import (
MemoryLogger,
eliot_json_encoder,
capture_logging,
)
from .jsonbytes import AnyBytesJSONEncoder
def validateInstanceOf(t):
"""
@ -309,7 +296,7 @@ class _DestinationParser(object):
rotateLength=rotate_length,
maxRotatedFiles=max_rotated_files,
)
return lambda reactor: FileDestination(get_file(), eliot_json_encoder)
return lambda reactor: FileDestination(get_file(), encoder=AnyBytesJSONEncoder)
_parse_destination_description = _DestinationParser().parse

View File

@ -61,6 +61,9 @@ class UTF8BytesJSONEncoder(json.JSONEncoder):
"""
A JSON encoder than can also encode UTF-8 encoded strings.
"""
def default(self, o):
return bytes_to_unicode(False, o)
def encode(self, o, **kwargs):
return json.JSONEncoder.encode(
self, bytes_to_unicode(False, o), **kwargs)
@ -77,6 +80,9 @@ class AnyBytesJSONEncoder(json.JSONEncoder):
Bytes are decoded to strings using UTF-8, if that fails to decode then the
bytes are quoted.
"""
def default(self, o):
return bytes_to_unicode(True, o)
def encode(self, o, **kwargs):
return json.JSONEncoder.encode(
self, bytes_to_unicode(True, o), **kwargs)

View File

@ -4,7 +4,13 @@ Ported to Python 3.
from __future__ import annotations
from six import ensure_str
import sys
if sys.version_info[:2] >= (3, 9):
from importlib.resources import files as resource_files, as_file
else:
from importlib_resources import files as resource_files, as_file
from contextlib import ExitStack
import weakref
from typing import Optional, Union, TypeVar, overload
from typing_extensions import Literal
@ -29,6 +35,7 @@ from twisted.web import (
http,
resource,
template,
static,
)
from twisted.web.iweb import (
IRequest,
@ -852,3 +859,21 @@ def get_keypair(request: IRequest) -> tuple[PublicKey, PrivateKey] | None:
return None
privkey, pubkey = create_signing_keypair_from_string(urlsafe_b64decode(privkey_der))
return pubkey, privkey
def add_static_children(root: IResource):
"""
Add static files from C{allmydata.web} to the given resource.
Package resources may be on the filesystem, or they may be in a zip
or something, so we need to do a bit more work to serve them as
static files.
"""
temporary_file_manager = ExitStack()
static_dir = resource_files("allmydata.web") / "static"
for child in static_dir.iterdir():
child_path = child.name.encode("utf-8")
root.putChild(child_path, static.File(
str(temporary_file_manager.enter_context(as_file(child)))
))
weakref.finalize(root, temporary_file_manager.close)

View File

@ -1,26 +1,16 @@
"""
Ported to Python 3.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import time, os
from pkg_resources import resource_filename
import time
from twisted.web.template import Element, XMLFile, renderElement, renderer
from twisted.python.filepath import FilePath
from twisted.web import static
import allmydata
from allmydata.util import idlib, jsonbytes as json
from allmydata.web.common import (
render_time,
MultiFormatResource,
SlotsSequenceElement,
add_static_children,
)
@ -38,9 +28,7 @@ class IntroducerRoot(MultiFormatResource):
self.introducer_service = introducer_node.getServiceNamed("introducer")
# necessary as a root Resource
self.putChild(b"", self)
static_dir = resource_filename("allmydata.web", "static")
for filen in os.listdir(static_dir):
self.putChild(filen.encode("utf-8"), static.File(os.path.join(static_dir, filen)))
add_static_children(self)
def _create_element(self):
"""

View File

@ -1,25 +1,13 @@
"""
Ported to Python 3.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from future.utils import PY2, PY3
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import os
import time
from urllib.parse import quote as urlquote
from hyperlink import DecodedURL, URL
from pkg_resources import resource_filename
from twisted.web import (
http,
resource,
static,
)
from twisted.web.util import redirectTo, Redirect
from twisted.python.filepath import FilePath
@ -54,6 +42,7 @@ from allmydata.web.common import (
render_time_delta,
render_time,
render_time_attr,
add_static_children,
)
from allmydata.web.private import (
create_private_tree,
@ -251,15 +240,10 @@ class Root(MultiFormatResource):
self.putChild(b"named", FileHandler(client))
self.putChild(b"status", status.Status(client.get_history()))
self.putChild(b"statistics", status.Statistics(client.stats_provider))
static_dir = resource_filename("allmydata.web", "static")
for filen in os.listdir(static_dir):
child_path = filen
if PY3:
child_path = filen.encode("utf-8")
self.putChild(child_path, static.File(os.path.join(static_dir, filen)))
self.putChild(b"report_incident", IncidentReporter())
add_static_children(self)
@exception_to_child
def getChild(self, path, request):
if not path:

39
tox.ini
View File

@ -11,6 +11,7 @@ python =
3.9: py39-coverage
3.10: py310-coverage
3.11: py311-coverage
3.12: py312-coverage
pypy-3.8: pypy38
pypy-3.9: pypy39
@ -18,11 +19,14 @@ python =
twisted = 1
[tox]
envlist = typechecks,codechecks,py{38,39,310,311}-{coverage},pypy27,pypy38,pypy39,integration
minversion = 2.4
envlist = typechecks,codechecks,py{38,39,310,311,312}-{coverage},pypy27,pypy38,pypy39,integration
minversion = 4
[testenv]
passenv = TAHOE_LAFS_* PIP_* SUBUNITREPORTER_* USERPROFILE HOMEDRIVE HOMEPATH
# Install code the real way, for maximum realism.
usedevelop = False
passenv = TAHOE_LAFS_*,PIP_*,SUBUNITREPORTER_*,USERPROFILE,HOMEDRIVE,HOMEPATH,COLUMNS
deps =
# We pull in certify *here* to avoid bug #2913. Basically if a
# `setup_requires=...` causes a package to be installed (with setuptools)
@ -40,10 +44,6 @@ deps =
# with the above pins.
certifi
# We add usedevelop=False because testing against a true installation gives
# more useful results.
usedevelop = False
extras =
# Get general testing environment dependencies so we can run the tests
# how we like.
@ -56,6 +56,7 @@ setenv =
# Define TEST_SUITE in the environment as an aid to constructing the
# correct test command below.
TEST_SUITE = allmydata
COLUMNS = 80
commands =
# As an aid to debugging, dump all of the Python packages and their
@ -81,6 +82,7 @@ commands =
coverage: coverage xml
[testenv:integration]
usedevelop = False
basepython = python3
platform = mylinux: linux
mymacos: darwin
@ -99,11 +101,7 @@ skip_install = true
deps =
# Pin a specific version so we get consistent outcomes; update this
# occasionally:
ruff == 0.0.287
# towncrier doesn't work with importlib_resources 6.0.0
# https://github.com/twisted/towncrier/issues/528
# Will be fixed in first version of Towncrier that is larger than 2023.6.
importlib_resources < 6.0.0
ruff == 0.1.6
towncrier
# On macOS, git inside of towncrier needs $HOME.
passenv = HOME
@ -135,28 +133,33 @@ deps =
types-pyOpenSSL
foolscap
# Upgrade when new releases come out:
Twisted==23.8.0
commands = mypy src
Twisted==23.10.0
commands =
# Different versions of Python have a different standard library, and we
# want to be compatible with all the variations. For speed's sake we only do
# the earliest and latest versions.
mypy --python-version=3.8 src
mypy --python-version=3.12 src
[testenv:draftnews]
passenv = TAHOE_LAFS_* PIP_* SUBUNITREPORTER_* USERPROFILE HOMEDRIVE HOMEPATH
passenv = TAHOE_LAFS_*,PIP_*,SUBUNITREPORTER_*,USERPROFILE,HOMEDRIVE,HOMEPATH,COLUMNS
deps =
# see comment in [testenv] about "certifi"
certifi
towncrier==21.3.0
towncrier==23.11.0
commands =
python -m towncrier --draft --config towncrier.toml
[testenv:news]
# On macOS, git invoked from Tox needs $HOME.
passenv = TAHOE_LAFS_* PIP_* SUBUNITREPORTER_* USERPROFILE HOMEDRIVE HOMEPATH HOME
passenv = TAHOE_LAFS_*,PIP_*,SUBUNITREPORTER_*,USERPROFILE,HOMEDRIVE,HOMEPATH,COLUMNS
whitelist_externals =
git
deps =
# see comment in [testenv] about "certifi"
certifi
towncrier==21.3.0
towncrier==23.11.0
commands =
python -m towncrier --yes --config towncrier.toml
# commit the changes