From 914e7ac013ac97adda9b8798200065f3328a3b76 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:38:46 -0500 Subject: [PATCH 001/186] news fragment --- newsfragments/3523.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3523.minor diff --git a/newsfragments/3523.minor b/newsfragments/3523.minor new file mode 100644 index 000000000..e69de29bb From 20dd4d55ad0052649ffb91fa7991ecd7c53590e1 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:44:12 -0500 Subject: [PATCH 002/186] Use "tahoe run" --- docs/anonymity-configuration.rst | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/docs/anonymity-configuration.rst b/docs/anonymity-configuration.rst index 5ad9ae740..d25f8ad41 100644 --- a/docs/anonymity-configuration.rst +++ b/docs/anonymity-configuration.rst @@ -273,7 +273,7 @@ Then, do the following: [connections] tcp = tor -* Launch the Tahoe server with ``tahoe start $NODEDIR`` +* Launch the Tahoe server with ``tahoe run $NODEDIR`` The ``tub.port`` section will cause the Tahoe server to listen on PORT, but bind the listening socket to the loopback interface, which is not reachable @@ -435,4 +435,3 @@ It is therefore important that your I2P router is sharing bandwidth with other routers, so that you can give back as you use I2P. This will never impair the performance of your Tahoe-LAFS node, because your I2P router will always prioritize your own traffic. - From 9fe3dddac4c76c9628bdb702db48717b8556e5f3 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:44:19 -0500 Subject: [PATCH 003/186] use "tahoe run" --- docs/frontends/CLI.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/frontends/CLI.rst b/docs/frontends/CLI.rst index e46936bad..007032e5d 100644 --- a/docs/frontends/CLI.rst +++ b/docs/frontends/CLI.rst @@ -85,7 +85,7 @@ Node Management "``tahoe create-node [NODEDIR]``" is the basic make-a-new-node command. It creates a new directory and populates it with files that -will allow the "``tahoe start``" and related commands to use it later +will allow the "``tahoe run``" and related commands to use it later on. ``tahoe create-node`` creates nodes that have client functionality (upload/download files), web API services (controlled by the '[node]web.port' configuration), and storage services (unless From 35a21b6b3667a2f23f49a4da8c1d6ec8f33f9243 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:44:28 -0500 Subject: [PATCH 004/186] don't use "tahoe start" and "tahoe stop" --- docs/running.rst | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/docs/running.rst b/docs/running.rst index ef6ba42ed..49fccdf9c 100644 --- a/docs/running.rst +++ b/docs/running.rst @@ -81,9 +81,7 @@ does not offer its disk space to other nodes. To configure other behavior, use “``tahoe create-node``” or see :doc:`configuration`. The “``tahoe run``” command above will run the node in the foreground. -On Unix, you can run it in the background instead by using the -“``tahoe start``” command. To stop a node started in this way, use -“``tahoe stop``”. ``tahoe --help`` gives a summary of all commands. +``tahoe --help`` gives a summary of all commands. Running a Server or Introducer From 44cb2d048fe6cc7e810ee53c31bb1dd935659211 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:44:36 -0500 Subject: [PATCH 005/186] use "tahoe run" --- docs/running.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/running.rst b/docs/running.rst index 49fccdf9c..0ed8983f7 100644 --- a/docs/running.rst +++ b/docs/running.rst @@ -97,7 +97,7 @@ and ``--location`` arguments. To construct an introducer, create a new base directory for it (the name of the directory is up to you), ``cd`` into it, and run “``tahoe create-introducer --hostname=example.net .``” (but using the hostname of -your VPS). Now run the introducer using “``tahoe start .``”. After it +your VPS). Now run the introducer using “``tahoe run .``”. After it starts, it will write a file named ``introducer.furl`` into the ``private/`` subdirectory of that base directory. This file contains the URL the other nodes must use in order to connect to this introducer. From 4e6077fae1c80d2910997c2d98319627a35998d0 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:45:03 -0500 Subject: [PATCH 006/186] Nope, actually it doesn't. It uses ~/.tahoe. --- docs/frontends/CLI.rst | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/docs/frontends/CLI.rst b/docs/frontends/CLI.rst index 007032e5d..9a001c479 100644 --- a/docs/frontends/CLI.rst +++ b/docs/frontends/CLI.rst @@ -94,8 +94,7 @@ on. ``tahoe create-node`` creates nodes that have client functionality NODEDIR defaults to ``~/.tahoe/`` , and newly-created nodes default to publishing a web server on port 3456 (limited to the loopback interface, at 127.0.0.1, to restrict access to other programs on the same host). All of the -other "``tahoe``" subcommands use corresponding defaults (with the exception -that "``tahoe run``" defaults to running a node in the current directory). +other "``tahoe``" subcommands use corresponding defaults. "``tahoe create-client [NODEDIR]``" creates a node with no storage service. That is, it behaves like "``tahoe create-node --no-storage [NODEDIR]``". From d9ea26b0e55a733594300a5905948cdf6076992a Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:45:32 -0500 Subject: [PATCH 007/186] It works now, that issue is resolved. --- docs/running.rst | 2 -- 1 file changed, 2 deletions(-) diff --git a/docs/running.rst b/docs/running.rst index 0ed8983f7..6d82a97f2 100644 --- a/docs/running.rst +++ b/docs/running.rst @@ -101,8 +101,6 @@ your VPS). Now run the introducer using “``tahoe run .``”. After it starts, it will write a file named ``introducer.furl`` into the ``private/`` subdirectory of that base directory. This file contains the URL the other nodes must use in order to connect to this introducer. -(Note that “``tahoe run .``” doesn't work for introducers, this is a -known issue: `#937`_.) You can distribute your Introducer fURL securely to new clients by using the ``tahoe invite`` command. This will prepare some JSON to send to the From aca5397d459f1ad06c9e4da466ab2e4ee4d74c2d Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:58:30 -0500 Subject: [PATCH 008/186] Don't use "tahoe start", "tahoe stop", and "tahoe restart" --- docs/frontends/CLI.rst | 19 ------------------- 1 file changed, 19 deletions(-) diff --git a/docs/frontends/CLI.rst b/docs/frontends/CLI.rst index 9a001c479..0badede98 100644 --- a/docs/frontends/CLI.rst +++ b/docs/frontends/CLI.rst @@ -116,25 +116,6 @@ the same way on all platforms and logs to stdout. If you want to run the process as a daemon, it is recommended that you use your favourite daemonization tool. -The now-deprecated "``tahoe start [NODEDIR]``" command will launch a -previously-created node. It will launch the node into the background -using ``tahoe daemonize`` (and internal-only command, not for user -use). On some platforms (including Windows) this command is unable to -run a daemon in the background; in that case it behaves in the same -way as "``tahoe run``". ``tahoe start`` also monitors the logs for up -to 5 seconds looking for either a succesful startup message or for -early failure messages and produces an appropriate exit code. You are -encouraged to use ``tahoe run`` along with your favourite -daemonization tool instead of this. ``tahoe start`` is maintained for -backwards compatibility of users already using it; new scripts should -depend on ``tahoe run``. - -"``tahoe stop [NODEDIR]``" will shut down a running node. "``tahoe -restart [NODEDIR]``" will stop and then restart a running -node. Similar to above, you should use ``tahoe run`` instead alongside -your favourite daemonization tool. - - File Store Manipulation ======================= From db9eb6d80792d0c06ba9d9aec843253091138881 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:58:42 -0500 Subject: [PATCH 009/186] maybe "tahoe run" works on incident gatherers? I don't know. I can't actually create one. This feature seems broken in flogtool. So it probably doesn't matter. --- docs/logging.rst | 7 +++---- 1 file changed, 3 insertions(+), 4 deletions(-) diff --git a/docs/logging.rst b/docs/logging.rst index 88cdebc00..893981096 100644 --- a/docs/logging.rst +++ b/docs/logging.rst @@ -128,10 +128,9 @@ provided in ``misc/incident-gatherer/support_classifiers.py`` . There is roughly one category for each ``log.WEIRD``-or-higher level event in the Tahoe source code. -The incident gatherer is created with the "``flogtool -create-incident-gatherer WORKDIR``" command, and started with "``tahoe -start``". The generated "``gatherer.tac``" file should be modified to add -classifier functions. +The incident gatherer is created with the "``flogtool create-incident-gatherer +WORKDIR``" command, and started with "``tahoe run``". The generated +"``gatherer.tac``" file should be modified to add classifier functions. The incident gatherer writes incident names (which are simply the relative pathname of the ``incident-\*.flog.bz2`` file) into ``classified/CATEGORY``. From 9a10673e8a6f6b72eab1c947e567989a06109e3f Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 15:59:14 -0500 Subject: [PATCH 010/186] "tahoe start" doesn't even work on these but "twistd -ny" does --- docs/logging.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/logging.rst b/docs/logging.rst index 893981096..11835b283 100644 --- a/docs/logging.rst +++ b/docs/logging.rst @@ -174,7 +174,7 @@ things that happened on multiple machines (such as comparing a client node making a request with the storage servers that respond to that request). Create the Log Gatherer with the "``flogtool create-gatherer WORKDIR``" -command, and start it with "``tahoe start``". Then copy the contents of the +command, and start it with "``twistd -ny gatherer.tac``". Then copy the contents of the ``log_gatherer.furl`` file it creates into the ``BASEDIR/tahoe.cfg`` file (under the key ``log_gatherer.furl`` of the section ``[node]``) of all nodes that should be sending it log events. (See :doc:`configuration`) From ae35ba84de3cdfe42f6c3ee891f5eccbd1b85215 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 16:02:15 -0500 Subject: [PATCH 011/186] Use "tahoe run" not "tahoe start" --- docs/configuration.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/configuration.rst b/docs/configuration.rst index 5bd89eec9..540107cbb 100644 --- a/docs/configuration.rst +++ b/docs/configuration.rst @@ -365,7 +365,7 @@ set the ``tub.location`` option described below. also generally reduced when operating in private mode. When False, any of the following configuration problems will cause - ``tahoe start`` to throw a PrivacyError instead of starting the node: + ``tahoe run`` to throw a PrivacyError instead of starting the node: * ``[node] tub.location`` contains any ``tcp:`` hints From 7dda680cb28ae0f12e394b637290f93740843613 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 16:02:26 -0500 Subject: [PATCH 012/186] Already using "tahoe run", de-emphasize "tahoe start" --- integration/conftest.py | 10 ++++------ integration/util.py | 6 ++---- 2 files changed, 6 insertions(+), 10 deletions(-) diff --git a/integration/conftest.py b/integration/conftest.py index ca18230cd..f37ec9353 100644 --- a/integration/conftest.py +++ b/integration/conftest.py @@ -201,9 +201,8 @@ log_gatherer.furl = {log_furl} with open(join(intro_dir, 'tahoe.cfg'), 'w') as f: f.write(config) - # on windows, "tahoe start" means: run forever in the foreground, - # but on linux it means daemonize. "tahoe run" is consistent - # between platforms. + # "tahoe run" is consistent across Linux/macOS/Windows, unlike the old + # "start" command. protocol = _MagicTextProtocol('introducer running') transport = _tahoe_runner_optional_coverage( protocol, @@ -278,9 +277,8 @@ log_gatherer.furl = {log_furl} with open(join(intro_dir, 'tahoe.cfg'), 'w') as f: f.write(config) - # on windows, "tahoe start" means: run forever in the foreground, - # but on linux it means daemonize. "tahoe run" is consistent - # between platforms. + # "tahoe run" is consistent across Linux/macOS/Windows, unlike the old + # "start" command. protocol = _MagicTextProtocol('introducer running') transport = _tahoe_runner_optional_coverage( protocol, diff --git a/integration/util.py b/integration/util.py index f916240ff..eed073225 100644 --- a/integration/util.py +++ b/integration/util.py @@ -189,10 +189,8 @@ def _run_node(reactor, node_dir, request, magic_text): magic_text = "client running" protocol = _MagicTextProtocol(magic_text) - # on windows, "tahoe start" means: run forever in the foreground, - # but on linux it means daemonize. "tahoe run" is consistent - # between platforms. - + # "tahoe run" is consistent across Linux/macOS/Windows, unlike the old + # "start" command. transport = _tahoe_runner_optional_coverage( protocol, reactor, From a34fca8e7af67fd137afd71efd6dd5ef889a7e0b Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 18:21:26 -0500 Subject: [PATCH 013/186] Don't think about "tahoe start" --- src/allmydata/scripts/common.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/scripts/common.py b/src/allmydata/scripts/common.py index 29bb1d5f1..106dad3f2 100644 --- a/src/allmydata/scripts/common.py +++ b/src/allmydata/scripts/common.py @@ -37,7 +37,7 @@ class BaseOptions(usage.Options): super(BaseOptions, self).__init__() self.command_name = os.path.basename(sys.argv[0]) - # Only allow "tahoe --version", not e.g. "tahoe start --version" + # Only allow "tahoe --version", not e.g. "tahoe --version" def opt_version(self): raise usage.UsageError("--version not allowed on subcommands") From 4d28b0ec27ac8631f5d4f2ff33a8dad0d29596ff Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 18:22:01 -0500 Subject: [PATCH 014/186] Get rid of "tahoe start", "tahoe daemonize", "tahoe stop", "tahoe restart" --- src/allmydata/scripts/runner.py | 11 +- src/allmydata/scripts/tahoe_daemonize.py | 16 -- src/allmydata/scripts/tahoe_restart.py | 21 -- src/allmydata/scripts/tahoe_start.py | 152 ------------- src/allmydata/scripts/tahoe_stop.py | 85 ------- src/allmydata/test/cli/test_daemonize.py | 202 ----------------- src/allmydata/test/cli/test_start.py | 273 ----------------------- 7 files changed, 1 insertion(+), 759 deletions(-) delete mode 100644 src/allmydata/scripts/tahoe_daemonize.py delete mode 100644 src/allmydata/scripts/tahoe_restart.py delete mode 100644 src/allmydata/scripts/tahoe_start.py delete mode 100644 src/allmydata/scripts/tahoe_stop.py delete mode 100644 src/allmydata/test/cli/test_daemonize.py delete mode 100644 src/allmydata/test/cli/test_start.py diff --git a/src/allmydata/scripts/runner.py b/src/allmydata/scripts/runner.py index 273a05af1..1f993fda1 100644 --- a/src/allmydata/scripts/runner.py +++ b/src/allmydata/scripts/runner.py @@ -9,8 +9,7 @@ from twisted.internet import defer, task, threads from allmydata.scripts.common import get_default_nodedir from allmydata.scripts import debug, create_node, cli, \ - admin, tahoe_daemonize, tahoe_start, \ - tahoe_stop, tahoe_restart, tahoe_run, tahoe_invite + admin, tahoe_run, tahoe_invite from allmydata.util.encodingutil import quote_output, quote_local_unicode_path, get_io_encoding from allmydata.util.eliotutil import ( opt_eliot_destination, @@ -37,19 +36,11 @@ if _default_nodedir: # XXX all this 'dispatch' stuff needs to be unified + fixed up _control_node_dispatch = { - "daemonize": tahoe_daemonize.daemonize, - "start": tahoe_start.start, "run": tahoe_run.run, - "stop": tahoe_stop.stop, - "restart": tahoe_restart.restart, } process_control_commands = [ ["run", None, tahoe_run.RunOptions, "run a node without daemonizing"], - ["daemonize", None, tahoe_daemonize.DaemonizeOptions, "(deprecated) run a node in the background"], - ["start", None, tahoe_start.StartOptions, "(deprecated) start a node in the background and confirm it started"], - ["stop", None, tahoe_stop.StopOptions, "(deprecated) stop a node"], - ["restart", None, tahoe_restart.RestartOptions, "(deprecated) restart a node"], ] diff --git a/src/allmydata/scripts/tahoe_daemonize.py b/src/allmydata/scripts/tahoe_daemonize.py deleted file mode 100644 index ad2f92355..000000000 --- a/src/allmydata/scripts/tahoe_daemonize.py +++ /dev/null @@ -1,16 +0,0 @@ -from .run_common import ( - RunOptions as _RunOptions, - run, -) - -__all__ = [ - "DaemonizeOptions", - "daemonize", -] - -class DaemonizeOptions(_RunOptions): - subcommand_name = "daemonize" - -def daemonize(config): - print("'tahoe daemonize' is deprecated; see 'tahoe run'") - return run(config) diff --git a/src/allmydata/scripts/tahoe_restart.py b/src/allmydata/scripts/tahoe_restart.py deleted file mode 100644 index 339db862f..000000000 --- a/src/allmydata/scripts/tahoe_restart.py +++ /dev/null @@ -1,21 +0,0 @@ -from __future__ import print_function - -from .tahoe_start import StartOptions, start -from .tahoe_stop import stop, COULD_NOT_STOP - - -class RestartOptions(StartOptions): - subcommand_name = "restart" - - -def restart(config): - print("'tahoe restart' is deprecated; see 'tahoe run'") - stderr = config.stderr - rc = stop(config) - if rc == COULD_NOT_STOP: - print("ignoring couldn't-stop", file=stderr) - rc = 0 - if rc: - print("not restarting", file=stderr) - return rc - return start(config) diff --git a/src/allmydata/scripts/tahoe_start.py b/src/allmydata/scripts/tahoe_start.py deleted file mode 100644 index bc076d1b7..000000000 --- a/src/allmydata/scripts/tahoe_start.py +++ /dev/null @@ -1,152 +0,0 @@ -from __future__ import print_function - -import os -import io -import sys -import time -import subprocess -from os.path import join, exists - -from allmydata.scripts.common import BasedirOptions -from allmydata.scripts.default_nodedir import _default_nodedir -from allmydata.util.encodingutil import quote_local_unicode_path - -from .run_common import MyTwistdConfig, identify_node_type - - -class StartOptions(BasedirOptions): - subcommand_name = "start" - optParameters = [ - ("basedir", "C", None, - "Specify which Tahoe base directory should be used." - " This has the same effect as the global --node-directory option." - " [default: %s]" % quote_local_unicode_path(_default_nodedir)), - ] - - def parseArgs(self, basedir=None, *twistd_args): - # This can't handle e.g. 'tahoe start --nodaemon', since '--nodaemon' - # looks like an option to the tahoe subcommand, not to twistd. So you - # can either use 'tahoe start' or 'tahoe start NODEDIR - # --TWISTD-OPTIONS'. Note that 'tahoe --node-directory=NODEDIR start - # --TWISTD-OPTIONS' also isn't allowed, unfortunately. - - BasedirOptions.parseArgs(self, basedir) - self.twistd_args = twistd_args - - def getSynopsis(self): - return ("Usage: %s [global-options] %s [options]" - " [NODEDIR [twistd-options]]" - % (self.command_name, self.subcommand_name)) - - def getUsage(self, width=None): - t = BasedirOptions.getUsage(self, width) + "\n" - twistd_options = str(MyTwistdConfig()).partition("\n")[2].partition("\n\n")[0] - t += twistd_options.replace("Options:", "twistd-options:", 1) - t += """ - -Note that if any twistd-options are used, NODEDIR must be specified explicitly -(not by default or using -C/--basedir or -d/--node-directory), and followed by -the twistd-options. -""" - return t - - -def start(config): - """ - Start a tahoe node (daemonize it and confirm startup) - - We run 'tahoe daemonize' with all the options given to 'tahoe - start' and then watch the log files for the correct text to appear - (e.g. "introducer started"). If that doesn't happen within a few - seconds, an error is printed along with all collected logs. - """ - print("'tahoe start' is deprecated; see 'tahoe run'") - out = config.stdout - err = config.stderr - basedir = config['basedir'] - quoted_basedir = quote_local_unicode_path(basedir) - print("STARTING", quoted_basedir, file=out) - if not os.path.isdir(basedir): - print("%s does not look like a directory at all" % quoted_basedir, file=err) - return 1 - nodetype = identify_node_type(basedir) - if not nodetype: - print("%s is not a recognizable node directory" % quoted_basedir, file=err) - return 1 - - # "tahoe start" attempts to monitor the logs for successful - # startup -- but we can't always do that. - - can_monitor_logs = False - if (nodetype in (u"client", u"introducer") - and "--nodaemon" not in config.twistd_args - and "--syslog" not in config.twistd_args - and "--logfile" not in config.twistd_args): - can_monitor_logs = True - - if "--help" in config.twistd_args: - return 0 - - if not can_monitor_logs: - print("Custom logging options; can't monitor logs for proper startup messages", file=out) - return 1 - - # before we spawn tahoe, we check if "the log file" exists or not, - # and if so remember how big it is -- essentially, we're doing - # "tail -f" to see what "this" incarnation of "tahoe daemonize" - # spews forth. - starting_offset = 0 - log_fname = join(basedir, 'logs', 'twistd.log') - if exists(log_fname): - with open(log_fname, 'r') as f: - f.seek(0, 2) - starting_offset = f.tell() - - # spawn tahoe. Note that since this daemonizes, it should return - # "pretty fast" and with a zero return-code, or else something - # Very Bad has happened. - try: - args = [sys.executable] if not getattr(sys, 'frozen', False) else [] - for i, arg in enumerate(sys.argv): - if arg in ['start', 'restart']: - args.append('daemonize') - else: - args.append(arg) - subprocess.check_call(args) - except subprocess.CalledProcessError as e: - return e.returncode - - # now, we have to determine if tahoe has actually started up - # successfully or not. so, we start sucking up log files and - # looking for "the magic string", which depends on the node type. - - magic_string = u'{} running'.format(nodetype) - with io.open(log_fname, 'r') as f: - f.seek(starting_offset) - - collected = u'' - overall_start = time.time() - while time.time() - overall_start < 60: - this_start = time.time() - while time.time() - this_start < 5: - collected += f.read() - if magic_string in collected: - if not config.parent['quiet']: - print("Node has started successfully", file=out) - return 0 - if 'Traceback ' in collected: - print("Error starting node; see '{}' for more:\n\n{}".format( - log_fname, - collected, - ), file=err) - return 1 - time.sleep(0.1) - print("Still waiting up to {}s for node startup".format( - 60 - int(time.time() - overall_start) - ), file=out) - - print("Something has gone wrong starting the node.", file=out) - print("Logs are available in '{}'".format(log_fname), file=out) - print("Collected for this run:", file=out) - print(collected, file=out) - return 1 diff --git a/src/allmydata/scripts/tahoe_stop.py b/src/allmydata/scripts/tahoe_stop.py deleted file mode 100644 index 28c0f8131..000000000 --- a/src/allmydata/scripts/tahoe_stop.py +++ /dev/null @@ -1,85 +0,0 @@ -from __future__ import print_function - -import os -import time -import signal - -from allmydata.scripts.common import BasedirOptions -from allmydata.util.encodingutil import quote_local_unicode_path -from .run_common import get_pidfile, get_pid_from_pidfile - -COULD_NOT_STOP = 2 - - -class StopOptions(BasedirOptions): - def parseArgs(self, basedir=None): - BasedirOptions.parseArgs(self, basedir) - - def getSynopsis(self): - return ("Usage: %s [global-options] stop [options] [NODEDIR]" - % (self.command_name,)) - - -def stop(config): - print("'tahoe stop' is deprecated; see 'tahoe run'") - out = config.stdout - err = config.stderr - basedir = config['basedir'] - quoted_basedir = quote_local_unicode_path(basedir) - print("STOPPING", quoted_basedir, file=out) - pidfile = get_pidfile(basedir) - pid = get_pid_from_pidfile(pidfile) - if pid is None: - print("%s does not look like a running node directory (no twistd.pid)" % quoted_basedir, file=err) - # we define rc=2 to mean "nothing is running, but it wasn't me who - # stopped it" - return COULD_NOT_STOP - elif pid == -1: - print("%s contains an invalid PID file" % basedir, file=err) - # we define rc=2 to mean "nothing is running, but it wasn't me who - # stopped it" - return COULD_NOT_STOP - - # kill it hard (SIGKILL), delete the twistd.pid file, then wait for the - # process itself to go away. If it hasn't gone away after 20 seconds, warn - # the user but keep waiting until they give up. - try: - os.kill(pid, signal.SIGKILL) - except OSError as oserr: - if oserr.errno == 3: - print(oserr.strerror) - # the process didn't exist, so wipe the pid file - os.remove(pidfile) - return COULD_NOT_STOP - else: - raise - try: - os.remove(pidfile) - except EnvironmentError: - pass - start = time.time() - time.sleep(0.1) - wait = 40 - first_time = True - while True: - # poll once per second until we see the process is no longer running - try: - os.kill(pid, 0) - except OSError: - print("process %d is dead" % pid, file=out) - return - wait -= 1 - if wait < 0: - if first_time: - print("It looks like pid %d is still running " - "after %d seconds" % (pid, - (time.time() - start)), file=err) - print("I will keep watching it until you interrupt me.", file=err) - wait = 10 - first_time = False - else: - print("pid %d still running after %d seconds" % \ - (pid, (time.time() - start)), file=err) - wait = 10 - time.sleep(1) - # control never reaches here: no timeout diff --git a/src/allmydata/test/cli/test_daemonize.py b/src/allmydata/test/cli/test_daemonize.py deleted file mode 100644 index b1365329a..000000000 --- a/src/allmydata/test/cli/test_daemonize.py +++ /dev/null @@ -1,202 +0,0 @@ -import os -from io import ( - BytesIO, -) -from os.path import dirname, join -from mock import patch, Mock -from six.moves import StringIO -from sys import getfilesystemencoding -from twisted.trial import unittest -from allmydata.scripts import runner -from allmydata.scripts.run_common import ( - identify_node_type, - DaemonizeTahoeNodePlugin, - MyTwistdConfig, -) -from allmydata.scripts.tahoe_daemonize import ( - DaemonizeOptions, -) - - -class Util(unittest.TestCase): - def setUp(self): - self.twistd_options = MyTwistdConfig() - self.twistd_options.parseOptions(["DaemonizeTahoeNode"]) - self.options = self.twistd_options.subOptions - - def test_node_type_nothing(self): - tmpdir = self.mktemp() - base = dirname(tmpdir).decode(getfilesystemencoding()) - - t = identify_node_type(base) - - self.assertIs(None, t) - - def test_node_type_introducer(self): - tmpdir = self.mktemp() - base = dirname(tmpdir).decode(getfilesystemencoding()) - with open(join(dirname(tmpdir), 'introducer.tac'), 'w') as f: - f.write("test placeholder") - - t = identify_node_type(base) - - self.assertEqual(u"introducer", t) - - def test_daemonize(self): - tmpdir = self.mktemp() - plug = DaemonizeTahoeNodePlugin('client', tmpdir) - - with patch('twisted.internet.reactor') as r: - def call(fn, *args, **kw): - fn() - r.stop = lambda: None - r.callWhenRunning = call - service = plug.makeService(self.options) - service.parent = Mock() - service.startService() - - self.assertTrue(service is not None) - - def test_daemonize_no_keygen(self): - tmpdir = self.mktemp() - stderr = BytesIO() - plug = DaemonizeTahoeNodePlugin('key-generator', tmpdir) - - with patch('twisted.internet.reactor') as r: - def call(fn, *args, **kw): - d = fn() - d.addErrback(lambda _: None) # ignore the error we'll trigger - r.callWhenRunning = call - service = plug.makeService(self.options) - service.stderr = stderr - service.parent = Mock() - # we'll raise ValueError because there's no key-generator - # .. BUT we do this in an async function called via - # "callWhenRunning" .. hence using a hook - d = service.set_hook('running') - service.startService() - def done(f): - self.assertIn( - "key-generator support removed", - stderr.getvalue(), - ) - return None - d.addBoth(done) - return d - - def test_daemonize_unknown_nodetype(self): - tmpdir = self.mktemp() - plug = DaemonizeTahoeNodePlugin('an-unknown-service', tmpdir) - - with patch('twisted.internet.reactor') as r: - def call(fn, *args, **kw): - fn() - r.stop = lambda: None - r.callWhenRunning = call - service = plug.makeService(self.options) - service.parent = Mock() - with self.assertRaises(ValueError) as ctx: - service.startService() - self.assertIn( - "unknown nodetype", - str(ctx.exception) - ) - - def test_daemonize_options(self): - parent = runner.Options() - opts = DaemonizeOptions() - opts.parent = parent - opts.parseArgs() - - # just gratuitous coverage, ensureing we don't blow up on - # these methods. - opts.getSynopsis() - opts.getUsage() - - -class RunDaemonizeTests(unittest.TestCase): - - def setUp(self): - # no test should change our working directory - self._working = os.path.abspath('.') - d = super(RunDaemonizeTests, self).setUp() - self._reactor = patch('twisted.internet.reactor') - self._reactor.stop = lambda: None - self._twistd = patch('allmydata.scripts.run_common.twistd') - self.node_dir = self.mktemp() - os.mkdir(self.node_dir) - for cm in [self._reactor, self._twistd]: - cm.__enter__() - return d - - def tearDown(self): - d = super(RunDaemonizeTests, self).tearDown() - for cm in [self._reactor, self._twistd]: - cm.__exit__(None, None, None) - # Note: if you raise an exception (e.g. via self.assertEqual - # or raise RuntimeError) it is apparently just ignored and the - # test passes anyway... - if self._working != os.path.abspath('.'): - print("WARNING: a test just changed the working dir; putting it back") - os.chdir(self._working) - return d - - def _placeholder_nodetype(self, nodetype): - fname = join(self.node_dir, '{}.tac'.format(nodetype)) - with open(fname, 'w') as f: - f.write("test placeholder") - - def test_daemonize_defaults(self): - self._placeholder_nodetype('introducer') - - config = runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't much around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'daemonize', - ]) - i, o, e = StringIO(), StringIO(), StringIO() - with patch('allmydata.scripts.runner.sys') as s: - exit_code = [None] - def _exit(code): - exit_code[0] = code - s.exit = _exit - runner.dispatch(config, i, o, e) - - self.assertEqual(0, exit_code[0]) - - def test_daemonize_wrong_nodetype(self): - self._placeholder_nodetype('invalid') - - config = runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't much around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'daemonize', - ]) - i, o, e = StringIO(), StringIO(), StringIO() - with patch('allmydata.scripts.runner.sys') as s: - exit_code = [None] - def _exit(code): - exit_code[0] = code - s.exit = _exit - runner.dispatch(config, i, o, e) - - self.assertEqual(0, exit_code[0]) - - def test_daemonize_run(self): - self._placeholder_nodetype('client') - - config = runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't much around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'daemonize', - ]) - with patch('allmydata.scripts.runner.sys') as s: - exit_code = [None] - def _exit(code): - exit_code[0] = code - s.exit = _exit - from allmydata.scripts.tahoe_daemonize import daemonize - daemonize(config) diff --git a/src/allmydata/test/cli/test_start.py b/src/allmydata/test/cli/test_start.py deleted file mode 100644 index 42c70f024..000000000 --- a/src/allmydata/test/cli/test_start.py +++ /dev/null @@ -1,273 +0,0 @@ -import os -import shutil -import subprocess -from os.path import join -from mock import patch -from six.moves import StringIO -from functools import partial - -from twisted.trial import unittest -from allmydata.scripts import runner - - -#@patch('twisted.internet.reactor') -@patch('allmydata.scripts.tahoe_start.subprocess') -class RunStartTests(unittest.TestCase): - - def setUp(self): - d = super(RunStartTests, self).setUp() - self.node_dir = self.mktemp() - os.mkdir(self.node_dir) - return d - - def _placeholder_nodetype(self, nodetype): - fname = join(self.node_dir, '{}.tac'.format(nodetype)) - with open(fname, 'w') as f: - f.write("test placeholder") - - def _pid_file(self, pid): - fname = join(self.node_dir, 'twistd.pid') - with open(fname, 'w') as f: - f.write(u"{}\n".format(pid)) - - def _logs(self, logs): - os.mkdir(join(self.node_dir, 'logs')) - fname = join(self.node_dir, 'logs', 'twistd.log') - with open(fname, 'w') as f: - f.write(logs) - - def test_start_defaults(self, _subprocess): - self._placeholder_nodetype('client') - self._pid_file(1234) - self._logs('one log\ntwo log\nred log\nblue log\n') - - config = runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't muck around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'start', - ]) - i, o, e = StringIO(), StringIO(), StringIO() - try: - with patch('allmydata.scripts.tahoe_start.os'): - with patch('allmydata.scripts.runner.sys') as s: - exit_code = [None] - def _exit(code): - exit_code[0] = code - s.exit = _exit - - def launch(*args, **kw): - with open(join(self.node_dir, 'logs', 'twistd.log'), 'a') as f: - f.write('client running\n') # "the magic" - _subprocess.check_call = launch - runner.dispatch(config, i, o, e) - except Exception: - pass - - self.assertEqual([0], exit_code) - self.assertTrue('Node has started' in o.getvalue()) - - def test_start_fails(self, _subprocess): - self._placeholder_nodetype('client') - self._logs('existing log line\n') - - config = runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't muck around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'start', - ]) - - i, o, e = StringIO(), StringIO(), StringIO() - with patch('allmydata.scripts.tahoe_start.time') as t: - with patch('allmydata.scripts.runner.sys') as s: - exit_code = [None] - def _exit(code): - exit_code[0] = code - s.exit = _exit - - thetime = [0] - def _time(): - thetime[0] += 0.1 - return thetime[0] - t.time = _time - - def launch(*args, **kw): - with open(join(self.node_dir, 'logs', 'twistd.log'), 'a') as f: - f.write('a new log line\n') - _subprocess.check_call = launch - - runner.dispatch(config, i, o, e) - - # should print out the collected logs and an error-code - self.assertTrue("a new log line" in o.getvalue()) - self.assertEqual([1], exit_code) - - def test_start_subprocess_fails(self, _subprocess): - self._placeholder_nodetype('client') - self._logs('existing log line\n') - - config = runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't muck around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'start', - ]) - - i, o, e = StringIO(), StringIO(), StringIO() - with patch('allmydata.scripts.tahoe_start.time'): - with patch('allmydata.scripts.runner.sys') as s: - # undo patch for the exception-class - _subprocess.CalledProcessError = subprocess.CalledProcessError - exit_code = [None] - def _exit(code): - exit_code[0] = code - s.exit = _exit - - def launch(*args, **kw): - raise subprocess.CalledProcessError(42, "tahoe") - _subprocess.check_call = launch - - runner.dispatch(config, i, o, e) - - # should get our "odd" error-code - self.assertEqual([42], exit_code) - - def test_start_help(self, _subprocess): - self._placeholder_nodetype('client') - - std = StringIO() - with patch('sys.stdout') as stdo: - stdo.write = std.write - try: - runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't muck around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'start', - '--help', - ], stdout=std) - self.fail("Should get exit") - except SystemExit as e: - print(e) - - self.assertIn( - "Usage:", - std.getvalue() - ) - - def test_start_unknown_node_type(self, _subprocess): - self._placeholder_nodetype('bogus') - - config = runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't muck around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'start', - ]) - - i, o, e = StringIO(), StringIO(), StringIO() - with patch('allmydata.scripts.runner.sys') as s: - exit_code = [None] - def _exit(code): - exit_code[0] = code - s.exit = _exit - - runner.dispatch(config, i, o, e) - - # should print out the collected logs and an error-code - self.assertIn( - "is not a recognizable node directory", - e.getvalue() - ) - self.assertEqual([1], exit_code) - - def test_start_nodedir_not_dir(self, _subprocess): - shutil.rmtree(self.node_dir) - assert not os.path.isdir(self.node_dir) - - config = runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't muck around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'start', - ]) - - i, o, e = StringIO(), StringIO(), StringIO() - with patch('allmydata.scripts.runner.sys') as s: - exit_code = [None] - def _exit(code): - exit_code[0] = code - s.exit = _exit - - runner.dispatch(config, i, o, e) - - # should print out the collected logs and an error-code - self.assertIn( - "does not look like a directory at all", - e.getvalue() - ) - self.assertEqual([1], exit_code) - - -class RunTests(unittest.TestCase): - """ - Tests confirming end-user behavior of CLI commands - """ - - def setUp(self): - d = super(RunTests, self).setUp() - self.addCleanup(partial(os.chdir, os.getcwd())) - self.node_dir = self.mktemp() - os.mkdir(self.node_dir) - return d - - @patch('twisted.internet.reactor') - def test_run_invalid_config(self, reactor): - """ - Configuration that's invalid should be obvious to the user - """ - - def cwr(fn, *args, **kw): - fn() - - def stop(*args, **kw): - stopped.append(None) - stopped = [] - reactor.callWhenRunning = cwr - reactor.stop = stop - - with open(os.path.join(self.node_dir, "client.tac"), "w") as f: - f.write('test') - - with open(os.path.join(self.node_dir, "tahoe.cfg"), "w") as f: - f.write( - "[invalid section]\n" - "foo = bar\n" - ) - - config = runner.parse_or_exit_with_explanation([ - # have to do this so the tests don't muck around in - # ~/.tahoe (the default) - '--node-directory', self.node_dir, - 'run', - ]) - - i, o, e = StringIO(), StringIO(), StringIO() - d = runner.dispatch(config, i, o, e) - - self.assertFailure(d, SystemExit) - - output = e.getvalue() - # should print out the collected logs and an error-code - self.assertIn( - "invalid section", - output, - ) - self.assertIn( - "Configuration error:", - output, - ) - # ensure reactor.stop was actually called - self.assertEqual([None], stopped) - return d From ca92fa4eb5b977dd879a7fd900e7f13259c481bf Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 18:22:14 -0500 Subject: [PATCH 015/186] Don't think about "tahoe start" --- src/allmydata/scripts/run_common.py | 8 ++++---- src/allmydata/test/test_runner.py | 13 ++++--------- 2 files changed, 8 insertions(+), 13 deletions(-) diff --git a/src/allmydata/scripts/run_common.py b/src/allmydata/scripts/run_common.py index 71934414d..f62534434 100644 --- a/src/allmydata/scripts/run_common.py +++ b/src/allmydata/scripts/run_common.py @@ -73,10 +73,10 @@ class RunOptions(BasedirOptions): ] def parseArgs(self, basedir=None, *twistd_args): - # This can't handle e.g. 'tahoe start --nodaemon', since '--nodaemon' - # looks like an option to the tahoe subcommand, not to twistd. So you - # can either use 'tahoe start' or 'tahoe start NODEDIR - # --TWISTD-OPTIONS'. Note that 'tahoe --node-directory=NODEDIR start + # This can't handle e.g. 'tahoe run --reactor=foo', since + # '--reactor=foo' looks like an option to the tahoe subcommand, not to + # twistd. So you can either use 'tahoe run' or 'tahoe run NODEDIR + # --TWISTD-OPTIONS'. Note that 'tahoe --node-directory=NODEDIR run # --TWISTD-OPTIONS' also isn't allowed, unfortunately. BasedirOptions.parseArgs(self, basedir) diff --git a/src/allmydata/test/test_runner.py b/src/allmydata/test/test_runner.py index 2ec871231..a258542b9 100644 --- a/src/allmydata/test/test_runner.py +++ b/src/allmydata/test/test_runner.py @@ -252,15 +252,10 @@ class RunNode(common_util.SignalMixin, unittest.TestCase, pollmixin.PollMixin, RunBinTahoeMixin): """ exercise "tahoe run" for both introducer, client node, and key-generator, - by spawning "tahoe run" (or "tahoe start") as a subprocess. This doesn't - get us line-level coverage, but it does a better job of confirming that - the user can actually run "./bin/tahoe run" and expect it to work. This - verifies that bin/tahoe sets up PYTHONPATH and the like correctly. - - This doesn't work on cygwin (it hangs forever), so we skip this test - when we're on cygwin. It is likely that "tahoe start" itself doesn't - work on cygwin: twisted seems unable to provide a version of - spawnProcess which really works there. + by spawning "tahoe run" as a subprocess. This doesn't get us line-level + coverage, but it does a better job of confirming that the user can + actually run "./bin/tahoe run" and expect it to work. This verifies that + bin/tahoe sets up PYTHONPATH and the like correctly. """ def workdir(self, name): From d346c90c6e035d726b81db5a2761f88e6ed53e7b Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 8 Dec 2020 18:22:58 -0500 Subject: [PATCH 016/186] This is gonna take some work --- src/allmydata/test/check_grid.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/allmydata/test/check_grid.py b/src/allmydata/test/check_grid.py index d3993ee5e..266b7688b 100644 --- a/src/allmydata/test/check_grid.py +++ b/src/allmydata/test/check_grid.py @@ -24,7 +24,7 @@ To set up the client node, do the following: tahoe create-client DIR populate DIR/introducer.furl - tahoe start DIR + tahoe start DIR XXXX tahoe add-alias -d DIR testgrid `tahoe mkdir -d DIR` pick a 10kB-ish test file, compute its md5sum tahoe put -d DIR FILE testgrid:old.MD5SUM @@ -117,7 +117,7 @@ class GridTester(object): def start_node(self): print("tahoe start", self.nodedir) - self.command(self.tahoe, "start", self.nodedir) + self.command(self.tahoe, "start", self.nodedir) # XXXX time.sleep(5) def stop_node(self): From 5b0190b9a1eef8a9735a8f46a72596178ba2b3c0 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 9 Dec 2020 07:24:51 -0500 Subject: [PATCH 017/186] Remove some more test code related to start/restart/stop --- src/allmydata/test/cli/test_cli.py | 81 +++++++++--------------------- src/allmydata/test/cli_node_api.py | 40 +++++++-------- src/allmydata/test/test_runner.py | 20 ++------ 3 files changed, 46 insertions(+), 95 deletions(-) diff --git a/src/allmydata/test/cli/test_cli.py b/src/allmydata/test/cli/test_cli.py index 7f4f4140e..27af29520 100644 --- a/src/allmydata/test/cli/test_cli.py +++ b/src/allmydata/test/cli/test_cli.py @@ -20,14 +20,14 @@ from allmydata.scripts.common_http import socket_error import allmydata.scripts.common_http # Test that the scripts can be imported. -from allmydata.scripts import create_node, debug, tahoe_start, tahoe_restart, \ +from allmydata.scripts import create_node, debug, \ tahoe_add_alias, tahoe_backup, tahoe_check, tahoe_cp, tahoe_get, tahoe_ls, \ tahoe_manifest, tahoe_mkdir, tahoe_mv, tahoe_put, tahoe_unlink, tahoe_webopen, \ - tahoe_stop, tahoe_daemonize, tahoe_run -_hush_pyflakes = [create_node, debug, tahoe_start, tahoe_restart, tahoe_stop, + tahoe_run +_hush_pyflakes = [create_node, debug, tahoe_add_alias, tahoe_backup, tahoe_check, tahoe_cp, tahoe_get, tahoe_ls, tahoe_manifest, tahoe_mkdir, tahoe_mv, tahoe_put, tahoe_unlink, tahoe_webopen, - tahoe_daemonize, tahoe_run] + tahoe_run] from allmydata.scripts import common from allmydata.scripts.common import DEFAULT_ALIAS, get_aliases, get_alias, \ @@ -626,18 +626,6 @@ class Help(unittest.TestCase): help = str(cli.ListAliasesOptions()) self.failUnlessIn("[options]", help) - def test_start(self): - help = str(tahoe_start.StartOptions()) - self.failUnlessIn("[options] [NODEDIR [twistd-options]]", help) - - def test_stop(self): - help = str(tahoe_stop.StopOptions()) - self.failUnlessIn("[options] [NODEDIR]", help) - - def test_restart(self): - help = str(tahoe_restart.RestartOptions()) - self.failUnlessIn("[options] [NODEDIR [twistd-options]]", help) - def test_run(self): help = str(tahoe_run.RunOptions()) self.failUnlessIn("[options] [NODEDIR [twistd-options]]", help) @@ -1269,79 +1257,58 @@ class Options(ReallyEqualMixin, unittest.TestCase): self.failUnlessIn(allmydata.__full_version__, stdout.getvalue()) # but "tahoe SUBCOMMAND --version" should be rejected self.failUnlessRaises(usage.UsageError, self.parse, - ["start", "--version"]) + ["run", "--version"]) self.failUnlessRaises(usage.UsageError, self.parse, - ["start", "--version-and-path"]) + ["run", "--version-and-path"]) def test_quiet(self): # accepted as an overall option, but not on subcommands - o = self.parse(["--quiet", "start"]) + o = self.parse(["--quiet", "run"]) self.failUnless(o.parent["quiet"]) self.failUnlessRaises(usage.UsageError, self.parse, - ["start", "--quiet"]) + ["run", "--quiet"]) def test_basedir(self): # accept a --node-directory option before the verb, or a --basedir # option after, or a basedir argument after, but none in the wrong # place, and not more than one of the three. - o = self.parse(["start"]) + o = self.parse(["run"]) self.failUnlessReallyEqual(o["basedir"], os.path.join(fileutil.abspath_expanduser_unicode(u"~"), u".tahoe")) - o = self.parse(["start", "here"]) + o = self.parse(["run", "here"]) self.failUnlessReallyEqual(o["basedir"], fileutil.abspath_expanduser_unicode(u"here")) - o = self.parse(["start", "--basedir", "there"]) + o = self.parse(["run", "--basedir", "there"]) self.failUnlessReallyEqual(o["basedir"], fileutil.abspath_expanduser_unicode(u"there")) - o = self.parse(["--node-directory", "there", "start"]) + o = self.parse(["--node-directory", "there", "run"]) self.failUnlessReallyEqual(o["basedir"], fileutil.abspath_expanduser_unicode(u"there")) - o = self.parse(["start", "here", "--nodaemon"]) + o = self.parse(["run", "here", "--nodaemon"]) self.failUnlessReallyEqual(o["basedir"], fileutil.abspath_expanduser_unicode(u"here")) self.failUnlessRaises(usage.UsageError, self.parse, - ["--basedir", "there", "start"]) + ["--basedir", "there", "run"]) self.failUnlessRaises(usage.UsageError, self.parse, - ["start", "--node-directory", "there"]) + ["run", "--node-directory", "there"]) self.failUnlessRaises(usage.UsageError, self.parse, ["--node-directory=there", - "start", "--basedir=here"]) + "run", "--basedir=here"]) self.failUnlessRaises(usage.UsageError, self.parse, - ["start", "--basedir=here", "anywhere"]) + ["run", "--basedir=here", "anywhere"]) self.failUnlessRaises(usage.UsageError, self.parse, ["--node-directory=there", - "start", "anywhere"]) + "run", "anywhere"]) self.failUnlessRaises(usage.UsageError, self.parse, ["--node-directory=there", - "start", "--basedir=here", "anywhere"]) + "run", "--basedir=here", "anywhere"]) self.failUnlessRaises(usage.UsageError, self.parse, - ["--node-directory=there", "start", "--nodaemon"]) + ["--node-directory=there", "run", "--nodaemon"]) self.failUnlessRaises(usage.UsageError, self.parse, - ["start", "--basedir=here", "--nodaemon"]) + ["run", "--basedir=here", "--nodaemon"]) -class Stop(unittest.TestCase): - def test_non_numeric_pid(self): - """ - If the pidfile exists but does not contain a numeric value, a complaint to - this effect is written to stderr and the non-success result is - returned. - """ - basedir = FilePath(self.mktemp().decode("ascii")) - basedir.makedirs() - basedir.child(u"twistd.pid").setContent(b"foo") - - config = tahoe_stop.StopOptions() - config.stdout = StringIO() - config.stderr = StringIO() - config['basedir'] = basedir.path - - result_code = tahoe_stop.stop(config) - self.assertEqual(2, result_code) - self.assertIn("invalid PID file", config.stderr.getvalue()) - - -class Start(unittest.TestCase): +class Run(unittest.TestCase): @patch('allmydata.scripts.run_common.os.chdir') @patch('allmydata.scripts.run_common.twistd') @@ -1355,13 +1322,13 @@ class Start(unittest.TestCase): basedir.child(u"twistd.pid").setContent(b"foo") basedir.child(u"tahoe-client.tac").setContent(b"") - config = tahoe_daemonize.DaemonizeOptions() + config = tahoe_run.RunOptions() config.stdout = StringIO() config.stderr = StringIO() config['basedir'] = basedir.path config.twistd_args = [] - result_code = tahoe_daemonize.daemonize(config) + result_code = tahoe_run.run(config) self.assertIn("invalid PID file", config.stderr.getvalue()) self.assertTrue(len(mock_twistd.mock_calls), 1) self.assertEqual(mock_twistd.mock_calls[0][0], 'runApp') diff --git a/src/allmydata/test/cli_node_api.py b/src/allmydata/test/cli_node_api.py index 8453fbca2..09e96bce9 100644 --- a/src/allmydata/test/cli_node_api.py +++ b/src/allmydata/test/cli_node_api.py @@ -14,6 +14,10 @@ from errno import ENOENT import attr +from eliot import ( + log_call, +) + from twisted.internet.error import ( ProcessDone, ProcessTerminated, @@ -42,11 +46,9 @@ from twisted.internet.task import ( from ..client import ( _Client, ) -from ..scripts.tahoe_stop import ( - COULD_NOT_STOP, -) from ..util.eliotutil import ( inline_callbacks, + log_call_deferred, ) class Expect(Protocol, object): @@ -156,6 +158,7 @@ class CLINodeAPI(object): env=os.environ, ) + @log_call(action_type="test:cli-api:run", include_args=["extra_tahoe_args"]) def run(self, protocol, extra_tahoe_args=()): """ Start the node running. @@ -176,28 +179,21 @@ class CLINodeAPI(object): if ENOENT != e.errno: raise + @log_call_deferred(action_type="test:cli-api:stop") def stop(self, protocol): - self._execute( - protocol, - [u"stop", self.basedir.asTextMode().path], - ) + return self.stop_and_wait() + @log_call_deferred(action_type="test:cli-api:stop-and-wait") @inline_callbacks def stop_and_wait(self): - if platform.isWindows(): - # On Windows there is no PID file and no "tahoe stop". - if self.process is not None: - while True: - try: - self.process.signalProcess("TERM") - except ProcessExitedAlready: - break - else: - yield deferLater(self.reactor, 0.1, lambda: None) - else: - protocol, ended = wait_for_exit() - self.stop(protocol) - yield ended + if self.process is not None: + while True: + try: + self.process.signalProcess("TERM") + except ProcessExitedAlready: + break + else: + yield deferLater(self.reactor, 0.1, lambda: None) def active(self): # By writing this file, we get two minutes before the client will @@ -208,8 +204,6 @@ class CLINodeAPI(object): def _check_cleanup_reason(self, reason): # Let it fail because the process has already exited. reason.trap(ProcessTerminated) - if reason.value.exitCode != COULD_NOT_STOP: - return reason return None def cleanup(self): diff --git a/src/allmydata/test/test_runner.py b/src/allmydata/test/test_runner.py index a258542b9..636880621 100644 --- a/src/allmydata/test/test_runner.py +++ b/src/allmydata/test/test_runner.py @@ -34,6 +34,7 @@ from ._twisted_9607 import ( ) from ..util.eliotutil import ( inline_callbacks, + log_call_deferred, ) def get_root_from_file(src): @@ -54,6 +55,7 @@ rootdir = get_root_from_file(srcfile) class RunBinTahoeMixin(object): + @log_call_deferred(action_type="run-bin-tahoe") def run_bintahoe(self, args, stdin=None, python_options=[], env=None): command = sys.executable argv = python_options + ["-m", "allmydata.scripts.runner"] + args @@ -335,7 +337,7 @@ class RunNode(common_util.SignalMixin, unittest.TestCase, pollmixin.PollMixin, @inline_callbacks def test_client(self): """ - Test many things. + Test too many things. 0) Verify that "tahoe create-node" takes a --webport option and writes the value to the configuration file. @@ -343,9 +345,9 @@ class RunNode(common_util.SignalMixin, unittest.TestCase, pollmixin.PollMixin, 1) Verify that "tahoe run" writes a pid file and a node url file (on POSIX). 2) Verify that the storage furl file has a stable value across a - "tahoe run" / "tahoe stop" / "tahoe run" sequence. + "tahoe run" / stop / "tahoe run" sequence. - 3) Verify that the pid file is removed after "tahoe stop" succeeds (on POSIX). + 3) Verify that the pid file is removed after SIGTERM (on POSIX). """ basedir = self.workdir("test_client") c1 = os.path.join(basedir, "c1") @@ -449,18 +451,6 @@ class RunNode(common_util.SignalMixin, unittest.TestCase, pollmixin.PollMixin, "does not look like a directory at all" ) - def test_stop_bad_directory(self): - """ - If ``tahoe run`` is pointed at a directory where no node is running, it - reports an error and exits. - """ - return self._bad_directory_test( - u"test_stop_bad_directory", - "tahoe stop", - lambda tahoe, p: tahoe.stop(p), - "does not look like a running node directory", - ) - @inline_callbacks def _bad_directory_test(self, workdir, description, operation, expected_message): """ From bc9019943e5728132534e51300c65aede175b753 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 9 Dec 2020 07:28:16 -0500 Subject: [PATCH 018/186] no more "tahoe restart" --- docs/frontends/webapi.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/frontends/webapi.rst b/docs/frontends/webapi.rst index 2ee348080..99fa44979 100644 --- a/docs/frontends/webapi.rst +++ b/docs/frontends/webapi.rst @@ -2145,7 +2145,7 @@ you could do the following:: tahoe debug dump-cap URI:CHK:n7r3m6wmomelk4sep3kw5cvduq:os7ijw5c3maek7pg65e5254k2fzjflavtpejjyhshpsxuqzhcwwq:3:20:14861 -> storage index: whpepioyrnff7orecjolvbudeu echo "whpepioyrnff7orecjolvbudeu my puppy told me to" >>$NODEDIR/access.blacklist - tahoe restart $NODEDIR + # ... restart the node to re-read configuration ... tahoe get URI:CHK:n7r3m6wmomelk4sep3kw5cvduq:os7ijw5c3maek7pg65e5254k2fzjflavtpejjyhshpsxuqzhcwwq:3:20:14861 -> error, 403 Access Prohibited: my puppy told me to From 179f0bb9ec942e27a49369461f73efad8573ce73 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 9 Dec 2020 10:55:13 -0500 Subject: [PATCH 019/186] news fragment --- newsfragments/3523.minor | 0 newsfragments/3550.removed | 1 + 2 files changed, 1 insertion(+) delete mode 100644 newsfragments/3523.minor create mode 100644 newsfragments/3550.removed diff --git a/newsfragments/3523.minor b/newsfragments/3523.minor deleted file mode 100644 index e69de29bb..000000000 diff --git a/newsfragments/3550.removed b/newsfragments/3550.removed new file mode 100644 index 000000000..2074bf676 --- /dev/null +++ b/newsfragments/3550.removed @@ -0,0 +1 @@ +The deprecated ``tahoe`` start, restart, stop, and daemonize sub-commands have been removed. \ No newline at end of file From 74c39904561c1d0f1b3383c5057f8255760e6b6b Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 9 Dec 2020 10:57:02 -0500 Subject: [PATCH 020/186] This extra stop complexity is no longer needed --- src/allmydata/test/cli_node_api.py | 24 +----------------------- 1 file changed, 1 insertion(+), 23 deletions(-) diff --git a/src/allmydata/test/cli_node_api.py b/src/allmydata/test/cli_node_api.py index 09e96bce9..34d73a199 100644 --- a/src/allmydata/test/cli_node_api.py +++ b/src/allmydata/test/cli_node_api.py @@ -5,7 +5,6 @@ __all__ = [ "on_stdout", "on_stdout_and_stderr", "on_different", - "wait_for_exit", ] import os @@ -19,7 +18,6 @@ from eliot import ( ) from twisted.internet.error import ( - ProcessDone, ProcessTerminated, ProcessExitedAlready, ) @@ -29,9 +27,6 @@ from twisted.internet.interfaces import ( from twisted.python.filepath import ( FilePath, ) -from twisted.python.runtime import ( - platform, -) from twisted.internet.protocol import ( Protocol, ProcessProtocol, @@ -180,7 +175,7 @@ class CLINodeAPI(object): raise @log_call_deferred(action_type="test:cli-api:stop") - def stop(self, protocol): + def stop(self): return self.stop_and_wait() @log_call_deferred(action_type="test:cli-api:stop-and-wait") @@ -210,20 +205,3 @@ class CLINodeAPI(object): stopping = self.stop_and_wait() stopping.addErrback(self._check_cleanup_reason) return stopping - - -class _WaitForEnd(ProcessProtocol, object): - def __init__(self, ended): - self._ended = ended - - def processEnded(self, reason): - if reason.check(ProcessDone): - self._ended.callback(None) - else: - self._ended.errback(reason) - - -def wait_for_exit(): - ended = Deferred() - protocol = _WaitForEnd(ended) - return protocol, ended From 8b890d255ee583463390244b78286fc6cfebb46c Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 9 Dec 2020 11:02:29 -0500 Subject: [PATCH 021/186] extra news fragments --- newsfragments/3523.minor | 0 newsfragments/3524.minor | 0 2 files changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3523.minor create mode 100644 newsfragments/3524.minor diff --git a/newsfragments/3523.minor b/newsfragments/3523.minor new file mode 100644 index 000000000..e69de29bb diff --git a/newsfragments/3524.minor b/newsfragments/3524.minor new file mode 100644 index 000000000..e69de29bb From b11161a7aa6c56c6fd60850b378a22606527271f Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 12:47:07 -0500 Subject: [PATCH 022/186] Start porting to Python 3. --- src/allmydata/test/test_system.py | 18 ++++++++++-------- 1 file changed, 10 insertions(+), 8 deletions(-) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 7a7fe117b..8bbe26ad0 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -1,5 +1,7 @@ from __future__ import print_function +from past.builtins import unicode + import os, re, sys, time, json from functools import partial @@ -54,7 +56,7 @@ from ..scripts.common import ( write_introducer, ) -LARGE_DATA = """ +LARGE_DATA = b""" This is some data to publish to the remote grid.., which needs to be large enough to not fit inside a LIT uri. """ @@ -761,7 +763,7 @@ class SystemTestMixin(pollmixin.PollMixin, testutil.StallMixin): return os.path.exists(sgf) d.addCallback(lambda junk: self.poll(check_for_furl, timeout=30)) def get_furl(junk): - self.stats_gatherer_furl = file(sgf, 'rb').read().strip() + self.stats_gatherer_furl = open(sgf, 'rb').read().strip() d.addCallback(get_furl) return d @@ -1026,7 +1028,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): def _test_upload_and_download(self, convergence): # we use 4000 bytes of data, which will result in about 400k written # to disk among all our simulated nodes - DATA = "Some data to upload\n" * 200 + DATA = b"Some data to upload\n" * 200 d = self.set_up_nodes() def _check_connections(res): for c in self.clients: @@ -1151,7 +1153,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): return connected d.addCallback(lambda ign: self.poll(_has_helper)) - HELPER_DATA = "Data that needs help to upload" * 1000 + HELPER_DATA = b"Data that needs help to upload" * 1000 def _upload_with_helper(res): u = upload.Data(HELPER_DATA, convergence=convergence) d = self.extra_node.upload(u) @@ -1185,7 +1187,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): d.addCallback(fireEventually) def _upload_resumable(res): - DATA = "Data that needs help to upload and gets interrupted" * 1000 + DATA = b"Data that needs help to upload and gets interrupted" * 1000 u1 = CountingDataUploadable(DATA, convergence=convergence) u2 = CountingDataUploadable(DATA, convergence=convergence) @@ -1398,11 +1400,11 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): def test_mutable(self): self.basedir = "system/SystemTest/test_mutable" - DATA = "initial contents go here." # 25 bytes % 3 != 0 + DATA = b"initial contents go here." # 25 bytes % 3 != 0 DATA_uploadable = MutableData(DATA) - NEWDATA = "new contents yay" + NEWDATA = b"new contents yay" NEWDATA_uploadable = MutableData(NEWDATA) - NEWERDATA = "this is getting old" + NEWERDATA = b"this is getting old" NEWERDATA_uploadable = MutableData(NEWERDATA) d = self.set_up_nodes() From 1adb40cf3b81d1b35b13d1f3080699386dd200dc Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 12:52:53 -0500 Subject: [PATCH 023/186] Some more progress towards Python 3. --- src/allmydata/client.py | 2 +- src/allmydata/dirnode.py | 6 +++--- src/allmydata/nodemaker.py | 2 +- src/allmydata/test/test_system.py | 6 ++---- 4 files changed, 7 insertions(+), 9 deletions(-) diff --git a/src/allmydata/client.py b/src/allmydata/client.py index 55db7e690..4c7a29d8d 100644 --- a/src/allmydata/client.py +++ b/src/allmydata/client.py @@ -720,7 +720,7 @@ class _Client(node.Node, pollmixin.PollMixin): def get_long_nodeid(self): # this matches what IServer.get_longname() says about us elsewhere vk_string = ed25519.string_from_verifying_key(self._node_public_key) - return remove_prefix(vk_string, "pub-") + return remove_prefix(vk_string, b"pub-") def get_long_tubid(self): return idlib.nodeid_b2a(self.nodeid) diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index 59ebd73ba..7b6052919 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -179,7 +179,7 @@ class Adder(object): def modify(self, old_contents, servermap, first_time): children = self.node._unpack_contents(old_contents) now = time.time() - for (namex, (child, new_metadata)) in self.entries.iteritems(): + for (namex, (child, new_metadata)) in list(self.entries.items()): name = normalize(namex) precondition(IFilesystemNode.providedBy(child), child) @@ -221,7 +221,7 @@ def _encrypt_rw_uri(writekey, rw_uri): def pack_children(childrenx, writekey, deep_immutable=False): # initial_children must have metadata (i.e. {} instead of None) children = {} - for (namex, (node, metadata)) in childrenx.iteritems(): + for (namex, (node, metadata)) in list(childrenx.items()): precondition(isinstance(metadata, dict), "directory creation requires metadata to be a dict, not None", metadata) children[normalize(namex)] = (node, metadata) @@ -245,7 +245,7 @@ def _pack_normalized_children(children, writekey, deep_immutable=False): If deep_immutable is True, I will require that all my children are deeply immutable, and will raise a MustBeDeepImmutableError if not. """ - precondition((writekey is None) or isinstance(writekey, str), writekey) + precondition((writekey is None) or isinstance(writekey, bytes), writekey) has_aux = isinstance(children, AuxValueDict) entries = [] diff --git a/src/allmydata/nodemaker.py b/src/allmydata/nodemaker.py index c3ba1ba7b..ab455ce48 100644 --- a/src/allmydata/nodemaker.py +++ b/src/allmydata/nodemaker.py @@ -126,7 +126,7 @@ class NodeMaker(object): def create_new_mutable_directory(self, initial_children={}, version=None): # initial_children must have metadata (i.e. {} instead of None) - for (name, (node, metadata)) in initial_children.iteritems(): + for (name, (node, metadata)) in initial_children.items(): precondition(isinstance(metadata, dict), "create_new_mutable_directory requires metadata to be a dict, not None", metadata) node.raise_error() diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 8bbe26ad0..c151bb09e 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -788,7 +788,7 @@ class SystemTestMixin(pollmixin.PollMixin, testutil.StallMixin): self.helper_furl = helper_furl if self.numclients >= 4: - with open(os.path.join(basedirs[3], 'tahoe.cfg'), 'ab+') as f: + with open(os.path.join(basedirs[3], 'tahoe.cfg'), 'a+') as f: f.write( "[client]\n" "helper.furl = {}\n".format(helper_furl) @@ -831,8 +831,6 @@ class SystemTestMixin(pollmixin.PollMixin, testutil.StallMixin): def setconf(config, which, section, feature, value): if which in feature_matrix.get((section, feature), {which}): - if isinstance(value, unicode): - value = value.encode("utf-8") config.setdefault(section, {})[feature] = value setclient = partial(setconf, config, which, "client") @@ -1036,7 +1034,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): all_peerids = c.get_storage_broker().get_all_serverids() self.failUnlessEqual(len(all_peerids), self.numclients) sb = c.storage_broker - permuted_peers = sb.get_servers_for_psi("a") + permuted_peers = sb.get_servers_for_psi(b"a") self.failUnlessEqual(len(permuted_peers), self.numclients) d.addCallback(_check_connections) From 1ab1aaea47f342eedaeb36ff76abfa5106aedf53 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 12:59:37 -0500 Subject: [PATCH 024/186] Some more progress towards Python 3. --- src/allmydata/dirnode.py | 28 ++++++++++++++-------------- src/allmydata/test/test_system.py | 4 ++-- 2 files changed, 16 insertions(+), 16 deletions(-) diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index 7b6052919..1af655574 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -205,8 +205,8 @@ class Adder(object): return new_contents def _encrypt_rw_uri(writekey, rw_uri): - precondition(isinstance(rw_uri, str), rw_uri) - precondition(isinstance(writekey, str), writekey) + precondition(isinstance(rw_uri, bytes), rw_uri) + precondition(isinstance(writekey, bytes), writekey) salt = hashutil.mutable_rwcap_salt_hash(rw_uri) key = hashutil.mutable_rwcap_key_hash(salt, writekey) @@ -264,26 +264,26 @@ def _pack_normalized_children(children, writekey, deep_immutable=False): assert isinstance(metadata, dict) rw_uri = child.get_write_uri() if rw_uri is None: - rw_uri = "" - assert isinstance(rw_uri, str), rw_uri + rw_uri = b"" + assert isinstance(rw_uri, bytes), rw_uri # should be prevented by MustBeDeepImmutableError check above assert not (rw_uri and deep_immutable) ro_uri = child.get_readonly_uri() if ro_uri is None: - ro_uri = "" - assert isinstance(ro_uri, str), ro_uri + ro_uri = b"" + assert isinstance(ro_uri, bytes), ro_uri if writekey is not None: writecap = netstring(_encrypt_rw_uri(writekey, rw_uri)) else: writecap = ZERO_LEN_NETSTR - entry = "".join([netstring(name.encode("utf-8")), + entry = b"".join([netstring(name.encode("utf-8")), netstring(strip_prefix_for_ro(ro_uri, deep_immutable)), writecap, - netstring(json.dumps(metadata))]) + netstring(json.dumps(metadata).encode("utf-8"))]) entries.append(netstring(entry)) - return "".join(entries) + return b"".join(entries) @implementer(IDirectoryNode, ICheckable, IDeepCheckable) class DirectoryNode(object): @@ -352,7 +352,7 @@ class DirectoryNode(object): # cleartext. The 'name' is UTF-8 encoded, and should be normalized to NFC. # The rwcapdata is formatted as: # pack("16ss32s", iv, AES(H(writekey+iv), plaintext_rw_uri), mac) - assert isinstance(data, str), (repr(data), type(data)) + assert isinstance(data, bytes), (repr(data), type(data)) # an empty directory is serialized as an empty string if data == "": return AuxValueDict() @@ -555,8 +555,8 @@ class DirectoryNode(object): return d def set_uri(self, namex, writecap, readcap, metadata=None, overwrite=True): - precondition(isinstance(writecap, (str,type(None))), writecap) - precondition(isinstance(readcap, (str,type(None))), readcap) + precondition(isinstance(writecap, (bytes, type(None))), writecap) + precondition(isinstance(readcap, (bytes, type(None))), readcap) # We now allow packing unknown nodes, provided they are valid # for this type of directory. @@ -577,8 +577,8 @@ class DirectoryNode(object): else: assert len(e) == 3 writecap, readcap, metadata = e - precondition(isinstance(writecap, (str,type(None))), writecap) - precondition(isinstance(readcap, (str,type(None))), readcap) + precondition(isinstance(writecap, (bytes,type(None))), writecap) + precondition(isinstance(readcap, (bytes,type(None))), readcap) # We now allow packing unknown nodes, provided they are valid # for this type of directory. diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index c151bb09e..19adeff20 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -1687,7 +1687,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): d1.addCallback(self.log, "publish finished") def _stash_uri(filenode): self.uri = filenode.get_uri() - assert isinstance(self.uri, str), (self.uri, filenode) + assert isinstance(self.uri, bytes), (self.uri, filenode) d1.addCallback(_stash_uri) return d1 d.addCallback(_made_subdir1) @@ -2174,7 +2174,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): filename = os.path.join(dirpath, filenames[0]) # peek at the magic to see if it is a chk share magic = open(filename, "rb").read(4) - if magic == '\x00\x00\x00\x01': + if magic == b'\x00\x00\x00\x01': break else: self.fail("unable to find any uri_extension files in %r" From b61b0a900151f8d18c2b12692671fd4f391b927b Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 13:02:29 -0500 Subject: [PATCH 025/186] Some more progress towards Python 3. --- src/allmydata/scripts/debug.py | 2 +- src/allmydata/test/test_system.py | 10 +++++----- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/src/allmydata/scripts/debug.py b/src/allmydata/scripts/debug.py index fd3f2b87c..6afbce909 100644 --- a/src/allmydata/scripts/debug.py +++ b/src/allmydata/scripts/debug.py @@ -638,7 +638,7 @@ def find_shares(options): from allmydata.util.encodingutil import listdir_unicode, quote_local_unicode_path out = options.stdout - sharedir = storage_index_to_dir(si_a2b(options.si_s)) + sharedir = storage_index_to_dir(si_a2b(options.si_s.encode("ascii"))) for d in options.nodedirs: d = os.path.join(d, "storage", "shares", sharedir) if os.path.exists(d): diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 19adeff20..f619077cd 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -1021,7 +1021,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): def test_upload_and_download_convergent(self): self.basedir = "system/SystemTest/test_upload_and_download_convergent" - return self._test_upload_and_download(convergence="some convergence string") + return self._test_upload_and_download(convergence=b"some convergence string") def _test_upload_and_download(self, convergence): # we use 4000 bytes of data, which will result in about 400k written @@ -1057,7 +1057,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): theuri = results.get_uri() log.msg("upload finished: uri is %s" % (theuri,)) self.uri = theuri - assert isinstance(self.uri, str), self.uri + assert isinstance(self.uri, bytes), self.uri self.cap = uri.from_string(self.uri) self.n = self.clients[1].create_node_from_uri(self.uri) d.addCallback(_upload_done) @@ -1091,17 +1091,17 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): d.addCallback(lambda ign: n.read(MemoryConsumer(), offset=1, size=4)) def _read_portion_done(mc): - self.failUnlessEqual("".join(mc.chunks), DATA[1:1+4]) + self.failUnlessEqual(b"".join(mc.chunks), DATA[1:1+4]) d.addCallback(_read_portion_done) d.addCallback(lambda ign: n.read(MemoryConsumer(), offset=2, size=None)) def _read_tail_done(mc): - self.failUnlessEqual("".join(mc.chunks), DATA[2:]) + self.failUnlessEqual(b"".join(mc.chunks), DATA[2:]) d.addCallback(_read_tail_done) d.addCallback(lambda ign: n.read(MemoryConsumer(), size=len(DATA)+1000)) def _read_too_much(mc): - self.failUnlessEqual("".join(mc.chunks), DATA) + self.failUnlessEqual(b"".join(mc.chunks), DATA) d.addCallback(_read_too_much) return d From add26895cf10d35f64ddae265343801294fd465e Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 13:11:39 -0500 Subject: [PATCH 026/186] Another passing test on Python 3. --- src/allmydata/immutable/downloader/finder.py | 2 +- src/allmydata/test/test_system.py | 14 +++++++------- 2 files changed, 8 insertions(+), 8 deletions(-) diff --git a/src/allmydata/immutable/downloader/finder.py b/src/allmydata/immutable/downloader/finder.py index 6d222bc73..0636c3fe2 100644 --- a/src/allmydata/immutable/downloader/finder.py +++ b/src/allmydata/immutable/downloader/finder.py @@ -98,7 +98,7 @@ class ShareFinder(object): # internal methods def loop(self): - pending_s = ",".join([rt.server.get_name() + pending_s = ",".join([str(rt.server.get_name(), "utf-8") for rt in self.pending_requests]) # sort? self.log(format="ShareFinder loop: running=%(running)s" " hungry=%(hungry)s, pending=%(pending)s", diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index f619077cd..ed434bf57 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -1,6 +1,6 @@ from __future__ import print_function -from past.builtins import unicode +from past.builtins import unicode, chr as byteschr, long import os, re, sys, time, json from functools import partial @@ -911,7 +911,7 @@ class SystemTestMixin(pollmixin.PollMixin, testutil.StallMixin): config = "[client]\n" if helper_furl: config += "helper.furl = %s\n" % helper_furl - basedir.child("tahoe.cfg").setContent(config) + basedir.child("tahoe.cfg").setContent(config.encode("utf-8")) private = basedir.child("private") private.makedirs() write_introducer( @@ -1316,10 +1316,10 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): #print("STATS") #from pprint import pprint #pprint(stats) - s = stats["stats"] - self.failUnlessEqual(s["storage_server.accepting_immutable_shares"], 1) - c = stats["counters"] - self.failUnless("storage_server.allocate" in c) + s = stats[b"stats"] + self.failUnlessEqual(s[b"storage_server.accepting_immutable_shares"], 1) + c = stats[b"counters"] + self.failUnless(b"storage_server.allocate" in c) d.addCallback(_got_stats) return d d.addCallback(_grab_stats) @@ -1605,7 +1605,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): return d def flip_bit(self, good): - return good[:-1] + chr(ord(good[-1]) ^ 0x01) + return good[:-1] + byteschr(ord(good[-1:]) ^ 0x01) def mangle_uri(self, gooduri): # change the key, which changes the storage index, which means we'll From 5924da93d8f6f3f5e30a636c7f9dde7fb3e6adb7 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 13:18:45 -0500 Subject: [PATCH 027/186] More bytes. --- src/allmydata/test/test_system.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index ed434bf57..c521e1e2f 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -1705,7 +1705,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): return res def _do_publish_private(self, res): - self.smalldata = "sssh, very secret stuff" + self.smalldata = b"sssh, very secret stuff" ut = upload.Data(self.smalldata, convergence=None) d = self.clients[0].create_dirnode() d.addCallback(self.log, "GOT private directory") @@ -1792,7 +1792,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): d1.addCallback(lambda res: self.shouldFail2(NotWriteableError, "mkdir(nope)", None, dirnode.create_subdirectory, u"nope")) d1.addCallback(self.log, "doing add_file(ro)") - ut = upload.Data("I will disappear, unrecorded and unobserved. The tragedy of my demise is made more poignant by its silence, but this beauty is not for you to ever know.", convergence="99i-p1x4-xd4-18yc-ywt-87uu-msu-zo -- completely and totally unguessable string (unless you read this)") + ut = upload.Data(b"I will disappear, unrecorded and unobserved. The tragedy of my demise is made more poignant by its silence, but this beauty is not for you to ever know.", convergence=b"99i-p1x4-xd4-18yc-ywt-87uu-msu-zo -- completely and totally unguessable string (unless you read this)") d1.addCallback(lambda res: self.shouldFail2(NotWriteableError, "add_file(nope)", None, dirnode.add_file, u"hope", ut)) d1.addCallback(self.log, "doing get(ro)") From 48bef7db9937ebbaf3440e7b3b6a2aa98e1871aa Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 13:42:35 -0500 Subject: [PATCH 028/186] Some straightforward changes to support Python 3. --- src/allmydata/dirnode.py | 34 +++++++++++++++++----------------- 1 file changed, 17 insertions(+), 17 deletions(-) diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index 59ebd73ba..1af655574 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -179,7 +179,7 @@ class Adder(object): def modify(self, old_contents, servermap, first_time): children = self.node._unpack_contents(old_contents) now = time.time() - for (namex, (child, new_metadata)) in self.entries.iteritems(): + for (namex, (child, new_metadata)) in list(self.entries.items()): name = normalize(namex) precondition(IFilesystemNode.providedBy(child), child) @@ -205,8 +205,8 @@ class Adder(object): return new_contents def _encrypt_rw_uri(writekey, rw_uri): - precondition(isinstance(rw_uri, str), rw_uri) - precondition(isinstance(writekey, str), writekey) + precondition(isinstance(rw_uri, bytes), rw_uri) + precondition(isinstance(writekey, bytes), writekey) salt = hashutil.mutable_rwcap_salt_hash(rw_uri) key = hashutil.mutable_rwcap_key_hash(salt, writekey) @@ -221,7 +221,7 @@ def _encrypt_rw_uri(writekey, rw_uri): def pack_children(childrenx, writekey, deep_immutable=False): # initial_children must have metadata (i.e. {} instead of None) children = {} - for (namex, (node, metadata)) in childrenx.iteritems(): + for (namex, (node, metadata)) in list(childrenx.items()): precondition(isinstance(metadata, dict), "directory creation requires metadata to be a dict, not None", metadata) children[normalize(namex)] = (node, metadata) @@ -245,7 +245,7 @@ def _pack_normalized_children(children, writekey, deep_immutable=False): If deep_immutable is True, I will require that all my children are deeply immutable, and will raise a MustBeDeepImmutableError if not. """ - precondition((writekey is None) or isinstance(writekey, str), writekey) + precondition((writekey is None) or isinstance(writekey, bytes), writekey) has_aux = isinstance(children, AuxValueDict) entries = [] @@ -264,26 +264,26 @@ def _pack_normalized_children(children, writekey, deep_immutable=False): assert isinstance(metadata, dict) rw_uri = child.get_write_uri() if rw_uri is None: - rw_uri = "" - assert isinstance(rw_uri, str), rw_uri + rw_uri = b"" + assert isinstance(rw_uri, bytes), rw_uri # should be prevented by MustBeDeepImmutableError check above assert not (rw_uri and deep_immutable) ro_uri = child.get_readonly_uri() if ro_uri is None: - ro_uri = "" - assert isinstance(ro_uri, str), ro_uri + ro_uri = b"" + assert isinstance(ro_uri, bytes), ro_uri if writekey is not None: writecap = netstring(_encrypt_rw_uri(writekey, rw_uri)) else: writecap = ZERO_LEN_NETSTR - entry = "".join([netstring(name.encode("utf-8")), + entry = b"".join([netstring(name.encode("utf-8")), netstring(strip_prefix_for_ro(ro_uri, deep_immutable)), writecap, - netstring(json.dumps(metadata))]) + netstring(json.dumps(metadata).encode("utf-8"))]) entries.append(netstring(entry)) - return "".join(entries) + return b"".join(entries) @implementer(IDirectoryNode, ICheckable, IDeepCheckable) class DirectoryNode(object): @@ -352,7 +352,7 @@ class DirectoryNode(object): # cleartext. The 'name' is UTF-8 encoded, and should be normalized to NFC. # The rwcapdata is formatted as: # pack("16ss32s", iv, AES(H(writekey+iv), plaintext_rw_uri), mac) - assert isinstance(data, str), (repr(data), type(data)) + assert isinstance(data, bytes), (repr(data), type(data)) # an empty directory is serialized as an empty string if data == "": return AuxValueDict() @@ -555,8 +555,8 @@ class DirectoryNode(object): return d def set_uri(self, namex, writecap, readcap, metadata=None, overwrite=True): - precondition(isinstance(writecap, (str,type(None))), writecap) - precondition(isinstance(readcap, (str,type(None))), readcap) + precondition(isinstance(writecap, (bytes, type(None))), writecap) + precondition(isinstance(readcap, (bytes, type(None))), readcap) # We now allow packing unknown nodes, provided they are valid # for this type of directory. @@ -577,8 +577,8 @@ class DirectoryNode(object): else: assert len(e) == 3 writecap, readcap, metadata = e - precondition(isinstance(writecap, (str,type(None))), writecap) - precondition(isinstance(readcap, (str,type(None))), readcap) + precondition(isinstance(writecap, (bytes,type(None))), writecap) + precondition(isinstance(readcap, (bytes,type(None))), readcap) # We now allow packing unknown nodes, provided they are valid # for this type of directory. From 6b8fd2f29d321a16c552bd03d450eeaa4d977f31 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 13:45:31 -0500 Subject: [PATCH 029/186] Some progress towards passing tests on Python 3. --- src/allmydata/dirnode.py | 4 +-- src/allmydata/nodemaker.py | 2 +- src/allmydata/test/test_dirnode.py | 42 +++++++++++++++--------------- 3 files changed, 24 insertions(+), 24 deletions(-) diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index 1af655574..74d11a40f 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -384,8 +384,8 @@ class DirectoryNode(object): # ro_uri is treated in the same way for consistency. # rw_uri and ro_uri will be either None or a non-empty string. - rw_uri = rw_uri.rstrip(' ') or None - ro_uri = ro_uri.rstrip(' ') or None + rw_uri = rw_uri.rstrip(b' ') or None + ro_uri = ro_uri.rstrip(b' ') or None try: child = self._create_and_validate_node(rw_uri, ro_uri, name) diff --git a/src/allmydata/nodemaker.py b/src/allmydata/nodemaker.py index c3ba1ba7b..ab455ce48 100644 --- a/src/allmydata/nodemaker.py +++ b/src/allmydata/nodemaker.py @@ -126,7 +126,7 @@ class NodeMaker(object): def create_new_mutable_directory(self, initial_children={}, version=None): # initial_children must have metadata (i.e. {} instead of None) - for (name, (node, metadata)) in initial_children.iteritems(): + for (name, (node, metadata)) in initial_children.items(): precondition(isinstance(metadata, dict), "create_new_mutable_directory requires metadata to be a dict, not None", metadata) node.raise_error() diff --git a/src/allmydata/test/test_dirnode.py b/src/allmydata/test/test_dirnode.py index 48ffff45a..1845c3ac3 100644 --- a/src/allmydata/test/test_dirnode.py +++ b/src/allmydata/test/test_dirnode.py @@ -48,16 +48,16 @@ class MemAccum(object): self.data = data self.producer.resumeProducing() -setup_py_uri = "URI:CHK:n7r3m6wmomelk4sep3kw5cvduq:os7ijw5c3maek7pg65e5254k2fzjflavtpejjyhshpsxuqzhcwwq:3:20:14861" -one_uri = "URI:LIT:n5xgk" # LIT for "one" -mut_write_uri = "URI:SSK:vfvcbdfbszyrsaxchgevhmmlii:euw4iw7bbnkrrwpzuburbhppuxhc3gwxv26f6imekhz7zyw2ojnq" -mdmf_write_uri = "URI:MDMF:x533rhbm6kiehzl5kj3s44n5ie:4gif5rhneyd763ouo5qjrgnsoa3bg43xycy4robj2rf3tvmhdl3a" -empty_litdir_uri = "URI:DIR2-LIT:" -tiny_litdir_uri = "URI:DIR2-LIT:gqytunj2onug64tufqzdcosvkjetutcjkq5gw4tvm5vwszdgnz5hgyzufqydulbshj5x2lbm" # contains one child which is itself also LIT -mut_read_uri = "URI:SSK-RO:jf6wkflosyvntwxqcdo7a54jvm:euw4iw7bbnkrrwpzuburbhppuxhc3gwxv26f6imekhz7zyw2ojnq" -mdmf_read_uri = "URI:MDMF-RO:d4cydxselputycfzkw6qgz4zv4:4gif5rhneyd763ouo5qjrgnsoa3bg43xycy4robj2rf3tvmhdl3a" -future_write_uri = "x-tahoe-crazy://I_am_from_the_future." -future_read_uri = "x-tahoe-crazy-readonly://I_am_from_the_future." +setup_py_uri = b"URI:CHK:n7r3m6wmomelk4sep3kw5cvduq:os7ijw5c3maek7pg65e5254k2fzjflavtpejjyhshpsxuqzhcwwq:3:20:14861" +one_uri = b"URI:LIT:n5xgk" # LIT for "one" +mut_write_uri = b"URI:SSK:vfvcbdfbszyrsaxchgevhmmlii:euw4iw7bbnkrrwpzuburbhppuxhc3gwxv26f6imekhz7zyw2ojnq" +mdmf_write_uri = b"URI:MDMF:x533rhbm6kiehzl5kj3s44n5ie:4gif5rhneyd763ouo5qjrgnsoa3bg43xycy4robj2rf3tvmhdl3a" +empty_litdir_uri = b"URI:DIR2-LIT:" +tiny_litdir_uri = b"URI:DIR2-LIT:gqytunj2onug64tufqzdcosvkjetutcjkq5gw4tvm5vwszdgnz5hgyzufqydulbshj5x2lbm" # contains one child which is itself also LIT +mut_read_uri = b"URI:SSK-RO:jf6wkflosyvntwxqcdo7a54jvm:euw4iw7bbnkrrwpzuburbhppuxhc3gwxv26f6imekhz7zyw2ojnq" +mdmf_read_uri = b"URI:MDMF-RO:d4cydxselputycfzkw6qgz4zv4:4gif5rhneyd763ouo5qjrgnsoa3bg43xycy4robj2rf3tvmhdl3a" +future_write_uri = b"x-tahoe-crazy://I_am_from_the_future." +future_read_uri = b"x-tahoe-crazy-readonly://I_am_from_the_future." future_nonascii_write_uri = u"x-tahoe-even-more-crazy://I_am_from_the_future_rw_\u263A".encode('utf-8') future_nonascii_read_uri = u"x-tahoe-even-more-crazy-readonly://I_am_from_the_future_ro_\u263A".encode('utf-8') @@ -423,7 +423,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, # moved on to stdlib "json" which doesn't have it either. d.addCallback(self.stall, 0.1) d.addCallback(lambda res: n.add_file(u"timestamps", - upload.Data("stamp me", convergence="some convergence string"))) + upload.Data(b"stamp me", convergence=b"some convergence string"))) d.addCallback(self.stall, 0.1) def _stop(res): self._stop_timestamp = time.time() @@ -472,11 +472,11 @@ class Dirnode(GridTestMixin, unittest.TestCase, self.failUnlessReallyEqual(set(children.keys()), set([u"child"]))) - uploadable1 = upload.Data("some data", convergence="converge") + uploadable1 = upload.Data(b"some data", convergence=b"converge") d.addCallback(lambda res: n.add_file(u"newfile", uploadable1)) d.addCallback(lambda newnode: self.failUnless(IImmutableFileNode.providedBy(newnode))) - uploadable2 = upload.Data("some data", convergence="stuff") + uploadable2 = upload.Data(b"some data", convergence=b"stuff") d.addCallback(lambda res: self.shouldFail(ExistingChildError, "add_file-no", "child 'newfile' already exists", @@ -491,7 +491,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, d.addCallback(lambda metadata: self.failUnlessEqual(set(metadata.keys()), set(["tahoe"]))) - uploadable3 = upload.Data("some data", convergence="converge") + uploadable3 = upload.Data(b"some data", convergence=b"converge") d.addCallback(lambda res: n.add_file(u"newfile-metadata", uploadable3, {"key": "value"})) @@ -507,8 +507,8 @@ class Dirnode(GridTestMixin, unittest.TestCase, def _created2(subdir2): self.subdir2 = subdir2 # put something in the way, to make sure it gets overwritten - return subdir2.add_file(u"child", upload.Data("overwrite me", - "converge")) + return subdir2.add_file(u"child", upload.Data(b"overwrite me", + b"converge")) d.addCallback(_created2) d.addCallback(lambda res: @@ -1074,7 +1074,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, d.addCallback(_created_root) def _created_subdir(subdir): self._subdir = subdir - d = subdir.add_file(u"file1", upload.Data("data"*100, None)) + d = subdir.add_file(u"file1", upload.Data(b"data"*100, None)) d.addCallback(lambda res: subdir.set_node(u"link", self._rootnode)) d.addCallback(lambda res: c.create_dirnode()) d.addCallback(lambda dn: @@ -1250,7 +1250,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, nm = c.nodemaker filecap = make_chk_file_uri(1234) filenode = nm.create_from_cap(filecap) - uploadable = upload.Data("some data", convergence="some convergence string") + uploadable = upload.Data(b"some data", convergence=b"some convergence string") d = c.create_dirnode(version=version) def _created(rw_dn): @@ -1867,7 +1867,7 @@ class Deleter(GridTestMixin, testutil.ReallyEqualMixin, unittest.TestCase): self.set_up_grid(oneshare=True) c0 = self.g.clients[0] d = c0.create_dirnode() - small = upload.Data("Small enough for a LIT", None) + small = upload.Data(b"Small enough for a LIT", None) def _created_dir(dn): self.root = dn self.root_uri = dn.get_uri() @@ -1909,10 +1909,10 @@ class Adder(GridTestMixin, unittest.TestCase, testutil.ShouldFailMixin): # root/file1 # root/file2 # root/dir1 - d = root_node.add_file(u'file1', upload.Data("Important Things", + d = root_node.add_file(u'file1', upload.Data(b"Important Things", None)) d.addCallback(lambda res: - root_node.add_file(u'file2', upload.Data("Sekrit Codes", None))) + root_node.add_file(u'file2', upload.Data(b"Sekrit Codes", None))) d.addCallback(lambda res: root_node.create_subdirectory(u"dir1")) d.addCallback(lambda res: root_node) From 016240d6e6056601079cb80216e445b4a8f25c78 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 13:50:15 -0500 Subject: [PATCH 030/186] More passing tests on Python 3. --- src/allmydata/dirnode.py | 6 ++--- src/allmydata/test/test_dirnode.py | 37 ++++++++++++++++++------------ 2 files changed, 25 insertions(+), 18 deletions(-) diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index 74d11a40f..137ff9699 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -468,7 +468,7 @@ class DirectoryNode(object): exists a child of the given name, False if not.""" name = normalize(namex) d = self._read() - d.addCallback(lambda children: children.has_key(name)) + d.addCallback(lambda children: name in children) return d def _get(self, children, name): @@ -569,7 +569,7 @@ class DirectoryNode(object): # this takes URIs a = Adder(self, overwrite=overwrite, create_readonly_node=self._create_readonly_node) - for (namex, e) in entries.iteritems(): + for (namex, e) in entries.items(): assert isinstance(namex, unicode), namex if len(e) == 2: writecap, readcap = e @@ -779,7 +779,7 @@ class DirectoryNode(object): # in the nodecache) seem to consume about 2000 bytes. dirkids = [] filekids = [] - for name, (child, metadata) in sorted(children.iteritems()): + for name, (child, metadata) in sorted(children.items()): childpath = path + [name] if isinstance(child, UnknownNode): walker.add_node(child, childpath) diff --git a/src/allmydata/test/test_dirnode.py b/src/allmydata/test/test_dirnode.py index 1845c3ac3..68e774501 100644 --- a/src/allmydata/test/test_dirnode.py +++ b/src/allmydata/test/test_dirnode.py @@ -1,4 +1,11 @@ """Tests for the dirnode module.""" +# from __future__ import absolute_import +# from __future__ import division +# from __future__ import print_function +# from __future__ import unicode_literals + +from past.builtins import unicode, long + import six import time import unicodedata @@ -95,13 +102,13 @@ class Dirnode(GridTestMixin, unittest.TestCase, self.failUnless(u) cap_formats = [] if mdmf: - cap_formats = ["URI:DIR2-MDMF:", - "URI:DIR2-MDMF-RO:", - "URI:DIR2-MDMF-Verifier:"] + cap_formats = [b"URI:DIR2-MDMF:", + b"URI:DIR2-MDMF-RO:", + b"URI:DIR2-MDMF-Verifier:"] else: - cap_formats = ["URI:DIR2:", - "URI:DIR2-RO", - "URI:DIR2-Verifier:"] + cap_formats = [b"URI:DIR2:", + b"URI:DIR2-RO", + b"URI:DIR2-Verifier:"] rw, ro, v = cap_formats self.failUnless(u.startswith(rw), u) u_ro = n.get_readonly_uri() @@ -149,7 +156,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, self.failUnless(isinstance(subdir, dirnode.DirectoryNode)) self.subdir = subdir new_v = subdir.get_verify_cap().to_string() - assert isinstance(new_v, str) + assert isinstance(new_v, bytes) self.expected_manifest.append( ((u"subdir",), subdir.get_uri()) ) self.expected_verifycaps.add(new_v) si = subdir.get_storage_index() @@ -182,7 +189,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, "largest-directory-children": 2, "largest-immutable-file": 0, } - for k,v in expected.iteritems(): + for k,v in expected.items(): self.failUnlessReallyEqual(stats[k], v, "stats[%s] was %s, not %s" % (k, stats[k], v)) @@ -272,8 +279,8 @@ class Dirnode(GridTestMixin, unittest.TestCase, { 'tahoe': {'linkcrtime': "bogus"}})) d.addCallback(lambda res: n.get_metadata_for(u"c2")) def _has_good_linkcrtime(metadata): - self.failUnless(metadata.has_key('tahoe')) - self.failUnless(metadata['tahoe'].has_key('linkcrtime')) + self.failUnless('tahoe' in metadata) + self.failUnless('linkcrtime' in metadata['tahoe']) self.failIfEqual(metadata['tahoe']['linkcrtime'], 'bogus') d.addCallback(_has_good_linkcrtime) @@ -918,7 +925,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, self.cap = cap return dn.list() d.addCallback(_created_small) - d.addCallback(lambda kids: self.failUnlessReallyEqual(kids.keys(), [u"o"])) + d.addCallback(lambda kids: self.failUnlessReallyEqual(list(kids.keys()), [u"o"])) # now test n.create_subdirectory(mutable=False) d.addCallback(lambda ign: c.create_dirnode()) @@ -928,7 +935,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, d.addCallback(_check_kids) d.addCallback(lambda ign: n.list()) d.addCallback(lambda children: - self.failUnlessReallyEqual(children.keys(), [u"subdir"])) + self.failUnlessReallyEqual(list(children.keys()), [u"subdir"])) d.addCallback(lambda ign: n.get(u"subdir")) d.addCallback(lambda sd: sd.list()) d.addCallback(_check_kids) @@ -1417,9 +1424,9 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): def _check_children(self, children): # Are all the expected child nodes there? - self.failUnless(children.has_key(u'file1')) - self.failUnless(children.has_key(u'file2')) - self.failUnless(children.has_key(u'file3')) + self.failUnless(u'file1' in children) + self.failUnless(u'file2' in children) + self.failUnless(u'file3' in children) # Are the metadata for child 3 right? file3_rocap = "URI:CHK:cmtcxq7hwxvfxan34yiev6ivhy:qvcekmjtoetdcw4kmi7b3rtblvgx7544crnwaqtiewemdliqsokq:3:10:5" From 59968d099cfa139cf10f4242af7818ecc5ad391c Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 14:05:03 -0500 Subject: [PATCH 031/186] More passing tests. --- src/allmydata/dirnode.py | 4 ++-- src/allmydata/test/test_dirnode.py | 36 +++++++++++++++--------------- src/allmydata/unknown.py | 4 ++-- 3 files changed, 22 insertions(+), 22 deletions(-) diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index 137ff9699..7b0c97540 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -354,7 +354,7 @@ class DirectoryNode(object): # pack("16ss32s", iv, AES(H(writekey+iv), plaintext_rw_uri), mac) assert isinstance(data, bytes), (repr(data), type(data)) # an empty directory is serialized as an empty string - if data == "": + if data == b"": return AuxValueDict() writeable = not self.is_readonly() mutable = self.is_mutable() @@ -373,7 +373,7 @@ class DirectoryNode(object): # Therefore we normalize names going both in and out of directories. name = normalize(namex_utf8.decode("utf-8")) - rw_uri = "" + rw_uri = b"" if writeable: rw_uri = self._decrypt_rwcapdata(rwcapdata) diff --git a/src/allmydata/test/test_dirnode.py b/src/allmydata/test/test_dirnode.py index 68e774501..23dcb3bbd 100644 --- a/src/allmydata/test/test_dirnode.py +++ b/src/allmydata/test/test_dirnode.py @@ -704,7 +704,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, set([u"short"]))) d2.addCallback(lambda ignored: tinylit_node.list()) d2.addCallback(lambda children: children[u"short"][0].read(MemAccum())) - d2.addCallback(lambda accum: self.failUnlessReallyEqual(accum.data, "The end.")) + d2.addCallback(lambda accum: self.failUnlessReallyEqual(accum.data, b"The end.")) return d2 d.addCallback(_check_kids) @@ -789,7 +789,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, rep = str(dn) self.failUnless("RO-IMM" in rep) cap = dn.get_cap() - self.failUnlessIn("CHK", cap.to_string()) + self.failUnlessIn(b"CHK", cap.to_string()) self.cap = cap return dn.list() d.addCallback(_created) @@ -901,8 +901,8 @@ class Dirnode(GridTestMixin, unittest.TestCase, rep = str(dn) self.failUnless("RO-IMM" in rep) cap = dn.get_cap() - self.failUnlessIn("LIT", cap.to_string()) - self.failUnlessReallyEqual(cap.to_string(), "URI:DIR2-LIT:") + self.failUnlessIn(b"LIT", cap.to_string()) + self.failUnlessReallyEqual(cap.to_string(), b"URI:DIR2-LIT:") self.cap = cap return dn.list() d.addCallback(_created_empty) @@ -919,9 +919,9 @@ class Dirnode(GridTestMixin, unittest.TestCase, rep = str(dn) self.failUnless("RO-IMM" in rep) cap = dn.get_cap() - self.failUnlessIn("LIT", cap.to_string()) + self.failUnlessIn(b"LIT", cap.to_string()) self.failUnlessReallyEqual(cap.to_string(), - "URI:DIR2-LIT:gi4tumj2n4wdcmz2kvjesosmjfkdu3rvpbtwwlbqhiwdeot3puwcy") + b"URI:DIR2-LIT:gi4tumj2n4wdcmz2kvjesosmjfkdu3rvpbtwwlbqhiwdeot3puwcy") self.cap = cap return dn.list() d.addCallback(_created_small) @@ -969,14 +969,14 @@ class Dirnode(GridTestMixin, unittest.TestCase, # It also tests that we store child names as UTF-8 NFC, and normalize # them again when retrieving them. - stripped_write_uri = "lafs://from_the_future\t" - stripped_read_uri = "lafs://readonly_from_the_future\t" - spacedout_write_uri = stripped_write_uri + " " - spacedout_read_uri = stripped_read_uri + " " + stripped_write_uri = b"lafs://from_the_future\t" + stripped_read_uri = b"lafs://readonly_from_the_future\t" + spacedout_write_uri = stripped_write_uri + b" " + spacedout_read_uri = stripped_read_uri + b" " child = nm.create_from_cap(spacedout_write_uri, spacedout_read_uri) self.failUnlessReallyEqual(child.get_write_uri(), spacedout_write_uri) - self.failUnlessReallyEqual(child.get_readonly_uri(), "ro." + spacedout_read_uri) + self.failUnlessReallyEqual(child.get_readonly_uri(), b"ro." + spacedout_read_uri) child_dottedi = u"ch\u0131\u0307ld" @@ -1393,7 +1393,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, class MinimalFakeMutableFile(object): def get_writekey(self): - return "writekey" + return b"writekey" class Packing(testutil.ReallyEqualMixin, unittest.TestCase): # This is a base32-encoded representation of the directory tree @@ -1535,10 +1535,10 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): @implementer(IMutableFileNode) class FakeMutableFile(object): counter = 0 - def __init__(self, initial_contents=""): + def __init__(self, initial_contents=b""): data = self._get_initial_contents(initial_contents) self.data = data.read(data.get_size()) - self.data = "".join(self.data) + self.data = b"".join(self.data) counter = FakeMutableFile.counter FakeMutableFile.counter += 1 @@ -1547,10 +1547,10 @@ class FakeMutableFile(object): self.uri = uri.WriteableSSKFileURI(writekey, fingerprint) def _get_initial_contents(self, contents): - if isinstance(contents, str): + if isinstance(contents, bytes): return contents if contents is None: - return "" + return b"" assert callable(contents), "%s should be callable, not %s" % \ (contents, type(contents)) return contents(self) @@ -1568,7 +1568,7 @@ class FakeMutableFile(object): return defer.succeed(self.data) def get_writekey(self): - return "writekey" + return b"writekey" def is_readonly(self): return False @@ -1591,7 +1591,7 @@ class FakeMutableFile(object): return defer.succeed(None) class FakeNodeMaker(NodeMaker): - def create_mutable_file(self, contents="", keysize=None, version=None): + def create_mutable_file(self, contents=b"", keysize=None, version=None): return defer.succeed(FakeMutableFile(contents)) class FakeClient2(_Client): diff --git a/src/allmydata/unknown.py b/src/allmydata/unknown.py index 6c970e484..34d084136 100644 --- a/src/allmydata/unknown.py +++ b/src/allmydata/unknown.py @@ -31,8 +31,8 @@ class UnknownNode(object): def __init__(self, given_rw_uri, given_ro_uri, deep_immutable=False, name=u""): - assert given_rw_uri is None or isinstance(given_rw_uri, str) - assert given_ro_uri is None or isinstance(given_ro_uri, str) + assert given_rw_uri is None or isinstance(given_rw_uri, bytes) + assert given_ro_uri is None or isinstance(given_ro_uri, bytes) given_rw_uri = given_rw_uri or None given_ro_uri = given_ro_uri or None From ff6443228246276f2da454fb9299fe783e0ac243 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 14:14:07 -0500 Subject: [PATCH 032/186] More passing on Python 3.. --- src/allmydata/test/test_dirnode.py | 160 ++++++++++++++--------------- 1 file changed, 80 insertions(+), 80 deletions(-) diff --git a/src/allmydata/test/test_dirnode.py b/src/allmydata/test/test_dirnode.py index 23dcb3bbd..b3deb9df8 100644 --- a/src/allmydata/test/test_dirnode.py +++ b/src/allmydata/test/test_dirnode.py @@ -673,22 +673,22 @@ class Dirnode(GridTestMixin, unittest.TestCase, self.failUnless(fut_node.is_unknown()) self.failUnlessReallyEqual(fut_node.get_uri(), future_write_uri) - self.failUnlessReallyEqual(fut_node.get_readonly_uri(), "ro." + future_read_uri) + self.failUnlessReallyEqual(fut_node.get_readonly_uri(), b"ro." + future_read_uri) self.failUnless(isinstance(fut_metadata, dict), fut_metadata) self.failUnless(futna_node.is_unknown()) self.failUnlessReallyEqual(futna_node.get_uri(), future_nonascii_write_uri) - self.failUnlessReallyEqual(futna_node.get_readonly_uri(), "ro." + future_nonascii_read_uri) + self.failUnlessReallyEqual(futna_node.get_readonly_uri(), b"ro." + future_nonascii_read_uri) self.failUnless(isinstance(futna_metadata, dict), futna_metadata) self.failUnless(fro_node.is_unknown()) - self.failUnlessReallyEqual(fro_node.get_uri(), "ro." + future_read_uri) - self.failUnlessReallyEqual(fut_node.get_readonly_uri(), "ro." + future_read_uri) + self.failUnlessReallyEqual(fro_node.get_uri(), b"ro." + future_read_uri) + self.failUnlessReallyEqual(fut_node.get_readonly_uri(), b"ro." + future_read_uri) self.failUnless(isinstance(fro_metadata, dict), fro_metadata) self.failUnless(frona_node.is_unknown()) - self.failUnlessReallyEqual(frona_node.get_uri(), "ro." + future_nonascii_read_uri) - self.failUnlessReallyEqual(futna_node.get_readonly_uri(), "ro." + future_nonascii_read_uri) + self.failUnlessReallyEqual(frona_node.get_uri(), b"ro." + future_nonascii_read_uri) + self.failUnlessReallyEqual(futna_node.get_readonly_uri(), b"ro." + future_nonascii_read_uri) self.failUnless(isinstance(frona_metadata, dict), frona_metadata) self.failIf(emptylit_node.is_unknown()) @@ -815,13 +815,13 @@ class Dirnode(GridTestMixin, unittest.TestCase, self.failUnlessEqual(two_metadata["metakey"], "metavalue") self.failUnless(fut_node.is_unknown()) - self.failUnlessReallyEqual(fut_node.get_uri(), "imm." + future_read_uri) - self.failUnlessReallyEqual(fut_node.get_readonly_uri(), "imm." + future_read_uri) + self.failUnlessReallyEqual(fut_node.get_uri(), b"imm." + future_read_uri) + self.failUnlessReallyEqual(fut_node.get_readonly_uri(), b"imm." + future_read_uri) self.failUnless(isinstance(fut_metadata, dict), fut_metadata) self.failUnless(futna_node.is_unknown()) - self.failUnlessReallyEqual(futna_node.get_uri(), "imm." + future_nonascii_read_uri) - self.failUnlessReallyEqual(futna_node.get_readonly_uri(), "imm." + future_nonascii_read_uri) + self.failUnlessReallyEqual(futna_node.get_uri(), b"imm." + future_nonascii_read_uri) + self.failUnlessReallyEqual(futna_node.get_readonly_uri(), b"imm." + future_nonascii_read_uri) self.failUnless(isinstance(futna_metadata, dict), futna_metadata) self.failIf(emptylit_node.is_unknown()) @@ -837,7 +837,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, set([u"short"]))) d2.addCallback(lambda ignored: tinylit_node.list()) d2.addCallback(lambda children: children[u"short"][0].read(MemAccum())) - d2.addCallback(lambda accum: self.failUnlessReallyEqual(accum.data, "The end.")) + d2.addCallback(lambda accum: self.failUnlessReallyEqual(accum.data, b"The end.")) return d2 d.addCallback(_check_kids) @@ -1010,7 +1010,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, self.failUnlessIn(name, kids_out) (expected_child, ign) = kids_out[name] self.failUnlessReallyEqual(rw_uri, expected_child.get_write_uri()) - self.failUnlessReallyEqual("ro." + ro_uri, expected_child.get_readonly_uri()) + self.failUnlessReallyEqual(b"ro." + ro_uri, expected_child.get_readonly_uri()) numkids += 1 self.failUnlessReallyEqual(numkids, len(kids_out)) @@ -1046,7 +1046,7 @@ class Dirnode(GridTestMixin, unittest.TestCase, child_node, child_metadata = children[u"child"] self.failUnlessReallyEqual(child_node.get_write_uri(), stripped_write_uri) - self.failUnlessReallyEqual(child_node.get_readonly_uri(), "ro." + stripped_read_uri) + self.failUnlessReallyEqual(child_node.get_readonly_uri(), b"ro." + stripped_read_uri) d.addCallback(_check_kids) d.addCallback(lambda ign: nm.create_from_cap(self.cap.to_string())) @@ -1459,12 +1459,12 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): children[u'file1'][0].get_uri()) def _make_kids(self, nm, which): - caps = {"imm": "URI:CHK:n7r3m6wmomelk4sep3kw5cvduq:os7ijw5c3maek7pg65e5254k2fzjflavtpejjyhshpsxuqzhcwwq:3:20:14861", - "lit": "URI:LIT:n5xgk", # LIT for "one" - "write": "URI:SSK:vfvcbdfbszyrsaxchgevhmmlii:euw4iw7bbnkrrwpzuburbhppuxhc3gwxv26f6imekhz7zyw2ojnq", - "read": "URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q", - "dirwrite": "URI:DIR2:n6x24zd3seu725yluj75q5boaa:mm6yoqjhl6ueh7iereldqxue4nene4wl7rqfjfybqrehdqmqskvq", - "dirread": "URI:DIR2-RO:b7sr5qsifnicca7cbk3rhrhbvq:mm6yoqjhl6ueh7iereldqxue4nene4wl7rqfjfybqrehdqmqskvq", + caps = {"imm": b"URI:CHK:n7r3m6wmomelk4sep3kw5cvduq:os7ijw5c3maek7pg65e5254k2fzjflavtpejjyhshpsxuqzhcwwq:3:20:14861", + "lit": b"URI:LIT:n5xgk", # LIT for "one" + "write": b"URI:SSK:vfvcbdfbszyrsaxchgevhmmlii:euw4iw7bbnkrrwpzuburbhppuxhc3gwxv26f6imekhz7zyw2ojnq", + "read": b"URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q", + "dirwrite": b"URI:DIR2:n6x24zd3seu725yluj75q5boaa:mm6yoqjhl6ueh7iereldqxue4nene4wl7rqfjfybqrehdqmqskvq", + "dirread": b"URI:DIR2-RO:b7sr5qsifnicca7cbk3rhrhbvq:mm6yoqjhl6ueh7iereldqxue4nene4wl7rqfjfybqrehdqmqskvq", } kids = {} for name in which: @@ -1542,8 +1542,8 @@ class FakeMutableFile(object): counter = FakeMutableFile.counter FakeMutableFile.counter += 1 - writekey = hashutil.ssk_writekey_hash(str(counter)) - fingerprint = hashutil.ssk_pubkey_fingerprint_hash(str(counter)) + writekey = hashutil.ssk_writekey_hash(b"%d" % counter) + fingerprint = hashutil.ssk_pubkey_fingerprint_hash(b"%d" % counter) self.uri = uri.WriteableSSKFileURI(writekey, fingerprint) def _get_initial_contents(self, contents): @@ -1638,9 +1638,9 @@ class Dirnode2(testutil.ReallyEqualMixin, testutil.ShouldFailMixin, unittest.Tes # and to add an URI prefixed with "ro." or "imm." when it is given in a # write slot (or URL parameter). d.addCallback(lambda ign: self._node.set_uri(u"add-ro", - "ro." + future_read_uri, None)) + b"ro." + future_read_uri, None)) d.addCallback(lambda ign: self._node.set_uri(u"add-imm", - "imm." + future_imm_uri, None)) + b"imm." + future_imm_uri, None)) d.addCallback(lambda ign: self._node.list()) def _check(children): @@ -1649,25 +1649,25 @@ class Dirnode2(testutil.ReallyEqualMixin, testutil.ShouldFailMixin, unittest.Tes self.failUnless(isinstance(fn, UnknownNode), fn) self.failUnlessReallyEqual(fn.get_uri(), future_write_uri) self.failUnlessReallyEqual(fn.get_write_uri(), future_write_uri) - self.failUnlessReallyEqual(fn.get_readonly_uri(), "ro." + future_read_uri) + self.failUnlessReallyEqual(fn.get_readonly_uri(), b"ro." + future_read_uri) (fn2, metadata2) = children[u"add-pair"] self.failUnless(isinstance(fn2, UnknownNode), fn2) self.failUnlessReallyEqual(fn2.get_uri(), future_write_uri) self.failUnlessReallyEqual(fn2.get_write_uri(), future_write_uri) - self.failUnlessReallyEqual(fn2.get_readonly_uri(), "ro." + future_read_uri) + self.failUnlessReallyEqual(fn2.get_readonly_uri(), b"ro." + future_read_uri) (fn3, metadata3) = children[u"add-ro"] self.failUnless(isinstance(fn3, UnknownNode), fn3) - self.failUnlessReallyEqual(fn3.get_uri(), "ro." + future_read_uri) + self.failUnlessReallyEqual(fn3.get_uri(), b"ro." + future_read_uri) self.failUnlessReallyEqual(fn3.get_write_uri(), None) - self.failUnlessReallyEqual(fn3.get_readonly_uri(), "ro." + future_read_uri) + self.failUnlessReallyEqual(fn3.get_readonly_uri(), b"ro." + future_read_uri) (fn4, metadata4) = children[u"add-imm"] self.failUnless(isinstance(fn4, UnknownNode), fn4) - self.failUnlessReallyEqual(fn4.get_uri(), "imm." + future_imm_uri) + self.failUnlessReallyEqual(fn4.get_uri(), b"imm." + future_imm_uri) self.failUnlessReallyEqual(fn4.get_write_uri(), None) - self.failUnlessReallyEqual(fn4.get_readonly_uri(), "imm." + future_imm_uri) + self.failUnlessReallyEqual(fn4.get_readonly_uri(), b"imm." + future_imm_uri) # We should also be allowed to copy the "future" UnknownNode, because # it contains all the information that was in the original directory @@ -1682,17 +1682,17 @@ class Dirnode2(testutil.ReallyEqualMixin, testutil.ShouldFailMixin, unittest.Tes self.failUnless(isinstance(fn, UnknownNode), fn) self.failUnlessReallyEqual(fn.get_uri(), future_write_uri) self.failUnlessReallyEqual(fn.get_write_uri(), future_write_uri) - self.failUnlessReallyEqual(fn.get_readonly_uri(), "ro." + future_read_uri) + self.failUnlessReallyEqual(fn.get_readonly_uri(), b"ro." + future_read_uri) d.addCallback(_check2) return d def test_unknown_strip_prefix_for_ro(self): - self.failUnlessReallyEqual(strip_prefix_for_ro("foo", False), "foo") - self.failUnlessReallyEqual(strip_prefix_for_ro("ro.foo", False), "foo") - self.failUnlessReallyEqual(strip_prefix_for_ro("imm.foo", False), "imm.foo") - self.failUnlessReallyEqual(strip_prefix_for_ro("foo", True), "foo") - self.failUnlessReallyEqual(strip_prefix_for_ro("ro.foo", True), "foo") - self.failUnlessReallyEqual(strip_prefix_for_ro("imm.foo", True), "foo") + self.failUnlessReallyEqual(strip_prefix_for_ro(b"foo", False), b"foo") + self.failUnlessReallyEqual(strip_prefix_for_ro(b"ro.foo", False), b"foo") + self.failUnlessReallyEqual(strip_prefix_for_ro(b"imm.foo", False), b"imm.foo") + self.failUnlessReallyEqual(strip_prefix_for_ro(b"foo", True), b"foo") + self.failUnlessReallyEqual(strip_prefix_for_ro(b"ro.foo", True), b"foo") + self.failUnlessReallyEqual(strip_prefix_for_ro(b"imm.foo", True), b"foo") def test_unknownnode(self): lit_uri = one_uri @@ -1704,58 +1704,58 @@ class Dirnode2(testutil.ReallyEqualMixin, testutil.ShouldFailMixin, unittest.Tes ] unknown_rw = [# These are errors because we're only given a rw_uri, and we can't # diminish it. - ( 2, UnknownNode("foo", None)), - ( 3, UnknownNode("foo", None, deep_immutable=True)), - ( 4, UnknownNode("ro.foo", None, deep_immutable=True)), - ( 5, UnknownNode("ro." + mut_read_uri, None, deep_immutable=True)), - ( 5.1, UnknownNode("ro." + mdmf_read_uri, None, deep_immutable=True)), - ( 6, UnknownNode("URI:SSK-RO:foo", None, deep_immutable=True)), - ( 7, UnknownNode("URI:SSK:foo", None)), + ( 2, UnknownNode(b"foo", None)), + ( 3, UnknownNode(b"foo", None, deep_immutable=True)), + ( 4, UnknownNode(b"ro.foo", None, deep_immutable=True)), + ( 5, UnknownNode(b"ro." + mut_read_uri, None, deep_immutable=True)), + ( 5.1, UnknownNode(b"ro." + mdmf_read_uri, None, deep_immutable=True)), + ( 6, UnknownNode(b"URI:SSK-RO:foo", None, deep_immutable=True)), + ( 7, UnknownNode(b"URI:SSK:foo", None)), ] must_be_ro = [# These are errors because a readonly constraint is not met. - ( 8, UnknownNode("ro." + mut_write_uri, None)), - ( 8.1, UnknownNode("ro." + mdmf_write_uri, None)), - ( 9, UnknownNode(None, "ro." + mut_write_uri)), - ( 9.1, UnknownNode(None, "ro." + mdmf_write_uri)), + ( 8, UnknownNode(b"ro." + mut_write_uri, None)), + ( 8.1, UnknownNode(b"ro." + mdmf_write_uri, None)), + ( 9, UnknownNode(None, b"ro." + mut_write_uri)), + ( 9.1, UnknownNode(None, b"ro." + mdmf_write_uri)), ] must_be_imm = [# These are errors because an immutable constraint is not met. - (10, UnknownNode(None, "ro.URI:SSK-RO:foo", deep_immutable=True)), - (11, UnknownNode(None, "imm.URI:SSK:foo")), - (12, UnknownNode(None, "imm.URI:SSK-RO:foo")), - (13, UnknownNode("bar", "ro.foo", deep_immutable=True)), - (14, UnknownNode("bar", "imm.foo", deep_immutable=True)), - (15, UnknownNode("bar", "imm." + lit_uri, deep_immutable=True)), - (16, UnknownNode("imm." + mut_write_uri, None)), - (16.1, UnknownNode("imm." + mdmf_write_uri, None)), - (17, UnknownNode("imm." + mut_read_uri, None)), - (17.1, UnknownNode("imm." + mdmf_read_uri, None)), - (18, UnknownNode("bar", "imm.foo")), + (10, UnknownNode(None, b"ro.URI:SSK-RO:foo", deep_immutable=True)), + (11, UnknownNode(None, b"imm.URI:SSK:foo")), + (12, UnknownNode(None, b"imm.URI:SSK-RO:foo")), + (13, UnknownNode(b"bar", b"ro.foo", deep_immutable=True)), + (14, UnknownNode(b"bar", b"imm.foo", deep_immutable=True)), + (15, UnknownNode(b"bar", b"imm." + lit_uri, deep_immutable=True)), + (16, UnknownNode(b"imm." + mut_write_uri, None)), + (16.1, UnknownNode(b"imm." + mdmf_write_uri, None)), + (17, UnknownNode(b"imm." + mut_read_uri, None)), + (17.1, UnknownNode(b"imm." + mdmf_read_uri, None)), + (18, UnknownNode(b"bar", b"imm.foo")), ] bad_uri = [# These are errors because the URI is bad once we've stripped the prefix. - (19, UnknownNode("ro.URI:SSK-RO:foo", None)), - (20, UnknownNode("imm.URI:CHK:foo", None, deep_immutable=True)), - (21, UnknownNode(None, "URI:CHK:foo")), - (22, UnknownNode(None, "URI:CHK:foo", deep_immutable=True)), + (19, UnknownNode(b"ro.URI:SSK-RO:foo", None)), + (20, UnknownNode(b"imm.URI:CHK:foo", None, deep_immutable=True)), + (21, UnknownNode(None, b"URI:CHK:foo")), + (22, UnknownNode(None, b"URI:CHK:foo", deep_immutable=True)), ] ro_prefixed = [# These are valid, and the readcap should end up with a ro. prefix. - (23, UnknownNode(None, "foo")), - (24, UnknownNode(None, "ro.foo")), - (25, UnknownNode(None, "ro." + lit_uri)), - (26, UnknownNode("bar", "foo")), - (27, UnknownNode("bar", "ro.foo")), - (28, UnknownNode("bar", "ro." + lit_uri)), - (29, UnknownNode("ro.foo", None)), - (30, UnknownNode("ro." + lit_uri, None)), + (23, UnknownNode(None, b"foo")), + (24, UnknownNode(None, b"ro.foo")), + (25, UnknownNode(None, b"ro." + lit_uri)), + (26, UnknownNode(b"bar", b"foo")), + (27, UnknownNode(b"bar", b"ro.foo")), + (28, UnknownNode(b"bar", b"ro." + lit_uri)), + (29, UnknownNode(b"ro.foo", None)), + (30, UnknownNode(b"ro." + lit_uri, None)), ] imm_prefixed = [# These are valid, and the readcap should end up with an imm. prefix. - (31, UnknownNode(None, "foo", deep_immutable=True)), - (32, UnknownNode(None, "ro.foo", deep_immutable=True)), - (33, UnknownNode(None, "imm.foo")), - (34, UnknownNode(None, "imm.foo", deep_immutable=True)), - (35, UnknownNode("imm." + lit_uri, None)), - (36, UnknownNode("imm." + lit_uri, None, deep_immutable=True)), - (37, UnknownNode(None, "imm." + lit_uri)), - (38, UnknownNode(None, "imm." + lit_uri, deep_immutable=True)), + (31, UnknownNode(None, b"foo", deep_immutable=True)), + (32, UnknownNode(None, b"ro.foo", deep_immutable=True)), + (33, UnknownNode(None, b"imm.foo")), + (34, UnknownNode(None, b"imm.foo", deep_immutable=True)), + (35, UnknownNode(b"imm." + lit_uri, None)), + (36, UnknownNode(b"imm." + lit_uri, None, deep_immutable=True)), + (37, UnknownNode(None, b"imm." + lit_uri)), + (38, UnknownNode(None, b"imm." + lit_uri, deep_immutable=True)), ] error = unknown_rw + must_be_ro + must_be_imm + bad_uri ok = ro_prefixed + imm_prefixed @@ -1787,10 +1787,10 @@ class Dirnode2(testutil.ReallyEqualMixin, testutil.ShouldFailMixin, unittest.Tes self.failIf(n.get_readonly_uri() is None, i) for (i, n) in ro_prefixed: - self.failUnless(n.get_readonly_uri().startswith("ro."), i) + self.failUnless(n.get_readonly_uri().startswith(b"ro."), i) for (i, n) in imm_prefixed: - self.failUnless(n.get_readonly_uri().startswith("imm."), i) + self.failUnless(n.get_readonly_uri().startswith(b"imm."), i) From b1800c457dc4f598585ecb3753413ecff968734f Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 14:33:56 -0500 Subject: [PATCH 033/186] All tests pass on Python 3. --- src/allmydata/test/test_dirnode.py | 20 ++++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/src/allmydata/test/test_dirnode.py b/src/allmydata/test/test_dirnode.py index b3deb9df8..f37926fa4 100644 --- a/src/allmydata/test/test_dirnode.py +++ b/src/allmydata/test/test_dirnode.py @@ -1412,7 +1412,7 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): nodemaker = NodeMaker(None, None, None, None, None, {"k": 3, "n": 10}, None, None) - write_uri = "URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q" + write_uri = b"URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q" filenode = nodemaker.create_from_cap(write_uri) node = dirnode.DirectoryNode(filenode, nodemaker, None) children = node._unpack_contents(known_tree) @@ -1429,8 +1429,8 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): self.failUnless(u'file3' in children) # Are the metadata for child 3 right? - file3_rocap = "URI:CHK:cmtcxq7hwxvfxan34yiev6ivhy:qvcekmjtoetdcw4kmi7b3rtblvgx7544crnwaqtiewemdliqsokq:3:10:5" - file3_rwcap = "URI:CHK:cmtcxq7hwxvfxan34yiev6ivhy:qvcekmjtoetdcw4kmi7b3rtblvgx7544crnwaqtiewemdliqsokq:3:10:5" + file3_rocap = b"URI:CHK:cmtcxq7hwxvfxan34yiev6ivhy:qvcekmjtoetdcw4kmi7b3rtblvgx7544crnwaqtiewemdliqsokq:3:10:5" + file3_rwcap = b"URI:CHK:cmtcxq7hwxvfxan34yiev6ivhy:qvcekmjtoetdcw4kmi7b3rtblvgx7544crnwaqtiewemdliqsokq:3:10:5" file3_metadata = {'ctime': 1246663897.4336269, 'tahoe': {'linkmotime': 1246663897.4336269, 'linkcrtime': 1246663897.4336269}, 'mtime': 1246663897.4336269} self.failUnlessEqual(file3_metadata, children[u'file3'][1]) self.failUnlessReallyEqual(file3_rocap, @@ -1439,8 +1439,8 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): children[u'file3'][0].get_uri()) # Are the metadata for child 2 right? - file2_rocap = "URI:CHK:apegrpehshwugkbh3jlt5ei6hq:5oougnemcl5xgx4ijgiumtdojlipibctjkbwvyygdymdphib2fvq:3:10:4" - file2_rwcap = "URI:CHK:apegrpehshwugkbh3jlt5ei6hq:5oougnemcl5xgx4ijgiumtdojlipibctjkbwvyygdymdphib2fvq:3:10:4" + file2_rocap = b"URI:CHK:apegrpehshwugkbh3jlt5ei6hq:5oougnemcl5xgx4ijgiumtdojlipibctjkbwvyygdymdphib2fvq:3:10:4" + file2_rwcap = b"URI:CHK:apegrpehshwugkbh3jlt5ei6hq:5oougnemcl5xgx4ijgiumtdojlipibctjkbwvyygdymdphib2fvq:3:10:4" file2_metadata = {'ctime': 1246663897.430218, 'tahoe': {'linkmotime': 1246663897.430218, 'linkcrtime': 1246663897.430218}, 'mtime': 1246663897.430218} self.failUnlessEqual(file2_metadata, children[u'file2'][1]) self.failUnlessReallyEqual(file2_rocap, @@ -1449,8 +1449,8 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): children[u'file2'][0].get_uri()) # Are the metadata for child 1 right? - file1_rocap = "URI:CHK:olxtimympo7f27jvhtgqlnbtn4:emzdnhk2um4seixozlkw3qx2nfijvdkx3ky7i7izl47yedl6e64a:3:10:10" - file1_rwcap = "URI:CHK:olxtimympo7f27jvhtgqlnbtn4:emzdnhk2um4seixozlkw3qx2nfijvdkx3ky7i7izl47yedl6e64a:3:10:10" + file1_rocap = b"URI:CHK:olxtimympo7f27jvhtgqlnbtn4:emzdnhk2um4seixozlkw3qx2nfijvdkx3ky7i7izl47yedl6e64a:3:10:10" + file1_rwcap = b"URI:CHK:olxtimympo7f27jvhtgqlnbtn4:emzdnhk2um4seixozlkw3qx2nfijvdkx3ky7i7izl47yedl6e64a:3:10:10" file1_metadata = {'ctime': 1246663897.4275661, 'tahoe': {'linkmotime': 1246663897.4275661, 'linkcrtime': 1246663897.4275661}, 'mtime': 1246663897.4275661} self.failUnlessEqual(file1_metadata, children[u'file1'][1]) self.failUnlessReallyEqual(file1_rocap, @@ -1492,7 +1492,7 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): name: (LiteralFileNode(uri.from_string(one_uri)), {}), } packed = dirnode.pack_children(kids, fn.get_writekey(), deep_immutable=False) - write_uri = "URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q" + write_uri = b"URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q" filenode = nm.create_from_cap(write_uri) dn = dirnode.DirectoryNode(filenode, nm, None) unkids = dn._unpack_contents(packed) @@ -1505,11 +1505,11 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): kids = self._make_kids(nm, ["imm", "lit", "write", "read", "dirwrite", "dirread"]) packed = dirnode.pack_children(kids, fn.get_writekey(), deep_immutable=False) - self.failUnlessIn("lit", packed) + self.failUnlessIn(b"lit", packed) kids = self._make_kids(nm, ["imm", "lit"]) packed = dirnode.pack_children(kids, fn.get_writekey(), deep_immutable=True) - self.failUnlessIn("lit", packed) + self.failUnlessIn(b"lit", packed) kids = self._make_kids(nm, ["imm", "lit", "write"]) self.failUnlessRaises(dirnode.MustBeDeepImmutableError, From 5cba8a438008387d52eecd9942a715ba0fee2b87 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 14:45:07 -0500 Subject: [PATCH 034/186] Port to Python 3. --- src/allmydata/test/test_dirnode.py | 22 +++++++++++++++------- src/allmydata/util/_python3.py | 1 + 2 files changed, 16 insertions(+), 7 deletions(-) diff --git a/src/allmydata/test/test_dirnode.py b/src/allmydata/test/test_dirnode.py index f37926fa4..586d89731 100644 --- a/src/allmydata/test/test_dirnode.py +++ b/src/allmydata/test/test_dirnode.py @@ -1,10 +1,18 @@ -"""Tests for the dirnode module.""" -# from __future__ import absolute_import -# from __future__ import division -# from __future__ import print_function -# from __future__ import unicode_literals +"""Tests for the dirnode module. -from past.builtins import unicode, long +Ported to Python 3. +""" +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals + +from past.builtins import long + +from future.utils import PY2 +if PY2: + # Skip list() since it results in spurious test failures + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, object, range, str, max, min # noqa: F401 import six import time @@ -1468,7 +1476,7 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): } kids = {} for name in which: - kids[unicode(name)] = (nm.create_from_cap(caps[name]), {}) + kids[str(name)] = (nm.create_from_cap(caps[name]), {}) return kids @given(text(min_size=1, max_size=20)) diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index 9763c35d7..7b454cbb7 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -134,6 +134,7 @@ PORTED_TEST_MODULES = [ "allmydata.test.test_crypto", "allmydata.test.test_deferredutil", "allmydata.test.test_dictutil", + "allmydata.test.test_dirnode", "allmydata.test.test_download", "allmydata.test.test_encode", "allmydata.test.test_encodingutil", From bb06067c33d6a226fbc5d7233e309db63ccf27e5 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 14:48:33 -0500 Subject: [PATCH 035/186] Port to Python 3. --- src/allmydata/dirnode.py | 21 +++++++++++++++++---- src/allmydata/util/_python3.py | 1 + 2 files changed, 18 insertions(+), 4 deletions(-) diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index 7b0c97540..6c4af7461 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -1,4 +1,15 @@ -"""Directory Node implementation.""" +"""Directory Node implementation. + +Ported to Python 3. +""" +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals + +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 from past.builtins import unicode import time @@ -37,6 +48,8 @@ from eliot.twisted import ( NAME = Field.for_types( u"name", + # Make sure this works on Python 2; with str, it gets Future str which + # breaks Eliot. [unicode], u"The name linking the parent to this node.", ) @@ -250,7 +263,7 @@ def _pack_normalized_children(children, writekey, deep_immutable=False): has_aux = isinstance(children, AuxValueDict) entries = [] for name in sorted(children.keys()): - assert isinstance(name, unicode) + assert isinstance(name, str) entry = None (child, metadata) = children[name] child.raise_error() @@ -543,7 +556,7 @@ class DirectoryNode(object): else: pathx = pathx.split("/") for p in pathx: - assert isinstance(p, unicode), p + assert isinstance(p, str), p childnamex = pathx[0] remaining_pathx = pathx[1:] if remaining_pathx: @@ -570,7 +583,7 @@ class DirectoryNode(object): a = Adder(self, overwrite=overwrite, create_readonly_node=self._create_readonly_node) for (namex, e) in entries.items(): - assert isinstance(namex, unicode), namex + assert isinstance(namex, str), namex if len(e) == 2: writecap, readcap = e metadata = None diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index 7b454cbb7..79f09af02 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -34,6 +34,7 @@ PORTED_MODULES = [ "allmydata.crypto.error", "allmydata.crypto.rsa", "allmydata.crypto.util", + "allmydata.dirnode", "allmydata.hashtree", "allmydata.immutable.downloader", "allmydata.immutable.downloader.common", From ba424837419482de7498731c06c5eb56f767dadc Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 14:50:28 -0500 Subject: [PATCH 036/186] News file. --- newsfragments/3553.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3553.minor diff --git a/newsfragments/3553.minor b/newsfragments/3553.minor new file mode 100644 index 000000000..e69de29bb From 96fd1861d2ffb35c9bc81230ab6f57c3aff493b3 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 9 Dec 2020 14:58:54 -0500 Subject: [PATCH 037/186] Port to Python 3. --- src/allmydata/dirnode.py | 8 +++++--- src/allmydata/nodemaker.py | 12 ++++++++++++ src/allmydata/test/test_dirnode.py | 4 ---- src/allmydata/util/_python3.py | 1 + 4 files changed, 18 insertions(+), 7 deletions(-) diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index 6c4af7461..fd6a9cc8c 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -9,7 +9,8 @@ from __future__ import unicode_literals from future.utils import PY2 if PY2: - from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 + # Skip dict so it doesn't break things. + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, list, object, range, str, max, min # noqa: F401 from past.builtins import unicode import time @@ -268,8 +269,9 @@ def _pack_normalized_children(children, writekey, deep_immutable=False): (child, metadata) = children[name] child.raise_error() if deep_immutable and not child.is_allowed_in_immutable_directory(): - raise MustBeDeepImmutableError("child %s is not allowed in an immutable directory" % - quote_output(name, encoding='utf-8'), name) + raise MustBeDeepImmutableError( + "child %r is not allowed in an immutable directory" % (name,), + name) if has_aux: entry = children.get_aux(name) if not entry: diff --git a/src/allmydata/nodemaker.py b/src/allmydata/nodemaker.py index ab455ce48..6b0b77c5c 100644 --- a/src/allmydata/nodemaker.py +++ b/src/allmydata/nodemaker.py @@ -1,3 +1,15 @@ +""" +Ported to Python 3. +""" +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals + +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 + import weakref from zope.interface import implementer from allmydata.util.assertutil import precondition diff --git a/src/allmydata/test/test_dirnode.py b/src/allmydata/test/test_dirnode.py index 586d89731..3a073dafe 100644 --- a/src/allmydata/test/test_dirnode.py +++ b/src/allmydata/test/test_dirnode.py @@ -14,7 +14,6 @@ if PY2: # Skip list() since it results in spurious test failures from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, object, range, str, max, min # noqa: F401 -import six import time import unicodedata from zope.interface import implementer @@ -46,9 +45,6 @@ import allmydata.test.common_util as testutil from hypothesis import given from hypothesis.strategies import text -if six.PY3: - long = int - @implementer(IConsumer) class MemAccum(object): diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index 79f09af02..604e2ccab 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -66,6 +66,7 @@ PORTED_MODULES = [ "allmydata.mutable.retrieve", "allmydata.mutable.servermap", "allmydata.node", + "allmydata.nodemaker", "allmydata.storage_client", "allmydata.storage.common", "allmydata.storage.crawler", From 46f18fbbc30747dd299f44f6647a80cc78cbc088 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 9 Dec 2020 15:47:43 -0500 Subject: [PATCH 038/186] news fragment --- newsfragments/3532.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3532.minor diff --git a/newsfragments/3532.minor b/newsfragments/3532.minor new file mode 100644 index 000000000..e69de29bb From e2963856d34e29fa15439bc9ff5e2088aae021e5 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 9 Dec 2020 15:48:40 -0500 Subject: [PATCH 039/186] Dependency Injection for _tub_portlocation --- src/allmydata/node.py | 19 +++- src/allmydata/test/test_connections.py | 30 +----- src/allmydata/test/test_node.py | 134 ++++++++++++++++--------- 3 files changed, 103 insertions(+), 80 deletions(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index 0fe62c50b..f6e5d1fd1 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -705,8 +705,15 @@ def _convert_tub_port(s): return us -def _tub_portlocation(config): +def _tub_portlocation(config, get_local_addresses_sync, allocate_tcp_port): """ + Figure out the network location of the main tub for some configuration. + + :param get_local_addresses_sync: A function like + ``iputil.get_local_addresses_sync``. + + :param allocate_tcp_port: A function like ``iputil.allocate_tcp_port``. + :returns: None or tuple of (port, location) for the main tub based on the given configuration. May raise ValueError or PrivacyError if there are problems with the config @@ -746,7 +753,7 @@ def _tub_portlocation(config): file_tubport = fileutil.read(config.portnum_fname).strip() tubport = _convert_tub_port(file_tubport) else: - tubport = "tcp:%d" % iputil.allocate_tcp_port() + tubport = "tcp:%d" % (allocate_tcp_port(),) fileutil.write_atomically(config.portnum_fname, tubport + "\n", mode="") else: @@ -766,7 +773,7 @@ def _tub_portlocation(config): if "AUTO" in split_location: if not reveal_ip: raise PrivacyError("tub.location uses AUTO") - local_addresses = iputil.get_local_addresses_sync() + local_addresses = get_local_addresses_sync() # tubport must be like "tcp:12345" or "tcp:12345:morestuff" local_portnum = int(tubport.split(":")[1]) new_locations = [] @@ -821,7 +828,11 @@ def create_main_tub(config, tub_options, :param tor_provider: None, or a _Provider instance if txtorcon + Tor are installed. """ - portlocation = _tub_portlocation(config) + portlocation = _tub_portlocation( + config, + iputil.get_local_addresses_sync, + iputil.allocate_tcp_port, + ) certfile = config.get_private_path("node.pem") # FIXME? "node.pem" was the CERTFILE option/thing tub = create_tub(tub_options, default_connection_handlers, foolscap_connection_handlers, diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 220815203..4c028297f 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -6,7 +6,7 @@ from twisted.internet.interfaces import IStreamClientEndpoint from foolscap.connections import tcp from ..node import PrivacyError, config_from_string from ..node import create_connection_handlers -from ..node import create_main_tub, _tub_portlocation +from ..node import create_main_tub from ..util.i2p_provider import create as create_i2p_provider from ..util.tor_provider import create as create_tor_provider @@ -438,31 +438,3 @@ class Privacy(unittest.TestCase): str(ctx.exception), "tub.location uses AUTO", ) - - def test_tub_location_tcp(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[node]\nreveal-IP-address = false\ntub.location=tcp:hostname:1234\n", - ) - with self.assertRaises(PrivacyError) as ctx: - _tub_portlocation(config) - self.assertEqual( - str(ctx.exception), - "tub.location includes tcp: hint", - ) - - def test_tub_location_legacy_tcp(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[node]\nreveal-IP-address = false\ntub.location=hostname:1234\n", - ) - - with self.assertRaises(PrivacyError) as ctx: - _tub_portlocation(config) - - self.assertEqual( - str(ctx.exception), - "tub.location includes tcp: hint", - ) diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index be3be51b9..32c610af6 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -39,6 +39,7 @@ import foolscap.logging.log from twisted.application import service from allmydata.node import ( + PrivacyError, create_tub_options, create_main_tub, create_node_dir, @@ -513,25 +514,24 @@ class TestCase(testutil.SignalMixin, unittest.TestCase): class TestMissingPorts(unittest.TestCase): """ - Test certain error-cases for ports setup + Test certain ``_tub_portlocation`` error cases for ports setup. """ - def setUp(self): self.basedir = self.mktemp() create_node_dir(self.basedir, "testing") + def _get_addr(self): + return ["LOCAL"] + + def _alloc_port(self): + return 999 + def test_parsing_tcp(self): """ - parse explicit tub.port with explicitly-default tub.location + When ``tub.port`` is given and ``tub.location`` is **AUTO** the port + number from ``tub.port`` is used as the port number for the value + constructed for ``tub.location``. """ - get_addr = mock.patch( - "allmydata.util.iputil.get_local_addresses_sync", - return_value=["LOCAL"], - ) - alloc_port = mock.patch( - "allmydata.util.iputil.allocate_tcp_port", - return_value=999, - ) config_data = ( "[node]\n" "tub.port = tcp:777\n" @@ -539,8 +539,11 @@ class TestMissingPorts(unittest.TestCase): ) config = config_from_string(self.basedir, "portnum", config_data) - with get_addr, alloc_port: - tubport, tublocation = _tub_portlocation(config) + tubport, tublocation = _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) self.assertEqual(tubport, "tcp:777") self.assertEqual(tublocation, b"tcp:LOCAL:777") @@ -548,21 +551,16 @@ class TestMissingPorts(unittest.TestCase): """ parse empty config, check defaults """ - get_addr = mock.patch( - "allmydata.util.iputil.get_local_addresses_sync", - return_value=["LOCAL"], - ) - alloc_port = mock.patch( - "allmydata.util.iputil.allocate_tcp_port", - return_value=999, - ) config_data = ( "[node]\n" ) config = config_from_string(self.basedir, "portnum", config_data) - with get_addr, alloc_port: - tubport, tublocation = _tub_portlocation(config) + tubport, tublocation = _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) self.assertEqual(tubport, "tcp:999") self.assertEqual(tublocation, b"tcp:LOCAL:999") @@ -570,22 +568,17 @@ class TestMissingPorts(unittest.TestCase): """ location with two options (including defaults) """ - get_addr = mock.patch( - "allmydata.util.iputil.get_local_addresses_sync", - return_value=["LOCAL"], - ) - alloc_port = mock.patch( - "allmydata.util.iputil.allocate_tcp_port", - return_value=999, - ) config_data = ( "[node]\n" "tub.location = tcp:HOST:888,AUTO\n" ) config = config_from_string(self.basedir, "portnum", config_data) - with get_addr, alloc_port: - tubport, tublocation = _tub_portlocation(config) + tubport, tublocation = _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) self.assertEqual(tubport, "tcp:999") self.assertEqual(tublocation, b"tcp:HOST:888,tcp:LOCAL:999") @@ -593,14 +586,6 @@ class TestMissingPorts(unittest.TestCase): """ parse config with both port + location disabled """ - get_addr = mock.patch( - "allmydata.util.iputil.get_local_addresses_sync", - return_value=["LOCAL"], - ) - alloc_port = mock.patch( - "allmydata.util.iputil.allocate_tcp_port", - return_value=999, - ) config_data = ( "[node]\n" "tub.port = disabled\n" @@ -608,8 +593,11 @@ class TestMissingPorts(unittest.TestCase): ) config = config_from_string(self.basedir, "portnum", config_data) - with get_addr, alloc_port: - res = _tub_portlocation(config) + res = _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) self.assertTrue(res is None) def test_empty_tub_port(self): @@ -623,7 +611,11 @@ class TestMissingPorts(unittest.TestCase): config = config_from_string(self.basedir, "portnum", config_data) with self.assertRaises(ValueError) as ctx: - _tub_portlocation(config) + _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) self.assertIn( "tub.port must not be empty", str(ctx.exception) @@ -640,7 +632,11 @@ class TestMissingPorts(unittest.TestCase): config = config_from_string(self.basedir, "portnum", config_data) with self.assertRaises(ValueError) as ctx: - _tub_portlocation(config) + _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) self.assertIn( "tub.location must not be empty", str(ctx.exception) @@ -658,7 +654,11 @@ class TestMissingPorts(unittest.TestCase): config = config_from_string(self.basedir, "portnum", config_data) with self.assertRaises(ValueError) as ctx: - _tub_portlocation(config) + _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) self.assertIn( "tub.port is disabled, but not tub.location", str(ctx.exception) @@ -676,12 +676,52 @@ class TestMissingPorts(unittest.TestCase): config = config_from_string(self.basedir, "portnum", config_data) with self.assertRaises(ValueError) as ctx: - _tub_portlocation(config) + _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) self.assertIn( "tub.location is disabled, but not tub.port", str(ctx.exception) ) + def test_tub_location_tcp(self): + config = config_from_string( + "fake.port", + "no-basedir", + "[node]\nreveal-IP-address = false\ntub.location=tcp:hostname:1234\n", + ) + with self.assertRaises(PrivacyError) as ctx: + _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) + self.assertEqual( + str(ctx.exception), + "tub.location includes tcp: hint", + ) + + def test_tub_location_legacy_tcp(self): + config = config_from_string( + "fake.port", + "no-basedir", + "[node]\nreveal-IP-address = false\ntub.location=hostname:1234\n", + ) + + with self.assertRaises(PrivacyError) as ctx: + _tub_portlocation( + config, + self._get_addr, + self._alloc_port, + ) + + self.assertEqual( + str(ctx.exception), + "tub.location includes tcp: hint", + ) + BASE_CONFIG = """ [tor] From 89441d91694ba3a61e4a0262e95acb49c32af33d Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 9 Dec 2020 16:18:48 -0500 Subject: [PATCH 040/186] Refactor create_connection_handlers so we don't need Tor and I2P mocks --- src/allmydata/node.py | 57 +++++++++++++++++++++------------ src/allmydata/test/test_node.py | 42 +++++++++++++++++------- 2 files changed, 67 insertions(+), 32 deletions(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index f6e5d1fd1..344f463e5 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -616,28 +616,20 @@ def _make_tcp_handler(): return default() -def create_connection_handlers(reactor, config, i2p_provider, tor_provider): +def create_default_connection_handlers(reactor, config, handlers): """ - :returns: 2-tuple of default_connection_handlers, foolscap_connection_handlers + :return: A dictionary giving the default connection handlers. The keys + are strings like "tcp" and the values are strings like "tor" or + ``None``. """ reveal_ip = config.get_config("node", "reveal-IP-address", True, boolean=True) - # We store handlers for everything. None means we were unable to - # create that handler, so hints which want it will be ignored. - handlers = foolscap_connection_handlers = { - "tcp": _make_tcp_handler(), - "tor": tor_provider.get_tor_handler(), - "i2p": i2p_provider.get_i2p_handler(), - } - log.msg( - format="built Foolscap connection handlers for: %(known_handlers)s", - known_handlers=sorted([k for k,v in handlers.items() if v]), - facility="tahoe.node", - umid="PuLh8g", - ) - - # then we remember the default mappings from tahoe.cfg - default_connection_handlers = {"tor": "tor", "i2p": "i2p"} + # Remember the default mappings from tahoe.cfg + default_connection_handlers = { + name: name + for name + in handlers + } tcp_handler_name = config.get_config("connections", "tcp", "tcp").lower() if tcp_handler_name == "disabled": default_connection_handlers["tcp"] = None @@ -662,10 +654,35 @@ def create_connection_handlers(reactor, config, i2p_provider, tor_provider): if not reveal_ip: if default_connection_handlers.get("tcp") == "tcp": - raise PrivacyError("tcp = tcp, must be set to 'tor' or 'disabled'") - return default_connection_handlers, foolscap_connection_handlers + raise PrivacyError("tcp = tcp, must not be set to 'tcp'") + return default_connection_handlers +def create_connection_handlers(reactor, config, i2p_provider, tor_provider): + """ + :returns: 2-tuple of default_connection_handlers, foolscap_connection_handlers + """ + # We store handlers for everything. None means we were unable to + # create that handler, so hints which want it will be ignored. + handlers = { + "tcp": _make_tcp_handler(), + } + handlers.update({ + "tor": tor_provider.get_tor_handler(), + "i2p": i2p_provider.get_i2p_handler(), + }) + log.msg( + format="built Foolscap connection handlers for: %(known_handlers)s", + known_handlers=sorted([k for k,v in handlers.items() if v]), + facility="tahoe.node", + umid="PuLh8g", + ) + return create_default_connection_handlers( + reactor, + config, + handlers, + ), handlers + def create_tub(tub_options, default_connection_handlers, foolscap_connection_handlers, handler_overrides={}, **kwargs): diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index 32c610af6..e921e0408 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -36,6 +36,7 @@ from twisted.trial import unittest from twisted.internet import defer import foolscap.logging.log +from foolscap.connections.tcp import default as make_tcp_handler from twisted.application import service from allmydata.node import ( @@ -43,6 +44,7 @@ from allmydata.node import ( create_tub_options, create_main_tub, create_node_dir, + create_default_connection_handlers, create_connection_handlers, config_from_string, read_config, @@ -778,14 +780,19 @@ class Listeners(unittest.TestCase): f.write("tub.location = AUTO\n") config = client.read_config(basedir, "client.port") - i2p_provider = mock.Mock() - tor_provider = mock.Mock() - dfh, fch = create_connection_handlers(None, config, i2p_provider, tor_provider) + fch = {"tcp": make_tcp_handler()} + dfh = create_default_connection_handlers( + None, + config, + fch, + ) tub_options = create_tub_options(config) t = FakeTub() with mock.patch("allmydata.node.Tub", return_value=t): with self.assertRaises(ValueError) as ctx: + i2p_provider = mock.Mock() + tor_provider = mock.Mock() create_main_tub(config, tub_options, dfh, fch, i2p_provider, tor_provider) self.assertIn( "you must choose", @@ -818,13 +825,18 @@ class Listeners(unittest.TestCase): f.write("tub.location = %s\n" % location) config = client.read_config(basedir, "client.port") - i2p_provider = mock.Mock() - tor_provider = mock.Mock() - dfh, fch = create_connection_handlers(None, config, i2p_provider, tor_provider) + fch = {"tcp": make_tcp_handler()} + dfh = create_default_connection_handlers( + None, + config, + fch, + ) tub_options = create_tub_options(config) t = FakeTub() with mock.patch("allmydata.node.Tub", return_value=t): + i2p_provider = mock.Mock() + tor_provider = mock.Mock() create_main_tub(config, tub_options, dfh, fch, i2p_provider, tor_provider) self.assertEqual(t.listening_ports, ["tcp:%d:interface=127.0.0.1" % port1, @@ -844,11 +856,16 @@ class Listeners(unittest.TestCase): tub_options = create_tub_options(config) t = FakeTub() - i2p_provider = mock.Mock() - tor_provider = mock.Mock() - dfh, fch = create_connection_handlers(None, config, i2p_provider, tor_provider) + fch = {"tcp": make_tcp_handler()} + dfh = create_default_connection_handlers( + None, + config, + fch, + ) with mock.patch("allmydata.node.Tub", return_value=t): + i2p_provider = mock.Mock() + tor_provider = mock.Mock() create_main_tub(config, tub_options, dfh, fch, i2p_provider, tor_provider) self.assertEqual(i2p_provider.get_listener.mock_calls, [mock.call()]) @@ -990,8 +1007,9 @@ class CreateConnectionHandlers(unittest.TestCase): tcp = disabled """)) reactor = object() # it's not actually used?! - provider = FakeProvider() - default_handlers, _ = create_connection_handlers( - reactor, config, provider, provider + default_handlers = create_default_connection_handlers( + reactor, + config, + {}, ) self.assertIs(default_handlers["tcp"], None) From d29d9c57e7df1a3260e7105c8f7476ba933de6e4 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Thu, 10 Dec 2020 06:59:41 -0500 Subject: [PATCH 041/186] These values aren't used for the exercised codepaths So just use None instead. Kind of a weak fix but a fix nonetheless. --- src/allmydata/test/test_node.py | 8 ++------ 1 file changed, 2 insertions(+), 6 deletions(-) diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index e921e0408..5a2acb690 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -791,9 +791,7 @@ class Listeners(unittest.TestCase): with mock.patch("allmydata.node.Tub", return_value=t): with self.assertRaises(ValueError) as ctx: - i2p_provider = mock.Mock() - tor_provider = mock.Mock() - create_main_tub(config, tub_options, dfh, fch, i2p_provider, tor_provider) + create_main_tub(config, tub_options, dfh, fch, None, None) self.assertIn( "you must choose", str(ctx.exception), @@ -835,9 +833,7 @@ class Listeners(unittest.TestCase): t = FakeTub() with mock.patch("allmydata.node.Tub", return_value=t): - i2p_provider = mock.Mock() - tor_provider = mock.Mock() - create_main_tub(config, tub_options, dfh, fch, i2p_provider, tor_provider) + create_main_tub(config, tub_options, dfh, fch, None, None) self.assertEqual(t.listening_ports, ["tcp:%d:interface=127.0.0.1" % port1, "tcp:%d:interface=127.0.0.1" % port2]) From fb621f438832cad4600e4e299df4a0abd3e56be9 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 10 Dec 2020 10:11:43 -0500 Subject: [PATCH 042/186] Port idlib to Python 3, making its behavior consistent across Python 2 and 3. --- src/allmydata/test/test_util.py | 4 +++- src/allmydata/util/_python3.py | 1 + src/allmydata/util/idlib.py | 24 ++++++++++++++++++++++-- 3 files changed, 26 insertions(+), 3 deletions(-) diff --git a/src/allmydata/test/test_util.py b/src/allmydata/test/test_util.py index c671caa31..c556eb4b9 100644 --- a/src/allmydata/test/test_util.py +++ b/src/allmydata/test/test_util.py @@ -33,7 +33,9 @@ if six.PY3: class IDLib(unittest.TestCase): def test_nodeid_b2a(self): - self.failUnlessEqual(idlib.nodeid_b2a(b"\x00"*20), "a"*32) + result = idlib.nodeid_b2a(b"\x00"*20) + self.assertEqual(result, "a"*32) + self.assertIsInstance(result, str) class MyList(list): diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index 604e2ccab..e81157a98 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -94,6 +94,7 @@ PORTED_MODULES = [ "allmydata.util.happinessutil", "allmydata.util.hashutil", "allmydata.util.humanreadable", + "allmydata.util.idlib", "allmydata.util.iputil", "allmydata.util.jsonbytes", "allmydata.util.log", diff --git a/src/allmydata/util/idlib.py b/src/allmydata/util/idlib.py index 5e44b9d82..eafcbc388 100644 --- a/src/allmydata/util/idlib.py +++ b/src/allmydata/util/idlib.py @@ -1,9 +1,29 @@ +""" +Ported to Python 3. +""" +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 +from six import ensure_text from foolscap import base32 + + def nodeid_b2a(nodeid): - # we display nodeids using the same base32 alphabet that Foolscap uses - return base32.encode(nodeid) + """ + We display nodeids using the same base32 alphabet that Foolscap uses. + + Returns a Unicode string. + """ + return ensure_text(base32.encode(nodeid)) def shortnodeid_b2a(nodeid): + """ + Short version of nodeid_b2a() output, Unicode string. + """ return nodeid_b2a(nodeid)[:8] From 36bf9224e68c8391098c958bf66aa5c6383f03e9 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 10 Dec 2020 10:52:07 -0500 Subject: [PATCH 043/186] More progress on Python 3, unbreak Python 2. --- src/allmydata/client.py | 4 ---- src/allmydata/immutable/upload.py | 2 +- src/allmydata/test/common_web.py | 9 +++++++-- src/allmydata/test/test_system.py | 14 +++++++------- 4 files changed, 15 insertions(+), 14 deletions(-) diff --git a/src/allmydata/client.py b/src/allmydata/client.py index 4c7a29d8d..c60c7d4c9 100644 --- a/src/allmydata/client.py +++ b/src/allmydata/client.py @@ -904,10 +904,6 @@ class _Client(node.Node, pollmixin.PollMixin): if helper_furl in ("None", ""): helper_furl = None - # FURLs need to be bytes: - if helper_furl is not None: - helper_furl = helper_furl.encode("utf-8") - DEP = self.encoding_params DEP["k"] = int(self.config.get_config("client", "shares.needed", DEP["k"])) DEP["n"] = int(self.config.get_config("client", "shares.total", DEP["n"])) diff --git a/src/allmydata/immutable/upload.py b/src/allmydata/immutable/upload.py index e77cbb30b..855f38919 100644 --- a/src/allmydata/immutable/upload.py +++ b/src/allmydata/immutable/upload.py @@ -1825,7 +1825,7 @@ class Uploader(service.MultiService, log.PrefixingLogMixin): def startService(self): service.MultiService.startService(self) if self._helper_furl: - self.parent.tub.connectTo(self._helper_furl, + self.parent.tub.connectTo(self._helper_furl.encode("utf-8"), self._got_helper) def _got_helper(self, helper): diff --git a/src/allmydata/test/common_web.py b/src/allmydata/test/common_web.py index a6278810b..faf546f7c 100644 --- a/src/allmydata/test/common_web.py +++ b/src/allmydata/test/common_web.py @@ -47,12 +47,17 @@ class VerboseError(Error): @inlineCallbacks def do_http(method, url, **kwargs): + """ + Run HTTP query, return Deferred of body as Unicode. + """ response = yield treq.request(method, url, persistent=False, **kwargs) - body = yield treq.content(response) + body = yield treq.text_content(response, "utf-8") # TODO: replace this with response.fail_for_status when # https://github.com/twisted/treq/pull/159 has landed if 400 <= response.code < 600: - raise VerboseError(response.code, response=body) + raise VerboseError( + response.code, response="For request {} to {}, got: {}".format( + method, url, body)) returnValue(body) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index c521e1e2f..5bdcd6df9 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -632,7 +632,7 @@ def _render_config(config): """ Convert a ``dict`` of ``dict`` of ``bytes`` to an ini-format string. """ - return "\n\n".join(list( + return u"\n\n".join(list( _render_config_section(k, v) for (k, v) in config.items() @@ -643,7 +643,7 @@ def _render_config_section(heading, values): Convert a ``bytes`` heading and a ``dict`` of ``bytes`` to an ini-format section as ``bytes``. """ - return "[{}]\n{}\n".format( + return u"[{}]\n{}\n".format( heading, _render_section_values(values) ) @@ -652,8 +652,8 @@ def _render_section_values(values): Convert a ``dict`` of ``bytes`` to the body of an ini-format section as ``bytes``. """ - return "\n".join(list( - "{} = {}".format(k, v) + return u"\n".join(list( + u"{} = {}".format(k, v) for (k, v) in sorted(values.items()) )) @@ -1856,7 +1856,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): "largest-directory-children": 3, "largest-immutable-file": 112, } - for k,v in expected.iteritems(): + for k,v in expected.items(): self.failUnlessEqual(stats[k], v, "stats[%s] was %s, not %s" % (k, stats[k], v)) @@ -1939,7 +1939,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): return do_http("post", url, data=body, headers=headers) def _test_web(self, res): - public = "uri/" + self._root_directory_uri + public = "uri/" + unicode(self._root_directory_uri, "ascii") d = self.GET("") def _got_welcome(page): html = page.replace('\n', ' ') @@ -1948,7 +1948,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): "I didn't see the right '%s' message in:\n%s" % (connected_re, page)) # nodeids/tubids don't have any regexp-special characters nodeid_re = r'Node ID:\s*%s' % ( - self.clients[0].get_long_tubid(), self.clients[0].get_long_nodeid()) + self.clients[0].get_long_tubid(), unicode(self.clients[0].get_long_nodeid(), "ascii")) self.failUnless(re.search(nodeid_re, html), "I didn't see the right '%s' message in:\n%s" % (nodeid_re, page)) self.failUnless("Helper: 0 active uploads" in page) From c356ced49b93fea2a126229438a85f2a350a2828 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 10 Dec 2020 10:56:41 -0500 Subject: [PATCH 044/186] Another passing test on Python 3. --- src/allmydata/test/test_system.py | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 5bdcd6df9..9d4f0257f 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -1342,7 +1342,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): assert pieces[-5].startswith("client") client_num = int(pieces[-5][-1]) storage_index_s = pieces[-1] - storage_index = si_a2b(storage_index_s) + storage_index = si_a2b(storage_index_s.encode("ascii")) for sharename in filenames: shnum = int(sharename) filename = os.path.join(dirpath, sharename) @@ -1375,7 +1375,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): elif which == "signature": signature = self.flip_bit(signature) elif which == "share_hash_chain": - nodenum = share_hash_chain.keys()[0] + nodenum = list(share_hash_chain.keys())[0] share_hash_chain[nodenum] = self.flip_bit(share_hash_chain[nodenum]) elif which == "block_hash_tree": block_hash_tree[-1] = self.flip_bit(block_hash_tree[-1]) @@ -1451,7 +1451,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): self.failUnless(" share_hash_chain: " in output) self.failUnless(" block_hash_tree: 1 nodes\n" in output) expected = (" verify-cap: URI:SSK-Verifier:%s:" % - base32.b2a(storage_index)) + unicode(base32.b2a(storage_index), "ascii")) self.failUnless(expected in output) except unittest.FailTest: print() @@ -1580,9 +1580,9 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): def _check_empty_file(res): # make sure we can create empty files, this usually screws up the # segsize math - d1 = self.clients[2].create_mutable_file(MutableData("")) + d1 = self.clients[2].create_mutable_file(MutableData(b"")) d1.addCallback(lambda newnode: newnode.download_best_version()) - d1.addCallback(lambda res: self.failUnlessEqual("", res)) + d1.addCallback(lambda res: self.failUnlessEqual(b"", res)) return d1 d.addCallback(_check_empty_file) From 5c1d904f57cac23371a6752a3c2af3323e3c00f9 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 10 Dec 2020 11:00:15 -0500 Subject: [PATCH 045/186] Skip test on Python 3 for now, since that is not going to work in short term. --- src/allmydata/test/test_system.py | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 9d4f0257f..2f4a503f4 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -1,9 +1,12 @@ from __future__ import print_function +from future.utils import PY3 + from past.builtins import unicode, chr as byteschr, long import os, re, sys, time, json from functools import partial +from unittest import skipIf from bs4 import BeautifulSoup @@ -2606,6 +2609,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): return d + @skipIf(PY3, "Python 3 CLI support hasn't happened yet.") def test_filesystem_with_cli_in_subprocess(self): # We do this in a separate test so that test_filesystem doesn't skip if we can't run bin/tahoe. From 37554162b0a020f2e37c7f3001cfa55244d08134 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 10 Dec 2020 11:17:42 -0500 Subject: [PATCH 046/186] News file. --- newsfragments/3552.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3552.minor diff --git a/newsfragments/3552.minor b/newsfragments/3552.minor new file mode 100644 index 000000000..e69de29bb From ba9e0db66eb83533c69d3f00ae4691884e9536bb Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 10 Dec 2020 11:17:46 -0500 Subject: [PATCH 047/186] Skip test_filesystem on Python 3 for now. --- src/allmydata/test/test_system.py | 1 + 1 file changed, 1 insertion(+) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 2f4a503f4..ed9ffb2e2 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -1629,6 +1629,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): # the key, which should cause the download to fail the post-download # plaintext_hash check. + @skipIf(PY3, "Python 3 web support hasn't happened yet.") def test_filesystem(self): self.basedir = "system/SystemTest/test_filesystem" self.data = LARGE_DATA From a2e2ee596b22718fb088e31e454c9a3bf73dbcbe Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 10 Dec 2020 11:47:02 -0500 Subject: [PATCH 048/186] Some progress(?) towards passing tests. --- src/allmydata/scripts/debug.py | 2 +- src/allmydata/test/common_util.py | 4 ++ src/allmydata/test/test_system.py | 73 +++++++++++++++++-------------- src/allmydata/util/base32.py | 3 ++ 4 files changed, 49 insertions(+), 33 deletions(-) diff --git a/src/allmydata/scripts/debug.py b/src/allmydata/scripts/debug.py index 6afbce909..fd3f2b87c 100644 --- a/src/allmydata/scripts/debug.py +++ b/src/allmydata/scripts/debug.py @@ -638,7 +638,7 @@ def find_shares(options): from allmydata.util.encodingutil import listdir_unicode, quote_local_unicode_path out = options.stdout - sharedir = storage_index_to_dir(si_a2b(options.si_s.encode("ascii"))) + sharedir = storage_index_to_dir(si_a2b(options.si_s)) for d in options.nodedirs: d = os.path.join(d, "storage", "shares", sharedir) if os.path.exists(d): diff --git a/src/allmydata/test/common_util.py b/src/allmydata/test/common_util.py index e3f5cf750..35dc8a191 100644 --- a/src/allmydata/test/common_util.py +++ b/src/allmydata/test/common_util.py @@ -17,6 +17,8 @@ from allmydata.util.encodingutil import unicode_platform, get_filesystem_encodin from future.utils import bord, bchr, binary_type from past.builtins import unicode +from ..util.encodingutil import unicode_to_argv + def skip_if_cannot_represent_filename(u): precondition(isinstance(u, unicode)) @@ -36,6 +38,8 @@ def skip_if_cannot_represent_argv(u): raise unittest.SkipTest("A non-ASCII argv could not be encoded on this platform.") def run_cli(verb, *args, **kwargs): + args = [(unicode_to_argv(arg) if isinstance(arg, unicode) else arg) + for arg in args] precondition(not [True for arg in args if not isinstance(arg, str)], "arguments to do_cli must be strs -- convert using unicode_to_argv", args=args) nodeargs = kwargs.get("nodeargs", []) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index ed9ffb2e2..7a9ba50b9 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -1,8 +1,18 @@ +""" +Ported to Python 3, partially: test_filesystem* will be done in a future round. +""" from __future__ import print_function +from __future__ import absolute_import +from __future__ import division +from __future__ import unicode_literals -from future.utils import PY3 +from future.utils import PY2, PY3 +if PY2: + # Don't import bytes since it causes issues on (so far unported) modules on Python 2. + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, dict, list, object, range, max, min, str # noqa: F401 -from past.builtins import unicode, chr as byteschr, long +from past.builtins import chr as byteschr, long +from six import ensure_text import os, re, sys, time, json from functools import partial @@ -1454,7 +1464,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): self.failUnless(" share_hash_chain: " in output) self.failUnless(" block_hash_tree: 1 nodes\n" in output) expected = (" verify-cap: URI:SSK-Verifier:%s:" % - unicode(base32.b2a(storage_index), "ascii")) + str(base32.b2a(storage_index), "ascii")) self.failUnless(expected in output) except unittest.FailTest: print() @@ -1533,7 +1543,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): for (client_num, storage_index, filename, shnum) in shares ]) assert len(where) == 10 # this test is designed for 3-of-10 - for shnum, filename in where.items(): + for shnum, filename in list(where.items()): # shares 7,8,9 are left alone. read will check # (share_hash_chain, block_hash_tree, share_data). New # seqnum+R pairs will trigger a check of (seqnum, R, IV, @@ -1860,7 +1870,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): "largest-directory-children": 3, "largest-immutable-file": 112, } - for k,v in expected.items(): + for k,v in list(expected.items()): self.failUnlessEqual(stats[k], v, "stats[%s] was %s, not %s" % (k, stats[k], v)) @@ -1909,33 +1919,33 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): return do_http("get", self.webish_url + urlpath) def POST(self, urlpath, use_helper=False, **fields): - sepbase = "boogabooga" - sep = "--" + sepbase + sepbase = b"boogabooga" + sep = b"--" + sepbase form = [] form.append(sep) - form.append('Content-Disposition: form-data; name="_charset"') - form.append('') - form.append('UTF-8') + form.append(b'Content-Disposition: form-data; name="_charset"') + form.append(b'') + form.append(b'UTF-8') form.append(sep) - for name, value in fields.iteritems(): + for name, value in fields.items(): if isinstance(value, tuple): filename, value = value - form.append('Content-Disposition: form-data; name="%s"; ' - 'filename="%s"' % (name, filename.encode("utf-8"))) + form.append(b'Content-Disposition: form-data; name="%s"; ' + b'filename="%s"' % (name, filename.encode("utf-8"))) else: - form.append('Content-Disposition: form-data; name="%s"' % name) - form.append('') - form.append(str(value)) + form.append(b'Content-Disposition: form-data; name="%s"' % name) + form.append(b'') + form.append(b"%s" % (value,)) form.append(sep) - form[-1] += "--" - body = "" + form[-1] += b"--" + body = b"" headers = {} if fields: - body = "\r\n".join(form) + "\r\n" - headers["content-type"] = "multipart/form-data; boundary=%s" % sepbase + body = b"\r\n".join(form) + b"\r\n" + headers["content-type"] = "multipart/form-data; boundary=%s" % str(sepbase, "ascii") return self.POST2(urlpath, body, headers, use_helper) - def POST2(self, urlpath, body="", headers={}, use_helper=False): + def POST2(self, urlpath, body=b"", headers={}, use_helper=False): if use_helper: url = self.helper_webish_url + urlpath else: @@ -1943,7 +1953,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): return do_http("post", url, data=body, headers=headers) def _test_web(self, res): - public = "uri/" + unicode(self._root_directory_uri, "ascii") + public = "uri/" + str(self._root_directory_uri, "ascii") d = self.GET("") def _got_welcome(page): html = page.replace('\n', ' ') @@ -1952,7 +1962,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): "I didn't see the right '%s' message in:\n%s" % (connected_re, page)) # nodeids/tubids don't have any regexp-special characters nodeid_re = r'Node ID:\s*%s' % ( - self.clients[0].get_long_tubid(), unicode(self.clients[0].get_long_nodeid(), "ascii")) + self.clients[0].get_long_tubid(), str(self.clients[0].get_long_nodeid(), "ascii")) self.failUnless(re.search(nodeid_re, html), "I didn't see the right '%s' message in:\n%s" % (nodeid_re, page)) self.failUnless("Helper: 0 active uploads" in page) @@ -2013,7 +2023,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): # upload a file with PUT d.addCallback(self.log, "about to try PUT") d.addCallback(lambda res: self.PUT(public + "/subdir3/new.txt", - "new.txt contents")) + b"new.txt contents")) d.addCallback(lambda res: self.GET(public + "/subdir3/new.txt")) d.addCallback(self.failUnlessEqual, "new.txt contents") # and again with something large enough to use multiple segments, @@ -2024,23 +2034,23 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): c.encoding_params['happy'] = 1 d.addCallback(_new_happy_semantics) d.addCallback(lambda res: self.PUT(public + "/subdir3/big.txt", - "big" * 500000)) # 1.5MB + b"big" * 500000)) # 1.5MB d.addCallback(lambda res: self.GET(public + "/subdir3/big.txt")) d.addCallback(lambda res: self.failUnlessEqual(len(res), 1500000)) # can we replace files in place? d.addCallback(lambda res: self.PUT(public + "/subdir3/new.txt", - "NEWER contents")) + b"NEWER contents")) d.addCallback(lambda res: self.GET(public + "/subdir3/new.txt")) d.addCallback(self.failUnlessEqual, "NEWER contents") # test unlinked POST - d.addCallback(lambda res: self.POST("uri", t="upload", - file=("new.txt", "data" * 10000))) + d.addCallback(lambda res: self.POST("uri", t=b"upload", + file=("new.txt", b"data" * 10000))) # and again using the helper, which exercises different upload-status # display code - d.addCallback(lambda res: self.POST("uri", use_helper=True, t="upload", - file=("foo.txt", "data2" * 10000))) + d.addCallback(lambda res: self.POST("uri", use_helper=True, t=b"upload", + file=("foo.txt", b"data2" * 10000))) # check that the status page exists d.addCallback(lambda res: self.GET("status")) @@ -2164,7 +2174,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): # exercise some of the diagnostic tools in runner.py # find a share - for (dirpath, dirnames, filenames) in os.walk(unicode(self.basedir)): + for (dirpath, dirnames, filenames) in os.walk(ensure_text(self.basedir)): if "storage" not in dirpath: continue if not filenames: @@ -2211,7 +2221,6 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): # 'find-shares' tool sharedir, shnum = os.path.split(filename) storagedir, storage_index_s = os.path.split(sharedir) - storage_index_s = str(storage_index_s) nodedirs = [self.getdir("client%d" % i) for i in range(self.numclients)] rc,out,err = yield run_cli("debug", "find-shares", storage_index_s, *nodedirs) diff --git a/src/allmydata/util/base32.py b/src/allmydata/util/base32.py index 287d214ea..75625c817 100644 --- a/src/allmydata/util/base32.py +++ b/src/allmydata/util/base32.py @@ -133,6 +133,9 @@ def a2b(cs): """ @param cs the base-32 encoded data (as bytes) """ + # Workaround Future newbytes issues by converting to real bytes on Python 2: + if hasattr(cs, "__native__"): + cs = cs.__native__() precondition(could_be_base32_encoded(cs), "cs is required to be possibly base32 encoded data.", cs=cs) precondition(isinstance(cs, bytes), cs) From 733223c8d706f946538ab410638e9e17fa81f24b Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Fri, 11 Dec 2020 10:34:30 -0500 Subject: [PATCH 049/186] Refactor create_main_tub to make testing tub location logic easier Then take advantage of this and simplify the tub location logic test --- src/allmydata/node.py | 75 +++++++++++++++++++++------------ src/allmydata/test/test_node.py | 48 ++++++++------------- 2 files changed, 65 insertions(+), 58 deletions(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index 344f463e5..24ef6f6a6 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -722,6 +722,10 @@ def _convert_tub_port(s): return us +class PortAssignmentRequired(Exception): + pass + + def _tub_portlocation(config, get_local_addresses_sync, allocate_tcp_port): """ Figure out the network location of the main tub for some configuration. @@ -778,7 +782,7 @@ def _tub_portlocation(config, get_local_addresses_sync, allocate_tcp_port): for port in tubport.split(","): if port in ("0", "tcp:0"): - raise ValueError("tub.port cannot be 0: you must choose") + raise PortAssignmentRequired() if cfg_location is None: cfg_location = "AUTO" @@ -821,6 +825,38 @@ def _tub_portlocation(config, get_local_addresses_sync, allocate_tcp_port): return tubport, location +def set_tub_locations(i2p_provider, tor_provider, tub, portlocation): + """ + Assign a Tub its listener locations. + + :param i2p_provider: See ``allmydata.util.i2p_provider.create``. + :param tor_provider: See ``allmydata.util.tor_provider.create``. + """ + if portlocation is None: + log.msg("Tub is not listening") + else: + tubport, location = portlocation + for port in tubport.split(","): + if port == "listen:i2p": + # the I2P provider will read its section of tahoe.cfg and + # return either a fully-formed Endpoint, or a descriptor + # that will create one, so we don't have to stuff all the + # options into the tub.port string (which would need a lot + # of escaping) + port_or_endpoint = i2p_provider.get_listener() + elif port == "listen:tor": + port_or_endpoint = tor_provider.get_listener() + else: + port_or_endpoint = port + # Foolscap requires native strings: + if isinstance(port_or_endpoint, (bytes, str)): + port_or_endpoint = ensure_str(port_or_endpoint) + tub.listenOn(port_or_endpoint) + tub.setLocation(location) + log.msg("Tub location set to %s" % (location,)) + # the Tub is now ready for tub.registerReference() + + def create_main_tub(config, tub_options, default_connection_handlers, foolscap_connection_handlers, i2p_provider, tor_provider, @@ -851,34 +887,17 @@ def create_main_tub(config, tub_options, iputil.allocate_tcp_port, ) - certfile = config.get_private_path("node.pem") # FIXME? "node.pem" was the CERTFILE option/thing - tub = create_tub(tub_options, default_connection_handlers, foolscap_connection_handlers, - handler_overrides=handler_overrides, certFile=certfile) - - if portlocation: - tubport, location = portlocation - for port in tubport.split(","): - if port == "listen:i2p": - # the I2P provider will read its section of tahoe.cfg and - # return either a fully-formed Endpoint, or a descriptor - # that will create one, so we don't have to stuff all the - # options into the tub.port string (which would need a lot - # of escaping) - port_or_endpoint = i2p_provider.get_listener() - elif port == "listen:tor": - port_or_endpoint = tor_provider.get_listener() - else: - port_or_endpoint = port - # Foolscap requires native strings: - if isinstance(port_or_endpoint, (bytes, str)): - port_or_endpoint = ensure_str(port_or_endpoint) - tub.listenOn(port_or_endpoint) - tub.setLocation(location) - log.msg("Tub location set to %s" % (location,)) - # the Tub is now ready for tub.registerReference() - else: - log.msg("Tub is not listening") + # FIXME? "node.pem" was the CERTFILE option/thing + certfile = config.get_private_path("node.pem") + tub = create_tub( + tub_options, + default_connection_handlers, + foolscap_connection_handlers, + handler_overrides=handler_overrides, + certFile=certfile, + ) + set_tub_locations(i2p_provider, tor_provider, tub, portlocation) return tub diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index 5a2acb690..d5280a72b 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -40,7 +40,9 @@ from foolscap.connections.tcp import default as make_tcp_handler from twisted.application import service from allmydata.node import ( + PortAssignmentRequired, PrivacyError, + set_tub_locations, create_tub_options, create_main_tub, create_node_dir, @@ -528,6 +530,22 @@ class TestMissingPorts(unittest.TestCase): def _alloc_port(self): return 999 + def test_listen_on_zero(self): + """ + ``set_tub_locations`` raises ``PortAssignmentRequired`` called with a + listen address including port 0. + """ + config_data = ( + "[node]\n" + "tub.port = tcp:0\n" + ) + config = config_from_string(self.basedir, "portnum", config_data) + + # XXX The implementation has some shortcomings. For example, how + # about tcp:localhost:0? + with self.assertRaises(PortAssignmentRequired): + _tub_portlocation(config, None, None) + def test_parsing_tcp(self): """ When ``tub.port`` is given and ``tub.location`` is **AUTO** the port @@ -767,36 +785,6 @@ class FakeTub(object): class Listeners(unittest.TestCase): - def test_listen_on_zero(self): - """ - Trying to listen on port 0 should be an error - """ - basedir = self.mktemp() - create_node_dir(basedir, "testing") - with open(os.path.join(basedir, "tahoe.cfg"), "w") as f: - f.write(BASE_CONFIG) - f.write("[node]\n") - f.write("tub.port = tcp:0\n") - f.write("tub.location = AUTO\n") - - config = client.read_config(basedir, "client.port") - fch = {"tcp": make_tcp_handler()} - dfh = create_default_connection_handlers( - None, - config, - fch, - ) - tub_options = create_tub_options(config) - t = FakeTub() - - with mock.patch("allmydata.node.Tub", return_value=t): - with self.assertRaises(ValueError) as ctx: - create_main_tub(config, tub_options, dfh, fch, None, None) - self.assertIn( - "you must choose", - str(ctx.exception), - ) - # Randomly allocate a couple distinct port numbers to try out. The test # never actually binds these port numbers so we don't care if they're "in # use" on the system or not. We just want a couple distinct values we can From 0cdf66a99127187ce343b6edb1862a9eae222377 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Fri, 11 Dec 2020 10:35:22 -0500 Subject: [PATCH 050/186] Make the Tor/I2P "provider" interfaces explicit --- src/allmydata/interfaces.py | 18 ++++++++++++++++++ src/allmydata/util/i2p_provider.py | 13 ++++++++++++- src/allmydata/util/tor_provider.py | 14 ++++++++++++-- 3 files changed, 42 insertions(+), 3 deletions(-) diff --git a/src/allmydata/interfaces.py b/src/allmydata/interfaces.py index b9a757b08..4c83444e3 100644 --- a/src/allmydata/interfaces.py +++ b/src/allmydata/interfaces.py @@ -3173,3 +3173,21 @@ class IAnnounceableStorageServer(Interface): :type: ``IReferenceable`` provider """ ) + + +class IAddressFamily(Interface): + """ + Support for one specific address family. + + This stretches the definition of address family to include things like Tor + and I2P. + """ + def get_listener(): + """ + Return a string endpoint description or an ``IStreamServerEndpoint``. + """ + + def get_client_endpoint(): + """ + Return an ``IStreamClientEndpoint``. + """ diff --git a/src/allmydata/util/i2p_provider.py b/src/allmydata/util/i2p_provider.py index 37789c428..46c57704a 100644 --- a/src/allmydata/util/i2p_provider.py +++ b/src/allmydata/util/i2p_provider.py @@ -2,11 +2,18 @@ from __future__ import absolute_import, print_function, with_statement import os +from zope.interface import ( + implementer, +) + from twisted.internet.defer import inlineCallbacks, returnValue from twisted.internet.endpoints import clientFromString from twisted.internet.error import ConnectionRefusedError, ConnectError from twisted.application import service +from ..interfaces import ( + IAddressFamily, +) def create(reactor, config): """ @@ -135,6 +142,7 @@ def create_config(reactor, cli_config): returnValue((tahoe_config_i2p, i2p_port, i2p_location)) +@implementer(IAddressFamily) class _Provider(service.MultiService): def __init__(self, config, reactor): service.MultiService.__init__(self) @@ -160,7 +168,7 @@ class _Provider(service.MultiService): (privkeyfile, external_port, escaped_sam_port) return i2p_port - def get_i2p_handler(self): + def get_client_endpoint(self): enabled = self._get_i2p_config("enabled", True, boolean=True) if not enabled: return None @@ -188,6 +196,9 @@ class _Provider(service.MultiService): return self._i2p.default(self._reactor, keyfile=keyfile) + # Backwards compatibility alias + get_i2p_handler = get_client_endpoint + def check_dest_config(self): if self._get_i2p_config("dest", False, boolean=True): if not self._txi2p: diff --git a/src/allmydata/util/tor_provider.py b/src/allmydata/util/tor_provider.py index d0ed75c3f..59039b6b7 100644 --- a/src/allmydata/util/tor_provider.py +++ b/src/allmydata/util/tor_provider.py @@ -2,6 +2,10 @@ from __future__ import absolute_import, print_function, with_statement import os +from zope.interface import ( + implementer, +) + from twisted.internet.defer import inlineCallbacks, returnValue from twisted.internet.endpoints import clientFromString, TCP4ServerEndpoint from twisted.internet.error import ConnectionRefusedError, ConnectError @@ -9,7 +13,9 @@ from twisted.application import service from .observer import OneShotObserverList from .iputil import allocate_tcp_port - +from ..interfaces import ( + IAddressFamily, +) def create(reactor, config): """ @@ -209,6 +215,7 @@ def create_config(reactor, cli_config): returnValue((tahoe_config_tor, tor_port, tor_location)) +@implementer(IAddressFamily) class _Provider(service.MultiService): def __init__(self, config, reactor): service.MultiService.__init__(self) @@ -228,7 +235,7 @@ class _Provider(service.MultiService): ep = TCP4ServerEndpoint(self._reactor, local_port, interface="127.0.0.1") return ep - def get_tor_handler(self): + def get_client_endpoint(self): enabled = self._get_tor_config("enabled", True, boolean=True) if not enabled: return None @@ -253,6 +260,9 @@ class _Provider(service.MultiService): return self._tor.default_socks() + # Backwards compatibility alias + get_tor_handler = get_client_endpoint + @inlineCallbacks def _make_control_endpoint(self, reactor, update_status): # this will only be called when tahoe.cfg has "[tor] launch = true" From 9259264d27634375c10c6097853d92e753646863 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Fri, 11 Dec 2020 10:38:15 -0500 Subject: [PATCH 051/186] Get rid of the remaining mocks --- src/allmydata/test/common.py | 35 ++++++++++++-- src/allmydata/test/test_node.py | 81 ++++++++------------------------- 2 files changed, 51 insertions(+), 65 deletions(-) diff --git a/src/allmydata/test/common.py b/src/allmydata/test/common.py index f93272540..4d313cac7 100644 --- a/src/allmydata/test/common.py +++ b/src/allmydata/test/common.py @@ -62,10 +62,16 @@ from twisted.internet.endpoints import AdoptedStreamServerEndpoint from twisted.trial.unittest import TestCase as _TrialTestCase from allmydata import uri -from allmydata.interfaces import IMutableFileNode, IImmutableFileNode,\ - NotEnoughSharesError, ICheckable, \ - IMutableUploadable, SDMF_VERSION, \ - MDMF_VERSION +from allmydata.interfaces import ( + IMutableFileNode, + IImmutableFileNode, + NotEnoughSharesError, + ICheckable, + IMutableUploadable, + SDMF_VERSION, + MDMF_VERSION, + IAddressFamily, +) from allmydata.check_results import CheckResults, CheckAndRepairResults, \ DeepCheckResults, DeepCheckAndRepairResults from allmydata.storage_client import StubServer @@ -1145,6 +1151,27 @@ def _corrupt_uri_extension(data, debug=False): return corrupt_field(data, 0x0c+uriextoffset, uriextlen) + +@attr.s +@implementer(IAddressFamily) +class FakeProvider(object): + """ + Pretend to provide support for some address family but just hand out + canned responses. + """ + _listener = attr.ib(default=None) + _handler = attr.ib(default=None) + + def get_listener(self): + if self._listener is None: + raise Exception("{!r} has no listener.") + return self._listener + + def get_client_endpoint(self): + if self._handler is None: + raise Exception("{!r} has no client endpoint.") + + class _TestCaseMixin(object): """ A mixin for ``TestCase`` which collects helpful behaviors for subclasses. diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index d5280a72b..eead51897 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -15,7 +15,6 @@ import os import stat import sys import time -import mock from textwrap import dedent import configparser @@ -36,7 +35,6 @@ from twisted.trial import unittest from twisted.internet import defer import foolscap.logging.log -from foolscap.connections.tcp import default as make_tcp_handler from twisted.application import service from allmydata.node import ( @@ -69,6 +67,9 @@ from allmydata.util.i2p_provider import create as create_i2p_provider from allmydata.util.tor_provider import create as create_tor_provider import allmydata.test.common_util as testutil +from .common import ( + FakeProvider, +) def port_numbers(): return integers(min_value=1, max_value=2 ** 16 - 1) @@ -796,70 +797,38 @@ class Listeners(unittest.TestCase): ``tub.location`` configuration, the node's *main* port listens on all of them. """ - basedir = self.mktemp() - config_fname = os.path.join(basedir, "tahoe.cfg") - os.mkdir(basedir) - os.mkdir(os.path.join(basedir, "private")) port1, port2 = iter(ports) port = ("tcp:%d:interface=127.0.0.1,tcp:%d:interface=127.0.0.1" % (port1, port2)) location = "tcp:localhost:%d,tcp:localhost:%d" % (port1, port2) - with open(config_fname, "w") as f: - f.write(BASE_CONFIG) - f.write("[node]\n") - f.write("tub.port = %s\n" % port) - f.write("tub.location = %s\n" % location) - - config = client.read_config(basedir, "client.port") - fch = {"tcp": make_tcp_handler()} - dfh = create_default_connection_handlers( - None, - config, - fch, - ) - tub_options = create_tub_options(config) t = FakeTub() - - with mock.patch("allmydata.node.Tub", return_value=t): - create_main_tub(config, tub_options, dfh, fch, None, None) + set_tub_locations(None, None, t, (port, location)) self.assertEqual(t.listening_ports, ["tcp:%d:interface=127.0.0.1" % port1, "tcp:%d:interface=127.0.0.1" % port2]) def test_tor_i2p_listeners(self): - basedir = self.mktemp() - config_fname = os.path.join(basedir, "tahoe.cfg") - os.mkdir(basedir) - os.mkdir(os.path.join(basedir, "private")) - with open(config_fname, "w") as f: - f.write(BASE_CONFIG) - f.write("[node]\n") - f.write("tub.port = listen:i2p,listen:tor\n") - f.write("tub.location = tcp:example.org:1234\n") - config = client.read_config(basedir, "client.port") - tub_options = create_tub_options(config) + """ + When configured to listen on an "i2p" or "tor" address, + ``set_tub_locations`` tells the Tub to listen on endpoints supplied by + the given Tor and I2P providers. + """ t = FakeTub() - fch = {"tcp": make_tcp_handler()} - dfh = create_default_connection_handlers( - None, - config, - fch, + i2p_listener = object() + i2p_provider = FakeProvider(i2p_listener) + tor_listener = object() + tor_provider = FakeProvider(tor_listener) + + set_tub_locations( + i2p_provider, + tor_provider, + t, + ("listen:i2p,listen:tor", "tcp:example.org:1234"), ) - - with mock.patch("allmydata.node.Tub", return_value=t): - i2p_provider = mock.Mock() - tor_provider = mock.Mock() - create_main_tub(config, tub_options, dfh, fch, i2p_provider, tor_provider) - - self.assertEqual(i2p_provider.get_listener.mock_calls, [mock.call()]) - self.assertEqual(tor_provider.get_listener.mock_calls, [mock.call()]) self.assertEqual( t.listening_ports, - [ - i2p_provider.get_listener(), - tor_provider.get_listener(), - ] + [i2p_listener, tor_listener], ) @@ -967,16 +936,6 @@ class Configuration(unittest.TestCase): -class FakeProvider(object): - """Emulate Tor and I2P providers.""" - - def get_tor_handler(self): - return "TORHANDLER!" - - def get_i2p_handler(self): - return "I2PHANDLER!" - - class CreateConnectionHandlers(unittest.TestCase): """ Tests for create_connection_handlers(). From b1f478c5dfe3ae009c8a782fda359efd41d55744 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 11 Dec 2020 10:48:34 -0500 Subject: [PATCH 052/186] Note test_system.py is only partially ported. --- src/allmydata/util/_python3.py | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index e81157a98..2d02e96bd 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -163,6 +163,12 @@ PORTED_TEST_MODULES = [ "allmydata.test.test_storage", "allmydata.test.test_storage_client", "allmydata.test.test_storage_web", + + # Only partially ported, test_filesystem_with_cli_in_subprocess and + # test_filesystem methods aren't ported yet, should be done once CLI and + # web are ported respectively. + "allmydata.test.test_system", + "allmydata.test.test_time_format", "allmydata.test.test_upload", "allmydata.test.test_uri", From 36e53caaeb31547fa7117d8d35e640adc124c019 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 11 Dec 2020 11:14:50 -0500 Subject: [PATCH 053/186] Add test coverage for packing UnknownNode with missing read-only URI. --- src/allmydata/test/test_dirnode.py | 24 ++++++++++++++++++++++++ src/allmydata/unknown.py | 8 ++++++++ 2 files changed, 32 insertions(+) diff --git a/src/allmydata/test/test_dirnode.py b/src/allmydata/test/test_dirnode.py index 3a073dafe..1c265492b 100644 --- a/src/allmydata/test/test_dirnode.py +++ b/src/allmydata/test/test_dirnode.py @@ -1475,6 +1475,30 @@ class Packing(testutil.ReallyEqualMixin, unittest.TestCase): kids[str(name)] = (nm.create_from_cap(caps[name]), {}) return kids + def test_pack_unpack_unknown(self): + """ + Minimal testing for roundtripping unknown URIs. + """ + nm = NodeMaker(None, None, None, None, None, {"k": 3, "n": 10}, None, None) + fn = MinimalFakeMutableFile() + # UnknownNode has massively complex rules about when it's an error. + # Just force it not to be an error. + unknown_rw = UnknownNode(b"whatevs://write", None) + unknown_rw.error = None + unknown_ro = UnknownNode(None, b"whatevs://readonly") + unknown_ro.error = None + kids = { + "unknown_rw": (unknown_rw, {}), + "unknown_ro": (unknown_ro, {}) + } + packed = dirnode.pack_children(kids, fn.get_writekey(), deep_immutable=False) + + write_uri = b"URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q" + filenode = nm.create_from_cap(write_uri) + dn = dirnode.DirectoryNode(filenode, nm, None) + unkids = dn._unpack_contents(packed) + self.assertEqual(kids, unkids) + @given(text(min_size=1, max_size=20)) def test_pack_unpack_unicode_hypothesis(self, name): """ diff --git a/src/allmydata/unknown.py b/src/allmydata/unknown.py index 34d084136..f79c88415 100644 --- a/src/allmydata/unknown.py +++ b/src/allmydata/unknown.py @@ -182,3 +182,11 @@ class UnknownNode(object): def check_and_repair(self, monitor, verify, add_lease): return defer.succeed(None) + + def __eq__(self, other): + if not isinstance(other, UnknownNode): + return False + return other.ro_uri == self.ro_uri and other.rw_uri == self.rw_uri + + def __ne__(self, other): + return not (self == other) From efac902e57e01059bdd9f06fb99f0f8b759a7396 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Fri, 11 Dec 2020 11:22:39 -0500 Subject: [PATCH 054/186] Slightly better user-facing privacy error message here --- src/allmydata/node.py | 5 ++++- src/allmydata/test/test_connections.py | 3 ++- 2 files changed, 6 insertions(+), 2 deletions(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index 24ef6f6a6..faa8d61c0 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -654,7 +654,10 @@ def create_default_connection_handlers(reactor, config, handlers): if not reveal_ip: if default_connection_handlers.get("tcp") == "tcp": - raise PrivacyError("tcp = tcp, must not be set to 'tcp'") + raise PrivacyError( + "Privacy requested with `reveal-IP-address = false` " + "but `tcp = tcp` conflicts with this.", + ) return default_connection_handlers diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 4c028297f..b5b571b0c 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -412,7 +412,8 @@ class Privacy(unittest.TestCase): self.assertEqual( str(ctx.exception), - "tcp = tcp, must be set to 'tor' or 'disabled'", + "Privacy requested with `reveal-IP-address = false` " + "but `tcp = tcp` conflicts with this.", ) def test_connections_tcp_disabled(self): From 9a8f72202ddd88535ce7888df5b4feceea93a1b7 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Fri, 11 Dec 2020 11:23:10 -0500 Subject: [PATCH 055/186] Check for these exceptions and report them better --- src/allmydata/scripts/run_common.py | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) diff --git a/src/allmydata/scripts/run_common.py b/src/allmydata/scripts/run_common.py index fa19c2076..67670ff7e 100644 --- a/src/allmydata/scripts/run_common.py +++ b/src/allmydata/scripts/run_common.py @@ -10,7 +10,11 @@ from twisted.application.service import Service from allmydata.scripts.default_nodedir import _default_nodedir from allmydata.util import fileutil -from allmydata.node import read_config +from allmydata.node import ( + PortAssignmentRequired, + PrivacyError, + read_config, +) from allmydata.util.encodingutil import listdir_unicode, quote_local_unicode_path from allmydata.util.configutil import UnknownConfigError from allmydata.util.deferredutil import HookMixin @@ -147,6 +151,10 @@ class DaemonizeTheRealService(Service, HookMixin): def handle_config_error(fail): if fail.check(UnknownConfigError): self.stderr.write("\nConfiguration error:\n{}\n\n".format(fail.value)) + elif fail.check(PortAssignmentRequired): + self.stderr.write("\ntub.port cannot be 0: you must choose.\n\n") + elif fail.check(PrivacyError): + self.stderr.write("\n{}\n\n".format(fail.value)) else: self.stderr.write("\nUnknown error\n") fail.printTraceback(self.stderr) From 36f18e0afb06168688934d87f5170c7266a60637 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 11 Dec 2020 12:30:12 -0500 Subject: [PATCH 056/186] Fix test_filesystem on Python 2. --- src/allmydata/test/test_system.py | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 7a9ba50b9..9fbf53099 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -12,7 +12,7 @@ if PY2: from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, dict, list, object, range, max, min, str # noqa: F401 from past.builtins import chr as byteschr, long -from six import ensure_text +from six import ensure_text, ensure_str import os, re, sys, time, json from functools import partial @@ -2244,7 +2244,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): # allmydata.control (mostly used for performance tests) c0 = self.clients[0] control_furl_file = c0.config.get_private_path("control.furl") - control_furl = open(control_furl_file, "r").read().strip() + control_furl = ensure_str(open(control_furl_file, "r").read().strip()) # it doesn't really matter which Tub we use to connect to the client, # so let's just use our IntroducerNode's d = self.introducer.tub.getReference(control_furl) @@ -2276,7 +2276,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): # sure that works, before we add other aliases. root_file = os.path.join(client0_basedir, "private", "root_dir.cap") - f = open(root_file, "w") + f = open(root_file, "wb") f.write(private_uri) f.close() @@ -2669,7 +2669,7 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): return d def _test_checker(self, res): - ut = upload.Data("too big to be literal" * 200, convergence=None) + ut = upload.Data(b"too big to be literal" * 200, convergence=None) d = self._personal_node.add_file(u"big file", ut) d.addCallback(lambda res: self._personal_node.check(Monitor())) From cf6206ca4226ec06eda85f977fe989d26e6330b0 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 11 Dec 2020 12:37:23 -0500 Subject: [PATCH 057/186] Fix test_filesystem_with_cli_in_subprocess on Python 2. --- src/allmydata/test/test_system.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 9fbf53099..d3e6151dd 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -2642,12 +2642,12 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): out, err, rc_or_sig = res self.failUnlessEqual(rc_or_sig, 0, str(res)) if check_stderr: - self.failUnlessEqual(err, "") + self.failUnlessEqual(err, b"") d.addCallback(_run_in_subprocess, "create-alias", "newalias") d.addCallback(_check_succeeded) - STDIN_DATA = "This is the file to upload from stdin." + STDIN_DATA = b"This is the file to upload from stdin." d.addCallback(_run_in_subprocess, "put", "-", "newalias:tahoe-file", stdin=STDIN_DATA) d.addCallback(_check_succeeded, check_stderr=False) From 42f2f2318cd1077cbd65eba3405e42f83f92cbce Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 11 Dec 2020 13:05:21 -0500 Subject: [PATCH 058/186] Fix some Python 3 tests. --- src/allmydata/immutable/downloader/finder.py | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/src/allmydata/immutable/downloader/finder.py b/src/allmydata/immutable/downloader/finder.py index 0636c3fe2..4f6d1aa14 100644 --- a/src/allmydata/immutable/downloader/finder.py +++ b/src/allmydata/immutable/downloader/finder.py @@ -9,6 +9,7 @@ from __future__ import unicode_literals from future.utils import PY2 if PY2: from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 +from six import ensure_str import time now = time.time @@ -98,7 +99,7 @@ class ShareFinder(object): # internal methods def loop(self): - pending_s = ",".join([str(rt.server.get_name(), "utf-8") + pending_s = ",".join([ensure_str(rt.server.get_name()) for rt in self.pending_requests]) # sort? self.log(format="ShareFinder loop: running=%(running)s" " hungry=%(hungry)s, pending=%(pending)s", From 9bf221dea422b9850483f9853cd39c099cbdd9b5 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 11 Dec 2020 13:10:56 -0500 Subject: [PATCH 059/186] Match Foolscap better. --- src/allmydata/immutable/upload.py | 3 ++- src/allmydata/introducer/client.py | 6 ++---- 2 files changed, 4 insertions(+), 5 deletions(-) diff --git a/src/allmydata/immutable/upload.py b/src/allmydata/immutable/upload.py index 855f38919..6dae825ac 100644 --- a/src/allmydata/immutable/upload.py +++ b/src/allmydata/immutable/upload.py @@ -11,6 +11,7 @@ from future.utils import PY2, native_str if PY2: from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 from past.builtins import long, unicode +from six import ensure_str import os, time, weakref, itertools from zope.interface import implementer @@ -1825,7 +1826,7 @@ class Uploader(service.MultiService, log.PrefixingLogMixin): def startService(self): service.MultiService.startService(self) if self._helper_furl: - self.parent.tub.connectTo(self._helper_furl.encode("utf-8"), + self.parent.tub.connectTo(ensure_str(self._helper_furl), self._got_helper) def _got_helper(self, helper): diff --git a/src/allmydata/introducer/client.py b/src/allmydata/introducer/client.py index fa1e1efe8..584475524 100644 --- a/src/allmydata/introducer/client.py +++ b/src/allmydata/introducer/client.py @@ -11,7 +11,7 @@ if PY2: from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 from past.builtins import long -from six import ensure_text +from six import ensure_text, ensure_str import time from zope.interface import implementer @@ -39,8 +39,6 @@ class IntroducerClient(service.Service, Referenceable): nickname, my_version, oldest_supported, sequencer, cache_filepath): self._tub = tub - if isinstance(introducer_furl, str): - introducer_furl = introducer_furl.encode("utf-8") self.introducer_furl = introducer_furl assert isinstance(nickname, str) @@ -96,7 +94,7 @@ class IntroducerClient(service.Service, Referenceable): def startService(self): service.Service.startService(self) self._introducer_error = None - rc = self._tub.connectTo(self.introducer_furl, self._got_introducer) + rc = self._tub.connectTo(ensure_str(self.introducer_furl), self._got_introducer) self._introducer_reconnector = rc def connect_failed(failure): self.log("Initial Introducer connection failed: perhaps it's down", From 51e50671e5a6c986b41d235b5d541337dcf2e40d Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Fri, 11 Dec 2020 15:32:24 -0500 Subject: [PATCH 060/186] Get rid of the "tahoe start" and "tahoe stop" and fix the obvious problems This just requires the client node to already be running now. --- src/allmydata/test/check_grid.py | 43 ++++++++------------------------ 1 file changed, 11 insertions(+), 32 deletions(-) diff --git a/src/allmydata/test/check_grid.py b/src/allmydata/test/check_grid.py index 266b7688b..82323a5c5 100644 --- a/src/allmydata/test/check_grid.py +++ b/src/allmydata/test/check_grid.py @@ -23,15 +23,15 @@ down the node after the test finishes. To set up the client node, do the following: tahoe create-client DIR - populate DIR/introducer.furl - tahoe start DIR XXXX - tahoe add-alias -d DIR testgrid `tahoe mkdir -d DIR` - pick a 10kB-ish test file, compute its md5sum - tahoe put -d DIR FILE testgrid:old.MD5SUM - tahoe put -d DIR FILE testgrid:recent.MD5SUM - tahoe put -d DIR FILE testgrid:recentdir/recent.MD5SUM - echo "" | tahoe put -d DIR --mutable testgrid:log - echo "" | tahoe put -d DIR --mutable testgrid:recentlog + populate DIR/private/introducers.yaml + $DAEMONIZE tahoe run DIR + tahoe -d DIR create-alias testgrid + # pick a 10kB-ish test file, compute its md5sum + tahoe -d DIR put FILE testgrid:old.MD5SUM + tahoe -d DIR put FILE testgrid:recent.MD5SUM + tahoe -d DIR put FILE testgrid:recentdir/recent.MD5SUM + echo "" | tahoe -d DIR put --mutable - testgrid:log + echo "" | tahoe -d DIR put --mutable - testgrid:recentlog This script will perform the following steps (the kind of compatibility that is being tested is in [brackets]): @@ -104,26 +104,13 @@ class GridTester(object): def cli(self, cmd, *args, **kwargs): print("tahoe", cmd, " ".join(args)) - stdout, stderr = self.command(self.tahoe, cmd, "-d", self.nodedir, + stdout, stderr = self.command(self.tahoe, "-d", self.nodedir, cmd, *args, **kwargs) if not kwargs.get("ignore_stderr", False) and stderr != "": raise CommandFailed("command '%s' had stderr: %s" % (" ".join(args), stderr)) return stdout - def stop_old_node(self): - print("tahoe stop", self.nodedir, "(force)") - self.command(self.tahoe, "stop", self.nodedir, expected_rc=None) - - def start_node(self): - print("tahoe start", self.nodedir) - self.command(self.tahoe, "start", self.nodedir) # XXXX - time.sleep(5) - - def stop_node(self): - print("tahoe stop", self.nodedir) - self.command(self.tahoe, "stop", self.nodedir) - def read_and_check(self, f): expected_md5_s = f[f.find(".")+1:] out = self.cli("get", "testgrid:" + f) @@ -204,19 +191,11 @@ class GridTester(object): fn = prefix + "." + md5sum return fn, data - def run(self): - self.stop_old_node() - self.start_node() - try: - self.do_test() - finally: - self.stop_node() - def main(): config = GridTesterOptions() config.parseOptions() gt = GridTester(config) - gt.run() + gt.do_test() if __name__ == "__main__": main() From f17a5dfafc3420582dc9c05ad29c45c5ca183003 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Sat, 12 Dec 2020 17:40:13 -0500 Subject: [PATCH 061/186] key-generator was apparently removed long ago --- src/allmydata/scripts/run_common.py | 5 ++--- src/allmydata/test/test_runner.py | 14 +++++++------- 2 files changed, 9 insertions(+), 10 deletions(-) diff --git a/src/allmydata/scripts/run_common.py b/src/allmydata/scripts/run_common.py index f62534434..da02c6517 100644 --- a/src/allmydata/scripts/run_common.py +++ b/src/allmydata/scripts/run_common.py @@ -46,8 +46,7 @@ def get_pid_from_pidfile(pidfile): def identify_node_type(basedir): """ - :return unicode: None or one of: 'client', 'introducer', or - 'key-generator' + :return unicode: None or one of: 'client' or 'introducer'. """ tac = u'' try: @@ -58,7 +57,7 @@ def identify_node_type(basedir): except OSError: return None - for t in (u"client", u"introducer", u"key-generator"): + for t in (u"client", u"introducer"): if t in tac: return t return None diff --git a/src/allmydata/test/test_runner.py b/src/allmydata/test/test_runner.py index 636880621..ef2b99a19 100644 --- a/src/allmydata/test/test_runner.py +++ b/src/allmydata/test/test_runner.py @@ -144,8 +144,8 @@ class BinTahoe(common_util.SignalMixin, unittest.TestCase, RunBinTahoeMixin): class CreateNode(unittest.TestCase): - # exercise "tahoe create-node", create-introducer, and - # create-key-generator by calling the corresponding code as a subroutine. + # exercise "tahoe create-node" and "tahoe create-introducer" by calling + # the corresponding code as a subroutine. def workdir(self, name): basedir = os.path.join("test_runner", "CreateNode", name) @@ -253,11 +253,11 @@ class CreateNode(unittest.TestCase): class RunNode(common_util.SignalMixin, unittest.TestCase, pollmixin.PollMixin, RunBinTahoeMixin): """ - exercise "tahoe run" for both introducer, client node, and key-generator, - by spawning "tahoe run" as a subprocess. This doesn't get us line-level - coverage, but it does a better job of confirming that the user can - actually run "./bin/tahoe run" and expect it to work. This verifies that - bin/tahoe sets up PYTHONPATH and the like correctly. + exercise "tahoe run" for both introducer and client node, by spawning + "tahoe run" as a subprocess. This doesn't get us line-level coverage, but + it does a better job of confirming that the user can actually run + "./bin/tahoe run" and expect it to work. This verifies that bin/tahoe sets + up PYTHONPATH and the like correctly. """ def workdir(self, name): From d8da61205522e28921185c1ae103c98a2905f33d Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Sat, 12 Dec 2020 18:20:09 -0500 Subject: [PATCH 062/186] Move the --nodaemon logic out of tahoe_run We're always going to --nodaemon from now on --- src/allmydata/scripts/run_common.py | 8 +------- src/allmydata/scripts/tahoe_run.py | 3 --- 2 files changed, 1 insertion(+), 10 deletions(-) diff --git a/src/allmydata/scripts/run_common.py b/src/allmydata/scripts/run_common.py index da02c6517..da2e93363 100644 --- a/src/allmydata/scripts/run_common.py +++ b/src/allmydata/scripts/run_common.py @@ -195,13 +195,7 @@ def run(config): # Now prepare to turn into a twistd process. This os.chdir is the point # of no return. os.chdir(basedir) - twistd_args = [] - if (nodetype in (u"client", u"introducer") - and "--nodaemon" not in config.twistd_args - and "--syslog" not in config.twistd_args - and "--logfile" not in config.twistd_args): - fileutil.make_dirs(os.path.join(basedir, u"logs")) - twistd_args.extend(["--logfile", os.path.join("logs", "twistd.log")]) + twistd_args = ["--nodaemon"] twistd_args.extend(config.twistd_args) twistd_args.append("DaemonizeTahoeNode") # point at our DaemonizeTahoeNodePlugin diff --git a/src/allmydata/scripts/tahoe_run.py b/src/allmydata/scripts/tahoe_run.py index 0a921cc71..cea9bda27 100644 --- a/src/allmydata/scripts/tahoe_run.py +++ b/src/allmydata/scripts/tahoe_run.py @@ -10,6 +10,3 @@ __all__ = [ class RunOptions(_RunOptions): subcommand_name = "run" - - def postOptions(self): - self.twistd_args += ("--nodaemon",) From ed2152e2c822b83163e1972cbcf566691f39f33a Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Sat, 12 Dec 2020 18:20:26 -0500 Subject: [PATCH 063/186] We don't need to check this condition. We're always running. --- src/allmydata/scripts/run_common.py | 34 ++--------------------------- 1 file changed, 2 insertions(+), 32 deletions(-) diff --git a/src/allmydata/scripts/run_common.py b/src/allmydata/scripts/run_common.py index da2e93363..86d0330a5 100644 --- a/src/allmydata/scripts/run_common.py +++ b/src/allmydata/scripts/run_common.py @@ -217,38 +217,8 @@ def run(config): print("found invalid PID file in %s - deleting it" % basedir, file=err) os.remove(pidfile) - # On Unix-like platforms: - # Unless --nodaemon was provided, the twistd.runApp() below spawns off a - # child process, and the parent calls os._exit(0), so there's no way for - # us to get control afterwards, even with 'except SystemExit'. If - # application setup fails (e.g. ImportError), runApp() will raise an - # exception. - # - # So if we wanted to do anything with the running child, we'd have two - # options: - # - # * fork first, and have our child wait for the runApp() child to get - # running. (note: just fork(). This is easier than fork+exec, since we - # don't have to get PATH and PYTHONPATH set up, since we're not - # starting a *different* process, just cloning a new instance of the - # current process) - # * or have the user run a separate command some time after this one - # exits. - # - # For Tahoe, we don't need to do anything with the child, so we can just - # let it exit. - # - # On Windows: - # twistd does not fork; it just runs in the current process whether or not - # --nodaemon is specified. (As on Unix, --nodaemon does have the side effect - # of causing us to log to stdout/stderr.) - - if "--nodaemon" in twistd_args or sys.platform == "win32": - verb = "running" - else: - verb = "starting" - - print("%s node in %s" % (verb, quoted_basedir), file=out) + # We always pass --nodaemon so twistd.runApp does not daemonize. + print("running node in %s" % (quoted_basedir,), file=out) twistd.runApp(twistd_config) # we should only reach here if --nodaemon or equivalent was used return 0 From 692285ada3758d6c966cfec6d6ceced141d64489 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Sat, 12 Dec 2020 18:21:16 -0500 Subject: [PATCH 064/186] key-generator was removed --- src/allmydata/scripts/run_common.py | 4 ---- 1 file changed, 4 deletions(-) diff --git a/src/allmydata/scripts/run_common.py b/src/allmydata/scripts/run_common.py index 86d0330a5..7abb9dd32 100644 --- a/src/allmydata/scripts/run_common.py +++ b/src/allmydata/scripts/run_common.py @@ -126,14 +126,10 @@ class DaemonizeTheRealService(Service, HookMixin): def startService(self): - def key_generator_removed(): - return fail(ValueError("key-generator support removed, see #2783")) - def start(): node_to_instance = { u"client": lambda: maybeDeferred(namedAny("allmydata.client.create_client"), self.basedir), u"introducer": lambda: maybeDeferred(namedAny("allmydata.introducer.server.create_introducer"), self.basedir), - u"key-generator": key_generator_removed, } try: From 25c98d742150cd532e1cc7723a005c105b69465a Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Sat, 12 Dec 2020 18:21:35 -0500 Subject: [PATCH 065/186] don't shadow the global --- src/allmydata/scripts/run_common.py | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/src/allmydata/scripts/run_common.py b/src/allmydata/scripts/run_common.py index 7abb9dd32..8978d892a 100644 --- a/src/allmydata/scripts/run_common.py +++ b/src/allmydata/scripts/run_common.py @@ -137,12 +137,12 @@ class DaemonizeTheRealService(Service, HookMixin): except KeyError: raise ValueError("unknown nodetype %s" % self.nodetype) - def handle_config_error(fail): - if fail.check(UnknownConfigError): - self.stderr.write("\nConfiguration error:\n{}\n\n".format(fail.value)) + def handle_config_error(reason): + if reason.check(UnknownConfigError): + self.stderr.write("\nConfiguration error:\n{}\n\n".format(reason.value)) else: self.stderr.write("\nUnknown error\n") - fail.printTraceback(self.stderr) + reason.printTraceback(self.stderr) reactor.stop() d = service_factory() From 39631a90bf0ed261bcd0fb09fb9484c09fcecddd Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Sat, 12 Dec 2020 18:23:01 -0500 Subject: [PATCH 066/186] we always use --nodaemon --- src/allmydata/scripts/run_common.py | 1 - 1 file changed, 1 deletion(-) diff --git a/src/allmydata/scripts/run_common.py b/src/allmydata/scripts/run_common.py index 8978d892a..698680e78 100644 --- a/src/allmydata/scripts/run_common.py +++ b/src/allmydata/scripts/run_common.py @@ -216,5 +216,4 @@ def run(config): # We always pass --nodaemon so twistd.runApp does not daemonize. print("running node in %s" % (quoted_basedir,), file=out) twistd.runApp(twistd_config) - # we should only reach here if --nodaemon or equivalent was used return 0 From b58b07a9d7d3df20b774298186321e3674700c6f Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Sat, 12 Dec 2020 18:26:22 -0500 Subject: [PATCH 067/186] Fold run_common into tahoe_run since there are no other run-like commands anymore --- src/allmydata/scripts/run_common.py | 219 --------------------------- src/allmydata/scripts/tahoe_run.py | 223 +++++++++++++++++++++++++++- src/allmydata/test/cli/test_cli.py | 4 +- 3 files changed, 220 insertions(+), 226 deletions(-) delete mode 100644 src/allmydata/scripts/run_common.py diff --git a/src/allmydata/scripts/run_common.py b/src/allmydata/scripts/run_common.py deleted file mode 100644 index 698680e78..000000000 --- a/src/allmydata/scripts/run_common.py +++ /dev/null @@ -1,219 +0,0 @@ -from __future__ import print_function - -import os, sys -from allmydata.scripts.common import BasedirOptions -from twisted.scripts import twistd -from twisted.python import usage -from twisted.python.reflect import namedAny -from twisted.internet.defer import maybeDeferred, fail -from twisted.application.service import Service - -from allmydata.scripts.default_nodedir import _default_nodedir -from allmydata.util import fileutil -from allmydata.util.encodingutil import listdir_unicode, quote_local_unicode_path -from allmydata.util.configutil import UnknownConfigError -from allmydata.util.deferredutil import HookMixin - - -def get_pidfile(basedir): - """ - Returns the path to the PID file. - :param basedir: the node's base directory - :returns: the path to the PID file - """ - return os.path.join(basedir, u"twistd.pid") - -def get_pid_from_pidfile(pidfile): - """ - Tries to read and return the PID stored in the node's PID file - (twistd.pid). - :param pidfile: try to read this PID file - :returns: A numeric PID on success, ``None`` if PID file absent or - inaccessible, ``-1`` if PID file invalid. - """ - try: - with open(pidfile, "r") as f: - pid = f.read() - except EnvironmentError: - return None - - try: - pid = int(pid) - except ValueError: - return -1 - - return pid - -def identify_node_type(basedir): - """ - :return unicode: None or one of: 'client' or 'introducer'. - """ - tac = u'' - try: - for fn in listdir_unicode(basedir): - if fn.endswith(u".tac"): - tac = fn - break - except OSError: - return None - - for t in (u"client", u"introducer"): - if t in tac: - return t - return None - - -class RunOptions(BasedirOptions): - optParameters = [ - ("basedir", "C", None, - "Specify which Tahoe base directory should be used." - " This has the same effect as the global --node-directory option." - " [default: %s]" % quote_local_unicode_path(_default_nodedir)), - ] - - def parseArgs(self, basedir=None, *twistd_args): - # This can't handle e.g. 'tahoe run --reactor=foo', since - # '--reactor=foo' looks like an option to the tahoe subcommand, not to - # twistd. So you can either use 'tahoe run' or 'tahoe run NODEDIR - # --TWISTD-OPTIONS'. Note that 'tahoe --node-directory=NODEDIR run - # --TWISTD-OPTIONS' also isn't allowed, unfortunately. - - BasedirOptions.parseArgs(self, basedir) - self.twistd_args = twistd_args - - def getSynopsis(self): - return ("Usage: %s [global-options] %s [options]" - " [NODEDIR [twistd-options]]" - % (self.command_name, self.subcommand_name)) - - def getUsage(self, width=None): - t = BasedirOptions.getUsage(self, width) + "\n" - twistd_options = str(MyTwistdConfig()).partition("\n")[2].partition("\n\n")[0] - t += twistd_options.replace("Options:", "twistd-options:", 1) - t += """ - -Note that if any twistd-options are used, NODEDIR must be specified explicitly -(not by default or using -C/--basedir or -d/--node-directory), and followed by -the twistd-options. -""" - return t - - -class MyTwistdConfig(twistd.ServerOptions): - subCommands = [("DaemonizeTahoeNode", None, usage.Options, "node")] - - stderr = sys.stderr - - -class DaemonizeTheRealService(Service, HookMixin): - """ - this HookMixin should really be a helper; our hooks: - - - 'running': triggered when startup has completed; it triggers - with None of successful or a Failure otherwise. - """ - stderr = sys.stderr - - def __init__(self, nodetype, basedir, options): - super(DaemonizeTheRealService, self).__init__() - self.nodetype = nodetype - self.basedir = basedir - # setup for HookMixin - self._hooks = { - "running": None, - } - self.stderr = options.parent.stderr - - def startService(self): - - def start(): - node_to_instance = { - u"client": lambda: maybeDeferred(namedAny("allmydata.client.create_client"), self.basedir), - u"introducer": lambda: maybeDeferred(namedAny("allmydata.introducer.server.create_introducer"), self.basedir), - } - - try: - service_factory = node_to_instance[self.nodetype] - except KeyError: - raise ValueError("unknown nodetype %s" % self.nodetype) - - def handle_config_error(reason): - if reason.check(UnknownConfigError): - self.stderr.write("\nConfiguration error:\n{}\n\n".format(reason.value)) - else: - self.stderr.write("\nUnknown error\n") - reason.printTraceback(self.stderr) - reactor.stop() - - d = service_factory() - - def created(srv): - srv.setServiceParent(self.parent) - d.addCallback(created) - d.addErrback(handle_config_error) - d.addBoth(self._call_hook, 'running') - return d - - from twisted.internet import reactor - reactor.callWhenRunning(start) - - -class DaemonizeTahoeNodePlugin(object): - tapname = "tahoenode" - def __init__(self, nodetype, basedir): - self.nodetype = nodetype - self.basedir = basedir - - def makeService(self, so): - return DaemonizeTheRealService(self.nodetype, self.basedir, so) - - -def run(config): - """ - Runs a Tahoe-LAFS node in the foreground. - - Sets up the IService instance corresponding to the type of node - that's starting and uses Twisted's twistd runner to disconnect our - process from the terminal. - """ - out = config.stdout - err = config.stderr - basedir = config['basedir'] - quoted_basedir = quote_local_unicode_path(basedir) - print("'tahoe {}' in {}".format(config.subcommand_name, quoted_basedir), file=out) - if not os.path.isdir(basedir): - print("%s does not look like a directory at all" % quoted_basedir, file=err) - return 1 - nodetype = identify_node_type(basedir) - if not nodetype: - print("%s is not a recognizable node directory" % quoted_basedir, file=err) - return 1 - # Now prepare to turn into a twistd process. This os.chdir is the point - # of no return. - os.chdir(basedir) - twistd_args = ["--nodaemon"] - twistd_args.extend(config.twistd_args) - twistd_args.append("DaemonizeTahoeNode") # point at our DaemonizeTahoeNodePlugin - - twistd_config = MyTwistdConfig() - twistd_config.stdout = out - twistd_config.stderr = err - try: - twistd_config.parseOptions(twistd_args) - except usage.error as ue: - # these arguments were unsuitable for 'twistd' - print(config, file=err) - print("tahoe %s: usage error from twistd: %s\n" % (config.subcommand_name, ue), file=err) - return 1 - twistd_config.loadedPlugins = {"DaemonizeTahoeNode": DaemonizeTahoeNodePlugin(nodetype, basedir)} - - # handle invalid PID file (twistd might not start otherwise) - pidfile = get_pidfile(basedir) - if get_pid_from_pidfile(pidfile) == -1: - print("found invalid PID file in %s - deleting it" % basedir, file=err) - os.remove(pidfile) - - # We always pass --nodaemon so twistd.runApp does not daemonize. - print("running node in %s" % (quoted_basedir,), file=out) - twistd.runApp(twistd_config) - return 0 diff --git a/src/allmydata/scripts/tahoe_run.py b/src/allmydata/scripts/tahoe_run.py index cea9bda27..3ee1bf3d9 100644 --- a/src/allmydata/scripts/tahoe_run.py +++ b/src/allmydata/scripts/tahoe_run.py @@ -1,12 +1,225 @@ -from .run_common import ( - RunOptions as _RunOptions, - run, -) +from __future__ import print_function __all__ = [ "RunOptions", "run", ] -class RunOptions(_RunOptions): +import os, sys +from allmydata.scripts.common import BasedirOptions +from twisted.scripts import twistd +from twisted.python import usage +from twisted.python.reflect import namedAny +from twisted.internet.defer import maybeDeferred +from twisted.application.service import Service + +from allmydata.scripts.default_nodedir import _default_nodedir +from allmydata.util.encodingutil import listdir_unicode, quote_local_unicode_path +from allmydata.util.configutil import UnknownConfigError +from allmydata.util.deferredutil import HookMixin + + +def get_pidfile(basedir): + """ + Returns the path to the PID file. + :param basedir: the node's base directory + :returns: the path to the PID file + """ + return os.path.join(basedir, u"twistd.pid") + +def get_pid_from_pidfile(pidfile): + """ + Tries to read and return the PID stored in the node's PID file + (twistd.pid). + :param pidfile: try to read this PID file + :returns: A numeric PID on success, ``None`` if PID file absent or + inaccessible, ``-1`` if PID file invalid. + """ + try: + with open(pidfile, "r") as f: + pid = f.read() + except EnvironmentError: + return None + + try: + pid = int(pid) + except ValueError: + return -1 + + return pid + +def identify_node_type(basedir): + """ + :return unicode: None or one of: 'client' or 'introducer'. + """ + tac = u'' + try: + for fn in listdir_unicode(basedir): + if fn.endswith(u".tac"): + tac = fn + break + except OSError: + return None + + for t in (u"client", u"introducer"): + if t in tac: + return t + return None + + +class RunOptions(BasedirOptions): subcommand_name = "run" + + optParameters = [ + ("basedir", "C", None, + "Specify which Tahoe base directory should be used." + " This has the same effect as the global --node-directory option." + " [default: %s]" % quote_local_unicode_path(_default_nodedir)), + ] + + def parseArgs(self, basedir=None, *twistd_args): + # This can't handle e.g. 'tahoe run --reactor=foo', since + # '--reactor=foo' looks like an option to the tahoe subcommand, not to + # twistd. So you can either use 'tahoe run' or 'tahoe run NODEDIR + # --TWISTD-OPTIONS'. Note that 'tahoe --node-directory=NODEDIR run + # --TWISTD-OPTIONS' also isn't allowed, unfortunately. + + BasedirOptions.parseArgs(self, basedir) + self.twistd_args = twistd_args + + def getSynopsis(self): + return ("Usage: %s [global-options] %s [options]" + " [NODEDIR [twistd-options]]" + % (self.command_name, self.subcommand_name)) + + def getUsage(self, width=None): + t = BasedirOptions.getUsage(self, width) + "\n" + twistd_options = str(MyTwistdConfig()).partition("\n")[2].partition("\n\n")[0] + t += twistd_options.replace("Options:", "twistd-options:", 1) + t += """ + +Note that if any twistd-options are used, NODEDIR must be specified explicitly +(not by default or using -C/--basedir or -d/--node-directory), and followed by +the twistd-options. +""" + return t + + +class MyTwistdConfig(twistd.ServerOptions): + subCommands = [("DaemonizeTahoeNode", None, usage.Options, "node")] + + stderr = sys.stderr + + +class DaemonizeTheRealService(Service, HookMixin): + """ + this HookMixin should really be a helper; our hooks: + + - 'running': triggered when startup has completed; it triggers + with None of successful or a Failure otherwise. + """ + stderr = sys.stderr + + def __init__(self, nodetype, basedir, options): + super(DaemonizeTheRealService, self).__init__() + self.nodetype = nodetype + self.basedir = basedir + # setup for HookMixin + self._hooks = { + "running": None, + } + self.stderr = options.parent.stderr + + def startService(self): + + def start(): + node_to_instance = { + u"client": lambda: maybeDeferred(namedAny("allmydata.client.create_client"), self.basedir), + u"introducer": lambda: maybeDeferred(namedAny("allmydata.introducer.server.create_introducer"), self.basedir), + } + + try: + service_factory = node_to_instance[self.nodetype] + except KeyError: + raise ValueError("unknown nodetype %s" % self.nodetype) + + def handle_config_error(reason): + if reason.check(UnknownConfigError): + self.stderr.write("\nConfiguration error:\n{}\n\n".format(reason.value)) + else: + self.stderr.write("\nUnknown error\n") + reason.printTraceback(self.stderr) + reactor.stop() + + d = service_factory() + + def created(srv): + srv.setServiceParent(self.parent) + d.addCallback(created) + d.addErrback(handle_config_error) + d.addBoth(self._call_hook, 'running') + return d + + from twisted.internet import reactor + reactor.callWhenRunning(start) + + +class DaemonizeTahoeNodePlugin(object): + tapname = "tahoenode" + def __init__(self, nodetype, basedir): + self.nodetype = nodetype + self.basedir = basedir + + def makeService(self, so): + return DaemonizeTheRealService(self.nodetype, self.basedir, so) + + +def run(config): + """ + Runs a Tahoe-LAFS node in the foreground. + + Sets up the IService instance corresponding to the type of node + that's starting and uses Twisted's twistd runner to disconnect our + process from the terminal. + """ + out = config.stdout + err = config.stderr + basedir = config['basedir'] + quoted_basedir = quote_local_unicode_path(basedir) + print("'tahoe {}' in {}".format(config.subcommand_name, quoted_basedir), file=out) + if not os.path.isdir(basedir): + print("%s does not look like a directory at all" % quoted_basedir, file=err) + return 1 + nodetype = identify_node_type(basedir) + if not nodetype: + print("%s is not a recognizable node directory" % quoted_basedir, file=err) + return 1 + # Now prepare to turn into a twistd process. This os.chdir is the point + # of no return. + os.chdir(basedir) + twistd_args = ["--nodaemon"] + twistd_args.extend(config.twistd_args) + twistd_args.append("DaemonizeTahoeNode") # point at our DaemonizeTahoeNodePlugin + + twistd_config = MyTwistdConfig() + twistd_config.stdout = out + twistd_config.stderr = err + try: + twistd_config.parseOptions(twistd_args) + except usage.error as ue: + # these arguments were unsuitable for 'twistd' + print(config, file=err) + print("tahoe %s: usage error from twistd: %s\n" % (config.subcommand_name, ue), file=err) + return 1 + twistd_config.loadedPlugins = {"DaemonizeTahoeNode": DaemonizeTahoeNodePlugin(nodetype, basedir)} + + # handle invalid PID file (twistd might not start otherwise) + pidfile = get_pidfile(basedir) + if get_pid_from_pidfile(pidfile) == -1: + print("found invalid PID file in %s - deleting it" % basedir, file=err) + os.remove(pidfile) + + # We always pass --nodaemon so twistd.runApp does not daemonize. + print("running node in %s" % (quoted_basedir,), file=out) + twistd.runApp(twistd_config) + return 0 diff --git a/src/allmydata/test/cli/test_cli.py b/src/allmydata/test/cli/test_cli.py index 27af29520..2d1e9f57a 100644 --- a/src/allmydata/test/cli/test_cli.py +++ b/src/allmydata/test/cli/test_cli.py @@ -1310,8 +1310,8 @@ class Options(ReallyEqualMixin, unittest.TestCase): class Run(unittest.TestCase): - @patch('allmydata.scripts.run_common.os.chdir') - @patch('allmydata.scripts.run_common.twistd') + @patch('allmydata.scripts.tahoe_run.os.chdir') + @patch('allmydata.scripts.tahoe_run.twistd') def test_non_numeric_pid(self, mock_twistd, chdir): """ If the pidfile exists but does not contain a numeric value, a complaint to From 34cd1efaa477ba28e0bd7ce15605db3c72631b7a Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Sat, 12 Dec 2020 18:34:49 -0500 Subject: [PATCH 068/186] For the sake of clarity, stop talking about daemons here --- src/allmydata/test/cli/test_cli.py | 14 +++++++++++--- 1 file changed, 11 insertions(+), 3 deletions(-) diff --git a/src/allmydata/test/cli/test_cli.py b/src/allmydata/test/cli/test_cli.py index 2d1e9f57a..2b1bc1c86 100644 --- a/src/allmydata/test/cli/test_cli.py +++ b/src/allmydata/test/cli/test_cli.py @@ -1272,6 +1272,14 @@ class Options(ReallyEqualMixin, unittest.TestCase): # accept a --node-directory option before the verb, or a --basedir # option after, or a basedir argument after, but none in the wrong # place, and not more than one of the three. + + # Here is some option twistd recognizes but we don't. Depending on + # where it appears, it should be passed through to twistd. It doesn't + # really matter which option it is (it doesn't even have to be a valid + # option). This test does not actually run any of the twistd argument + # parsing. + some_twistd_option = "--spew" + o = self.parse(["run"]) self.failUnlessReallyEqual(o["basedir"], os.path.join(fileutil.abspath_expanduser_unicode(u"~"), u".tahoe")) @@ -1282,7 +1290,7 @@ class Options(ReallyEqualMixin, unittest.TestCase): o = self.parse(["--node-directory", "there", "run"]) self.failUnlessReallyEqual(o["basedir"], fileutil.abspath_expanduser_unicode(u"there")) - o = self.parse(["run", "here", "--nodaemon"]) + o = self.parse(["run", "here", some_twistd_option]) self.failUnlessReallyEqual(o["basedir"], fileutil.abspath_expanduser_unicode(u"here")) self.failUnlessRaises(usage.UsageError, self.parse, @@ -1303,9 +1311,9 @@ class Options(ReallyEqualMixin, unittest.TestCase): "run", "--basedir=here", "anywhere"]) self.failUnlessRaises(usage.UsageError, self.parse, - ["--node-directory=there", "run", "--nodaemon"]) + ["--node-directory=there", "run", some_twistd_option]) self.failUnlessRaises(usage.UsageError, self.parse, - ["run", "--basedir=here", "--nodaemon"]) + ["run", "--basedir=here", some_twistd_option]) class Run(unittest.TestCase): From b787de0acc6c1b9d1491a54539bf3ed458732a19 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Sun, 13 Dec 2020 06:49:49 -0500 Subject: [PATCH 069/186] Fix BeautifulSoup's GuessedAtParserWarning --- newsfragments/3557.minor | 0 src/allmydata/test/test_checker.py | 2 +- src/allmydata/test/web/test_common.py | 2 +- 3 files changed, 2 insertions(+), 2 deletions(-) create mode 100644 newsfragments/3557.minor diff --git a/newsfragments/3557.minor b/newsfragments/3557.minor new file mode 100644 index 000000000..e69de29bb diff --git a/src/allmydata/test/test_checker.py b/src/allmydata/test/test_checker.py index 20135688d..936d270ae 100644 --- a/src/allmydata/test/test_checker.py +++ b/src/allmydata/test/test_checker.py @@ -144,7 +144,7 @@ class WebResultsRendering(unittest.TestCase): @staticmethod def remove_tags(html): - return BeautifulSoup(html).get_text(separator=" ") + return BeautifulSoup(html, 'html5lib').get_text(separator=" ") def create_fake_client(self): sb = StorageFarmBroker(True, None, EMPTY_CLIENT_CONFIG) diff --git a/src/allmydata/test/web/test_common.py b/src/allmydata/test/web/test_common.py index 2a0ebd3d8..5261c412f 100644 --- a/src/allmydata/test/web/test_common.py +++ b/src/allmydata/test/web/test_common.py @@ -160,7 +160,7 @@ class RenderExceptionTests(SyncTestCase): MatchesPredicate( lambda value: assert_soup_has_tag_with_attributes( self, - BeautifulSoup(value), + BeautifulSoup(value, 'html5lib'), "meta", {"http-equiv": "refresh", "content": "0;URL={}".format(loc.encode("ascii")), From 0a1c2386b975ae0de1b618055d3d4e72a973898d Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 09:37:04 -0500 Subject: [PATCH 070/186] client must be running already --- src/allmydata/test/check_grid.py | 5 +---- 1 file changed, 1 insertion(+), 4 deletions(-) diff --git a/src/allmydata/test/check_grid.py b/src/allmydata/test/check_grid.py index 82323a5c5..5c46c41a8 100644 --- a/src/allmydata/test/check_grid.py +++ b/src/allmydata/test/check_grid.py @@ -16,9 +16,7 @@ that this script does not import anything from tahoe directly, so it doesn't matter what its PYTHONPATH is, as long as the bin/tahoe that it uses is functional. -This script expects that the client node will be not running when the script -starts, but it will forcibly shut down the node just to be sure. It will shut -down the node after the test finishes. +This script expects the client node to be running already. To set up the client node, do the following: @@ -52,7 +50,6 @@ is being tested is in [brackets]): This script will also keep track of speeds and latencies and will write them in a machine-readable logfile. - """ import time, subprocess, md5, os.path, random From a0931f4999ec0e5f589080e1583046d497d9e115 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 09:38:06 -0500 Subject: [PATCH 071/186] You can pass the introducer on the command line --- src/allmydata/test/check_grid.py | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/src/allmydata/test/check_grid.py b/src/allmydata/test/check_grid.py index 5c46c41a8..f4bfcae31 100644 --- a/src/allmydata/test/check_grid.py +++ b/src/allmydata/test/check_grid.py @@ -20,9 +20,8 @@ This script expects the client node to be running already. To set up the client node, do the following: - tahoe create-client DIR - populate DIR/private/introducers.yaml $DAEMONIZE tahoe run DIR + tahoe create-client --introducer=INTRODUCER_FURL DIR tahoe -d DIR create-alias testgrid # pick a 10kB-ish test file, compute its md5sum tahoe -d DIR put FILE testgrid:old.MD5SUM From bdb7c50fac2d7cab17e6974390d247cfafe11472 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 09:38:16 -0500 Subject: [PATCH 072/186] You can just use multiple terminals If you know how to daemonize stuff you can figure it out yourself I guess. --- src/allmydata/test/check_grid.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/test/check_grid.py b/src/allmydata/test/check_grid.py index f4bfcae31..0a68ed899 100644 --- a/src/allmydata/test/check_grid.py +++ b/src/allmydata/test/check_grid.py @@ -20,8 +20,8 @@ This script expects the client node to be running already. To set up the client node, do the following: - $DAEMONIZE tahoe run DIR tahoe create-client --introducer=INTRODUCER_FURL DIR + tahoe run DIR tahoe -d DIR create-alias testgrid # pick a 10kB-ish test file, compute its md5sum tahoe -d DIR put FILE testgrid:old.MD5SUM From 0357eeb924d0691a9153125b99070a56a8cbd5fb Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 09:24:57 -0500 Subject: [PATCH 073/186] news fragment --- newsfragments/3558.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3558.minor diff --git a/newsfragments/3558.minor b/newsfragments/3558.minor new file mode 100644 index 000000000..e69de29bb From 41fc7d2c3c55e499dd1b29fa7618d31a39b82b10 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 10:24:26 -0500 Subject: [PATCH 074/186] Can we force it to report for the right project? --- .circleci/config.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.circleci/config.yml b/.circleci/config.yml index afa3fafa1..5039b0d7f 100644 --- a/.circleci/config.yml +++ b/.circleci/config.yml @@ -270,7 +270,7 @@ jobs: name: "Submit coverage results" command: | if [ -n "${UPLOAD_COVERAGE}" ]; then - /tmp/venv/bin/codecov + /tmp/venv/bin/codecov --slug tahoe-lafs/tahoe-lafs fi From 85c1b2472972d7ed8ec90257cfcab041d21888d9 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 10:36:24 -0500 Subject: [PATCH 075/186] That didn't work. Using `--slug tahoe-lafs/tahoe-lafs` does not fix the issue. It causes codecov to fail with `Commit sha does not match Circle build.` --- .circleci/config.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.circleci/config.yml b/.circleci/config.yml index 5039b0d7f..afa3fafa1 100644 --- a/.circleci/config.yml +++ b/.circleci/config.yml @@ -270,7 +270,7 @@ jobs: name: "Submit coverage results" command: | if [ -n "${UPLOAD_COVERAGE}" ]; then - /tmp/venv/bin/codecov --slug tahoe-lafs/tahoe-lafs + /tmp/venv/bin/codecov fi From 7be13f5c7f462af15d09c65ac3eef93c2a66ae1e Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 10:36:45 -0500 Subject: [PATCH 076/186] Try providing an upload token instead --- .codecov.yml | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/.codecov.yml b/.codecov.yml index 57abf7c0a..b596def25 100644 --- a/.codecov.yml +++ b/.codecov.yml @@ -32,3 +32,11 @@ coverage: patch: default: threshold: 1% + + +codecov: + # This is a public repository so supposedly we don't "need" to use an upload + # token. However, using one makes sure that CI jobs running against forked + # repositories have coverage uploaded to the right place in codecov so + # they're reports aren't incomplete. + token: "abf679b6-e2e6-4b33-b7b5-6cfbd41ee691" From f1be6a50a2a4c3d5d543c24450a8d27eec3e4900 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 11:04:31 -0500 Subject: [PATCH 077/186] So embarassing --- .codecov.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.codecov.yml b/.codecov.yml index b596def25..df1eb5e01 100644 --- a/.codecov.yml +++ b/.codecov.yml @@ -38,5 +38,5 @@ codecov: # This is a public repository so supposedly we don't "need" to use an upload # token. However, using one makes sure that CI jobs running against forked # repositories have coverage uploaded to the right place in codecov so - # they're reports aren't incomplete. + # their reports aren't incomplete. token: "abf679b6-e2e6-4b33-b7b5-6cfbd41ee691" From 28f46e9b06a96071fe06501b2073adc933c7ed3b Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 14 Dec 2020 11:07:37 -0500 Subject: [PATCH 078/186] test_system.py passes on both Python 2 and Python 3. --- src/allmydata/test/common_util.py | 36 ++++++++++++++++++------------- src/allmydata/test/test_system.py | 11 +++++++--- 2 files changed, 29 insertions(+), 18 deletions(-) diff --git a/src/allmydata/test/common_util.py b/src/allmydata/test/common_util.py index cd7d81eba..6b4e8513f 100644 --- a/src/allmydata/test/common_util.py +++ b/src/allmydata/test/common_util.py @@ -1,5 +1,8 @@ from __future__ import print_function +from future.utils import PY2, native_str +from future.builtins import str as future_str + import os import time import signal @@ -50,24 +53,23 @@ def _getvalue(io): return io.read() -def run_cli_bytes(verb, *args, **kwargs): +def run_cli_native(verb, *args, **kwargs): """ - Run a Tahoe-LAFS CLI command specified as bytes. + Run a Tahoe-LAFS CLI command specified as bytes (on Python 2) or Unicode + (on Python 3); basically, it accepts a native string. Most code should prefer ``run_cli_unicode`` which deals with all the - necessary encoding considerations. This helper still exists so that novel - misconfigurations can be explicitly tested (for example, receiving UTF-8 - bytes when the system encoding claims to be ASCII). + necessary encoding considerations. - :param bytes verb: The command to run. For example, ``b"create-node"``. + :param native_str verb: The command to run. For example, ``b"create-node"``. - :param [bytes] args: The arguments to pass to the command. For example, + :param [native_str] args: The arguments to pass to the command. For example, ``(b"--hostname=localhost",)``. - :param [bytes] nodeargs: Extra arguments to pass to the Tahoe executable + :param [native_str] nodeargs: Extra arguments to pass to the Tahoe executable before ``verb``. - :param bytes stdin: Text to pass to the command via stdin. + :param native_str stdin: Text to pass to the command via stdin. :param NoneType|str encoding: The name of an encoding which stdout and stderr will be configured to use. ``None`` means stdout and stderr @@ -77,8 +79,8 @@ def run_cli_bytes(verb, *args, **kwargs): nodeargs = kwargs.pop("nodeargs", []) encoding = kwargs.pop("encoding", None) precondition( - all(isinstance(arg, bytes) for arg in [verb] + nodeargs + list(args)), - "arguments to run_cli must be bytes -- convert using unicode_to_argv", + all(isinstance(arg, native_str) for arg in [verb] + nodeargs + list(args)), + "arguments to run_cli must be a native string -- convert using unicode_to_argv", verb=verb, args=args, nodeargs=nodeargs, @@ -137,15 +139,19 @@ def run_cli_unicode(verb, argv, nodeargs=None, stdin=None, encoding=None): if nodeargs is None: nodeargs = [] precondition( - all(isinstance(arg, unicode) for arg in [verb] + nodeargs + argv), + all(isinstance(arg, future_str) for arg in [verb] + nodeargs + argv), "arguments to run_cli_unicode must be unicode", verb=verb, nodeargs=nodeargs, argv=argv, ) codec = encoding or "ascii" - encode = lambda t: None if t is None else t.encode(codec) - d = run_cli_bytes( + if PY2: + encode = lambda t: None if t is None else t.encode(codec) + else: + # On Python 3 command-line parsing expects Unicode! + encode = lambda t: t + d = run_cli_native( encode(verb), nodeargs=list(encode(arg) for arg in nodeargs), stdin=encode(stdin), @@ -163,7 +169,7 @@ def run_cli_unicode(verb, argv, nodeargs=None, stdin=None, encoding=None): return d -run_cli = run_cli_bytes +run_cli = run_cli_native def parse_cli(*argv): diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index f84edabfc..04e52b319 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -63,7 +63,7 @@ from .web.common import ( # TODO: move this to common or common_util from allmydata.test.test_runner import RunBinTahoeMixin from . import common_util as testutil -from .common_util import run_cli_bytes +from .common_util import run_cli_unicode from ..scripts.common import ( write_introducer, ) @@ -72,7 +72,10 @@ def run_cli(*args, **kwargs): """ Backwards compatible version so we don't have to change all the tests. """ - return run_cli_bytes(*(ensure_binary(a) for a in args), **kwargs) + nodeargs = [ensure_text(a) for a in kwargs.pop("nodeargs", [])] + kwargs["nodeargs"] = nodeargs + return run_cli_unicode( + ensure_text(args[0]), [ensure_text(a) for a in args[1:]], **kwargs) @@ -1287,7 +1290,9 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): s = stats["stats"] self.failUnlessEqual(s["storage_server.accepting_immutable_shares"], 1) c = stats["counters"] - self.failUnless("storage_server.allocate" in c) + # Probably this should be Unicode eventually? But we haven't ported + # stats code yet. + self.failUnless(b"storage_server.allocate" in c) d.addCallback(_grab_stats) return d From c7759cb82ccf504c687601d76895ab12f24d5204 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 14 Dec 2020 13:53:12 -0500 Subject: [PATCH 079/186] Try to fix test_web.py on Python 2. --- src/allmydata/test/common_web.py | 4 ++-- src/allmydata/test/test_system.py | 7 ++++++- 2 files changed, 8 insertions(+), 3 deletions(-) diff --git a/src/allmydata/test/common_web.py b/src/allmydata/test/common_web.py index faf546f7c..d6bef702c 100644 --- a/src/allmydata/test/common_web.py +++ b/src/allmydata/test/common_web.py @@ -48,10 +48,10 @@ class VerboseError(Error): @inlineCallbacks def do_http(method, url, **kwargs): """ - Run HTTP query, return Deferred of body as Unicode. + Run HTTP query, return Deferred of body as bytes. """ response = yield treq.request(method, url, persistent=False, **kwargs) - body = yield treq.text_content(response, "utf-8") + body = yield treq.content(response) # TODO: replace this with response.fail_for_status when # https://github.com/twisted/treq/pull/159 has landed if 400 <= response.code < 600: diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 04e52b319..2a0a6eb00 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -55,7 +55,7 @@ from .common import ( TEST_RSA_KEY_SIZE, SameProcessStreamEndpointAssigner, ) -from .common_web import do_http, Error +from .common_web import do_http as do_http_bytes, Error from .web.common import ( assert_soup_has_tag_with_attributes ) @@ -78,6 +78,11 @@ def run_cli(*args, **kwargs): ensure_text(args[0]), [ensure_text(a) for a in args[1:]], **kwargs) +def do_http(*args, **kwargs): + """Wrapper for do_http() that returns Unicode.""" + return do_http_bytes(*args, **kwargs).addCallback( + lambda b: str(b, "utf-8")) + LARGE_DATA = b""" This is some data to publish to the remote grid.., which needs to be large From 01507e4f93f87a0cc0c19c6e758cd38b1877099e Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 16:57:20 -0500 Subject: [PATCH 080/186] some direct tests for DaemonizeTheRealService --- src/allmydata/test/cli/test_run.py | 131 +++++++++++++++++++++++++++++ 1 file changed, 131 insertions(+) create mode 100644 src/allmydata/test/cli/test_run.py diff --git a/src/allmydata/test/cli/test_run.py b/src/allmydata/test/cli/test_run.py new file mode 100644 index 000000000..5968a3a5b --- /dev/null +++ b/src/allmydata/test/cli/test_run.py @@ -0,0 +1,131 @@ +""" +Tests for ``allmydata.scripts.tahoe_run``. +""" + +from six.moves import ( + StringIO, +) + +from testtools.matchers import ( + Contains, + Equals, + MatchesAll, + IsInstance, + AfterPreprocessing, +) + +from twisted.python.filepath import ( + FilePath, +) +from twisted.internet.testing import ( + MemoryReactor, +) +from twisted.internet.test.modulehelpers import ( + AlternateReactor, +) + + +from ...scripts.tahoe_run import ( + DaemonizeTheRealService, +) + +from ...scripts.runner import ( + parse_options +) +from ..common import ( + SyncTestCase, +) + +class DaemonizeTheRealServiceTests(SyncTestCase): + """ + Tests for ``DaemonizeTheRealService``. + """ + def _verify_error(self, config, expected): + """ + Assert that when ``DaemonizeTheRealService`` is started using the given + configuration it writes the given message to stderr and stops the + reactor. + + :param bytes config: The contents of a ``tahoe.cfg`` file to give to + the service. + + :param bytes expected: A string to assert appears in stderr after the + service starts. + """ + nodedir = FilePath(self.mktemp()) + nodedir.makedirs() + nodedir.child("tahoe.cfg").setContent(config) + nodedir.child("tahoe-client.tac").touch() + + options = parse_options(["run", nodedir.path]) + stdout = options.stdout = StringIO() + stderr = options.stderr = StringIO() + run_options = options.subOptions + + reactor = MemoryReactor() + with AlternateReactor(reactor): + service = DaemonizeTheRealService( + "client", + nodedir.path, + run_options, + ) + service.startService() + + # We happen to know that the service uses reactor.callWhenRunning + # to schedule all its work (though I couldn't tell you *why*). + # Make sure those scheduled calls happen. + waiting = reactor.whenRunningHooks[:] + del reactor.whenRunningHooks[:] + for f, a, k in waiting: + f(*a, **k) + + self.assertThat( + reactor.hasStopped, + Equals(True), + ) + + self.assertThat( + stdout.getvalue(), + Equals(""), + ) + + self.assertThat( + stderr.getvalue(), + Contains(expected), + ) + + def test_unknown_config(self): + """ + If there are unknown items in the node configuration file then a short + message introduced with ``"Configuration error:"`` is written to + stderr. + """ + self._verify_error("[invalid-section]\n", "Configuration error:") + + def test_port_assignment_required(self): + """ + If ``tub.port`` is configured to use port 0 then a short message rejecting + this configuration is written to stderr. + """ + self._verify_error( + """ + [node] + tub.port = 0 + """, + "tub.port cannot be 0", + ) + + def test_privacy_error(self): + """ + If ``reveal-IP-address`` is set to false and the tub is not configured in + a way that avoids revealing the node's IP address, a short message + about privacy is written to stderr. + """ + self._verify_error( + """ + [node] + tub.port = AUTO + reveal-IP-address = false + """, + "Privacy requested", + ) From b6ea3f47c8aa88509b4c1d67e0b627b0bafd041a Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 16:58:22 -0500 Subject: [PATCH 081/186] unused imports --- src/allmydata/test/cli/test_run.py | 4 ---- 1 file changed, 4 deletions(-) diff --git a/src/allmydata/test/cli/test_run.py b/src/allmydata/test/cli/test_run.py index 5968a3a5b..d27791f34 100644 --- a/src/allmydata/test/cli/test_run.py +++ b/src/allmydata/test/cli/test_run.py @@ -9,9 +9,6 @@ from six.moves import ( from testtools.matchers import ( Contains, Equals, - MatchesAll, - IsInstance, - AfterPreprocessing, ) from twisted.python.filepath import ( @@ -24,7 +21,6 @@ from twisted.internet.test.modulehelpers import ( AlternateReactor, ) - from ...scripts.tahoe_run import ( DaemonizeTheRealService, ) From 18c18a0e1d88893a74f18c4ed7566e6bbcdfea08 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 17:33:58 -0500 Subject: [PATCH 082/186] explain the inconsistent naming --- src/allmydata/interfaces.py | 3 +++ 1 file changed, 3 insertions(+) diff --git a/src/allmydata/interfaces.py b/src/allmydata/interfaces.py index 8e27bb3ea..6d0938dd5 100644 --- a/src/allmydata/interfaces.py +++ b/src/allmydata/interfaces.py @@ -3153,6 +3153,9 @@ class IAddressFamily(Interface): def get_listener(): """ Return a string endpoint description or an ``IStreamServerEndpoint``. + + This would be named ``get_server_endpoint`` if not for historical + reasons. """ def get_client_endpoint(): From fcbe56ba0a5d2d8f03e9ad4846199b0b23687697 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 17:36:35 -0500 Subject: [PATCH 083/186] docstrings for the Tor and I2P address family implementations --- src/allmydata/util/i2p_provider.py | 7 +++++++ src/allmydata/util/tor_provider.py | 6 ++++++ 2 files changed, 13 insertions(+) diff --git a/src/allmydata/util/i2p_provider.py b/src/allmydata/util/i2p_provider.py index 46c57704a..22575b4ca 100644 --- a/src/allmydata/util/i2p_provider.py +++ b/src/allmydata/util/i2p_provider.py @@ -169,6 +169,13 @@ class _Provider(service.MultiService): return i2p_port def get_client_endpoint(self): + """ + Get an ``IStreamClientEndpoint`` which will set up a connection to an I2P + address. + + If I2P is not enabled or the dependencies are not available, return + ``None`` instead. + """ enabled = self._get_i2p_config("enabled", True, boolean=True) if not enabled: return None diff --git a/src/allmydata/util/tor_provider.py b/src/allmydata/util/tor_provider.py index 59039b6b7..42700b3bc 100644 --- a/src/allmydata/util/tor_provider.py +++ b/src/allmydata/util/tor_provider.py @@ -236,6 +236,12 @@ class _Provider(service.MultiService): return ep def get_client_endpoint(self): + """ + Get an ``IStreamClientEndpoint`` which will set up a connection using Tor. + + If Tor is not enabled or the dependencies are not available, return + ``None`` instead. + """ enabled = self._get_tor_config("enabled", True, boolean=True) if not enabled: return None From 4b1c6a2815699e1770dd92c27ae5ecf9884f4331 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 17:42:30 -0500 Subject: [PATCH 084/186] Remove these reactor parameters --- src/allmydata/client.py | 2 +- src/allmydata/introducer/server.py | 2 +- src/allmydata/node.py | 5 ++--- src/allmydata/test/test_connections.py | 16 ++++++++-------- src/allmydata/test/test_node.py | 8 +++----- 5 files changed, 15 insertions(+), 18 deletions(-) diff --git a/src/allmydata/client.py b/src/allmydata/client.py index ce5c570fc..75e20951b 100644 --- a/src/allmydata/client.py +++ b/src/allmydata/client.py @@ -270,7 +270,7 @@ def create_client_from_config(config, _client_factory=None, _introducer_factory= i2p_provider = create_i2p_provider(reactor, config) tor_provider = create_tor_provider(reactor, config) - handlers = node.create_connection_handlers(reactor, config, i2p_provider, tor_provider) + handlers = node.create_connection_handlers(config, i2p_provider, tor_provider) default_connection_handlers, foolscap_connection_handlers = handlers tub_options = node.create_tub_options(config) diff --git a/src/allmydata/introducer/server.py b/src/allmydata/introducer/server.py index cd3d4a68a..237c30315 100644 --- a/src/allmydata/introducer/server.py +++ b/src/allmydata/introducer/server.py @@ -70,7 +70,7 @@ def create_introducer(basedir=u"."): i2p_provider = create_i2p_provider(reactor, config) tor_provider = create_tor_provider(reactor, config) - default_connection_handlers, foolscap_connection_handlers = create_connection_handlers(reactor, config, i2p_provider, tor_provider) + default_connection_handlers, foolscap_connection_handlers = create_connection_handlers(config, i2p_provider, tor_provider) tub_options = create_tub_options(config) # we don't remember these because the Introducer doesn't make diff --git a/src/allmydata/node.py b/src/allmydata/node.py index faa8d61c0..e5a11d91c 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -616,7 +616,7 @@ def _make_tcp_handler(): return default() -def create_default_connection_handlers(reactor, config, handlers): +def create_default_connection_handlers(config, handlers): """ :return: A dictionary giving the default connection handlers. The keys are strings like "tcp" and the values are strings like "tor" or @@ -661,7 +661,7 @@ def create_default_connection_handlers(reactor, config, handlers): return default_connection_handlers -def create_connection_handlers(reactor, config, i2p_provider, tor_provider): +def create_connection_handlers(config, i2p_provider, tor_provider): """ :returns: 2-tuple of default_connection_handlers, foolscap_connection_handlers """ @@ -681,7 +681,6 @@ def create_connection_handlers(reactor, config, i2p_provider, tor_provider): umid="PuLh8g", ) return create_default_connection_handlers( - reactor, config, handlers, ), handlers diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index b5b571b0c..30aac8446 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -22,7 +22,7 @@ class TCP(unittest.TestCase): "no-basedir", BASECONFIG, ) - _, foolscap_handlers = create_connection_handlers(None, config, mock.Mock(), mock.Mock()) + _, foolscap_handlers = create_connection_handlers(config, mock.Mock(), mock.Mock()) self.assertIsInstance( foolscap_handlers['tcp'], tcp.DefaultTCP, @@ -341,7 +341,7 @@ class Connections(unittest.TestCase): self.config = config_from_string("fake.port", self.basedir, BASECONFIG) def test_default(self): - default_connection_handlers, _ = create_connection_handlers(None, self.config, mock.Mock(), mock.Mock()) + default_connection_handlers, _ = create_connection_handlers(self.config, mock.Mock(), mock.Mock()) self.assertEqual(default_connection_handlers["tcp"], "tcp") self.assertEqual(default_connection_handlers["tor"], "tor") self.assertEqual(default_connection_handlers["i2p"], "i2p") @@ -352,7 +352,7 @@ class Connections(unittest.TestCase): "no-basedir", BASECONFIG + "[connections]\ntcp = tor\n", ) - default_connection_handlers, _ = create_connection_handlers(None, config, mock.Mock(), mock.Mock()) + default_connection_handlers, _ = create_connection_handlers(config, mock.Mock(), mock.Mock()) self.assertEqual(default_connection_handlers["tcp"], "tor") self.assertEqual(default_connection_handlers["tor"], "tor") @@ -368,7 +368,7 @@ class Connections(unittest.TestCase): ) with self.assertRaises(ValueError) as ctx: tor_provider = create_tor_provider(reactor, self.config) - default_connection_handlers, _ = create_connection_handlers(None, self.config, mock.Mock(), tor_provider) + default_connection_handlers, _ = create_connection_handlers(self.config, mock.Mock(), tor_provider) self.assertEqual( str(ctx.exception), "'tahoe.cfg [connections] tcp='" @@ -383,7 +383,7 @@ class Connections(unittest.TestCase): BASECONFIG + "[connections]\ntcp = unknown\n", ) with self.assertRaises(ValueError) as ctx: - create_connection_handlers(None, config, mock.Mock(), mock.Mock()) + create_connection_handlers(config, mock.Mock(), mock.Mock()) self.assertIn("'tahoe.cfg [connections] tcp='", str(ctx.exception)) self.assertIn("uses unknown handler type 'unknown'", str(ctx.exception)) @@ -393,7 +393,7 @@ class Connections(unittest.TestCase): "no-basedir", BASECONFIG + "[connections]\ntcp = disabled\n", ) - default_connection_handlers, _ = create_connection_handlers(None, config, mock.Mock(), mock.Mock()) + default_connection_handlers, _ = create_connection_handlers(config, mock.Mock(), mock.Mock()) self.assertEqual(default_connection_handlers["tcp"], None) self.assertEqual(default_connection_handlers["tor"], "tor") self.assertEqual(default_connection_handlers["i2p"], "i2p") @@ -408,7 +408,7 @@ class Privacy(unittest.TestCase): ) with self.assertRaises(PrivacyError) as ctx: - create_connection_handlers(None, config, mock.Mock(), mock.Mock()) + create_connection_handlers(config, mock.Mock(), mock.Mock()) self.assertEqual( str(ctx.exception), @@ -423,7 +423,7 @@ class Privacy(unittest.TestCase): BASECONFIG + "[connections]\ntcp = disabled\n" + "[node]\nreveal-IP-address = false\n", ) - default_connection_handlers, _ = create_connection_handlers(None, config, mock.Mock(), mock.Mock()) + default_connection_handlers, _ = create_connection_handlers(config, mock.Mock(), mock.Mock()) self.assertEqual(default_connection_handlers["tcp"], None) def test_tub_location_auto(self): diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index eead51897..11e65380f 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -91,7 +91,7 @@ def testing_tub(config_data=''): i2p_provider = create_i2p_provider(reactor, config) tor_provider = create_tor_provider(reactor, config) - handlers = create_connection_handlers(reactor, config, i2p_provider, tor_provider) + handlers = create_connection_handlers(config, i2p_provider, tor_provider) default_connection_handlers, foolscap_connection_handlers = handlers tub_options = create_tub_options(config) @@ -936,9 +936,9 @@ class Configuration(unittest.TestCase): -class CreateConnectionHandlers(unittest.TestCase): +class CreateDefaultConnectionHandlersTests(unittest.TestCase): """ - Tests for create_connection_handlers(). + Tests for create_default_connection_handlers(). """ def test_tcp_disabled(self): @@ -949,9 +949,7 @@ class CreateConnectionHandlers(unittest.TestCase): [connections] tcp = disabled """)) - reactor = object() # it's not actually used?! default_handlers = create_default_connection_handlers( - reactor, config, {}, ) From 49330d1e4aa2969c8112355c2576dab09ff5af17 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 18:25:58 -0500 Subject: [PATCH 085/186] docstring --- src/allmydata/node.py | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index e5a11d91c..700924cca 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -725,7 +725,9 @@ def _convert_tub_port(s): class PortAssignmentRequired(Exception): - pass + """ + A Tub port number was configured to be 0 where this is not allowed. + """ def _tub_portlocation(config, get_local_addresses_sync, allocate_tcp_port): From b77f43e360ea3b71ef2552ba237767c20eda7483 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 18:27:02 -0500 Subject: [PATCH 086/186] Simplify handler initialization --- src/allmydata/node.py | 2 -- 1 file changed, 2 deletions(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index 700924cca..84abf11d1 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -669,8 +669,6 @@ def create_connection_handlers(config, i2p_provider, tor_provider): # create that handler, so hints which want it will be ignored. handlers = { "tcp": _make_tcp_handler(), - } - handlers.update({ "tor": tor_provider.get_tor_handler(), "i2p": i2p_provider.get_i2p_handler(), }) From 677e62e73eabb4016776c8314b595da3a66c21c0 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 18:29:50 -0500 Subject: [PATCH 087/186] Return the canned handler --- src/allmydata/test/common.py | 1 + 1 file changed, 1 insertion(+) diff --git a/src/allmydata/test/common.py b/src/allmydata/test/common.py index 0a1f2a98d..fbbf2000a 100644 --- a/src/allmydata/test/common.py +++ b/src/allmydata/test/common.py @@ -1172,6 +1172,7 @@ class FakeProvider(object): def get_client_endpoint(self): if self._handler is None: raise Exception("{!r} has no client endpoint.") + return self._handler class _TestCaseMixin(object): From 5c6e0a2bb46daac6944018f9e9267fcf4b40c6a5 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 18:33:28 -0500 Subject: [PATCH 088/186] docstrings --- src/allmydata/test/test_node.py | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index 11e65380f..a991caa08 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -708,6 +708,11 @@ class TestMissingPorts(unittest.TestCase): ) def test_tub_location_tcp(self): + """ + If ``reveal-IP-address`` is set to false and ``tub.location`` includes a + **tcp** hint then ``_tub_portlocation`` raises `PrivacyError`` because + TCP leaks IP addresses. + """ config = config_from_string( "fake.port", "no-basedir", @@ -725,6 +730,11 @@ class TestMissingPorts(unittest.TestCase): ) def test_tub_location_legacy_tcp(self): + """ + If ``reveal-IP-address`` is set to false and ``tub.location`` includes a + "legacy" hint with no explicit type (which means it is a **tcp** hint) + then the behavior is the same as for an explicit **tcp** hint. + """ config = config_from_string( "fake.port", "no-basedir", From a97184868d1bb71e02f72f3c884882dc6f9c8166 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Mon, 14 Dec 2020 18:40:18 -0500 Subject: [PATCH 089/186] oops syntax error so what is the good of pre-commit? --- src/allmydata/node.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index 84abf11d1..d859fd725 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -671,7 +671,7 @@ def create_connection_handlers(config, i2p_provider, tor_provider): "tcp": _make_tcp_handler(), "tor": tor_provider.get_tor_handler(), "i2p": i2p_provider.get_i2p_handler(), - }) + } log.msg( format="built Foolscap connection handlers for: %(known_handlers)s", known_handlers=sorted([k for k,v in handlers.items() if v]), From 6a295688883496fdb5e41e87f5dce615591cf0ad Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 15 Dec 2020 09:16:50 -0500 Subject: [PATCH 090/186] Fix flakes. --- src/allmydata/test/common_util.py | 2 -- src/allmydata/test/test_system.py | 2 +- 2 files changed, 1 insertion(+), 3 deletions(-) diff --git a/src/allmydata/test/common_util.py b/src/allmydata/test/common_util.py index 6b4e8513f..1ad0f8242 100644 --- a/src/allmydata/test/common_util.py +++ b/src/allmydata/test/common_util.py @@ -24,8 +24,6 @@ from allmydata.util.encodingutil import unicode_platform, get_filesystem_encodin from future.utils import bord, bchr, binary_type from past.builtins import unicode -from ..util.encodingutil import unicode_to_argv - def skip_if_cannot_represent_filename(u): precondition(isinstance(u, unicode)) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 2a0a6eb00..192b82e78 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -12,7 +12,7 @@ if PY2: from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, dict, list, object, range, max, min, str # noqa: F401 from past.builtins import chr as byteschr, long -from six import ensure_text, ensure_str, ensure_binary +from six import ensure_text, ensure_str import os, re, sys, time, json from functools import partial From 8ee07d58ee162a23140c91f5a120e4de3e06b88a Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 15 Dec 2020 09:50:09 -0500 Subject: [PATCH 091/186] Fix Nix build. --- nix/tahoe-lafs.nix | 5 ----- 1 file changed, 5 deletions(-) diff --git a/nix/tahoe-lafs.nix b/nix/tahoe-lafs.nix index cd7027988..eff4ad257 100644 --- a/nix/tahoe-lafs.nix +++ b/nix/tahoe-lafs.nix @@ -23,17 +23,12 @@ python.pkgs.buildPythonPackage rec { # This list is over-zealous because it's more work to disable individual # tests with in a module. - # test_system is a lot of integration-style tests that do a lot of real - # networking between many processes. They sometimes fail spuriously. - rm src/allmydata/test/test_system.py - # Many of these tests don't properly skip when i2p or tor dependencies are # not supplied (and we are not supplying them). rm src/allmydata/test/test_i2p_provider.py rm src/allmydata/test/test_connections.py rm src/allmydata/test/cli/test_create.py rm src/allmydata/test/test_client.py - rm src/allmydata/test/test_runner.py # Some eliot code changes behavior based on whether stdout is a tty or not # and fails when it is not. From 4a587836a5fda6321099b585678c9d29d02275a4 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 15 Dec 2020 10:13:46 -0500 Subject: [PATCH 092/186] Port eliotutil and tests to Python 3. --- newsfragments/3560.minor | 0 nix/tahoe-lafs.nix | 4 ---- src/allmydata/test/test_eliotutil.py | 8 +++++++- src/allmydata/util/_python3.py | 2 ++ src/allmydata/util/eliotutil.py | 13 ++++++++++++- 5 files changed, 21 insertions(+), 6 deletions(-) create mode 100644 newsfragments/3560.minor diff --git a/newsfragments/3560.minor b/newsfragments/3560.minor new file mode 100644 index 000000000..e69de29bb diff --git a/nix/tahoe-lafs.nix b/nix/tahoe-lafs.nix index cd7027988..148763c2f 100644 --- a/nix/tahoe-lafs.nix +++ b/nix/tahoe-lafs.nix @@ -34,10 +34,6 @@ python.pkgs.buildPythonPackage rec { rm src/allmydata/test/cli/test_create.py rm src/allmydata/test/test_client.py rm src/allmydata/test/test_runner.py - - # Some eliot code changes behavior based on whether stdout is a tty or not - # and fails when it is not. - rm src/allmydata/test/test_eliotutil.py ''; diff --git a/src/allmydata/test/test_eliotutil.py b/src/allmydata/test/test_eliotutil.py index b382b7289..1a8c2f801 100644 --- a/src/allmydata/test/test_eliotutil.py +++ b/src/allmydata/test/test_eliotutil.py @@ -1,5 +1,7 @@ """ -Tests for ``allmydata.test.eliotutil``. +Tests for ``allmydata.util.eliotutil``. + +Ported to Python 3. """ from __future__ import ( @@ -9,6 +11,10 @@ from __future__ import ( division, ) +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 + from sys import stdout import logging diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index f027a5b53..903567983 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -90,6 +90,7 @@ PORTED_MODULES = [ "allmydata.util.connection_status", "allmydata.util.deferredutil", "allmydata.util.dictutil", + "allmydata.util.eliotutil", "allmydata.util.encodingutil", "allmydata.util.fileutil", "allmydata.util.gcutil", @@ -140,6 +141,7 @@ PORTED_TEST_MODULES = [ "allmydata.test.test_dictutil", "allmydata.test.test_dirnode", "allmydata.test.test_download", + "allmydata.test.test_eliotutil", "allmydata.test.test_encode", "allmydata.test.test_encodingutil", "allmydata.test.test_filenode", diff --git a/src/allmydata/util/eliotutil.py b/src/allmydata/util/eliotutil.py index f6f40945d..096356dfa 100644 --- a/src/allmydata/util/eliotutil.py +++ b/src/allmydata/util/eliotutil.py @@ -1,6 +1,12 @@ """ Tools aimed at the interaction between Tahoe-LAFS implementation and Eliot. + +Ported to Python 3. """ +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals from __future__ import ( unicode_literals, @@ -18,6 +24,11 @@ __all__ = [ "validateSetMembership", ] +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 +from six import ensure_text + from sys import ( stdout, ) @@ -228,7 +239,7 @@ def _stdlib_logging_to_eliot_configuration(stdlib_logger, eliot_logger=None): class _DestinationParser(object): def parse(self, description): - description = description.decode(u"ascii") + description = ensure_text(description) try: kind, args = description.split(u":", 1) From 29f0ae05540479e0d9655e50155a0d3706e559a6 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 15 Dec 2020 13:30:58 -0500 Subject: [PATCH 093/186] These don't need to be methods. Also docstrings are nice. --- src/allmydata/test/test_node.py | 62 +++++++++++++++++++-------------- 1 file changed, 35 insertions(+), 27 deletions(-) diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index a991caa08..d9b22e38d 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -517,6 +517,20 @@ class TestCase(testutil.SignalMixin, unittest.TestCase): new_config.get_config("foo", "bar") +def _stub_get_local_addresses_sync(): + """ + A function like ``allmydata.util.iputil.get_local_addresses_sync``. + """ + return ["LOCAL"] + + +def _stub_allocate_tcp_port(): + """ + A function like ``allmydata.util.iputil.allocate_tcp_port``. + """ + return 999 + + class TestMissingPorts(unittest.TestCase): """ Test certain ``_tub_portlocation`` error cases for ports setup. @@ -525,16 +539,10 @@ class TestMissingPorts(unittest.TestCase): self.basedir = self.mktemp() create_node_dir(self.basedir, "testing") - def _get_addr(self): - return ["LOCAL"] - - def _alloc_port(self): - return 999 - def test_listen_on_zero(self): """ ``set_tub_locations`` raises ``PortAssignmentRequired`` called with a - listen address including port 0. + listen address including port 0 and no interface. """ config_data = ( "[node]\n" @@ -562,8 +570,8 @@ class TestMissingPorts(unittest.TestCase): tubport, tublocation = _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertEqual(tubport, "tcp:777") self.assertEqual(tublocation, b"tcp:LOCAL:777") @@ -579,8 +587,8 @@ class TestMissingPorts(unittest.TestCase): tubport, tublocation = _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertEqual(tubport, "tcp:999") self.assertEqual(tublocation, b"tcp:LOCAL:999") @@ -597,8 +605,8 @@ class TestMissingPorts(unittest.TestCase): tubport, tublocation = _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertEqual(tubport, "tcp:999") self.assertEqual(tublocation, b"tcp:HOST:888,tcp:LOCAL:999") @@ -616,8 +624,8 @@ class TestMissingPorts(unittest.TestCase): res = _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertTrue(res is None) @@ -634,8 +642,8 @@ class TestMissingPorts(unittest.TestCase): with self.assertRaises(ValueError) as ctx: _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertIn( "tub.port must not be empty", @@ -655,8 +663,8 @@ class TestMissingPorts(unittest.TestCase): with self.assertRaises(ValueError) as ctx: _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertIn( "tub.location must not be empty", @@ -677,8 +685,8 @@ class TestMissingPorts(unittest.TestCase): with self.assertRaises(ValueError) as ctx: _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertIn( "tub.port is disabled, but not tub.location", @@ -699,8 +707,8 @@ class TestMissingPorts(unittest.TestCase): with self.assertRaises(ValueError) as ctx: _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertIn( "tub.location is disabled, but not tub.port", @@ -721,8 +729,8 @@ class TestMissingPorts(unittest.TestCase): with self.assertRaises(PrivacyError) as ctx: _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertEqual( str(ctx.exception), @@ -744,8 +752,8 @@ class TestMissingPorts(unittest.TestCase): with self.assertRaises(PrivacyError) as ctx: _tub_portlocation( config, - self._get_addr, - self._alloc_port, + _stub_get_local_addresses_sync, + _stub_allocate_tcp_port, ) self.assertEqual( From be559ab3a5c5c081dd9f6644f1a693c3e09af486 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 15 Dec 2020 13:31:18 -0500 Subject: [PATCH 094/186] Turn the XXX into a TODO'd test and a ticket --- src/allmydata/test/test_node.py | 21 +++++++++++++++++---- 1 file changed, 17 insertions(+), 4 deletions(-) diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index d9b22e38d..a932a9923 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -6,7 +6,7 @@ from __future__ import division from __future__ import print_function from __future__ import unicode_literals -from future.utils import PY2 +from future.utils import PY2, native_str if PY2: from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 @@ -549,12 +549,25 @@ class TestMissingPorts(unittest.TestCase): "tub.port = tcp:0\n" ) config = config_from_string(self.basedir, "portnum", config_data) - - # XXX The implementation has some shortcomings. For example, how - # about tcp:localhost:0? with self.assertRaises(PortAssignmentRequired): _tub_portlocation(config, None, None) + def test_listen_on_zero_with_host(self): + """ + ``set_tub_locations`` raises ``PortAssignmentRequired`` called with a + listen address including port 0 and an interface. + """ + config_data = ( + "[node]\n" + "tub.port = tcp:0:interface=127.0.0.1\n" + ) + config = config_from_string(self.basedir, "portnum", config_data) + with self.assertRaises(PortAssignmentRequired): + _tub_portlocation(config, None, None) + test_listen_on_zero_with_host.todo = native_str( + "https://tahoe-lafs.org/trac/tahoe-lafs/ticket/3563" + ) + def test_parsing_tcp(self): """ When ``tub.port`` is given and ``tub.location`` is **AUTO** the port From c2dc2b39da29764943aadd93721b55317a9c3397 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 15 Dec 2020 13:34:04 -0500 Subject: [PATCH 095/186] A better name --- src/allmydata/test/common.py | 2 +- src/allmydata/test/test_node.py | 6 +++--- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/src/allmydata/test/common.py b/src/allmydata/test/common.py index fbbf2000a..b2fe8cac8 100644 --- a/src/allmydata/test/common.py +++ b/src/allmydata/test/common.py @@ -1156,7 +1156,7 @@ def _corrupt_uri_extension(data, debug=False): @attr.s @implementer(IAddressFamily) -class FakeProvider(object): +class ConstantAddresses(object): """ Pretend to provide support for some address family but just hand out canned responses. diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index a932a9923..2ea43f9ea 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -68,7 +68,7 @@ from allmydata.util.tor_provider import create as create_tor_provider import allmydata.test.common_util as testutil from .common import ( - FakeProvider, + ConstantAddresses, ) def port_numbers(): @@ -847,9 +847,9 @@ class Listeners(unittest.TestCase): t = FakeTub() i2p_listener = object() - i2p_provider = FakeProvider(i2p_listener) + i2p_provider = ConstantAddresses(i2p_listener) tor_listener = object() - tor_provider = FakeProvider(tor_listener) + tor_provider = ConstantAddresses(tor_listener) set_tub_locations( i2p_provider, From 53b782aca4206a460e8ec8ec0b3169b0ff587ef5 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 15 Dec 2020 13:40:12 -0500 Subject: [PATCH 096/186] get the function name right --- src/allmydata/test/test_node.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index 2ea43f9ea..f5d61efb5 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -541,7 +541,7 @@ class TestMissingPorts(unittest.TestCase): def test_listen_on_zero(self): """ - ``set_tub_locations`` raises ``PortAssignmentRequired`` called with a + ``_tub_portlocation`` raises ``PortAssignmentRequired`` called with a listen address including port 0 and no interface. """ config_data = ( @@ -554,7 +554,7 @@ class TestMissingPorts(unittest.TestCase): def test_listen_on_zero_with_host(self): """ - ``set_tub_locations`` raises ``PortAssignmentRequired`` called with a + ``_tub_portlocation`` raises ``PortAssignmentRequired`` called with a listen address including port 0 and an interface. """ config_data = ( From 7dbcb4d712b190bfd6ec3fdaff4079fd30eb203a Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 15 Dec 2020 13:41:01 -0500 Subject: [PATCH 097/186] Make set_tub_locations a nicer function --- src/allmydata/node.py | 56 ++++++++++++++++++--------------- src/allmydata/test/test_node.py | 5 +-- 2 files changed, 34 insertions(+), 27 deletions(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index d859fd725..90f0742f0 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -827,36 +827,31 @@ def _tub_portlocation(config, get_local_addresses_sync, allocate_tcp_port): return tubport, location -def set_tub_locations(i2p_provider, tor_provider, tub, portlocation): +def set_tub_locations(i2p_provider, tor_provider, tub, tubport, location): """ Assign a Tub its listener locations. :param i2p_provider: See ``allmydata.util.i2p_provider.create``. :param tor_provider: See ``allmydata.util.tor_provider.create``. """ - if portlocation is None: - log.msg("Tub is not listening") - else: - tubport, location = portlocation - for port in tubport.split(","): - if port == "listen:i2p": - # the I2P provider will read its section of tahoe.cfg and - # return either a fully-formed Endpoint, or a descriptor - # that will create one, so we don't have to stuff all the - # options into the tub.port string (which would need a lot - # of escaping) - port_or_endpoint = i2p_provider.get_listener() - elif port == "listen:tor": - port_or_endpoint = tor_provider.get_listener() - else: - port_or_endpoint = port - # Foolscap requires native strings: - if isinstance(port_or_endpoint, (bytes, str)): - port_or_endpoint = ensure_str(port_or_endpoint) - tub.listenOn(port_or_endpoint) - tub.setLocation(location) - log.msg("Tub location set to %s" % (location,)) - # the Tub is now ready for tub.registerReference() + for port in tubport.split(","): + if port == "listen:i2p": + # the I2P provider will read its section of tahoe.cfg and + # return either a fully-formed Endpoint, or a descriptor + # that will create one, so we don't have to stuff all the + # options into the tub.port string (which would need a lot + # of escaping) + port_or_endpoint = i2p_provider.get_listener() + elif port == "listen:tor": + port_or_endpoint = tor_provider.get_listener() + else: + port_or_endpoint = port + # Foolscap requires native strings: + if isinstance(port_or_endpoint, (bytes, str)): + port_or_endpoint = ensure_str(port_or_endpoint) + tub.listenOn(port_or_endpoint) + # This last step makes the Tub is ready for tub.registerReference() + tub.setLocation(location) def create_main_tub(config, tub_options, @@ -899,7 +894,18 @@ def create_main_tub(config, tub_options, handler_overrides=handler_overrides, certFile=certfile, ) - set_tub_locations(i2p_provider, tor_provider, tub, portlocation) + if portlocation is None: + log.msg("Tub is not listening") + else: + tubport, location = portlocation + set_tub_locations( + i2p_provider, + tor_provider, + tub, + tubport, + location, + ) + log.msg("Tub location set to %s" % (location,)) return tub diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index f5d61efb5..cd3eb1eab 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -833,7 +833,7 @@ class Listeners(unittest.TestCase): (port1, port2)) location = "tcp:localhost:%d,tcp:localhost:%d" % (port1, port2) t = FakeTub() - set_tub_locations(None, None, t, (port, location)) + set_tub_locations(None, None, t, port, location) self.assertEqual(t.listening_ports, ["tcp:%d:interface=127.0.0.1" % port1, "tcp:%d:interface=127.0.0.1" % port2]) @@ -855,7 +855,8 @@ class Listeners(unittest.TestCase): i2p_provider, tor_provider, t, - ("listen:i2p,listen:tor", "tcp:example.org:1234"), + "listen:i2p,listen:tor", + "tcp:example.org:1234", ) self.assertEqual( t.listening_ports, From fee8c55f01b8b4b9d5f1d0392d43b76f5a426371 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 15 Dec 2020 13:42:20 -0500 Subject: [PATCH 098/186] The listenOn is probably the most interesting part --- src/allmydata/node.py | 4 ++-- src/allmydata/test/test_node.py | 12 ++++++------ 2 files changed, 8 insertions(+), 8 deletions(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index 90f0742f0..c5433c33c 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -827,7 +827,7 @@ def _tub_portlocation(config, get_local_addresses_sync, allocate_tcp_port): return tubport, location -def set_tub_locations(i2p_provider, tor_provider, tub, tubport, location): +def tub_listen_on(i2p_provider, tor_provider, tub, tubport, location): """ Assign a Tub its listener locations. @@ -898,7 +898,7 @@ def create_main_tub(config, tub_options, log.msg("Tub is not listening") else: tubport, location = portlocation - set_tub_locations( + tub_listen_on( i2p_provider, tor_provider, tub, diff --git a/src/allmydata/test/test_node.py b/src/allmydata/test/test_node.py index cd3eb1eab..1e0f3020c 100644 --- a/src/allmydata/test/test_node.py +++ b/src/allmydata/test/test_node.py @@ -40,7 +40,7 @@ from twisted.application import service from allmydata.node import ( PortAssignmentRequired, PrivacyError, - set_tub_locations, + tub_listen_on, create_tub_options, create_main_tub, create_node_dir, @@ -833,16 +833,16 @@ class Listeners(unittest.TestCase): (port1, port2)) location = "tcp:localhost:%d,tcp:localhost:%d" % (port1, port2) t = FakeTub() - set_tub_locations(None, None, t, port, location) + tub_listen_on(None, None, t, port, location) self.assertEqual(t.listening_ports, ["tcp:%d:interface=127.0.0.1" % port1, "tcp:%d:interface=127.0.0.1" % port2]) def test_tor_i2p_listeners(self): """ - When configured to listen on an "i2p" or "tor" address, - ``set_tub_locations`` tells the Tub to listen on endpoints supplied by - the given Tor and I2P providers. + When configured to listen on an "i2p" or "tor" address, ``tub_listen_on`` + tells the Tub to listen on endpoints supplied by the given Tor and I2P + providers. """ t = FakeTub() @@ -851,7 +851,7 @@ class Listeners(unittest.TestCase): tor_listener = object() tor_provider = ConstantAddresses(tor_listener) - set_tub_locations( + tub_listen_on( i2p_provider, tor_provider, t, From c45290c55fc2d6aae620be09a0ee858b6a7322a8 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 15 Dec 2020 18:29:43 -0500 Subject: [PATCH 099/186] news fragment --- newsfragments/3533.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3533.minor diff --git a/newsfragments/3533.minor b/newsfragments/3533.minor new file mode 100644 index 000000000..e69de29bb From 2b1ea5c60429e7f39d7aef8898d484796da757c2 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 15 Dec 2020 18:30:12 -0500 Subject: [PATCH 100/186] Remove the client Mock object It wasn't used by anything so that was easy. Clean up the test as long as we're here. --- src/allmydata/test/web/common.py | 3 ++- src/allmydata/test/web/test_root.py | 40 +++++++++++++++++++++-------- 2 files changed, 31 insertions(+), 12 deletions(-) diff --git a/src/allmydata/test/web/common.py b/src/allmydata/test/web/common.py index 1f568ad8d..00a40e3c5 100644 --- a/src/allmydata/test/web/common.py +++ b/src/allmydata/test/web/common.py @@ -25,7 +25,8 @@ def assert_soup_has_tag_with_attributes(testcase, soup, tag_name, attrs): tags = soup.find_all(tag_name) for tag in tags: if all(v in tag.attrs.get(k, []) for k, v in attrs.items()): - return # we found every attr in this tag; done + # we found every attr in this tag; done + return tag testcase.fail( u"No <{}> tags contain attributes: {}".format(tag_name, attrs) ) diff --git a/src/allmydata/test/web/test_root.py b/src/allmydata/test/web/test_root.py index 139441a6c..0715c8102 100644 --- a/src/allmydata/test/web/test_root.py +++ b/src/allmydata/test/web/test_root.py @@ -1,7 +1,13 @@ -from mock import Mock - import time +from urllib import ( + quote, +) + +from bs4 import ( + BeautifulSoup, +) + from twisted.trial import unittest from twisted.web.template import Tag from twisted.web.test.requesthelper import DummyRequest @@ -16,6 +22,9 @@ from ...util.connection_status import ConnectionStatus from allmydata.web.root import URIHandler from allmydata.client import _Client +from .common import ( + assert_soup_has_tag_with_attributes, +) from ..common_web import ( render, ) @@ -30,28 +39,37 @@ class RenderSlashUri(unittest.TestCase): """ def setUp(self): - self.client = Mock() + self.client = object() self.res = URIHandler(self.client) - def test_valid(self): + def test_valid_query_redirect(self): """ - A valid capbility does not result in error + A syntactically valid capability given in the ``uri`` query argument + results in a redirect. """ - query_args = {b"uri": [ + cap = ( b"URI:CHK:nt2xxmrccp7sursd6yh2thhcky:" b"mukesarwdjxiyqsjinbfiiro6q7kgmmekocxfjcngh23oxwyxtzq:2:5:5874882" - ]} + ) + query_args = {b"uri": [cap]} response_body = self.successResultOf( render(self.res, query_args), ) - self.assertNotEqual( - response_body, - "Invalid capability", + soup = BeautifulSoup(response_body, 'html5lib') + tag = assert_soup_has_tag_with_attributes( + self, + soup, + u"meta", + {u"http-equiv": "refresh"}, + ) + self.assertIn( + quote(cap, safe=""), + tag.attrs.get(u"content"), ) def test_invalid(self): """ - A (trivially) invalid capbility is an error + A syntactically invalid capbility results in an error. """ query_args = {b"uri": [b"not a capability"]} response_body = self.successResultOf( From eeebd15c42936bab9987d24977e5e0fbc58c6f80 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 09:15:37 -0500 Subject: [PATCH 101/186] Take Mock out of ``allmydata.test.test_connections.TCP`` --- src/allmydata/node.py | 4 +-- src/allmydata/test/test_connections.py | 50 ++++++++++++++++++++++---- 2 files changed, 45 insertions(+), 9 deletions(-) diff --git a/src/allmydata/node.py b/src/allmydata/node.py index c5433c33c..e08c07508 100644 --- a/src/allmydata/node.py +++ b/src/allmydata/node.py @@ -669,8 +669,8 @@ def create_connection_handlers(config, i2p_provider, tor_provider): # create that handler, so hints which want it will be ignored. handlers = { "tcp": _make_tcp_handler(), - "tor": tor_provider.get_tor_handler(), - "i2p": i2p_provider.get_i2p_handler(), + "tor": tor_provider.get_client_endpoint(), + "i2p": i2p_provider.get_client_endpoint(), } log.msg( format="built Foolscap connection handlers for: %(known_handlers)s", diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 30aac8446..a5746ced1 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -1,31 +1,67 @@ import os import mock + from twisted.trial import unittest from twisted.internet import reactor, endpoints, defer from twisted.internet.interfaces import IStreamClientEndpoint + from foolscap.connections import tcp + +from testtools.matchers import ( + MatchesDict, + IsInstance, + Equals, +) + from ..node import PrivacyError, config_from_string from ..node import create_connection_handlers from ..node import create_main_tub from ..util.i2p_provider import create as create_i2p_provider from ..util.tor_provider import create as create_tor_provider +from .common import ( + SyncTestCase, + ConstantAddresses, +) + BASECONFIG = "" -class TCP(unittest.TestCase): - - def test_default(self): +class CreateConnectionHandlersTests(SyncTestCase): + """ + Tests for the Foolscap connection handlers return by + ``create_connection_handlers``. + """ + def test_foolscap_handlers(self): + """ + ``create_connection_handlers`` returns a Foolscap connection handlers + dictionary mapping ``"tcp"`` to + ``foolscap.connections.tcp.DefaultTCP``, ``"tor"`` to the supplied Tor + provider's handler, and ``"i2p"`` to the supplied I2P provider's + handler. + """ config = config_from_string( "fake.port", "no-basedir", BASECONFIG, ) - _, foolscap_handlers = create_connection_handlers(config, mock.Mock(), mock.Mock()) - self.assertIsInstance( - foolscap_handlers['tcp'], - tcp.DefaultTCP, + tor_endpoint = object() + tor = ConstantAddresses(handler=tor_endpoint) + i2p_endpoint = object() + i2p = ConstantAddresses(handler=i2p_endpoint) + _, foolscap_handlers = create_connection_handlers( + config, + i2p, + tor, + ) + self.assertThat( + foolscap_handlers, + MatchesDict({ + "tcp": IsInstance(tcp.DefaultTCP), + "i2p": Equals(i2p_endpoint), + "tor": Equals(tor_endpoint), + }), ) From 052b3d9fb1fd88f54bcdce5c7e88a4ed3e7ecc7a Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 16 Dec 2020 10:06:59 -0500 Subject: [PATCH 102/186] Re-enable logging validation on Python 3. --- newsfragments/3564.minor | 0 setup.py | 4 +++- src/allmydata/test/eliotutil.py | 16 ++++++++++++---- 3 files changed, 15 insertions(+), 5 deletions(-) create mode 100644 newsfragments/3564.minor diff --git a/newsfragments/3564.minor b/newsfragments/3564.minor new file mode 100644 index 000000000..e69de29bb diff --git a/setup.py b/setup.py index c26805684..345d6aa08 100644 --- a/setup.py +++ b/setup.py @@ -111,7 +111,9 @@ install_requires = [ # Eliot is contemplating dropping Python 2 support. Stick to a version we # know works on Python 2.7. - "eliot ~= 1.7", + "eliot ~= 1.7 ; python_version < '3.0'", + # On Python 3, we want a new enough version to support custom JSON encoders. + "eliot >= 1.13.0 ; python_version > '3.0'", # Pyrsistent 0.17.0 (which we use by way of Eliot) has dropped # Python 2 entirely; stick to the version known to work for us. diff --git a/src/allmydata/test/eliotutil.py b/src/allmydata/test/eliotutil.py index 0c72502a7..88b4ec90e 100644 --- a/src/allmydata/test/eliotutil.py +++ b/src/allmydata/test/eliotutil.py @@ -31,6 +31,9 @@ from twisted.internet.defer import ( maybeDeferred, ) +from ..util.jsonbytes import BytesJSONEncoder + + _NAME = Field.for_types( u"name", [str], @@ -61,6 +64,14 @@ def eliot_logged_test(f): class storage(object): pass + + # On Python 3, we want to use our custom JSON encoder when validating + # messages can be encoded to JSON: + if PY3: + capture = lambda f : capture_logging(None, encoder_=BytesJSONEncoder)(f) + else: + capture = lambda f : capture_logging(None)(f) + @wraps(f) def run_and_republish(self, *a, **kw): # Unfortunately the only way to get at the global/default logger... @@ -85,7 +96,7 @@ def eliot_logged_test(f): # can finish the test's action. storage.action.finish() - @capture_logging(None) + @capture def run(self, logger): # Record the MemoryLogger for later message extraction. storage.logger = logger @@ -165,9 +176,6 @@ class EliotLoggedRunTest(object): @eliot_logged_test def run(self, result=None): - # Workaround for https://github.com/itamarst/eliot/issues/456 - if PY3: - self.case.eliot_logger._validate_message = lambda *args, **kwargs: None return self._run_tests_with_factory( self.case, self.handlers, From d3f8839f1bb4d2f4d1f7f6e12052bef126b28774 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 10:34:07 -0500 Subject: [PATCH 103/186] Duplicate of allmydata.test.test_tor_provider.Provider.test_handler_disabled --- src/allmydata/test/test_connections.py | 10 ---------- 1 file changed, 10 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index a5746ced1..66319066c 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -67,16 +67,6 @@ class CreateConnectionHandlersTests(SyncTestCase): class Tor(unittest.TestCase): - def test_disabled(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[tor]\nenabled = false\n", - ) - tor_provider = create_tor_provider(reactor, config) - h = tor_provider.get_tor_handler() - self.assertEqual(h, None) - def test_unimportable(self): with mock.patch("allmydata.util.tor_provider._import_tor", return_value=None): From 17d9988d4560a8e78a6d820eb087cef0fdf47691 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 10:34:33 -0500 Subject: [PATCH 104/186] Duplicate of allmydata.test.test_tor_provider.Provider.test_handler_no_tor --- src/allmydata/test/test_connections.py | 8 -------- 1 file changed, 8 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 66319066c..6c5085593 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -67,14 +67,6 @@ class CreateConnectionHandlersTests(SyncTestCase): class Tor(unittest.TestCase): - def test_unimportable(self): - with mock.patch("allmydata.util.tor_provider._import_tor", - return_value=None): - config = config_from_string("fake.port", "no-basedir", BASECONFIG) - tor_provider = create_tor_provider(reactor, config) - h = tor_provider.get_tor_handler() - self.assertEqual(h, None) - def test_default(self): h1 = mock.Mock() with mock.patch("foolscap.connections.tor.default_socks", From b5d4a2579b172dfff9c147d09f5914cb1b5f86bb Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 10:37:41 -0500 Subject: [PATCH 105/186] Duplicate of allmydata.test.test_i2p_provider.Provider.test_handler_disabled --- src/allmydata/test/test_connections.py | 10 ---------- 1 file changed, 10 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 6c5085593..d59c415aa 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -212,16 +212,6 @@ class Tor(unittest.TestCase): class I2P(unittest.TestCase): - def test_disabled(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[i2p]\nenabled = false\n", - ) - i2p_provider = create_i2p_provider(None, config) - h = i2p_provider.get_i2p_handler() - self.assertEqual(h, None) - def test_unimportable(self): config = config_from_string( "fake.port", From ec9851f6d89ff0ff434a2428e815c67ea0571c69 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 10:38:13 -0500 Subject: [PATCH 106/186] Duplicate of allmydata.test.test_i2p_provider.Provider.test_handler_no_i2p --- src/allmydata/test/test_connections.py | 12 ------------ 1 file changed, 12 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index d59c415aa..2e936a2df 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -212,18 +212,6 @@ class Tor(unittest.TestCase): class I2P(unittest.TestCase): - def test_unimportable(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG, - ) - with mock.patch("allmydata.util.i2p_provider._import_i2p", - return_value=None): - i2p_provider = create_i2p_provider(reactor, config) - h = i2p_provider.get_i2p_handler() - self.assertEqual(h, None) - def test_default(self): config = config_from_string("fake.port", "no-basedir", BASECONFIG) h1 = mock.Mock() From 71ced4c228f799aa96985e8eee5b763c7a08d0cd Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:03:37 -0500 Subject: [PATCH 107/186] Duplicate of allmydata.test.test_tor_provider.Provider.test_handler_socks_endpoint --- src/allmydata/test/test_connections.py | 11 ----------- src/allmydata/test/test_tor_provider.py | 4 ++++ 2 files changed, 4 insertions(+), 11 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 2e936a2df..6e4a84e5e 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -67,17 +67,6 @@ class CreateConnectionHandlersTests(SyncTestCase): class Tor(unittest.TestCase): - def test_default(self): - h1 = mock.Mock() - with mock.patch("foolscap.connections.tor.default_socks", - return_value=h1) as f: - - config = config_from_string("fake.port", "no-basedir", BASECONFIG) - tor_provider = create_tor_provider(reactor, config) - h = tor_provider.get_tor_handler() - self.assertEqual(f.mock_calls, [mock.call()]) - self.assertIdentical(h, h1) - def _do_test_launch(self, executable): # the handler is created right away config = BASECONFIG+"[tor]\nlaunch = true\n" diff --git a/src/allmydata/test/test_tor_provider.py b/src/allmydata/test/test_tor_provider.py index bfc962831..c6d950632 100644 --- a/src/allmydata/test/test_tor_provider.py +++ b/src/allmydata/test/test_tor_provider.py @@ -349,6 +349,10 @@ class Provider(unittest.TestCase): cfs2.assert_called_with(reactor, ep_desc) def test_handler_socks_endpoint(self): + """ + If not configured otherwise, the Tor provider returns a Socks-based + handler. + """ tor = mock.Mock() handler = object() tor.socks_endpoint = mock.Mock(return_value=handler) From 61778bc799d147bee8c01f36a7b90e53076b4efd Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:05:51 -0500 Subject: [PATCH 108/186] Duplicate of allmydata.test.test_tor_provider.CreateOnion.test_launch --- src/allmydata/test/test_connections.py | 3 --- 1 file changed, 3 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 6e4a84e5e..22ad4a4ca 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -103,9 +103,6 @@ class Tor(unittest.TestCase): cfs.assert_called_with(reactor, "ep_desc") self.assertIs(cep, tcep) - def test_launch(self): - self._do_test_launch(None) - def test_launch_executable(self): self._do_test_launch("/special/tor") From 01b31e0680d81da0b749e230bc7e7afe703a26fc Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:06:57 -0500 Subject: [PATCH 109/186] Duplicate of allmydata.test.test_tor_provider.CreateOnion.test_launch_executable --- src/allmydata/test/test_connections.py | 39 -------------------------- 1 file changed, 39 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 22ad4a4ca..34bb3d9a8 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -67,45 +67,6 @@ class CreateConnectionHandlersTests(SyncTestCase): class Tor(unittest.TestCase): - def _do_test_launch(self, executable): - # the handler is created right away - config = BASECONFIG+"[tor]\nlaunch = true\n" - if executable: - config += "tor.executable = %s\n" % executable - h1 = mock.Mock() - with mock.patch("foolscap.connections.tor.control_endpoint_maker", - return_value=h1) as f: - - config = config_from_string("fake.port", ".", config) - tp = create_tor_provider("reactor", config) - h = tp.get_tor_handler() - - private_dir = config.get_config_path("private") - exp = mock.call(tp._make_control_endpoint, - takes_status=True) - self.assertEqual(f.mock_calls, [exp]) - self.assertIdentical(h, h1) - - # later, when Foolscap first connects, Tor should be launched - reactor = "reactor" - tcp = object() - tcep = object() - launch_tor = mock.Mock(return_value=defer.succeed(("ep_desc", tcp))) - cfs = mock.Mock(return_value=tcep) - with mock.patch("allmydata.util.tor_provider._launch_tor", launch_tor): - with mock.patch("allmydata.util.tor_provider.clientFromString", cfs): - d = tp._make_control_endpoint(reactor, - update_status=lambda status: None) - cep = self.successResultOf(d) - launch_tor.assert_called_with(reactor, executable, - os.path.abspath(private_dir), - tp._txtorcon) - cfs.assert_called_with(reactor, "ep_desc") - self.assertIs(cep, tcep) - - def test_launch_executable(self): - self._do_test_launch("/special/tor") - def test_socksport_unix_endpoint(self): h1 = mock.Mock() with mock.patch("foolscap.connections.tor.socks_endpoint", From f7c92bf4c94f2319c680d2d3267bae8c473a2569 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:11:22 -0500 Subject: [PATCH 110/186] Duplicate of allmydata.test.test_i2p_provider.Provider.test_handler_default --- src/allmydata/test/test_connections.py | 10 ---------- 1 file changed, 10 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 34bb3d9a8..8ec6da619 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -159,16 +159,6 @@ class Tor(unittest.TestCase): class I2P(unittest.TestCase): - def test_default(self): - config = config_from_string("fake.port", "no-basedir", BASECONFIG) - h1 = mock.Mock() - with mock.patch("foolscap.connections.i2p.default", - return_value=h1) as f: - i2p_provider = create_i2p_provider(reactor, config) - h = i2p_provider.get_i2p_handler() - self.assertEqual(f.mock_calls, [mock.call(reactor, keyfile=None)]) - self.assertIdentical(h, h1) - def test_samport(self): config = config_from_string( "fake.port", From ececae2ce95caf70702e5a0bc6908a3b399b467d Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:12:36 -0500 Subject: [PATCH 111/186] Duplicate of allmydata.test.test_i2p_provider.Provider.test_handler_sam_endpoint --- src/allmydata/test/test_connections.py | 17 ----------------- 1 file changed, 17 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 8ec6da619..f6b1af4c2 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -159,23 +159,6 @@ class Tor(unittest.TestCase): class I2P(unittest.TestCase): - def test_samport(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[i2p]\nsam.port = tcp:localhost:1234\n", - ) - h1 = mock.Mock() - with mock.patch("foolscap.connections.i2p.sam_endpoint", - return_value=h1) as f: - i2p_provider = create_i2p_provider(reactor, config) - h = i2p_provider.get_i2p_handler() - - self.assertEqual(len(f.mock_calls), 1) - ep = f.mock_calls[0][1][0] - self.assertIsInstance(ep, endpoints.TCP4ClientEndpoint) - self.assertIdentical(h, h1) - def test_samport_and_launch(self): config = config_from_string( "no-basedir", From acc36c34d05f53f2bc7fc61ccdf9ef859c7e0e03 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 16 Dec 2020 11:13:32 -0500 Subject: [PATCH 112/186] Tests pass on Python 2 and Python 3. --- src/allmydata/test/web/test_common.py | 2 +- src/allmydata/web/common.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/src/allmydata/test/web/test_common.py b/src/allmydata/test/web/test_common.py index 5261c412f..6f0d51d5a 100644 --- a/src/allmydata/test/web/test_common.py +++ b/src/allmydata/test/web/test_common.py @@ -163,7 +163,7 @@ class RenderExceptionTests(SyncTestCase): BeautifulSoup(value, 'html5lib'), "meta", {"http-equiv": "refresh", - "content": "0;URL={}".format(loc.encode("ascii")), + "content": "0;URL={}".format(loc), }, ) # The assertion will raise if it has a problem, otherwise diff --git a/src/allmydata/web/common.py b/src/allmydata/web/common.py index d970cc918..a1fd06c1c 100644 --- a/src/allmydata/web/common.py +++ b/src/allmydata/web/common.py @@ -562,7 +562,7 @@ def _finish(result, render, request): Message.log( message_type=u"allmydata:web:common-render:DecodedURL", ) - _finish(redirectTo(str(result), request), render, request) + _finish(redirectTo(result.to_text().encode("utf-8"), request), render, request) elif result is None: Message.log( message_type=u"allmydata:web:common-render:None", From e84860ef15c1ed0d7b652dec84c5369b6f4075b4 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:13:52 -0500 Subject: [PATCH 113/186] Duplicate of allmydata.test.test_i2p_provider.Provider.test_handler_launch --- src/allmydata/test/test_connections.py | 15 --------------- 1 file changed, 15 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index f6b1af4c2..f03c0ea76 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -174,21 +174,6 @@ class I2P(unittest.TestCase): str(ctx.exception) ) - def test_launch(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[i2p]\nlaunch = true\n", - ) - h1 = mock.Mock() - with mock.patch("foolscap.connections.i2p.launch", - return_value=h1) as f: - i2p_provider = create_i2p_provider(reactor, config) - h = i2p_provider.get_i2p_handler() - exp = mock.call(i2p_configdir=None, i2p_binary=None) - self.assertEqual(f.mock_calls, [exp]) - self.assertIdentical(h, h1) - def test_launch_executable(self): config = config_from_string( "fake.port", From 6d66be43b9279f0a7bee018b7cb3f23a4cbda306 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:14:46 -0500 Subject: [PATCH 114/186] Duplicate of allmydata.test.test_i2p_provider.Provider.test_handler_launch_configdir --- src/allmydata/test/test_connections.py | 15 --------------- 1 file changed, 15 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index f03c0ea76..0496eb05c 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -189,21 +189,6 @@ class I2P(unittest.TestCase): self.assertEqual(f.mock_calls, [exp]) self.assertIdentical(h, h1) - def test_launch_configdir(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[i2p]\nlaunch = true\n" + "i2p.configdir = cfg\n", - ) - h1 = mock.Mock() - with mock.patch("foolscap.connections.i2p.launch", - return_value=h1) as f: - i2p_provider = create_i2p_provider(reactor, config) - h = i2p_provider.get_i2p_handler() - exp = mock.call(i2p_configdir="cfg", i2p_binary=None) - self.assertEqual(f.mock_calls, [exp]) - self.assertIdentical(h, h1) - def test_launch_configdir_and_executable(self): config = config_from_string( "no-basedir", From f7362dc1efb8cf6cf0f0857f696dc8241ca40717 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 16 Dec 2020 11:14:55 -0500 Subject: [PATCH 115/186] Port to Python 3. --- src/allmydata/test/web/test_common.py | 10 ++++++++++ src/allmydata/util/_python3.py | 1 + 2 files changed, 11 insertions(+) diff --git a/src/allmydata/test/web/test_common.py b/src/allmydata/test/web/test_common.py index 6f0d51d5a..84ab5cab2 100644 --- a/src/allmydata/test/web/test_common.py +++ b/src/allmydata/test/web/test_common.py @@ -1,6 +1,16 @@ """ Tests for ``allmydata.web.common``. + +Ported to Python 3. """ +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals + +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 import gc diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index 903567983..169c7b413 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -171,4 +171,5 @@ PORTED_TEST_MODULES = [ "allmydata.test.test_upload", "allmydata.test.test_uri", "allmydata.test.test_util", + "allmydata.test.web.test_common", ] From 81b684b583c3304b995e8ff635b34aeee6405fe2 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:15:21 -0500 Subject: [PATCH 116/186] Duplicate of allmydata.test.test_i2p_provider.Provider.test_handler_launch_configdir_executable --- src/allmydata/test/test_connections.py | 16 ---------------- 1 file changed, 16 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 0496eb05c..c276393b2 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -189,22 +189,6 @@ class I2P(unittest.TestCase): self.assertEqual(f.mock_calls, [exp]) self.assertIdentical(h, h1) - def test_launch_configdir_and_executable(self): - config = config_from_string( - "no-basedir", - "fake.port", - BASECONFIG + "[i2p]\nlaunch = true\n" + - "i2p.executable = i2p\n" + "i2p.configdir = cfg\n", - ) - h1 = mock.Mock() - with mock.patch("foolscap.connections.i2p.launch", - return_value=h1) as f: - i2p_provider = create_i2p_provider(reactor, config) - h = i2p_provider.get_i2p_handler() - exp = mock.call(i2p_configdir="cfg", i2p_binary="i2p") - self.assertEqual(f.mock_calls, [exp]) - self.assertIdentical(h, h1) - def test_configdir(self): config = config_from_string( "fake.port", From 8271dbf3e65360d1be875d6aaebd13ca92416f59 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:15:51 -0500 Subject: [PATCH 117/186] Duplicate of allmydata.test.test_i2p_provider.Provider.test_handler_configdir --- src/allmydata/test/test_connections.py | 14 -------------- 1 file changed, 14 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index c276393b2..bfb84536c 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -189,20 +189,6 @@ class I2P(unittest.TestCase): self.assertEqual(f.mock_calls, [exp]) self.assertIdentical(h, h1) - def test_configdir(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[i2p]\ni2p.configdir = cfg\n", - ) - h1 = mock.Mock() - with mock.patch("foolscap.connections.i2p.local_i2p", - return_value=h1) as f: - i2p_provider = create_i2p_provider(None, config) - h = i2p_provider.get_i2p_handler() - - self.assertEqual(f.mock_calls, [mock.call("cfg")]) - self.assertIdentical(h, h1) class Connections(unittest.TestCase): From 61c76902cae05a19e484a8119a4ca236382b46aa Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 16 Dec 2020 11:16:34 -0500 Subject: [PATCH 118/186] Port to Python 3. --- src/allmydata/test/test_json_metadata.py | 11 +++++++++++ src/allmydata/util/_python3.py | 1 + 2 files changed, 12 insertions(+) diff --git a/src/allmydata/test/test_json_metadata.py b/src/allmydata/test/test_json_metadata.py index 75d4e1567..a0cb9c142 100644 --- a/src/allmydata/test/test_json_metadata.py +++ b/src/allmydata/test/test_json_metadata.py @@ -1,3 +1,14 @@ +""" +Ported to Python 3. +""" +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals + +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 from twisted.trial.unittest import TestCase diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index 169c7b413..98e8568e2 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -153,6 +153,7 @@ PORTED_TEST_MODULES = [ "allmydata.test.test_immutable", "allmydata.test.test_introducer", "allmydata.test.test_iputil", + "allmydata.test.test_json_metadata", "allmydata.test.test_log", "allmydata.test.test_monitor", "allmydata.test.test_netstring", From 7eb9f2ce5467143b95a71a5804a0b131e2edfe22 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:20:45 -0500 Subject: [PATCH 119/186] Moved into allmydata.test.test_i2p_provider This follows the local convention of using mock even though I'm trying to get rid of mock. This is because it keeps the test_i2p_provider suite consistent which means it won't make removing mock from test_i2p_provider later much harder and lets me avoid doing that work now. --- src/allmydata/test/test_connections.py | 16 ---------------- src/allmydata/test/test_i2p_provider.py | 14 ++++++++++++++ 2 files changed, 14 insertions(+), 16 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index bfb84536c..b35de60e1 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -174,22 +174,6 @@ class I2P(unittest.TestCase): str(ctx.exception) ) - def test_launch_executable(self): - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[i2p]\nlaunch = true\n" + "i2p.executable = i2p\n", - ) - h1 = mock.Mock() - with mock.patch("foolscap.connections.i2p.launch", - return_value=h1) as f: - i2p_provider = create_i2p_provider(reactor, config) - h = i2p_provider.get_i2p_handler() - exp = mock.call(i2p_configdir=None, i2p_binary="i2p") - self.assertEqual(f.mock_calls, [exp]) - self.assertIdentical(h, h1) - - class Connections(unittest.TestCase): def setUp(self): diff --git a/src/allmydata/test/test_i2p_provider.py b/src/allmydata/test/test_i2p_provider.py index a724b300e..37f2333f5 100644 --- a/src/allmydata/test/test_i2p_provider.py +++ b/src/allmydata/test/test_i2p_provider.py @@ -277,6 +277,20 @@ class Provider(unittest.TestCase): i2p.local_i2p.assert_called_with("configdir") self.assertIs(h, handler) + def test_handler_launch_executable(self): + i2p = mock.Mock() + handler = object() + i2p.launch = mock.Mock(return_value=handler) + reactor = object() + + with mock_i2p(i2p): + p = i2p_provider.create(reactor, + FakeConfig(launch=True, + **{"i2p.executable": "myi2p"})) + h = p.get_i2p_handler() + self.assertIs(h, handler) + i2p.launch.assert_called_with(i2p_configdir=None, i2p_binary="myi2p") + def test_handler_default(self): i2p = mock.Mock() handler = object() From 468895c74dc3823cea58de999903610817aaf058 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:28:36 -0500 Subject: [PATCH 120/186] Duplicate of allmydata.test.test_tor_provider.Provider.test_handler_control_endpoint --- src/allmydata/test/test_connections.py | 16 ---------------- 1 file changed, 16 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index b35de60e1..2989c10eb 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -141,22 +141,6 @@ class Tor(unittest.TestCase): str(ctx.exception) ) - def test_controlport(self): - h1 = mock.Mock() - with mock.patch("foolscap.connections.tor.control_endpoint", - return_value=h1) as f: - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[tor]\ncontrol.port = tcp:localhost:1234\n", - ) - tor_provider = create_tor_provider(reactor, config) - h = tor_provider.get_tor_handler() - self.assertEqual(len(f.mock_calls), 1) - ep = f.mock_calls[0][1][0] - self.assertIsInstance(ep, endpoints.TCP4ClientEndpoint) - self.assertIdentical(h, h1) - class I2P(unittest.TestCase): def test_samport_and_launch(self): From 3d564f97d59619bac5c1ee976fec6a0ba76379f1 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:48:33 -0500 Subject: [PATCH 121/186] Switch away from mock in a few more simple cases in test_connections.py --- src/allmydata/test/test_connections.py | 48 +++++++++++++++++++++----- 1 file changed, 39 insertions(+), 9 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 2989c10eb..59652730f 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -1,8 +1,7 @@ -import os import mock from twisted.trial import unittest -from twisted.internet import reactor, endpoints, defer +from twisted.internet import reactor from twisted.internet.interfaces import IStreamClientEndpoint from foolscap.connections import tcp @@ -165,7 +164,11 @@ class Connections(unittest.TestCase): self.config = config_from_string("fake.port", self.basedir, BASECONFIG) def test_default(self): - default_connection_handlers, _ = create_connection_handlers(self.config, mock.Mock(), mock.Mock()) + default_connection_handlers, _ = create_connection_handlers( + self.config, + ConstantAddresses(handler=object()), + ConstantAddresses(handler=object()), + ) self.assertEqual(default_connection_handlers["tcp"], "tcp") self.assertEqual(default_connection_handlers["tor"], "tor") self.assertEqual(default_connection_handlers["i2p"], "i2p") @@ -176,7 +179,11 @@ class Connections(unittest.TestCase): "no-basedir", BASECONFIG + "[connections]\ntcp = tor\n", ) - default_connection_handlers, _ = create_connection_handlers(config, mock.Mock(), mock.Mock()) + default_connection_handlers, _ = create_connection_handlers( + config, + ConstantAddresses(handler=object()), + ConstantAddresses(handler=object()), + ) self.assertEqual(default_connection_handlers["tcp"], "tor") self.assertEqual(default_connection_handlers["tor"], "tor") @@ -207,7 +214,11 @@ class Connections(unittest.TestCase): BASECONFIG + "[connections]\ntcp = unknown\n", ) with self.assertRaises(ValueError) as ctx: - create_connection_handlers(config, mock.Mock(), mock.Mock()) + create_connection_handlers( + config, + ConstantAddresses(handler=object()), + ConstantAddresses(handler=object()), + ) self.assertIn("'tahoe.cfg [connections] tcp='", str(ctx.exception)) self.assertIn("uses unknown handler type 'unknown'", str(ctx.exception)) @@ -217,7 +228,11 @@ class Connections(unittest.TestCase): "no-basedir", BASECONFIG + "[connections]\ntcp = disabled\n", ) - default_connection_handlers, _ = create_connection_handlers(config, mock.Mock(), mock.Mock()) + default_connection_handlers, _ = create_connection_handlers( + config, + ConstantAddresses(handler=object()), + ConstantAddresses(handler=object()), + ) self.assertEqual(default_connection_handlers["tcp"], None) self.assertEqual(default_connection_handlers["tor"], "tor") self.assertEqual(default_connection_handlers["i2p"], "i2p") @@ -232,7 +247,11 @@ class Privacy(unittest.TestCase): ) with self.assertRaises(PrivacyError) as ctx: - create_connection_handlers(config, mock.Mock(), mock.Mock()) + create_connection_handlers( + config, + ConstantAddresses(handler=object()), + ConstantAddresses(handler=object()), + ) self.assertEqual( str(ctx.exception), @@ -247,7 +266,11 @@ class Privacy(unittest.TestCase): BASECONFIG + "[connections]\ntcp = disabled\n" + "[node]\nreveal-IP-address = false\n", ) - default_connection_handlers, _ = create_connection_handlers(config, mock.Mock(), mock.Mock()) + default_connection_handlers, _ = create_connection_handlers( + config, + ConstantAddresses(handler=object()), + ConstantAddresses(handler=object()), + ) self.assertEqual(default_connection_handlers["tcp"], None) def test_tub_location_auto(self): @@ -258,7 +281,14 @@ class Privacy(unittest.TestCase): ) with self.assertRaises(PrivacyError) as ctx: - create_main_tub(config, {}, {}, {}, mock.Mock(), mock.Mock()) + create_main_tub( + config, + tub_options={}, + default_connection_handlers={}, + foolscap_connection_handlers={}, + i2p_provider=ConstantAddresses(), + tor_provider=ConstantAddresses(), + ) self.assertEqual( str(ctx.exception), "tub.location uses AUTO", From 3d82ca0d25a6bf6facaac0896d9f9e938c103d78 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 11:50:35 -0500 Subject: [PATCH 122/186] Use boring old dependency injection to replace mocks in this test --- src/allmydata/test/test_connections.py | 30 +++++++++++++------ src/allmydata/util/tor_provider.py | 41 ++++++++++++++------------ 2 files changed, 43 insertions(+), 28 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 59652730f..107d99dc6 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -190,16 +190,28 @@ class Connections(unittest.TestCase): self.assertEqual(default_connection_handlers["i2p"], "i2p") def test_tor_unimportable(self): - with mock.patch("allmydata.util.tor_provider._import_tor", - return_value=None): - self.config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[connections]\ntcp = tor\n", + """ + If the configuration calls for substituting Tor for TCP and + ``foolscap.connections.tor`` is not importable then + ``create_connection_handlers`` raises ``ValueError`` with a message + explaining this makes Tor unusable. + """ + self.config = config_from_string( + "fake.port", + "no-basedir", + BASECONFIG + "[connections]\ntcp = tor\n", + ) + tor_provider = create_tor_provider( + reactor, + self.config, + import_tor=lambda: None, + ) + with self.assertRaises(ValueError) as ctx: + default_connection_handlers, _ = create_connection_handlers( + self.config, + i2p_provider=ConstantAddresses(handler=object()), + tor_provider=tor_provider, ) - with self.assertRaises(ValueError) as ctx: - tor_provider = create_tor_provider(reactor, self.config) - default_connection_handlers, _ = create_connection_handlers(self.config, mock.Mock(), tor_provider) self.assertEqual( str(ctx.exception), "'tahoe.cfg [connections] tcp='" diff --git a/src/allmydata/util/tor_provider.py b/src/allmydata/util/tor_provider.py index 42700b3bc..7b832735d 100644 --- a/src/allmydata/util/tor_provider.py +++ b/src/allmydata/util/tor_provider.py @@ -17,23 +17,7 @@ from ..interfaces import ( IAddressFamily, ) -def create(reactor, config): - """ - Create a new _Provider service (this is an IService so must be - hooked up to a parent or otherwise started). - - If foolscap.connections.tor or txtorcon are not installed, then - Provider.get_tor_handler() will return None. If tahoe.cfg wants - to start an onion service too, then this `create()` method will - throw a nice error (and startService will throw an ugly error). - """ - provider = _Provider(config, reactor) - provider.check_onion_config() - return provider - - def _import_tor(): - # this exists to be overridden by unit tests try: from foolscap.connections import tor return tor @@ -47,6 +31,25 @@ def _import_txtorcon(): except ImportError: # pragma: no cover return None +def create(reactor, config, import_tor=None, import_txtorcon=None): + """ + Create a new _Provider service (this is an IService so must be + hooked up to a parent or otherwise started). + + If foolscap.connections.tor or txtorcon are not installed, then + Provider.get_tor_handler() will return None. If tahoe.cfg wants + to start an onion service too, then this `create()` method will + throw a nice error (and startService will throw an ugly error). + """ + if import_tor is None: + import_tor = _import_tor + if import_txtorcon is None: + import_txtorcon = _import_txtorcon + provider = _Provider(config, reactor, import_tor(), import_txtorcon()) + provider.check_onion_config() + return provider + + def data_directory(private_dir): return os.path.join(private_dir, "tor-statedir") @@ -217,14 +220,14 @@ def create_config(reactor, cli_config): @implementer(IAddressFamily) class _Provider(service.MultiService): - def __init__(self, config, reactor): + def __init__(self, config, reactor, tor, txtorcon): service.MultiService.__init__(self) self._config = config self._tor_launched = None self._onion_ehs = None self._onion_tor_control_proto = None - self._tor = _import_tor() - self._txtorcon = _import_txtorcon() + self._tor = tor + self._txtorcon = txtorcon self._reactor = reactor def _get_tor_config(self, *args, **kwargs): From 9f28ccb2a4babf7a1ec9cef6a70c5049ff1f1033 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 12:07:54 -0500 Subject: [PATCH 123/186] Move the last three mock-using tests to test_tor_provider where they can be rewritten later --- src/allmydata/test/test_connections.py | 44 ------------------------- src/allmydata/test/test_tor_provider.py | 40 ++++++++++++++++++++++ 2 files changed, 40 insertions(+), 44 deletions(-) diff --git a/src/allmydata/test/test_connections.py b/src/allmydata/test/test_connections.py index 107d99dc6..7a24ac794 100644 --- a/src/allmydata/test/test_connections.py +++ b/src/allmydata/test/test_connections.py @@ -1,8 +1,6 @@ -import mock from twisted.trial import unittest from twisted.internet import reactor -from twisted.internet.interfaces import IStreamClientEndpoint from foolscap.connections import tcp @@ -66,48 +64,6 @@ class CreateConnectionHandlersTests(SyncTestCase): class Tor(unittest.TestCase): - def test_socksport_unix_endpoint(self): - h1 = mock.Mock() - with mock.patch("foolscap.connections.tor.socks_endpoint", - return_value=h1) as f: - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[tor]\nsocks.port = unix:/var/lib/fw-daemon/tor_socks.socket\n", - ) - tor_provider = create_tor_provider(reactor, config) - h = tor_provider.get_tor_handler() - self.assertTrue(IStreamClientEndpoint.providedBy(f.mock_calls[0][1][0])) - self.assertIdentical(h, h1) - - def test_socksport_endpoint(self): - h1 = mock.Mock() - with mock.patch("foolscap.connections.tor.socks_endpoint", - return_value=h1) as f: - config = config_from_string( - "fake.port", - "no-basedir", - BASECONFIG + "[tor]\nsocks.port = tcp:127.0.0.1:1234\n", - ) - tor_provider = create_tor_provider(reactor, config) - h = tor_provider.get_tor_handler() - self.assertTrue(IStreamClientEndpoint.providedBy(f.mock_calls[0][1][0])) - self.assertIdentical(h, h1) - - def test_socksport_endpoint_otherhost(self): - h1 = mock.Mock() - with mock.patch("foolscap.connections.tor.socks_endpoint", - return_value=h1) as f: - config = config_from_string( - "no-basedir", - "fake.port", - BASECONFIG + "[tor]\nsocks.port = tcp:otherhost:1234\n", - ) - tor_provider = create_tor_provider(reactor, config) - h = tor_provider.get_tor_handler() - self.assertTrue(IStreamClientEndpoint.providedBy(f.mock_calls[0][1][0])) - self.assertIdentical(h, h1) - def test_socksport_bad_endpoint(self): config = config_from_string( "fake.port", diff --git a/src/allmydata/test/test_tor_provider.py b/src/allmydata/test/test_tor_provider.py index c6d950632..f5dd2e29c 100644 --- a/src/allmydata/test/test_tor_provider.py +++ b/src/allmydata/test/test_tor_provider.py @@ -369,6 +369,46 @@ class Provider(unittest.TestCase): tor.socks_endpoint.assert_called_with(ep) self.assertIs(h, handler) + def test_handler_socks_unix_endpoint(self): + """ + ``socks.port`` can be configured as a UNIX client endpoint. + """ + tor = mock.Mock() + handler = object() + tor.socks_endpoint = mock.Mock(return_value=handler) + ep = object() + cfs = mock.Mock(return_value=ep) + reactor = object() + + with mock_tor(tor): + p = tor_provider.create(reactor, + FakeConfig(**{"socks.port": "unix:path"})) + with mock.patch("allmydata.util.tor_provider.clientFromString", cfs): + h = p.get_tor_handler() + cfs.assert_called_with(reactor, "unix:path") + tor.socks_endpoint.assert_called_with(ep) + self.assertIs(h, handler) + + def test_handler_socks_tcp_endpoint(self): + """ + ``socks.port`` can be configured as a UNIX client endpoint. + """ + tor = mock.Mock() + handler = object() + tor.socks_endpoint = mock.Mock(return_value=handler) + ep = object() + cfs = mock.Mock(return_value=ep) + reactor = object() + + with mock_tor(tor): + p = tor_provider.create(reactor, + FakeConfig(**{"socks.port": "tcp:127.0.0.1:1234"})) + with mock.patch("allmydata.util.tor_provider.clientFromString", cfs): + h = p.get_tor_handler() + cfs.assert_called_with(reactor, "tcp:127.0.0.1:1234") + tor.socks_endpoint.assert_called_with(ep) + self.assertIs(h, handler) + def test_handler_control_endpoint(self): tor = mock.Mock() handler = object() From 8bb4d10d7f91a5d388621c120f626d200ed8432b Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 12:28:29 -0500 Subject: [PATCH 124/186] news fragment --- newsfragments/3529.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3529.minor diff --git a/newsfragments/3529.minor b/newsfragments/3529.minor new file mode 100644 index 000000000..e69de29bb From 67c0a4ac8430ff96fd4526a969cb9575c1cf9005 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 16 Dec 2020 13:53:49 -0500 Subject: [PATCH 125/186] Port another test module to Python 3. --- src/allmydata/test/web/test_util.py | 12 ++++++++++++ src/allmydata/util/_python3.py | 1 + src/allmydata/web/common.py | 18 +++++++++--------- src/allmydata/web/common_py3.py | 10 +++++----- 4 files changed, 27 insertions(+), 14 deletions(-) diff --git a/src/allmydata/test/web/test_util.py b/src/allmydata/test/web/test_util.py index 24f865ebc..5f4d6bb88 100644 --- a/src/allmydata/test/web/test_util.py +++ b/src/allmydata/test/web/test_util.py @@ -1,3 +1,15 @@ +""" +Ported to Python 3. +""" +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals + +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 + from twisted.trial import unittest from allmydata.web import status, common from ..common import ShouldFailMixin diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index 98e8568e2..31796a71b 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -173,4 +173,5 @@ PORTED_TEST_MODULES = [ "allmydata.test.test_uri", "allmydata.test.test_util", "allmydata.test.web.test_common", + "allmydata.test.web.test_util", ] diff --git a/src/allmydata/web/common.py b/src/allmydata/web/common.py index a1fd06c1c..4e8cab89d 100644 --- a/src/allmydata/web/common.py +++ b/src/allmydata/web/common.py @@ -210,26 +210,26 @@ def compute_rate(bytes, seconds): def abbreviate_rate(data): # 21.8kBps, 554.4kBps 4.37MBps if data is None: - return "" + return u"" r = float(data) if r > 1000000: - return "%1.2fMBps" % (r/1000000) + return u"%1.2fMBps" % (r/1000000) if r > 1000: - return "%.1fkBps" % (r/1000) - return "%.0fBps" % r + return u"%.1fkBps" % (r/1000) + return u"%.0fBps" % r def abbreviate_size(data): # 21.8kB, 554.4kB 4.37MB if data is None: - return "" + return u"" r = float(data) if r > 1000000000: - return "%1.2fGB" % (r/1000000000) + return u"%1.2fGB" % (r/1000000000) if r > 1000000: - return "%1.2fMB" % (r/1000000) + return u"%1.2fMB" % (r/1000000) if r > 1000: - return "%.1fkB" % (r/1000) - return "%.0fB" % r + return u"%.1fkB" % (r/1000) + return u"%.0fB" % r def plural(sequence_or_length): if isinstance(sequence_or_length, int): diff --git a/src/allmydata/web/common_py3.py b/src/allmydata/web/common_py3.py index 22f235790..89efd4d5d 100644 --- a/src/allmydata/web/common_py3.py +++ b/src/allmydata/web/common_py3.py @@ -97,14 +97,14 @@ class MultiFormatResource(resource.Resource, object): def abbreviate_time(data): # 1.23s, 790ms, 132us if data is None: - return "" + return u"" s = float(data) if s >= 10: return abbreviate.abbreviate_time(data) if s >= 1.0: - return "%.2fs" % s + return u"%.2fs" % s if s >= 0.01: - return "%.0fms" % (1000*s) + return u"%.0fms" % (1000*s) if s >= 0.001: - return "%.1fms" % (1000*s) - return "%.0fus" % (1000000*s) + return u"%.1fms" % (1000*s) + return u"%.0fus" % (1000000*s) From b5f2afe39cce90d3863934f1756336399be7fd75 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 16 Dec 2020 14:13:46 -0500 Subject: [PATCH 126/186] WIP porting test_status.py. --- src/allmydata/test/web/test_web.py | 10 +++++----- src/allmydata/web/status.py | 7 ++++--- 2 files changed, 9 insertions(+), 8 deletions(-) diff --git a/src/allmydata/test/web/test_web.py b/src/allmydata/test/web/test_web.py index 326569a26..f2b6d8f13 100644 --- a/src/allmydata/test/web/test_web.py +++ b/src/allmydata/test/web/test_web.py @@ -90,7 +90,7 @@ class FakeNodeMaker(NodeMaker): return FakeMutableFileNode(None, None, self.encoding_params, None, self.all_contents).init_from_cap(cap) - def create_mutable_file(self, contents="", keysize=None, + def create_mutable_file(self, contents=b"", keysize=None, version=SDMF_VERSION): n = FakeMutableFileNode(None, None, self.encoding_params, None, self.all_contents) @@ -105,7 +105,7 @@ class FakeUploader(service.Service): d = uploadable.get_size() d.addCallback(lambda size: uploadable.read(size)) def _got_data(datav): - data = "".join(datav) + data = b"".join(datav) n = create_chk_filenode(data, self.all_contents) ur = upload.UploadResults(file_size=len(data), ciphertext_fetched=0, @@ -130,9 +130,9 @@ def build_one_ds(): ds = DownloadStatus("storage_index", 1234) now = time.time() - serverA = StubServer(hashutil.tagged_hash("foo", "serverid_a")[:20]) - serverB = StubServer(hashutil.tagged_hash("foo", "serverid_b")[:20]) - storage_index = hashutil.storage_index_hash("SI") + serverA = StubServer(hashutil.tagged_hash(b"foo", b"serverid_a")[:20]) + serverB = StubServer(hashutil.tagged_hash(b"foo", b"serverid_b")[:20]) + storage_index = hashutil.storage_index_hash(b"SI") e0 = ds.add_segment_request(0, now) e0.activate(now+0.5) e0.deliver(now+1, 0, 100, 0.5) # when, start,len, decodetime diff --git a/src/allmydata/web/status.py b/src/allmydata/web/status.py index ec55b73eb..3284dfda6 100644 --- a/src/allmydata/web/status.py +++ b/src/allmydata/web/status.py @@ -1,3 +1,4 @@ +from past.builtins import long, unicode import pprint import itertools @@ -1297,6 +1298,7 @@ class Status(MultiFormatResource): except ValueError: raise WebError("no '-' in '{}'".format(path)) count = int(count_s) + stype = unicode(stype, "ascii") if stype == "up": for s in itertools.chain(h.list_all_upload_statuses(), h.list_all_helper_statuses()): @@ -1335,7 +1337,7 @@ class Status(MultiFormatResource): active = [s for s in self._get_all_statuses() if s.get_active()] - active.sort(lambda a, b: cmp(a.get_started(), b.get_started())) + active.sort(key=lambda a: a.get_started()) active.reverse() return active @@ -1343,7 +1345,7 @@ class Status(MultiFormatResource): recent = [s for s in self._get_all_statuses() if not s.get_active()] - recent.sort(lambda a, b: cmp(a.get_started(), b.get_started())) + recent.sort(key=lambda a: a.get_started()) recent.reverse() return recent @@ -1373,7 +1375,6 @@ class StatusElement(Element): started_s = render_time(op.get_started()) result["started"] = started_s - si_s = base32.b2a_or_none(op.get_storage_index()) if si_s is None: si_s = "(None)" From 83ebaef86cb90aa01b1216a007b171b8a92ced52 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 15:24:33 -0500 Subject: [PATCH 127/186] Stop mocking safe_load The comment implies this will cause something to break on some platform. Let's find out. --- src/allmydata/test/test_introducer.py | 15 ++++----------- 1 file changed, 4 insertions(+), 11 deletions(-) diff --git a/src/allmydata/test/test_introducer.py b/src/allmydata/test/test_introducer.py index b14b66ffb..ac587e51b 100644 --- a/src/allmydata/test/test_introducer.py +++ b/src/allmydata/test/test_introducer.py @@ -15,7 +15,7 @@ from six import ensure_binary, ensure_text import os, re, itertools from base64 import b32decode import json -from mock import Mock, patch +from mock import Mock from testtools.matchers import ( Is, @@ -94,17 +94,10 @@ class Node(testutil.SignalMixin, testutil.ReallyEqualMixin, AsyncTestCase): f.write(u'---\n') os.chmod(yaml_fname, 0o000) self.addCleanup(lambda: os.chmod(yaml_fname, 0o700)) - # just mocking the yaml failure, as "yamlutil.safe_load" only - # returns None on some platforms for unreadable files - with patch("allmydata.client.yamlutil") as p: - p.safe_load = Mock(return_value=None) - - fake_tub = Mock() - config = read_config(basedir, "portnum") - - with self.assertRaises(EnvironmentError): - create_introducer_clients(config, fake_tub) + config = read_config(basedir, "portnum") + with self.assertRaises(EnvironmentError): + create_introducer_clients(config, Tub()) @defer.inlineCallbacks def test_furl(self): From 3513e9b4fce500a358f5c517708cc33837b4abf7 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 15:25:11 -0500 Subject: [PATCH 128/186] news fragment --- newsfragments/3534.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3534.minor diff --git a/newsfragments/3534.minor b/newsfragments/3534.minor new file mode 100644 index 000000000..e69de29bb From 60e401ca697abfd910dc841c900025524b90846f Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 16:19:33 -0500 Subject: [PATCH 129/186] Make ObserverList synchronous, reentrant, and exception safe with tests --- src/allmydata/test/test_observer.py | 40 +++++++++++++++++++++++++++++ src/allmydata/util/observer.py | 15 ++++++++--- 2 files changed, 52 insertions(+), 3 deletions(-) diff --git a/src/allmydata/test/test_observer.py b/src/allmydata/test/test_observer.py index 0db13db58..73eece98e 100644 --- a/src/allmydata/test/test_observer.py +++ b/src/allmydata/test/test_observer.py @@ -101,3 +101,43 @@ class Observer(unittest.TestCase): d.addCallback(_step2) d.addCallback(_check2) return d + + def test_observer_list_reentrant(self): + """ + ``ObserverList`` is reentrant. + """ + observed = [] + + def observer_one(): + obs.unsubscribe(observer_one) + + def observer_two(): + observed.append(None) + + obs = observer.ObserverList() + obs.subscribe(observer_one) + obs.subscribe(observer_two) + obs.notify() + + self.assertEqual([None], observed) + + def test_observer_list_observer_errors(self): + """ + An error in an earlier observer does not prevent notification from being + delivered to a later observer. + """ + observed = [] + + def observer_one(): + raise Exception("Some problem here") + + def observer_two(): + observed.append(None) + + obs = observer.ObserverList() + obs.subscribe(observer_one) + obs.subscribe(observer_two) + obs.notify() + + self.assertEqual([None], observed) + self.assertEqual(1, len(self.flushLoggedErrors(Exception))) diff --git a/src/allmydata/util/observer.py b/src/allmydata/util/observer.py index 432aabb87..ad55e65a5 100644 --- a/src/allmydata/util/observer.py +++ b/src/allmydata/util/observer.py @@ -16,6 +16,9 @@ if PY2: import weakref from twisted.internet import defer from foolscap.api import eventually +from twisted.logger import ( + Logger, +) """The idiom we use is for the observed object to offer a method named 'when_something', which returns a deferred. That deferred will be fired when @@ -97,7 +100,10 @@ class LazyOneShotObserverList(OneShotObserverList): self._fire(self._get_result()) class ObserverList(object): - """A simple class to distribute events to a number of subscribers.""" + """ + Immediately distribute events to a number of subscribers. + """ + _logger = Logger() def __init__(self): self._watchers = [] @@ -109,8 +115,11 @@ class ObserverList(object): self._watchers.remove(observer) def notify(self, *args, **kwargs): - for o in self._watchers: - eventually(o, *args, **kwargs) + for o in self._watchers[:]: + try: + o(*args, **kwargs) + except: + self._logger.failure("While notifying {o!r}", o=o) class EventStreamObserver(object): """A simple class to distribute multiple events to a single subscriber. From b2c9296f6befc5e0046c165bb5b0d1b628cadf69 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 16:20:00 -0500 Subject: [PATCH 130/186] Use ObserverList instead of an ad hoc reimplementation --- src/allmydata/introducer/client.py | 23 +++++++++++++---------- 1 file changed, 13 insertions(+), 10 deletions(-) diff --git a/src/allmydata/introducer/client.py b/src/allmydata/introducer/client.py index fa1e1efe8..60a06ce2d 100644 --- a/src/allmydata/introducer/client.py +++ b/src/allmydata/introducer/client.py @@ -24,6 +24,9 @@ from allmydata.introducer.common import sign_to_foolscap, unsign_from_foolscap,\ get_tubid_string_from_ann from allmydata.util import log, yamlutil, connection_status from allmydata.util.rrefutil import add_version_to_remote_reference +from allmydata.util.observer import ( + ObserverList, +) from allmydata.crypto.error import BadSignature from allmydata.util.assertutil import precondition @@ -64,8 +67,7 @@ class IntroducerClient(service.Service, Referenceable): self._publisher = None self._since = None - self._local_subscribers = [] # (servicename,cb,args,kwargs) tuples - self._subscribed_service_names = set() + self._local_subscribers = {} # {servicename: ObserverList} self._subscriptions = set() # requests we've actually sent # _inbound_announcements remembers one announcement per @@ -179,21 +181,21 @@ class IntroducerClient(service.Service, Referenceable): return log.msg(*args, **kwargs) def subscribe_to(self, service_name, cb, *args, **kwargs): - self._local_subscribers.append( (service_name,cb,args,kwargs) ) - self._subscribed_service_names.add(service_name) + obs = self._local_subscribers.setdefault(service_name, ObserverList()) + obs.subscribe(lambda key_s, ann: cb(key_s, ann, *args, **kwargs)) self._maybe_subscribe() for index,(ann,key_s,when) in list(self._inbound_announcements.items()): precondition(isinstance(key_s, bytes), key_s) servicename = index[0] if servicename == service_name: - eventually(cb, key_s, ann, *args, **kwargs) + obs.notify(key_s, ann) def _maybe_subscribe(self): if not self._publisher: self.log("want to subscribe, but no introducer yet", level=log.NOISY) return - for service_name in self._subscribed_service_names: + for service_name in self._local_subscribers: if service_name in self._subscriptions: continue self._subscriptions.add(service_name) @@ -272,7 +274,7 @@ class IntroducerClient(service.Service, Referenceable): precondition(isinstance(key_s, bytes), key_s) self._debug_counts["inbound_announcement"] += 1 service_name = str(ann["service-name"]) - if service_name not in self._subscribed_service_names: + if service_name not in self._local_subscribers: self.log("announcement for a service we don't care about [%s]" % (service_name,), level=log.UNUSUAL, umid="dIpGNA") self._debug_counts["wrong_service"] += 1 @@ -343,9 +345,10 @@ class IntroducerClient(service.Service, Referenceable): def _deliver_announcements(self, key_s, ann): precondition(isinstance(key_s, bytes), key_s) service_name = str(ann["service-name"]) - for (service_name2,cb,args,kwargs) in self._local_subscribers: - if service_name2 == service_name: - eventually(cb, key_s, ann, *args, **kwargs) + + obs = self._local_subscribers.get(service_name) + if obs is not None: + obs.notify(key_s, ann) def connection_status(self): assert self.running # startService builds _introducer_reconnector From 98000c2b66a7a4a5b68e0abddb41abfc2ea9544d Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 16:20:38 -0500 Subject: [PATCH 131/186] re-implement test_unsigned_announcement without mock and to make assertions about public behavior instead of private implementation details --- src/allmydata/test/test_introducer.py | 57 +++++++++++++++++++++------ 1 file changed, 45 insertions(+), 12 deletions(-) diff --git a/src/allmydata/test/test_introducer.py b/src/allmydata/test/test_introducer.py index ac587e51b..668d577db 100644 --- a/src/allmydata/test/test_introducer.py +++ b/src/allmydata/test/test_introducer.py @@ -15,7 +15,12 @@ from six import ensure_binary, ensure_text import os, re, itertools from base64 import b32decode import json -from mock import Mock +from operator import ( + setitem, +) +from functools import ( + partial, +) from testtools.matchers import ( Is, @@ -84,7 +89,8 @@ class Node(testutil.SignalMixin, testutil.ReallyEqualMixin, AsyncTestCase): def test_introducer_clients_unloadable(self): """ - Error if introducers.yaml exists but we can't read it + ``create_introducer_clients`` raises ``EnvironmentError`` if + ``introducers.yaml`` exists but we can't read it. """ basedir = u"introducer.IntroducerNode.test_introducer_clients_unloadable" os.mkdir(basedir) @@ -1030,23 +1036,50 @@ class Signatures(SyncTestCase): unsign_from_foolscap, (bad_msg, sig, b"v999-key")) def test_unsigned_announcement(self): - ed25519.verifying_key_from_string(b"pub-v0-wodst6ly4f7i7akt2nxizsmmy2rlmer6apltl56zctn67wfyu5tq") - mock_tub = Mock() + """ + An incorrectly signed announcement is not delivered to subscribers. + """ + private_key, public_key = ed25519.create_signing_keypair() + public_key_str = ed25519.string_from_verifying_key(public_key) + ic = IntroducerClient( - mock_tub, + Tub(), "pb://", u"fake_nick", "0.0.0", "1.2.3", (0, u"i am a nonce"), - "invalid", + FilePath(self.mktemp()), + ) + received = {} + ic.subscribe_to("good-stuff", partial(setitem, received)) + + # Deliver a good message to prove our test code is valid. + ann = {"service-name": "good-stuff", "payload": "hello"} + ann_t = sign_to_foolscap(ann, private_key) + ic.got_announcements([ann_t]) + + self.assertEqual( + {public_key_str[len("pub-"):]: ann}, + received, + ) + received.clear() + + # Now deliver one without a valid signature and observe that it isn't + # delivered to the subscriber. + ann = {"service-name": "good-stuff", "payload": "bad stuff"} + (msg, sig, key) = sign_to_foolscap(ann, private_key) + # Invalidate the signature a little + sig = sig.replace(b"2", b"3") + ann_t = (msg, sig, key) + ic.got_announcements([ann_t]) + + # The received announcements dict should remain empty because we + # should not receive the announcement with the invalid signature. + self.assertEqual( + {}, + received, ) - self.assertEqual(0, ic._debug_counts["inbound_announcement"]) - ic.got_announcements([ - (b"message", b"v0-aaaaaaa", b"v0-wodst6ly4f7i7akt2nxizsmmy2rlmer6apltl56zctn67wfyu5tq") - ]) - # we should have rejected this announcement due to a bad signature - self.assertEqual(0, ic._debug_counts["inbound_announcement"]) # add tests of StorageFarmBroker: if it receives duplicate announcements, it From b200075246ee8d69e19d852fd44cdaa87b117908 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 16:23:05 -0500 Subject: [PATCH 132/186] whitespace --- src/allmydata/introducer/client.py | 1 - 1 file changed, 1 deletion(-) diff --git a/src/allmydata/introducer/client.py b/src/allmydata/introducer/client.py index 60a06ce2d..af84d12c3 100644 --- a/src/allmydata/introducer/client.py +++ b/src/allmydata/introducer/client.py @@ -345,7 +345,6 @@ class IntroducerClient(service.Service, Referenceable): def _deliver_announcements(self, key_s, ann): precondition(isinstance(key_s, bytes), key_s) service_name = str(ann["service-name"]) - obs = self._local_subscribers.get(service_name) if obs is not None: obs.notify(key_s, ann) From 4117beba6a3eb2f142e089c55fc04a840d1d1e24 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 16:25:51 -0500 Subject: [PATCH 133/186] remove unused import yaaay --- src/allmydata/introducer/client.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/introducer/client.py b/src/allmydata/introducer/client.py index af84d12c3..de24f9cf3 100644 --- a/src/allmydata/introducer/client.py +++ b/src/allmydata/introducer/client.py @@ -16,7 +16,7 @@ from six import ensure_text import time from zope.interface import implementer from twisted.application import service -from foolscap.api import Referenceable, eventually +from foolscap.api import Referenceable from allmydata.interfaces import InsufficientVersionError from allmydata.introducer.interfaces import IIntroducerClient, \ RIIntroducerSubscriberClient_v2 From a223f6bb60a9e9c3e7a25613c479c36d2d2ca74f Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 17:31:06 -0500 Subject: [PATCH 134/186] More reliably corrupt the signature --- src/allmydata/test/test_introducer.py | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/src/allmydata/test/test_introducer.py b/src/allmydata/test/test_introducer.py index 668d577db..1ba928257 100644 --- a/src/allmydata/test/test_introducer.py +++ b/src/allmydata/test/test_introducer.py @@ -1069,8 +1069,11 @@ class Signatures(SyncTestCase): # delivered to the subscriber. ann = {"service-name": "good-stuff", "payload": "bad stuff"} (msg, sig, key) = sign_to_foolscap(ann, private_key) - # Invalidate the signature a little - sig = sig.replace(b"2", b"3") + # Drop a base32 word from the middle of the key to invalidate the + # signature. + sig_l = list(sig) + sig_l[20:22] = [] + sig = b"".join(sig_l) ann_t = (msg, sig, key) ic.got_announcements([ann_t]) From 895ba55cf7759ed7a76eb77b435bbd038fe6a759 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 18:17:14 -0500 Subject: [PATCH 135/186] Python 3 compatibility --- src/allmydata/test/test_introducer.py | 6 +++--- src/allmydata/util/base32.py | 4 +++- 2 files changed, 6 insertions(+), 4 deletions(-) diff --git a/src/allmydata/test/test_introducer.py b/src/allmydata/test/test_introducer.py index 1ba928257..0475d3f6c 100644 --- a/src/allmydata/test/test_introducer.py +++ b/src/allmydata/test/test_introducer.py @@ -1071,9 +1071,9 @@ class Signatures(SyncTestCase): (msg, sig, key) = sign_to_foolscap(ann, private_key) # Drop a base32 word from the middle of the key to invalidate the # signature. - sig_l = list(sig) - sig_l[20:22] = [] - sig = b"".join(sig_l) + sig_a = bytearray(sig) + sig_a[20:22] = [] + sig = bytes(sig_a) ann_t = (msg, sig, key) ic.got_announcements([ann_t]) diff --git a/src/allmydata/util/base32.py b/src/allmydata/util/base32.py index 287d214ea..41c0f0413 100644 --- a/src/allmydata/util/base32.py +++ b/src/allmydata/util/base32.py @@ -140,7 +140,9 @@ def a2b(cs): # Add padding back, to make Python's base64 module happy: while (len(cs) * 5) % 8 != 0: cs += b"=" - return base64.b32decode(cs) + # Let newbytes come through and still work on Python 2, where the base64 + # module gets confused by them. + return base64.b32decode(backwardscompat_bytes(cs)) __all__ = ["b2a", "a2b", "b2a_or_none", "BASE32CHAR_3bits", "BASE32CHAR_1bits", "BASE32CHAR", "BASE32STR_anybytes", "could_be_base32_encoded"] From 0ffbc7870ee4f0f319f594acd8cb47aa289d51e7 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Wed, 16 Dec 2020 20:32:04 -0500 Subject: [PATCH 136/186] Okay, let KeyboardInterrupt through --- src/allmydata/test/test_observer.py | 13 +++++++++++++ src/allmydata/util/observer.py | 2 +- 2 files changed, 14 insertions(+), 1 deletion(-) diff --git a/src/allmydata/test/test_observer.py b/src/allmydata/test/test_observer.py index 73eece98e..134876be3 100644 --- a/src/allmydata/test/test_observer.py +++ b/src/allmydata/test/test_observer.py @@ -141,3 +141,16 @@ class Observer(unittest.TestCase): self.assertEqual([None], observed) self.assertEqual(1, len(self.flushLoggedErrors(Exception))) + + def test_observer_list_propagate_keyboardinterrupt(self): + """ + ``KeyboardInterrupt`` escapes ``ObserverList.notify``. + """ + def observer_one(): + raise KeyboardInterrupt() + + obs = observer.ObserverList() + obs.subscribe(observer_one) + + with self.assertRaises(KeyboardInterrupt): + obs.notify() diff --git a/src/allmydata/util/observer.py b/src/allmydata/util/observer.py index ad55e65a5..4a39fe014 100644 --- a/src/allmydata/util/observer.py +++ b/src/allmydata/util/observer.py @@ -118,7 +118,7 @@ class ObserverList(object): for o in self._watchers[:]: try: o(*args, **kwargs) - except: + except Exception: self._logger.failure("While notifying {o!r}", o=o) class EventStreamObserver(object): From 33392502d3e1e96a27fa187f39d076feadecdb87 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 17 Dec 2020 09:41:14 -0500 Subject: [PATCH 137/186] server IDs/node IDS should be bytes. --- src/allmydata/storage_client.py | 1 + src/allmydata/test/web/test_status.py | 10 +++++----- src/allmydata/test/web/test_web.py | 4 ++-- 3 files changed, 8 insertions(+), 7 deletions(-) diff --git a/src/allmydata/storage_client.py b/src/allmydata/storage_client.py index 294a2d215..eb1572dcb 100644 --- a/src/allmydata/storage_client.py +++ b/src/allmydata/storage_client.py @@ -464,6 +464,7 @@ class StorageFarmBroker(service.MultiService): @implementer(IDisplayableServer) class StubServer(object): def __init__(self, serverid): + assert isinstance(serverid, bytes) self.serverid = serverid # binary tubid def get_serverid(self): return self.serverid diff --git a/src/allmydata/test/web/test_status.py b/src/allmydata/test/web/test_status.py index 5685a3938..d2ab5a606 100644 --- a/src/allmydata/test/web/test_status.py +++ b/src/allmydata/test/web/test_status.py @@ -143,12 +143,12 @@ class DownloadStatusElementTests(TrialTestCase): See if we can render the page almost fully. """ status = FakeDownloadStatus( - "si-1", 123, - ["s-1", "s-2", "s-3"], - {"s-1": "unknown problem"}, - {"s-1": [1], "s-2": [1,2], "s-3": [2,3]}, + b"si-1", 123, + [b"s-1", b"s-2", b"s-3"], + {b"s-1": "unknown problem"}, + {b"s-1": [1], b"s-2": [1,2], b"s-3": [2,3]}, {"fetch_per_server": - {"s-1": [1], "s-2": [2,3], "s-3": [3,2]}} + {b"s-1": [1], b"s-2": [2,3], b"s-3": [3,2]}} ) result = self._render_download_status_element(status) diff --git a/src/allmydata/test/web/test_web.py b/src/allmydata/test/web/test_web.py index f2b6d8f13..48c477a47 100644 --- a/src/allmydata/test/web/test_web.py +++ b/src/allmydata/test/web/test_web.py @@ -261,7 +261,7 @@ class FakeClient(_Client): # minimal subset service.MultiService.__init__(self) self.all_contents = {} - self.nodeid = "fake_nodeid" + self.nodeid = b"fake_nodeid" self.nickname = u"fake_nickname \u263A" self.introducer_furls = [] self.introducer_clients = [] @@ -277,7 +277,7 @@ class FakeClient(_Client): # fake knowledge of another server self.storage_broker.test_add_server("other_nodeid", FakeDisplayableServer( - serverid="other_nodeid", nickname=u"other_nickname \u263B", connected = True, + serverid=b"other_nodeid", nickname=u"other_nickname \u263B", connected = True, last_connect_time = 10, last_loss_time = 20, last_rx_time = 30)) self.storage_broker.test_add_server("disconnected_nodeid", FakeDisplayableServer( From 3ac64e42f71355aafb6ceb97720eec5303522f7d Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 17 Dec 2020 09:54:04 -0500 Subject: [PATCH 138/186] Web test_status tests pass on Python 3. --- src/allmydata/test/web/test_web.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/test/web/test_web.py b/src/allmydata/test/web/test_web.py index 48c477a47..1908afdeb 100644 --- a/src/allmydata/test/web/test_web.py +++ b/src/allmydata/test/web/test_web.py @@ -127,7 +127,7 @@ class FakeUploader(service.Service): def build_one_ds(): - ds = DownloadStatus("storage_index", 1234) + ds = DownloadStatus(b"storage_index", 1234) now = time.time() serverA = StubServer(hashutil.tagged_hash(b"foo", b"serverid_a")[:20]) From 6e12cce1e49ac90f2925ed53953c24c8751d908e Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 17 Dec 2020 09:55:35 -0500 Subject: [PATCH 139/186] Port to Python 3. --- src/allmydata/test/web/test_status.py | 10 ++++++++++ src/allmydata/util/_python3.py | 1 + 2 files changed, 11 insertions(+) diff --git a/src/allmydata/test/web/test_status.py b/src/allmydata/test/web/test_status.py index d2ab5a606..414925446 100644 --- a/src/allmydata/test/web/test_status.py +++ b/src/allmydata/test/web/test_status.py @@ -1,6 +1,16 @@ """ Tests for ```allmydata.web.status```. + +Ported to Python 3. """ +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals + +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 from bs4 import BeautifulSoup from twisted.web.template import flattenString diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index 803835fae..af771cd5a 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -181,4 +181,5 @@ PORTED_TEST_MODULES = [ "allmydata.test.test_util", "allmydata.test.web.test_common", "allmydata.test.web.test_util", + "allmydata.test.web.test_status", ] From 48b9ffe2a5985691e56ecb78cdef56b20eba7c2c Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Thu, 17 Dec 2020 09:55:48 -0500 Subject: [PATCH 140/186] News file. --- newsfragments/3565.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3565.minor diff --git a/newsfragments/3565.minor b/newsfragments/3565.minor new file mode 100644 index 000000000..e69de29bb From f0359f106c2631e7ac671c9a496a8d6de0fc675d Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Thu, 17 Dec 2020 10:20:17 -0500 Subject: [PATCH 141/186] news fragment --- newsfragments/3567.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3567.minor diff --git a/newsfragments/3567.minor b/newsfragments/3567.minor new file mode 100644 index 000000000..e69de29bb From 8e6c52b61e2208a43ec0bcf1098f76c6b227eed2 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Thu, 17 Dec 2020 10:20:22 -0500 Subject: [PATCH 142/186] pre-assign a listening socket to the main tub to avoid the error --- src/allmydata/test/test_storage_client.py | 11 ++++++++--- 1 file changed, 8 insertions(+), 3 deletions(-) diff --git a/src/allmydata/test/test_storage_client.py b/src/allmydata/test/test_storage_client.py index 18caccc5d..3a21dfd9e 100644 --- a/src/allmydata/test/test_storage_client.py +++ b/src/allmydata/test/test_storage_client.py @@ -457,7 +457,8 @@ class StoragePluginWebPresence(AsyncTestCase): self.storage_plugin = u"tahoe-lafs-dummy-v1" from twisted.internet import reactor - _, port_endpoint = self.port_assigner.assign(reactor) + _, webport_endpoint = self.port_assigner.assign(reactor) + tubport_location, tubport_endpoint = self.port_assigner.assign(reactor) tempdir = TempDir() self.useFixture(tempdir) @@ -468,8 +469,12 @@ class StoragePluginWebPresence(AsyncTestCase): "web": "1", }, node_config={ - "tub.location": "127.0.0.1:1", - "web.port": ensure_text(port_endpoint), + # We don't really need the main Tub listening but if we + # disable it then we also have to disable storage (because + # config validation policy). + "tub.port": tubport_endpoint, + "tub.location": tubport_location, + "web.port": ensure_text(webport_endpoint), }, storage_plugin=self.storage_plugin, basedir=self.basedir, From 05699722124f1fa41eae1a23133b0a2468528d11 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Thu, 17 Dec 2020 12:40:12 -0500 Subject: [PATCH 143/186] news fragment --- newsfragments/3568.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3568.minor diff --git a/newsfragments/3568.minor b/newsfragments/3568.minor new file mode 100644 index 000000000..e69de29bb From ad1e38eea432dd2041be26ee7df0d29404b0d5a0 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Thu, 17 Dec 2020 12:40:20 -0500 Subject: [PATCH 144/186] Can we make codecov wait a while? --- .codecov.yml | 3 +++ 1 file changed, 3 insertions(+) diff --git a/.codecov.yml b/.codecov.yml index df1eb5e01..310a40743 100644 --- a/.codecov.yml +++ b/.codecov.yml @@ -40,3 +40,6 @@ codecov: # repositories have coverage uploaded to the right place in codecov so # their reports aren't incomplete. token: "abf679b6-e2e6-4b33-b7b5-6cfbd41ee691" + + notify: + after_n_builds: 23 From 29d54aee78fc43222e12a4cacf123a95c0ade0dc Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Thu, 17 Dec 2020 13:08:33 -0500 Subject: [PATCH 145/186] What does this setting do? --- .codecov.yml | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/.codecov.yml b/.codecov.yml index 310a40743..166190c5e 100644 --- a/.codecov.yml +++ b/.codecov.yml @@ -42,4 +42,7 @@ codecov: token: "abf679b6-e2e6-4b33-b7b5-6cfbd41ee691" notify: - after_n_builds: 23 + # The reference documentation suggests that this is the default setting: + # https://docs.codecov.io/docs/codecovyml-reference#codecovnotifywait_for_ci + # However observation suggests otherwise. + wait_for_ci: true From b24a9f70832984a942a017a3190cd95c0df75df2 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 18 Dec 2020 11:21:04 -0500 Subject: [PATCH 146/186] Trying to get test_grid working on Python 3. --- src/allmydata/test/no_network.py | 1 - src/allmydata/test/web/test_grid.py | 123 ++++++++++++++-------------- src/allmydata/web/common.py | 9 +- src/allmydata/web/common_py3.py | 8 ++ src/allmydata/web/filenode.py | 14 ++-- src/allmydata/web/root.py | 5 +- src/allmydata/webish.py | 3 +- 7 files changed, 89 insertions(+), 74 deletions(-) diff --git a/src/allmydata/test/no_network.py b/src/allmydata/test/no_network.py index 59ab807bb..9a2713830 100644 --- a/src/allmydata/test/no_network.py +++ b/src/allmydata/test/no_network.py @@ -614,7 +614,6 @@ class GridTestMixin(object): method="GET", clientnum=0, **kwargs): # if return_response=True, this fires with (data, statuscode, # respheaders) instead of just data. - assert not isinstance(urlpath, unicode) url = self.client_baseurls[clientnum] + urlpath response = yield treq.request(method, url, persistent=False, diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index 8f61781d4..04c3edbac 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -1,6 +1,7 @@ from __future__ import print_function -import os.path, re, urllib +import os.path, re +from urllib.parse import quote as url_quote import json from six.moves import StringIO @@ -37,7 +38,7 @@ DIR_HTML_TAG = '' class CompletelyUnhandledError(Exception): pass -class ErrorBoom(object, resource.Resource): +class ErrorBoom(resource.Resource): @render_exception def render(self, req): raise CompletelyUnhandledError("whoops") @@ -54,25 +55,25 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.set_up_grid() c0 = self.g.clients[0] self.uris = {} - DATA = "data" * 100 - d = c0.upload(upload.Data(DATA, convergence="")) + DATA = b"data" * 100 + d = c0.upload(upload.Data(DATA, convergence=b"")) def _stash_uri(ur, which): self.uris[which] = ur.get_uri() d.addCallback(_stash_uri, "good") d.addCallback(lambda ign: - c0.upload(upload.Data(DATA+"1", convergence=""))) + c0.upload(upload.Data(DATA+b"1", convergence=b""))) d.addCallback(_stash_uri, "sick") d.addCallback(lambda ign: - c0.upload(upload.Data(DATA+"2", convergence=""))) + c0.upload(upload.Data(DATA+b"2", convergence=b""))) d.addCallback(_stash_uri, "dead") def _stash_mutable_uri(n, which): self.uris[which] = n.get_uri() - assert isinstance(self.uris[which], str) + assert isinstance(self.uris[which], bytes) d.addCallback(lambda ign: - c0.create_mutable_file(publish.MutableData(DATA+"3"))) + c0.create_mutable_file(publish.MutableData(DATA+b"3"))) d.addCallback(_stash_mutable_uri, "corrupt") d.addCallback(lambda ign: - c0.upload(upload.Data("literal", convergence=""))) + c0.upload(upload.Data("literal", convergence=b""))) d.addCallback(_stash_uri, "small") d.addCallback(lambda ign: c0.create_immutable_dirnode({})) d.addCallback(_stash_mutable_uri, "smalldir") @@ -80,7 +81,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi def _compute_fileurls(ignored): self.fileurls = {} for which in self.uris: - self.fileurls[which] = "uri/" + urllib.quote(self.uris[which]) + self.fileurls[which] = "uri/" + url_quote(self.uris[which]) d.addCallback(_compute_fileurls) def _clobber_shares(ignored): @@ -203,28 +204,28 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.set_up_grid() c0 = self.g.clients[0] self.uris = {} - DATA = "data" * 100 - d = c0.upload(upload.Data(DATA, convergence="")) + DATA = b"data" * 100 + d = c0.upload(upload.Data(DATA, convergence=b"")) def _stash_uri(ur, which): self.uris[which] = ur.get_uri() d.addCallback(_stash_uri, "good") d.addCallback(lambda ign: - c0.upload(upload.Data(DATA+"1", convergence=""))) + c0.upload(upload.Data(DATA+b"1", convergence=b""))) d.addCallback(_stash_uri, "sick") d.addCallback(lambda ign: - c0.upload(upload.Data(DATA+"2", convergence=""))) + c0.upload(upload.Data(DATA+b"2", convergence=b""))) d.addCallback(_stash_uri, "dead") def _stash_mutable_uri(n, which): self.uris[which] = n.get_uri() - assert isinstance(self.uris[which], str) + assert isinstance(self.uris[which], bytes) d.addCallback(lambda ign: - c0.create_mutable_file(publish.MutableData(DATA+"3"))) + c0.create_mutable_file(publish.MutableData(DATA+b"3"))) d.addCallback(_stash_mutable_uri, "corrupt") def _compute_fileurls(ignored): self.fileurls = {} for which in self.uris: - self.fileurls[which] = "uri/" + urllib.quote(self.uris[which]) + self.fileurls[which] = "uri/" + url_quote(self.uris[which]) d.addCallback(_compute_fileurls) def _clobber_shares(ignored): @@ -286,8 +287,8 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.set_up_grid() c0 = self.g.clients[0] self.uris = {} - DATA = "data" * 100 - d = c0.upload(upload.Data(DATA+"1", convergence="")) + DATA = b"data" * 100 + d = c0.upload(upload.Data(DATA+b"1", convergence=b"")) def _stash_uri(ur, which): self.uris[which] = ur.get_uri() d.addCallback(_stash_uri, "sick") @@ -295,7 +296,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi def _compute_fileurls(ignored): self.fileurls = {} for which in self.uris: - self.fileurls[which] = "uri/" + urllib.quote(self.uris[which]) + self.fileurls[which] = "uri/" + url_quote(self.uris[which]) d.addCallback(_compute_fileurls) def _clobber_shares(ignored): @@ -329,7 +330,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.fileurls = {} # the future cap format may contain slashes, which must be tolerated - expected_info_url = "uri/%s?t=info" % urllib.quote(unknown_rwcap, + expected_info_url = "uri/%s?t=info" % url_quote(unknown_rwcap, safe="") if immutable: @@ -343,8 +344,8 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi def _stash_root_and_create_file(n): self.rootnode = n - self.rooturl = "uri/" + urllib.quote(n.get_uri()) - self.rourl = "uri/" + urllib.quote(n.get_readonly_uri()) + self.rooturl = "uri/" + url_quote(n.get_uri()) + self.rourl = "uri/" + url_quote(n.get_readonly_uri()) if not immutable: return self.rootnode.set_node(name, future_node) d.addCallback(_stash_root_and_create_file) @@ -510,7 +511,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.failUnlessIn("CHK", cap.to_string()) self.cap = cap self.rootnode = dn - self.rooturl = "uri/" + urllib.quote(dn.get_uri()) + self.rooturl = "uri/" + url_quote(dn.get_uri()) return download_to_data(dn._node) d.addCallback(_created) @@ -559,7 +560,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi if td.text != u"FILE": continue a = td.findNextSibling()(u"a")[0] - self.assertIn(urllib.quote(lonely_uri), a[u"href"]) + self.assertIn(url_quote(lonely_uri), a[u"href"]) self.assertEqual(u"lonely", a.text) self.assertEqual(a[u"rel"], [u"noreferrer"]) self.assertEqual(u"{}".format(len("one")), td.findNextSibling().findNextSibling().text) @@ -573,7 +574,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi if a.text == u"More Info" ) self.assertEqual(1, len(infos)) - self.assertTrue(infos[0].endswith(urllib.quote(lonely_uri) + "?t=info")) + self.assertTrue(infos[0].endswith(url_quote(lonely_uri) + "?t=info")) d.addCallback(_check_html) # ... and in JSON. @@ -596,12 +597,12 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi c0 = self.g.clients[0] self.uris = {} self.fileurls = {} - DATA = "data" * 100 + DATA = b"data" * 100 d = c0.create_dirnode() def _stash_root_and_create_file(n): self.rootnode = n - self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri()) - return n.add_file(u"good", upload.Data(DATA, convergence="")) + self.fileurls["root"] = "uri/" + url_quote(n.get_uri()) + return n.add_file(u"good", upload.Data(DATA, convergence=b"")) d.addCallback(_stash_root_and_create_file) def _stash_uri(fn, which): self.uris[which] = fn.get_uri() @@ -610,12 +611,12 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(lambda ign: self.rootnode.add_file(u"small", upload.Data("literal", - convergence=""))) + convergence=b""))) d.addCallback(_stash_uri, "small") d.addCallback(lambda ign: self.rootnode.add_file(u"sick", - upload.Data(DATA+"1", - convergence=""))) + upload.Data(DATA+b"1", + convergence=b""))) d.addCallback(_stash_uri, "sick") # this tests that deep-check and stream-manifest will ignore @@ -695,8 +696,8 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(_stash_uri, "subdir") d.addCallback(lambda subdir_node: subdir_node.add_file(u"grandchild", - upload.Data(DATA+"2", - convergence=""))) + upload.Data(DATA+b"2", + convergence=b""))) d.addCallback(_stash_uri, "grandchild") d.addCallback(lambda ign: @@ -770,12 +771,12 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi c0 = self.g.clients[0] self.uris = {} self.fileurls = {} - DATA = "data" * 100 + DATA = b"data" * 100 d = c0.create_dirnode() def _stash_root_and_create_file(n): self.rootnode = n - self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri()) - return n.add_file(u"good", upload.Data(DATA, convergence="")) + self.fileurls["root"] = "uri/" + url_quote(n.get_uri()) + return n.add_file(u"good", upload.Data(DATA, convergence=b"")) d.addCallback(_stash_root_and_create_file) def _stash_uri(fn, which): self.uris[which] = fn.get_uri() @@ -783,17 +784,17 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(lambda ign: self.rootnode.add_file(u"small", upload.Data("literal", - convergence=""))) + convergence=b""))) d.addCallback(_stash_uri, "small") d.addCallback(lambda ign: self.rootnode.add_file(u"sick", - upload.Data(DATA+"1", - convergence=""))) + upload.Data(DATA+b"1", + convergence=b""))) d.addCallback(_stash_uri, "sick") #d.addCallback(lambda ign: # self.rootnode.add_file(u"dead", - # upload.Data(DATA+"2", - # convergence=""))) + # upload.Data(DATA+b"2", + # convergence=b""))) #d.addCallback(_stash_uri, "dead") #d.addCallback(lambda ign: c0.create_mutable_file("mutable")) @@ -888,25 +889,25 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.set_up_grid(num_clients=2, oneshare=True) c0 = self.g.clients[0] self.uris = {} - DATA = "data" * 100 - d = c0.upload(upload.Data(DATA, convergence="")) + DATA = b"data" * 100 + d = c0.upload(upload.Data(DATA, convergence=b"")) def _stash_uri(ur, which): self.uris[which] = ur.get_uri() d.addCallback(_stash_uri, "one") d.addCallback(lambda ign: - c0.upload(upload.Data(DATA+"1", convergence=""))) + c0.upload(upload.Data(DATA+b"1", convergence=b""))) d.addCallback(_stash_uri, "two") def _stash_mutable_uri(n, which): self.uris[which] = n.get_uri() - assert isinstance(self.uris[which], str) + assert isinstance(self.uris[which], bytes) d.addCallback(lambda ign: - c0.create_mutable_file(publish.MutableData(DATA+"2"))) + c0.create_mutable_file(publish.MutableData(DATA+b"2"))) d.addCallback(_stash_mutable_uri, "mutable") def _compute_fileurls(ignored): self.fileurls = {} for which in self.uris: - self.fileurls[which] = "uri/" + urllib.quote(self.uris[which]) + self.fileurls[which] = "uri/" + url_quote(self.uris[which]) d.addCallback(_compute_fileurls) d.addCallback(self._count_leases, "one") @@ -982,13 +983,13 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi c0 = self.g.clients[0] self.uris = {} self.fileurls = {} - DATA = "data" * 100 + DATA = b"data" * 100 d = c0.create_dirnode() def _stash_root_and_create_file(n): self.rootnode = n self.uris["root"] = n.get_uri() - self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri()) - return n.add_file(u"one", upload.Data(DATA, convergence="")) + self.fileurls["root"] = "uri/" + url_quote(n.get_uri()) + return n.add_file(u"one", upload.Data(DATA, convergence=b"")) d.addCallback(_stash_root_and_create_file) def _stash_uri(fn, which): self.uris[which] = fn.get_uri() @@ -996,7 +997,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(lambda ign: self.rootnode.add_file(u"small", upload.Data("literal", - convergence=""))) + convergence=b""))) d.addCallback(_stash_uri, "small") d.addCallback(lambda ign: @@ -1051,34 +1052,34 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi c0 = self.g.clients[0] c0.encoding_params['happy'] = 2 self.fileurls = {} - DATA = "data" * 100 + DATA = b"data" * 100 d = c0.create_dirnode() def _stash_root(n): - self.fileurls["root"] = "uri/" + urllib.quote(n.get_uri()) + self.fileurls["root"] = "uri/" + url_quote(n.get_uri()) self.fileurls["imaginary"] = self.fileurls["root"] + "/imaginary" return n d.addCallback(_stash_root) - d.addCallback(lambda ign: c0.upload(upload.Data(DATA, convergence=""))) + d.addCallback(lambda ign: c0.upload(upload.Data(DATA, convergence=b""))) def _stash_bad(ur): - self.fileurls["1share"] = "uri/" + urllib.quote(ur.get_uri()) + self.fileurls["1share"] = "uri/" + url_quote(ur.get_uri()) self.delete_shares_numbered(ur.get_uri(), range(1,10)) u = uri.from_string(ur.get_uri()) u.key = testutil.flip_bit(u.key, 0) baduri = u.to_string() - self.fileurls["0shares"] = "uri/" + urllib.quote(baduri) + self.fileurls["0shares"] = "uri/" + url_quote(baduri) d.addCallback(_stash_bad) d.addCallback(lambda ign: c0.create_dirnode()) def _mangle_dirnode_1share(n): u = n.get_uri() - url = self.fileurls["dir-1share"] = "uri/" + urllib.quote(u) + url = self.fileurls["dir-1share"] = "uri/" + url_quote(u) self.fileurls["dir-1share-json"] = url + "?t=json" self.delete_shares_numbered(u, range(1,10)) d.addCallback(_mangle_dirnode_1share) d.addCallback(lambda ign: c0.create_dirnode()) def _mangle_dirnode_0share(n): u = n.get_uri() - url = self.fileurls["dir-0share"] = "uri/" + urllib.quote(u) + url = self.fileurls["dir-0share"] = "uri/" + url_quote(u) self.fileurls["dir-0share-json"] = url + "?t=json" self.delete_shares_numbered(u, range(0,10)) d.addCallback(_mangle_dirnode_0share) @@ -1269,9 +1270,9 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi c0 = self.g.clients[0] fn = c0.config.get_config_path("access.blacklist") self.uris = {} - DATA = "off-limits " * 50 + DATA = b"off-limits " * 50 - d = c0.upload(upload.Data(DATA, convergence="")) + d = c0.upload(upload.Data(DATA, convergence=b"")) def _stash_uri_and_create_dir(ur): self.uri = ur.get_uri() self.url = "uri/"+self.uri diff --git a/src/allmydata/web/common.py b/src/allmydata/web/common.py index 4e8cab89d..98f915a1e 100644 --- a/src/allmydata/web/common.py +++ b/src/allmydata/web/common.py @@ -1,4 +1,5 @@ from past.builtins import unicode +from six import ensure_text import time import json @@ -99,11 +100,13 @@ def get_filenode_metadata(filenode): def boolean_of_arg(arg): # TODO: "" + arg = ensure_text(arg) if arg.lower() not in ("true", "t", "1", "false", "f", "0", "on", "off"): raise WebError("invalid boolean argument: %r" % (arg,), http.BAD_REQUEST) return arg.lower() in ("true", "t", "1", "on") def parse_replace_arg(replace): + replace = ensure_text(replace) if replace.lower() == "only-files": return replace try: @@ -118,11 +121,11 @@ def get_format(req, default="CHK"): if boolean_of_arg(get_arg(req, "mutable", "false")): return "SDMF" return default - if arg.upper() == "CHK": + if arg.upper() == b"CHK": return "CHK" - elif arg.upper() == "SDMF": + elif arg.upper() == b"SDMF": return "SDMF" - elif arg.upper() == "MDMF": + elif arg.upper() == b"MDMF": return "MDMF" else: raise WebError("Unknown format: %s, I know CHK, SDMF, MDMF" % arg, diff --git a/src/allmydata/web/common_py3.py b/src/allmydata/web/common_py3.py index 89efd4d5d..05fc2b50b 100644 --- a/src/allmydata/web/common_py3.py +++ b/src/allmydata/web/common_py3.py @@ -4,6 +4,8 @@ Common utilities that are available from Python 3. Can eventually be merged back into allmydata.web.common. """ +from past.builtins import unicode + from twisted.web import resource, http from allmydata.util import abbreviate @@ -23,7 +25,13 @@ def get_arg(req, argname, default=None, multiple=False): empty), starting with all those in the query args. :param TahoeLAFSRequest req: The request to consider. + + :return: Either bytes or tuple of bytes. """ + if isinstance(argname, unicode): + argname = argname.encode("ascii") + if isinstance(default, unicode): + default = default.encode("ascii") results = [] if argname in req.args: results.extend(req.args[argname]) diff --git a/src/allmydata/web/filenode.py b/src/allmydata/web/filenode.py index f65977460..d90449e9f 100644 --- a/src/allmydata/web/filenode.py +++ b/src/allmydata/web/filenode.py @@ -133,9 +133,9 @@ class PlaceHolderNodeHandler(Resource, ReplaceMeMixin): @render_exception def render_POST(self, req): - t = get_arg(req, "t", "").strip() - replace = boolean_of_arg(get_arg(req, "replace", "true")) - if t == "upload": + t = get_arg(req, b"t", b"").strip() + replace = boolean_of_arg(get_arg(req, b"replace", b"true")) + if t == b"upload": # like PUT, but get the file data from an HTML form's input field. # We could get here from POST /uri/mutablefilecap?t=upload, # or POST /uri/path/file?t=upload, or @@ -290,11 +290,11 @@ class FileNodeHandler(Resource, ReplaceMeMixin, object): @render_exception def render_POST(self, req): - t = get_arg(req, "t", "").strip() - replace = boolean_of_arg(get_arg(req, "replace", "true")) - if t == "check": + t = get_arg(req, b"t", b"").strip() + replace = boolean_of_arg(get_arg(req, b"replace", b"true")) + if t == b"check": d = self._POST_check(req) - elif t == "upload": + elif t == b"upload": # like PUT, but get the file data from an HTML form's input field # We could get here from POST /uri/mutablefilecap?t=upload, # or POST /uri/path/file?t=upload, or diff --git a/src/allmydata/web/root.py b/src/allmydata/web/root.py index cb5ddc070..5d45d41b3 100644 --- a/src/allmydata/web/root.py +++ b/src/allmydata/web/root.py @@ -226,7 +226,10 @@ class Root(MultiFormatResource): self._client = client self._now_fn = now_fn - self.putChild("uri", URIHandler(client)) + # Children need to be bytes; for now just doing these to make specific + # tests pass on Python 3, but eventually will do all them when this + # module is ported to Python 3 (if not earlier). + self.putChild(b"uri", URIHandler(client)) self.putChild("cap", URIHandler(client)) # Handler for everything beneath "/private", an area of the resource diff --git a/src/allmydata/webish.py b/src/allmydata/webish.py index b5e310fbc..a7f44514f 100644 --- a/src/allmydata/webish.py +++ b/src/allmydata/webish.py @@ -110,6 +110,7 @@ def _get_client_ip(request): def _logFormatter(logDateTime, request): + print("REQUEST: {}".format(request.uri)) # we build up a log string that hides most of the cap, to preserve # user privacy. We retain the query args so we can identify things # like t=json. Then we send it to the flog. We make no attempt to @@ -128,7 +129,7 @@ def _logFormatter(logDateTime, request): # sure we censor these too. if queryargs.startswith(b"uri="): queryargs = b"uri=[CENSORED]" - queryargs = "?" + queryargs + queryargs = b"?" + queryargs if path.startswith(b"/uri/"): path = b"/uri/[CENSORED]" elif path.startswith(b"/file/"): From 8f4a0379eacb19320fe1c276c6e32f3313f1a988 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 18 Dec 2020 11:26:10 -0500 Subject: [PATCH 147/186] Correct examples. --- src/allmydata/test/common_util.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/allmydata/test/common_util.py b/src/allmydata/test/common_util.py index 1ad0f8242..63c39d39a 100644 --- a/src/allmydata/test/common_util.py +++ b/src/allmydata/test/common_util.py @@ -59,10 +59,10 @@ def run_cli_native(verb, *args, **kwargs): Most code should prefer ``run_cli_unicode`` which deals with all the necessary encoding considerations. - :param native_str verb: The command to run. For example, ``b"create-node"``. + :param native_str verb: The command to run. For example, ``"create-node"``. :param [native_str] args: The arguments to pass to the command. For example, - ``(b"--hostname=localhost",)``. + ``("--hostname=localhost",)``. :param [native_str] nodeargs: Extra arguments to pass to the Tahoe executable before ``verb``. From 721b02b262a538de05bd757c655dc87f274e1950 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 18 Dec 2020 11:29:51 -0500 Subject: [PATCH 148/186] Use the function I specifically wrote for this! --- src/allmydata/util/base32.py | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/src/allmydata/util/base32.py b/src/allmydata/util/base32.py index 75625c817..efad58841 100644 --- a/src/allmydata/util/base32.py +++ b/src/allmydata/util/base32.py @@ -134,8 +134,7 @@ def a2b(cs): @param cs the base-32 encoded data (as bytes) """ # Workaround Future newbytes issues by converting to real bytes on Python 2: - if hasattr(cs, "__native__"): - cs = cs.__native__() + cs = backwardscompat_bytes(cs) precondition(could_be_base32_encoded(cs), "cs is required to be possibly base32 encoded data.", cs=cs) precondition(isinstance(cs, bytes), cs) From 865f3fd7d0d0fa82dcb9ed4a75f7f12ee02004fa Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 18 Dec 2020 11:33:24 -0500 Subject: [PATCH 149/186] Improve the docstring. --- src/allmydata/test/test_system.py | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 192b82e78..0a5996cdc 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -70,7 +70,13 @@ from ..scripts.common import ( def run_cli(*args, **kwargs): """ - Backwards compatible version so we don't have to change all the tests. + Run a Tahoe-LAFS CLI utility, but inline. + + Version of run_cli_unicode() that takes any kind of string, and the + command-line args inline instead of as verb + list. + + Backwards compatible version so we don't have to change all the tests that + expected this API. """ nodeargs = [ensure_text(a) for a in kwargs.pop("nodeargs", [])] kwargs["nodeargs"] = nodeargs From 50a794a91154259d7d948e5fb110e520f7d40588 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 18 Dec 2020 11:34:08 -0500 Subject: [PATCH 150/186] More accurate docstring. --- src/allmydata/test/test_system.py | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index 0a5996cdc..e8b27fa69 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -664,7 +664,7 @@ def flush_but_dont_ignore(res): def _render_config(config): """ - Convert a ``dict`` of ``dict`` of ``bytes`` to an ini-format string. + Convert a ``dict`` of ``dict`` of ``unicode`` to an ini-format string. """ return u"\n\n".join(list( _render_config_section(k, v) @@ -674,8 +674,8 @@ def _render_config(config): def _render_config_section(heading, values): """ - Convert a ``bytes`` heading and a ``dict`` of ``bytes`` to an ini-format - section as ``bytes``. + Convert a ``unicode`` heading and a ``dict`` of ``unicode`` to an ini-format + section as ``unicode``. """ return u"[{}]\n{}\n".format( heading, _render_section_values(values) @@ -683,8 +683,8 @@ def _render_config_section(heading, values): def _render_section_values(values): """ - Convert a ``dict`` of ``bytes`` to the body of an ini-format section as - ``bytes``. + Convert a ``dict`` of ``unicode`` to the body of an ini-format section as + ``unicode``. """ return u"\n".join(list( u"{} = {}".format(k, v) From f964ae1782076dcb99aaad886393bbddb416ecbf Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 18 Dec 2020 15:43:27 -0500 Subject: [PATCH 151/186] Docstrings. --- src/allmydata/web/common.py | 20 ++++++++++++++++++-- src/allmydata/web/common_py3.py | 7 +++++++ 2 files changed, 25 insertions(+), 2 deletions(-) diff --git a/src/allmydata/web/common.py b/src/allmydata/web/common.py index 4e8cab89d..929db4f6e 100644 --- a/src/allmydata/web/common.py +++ b/src/allmydata/web/common.py @@ -208,7 +208,15 @@ def compute_rate(bytes, seconds): return 1.0 * bytes / seconds def abbreviate_rate(data): - # 21.8kBps, 554.4kBps 4.37MBps + """ + Convert number of bytes/second into human readable strings (unicode). + + Uses metric measures, so 1000 not 1024, e.g. 21.8kBps, 554.4kBps, 4.37MBps. + + :param data: Either ``None`` or integer. + + :return: Unicode string. + """ if data is None: return u"" r = float(data) @@ -219,7 +227,15 @@ def abbreviate_rate(data): return u"%.0fBps" % r def abbreviate_size(data): - # 21.8kB, 554.4kB 4.37MB + """ + Convert number of bytes into human readable strings (unicode). + + Uses metric measures, so 1000 not 1024, e.g. 21.8kB, 554.4kB, 4.37MB. + + :param data: Either ``None`` or integer. + + :return: Unicode string. + """ if data is None: return u"" r = float(data) diff --git a/src/allmydata/web/common_py3.py b/src/allmydata/web/common_py3.py index 89efd4d5d..41b6939f3 100644 --- a/src/allmydata/web/common_py3.py +++ b/src/allmydata/web/common_py3.py @@ -95,6 +95,13 @@ class MultiFormatResource(resource.Resource, object): def abbreviate_time(data): + """ + Convert number of seconds into human readable string. + + :param data: Either ``None`` or integer or float, seconds. + + :return: Unicode string. + """ # 1.23s, 790ms, 132us if data is None: return u"" From c71acf93fd056e1ea9ff6e6ac7d2094ceaedb221 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 18 Dec 2020 16:10:23 -0500 Subject: [PATCH 152/186] Bytes, alas. --- src/allmydata/web/filenode.py | 15 ++++++++------- 1 file changed, 8 insertions(+), 7 deletions(-) diff --git a/src/allmydata/web/filenode.py b/src/allmydata/web/filenode.py index d90449e9f..b58cc71f9 100644 --- a/src/allmydata/web/filenode.py +++ b/src/allmydata/web/filenode.py @@ -1,3 +1,4 @@ +from past.builtins import unicode import json @@ -117,7 +118,7 @@ class PlaceHolderNodeHandler(Resource, ReplaceMeMixin): @render_exception def render_PUT(self, req): - t = get_arg(req, "t", "").strip() + t = get_arg(req, b"t", b"").strip() replace = parse_replace_arg(get_arg(req, "replace", "true")) assert self.parentnode and self.name @@ -179,7 +180,7 @@ class FileNodeHandler(Resource, ReplaceMeMixin, object): @render_exception def render_GET(self, req): - t = get_arg(req, "t", "").strip() + t = unicode(get_arg(req, b"t", b"").strip(), "ascii") # t=info contains variable ophandles, so is not allowed an ETag. FIXED_OUTPUT_TYPES = ["", "json", "uri", "readonly-uri"] @@ -237,19 +238,19 @@ class FileNodeHandler(Resource, ReplaceMeMixin, object): @render_exception def render_HEAD(self, req): - t = get_arg(req, "t", "").strip() + t = get_arg(req, b"t", b"").strip() if t: raise WebError("HEAD file: bad t=%s" % t) - filename = get_arg(req, "filename", self.name) or "unknown" + filename = get_arg(req, b"filename", self.name) or "unknown" d = self.node.get_best_readable_version() d.addCallback(lambda dn: FileDownloader(dn, filename)) return d @render_exception def render_PUT(self, req): - t = get_arg(req, "t", "").strip() - replace = parse_replace_arg(get_arg(req, "replace", "true")) - offset = parse_offset_arg(get_arg(req, "offset", None)) + t = get_arg(req, b"t", b"").strip() + replace = parse_replace_arg(get_arg(req, b"replace", b"true")) + offset = parse_offset_arg(get_arg(req, b"offset", None)) if not t: if not replace: From 2ec7d52d09a0a44f3e76ba1ee40aeb24852431af Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Fri, 18 Dec 2020 16:10:31 -0500 Subject: [PATCH 153/186] Some progress towards passing tests on Python 3. --- src/allmydata/test/no_network.py | 3 ++- src/allmydata/test/web/test_grid.py | 8 +++++--- 2 files changed, 7 insertions(+), 4 deletions(-) diff --git a/src/allmydata/test/no_network.py b/src/allmydata/test/no_network.py index 9a2713830..e1f04b864 100644 --- a/src/allmydata/test/no_network.py +++ b/src/allmydata/test/no_network.py @@ -24,6 +24,7 @@ from future.utils import PY2 if PY2: from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 from past.builtins import unicode +from six import ensure_text import os from base64 import b32encode @@ -614,7 +615,7 @@ class GridTestMixin(object): method="GET", clientnum=0, **kwargs): # if return_response=True, this fires with (data, statuscode, # respheaders) instead of just data. - url = self.client_baseurls[clientnum] + urlpath + url = self.client_baseurls[clientnum] + ensure_text(urlpath) response = yield treq.request(method, url, persistent=False, allow_redirects=followRedirect, diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index 04c3edbac..652d29953 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -1,5 +1,7 @@ from __future__ import print_function +from past.builtins import unicode + import os.path, re from urllib.parse import quote as url_quote import json @@ -48,7 +50,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi def CHECK(self, ign, which, args, clientnum=0): fileurl = self.fileurls[which] url = fileurl + "?" + args - return self.GET(url, method="POST", clientnum=clientnum) + return self.GET(url, method="POST", clientnum=clientnum).addCallback(unicode, "utf-8") def test_filecheck(self): self.basedir = "web/Grid/filecheck" @@ -1275,7 +1277,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d = c0.upload(upload.Data(DATA, convergence=b"")) def _stash_uri_and_create_dir(ur): self.uri = ur.get_uri() - self.url = "uri/"+self.uri + self.url = b"uri/"+self.uri u = uri.from_string_filenode(self.uri) self.si = u.get_storage_index() childnode = c0.create_node_from_uri(self.uri, None) @@ -1284,7 +1286,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi def _stash_dir(node): self.dir_node = node self.dir_uri = node.get_uri() - self.dir_url = "uri/"+self.dir_uri + self.dir_url = b"uri/"+self.dir_uri d.addCallback(_stash_dir) d.addCallback(lambda ign: self.GET(self.dir_url, followRedirect=True)) def _check_dir_html(body): From 98c71e51e1aaf66f8dac9c950fd9a5818b333325 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 21 Dec 2020 10:04:27 -0500 Subject: [PATCH 154/186] More progress towards passing tests. --- src/allmydata/blacklist.py | 4 ++-- src/allmydata/test/common.py | 5 +++-- src/allmydata/test/web/test_grid.py | 11 +++++----- src/allmydata/web/directory.py | 34 ++++++++++++++--------------- src/allmydata/web/filenode.py | 2 +- src/allmydata/web/root.py | 1 + 6 files changed, 30 insertions(+), 27 deletions(-) diff --git a/src/allmydata/blacklist.py b/src/allmydata/blacklist.py index 89ee81a96..1ee507117 100644 --- a/src/allmydata/blacklist.py +++ b/src/allmydata/blacklist.py @@ -34,10 +34,10 @@ class Blacklist(object): try: if self.last_mtime is None or current_mtime > self.last_mtime: self.entries.clear() - with open(self.blacklist_fn, "r") as f: + with open(self.blacklist_fn, "rb") as f: for line in f: line = line.strip() - if not line or line.startswith("#"): + if not line or line.startswith(b"#"): continue si_s, reason = line.split(None, 1) si = base32.a2b(si_s) # must be valid base32 diff --git a/src/allmydata/test/common.py b/src/allmydata/test/common.py index b2fe8cac8..5bf90731d 100644 --- a/src/allmydata/test/common.py +++ b/src/allmydata/test/common.py @@ -825,11 +825,12 @@ class WebErrorMixin(object): code=None, substring=None, response_substring=None, callable=None, *args, **kwargs): # returns a Deferred with the response body - assert substring is None or isinstance(substring, str) + assert substring is None or isinstance(substring, bytes) + assert substring is None or isinstance(response_substring, bytes) assert callable def _validate(f): if code is not None: - self.failUnlessEqual(f.value.status, str(code), which) + self.failUnlessEqual(f.value.status, b"%d" % code, which) if substring: code_string = str(f) self.failUnless(substring in code_string, diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index 652d29953..6fcc2c06f 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -1290,6 +1290,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(_stash_dir) d.addCallback(lambda ign: self.GET(self.dir_url, followRedirect=True)) def _check_dir_html(body): + body = unicode(body, "utf-8") self.failUnlessIn(DIR_HTML_TAG, body) self.failUnlessIn("blacklisted.txt", body) d.addCallback(_check_dir_html) @@ -1301,14 +1302,14 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi f.write(" # this is a comment\n") f.write(" \n") f.write("\n") # also exercise blank lines - f.write("%s %s\n" % (base32.b2a(self.si), "off-limits to you")) + f.write("%s off-limits to you\n" % (unicode(base32.b2a(self.si), "ascii"),)) f.close() # clients should be checking the blacklist each time, so we don't # need to restart the client d.addCallback(_blacklist) d.addCallback(lambda ign: self.shouldHTTPError("get_from_blacklisted_uri", - 403, "Forbidden", - "Access Prohibited: off-limits", + 403, b"Forbidden", + b"Access Prohibited: off-limits", self.GET, self.url)) # We should still be able to list the parent directory, in HTML... @@ -1376,8 +1377,8 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(lambda body: self.failUnlessEqual(DATA, body)) def _block_dir(ign): - f = open(fn, "w") - f.write("%s %s\n" % (self.dir_si_b32, "dir-off-limits to you")) + f = open(fn, "wb") + f.write(b"%s %s\n" % (self.dir_si_b32, b"dir-off-limits to you")) f.close() self.g.clients[0].blacklist.last_mtime -= 2.0 d.addCallback(_block_dir) diff --git a/src/allmydata/web/directory.py b/src/allmydata/web/directory.py index f83defd6a..31afd4187 100644 --- a/src/allmydata/web/directory.py +++ b/src/allmydata/web/directory.py @@ -1,6 +1,6 @@ import json -import urllib +from urllib.parse import quote as url_quote from datetime import timedelta from zope.interface import implementer @@ -109,7 +109,7 @@ class DirectoryNodeHandler(ReplaceMeMixin, Resource, object): # or no further children) renders "this" page. We also need # to reject "/uri/URI:DIR2:..//", so we look at postpath. name = name.decode('utf8') - if not name and req.postpath != ['']: + if not name and req.postpath != [b'']: return self # Rejecting URIs that contain empty path pieces (for example: @@ -135,7 +135,7 @@ class DirectoryNodeHandler(ReplaceMeMixin, Resource, object): terminal = (req.prepath + req.postpath)[-1].decode('utf8') == name nonterminal = not terminal #len(req.postpath) > 0 - t = get_arg(req, "t", "").strip() + t = get_arg(req, b"t", b"").strip() if isinstance(node_or_failure, Failure): f = node_or_failure f.trap(NoSuchChildError) @@ -217,7 +217,7 @@ class DirectoryNodeHandler(ReplaceMeMixin, Resource, object): @render_exception def render_GET(self, req): # This is where all of the directory-related ?t=* code goes. - t = get_arg(req, "t", "").strip() + t = get_arg(req, b"t", b"").strip() # t=info contains variable ophandles, t=rename-form contains the name # of the child being renamed. Neither is allowed an ETag. @@ -255,7 +255,7 @@ class DirectoryNodeHandler(ReplaceMeMixin, Resource, object): @render_exception def render_PUT(self, req): - t = get_arg(req, "t", "").strip() + t = get_arg(req, b"t", b"").strip() replace = parse_replace_arg(get_arg(req, "replace", "true")) if t == "mkdir": @@ -275,7 +275,7 @@ class DirectoryNodeHandler(ReplaceMeMixin, Resource, object): @render_exception def render_POST(self, req): - t = get_arg(req, "t", "").strip() + t = get_arg(req, b"t", b"").strip() if t == "mkdir": d = self._POST_mkdir(req) @@ -732,7 +732,7 @@ class DirectoryAsHTML(Element): return "" rocap = self.node.get_readonly_uri() root = get_root(req) - uri_link = "%s/uri/%s/" % (root, urllib.quote(rocap)) + uri_link = "%s/uri/%s/" % (root, url_quote(rocap)) return tag(tags.a("Read-Only Version", href=uri_link)) @renderer @@ -754,10 +754,10 @@ class DirectoryAsHTML(Element): called by the 'children' renderer) """ name = name.encode("utf-8") - nameurl = urllib.quote(name, safe="") # encode any slashes too + nameurl = url_quote(name, safe="") # encode any slashes too root = get_root(req) - here = "{}/uri/{}/".format(root, urllib.quote(self.node.get_uri())) + here = "{}/uri/{}/".format(root, url_quote(self.node.get_uri())) if self.node.is_unknown() or self.node.is_readonly(): unlink = "-" rename = "-" @@ -814,7 +814,7 @@ class DirectoryAsHTML(Element): assert IFilesystemNode.providedBy(target), target target_uri = target.get_uri() or "" - quoted_uri = urllib.quote(target_uri, safe="") # escape slashes too + quoted_uri = url_quote(target_uri, safe="") # escape slashes too if IMutableFileNode.providedBy(target): # to prevent javascript in displayed .html files from stealing a @@ -835,7 +835,7 @@ class DirectoryAsHTML(Element): elif IDirectoryNode.providedBy(target): # directory - uri_link = "%s/uri/%s/" % (root, urllib.quote(target_uri)) + uri_link = "%s/uri/%s/" % (root, url_quote(target_uri)) slots["filename"] = tags.a(name, href=uri_link) if not target.is_mutable(): dirtype = "DIR-IMM" @@ -871,7 +871,7 @@ class DirectoryAsHTML(Element): slots["size"] = "-" # use a directory-relative info link, so we can extract both the # writecap and the readcap - info_link = "%s?t=info" % urllib.quote(name) + info_link = "%s?t=info" % url_quote(name) if info_link: slots["info"] = tags.a("More Info", href=info_link) @@ -888,7 +888,7 @@ class DirectoryAsHTML(Element): # because action="." doesn't get us back to the dir page (but # instead /uri itself) root = get_root(req) - here = "{}/uri/{}/".format(root, urllib.quote(self.node.get_uri())) + here = "{}/uri/{}/".format(root, url_quote(self.node.get_uri())) if self.node.is_readonly(): return tags.div("No upload forms: directory is read-only") @@ -1166,13 +1166,13 @@ def _cap_to_link(root, path, cap): if isinstance(cap_obj, (CHKFileURI, WriteableSSKFileURI, ReadonlySSKFileURI)): uri_link = root_url.child( u"file", - u"{}".format(urllib.quote(cap)), - u"{}".format(urllib.quote(path[-1])), + u"{}".format(url_quote(cap)), + u"{}".format(url_quote(path[-1])), ) else: uri_link = root_url.child( u"uri", - u"{}".format(urllib.quote(cap, safe="")), + u"{}".format(url_quote(cap, safe="")), ) return tags.a(cap, href=uri_link.to_text()) else: @@ -1464,7 +1464,7 @@ class UnknownNodeHandler(Resource, object): @render_exception def render_GET(self, req): - t = get_arg(req, "t", "").strip() + t = get_arg(req, b"t", b"").strip() if t == "info": return MoreInfo(self.node) if t == "json": diff --git a/src/allmydata/web/filenode.py b/src/allmydata/web/filenode.py index b58cc71f9..a69690b41 100644 --- a/src/allmydata/web/filenode.py +++ b/src/allmydata/web/filenode.py @@ -1,4 +1,4 @@ -from past.builtins import unicode +from past.builtins import unicode, long import json diff --git a/src/allmydata/web/root.py b/src/allmydata/web/root.py index 5d45d41b3..fa0a1c381 100644 --- a/src/allmydata/web/root.py +++ b/src/allmydata/web/root.py @@ -159,6 +159,7 @@ class URIHandler(resource.Resource, object): node = self.client.create_node_from_uri(name) return directory.make_handler_for(node, self.client) except (TypeError, AssertionError): + raise raise WebError( "'{}' is not a valid file- or directory- cap".format(name) ) From 15c7af8e72a59f4702dff0470514369d02fed9d8 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 21 Dec 2020 10:21:39 -0500 Subject: [PATCH 155/186] Another passing test. --- src/allmydata/test/common.py | 12 ++++++++---- src/allmydata/test/web/test_grid.py | 29 +++++++++++++++++------------ src/allmydata/web/directory.py | 8 ++++---- src/allmydata/web/filenode.py | 4 ++-- src/allmydata/web/root.py | 3 +-- 5 files changed, 32 insertions(+), 24 deletions(-) diff --git a/src/allmydata/test/common.py b/src/allmydata/test/common.py index 5bf90731d..f1dbf651d 100644 --- a/src/allmydata/test/common.py +++ b/src/allmydata/test/common.py @@ -11,7 +11,7 @@ __all__ = [ "skipIf", ] -from past.builtins import chr as byteschr +from past.builtins import chr as byteschr, unicode import os, random, struct import six @@ -825,14 +825,18 @@ class WebErrorMixin(object): code=None, substring=None, response_substring=None, callable=None, *args, **kwargs): # returns a Deferred with the response body - assert substring is None or isinstance(substring, bytes) - assert substring is None or isinstance(response_substring, bytes) + if isinstance(substring, bytes): + substring = unicode(substring, "ascii") + if isinstance(response_substring, unicode): + response_substring = response_substring.encode("ascii") + assert substring is None or isinstance(substring, unicode) + assert response_substring is None or isinstance(response_substring, bytes) assert callable def _validate(f): if code is not None: self.failUnlessEqual(f.value.status, b"%d" % code, which) if substring: - code_string = str(f) + code_string = unicode(f) self.failUnless(substring in code_string, "%s: substring '%s' not in '%s'" % (which, substring, code_string)) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index 6fcc2c06f..c78e0261f 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -52,6 +52,12 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi url = fileurl + "?" + args return self.GET(url, method="POST", clientnum=clientnum).addCallback(unicode, "utf-8") + def GET_string(self, *args, **kwargs): + """Send an HTTP request, but convert result to Unicode string.""" + d = GridTestMixin.GET(self, *args, **kwargs) + d.addCallback(unicode, "utf-8") + return d + def test_filecheck(self): self.basedir = "web/Grid/filecheck" self.set_up_grid() @@ -1288,9 +1294,8 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.dir_uri = node.get_uri() self.dir_url = b"uri/"+self.dir_uri d.addCallback(_stash_dir) - d.addCallback(lambda ign: self.GET(self.dir_url, followRedirect=True)) + d.addCallback(lambda ign: self.GET_string(self.dir_url, followRedirect=True)) def _check_dir_html(body): - body = unicode(body, "utf-8") self.failUnlessIn(DIR_HTML_TAG, body) self.failUnlessIn("blacklisted.txt", body) d.addCallback(_check_dir_html) @@ -1308,19 +1313,19 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi # need to restart the client d.addCallback(_blacklist) d.addCallback(lambda ign: self.shouldHTTPError("get_from_blacklisted_uri", - 403, b"Forbidden", - b"Access Prohibited: off-limits", + 403, "Forbidden", + "Access Prohibited: off-limits", self.GET, self.url)) # We should still be able to list the parent directory, in HTML... - d.addCallback(lambda ign: self.GET(self.dir_url, followRedirect=True)) + d.addCallback(lambda ign: self.GET_string(self.dir_url, followRedirect=True)) def _check_dir_html2(body): self.failUnlessIn(DIR_HTML_TAG, body) self.failUnlessIn("blacklisted.txt", body) d.addCallback(_check_dir_html2) # ... and in JSON (used by CLI). - d.addCallback(lambda ign: self.GET(self.dir_url+"?t=json", followRedirect=True)) + d.addCallback(lambda ign: self.GET(self.dir_url+b"?t=json", followRedirect=True)) def _check_dir_json(res): data = json.loads(res) self.failUnless(isinstance(data, list), data) @@ -1359,14 +1364,14 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(_add_dir) def _get_dircap(dn): self.dir_si_b32 = base32.b2a(dn.get_storage_index()) - self.dir_url_base = "uri/"+dn.get_write_uri() - self.dir_url_json1 = "uri/"+dn.get_write_uri()+"?t=json" - self.dir_url_json2 = "uri/"+dn.get_write_uri()+"?t=json" - self.dir_url_json_ro = "uri/"+dn.get_readonly_uri()+"?t=json" - self.child_url = "uri/"+dn.get_readonly_uri()+"/child" + self.dir_url_base = b"uri/"+dn.get_write_uri() + self.dir_url_json1 = b"uri/"+dn.get_write_uri()+b"?t=json" + self.dir_url_json2 = b"uri/"+dn.get_write_uri()+b"?t=json" + self.dir_url_json_ro = b"uri/"+dn.get_readonly_uri()+b"?t=json" + self.child_url = b"uri/"+dn.get_readonly_uri()+b"/child" d.addCallback(_get_dircap) d.addCallback(lambda ign: self.GET(self.dir_url_base, followRedirect=True)) - d.addCallback(lambda body: self.failUnlessIn(DIR_HTML_TAG, body)) + d.addCallback(lambda body: self.failUnlessIn(DIR_HTML_TAG, unicode(body, "utf-8"))) d.addCallback(lambda ign: self.GET(self.dir_url_json1)) d.addCallback(lambda res: json.loads(res)) # just check it decodes d.addCallback(lambda ign: self.GET(self.dir_url_json2)) diff --git a/src/allmydata/web/directory.py b/src/allmydata/web/directory.py index 31afd4187..d66f664c9 100644 --- a/src/allmydata/web/directory.py +++ b/src/allmydata/web/directory.py @@ -1,5 +1,5 @@ +from past.builtins import unicode -import json from urllib.parse import quote as url_quote from datetime import timedelta @@ -20,7 +20,7 @@ from twisted.web.template import ( from hyperlink import URL from twisted.python.filepath import FilePath -from allmydata.util import base32 +from allmydata.util import base32, jsonbytes as json from allmydata.util.encodingutil import ( to_bytes, quote_output, @@ -217,7 +217,7 @@ class DirectoryNodeHandler(ReplaceMeMixin, Resource, object): @render_exception def render_GET(self, req): # This is where all of the directory-related ?t=* code goes. - t = get_arg(req, b"t", b"").strip() + t = unicode(get_arg(req, b"t", b"").strip(), "ascii") # t=info contains variable ophandles, t=rename-form contains the name # of the child being renamed. Neither is allowed an ETag. @@ -1005,7 +1005,7 @@ def _directory_json_metadata(req, dirnode): d = dirnode.list() def _got(children): kids = {} - for name, (childnode, metadata) in children.iteritems(): + for name, (childnode, metadata) in children.items(): assert IFilesystemNode.providedBy(childnode), childnode rw_uri = childnode.get_write_uri() ro_uri = childnode.get_readonly_uri() diff --git a/src/allmydata/web/filenode.py b/src/allmydata/web/filenode.py index a69690b41..5bd575631 100644 --- a/src/allmydata/web/filenode.py +++ b/src/allmydata/web/filenode.py @@ -1,7 +1,5 @@ from past.builtins import unicode, long -import json - from twisted.web import http, static from twisted.internet import defer from twisted.web.resource import ( @@ -42,6 +40,8 @@ from allmydata.web.check_results import ( LiteralCheckResultsRenderer, ) from allmydata.web.info import MoreInfo +from allmydata.util import jsonbytes as json + class ReplaceMeMixin(object): def replace_me_with_a_child(self, req, client, replace): diff --git a/src/allmydata/web/root.py b/src/allmydata/web/root.py index fa0a1c381..969f22a1b 100644 --- a/src/allmydata/web/root.py +++ b/src/allmydata/web/root.py @@ -1,6 +1,5 @@ import os import time -import json import urllib from hyperlink import DecodedURL, URL @@ -21,7 +20,7 @@ from twisted.web.template import ( ) import allmydata # to display import path -from allmydata.util import log +from allmydata.util import log, jsonbytes as json from allmydata.interfaces import IFileNode from allmydata.web import ( filenode, From d8197d95541b34fd3b8c62583160c547e8f4d733 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 21 Dec 2020 10:52:31 -0500 Subject: [PATCH 156/186] Another passing test. --- src/allmydata/test/web/test_grid.py | 4 ++-- src/allmydata/web/directory.py | 8 ++++---- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index c78e0261f..1bcf528cc 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -1004,12 +1004,12 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(_stash_uri, "one") d.addCallback(lambda ign: self.rootnode.add_file(u"small", - upload.Data("literal", + upload.Data(b"literal", convergence=b""))) d.addCallback(_stash_uri, "small") d.addCallback(lambda ign: - c0.create_mutable_file(publish.MutableData("mutable"))) + c0.create_mutable_file(publish.MutableData(b"mutable"))) d.addCallback(lambda fn: self.rootnode.set_node(u"mutable", fn)) d.addCallback(_stash_uri, "mutable") diff --git a/src/allmydata/web/directory.py b/src/allmydata/web/directory.py index d66f664c9..2ae5eac17 100644 --- a/src/allmydata/web/directory.py +++ b/src/allmydata/web/directory.py @@ -275,7 +275,7 @@ class DirectoryNodeHandler(ReplaceMeMixin, Resource, object): @render_exception def render_POST(self, req): - t = get_arg(req, b"t", b"").strip() + t = unicode(get_arg(req, b"t", b"").strip(), "ascii") if t == "mkdir": d = self._POST_mkdir(req) @@ -1441,7 +1441,7 @@ class DeepCheckStreamer(dirnode.DeepStats): def write_line(self, data): j = json.dumps(data, ensure_ascii=True) assert "\n" not in j - self.req.write(j+"\n") + self.req.write(j.encode("utf-8")+b"\n") def finish(self): stats = dirnode.DeepStats.get_results(self) @@ -1450,8 +1450,8 @@ class DeepCheckStreamer(dirnode.DeepStats): } j = json.dumps(d, ensure_ascii=True) assert "\n" not in j - self.req.write(j+"\n") - return "" + self.req.write(j.encode("utf-8")+b"\n") + return b"" class UnknownNodeHandler(Resource, object): From a2f042845d6f8e0ca37c12bbf0e4aa7f85a81a2c Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 21 Dec 2020 10:58:09 -0500 Subject: [PATCH 157/186] Another passing test. --- src/allmydata/test/web/test_grid.py | 6 +++--- src/allmydata/web/directory.py | 8 ++++---- 2 files changed, 7 insertions(+), 7 deletions(-) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index 1bcf528cc..cadd9e224 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -81,7 +81,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi c0.create_mutable_file(publish.MutableData(DATA+b"3"))) d.addCallback(_stash_mutable_uri, "corrupt") d.addCallback(lambda ign: - c0.upload(upload.Data("literal", convergence=b""))) + c0.upload(upload.Data(b"literal", convergence=b""))) d.addCallback(_stash_uri, "small") d.addCallback(lambda ign: c0.create_immutable_dirnode({})) d.addCallback(_stash_mutable_uri, "smalldir") @@ -618,7 +618,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(_stash_uri, "good") d.addCallback(lambda ign: self.rootnode.add_file(u"small", - upload.Data("literal", + upload.Data(b"literal", convergence=b""))) d.addCallback(_stash_uri, "small") d.addCallback(lambda ign: @@ -791,7 +791,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(_stash_uri, "good") d.addCallback(lambda ign: self.rootnode.add_file(u"small", - upload.Data("literal", + upload.Data(b"literal", convergence=b""))) d.addCallback(_stash_uri, "small") d.addCallback(lambda ign: diff --git a/src/allmydata/web/directory.py b/src/allmydata/web/directory.py index 2ae5eac17..56ad9a85c 100644 --- a/src/allmydata/web/directory.py +++ b/src/allmydata/web/directory.py @@ -225,7 +225,7 @@ class DirectoryNodeHandler(ReplaceMeMixin, Resource, object): if not self.node.is_mutable() and t in FIXED_OUTPUT_TYPES: si = self.node.get_storage_index() if si and req.setETag('DIR:%s-%s' % (base32.b2a(si), t or "")): - return "" + return b"" if not t: # render the directory as HTML @@ -1363,7 +1363,7 @@ class ManifestStreamer(dirnode.DeepStats): j = json.dumps(d, ensure_ascii=True) assert "\n" not in j - self.req.write(j+"\n") + self.req.write(j.encode("utf-8")+b"\n") def finish(self): stats = dirnode.DeepStats.get_results(self) @@ -1372,8 +1372,8 @@ class ManifestStreamer(dirnode.DeepStats): } j = json.dumps(d, ensure_ascii=True) assert "\n" not in j - self.req.write(j+"\n") - return "" + self.req.write(j.encode("utf-8")+b"\n") + return b"" @implementer(IPushProducer) class DeepCheckStreamer(dirnode.DeepStats): From 2737229895bb55437cdd7140fd3e23a0eedf54fa Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 21 Dec 2020 11:12:52 -0500 Subject: [PATCH 158/186] Another passing test. --- src/allmydata/test/common_util.py | 14 +++++--------- src/allmydata/test/web/test_grid.py | 17 +++++++++++++---- 2 files changed, 18 insertions(+), 13 deletions(-) diff --git a/src/allmydata/test/common_util.py b/src/allmydata/test/common_util.py index 1ad0f8242..168714492 100644 --- a/src/allmydata/test/common_util.py +++ b/src/allmydata/test/common_util.py @@ -1,7 +1,8 @@ from __future__ import print_function -from future.utils import PY2, native_str +from future.utils import PY2, native_str, bchr, binary_type from future.builtins import str as future_str +from past.builtins import unicode import os import time @@ -20,9 +21,6 @@ from twisted.trial import unittest from ..util.assertutil import precondition from ..scripts import runner from allmydata.util.encodingutil import unicode_platform, get_filesystem_encoding, get_io_encoding -# Imported for backwards compatibility: -from future.utils import bord, bchr, binary_type -from past.builtins import unicode def skip_if_cannot_represent_filename(u): @@ -183,13 +181,11 @@ def insecurerandstr(n): return b''.join(map(bchr, map(randrange, [0]*n, [256]*n))) def flip_bit(good, which): - # TODO Probs need to update with bchr/bord as with flip_one_bit, below. - # flip the low-order bit of good[which] if which == -1: - pieces = good[:which], good[-1:], "" + pieces = good[:which], good[-1:], b"" else: pieces = good[:which], good[which:which+1], good[which+1:] - return pieces[0] + chr(ord(pieces[1]) ^ 0x01) + pieces[2] + return pieces[0] + bchr(ord(pieces[1]) ^ 0x01) + pieces[2] def flip_one_bit(s, offset=0, size=None): """ flip one random bit of the string s, in a byte greater than or equal to offset and less @@ -198,7 +194,7 @@ def flip_one_bit(s, offset=0, size=None): if size is None: size=len(s)-offset i = randrange(offset, offset+size) - result = s[:i] + bchr(bord(s[i])^(0x01<Return to file', res) @@ -1101,6 +1103,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi 410, "Gone", "NoSharesError", self.GET, self.fileurls["0shares"])) def _check_zero_shares(body): + body = unicode(body, "utf-8") self.failIfIn("", body) body = " ".join(body.strip().split()) exp = ("NoSharesError: no shares could be found. " @@ -1118,6 +1121,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi 410, "Gone", "NotEnoughSharesError", self.GET, self.fileurls["1share"])) def _check_one_share(body): + body = unicode(body, "utf-8") self.failIfIn("", body) body = " ".join(body.strip().split()) msgbase = ("NotEnoughSharesError: This indicates that some " @@ -1142,10 +1146,11 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi 404, "Not Found", None, self.GET, self.fileurls["imaginary"])) def _missing_child(body): + body = unicode(body, "utf-8") self.failUnlessIn("No such child: imaginary", body) d.addCallback(_missing_child) - d.addCallback(lambda ignored: self.GET(self.fileurls["dir-0share"])) + d.addCallback(lambda ignored: self.GET_string(self.fileurls["dir-0share"])) def _check_0shares_dir_html(body): self.failUnlessIn(DIR_HTML_TAG, body) # we should see the regular page, but without the child table or @@ -1164,7 +1169,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.failUnlessIn("No upload forms: directory is unreadable", body) d.addCallback(_check_0shares_dir_html) - d.addCallback(lambda ignored: self.GET(self.fileurls["dir-1share"])) + d.addCallback(lambda ignored: self.GET_string(self.fileurls["dir-1share"])) def _check_1shares_dir_html(body): # at some point, we'll split UnrecoverableFileError into 0-shares # and some-shares like we did for immutable files (since there @@ -1191,6 +1196,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.GET, self.fileurls["dir-0share-json"])) def _check_unrecoverable_file(body): + body = unicode(body, "utf-8") self.failIfIn("", body) body = " ".join(body.strip().split()) exp = ("UnrecoverableFileError: the directory (or mutable file) " @@ -1218,7 +1224,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi # attach a webapi child that throws a random error, to test how it # gets rendered. w = c0.getServiceNamed("webish") - w.root.putChild("ERRORBOOM", ErrorBoom()) + w.root.putChild(b"ERRORBOOM", ErrorBoom()) # "Accept: */*" : should get a text/html stack trace # "Accept: text/plain" : should get a text/plain stack trace @@ -1231,6 +1237,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.GET, "ERRORBOOM", headers={"accept": "*/*"})) def _internal_error_html1(body): + body = unicode(body, "utf-8") self.failUnlessIn("", "expected HTML, not '%s'" % body) d.addCallback(_internal_error_html1) @@ -1240,6 +1247,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.GET, "ERRORBOOM", headers={"accept": "text/plain"})) def _internal_error_text2(body): + body = unicode(body, "utf-8") self.failIfIn("", body) self.failUnless(body.startswith("Traceback "), body) d.addCallback(_internal_error_text2) @@ -1251,6 +1259,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.GET, "ERRORBOOM", headers={"accept": CLI_accepts})) def _internal_error_text3(body): + body = unicode(body, "utf-8") self.failIfIn("", body) self.failUnless(body.startswith("Traceback "), body) d.addCallback(_internal_error_text3) @@ -1260,7 +1269,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi 500, "Internal Server Error", None, self.GET, "ERRORBOOM")) def _internal_error_html4(body): - self.failUnlessIn("", body) + self.failUnlessIn(b"", body) d.addCallback(_internal_error_html4) def _flush_errors(res): From c25dd5776888b0d3ec9e0ecf24d16d180376890c Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 21 Dec 2020 13:12:01 -0500 Subject: [PATCH 159/186] Make sure we can handle bytes, plus a couple other fixes. --- src/allmydata/dirnode.py | 3 +-- src/allmydata/scripts/debug.py | 6 ++++-- src/allmydata/test/web/test_grid.py | 2 -- src/allmydata/web/introweb.py | 3 +-- src/allmydata/web/status.py | 3 +-- src/allmydata/web/storage.py | 4 ++-- src/allmydata/webish.py | 2 +- 7 files changed, 10 insertions(+), 13 deletions(-) diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index fd6a9cc8c..e8b80b9ad 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -18,7 +18,6 @@ import time from zope.interface import implementer from twisted.internet import defer from foolscap.api import fireEventually -import json from allmydata.crypto import aes from allmydata.deep_stats import DeepStats @@ -31,7 +30,7 @@ from allmydata.interfaces import IFilesystemNode, IDirectoryNode, IFileNode, \ from allmydata.check_results import DeepCheckResults, \ DeepCheckAndRepairResults from allmydata.monitor import Monitor -from allmydata.util import hashutil, base32, log +from allmydata.util import hashutil, base32, log, jsonbytes as json from allmydata.util.encodingutil import quote_output, normalize from allmydata.util.assertutil import precondition from allmydata.util.netstring import netstring, split_netstring diff --git a/src/allmydata/scripts/debug.py b/src/allmydata/scripts/debug.py index fd3f2b87c..451b1d661 100644 --- a/src/allmydata/scripts/debug.py +++ b/src/allmydata/scripts/debug.py @@ -1,5 +1,7 @@ from __future__ import print_function +from future.utils import bchr + # do not import any allmydata modules at this level. Do that from inside # individual functions instead. import struct, time, os, sys @@ -905,7 +907,7 @@ def corrupt_share(options): f = open(fn, "rb+") f.seek(offset) d = f.read(1) - d = chr(ord(d) ^ 0x01) + d = bchr(ord(d) ^ 0x01) f.seek(offset) f.write(d) f.close() @@ -920,7 +922,7 @@ def corrupt_share(options): f.seek(m.DATA_OFFSET) data = f.read(2000) # make sure this slot contains an SMDF share - assert data[0] == b"\x00", "non-SDMF mutable shares not supported" + assert data[0:1] == b"\x00", "non-SDMF mutable shares not supported" f.close() (version, ig_seqnum, ig_roothash, ig_IV, ig_k, ig_N, ig_segsize, diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index 0b58e2956..ecdc454b6 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -109,7 +109,6 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(self.CHECK, "good", "t=check") def _got_html_good(res): - res = unicode(res, "utf-8") self.failUnlessIn("Healthy", res) self.failIfIn("Not Healthy", res) soup = BeautifulSoup(res, 'html5lib') @@ -118,7 +117,6 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(_got_html_good) d.addCallback(self.CHECK, "good", "t=check&return_to=somewhere") def _got_html_good_return_to(res): - res = unicode(res, "utf-8") self.failUnlessIn("Healthy", res) self.failIfIn("Not Healthy", res) self.failUnlessIn('Return to file', res) diff --git a/src/allmydata/web/introweb.py b/src/allmydata/web/introweb.py index 380b6efd4..6ec558e82 100644 --- a/src/allmydata/web/introweb.py +++ b/src/allmydata/web/introweb.py @@ -5,8 +5,7 @@ from twisted.web.template import Element, XMLFile, renderElement, renderer from twisted.python.filepath import FilePath from twisted.web import static import allmydata -import json -from allmydata.util import idlib +from allmydata.util import idlib, jsonbytes as json from allmydata.web.common import ( render_time, MultiFormatResource, diff --git a/src/allmydata/web/status.py b/src/allmydata/web/status.py index 3284dfda6..2002b2fdf 100644 --- a/src/allmydata/web/status.py +++ b/src/allmydata/web/status.py @@ -3,7 +3,6 @@ from past.builtins import long, unicode import pprint import itertools import hashlib -import json from twisted.internet import defer from twisted.python.filepath import FilePath from twisted.web.resource import Resource @@ -14,7 +13,7 @@ from twisted.web.template import ( renderElement, tags, ) -from allmydata.util import base32, idlib +from allmydata.util import base32, idlib, jsonbytes as json from allmydata.web.common import ( abbreviate_time, abbreviate_rate, diff --git a/src/allmydata/web/storage.py b/src/allmydata/web/storage.py index 51624a409..82c789d9b 100644 --- a/src/allmydata/web/storage.py +++ b/src/allmydata/web/storage.py @@ -1,6 +1,6 @@ from future.utils import PY2 -import time, json +import time from twisted.python.filepath import FilePath from twisted.web.template import ( Element, @@ -14,7 +14,7 @@ from allmydata.web.common_py3 import ( MultiFormatResource ) from allmydata.util.abbreviate import abbreviate_space -from allmydata.util import time_format, idlib +from allmydata.util import time_format, idlib, jsonbytes as json def remove_prefix(s, prefix): diff --git a/src/allmydata/webish.py b/src/allmydata/webish.py index a7f44514f..3bfd073a4 100644 --- a/src/allmydata/webish.py +++ b/src/allmydata/webish.py @@ -110,7 +110,7 @@ def _get_client_ip(request): def _logFormatter(logDateTime, request): - print("REQUEST: {}".format(request.uri)) + print("REQUEST: {} {}".format(request.method, request.uri)) # we build up a log string that hides most of the cap, to preserve # user privacy. We retain the query args so we can identify things # like t=json. Then we send it to the flog. We make no attempt to From 7fc64fdf45bed96eba70f4b8075e2e2df7441336 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 21 Dec 2020 13:20:14 -0500 Subject: [PATCH 160/186] Also handle bytes when serializing production Eliot log messages on Python 3. --- src/allmydata/test/test_eliotutil.py | 5 ++++- src/allmydata/util/eliotutil.py | 5 ++++- 2 files changed, 8 insertions(+), 2 deletions(-) diff --git a/src/allmydata/test/test_eliotutil.py b/src/allmydata/test/test_eliotutil.py index 1a8c2f801..0073a7675 100644 --- a/src/allmydata/test/test_eliotutil.py +++ b/src/allmydata/test/test_eliotutil.py @@ -57,11 +57,14 @@ from ..util.eliotutil import ( _parse_destination_description, _EliotLogging, ) +from ..util.jsonbytes import BytesJSONEncoder + from .common import ( SyncTestCase, AsyncTestCase, ) + class EliotLoggedTestTests(AsyncTestCase): def test_returns_none(self): Message.log(hello="world") @@ -94,7 +97,7 @@ class ParseDestinationDescriptionTests(SyncTestCase): reactor = object() self.assertThat( _parse_destination_description("file:-")(reactor), - Equals(FileDestination(stdout)), + Equals(FileDestination(stdout, encoder=BytesJSONEncoder)), ) diff --git a/src/allmydata/util/eliotutil.py b/src/allmydata/util/eliotutil.py index 096356dfa..9e3cdd3e1 100644 --- a/src/allmydata/util/eliotutil.py +++ b/src/allmydata/util/eliotutil.py @@ -86,6 +86,9 @@ from twisted.internet.defer import ( ) from twisted.application.service import Service +from .jsonbytes import BytesJSONEncoder + + def validateInstanceOf(t): """ Return an Eliot validator that requires values to be instances of ``t``. @@ -302,7 +305,7 @@ class _DestinationParser(object): rotateLength=rotate_length, maxRotatedFiles=max_rotated_files, ) - return lambda reactor: FileDestination(get_file()) + return lambda reactor: FileDestination(get_file(), BytesJSONEncoder) _parse_destination_description = _DestinationParser().parse From f30376ade6357acf343eca506530deb963c18551 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 22 Dec 2020 10:47:25 -0500 Subject: [PATCH 161/186] Make sure test output can encode bytes correctly too. --- src/allmydata/test/__init__.py | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/src/allmydata/test/__init__.py b/src/allmydata/test/__init__.py index abbde919f..19c046eca 100644 --- a/src/allmydata/test/__init__.py +++ b/src/allmydata/test/__init__.py @@ -113,4 +113,5 @@ if sys.platform == "win32": initialize() from eliot import to_file -to_file(open("eliot.log", "w")) +from allmydata.util.jsonbytes import BytesJSONEncoder +to_file(open("eliot.log", "w"), encoder=BytesJSONEncoder) From 0534979e617ba57aa011eaf57b0c076c67dd538c Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 22 Dec 2020 11:03:23 -0500 Subject: [PATCH 162/186] Another passing test on Python 3. --- src/allmydata/web/common_py3.py | 2 ++ 1 file changed, 2 insertions(+) diff --git a/src/allmydata/web/common_py3.py b/src/allmydata/web/common_py3.py index 05fc2b50b..9d4435706 100644 --- a/src/allmydata/web/common_py3.py +++ b/src/allmydata/web/common_py3.py @@ -70,6 +70,8 @@ class MultiFormatResource(resource.Resource, object): :return: The result of the selected renderer. """ t = get_arg(req, self.formatArgument, self.formatDefault) + if isinstance(t, bytes): + t = unicode(t, "ascii") renderer = self._get_renderer(t) return renderer(req) From 74c08883f5c8eaaa171a396bb89db04b0c4bc2e8 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 22 Dec 2020 11:36:52 -0500 Subject: [PATCH 163/186] Another passing test on Python 3. --- src/allmydata/test/web/test_grid.py | 22 ++++++++++++---------- src/allmydata/web/directory.py | 2 +- 2 files changed, 13 insertions(+), 11 deletions(-) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index ecdc454b6..f140bc69f 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -361,18 +361,19 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi # make sure directory listing tolerates unknown nodes d.addCallback(lambda ign: self.GET(self.rooturl)) def _check_directory_html(res, expected_type_suffix): - pattern = re.compile(r'\?%s[ \t\n\r]*' - '%s' % (expected_type_suffix, str(name)), + pattern = re.compile(br'\?%s[ \t\n\r]*' + b'%s' % ( + expected_type_suffix, name.encode("ascii")), re.DOTALL) self.failUnless(re.search(pattern, res), res) # find the More Info link for name, should be relative - mo = re.search(r'More Info', res) + mo = re.search(br'More Info', res) info_url = mo.group(1) - self.failUnlessReallyEqual(info_url, "%s?t=info" % (str(name),)) + self.failUnlessReallyEqual(info_url, b"%s?t=info" % (name.encode("ascii"),)) if immutable: - d.addCallback(_check_directory_html, "-IMM") + d.addCallback(_check_directory_html, b"-IMM") else: - d.addCallback(_check_directory_html, "") + d.addCallback(_check_directory_html, b"") d.addCallback(lambda ign: self.GET(self.rooturl+"?t=json")) def _check_directory_json(res, expect_rw_uri): @@ -392,7 +393,6 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(_check_directory_json, expect_rw_uri=not immutable) def _check_info(res, expect_rw_uri, expect_ro_uri): - self.failUnlessIn("Object Type: unknown", res) if expect_rw_uri: self.failUnlessIn(unknown_rwcap, res) if expect_ro_uri: @@ -402,6 +402,8 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.failUnlessIn(unknown_rocap, res) else: self.failIfIn(unknown_rocap, res) + res = unicode(res, "utf-8") + self.failUnlessIn("Object Type: unknown", res) self.failIfIn("Raw data as", res) self.failIfIn("Directory writecap", res) self.failIfIn("Checker Operations", res) @@ -413,7 +415,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(lambda ign: self.GET(expected_info_url)) d.addCallback(_check_info, expect_rw_uri=False, expect_ro_uri=False) - d.addCallback(lambda ign: self.GET("%s/%s?t=info" % (self.rooturl, str(name)))) + d.addCallback(lambda ign: self.GET("%s/%s?t=info" % (self.rooturl, name))) d.addCallback(_check_info, expect_rw_uri=False, expect_ro_uri=True) def _check_json(res, expect_rw_uri): @@ -445,9 +447,9 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi # or not future_node was immutable. d.addCallback(lambda ign: self.GET(self.rourl)) if immutable: - d.addCallback(_check_directory_html, "-IMM") + d.addCallback(_check_directory_html, b"-IMM") else: - d.addCallback(_check_directory_html, "-RO") + d.addCallback(_check_directory_html, b"-RO") d.addCallback(lambda ign: self.GET(self.rourl+"?t=json")) d.addCallback(_check_directory_json, expect_rw_uri=False) diff --git a/src/allmydata/web/directory.py b/src/allmydata/web/directory.py index 56ad9a85c..981c8ef56 100644 --- a/src/allmydata/web/directory.py +++ b/src/allmydata/web/directory.py @@ -1464,7 +1464,7 @@ class UnknownNodeHandler(Resource, object): @render_exception def render_GET(self, req): - t = get_arg(req, b"t", b"").strip() + t = unicode(get_arg(req, "t", "").strip(), "ascii") if t == "info": return MoreInfo(self.node) if t == "json": From 013388981cf85a9082e35c8da15aefafb59bc615 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 22 Dec 2020 13:04:53 -0500 Subject: [PATCH 164/186] Fix intermittent failing test on PyPy hopefully. PyPy does not have refcounts, so files were only being closed when GC happened, which meant their buffered writes never hit disk. --- newsfragments/3572.minor | 0 src/allmydata/test/test_system.py | 18 ++++++++++++------ 2 files changed, 12 insertions(+), 6 deletions(-) create mode 100644 newsfragments/3572.minor diff --git a/newsfragments/3572.minor b/newsfragments/3572.minor new file mode 100644 index 000000000..e69de29bb diff --git a/src/allmydata/test/test_system.py b/src/allmydata/test/test_system.py index e8b27fa69..235361cf8 100644 --- a/src/allmydata/test/test_system.py +++ b/src/allmydata/test/test_system.py @@ -2327,7 +2327,8 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): files.append(fn) data = "data to be uploaded: file%d\n" % i datas.append(data) - open(fn,"wb").write(data) + with open(fn, "wb") as f: + f.write(data) def _check_stdout_against(out_and_err, filenum=None, data=None): (out, err) = out_and_err @@ -2505,13 +2506,18 @@ class SystemTest(SystemTestMixin, RunBinTahoeMixin, unittest.TestCase): # recursive copy: setup dn = os.path.join(self.basedir, "dir1") os.makedirs(dn) - open(os.path.join(dn, "rfile1"), "wb").write("rfile1") - open(os.path.join(dn, "rfile2"), "wb").write("rfile2") - open(os.path.join(dn, "rfile3"), "wb").write("rfile3") + with open(os.path.join(dn, "rfile1"), "wb") as f: + f.write("rfile1") + with open(os.path.join(dn, "rfile2"), "wb") as f: + f.write("rfile2") + with open(os.path.join(dn, "rfile3"), "wb") as f: + f.write("rfile3") sdn2 = os.path.join(dn, "subdir2") os.makedirs(sdn2) - open(os.path.join(sdn2, "rfile4"), "wb").write("rfile4") - open(os.path.join(sdn2, "rfile5"), "wb").write("rfile5") + with open(os.path.join(sdn2, "rfile4"), "wb") as f: + f.write("rfile4") + with open(os.path.join(sdn2, "rfile5"), "wb") as f: + f.write("rfile5") # from disk into tahoe d.addCallback(run, "cp", "-r", dn, "tahoe:") From baa2cff29ceb50c821aa63b6aa8926dd5f1f3d77 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 22 Dec 2020 13:14:18 -0500 Subject: [PATCH 165/186] Unbreak Python 2. --- src/allmydata/test/web/test_grid.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index f140bc69f..c4d212133 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -1112,7 +1112,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi "severe corruption. You should perform a filecheck on " "this object to learn more. The full error message is: " "no shares (need 3). Last failure: None") - self.failUnlessReallyEqual(exp, body) + self.assertEqual(exp, body) d.addCallback(_check_zero_shares) From 8881728ca52c7fb1bfdf8d74fd8a9a939c308cb9 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 22 Dec 2020 13:17:07 -0500 Subject: [PATCH 166/186] Another passing test on Python 3. --- src/allmydata/test/web/test_grid.py | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index c4d212133..67496eb1e 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -473,9 +473,9 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.uris = {} self.fileurls = {} - lonely_uri = "URI:LIT:n5xgk" # LIT for "one" - mut_write_uri = "URI:SSK:vfvcbdfbszyrsaxchgevhmmlii:euw4iw7bbnkrrwpzuburbhppuxhc3gwxv26f6imekhz7zyw2ojnq" - mut_read_uri = "URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q" + lonely_uri = b"URI:LIT:n5xgk" # LIT for "one" + mut_write_uri = b"URI:SSK:vfvcbdfbszyrsaxchgevhmmlii:euw4iw7bbnkrrwpzuburbhppuxhc3gwxv26f6imekhz7zyw2ojnq" + mut_read_uri = b"URI:SSK-RO:e3mdrzfwhoq42hy5ubcz6rp3o4:ybyibhnp3vvwuq2vaw2ckjmesgkklfs6ghxleztqidihjyofgw7q" # This method tests mainly dirnode, but we'd have to duplicate code in order to # test the dirnode and web layers separately. @@ -518,7 +518,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi rep = str(dn) self.failUnlessIn("RO-IMM", rep) cap = dn.get_cap() - self.failUnlessIn("CHK", cap.to_string()) + self.failUnlessIn(b"CHK", cap.to_string()) self.cap = cap self.rootnode = dn self.rooturl = "uri/" + url_quote(dn.get_uri()) @@ -537,7 +537,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi entry = entries[0] (name_utf8, ro_uri, rwcapdata, metadata_s), subpos = split_netstring(entry, 4) name = name_utf8.decode("utf-8") - self.failUnlessEqual(rwcapdata, "") + self.failUnlessEqual(rwcapdata, b"") self.failUnlessIn(name, kids) (expected_child, ign) = kids[name] self.failUnlessReallyEqual(ro_uri, expected_child.get_readonly_uri()) @@ -564,7 +564,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(lambda ign: self.GET(self.rooturl)) def _check_html(res): soup = BeautifulSoup(res, 'html5lib') - self.failIfIn("URI:SSK", res) + self.failIfIn(b"URI:SSK", res) found = False for td in soup.find_all(u"td"): if td.text != u"FILE": From cbf348f21b379799376dd861d3af69eed76e2a60 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 22 Dec 2020 13:17:46 -0500 Subject: [PATCH 167/186] Get rid of debug print. --- src/allmydata/webish.py | 1 - 1 file changed, 1 deletion(-) diff --git a/src/allmydata/webish.py b/src/allmydata/webish.py index 3bfd073a4..f32f56714 100644 --- a/src/allmydata/webish.py +++ b/src/allmydata/webish.py @@ -110,7 +110,6 @@ def _get_client_ip(request): def _logFormatter(logDateTime, request): - print("REQUEST: {} {}".format(request.method, request.uri)) # we build up a log string that hides most of the cap, to preserve # user privacy. We retain the query args so we can identify things # like t=json. Then we send it to the flog. We make no attempt to From 3ca17454c4a2f277dc1dc826b640007dee863560 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 22 Dec 2020 13:18:07 -0500 Subject: [PATCH 168/186] News file. --- newsfragments/3566.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3566.minor diff --git a/newsfragments/3566.minor b/newsfragments/3566.minor new file mode 100644 index 000000000..e69de29bb From 1c7956bc1aad33fd3f3a666b7845ad6ea619a922 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 22 Dec 2020 13:19:59 -0500 Subject: [PATCH 169/186] Port to Python 3. --- src/allmydata/test/web/test_grid.py | 42 +++++++++++++++++------------ src/allmydata/util/_python3.py | 1 + 2 files changed, 26 insertions(+), 17 deletions(-) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index 67496eb1e..7e5468143 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -1,6 +1,14 @@ +""" +Ported to Python 3. +""" from __future__ import print_function +from __future__ import absolute_import +from __future__ import division +from __future__ import unicode_literals -from past.builtins import unicode +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 import os.path, re from urllib.parse import quote as url_quote @@ -50,12 +58,12 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi def CHECK(self, ign, which, args, clientnum=0): fileurl = self.fileurls[which] url = fileurl + "?" + args - return self.GET(url, method="POST", clientnum=clientnum).addCallback(unicode, "utf-8") + return self.GET(url, method="POST", clientnum=clientnum).addCallback(str, "utf-8") def GET_string(self, *args, **kwargs): """Send an HTTP request, but convert result to Unicode string.""" d = GridTestMixin.GET(self, *args, **kwargs) - d.addCallback(unicode, "utf-8") + d.addCallback(str, "utf-8") return d def test_filecheck(self): @@ -402,7 +410,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.failUnlessIn(unknown_rocap, res) else: self.failIfIn(unknown_rocap, res) - res = unicode(res, "utf-8") + res = str(res, "utf-8") self.failUnlessIn("Object Type: unknown", res) self.failIfIn("Raw data as", res) self.failIfIn("Directory writecap", res) @@ -712,7 +720,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(lambda ign: self.delete_shares_numbered(self.uris["subdir"], - range(1, 10))) + list(range(1, 10)))) # root # root/good @@ -1072,7 +1080,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi d.addCallback(lambda ign: c0.upload(upload.Data(DATA, convergence=b""))) def _stash_bad(ur): self.fileurls["1share"] = "uri/" + url_quote(ur.get_uri()) - self.delete_shares_numbered(ur.get_uri(), range(1,10)) + self.delete_shares_numbered(ur.get_uri(), list(range(1,10))) u = uri.from_string(ur.get_uri()) u.key = testutil.flip_bit(u.key, 0) @@ -1084,14 +1092,14 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi u = n.get_uri() url = self.fileurls["dir-1share"] = "uri/" + url_quote(u) self.fileurls["dir-1share-json"] = url + "?t=json" - self.delete_shares_numbered(u, range(1,10)) + self.delete_shares_numbered(u, list(range(1,10))) d.addCallback(_mangle_dirnode_1share) d.addCallback(lambda ign: c0.create_dirnode()) def _mangle_dirnode_0share(n): u = n.get_uri() url = self.fileurls["dir-0share"] = "uri/" + url_quote(u) self.fileurls["dir-0share-json"] = url + "?t=json" - self.delete_shares_numbered(u, range(0,10)) + self.delete_shares_numbered(u, list(range(0,10))) d.addCallback(_mangle_dirnode_0share) # NotEnoughSharesError should be reported sensibly, with a @@ -1103,7 +1111,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi 410, "Gone", "NoSharesError", self.GET, self.fileurls["0shares"])) def _check_zero_shares(body): - body = unicode(body, "utf-8") + body = str(body, "utf-8") self.failIfIn("", body) body = " ".join(body.strip().split()) exp = ("NoSharesError: no shares could be found. " @@ -1121,7 +1129,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi 410, "Gone", "NotEnoughSharesError", self.GET, self.fileurls["1share"])) def _check_one_share(body): - body = unicode(body, "utf-8") + body = str(body, "utf-8") self.failIfIn("", body) body = " ".join(body.strip().split()) msgbase = ("NotEnoughSharesError: This indicates that some " @@ -1146,7 +1154,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi 404, "Not Found", None, self.GET, self.fileurls["imaginary"])) def _missing_child(body): - body = unicode(body, "utf-8") + body = str(body, "utf-8") self.failUnlessIn("No such child: imaginary", body) d.addCallback(_missing_child) @@ -1196,7 +1204,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.GET, self.fileurls["dir-0share-json"])) def _check_unrecoverable_file(body): - body = unicode(body, "utf-8") + body = str(body, "utf-8") self.failIfIn("", body) body = " ".join(body.strip().split()) exp = ("UnrecoverableFileError: the directory (or mutable file) " @@ -1237,7 +1245,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.GET, "ERRORBOOM", headers={"accept": "*/*"})) def _internal_error_html1(body): - body = unicode(body, "utf-8") + body = str(body, "utf-8") self.failUnlessIn("", "expected HTML, not '%s'" % body) d.addCallback(_internal_error_html1) @@ -1247,7 +1255,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.GET, "ERRORBOOM", headers={"accept": "text/plain"})) def _internal_error_text2(body): - body = unicode(body, "utf-8") + body = str(body, "utf-8") self.failIfIn("", body) self.failUnless(body.startswith("Traceback "), body) d.addCallback(_internal_error_text2) @@ -1259,7 +1267,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.GET, "ERRORBOOM", headers={"accept": CLI_accepts})) def _internal_error_text3(body): - body = unicode(body, "utf-8") + body = str(body, "utf-8") self.failIfIn("", body) self.failUnless(body.startswith("Traceback "), body) d.addCallback(_internal_error_text3) @@ -1316,7 +1324,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi f.write(" # this is a comment\n") f.write(" \n") f.write("\n") # also exercise blank lines - f.write("%s off-limits to you\n" % (unicode(base32.b2a(self.si), "ascii"),)) + f.write("%s off-limits to you\n" % (str(base32.b2a(self.si), "ascii"),)) f.close() # clients should be checking the blacklist each time, so we don't # need to restart the client @@ -1380,7 +1388,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.child_url = b"uri/"+dn.get_readonly_uri()+b"/child" d.addCallback(_get_dircap) d.addCallback(lambda ign: self.GET(self.dir_url_base, followRedirect=True)) - d.addCallback(lambda body: self.failUnlessIn(DIR_HTML_TAG, unicode(body, "utf-8"))) + d.addCallback(lambda body: self.failUnlessIn(DIR_HTML_TAG, str(body, "utf-8"))) d.addCallback(lambda ign: self.GET(self.dir_url_json1)) d.addCallback(lambda res: json.loads(res)) # just check it decodes d.addCallback(lambda ign: self.GET(self.dir_url_json2)) diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index af771cd5a..a7b77001a 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -180,6 +180,7 @@ PORTED_TEST_MODULES = [ "allmydata.test.test_uri", "allmydata.test.test_util", "allmydata.test.web.test_common", + "allmydata.test.web.test_grid", "allmydata.test.web.test_util", "allmydata.test.web.test_status", ] From c5b403bd2f35717ee9cecf968696ee534295ff88 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 23 Dec 2020 09:17:39 -0500 Subject: [PATCH 170/186] Make the class new style again on Python 2. --- src/allmydata/test/web/test_grid.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index 7e5468143..f85538296 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -48,7 +48,7 @@ DIR_HTML_TAG = '' class CompletelyUnhandledError(Exception): pass -class ErrorBoom(resource.Resource): +class ErrorBoom(resource.Resource, object): @render_exception def render(self, req): raise CompletelyUnhandledError("whoops") From f736dc6f7b1339b97594d8389617afa11c8bffea Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 23 Dec 2020 09:34:48 -0500 Subject: [PATCH 171/186] Fix some tests caused by unicode rendering. --- src/allmydata/test/web/test_web.py | 5 ++++- src/allmydata/web/common.py | 4 ++-- 2 files changed, 6 insertions(+), 3 deletions(-) diff --git a/src/allmydata/test/web/test_web.py b/src/allmydata/test/web/test_web.py index 1908afdeb..e975464d3 100644 --- a/src/allmydata/test/web/test_web.py +++ b/src/allmydata/test/web/test_web.py @@ -746,7 +746,10 @@ class MultiFormatResourceTests(TrialTestCase): "400 - Bad Format", response_body, ) self.assertIn( - "Unknown t value: 'foo'", response_body, + "Unknown t value:", response_body, + ) + self.assertIn( + "'foo'", response_body, ) diff --git a/src/allmydata/web/common.py b/src/allmydata/web/common.py index 2832cc6a8..57118d1d4 100644 --- a/src/allmydata/web/common.py +++ b/src/allmydata/web/common.py @@ -1,5 +1,5 @@ from past.builtins import unicode -from six import ensure_text +from six import ensure_text, ensure_str import time import json @@ -112,7 +112,7 @@ def parse_replace_arg(replace): try: return boolean_of_arg(replace) except WebError: - raise WebError("invalid replace= argument: %r" % (replace,), http.BAD_REQUEST) + raise WebError("invalid replace= argument: %r" % (ensure_str(replace),), http.BAD_REQUEST) def get_format(req, default="CHK"): From 58cb757816bdf84f369d32d029da1d730c916bea Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 23 Dec 2020 09:42:42 -0500 Subject: [PATCH 172/186] Sometimes these values are more extended Unicode than ASCII. --- src/allmydata/web/common_py3.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/allmydata/web/common_py3.py b/src/allmydata/web/common_py3.py index b91f53a3a..5aa338e4d 100644 --- a/src/allmydata/web/common_py3.py +++ b/src/allmydata/web/common_py3.py @@ -29,9 +29,9 @@ def get_arg(req, argname, default=None, multiple=False): :return: Either bytes or tuple of bytes. """ if isinstance(argname, unicode): - argname = argname.encode("ascii") + argname = argname.encode("utf-8") if isinstance(default, unicode): - default = default.encode("ascii") + default = default.encode("utf-8") results = [] if argname in req.args: results.extend(req.args[argname]) From 3c8550b666c15937029184241e51a12b6a2e3739 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 23 Dec 2020 09:54:35 -0500 Subject: [PATCH 173/186] Python 3 fix: direct indexing of bytes returns an int. --- src/allmydata/test/common_util.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/test/common_util.py b/src/allmydata/test/common_util.py index 16e744902..74f15a464 100644 --- a/src/allmydata/test/common_util.py +++ b/src/allmydata/test/common_util.py @@ -194,7 +194,7 @@ def flip_one_bit(s, offset=0, size=None): if size is None: size=len(s)-offset i = randrange(offset, offset+size) - result = s[:i] + bchr(ord(s[i])^(0x01< Date: Wed, 23 Dec 2020 10:01:44 -0500 Subject: [PATCH 174/186] Fix tests on Python 3. --- src/allmydata/test/test_checker.py | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/src/allmydata/test/test_checker.py b/src/allmydata/test/test_checker.py index 936d270ae..a7042468a 100644 --- a/src/allmydata/test/test_checker.py +++ b/src/allmydata/test/test_checker.py @@ -173,7 +173,7 @@ class WebResultsRendering(unittest.TestCase): return c def render_json(self, resource): - return self.successResultOf(render(resource, {"output": ["json"]})) + return self.successResultOf(render(resource, {b"output": [b"json"]})) def render_element(self, element, args=None): if args is None: @@ -186,7 +186,7 @@ class WebResultsRendering(unittest.TestCase): html = self.render_element(lcr) self.failUnlessIn(b"Literal files are always healthy", html) - html = self.render_element(lcr, args={"return_to": ["FOOURL"]}) + html = self.render_element(lcr, args={b"return_to": [b"FOOURL"]}) self.failUnlessIn(b"Literal files are always healthy", html) self.failUnlessIn(b'Return to file.', html) @@ -269,7 +269,7 @@ class WebResultsRendering(unittest.TestCase): self.failUnlessIn("File Check Results for SI=2k6avp", s) # abbreviated self.failUnlessIn("Not Recoverable! : rather dead", s) - html = self.render_element(w, args={"return_to": ["FOOURL"]}) + html = self.render_element(w, args={b"return_to": [b"FOOURL"]}) self.failUnlessIn(b'Return to file/directory.', html) From eb8837a4c8b0a22d607692248254f3220c1faf8b Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 23 Dec 2020 10:09:37 -0500 Subject: [PATCH 175/186] More things that need to be bytes. --- src/allmydata/test/test_storage_web.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/test/test_storage_web.py b/src/allmydata/test/test_storage_web.py index ca0cd85fc..b3f5fac98 100644 --- a/src/allmydata/test/test_storage_web.py +++ b/src/allmydata/test/test_storage_web.py @@ -70,7 +70,7 @@ def renderJSON(resource): """ Render a JSON from the given resource. """ - return render(resource, {"t": ["json"]}) + return render(resource, {b"t": [b"json"]}) class MyBucketCountingCrawler(BucketCountingCrawler): def finished_prefix(self, cycle, prefix): From 9e83343335a683d348cf68d523dbd35f4681e663 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 29 Dec 2020 10:47:58 -0500 Subject: [PATCH 176/186] news fragment --- newsfragments/3575.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3575.minor diff --git a/newsfragments/3575.minor b/newsfragments/3575.minor new file mode 100644 index 000000000..e69de29bb From 30b37e17dd44559f05242ec66925f9f6b2905d79 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 29 Dec 2020 10:48:03 -0500 Subject: [PATCH 177/186] More of a storage_index_hash test --- src/allmydata/test/test_hashutil.py | 30 +++++++++++++++++++++++++++-- 1 file changed, 28 insertions(+), 2 deletions(-) diff --git a/src/allmydata/test/test_hashutil.py b/src/allmydata/test/test_hashutil.py index abcd4f0fb..6ec861c9f 100644 --- a/src/allmydata/test/test_hashutil.py +++ b/src/allmydata/test/test_hashutil.py @@ -102,9 +102,35 @@ class HashUtilTests(unittest.TestCase): got_a = base32.b2a(got) self.failUnlessEqual(got_a, expected_a) - def test_known_answers(self): - # assert backwards compatibility + def test_storage_index_hash_known_answers(self): + """ + Verify backwards compatibility by comparing ``storage_index_hash`` outputs + for some well-known (to us) inputs. + """ + # This is a marginal case. b"" is not a valid aes 128 key. The + # implementation does nothing to avoid producing a result for it, + # though. self._testknown(hashutil.storage_index_hash, b"qb5igbhcc5esa6lwqorsy7e6am", b"") + + # This is a little bit more realistic though clearly this is a poor key choice. + self._testknown(hashutil.storage_index_hash, b"wvggbrnrezdpa5yayrgiw5nzja", b"x" * 16) + + # Here's a much more realistic key that I generated by reading some + # bytes from /dev/urandom. I computed the expected hash value twice. + # First using hashlib.sha256 and then with sha256sum(1). The input + # string given to the hash function was "43:," + # in each case. + self._testknown( + hashutil.storage_index_hash, + b"aarbseqqrpsfowduchcjbonscq", + base32.a2b(b"2ckv3dfzh6rgjis6ogfqhyxnzy"), + ) + + def test_known_answers(self): + """ + Verify backwards compatibility by comparing hash outputs for some + well-known (to us) inputs. + """ self._testknown(hashutil.block_hash, b"msjr5bh4evuh7fa3zw7uovixfbvlnstr5b65mrerwfnvjxig2jvq", b"") self._testknown(hashutil.uri_extension_hash, b"wthsu45q7zewac2mnivoaa4ulh5xvbzdmsbuyztq2a5fzxdrnkka", b"") self._testknown(hashutil.plaintext_hash, b"5lz5hwz3qj3af7n6e3arblw7xzutvnd3p3fjsngqjcb7utf3x3da", b"") From 5a543fd49776f8fd9828883ba510262639c2ce57 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 29 Dec 2020 13:21:59 -0500 Subject: [PATCH 178/186] news fragment --- newsfragments/3384.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3384.minor diff --git a/newsfragments/3384.minor b/newsfragments/3384.minor new file mode 100644 index 000000000..e69de29bb From ae87d53e4943beefdde3c20923a193b5c2ba2fe0 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Tue, 29 Dec 2020 13:22:03 -0500 Subject: [PATCH 179/186] Let us have a ~5ish coverage --- setup.py | 5 +---- 1 file changed, 1 insertion(+), 4 deletions(-) diff --git a/setup.py b/setup.py index 345d6aa08..0e5a43dba 100644 --- a/setup.py +++ b/setup.py @@ -385,10 +385,7 @@ setup(name="tahoe-lafs", # also set in __init__.py # this version from time to time, but we will do it # intentionally. "pyflakes == 2.2.0", - # coverage 5.0 breaks the integration tests in some opaque way. - # This probably needs to be addressed in a more permanent way - # eventually... - "coverage ~= 4.5", + "coverage ~= 5.0", "mock", "tox", "pytest", From c0358b3e03b8d500dcbb3ddcaaa6100f546e98d8 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Fri, 1 Jan 2021 15:14:47 -0500 Subject: [PATCH 180/186] Fold _encode_tail_segment in to _encode_segment --- src/allmydata/immutable/encode.py | 53 ++++++++++++++----------------- 1 file changed, 23 insertions(+), 30 deletions(-) diff --git a/src/allmydata/immutable/encode.py b/src/allmydata/immutable/encode.py index 9351df501..a9835b6b8 100644 --- a/src/allmydata/immutable/encode.py +++ b/src/allmydata/immutable/encode.py @@ -255,11 +255,11 @@ class Encoder(object): # captures the slot, not the value #d.addCallback(lambda res: self.do_segment(i)) # use this form instead: - d.addCallback(lambda res, i=i: self._encode_segment(i)) + d.addCallback(lambda res, i=i: self._encode_segment(i, is_tail=False)) d.addCallback(self._send_segment, i) d.addCallback(self._turn_barrier) last_segnum = self.num_segments - 1 - d.addCallback(lambda res: self._encode_tail_segment(last_segnum)) + d.addCallback(lambda res: self._encode_segment(last_segnum, is_tail=True)) d.addCallback(self._send_segment, last_segnum) d.addCallback(self._turn_barrier) @@ -317,8 +317,24 @@ class Encoder(object): dl.append(d) return self._gather_responses(dl) - def _encode_segment(self, segnum): - codec = self._codec + def _encode_segment(self, segnum, is_tail): + """ + Encode one segment of input into the configured number of shares. + + :param segnum: Ostensibly, the number of the segment to encode. In + reality, this parameter is ignored and the *next* segment is + encoded and returned. + + :param bool is_tail: ``True`` if this is the last segment, ``False`` + otherwise. + + :return: A ``Deferred`` which fires with a two-tuple. The first + element is a list of string-y objects representing the encoded + segment data for one of the shares. The second element is a list + of integers giving the share numbers of the shares in the first + element. + """ + codec = self._tail_codec if is_tail else self._codec start = time.time() # the ICodecEncoder API wants to receive a total of self.segment_size @@ -350,9 +366,11 @@ class Encoder(object): # footprint to 430KiB at the expense of more hash-tree overhead. d = self._gather_data(self.required_shares, input_piece_size, - crypttext_segment_hasher) + crypttext_segment_hasher, allow_short=is_tail) def _done_gathering(chunks): for c in chunks: + # If is_tail then a short trailing chunk will have been padded + # by _gather_data assert len(c) == input_piece_size self._crypttext_hashes.append(crypttext_segment_hasher.digest()) # during this call, we hit 5*segsize memory @@ -365,31 +383,6 @@ class Encoder(object): d.addCallback(_done) return d - def _encode_tail_segment(self, segnum): - - start = time.time() - codec = self._tail_codec - input_piece_size = codec.get_block_size() - - crypttext_segment_hasher = hashutil.crypttext_segment_hasher() - - d = self._gather_data(self.required_shares, input_piece_size, - crypttext_segment_hasher, allow_short=True) - def _done_gathering(chunks): - for c in chunks: - # a short trailing chunk will have been padded by - # _gather_data - assert len(c) == input_piece_size - self._crypttext_hashes.append(crypttext_segment_hasher.digest()) - return codec.encode(chunks) - d.addCallback(_done_gathering) - def _done(res): - elapsed = time.time() - start - self._times["cumulative_encoding"] += elapsed - return res - d.addCallback(_done) - return d - def _gather_data(self, num_chunks, input_chunk_size, crypttext_segment_hasher, allow_short=False): From 244089d7852f059991f6d3ce954d271853526158 Mon Sep 17 00:00:00 2001 From: Jean-Paul Calderone Date: Fri, 1 Jan 2021 15:15:06 -0500 Subject: [PATCH 181/186] news fragment --- newsfragments/3578.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3578.minor diff --git a/newsfragments/3578.minor b/newsfragments/3578.minor new file mode 100644 index 000000000..e69de29bb From a46a7dc7f8049f32dfd5a19dc27779a22f0953a1 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 4 Jan 2021 14:23:12 -0500 Subject: [PATCH 182/186] Log, don't raise. --- src/allmydata/web/root.py | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/src/allmydata/web/root.py b/src/allmydata/web/root.py index 969f22a1b..fdc72ab71 100644 --- a/src/allmydata/web/root.py +++ b/src/allmydata/web/root.py @@ -157,8 +157,9 @@ class URIHandler(resource.Resource, object): try: node = self.client.create_node_from_uri(name) return directory.make_handler_for(node, self.client) - except (TypeError, AssertionError): - raise + except (TypeError, AssertionError) as e: + log.msg(format="Failed to parse cap, perhaps due to bug: %(e)s", + e=e, level=log.WEIRD) raise WebError( "'{}' is not a valid file- or directory- cap".format(name) ) From d7db34f27a0fc3c11f5169d09168044963185d42 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 4 Jan 2021 14:33:06 -0500 Subject: [PATCH 183/186] Add explanation for if statement. --- src/allmydata/web/common_py3.py | 1 + 1 file changed, 1 insertion(+) diff --git a/src/allmydata/web/common_py3.py b/src/allmydata/web/common_py3.py index 5aa338e4d..3e9eb8379 100644 --- a/src/allmydata/web/common_py3.py +++ b/src/allmydata/web/common_py3.py @@ -70,6 +70,7 @@ class MultiFormatResource(resource.Resource, object): :return: The result of the selected renderer. """ t = get_arg(req, self.formatArgument, self.formatDefault) + # It's either bytes or None. if isinstance(t, bytes): t = unicode(t, "ascii") renderer = self._get_renderer(t) From 6f0838e2e9a2c8f9b7366c82d4b2e46d4f00d671 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 4 Jan 2021 14:34:18 -0500 Subject: [PATCH 184/186] Docstring. --- src/allmydata/test/common_util.py | 1 + 1 file changed, 1 insertion(+) diff --git a/src/allmydata/test/common_util.py b/src/allmydata/test/common_util.py index 74f15a464..2a70cff3a 100644 --- a/src/allmydata/test/common_util.py +++ b/src/allmydata/test/common_util.py @@ -181,6 +181,7 @@ def insecurerandstr(n): return b''.join(map(bchr, map(randrange, [0]*n, [256]*n))) def flip_bit(good, which): + """Flip the low-order bit of good[which].""" if which == -1: pieces = good[:which], good[-1:], b"" else: From 961ad123cc57b46580a1f77dbc44da67a5440588 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 4 Jan 2021 14:35:33 -0500 Subject: [PATCH 185/186] Better name. --- src/allmydata/test/web/test_grid.py | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index f85538296..135957ed4 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -60,7 +60,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi url = fileurl + "?" + args return self.GET(url, method="POST", clientnum=clientnum).addCallback(str, "utf-8") - def GET_string(self, *args, **kwargs): + def GET_unicode(self, *args, **kwargs): """Send an HTTP request, but convert result to Unicode string.""" d = GridTestMixin.GET(self, *args, **kwargs) d.addCallback(str, "utf-8") @@ -1158,7 +1158,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.failUnlessIn("No such child: imaginary", body) d.addCallback(_missing_child) - d.addCallback(lambda ignored: self.GET_string(self.fileurls["dir-0share"])) + d.addCallback(lambda ignored: self.GET_unicode(self.fileurls["dir-0share"])) def _check_0shares_dir_html(body): self.failUnlessIn(DIR_HTML_TAG, body) # we should see the regular page, but without the child table or @@ -1177,7 +1177,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.failUnlessIn("No upload forms: directory is unreadable", body) d.addCallback(_check_0shares_dir_html) - d.addCallback(lambda ignored: self.GET_string(self.fileurls["dir-1share"])) + d.addCallback(lambda ignored: self.GET_unicode(self.fileurls["dir-1share"])) def _check_1shares_dir_html(body): # at some point, we'll split UnrecoverableFileError into 0-shares # and some-shares like we did for immutable files (since there @@ -1311,7 +1311,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.dir_uri = node.get_uri() self.dir_url = b"uri/"+self.dir_uri d.addCallback(_stash_dir) - d.addCallback(lambda ign: self.GET_string(self.dir_url, followRedirect=True)) + d.addCallback(lambda ign: self.GET_unicode(self.dir_url, followRedirect=True)) def _check_dir_html(body): self.failUnlessIn(DIR_HTML_TAG, body) self.failUnlessIn("blacklisted.txt", body) @@ -1335,7 +1335,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi self.GET, self.url)) # We should still be able to list the parent directory, in HTML... - d.addCallback(lambda ign: self.GET_string(self.dir_url, followRedirect=True)) + d.addCallback(lambda ign: self.GET_unicode(self.dir_url, followRedirect=True)) def _check_dir_html2(body): self.failUnlessIn(DIR_HTML_TAG, body) self.failUnlessIn("blacklisted.txt", body) From bc19ccc77a358f968547f3d7500d117a925be1c0 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Mon, 4 Jan 2021 14:36:02 -0500 Subject: [PATCH 186/186] Use method that already does this. --- src/allmydata/test/web/test_grid.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/allmydata/test/web/test_grid.py b/src/allmydata/test/web/test_grid.py index 135957ed4..ef2718df4 100644 --- a/src/allmydata/test/web/test_grid.py +++ b/src/allmydata/test/web/test_grid.py @@ -58,7 +58,7 @@ class Grid(GridTestMixin, WebErrorMixin, ShouldFailMixin, testutil.ReallyEqualMi def CHECK(self, ign, which, args, clientnum=0): fileurl = self.fileurls[which] url = fileurl + "?" + args - return self.GET(url, method="POST", clientnum=clientnum).addCallback(str, "utf-8") + return self.GET_unicode(url, method="POST", clientnum=clientnum) def GET_unicode(self, *args, **kwargs): """Send an HTTP request, but convert result to Unicode string."""