From d28e172b5fc9e6327bc1bb8e2e6efe6d177aab85 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Tue, 9 Mar 2021 15:02:32 -0500 Subject: [PATCH 01/29] Update notes about reading/writing docs --- docs/README.txt | 39 +++++++++++++++++++++++++++++++++------ newsfragments/3630.minor | 0 2 files changed, 33 insertions(+), 6 deletions(-) create mode 100644 newsfragments/3630.minor diff --git a/docs/README.txt b/docs/README.txt index 87c9583d9..063d2d772 100644 --- a/docs/README.txt +++ b/docs/README.txt @@ -1,7 +1,34 @@ +If you are reading Tahoe-LAFS documentation +------------------------------------------- -Note: http://tahoe-lafs.readthedocs.io/en/latest/ is the preferred place to -read this documentation (GitHub doesn't render cross-document links or -images). If you're reading this on https://github.com/tahoe-lafs/tahoe-lafs , -or from a checked-out source tree, then either run `tox -e docs` and open -_build/html/index.html in your browser, or view the pre-rendered trunk copy -at http://tahoe-lafs.readthedocs.io/en/latest/ +If you are reading Tahoe-LAFS documentation at a code hosting site or +from a checked-out source tree, the preferred place to view the docs +is http://tahoe-lafs.readthedocs.io/en/latest/. Code-hosting sites do +not render cross-document links or images correctly. + + +If you are writing Tahoe-LAFS documentation +------------------------------------------- + +To edit Tahoe-LAFS docs, you will need a checked-out source tree. You +can edit the `.rst` files in this directory using a text editor, and +then generate HTML output using Sphinx, a program that can produce its +output in HTML and other formats. + +The files with `.rst` extension use reStructuredText markup format, +which is the format Sphinx natively handles. To learn more about +Sphinx, and for a friendly primer on reStructuredText, please see +Sphinx project's documentation, available at: + +https://www.sphinx-doc.org/ + +If you have tox installed, you can run `tox -e docs` and then open the +resulting docs/_build/html/index.html in your web browser. + +If you have Sphinx and Make installed, you can also run `make html` +within docs directory. You may also need to install some additional +Python packages, such as setuptools and recommonmark. + +Note that Sphinx can also process comments in Python source code to +generate API documentation. Tahoe-LAFS currently does not use Sphinx +for this purpose. diff --git a/newsfragments/3630.minor b/newsfragments/3630.minor new file mode 100644 index 000000000..e69de29bb From 8b1aa50b6967f32aaa6f8e19b5db129f7b1349b6 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Tue, 23 Mar 2021 10:17:20 -0400 Subject: [PATCH 02/29] Use one space after period --- docs/README.txt | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/README.txt b/docs/README.txt index 063d2d772..571e28bdd 100644 --- a/docs/README.txt +++ b/docs/README.txt @@ -3,20 +3,20 @@ If you are reading Tahoe-LAFS documentation If you are reading Tahoe-LAFS documentation at a code hosting site or from a checked-out source tree, the preferred place to view the docs -is http://tahoe-lafs.readthedocs.io/en/latest/. Code-hosting sites do +is http://tahoe-lafs.readthedocs.io/en/latest/. Code-hosting sites do not render cross-document links or images correctly. If you are writing Tahoe-LAFS documentation ------------------------------------------- -To edit Tahoe-LAFS docs, you will need a checked-out source tree. You +To edit Tahoe-LAFS docs, you will need a checked-out source tree. You can edit the `.rst` files in this directory using a text editor, and then generate HTML output using Sphinx, a program that can produce its output in HTML and other formats. The files with `.rst` extension use reStructuredText markup format, -which is the format Sphinx natively handles. To learn more about +which is the format Sphinx natively handles. To learn more about Sphinx, and for a friendly primer on reStructuredText, please see Sphinx project's documentation, available at: @@ -26,9 +26,9 @@ If you have tox installed, you can run `tox -e docs` and then open the resulting docs/_build/html/index.html in your web browser. If you have Sphinx and Make installed, you can also run `make html` -within docs directory. You may also need to install some additional +within docs directory. You may also need to install some additional Python packages, such as setuptools and recommonmark. Note that Sphinx can also process comments in Python source code to -generate API documentation. Tahoe-LAFS currently does not use Sphinx +generate API documentation. Tahoe-LAFS currently does not use Sphinx for this purpose. From c71b72758e8b368981d75330b9ff5324f6c776f5 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Tue, 23 Mar 2021 10:18:25 -0400 Subject: [PATCH 03/29] "The files with ..." to "Files with ..." --- docs/README.txt | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/README.txt b/docs/README.txt index 571e28bdd..868e4cd2b 100644 --- a/docs/README.txt +++ b/docs/README.txt @@ -15,10 +15,10 @@ can edit the `.rst` files in this directory using a text editor, and then generate HTML output using Sphinx, a program that can produce its output in HTML and other formats. -The files with `.rst` extension use reStructuredText markup format, -which is the format Sphinx natively handles. To learn more about -Sphinx, and for a friendly primer on reStructuredText, please see -Sphinx project's documentation, available at: +Files with `.rst` extension use reStructuredText markup format, which +is the format Sphinx natively handles. To learn more about Sphinx, and +for a friendly primer on reStructuredText, please see Sphinx project's +documentation, available at: https://www.sphinx-doc.org/ From 72ffc6211cdd06ee6299eb15ce2629411166d35c Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Tue, 23 Mar 2021 10:34:46 -0400 Subject: [PATCH 04/29] tox to `tox` --- docs/README.txt | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/README.txt b/docs/README.txt index 868e4cd2b..d710470f7 100644 --- a/docs/README.txt +++ b/docs/README.txt @@ -22,8 +22,8 @@ documentation, available at: https://www.sphinx-doc.org/ -If you have tox installed, you can run `tox -e docs` and then open the -resulting docs/_build/html/index.html in your web browser. +If you have `tox` installed, you can run `tox -e docs` and then open +the resulting docs/_build/html/index.html in your web browser. If you have Sphinx and Make installed, you can also run `make html` within docs directory. You may also need to install some additional From db02800232d2394154837384a2d049d774c28b1b Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Tue, 23 Mar 2021 10:35:30 -0400 Subject: [PATCH 05/29] `docs directory` to `the docs directory` --- docs/README.txt | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/README.txt b/docs/README.txt index d710470f7..987014518 100644 --- a/docs/README.txt +++ b/docs/README.txt @@ -26,8 +26,8 @@ If you have `tox` installed, you can run `tox -e docs` and then open the resulting docs/_build/html/index.html in your web browser. If you have Sphinx and Make installed, you can also run `make html` -within docs directory. You may also need to install some additional -Python packages, such as setuptools and recommonmark. +within the docs directory. You may also need to install some +additional Python packages, such as setuptools and recommonmark. Note that Sphinx can also process comments in Python source code to generate API documentation. Tahoe-LAFS currently does not use Sphinx From 517cfad4df33d667cdef411026140e7a041eaffc Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Tue, 23 Mar 2021 10:36:06 -0400 Subject: [PATCH 06/29] `setuptools` and `recommonmark` --- docs/README.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/README.txt b/docs/README.txt index 987014518..25460bbaa 100644 --- a/docs/README.txt +++ b/docs/README.txt @@ -27,7 +27,7 @@ the resulting docs/_build/html/index.html in your web browser. If you have Sphinx and Make installed, you can also run `make html` within the docs directory. You may also need to install some -additional Python packages, such as setuptools and recommonmark. +additional Python packages, such as `setuptools` and `recommonmark`. Note that Sphinx can also process comments in Python source code to generate API documentation. Tahoe-LAFS currently does not use Sphinx From 5ac52da14c07800772cec480f1914e97d19e83f6 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Tue, 23 Mar 2021 10:45:56 -0400 Subject: [PATCH 07/29] `comments in Python source code` to `Python docstrings` --- docs/README.txt | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/README.txt b/docs/README.txt index 25460bbaa..53f905e90 100644 --- a/docs/README.txt +++ b/docs/README.txt @@ -29,6 +29,6 @@ If you have Sphinx and Make installed, you can also run `make html` within the docs directory. You may also need to install some additional Python packages, such as `setuptools` and `recommonmark`. -Note that Sphinx can also process comments in Python source code to -generate API documentation. Tahoe-LAFS currently does not use Sphinx -for this purpose. +Note that Sphinx can also process Python docstrings to generate API +documentation. Tahoe-LAFS currently does not use Sphinx for this +purpose. From 5b39940d47239726a3ee3328503fc1c55fd2ff4f Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Tue, 23 Mar 2021 10:52:51 -0400 Subject: [PATCH 08/29] Drop the note about running `make html` in docs --- docs/README.txt | 4 ---- 1 file changed, 4 deletions(-) diff --git a/docs/README.txt b/docs/README.txt index 53f905e90..b571a2077 100644 --- a/docs/README.txt +++ b/docs/README.txt @@ -25,10 +25,6 @@ https://www.sphinx-doc.org/ If you have `tox` installed, you can run `tox -e docs` and then open the resulting docs/_build/html/index.html in your web browser. -If you have Sphinx and Make installed, you can also run `make html` -within the docs directory. You may also need to install some -additional Python packages, such as `setuptools` and `recommonmark`. - Note that Sphinx can also process Python docstrings to generate API documentation. Tahoe-LAFS currently does not use Sphinx for this purpose. From 0c278a5f3c1eb012817b2ec0306291caea55b0d5 Mon Sep 17 00:00:00 2001 From: May-Lee Sia Date: Thu, 25 Mar 2021 17:56:59 +0100 Subject: [PATCH 09/29] Add May-Lee Added my name and my pronouns JIC. --- docs/CODE_OF_CONDUCT.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/CODE_OF_CONDUCT.md b/docs/CODE_OF_CONDUCT.md index 3b14f2d70..174157e8c 100644 --- a/docs/CODE_OF_CONDUCT.md +++ b/docs/CODE_OF_CONDUCT.md @@ -45,6 +45,7 @@ The following community members have made themselves available for conduct issue - Jean-Paul Calderone (jean-paul at leastauthority dot com) - meejah (meejah at meejah dot ca) +- May-Lee Sia(she/her) (tahoe dot lafs dot community at gmail dot com) This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.3.0, available at From bfef77a39683d698450d69151059bc15ac507c45 Mon Sep 17 00:00:00 2001 From: "Jason R. Coombs" Date: Mon, 15 Feb 2021 14:38:11 -0500 Subject: [PATCH 10/29] Port runner to Python 3. --- src/allmydata/scripts/runner.py | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/src/allmydata/scripts/runner.py b/src/allmydata/scripts/runner.py index ada3c2dfc..9d5df478d 100644 --- a/src/allmydata/scripts/runner.py +++ b/src/allmydata/scripts/runner.py @@ -1,5 +1,11 @@ from __future__ import print_function +from __future__ import absolute_import +from __future__ import division +from __future__ import unicode_literals +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 import warnings import os, sys from six.moves import StringIO From 110e77b56031b111f48f1827826927dfc9b0f26f Mon Sep 17 00:00:00 2001 From: "Jason R. Coombs" Date: Fri, 26 Mar 2021 09:48:46 -0400 Subject: [PATCH 11/29] Mark module as ported --- src/allmydata/util/_python3.py | 1 + 1 file changed, 1 insertion(+) diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index f26b77185..7b3226bc5 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -77,6 +77,7 @@ PORTED_MODULES = [ "allmydata.node", "allmydata.nodemaker", "allmydata.scripts.create_node", + "allmydata.scripts.runner", "allmydata.scripts.types_", "allmydata.stats", "allmydata.storage_client", From a6147b05b1d52a32e981521e6e3fa07e339ff394 Mon Sep 17 00:00:00 2001 From: "Jason R. Coombs" Date: Fri, 26 Mar 2021 11:24:39 -0400 Subject: [PATCH 12/29] Fix test failure in test_unicode_arguments_and_output on Python 2. --- src/allmydata/scripts/runner.py | 23 ++++++++++++++++++++--- 1 file changed, 20 insertions(+), 3 deletions(-) diff --git a/src/allmydata/scripts/runner.py b/src/allmydata/scripts/runner.py index 9d5df478d..dc83e2ff0 100644 --- a/src/allmydata/scripts/runner.py +++ b/src/allmydata/scripts/runner.py @@ -22,7 +22,7 @@ from twisted.internet import defer, task, threads from allmydata.scripts.common import get_default_nodedir from allmydata.scripts import debug, create_node, cli, \ admin, tahoe_run, tahoe_invite -from allmydata.util.encodingutil import quote_local_unicode_path +from allmydata.util.encodingutil import quote_local_unicode_path, argv_to_unicode from allmydata.util.eliotutil import ( opt_eliot_destination, opt_help_eliot_destinations, @@ -118,6 +118,22 @@ def parse_options(argv, config=None): config.parseOptions(argv) # may raise usage.error return config + +def _safety_encode(msg): + """ + Previously, on Python 2, argv was utf-8 encoded bytes and + Options.parseOptions would pass through those unicode + characters (such as in a UsageError unknown command). + + In a Unicode-preferred world, the argv is decoded to + Unicode early and that's what's passed through, but + print still needs an encoded value. + """ + if PY2: + return msg.encode('utf-8') + return msg + + def parse_or_exit_with_explanation(argv, stdout=sys.stdout): config = Options() try: @@ -127,7 +143,7 @@ def parse_or_exit_with_explanation(argv, stdout=sys.stdout): while hasattr(c, 'subOptions'): c = c.subOptions print(str(c), file=stdout) - print("%s: %s\n" % (sys.argv[0], e), file=stdout) + print(_safety_encode("%s: %s\n" % (sys.argv[0], e)), file=stdout) sys.exit(1) return config @@ -235,7 +251,8 @@ def _run_with_reactor(reactor): _setup_coverage(reactor) - d = defer.maybeDeferred(parse_or_exit_with_explanation, sys.argv[1:]) + argv = list(map(argv_to_unicode, sys.argv[1:])) + d = defer.maybeDeferred(parse_or_exit_with_explanation, argv) d.addCallback(_maybe_enable_eliot_logging, reactor) d.addCallback(dispatch) def _show_exception(f): From f8dddcd1d031b6e187e215a464a4118882bf8612 Mon Sep 17 00:00:00 2001 From: "Jason R. Coombs" Date: Fri, 26 Mar 2021 11:27:53 -0400 Subject: [PATCH 13/29] Add newsfragment --- newsfragments/3603.minor.2 | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3603.minor.2 diff --git a/newsfragments/3603.minor.2 b/newsfragments/3603.minor.2 new file mode 100644 index 000000000..e69de29bb From 32b2e93302393b8b5b8b1cb59ad48ed1d01001a3 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Sat, 27 Mar 2021 10:03:12 -0400 Subject: [PATCH 14/29] Use newer version of coveralls for reporting --- .github/workflows/ci.yml | 20 ++++++++++++++++++-- 1 file changed, 18 insertions(+), 2 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index e14268f23..1b75f95b9 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -87,10 +87,26 @@ jobs: # Action for this, as of Jan 2021 it does not support Python coverage # files - only lcov files. Therefore, we use coveralls-python, the # coveralls.io-supplied Python reporter, for this. + # + # It is coveralls-python 1.x that has maintained compatibility + # with Python 2, while coveralls-python 3.x is compatible with + # Python 3. Sadly we can't use them both in the same workflow. + # + # The two versions of coveralls-python are somewhat mutually + # incompatible. Mixing these two different versions when + # reporting coverage to coveralls.io will lead to grief, since + # they get job IDs in different fashion. If we use both + # versions of coveralls in the same workflow, the finalizing + # step will be able to mark only part of the jobs as done, and + # the other part will be left hanging, never marked as done. It + # doesn't matter if we make an API call or `coveralls --finish`. + # + # So we just use the newer coveralls-python that is available to + # Python 3 throughout this workflow. - name: "Report Coverage to Coveralls" run: | - pip install coveralls - python -m coveralls + pip3 install --upgrade coveralls==3.0.1 + python3 -m coveralls env: # Some magic value required for some magic reason. GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}" From 626eba53b81f6fcbc2a7b6366c370fba9ffaa1c3 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Sat, 27 Mar 2021 10:17:22 -0400 Subject: [PATCH 15/29] Run `coveralls --finish` to indicate completion --- .github/workflows/ci.yml | 77 +++++----------------------------------- 1 file changed, 9 insertions(+), 68 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 1b75f95b9..8eaef0d07 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -129,80 +129,21 @@ jobs: # a single report, we have to tell Coveralls when we've uploaded all of the # data files. This does it. We make sure it runs last by making it depend # on *all* of the coverage-collecting jobs. + # + # See notes about parallel builds on GitHub Actions at + # https://coveralls-python.readthedocs.io/en/latest/usage/configuration.html finish-coverage-report: - # There happens to just be one coverage-collecting job at the moment. If - # the coverage reports are broken and someone added more - # coverage-collecting jobs to this workflow but didn't update this, that's - # why. - needs: - - "coverage" - runs-on: "ubuntu-latest" + needs: coverage + runs-on: ubuntu-latest + container: python:3-slim steps: - - name: "Check out Tahoe-LAFS sources" - uses: "actions/checkout@v2" - - - name: "Finish Coveralls Reporting" + - name: "Indicate completion to coveralls.io" run: | - # coveralls-python does have a `--finish` option but it doesn't seem - # to work, at least for us. - # https://github.com/coveralls-clients/coveralls-python/issues/248 - # - # But all it does is this simple POST so we can just send it - # ourselves. The only hard part is guessing what the POST - # parameters mean. And I've done that for you already. - # - # Since the build is done I'm going to guess that "done" is a fine - # value for status. - # - # That leaves "build_num". The coveralls documentation gives some - # hints about it. It suggests using $CIRCLE_WORKFLOW_ID if your job - # is on CircleCI. CircleCI documentation says this about - # CIRCLE_WORKFLOW_ID: - # - # Observation of the coveralls.io web interface, logs from the - # coveralls command in action, and experimentation suggests the - # value for PRs is something more like: - # - # -PR- - # - # For branches, it's just the git branch tip hash. - - # For pull requests, refs/pull//merge was just checked out - # by so HEAD will refer to the right revision. For branches, HEAD - # is also the tip of the branch. - REV=$(git rev-parse HEAD) - - # We can get the PR number from the "context". - # - # https://docs.github.com/en/free-pro-team@latest/developers/webhooks-and-events/webhook-events-and-payloads#pull_request - # - # (via ). - # - # If this is a pull request, `github.event` is a `pull_request` - # structure which has `number` right in it. - # - # If this is a push, `github.event` is a `push` instead but we only - # need the revision to construct the build_num. - - PR=${{ github.event.number }} - - if [ "${PR}" = "" ]; then - BUILD_NUM=$REV - else - BUILD_NUM=$REV-PR-$PR - fi - REPO_NAME=$GITHUB_REPOSITORY - - curl \ - -k \ - https://coveralls.io/webhook?repo_token=$COVERALLS_REPO_TOKEN \ - -d \ - "payload[build_num]=$BUILD_NUM&payload[status]=done&payload[repo_name]=$REPO_NAME" + pip3 install --upgrade coveralls==3.0.1 + python3 -m coveralls --finish env: # Some magic value required for some magic reason. GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}" - # Help coveralls identify our project. - COVERALLS_REPO_TOKEN: "JPf16rLB7T2yjgATIxFzTsEgMdN1UNq6o" integration: runs-on: ${{ matrix.os }} From ed203817d72972054121078a3d4d692eef0bd0fd Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Sat, 27 Mar 2021 10:17:42 -0400 Subject: [PATCH 16/29] Add newsfragment --- newsfragments/3653.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3653.minor diff --git a/newsfragments/3653.minor b/newsfragments/3653.minor new file mode 100644 index 000000000..e69de29bb From 4e4faee5fd21f21604c7c6b64d8e610c39855dad Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Sat, 27 Mar 2021 10:21:34 -0400 Subject: [PATCH 17/29] Update notes about coveralls versions --- .github/workflows/ci.yml | 11 +++++++---- 1 file changed, 7 insertions(+), 4 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 8eaef0d07..d565ede47 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -98,11 +98,14 @@ jobs: # they get job IDs in different fashion. If we use both # versions of coveralls in the same workflow, the finalizing # step will be able to mark only part of the jobs as done, and - # the other part will be left hanging, never marked as done. It - # doesn't matter if we make an API call or `coveralls --finish`. + # the other part will be left hanging, never marked as done: it + # does not matter if we make an API call or `coveralls --finish` + # to indicate that CI has finished running. # - # So we just use the newer coveralls-python that is available to - # Python 3 throughout this workflow. + # So we try to use the newer coveralls-python that is available + # via Python 3 (which is present in GitHub Actions tool cache, + # even when we're running Python 2.7 tests) throughout this + # workflow. - name: "Report Coverage to Coveralls" run: | pip3 install --upgrade coveralls==3.0.1 From 43f1f115cbe8b288a83a6bfe4b47121a0ab565a8 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 30 Mar 2021 09:46:56 -0400 Subject: [PATCH 18/29] Simplify. --- src/allmydata/scripts/runner.py | 20 ++++---------------- 1 file changed, 4 insertions(+), 16 deletions(-) diff --git a/src/allmydata/scripts/runner.py b/src/allmydata/scripts/runner.py index dc83e2ff0..7122e499e 100644 --- a/src/allmydata/scripts/runner.py +++ b/src/allmydata/scripts/runner.py @@ -119,21 +119,6 @@ def parse_options(argv, config=None): return config -def _safety_encode(msg): - """ - Previously, on Python 2, argv was utf-8 encoded bytes and - Options.parseOptions would pass through those unicode - characters (such as in a UsageError unknown command). - - In a Unicode-preferred world, the argv is decoded to - Unicode early and that's what's passed through, but - print still needs an encoded value. - """ - if PY2: - return msg.encode('utf-8') - return msg - - def parse_or_exit_with_explanation(argv, stdout=sys.stdout): config = Options() try: @@ -143,7 +128,10 @@ def parse_or_exit_with_explanation(argv, stdout=sys.stdout): while hasattr(c, 'subOptions'): c = c.subOptions print(str(c), file=stdout) - print(_safety_encode("%s: %s\n" % (sys.argv[0], e)), file=stdout) + # On Python 2 the string may turn into a unicode string, e.g. the error + # may be unicode, in which case it will print funny. Once we're on + # Python 3 we can just drop the ensure_str(). + print(six.ensure_str("%s: %s\n" % (sys.argv[0], e)), file=stdout) sys.exit(1) return config From e97043eeae0fc38202d5f3f56760a81b511ca5f0 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 30 Mar 2021 09:58:22 -0400 Subject: [PATCH 19/29] News file. --- newsfragments/3655.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3655.minor diff --git a/newsfragments/3655.minor b/newsfragments/3655.minor new file mode 100644 index 000000000..e69de29bb From 90c393b8b22607a986ce596b4e70f8a7ce707843 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 30 Mar 2021 10:04:28 -0400 Subject: [PATCH 20/29] Port __init__.py modules to Python 3 (or just mark them as ported if they're empty). --- src/allmydata/introducer/__init__.py | 13 +++++++++++++ src/allmydata/test/__init__.py | 12 ++++++++++-- src/allmydata/util/_python3.py | 14 ++++++++++++++ 3 files changed, 37 insertions(+), 2 deletions(-) diff --git a/src/allmydata/introducer/__init__.py b/src/allmydata/introducer/__init__.py index c4a644e76..bfc960e05 100644 --- a/src/allmydata/introducer/__init__.py +++ b/src/allmydata/introducer/__init__.py @@ -1,3 +1,16 @@ +""" +Ported to Python 3. +""" + +from __future__ import unicode_literals +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function + +from future.utils import PY2 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 + from allmydata.introducer.server import create_introducer diff --git a/src/allmydata/test/__init__.py b/src/allmydata/test/__init__.py index e3ac48290..a1cbb76a1 100644 --- a/src/allmydata/test/__init__.py +++ b/src/allmydata/test/__init__.py @@ -12,9 +12,17 @@ Some setup that should apply across the entire test suite. Rather than defining interesting APIs for other code to use, this just causes some side-effects which make things better when the test suite runs. -""" -from future.utils import PY3 +Ported to Python 3. +""" +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals + +from future.utils import PY2, PY3 +if PY2: + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401 import warnings from traceback import extract_stack, format_list diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index f26b77185..db2b3307a 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -40,9 +40,11 @@ PORTED_MODULES = [ "allmydata.crypto.util", "allmydata.deep_stats", "allmydata.dirnode", + "allmydata.frontends", "allmydata.frontends.sftpd", "allmydata.hashtree", "allmydata.history", + "allmydata.immutable", "allmydata.immutable.checker", "allmydata.immutable.downloader", "allmydata.immutable.downloader.common", @@ -61,11 +63,13 @@ PORTED_MODULES = [ "allmydata.immutable.repairer", "allmydata.immutable.upload", "allmydata.interfaces", + "allmydata.introducer", "allmydata.introducer.client", "allmydata.introducer.common", "allmydata.introducer.interfaces", "allmydata.introducer.server", "allmydata.monitor", + "allmydata.mutable", "allmydata.mutable.checker", "allmydata.mutable.common", "allmydata.mutable.filenode", @@ -76,10 +80,12 @@ PORTED_MODULES = [ "allmydata.mutable.servermap", "allmydata.node", "allmydata.nodemaker", + "allmydata.scripts", "allmydata.scripts.create_node", "allmydata.scripts.types_", "allmydata.stats", "allmydata.storage_client", + "allmydata.storage", "allmydata.storage.common", "allmydata.storage.crawler", "allmydata.storage.expirer", @@ -88,12 +94,18 @@ PORTED_MODULES = [ "allmydata.storage.mutable", "allmydata.storage.server", "allmydata.storage.shares", + "allmydata.test", + "allmydata.test.cli", "allmydata.test.no_network", "allmydata.test.matchers", + "allmydata.test.mutable", "allmydata.test.mutable.util", + "allmydata.test.web", + "allmydata.testing", "allmydata.testing.web", "allmydata.unknown", "allmydata.uri", + "allmydata.util", "allmydata.util._python3", "allmydata.util.abbreviate", "allmydata.util.assertutil", @@ -125,6 +137,7 @@ PORTED_MODULES = [ "allmydata.util.statistics", "allmydata.util.time_format", "allmydata.util.tor_provider", + "allmydata.web", "allmydata.web.check_results", "allmydata.web.common", "allmydata.web.directory", @@ -140,6 +153,7 @@ PORTED_MODULES = [ "allmydata.web.storage_plugins", "allmydata.web.unlinked", "allmydata.webish", + "allmydata.windows", ] PORTED_TEST_MODULES = [ From 45e21f8f703f26a8e98af04c3042428de5d80e1b Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Tue, 30 Mar 2021 11:05:49 -0400 Subject: [PATCH 21/29] Port to Python 3. --- src/allmydata/__init__.py | 10 ++++++++++ src/allmydata/test/__init__.py | 2 +- src/allmydata/util/_python3.py | 1 + 3 files changed, 12 insertions(+), 1 deletion(-) diff --git a/src/allmydata/__init__.py b/src/allmydata/__init__.py index 3157c8c80..b29868c05 100644 --- a/src/allmydata/__init__.py +++ b/src/allmydata/__init__.py @@ -3,6 +3,16 @@ Decentralized storage grid. community web site: U{https://tahoe-lafs.org/} """ +from __future__ import absolute_import +from __future__ import division +from __future__ import print_function +from __future__ import unicode_literals + +from future.utils import PY2 +if PY2: + # Don't import future str() so we don't break Foolscap serialization on Python 2. + from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, max, min # noqa: F401 + from past.builtins import unicode as str __all__ = [ "__version__", diff --git a/src/allmydata/test/__init__.py b/src/allmydata/test/__init__.py index a1cbb76a1..c75f8d003 100644 --- a/src/allmydata/test/__init__.py +++ b/src/allmydata/test/__init__.py @@ -132,4 +132,4 @@ if sys.platform == "win32": from eliot import to_file from allmydata.util.jsonbytes import BytesJSONEncoder -to_file(open("eliot.log", "w"), encoder=BytesJSONEncoder) +to_file(open("eliot.log", "wb"), encoder=BytesJSONEncoder) diff --git a/src/allmydata/util/_python3.py b/src/allmydata/util/_python3.py index db2b3307a..0d7f1561c 100644 --- a/src/allmydata/util/_python3.py +++ b/src/allmydata/util/_python3.py @@ -24,6 +24,7 @@ if PY2: # Keep these sorted alphabetically, to reduce merge conflicts: PORTED_MODULES = [ + "allmydata", "allmydata.__main__", "allmydata._auto_deps", "allmydata._monkeypatch", From cef53c40d12e27472e0f1f0e3e44ed302e327d92 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 31 Mar 2021 09:05:18 -0400 Subject: [PATCH 22/29] New ticket number. --- newsfragments/{3603.minor.2 => 3656.minor} | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename newsfragments/{3603.minor.2 => 3656.minor} (100%) diff --git a/newsfragments/3603.minor.2 b/newsfragments/3656.minor similarity index 100% rename from newsfragments/3603.minor.2 rename to newsfragments/3656.minor From 0d0dd4dee9d8c88575c8d26aaf5f88d1aea6a431 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 31 Mar 2021 10:35:25 -0400 Subject: [PATCH 23/29] Rip out all references to the unused IProgress API. --- src/allmydata/blacklist.py | 2 +- src/allmydata/dirnode.py | 4 +-- src/allmydata/immutable/encode.py | 12 +------ src/allmydata/immutable/filenode.py | 4 +-- src/allmydata/immutable/literal.py | 5 +-- src/allmydata/immutable/offloaded.py | 4 +-- src/allmydata/immutable/upload.py | 30 +++++------------ src/allmydata/interfaces.py | 48 +++------------------------- src/allmydata/mutable/filenode.py | 16 +++++----- src/allmydata/test/common.py | 10 +++--- src/allmydata/test/test_dirnode.py | 2 +- src/allmydata/util/consumer.py | 11 +++---- src/allmydata/util/progress.py | 37 --------------------- 13 files changed, 39 insertions(+), 146 deletions(-) delete mode 100644 src/allmydata/util/progress.py diff --git a/src/allmydata/blacklist.py b/src/allmydata/blacklist.py index 570d4723e..43eb36cc6 100644 --- a/src/allmydata/blacklist.py +++ b/src/allmydata/blacklist.py @@ -142,7 +142,7 @@ class ProhibitedNode(object): def get_best_readable_version(self): raise FileProhibited(self.reason) - def download_best_version(self, progress=None): + def download_best_version(self): raise FileProhibited(self.reason) def get_best_mutable_version(self): diff --git a/src/allmydata/dirnode.py b/src/allmydata/dirnode.py index ea8e84e05..fdf373b45 100644 --- a/src/allmydata/dirnode.py +++ b/src/allmydata/dirnode.py @@ -646,7 +646,7 @@ class DirectoryNode(object): return d - def add_file(self, namex, uploadable, metadata=None, overwrite=True, progress=None): + def add_file(self, namex, uploadable, metadata=None, overwrite=True): """I upload a file (using the given IUploadable), then attach the resulting FileNode to the directory at the given name. I return a Deferred that fires (with the IFileNode of the uploaded file) when @@ -657,7 +657,7 @@ class DirectoryNode(object): d = DeferredContext(defer.fail(NotWriteableError())) else: # XXX should pass reactor arg - d = DeferredContext(self._uploader.upload(uploadable, progress=progress)) + d = DeferredContext(self._uploader.upload(uploadable)) d.addCallback(lambda results: self._create_and_validate_node(results.get_uri(), None, name)) diff --git a/src/allmydata/immutable/encode.py b/src/allmydata/immutable/encode.py index bbb7f8123..42fc18077 100644 --- a/src/allmydata/immutable/encode.py +++ b/src/allmydata/immutable/encode.py @@ -90,7 +90,7 @@ PiB=1024*TiB @implementer(IEncoder) class Encoder(object): - def __init__(self, log_parent=None, upload_status=None, progress=None): + def __init__(self, log_parent=None, upload_status=None): object.__init__(self) self.uri_extension_data = {} self._codec = None @@ -102,7 +102,6 @@ class Encoder(object): self._log_number = log.msg("creating Encoder %s" % self, facility="tahoe.encoder", parent=log_parent) self._aborted = False - self._progress = progress def __repr__(self): if hasattr(self, "_storage_index"): @@ -123,8 +122,6 @@ class Encoder(object): def _got_size(size): self.log(format="file size: %(size)d", size=size) self.file_size = size - if self._progress: - self._progress.set_progress_total(self.file_size) d.addCallback(_got_size) d.addCallback(lambda res: eu.get_all_encoding_parameters()) d.addCallback(self._got_all_encoding_parameters) @@ -462,13 +459,6 @@ class Encoder(object): dl = self._gather_responses(dl) - def do_progress(ign): - done = self.segment_size * (segnum + 1) - if self._progress: - self._progress.set_progress(done) - return ign - dl.addCallback(do_progress) - def _logit(res): self.log("%s uploaded %s / %s bytes (%d%%) of your file." % (self, diff --git a/src/allmydata/immutable/filenode.py b/src/allmydata/immutable/filenode.py index 9e13e1337..6962c31a4 100644 --- a/src/allmydata/immutable/filenode.py +++ b/src/allmydata/immutable/filenode.py @@ -337,13 +337,13 @@ class ImmutableFileNode(object): """ return defer.succeed(self) - def download_best_version(self, progress=None): + def download_best_version(self): """ Download the best version of this file, returning its contents as a bytestring. Since there is only one version of an immutable file, we download and return the contents of this file. """ - d = consumer.download_to_data(self, progress=progress) + d = consumer.download_to_data(self) return d # for an immutable file, download_to_data (specified in IReadable) diff --git a/src/allmydata/immutable/literal.py b/src/allmydata/immutable/literal.py index 6ed5571b9..544a205e1 100644 --- a/src/allmydata/immutable/literal.py +++ b/src/allmydata/immutable/literal.py @@ -113,10 +113,7 @@ class LiteralFileNode(_ImmutableFileNodeBase): return defer.succeed(self) - def download_best_version(self, progress=None): - if progress is not None: - progress.set_progress_total(len(self.u.data)) - progress.set_progress(len(self.u.data)) + def download_best_version(self): return defer.succeed(self.u.data) diff --git a/src/allmydata/immutable/offloaded.py b/src/allmydata/immutable/offloaded.py index 2c4b9db78..8ce51782c 100644 --- a/src/allmydata/immutable/offloaded.py +++ b/src/allmydata/immutable/offloaded.py @@ -154,8 +154,8 @@ class CHKUploadHelper(Referenceable, upload.CHKUploader): # type: ignore # warn def __init__(self, storage_index, helper, storage_broker, secret_holder, incoming_file, encoding_file, - log_number, progress=None): - upload.CHKUploader.__init__(self, storage_broker, secret_holder, progress=progress) + log_number): + upload.CHKUploader.__init__(self, storage_broker, secret_holder) self._storage_index = storage_index self._helper = helper self._incoming_file = incoming_file diff --git a/src/allmydata/immutable/upload.py b/src/allmydata/immutable/upload.py index 3d794abf1..cb332dfdf 100644 --- a/src/allmydata/immutable/upload.py +++ b/src/allmydata/immutable/upload.py @@ -48,7 +48,7 @@ from allmydata.util.rrefutil import add_version_to_remote_reference from allmydata.interfaces import IUploadable, IUploader, IUploadResults, \ IEncryptedUploadable, RIEncryptedUploadable, IUploadStatus, \ NoServersError, InsufficientVersionError, UploadUnhappinessError, \ - DEFAULT_MAX_SEGMENT_SIZE, IProgress, IPeerSelector + DEFAULT_MAX_SEGMENT_SIZE, IPeerSelector from allmydata.immutable import layout from io import BytesIO @@ -945,7 +945,7 @@ class EncryptAnUploadable(object): IEncryptedUploadable.""" CHUNKSIZE = 50*1024 - def __init__(self, original, log_parent=None, progress=None, chunk_size=None): + def __init__(self, original, log_parent=None, chunk_size=None): """ :param chunk_size: The number of bytes to read from the uploadable at a time, or None for some default. @@ -962,7 +962,6 @@ class EncryptAnUploadable(object): self._file_size = None self._ciphertext_bytes_read = 0 self._status = None - self._progress = progress if chunk_size is not None: self.CHUNKSIZE = chunk_size @@ -985,8 +984,6 @@ class EncryptAnUploadable(object): self._file_size = size if self._status: self._status.set_size(size) - if self._progress: - self._progress.set_progress_total(size) return size d.addCallback(_got_size) return d @@ -1229,7 +1226,7 @@ class UploadStatus(object): class CHKUploader(object): - def __init__(self, storage_broker, secret_holder, progress=None, reactor=None): + def __init__(self, storage_broker, secret_holder, reactor=None): # server_selector needs storage_broker and secret_holder self._storage_broker = storage_broker self._secret_holder = secret_holder @@ -1239,7 +1236,6 @@ class CHKUploader(object): self._upload_status = UploadStatus() self._upload_status.set_helper(False) self._upload_status.set_active(True) - self._progress = progress self._reactor = reactor # locate_all_shareholders() will create the following attribute: @@ -1294,7 +1290,6 @@ class CHKUploader(object): self._encoder = encode.Encoder( self._log_number, self._upload_status, - progress=self._progress, ) # this just returns itself yield self._encoder.set_encrypted_uploadable(eu) @@ -1428,13 +1423,12 @@ def read_this_many_bytes(uploadable, size, prepend_data=[]): class LiteralUploader(object): - def __init__(self, progress=None): + def __init__(self): self._status = s = UploadStatus() s.set_storage_index(None) s.set_helper(False) s.set_progress(0, 1.0) s.set_active(False) - self._progress = progress def start(self, uploadable): uploadable = IUploadable(uploadable) @@ -1442,8 +1436,6 @@ class LiteralUploader(object): def _got_size(size): self._size = size self._status.set_size(size) - if self._progress: - self._progress.set_progress_total(size) return read_this_many_bytes(uploadable, size) d.addCallback(_got_size) d.addCallback(lambda data: uri.LiteralFileURI(b"".join(data))) @@ -1467,8 +1459,6 @@ class LiteralUploader(object): self._status.set_progress(1, 1.0) self._status.set_progress(2, 1.0) self._status.set_results(ur) - if self._progress: - self._progress.set_progress(self._size) return ur def close(self): @@ -1867,13 +1857,12 @@ class Uploader(service.MultiService, log.PrefixingLogMixin): name = "uploader" URI_LIT_SIZE_THRESHOLD = 55 - def __init__(self, helper_furl=None, stats_provider=None, history=None, progress=None): + def __init__(self, helper_furl=None, stats_provider=None, history=None): self._helper_furl = helper_furl self.stats_provider = stats_provider self._history = history self._helper = None self._all_uploads = weakref.WeakKeyDictionary() # for debugging - self._progress = progress log.PrefixingLogMixin.__init__(self, facility="tahoe.immutable.upload") service.MultiService.__init__(self) @@ -1907,13 +1896,12 @@ class Uploader(service.MultiService, log.PrefixingLogMixin): return (self._helper_furl, bool(self._helper)) - def upload(self, uploadable, progress=None, reactor=None): + def upload(self, uploadable, reactor=None): """ Returns a Deferred that will fire with the UploadResults instance. """ assert self.parent assert self.running - assert progress is None or IProgress.providedBy(progress) uploadable = IUploadable(uploadable) d = uploadable.get_size() @@ -1922,15 +1910,13 @@ class Uploader(service.MultiService, log.PrefixingLogMixin): precondition(isinstance(default_params, dict), default_params) precondition("max_segment_size" in default_params, default_params) uploadable.set_default_encoding_parameters(default_params) - if progress: - progress.set_progress_total(size) if self.stats_provider: self.stats_provider.count('uploader.files_uploaded', 1) self.stats_provider.count('uploader.bytes_uploaded', size) if size <= self.URI_LIT_SIZE_THRESHOLD: - uploader = LiteralUploader(progress=progress) + uploader = LiteralUploader() return uploader.start(uploadable) else: eu = EncryptAnUploadable(uploadable, self._parentmsgid) @@ -1943,7 +1929,7 @@ class Uploader(service.MultiService, log.PrefixingLogMixin): else: storage_broker = self.parent.get_storage_broker() secret_holder = self.parent._secret_holder - uploader = CHKUploader(storage_broker, secret_holder, progress=progress, reactor=reactor) + uploader = CHKUploader(storage_broker, secret_holder, reactor=reactor) d2.addCallback(lambda x: uploader.start(eu)) self._all_uploads[uploader] = None diff --git a/src/allmydata/interfaces.py b/src/allmydata/interfaces.py index 96d3e813c..2e055a888 100644 --- a/src/allmydata/interfaces.py +++ b/src/allmydata/interfaces.py @@ -733,38 +733,6 @@ class MustNotBeUnknownRWError(CapConstraintError): """Cannot add an unknown child cap specified in a rw_uri field.""" -class IProgress(Interface): - """ - Remembers progress measured in arbitrary units. Users of these - instances must call ``set_progress_total`` at least once before - progress can be valid, and must use the same units for both - ``set_progress_total`` and ``set_progress calls``. - - See also: - :class:`allmydata.util.progress.PercentProgress` - """ - - progress = Attribute( - "Current amount of progress (in percentage)" - ) - - def set_progress(value): - """ - Sets the current amount of progress. - - Arbitrary units, but must match units used for - set_progress_total. - """ - - def set_progress_total(value): - """ - Sets the total amount of expected progress - - Arbitrary units, but must be same units as used when calling - set_progress() on this instance).. - """ - - class IReadable(Interface): """I represent a readable object -- either an immutable file, or a specific version of a mutable file. @@ -794,11 +762,9 @@ class IReadable(Interface): def get_size(): """Return the length (in bytes) of this readable object.""" - def download_to_data(progress=None): + def download_to_data(): """Download all of the file contents. I return a Deferred that fires with the contents as a byte string. - - :param progress: None or IProgress implementer """ def read(consumer, offset=0, size=None): @@ -1106,13 +1072,11 @@ class IFileNode(IFilesystemNode): the Deferred will errback with an UnrecoverableFileError. """ - def download_best_version(progress=None): + def download_best_version(): """Download the contents of the version that would be returned by get_best_readable_version(). This is equivalent to calling download_to_data() on the IReadable given by that method. - progress is anything that implements IProgress - I return a Deferred that fires with a byte string when the file has been fully downloaded. To support streaming download, use the 'read' method of IReadable. If no version is recoverable, @@ -1258,7 +1222,7 @@ class IMutableFileNode(IFileNode): everything) to get increased visibility. """ - def upload(new_contents, servermap, progress=None): + def upload(new_contents, servermap): """Replace the contents of the file with new ones. This requires a servermap that was previously updated with MODE_WRITE. @@ -1279,8 +1243,6 @@ class IMutableFileNode(IFileNode): operation. If I do not signal UncoordinatedWriteError, then I was able to write the new version without incident. - ``progress`` is either None or an IProgress provider - I return a Deferred that fires (with a PublishStatus object) when the publish has completed. I will update the servermap in-place with the location of all new shares. @@ -1471,14 +1433,12 @@ class IDirectoryNode(IFilesystemNode): equivalent to calling set_node() multiple times, but is much more efficient.""" - def add_file(name, uploadable, metadata=None, overwrite=True, progress=None): + def add_file(name, uploadable, metadata=None, overwrite=True): """I upload a file (using the given IUploadable), then attach the resulting ImmutableFileNode to the directory at the given name. I set metadata the same way as set_uri and set_node. The child name must be a unicode string. - ``progress`` either provides IProgress or is None - I return a Deferred that fires (with the IFileNode of the uploaded file) when the operation completes.""" diff --git a/src/allmydata/mutable/filenode.py b/src/allmydata/mutable/filenode.py index baedacb23..cd8cb0dc7 100644 --- a/src/allmydata/mutable/filenode.py +++ b/src/allmydata/mutable/filenode.py @@ -418,21 +418,21 @@ class MutableFileNode(object): return d.addCallback(_get_version, version) - def download_best_version(self, progress=None): + def download_best_version(self): """ I return a Deferred that fires with the contents of the best version of this mutable file. """ - return self._do_serialized(self._download_best_version, progress=progress) + return self._do_serialized(self._download_best_version) - def _download_best_version(self, progress=None): + def _download_best_version(self): """ I am the serialized sibling of download_best_version. """ d = self.get_best_readable_version() d.addCallback(self._record_size) - d.addCallback(lambda version: version.download_to_data(progress=progress)) + d.addCallback(lambda version: version.download_to_data()) # It is possible that the download will fail because there # aren't enough shares to be had. If so, we will try again after @@ -447,7 +447,7 @@ class MutableFileNode(object): d = self.get_best_mutable_version() d.addCallback(self._record_size) - d.addCallback(lambda version: version.download_to_data(progress=progress)) + d.addCallback(lambda version: version.download_to_data()) return d d.addErrback(_maybe_retry) @@ -564,7 +564,7 @@ class MutableFileNode(object): return d - def upload(self, new_contents, servermap, progress=None): + def upload(self, new_contents, servermap): """ I overwrite the contents of the best recoverable version of this mutable file with new_contents, using servermap instead of @@ -951,13 +951,13 @@ class MutableFileVersion(object): return self._servermap.size_of_version(self._version) - def download_to_data(self, fetch_privkey=False, progress=None): # type: ignore # fixme + def download_to_data(self, fetch_privkey=False): # type: ignore # fixme """ I return a Deferred that fires with the contents of this readable object as a byte string. """ - c = consumer.MemoryConsumer(progress=progress) + c = consumer.MemoryConsumer() d = self.read(c, fetch_privkey=fetch_privkey) d.addCallback(lambda mc: b"".join(mc.chunks)) return d diff --git a/src/allmydata/test/common.py b/src/allmydata/test/common.py index 94a59f5c9..d7f00554d 100644 --- a/src/allmydata/test/common.py +++ b/src/allmydata/test/common.py @@ -539,8 +539,8 @@ class FakeCHKFileNode(object): # type: ignore # incomplete implementation return defer.succeed(self) - def download_to_data(self, progress=None): - return download_to_data(self, progress=progress) + def download_to_data(self): + return download_to_data(self) download_best_version = download_to_data @@ -717,11 +717,11 @@ class FakeMutableFileNode(object): # type: ignore # incomplete implementation d.addCallback(_done) return d - def download_best_version(self, progress=None): - return defer.succeed(self._download_best_version(progress=progress)) + def download_best_version(self): + return defer.succeed(self._download_best_version()) - def _download_best_version(self, ignored=None, progress=None): + def _download_best_version(self, ignored=None): if isinstance(self.my_uri, uri.LiteralFileURI): return self.my_uri.data if self.storage_index not in self.all_contents: diff --git a/src/allmydata/test/test_dirnode.py b/src/allmydata/test/test_dirnode.py index 23a0fd76b..67d331430 100644 --- a/src/allmydata/test/test_dirnode.py +++ b/src/allmydata/test/test_dirnode.py @@ -1592,7 +1592,7 @@ class FakeMutableFile(object): # type: ignore # incomplete implementation def get_write_uri(self): return self.uri.to_string() - def download_best_version(self, progress=None): + def download_best_version(self): return defer.succeed(self.data) def get_writekey(self): diff --git a/src/allmydata/util/consumer.py b/src/allmydata/util/consumer.py index 393d7dec5..a8eededcc 100644 --- a/src/allmydata/util/consumer.py +++ b/src/allmydata/util/consumer.py @@ -9,10 +9,9 @@ from twisted.internet.interfaces import IConsumer @implementer(IConsumer) class MemoryConsumer(object): - def __init__(self, progress=None): + def __init__(self): self.chunks = [] self.done = False - self._progress = progress def registerProducer(self, p, streaming): self.producer = p @@ -25,16 +24,14 @@ class MemoryConsumer(object): def write(self, data): self.chunks.append(data) - if self._progress is not None: - self._progress.set_progress(sum([len(c) for c in self.chunks])) def unregisterProducer(self): self.done = True -def download_to_data(n, offset=0, size=None, progress=None): +def download_to_data(n, offset=0, size=None): """ - :param progress: None or an IProgress implementer + Return Deferred that fires with results of reading from the given filenode. """ - d = n.read(MemoryConsumer(progress=progress), offset, size) + d = n.read(MemoryConsumer(), offset, size) d.addCallback(lambda mc: b"".join(mc.chunks)) return d diff --git a/src/allmydata/util/progress.py b/src/allmydata/util/progress.py deleted file mode 100644 index 3618c88dd..000000000 --- a/src/allmydata/util/progress.py +++ /dev/null @@ -1,37 +0,0 @@ -""" -Utilities relating to computing progress information. - -Ties in with the "consumer" module also -""" - -from allmydata.interfaces import IProgress -from zope.interface import implementer - - -@implementer(IProgress) -class PercentProgress(object): - """ - Represents progress as a percentage, from 0.0 to 100.0 - """ - - def __init__(self, total_size=None): - self._value = 0.0 - self.set_progress_total(total_size) - - def set_progress(self, value): - "IProgress API" - self._value = value - - def set_progress_total(self, size): - "IProgress API" - if size is not None: - size = float(size) - self._total_size = size - - @property - def progress(self): - if self._total_size is None: - return 0 # or 1.0? - if self._total_size <= 0.0: - return 0 - return (self._value / self._total_size) * 100.0 From cf63a3a83aac548b1fff5ad923b7b7b807312c36 Mon Sep 17 00:00:00 2001 From: Itamar Turner-Trauring Date: Wed, 31 Mar 2021 10:35:51 -0400 Subject: [PATCH 24/29] News file. --- newsfragments/3658.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3658.minor diff --git a/newsfragments/3658.minor b/newsfragments/3658.minor new file mode 100644 index 000000000..e69de29bb From 88e3005abbea9df8d57104352c122e90d785d771 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Wed, 31 Mar 2021 11:53:02 -0400 Subject: [PATCH 25/29] Quote some YAML strings Following PR review feedback. Some parts of GitHub Actions configuration follows convention in GA's documentation, in which YAML strings are not quoted, but that probably is not a good idea. We also don't want to change all the strings in this unrelated set of changes. So we will quote strings as we go, in the blocks we touch. --- .github/workflows/ci.yml | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index d565ede47..39cd114d3 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -136,9 +136,10 @@ jobs: # See notes about parallel builds on GitHub Actions at # https://coveralls-python.readthedocs.io/en/latest/usage/configuration.html finish-coverage-report: - needs: coverage - runs-on: ubuntu-latest - container: python:3-slim + needs: + - "coverage" + runs-on: "ubuntu-latest" + container: "python:3-slim" steps: - name: "Indicate completion to coveralls.io" run: | From bb8295ac6111da904b49a771e155d05a929cd9f9 Mon Sep 17 00:00:00 2001 From: May-Lee Sia Date: Wed, 31 Mar 2021 21:14:50 +0200 Subject: [PATCH 26/29] Add newsfragment --- newsfragments/3651.minor | 0 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 newsfragments/3651.minor diff --git a/newsfragments/3651.minor b/newsfragments/3651.minor new file mode 100644 index 000000000..e69de29bb From add2be1b44e847d6d9a2f754861b1c0f8c264163 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Fri, 2 Apr 2021 10:24:43 -0400 Subject: [PATCH 27/29] Pin decorator CI broke when decorator 5.0.1 (a dependency via pytest-twisted) with dropped Python 2.7 compatibility was released. --- newsfragments/3662.minor | 0 setup.py | 4 ++++ 2 files changed, 4 insertions(+) create mode 100644 newsfragments/3662.minor diff --git a/newsfragments/3662.minor b/newsfragments/3662.minor new file mode 100644 index 000000000..e69de29bb diff --git a/setup.py b/setup.py index f5ca91b2d..df770fadc 100644 --- a/setup.py +++ b/setup.py @@ -389,6 +389,10 @@ setup(name="tahoe-lafs", # also set in __init__.py "tox", "pytest", "pytest-twisted", + # XXX: decorator isn't a direct dependency, but pytest-twisted + # depends on decorator, and decorator 5.x isn't compatible with + # Python 2.7. + "decorator < 5", "hypothesis >= 3.6.1", "treq", "towncrier", From 1c3b1d0d276ce45788cb74c719aa8a03d18a29e8 Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Fri, 2 Apr 2021 17:25:46 -0400 Subject: [PATCH 28/29] Add newsfragment --- newsfragments/3663.other | 1 + 1 file changed, 1 insertion(+) create mode 100644 newsfragments/3663.other diff --git a/newsfragments/3663.other b/newsfragments/3663.other new file mode 100644 index 000000000..62abf2666 --- /dev/null +++ b/newsfragments/3663.other @@ -0,0 +1 @@ +You can run `make livehtml` in docs directory to invoke sphinx-autobuild. From c13fb8d7effb40aba19d5068495323f15139152c Mon Sep 17 00:00:00 2001 From: Sajith Sasidharan Date: Fri, 2 Apr 2021 17:27:32 -0400 Subject: [PATCH 29/29] Add a make rule to invoke sphinx-autobuild when building docs --- docs/Makefile | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/docs/Makefile b/docs/Makefile index ed9e59186..3d7b51f7f 100644 --- a/docs/Makefile +++ b/docs/Makefile @@ -214,3 +214,7 @@ pseudoxml: $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml @echo @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." + +.PHONY: livehtml +livehtml: + sphinx-autobuild -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html