Merge remote-tracking branch 'origin/master' into 3645.gbs-expanded-motivation

This commit is contained in:
Jean-Paul Calderone 2021-04-03 07:44:58 -04:00
commit 37fe06da76
31 changed files with 171 additions and 226 deletions

@ -87,10 +87,29 @@ jobs:
# Action for this, as of Jan 2021 it does not support Python coverage
# files - only lcov files. Therefore, we use coveralls-python, the
# coveralls.io-supplied Python reporter, for this.
#
# It is coveralls-python 1.x that has maintained compatibility
# with Python 2, while coveralls-python 3.x is compatible with
# Python 3. Sadly we can't use them both in the same workflow.
#
# The two versions of coveralls-python are somewhat mutually
# incompatible. Mixing these two different versions when
# reporting coverage to coveralls.io will lead to grief, since
# they get job IDs in different fashion. If we use both
# versions of coveralls in the same workflow, the finalizing
# step will be able to mark only part of the jobs as done, and
# the other part will be left hanging, never marked as done: it
# does not matter if we make an API call or `coveralls --finish`
# to indicate that CI has finished running.
#
# So we try to use the newer coveralls-python that is available
# via Python 3 (which is present in GitHub Actions tool cache,
# even when we're running Python 2.7 tests) throughout this
# workflow.
- name: "Report Coverage to Coveralls"
run: |
pip install coveralls
python -m coveralls
pip3 install --upgrade coveralls==3.0.1
python3 -m coveralls
env:
# Some magic value required for some magic reason.
GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}"
@ -113,80 +132,22 @@ jobs:
# a single report, we have to tell Coveralls when we've uploaded all of the
# data files. This does it. We make sure it runs last by making it depend
# on *all* of the coverage-collecting jobs.
#
# See notes about parallel builds on GitHub Actions at
# https://coveralls-python.readthedocs.io/en/latest/usage/configuration.html
finish-coverage-report:
# There happens to just be one coverage-collecting job at the moment. If
# the coverage reports are broken and someone added more
# coverage-collecting jobs to this workflow but didn't update this, that's
# why.
needs:
needs:
- "coverage"
runs-on: "ubuntu-latest"
container: "python:3-slim"
steps:
- name: "Check out Tahoe-LAFS sources"
uses: "actions/checkout@v2"
- name: "Finish Coveralls Reporting"
- name: "Indicate completion to coveralls.io"
run: |
# coveralls-python does have a `--finish` option but it doesn't seem
# to work, at least for us.
# https://github.com/coveralls-clients/coveralls-python/issues/248
#
# But all it does is this simple POST so we can just send it
# ourselves. The only hard part is guessing what the POST
# parameters mean. And I've done that for you already.
#
# Since the build is done I'm going to guess that "done" is a fine
# value for status.
#
# That leaves "build_num". The coveralls documentation gives some
# hints about it. It suggests using $CIRCLE_WORKFLOW_ID if your job
# is on CircleCI. CircleCI documentation says this about
# CIRCLE_WORKFLOW_ID:
#
# Observation of the coveralls.io web interface, logs from the
# coveralls command in action, and experimentation suggests the
# value for PRs is something more like:
#
# <GIT MERGE COMMIT HASH>-PR-<PR NUM>
#
# For branches, it's just the git branch tip hash.
# For pull requests, refs/pull/<PR NUM>/merge was just checked out
# by so HEAD will refer to the right revision. For branches, HEAD
# is also the tip of the branch.
REV=$(git rev-parse HEAD)
# We can get the PR number from the "context".
#
# https://docs.github.com/en/free-pro-team@latest/developers/webhooks-and-events/webhook-events-and-payloads#pull_request
#
# (via <https://github.community/t/github-ref-is-inconsistent/17728/3>).
#
# If this is a pull request, `github.event` is a `pull_request`
# structure which has `number` right in it.
#
# If this is a push, `github.event` is a `push` instead but we only
# need the revision to construct the build_num.
PR=${{ github.event.number }}
if [ "${PR}" = "" ]; then
BUILD_NUM=$REV
else
BUILD_NUM=$REV-PR-$PR
fi
REPO_NAME=$GITHUB_REPOSITORY
curl \
-k \
https://coveralls.io/webhook?repo_token=$COVERALLS_REPO_TOKEN \
-d \
"payload[build_num]=$BUILD_NUM&payload[status]=done&payload[repo_name]=$REPO_NAME"
pip3 install --upgrade coveralls==3.0.1
python3 -m coveralls --finish
env:
# Some magic value required for some magic reason.
GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}"
# Help coveralls identify our project.
COVERALLS_REPO_TOKEN: "JPf16rLB7T2yjgATIxFzTsEgMdN1UNq6o"
integration:
runs-on: ${{ matrix.os }}

@ -45,6 +45,7 @@ The following community members have made themselves available for conduct issue
- Jean-Paul Calderone (jean-paul at leastauthority dot com)
- meejah (meejah at meejah dot ca)
- May-Lee Sia(she/her) (tahoe dot lafs dot community at gmail dot com)
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 1.3.0, available at

@ -214,3 +214,7 @@ pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
.PHONY: livehtml
livehtml:
sphinx-autobuild -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html

@ -1,7 +1,30 @@
If you are reading Tahoe-LAFS documentation
-------------------------------------------
Note: http://tahoe-lafs.readthedocs.io/en/latest/ is the preferred place to
read this documentation (GitHub doesn't render cross-document links or
images). If you're reading this on https://github.com/tahoe-lafs/tahoe-lafs ,
or from a checked-out source tree, then either run `tox -e docs` and open
_build/html/index.html in your browser, or view the pre-rendered trunk copy
at http://tahoe-lafs.readthedocs.io/en/latest/
If you are reading Tahoe-LAFS documentation at a code hosting site or
from a checked-out source tree, the preferred place to view the docs
is http://tahoe-lafs.readthedocs.io/en/latest/. Code-hosting sites do
not render cross-document links or images correctly.
If you are writing Tahoe-LAFS documentation
-------------------------------------------
To edit Tahoe-LAFS docs, you will need a checked-out source tree. You
can edit the `.rst` files in this directory using a text editor, and
then generate HTML output using Sphinx, a program that can produce its
output in HTML and other formats.
Files with `.rst` extension use reStructuredText markup format, which
is the format Sphinx natively handles. To learn more about Sphinx, and
for a friendly primer on reStructuredText, please see Sphinx project's
documentation, available at:
https://www.sphinx-doc.org/
If you have `tox` installed, you can run `tox -e docs` and then open
the resulting docs/_build/html/index.html in your web browser.
Note that Sphinx can also process Python docstrings to generate API
documentation. Tahoe-LAFS currently does not use Sphinx for this
purpose.

0
newsfragments/3630.minor Normal file

0
newsfragments/3651.minor Normal file

0
newsfragments/3653.minor Normal file

0
newsfragments/3655.minor Normal file

0
newsfragments/3656.minor Normal file

0
newsfragments/3658.minor Normal file

0
newsfragments/3662.minor Normal file

1
newsfragments/3663.other Normal file

@ -0,0 +1 @@
You can run `make livehtml` in docs directory to invoke sphinx-autobuild.

@ -389,6 +389,10 @@ setup(name="tahoe-lafs", # also set in __init__.py
"tox",
"pytest",
"pytest-twisted",
# XXX: decorator isn't a direct dependency, but pytest-twisted
# depends on decorator, and decorator 5.x isn't compatible with
# Python 2.7.
"decorator < 5",
"hypothesis >= 3.6.1",
"treq",
"towncrier",

@ -3,6 +3,16 @@ Decentralized storage grid.
community web site: U{https://tahoe-lafs.org/}
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from future.utils import PY2
if PY2:
# Don't import future str() so we don't break Foolscap serialization on Python 2.
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, max, min # noqa: F401
from past.builtins import unicode as str
__all__ = [
"__version__",

@ -142,7 +142,7 @@ class ProhibitedNode(object):
def get_best_readable_version(self):
raise FileProhibited(self.reason)
def download_best_version(self, progress=None):
def download_best_version(self):
raise FileProhibited(self.reason)
def get_best_mutable_version(self):

@ -646,7 +646,7 @@ class DirectoryNode(object):
return d
def add_file(self, namex, uploadable, metadata=None, overwrite=True, progress=None):
def add_file(self, namex, uploadable, metadata=None, overwrite=True):
"""I upload a file (using the given IUploadable), then attach the
resulting FileNode to the directory at the given name. I return a
Deferred that fires (with the IFileNode of the uploaded file) when
@ -657,7 +657,7 @@ class DirectoryNode(object):
d = DeferredContext(defer.fail(NotWriteableError()))
else:
# XXX should pass reactor arg
d = DeferredContext(self._uploader.upload(uploadable, progress=progress))
d = DeferredContext(self._uploader.upload(uploadable))
d.addCallback(lambda results:
self._create_and_validate_node(results.get_uri(), None,
name))

@ -90,7 +90,7 @@ PiB=1024*TiB
@implementer(IEncoder)
class Encoder(object):
def __init__(self, log_parent=None, upload_status=None, progress=None):
def __init__(self, log_parent=None, upload_status=None):
object.__init__(self)
self.uri_extension_data = {}
self._codec = None
@ -102,7 +102,6 @@ class Encoder(object):
self._log_number = log.msg("creating Encoder %s" % self,
facility="tahoe.encoder", parent=log_parent)
self._aborted = False
self._progress = progress
def __repr__(self):
if hasattr(self, "_storage_index"):
@ -123,8 +122,6 @@ class Encoder(object):
def _got_size(size):
self.log(format="file size: %(size)d", size=size)
self.file_size = size
if self._progress:
self._progress.set_progress_total(self.file_size)
d.addCallback(_got_size)
d.addCallback(lambda res: eu.get_all_encoding_parameters())
d.addCallback(self._got_all_encoding_parameters)
@ -462,13 +459,6 @@ class Encoder(object):
dl = self._gather_responses(dl)
def do_progress(ign):
done = self.segment_size * (segnum + 1)
if self._progress:
self._progress.set_progress(done)
return ign
dl.addCallback(do_progress)
def _logit(res):
self.log("%s uploaded %s / %s bytes (%d%%) of your file." %
(self,

@ -337,13 +337,13 @@ class ImmutableFileNode(object):
"""
return defer.succeed(self)
def download_best_version(self, progress=None):
def download_best_version(self):
"""
Download the best version of this file, returning its contents
as a bytestring. Since there is only one version of an immutable
file, we download and return the contents of this file.
"""
d = consumer.download_to_data(self, progress=progress)
d = consumer.download_to_data(self)
return d
# for an immutable file, download_to_data (specified in IReadable)

@ -113,10 +113,7 @@ class LiteralFileNode(_ImmutableFileNodeBase):
return defer.succeed(self)
def download_best_version(self, progress=None):
if progress is not None:
progress.set_progress_total(len(self.u.data))
progress.set_progress(len(self.u.data))
def download_best_version(self):
return defer.succeed(self.u.data)

@ -154,8 +154,8 @@ class CHKUploadHelper(Referenceable, upload.CHKUploader): # type: ignore # warn
def __init__(self, storage_index,
helper, storage_broker, secret_holder,
incoming_file, encoding_file,
log_number, progress=None):
upload.CHKUploader.__init__(self, storage_broker, secret_holder, progress=progress)
log_number):
upload.CHKUploader.__init__(self, storage_broker, secret_holder)
self._storage_index = storage_index
self._helper = helper
self._incoming_file = incoming_file

@ -48,7 +48,7 @@ from allmydata.util.rrefutil import add_version_to_remote_reference
from allmydata.interfaces import IUploadable, IUploader, IUploadResults, \
IEncryptedUploadable, RIEncryptedUploadable, IUploadStatus, \
NoServersError, InsufficientVersionError, UploadUnhappinessError, \
DEFAULT_MAX_SEGMENT_SIZE, IProgress, IPeerSelector
DEFAULT_MAX_SEGMENT_SIZE, IPeerSelector
from allmydata.immutable import layout
from io import BytesIO
@ -945,7 +945,7 @@ class EncryptAnUploadable(object):
IEncryptedUploadable."""
CHUNKSIZE = 50*1024
def __init__(self, original, log_parent=None, progress=None, chunk_size=None):
def __init__(self, original, log_parent=None, chunk_size=None):
"""
:param chunk_size: The number of bytes to read from the uploadable at a
time, or None for some default.
@ -962,7 +962,6 @@ class EncryptAnUploadable(object):
self._file_size = None
self._ciphertext_bytes_read = 0
self._status = None
self._progress = progress
if chunk_size is not None:
self.CHUNKSIZE = chunk_size
@ -985,8 +984,6 @@ class EncryptAnUploadable(object):
self._file_size = size
if self._status:
self._status.set_size(size)
if self._progress:
self._progress.set_progress_total(size)
return size
d.addCallback(_got_size)
return d
@ -1229,7 +1226,7 @@ class UploadStatus(object):
class CHKUploader(object):
def __init__(self, storage_broker, secret_holder, progress=None, reactor=None):
def __init__(self, storage_broker, secret_holder, reactor=None):
# server_selector needs storage_broker and secret_holder
self._storage_broker = storage_broker
self._secret_holder = secret_holder
@ -1239,7 +1236,6 @@ class CHKUploader(object):
self._upload_status = UploadStatus()
self._upload_status.set_helper(False)
self._upload_status.set_active(True)
self._progress = progress
self._reactor = reactor
# locate_all_shareholders() will create the following attribute:
@ -1294,7 +1290,6 @@ class CHKUploader(object):
self._encoder = encode.Encoder(
self._log_number,
self._upload_status,
progress=self._progress,
)
# this just returns itself
yield self._encoder.set_encrypted_uploadable(eu)
@ -1428,13 +1423,12 @@ def read_this_many_bytes(uploadable, size, prepend_data=[]):
class LiteralUploader(object):
def __init__(self, progress=None):
def __init__(self):
self._status = s = UploadStatus()
s.set_storage_index(None)
s.set_helper(False)
s.set_progress(0, 1.0)
s.set_active(False)
self._progress = progress
def start(self, uploadable):
uploadable = IUploadable(uploadable)
@ -1442,8 +1436,6 @@ class LiteralUploader(object):
def _got_size(size):
self._size = size
self._status.set_size(size)
if self._progress:
self._progress.set_progress_total(size)
return read_this_many_bytes(uploadable, size)
d.addCallback(_got_size)
d.addCallback(lambda data: uri.LiteralFileURI(b"".join(data)))
@ -1467,8 +1459,6 @@ class LiteralUploader(object):
self._status.set_progress(1, 1.0)
self._status.set_progress(2, 1.0)
self._status.set_results(ur)
if self._progress:
self._progress.set_progress(self._size)
return ur
def close(self):
@ -1867,13 +1857,12 @@ class Uploader(service.MultiService, log.PrefixingLogMixin):
name = "uploader"
URI_LIT_SIZE_THRESHOLD = 55
def __init__(self, helper_furl=None, stats_provider=None, history=None, progress=None):
def __init__(self, helper_furl=None, stats_provider=None, history=None):
self._helper_furl = helper_furl
self.stats_provider = stats_provider
self._history = history
self._helper = None
self._all_uploads = weakref.WeakKeyDictionary() # for debugging
self._progress = progress
log.PrefixingLogMixin.__init__(self, facility="tahoe.immutable.upload")
service.MultiService.__init__(self)
@ -1907,13 +1896,12 @@ class Uploader(service.MultiService, log.PrefixingLogMixin):
return (self._helper_furl, bool(self._helper))
def upload(self, uploadable, progress=None, reactor=None):
def upload(self, uploadable, reactor=None):
"""
Returns a Deferred that will fire with the UploadResults instance.
"""
assert self.parent
assert self.running
assert progress is None or IProgress.providedBy(progress)
uploadable = IUploadable(uploadable)
d = uploadable.get_size()
@ -1922,15 +1910,13 @@ class Uploader(service.MultiService, log.PrefixingLogMixin):
precondition(isinstance(default_params, dict), default_params)
precondition("max_segment_size" in default_params, default_params)
uploadable.set_default_encoding_parameters(default_params)
if progress:
progress.set_progress_total(size)
if self.stats_provider:
self.stats_provider.count('uploader.files_uploaded', 1)
self.stats_provider.count('uploader.bytes_uploaded', size)
if size <= self.URI_LIT_SIZE_THRESHOLD:
uploader = LiteralUploader(progress=progress)
uploader = LiteralUploader()
return uploader.start(uploadable)
else:
eu = EncryptAnUploadable(uploadable, self._parentmsgid)
@ -1943,7 +1929,7 @@ class Uploader(service.MultiService, log.PrefixingLogMixin):
else:
storage_broker = self.parent.get_storage_broker()
secret_holder = self.parent._secret_holder
uploader = CHKUploader(storage_broker, secret_holder, progress=progress, reactor=reactor)
uploader = CHKUploader(storage_broker, secret_holder, reactor=reactor)
d2.addCallback(lambda x: uploader.start(eu))
self._all_uploads[uploader] = None

@ -733,38 +733,6 @@ class MustNotBeUnknownRWError(CapConstraintError):
"""Cannot add an unknown child cap specified in a rw_uri field."""
class IProgress(Interface):
"""
Remembers progress measured in arbitrary units. Users of these
instances must call ``set_progress_total`` at least once before
progress can be valid, and must use the same units for both
``set_progress_total`` and ``set_progress calls``.
See also:
:class:`allmydata.util.progress.PercentProgress`
"""
progress = Attribute(
"Current amount of progress (in percentage)"
)
def set_progress(value):
"""
Sets the current amount of progress.
Arbitrary units, but must match units used for
set_progress_total.
"""
def set_progress_total(value):
"""
Sets the total amount of expected progress
Arbitrary units, but must be same units as used when calling
set_progress() on this instance)..
"""
class IReadable(Interface):
"""I represent a readable object -- either an immutable file, or a
specific version of a mutable file.
@ -794,11 +762,9 @@ class IReadable(Interface):
def get_size():
"""Return the length (in bytes) of this readable object."""
def download_to_data(progress=None):
def download_to_data():
"""Download all of the file contents. I return a Deferred that fires
with the contents as a byte string.
:param progress: None or IProgress implementer
"""
def read(consumer, offset=0, size=None):
@ -1106,13 +1072,11 @@ class IFileNode(IFilesystemNode):
the Deferred will errback with an UnrecoverableFileError.
"""
def download_best_version(progress=None):
def download_best_version():
"""Download the contents of the version that would be returned
by get_best_readable_version(). This is equivalent to calling
download_to_data() on the IReadable given by that method.
progress is anything that implements IProgress
I return a Deferred that fires with a byte string when the file
has been fully downloaded. To support streaming download, use
the 'read' method of IReadable. If no version is recoverable,
@ -1258,7 +1222,7 @@ class IMutableFileNode(IFileNode):
everything) to get increased visibility.
"""
def upload(new_contents, servermap, progress=None):
def upload(new_contents, servermap):
"""Replace the contents of the file with new ones. This requires a
servermap that was previously updated with MODE_WRITE.
@ -1279,8 +1243,6 @@ class IMutableFileNode(IFileNode):
operation. If I do not signal UncoordinatedWriteError, then I was
able to write the new version without incident.
``progress`` is either None or an IProgress provider
I return a Deferred that fires (with a PublishStatus object) when the
publish has completed. I will update the servermap in-place with the
location of all new shares.
@ -1471,14 +1433,12 @@ class IDirectoryNode(IFilesystemNode):
equivalent to calling set_node() multiple times, but is much more
efficient."""
def add_file(name, uploadable, metadata=None, overwrite=True, progress=None):
def add_file(name, uploadable, metadata=None, overwrite=True):
"""I upload a file (using the given IUploadable), then attach the
resulting ImmutableFileNode to the directory at the given name. I set
metadata the same way as set_uri and set_node. The child name must be
a unicode string.
``progress`` either provides IProgress or is None
I return a Deferred that fires (with the IFileNode of the uploaded
file) when the operation completes."""

@ -1,3 +1,16 @@
"""
Ported to Python 3.
"""
from __future__ import unicode_literals
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
from allmydata.introducer.server import create_introducer

@ -418,21 +418,21 @@ class MutableFileNode(object):
return d.addCallback(_get_version, version)
def download_best_version(self, progress=None):
def download_best_version(self):
"""
I return a Deferred that fires with the contents of the best
version of this mutable file.
"""
return self._do_serialized(self._download_best_version, progress=progress)
return self._do_serialized(self._download_best_version)
def _download_best_version(self, progress=None):
def _download_best_version(self):
"""
I am the serialized sibling of download_best_version.
"""
d = self.get_best_readable_version()
d.addCallback(self._record_size)
d.addCallback(lambda version: version.download_to_data(progress=progress))
d.addCallback(lambda version: version.download_to_data())
# It is possible that the download will fail because there
# aren't enough shares to be had. If so, we will try again after
@ -447,7 +447,7 @@ class MutableFileNode(object):
d = self.get_best_mutable_version()
d.addCallback(self._record_size)
d.addCallback(lambda version: version.download_to_data(progress=progress))
d.addCallback(lambda version: version.download_to_data())
return d
d.addErrback(_maybe_retry)
@ -564,7 +564,7 @@ class MutableFileNode(object):
return d
def upload(self, new_contents, servermap, progress=None):
def upload(self, new_contents, servermap):
"""
I overwrite the contents of the best recoverable version of this
mutable file with new_contents, using servermap instead of
@ -951,13 +951,13 @@ class MutableFileVersion(object):
return self._servermap.size_of_version(self._version)
def download_to_data(self, fetch_privkey=False, progress=None): # type: ignore # fixme
def download_to_data(self, fetch_privkey=False): # type: ignore # fixme
"""
I return a Deferred that fires with the contents of this
readable object as a byte string.
"""
c = consumer.MemoryConsumer(progress=progress)
c = consumer.MemoryConsumer()
d = self.read(c, fetch_privkey=fetch_privkey)
d.addCallback(lambda mc: b"".join(mc.chunks))
return d

@ -1,5 +1,11 @@
from __future__ import print_function
from __future__ import absolute_import
from __future__ import division
from __future__ import unicode_literals
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import warnings
import os, sys
from six.moves import StringIO
@ -16,7 +22,7 @@ from twisted.internet import defer, task, threads
from allmydata.scripts.common import get_default_nodedir
from allmydata.scripts import debug, create_node, cli, \
admin, tahoe_run, tahoe_invite
from allmydata.util.encodingutil import quote_local_unicode_path
from allmydata.util.encodingutil import quote_local_unicode_path, argv_to_unicode
from allmydata.util.eliotutil import (
opt_eliot_destination,
opt_help_eliot_destinations,
@ -112,6 +118,7 @@ def parse_options(argv, config=None):
config.parseOptions(argv) # may raise usage.error
return config
def parse_or_exit_with_explanation(argv, stdout=sys.stdout):
config = Options()
try:
@ -121,7 +128,10 @@ def parse_or_exit_with_explanation(argv, stdout=sys.stdout):
while hasattr(c, 'subOptions'):
c = c.subOptions
print(str(c), file=stdout)
print("%s: %s\n" % (sys.argv[0], e), file=stdout)
# On Python 2 the string may turn into a unicode string, e.g. the error
# may be unicode, in which case it will print funny. Once we're on
# Python 3 we can just drop the ensure_str().
print(six.ensure_str("%s: %s\n" % (sys.argv[0], e)), file=stdout)
sys.exit(1)
return config
@ -229,7 +239,8 @@ def _run_with_reactor(reactor):
_setup_coverage(reactor)
d = defer.maybeDeferred(parse_or_exit_with_explanation, sys.argv[1:])
argv = list(map(argv_to_unicode, sys.argv[1:]))
d = defer.maybeDeferred(parse_or_exit_with_explanation, argv)
d.addCallback(_maybe_enable_eliot_logging, reactor)
d.addCallback(dispatch)
def _show_exception(f):

@ -12,9 +12,17 @@ Some setup that should apply across the entire test suite.
Rather than defining interesting APIs for other code to use, this just causes
some side-effects which make things better when the test suite runs.
"""
from future.utils import PY3
Ported to Python 3.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from future.utils import PY2, PY3
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import warnings
from traceback import extract_stack, format_list
@ -124,4 +132,4 @@ if sys.platform == "win32":
from eliot import to_file
from allmydata.util.jsonbytes import BytesJSONEncoder
to_file(open("eliot.log", "w"), encoder=BytesJSONEncoder)
to_file(open("eliot.log", "wb"), encoder=BytesJSONEncoder)

@ -539,8 +539,8 @@ class FakeCHKFileNode(object): # type: ignore # incomplete implementation
return defer.succeed(self)
def download_to_data(self, progress=None):
return download_to_data(self, progress=progress)
def download_to_data(self):
return download_to_data(self)
download_best_version = download_to_data
@ -717,11 +717,11 @@ class FakeMutableFileNode(object): # type: ignore # incomplete implementation
d.addCallback(_done)
return d
def download_best_version(self, progress=None):
return defer.succeed(self._download_best_version(progress=progress))
def download_best_version(self):
return defer.succeed(self._download_best_version())
def _download_best_version(self, ignored=None, progress=None):
def _download_best_version(self, ignored=None):
if isinstance(self.my_uri, uri.LiteralFileURI):
return self.my_uri.data
if self.storage_index not in self.all_contents:

@ -1592,7 +1592,7 @@ class FakeMutableFile(object): # type: ignore # incomplete implementation
def get_write_uri(self):
return self.uri.to_string()
def download_best_version(self, progress=None):
def download_best_version(self):
return defer.succeed(self.data)
def get_writekey(self):

@ -24,6 +24,7 @@ if PY2:
# Keep these sorted alphabetically, to reduce merge conflicts:
PORTED_MODULES = [
"allmydata",
"allmydata.__main__",
"allmydata._auto_deps",
"allmydata._monkeypatch",
@ -40,9 +41,11 @@ PORTED_MODULES = [
"allmydata.crypto.util",
"allmydata.deep_stats",
"allmydata.dirnode",
"allmydata.frontends",
"allmydata.frontends.sftpd",
"allmydata.hashtree",
"allmydata.history",
"allmydata.immutable",
"allmydata.immutable.checker",
"allmydata.immutable.downloader",
"allmydata.immutable.downloader.common",
@ -61,11 +64,13 @@ PORTED_MODULES = [
"allmydata.immutable.repairer",
"allmydata.immutable.upload",
"allmydata.interfaces",
"allmydata.introducer",
"allmydata.introducer.client",
"allmydata.introducer.common",
"allmydata.introducer.interfaces",
"allmydata.introducer.server",
"allmydata.monitor",
"allmydata.mutable",
"allmydata.mutable.checker",
"allmydata.mutable.common",
"allmydata.mutable.filenode",
@ -76,10 +81,13 @@ PORTED_MODULES = [
"allmydata.mutable.servermap",
"allmydata.node",
"allmydata.nodemaker",
"allmydata.scripts",
"allmydata.scripts.create_node",
"allmydata.scripts.runner",
"allmydata.scripts.types_",
"allmydata.stats",
"allmydata.storage_client",
"allmydata.storage",
"allmydata.storage.common",
"allmydata.storage.crawler",
"allmydata.storage.expirer",
@ -88,12 +96,18 @@ PORTED_MODULES = [
"allmydata.storage.mutable",
"allmydata.storage.server",
"allmydata.storage.shares",
"allmydata.test",
"allmydata.test.cli",
"allmydata.test.no_network",
"allmydata.test.matchers",
"allmydata.test.mutable",
"allmydata.test.mutable.util",
"allmydata.test.web",
"allmydata.testing",
"allmydata.testing.web",
"allmydata.unknown",
"allmydata.uri",
"allmydata.util",
"allmydata.util._python3",
"allmydata.util.abbreviate",
"allmydata.util.assertutil",
@ -125,6 +139,7 @@ PORTED_MODULES = [
"allmydata.util.statistics",
"allmydata.util.time_format",
"allmydata.util.tor_provider",
"allmydata.web",
"allmydata.web.check_results",
"allmydata.web.common",
"allmydata.web.directory",
@ -140,6 +155,7 @@ PORTED_MODULES = [
"allmydata.web.storage_plugins",
"allmydata.web.unlinked",
"allmydata.webish",
"allmydata.windows",
]
PORTED_TEST_MODULES = [

@ -9,10 +9,9 @@ from twisted.internet.interfaces import IConsumer
@implementer(IConsumer)
class MemoryConsumer(object):
def __init__(self, progress=None):
def __init__(self):
self.chunks = []
self.done = False
self._progress = progress
def registerProducer(self, p, streaming):
self.producer = p
@ -25,16 +24,14 @@ class MemoryConsumer(object):
def write(self, data):
self.chunks.append(data)
if self._progress is not None:
self._progress.set_progress(sum([len(c) for c in self.chunks]))
def unregisterProducer(self):
self.done = True
def download_to_data(n, offset=0, size=None, progress=None):
def download_to_data(n, offset=0, size=None):
"""
:param progress: None or an IProgress implementer
Return Deferred that fires with results of reading from the given filenode.
"""
d = n.read(MemoryConsumer(progress=progress), offset, size)
d = n.read(MemoryConsumer(), offset, size)
d.addCallback(lambda mc: b"".join(mc.chunks))
return d

@ -1,37 +0,0 @@
"""
Utilities relating to computing progress information.
Ties in with the "consumer" module also
"""
from allmydata.interfaces import IProgress
from zope.interface import implementer
@implementer(IProgress)
class PercentProgress(object):
"""
Represents progress as a percentage, from 0.0 to 100.0
"""
def __init__(self, total_size=None):
self._value = 0.0
self.set_progress_total(total_size)
def set_progress(self, value):
"IProgress API"
self._value = value
def set_progress_total(self, size):
"IProgress API"
if size is not None:
size = float(size)
self._total_size = size
@property
def progress(self):
if self._total_size is None:
return 0 # or 1.0?
if self._total_size <= 0.0:
return 0
return (self._value / self._total_size) * 100.0