mirror of
https://github.com/tahoe-lafs/tahoe-lafs.git
synced 2025-01-02 19:26:44 +00:00
Merge 'origin/master' into 3636.doc-toc-reorg
This commit is contained in:
commit
c624135524
.github/workflows
docs
misc/checkers
newsfragments
3616.minor3645.minor3651.minor3657.minor3659.documentation3666.documentation3667.minor3669.minor3670.minor3671.minor3674.minor
setup.pysrc/allmydata
introducer
scripts
test
_twisted_9607.py_win_subprocess.py
cli
cli_node_api.pycommon.pycommon_util.pycommon_web.pyeliotutil.pystatus.pystorage_plugin.pystrategies.pytest_consumer.pytest_multi_introducers.pytest_python2_regressions.pytest_system.pytest_util.pyweb
util
2
.github/workflows/ci.yml
vendored
2
.github/workflows/ci.yml
vendored
@ -20,8 +20,10 @@ jobs:
|
||||
os:
|
||||
- macos-latest
|
||||
- windows-latest
|
||||
- ubuntu-latest
|
||||
python-version:
|
||||
- 2.7
|
||||
- 3.6
|
||||
|
||||
steps:
|
||||
|
||||
|
@ -65,6 +65,7 @@ preserving your privacy and security.
|
||||
contributing
|
||||
CODE_OF_CONDUCT
|
||||
developer-guide
|
||||
ticket-triage
|
||||
release-checklist
|
||||
desert-island
|
||||
|
||||
|
@ -13,6 +13,61 @@ Specifically, it should be possible to implement a Tahoe-LAFS storage server wit
|
||||
The Tahoe-LAFS client will also need to change but it is not expected that it will be noticably simplified by this change
|
||||
(though this may be the first step towards simplifying it).
|
||||
|
||||
Motivation
|
||||
----------
|
||||
|
||||
Foolscap
|
||||
~~~~~~~~
|
||||
|
||||
Foolscap is a remote method invocation protocol with several distinctive features.
|
||||
At its core it allows separate processes to refer each other's objects and methods using a capability-based model.
|
||||
This allows for extremely fine-grained access control in a system that remains highly securable without becoming overwhelmingly complicated.
|
||||
Supporting this is a flexible and extensible serialization system which allows data to be exchanged between processes in carefully controlled ways.
|
||||
|
||||
Tahoe-LAFS avails itself of only a small portion of these features.
|
||||
A Tahoe-LAFS storage server typically only exposes one object with a fixed set of methods to clients.
|
||||
A Tahoe-LAFS introducer node does roughly the same.
|
||||
Tahoe-LAFS exchanges simple data structures that have many common, standard serialized representations.
|
||||
|
||||
In exchange for this slight use of Foolscap's sophisticated mechanisms,
|
||||
Tahoe-LAFS pays a substantial price:
|
||||
|
||||
* Foolscap is implemented only for Python.
|
||||
Tahoe-LAFS is thus limited to being implemented only in Python.
|
||||
* There is only one Python implementation of Foolscap.
|
||||
The implementation is therefore the de facto standard and understanding of the protocol often relies on understanding that implementation.
|
||||
* The Foolscap developer community is very small.
|
||||
The implementation therefore advances very little and some non-trivial part of the maintenance cost falls on the Tahoe-LAFS project.
|
||||
* The extensible serialization system imposes substantial complexity compared to the simple data structures Tahoe-LAFS actually exchanges.
|
||||
|
||||
HTTP
|
||||
~~~~
|
||||
|
||||
HTTP is a request/response protocol that has become the lingua franca of the internet.
|
||||
Combined with the principles of Representational State Transfer (REST) it is widely employed to create, update, and delete data in collections on the internet.
|
||||
HTTP itself provides only modest functionality in comparison to Foolscap.
|
||||
However its simplicity and widespread use have led to a diverse and almost overwhelming ecosystem of libraries, frameworks, toolkits, and so on.
|
||||
|
||||
By adopting HTTP in place of Foolscap Tahoe-LAFS can realize the following concrete benefits:
|
||||
|
||||
* Practically every language or runtime has an HTTP protocol implementation (or a dozen of them) available.
|
||||
This change paves the way for new Tahoe-LAFS implementations using tools better suited for certain situations
|
||||
(mobile client implementations, high-performance server implementations, easily distributed desktop clients, etc).
|
||||
* The simplicity of and vast quantity of resources about HTTP make it a very easy protocol to learn and use.
|
||||
This change reduces the barrier to entry for developers to contribute improvements to Tahoe-LAFS's network interactions.
|
||||
* For any given language there is very likely an HTTP implementation with a large and active developer community.
|
||||
Tahoe-LAFS can therefore benefit from the large effort being put into making better libraries for using HTTP.
|
||||
* One of the core features of HTTP is the mundane transfer of bulk data and implementions are often capable of doing this with extreme efficiency.
|
||||
The alignment of this core feature with a core activity of Tahoe-LAFS of transferring bulk data means that a substantial barrier to improved Tahoe-LAFS runtime performance will be eliminated.
|
||||
|
||||
TLS
|
||||
~~~
|
||||
|
||||
The Foolscap-based protocol provides *some* of Tahoe-LAFS's confidentiality, integrity, and authentication properties by leveraging TLS.
|
||||
An HTTP-based protocol can make use of TLS in largely the same way to provide the same properties.
|
||||
Provision of these properties *is* dependant on implementers following Great Black Swamp's rules for x509 certificate validation
|
||||
(rather than the standard "web" rules for validation).
|
||||
|
||||
Requirements
|
||||
------------
|
||||
|
||||
|
27
docs/ticket-triage.rst
Normal file
27
docs/ticket-triage.rst
Normal file
@ -0,0 +1,27 @@
|
||||
=============
|
||||
Ticket Triage
|
||||
=============
|
||||
|
||||
Ticket triage is a weekly, informal ritual that is meant to solve the problem of
|
||||
tickets getting opened and then forgotten about. It is simple and keeps project
|
||||
momentum going and prevents ticket cruft.
|
||||
|
||||
It fosters conversation around project tasks and philosophies as they relate to
|
||||
milestones.
|
||||
|
||||
Process
|
||||
-------
|
||||
- The role of Ticket Triager rotates regularly-ish, and is assigned ad hoc
|
||||
- The Triager needs a ``Trac`` account
|
||||
- The Triager looks at all the tickets that have been created in the last week (or month, etc.)
|
||||
- They can use a custom query or do this as the week progresses
|
||||
- BONUS ROUND: Dig up a stale ticket from the past
|
||||
- Assign each ticket to a milestone on the Roadmap
|
||||
- The following situations merit discussion:
|
||||
- A ticket doesn't have an appropriate milestone and we should create one
|
||||
- A ticket, in vanishingly rare circumstances, should be deleted
|
||||
- The ticket is spam
|
||||
- The ticket contains sensitive information and harm will come to one or more people if it continues to be distributed
|
||||
- A ticket could be assigned to multiple milestones
|
||||
- There is another question about a ticket
|
||||
- These tickets will be brought as necessary to one of our meetings (currently Tuesdays) for discussion
|
0
newsfragments/3616.minor
Normal file
0
newsfragments/3616.minor
Normal file
0
newsfragments/3645.minor
Normal file
0
newsfragments/3645.minor
Normal file
@ -0,0 +1 @@
|
||||
We added documentation detailing the project's ticket triage process
|
0
newsfragments/3657.minor
Normal file
0
newsfragments/3657.minor
Normal file
0
newsfragments/3659.documentation
Normal file
0
newsfragments/3659.documentation
Normal file
1
newsfragments/3666.documentation
Normal file
1
newsfragments/3666.documentation
Normal file
@ -0,0 +1 @@
|
||||
`tox -e docs` will treat warnings about docs as errors.
|
0
newsfragments/3667.minor
Normal file
0
newsfragments/3667.minor
Normal file
0
newsfragments/3669.minor
Normal file
0
newsfragments/3669.minor
Normal file
0
newsfragments/3670.minor
Normal file
0
newsfragments/3670.minor
Normal file
0
newsfragments/3671.minor
Normal file
0
newsfragments/3671.minor
Normal file
0
newsfragments/3674.minor
Normal file
0
newsfragments/3674.minor
Normal file
3
setup.py
3
setup.py
@ -11,6 +11,7 @@ import sys
|
||||
# See the docs/about.rst file for licensing information.
|
||||
|
||||
import os, subprocess, re
|
||||
from io import open
|
||||
|
||||
basedir = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
@ -357,7 +358,7 @@ if version:
|
||||
|
||||
setup(name="tahoe-lafs", # also set in __init__.py
|
||||
description='secure, decentralized, fault-tolerant file store',
|
||||
long_description=open('README.rst', 'rU').read(),
|
||||
long_description=open('README.rst', 'r', encoding='utf-8').read(),
|
||||
author='the Tahoe-LAFS project',
|
||||
author_email='tahoe-dev@tahoe-lafs.org',
|
||||
url='https://tahoe-lafs.org/',
|
||||
|
@ -11,9 +11,11 @@ if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
import re
|
||||
|
||||
from foolscap.furl import decode_furl
|
||||
from allmydata.crypto.util import remove_prefix
|
||||
from allmydata.crypto import ed25519
|
||||
from allmydata.util import base32, rrefutil, jsonbytes as json
|
||||
from allmydata.util import base32, jsonbytes as json
|
||||
|
||||
|
||||
def get_tubid_string_from_ann(ann):
|
||||
@ -123,10 +125,10 @@ class AnnouncementDescriptor(object):
|
||||
self.service_name = ann_d["service-name"]
|
||||
self.version = ann_d.get("my-version", "")
|
||||
self.nickname = ann_d.get("nickname", u"")
|
||||
(service_name, key_s) = index
|
||||
(_, key_s) = index
|
||||
self.serverid = key_s
|
||||
furl = ann_d.get("anonymous-storage-FURL")
|
||||
if furl:
|
||||
self.connection_hints = rrefutil.connection_hints_for_furl(furl)
|
||||
_, self.connection_hints, _ = decode_furl(furl)
|
||||
else:
|
||||
self.connection_hints = []
|
||||
|
@ -24,11 +24,12 @@ except ImportError:
|
||||
from zope.interface import implementer
|
||||
from twisted.application import service
|
||||
from twisted.internet import defer
|
||||
from twisted.internet.address import IPv4Address
|
||||
from twisted.python.failure import Failure
|
||||
from foolscap.api import Referenceable
|
||||
import allmydata
|
||||
from allmydata import node
|
||||
from allmydata.util import log, rrefutil, dictutil
|
||||
from allmydata.util import log, dictutil
|
||||
from allmydata.util.i2p_provider import create as create_i2p_provider
|
||||
from allmydata.util.tor_provider import create as create_tor_provider
|
||||
from allmydata.introducer.interfaces import \
|
||||
@ -148,6 +149,15 @@ class _IntroducerNode(node.Node):
|
||||
ws = IntroducerWebishServer(self, webport, nodeurl_path, staticdir)
|
||||
ws.setServiceParent(self)
|
||||
|
||||
|
||||
def stringify_remote_address(rref):
|
||||
remote = rref.getPeer()
|
||||
if isinstance(remote, IPv4Address):
|
||||
return "%s:%d" % (remote.host, remote.port)
|
||||
# loopback is a non-IPv4Address
|
||||
return str(remote)
|
||||
|
||||
|
||||
@implementer(RIIntroducerPublisherAndSubscriberService_v2)
|
||||
class IntroducerService(service.MultiService, Referenceable):
|
||||
name = "introducer"
|
||||
@ -216,7 +226,7 @@ class IntroducerService(service.MultiService, Referenceable):
|
||||
# tubid will be None. Also, subscribers do not tell us which
|
||||
# pubkey they use; only publishers do that.
|
||||
tubid = rref.getRemoteTubID() or "?"
|
||||
remote_address = rrefutil.stringify_remote_address(rref)
|
||||
remote_address = stringify_remote_address(rref)
|
||||
# these three assume subscriber_info["version"]==0, but
|
||||
# should tolerate other versions
|
||||
nickname = subscriber_info.get("nickname", u"?")
|
||||
|
@ -351,7 +351,7 @@ class BackupOptions(FileStoreOptions):
|
||||
line. The file is assumed to be in the argv encoding."""
|
||||
abs_filepath = argv_to_abspath(filepath)
|
||||
try:
|
||||
exclude_file = file(abs_filepath)
|
||||
exclude_file = open(abs_filepath)
|
||||
except:
|
||||
raise BackupConfigurationError('Error opening exclude file %s.' % quote_local_unicode_path(abs_filepath))
|
||||
try:
|
||||
|
@ -1,7 +1,6 @@
|
||||
# coding: utf-8
|
||||
|
||||
from __future__ import print_function
|
||||
from six import ensure_str
|
||||
|
||||
import os, sys, textwrap
|
||||
import codecs
|
||||
@ -22,11 +21,13 @@ from yaml import (
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import str # noqa: F401
|
||||
else:
|
||||
from typing import Union
|
||||
|
||||
from twisted.python import usage
|
||||
|
||||
from allmydata.util.assertutil import precondition
|
||||
from allmydata.util.encodingutil import unicode_to_url, quote_output, \
|
||||
from allmydata.util.encodingutil import quote_output, \
|
||||
quote_local_unicode_path, argv_to_abspath
|
||||
from allmydata.scripts.default_nodedir import _default_nodedir
|
||||
|
||||
@ -274,18 +275,27 @@ def get_alias(aliases, path_unicode, default):
|
||||
return uri.from_string_dirnode(aliases[alias]).to_string(), path[colon+1:]
|
||||
|
||||
def escape_path(path):
|
||||
# type: (str) -> str
|
||||
# type: (Union[str,bytes]) -> str
|
||||
u"""
|
||||
Return path quoted to US-ASCII, valid URL characters.
|
||||
|
||||
>>> path = u'/føö/bar/☃'
|
||||
>>> escaped = escape_path(path)
|
||||
>>> str(escaped)
|
||||
'/f%C3%B8%C3%B6/bar/%E2%98%83'
|
||||
>>> escaped.encode('ascii').decode('ascii') == escaped
|
||||
True
|
||||
>>> escaped
|
||||
u'/f%C3%B8%C3%B6/bar/%E2%98%83'
|
||||
"""
|
||||
segments = path.split("/")
|
||||
result = "/".join([urllib.parse.quote(unicode_to_url(s)) for s in segments])
|
||||
result = ensure_str(result, "ascii")
|
||||
if isinstance(path, str):
|
||||
path = path.encode("utf-8")
|
||||
segments = path.split(b"/")
|
||||
result = str(
|
||||
b"/".join([
|
||||
urllib.parse.quote(s).encode("ascii") for s in segments
|
||||
]),
|
||||
"ascii"
|
||||
)
|
||||
# Eventually (i.e. as part of Python 3 port) we want this to always return
|
||||
# Unicode strings. However, to reduce diff sizes in the short term it'll
|
||||
# return native string (i.e. bytes) on Python 2.
|
||||
if PY2:
|
||||
result = result.encode("ascii").__native__()
|
||||
return result
|
||||
|
@ -449,12 +449,13 @@ def create_node(config):
|
||||
v = remote_config.get(k, None)
|
||||
if v is not None:
|
||||
# we're faking usually argv-supplied options :/
|
||||
v_orig = v
|
||||
if isinstance(v, str):
|
||||
v = v.encode(get_io_encoding())
|
||||
config[k] = v
|
||||
if k not in sensitive_keys:
|
||||
if k not in ['shares-happy', 'shares-total', 'shares-needed']:
|
||||
print(" {}: {}".format(k, v), file=out)
|
||||
print(" {}: {}".format(k, v_orig), file=out)
|
||||
else:
|
||||
print(" {}: [sensitive data; see tahoe.cfg]".format(k), file=out)
|
||||
|
||||
|
@ -1,14 +1,16 @@
|
||||
from __future__ import print_function
|
||||
|
||||
from past.builtins import unicode
|
||||
|
||||
import os.path
|
||||
import time
|
||||
import urllib
|
||||
import json
|
||||
from urllib.parse import quote as url_quote
|
||||
import datetime
|
||||
|
||||
from allmydata.scripts.common import get_alias, escape_path, DEFAULT_ALIAS, \
|
||||
UnknownAliasError
|
||||
from allmydata.scripts.common_http import do_http, HTTPError, format_http_error
|
||||
from allmydata.util import time_format
|
||||
from allmydata.util import time_format, jsonbytes as json
|
||||
from allmydata.scripts import backupdb
|
||||
from allmydata.util.encodingutil import listdir_unicode, quote_output, \
|
||||
quote_local_unicode_path, to_bytes, FilenameEncodingError, unicode_to_url
|
||||
@ -52,7 +54,7 @@ def mkdir(contents, options):
|
||||
|
||||
def put_child(dirurl, childname, childcap):
|
||||
assert dirurl[-1] != "/"
|
||||
url = dirurl + "/" + urllib.quote(unicode_to_url(childname)) + "?t=uri"
|
||||
url = dirurl + "/" + url_quote(unicode_to_url(childname)) + "?t=uri"
|
||||
resp = do_http("PUT", url, childcap)
|
||||
if resp.status not in (200, 201):
|
||||
raise HTTPError("Error during put_child", resp)
|
||||
@ -97,7 +99,7 @@ class BackerUpper(object):
|
||||
except UnknownAliasError as e:
|
||||
e.display(stderr)
|
||||
return 1
|
||||
to_url = nodeurl + "uri/%s/" % urllib.quote(rootcap)
|
||||
to_url = nodeurl + "uri/%s/" % url_quote(rootcap)
|
||||
if path:
|
||||
to_url += escape_path(path)
|
||||
if not to_url.endswith("/"):
|
||||
@ -165,7 +167,7 @@ class BackerUpper(object):
|
||||
if must_create:
|
||||
self.verboseprint(" creating directory for %s" % quote_local_unicode_path(path))
|
||||
newdircap = mkdir(create_contents, self.options)
|
||||
assert isinstance(newdircap, str)
|
||||
assert isinstance(newdircap, bytes)
|
||||
if r:
|
||||
r.did_create(newdircap)
|
||||
return True, newdircap
|
||||
@ -192,7 +194,7 @@ class BackerUpper(object):
|
||||
filecap = r.was_uploaded()
|
||||
self.verboseprint("checking %s" % quote_output(filecap))
|
||||
nodeurl = self.options['node-url']
|
||||
checkurl = nodeurl + "uri/%s?t=check&output=JSON" % urllib.quote(filecap)
|
||||
checkurl = nodeurl + "uri/%s?t=check&output=JSON" % url_quote(filecap)
|
||||
self._files_checked += 1
|
||||
resp = do_http("POST", checkurl)
|
||||
if resp.status != 200:
|
||||
@ -225,7 +227,7 @@ class BackerUpper(object):
|
||||
dircap = r.was_created()
|
||||
self.verboseprint("checking %s" % quote_output(dircap))
|
||||
nodeurl = self.options['node-url']
|
||||
checkurl = nodeurl + "uri/%s?t=check&output=JSON" % urllib.quote(dircap)
|
||||
checkurl = nodeurl + "uri/%s?t=check&output=JSON" % url_quote(dircap)
|
||||
self._directories_checked += 1
|
||||
resp = do_http("POST", checkurl)
|
||||
if resp.status != 200:
|
||||
@ -345,7 +347,7 @@ class FileTarget(object):
|
||||
target = PermissionDeniedTarget(self._path, isdir=False)
|
||||
return target.backup(progress, upload_file, upload_directory)
|
||||
else:
|
||||
assert isinstance(childcap, str)
|
||||
assert isinstance(childcap, bytes)
|
||||
if created:
|
||||
return progress.created_file(self._path, childcap, metadata)
|
||||
return progress.reused_file(self._path, childcap, metadata)
|
||||
@ -525,12 +527,12 @@ class BackupProgress(object):
|
||||
return self, {
|
||||
os.path.basename(create_path): create_value
|
||||
for (create_path, create_value)
|
||||
in self._create_contents.iteritems()
|
||||
in self._create_contents.items()
|
||||
if os.path.dirname(create_path) == dirpath
|
||||
}, {
|
||||
os.path.basename(compare_path): compare_value
|
||||
for (compare_path, compare_value)
|
||||
in self._compare_contents.iteritems()
|
||||
in self._compare_contents.items()
|
||||
if os.path.dirname(compare_path) == dirpath
|
||||
}
|
||||
|
||||
|
@ -1,6 +1,6 @@
|
||||
from __future__ import print_function
|
||||
|
||||
import urllib
|
||||
from urllib.parse import quote as url_quote
|
||||
from allmydata.scripts.common import get_alias, DEFAULT_ALIAS, escape_path, \
|
||||
UnknownAliasError
|
||||
from allmydata.scripts.common_http import do_http, format_http_error
|
||||
@ -20,7 +20,7 @@ def get(options):
|
||||
except UnknownAliasError as e:
|
||||
e.display(stderr)
|
||||
return 1
|
||||
url = nodeurl + "uri/%s" % urllib.quote(rootcap)
|
||||
url = nodeurl + "uri/%s" % url_quote(rootcap)
|
||||
if path:
|
||||
url += "/" + escape_path(path)
|
||||
|
||||
@ -30,6 +30,10 @@ def get(options):
|
||||
outf = open(to_file, "wb")
|
||||
else:
|
||||
outf = stdout
|
||||
# Make sure we can write bytes; on Python 3 stdout is Unicode by
|
||||
# default.
|
||||
if getattr(outf, "encoding", None) is not None:
|
||||
outf = outf.buffer
|
||||
while True:
|
||||
data = resp.read(4096)
|
||||
if not data:
|
||||
|
@ -1,6 +1,10 @@
|
||||
from __future__ import print_function
|
||||
|
||||
import urllib, time
|
||||
from past.builtins import unicode
|
||||
from six import ensure_text, ensure_str
|
||||
|
||||
import time
|
||||
from urllib.parse import quote as url_quote
|
||||
import json
|
||||
from allmydata.scripts.common import get_alias, DEFAULT_ALIAS, escape_path, \
|
||||
UnknownAliasError
|
||||
@ -23,7 +27,7 @@ def list(options):
|
||||
except UnknownAliasError as e:
|
||||
e.display(stderr)
|
||||
return 1
|
||||
url = nodeurl + "uri/%s" % urllib.quote(rootcap)
|
||||
url = nodeurl + "uri/%s" % url_quote(rootcap)
|
||||
if path:
|
||||
# move where.endswith check here?
|
||||
url += "/" + escape_path(path)
|
||||
@ -149,9 +153,9 @@ def list(options):
|
||||
line.append(quote_output(name) + classify)
|
||||
|
||||
if options["uri"]:
|
||||
line.append(uri)
|
||||
line.append(ensure_str(uri))
|
||||
if options["readonly-uri"]:
|
||||
line.append(quote_output(ro_uri or "-", quotemarks=False))
|
||||
line.append(quote_output(ensure_str(ro_uri) or "-", quotemarks=False))
|
||||
|
||||
rows.append((encoding_error, line))
|
||||
|
||||
@ -164,7 +168,7 @@ def list(options):
|
||||
while len(left_justifys) <= i:
|
||||
left_justifys.append(False)
|
||||
max_widths[i] = max(max_widths[i], len(cell))
|
||||
if cell.startswith("URI"):
|
||||
if ensure_text(cell).startswith("URI"):
|
||||
left_justifys[i] = True
|
||||
if len(left_justifys) == 1:
|
||||
left_justifys[0] = True
|
||||
|
@ -1,74 +0,0 @@
|
||||
"""
|
||||
A copy of the implementation of Twisted's ``getProcessOutputAndValue``
|
||||
with the fix for Twisted #9607 (support for stdinBytes) patched in.
|
||||
"""
|
||||
|
||||
from __future__ import (
|
||||
division,
|
||||
absolute_import,
|
||||
print_function,
|
||||
unicode_literals,
|
||||
)
|
||||
|
||||
from io import BytesIO
|
||||
|
||||
from twisted.internet import protocol, defer
|
||||
|
||||
|
||||
class _EverythingGetter(protocol.ProcessProtocol, object):
|
||||
|
||||
def __init__(self, deferred, stdinBytes=None):
|
||||
self.deferred = deferred
|
||||
self.outBuf = BytesIO()
|
||||
self.errBuf = BytesIO()
|
||||
self.outReceived = self.outBuf.write
|
||||
self.errReceived = self.errBuf.write
|
||||
self.stdinBytes = stdinBytes
|
||||
|
||||
def connectionMade(self):
|
||||
if self.stdinBytes is not None:
|
||||
self.transport.writeToChild(0, self.stdinBytes)
|
||||
# The only compelling reason not to _always_ close stdin here is
|
||||
# backwards compatibility.
|
||||
self.transport.closeStdin()
|
||||
|
||||
def processEnded(self, reason):
|
||||
out = self.outBuf.getvalue()
|
||||
err = self.errBuf.getvalue()
|
||||
e = reason.value
|
||||
code = e.exitCode
|
||||
if e.signal:
|
||||
self.deferred.errback((out, err, e.signal))
|
||||
else:
|
||||
self.deferred.callback((out, err, code))
|
||||
|
||||
|
||||
|
||||
def _callProtocolWithDeferred(protocol, executable, args, env, path,
|
||||
reactor=None, protoArgs=()):
|
||||
if reactor is None:
|
||||
from twisted.internet import reactor
|
||||
|
||||
d = defer.Deferred()
|
||||
p = protocol(d, *protoArgs)
|
||||
reactor.spawnProcess(p, executable, (executable,)+tuple(args), env, path)
|
||||
return d
|
||||
|
||||
|
||||
|
||||
def getProcessOutputAndValue(executable, args=(), env={}, path=None,
|
||||
reactor=None, stdinBytes=None):
|
||||
"""Spawn a process and returns a Deferred that will be called back with
|
||||
its output (from stdout and stderr) and it's exit code as (out, err, code)
|
||||
If a signal is raised, the Deferred will errback with the stdout and
|
||||
stderr up to that point, along with the signal, as (out, err, signalNum)
|
||||
"""
|
||||
return _callProtocolWithDeferred(
|
||||
_EverythingGetter,
|
||||
executable,
|
||||
args,
|
||||
env,
|
||||
path,
|
||||
reactor,
|
||||
protoArgs=(stdinBytes,),
|
||||
)
|
@ -1,3 +1,13 @@
|
||||
"""
|
||||
This module is only necessary on Python 2. Once Python 2 code is dropped, it
|
||||
can be deleted.
|
||||
"""
|
||||
|
||||
from future.utils import PY3
|
||||
if PY3:
|
||||
raise RuntimeError("Just use subprocess.Popen")
|
||||
|
||||
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
## Copyright (C) 2021 Valentin Lab
|
||||
|
@ -1,3 +1,18 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
import __builtin__ as builtins
|
||||
else:
|
||||
import builtins
|
||||
|
||||
import os.path
|
||||
from six.moves import cStringIO as StringIO
|
||||
from datetime import timedelta
|
||||
@ -6,7 +21,6 @@ import re
|
||||
from twisted.trial import unittest
|
||||
from twisted.python.monkey import MonkeyPatcher
|
||||
|
||||
import __builtin__
|
||||
from allmydata.util import fileutil
|
||||
from allmydata.util.fileutil import abspath_expanduser_unicode
|
||||
from allmydata.util.encodingutil import get_io_encoding, unicode_to_argv
|
||||
@ -86,7 +100,7 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
d.addCallback(lambda res: do_backup(True))
|
||||
def _check0(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertEqual(len(err), 0, err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
(
|
||||
files_uploaded,
|
||||
@ -143,40 +157,40 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
d.addCallback(lambda res: self.do_cli("ls", "--uri", "tahoe:backups"))
|
||||
def _check1(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertEqual(len(err), 0, err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
lines = out.split("\n")
|
||||
children = dict([line.split() for line in lines if line])
|
||||
latest_uri = children["Latest"]
|
||||
self.failUnless(latest_uri.startswith("URI:DIR2-CHK:"), latest_uri)
|
||||
childnames = children.keys()
|
||||
childnames = list(children.keys())
|
||||
self.failUnlessReallyEqual(sorted(childnames), ["Archives", "Latest"])
|
||||
d.addCallback(_check1)
|
||||
d.addCallback(lambda res: self.do_cli("ls", "tahoe:backups/Latest"))
|
||||
def _check2(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertEqual(len(err), 0, err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
self.failUnlessReallyEqual(sorted(out.split()), ["empty", "parent"])
|
||||
d.addCallback(_check2)
|
||||
d.addCallback(lambda res: self.do_cli("ls", "tahoe:backups/Latest/empty"))
|
||||
def _check2a(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertEqual(len(err), 0, err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
self.failUnlessReallyEqual(out.strip(), "")
|
||||
self.assertFalse(out.strip())
|
||||
d.addCallback(_check2a)
|
||||
d.addCallback(lambda res: self.do_cli("get", "tahoe:backups/Latest/parent/subdir/foo.txt"))
|
||||
def _check3(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertFalse(err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
self.failUnlessReallyEqual(out, "foo")
|
||||
self.assertEqual(out, "foo")
|
||||
d.addCallback(_check3)
|
||||
d.addCallback(lambda res: self.do_cli("ls", "tahoe:backups/Archives"))
|
||||
def _check4(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertFalse(err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
self.old_archives = out.split()
|
||||
self.failUnlessReallyEqual(len(self.old_archives), 1)
|
||||
@ -189,7 +203,7 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
# second backup should reuse everything, if the backupdb is
|
||||
# available
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertFalse(err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
fu, fr, fs, dc, dr, ds = self.count_output(out)
|
||||
# foo.txt, bar.txt, blah.txt
|
||||
@ -221,7 +235,7 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
# the directories should have been changed, so we should
|
||||
# re-use all of them too.
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertFalse(err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
fu, fr, fs, dc, dr, ds = self.count_output(out)
|
||||
fchecked, dchecked = self.count_output2(out)
|
||||
@ -238,7 +252,7 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
d.addCallback(lambda res: self.do_cli("ls", "tahoe:backups/Archives"))
|
||||
def _check5(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertFalse(err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
self.new_archives = out.split()
|
||||
self.failUnlessReallyEqual(len(self.new_archives), 3, out)
|
||||
@ -265,7 +279,7 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
# second backup should reuse bar.txt (if backupdb is available),
|
||||
# and upload the rest. None of the directories can be reused.
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertFalse(err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
fu, fr, fs, dc, dr, ds = self.count_output(out)
|
||||
# new foo.txt, surprise file, subfile, empty
|
||||
@ -281,7 +295,7 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
d.addCallback(lambda res: self.do_cli("ls", "tahoe:backups/Archives"))
|
||||
def _check6(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertFalse(err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
self.new_archives = out.split()
|
||||
self.failUnlessReallyEqual(len(self.new_archives), 4)
|
||||
@ -291,17 +305,17 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
d.addCallback(lambda res: self.do_cli("get", "tahoe:backups/Latest/parent/subdir/foo.txt"))
|
||||
def _check7(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertFalse(err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
self.failUnlessReallyEqual(out, "FOOF!")
|
||||
self.assertEqual(out, "FOOF!")
|
||||
# the old snapshot should not be modified
|
||||
return self.do_cli("get", "tahoe:backups/Archives/%s/parent/subdir/foo.txt" % self.old_archives[0])
|
||||
d.addCallback(_check7)
|
||||
def _check8(args):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(err, "")
|
||||
self.assertFalse(err)
|
||||
self.failUnlessReallyEqual(rc, 0)
|
||||
self.failUnlessReallyEqual(out, "foo")
|
||||
self.assertEqual(out, "foo")
|
||||
d.addCallback(_check8)
|
||||
|
||||
return d
|
||||
@ -382,7 +396,7 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
self._check_filtering(filtered, root_listdir, (u'_darcs', u'subdir'),
|
||||
(nice_doc, u'lib.a'))
|
||||
# read exclude patterns from file
|
||||
exclusion_string = doc_pattern_arg + "\nlib.?"
|
||||
exclusion_string = doc_pattern_arg + b"\nlib.?"
|
||||
excl_filepath = os.path.join(basedir, 'exclusion')
|
||||
fileutil.write(excl_filepath, exclusion_string)
|
||||
backup_options = parse(['--exclude-from', excl_filepath, 'from', 'to'])
|
||||
@ -407,12 +421,16 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
|
||||
ns = Namespace()
|
||||
ns.called = False
|
||||
original_open = open
|
||||
def call_file(name, *args):
|
||||
ns.called = True
|
||||
self.failUnlessEqual(name, abspath_expanduser_unicode(exclude_file))
|
||||
return StringIO()
|
||||
if name.endswith("excludes.dummy"):
|
||||
ns.called = True
|
||||
self.failUnlessEqual(name, abspath_expanduser_unicode(exclude_file))
|
||||
return StringIO()
|
||||
else:
|
||||
return original_open(name, *args)
|
||||
|
||||
patcher = MonkeyPatcher((__builtin__, 'file', call_file))
|
||||
patcher = MonkeyPatcher((builtins, 'open', call_file))
|
||||
patcher.runWithPatches(parse_options, basedir, "backup", ['--exclude-from', unicode_to_argv(exclude_file), 'from', 'to'])
|
||||
self.failUnless(ns.called)
|
||||
|
||||
@ -584,7 +602,7 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
(rc, out, err) = args
|
||||
self.failUnlessReallyEqual(rc, 1)
|
||||
self.failUnlessIn("error:", err)
|
||||
self.failUnlessReallyEqual(out, "")
|
||||
self.assertEqual(len(out), 0)
|
||||
d.addCallback(_check)
|
||||
return d
|
||||
|
||||
@ -600,6 +618,6 @@ class Backup(GridTestMixin, CLITestMixin, StallMixin, unittest.TestCase):
|
||||
self.failUnlessReallyEqual(rc, 1)
|
||||
self.failUnlessIn("error:", err)
|
||||
self.failUnlessIn("nonexistent", err)
|
||||
self.failUnlessReallyEqual(out, "")
|
||||
self.assertEqual(len(out), 0)
|
||||
d.addCallback(_check)
|
||||
return d
|
||||
|
@ -37,10 +37,26 @@ from allmydata.util import jsonbytes as json
|
||||
|
||||
from ..no_network import GridTestMixin
|
||||
from ..common_web import do_http
|
||||
from ..status import FakeStatus
|
||||
from .common import CLITestMixin
|
||||
|
||||
|
||||
class FakeStatus(object):
|
||||
def __init__(self):
|
||||
self.status = []
|
||||
|
||||
def setServiceParent(self, p):
|
||||
pass
|
||||
|
||||
def get_status(self):
|
||||
return self.status
|
||||
|
||||
def get_storage_index(self):
|
||||
return None
|
||||
|
||||
def get_size(self):
|
||||
return None
|
||||
|
||||
|
||||
class ProgressBar(unittest.TestCase):
|
||||
|
||||
def test_ascii0(self):
|
||||
|
@ -1,3 +1,14 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
__all__ = [
|
||||
"CLINodeAPI",
|
||||
@ -81,7 +92,7 @@ class _ProcessProtocolAdapter(ProcessProtocol, object):
|
||||
self._fds = fds
|
||||
|
||||
def connectionMade(self):
|
||||
for proto in self._fds.values():
|
||||
for proto in list(self._fds.values()):
|
||||
proto.makeConnection(self.transport)
|
||||
|
||||
def childDataReceived(self, childFD, data):
|
||||
@ -94,7 +105,7 @@ class _ProcessProtocolAdapter(ProcessProtocol, object):
|
||||
|
||||
def processEnded(self, reason):
|
||||
notified = set()
|
||||
for proto in self._fds.values():
|
||||
for proto in list(self._fds.values()):
|
||||
if proto not in notified:
|
||||
proto.connectionLost(reason)
|
||||
notified.add(proto)
|
||||
|
@ -1,4 +1,15 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import print_function
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2, native_str
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
from past.builtins import chr as byteschr
|
||||
|
||||
__all__ = [
|
||||
"SyncTestCase",
|
||||
@ -15,8 +26,6 @@ __all__ = [
|
||||
"PIPE",
|
||||
]
|
||||
|
||||
from past.builtins import chr as byteschr, unicode
|
||||
|
||||
import sys
|
||||
import os, random, struct
|
||||
import six
|
||||
@ -106,7 +115,7 @@ from .eliotutil import (
|
||||
)
|
||||
from .common_util import ShouldFailMixin # noqa: F401
|
||||
|
||||
if sys.platform == "win32":
|
||||
if sys.platform == "win32" and PY2:
|
||||
# Python 2.7 doesn't have good options for launching a process with
|
||||
# non-ASCII in its command line. So use this alternative that does a
|
||||
# better job. However, only use it on Windows because it doesn't work
|
||||
@ -253,7 +262,7 @@ class UseNode(object):
|
||||
plugin_config = attr.ib()
|
||||
storage_plugin = attr.ib()
|
||||
basedir = attr.ib(validator=attr.validators.instance_of(FilePath))
|
||||
introducer_furl = attr.ib(validator=attr.validators.instance_of(str),
|
||||
introducer_furl = attr.ib(validator=attr.validators.instance_of(native_str),
|
||||
converter=six.ensure_str)
|
||||
node_config = attr.ib(default=attr.Factory(dict))
|
||||
|
||||
@ -264,7 +273,7 @@ class UseNode(object):
|
||||
return "\n".join(
|
||||
" = ".join((key, value))
|
||||
for (key, value)
|
||||
in config.items()
|
||||
in list(config.items())
|
||||
)
|
||||
|
||||
if self.plugin_config is None:
|
||||
@ -849,17 +858,17 @@ class WebErrorMixin(object):
|
||||
callable=None, *args, **kwargs):
|
||||
# returns a Deferred with the response body
|
||||
if isinstance(substring, bytes):
|
||||
substring = unicode(substring, "ascii")
|
||||
if isinstance(response_substring, unicode):
|
||||
substring = str(substring, "ascii")
|
||||
if isinstance(response_substring, str):
|
||||
response_substring = response_substring.encode("ascii")
|
||||
assert substring is None or isinstance(substring, unicode)
|
||||
assert substring is None or isinstance(substring, str)
|
||||
assert response_substring is None or isinstance(response_substring, bytes)
|
||||
assert callable
|
||||
def _validate(f):
|
||||
if code is not None:
|
||||
self.failUnlessEqual(f.value.status, b"%d" % code, which)
|
||||
if substring:
|
||||
code_string = unicode(f)
|
||||
code_string = str(f)
|
||||
self.failUnless(substring in code_string,
|
||||
"%s: substring '%s' not in '%s'"
|
||||
% (which, substring, code_string))
|
||||
@ -882,7 +891,7 @@ class WebErrorMixin(object):
|
||||
body = yield response.content()
|
||||
self.assertEquals(response.code, code)
|
||||
if response_substring is not None:
|
||||
if isinstance(response_substring, unicode):
|
||||
if isinstance(response_substring, str):
|
||||
response_substring = response_substring.encode("utf-8")
|
||||
self.assertIn(response_substring, body)
|
||||
returnValue(body)
|
||||
|
@ -1,8 +1,15 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import print_function
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2, bchr, binary_type
|
||||
from future.builtins import str as future_str
|
||||
from past.builtins import unicode
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
import os
|
||||
import time
|
||||
@ -20,11 +27,11 @@ from twisted.trial import unittest
|
||||
|
||||
from ..util.assertutil import precondition
|
||||
from ..scripts import runner
|
||||
from allmydata.util.encodingutil import unicode_platform, get_filesystem_encoding, get_io_encoding, argv_type, unicode_to_argv
|
||||
from allmydata.util.encodingutil import unicode_platform, get_filesystem_encoding, argv_type, unicode_to_argv
|
||||
|
||||
|
||||
def skip_if_cannot_represent_filename(u):
|
||||
precondition(isinstance(u, unicode))
|
||||
precondition(isinstance(u, str))
|
||||
|
||||
enc = get_filesystem_encoding()
|
||||
if not unicode_platform():
|
||||
@ -33,13 +40,6 @@ def skip_if_cannot_represent_filename(u):
|
||||
except UnicodeEncodeError:
|
||||
raise unittest.SkipTest("A non-ASCII filename could not be encoded on this platform.")
|
||||
|
||||
def skip_if_cannot_represent_argv(u):
|
||||
precondition(isinstance(u, unicode))
|
||||
try:
|
||||
u.encode(get_io_encoding())
|
||||
except UnicodeEncodeError:
|
||||
raise unittest.SkipTest("A non-ASCII argv could not be encoded on this platform.")
|
||||
|
||||
|
||||
def _getvalue(io):
|
||||
"""
|
||||
@ -51,7 +51,7 @@ def _getvalue(io):
|
||||
|
||||
def maybe_unicode_to_argv(o):
|
||||
"""Convert object to argv form if necessary."""
|
||||
if isinstance(o, unicode):
|
||||
if isinstance(o, str):
|
||||
return unicode_to_argv(o)
|
||||
return o
|
||||
|
||||
@ -94,13 +94,18 @@ def run_cli_native(verb, *args, **kwargs):
|
||||
argv = nodeargs + [verb] + list(args)
|
||||
stdin = kwargs.get("stdin", "")
|
||||
if encoding is None:
|
||||
# The original behavior, the Python 2 behavior, is to accept either
|
||||
# bytes or unicode and try to automatically encode or decode as
|
||||
# necessary. This works okay for ASCII and if LANG is set
|
||||
# appropriately. These aren't great constraints so we should move
|
||||
# away from this behavior.
|
||||
stdout = StringIO()
|
||||
stderr = StringIO()
|
||||
if PY2:
|
||||
# The original behavior, the Python 2 behavior, is to accept either
|
||||
# bytes or unicode and try to automatically encode or decode as
|
||||
# necessary. This works okay for ASCII and if LANG is set
|
||||
# appropriately. These aren't great constraints so we should move
|
||||
# away from this behavior.
|
||||
stdout = StringIO()
|
||||
stderr = StringIO()
|
||||
else:
|
||||
# Default on Python 3 is accepting text.
|
||||
stdout = TextIOWrapper(BytesIO(), "utf-8")
|
||||
stderr = TextIOWrapper(BytesIO(), "utf-8")
|
||||
else:
|
||||
# The new behavior, the Python 3 behavior, is to accept unicode and
|
||||
# encode it using a specific encoding. For older versions of Python
|
||||
@ -188,7 +193,7 @@ class DevNullDictionary(dict):
|
||||
return
|
||||
|
||||
def insecurerandstr(n):
|
||||
return b''.join(map(bchr, map(randrange, [0]*n, [256]*n)))
|
||||
return b''.join(map(bchr, list(map(randrange, [0]*n, [256]*n))))
|
||||
|
||||
def flip_bit(good, which):
|
||||
"""Flip the low-order bit of good[which]."""
|
||||
@ -218,9 +223,9 @@ class ReallyEqualMixin(object):
|
||||
# type. They're equal, and _logically_ the same type, but have
|
||||
# different types in practice.
|
||||
if a.__class__ == future_str:
|
||||
a = unicode(a)
|
||||
a = str(a)
|
||||
if b.__class__ == future_str:
|
||||
b = unicode(b)
|
||||
b = str(b)
|
||||
self.assertEqual(type(a), type(b), "a :: %r (%s), b :: %r (%s), %r" % (a, type(a), b, type(b), msg))
|
||||
|
||||
|
||||
@ -304,7 +309,7 @@ class ShouldFailMixin(object):
|
||||
of the message wrapped by this Failure, or the test will fail.
|
||||
"""
|
||||
|
||||
assert substring is None or isinstance(substring, (bytes, unicode))
|
||||
assert substring is None or isinstance(substring, (bytes, str))
|
||||
d = defer.maybeDeferred(callable, *args, **kwargs)
|
||||
def done(res):
|
||||
if isinstance(res, failure.Failure):
|
||||
@ -395,28 +400,8 @@ class TimezoneMixin(object):
|
||||
return hasattr(time, 'tzset')
|
||||
|
||||
|
||||
try:
|
||||
import win32file
|
||||
import win32con
|
||||
def make_readonly(path):
|
||||
win32file.SetFileAttributes(path, win32con.FILE_ATTRIBUTE_READONLY)
|
||||
def make_accessible(path):
|
||||
win32file.SetFileAttributes(path, win32con.FILE_ATTRIBUTE_NORMAL)
|
||||
except ImportError:
|
||||
import stat
|
||||
def _make_readonly(path):
|
||||
os.chmod(path, stat.S_IREAD)
|
||||
os.chmod(os.path.dirname(path), stat.S_IREAD)
|
||||
def _make_accessible(path):
|
||||
os.chmod(os.path.dirname(path), stat.S_IWRITE | stat.S_IEXEC | stat.S_IREAD)
|
||||
os.chmod(path, stat.S_IWRITE | stat.S_IEXEC | stat.S_IREAD)
|
||||
make_readonly = _make_readonly
|
||||
make_accessible = _make_accessible
|
||||
|
||||
|
||||
__all__ = [
|
||||
"make_readonly", "make_accessible", "TestMixin", "ShouldFailMixin",
|
||||
"StallMixin", "skip_if_cannot_represent_argv", "run_cli", "parse_cli",
|
||||
"TestMixin", "ShouldFailMixin", "StallMixin", "run_cli", "parse_cli",
|
||||
"DevNullDictionary", "insecurerandstr", "flip_bit", "flip_one_bit",
|
||||
"SignalMixin", "skip_if_cannot_represent_filename", "ReallyEqualMixin"
|
||||
]
|
||||
|
@ -1,3 +1,15 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
from six import ensure_str
|
||||
|
||||
__all__ = [
|
||||
|
@ -1,12 +1,21 @@
|
||||
"""
|
||||
Tools aimed at the interaction between tests and Eliot.
|
||||
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
# Python 2 compatibility
|
||||
# Can't use `builtins.str` because it's not JSON encodable:
|
||||
# `exceptions.TypeError: <class 'future.types.newstr.newstr'> is not JSON-encodeable`
|
||||
from past.builtins import unicode as str
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, max, min # noqa: F401
|
||||
|
||||
from six import ensure_text
|
||||
|
||||
__all__ = [
|
||||
|
@ -1,16 +0,0 @@
|
||||
|
||||
class FakeStatus(object):
|
||||
def __init__(self):
|
||||
self.status = []
|
||||
|
||||
def setServiceParent(self, p):
|
||||
pass
|
||||
|
||||
def get_status(self):
|
||||
return self.status
|
||||
|
||||
def get_storage_index(self):
|
||||
return None
|
||||
|
||||
def get_size(self):
|
||||
return None
|
@ -1,8 +1,17 @@
|
||||
"""
|
||||
A storage server plugin the test suite can use to validate the
|
||||
functionality.
|
||||
"""
|
||||
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
from future.utils import native_str, native_str_to_bytes
|
||||
from six import ensure_str
|
||||
|
||||
|
@ -1,6 +1,16 @@
|
||||
"""
|
||||
Hypothesis strategies use for testing Tahoe-LAFS.
|
||||
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
from hypothesis.strategies import (
|
||||
one_of,
|
||||
|
84
src/allmydata/test/test_consumer.py
Normal file
84
src/allmydata/test/test_consumer.py
Normal file
@ -0,0 +1,84 @@
|
||||
"""
|
||||
Tests for allmydata.util.consumer.
|
||||
|
||||
Ported to Python 3.
|
||||
"""
|
||||
|
||||
from __future__ import unicode_literals
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
from zope.interface import implementer
|
||||
from twisted.trial.unittest import TestCase
|
||||
from twisted.internet.interfaces import IPushProducer, IPullProducer
|
||||
|
||||
from allmydata.util.consumer import MemoryConsumer
|
||||
|
||||
|
||||
@implementer(IPushProducer)
|
||||
@implementer(IPullProducer)
|
||||
class Producer(object):
|
||||
"""Can be used as either streaming or non-streaming producer.
|
||||
|
||||
If used as streaming, the test should call iterate() manually.
|
||||
"""
|
||||
|
||||
def __init__(self, consumer, data):
|
||||
self.data = data
|
||||
self.consumer = consumer
|
||||
self.done = False
|
||||
|
||||
def resumeProducing(self):
|
||||
"""Kick off streaming."""
|
||||
self.iterate()
|
||||
|
||||
def iterate(self):
|
||||
"""Do another iteration of writing."""
|
||||
if self.done:
|
||||
raise RuntimeError(
|
||||
"There's a bug somewhere, shouldn't iterate after being done"
|
||||
)
|
||||
if self.data:
|
||||
self.consumer.write(self.data.pop(0))
|
||||
else:
|
||||
self.done = True
|
||||
self.consumer.unregisterProducer()
|
||||
|
||||
|
||||
class MemoryConsumerTests(TestCase):
|
||||
"""Tests for MemoryConsumer."""
|
||||
|
||||
def test_push_producer(self):
|
||||
"""
|
||||
A MemoryConsumer accumulates all data sent by a streaming producer.
|
||||
"""
|
||||
consumer = MemoryConsumer()
|
||||
producer = Producer(consumer, [b"abc", b"def", b"ghi"])
|
||||
consumer.registerProducer(producer, True)
|
||||
self.assertEqual(consumer.chunks, [b"abc"])
|
||||
producer.iterate()
|
||||
producer.iterate()
|
||||
self.assertEqual(consumer.chunks, [b"abc", b"def", b"ghi"])
|
||||
self.assertEqual(consumer.done, False)
|
||||
producer.iterate()
|
||||
self.assertEqual(consumer.chunks, [b"abc", b"def", b"ghi"])
|
||||
self.assertEqual(consumer.done, True)
|
||||
|
||||
def test_pull_producer(self):
|
||||
"""
|
||||
A MemoryConsumer accumulates all data sent by a non-streaming producer.
|
||||
"""
|
||||
consumer = MemoryConsumer()
|
||||
producer = Producer(consumer, [b"abc", b"def", b"ghi"])
|
||||
consumer.registerProducer(producer, False)
|
||||
self.assertEqual(consumer.chunks, [b"abc", b"def", b"ghi"])
|
||||
self.assertEqual(consumer.done, True)
|
||||
|
||||
|
||||
# download_to_data() is effectively tested by some of the filenode tests, e.g.
|
||||
# test_immutable.py.
|
@ -1,4 +1,16 @@
|
||||
#!/usr/bin/python
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
from six import ensure_binary
|
||||
|
||||
import os
|
||||
|
||||
from twisted.python.filepath import FilePath
|
||||
@ -43,7 +55,7 @@ class MultiIntroTests(unittest.TestCase):
|
||||
u'intro2':{ 'furl': 'furl4' },
|
||||
},
|
||||
}
|
||||
self.yaml_path.setContent(yamlutil.safe_dump(connections))
|
||||
self.yaml_path.setContent(ensure_binary(yamlutil.safe_dump(connections)))
|
||||
# get a client and count of introducer_clients
|
||||
myclient = yield create_client(self.basedir)
|
||||
ic_count = len(myclient.introducer_clients)
|
||||
@ -73,7 +85,7 @@ class MultiIntroTests(unittest.TestCase):
|
||||
tahoe_cfg_furl = myclient.introducer_clients[0].introducer_furl
|
||||
|
||||
# assertions
|
||||
self.failUnlessEqual(fake_furl, tahoe_cfg_furl)
|
||||
self.failUnlessEqual(fake_furl, str(tahoe_cfg_furl, "utf-8"))
|
||||
self.assertEqual(
|
||||
list(
|
||||
warning["message"]
|
||||
@ -97,10 +109,10 @@ class MultiIntroTests(unittest.TestCase):
|
||||
u'default': { 'furl': 'furl1' },
|
||||
},
|
||||
}
|
||||
self.yaml_path.setContent(yamlutil.safe_dump(connections))
|
||||
self.yaml_path.setContent(ensure_binary(yamlutil.safe_dump(connections)))
|
||||
FilePath(self.basedir).child("tahoe.cfg").setContent(
|
||||
"[client]\n"
|
||||
"introducer.furl = furl1\n"
|
||||
b"[client]\n"
|
||||
b"introducer.furl = furl1\n"
|
||||
)
|
||||
|
||||
with self.assertRaises(ValueError) as ctx:
|
||||
@ -112,7 +124,7 @@ class MultiIntroTests(unittest.TestCase):
|
||||
"please fix impossible configuration.",
|
||||
)
|
||||
|
||||
SIMPLE_YAML = """
|
||||
SIMPLE_YAML = b"""
|
||||
introducers:
|
||||
one:
|
||||
furl: furl1
|
||||
@ -121,7 +133,7 @@ introducers:
|
||||
# this format was recommended in docs/configuration.rst in 1.12.0, but it
|
||||
# isn't correct (the "furl = furl1" line is recorded as the string value of
|
||||
# the ["one"] key, instead of being parsed as a single-key dictionary).
|
||||
EQUALS_YAML = """
|
||||
EQUALS_YAML = b"""
|
||||
introducers:
|
||||
one: furl = furl1
|
||||
"""
|
||||
@ -147,17 +159,17 @@ class NoDefault(unittest.TestCase):
|
||||
connections = {'introducers': {
|
||||
u'one': { 'furl': 'furl1' },
|
||||
}}
|
||||
self.yaml_path.setContent(yamlutil.safe_dump(connections))
|
||||
self.yaml_path.setContent(ensure_binary(yamlutil.safe_dump(connections)))
|
||||
myclient = yield create_client(self.basedir)
|
||||
tahoe_cfg_furl = myclient.introducer_clients[0].introducer_furl
|
||||
self.assertEquals(tahoe_cfg_furl, 'furl1')
|
||||
self.assertEquals(tahoe_cfg_furl, b'furl1')
|
||||
|
||||
@defer.inlineCallbacks
|
||||
def test_real_yaml(self):
|
||||
self.yaml_path.setContent(SIMPLE_YAML)
|
||||
myclient = yield create_client(self.basedir)
|
||||
tahoe_cfg_furl = myclient.introducer_clients[0].introducer_furl
|
||||
self.assertEquals(tahoe_cfg_furl, 'furl1')
|
||||
self.assertEquals(tahoe_cfg_furl, b'furl1')
|
||||
|
||||
@defer.inlineCallbacks
|
||||
def test_invalid_equals_yaml(self):
|
||||
@ -172,6 +184,6 @@ class NoDefault(unittest.TestCase):
|
||||
@defer.inlineCallbacks
|
||||
def test_introducerless(self):
|
||||
connections = {'introducers': {} }
|
||||
self.yaml_path.setContent(yamlutil.safe_dump(connections))
|
||||
self.yaml_path.setContent(ensure_binary(yamlutil.safe_dump(connections)))
|
||||
myclient = yield create_client(self.basedir)
|
||||
self.assertEquals(len(myclient.introducer_clients), 0)
|
||||
|
@ -15,7 +15,6 @@ from testtools.matchers import (
|
||||
|
||||
BLACKLIST = {
|
||||
"allmydata.scripts.types_",
|
||||
"allmydata.test.check_load",
|
||||
"allmydata.test._win_subprocess",
|
||||
"allmydata.windows.registry",
|
||||
"allmydata.windows.fixups",
|
||||
|
@ -50,8 +50,7 @@ from twisted.python.failure import Failure
|
||||
from twisted.python.filepath import (
|
||||
FilePath,
|
||||
)
|
||||
|
||||
from ._twisted_9607 import (
|
||||
from twisted.internet.utils import (
|
||||
getProcessOutputAndValue,
|
||||
)
|
||||
|
||||
|
@ -17,15 +17,17 @@ import yaml
|
||||
import json
|
||||
|
||||
from twisted.trial import unittest
|
||||
from foolscap.api import Violation, RemoteException
|
||||
|
||||
from allmydata.util import idlib, mathutil
|
||||
from allmydata.util import fileutil
|
||||
from allmydata.util import jsonbytes
|
||||
from allmydata.util import pollmixin
|
||||
from allmydata.util import yamlutil
|
||||
from allmydata.util import rrefutil
|
||||
from allmydata.util.fileutil import EncryptedTemporaryFile
|
||||
from allmydata.test.common_util import ReallyEqualMixin
|
||||
|
||||
from .no_network import fireNow, LocalWrapper
|
||||
|
||||
if six.PY3:
|
||||
long = int
|
||||
@ -480,7 +482,12 @@ class EqButNotIs(object):
|
||||
|
||||
class YAML(unittest.TestCase):
|
||||
def test_convert(self):
|
||||
data = yaml.safe_dump(["str", u"unicode", u"\u1234nicode"])
|
||||
"""
|
||||
Unicode and (ASCII) native strings get roundtripped to Unicode strings.
|
||||
"""
|
||||
data = yaml.safe_dump(
|
||||
[six.ensure_str("str"), u"unicode", u"\u1234nicode"]
|
||||
)
|
||||
back = yamlutil.safe_load(data)
|
||||
self.assertIsInstance(back[0], str)
|
||||
self.assertIsInstance(back[1], str)
|
||||
@ -521,3 +528,38 @@ class JSONBytes(unittest.TestCase):
|
||||
encoded = jsonbytes.dumps_bytes(x)
|
||||
self.assertIsInstance(encoded, bytes)
|
||||
self.assertEqual(json.loads(encoded, encoding="utf-8"), x)
|
||||
|
||||
|
||||
class FakeGetVersion(object):
|
||||
"""Emulate an object with a get_version."""
|
||||
|
||||
def __init__(self, result):
|
||||
self.result = result
|
||||
|
||||
def remote_get_version(self):
|
||||
if isinstance(self.result, Exception):
|
||||
raise self.result
|
||||
return self.result
|
||||
|
||||
|
||||
class RrefUtilTests(unittest.TestCase):
|
||||
"""Tests for rrefutil."""
|
||||
|
||||
def test_version_returned(self):
|
||||
"""If get_version() succeeded, it is set on the rref."""
|
||||
rref = LocalWrapper(FakeGetVersion(12345), fireNow)
|
||||
result = self.successResultOf(
|
||||
rrefutil.add_version_to_remote_reference(rref, "default")
|
||||
)
|
||||
self.assertEqual(result.version, 12345)
|
||||
self.assertIdentical(result, rref)
|
||||
|
||||
def test_exceptions(self):
|
||||
"""If get_version() failed, default version is set on the rref."""
|
||||
for exception in (Violation(), RemoteException(ValueError())):
|
||||
rref = LocalWrapper(FakeGetVersion(exception), fireNow)
|
||||
result = self.successResultOf(
|
||||
rrefutil.add_version_to_remote_reference(rref, "Default")
|
||||
)
|
||||
self.assertEqual(result.version, "Default")
|
||||
self.assertIdentical(result, rref)
|
||||
|
@ -1,3 +1,14 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
import re
|
||||
|
||||
|
@ -1,3 +1,15 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
import attr
|
||||
|
||||
from testtools.matchers import Mismatch
|
||||
|
@ -1394,8 +1394,8 @@ class Web(WebMixin, WebErrorMixin, testutil.StallMixin, testutil.ReallyEqualMixi
|
||||
def _got(res_and_status_and_headers):
|
||||
(res, status, headers) = res_and_status_and_headers
|
||||
self.failUnlessReallyEqual(res, "")
|
||||
self.failUnlessReallyEqual(headers.getRawHeaders("content-length")[0],
|
||||
str(len(self.BAR_CONTENTS)))
|
||||
self.failUnlessReallyEqual(int(headers.getRawHeaders("content-length")[0]),
|
||||
len(self.BAR_CONTENTS))
|
||||
self.failUnlessReallyEqual(headers.getRawHeaders("content-type"),
|
||||
["text/plain"])
|
||||
d.addCallback(_got)
|
||||
@ -3015,8 +3015,8 @@ class Web(WebMixin, WebErrorMixin, testutil.StallMixin, testutil.ReallyEqualMixi
|
||||
def _got_headers(res_and_status_and_headers):
|
||||
(res, status, headers) = res_and_status_and_headers
|
||||
self.failUnlessReallyEqual(res, "")
|
||||
self.failUnlessReallyEqual(headers.getRawHeaders("content-length")[0],
|
||||
str(len(NEW2_CONTENTS)))
|
||||
self.failUnlessReallyEqual(int(headers.getRawHeaders("content-length")[0]),
|
||||
len(NEW2_CONTENTS))
|
||||
self.failUnlessReallyEqual(headers.getRawHeaders("content-type"),
|
||||
["text/plain"])
|
||||
d.addCallback(_got_headers)
|
||||
|
@ -98,11 +98,21 @@ PORTED_MODULES = [
|
||||
"allmydata.storage.shares",
|
||||
"allmydata.test",
|
||||
"allmydata.test.cli",
|
||||
"allmydata.test.cli_node_api",
|
||||
"allmydata.test.common",
|
||||
"allmydata.test.common_util",
|
||||
"allmydata.test.common_web",
|
||||
"allmydata.test.eliotutil",
|
||||
"allmydata.test.no_network",
|
||||
"allmydata.test.matchers",
|
||||
"allmydata.test.mutable",
|
||||
"allmydata.test.mutable.util",
|
||||
"allmydata.test.python3_tests",
|
||||
"allmydata.test.storage_plugin",
|
||||
"allmydata.test.strategies",
|
||||
"allmydata.test.web",
|
||||
"allmydata.test.web.common",
|
||||
"allmydata.test.web.matchers",
|
||||
"allmydata.testing",
|
||||
"allmydata.testing.web",
|
||||
"allmydata.unknown",
|
||||
@ -115,6 +125,8 @@ PORTED_MODULES = [
|
||||
"allmydata.util.base62",
|
||||
"allmydata.util.configutil",
|
||||
"allmydata.util.connection_status",
|
||||
"allmydata.util.consumer",
|
||||
"allmydata.util.dbutil",
|
||||
"allmydata.util.deferredutil",
|
||||
"allmydata.util.dictutil",
|
||||
"allmydata.util.eliotutil",
|
||||
@ -135,10 +147,12 @@ PORTED_MODULES = [
|
||||
"allmydata.util.observer",
|
||||
"allmydata.util.pipeline",
|
||||
"allmydata.util.pollmixin",
|
||||
"allmydata.util.rrefutil",
|
||||
"allmydata.util.spans",
|
||||
"allmydata.util.statistics",
|
||||
"allmydata.util.time_format",
|
||||
"allmydata.util.tor_provider",
|
||||
"allmydata.util.yamlutil",
|
||||
"allmydata.web",
|
||||
"allmydata.web.check_results",
|
||||
"allmydata.web.common",
|
||||
@ -160,6 +174,7 @@ PORTED_MODULES = [
|
||||
|
||||
PORTED_TEST_MODULES = [
|
||||
"allmydata.test.cli.test_alias",
|
||||
"allmydata.test.cli.test_backup",
|
||||
"allmydata.test.cli.test_backupdb",
|
||||
"allmydata.test.cli.test_create",
|
||||
"allmydata.test.cli.test_invite",
|
||||
@ -191,6 +206,7 @@ PORTED_TEST_MODULES = [
|
||||
"allmydata.test.test_configutil",
|
||||
"allmydata.test.test_connections",
|
||||
"allmydata.test.test_connection_status",
|
||||
"allmydata.test.test_consumer",
|
||||
"allmydata.test.test_crawler",
|
||||
"allmydata.test.test_crypto",
|
||||
|
||||
@ -219,6 +235,7 @@ PORTED_TEST_MODULES = [
|
||||
"allmydata.test.test_json_metadata",
|
||||
"allmydata.test.test_log",
|
||||
"allmydata.test.test_monitor",
|
||||
"allmydata.test.test_multi_introducers",
|
||||
"allmydata.test.test_netstring",
|
||||
"allmydata.test.test_no_network",
|
||||
"allmydata.test.test_node",
|
||||
|
@ -1,11 +1,22 @@
|
||||
|
||||
"""This file defines a basic download-to-memory consumer, suitable for use in
|
||||
a filenode's read() method. See download_to_data() for an example of its use.
|
||||
"""
|
||||
This file defines a basic download-to-memory consumer, suitable for use in
|
||||
a filenode's read() method. See download_to_data() for an example of its use.
|
||||
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
from zope.interface import implementer
|
||||
from twisted.internet.interfaces import IConsumer
|
||||
|
||||
|
||||
@implementer(IConsumer)
|
||||
class MemoryConsumer(object):
|
||||
|
||||
@ -28,6 +39,7 @@ class MemoryConsumer(object):
|
||||
def unregisterProducer(self):
|
||||
self.done = True
|
||||
|
||||
|
||||
def download_to_data(n, offset=0, size=None):
|
||||
"""
|
||||
Return Deferred that fires with results of reading from the given filenode.
|
||||
|
@ -1,9 +1,23 @@
|
||||
"""
|
||||
SQLite3 utilities.
|
||||
|
||||
Test coverage currently provided by test_backupdb.py.
|
||||
|
||||
Ported to Python 3.
|
||||
"""
|
||||
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
import os, sys
|
||||
|
||||
import sqlite3
|
||||
from sqlite3 import IntegrityError
|
||||
[IntegrityError]
|
||||
|
||||
|
||||
class DBError(Exception):
|
||||
@ -12,7 +26,7 @@ class DBError(Exception):
|
||||
|
||||
def get_db(dbfile, stderr=sys.stderr,
|
||||
create_version=(None, None), updaters={}, just_create=False, dbname="db",
|
||||
journal_mode=None, synchronous=None):
|
||||
):
|
||||
"""Open or create the given db file. The parent directory must exist.
|
||||
create_version=(SCHEMA, VERNUM), and SCHEMA must have a 'version' table.
|
||||
Updaters is a {newver: commands} mapping, where e.g. updaters[2] is used
|
||||
@ -32,12 +46,6 @@ def get_db(dbfile, stderr=sys.stderr,
|
||||
# The default is unspecified according to <http://www.sqlite.org/foreignkeys.html#fk_enable>.
|
||||
c.execute("PRAGMA foreign_keys = ON;")
|
||||
|
||||
if journal_mode is not None:
|
||||
c.execute("PRAGMA journal_mode = %s;" % (journal_mode,))
|
||||
|
||||
if synchronous is not None:
|
||||
c.execute("PRAGMA synchronous = %s;" % (synchronous,))
|
||||
|
||||
if must_create:
|
||||
c.executescript(schema)
|
||||
c.execute("INSERT INTO version (version) VALUES (?)", (target_version,))
|
||||
|
@ -1,6 +1,16 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from twisted.internet import address
|
||||
from foolscap.api import Violation, RemoteException, SturdyRef
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
from foolscap.api import Violation, RemoteException
|
||||
|
||||
|
||||
def add_version_to_remote_reference(rref, default):
|
||||
@ -18,24 +28,3 @@ def add_version_to_remote_reference(rref, default):
|
||||
return rref
|
||||
d.addCallbacks(_got_version, _no_get_version)
|
||||
return d
|
||||
|
||||
|
||||
def connection_hints_for_furl(furl):
|
||||
hints = []
|
||||
for h in SturdyRef(furl).locationHints:
|
||||
# Foolscap-0.2.5 and earlier used strings in .locationHints, 0.2.6
|
||||
# through 0.6.4 used tuples of ("ipv4",host,port), 0.6.5 through
|
||||
# 0.8.0 used tuples of ("tcp",host,port), and >=0.9.0 uses strings
|
||||
# again. Tolerate them all.
|
||||
if isinstance(h, tuple):
|
||||
hints.append(":".join([str(s) for s in h]))
|
||||
else:
|
||||
hints.append(h)
|
||||
return hints
|
||||
|
||||
def stringify_remote_address(rref):
|
||||
remote = rref.getPeer()
|
||||
if isinstance(remote, address.IPv4Address):
|
||||
return "%s:%d" % (remote.host, remote.port)
|
||||
# loopback is a non-IPv4Address
|
||||
return str(remote)
|
||||
|
@ -1,24 +0,0 @@
|
||||
import os
|
||||
import sys
|
||||
from twisted.python.util import sibpath as tsibpath
|
||||
|
||||
def sibpath(path, sibling):
|
||||
"""
|
||||
Looks for a named sibling relative to the given path. If such a file
|
||||
exists, its path will be returned, otherwise a second search will be
|
||||
made for the named sibling relative to the path of the executable
|
||||
currently running. This is useful in the case that something built
|
||||
with py2exe, for example, needs to find data files relative to its
|
||||
install. Note hence that care should be taken not to search for
|
||||
private package files whose names might collide with files which might
|
||||
be found installed alongside the python interpreter itself. If no
|
||||
file is found in either place, the sibling relative to the given path
|
||||
is returned, likely leading to a file not found error.
|
||||
"""
|
||||
sib = tsibpath(path, sibling)
|
||||
if not os.path.exists(sib):
|
||||
exe_sib = tsibpath(sys.executable, sibling)
|
||||
if os.path.exists(exe_sib):
|
||||
return exe_sib
|
||||
return sib
|
||||
|
@ -1,336 +0,0 @@
|
||||
"""
|
||||
"Rational" version definition and parsing for DistutilsVersionFight
|
||||
discussion at PyCon 2009.
|
||||
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
import re
|
||||
|
||||
|
||||
class IrrationalVersionError(Exception):
|
||||
"""This is an irrational version."""
|
||||
pass
|
||||
|
||||
class HugeMajorVersionNumError(IrrationalVersionError):
|
||||
"""An irrational version because the major version number is huge
|
||||
(often because a year or date was used).
|
||||
|
||||
See `error_on_huge_major_num` option in `NormalizedVersion` for details.
|
||||
This guard can be disabled by setting that option False.
|
||||
"""
|
||||
pass
|
||||
|
||||
# A marker used in the second and third parts of the `parts` tuple, for
|
||||
# versions that don't have those segments, to sort properly. An example
|
||||
# of versions in sort order ('highest' last):
|
||||
# 1.0b1 ((1,0), ('b',1), ('f',))
|
||||
# 1.0.dev345 ((1,0), ('f',), ('dev', 345))
|
||||
# 1.0 ((1,0), ('f',), ('f',))
|
||||
# 1.0.post256.dev345 ((1,0), ('f',), ('f', 'post', 256, 'dev', 345))
|
||||
# 1.0.post345 ((1,0), ('f',), ('f', 'post', 345, 'f'))
|
||||
# ^ ^ ^
|
||||
# 'b' < 'f' ---------------------/ | |
|
||||
# | |
|
||||
# 'dev' < 'f' < 'post' -------------------/ |
|
||||
# |
|
||||
# 'dev' < 'f' ----------------------------------------------/
|
||||
# Other letters would do, but 'f' for 'final' is kind of nice.
|
||||
FINAL_MARKER = ('f',)
|
||||
|
||||
VERSION_RE = re.compile(r'''
|
||||
^
|
||||
(?P<version>\d+\.\d+) # minimum 'N.N'
|
||||
(?P<extraversion>(?:\.\d+)*) # any number of extra '.N' segments
|
||||
(?:
|
||||
(?P<prerel>[abc]|rc) # 'a'=alpha, 'b'=beta, 'c'=release candidate
|
||||
# 'rc'= alias for release candidate
|
||||
(?P<prerelversion>\d+(?:\.\d+)*)
|
||||
)?
|
||||
(?P<postdev>(\.post(?P<post>\d+))?(\.dev(?P<dev>\d+))?)?
|
||||
$''', re.VERBOSE)
|
||||
|
||||
class NormalizedVersion(object):
|
||||
"""A rational version.
|
||||
|
||||
Good:
|
||||
1.2 # equivalent to "1.2.0"
|
||||
1.2.0
|
||||
1.2a1
|
||||
1.2.3a2
|
||||
1.2.3b1
|
||||
1.2.3c1
|
||||
1.2.3.4
|
||||
TODO: fill this out
|
||||
|
||||
Bad:
|
||||
1 # mininum two numbers
|
||||
1.2a # release level must have a release serial
|
||||
1.2.3b
|
||||
"""
|
||||
def __init__(self, s, error_on_huge_major_num=True):
|
||||
"""Create a NormalizedVersion instance from a version string.
|
||||
|
||||
@param s {str} The version string.
|
||||
@param error_on_huge_major_num {bool} Whether to consider an
|
||||
apparent use of a year or full date as the major version number
|
||||
an error. Default True. One of the observed patterns on PyPI before
|
||||
the introduction of `NormalizedVersion` was version numbers like this:
|
||||
2009.01.03
|
||||
20040603
|
||||
2005.01
|
||||
This guard is here to strongly encourage the package author to
|
||||
use an alternate version, because a release deployed into PyPI
|
||||
and, e.g. downstream Linux package managers, will forever remove
|
||||
the possibility of using a version number like "1.0" (i.e.
|
||||
where the major number is less than that huge major number).
|
||||
"""
|
||||
self._parse(s, error_on_huge_major_num)
|
||||
|
||||
@classmethod
|
||||
def from_parts(cls, version, prerelease=FINAL_MARKER,
|
||||
devpost=FINAL_MARKER):
|
||||
return cls(cls.parts_to_str((version, prerelease, devpost)))
|
||||
|
||||
def _parse(self, s, error_on_huge_major_num=True):
|
||||
"""Parses a string version into parts."""
|
||||
match = VERSION_RE.search(s)
|
||||
if not match:
|
||||
raise IrrationalVersionError(s)
|
||||
|
||||
groups = match.groupdict()
|
||||
parts = []
|
||||
|
||||
# main version
|
||||
block = self._parse_numdots(groups['version'], s, False, 2)
|
||||
extraversion = groups.get('extraversion')
|
||||
if extraversion not in ('', None):
|
||||
block += self._parse_numdots(extraversion[1:], s)
|
||||
parts.append(tuple(block))
|
||||
|
||||
# prerelease
|
||||
prerel = groups.get('prerel')
|
||||
if prerel is not None:
|
||||
block = [prerel]
|
||||
block += self._parse_numdots(groups.get('prerelversion'), s,
|
||||
pad_zeros_length=1)
|
||||
parts.append(tuple(block))
|
||||
else:
|
||||
parts.append(FINAL_MARKER)
|
||||
|
||||
# postdev
|
||||
if groups.get('postdev'):
|
||||
post = groups.get('post')
|
||||
dev = groups.get('dev')
|
||||
postdev = []
|
||||
if post is not None:
|
||||
postdev.extend([FINAL_MARKER[0], 'post', int(post)])
|
||||
if dev is None:
|
||||
postdev.append(FINAL_MARKER[0])
|
||||
if dev is not None:
|
||||
postdev.extend(['dev', int(dev)])
|
||||
parts.append(tuple(postdev))
|
||||
else:
|
||||
parts.append(FINAL_MARKER)
|
||||
self.parts = tuple(parts)
|
||||
if error_on_huge_major_num and self.parts[0][0] > 1980:
|
||||
raise HugeMajorVersionNumError("huge major version number, %r, "
|
||||
"which might cause future problems: %r" % (self.parts[0][0], s))
|
||||
|
||||
def _parse_numdots(self, s, full_ver_str, drop_trailing_zeros=True,
|
||||
pad_zeros_length=0):
|
||||
"""Parse 'N.N.N' sequences, return a list of ints.
|
||||
|
||||
@param s {str} 'N.N.N...' sequence to be parsed
|
||||
@param full_ver_str {str} The full version string from which this
|
||||
comes. Used for error strings.
|
||||
@param drop_trailing_zeros {bool} Whether to drop trailing zeros
|
||||
from the returned list. Default True.
|
||||
@param pad_zeros_length {int} The length to which to pad the
|
||||
returned list with zeros, if necessary. Default 0.
|
||||
"""
|
||||
nums = []
|
||||
for n in s.split("."):
|
||||
if len(n) > 1 and n[0] == '0':
|
||||
raise IrrationalVersionError("cannot have leading zero in "
|
||||
"version number segment: '%s' in %r" % (n, full_ver_str))
|
||||
nums.append(int(n))
|
||||
if drop_trailing_zeros:
|
||||
while nums and nums[-1] == 0:
|
||||
nums.pop()
|
||||
while len(nums) < pad_zeros_length:
|
||||
nums.append(0)
|
||||
return nums
|
||||
|
||||
def __str__(self):
|
||||
return self.parts_to_str(self.parts)
|
||||
|
||||
@classmethod
|
||||
def parts_to_str(cls, parts):
|
||||
"""Transforms a version expressed in tuple into its string
|
||||
representation."""
|
||||
# XXX This doesn't check for invalid tuples
|
||||
main, prerel, postdev = parts
|
||||
s = '.'.join(str(v) for v in main)
|
||||
if prerel is not FINAL_MARKER:
|
||||
s += prerel[0]
|
||||
s += '.'.join(str(v) for v in prerel[1:])
|
||||
if postdev and postdev is not FINAL_MARKER:
|
||||
if postdev[0] == 'f':
|
||||
postdev = postdev[1:]
|
||||
i = 0
|
||||
while i < len(postdev):
|
||||
if i % 2 == 0:
|
||||
s += '.'
|
||||
s += str(postdev[i])
|
||||
i += 1
|
||||
return s
|
||||
|
||||
def __repr__(self):
|
||||
return "%s('%s')" % (self.__class__.__name__, self)
|
||||
|
||||
def _cannot_compare(self, other):
|
||||
raise TypeError("cannot compare %s and %s"
|
||||
% (type(self).__name__, type(other).__name__))
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, NormalizedVersion):
|
||||
self._cannot_compare(other)
|
||||
return self.parts == other.parts
|
||||
|
||||
def __lt__(self, other):
|
||||
if not isinstance(other, NormalizedVersion):
|
||||
self._cannot_compare(other)
|
||||
return self.parts < other.parts
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self.__eq__(other)
|
||||
|
||||
def __gt__(self, other):
|
||||
return not (self.__lt__(other) or self.__eq__(other))
|
||||
|
||||
def __le__(self, other):
|
||||
return self.__eq__(other) or self.__lt__(other)
|
||||
|
||||
def __ge__(self, other):
|
||||
return self.__eq__(other) or self.__gt__(other)
|
||||
|
||||
def suggest_normalized_version(s):
|
||||
"""Suggest a normalized version close to the given version string.
|
||||
|
||||
If you have a version string that isn't rational (i.e. NormalizedVersion
|
||||
doesn't like it) then you might be able to get an equivalent (or close)
|
||||
rational version from this function.
|
||||
|
||||
This does a number of simple normalizations to the given string, based
|
||||
on observation of versions currently in use on PyPI. Given a dump of
|
||||
those version during PyCon 2009, 4287 of them:
|
||||
- 2312 (53.93%) match NormalizedVersion without change
|
||||
- with the automatic suggestion
|
||||
- 3474 (81.04%) match when using this suggestion method
|
||||
|
||||
@param s {str} An irrational version string.
|
||||
@returns A rational version string, or None, if couldn't determine one.
|
||||
"""
|
||||
try:
|
||||
NormalizedVersion(s)
|
||||
return s # already rational
|
||||
except IrrationalVersionError:
|
||||
pass
|
||||
|
||||
rs = s.lower()
|
||||
|
||||
# part of this could use maketrans
|
||||
for orig, repl in (('-alpha', 'a'), ('-beta', 'b'), ('alpha', 'a'),
|
||||
('beta', 'b'), ('rc', 'c'), ('-final', ''),
|
||||
('-pre', 'c'),
|
||||
('-release', ''), ('.release', ''), ('-stable', ''),
|
||||
('+', '.'), ('_', '.'), (' ', ''), ('.final', ''),
|
||||
('final', '')):
|
||||
rs = rs.replace(orig, repl)
|
||||
|
||||
# if something ends with dev or pre, we add a 0
|
||||
rs = re.sub(r"pre$", r"pre0", rs)
|
||||
rs = re.sub(r"dev$", r"dev0", rs)
|
||||
|
||||
# if we have something like "b-2" or "a.2" at the end of the
|
||||
# version, that is pobably beta, alpha, etc
|
||||
# let's remove the dash or dot
|
||||
rs = re.sub(r"([abc]|rc)[\-\.](\d+)$", r"\1\2", rs)
|
||||
|
||||
# 1.0-dev-r371 -> 1.0.dev371
|
||||
# 0.1-dev-r79 -> 0.1.dev79
|
||||
rs = re.sub(r"[\-\.](dev)[\-\.]?r?(\d+)$", r".\1\2", rs)
|
||||
|
||||
# Clean: 2.0.a.3, 2.0.b1, 0.9.0~c1
|
||||
rs = re.sub(r"[.~]?([abc])\.?", r"\1", rs)
|
||||
|
||||
# Clean: v0.3, v1.0
|
||||
if rs.startswith('v'):
|
||||
rs = rs[1:]
|
||||
|
||||
# Clean leading '0's on numbers.
|
||||
#TODO: unintended side-effect on, e.g., "2003.05.09"
|
||||
# PyPI stats: 77 (~2%) better
|
||||
rs = re.sub(r"\b0+(\d+)(?!\d)", r"\1", rs)
|
||||
|
||||
# Clean a/b/c with no version. E.g. "1.0a" -> "1.0a0". Setuptools infers
|
||||
# zero.
|
||||
# PyPI stats: 245 (7.56%) better
|
||||
rs = re.sub(r"(\d+[abc])$", r"\g<1>0", rs)
|
||||
|
||||
# the 'dev-rNNN' tag is a dev tag
|
||||
rs = re.sub(r"\.?(dev-r|dev\.r)\.?(\d+)$", r".dev\2", rs)
|
||||
|
||||
# clean the - when used as a pre delimiter
|
||||
rs = re.sub(r"-(a|b|c)(\d+)$", r"\1\2", rs)
|
||||
|
||||
# a terminal "dev" or "devel" can be changed into ".dev0"
|
||||
rs = re.sub(r"[\.\-](dev|devel)$", r".dev0", rs)
|
||||
|
||||
# a terminal "dev" can be changed into ".dev0"
|
||||
rs = re.sub(r"(?![\.\-])dev$", r".dev0", rs)
|
||||
|
||||
# a terminal "final" or "stable" can be removed
|
||||
rs = re.sub(r"(final|stable)$", "", rs)
|
||||
|
||||
# The 'r' and the '-' tags are post release tags
|
||||
# 0.4a1.r10 -> 0.4a1.post10
|
||||
# 0.9.33-17222 -> 0.9.33.post17222
|
||||
# 0.9.33-r17222 -> 0.9.33.post17222
|
||||
rs = re.sub(r"\.?(r|-|-r)\.?(\d+)$", r".post\2", rs)
|
||||
|
||||
# Clean 'r' instead of 'dev' usage:
|
||||
# 0.9.33+r17222 -> 0.9.33.dev17222
|
||||
# 1.0dev123 -> 1.0.dev123
|
||||
# 1.0.git123 -> 1.0.dev123
|
||||
# 1.0.bzr123 -> 1.0.dev123
|
||||
# 0.1a0dev.123 -> 0.1a0.dev123
|
||||
# PyPI stats: ~150 (~4%) better
|
||||
rs = re.sub(r"\.?(dev|git|bzr)\.?(\d+)$", r".dev\2", rs)
|
||||
|
||||
# Clean '.pre' (normalized from '-pre' above) instead of 'c' usage:
|
||||
# 0.2.pre1 -> 0.2c1
|
||||
# 0.2-c1 -> 0.2c1
|
||||
# 1.0preview123 -> 1.0c123
|
||||
# PyPI stats: ~21 (0.62%) better
|
||||
rs = re.sub(r"\.?(pre|preview|-c)(\d+)$", r"c\g<2>", rs)
|
||||
|
||||
|
||||
# Tcl/Tk uses "px" for their post release markers
|
||||
rs = re.sub(r"p(\d+)$", r".post\1", rs)
|
||||
|
||||
try:
|
||||
NormalizedVersion(rs)
|
||||
return rs # already rational
|
||||
except IrrationalVersionError:
|
||||
pass
|
||||
return None
|
@ -1,11 +1,39 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
import yaml
|
||||
|
||||
# Announcements contain unicode, because they come from JSON. We tell PyYAML
|
||||
# to give us unicode instead of str/bytes.
|
||||
def construct_unicode(loader, node):
|
||||
return node.value
|
||||
yaml.SafeLoader.add_constructor("tag:yaml.org,2002:str",
|
||||
construct_unicode)
|
||||
|
||||
if PY2:
|
||||
# On Python 2 the way pyyaml deals with Unicode strings is inconsistent.
|
||||
#
|
||||
# >>> yaml.safe_load(yaml.safe_dump(u"hello"))
|
||||
# 'hello'
|
||||
# >>> yaml.safe_load(yaml.safe_dump(u"hello\u1234"))
|
||||
# u'hello\u1234'
|
||||
#
|
||||
# In other words, Unicode strings get roundtripped to byte strings, but
|
||||
# only sometimes.
|
||||
#
|
||||
# In order to ensure unicode stays unicode, we add a configuration saying
|
||||
# that the YAML String Language-Independent Type ("a sequence of zero or
|
||||
# more Unicode characters") should be the underlying Unicode string object,
|
||||
# rather than converting to bytes when possible.
|
||||
#
|
||||
# Reference: https://yaml.org/type/str.html
|
||||
def construct_unicode(loader, node):
|
||||
return node.value
|
||||
yaml.SafeLoader.add_constructor("tag:yaml.org,2002:str",
|
||||
construct_unicode)
|
||||
|
||||
def safe_load(f):
|
||||
return yaml.safe_load(f)
|
||||
|
6
tox.ini
6
tox.ini
@ -112,8 +112,8 @@ commands =
|
||||
# If towncrier.check fails, you forgot to add a towncrier news
|
||||
# fragment explaining the change in this branch. Create one at
|
||||
# `newsfragments/<ticket>.<change type>` with some text for the news
|
||||
# file. See pyproject.toml for legal <change type> values.
|
||||
python -m towncrier.check --pyproject towncrier.pyproject.toml
|
||||
# file. See towncrier.pyproject.toml for legal <change type> values.
|
||||
python -m towncrier.check --config towncrier.pyproject.toml
|
||||
|
||||
|
||||
[testenv:typechecks]
|
||||
@ -234,7 +234,7 @@ deps =
|
||||
# normal install is not needed for docs, and slows things down
|
||||
skip_install = True
|
||||
commands =
|
||||
sphinx-build -b html -d {toxinidir}/docs/_build/doctrees {toxinidir}/docs {toxinidir}/docs/_build/html
|
||||
sphinx-build -W -b html -d {toxinidir}/docs/_build/doctrees {toxinidir}/docs {toxinidir}/docs/_build/html
|
||||
|
||||
[testenv:pyinstaller]
|
||||
# We override this to pass --no-use-pep517 because pyinstaller (3.4, at least)
|
||||
|
Loading…
Reference in New Issue
Block a user