Merge remote-tracking branch 'origin/master' into 3636.doc-toc-reorg

This commit is contained in:
Sajith Sasidharan 2021-06-05 11:01:09 -04:00
commit 8e600d9fef
223 changed files with 1137 additions and 417 deletions

View File

@ -18,15 +18,17 @@ jobs:
fail-fast: false
matrix:
os:
- macos-latest
- windows-latest
- macos-latest
- ubuntu-latest
python-version:
- 2.7
- 3.6
- 3.7
- 3.8
- 3.9
steps:
# See https://github.com/actions/checkout. A fetch-depth of 0
# fetches all tags and branches.
- name: Check out Tahoe-LAFS sources
@ -36,17 +38,15 @@ jobs:
- name: Set up Python ${{ matrix.python-version }}
if: ${{ matrix.os != 'windows-latest' }}
uses: actions/setup-python@v1
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
# See note below about need for using 32-bit Python 2.7 on
# Windows. The extra handling here for Python 3.6 on Windows is
# because I could not figure out the right GitHub Actions
# expression to do this in a better way.
# Windows.
- name: Set up Python ${{ matrix.python-version }} [Windows x64]
if: ${{ ( matrix.os == 'windows-latest' ) && ( matrix.python-version == '3.6' ) }}
uses: actions/setup-python@v1
if: ${{ ( matrix.os == 'windows-latest' ) && ( matrix.python-version != '2.7' ) }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
architecture: 'x64'
@ -85,14 +85,14 @@ jobs:
- name: Install Python packages
run: |
pip install --upgrade codecov tox setuptools
pip install --upgrade codecov tox tox-gh-actions setuptools
pip list
- name: Display tool versions
run: python misc/build_helpers/show-tool-versions.py
- name: Run "tox -e py27-coverage"
run: tox -e py27-coverage
- name: Run tox for corresponding Python version
run: python -m tox
- name: Upload eliot.log in case of failure
uses: actions/upload-artifact@v1
@ -177,6 +177,9 @@ jobs:
- windows-latest
python-version:
- 2.7
include:
- os: ubuntu-latest
python-version: 3.6
steps:
@ -234,9 +237,14 @@ jobs:
- name: Display tool versions
run: python misc/build_helpers/show-tool-versions.py
- name: Run "tox -e integration"
- name: Run "Python 2 integration tests"
if: ${{ matrix.python-version == '2.7' }}
run: tox -e integration
- name: Run "Python 3 integration tests"
if: ${{ matrix.python-version != '2.7' }}
run: tox -e integration3
- name: Upload eliot.log in case of failure
uses: actions/upload-artifact@v1
if: failure()

42
CONTRIBUTORS.rst Normal file
View File

@ -0,0 +1,42 @@
Contributor Checklist
=====================
* Create a ``Trac`` ticket, fill it out and assign it to yourself (contact exarkun if you don't have an account):
``https://tahoe-lafs.org/trac/tahoe-lafs/newticket``
* Use the ticket number to name your branch (example):
``3003.contributor-guide``
* Good idea to add tests at the same time you write your code.
* Add a file to the ``/newsfragments`` folder, named with the ticket number and the type of patch (example):
``newsfragments/3651.minor``
* ``towncrier`` recognizes the following types:
``incompat``, ``feature``, ``bugfix``, ``installation``, ``configuration``, ``documentation``, ``removed``, ``other``, ``minor``
* Add one sentence to ``newsfragments/<ticket-number>.<towncrier-type>`` describing the change (example):
``The integration test suite has been updated to use pytest-twisted instead of deprecated pytest APIs.``
* Run the test suite with ``tox``, ``tox -e codechecks`` and ``tox -e typechecks``
* Push your branch to Github with your ticket number in the merge commit message (example):
``Fixes ticket:3003``
This makes the ``Trac`` ticket close when your PR gets approved.
* Request appropriate review - we suggest asking `Tahoe Committers <https://github.com/orgs/tahoe-lafs/teams/tahoe-committers>`__
References
----------
This checklist is a summary of `this page on contributing Patches <https://tahoe-lafs.org/trac/tahoe-lafs/wiki/Patches>`__
Before authoring or reviewing a patch, please familiarize yourself with the `Coding Standard <https://tahoe-lafs.org/trac/tahoe-lafs/wiki/CodingStandards>`__
and the `Contributor Code of Conduct <docs/CODE_OF_CONDUCT.md>`__.

28
CREDITS
View File

@ -204,6 +204,34 @@ E: meejah@meejah.ca
P: 0xC2602803128069A7, 9D5A 2BD5 688E CB88 9DEB CD3F C260 2803 1280 69A7
D: various bug-fixes and features
N: Chad Whitacre
E: chad@zetaweb.com
D: Python3 porting
N: Itamar Turner-Trauring
E: itamar@pythonspeed.com
D: Python3 porting
N: Jason R. Coombs
E: jaraco@jaraco.com
D: Python3 porting
N: Maciej Fijalkowski
E: fijall@gmail.com
D: Python3 porting
N: Ross Patterson
E: me@rpatterson.net
D: Python3 porting
N: Sajith Sasidharan
E: sajith@hcoop.net
D: Python3 porting
N: Pete Fein
E: pete@snake.dev
D: Python3 porting
N: Viktoriia Savchuk
W: https://twitter.com/viktoriiasvchk
D: Developer community focused improvements on the README file.

File diff suppressed because one or more lines are too long

View File

@ -6,7 +6,7 @@ Free and Open decentralized data store
`Tahoe-LAFS <https://www.tahoe-lafs.org>`__ (Tahoe Least-Authority File Store) is the first free software / open-source storage technology that distributes your data across multiple servers. Even if some servers fail or are taken over by an attacker, the entire file store continues to function correctly, preserving your privacy and security.
|Contributor Covenant| |readthedocs| |travis| |circleci| |coveralls|
|Contributor Covenant| |readthedocs| |circleci| |githubactions| |coveralls|
Table of contents
@ -72,7 +72,7 @@ You can find the full Tahoe-LAFS documentation at our `documentation site <http:
Get involved with the Tahoe-LAFS community:
- Chat with Tahoe-LAFS developers at #tahoe-lafs chat on irc.freenode.net or `Slack <https://join.slack.com/t/tahoe-lafs/shared_invite/zt-jqfj12r5-ZZ5z3RvHnubKVADpP~JINQ>`__.
- Chat with Tahoe-LAFS developers at ``#tahoe-lafs`` channel on `libera.chat <https://libera.chat/>`__ IRC network or `Slack <https://join.slack.com/t/tahoe-lafs/shared_invite/zt-jqfj12r5-ZZ5z3RvHnubKVADpP~JINQ>`__.
- Join our `weekly conference calls <https://www.tahoe-lafs.org/trac/tahoe-lafs/wiki/WeeklyMeeting>`__ with core developers and interested community members.
@ -93,6 +93,10 @@ As a community-driven open source project, Tahoe-LAFS welcomes contributions of
Before authoring or reviewing a patch, please familiarize yourself with the `Coding Standard <https://tahoe-lafs.org/trac/tahoe-lafs/wiki/CodingStandards>`__ and the `Contributor Code of Conduct <docs/CODE_OF_CONDUCT.md>`__.
🤝 Supporters
--------------
We would like to thank `Fosshost <https://fosshost.org>`__ for supporting us with hosting services. If your open source project needs help, you can apply for their support.
❓ FAQ
------
@ -118,13 +122,12 @@ See `TGPPL.PDF <https://tahoe-lafs.org/~zooko/tgppl.pdf>`__ for why the TGPPL ex
:alt: documentation status
:target: http://tahoe-lafs.readthedocs.io/en/latest/?badge=latest
.. |travis| image:: https://travis-ci.org/tahoe-lafs/tahoe-lafs.png?branch=master
:alt: build status
:target: https://travis-ci.org/tahoe-lafs/tahoe-lafs
.. |circleci| image:: https://circleci.com/gh/tahoe-lafs/tahoe-lafs.svg?style=svg
:target: https://circleci.com/gh/tahoe-lafs/tahoe-lafs
.. |githubactions| image:: https://github.com/tahoe-lafs/tahoe-lafs/actions/workflows/ci.yml/badge.svg
:target: https://github.com/tahoe-lafs/tahoe-lafs/actions
.. |coveralls| image:: https://coveralls.io/repos/github/tahoe-lafs/tahoe-lafs/badge.svg
:alt: code coverage
:target: https://coveralls.io/github/tahoe-lafs/tahoe-lafs

View File

@ -165,7 +165,7 @@ from PyPI with ``venv/bin/pip install tahoe-lafs``. After installation, run
Successfully installed ...
% venv/bin/tahoe --version
tahoe-lafs: 1.14.0
tahoe-lafs: 1.15.1
foolscap: ...
%
@ -180,15 +180,15 @@ following instructions with the local filename.
% virtualenv venv
New python executable in ~/venv/bin/python2.7
Installing setuptools, pip, wheel...done.
% venv/bin/pip install https://tahoe-lafs.org/downloads/tahoe-lafs-1.14.0.tar.bz2
Collecting https://tahoe-lafs.org/downloads/tahoe-lafs-1.14.0.tar.bz2
% venv/bin/pip install https://tahoe-lafs.org/downloads/tahoe-lafs-1.15.1.tar.bz2
Collecting https://tahoe-lafs.org/downloads/tahoe-lafs-1.15.1.tar.bz2
...
Installing collected packages: ...
Successfully installed ...
% venv/bin/tahoe --version
tahoe-lafs: 1.14.0
tahoe-lafs: 1.15.1
...
.. _verifying_signatures:
@ -256,7 +256,7 @@ the additional libraries needed to run the unit tests::
Successfully installed ...
% venv/bin/tahoe --version
tahoe-lafs: 1.14.0.post34.dev0
tahoe-lafs: 1.15.1
...
This way, you won't have to re-run the ``pip install`` step each time you
@ -305,7 +305,7 @@ result in a "all tests passed" mesage::
% tox
GLOB sdist-make: ~/tahoe-lafs/setup.py
py27 recreate: ~/tahoe-lafs/.tox/py27
py27 inst: ~/tahoe-lafs/.tox/dist/tahoe-lafs-1.14.0.post8.dev0.zip
py27 inst: ~/tahoe-lafs/.tox/dist/tahoe-lafs-1.15.1.zip
py27 runtests: commands[0] | tahoe --version
py27 runtests: commands[1] | trial --rterrors allmydata
allmydata.test.test_auth

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.3 KiB

After

Width:  |  Height:  |  Size: 7.6 KiB

View File

@ -514,10 +514,10 @@ Command Examples
the pattern will be matched against any level of the directory tree;
it's still impossible to specify absolute path exclusions.
``tahoe backup --exclude-from=/path/to/filename ~ work:backups``
``tahoe backup --exclude-from-utf-8=/path/to/filename ~ work:backups``
``--exclude-from`` is similar to ``--exclude``, but reads exclusion
patterns from ``/path/to/filename``, one per line.
``--exclude-from-utf-8`` is similar to ``--exclude``, but reads exclusion
patterns from a UTF-8-encoded ``/path/to/filename``, one per line.
``tahoe backup --exclude-vcs ~ work:backups``

View File

@ -59,6 +59,10 @@ Create Branch and Apply Updates
- summarize major changes
- commit it
- update "nix/tahoe-lafs.nix"
- change the value given for `version` from `OLD.post1` to `NEW.post1`
- update "CREDITS"
- are there any new contributors in this release?
@ -189,11 +193,16 @@ is appropriate.
Once a release-candidate has marinated for some time then it can be
made into a the actual release.
XXX Write this section when doing 1.15.0 actual release
(In general, this means dropping the "rcX" part of the release and the
tag, uploading those artifacts, uploading to PyPI, ... )
The actual release follows the same steps as above, with some differences:
- there is no "-rcX" on the end of release names
- the release is uploaded to PyPI (using Twine)
- the version is tagged in Git (ideally using "the tahoe release key"
but can be done with any of the authorized core developers' personal
key)
- the release-candidate branches must be merged back to master after
the release is official (e.g. causing newsfragments to be deleted on
master, etc)
Announcing the Release

View File

@ -235,7 +235,7 @@ Socialize
=========
You can chat with other users of and hackers of this software on the
#tahoe-lafs IRC channel at ``irc.freenode.net``, or on the `tahoe-dev mailing
#tahoe-lafs IRC channel at ``irc.libera.chat``, or on the `tahoe-dev mailing
list`_.
.. _tahoe-dev mailing list: https://tahoe-lafs.org/cgi-bin/mailman/listinfo/tahoe-dev

View File

@ -1,5 +1,15 @@
"""
Ported to Python 3.
"""
from __future__ import unicode_literals
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import sys
import shutil
from time import sleep
@ -28,7 +38,7 @@ from twisted.internet.error import (
import pytest
import pytest_twisted
from util import (
from .util import (
_CollectOutputProtocol,
_MagicTextProtocol,
_DumpOutputProtocol,

View File

@ -5,6 +5,15 @@
# You can safely skip any of these tests, it'll just appear to "take
# longer" to start the first test as the fixtures get built
from __future__ import unicode_literals
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
def test_create_flogger(flog_gatherer):
print("Created flog_gatherer")

View File

@ -0,0 +1,64 @@
"""
Integration tests for getting and putting files, including reading from stdin
and stdout.
"""
from subprocess import Popen, PIPE
import pytest
from .util import run_in_thread, cli
DATA = b"abc123 this is not utf-8 decodable \xff\x00\x33 \x11"
try:
DATA.decode("utf-8")
except UnicodeDecodeError:
pass # great, what we want
else:
raise ValueError("BUG, the DATA string was decoded from UTF-8")
@pytest.fixture(scope="session")
def get_put_alias(alice):
cli(alice, "create-alias", "getput")
def read_bytes(path):
with open(path, "rb") as f:
return f.read()
@run_in_thread
def test_put_from_stdin(alice, get_put_alias, tmpdir):
"""
It's possible to upload a file via `tahoe put`'s STDIN, and then download
it to a file.
"""
tempfile = str(tmpdir.join("file"))
p = Popen(
["tahoe", "--node-directory", alice.node_dir, "put", "-", "getput:fromstdin"],
stdin=PIPE
)
p.stdin.write(DATA)
p.stdin.close()
assert p.wait() == 0
cli(alice, "get", "getput:fromstdin", tempfile)
assert read_bytes(tempfile) == DATA
def test_get_to_stdout(alice, get_put_alias, tmpdir):
"""
It's possible to upload a file, and then download it to stdout.
"""
tempfile = tmpdir.join("file")
with tempfile.open("wb") as f:
f.write(DATA)
cli(alice, "put", str(tempfile), "getput:tostdout")
p = Popen(
["tahoe", "--node-directory", alice.node_dir, "get", "getput:tostdout", "-"],
stdout=PIPE
)
assert p.stdout.read() == DATA
assert p.wait() == 0

View File

@ -1,9 +1,21 @@
"""
Ported to Python 3.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import sys
from os.path import join
from twisted.internet.error import ProcessTerminated
import util
from . import util
import pytest_twisted
@ -42,4 +54,4 @@ def test_upload_immutable(reactor, temp_dir, introducer_furl, flog_gatherer, sto
assert isinstance(e, ProcessTerminated)
output = proto.output.getvalue()
assert "shares could be placed on only" in output
assert b"shares could be placed on only" in output

View File

@ -1,3 +1,6 @@
"""
Ported to Python 3.
"""
from __future__ import (
print_function,
unicode_literals,
@ -5,12 +8,18 @@ from __future__ import (
division,
)
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
from six import ensure_text
import json
from os.path import (
join,
)
from urlparse import (
from urllib.parse import (
urlsplit,
)
@ -68,7 +77,7 @@ def _connect_client(reactor, api_auth_token, ws_url):
factory = WebSocketClientFactory(
url=ws_url,
headers={
"Authorization": "{} {}".format(SCHEME, api_auth_token),
"Authorization": "{} {}".format(str(SCHEME, "ascii"), api_auth_token),
}
)
factory.protocol = _StreamingLogClientProtocol
@ -127,7 +136,7 @@ def _test_streaming_logs(reactor, temp_dir, alice):
node_url = cfg.get_config_from_file("node.url")
api_auth_token = cfg.get_private_config("api_auth_token")
ws_url = node_url.replace("http://", "ws://")
ws_url = ensure_text(node_url).replace("http://", "ws://")
log_url = ws_url + "private/logs/v1"
print("Connecting to {}".format(log_url))

View File

@ -1,12 +1,22 @@
"""
Ported to Python 3.
"""
from __future__ import unicode_literals
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import sys
from os.path import join
import pytest
import pytest_twisted
import util
from . import util
from twisted.python.filepath import (
FilePath,
@ -55,7 +65,7 @@ def test_onion_service_storage(reactor, request, temp_dir, flog_gatherer, tor_ne
cap = proto.output.getvalue().strip().split()[-1]
print("TEH CAP!", cap)
proto = util._CollectOutputProtocol()
proto = util._CollectOutputProtocol(capture_stderr=False)
reactor.spawnProcess(
proto,
sys.executable,
@ -68,7 +78,7 @@ def test_onion_service_storage(reactor, request, temp_dir, flog_gatherer, tor_ne
yield proto.done
dave_got = proto.output.getvalue().strip()
assert dave_got == open(gold_path, 'r').read().strip()
assert dave_got == open(gold_path, 'rb').read().strip()
@pytest_twisted.inlineCallbacks
@ -100,7 +110,7 @@ def _create_anonymous_node(reactor, name, control_port, request, temp_dir, flog_
# Which services should this client connect to?
write_introducer(node_dir, "default", introducer_furl)
with node_dir.child('tahoe.cfg').open('w') as f:
f.write('''
node_config = '''
[node]
nickname = %(name)s
web.port = %(web_port)s
@ -125,7 +135,9 @@ shares.total = 2
'log_furl': flog_gatherer,
'control_port': control_port,
'local_port': control_port + 1000,
})
}
node_config = node_config.encode("utf-8")
f.write(node_config)
print("running")
yield util._run_node(reactor, node_dir.path, request, None)

View File

@ -7,15 +7,26 @@ Most of the tests have cursory asserts and encode 'what the WebAPI did
at the time of testing' -- not necessarily a cohesive idea of what the
WebAPI *should* do in every situation. It's not clear the latter
exists anywhere, however.
Ported to Python 3.
"""
from __future__ import unicode_literals
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import time
import json
import urllib2
from urllib.parse import unquote as url_unquote, quote as url_quote
import allmydata.uri
from allmydata.util import jsonbytes as json
import util
from . import util
import requests
import html5lib
@ -64,7 +75,7 @@ def test_upload_download(alice):
u"filename": u"boom",
}
)
assert data == FILE_CONTENTS
assert str(data, "utf-8") == FILE_CONTENTS
def test_put(alice):
@ -95,7 +106,7 @@ def test_helper_status(storage_nodes):
resp = requests.get(url)
assert resp.status_code >= 200 and resp.status_code < 300
dom = BeautifulSoup(resp.content, "html5lib")
assert unicode(dom.h1.string) == u"Helper Status"
assert str(dom.h1.string) == u"Helper Status"
def test_deep_stats(alice):
@ -115,10 +126,10 @@ def test_deep_stats(alice):
# when creating a directory, we'll be re-directed to a URL
# containing our writecap..
uri = urllib2.unquote(resp.url)
uri = url_unquote(resp.url)
assert 'URI:DIR2:' in uri
dircap = uri[uri.find("URI:DIR2:"):].rstrip('/')
dircap_uri = util.node_url(alice.node_dir, "uri/{}".format(urllib2.quote(dircap)))
dircap_uri = util.node_url(alice.node_dir, "uri/{}".format(url_quote(dircap)))
# POST a file into this directory
FILE_CONTENTS = u"a file in a directory"
@ -145,7 +156,7 @@ def test_deep_stats(alice):
k, data = d
assert k == u"dirnode"
assert len(data['children']) == 1
k, child = data['children'].values()[0]
k, child = list(data['children'].values())[0]
assert k == u"filenode"
assert child['size'] == len(FILE_CONTENTS)
@ -196,11 +207,11 @@ def test_status(alice):
print("Uploaded data, cap={}".format(cap))
resp = requests.get(
util.node_url(alice.node_dir, u"uri/{}".format(urllib2.quote(cap))),
util.node_url(alice.node_dir, u"uri/{}".format(url_quote(cap))),
)
print("Downloaded {} bytes of data".format(len(resp.content)))
assert resp.content == FILE_CONTENTS
assert str(resp.content, "ascii") == FILE_CONTENTS
resp = requests.get(
util.node_url(alice.node_dir, "status"),
@ -219,12 +230,12 @@ def test_status(alice):
continue
resp = requests.get(util.node_url(alice.node_dir, href))
if href.startswith(u"/status/up"):
assert "File Upload Status" in resp.content
if "Total Size: {}".format(len(FILE_CONTENTS)) in resp.content:
assert b"File Upload Status" in resp.content
if b"Total Size: %d" % (len(FILE_CONTENTS),) in resp.content:
found_upload = True
elif href.startswith(u"/status/down"):
assert "File Download Status" in resp.content
if "Total Size: {}".format(len(FILE_CONTENTS)) in resp.content:
assert b"File Download Status" in resp.content
if b"Total Size: %d" % (len(FILE_CONTENTS),) in resp.content:
found_download = True
# download the specialized event information
@ -297,7 +308,7 @@ def test_directory_deep_check(alice):
print("Uploaded data1, cap={}".format(cap1))
resp = requests.get(
util.node_url(alice.node_dir, u"uri/{}".format(urllib2.quote(cap0))),
util.node_url(alice.node_dir, u"uri/{}".format(url_quote(cap0))),
params={u"t": u"info"},
)
@ -398,9 +409,9 @@ def test_directory_deep_check(alice):
for _ in range(5):
resp = requests.get(deepcheck_uri)
dom = BeautifulSoup(resp.content, "html5lib")
if dom.h1 and u'Results' in unicode(dom.h1.string):
if dom.h1 and u'Results' in str(dom.h1.string):
break
if dom.h2 and dom.h2.a and u"Reload" in unicode(dom.h2.a.string):
if dom.h2 and dom.h2.a and u"Reload" in str(dom.h2.a.string):
dom = None
time.sleep(1)
assert dom is not None, "Operation never completed"
@ -438,7 +449,7 @@ def test_introducer_info(introducer):
resp = requests.get(
util.node_url(introducer.node_dir, u""),
)
assert "Introducer" in resp.content
assert b"Introducer" in resp.content
resp = requests.get(
util.node_url(introducer.node_dir, u""),
@ -511,6 +522,6 @@ def test_mkdir_with_children(alice):
params={u"t": "mkdir-with-children"},
data=json.dumps(meta),
)
assert resp.startswith("URI:DIR2")
assert resp.startswith(b"URI:DIR2")
cap = allmydata.uri.from_string(resp)
assert isinstance(cap, allmydata.uri.DirectoryURI)

View File

@ -1,9 +1,21 @@
"""
Ported to Python 3.
"""
from __future__ import unicode_literals
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from future.utils import PY2
if PY2:
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
import sys
import time
import json
from os import mkdir, environ
from os.path import exists, join
from six.moves import StringIO
from io import StringIO, BytesIO
from functools import partial
from subprocess import check_output
@ -55,9 +67,10 @@ class _CollectOutputProtocol(ProcessProtocol):
self.output, and callback's on done with all of it after the
process exits (for any reason).
"""
def __init__(self):
def __init__(self, capture_stderr=True):
self.done = Deferred()
self.output = StringIO()
self.output = BytesIO()
self.capture_stderr = capture_stderr
def processEnded(self, reason):
if not self.done.called:
@ -71,8 +84,9 @@ class _CollectOutputProtocol(ProcessProtocol):
self.output.write(data)
def errReceived(self, data):
print("ERR: {}".format(data))
self.output.write(data)
print("ERR: {!r}".format(data))
if self.capture_stderr:
self.output.write(data)
class _DumpOutputProtocol(ProcessProtocol):
@ -92,9 +106,11 @@ class _DumpOutputProtocol(ProcessProtocol):
self.done.errback(reason)
def outReceived(self, data):
data = str(data, sys.stdout.encoding)
self._out.write(data)
def errReceived(self, data):
data = str(data, sys.stdout.encoding)
self._out.write(data)
@ -114,6 +130,7 @@ class _MagicTextProtocol(ProcessProtocol):
self.exited.callback(None)
def outReceived(self, data):
data = str(data, sys.stdout.encoding)
sys.stdout.write(data)
self._output.write(data)
if not self.magic_seen.called and self._magic_text in self._output.getvalue():
@ -121,6 +138,7 @@ class _MagicTextProtocol(ProcessProtocol):
self.magic_seen.callback(self)
def errReceived(self, data):
data = str(data, sys.stderr.encoding)
sys.stdout.write(data)
@ -261,9 +279,9 @@ def _create_node(reactor, request, temp_dir, introducer_furl, flog_gatherer, nam
'--hostname', 'localhost',
'--listen', 'tcp',
'--webport', web_port,
'--shares-needed', unicode(needed),
'--shares-happy', unicode(happy),
'--shares-total', unicode(total),
'--shares-needed', str(needed),
'--shares-happy', str(happy),
'--shares-total', str(total),
'--helper',
]
if not storage:
@ -280,7 +298,7 @@ def _create_node(reactor, request, temp_dir, introducer_furl, flog_gatherer, nam
config,
u'node',
u'log_gatherer.furl',
flog_gatherer.decode("utf-8"),
flog_gatherer,
)
write_config(FilePath(config_path), config)
created_d.addCallback(created)
@ -526,7 +544,8 @@ def generate_ssh_key(path):
key = RSAKey.generate(2048)
key.write_private_key_file(path)
with open(path + ".pub", "wb") as f:
f.write(b"%s %s" % (key.get_name(), key.get_base64()))
s = "%s %s" % (key.get_name(), key.get_base64())
f.write(s.encode("ascii"))
def run_in_thread(f):

View File

@ -52,6 +52,8 @@ system where Tahoe is installed, or in a source tree with setup.py like this:
setup.py run_with_pythonpath -p -c 'misc/make-canary-files.py ARGS..'
"""
from past.builtins import cmp
import os, hashlib
from twisted.python import usage
from allmydata.immutable import upload

View File

@ -18,7 +18,7 @@ def factorial(n):
factorial(n) with n<0 is -factorial(abs(n))
"""
result = 1
for i in xrange(1, abs(n)+1):
for i in range(1, abs(n)+1):
result *= i
assert n >= 0
return result
@ -30,7 +30,7 @@ def binomial(n, k):
# calculate n!/k! as one product, avoiding factors that
# just get canceled
P = k+1
for i in xrange(k+2, n+1):
for i in range(k+2, n+1):
P *= i
# if you are paranoid:
# C, rem = divmod(P, factorial(n-k))

View File

@ -79,7 +79,7 @@ def make_candidate(B, K, K1, K2, q, T, T_min, L_hash, lg_N, sig_bytes, c_sign, c
# Winternitz with B < 4 is never optimal. For example, going from B=4 to B=2 halves the
# chain depth, but that is cancelled out by doubling (roughly) the number of digits.
range_B = xrange(4, 33)
range_B = range(4, 33)
M = pow(2, lg_M)
@ -100,7 +100,7 @@ def calculate(K, K1, K2, q_max, L_hash, trees):
T_min = ceil_div(lg_M - lg_K1, lg_K)
last_q = None
for T in xrange(T_min, T_min+21):
for T in range(T_min, T_min+21):
# lg(total number of leaf private keys)
lg_S = lg_K1 + lg_K*T
lg_N = lg_S + lg_K2
@ -137,14 +137,14 @@ def calculate(K, K1, K2, q_max, L_hash, trees):
# We approximate lg(M-x) as lg(M)
lg_px_step = lg_M + lg_p - lg_1_p
for x in xrange(1, j):
for x in range(1, j):
lg_px[x] = lg_px[x-1] - lg(x) + lg_px_step
q = None
# Find the minimum acceptable value of q.
for q_cand in xrange(1, q_max+1):
for q_cand in range(1, q_max+1):
lg_q = lg(q_cand)
lg_pforge = [lg_px[x] + (lg_q*x - lg_K2)*q_cand for x in xrange(1, j)]
lg_pforge = [lg_px[x] + (lg_q*x - lg_K2)*q_cand for x in range(1, j)]
if max(lg_pforge) < -L_hash + lg(j) and lg_px[j-1] + 1.0 < -L_hash:
#print("K = %d, K1 = %d, K2 = %d, L_hash = %d, lg_K2 = %.3f, q = %d, lg_pforge_1 = %.3f, lg_pforge_2 = %.3f, lg_pforge_3 = %.3f"
# % (K, K1, K2, L_hash, lg_K2, q, lg_pforge_1, lg_pforge_2, lg_pforge_3))
@ -246,13 +246,13 @@ def search():
K_max = 50
c2 = compressions(2*L_hash)
c3 = compressions(3*L_hash)
for dau in xrange(0, 10):
for dau in range(0, 10):
a = pow(2, dau)
for tri in xrange(0, ceil_log(30-dau, 3)):
for tri in range(0, ceil_log(30-dau, 3)):
x = int(a*pow(3, tri))
h = dau + 2*tri
c_x = int(sum_powers(2, dau)*c2 + a*sum_powers(3, tri)*c3)
for y in xrange(1, x+1):
for y in range(1, x+1):
if tri > 0:
# If the bottom level has arity 3, then for every 2 nodes by which the tree is
# imperfect, we can save c3 compressions by pruning 3 leaves back to their parent.
@ -267,16 +267,16 @@ def search():
if y not in trees or (h, c_y, (dau, tri)) < trees[y]:
trees[y] = (h, c_y, (dau, tri))
#for x in xrange(1, K_max+1):
#for x in range(1, K_max+1):
# print(x, trees[x])
candidates = []
progress = 0
fuzz = 0
complete = (K_max-1)*(2200-200)/100
for K in xrange(2, K_max+1):
for K2 in xrange(200, 2200, 100):
for K1 in xrange(max(2, K-fuzz), min(K_max, K+fuzz)+1):
for K in range(2, K_max+1):
for K2 in range(200, 2200, 100):
for K1 in range(max(2, K-fuzz), min(K_max, K+fuzz)+1):
candidates += calculate(K, K1, K2, q_max, L_hash, trees)
progress += 1
print("searching: %3d %% \r" % (100.0 * progress / complete,), end=' ', file=stderr)
@ -285,7 +285,7 @@ def search():
step = 2.0
bins = {}
limit = floor_div(limit_cost, step)
for bin in xrange(0, limit+2):
for bin in range(0, limit+2):
bins[bin] = []
for c in candidates:
@ -296,7 +296,7 @@ def search():
# For each in a range of signing times, find the best candidate.
best = []
for bin in xrange(0, limit):
for bin in range(0, limit):
candidates = bins[bin] + bins[bin+1] + bins[bin+2]
if len(candidates) > 0:
best += [min(candidates, key=lambda c: c['sig_bytes'])]

View File

@ -4,6 +4,8 @@
from __future__ import print_function
from past.builtins import cmp
import random
SERVER_CAPACITY = 10**12

View File

@ -2,6 +2,11 @@
from __future__ import print_function
from future.utils import PY2
if PY2:
from future.builtins import input
import random, math, re
from twisted.python import usage
@ -205,7 +210,7 @@ def graph():
series["alacrity"][file_size] = s.bytes_until_some_data
g.plot([ (fs, series["overhead"][fs])
for fs in sizes ])
raw_input("press return")
input("press return")
if __name__ == '__main__':

View File

@ -1 +0,0 @@
PyPy is now a supported platform.

View File

@ -1 +0,0 @@
The Tahoe-LAFS project has adopted a formal code of conduct.

View File

@ -1 +0,0 @@
The Magic Folder frontend has been split out into a stand-alone project. The functionality is no longer part of Tahoe-LAFS itself. Learn more at <https://github.com/LeastAuthority/magic-folder>.

View File

@ -1 +0,0 @@
Tahoe-LAFS now supports CentOS 8 and no longer supports CentOS 7.

View File

@ -1 +0,0 @@
Make directory page links work.

View File

@ -1 +0,0 @@
Replace nevow with twisted.web in web.operations.OphandleTable

View File

@ -1 +0,0 @@
Replace nevow with twisted.web in web.operations.ReloadMixin

View File

@ -1 +0,0 @@
Port checker result pages' rendering from nevow to twisted web templates.

View File

@ -1 +0,0 @@
allmydata.testing.web, a new module, now offers a supported Python API for testing Tahoe-LAFS web API clients.

View File

@ -1 +0,0 @@
Slackware 14.2 is no longer a Tahoe-LAFS supported platform.

View File

@ -1 +0,0 @@
Tahoe-LAFS now supports Ubuntu 20.04.

View File

@ -1 +0,0 @@
Use last known revision of Chutney that is known to work with Python 2 for Tor integration tests.

View File

@ -1 +0,0 @@
Mutable files now use RSA exponent 65537

View File

@ -1 +0,0 @@

View File

@ -1 +0,0 @@
The "coverage" tox environment has been replaced by the "py27-coverage" and "py36-coverage" environments.

View File

@ -1 +0,0 @@

View File

@ -1 +0,0 @@

View File

@ -1 +0,0 @@
Added pre-commit config to run flake8 checks on commit/push.

Some files were not shown because too many files have changed in this diff Show More