mirror of
https://github.com/tahoe-lafs/tahoe-lafs.git
synced 2024-12-21 22:07:51 +00:00
Merge remote-tracking branch 'origin/master' into 3504.private-introducer-furl
This commit is contained in:
commit
a0d46c6f09
19
.github/workflows/ci.yml
vendored
19
.github/workflows/ci.yml
vendored
@ -21,7 +21,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
|
||||
# Get vcpython27 on Windows + Python 2.7, to build zfec
|
||||
# Get vcpython27 on Windows + Python 2.7, to build netifaces
|
||||
# extension. See https://chocolatey.org/packages/vcpython27 and
|
||||
# https://github.com/crazy-max/ghaction-chocolatey
|
||||
- name: Install MSVC 9.0 for Python 2.7 [Windows]
|
||||
@ -78,6 +78,15 @@ jobs:
|
||||
|
||||
steps:
|
||||
|
||||
# Get vcpython27 for Windows + Python 2.7, to build netifaces
|
||||
# extension. See https://chocolatey.org/packages/vcpython27 and
|
||||
# https://github.com/crazy-max/ghaction-chocolatey
|
||||
- name: Install MSVC 9.0 for Python 2.7 [Windows]
|
||||
if: matrix.os == 'windows-latest' && matrix.python-version == '2.7'
|
||||
uses: crazy-max/ghaction-chocolatey@v1
|
||||
with:
|
||||
args: install vcpython27
|
||||
|
||||
- name: Install Tor [Ubuntu]
|
||||
if: matrix.os == 'ubuntu-latest'
|
||||
run: sudo apt install tor
|
||||
@ -92,12 +101,6 @@ jobs:
|
||||
with:
|
||||
args: install tor
|
||||
|
||||
- name: Install MSVC 9.0 for Python 2.7 [Windows]
|
||||
if: matrix.os == 'windows-latest' && matrix.python-version == '2.7'
|
||||
uses: crazy-max/ghaction-chocolatey@v1
|
||||
with:
|
||||
args: install vcpython27
|
||||
|
||||
- name: Check out Tahoe-LAFS sources
|
||||
uses: actions/checkout@v2
|
||||
|
||||
@ -141,7 +144,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
|
||||
# Get vcpython27 on Windows + Python 2.7, to build zfec
|
||||
# Get vcpython27 for Windows + Python 2.7, to build netifaces
|
||||
# extension. See https://chocolatey.org/packages/vcpython27 and
|
||||
# https://github.com/crazy-max/ghaction-chocolatey
|
||||
- name: Install MSVC 9.0 for Python 2.7 [Windows]
|
||||
|
@ -39,9 +39,7 @@ If you are on Windows, please see :doc:`windows` for platform-specific
|
||||
instructions.
|
||||
|
||||
If you are on a Mac, you can either follow these instructions, or use the
|
||||
pre-packaged bundle described in :doc:`OS-X`. The Tahoe project hosts
|
||||
pre-compiled "wheels" for all dependencies, so use the ``--find-links=``
|
||||
option described below to avoid needing a compiler.
|
||||
pre-packaged bundle described in :doc:`OS-X`.
|
||||
|
||||
Many Linux distributions include Tahoe-LAFS packages. Debian and Ubuntu users
|
||||
can ``apt-get install tahoe-lafs``. See `OSPackages`_ for other
|
||||
@ -54,9 +52,14 @@ Preliminaries
|
||||
=============
|
||||
|
||||
If you don't use a pre-packaged copy of Tahoe, you can build it yourself.
|
||||
You'll need Python2.7, pip, and virtualenv. On unix-like platforms, you will
|
||||
need a C compiler, the Python development headers, and some libraries
|
||||
(libffi-dev and libssl-dev).
|
||||
You'll need Python2.7, pip, and virtualenv.
|
||||
Tahoe-LAFS depends on some libraries which require a C compiler to build.
|
||||
However, for many platforms, PyPI hosts already-built packages of libraries.
|
||||
|
||||
If there is no already-built package for your platform,
|
||||
you will need a C compiler,
|
||||
the Python development headers,
|
||||
and some libraries (libffi-dev and libssl-dev).
|
||||
|
||||
On a modern Debian/Ubuntu-derived distribution, this command will get you
|
||||
everything you need::
|
||||
@ -64,8 +67,7 @@ everything you need::
|
||||
apt-get install build-essential python-dev libffi-dev libssl-dev libyaml-dev python-virtualenv
|
||||
|
||||
On OS-X, install pip and virtualenv as described below. If you want to
|
||||
compile the dependencies yourself (instead of using ``--find-links`` to take
|
||||
advantage of the pre-compiled ones we host), you'll also need to install
|
||||
compile the dependencies yourself, you'll also need to install
|
||||
Xcode and its command-line tools.
|
||||
|
||||
**Note** that Tahoe-LAFS depends on `openssl 1.1.1c` or greater.
|
||||
@ -150,30 +152,24 @@ from PyPI with ``venv/bin/pip install tahoe-lafs``. After installation, run
|
||||
% virtualenv venv
|
||||
New python executable in ~/venv/bin/python2.7
|
||||
Installing setuptools, pip, wheel...done.
|
||||
|
||||
|
||||
% venv/bin/pip install -U pip setuptools
|
||||
Downloading/unpacking pip from https://pypi.python.org/...
|
||||
...
|
||||
Successfully installed pip setuptools
|
||||
|
||||
|
||||
% venv/bin/pip install tahoe-lafs
|
||||
Collecting tahoe-lafs
|
||||
...
|
||||
Installing collected packages: ...
|
||||
Successfully installed ...
|
||||
|
||||
|
||||
% venv/bin/tahoe --version
|
||||
tahoe-lafs: 1.14.0
|
||||
foolscap: ...
|
||||
|
||||
|
||||
%
|
||||
|
||||
On OS-X, instead of ``pip install tahoe-lafs``, use this command to take
|
||||
advantage of the hosted pre-compiled wheels::
|
||||
|
||||
venv/bin/pip install --find-links=https://tahoe-lafs.org/deps tahoe-lafs
|
||||
|
||||
|
||||
Install From a Source Tarball
|
||||
-----------------------------
|
||||
|
||||
@ -182,13 +178,13 @@ You can also install directly from the source tarball URL::
|
||||
% virtualenv venv
|
||||
New python executable in ~/venv/bin/python2.7
|
||||
Installing setuptools, pip, wheel...done.
|
||||
|
||||
|
||||
% venv/bin/pip install https://tahoe-lafs.org/downloads/tahoe-lafs-1.14.0.tar.bz2
|
||||
Collecting https://tahoe-lafs.org/downloads/tahoe-lafs-1.14.0.tar.bz2
|
||||
...
|
||||
Installing collected packages: ...
|
||||
Successfully installed ...
|
||||
|
||||
|
||||
% venv/bin/tahoe --version
|
||||
tahoe-lafs: 1.14.0
|
||||
...
|
||||
@ -213,16 +209,16 @@ with the ``--editable`` flag. You should also use the ``[test]`` extra to get
|
||||
the additional libraries needed to run the unit tests::
|
||||
|
||||
% git clone https://github.com/tahoe-lafs/tahoe-lafs.git
|
||||
|
||||
|
||||
% cd tahoe-lafs
|
||||
|
||||
|
||||
% virtualenv venv
|
||||
|
||||
|
||||
% venv/bin/pip install --editable .[test]
|
||||
Obtaining file::~/tahoe-lafs
|
||||
...
|
||||
Successfully installed ...
|
||||
|
||||
|
||||
% venv/bin/tahoe --version
|
||||
tahoe-lafs: 1.14.0.post34.dev0
|
||||
...
|
||||
@ -282,7 +278,7 @@ result in a "all tests passed" mesage::
|
||||
test_missing_signature ... [OK]
|
||||
...
|
||||
Ran 1186 tests in 423.179s
|
||||
|
||||
|
||||
PASSED (skips=7, expectedFailures=3, successes=1176)
|
||||
__________________________ summary ___________________________________
|
||||
py27: commands succeeded
|
||||
|
@ -1,110 +0,0 @@
|
||||
How to Make a Tahoe-LAFS Release
|
||||
|
||||
Any developer with push priveleges can do most of these steps, but a
|
||||
"Release Maintainer" is required for some signing operations -- these
|
||||
steps are marked with (Release Maintainer). Currently, the following
|
||||
people are Release Maintainers:
|
||||
|
||||
- Brian Warner (https://github.com/warner)
|
||||
|
||||
|
||||
* select features/PRs for new release [0/2]
|
||||
- [ ] made sure they are tagged/labeled
|
||||
- [ ] merged all release PRs
|
||||
|
||||
* basic quality checks [0/3]
|
||||
- [ ] all travis CI checks pass
|
||||
- [ ] all appveyor checks pass
|
||||
- [ ] all buildbot workers pass their checks
|
||||
|
||||
* freeze master branch [0/1]
|
||||
- [ ] announced the freeze of the master branch on IRC (i.e. non-release PRs won't be merged until after release)
|
||||
|
||||
* sync documentation [0/7]
|
||||
|
||||
- [ ] NEWS.rst: (run "tox -e news")
|
||||
- [ ] added final release name and date to top-most item in NEWS.rst
|
||||
- [ ] updated relnotes.txt (change next, last versions; summarize NEWS)
|
||||
- [ ] updated CREDITS
|
||||
- [ ] updated docs/known_issues.rst
|
||||
- [ ] docs/INSTALL.rst only points to current tahoe-lafs-X.Y.Z.tar.gz source code file
|
||||
- [ ] updated https://tahoe-lafs.org/hacktahoelafs/
|
||||
|
||||
* sign + build the tag [0/8]
|
||||
|
||||
- [ ] code passes all checks / tests (i.e. all CI is green)
|
||||
- [ ] documentation is ready (see above)
|
||||
- [ ] (Release Maintainer): git tag -s -u 0xE34E62D06D0E69CFCA4179FFBDE0D31D68666A7A -m "release Tahoe-LAFS-X.Y.Z" tahoe-lafs-X.Y.Z
|
||||
- [ ] build code locally:
|
||||
tox -e py27,codechecks,deprecations,docs,integration,upcoming-deprecations
|
||||
- [ ] created tarballs (they'll be in dist/ for later comparison)
|
||||
tox -e tarballs
|
||||
- [ ] release version is reporting itself as intended version
|
||||
ls dist/
|
||||
- [ ] 'git pull' doesn't pull anything
|
||||
- [ ] pushed tag to trigger buildslaves
|
||||
git push official master TAGNAME
|
||||
- [ ] confirmed Dockerhub built successfully:
|
||||
https://hub.docker.com/r/tahoelafs/base/builds/
|
||||
|
||||
* sign the release artifacts [0/8]
|
||||
|
||||
- [ ] (Release Maintainer): pushed signed tag (should trigger Buildbot builders)
|
||||
- [ ] Buildbot workers built all artifacts successfully
|
||||
- [ ] downloaded upstream tarballs+wheels
|
||||
- [ ] announce on IRC that master is unlocked
|
||||
- [ ] compared upstream tarballs+wheels against local copies
|
||||
- [ ] (Release Maintainer): signed each upstream artifacts with "gpg -ba -u 0xE34E62D06D0E69CFCA4179FFBDE0D31D68666A7A FILE"
|
||||
- [ ] added to relnotes.txt: [0/3]
|
||||
- [ ] prefix with SHA256 of tarballs
|
||||
- [ ] release pubkey
|
||||
- [ ] git revision hash
|
||||
- [ ] GPG-signed the release email with release key (write to
|
||||
relnotes.txt.asc) Ideally this is a Release Maintainer, but could
|
||||
be any developer
|
||||
|
||||
* publish release artifacts [0/9]
|
||||
|
||||
- [ ] uploaded to PyPI via: twine upload dist/*
|
||||
- [ ] uploaded *.asc to org ~source/downloads/
|
||||
- [ ] test install works properly: pip install tahoe-lafs
|
||||
- [ ] copied the release tarballs and signatures to tahoe-lafs.org: ~source/downloads/
|
||||
- [ ] moved old release out of ~source/downloads (to downloads/old/?)
|
||||
- [ ] ensured readthedocs.org updated
|
||||
- [ ] uploaded wheels to https://tahoe-lafs.org/deps/
|
||||
- [ ] uploaded release to https://github.com/tahoe-lafs/tahoe-lafs/releases
|
||||
|
||||
* check release downloads [0/]
|
||||
|
||||
- [ ] test PyPI via: pip install tahoe-lafs
|
||||
- [ ] https://github.com/tahoe-lafs/tahoe-lafs/releases
|
||||
- [ ] https://tahoe-lafs.org/downloads/
|
||||
- [ ] https://tahoe-lafs.org/deps/
|
||||
|
||||
* document release in trac [0/]
|
||||
|
||||
- [ ] closed the Milestone on the trac Roadmap
|
||||
|
||||
* unfreeze master branch [0/]
|
||||
|
||||
- [ ] announced on IRC that new PRs will be looked at/merged
|
||||
|
||||
* announce new release [0/]
|
||||
|
||||
- [ ] sent release email and relnotes.txt.asc to tahoe-announce@tahoe-lafs.org
|
||||
- [ ] sent release email and relnotes.txt.asc to tahoe-dev@tahoe-lafs.org
|
||||
- [ ] updated Wiki front page: version on download link, News column
|
||||
- [ ] updated Wiki "Doc": parade of release notes (with rev of NEWS.rst)
|
||||
- [ ] make an "announcement of new release" on freshmeat (XXX still a thing?)
|
||||
- [ ] make an "announcement of new release" on launchpad
|
||||
- [ ] tweeted as @tahoelafs
|
||||
- [ ] emailed relnotes.txt.asc to below listed mailing-lists/organizations
|
||||
- [ ] also announce release to (trimmed from previous version of this doc):
|
||||
- twisted-python@twistedmatrix.com
|
||||
- liberationtech@lists.stanford.edu
|
||||
- lwn@lwn.net
|
||||
- p2p-hackers@lists.zooko.com
|
||||
- python-list@python.org
|
||||
- http://listcultures.org/pipermail/p2presearch_listcultures.org/
|
||||
- cryptopp-users@googlegroups.com
|
||||
- (others?)
|
197
docs/release-checklist.rst
Normal file
197
docs/release-checklist.rst
Normal file
@ -0,0 +1,197 @@
|
||||
|
||||
=================
|
||||
Release Checklist
|
||||
=================
|
||||
|
||||
These instructions were produced while making the 1.15.0 release. They
|
||||
are based on the original instructions (in old revisions in the file
|
||||
`docs/how_to_make_a_tahoe-lafs_release.org`).
|
||||
|
||||
Any contributor can do the first part of the release preparation. Only
|
||||
certain contributors can perform other parts. These are the two main
|
||||
sections of this checklist (and could be done by different people).
|
||||
|
||||
A final section describes how to announce the release.
|
||||
|
||||
|
||||
Any Contributor
|
||||
---------------
|
||||
|
||||
Anyone who can create normal PRs should be able to complete this
|
||||
portion of the release process.
|
||||
|
||||
|
||||
Prepare for the Release
|
||||
```````````````````````
|
||||
|
||||
The `master` branch should always be releasable.
|
||||
|
||||
It may be worth asking (on IRC or mailing-ist) if anything will be
|
||||
merged imminently (for example, "I will prepare a release this coming
|
||||
Tuesday if you want to get anything in").
|
||||
|
||||
- Create a ticket for the release in Trac
|
||||
- Ticket number needed in next section
|
||||
|
||||
|
||||
Create Branch and Apply Updates
|
||||
```````````````````````````````
|
||||
|
||||
- Create a branch for release-candidates (e.g. `XXXX.release-1.15.0.rc0`)
|
||||
- run `tox -e news` to produce a new NEWS.txt file (this does a commit)
|
||||
- create the news for the release
|
||||
- newsfragments/<ticket number>.minor
|
||||
- commit it
|
||||
- manually fix NEWS.txt
|
||||
- proper title for latest release ("Release 1.15.0" instead of "Release ...post1432")
|
||||
- double-check date (maybe release will be in the future)
|
||||
- spot-check the release notes (these come from the newsfragments
|
||||
files though so don't do heavy editing)
|
||||
- commit these changes
|
||||
- update "relnotes.txt"
|
||||
- update all mentions of 1.14.0 -> 1.15.0
|
||||
- update "previous release" statement and date
|
||||
- summarize major changes
|
||||
- commit it
|
||||
- update "CREDITS"
|
||||
- are there any new contributors in this release?
|
||||
- one way: git log release-1.14.0.. | grep Author | sort | uniq
|
||||
- commit it
|
||||
- update "docs/known_issues.rst" if appropriate
|
||||
- update "docs/INSTALL.rst" references to the new release
|
||||
- Push the branch to github
|
||||
- Create a (draft) PR; this should trigger CI (note that github
|
||||
doesn't let you create a PR without some changes on the branch so
|
||||
running + committing the NEWS.txt file achieves that without changing
|
||||
any code)
|
||||
- Confirm CI runs successfully on all platforms
|
||||
|
||||
|
||||
Create Release Candidate
|
||||
````````````````````````
|
||||
|
||||
Before "officially" tagging any release, we will make a
|
||||
release-candidate available. So there will be at least 1.15.0rc0 (for
|
||||
example). If there are any problems, an rc1 or rc2 etc may also be
|
||||
released. Anyone can sign these releases (ideally they'd be signed
|
||||
"officially" as well, but it's better to get them out than to wait for
|
||||
that).
|
||||
|
||||
Typically expert users will be the ones testing release candidates and
|
||||
they will need to evaluate which contributors' signatures they trust.
|
||||
|
||||
- (all steps above are completed)
|
||||
- sign the release
|
||||
- git tag -s -u 0xE34E62D06D0E69CFCA4179FFBDE0D31D68666A7A -m "release Tahoe-LAFS-1.15.0rc0" tahoe-lafs-1.15.0rc0
|
||||
- (replace the key-id above with your own)
|
||||
- build all code locally
|
||||
- these should all pass:
|
||||
- tox -e py27,codechecks,docs,integration
|
||||
- these can fail (ideally they should not of course):
|
||||
- tox -e deprecations,upcoming-deprecations
|
||||
- build tarballs
|
||||
- tox -e tarballs
|
||||
- confirm it at least exists:
|
||||
- ls dist/ | grep 1.15.0rc0
|
||||
- inspect and test the tarballs
|
||||
- install each in a fresh virtualenv
|
||||
- run `tahoe` command
|
||||
- when satisfied, sign the tarballs:
|
||||
- gpg --pinentry=loopback --armor --sign dist/tahoe_lafs-1.15.0rc0-py2-none-any.whl
|
||||
- gpg --pinentry=loopback --armor --sign dist/tahoe_lafs-1.15.0rc0.tar.bz2
|
||||
- gpg --pinentry=loopback --armor --sign dist/tahoe_lafs-1.15.0rc0.tar.gz
|
||||
- gpg --pinentry=loopback --armor --sign dist/tahoe_lafs-1.15.0rc0.zip
|
||||
|
||||
|
||||
Privileged Contributor
|
||||
-----------------------
|
||||
|
||||
Steps in this portion require special access to keys or
|
||||
infrastructure. For example, **access to tahoe-lafs.org** to upload
|
||||
binaries or edit HTML.
|
||||
|
||||
|
||||
Hack Tahoe-LAFS
|
||||
```````````````
|
||||
|
||||
Did anyone contribute a hack since the last release? If so, then
|
||||
https://tahoe-lafs.org/hacktahoelafs/ needs to be updated.
|
||||
|
||||
|
||||
Upload Artifacts
|
||||
````````````````
|
||||
|
||||
Any release-candidate or actual release plus signature (.asc file)
|
||||
need to be uploaded to https://tahoe-lafs.org in `~source/downloads`
|
||||
|
||||
- secure-copy all release artifacts to the download area on the
|
||||
tahoe-lafs.org host machine. `~source/downloads` on there maps to
|
||||
https://tahoe-lafs.org/downloads/ on the Web.
|
||||
- scp dist/*1.15.0* username@tahoe-lafs.org:/home/source/downloads
|
||||
- the following developers have access to do this:
|
||||
- exarkun
|
||||
- meejah
|
||||
- warner
|
||||
|
||||
For the actual release, the tarball and signature files need to be
|
||||
uploaded to PyPI as well.
|
||||
|
||||
- how to do this?
|
||||
- (original guide says only "twine upload dist/*")
|
||||
- the following developers have access to do this:
|
||||
- warner
|
||||
- exarkun (partial?)
|
||||
- meejah (partial?)
|
||||
|
||||
Announcing the Release Candidate
|
||||
````````````````````````````````
|
||||
|
||||
The release-candidate should be announced by posting to the
|
||||
mailing-list (tahoe-dev@tahoe-lafs.org). For example:
|
||||
https://tahoe-lafs.org/pipermail/tahoe-dev/2020-October/009995.html
|
||||
|
||||
|
||||
Is The Release Done Yet?
|
||||
````````````````````````
|
||||
|
||||
If anyone reports a problem with a release-candidate then a new
|
||||
release-candidate should be made once a fix has been merged to
|
||||
master. Repeat the above instructions with `rc1` or `rc2` or whatever
|
||||
is appropriate.
|
||||
|
||||
Once a release-candidate has marinated for some time then it can be
|
||||
made into a the actual release.
|
||||
|
||||
XXX Write this section when doing 1.15.0 actual release
|
||||
|
||||
(In general, this means dropping the "rcX" part of the release and the
|
||||
tag, uploading those artifacts, uploading to PyPI, ... )
|
||||
|
||||
|
||||
|
||||
Announcing the Release
|
||||
----------------------
|
||||
|
||||
|
||||
mailing-lists
|
||||
`````````````
|
||||
|
||||
A new Tahoe release is traditionally announced on our mailing-list
|
||||
(tahoe-dev@tahoe-lafs.org). The former version of these instructions
|
||||
also announced the release on the following other lists:
|
||||
|
||||
- tahoe-announce@tahoe-lafs.org
|
||||
- twisted-python@twistedmatrix.com
|
||||
- liberationtech@lists.stanford.edu
|
||||
- lwn@lwn.net
|
||||
- p2p-hackers@lists.zooko.com
|
||||
- python-list@python.org
|
||||
- http://listcultures.org/pipermail/p2presearch_listcultures.org/
|
||||
- cryptopp-users@googlegroups.com
|
||||
|
||||
|
||||
wiki
|
||||
````
|
||||
|
||||
Edit the "News" section of the front page of https://tahoe-lafs.org
|
||||
with a link to the mailing-list archive of the announcement message.
|
@ -8,6 +8,7 @@ the data formats used by Tahoe.
|
||||
:maxdepth: 2
|
||||
|
||||
outline
|
||||
url
|
||||
uri
|
||||
file-encoding
|
||||
URI-extension
|
||||
|
165
docs/specifications/url.rst
Normal file
165
docs/specifications/url.rst
Normal file
@ -0,0 +1,165 @@
|
||||
URLs
|
||||
====
|
||||
|
||||
The goal of this document is to completely specify the construction and use of the URLs by Tahoe-LAFS for service location.
|
||||
This includes, but is not limited to, the original Foolscap-based URLs.
|
||||
These are not to be confused with the URI-like capabilities Tahoe-LAFS uses to refer to stored data.
|
||||
An attempt is also made to outline the rationale for certain choices about these URLs.
|
||||
The intended audience for this document is Tahoe-LAFS maintainers and other developers interested in interoperating with Tahoe-LAFS or these URLs.
|
||||
|
||||
Background
|
||||
----------
|
||||
|
||||
Tahoe-LAFS first used Foolscap_ for network communication.
|
||||
Foolscap connection setup takes as an input a Foolscap URL or a *fURL*.
|
||||
A fURL includes three components:
|
||||
|
||||
* the base32-encoded SHA1 hash of the DER form of an x509v3 certificate
|
||||
* zero or more network addresses [1]_
|
||||
* an object identifier
|
||||
|
||||
A Foolscap client tries to connect to each network address in turn.
|
||||
If a connection is established then TLS is negotiated.
|
||||
The server is authenticated by matching its certificate against the hash in the fURL.
|
||||
A matching certificate serves as proof that the handshaking peer is the correct server.
|
||||
This serves as the process by which the client authenticates the server.
|
||||
|
||||
The client can then exercise further Foolscap functionality using the fURL's object identifier.
|
||||
If the object identifier is an unguessable, secret string then it serves as a capability.
|
||||
This unguessable identifier is sometimes called a `swiss number`_ (or swissnum).
|
||||
The client's use of the swissnum is what allows the server to authorize the client.
|
||||
|
||||
.. _`swiss number`: http://wiki.erights.org/wiki/Swiss_number
|
||||
|
||||
NURLs
|
||||
-----
|
||||
|
||||
The authentication and authorization properties of fURLs are a good fit for Tahoe-LAFS' requirements.
|
||||
These are not inherently tied to the Foolscap protocol itself.
|
||||
In particular they are beneficial to :doc:`../proposed/http-storage-node-protocol` which uses HTTP instead of Foolscap.
|
||||
It is conceivable they will also be used with WebSockets at some point as well.
|
||||
|
||||
Continuing to refer to these URLs as fURLs when they are being used for other protocols may cause confusion.
|
||||
Therefore,
|
||||
this document coins the name **NURL** for these URLs.
|
||||
This can be considered to expand to "**N**\ ew URLs" or "Authe\ **N**\ ticating URLs" or "Authorizi\ **N**\ g URLs" as the reader prefers.
|
||||
|
||||
The anticipated use for a **NURL** will still be to establish a TLS connection to a peer.
|
||||
The protocol run over that TLS connection could be Foolscap though it is more likely to be an HTTP-based protocol (such as GBS).
|
||||
|
||||
Syntax
|
||||
------
|
||||
|
||||
The EBNF for a NURL is as follows::
|
||||
|
||||
nurl = scheme, hash, "@", net-loc-list, "/", swiss-number, [ version1 ]
|
||||
|
||||
scheme = "pb://"
|
||||
|
||||
hash = unreserved
|
||||
|
||||
net-loc-list = net-loc, [ { ",", net-loc } ]
|
||||
net-loc = tcp-loc | tor-loc | i2p-loc
|
||||
|
||||
tcp-loc = [ "tcp:" ], hostname, [ ":" port ]
|
||||
tor-loc = "tor:", hostname, [ ":" port ]
|
||||
i2p-loc = "i2p:", i2p-addr, [ ":" port ]
|
||||
|
||||
i2p-addr = { unreserved }, ".i2p"
|
||||
hostname = domain | IPv4address | IPv6address
|
||||
|
||||
swiss-number = segment
|
||||
|
||||
version1 = "#v=1"
|
||||
|
||||
See https://tools.ietf.org/html/rfc3986#section-3.3 for the definition of ``segment``.
|
||||
See https://tools.ietf.org/html/rfc2396#appendix-A for the definition of ``unreserved``.
|
||||
See https://tools.ietf.org/html/draft-main-ipaddr-text-rep-02#section-3.1 for the definition of ``IPv4address``.
|
||||
See https://tools.ietf.org/html/draft-main-ipaddr-text-rep-02#section-3.2 for the definition of ``IPv6address``.
|
||||
See https://tools.ietf.org/html/rfc1035#section-2.3.1 for the definition of ``domain``.
|
||||
|
||||
Versions
|
||||
--------
|
||||
|
||||
Though all NURLs are syntactically compatible some semantic differences are allowed.
|
||||
These differences are separated into distinct versions.
|
||||
|
||||
Version 0
|
||||
---------
|
||||
|
||||
A Foolscap fURL is considered the canonical definition of a version 0 NURL.
|
||||
Notably,
|
||||
the hash component is defined as the base32-encoded SHA1 hash of the DER form of an x509v3 certificate.
|
||||
A version 0 NURL is identified by the absence of the ``v=1`` fragment.
|
||||
|
||||
Examples
|
||||
~~~~~~~~
|
||||
|
||||
* ``pb://sisi4zenj7cxncgvdog7szg3yxbrnamy@tcp:127.1:34399/xphmwz6lx24rh2nxlinni``
|
||||
* ``pb://2uxmzoqqimpdwowxr24q6w5ekmxcymby@localhost:47877/riqhpojvzwxujhna5szkn``
|
||||
|
||||
Version 1
|
||||
---------
|
||||
|
||||
The hash component of a version 1 NURL differs in three ways from the prior version.
|
||||
|
||||
1. The hash function used is SHA3-224 instead of SHA1.
|
||||
The security of SHA1 `continues to be eroded`_.
|
||||
Contrariwise SHA3 is currently the most recent addition to the SHA family by NIST.
|
||||
The 224 bit instance is chosen to keep the output short and because it offers greater collision resistance than SHA1 was thought to offer even at its inception
|
||||
(prior to security research showing actual collision resistance is lower).
|
||||
2. The hash is computed over the certificate's SPKI instead of the whole certificate.
|
||||
This allows certificate re-generation so long as the public key remains the same.
|
||||
This is useful to allow contact information to be updated or extension of validity period.
|
||||
Use of an SPKI hash has also been `explored by the web community`_ during its flirtation with using it for HTTPS certificate pinning
|
||||
(though this is now largely abandoned).
|
||||
|
||||
.. note::
|
||||
*Only* the certificate's keypair is pinned by the SPKI hash.
|
||||
The freedom to change every other part of the certificate is coupled with the fact that all other parts of the certificate contain arbitrary information set by the private key holder.
|
||||
It is neither guaranteed nor expected that a certificate-issuing authority has validated this information.
|
||||
Therefore,
|
||||
*all* certificate fields should be considered within the context of the relationship identified by the SPKI hash.
|
||||
|
||||
3. The hash is encoded using urlsafe-base64 (without padding) instead of base32.
|
||||
This provides a more compact representation and minimizes the usability impacts of switching from a 160 bit hash to a 224 bit hash.
|
||||
|
||||
A version 1 NURL is identified by the presence of the ``v=1`` fragment.
|
||||
Though the length of the hash string (38 bytes) could also be used to differentiate it from a version 0 NURL,
|
||||
there is no guarantee that this will be effective in differentiating it from future versions so this approach should not be used.
|
||||
|
||||
It is possible for a client to unilaterally upgrade a version 0 NURL to a version 1 NURL.
|
||||
After establishing and authenticating a connection the client will have received a copy of the server's certificate.
|
||||
This is sufficient to compute the new hash and rewrite the NURL to upgrade it to version 1.
|
||||
This provides stronger authentication assurances for future uses but it is not required.
|
||||
|
||||
Examples
|
||||
~~~~~~~~
|
||||
|
||||
* ``pb://1WUX44xKjKdpGLohmFcBNuIRN-8rlv1Iij_7rQ@tcp:127.1:34399/jhjbc3bjbhk#v=1``
|
||||
* ``pb://azEu8vlRpnEeYm0DySQDeNY3Z2iJXHC_bsbaAw@localhost:47877/64i4aokv4ej#v=1``
|
||||
|
||||
.. _`continues to be eroded`: https://en.wikipedia.org/wiki/SHA-1#Cryptanalysis_and_validation
|
||||
.. _`explored by the web community`: https://www.imperialviolet.org/2011/05/04/pinning.html
|
||||
.. _Foolscap: https://github.com/warner/foolscap
|
||||
|
||||
.. [1] ``foolscap.furl.decode_furl`` is taken as the canonical definition of the syntax of a fURL.
|
||||
The **location hints** part of the fURL,
|
||||
as it is referred to in Foolscap,
|
||||
is matched by the regular expression fragment ``([^/]*)``.
|
||||
Since this matches the empty string,
|
||||
no network addresses are required to form a fURL.
|
||||
The supporting code around the regular expression also takes extra steps to allow an empty string to match here.
|
||||
|
||||
Open Questions
|
||||
--------------
|
||||
|
||||
1. Should we make a hard recommendation that all certificate fields are ignored?
|
||||
The system makes no guarantees about validation of these fields.
|
||||
Is it just an unnecessary risk to let a user see them?
|
||||
|
||||
2. Should the version specifier be a query-arg-alike or a fragment-alike?
|
||||
The value is only necessary on the client side which makes it similar to an HTTP URL fragment.
|
||||
The current Tahoe-LAFS configuration parsing code has special handling of the fragment character (``#``) which makes it unusable.
|
||||
However,
|
||||
the configuration parsing code is easily changed.
|
@ -33,7 +33,7 @@ You can use whatever name you like for the virtualenv, but example uses
|
||||
3: Use the virtualenv's ``pip`` to install the latest release of Tahoe-LAFS
|
||||
into this virtualenv::
|
||||
|
||||
PS C:\Users\me> venv\Scripts\pip install --find-links=https://tahoe-lafs.org/deps/ tahoe-lafs
|
||||
PS C:\Users\me> venv\Scripts\pip install tahoe-lafs
|
||||
Collecting tahoe-lafs
|
||||
...
|
||||
Installing collected packages: ...
|
||||
@ -69,7 +69,7 @@ The ``pip install tahoe-lafs`` command above will install the latest release
|
||||
the following command (using pip from the virtualenv, from the root of your
|
||||
git checkout)::
|
||||
|
||||
$ venv\Scripts\pip install --find-links=https://tahoe-lafs.org/deps/ .
|
||||
$ venv\Scripts\pip install .
|
||||
|
||||
If you're planning to hack on the source code, you might want to add
|
||||
``--editable`` so you won't have to re-install each time you make a change.
|
||||
@ -77,12 +77,7 @@ If you're planning to hack on the source code, you might want to add
|
||||
Dependencies
|
||||
------------
|
||||
|
||||
Tahoe-LAFS depends upon several packages that use compiled C code
|
||||
(such as zfec). This code must be built separately for each platform
|
||||
(Windows, OS-X, and different flavors of Linux).
|
||||
|
||||
Pre-compiled "wheels" of all Tahoe's dependencies are hosted on the
|
||||
tahoe-lafs.org website in the ``deps/`` directory. The ``--find-links=``
|
||||
argument (used in the examples above) instructs ``pip`` to look at that URL
|
||||
for dependencies. This should avoid the need for anything to be compiled
|
||||
during the install.
|
||||
Tahoe-LAFS depends upon several packages that use compiled C code (such as zfec).
|
||||
This code must be built separately for each platform (Windows, OS-X, and different flavors of Linux).
|
||||
Fortunately, this is now done by upstream packages for most platforms.
|
||||
The result is that a C compiler is usually not required to install Tahoe-LAFS.
|
||||
|
File diff suppressed because it is too large
Load Diff
@ -6,6 +6,9 @@ from os.path import exists, join
|
||||
from six.moves import StringIO
|
||||
from functools import partial
|
||||
|
||||
from twisted.python.filepath import (
|
||||
FilePath,
|
||||
)
|
||||
from twisted.internet.defer import Deferred, succeed
|
||||
from twisted.internet.protocol import ProcessProtocol
|
||||
from twisted.internet.error import ProcessExitedAlready, ProcessDone
|
||||
@ -257,8 +260,13 @@ def _create_node(reactor, request, temp_dir, introducer_furl, flog_gatherer, nam
|
||||
def created(_):
|
||||
config_path = join(node_dir, 'tahoe.cfg')
|
||||
config = get_config(config_path)
|
||||
set_config(config, 'node', 'log_gatherer.furl', flog_gatherer)
|
||||
write_config(config_path, config)
|
||||
set_config(
|
||||
config,
|
||||
u'node',
|
||||
u'log_gatherer.furl',
|
||||
flog_gatherer.decode("utf-8"),
|
||||
)
|
||||
write_config(FilePath(config_path), config)
|
||||
created_d.addCallback(created)
|
||||
|
||||
d = Deferred()
|
||||
|
1
newsfragments/1549.installation
Normal file
1
newsfragments/1549.installation
Normal file
@ -0,0 +1 @@
|
||||
Tahoe-LAFS now requires Twisted 19.10.0 or newer. As a result, it now has a transitive dependency on bcrypt.
|
0
newsfragments/3477.minor
Normal file
0
newsfragments/3477.minor
Normal file
1
newsfragments/3478.minor
Normal file
1
newsfragments/3478.minor
Normal file
@ -0,0 +1 @@
|
||||
|
1
newsfragments/3497.installation
Normal file
1
newsfragments/3497.installation
Normal file
@ -0,0 +1 @@
|
||||
The Tahoe-LAFS project no longer commits to maintaining binary packages for all dependencies at <https://tahoe-lafs.org/deps>. Please use PyPI instead.
|
0
newsfragments/3502.minor
Normal file
0
newsfragments/3502.minor
Normal file
1
newsfragments/3503.other
Normal file
1
newsfragments/3503.other
Normal file
@ -0,0 +1 @@
|
||||
The specification section of the Tahoe-LAFS documentation now includes explicit discussion of the security properties of Foolscap "fURLs" on which it depends.
|
1
newsfragments/3509.bugfix
Normal file
1
newsfragments/3509.bugfix
Normal file
@ -0,0 +1 @@
|
||||
Fix regression that broke flogtool results on Python 2.
|
1
newsfragments/3510.bugfix
Normal file
1
newsfragments/3510.bugfix
Normal file
@ -0,0 +1 @@
|
||||
Fix a logging regression on Python 2 involving unicode strings.
|
0
newsfragments/3511.minor
Normal file
0
newsfragments/3511.minor
Normal file
0
newsfragments/3513.minor
Normal file
0
newsfragments/3513.minor
Normal file
0
newsfragments/3517.minor
Normal file
0
newsfragments/3517.minor
Normal file
1
newsfragments/3518.removed
Normal file
1
newsfragments/3518.removed
Normal file
@ -0,0 +1 @@
|
||||
Announcements delivered through the introducer system are no longer automatically annotated with copious information about the Tahoe-LAFS software version nor the versions of its dependencies.
|
0
newsfragments/3537.minor
Normal file
0
newsfragments/3537.minor
Normal file
0
newsfragments/3542.minor
Normal file
0
newsfragments/3542.minor
Normal file
@ -15,6 +15,9 @@ self: super: {
|
||||
# Need version of pyutil that supports Python 3. The version in 19.09
|
||||
# is too old.
|
||||
pyutil = python-super.callPackage ./pyutil.nix { };
|
||||
|
||||
# Need a newer version of Twisted, too.
|
||||
twisted = python-super.callPackage ./twisted.nix { };
|
||||
};
|
||||
};
|
||||
}
|
||||
|
63
nix/twisted.nix
Normal file
63
nix/twisted.nix
Normal file
@ -0,0 +1,63 @@
|
||||
{ stdenv
|
||||
, buildPythonPackage
|
||||
, fetchPypi
|
||||
, python
|
||||
, zope_interface
|
||||
, incremental
|
||||
, automat
|
||||
, constantly
|
||||
, hyperlink
|
||||
, pyhamcrest
|
||||
, attrs
|
||||
, pyopenssl
|
||||
, service-identity
|
||||
, setuptools
|
||||
, idna
|
||||
, bcrypt
|
||||
}:
|
||||
buildPythonPackage rec {
|
||||
pname = "Twisted";
|
||||
version = "19.10.0";
|
||||
|
||||
src = fetchPypi {
|
||||
inherit pname version;
|
||||
extension = "tar.bz2";
|
||||
sha256 = "7394ba7f272ae722a74f3d969dcf599bc4ef093bc392038748a490f1724a515d";
|
||||
};
|
||||
|
||||
propagatedBuildInputs = [ zope_interface incremental automat constantly hyperlink pyhamcrest attrs setuptools bcrypt ];
|
||||
|
||||
passthru.extras.tls = [ pyopenssl service-identity idna ];
|
||||
|
||||
# Patch t.p._inotify to point to libc. Without this,
|
||||
# twisted.python.runtime.platform.supportsINotify() == False
|
||||
patchPhase = stdenv.lib.optionalString stdenv.isLinux ''
|
||||
substituteInPlace src/twisted/python/_inotify.py --replace \
|
||||
"ctypes.util.find_library('c')" "'${stdenv.glibc.out}/lib/libc.so.6'"
|
||||
'';
|
||||
|
||||
# Generate Twisted's plug-in cache. Twisted users must do it as well. See
|
||||
# http://twistedmatrix.com/documents/current/core/howto/plugin.html#auto3
|
||||
# and http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=477103 for
|
||||
# details.
|
||||
postFixup = ''
|
||||
$out/bin/twistd --help > /dev/null
|
||||
'';
|
||||
|
||||
checkPhase = ''
|
||||
${python.interpreter} -m unittest discover -s twisted/test
|
||||
'';
|
||||
# Tests require network
|
||||
doCheck = false;
|
||||
|
||||
meta = with stdenv.lib; {
|
||||
homepage = https://twistedmatrix.com/;
|
||||
description = "Twisted, an event-driven networking engine written in Python";
|
||||
longDescription = ''
|
||||
Twisted is an event-driven networking engine written in Python
|
||||
and licensed under the MIT license.
|
||||
'';
|
||||
license = licenses.mit;
|
||||
maintainers = [ ];
|
||||
};
|
||||
}
|
4
setup.py
4
setup.py
@ -98,7 +98,9 @@ install_requires = [
|
||||
# `pip install tahoe-lafs[sftp]` would not install requirements
|
||||
# specified by Twisted[conch]. Since this would be the *whole point* of
|
||||
# an sftp extra in Tahoe-LAFS, there is no point in having one.
|
||||
"Twisted[tls,conch] >= 18.4.0",
|
||||
# * Twisted 19.10 introduces Site.getContentFile which we use to get
|
||||
# temporary upload files placed into a per-node temporary directory.
|
||||
"Twisted[tls,conch] >= 19.10.0",
|
||||
|
||||
"PyYAML >= 3.11",
|
||||
|
||||
|
@ -32,6 +32,7 @@ from allmydata.introducer.client import IntroducerClient
|
||||
from allmydata.util import (
|
||||
hashutil, base32, pollmixin, log, idlib,
|
||||
yamlutil, configutil,
|
||||
fileutil,
|
||||
)
|
||||
from allmydata.util.encodingutil import get_filesystem_encoding
|
||||
from allmydata.util.abbreviate import parse_abbreviated_size
|
||||
@ -472,7 +473,6 @@ def create_introducer_clients(config, main_tub, _introducer_factory=None):
|
||||
config.nickname,
|
||||
str(allmydata.__full_version__),
|
||||
str(_Client.OLDEST_SUPPORTED_VERSION),
|
||||
list(node.get_app_versions()),
|
||||
partial(_sequencer, config),
|
||||
cache_path,
|
||||
)
|
||||
@ -1003,6 +1003,21 @@ class _Client(node.Node, pollmixin.PollMixin):
|
||||
def set_default_mutable_keysize(self, keysize):
|
||||
self._key_generator.set_default_keysize(keysize)
|
||||
|
||||
def _get_tempdir(self):
|
||||
"""
|
||||
Determine the path to the directory where temporary files for this node
|
||||
should be written.
|
||||
|
||||
:return bytes: The path which will exist and be a directory.
|
||||
"""
|
||||
tempdir_config = self.config.get_config("node", "tempdir", "tmp")
|
||||
if isinstance(tempdir_config, bytes):
|
||||
tempdir_config = tempdir_config.decode('utf-8')
|
||||
tempdir = self.config.get_config_path(tempdir_config)
|
||||
if not os.path.exists(tempdir):
|
||||
fileutil.make_dirs(tempdir)
|
||||
return tempdir
|
||||
|
||||
def init_web(self, webport):
|
||||
self.log("init_web(webport=%s)", args=(webport,))
|
||||
|
||||
@ -1010,7 +1025,13 @@ class _Client(node.Node, pollmixin.PollMixin):
|
||||
nodeurl_path = self.config.get_config_path("node.url")
|
||||
staticdir_config = self.config.get_config("node", "web.static", "public_html")
|
||||
staticdir = self.config.get_config_path(staticdir_config)
|
||||
ws = WebishServer(self, webport, nodeurl_path, staticdir)
|
||||
ws = WebishServer(
|
||||
self,
|
||||
webport,
|
||||
self._get_tempdir(),
|
||||
nodeurl_path,
|
||||
staticdir,
|
||||
)
|
||||
ws.setServiceParent(self)
|
||||
|
||||
def init_ftp_server(self):
|
||||
|
@ -24,7 +24,7 @@ class IntroducerClient(service.Service, Referenceable):
|
||||
|
||||
def __init__(self, tub, introducer_furl,
|
||||
nickname, my_version, oldest_supported,
|
||||
app_versions, sequencer, cache_filepath):
|
||||
sequencer, cache_filepath):
|
||||
self._tub = tub
|
||||
self.introducer_furl = introducer_furl
|
||||
|
||||
@ -32,13 +32,12 @@ class IntroducerClient(service.Service, Referenceable):
|
||||
self._nickname = nickname
|
||||
self._my_version = my_version
|
||||
self._oldest_supported = oldest_supported
|
||||
self._app_versions = app_versions
|
||||
self._sequencer = sequencer
|
||||
self._cache_filepath = cache_filepath
|
||||
|
||||
self._my_subscriber_info = { "version": 0,
|
||||
"nickname": self._nickname,
|
||||
"app-versions": self._app_versions,
|
||||
"app-versions": [],
|
||||
"my-version": self._my_version,
|
||||
"oldest-supported": self._oldest_supported,
|
||||
}
|
||||
@ -190,7 +189,7 @@ class IntroducerClient(service.Service, Referenceable):
|
||||
# "seqnum" and "nonce" will be populated with new values in
|
||||
# publish(), each time we make a change
|
||||
"nickname": self._nickname,
|
||||
"app-versions": self._app_versions,
|
||||
"app-versions": [],
|
||||
"my-version": self._my_version,
|
||||
"oldest-supported": self._oldest_supported,
|
||||
|
||||
|
@ -1,3 +1,15 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
# Omit dict so Python 3 changes don't leak into API callers on Python 2.
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, list, object, range, str, max, min # noqa: F401
|
||||
from past.utils import old_div
|
||||
|
||||
import struct
|
||||
@ -1744,7 +1756,7 @@ class MDMFSlotReadProxy(object):
|
||||
|
||||
|
||||
def _read(self, readvs, force_remote=False):
|
||||
unsatisfiable = list(filter(lambda x: x[0] + x[1] > len(self._data), readvs))
|
||||
unsatisfiable = [x for x in readvs if x[0] + x[1] > len(self._data)]
|
||||
# TODO: It's entirely possible to tweak this so that it just
|
||||
# fulfills the requests that it can, and not demand that all
|
||||
# requests are satisfiable before running it.
|
||||
|
@ -1,3 +1,14 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
from zope.interface import implementer
|
||||
from twisted.internet import defer
|
||||
|
@ -1,4 +1,15 @@
|
||||
from past.builtins import unicode
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
# Don't import bytes and str, to prevent API leakage
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, dict, list, object, range, max, min # noqa: F401
|
||||
|
||||
import time
|
||||
|
||||
@ -749,9 +760,9 @@ class Retrieve(object):
|
||||
|
||||
blockhashes = dict(enumerate(blockhashes))
|
||||
self.log("the reader gave me the following blockhashes: %s" % \
|
||||
blockhashes.keys())
|
||||
list(blockhashes.keys()))
|
||||
self.log("the reader gave me the following sharehashes: %s" % \
|
||||
sharehashes.keys())
|
||||
list(sharehashes.keys()))
|
||||
bht = self._block_hash_trees[reader.shnum]
|
||||
|
||||
if bht.needed_hashes(segnum, include_leaf=True):
|
||||
@ -908,7 +919,7 @@ class Retrieve(object):
|
||||
|
||||
|
||||
def notify_server_corruption(self, server, shnum, reason):
|
||||
if isinstance(reason, unicode):
|
||||
if isinstance(reason, str):
|
||||
reason = reason.encode("utf-8")
|
||||
storage_server = server.get_storage_server()
|
||||
storage_server.advise_corrupt_share(
|
||||
|
@ -1,5 +1,15 @@
|
||||
"""
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import print_function
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
# Doesn't import str to prevent API leakage on Python 2
|
||||
from future.builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, max, min # noqa: F401
|
||||
from past.builtins import unicode
|
||||
|
||||
import sys, time, copy
|
||||
@ -188,7 +198,7 @@ class ServerMap(object):
|
||||
def dump(self, out=sys.stdout):
|
||||
print("servermap:", file=out)
|
||||
|
||||
for ( (server, shnum), (verinfo, timestamp) ) in self._known_shares.items():
|
||||
for ( (server, shnum), (verinfo, timestamp) ) in list(self._known_shares.items()):
|
||||
(seqnum, root_hash, IV, segsize, datalength, k, N, prefix,
|
||||
offsets_tuple) = verinfo
|
||||
print("[%s]: sh#%d seq%d-%s %d-of-%d len%d" %
|
||||
@ -226,7 +236,7 @@ class ServerMap(object):
|
||||
"""Return a dict that maps versionid to sets of (shnum, server,
|
||||
timestamp) tuples."""
|
||||
versionmap = DictOfSets()
|
||||
for ( (server, shnum), (verinfo, timestamp) ) in self._known_shares.items():
|
||||
for ( (server, shnum), (verinfo, timestamp) ) in list(self._known_shares.items()):
|
||||
versionmap.add(verinfo, (shnum, server, timestamp))
|
||||
return versionmap
|
||||
|
||||
@ -245,7 +255,7 @@ class ServerMap(object):
|
||||
(num_distinct_shares, k, N) tuples."""
|
||||
versionmap = self.make_versionmap()
|
||||
all_shares = {}
|
||||
for verinfo, shares in versionmap.items():
|
||||
for verinfo, shares in list(versionmap.items()):
|
||||
s = set()
|
||||
for (shnum, server, timestamp) in shares:
|
||||
s.add(shnum)
|
||||
@ -271,7 +281,7 @@ class ServerMap(object):
|
||||
"""Return a string describing which versions we know about."""
|
||||
versionmap = self.make_versionmap()
|
||||
bits = []
|
||||
for (verinfo, shares) in versionmap.items():
|
||||
for (verinfo, shares) in list(versionmap.items()):
|
||||
vstr = self.summarize_version(verinfo)
|
||||
shnums = set([shnum for (shnum, server, timestamp) in shares])
|
||||
bits.append("%d*%s" % (len(shnums), vstr))
|
||||
@ -282,7 +292,7 @@ class ServerMap(object):
|
||||
recoverable."""
|
||||
versionmap = self.make_versionmap()
|
||||
recoverable_versions = set()
|
||||
for (verinfo, shares) in versionmap.items():
|
||||
for (verinfo, shares) in list(versionmap.items()):
|
||||
(seqnum, root_hash, IV, segsize, datalength, k, N, prefix,
|
||||
offsets_tuple) = verinfo
|
||||
shnums = set([shnum for (shnum, server, timestamp) in shares])
|
||||
@ -298,7 +308,7 @@ class ServerMap(object):
|
||||
versionmap = self.make_versionmap()
|
||||
|
||||
unrecoverable_versions = set()
|
||||
for (verinfo, shares) in versionmap.items():
|
||||
for (verinfo, shares) in list(versionmap.items()):
|
||||
(seqnum, root_hash, IV, segsize, datalength, k, N, prefix,
|
||||
offsets_tuple) = verinfo
|
||||
shnums = set([shnum for (shnum, server, timestamp) in shares])
|
||||
@ -332,7 +342,7 @@ class ServerMap(object):
|
||||
healths = {} # maps verinfo to (found,k)
|
||||
unrecoverable = set()
|
||||
highest_recoverable_seqnum = -1
|
||||
for (verinfo, shares) in versionmap.items():
|
||||
for (verinfo, shares) in list(versionmap.items()):
|
||||
(seqnum, root_hash, IV, segsize, datalength, k, N, prefix,
|
||||
offsets_tuple) = verinfo
|
||||
shnums = set([shnum for (shnum, server, timestamp) in shares])
|
||||
@ -667,7 +677,7 @@ class ServermapUpdater(object):
|
||||
|
||||
ds = []
|
||||
|
||||
for shnum,datav in datavs.items():
|
||||
for shnum,datav in list(datavs.items()):
|
||||
data = datav[0]
|
||||
reader = MDMFSlotReadProxy(ss,
|
||||
storage_index,
|
||||
|
@ -19,24 +19,27 @@ import os.path
|
||||
import re
|
||||
import types
|
||||
import errno
|
||||
import tempfile
|
||||
from base64 import b32decode, b32encode
|
||||
from errno import ENOENT, EPERM
|
||||
from warnings import warn
|
||||
|
||||
import attr
|
||||
|
||||
# On Python 2 this will be the backported package.
|
||||
import configparser
|
||||
|
||||
from twisted.python.filepath import FilePath
|
||||
from twisted.python.filepath import (
|
||||
FilePath,
|
||||
)
|
||||
from twisted.python import log as twlog
|
||||
from twisted.application import service
|
||||
from twisted.python.failure import Failure
|
||||
from foolscap.api import Tub, app_versions
|
||||
from foolscap.api import Tub
|
||||
|
||||
import foolscap.logging.log
|
||||
from allmydata.version_checks import get_package_versions, get_package_versions_string
|
||||
|
||||
from allmydata.util import log
|
||||
from allmydata.util import fileutil, iputil
|
||||
from allmydata.util.assertutil import _assert
|
||||
from allmydata.util.fileutil import abspath_expanduser_unicode
|
||||
from allmydata.util.encodingutil import get_filesystem_encoding, quote_output
|
||||
from allmydata.util import configutil
|
||||
@ -44,6 +47,10 @@ from allmydata.util.yamlutil import (
|
||||
safe_load,
|
||||
)
|
||||
|
||||
from . import (
|
||||
__full_version__,
|
||||
)
|
||||
|
||||
def _common_valid_config():
|
||||
return configutil.ValidConfiguration({
|
||||
"connections": (
|
||||
@ -84,11 +91,6 @@ def _common_valid_config():
|
||||
),
|
||||
})
|
||||
|
||||
# Add our application versions to the data that Foolscap's LogPublisher
|
||||
# reports.
|
||||
for thing, things_version in list(get_package_versions().items()):
|
||||
app_versions.add_version(thing, things_version)
|
||||
|
||||
# group 1 will be addr (dotted quad string), group 3 if any will be portnum (string)
|
||||
ADDR_RE = re.compile("^([1-9][0-9]*\.[1-9][0-9]*\.[1-9][0-9]*\.[1-9][0-9]*)(:([1-9][0-9]*))?$")
|
||||
|
||||
@ -198,25 +200,27 @@ def read_config(basedir, portnumfile, generated_files=[], _valid_config=None):
|
||||
# canonicalize the portnum file
|
||||
portnumfile = os.path.join(basedir, portnumfile)
|
||||
|
||||
# (try to) read the main config file
|
||||
config_fname = os.path.join(basedir, "tahoe.cfg")
|
||||
config_path = FilePath(basedir).child("tahoe.cfg")
|
||||
try:
|
||||
parser = configutil.get_config(config_fname)
|
||||
config_str = config_path.getContent()
|
||||
except EnvironmentError as e:
|
||||
if e.errno != errno.ENOENT:
|
||||
raise
|
||||
# The file is missing, just create empty ConfigParser.
|
||||
parser = configutil.get_config_from_string(u"")
|
||||
config_str = u""
|
||||
else:
|
||||
config_str = config_str.decode("utf-8-sig")
|
||||
|
||||
configutil.validate_config(config_fname, parser, _valid_config)
|
||||
|
||||
# make sure we have a private configuration area
|
||||
fileutil.make_dirs(os.path.join(basedir, "private"), 0o700)
|
||||
|
||||
return _Config(parser, portnumfile, basedir, config_fname)
|
||||
return config_from_string(
|
||||
basedir,
|
||||
portnumfile,
|
||||
config_str,
|
||||
_valid_config,
|
||||
config_path,
|
||||
)
|
||||
|
||||
|
||||
def config_from_string(basedir, portnumfile, config_str, _valid_config=None):
|
||||
def config_from_string(basedir, portnumfile, config_str, _valid_config=None, fpath=None):
|
||||
"""
|
||||
load and validate configuration from in-memory string
|
||||
"""
|
||||
@ -229,16 +233,19 @@ def config_from_string(basedir, portnumfile, config_str, _valid_config=None):
|
||||
# load configuration from in-memory string
|
||||
parser = configutil.get_config_from_string(config_str)
|
||||
|
||||
fname = "<in-memory>"
|
||||
configutil.validate_config(fname, parser, _valid_config)
|
||||
return _Config(parser, portnumfile, basedir, fname)
|
||||
configutil.validate_config(
|
||||
"<string>" if fpath is None else fpath.path,
|
||||
parser,
|
||||
_valid_config,
|
||||
)
|
||||
|
||||
|
||||
def get_app_versions():
|
||||
"""
|
||||
:returns: dict of versions important to Foolscap
|
||||
"""
|
||||
return dict(app_versions.versions)
|
||||
return _Config(
|
||||
parser,
|
||||
portnumfile,
|
||||
basedir,
|
||||
fpath,
|
||||
_valid_config,
|
||||
)
|
||||
|
||||
|
||||
def _error_about_old_config_files(basedir, generated_files):
|
||||
@ -266,6 +273,7 @@ def _error_about_old_config_files(basedir, generated_files):
|
||||
raise e
|
||||
|
||||
|
||||
@attr.s
|
||||
class _Config(object):
|
||||
"""
|
||||
Manages configuration of a Tahoe 'node directory'.
|
||||
@ -274,30 +282,47 @@ class _Config(object):
|
||||
class; names and funtionality have been kept the same while moving
|
||||
the code. It probably makes sense for several of these APIs to
|
||||
have better names.
|
||||
|
||||
:ivar ConfigParser config: The actual configuration values.
|
||||
|
||||
:ivar str portnum_fname: filename to use for the port-number file (a
|
||||
relative path inside basedir).
|
||||
|
||||
:ivar str _basedir: path to our "node directory", inside which all
|
||||
configuration is managed.
|
||||
|
||||
:ivar (FilePath|NoneType) config_path: The path actually used to create
|
||||
the configparser (might be ``None`` if using in-memory data).
|
||||
|
||||
:ivar ValidConfiguration valid_config_sections: The validator for the
|
||||
values in this configuration.
|
||||
"""
|
||||
config = attr.ib(validator=attr.validators.instance_of(configparser.ConfigParser))
|
||||
portnum_fname = attr.ib()
|
||||
_basedir = attr.ib(
|
||||
converter=lambda basedir: abspath_expanduser_unicode(ensure_text(basedir)),
|
||||
)
|
||||
config_path = attr.ib(
|
||||
validator=attr.validators.optional(
|
||||
attr.validators.instance_of(FilePath),
|
||||
),
|
||||
)
|
||||
valid_config_sections = attr.ib(
|
||||
default=configutil.ValidConfiguration.everything(),
|
||||
validator=attr.validators.instance_of(configutil.ValidConfiguration),
|
||||
)
|
||||
|
||||
def __init__(self, configparser, portnum_fname, basedir, config_fname):
|
||||
"""
|
||||
:param configparser: a ConfigParser instance
|
||||
@property
|
||||
def nickname(self):
|
||||
nickname = self.get_config("node", "nickname", u"<unspecified>")
|
||||
assert isinstance(nickname, str)
|
||||
return nickname
|
||||
|
||||
:param portnum_fname: filename to use for the port-number file
|
||||
(a relative path inside basedir)
|
||||
|
||||
:param basedir: path to our "node directory", inside which all
|
||||
configuration is managed
|
||||
|
||||
:param config_fname: the pathname actually used to create the
|
||||
configparser (might be 'fake' if using in-memory data)
|
||||
"""
|
||||
self.portnum_fname = portnum_fname
|
||||
self._basedir = abspath_expanduser_unicode(ensure_text(basedir))
|
||||
self._config_fname = config_fname
|
||||
self.config = configparser
|
||||
self.nickname = self.get_config("node", "nickname", u"<unspecified>")
|
||||
assert isinstance(self.nickname, str)
|
||||
|
||||
def validate(self, valid_config_sections):
|
||||
configutil.validate_config(self._config_fname, self.config, valid_config_sections)
|
||||
@property
|
||||
def _config_fname(self):
|
||||
if self.config_path is None:
|
||||
return "<string>"
|
||||
return self.config_path.path
|
||||
|
||||
def write_config_file(self, name, value, mode="w"):
|
||||
"""
|
||||
@ -342,6 +367,34 @@ class _Config(object):
|
||||
)
|
||||
return default
|
||||
|
||||
def set_config(self, section, option, value):
|
||||
"""
|
||||
Set a config option in a section and re-write the tahoe.cfg file
|
||||
|
||||
:param str section: The name of the section in which to set the
|
||||
option.
|
||||
|
||||
:param str option: The name of the option to set.
|
||||
|
||||
:param str value: The value of the option.
|
||||
|
||||
:raise UnescapedHashError: If the option holds a fURL and there is a
|
||||
``#`` in the value.
|
||||
"""
|
||||
if option.endswith(".furl") and "#" in value:
|
||||
raise UnescapedHashError(section, option, value)
|
||||
|
||||
copied_config = configutil.copy_config(self.config)
|
||||
configutil.set_config(copied_config, section, option, value)
|
||||
configutil.validate_config(
|
||||
self._config_fname,
|
||||
copied_config,
|
||||
self.valid_config_sections,
|
||||
)
|
||||
if self.config_path is not None:
|
||||
configutil.write_config(self.config_path, copied_config)
|
||||
self.config = copied_config
|
||||
|
||||
def get_config_from_file(self, name, required=False):
|
||||
"""Get the (string) contents of a config file, or None if the file
|
||||
did not exist. If required=True, raise an exception rather than
|
||||
@ -837,8 +890,6 @@ class Node(service.MultiService):
|
||||
self._i2p_provider = i2p_provider
|
||||
self._tor_provider = tor_provider
|
||||
|
||||
self.init_tempdir()
|
||||
|
||||
self.create_log_tub()
|
||||
self.logSource = "Node"
|
||||
self.setup_logging()
|
||||
@ -856,7 +907,7 @@ class Node(service.MultiService):
|
||||
if self.control_tub is not None:
|
||||
self.control_tub.setServiceParent(self)
|
||||
|
||||
self.log("Node constructed. " + get_package_versions_string())
|
||||
self.log("Node constructed. " + __full_version__)
|
||||
iputil.increase_rlimits()
|
||||
|
||||
def _is_tub_listening(self):
|
||||
@ -865,25 +916,6 @@ class Node(service.MultiService):
|
||||
"""
|
||||
return len(self.tub.getListeners()) > 0
|
||||
|
||||
def init_tempdir(self):
|
||||
"""
|
||||
Initialize/create a directory for temporary files.
|
||||
"""
|
||||
tempdir_config = self.config.get_config("node", "tempdir", "tmp")
|
||||
if isinstance(tempdir_config, bytes):
|
||||
tempdir_config = tempdir_config.decode('utf-8')
|
||||
tempdir = self.config.get_config_path(tempdir_config)
|
||||
if not os.path.exists(tempdir):
|
||||
fileutil.make_dirs(tempdir)
|
||||
tempfile.tempdir = tempdir
|
||||
# this should cause twisted.web.http (which uses
|
||||
# tempfile.TemporaryFile) to put large request bodies in the given
|
||||
# directory. Without this, the default temp dir is usually /tmp/,
|
||||
# which is frequently too small.
|
||||
temp_fd, test_name = tempfile.mkstemp()
|
||||
_assert(os.path.dirname(test_name) == tempdir, test_name, tempdir)
|
||||
os.close(temp_fd) # avoid leak of unneeded fd
|
||||
|
||||
# pull this outside of Node's __init__ too, see:
|
||||
# https://tahoe-lafs.org/trac/tahoe-lafs/ticket/2948
|
||||
def create_log_tub(self):
|
||||
|
@ -7,7 +7,6 @@ import six
|
||||
from twisted.python import usage
|
||||
from twisted.internet import defer, task, threads
|
||||
|
||||
from allmydata.version_checks import get_package_versions_string
|
||||
from allmydata.scripts.common import get_default_nodedir
|
||||
from allmydata.scripts import debug, create_node, cli, \
|
||||
stats_gatherer, admin, tahoe_daemonize, tahoe_start, \
|
||||
@ -19,6 +18,10 @@ from allmydata.util.eliotutil import (
|
||||
eliot_logging_service,
|
||||
)
|
||||
|
||||
from .. import (
|
||||
__full_version__,
|
||||
)
|
||||
|
||||
_default_nodedir = get_default_nodedir()
|
||||
|
||||
NODEDIR_HELP = ("Specify which Tahoe node directory should be used. The "
|
||||
@ -77,12 +80,10 @@ class Options(usage.Options):
|
||||
]
|
||||
|
||||
def opt_version(self):
|
||||
print(get_package_versions_string(debug=True), file=self.stdout)
|
||||
print(__full_version__, file=self.stdout)
|
||||
self.no_command_needed = True
|
||||
|
||||
def opt_version_and_path(self):
|
||||
print(get_package_versions_string(show_paths=True, debug=True), file=self.stdout)
|
||||
self.no_command_needed = True
|
||||
opt_version_and_path = opt_version
|
||||
|
||||
opt_eliot_destination = opt_eliot_destination
|
||||
opt_help_eliot_destinations = opt_help_eliot_destinations
|
||||
|
@ -1266,7 +1266,7 @@ class Options(ReallyEqualMixin, unittest.TestCase):
|
||||
# "tahoe --version" dumps text to stdout and exits
|
||||
stdout = StringIO()
|
||||
self.failUnlessRaises(SystemExit, self.parse, ["--version"], stdout)
|
||||
self.failUnlessIn(allmydata.__appname__ + ":", stdout.getvalue())
|
||||
self.failUnlessIn(allmydata.__full_version__, stdout.getvalue())
|
||||
# but "tahoe SUBCOMMAND --version" should be rejected
|
||||
self.failUnlessRaises(usage.UsageError, self.parse,
|
||||
["start", "--version"])
|
||||
|
@ -52,13 +52,8 @@ class Config(unittest.TestCase):
|
||||
create_node.write_node_config(f, opts)
|
||||
create_node.write_client_config(f, opts)
|
||||
|
||||
config = configutil.get_config(fname)
|
||||
# should succeed, no exceptions
|
||||
configutil.validate_config(
|
||||
fname,
|
||||
config,
|
||||
client._valid_config(),
|
||||
)
|
||||
client.read_config(d, "")
|
||||
|
||||
@defer.inlineCallbacks
|
||||
def test_client(self):
|
||||
|
@ -113,7 +113,6 @@ class MemoryIntroducerClient(object):
|
||||
nickname = attr.ib()
|
||||
my_version = attr.ib()
|
||||
oldest_supported = attr.ib()
|
||||
app_versions = attr.ib()
|
||||
sequencer = attr.ib()
|
||||
cache_filepath = attr.ib()
|
||||
|
||||
@ -1157,8 +1156,9 @@ class _TestCaseMixin(object):
|
||||
test (including setUp and tearDown messages).
|
||||
* trial-compatible mktemp method
|
||||
* unittest2-compatible assertRaises helper
|
||||
* Automatic cleanup of tempfile.tempdir mutation (pervasive through the
|
||||
Tahoe-LAFS test suite).
|
||||
* Automatic cleanup of tempfile.tempdir mutation (once pervasive through
|
||||
the Tahoe-LAFS test suite, perhaps gone now but someone should verify
|
||||
this).
|
||||
"""
|
||||
def setUp(self):
|
||||
# Restore the original temporary directory. Node ``init_tempdir``
|
||||
|
@ -357,7 +357,7 @@ class NoNetworkGrid(service.MultiService):
|
||||
to complete properly
|
||||
"""
|
||||
if self._setup_errors:
|
||||
raise self._setup_errors[0].value
|
||||
self._setup_errors[0].raiseException()
|
||||
|
||||
@defer.inlineCallbacks
|
||||
def make_client(self, i, write_config=True):
|
||||
|
@ -41,9 +41,6 @@ import allmydata.util.log
|
||||
|
||||
from allmydata.node import OldConfigError, UnescapedHashError, create_node_dir
|
||||
from allmydata.frontends.auth import NeedRootcapLookupScheme
|
||||
from allmydata.version_checks import (
|
||||
get_package_versions_string,
|
||||
)
|
||||
from allmydata import client
|
||||
from allmydata.storage_client import (
|
||||
StorageClientConfig,
|
||||
@ -618,8 +615,6 @@ class Basic(testutil.ReallyEqualMixin, unittest.TestCase):
|
||||
self.failIfEqual(str(allmydata.__version__), "unknown")
|
||||
self.failUnless("." in str(allmydata.__full_version__),
|
||||
"non-numeric version in '%s'" % allmydata.__version__)
|
||||
all_versions = get_package_versions_string()
|
||||
self.failUnless(allmydata.__appname__ in all_versions)
|
||||
# also test stats
|
||||
stats = c.get_stats()
|
||||
self.failUnless("node.uptime" in stats)
|
||||
|
@ -14,12 +14,89 @@ if PY2:
|
||||
from builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
import os.path
|
||||
from configparser import (
|
||||
ConfigParser,
|
||||
)
|
||||
from functools import (
|
||||
partial,
|
||||
)
|
||||
|
||||
from hypothesis import (
|
||||
given,
|
||||
)
|
||||
from hypothesis.strategies import (
|
||||
dictionaries,
|
||||
text,
|
||||
characters,
|
||||
)
|
||||
|
||||
from twisted.python.filepath import (
|
||||
FilePath,
|
||||
)
|
||||
from twisted.trial import unittest
|
||||
|
||||
from allmydata.util import configutil
|
||||
|
||||
|
||||
def arbitrary_config_dicts(
|
||||
min_sections=0,
|
||||
max_sections=3,
|
||||
max_section_name_size=8,
|
||||
max_items_per_section=3,
|
||||
max_item_length=8,
|
||||
max_value_length=8,
|
||||
):
|
||||
"""
|
||||
Build ``dict[str, dict[str, str]]`` instances populated with arbitrary
|
||||
configurations.
|
||||
"""
|
||||
identifier_text = partial(
|
||||
text,
|
||||
# Don't allow most control characters or spaces
|
||||
alphabet=characters(
|
||||
blacklist_categories=('Cc', 'Cs', 'Zs'),
|
||||
),
|
||||
)
|
||||
return dictionaries(
|
||||
identifier_text(
|
||||
min_size=1,
|
||||
max_size=max_section_name_size,
|
||||
),
|
||||
dictionaries(
|
||||
identifier_text(
|
||||
min_size=1,
|
||||
max_size=max_item_length,
|
||||
),
|
||||
text(max_size=max_value_length),
|
||||
max_size=max_items_per_section,
|
||||
),
|
||||
min_size=min_sections,
|
||||
max_size=max_sections,
|
||||
)
|
||||
|
||||
|
||||
def to_configparser(dictconfig):
|
||||
"""
|
||||
Take a ``dict[str, dict[str, str]]`` and turn it into the corresponding
|
||||
populated ``ConfigParser`` instance.
|
||||
"""
|
||||
cp = ConfigParser()
|
||||
for section, items in dictconfig.items():
|
||||
cp.add_section(section)
|
||||
for k, v in items.items():
|
||||
cp.set(
|
||||
section,
|
||||
k,
|
||||
# ConfigParser has a feature that everyone knows and loves
|
||||
# where it will use %-style interpolation to substitute
|
||||
# values from one part of the config into another part of
|
||||
# the config. Escape all our `%`s to avoid hitting this
|
||||
# and complicating things.
|
||||
v.replace("%", "%%"),
|
||||
)
|
||||
return cp
|
||||
|
||||
|
||||
class ConfigUtilTests(unittest.TestCase):
|
||||
def setUp(self):
|
||||
super(ConfigUtilTests, self).setUp()
|
||||
@ -55,7 +132,7 @@ enabled = false
|
||||
|
||||
# test that set_config can mutate an existing option
|
||||
configutil.set_config(config, "node", "nickname", "Alice!")
|
||||
configutil.write_config(tahoe_cfg, config)
|
||||
configutil.write_config(FilePath(tahoe_cfg), config)
|
||||
|
||||
config = configutil.get_config(tahoe_cfg)
|
||||
self.failUnlessEqual(config.get("node", "nickname"), "Alice!")
|
||||
@ -63,19 +140,21 @@ enabled = false
|
||||
# test that set_config can set a new option
|
||||
descriptor = "Twas brillig, and the slithy toves Did gyre and gimble in the wabe"
|
||||
configutil.set_config(config, "node", "descriptor", descriptor)
|
||||
configutil.write_config(tahoe_cfg, config)
|
||||
configutil.write_config(FilePath(tahoe_cfg), config)
|
||||
|
||||
config = configutil.get_config(tahoe_cfg)
|
||||
self.failUnlessEqual(config.get("node", "descriptor"), descriptor)
|
||||
|
||||
def test_config_validation_success(self):
|
||||
fname = self.create_tahoe_cfg('[node]\nvalid = foo\n')
|
||||
|
||||
config = configutil.get_config(fname)
|
||||
"""
|
||||
``configutil.validate_config`` returns ``None`` when the configuration it
|
||||
is given has nothing more than the static sections and items defined
|
||||
by the validator.
|
||||
"""
|
||||
# should succeed, no exceptions
|
||||
configutil.validate_config(
|
||||
fname,
|
||||
config,
|
||||
"<test_config_validation_success>",
|
||||
to_configparser({"node": {"valid": "foo"}}),
|
||||
self.static_valid_config,
|
||||
)
|
||||
|
||||
@ -85,24 +164,20 @@ enabled = false
|
||||
validation but are matched by the dynamic validation is considered
|
||||
valid.
|
||||
"""
|
||||
fname = self.create_tahoe_cfg('[node]\nvalid = foo\n')
|
||||
|
||||
config = configutil.get_config(fname)
|
||||
# should succeed, no exceptions
|
||||
configutil.validate_config(
|
||||
fname,
|
||||
config,
|
||||
"<test_config_dynamic_validation_success>",
|
||||
to_configparser({"node": {"valid": "foo"}}),
|
||||
self.dynamic_valid_config,
|
||||
)
|
||||
|
||||
def test_config_validation_invalid_item(self):
|
||||
fname = self.create_tahoe_cfg('[node]\nvalid = foo\ninvalid = foo\n')
|
||||
|
||||
config = configutil.get_config(fname)
|
||||
config = to_configparser({"node": {"valid": "foo", "invalid": "foo"}})
|
||||
e = self.assertRaises(
|
||||
configutil.UnknownConfigError,
|
||||
configutil.validate_config,
|
||||
fname, config,
|
||||
"<test_config_validation_invalid_item>",
|
||||
config,
|
||||
self.static_valid_config,
|
||||
)
|
||||
self.assertIn("section [node] contains unknown option 'invalid'", str(e))
|
||||
@ -112,13 +187,12 @@ enabled = false
|
||||
A configuration with a section that is matched by neither the static nor
|
||||
dynamic validators is rejected.
|
||||
"""
|
||||
fname = self.create_tahoe_cfg('[node]\nvalid = foo\n[invalid]\n')
|
||||
|
||||
config = configutil.get_config(fname)
|
||||
config = to_configparser({"node": {"valid": "foo"}, "invalid": {}})
|
||||
e = self.assertRaises(
|
||||
configutil.UnknownConfigError,
|
||||
configutil.validate_config,
|
||||
fname, config,
|
||||
"<test_config_validation_invalid_section>",
|
||||
config,
|
||||
self.static_valid_config,
|
||||
)
|
||||
self.assertIn("contains unknown section [invalid]", str(e))
|
||||
@ -128,13 +202,12 @@ enabled = false
|
||||
A configuration with a section that is matched by neither the static nor
|
||||
dynamic validators is rejected.
|
||||
"""
|
||||
fname = self.create_tahoe_cfg('[node]\nvalid = foo\n[invalid]\n')
|
||||
|
||||
config = configutil.get_config(fname)
|
||||
config = to_configparser({"node": {"valid": "foo"}, "invalid": {}})
|
||||
e = self.assertRaises(
|
||||
configutil.UnknownConfigError,
|
||||
configutil.validate_config,
|
||||
fname, config,
|
||||
"<test_config_dynamic_validation_invalid_section>",
|
||||
config,
|
||||
self.dynamic_valid_config,
|
||||
)
|
||||
self.assertIn("contains unknown section [invalid]", str(e))
|
||||
@ -144,13 +217,12 @@ enabled = false
|
||||
A configuration with a section, item pair that is matched by neither the
|
||||
static nor dynamic validators is rejected.
|
||||
"""
|
||||
fname = self.create_tahoe_cfg('[node]\nvalid = foo\ninvalid = foo\n')
|
||||
|
||||
config = configutil.get_config(fname)
|
||||
config = to_configparser({"node": {"valid": "foo", "invalid": "foo"}})
|
||||
e = self.assertRaises(
|
||||
configutil.UnknownConfigError,
|
||||
configutil.validate_config,
|
||||
fname, config,
|
||||
"<test_config_dynamic_validation_invalid_item>",
|
||||
config,
|
||||
self.dynamic_valid_config,
|
||||
)
|
||||
self.assertIn("section [node] contains unknown option 'invalid'", str(e))
|
||||
@ -163,3 +235,61 @@ enabled = false
|
||||
config = configutil.get_config(fname)
|
||||
self.assertEqual(config.get("node", "a"), "foo")
|
||||
self.assertEqual(config.get("node", "b"), "bar")
|
||||
|
||||
@given(arbitrary_config_dicts())
|
||||
def test_everything_valid(self, cfgdict):
|
||||
"""
|
||||
``validate_config`` returns ``None`` when the validator is
|
||||
``ValidConfiguration.everything()``.
|
||||
"""
|
||||
cfg = to_configparser(cfgdict)
|
||||
self.assertIs(
|
||||
configutil.validate_config(
|
||||
"<test_everything_valid>",
|
||||
cfg,
|
||||
configutil.ValidConfiguration.everything(),
|
||||
),
|
||||
None,
|
||||
)
|
||||
|
||||
@given(arbitrary_config_dicts(min_sections=1))
|
||||
def test_nothing_valid(self, cfgdict):
|
||||
"""
|
||||
``validate_config`` raises ``UnknownConfigError`` when the validator is
|
||||
``ValidConfiguration.nothing()`` for all non-empty configurations.
|
||||
"""
|
||||
cfg = to_configparser(cfgdict)
|
||||
with self.assertRaises(configutil.UnknownConfigError):
|
||||
configutil.validate_config(
|
||||
"<test_everything_valid>",
|
||||
cfg,
|
||||
configutil.ValidConfiguration.nothing(),
|
||||
)
|
||||
|
||||
def test_nothing_empty_valid(self):
|
||||
"""
|
||||
``validate_config`` returns ``None`` when the validator is
|
||||
``ValidConfiguration.nothing()`` if the configuration is empty.
|
||||
"""
|
||||
cfg = ConfigParser()
|
||||
self.assertIs(
|
||||
configutil.validate_config(
|
||||
"<test_everything_valid>",
|
||||
cfg,
|
||||
configutil.ValidConfiguration.nothing(),
|
||||
),
|
||||
None,
|
||||
)
|
||||
|
||||
@given(arbitrary_config_dicts())
|
||||
def test_copy_config(self, cfgdict):
|
||||
"""
|
||||
``copy_config`` creates a new ``ConfigParser`` object containing the same
|
||||
values as its input.
|
||||
"""
|
||||
cfg = to_configparser(cfgdict)
|
||||
copied = configutil.copy_config(cfg)
|
||||
# Should be equal
|
||||
self.assertEqual(cfg, copied)
|
||||
# But not because they're the same object.
|
||||
self.assertIsNot(cfg, copied)
|
||||
|
@ -158,7 +158,7 @@ class ServiceMixin(object):
|
||||
class Introducer(ServiceMixin, AsyncTestCase):
|
||||
def test_create(self):
|
||||
ic = IntroducerClient(None, "introducer.furl", u"my_nickname",
|
||||
"my_version", "oldest_version", {}, fakeseq,
|
||||
"my_version", "oldest_version", fakeseq,
|
||||
FilePath(self.mktemp()))
|
||||
self.failUnless(isinstance(ic, IntroducerClient))
|
||||
|
||||
@ -191,13 +191,13 @@ class Client(AsyncTestCase):
|
||||
def test_duplicate_receive_v2(self):
|
||||
ic1 = IntroducerClient(None,
|
||||
"introducer.furl", u"my_nickname",
|
||||
"ver23", "oldest_version", {}, fakeseq,
|
||||
"ver23", "oldest_version", fakeseq,
|
||||
FilePath(self.mktemp()))
|
||||
# we use a second client just to create a different-looking
|
||||
# announcement
|
||||
ic2 = IntroducerClient(None,
|
||||
"introducer.furl", u"my_nickname",
|
||||
"ver24","oldest_version",{}, fakeseq,
|
||||
"ver24","oldest_version",fakeseq,
|
||||
FilePath(self.mktemp()))
|
||||
announcements = []
|
||||
def _received(key_s, ann):
|
||||
@ -301,7 +301,7 @@ class Server(AsyncTestCase):
|
||||
i = IntroducerService()
|
||||
ic1 = IntroducerClient(None,
|
||||
"introducer.furl", u"my_nickname",
|
||||
"ver23", "oldest_version", {}, realseq,
|
||||
"ver23", "oldest_version", realseq,
|
||||
FilePath(self.mktemp()))
|
||||
furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:36106/gydnp"
|
||||
|
||||
@ -399,7 +399,7 @@ class Queue(SystemTestMixin, AsyncTestCase):
|
||||
tub2 = Tub()
|
||||
tub2.setServiceParent(self.parent)
|
||||
c = IntroducerClient(tub2, ifurl,
|
||||
u"nickname", "version", "oldest", {}, fakeseq,
|
||||
u"nickname", "version", "oldest", fakeseq,
|
||||
FilePath(self.mktemp()))
|
||||
furl1 = "pb://onug64tu@127.0.0.1:123/short" # base32("short")
|
||||
private_key, _ = ed25519.create_signing_keypair()
|
||||
@ -480,7 +480,7 @@ class SystemTest(SystemTestMixin, AsyncTestCase):
|
||||
c = IntroducerClient(tub, self.introducer_furl,
|
||||
NICKNAME % str(i),
|
||||
"version", "oldest",
|
||||
{"component": "component-v1"}, fakeseq,
|
||||
fakeseq,
|
||||
FilePath(self.mktemp()))
|
||||
received_announcements[c] = {}
|
||||
def got(key_s_or_tubid, ann, announcements):
|
||||
@ -740,9 +740,8 @@ class ClientInfo(AsyncTestCase):
|
||||
def test_client_v2(self):
|
||||
introducer = IntroducerService()
|
||||
tub = introducer_furl = None
|
||||
app_versions = {"whizzy": "fizzy"}
|
||||
client_v2 = IntroducerClient(tub, introducer_furl, NICKNAME % u"v2",
|
||||
"my_version", "oldest", app_versions,
|
||||
"my_version", "oldest",
|
||||
fakeseq, FilePath(self.mktemp()))
|
||||
#furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum"
|
||||
#ann_s = make_ann_t(client_v2, furl1, None, 10)
|
||||
@ -754,7 +753,6 @@ class ClientInfo(AsyncTestCase):
|
||||
self.failUnlessEqual(len(subs), 1)
|
||||
s0 = subs[0]
|
||||
self.failUnlessEqual(s0.service_name, "storage")
|
||||
self.failUnlessEqual(s0.app_versions, app_versions)
|
||||
self.failUnlessEqual(s0.nickname, NICKNAME % u"v2")
|
||||
self.failUnlessEqual(s0.version, "my_version")
|
||||
|
||||
@ -763,9 +761,8 @@ class Announcements(AsyncTestCase):
|
||||
def test_client_v2_signed(self):
|
||||
introducer = IntroducerService()
|
||||
tub = introducer_furl = None
|
||||
app_versions = {"whizzy": "fizzy"}
|
||||
client_v2 = IntroducerClient(tub, introducer_furl, u"nick-v2",
|
||||
"my_version", "oldest", app_versions,
|
||||
"my_version", "oldest",
|
||||
fakeseq, FilePath(self.mktemp()))
|
||||
furl1 = "pb://62ubehyunnyhzs7r6vdonnm2hpi52w6y@127.0.0.1:0/swissnum"
|
||||
|
||||
@ -779,7 +776,6 @@ class Announcements(AsyncTestCase):
|
||||
self.failUnlessEqual(len(a), 1)
|
||||
self.assertThat(a[0].canary, Is(canary0))
|
||||
self.failUnlessEqual(a[0].index, ("storage", public_key_str))
|
||||
self.failUnlessEqual(a[0].announcement["app-versions"], app_versions)
|
||||
self.failUnlessEqual(a[0].nickname, u"nick-v2")
|
||||
self.failUnlessEqual(a[0].service_name, "storage")
|
||||
self.failUnlessEqual(a[0].version, "my_version")
|
||||
@ -863,7 +859,7 @@ class Announcements(AsyncTestCase):
|
||||
# test loading
|
||||
yield flushEventualQueue()
|
||||
ic2 = IntroducerClient(None, "introducer.furl", u"my_nickname",
|
||||
"my_version", "oldest_version", {}, fakeseq,
|
||||
"my_version", "oldest_version", fakeseq,
|
||||
ic._cache_filepath)
|
||||
announcements = {}
|
||||
def got(key_s, ann):
|
||||
@ -960,7 +956,7 @@ class NonV1Server(SystemTestMixin, AsyncTestCase):
|
||||
tub.setServiceParent(self.parent)
|
||||
listenOnUnused(tub)
|
||||
c = IntroducerClient(tub, self.introducer_furl,
|
||||
u"nickname-client", "version", "oldest", {},
|
||||
u"nickname-client", "version", "oldest",
|
||||
fakeseq, FilePath(self.mktemp()))
|
||||
announcements = {}
|
||||
def got(key_s, ann):
|
||||
@ -1033,7 +1029,6 @@ class Signatures(SyncTestCase):
|
||||
u"fake_nick",
|
||||
"0.0.0",
|
||||
"1.2.3",
|
||||
{},
|
||||
(0, u"i am a nonce"),
|
||||
"invalid",
|
||||
)
|
||||
|
@ -9,7 +9,7 @@ from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
|
||||
from future.utils import PY2
|
||||
from future.utils import PY2, native_str
|
||||
if PY2:
|
||||
from builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
@ -154,3 +154,17 @@ class Log(unittest.TestCase):
|
||||
obj.log("four")
|
||||
self.assertEqual([m[2] for m in self.messages],
|
||||
["grand", "par1", "par2", "msg1", "msg1"])
|
||||
|
||||
def test_native_string_keys(self):
|
||||
"""Keyword argument keys are all native strings."""
|
||||
class LoggingObject17(tahoe_log.PrefixingLogMixin):
|
||||
pass
|
||||
|
||||
obj = LoggingObject17()
|
||||
# Native string by default:
|
||||
obj.log(hello="world")
|
||||
# Will be Unicode on Python 2:
|
||||
obj.log(**{"my": "message"})
|
||||
for message in self.messages:
|
||||
for k in message[-1].keys():
|
||||
self.assertIsInstance(k, native_str)
|
||||
|
@ -29,6 +29,9 @@ from hypothesis.strategies import (
|
||||
|
||||
from unittest import skipIf
|
||||
|
||||
from twisted.python.filepath import (
|
||||
FilePath,
|
||||
)
|
||||
from twisted.trial import unittest
|
||||
from twisted.internet import defer
|
||||
|
||||
@ -52,7 +55,11 @@ from allmydata import client
|
||||
|
||||
from allmydata.util import fileutil, iputil
|
||||
from allmydata.util.namespace import Namespace
|
||||
from allmydata.util.configutil import UnknownConfigError
|
||||
from allmydata.util.configutil import (
|
||||
ValidConfiguration,
|
||||
UnknownConfigError,
|
||||
)
|
||||
|
||||
from allmydata.util.i2p_provider import create as create_i2p_provider
|
||||
from allmydata.util.tor_provider import create as create_tor_provider
|
||||
import allmydata.test.common_util as testutil
|
||||
@ -431,6 +438,78 @@ class TestCase(testutil.SignalMixin, unittest.TestCase):
|
||||
yield client.create_client(basedir)
|
||||
self.failUnless(ns.called)
|
||||
|
||||
def test_set_config_unescaped_furl_hash(self):
|
||||
"""
|
||||
``_Config.set_config`` raises ``UnescapedHashError`` if the item being set
|
||||
is a furl and the value includes ``"#"`` and does not set the value.
|
||||
"""
|
||||
basedir = self.mktemp()
|
||||
new_config = config_from_string(basedir, "", "")
|
||||
with self.assertRaises(UnescapedHashError):
|
||||
new_config.set_config("foo", "bar.furl", "value#1")
|
||||
with self.assertRaises(MissingConfigEntry):
|
||||
new_config.get_config("foo", "bar.furl")
|
||||
|
||||
def test_set_config_new_section(self):
|
||||
"""
|
||||
``_Config.set_config`` can be called with the name of a section that does
|
||||
not already exist to create that section and set an item in it.
|
||||
"""
|
||||
basedir = self.mktemp()
|
||||
new_config = config_from_string(basedir, "", "", ValidConfiguration.everything())
|
||||
new_config.set_config("foo", "bar", "value1")
|
||||
self.assertEqual(
|
||||
new_config.get_config("foo", "bar"),
|
||||
"value1"
|
||||
)
|
||||
|
||||
def test_set_config_replace(self):
|
||||
"""
|
||||
``_Config.set_config`` can be called with a section and item that already
|
||||
exists to change an existing value to a new one.
|
||||
"""
|
||||
basedir = self.mktemp()
|
||||
new_config = config_from_string(basedir, "", "", ValidConfiguration.everything())
|
||||
new_config.set_config("foo", "bar", "value1")
|
||||
new_config.set_config("foo", "bar", "value2")
|
||||
self.assertEqual(
|
||||
new_config.get_config("foo", "bar"),
|
||||
"value2"
|
||||
)
|
||||
|
||||
def test_set_config_write(self):
|
||||
"""
|
||||
``_Config.set_config`` persists the configuration change so it can be
|
||||
re-loaded later.
|
||||
"""
|
||||
# Let our nonsense config through
|
||||
valid_config = ValidConfiguration.everything()
|
||||
basedir = FilePath(self.mktemp())
|
||||
basedir.makedirs()
|
||||
cfg = basedir.child(b"tahoe.cfg")
|
||||
cfg.setContent(b"")
|
||||
new_config = read_config(basedir.path, "", [], valid_config)
|
||||
new_config.set_config("foo", "bar", "value1")
|
||||
loaded_config = read_config(basedir.path, "", [], valid_config)
|
||||
self.assertEqual(
|
||||
loaded_config.get_config("foo", "bar"),
|
||||
"value1",
|
||||
)
|
||||
|
||||
def test_set_config_rejects_invalid_config(self):
|
||||
"""
|
||||
``_Config.set_config`` raises ``UnknownConfigError`` if the section or
|
||||
item is not recognized by the validation object and does not set the
|
||||
value.
|
||||
"""
|
||||
# Make everything invalid.
|
||||
valid_config = ValidConfiguration.nothing()
|
||||
new_config = config_from_string(self.mktemp(), "", "", valid_config)
|
||||
with self.assertRaises(UnknownConfigError):
|
||||
new_config.set_config("foo", "bar", "baz")
|
||||
with self.assertRaises(MissingConfigEntry):
|
||||
new_config.get_config("foo", "bar")
|
||||
|
||||
|
||||
class TestMissingPorts(unittest.TestCase):
|
||||
"""
|
||||
|
@ -12,7 +12,6 @@ from twisted.internet import reactor
|
||||
from twisted.python import usage
|
||||
from twisted.internet.defer import (
|
||||
inlineCallbacks,
|
||||
returnValue,
|
||||
DeferredList,
|
||||
)
|
||||
from twisted.python.filepath import FilePath
|
||||
@ -20,12 +19,9 @@ from twisted.python.runtime import (
|
||||
platform,
|
||||
)
|
||||
from allmydata.util import fileutil, pollmixin
|
||||
from allmydata.util.encodingutil import unicode_to_argv, unicode_to_output, \
|
||||
get_filesystem_encoding
|
||||
from allmydata.util.encodingutil import unicode_to_argv, unicode_to_output
|
||||
from allmydata.test import common_util
|
||||
from allmydata.version_checks import normalized_version
|
||||
import allmydata
|
||||
from allmydata import __appname__
|
||||
from .common_util import parse_cli, run_cli
|
||||
from .cli_node_api import (
|
||||
CLINodeAPI,
|
||||
@ -58,17 +54,6 @@ rootdir = get_root_from_file(srcfile)
|
||||
|
||||
|
||||
class RunBinTahoeMixin(object):
|
||||
|
||||
@inlineCallbacks
|
||||
def find_import_location(self):
|
||||
res = yield self.run_bintahoe(["--version-and-path"])
|
||||
out, err, rc_or_sig = res
|
||||
self.assertEqual(rc_or_sig, 0, res)
|
||||
lines = out.splitlines()
|
||||
tahoe_pieces = lines[0].split()
|
||||
self.assertEqual(tahoe_pieces[0], "%s:" % (__appname__,), (tahoe_pieces, res))
|
||||
returnValue(tahoe_pieces[-1].strip("()"))
|
||||
|
||||
def run_bintahoe(self, args, stdin=None, python_options=[], env=None):
|
||||
command = sys.executable
|
||||
argv = python_options + ["-m", "allmydata.scripts.runner"] + args
|
||||
@ -86,64 +71,6 @@ class RunBinTahoeMixin(object):
|
||||
|
||||
|
||||
class BinTahoe(common_util.SignalMixin, unittest.TestCase, RunBinTahoeMixin):
|
||||
@inlineCallbacks
|
||||
def test_the_right_code(self):
|
||||
# running "tahoe" in a subprocess should find the same code that
|
||||
# holds this test file, else something is weird
|
||||
test_path = os.path.dirname(os.path.dirname(os.path.normcase(os.path.realpath(srcfile))))
|
||||
bintahoe_import_path = yield self.find_import_location()
|
||||
|
||||
same = (bintahoe_import_path == test_path)
|
||||
if not same:
|
||||
msg = ("My tests and my 'tahoe' executable are using different paths.\n"
|
||||
"tahoe: %r\n"
|
||||
"tests: %r\n"
|
||||
"( according to the test source filename %r)\n" %
|
||||
(bintahoe_import_path, test_path, srcfile))
|
||||
|
||||
if (not isinstance(rootdir, unicode) and
|
||||
rootdir.decode(get_filesystem_encoding(), 'replace') != rootdir):
|
||||
msg += ("However, this may be a false alarm because the import path\n"
|
||||
"is not representable in the filesystem encoding.")
|
||||
raise unittest.SkipTest(msg)
|
||||
else:
|
||||
msg += "Please run the tests in a virtualenv that includes both the Tahoe-LAFS library and the 'tahoe' executable."
|
||||
self.fail(msg)
|
||||
|
||||
def test_path(self):
|
||||
d = self.run_bintahoe(["--version-and-path"])
|
||||
def _cb(res):
|
||||
out, err, rc_or_sig = res
|
||||
self.failUnlessEqual(rc_or_sig, 0, str(res))
|
||||
|
||||
# Fail unless the __appname__ package is *this* version *and*
|
||||
# was loaded from *this* source directory.
|
||||
|
||||
required_verstr = str(allmydata.__version__)
|
||||
|
||||
self.failIfEqual(required_verstr, "unknown",
|
||||
"We don't know our version, because this distribution didn't come "
|
||||
"with a _version.py and 'setup.py update_version' hasn't been run.")
|
||||
|
||||
srcdir = os.path.dirname(os.path.dirname(os.path.normcase(os.path.realpath(srcfile))))
|
||||
info = repr((res, allmydata.__appname__, required_verstr, srcdir))
|
||||
|
||||
appverpath = out.split(')')[0]
|
||||
(appverfull, path) = appverpath.split('] (')
|
||||
(appver, comment) = appverfull.split(' [')
|
||||
(branch, full_version) = comment.split(': ')
|
||||
(app, ver) = appver.split(': ')
|
||||
|
||||
self.failUnlessEqual(app, allmydata.__appname__, info)
|
||||
norm_ver = normalized_version(ver)
|
||||
norm_required = normalized_version(required_verstr)
|
||||
self.failUnlessEqual(norm_ver, norm_required, info)
|
||||
self.failUnlessEqual(path, srcdir, info)
|
||||
self.failUnlessEqual(branch, allmydata.branch)
|
||||
self.failUnlessEqual(full_version, allmydata.full_version)
|
||||
d.addCallback(_cb)
|
||||
return d
|
||||
|
||||
def test_unicode_arguments_and_output(self):
|
||||
tricky = u"\u2621"
|
||||
try:
|
||||
@ -165,8 +92,8 @@ class BinTahoe(common_util.SignalMixin, unittest.TestCase, RunBinTahoeMixin):
|
||||
d = self.run_bintahoe(["--version"], python_options=["-t"])
|
||||
def _cb(res):
|
||||
out, err, rc_or_sig = res
|
||||
self.failUnlessEqual(rc_or_sig, 0, str(res))
|
||||
self.failUnless(out.startswith(allmydata.__appname__+':'), str(res))
|
||||
self.assertEqual(rc_or_sig, 0, str(res))
|
||||
self.assertTrue(out.startswith(allmydata.__appname__ + '/'), str(res))
|
||||
d.addCallback(_cb)
|
||||
return d
|
||||
|
||||
|
@ -1,275 +0,0 @@
|
||||
"""
|
||||
Tests for allmydata.util.verlib and allmydata.version_checks.
|
||||
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
import sys
|
||||
import pkg_resources
|
||||
from operator import (
|
||||
setitem,
|
||||
)
|
||||
from twisted.trial import unittest
|
||||
|
||||
from allmydata.version_checks import (
|
||||
_cross_check as cross_check,
|
||||
_extract_openssl_version as extract_openssl_version,
|
||||
_get_package_versions_and_locations as get_package_versions_and_locations,
|
||||
)
|
||||
from allmydata.util.verlib import NormalizedVersion as V, \
|
||||
IrrationalVersionError, \
|
||||
suggest_normalized_version as suggest
|
||||
|
||||
|
||||
class MockSSL(object):
|
||||
SSLEAY_VERSION = 0
|
||||
SSLEAY_CFLAGS = 2
|
||||
|
||||
def __init__(self, version, compiled_without_heartbeats=False):
|
||||
self.opts = {
|
||||
self.SSLEAY_VERSION: version,
|
||||
self.SSLEAY_CFLAGS: compiled_without_heartbeats and 'compiler: gcc -DOPENSSL_NO_HEARTBEATS'
|
||||
or 'compiler: gcc',
|
||||
}
|
||||
|
||||
def SSLeay_version(self, which):
|
||||
return self.opts[which]
|
||||
|
||||
|
||||
class CheckRequirement(unittest.TestCase):
|
||||
def test_packages_from_pkg_resources(self):
|
||||
if hasattr(sys, 'frozen'):
|
||||
raise unittest.SkipTest("This test doesn't apply to frozen builds.")
|
||||
|
||||
class MockPackage(object):
|
||||
def __init__(self, project_name, version, location):
|
||||
self.project_name = project_name
|
||||
self.version = version
|
||||
self.location = location
|
||||
|
||||
def call_pkg_resources_require(*args):
|
||||
return [MockPackage("Foo", "1.0", "/path")]
|
||||
self.patch(pkg_resources, 'require', call_pkg_resources_require)
|
||||
|
||||
(packages, errors) = get_package_versions_and_locations()
|
||||
self.failUnlessIn(("foo", ("1.0", "/path", "according to pkg_resources")), packages)
|
||||
self.failIfEqual(errors, [])
|
||||
self.failUnlessEqual([e for e in errors if "was not found by pkg_resources" not in e], [])
|
||||
|
||||
def test_cross_check_unparseable_versions(self):
|
||||
# The bug in #1355 is triggered when a version string from either pkg_resources or import
|
||||
# is not parseable at all by normalized_version.
|
||||
|
||||
res = cross_check({"foo": ("unparseable", "")}, [("foo", ("1.0", "", None))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"foo": ("1.0", "")}, [("foo", ("unparseable", "", None))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"foo": ("unparseable", "")}, [("foo", ("unparseable", "", None))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
def test_cross_check(self):
|
||||
res = cross_check({}, [])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({}, [("tahoe-lafs", ("1.0", "", "blah"))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"foo": ("unparseable", "")}, [])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"argparse": ("unparseable", "")}, [])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({}, [("foo", ("unparseable", "", None))])
|
||||
self.failUnlessEqual(len(res), 1)
|
||||
self.assertTrue(("version 'unparseable'" in res[0]) or ("version u'unparseable'" in res[0]))
|
||||
self.failUnlessIn("was not found by pkg_resources", res[0])
|
||||
|
||||
res = cross_check({"distribute": ("1.0", "/somewhere")}, [("setuptools", ("2.0", "/somewhere", "distribute"))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"distribute": ("1.0", "/somewhere")}, [("setuptools", ("2.0", "/somewhere", None))])
|
||||
self.failUnlessEqual(len(res), 1)
|
||||
self.failUnlessIn("location mismatch", res[0])
|
||||
|
||||
res = cross_check({"distribute": ("1.0", "/somewhere")}, [("setuptools", ("2.0", "/somewhere_different", None))])
|
||||
self.failUnlessEqual(len(res), 1)
|
||||
self.failUnlessIn("location mismatch", res[0])
|
||||
|
||||
res = cross_check({"zope.interface": ("1.0", "")}, [("zope.interface", ("unknown", "", None))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"zope.interface": ("unknown", "")}, [("zope.interface", ("unknown", "", None))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"foo": ("1.0", "")}, [("foo", ("unknown", "", None))])
|
||||
self.failUnlessEqual(len(res), 1)
|
||||
self.failUnlessIn("could not find a version number", res[0])
|
||||
|
||||
res = cross_check({"foo": ("unknown", "")}, [("foo", ("unknown", "", None))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
# When pkg_resources and import both find a package, there is only a warning if both
|
||||
# the version and the path fail to match.
|
||||
|
||||
res = cross_check({"foo": ("1.0", "/somewhere")}, [("foo", ("2.0", "/somewhere", None))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"foo": ("1.0", "/somewhere")}, [("foo", ("1.0", "/somewhere_different", None))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"foo": ("1.0-r123", "/somewhere")}, [("foo", ("1.0.post123", "/somewhere_different", None))])
|
||||
self.failUnlessEqual(res, [])
|
||||
|
||||
res = cross_check({"foo": ("1.0", "/somewhere")}, [("foo", ("2.0", "/somewhere_different", None))])
|
||||
self.failUnlessEqual(len(res), 1)
|
||||
self.assertTrue(("but version '2.0'" in res[0]) or ("but version u'2.0'" in res[0]))
|
||||
|
||||
def test_extract_openssl_version(self):
|
||||
self.failUnlessEqual(extract_openssl_version(MockSSL("")),
|
||||
("", None, None))
|
||||
self.failUnlessEqual(extract_openssl_version(MockSSL("NotOpenSSL a.b.c foo")),
|
||||
("NotOpenSSL", None, "a.b.c foo"))
|
||||
self.failUnlessEqual(extract_openssl_version(MockSSL("OpenSSL a.b.c")),
|
||||
("a.b.c", None, None))
|
||||
self.failUnlessEqual(extract_openssl_version(MockSSL("OpenSSL 1.0.1e 11 Feb 2013")),
|
||||
("1.0.1e", None, "11 Feb 2013"))
|
||||
self.failUnlessEqual(extract_openssl_version(MockSSL("OpenSSL 1.0.1e 11 Feb 2013", compiled_without_heartbeats=True)),
|
||||
("1.0.1e", None, "11 Feb 2013, no heartbeats"))
|
||||
|
||||
|
||||
# based on https://bitbucket.org/tarek/distutilsversion/src/17df9a7d96ef/test_verlib.py
|
||||
|
||||
class VersionTestCase(unittest.TestCase):
|
||||
versions = ((V('1.0'), '1.0'),
|
||||
(V('1.1'), '1.1'),
|
||||
(V('1.2.3'), '1.2.3'),
|
||||
(V('1.2'), '1.2'),
|
||||
(V('1.2.3a4'), '1.2.3a4'),
|
||||
(V('1.2c4'), '1.2c4'),
|
||||
(V('1.2.3.4'), '1.2.3.4'),
|
||||
(V('1.2.3.4.0b3'), '1.2.3.4b3'),
|
||||
(V('1.2.0.0.0'), '1.2'),
|
||||
(V('1.0.dev345'), '1.0.dev345'),
|
||||
(V('1.0.post456.dev623'), '1.0.post456.dev623'))
|
||||
|
||||
def test_basic_versions(self):
|
||||
for v, s in self.versions:
|
||||
self.failUnlessEqual(str(v), s)
|
||||
|
||||
def test_from_parts(self):
|
||||
for v, s in self.versions:
|
||||
parts = v.parts
|
||||
v2 = V.from_parts(*parts)
|
||||
self.failUnlessEqual(v, v2)
|
||||
self.failUnlessEqual(str(v), str(v2))
|
||||
|
||||
def test_irrational_versions(self):
|
||||
irrational = ('1', '1.2a', '1.2.3b', '1.02', '1.2a03',
|
||||
'1.2a3.04', '1.2.dev.2', '1.2dev', '1.2.dev',
|
||||
'1.2.dev2.post2', '1.2.post2.dev3.post4')
|
||||
|
||||
for s in irrational:
|
||||
self.failUnlessRaises(IrrationalVersionError, V, s)
|
||||
|
||||
def test_comparison(self):
|
||||
self.failUnlessRaises(TypeError, lambda: V('1.2.0') == '1.2')
|
||||
|
||||
self.failUnlessEqual(V('1.2.0'), V('1.2'))
|
||||
self.failIfEqual(V('1.2.0'), V('1.2.3'))
|
||||
self.failUnless(V('1.2.0') < V('1.2.3'))
|
||||
self.failUnless(V('1.0') > V('1.0b2'))
|
||||
self.failUnless(V('1.0') > V('1.0c2') > V('1.0c1') > V('1.0b2') > V('1.0b1')
|
||||
> V('1.0a2') > V('1.0a1'))
|
||||
self.failUnless(V('1.0.0') > V('1.0.0c2') > V('1.0.0c1') > V('1.0.0b2') > V('1.0.0b1')
|
||||
> V('1.0.0a2') > V('1.0.0a1'))
|
||||
|
||||
self.failUnless(V('1.0') < V('1.0.post456.dev623'))
|
||||
self.failUnless(V('1.0.post456.dev623') < V('1.0.post456') < V('1.0.post1234'))
|
||||
|
||||
self.failUnless(V('1.0a1')
|
||||
< V('1.0a2.dev456')
|
||||
< V('1.0a2')
|
||||
< V('1.0a2.1.dev456') # e.g. need to do a quick post release on 1.0a2
|
||||
< V('1.0a2.1')
|
||||
< V('1.0b1.dev456')
|
||||
< V('1.0b2')
|
||||
< V('1.0c1')
|
||||
< V('1.0c2.dev456')
|
||||
< V('1.0c2')
|
||||
< V('1.0.dev7')
|
||||
< V('1.0.dev18')
|
||||
< V('1.0.dev456')
|
||||
< V('1.0.dev1234')
|
||||
< V('1.0')
|
||||
< V('1.0.post456.dev623') # development version of a post release
|
||||
< V('1.0.post456'))
|
||||
|
||||
def test_suggest_normalized_version(self):
|
||||
self.failUnlessEqual(suggest('1.0'), '1.0')
|
||||
self.failUnlessEqual(suggest('1.0-alpha1'), '1.0a1')
|
||||
self.failUnlessEqual(suggest('1.0c2'), '1.0c2')
|
||||
self.failUnlessEqual(suggest('walla walla washington'), None)
|
||||
self.failUnlessEqual(suggest('2.4c1'), '2.4c1')
|
||||
|
||||
# from setuptools
|
||||
self.failUnlessEqual(suggest('0.4a1.r10'), '0.4a1.post10')
|
||||
self.failUnlessEqual(suggest('0.7a1dev-r66608'), '0.7a1.dev66608')
|
||||
self.failUnlessEqual(suggest('0.6a9.dev-r41475'), '0.6a9.dev41475')
|
||||
self.failUnlessEqual(suggest('2.4preview1'), '2.4c1')
|
||||
self.failUnlessEqual(suggest('2.4pre1') , '2.4c1')
|
||||
self.failUnlessEqual(suggest('2.1-rc2'), '2.1c2')
|
||||
|
||||
# from pypi
|
||||
self.failUnlessEqual(suggest('0.1dev'), '0.1.dev0')
|
||||
self.failUnlessEqual(suggest('0.1.dev'), '0.1.dev0')
|
||||
|
||||
# we want to be able to parse Twisted
|
||||
# development versions are like post releases in Twisted
|
||||
self.failUnlessEqual(suggest('9.0.0+r2363'), '9.0.0.post2363')
|
||||
|
||||
# pre-releases are using markers like "pre1"
|
||||
self.failUnlessEqual(suggest('9.0.0pre1'), '9.0.0c1')
|
||||
|
||||
# we want to be able to parse Tcl-TK
|
||||
# they use "p1" "p2" for post releases
|
||||
self.failUnlessEqual(suggest('1.4p1'), '1.4.post1')
|
||||
|
||||
# from darcsver
|
||||
self.failUnlessEqual(suggest('1.8.1-r4956'), '1.8.1.post4956')
|
||||
|
||||
# zetuptoolz
|
||||
self.failUnlessEqual(suggest('0.6c16dev3'), '0.6c16.dev3')
|
||||
|
||||
|
||||
class T(unittest.TestCase):
|
||||
def test_report_import_error(self):
|
||||
"""
|
||||
get_package_versions_and_locations reports a dependency if a dependency
|
||||
cannot be imported.
|
||||
"""
|
||||
# Make sure we don't leave the system in a bad state.
|
||||
self.addCleanup(
|
||||
lambda foolscap=sys.modules["foolscap"]: setitem(
|
||||
sys.modules,
|
||||
"foolscap",
|
||||
foolscap,
|
||||
),
|
||||
)
|
||||
# Make it look like Foolscap isn't installed.
|
||||
sys.modules["foolscap"] = None
|
||||
vers_and_locs, errors = get_package_versions_and_locations()
|
||||
|
||||
foolscap_stuffs = [stuff for (pkg, stuff) in vers_and_locs if pkg == 'foolscap']
|
||||
self.failUnlessEqual(len(foolscap_stuffs), 1)
|
||||
self.failUnless([e for e in errors if "\'foolscap\' could not be imported" in e])
|
@ -127,7 +127,7 @@ class IntroducerWeb(unittest.TestCase):
|
||||
assert_soup_has_text(
|
||||
self,
|
||||
soup,
|
||||
u"%s: %s" % (allmydata.__appname__, allmydata.__version__),
|
||||
allmydata.__full_version__,
|
||||
)
|
||||
assert_soup_has_text(self, soup, u"no peers!")
|
||||
assert_soup_has_text(self, soup, u"subscribers!")
|
||||
|
@ -6,6 +6,9 @@ import treq
|
||||
|
||||
from bs4 import BeautifulSoup
|
||||
|
||||
from twisted.python.filepath import (
|
||||
FilePath,
|
||||
)
|
||||
from twisted.application import service
|
||||
from twisted.internet import defer
|
||||
from twisted.internet.defer import inlineCallbacks, returnValue
|
||||
@ -316,8 +319,16 @@ class WebMixin(TimezoneMixin):
|
||||
self.staticdir = self.mktemp()
|
||||
self.clock = Clock()
|
||||
self.fakeTime = 86460 # 1d 0h 1m 0s
|
||||
self.ws = webish.WebishServer(self.s, "0", staticdir=self.staticdir,
|
||||
clock=self.clock, now_fn=lambda:self.fakeTime)
|
||||
tempdir = FilePath(self.mktemp())
|
||||
tempdir.makedirs()
|
||||
self.ws = webish.WebishServer(
|
||||
self.s,
|
||||
"0",
|
||||
tempdir=tempdir.path,
|
||||
staticdir=self.staticdir,
|
||||
clock=self.clock,
|
||||
now_fn=lambda:self.fakeTime,
|
||||
)
|
||||
self.ws.setServiceParent(self.s)
|
||||
self.webish_port = self.ws.getPortnum()
|
||||
self.webish_url = self.ws.getURL()
|
||||
|
@ -5,6 +5,19 @@ Tests for ``allmydata.webish``.
|
||||
from uuid import (
|
||||
uuid4,
|
||||
)
|
||||
from errno import (
|
||||
EACCES,
|
||||
)
|
||||
from io import (
|
||||
BytesIO,
|
||||
)
|
||||
|
||||
from hypothesis import (
|
||||
given,
|
||||
)
|
||||
from hypothesis.strategies import (
|
||||
integers,
|
||||
)
|
||||
|
||||
from testtools.matchers import (
|
||||
AfterPreprocessing,
|
||||
@ -12,8 +25,13 @@ from testtools.matchers import (
|
||||
Equals,
|
||||
MatchesAll,
|
||||
Not,
|
||||
IsInstance,
|
||||
HasLength,
|
||||
)
|
||||
|
||||
from twisted.python.runtime import (
|
||||
platform,
|
||||
)
|
||||
from twisted.python.filepath import (
|
||||
FilePath,
|
||||
)
|
||||
@ -30,7 +48,7 @@ from ..common import (
|
||||
|
||||
from ...webish import (
|
||||
TahoeLAFSRequest,
|
||||
tahoe_lafs_site,
|
||||
TahoeLAFSSite,
|
||||
)
|
||||
|
||||
|
||||
@ -96,7 +114,7 @@ class TahoeLAFSRequestTests(SyncTestCase):
|
||||
|
||||
class TahoeLAFSSiteTests(SyncTestCase):
|
||||
"""
|
||||
Tests for the ``Site`` created by ``tahoe_lafs_site``.
|
||||
Tests for ``TahoeLAFSSite``.
|
||||
"""
|
||||
def _test_censoring(self, path, censored):
|
||||
"""
|
||||
@ -112,7 +130,7 @@ class TahoeLAFSSiteTests(SyncTestCase):
|
||||
"""
|
||||
logPath = self.mktemp()
|
||||
|
||||
site = tahoe_lafs_site(Resource(), logPath=logPath)
|
||||
site = TahoeLAFSSite(self.mktemp(), Resource(), logPath=logPath)
|
||||
site.startFactory()
|
||||
|
||||
channel = DummyChannel()
|
||||
@ -170,6 +188,106 @@ class TahoeLAFSSiteTests(SyncTestCase):
|
||||
b"/uri?uri=[CENSORED]",
|
||||
)
|
||||
|
||||
def _create_request(self, tempdir):
|
||||
"""
|
||||
Create and return a new ``TahoeLAFSRequest`` hooked up to a
|
||||
``TahoeLAFSSite``.
|
||||
|
||||
:param bytes tempdir: The temporary directory to give to the site.
|
||||
|
||||
:return TahoeLAFSRequest: The new request instance.
|
||||
"""
|
||||
site = TahoeLAFSSite(tempdir.path, Resource(), logPath=self.mktemp())
|
||||
site.startFactory()
|
||||
|
||||
channel = DummyChannel()
|
||||
channel.site = site
|
||||
request = TahoeLAFSRequest(channel)
|
||||
return request
|
||||
|
||||
@given(integers(min_value=0, max_value=1024 * 1024 - 1))
|
||||
def test_small_content(self, request_body_size):
|
||||
"""
|
||||
A request body smaller than 1 MiB is kept in memory.
|
||||
"""
|
||||
tempdir = FilePath(self.mktemp())
|
||||
request = self._create_request(tempdir)
|
||||
request.gotLength(request_body_size)
|
||||
self.assertThat(
|
||||
request.content,
|
||||
IsInstance(BytesIO),
|
||||
)
|
||||
|
||||
def _large_request_test(self, request_body_size):
|
||||
"""
|
||||
Assert that when a request with a body of of the given size is received
|
||||
its content is written to the directory the ``TahoeLAFSSite`` is
|
||||
configured with.
|
||||
"""
|
||||
tempdir = FilePath(self.mktemp())
|
||||
tempdir.makedirs()
|
||||
request = self._create_request(tempdir)
|
||||
|
||||
# So. Bad news. The temporary file for the uploaded content is
|
||||
# unnamed (and this isn't even necessarily a bad thing since it is how
|
||||
# you get automatic on-process-exit cleanup behavior on POSIX). It's
|
||||
# not visible by inspecting the filesystem. It has no name we can
|
||||
# discover. Then how do we verify it is written to the right place?
|
||||
# The question itself is meaningless if we try to be too precise. It
|
||||
# *has* no filesystem location. However, it is still stored *on* some
|
||||
# filesystem. We still want to make sure it is on the filesystem we
|
||||
# specified because otherwise it might be on a filesystem that's too
|
||||
# small or undesirable in some other way.
|
||||
#
|
||||
# I don't know of any way to ask a file descriptor which filesystem
|
||||
# it's on, either, though. It might be the case that the [f]statvfs()
|
||||
# result could be compared somehow to infer the filesystem but
|
||||
# ... it's not clear what the failure modes might be there, across
|
||||
# different filesystems and runtime environments.
|
||||
#
|
||||
# Another approach is to make the temp directory unwriteable and
|
||||
# observe the failure when an attempt is made to create a file there.
|
||||
# This is hardly a lovely solution but at least it's kind of simple.
|
||||
#
|
||||
# It would be nice if it worked consistently cross-platform but on
|
||||
# Windows os.chmod is more or less broken.
|
||||
if platform.isWindows():
|
||||
request.gotLength(request_body_size)
|
||||
self.assertThat(
|
||||
tempdir.children(),
|
||||
HasLength(1),
|
||||
)
|
||||
else:
|
||||
tempdir.chmod(0o550)
|
||||
with self.assertRaises(OSError) as ctx:
|
||||
request.gotLength(request_body_size)
|
||||
raise Exception(
|
||||
"OSError not raised, instead tempdir.children() = {}".format(
|
||||
tempdir.children(),
|
||||
),
|
||||
)
|
||||
|
||||
self.assertThat(
|
||||
ctx.exception.errno,
|
||||
Equals(EACCES),
|
||||
)
|
||||
|
||||
def test_unknown_request_size(self):
|
||||
"""
|
||||
A request body with an unknown size is written to a file in the temporary
|
||||
directory passed to ``TahoeLAFSSite``.
|
||||
"""
|
||||
self._large_request_test(None)
|
||||
|
||||
@given(integers(min_value=1024 * 1024))
|
||||
def test_large_request(self, request_body_size):
|
||||
"""
|
||||
A request body of 1 MiB or more is written to a file in the temporary
|
||||
directory passed to ``TahoeLAFSSite``.
|
||||
"""
|
||||
self._large_request_test(request_body_size)
|
||||
|
||||
|
||||
def param(name, value):
|
||||
return u"; {}={}".format(name, value)
|
||||
|
||||
|
@ -56,7 +56,11 @@ PORTED_MODULES = [
|
||||
"allmydata.mutable.checker",
|
||||
"allmydata.mutable.common",
|
||||
"allmydata.mutable.filenode",
|
||||
"allmydata.mutable.layout",
|
||||
"allmydata.mutable.publish",
|
||||
"allmydata.mutable.repairer",
|
||||
"allmydata.mutable.retrieve",
|
||||
"allmydata.mutable.servermap",
|
||||
"allmydata.node",
|
||||
"allmydata.storage_client",
|
||||
"allmydata.storage.common",
|
||||
@ -155,5 +159,4 @@ PORTED_TEST_MODULES = [
|
||||
"allmydata.test.test_upload",
|
||||
"allmydata.test.test_uri",
|
||||
"allmydata.test.test_util",
|
||||
"allmydata.test.test_version",
|
||||
]
|
||||
|
@ -20,6 +20,10 @@ from configparser import ConfigParser
|
||||
|
||||
import attr
|
||||
|
||||
from twisted.python.runtime import (
|
||||
platform,
|
||||
)
|
||||
|
||||
|
||||
class UnknownConfigError(Exception):
|
||||
"""
|
||||
@ -59,8 +63,25 @@ def set_config(config, section, option, value):
|
||||
assert config.get(section, option) == value
|
||||
|
||||
def write_config(tahoe_cfg, config):
|
||||
with open(tahoe_cfg, "w") as f:
|
||||
config.write(f)
|
||||
"""
|
||||
Write a configuration to a file.
|
||||
|
||||
:param FilePath tahoe_cfg: The path to which to write the config.
|
||||
|
||||
:param ConfigParser config: The configuration to write.
|
||||
|
||||
:return: ``None``
|
||||
"""
|
||||
tmp = tahoe_cfg.temporarySibling()
|
||||
# FilePath.open can only open files in binary mode which does not work
|
||||
# with ConfigParser.write.
|
||||
with open(tmp.path, "wt") as fp:
|
||||
config.write(fp)
|
||||
# Windows doesn't have atomic overwrite semantics for moveTo. Thus we end
|
||||
# up slightly less than atomic.
|
||||
if platform.isWindows():
|
||||
tahoe_cfg.remove()
|
||||
tmp.moveTo(tahoe_cfg)
|
||||
|
||||
def validate_config(fname, cfg, valid_config):
|
||||
"""
|
||||
@ -102,10 +123,34 @@ class ValidConfiguration(object):
|
||||
an item name as bytes and returns True if that section, item pair is
|
||||
valid, False otherwise.
|
||||
"""
|
||||
_static_valid_sections = attr.ib()
|
||||
_static_valid_sections = attr.ib(
|
||||
validator=attr.validators.instance_of(dict)
|
||||
)
|
||||
_is_valid_section = attr.ib(default=lambda section_name: False)
|
||||
_is_valid_item = attr.ib(default=lambda section_name, item_name: False)
|
||||
|
||||
@classmethod
|
||||
def everything(cls):
|
||||
"""
|
||||
Create a validator which considers everything valid.
|
||||
"""
|
||||
return cls(
|
||||
{},
|
||||
lambda section_name: True,
|
||||
lambda section_name, item_name: True,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def nothing(cls):
|
||||
"""
|
||||
Create a validator which considers nothing valid.
|
||||
"""
|
||||
return cls(
|
||||
{},
|
||||
lambda section_name: False,
|
||||
lambda section_name, item_name: False,
|
||||
)
|
||||
|
||||
def is_valid_section(self, section_name):
|
||||
"""
|
||||
:return: True if the given section name is valid, False otherwise.
|
||||
@ -136,6 +181,23 @@ class ValidConfiguration(object):
|
||||
)
|
||||
|
||||
|
||||
def copy_config(old):
|
||||
"""
|
||||
Return a brand new ``ConfigParser`` containing the same values as
|
||||
the given object.
|
||||
|
||||
:param ConfigParser old: The configuration to copy.
|
||||
|
||||
:return ConfigParser: The new object containing the same configuration.
|
||||
"""
|
||||
new = ConfigParser()
|
||||
for section_name in old.sections():
|
||||
new.add_section(section_name)
|
||||
for k, v in old.items(section_name):
|
||||
new.set(section_name, k, v.replace("%", "%%"))
|
||||
return new
|
||||
|
||||
|
||||
def _either(f, g):
|
||||
"""
|
||||
:return: A function which returns True if either f or g returns True.
|
||||
|
@ -11,6 +11,7 @@ from __future__ import unicode_literals
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
from six import ensure_str
|
||||
|
||||
from pyutil import nummedobj
|
||||
|
||||
@ -55,6 +56,7 @@ class LogMixin(object):
|
||||
pmsgid = self._parentmsgid
|
||||
if pmsgid is None:
|
||||
pmsgid = self._grandparentmsgid
|
||||
kwargs = {ensure_str(k): v for (k, v) in kwargs.items()}
|
||||
msgid = log.msg(msg, facility=facility, parent=pmsgid, *args, **kwargs)
|
||||
if self._parentmsgid is None:
|
||||
self._parentmsgid = msgid
|
||||
|
@ -1,334 +0,0 @@
|
||||
"""
|
||||
Produce reports about the versions of Python software in use by Tahoe-LAFS
|
||||
for debugging and auditing purposes.
|
||||
|
||||
Ported to Python 3.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
from __future__ import print_function
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from future.utils import PY2
|
||||
if PY2:
|
||||
from builtins import filter, map, zip, ascii, chr, hex, input, next, oct, open, pow, round, super, bytes, dict, list, object, range, str, max, min # noqa: F401
|
||||
|
||||
__all__ = [
|
||||
"PackagingError",
|
||||
"get_package_versions",
|
||||
"get_package_versions_string",
|
||||
"normalized_version",
|
||||
]
|
||||
|
||||
import os, platform, re, sys, traceback, pkg_resources
|
||||
|
||||
import six
|
||||
|
||||
import distro
|
||||
|
||||
from . import (
|
||||
__appname__,
|
||||
full_version,
|
||||
branch,
|
||||
)
|
||||
from .util import (
|
||||
verlib,
|
||||
)
|
||||
|
||||
if getattr(sys, 'frozen', None):
|
||||
# "Frozen" python interpreters (i.e., standalone executables
|
||||
# generated by PyInstaller and other, similar utilities) run
|
||||
# independently of a traditional setuptools-based packaging
|
||||
# environment, and so pkg_resources.get_distribution() cannot be
|
||||
# used in such cases to gather a list of requirements at runtime
|
||||
# (and because a frozen application is one that has already been
|
||||
# "installed", an empty list suffices here).
|
||||
_INSTALL_REQUIRES = []
|
||||
else:
|
||||
_INSTALL_REQUIRES = list(
|
||||
str(req)
|
||||
for req
|
||||
in pkg_resources.get_distribution(__appname__).requires()
|
||||
)
|
||||
|
||||
class PackagingError(EnvironmentError):
|
||||
"""
|
||||
Raised when there is an error in packaging of Tahoe-LAFS or its
|
||||
dependencies which makes it impossible to proceed safely.
|
||||
"""
|
||||
|
||||
def get_package_versions():
|
||||
return dict([(k, v) for k, (v, l, c) in _vers_and_locs_list])
|
||||
|
||||
def get_package_versions_string(show_paths=False, debug=False):
|
||||
res = []
|
||||
for p, (v, loc, comment) in _vers_and_locs_list:
|
||||
info = str(p) + ": " + str(v)
|
||||
if comment:
|
||||
info = info + " [%s]" % str(comment)
|
||||
if show_paths:
|
||||
info = info + " (%s)" % str(loc)
|
||||
res.append(info)
|
||||
|
||||
output = "\n".join(res) + "\n"
|
||||
|
||||
if _cross_check_errors:
|
||||
output += _get_error_string(_cross_check_errors, debug=debug)
|
||||
|
||||
return output
|
||||
|
||||
_distributor_id_cmdline_re = re.compile("(?:Distributor ID:)\s*(.*)", re.I)
|
||||
_release_cmdline_re = re.compile("(?:Release:)\s*(.*)", re.I)
|
||||
|
||||
_distributor_id_file_re = re.compile("(?:DISTRIB_ID\s*=)\s*(.*)", re.I)
|
||||
_release_file_re = re.compile("(?:DISTRIB_RELEASE\s*=)\s*(.*)", re.I)
|
||||
|
||||
_distname = None
|
||||
_version = None
|
||||
|
||||
def normalized_version(verstr, what=None):
|
||||
try:
|
||||
suggested = verlib.suggest_normalized_version(verstr) or verstr
|
||||
return verlib.NormalizedVersion(suggested)
|
||||
except verlib.IrrationalVersionError:
|
||||
raise
|
||||
except Exception:
|
||||
cls, value, trace = sys.exc_info()
|
||||
new_exc = PackagingError("could not parse %s due to %s: %s"
|
||||
% (what or repr(verstr), cls.__name__, value))
|
||||
six.reraise(cls, new_exc, trace)
|
||||
|
||||
def _get_error_string(errors, debug=False):
|
||||
|
||||
msg = "\n%s\n" % ("\n".join(errors),)
|
||||
if debug:
|
||||
msg += (
|
||||
"\n"
|
||||
"For debugging purposes, the PYTHONPATH was\n"
|
||||
" %r\n"
|
||||
"install_requires was\n"
|
||||
" %r\n"
|
||||
"sys.path after importing pkg_resources was\n"
|
||||
" %s\n"
|
||||
% (
|
||||
os.environ.get('PYTHONPATH'),
|
||||
_INSTALL_REQUIRES,
|
||||
(os.pathsep+"\n ").join(sys.path),
|
||||
)
|
||||
)
|
||||
return msg
|
||||
|
||||
def _cross_check(pkg_resources_vers_and_locs, imported_vers_and_locs_list):
|
||||
"""This function returns a list of errors due to any failed cross-checks."""
|
||||
|
||||
from ._auto_deps import not_import_versionable
|
||||
|
||||
errors = []
|
||||
not_pkg_resourceable = ['python', 'platform', __appname__.lower(), 'openssl']
|
||||
|
||||
for name, (imp_ver, imp_loc, imp_comment) in imported_vers_and_locs_list:
|
||||
name = name.lower()
|
||||
if name not in not_pkg_resourceable:
|
||||
if name not in pkg_resources_vers_and_locs:
|
||||
if name == "setuptools" and "distribute" in pkg_resources_vers_and_locs:
|
||||
pr_ver, pr_loc = pkg_resources_vers_and_locs["distribute"]
|
||||
if not (os.path.normpath(os.path.realpath(pr_loc)) == os.path.normpath(os.path.realpath(imp_loc))
|
||||
and imp_comment == "distribute"):
|
||||
errors.append("Warning: dependency 'setuptools' found to be version %r of 'distribute' from %r "
|
||||
"by pkg_resources, but 'import setuptools' gave version %r [%s] from %r. "
|
||||
"A version mismatch is expected, but a location mismatch is not."
|
||||
% (pr_ver, pr_loc, imp_ver, imp_comment or 'probably *not* distribute', imp_loc))
|
||||
else:
|
||||
errors.append("Warning: dependency %r (version %r imported from %r) was not found by pkg_resources."
|
||||
% (name, imp_ver, imp_loc))
|
||||
continue
|
||||
|
||||
pr_ver, pr_loc = pkg_resources_vers_and_locs[name]
|
||||
if imp_ver is None and imp_loc is None:
|
||||
errors.append("Warning: dependency %r could not be imported. pkg_resources thought it should be possible "
|
||||
"to import version %r from %r.\nThe exception trace was %r."
|
||||
% (name, pr_ver, pr_loc, imp_comment))
|
||||
continue
|
||||
|
||||
# If the pkg_resources version is identical to the imported version, don't attempt
|
||||
# to normalize them, since it is unnecessary and may fail (ticket #2499).
|
||||
if imp_ver != 'unknown' and pr_ver == imp_ver:
|
||||
continue
|
||||
|
||||
try:
|
||||
pr_normver = normalized_version(pr_ver)
|
||||
except verlib.IrrationalVersionError:
|
||||
continue
|
||||
except Exception as e:
|
||||
errors.append("Warning: version number %r found for dependency %r by pkg_resources could not be parsed. "
|
||||
"The version found by import was %r from %r. "
|
||||
"pkg_resources thought it should be found at %r. "
|
||||
"The exception was %s: %s"
|
||||
% (pr_ver, name, imp_ver, imp_loc, pr_loc, e.__class__.__name__, e))
|
||||
else:
|
||||
if imp_ver == 'unknown':
|
||||
if name not in not_import_versionable:
|
||||
errors.append("Warning: unexpectedly could not find a version number for dependency %r imported from %r. "
|
||||
"pkg_resources thought it should be version %r at %r."
|
||||
% (name, imp_loc, pr_ver, pr_loc))
|
||||
else:
|
||||
try:
|
||||
imp_normver = normalized_version(imp_ver)
|
||||
except verlib.IrrationalVersionError:
|
||||
continue
|
||||
except Exception as e:
|
||||
errors.append("Warning: version number %r found for dependency %r (imported from %r) could not be parsed. "
|
||||
"pkg_resources thought it should be version %r at %r. "
|
||||
"The exception was %s: %s"
|
||||
% (imp_ver, name, imp_loc, pr_ver, pr_loc, e.__class__.__name__, e))
|
||||
else:
|
||||
if pr_ver == 'unknown' or (pr_normver != imp_normver):
|
||||
if not os.path.normpath(os.path.realpath(pr_loc)) == os.path.normpath(os.path.realpath(imp_loc)):
|
||||
errors.append("Warning: dependency %r found to have version number %r (normalized to %r, from %r) "
|
||||
"by pkg_resources, but version %r (normalized to %r, from %r) by import."
|
||||
% (name, pr_ver, str(pr_normver), pr_loc, imp_ver, str(imp_normver), imp_loc))
|
||||
|
||||
return errors
|
||||
|
||||
def _get_openssl_version():
|
||||
try:
|
||||
from OpenSSL import SSL
|
||||
return _extract_openssl_version(SSL)
|
||||
except Exception:
|
||||
return ("unknown", None, None)
|
||||
|
||||
def _extract_openssl_version(ssl_module):
|
||||
openssl_version = ssl_module.SSLeay_version(ssl_module.SSLEAY_VERSION)
|
||||
if openssl_version.startswith('OpenSSL '):
|
||||
openssl_version = openssl_version[8 :]
|
||||
|
||||
(version, _, comment) = openssl_version.partition(' ')
|
||||
|
||||
try:
|
||||
openssl_cflags = ssl_module.SSLeay_version(ssl_module.SSLEAY_CFLAGS)
|
||||
if '-DOPENSSL_NO_HEARTBEATS' in openssl_cflags.split(' '):
|
||||
comment += ", no heartbeats"
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return (version, None, comment if comment else None)
|
||||
|
||||
|
||||
def _get_platform():
|
||||
# Our version of platform.platform(), telling us both less and more than the
|
||||
# Python Standard Library's version does.
|
||||
# We omit details such as the Linux kernel version number, but we add a
|
||||
# more detailed and correct rendition of the Linux distribution and
|
||||
# distribution-version.
|
||||
if "linux" in platform.system().lower():
|
||||
return (
|
||||
platform.system() + "-" +
|
||||
"_".join(distro.linux_distribution()[:2]) + "-" +
|
||||
platform.machine() + "-" +
|
||||
"_".join([x for x in platform.architecture() if x])
|
||||
)
|
||||
else:
|
||||
return platform.platform()
|
||||
|
||||
def _get_package_versions_and_locations():
|
||||
import warnings
|
||||
from ._auto_deps import package_imports, global_deprecation_messages, deprecation_messages, \
|
||||
runtime_warning_messages, warning_imports, ignorable
|
||||
|
||||
def package_dir(srcfile):
|
||||
return os.path.dirname(os.path.dirname(os.path.normcase(os.path.realpath(srcfile))))
|
||||
|
||||
# pkg_resources.require returns the distribution that pkg_resources attempted to put
|
||||
# on sys.path, which can differ from the one that we actually import due to #1258,
|
||||
# or any other bug that causes sys.path to be set up incorrectly. Therefore we
|
||||
# must import the packages in order to check their versions and paths.
|
||||
|
||||
# This is to suppress all UserWarnings and various DeprecationWarnings and RuntimeWarnings
|
||||
# (listed in _auto_deps.py).
|
||||
|
||||
warnings.filterwarnings("ignore", category=UserWarning, append=True)
|
||||
|
||||
for msg in global_deprecation_messages + deprecation_messages:
|
||||
warnings.filterwarnings("ignore", category=DeprecationWarning, message=msg, append=True)
|
||||
for msg in runtime_warning_messages:
|
||||
warnings.filterwarnings("ignore", category=RuntimeWarning, message=msg, append=True)
|
||||
try:
|
||||
for modulename in warning_imports:
|
||||
try:
|
||||
__import__(modulename)
|
||||
except (ImportError, SyntaxError):
|
||||
pass
|
||||
finally:
|
||||
# Leave suppressions for UserWarnings and global_deprecation_messages active.
|
||||
for _ in runtime_warning_messages + deprecation_messages:
|
||||
warnings.filters.pop()
|
||||
|
||||
packages = []
|
||||
pkg_resources_vers_and_locs = dict()
|
||||
|
||||
if not hasattr(sys, 'frozen'):
|
||||
pkg_resources_vers_and_locs = {
|
||||
p.project_name.lower(): (str(p.version), p.location)
|
||||
for p
|
||||
in pkg_resources.require(_INSTALL_REQUIRES)
|
||||
}
|
||||
|
||||
def get_version(module):
|
||||
if hasattr(module, '__version__'):
|
||||
return str(getattr(module, '__version__'))
|
||||
elif hasattr(module, 'version'):
|
||||
ver = getattr(module, 'version')
|
||||
if isinstance(ver, tuple):
|
||||
return '.'.join(map(str, ver))
|
||||
else:
|
||||
return str(ver)
|
||||
else:
|
||||
return 'unknown'
|
||||
|
||||
for pkgname, modulename in [(__appname__, 'allmydata')] + package_imports:
|
||||
if modulename:
|
||||
try:
|
||||
__import__(modulename)
|
||||
module = sys.modules[modulename]
|
||||
except (ImportError, SyntaxError):
|
||||
etype, emsg, etrace = sys.exc_info()
|
||||
trace_info = (etype, str(emsg), ([None] + traceback.extract_tb(etrace))[-1])
|
||||
packages.append( (pkgname, (None, None, trace_info)) )
|
||||
else:
|
||||
comment = None
|
||||
if pkgname == __appname__:
|
||||
comment = "%s: %s" % (branch, full_version)
|
||||
elif pkgname == 'setuptools' and hasattr(module, '_distribute'):
|
||||
# distribute does not report its version in any module variables
|
||||
comment = 'distribute'
|
||||
ver = get_version(module)
|
||||
loc = package_dir(module.__file__)
|
||||
if ver == "unknown" and pkgname in pkg_resources_vers_and_locs:
|
||||
(pr_ver, pr_loc) = pkg_resources_vers_and_locs[pkgname]
|
||||
if loc == os.path.normcase(os.path.realpath(pr_loc)):
|
||||
ver = pr_ver
|
||||
packages.append( (pkgname, (ver, loc, comment)) )
|
||||
elif pkgname == 'python':
|
||||
packages.append( (pkgname, (platform.python_version(), sys.executable, None)) )
|
||||
elif pkgname == 'platform':
|
||||
packages.append( (pkgname, (_get_platform(), None, None)) )
|
||||
elif pkgname == 'OpenSSL':
|
||||
packages.append( (pkgname, _get_openssl_version()) )
|
||||
|
||||
cross_check_errors = []
|
||||
|
||||
if len(pkg_resources_vers_and_locs) > 0:
|
||||
imported_packages = set([p.lower() for (p, _) in packages])
|
||||
extra_packages = []
|
||||
|
||||
for pr_name, (pr_ver, pr_loc) in pkg_resources_vers_and_locs.items():
|
||||
if pr_name not in imported_packages and pr_name not in ignorable:
|
||||
extra_packages.append( (pr_name, (pr_ver, pr_loc, "according to pkg_resources")) )
|
||||
|
||||
cross_check_errors = _cross_check(pkg_resources_vers_and_locs, packages)
|
||||
packages += extra_packages
|
||||
|
||||
return packages, cross_check_errors
|
||||
|
||||
|
||||
_vers_and_locs_list, _cross_check_errors = _get_package_versions_and_locations()
|
@ -6,7 +6,6 @@ from twisted.python.filepath import FilePath
|
||||
from twisted.web import static
|
||||
import allmydata
|
||||
import json
|
||||
from allmydata.version_checks import get_package_versions_string
|
||||
from allmydata.util import idlib
|
||||
from allmydata.web.common import (
|
||||
render_time,
|
||||
@ -89,7 +88,7 @@ class IntroducerRootElement(Element):
|
||||
self.introducer_service = introducer_service
|
||||
self.node_data_dict = {
|
||||
"my_nodeid": idlib.nodeid_b2a(self.introducer_node.nodeid),
|
||||
"version": get_package_versions_string(),
|
||||
"version": allmydata.__full_version__,
|
||||
"import_path": str(allmydata).replace("/", "/ "), # XXX kludge for wrapping
|
||||
"rendered_at": render_time(time.time()),
|
||||
}
|
||||
|
@ -21,7 +21,6 @@ from twisted.web.template import (
|
||||
)
|
||||
|
||||
import allmydata # to display import path
|
||||
from allmydata.version_checks import get_package_versions_string
|
||||
from allmydata.util import log
|
||||
from allmydata.interfaces import IFileNode
|
||||
from allmydata.web import (
|
||||
@ -566,7 +565,7 @@ class RootElement(Element):
|
||||
|
||||
@renderer
|
||||
def version(self, req, tag):
|
||||
return tag(get_package_versions_string())
|
||||
return tag(allmydata.__full_version__)
|
||||
|
||||
@renderer
|
||||
def import_path(self, req, tag):
|
||||
|
@ -1,13 +1,13 @@
|
||||
from six import ensure_str
|
||||
|
||||
import re, time
|
||||
import re, time, tempfile
|
||||
|
||||
from functools import (
|
||||
partial,
|
||||
)
|
||||
from cgi import (
|
||||
FieldStorage,
|
||||
)
|
||||
from io import (
|
||||
BytesIO,
|
||||
)
|
||||
|
||||
from twisted.application import service, strports, internet
|
||||
from twisted.web import static
|
||||
@ -150,17 +150,34 @@ def _logFormatter(logDateTime, request):
|
||||
)
|
||||
|
||||
|
||||
tahoe_lafs_site = partial(
|
||||
Site,
|
||||
requestFactory=TahoeLAFSRequest,
|
||||
logFormatter=_logFormatter,
|
||||
)
|
||||
class TahoeLAFSSite(Site, object):
|
||||
"""
|
||||
The HTTP protocol factory used by Tahoe-LAFS.
|
||||
|
||||
Among the behaviors provided:
|
||||
|
||||
* A configurable temporary directory where large request bodies can be
|
||||
written so they don't stay in memory.
|
||||
|
||||
* A log formatter that writes some access logs but omits capability
|
||||
strings to help keep them secret.
|
||||
"""
|
||||
requestFactory = TahoeLAFSRequest
|
||||
|
||||
def __init__(self, tempdir, *args, **kwargs):
|
||||
Site.__init__(self, *args, logFormatter=_logFormatter, **kwargs)
|
||||
self._tempdir = tempdir
|
||||
|
||||
def getContentFile(self, length):
|
||||
if length is None or length >= 1024 * 1024:
|
||||
return tempfile.TemporaryFile(dir=self._tempdir)
|
||||
return BytesIO()
|
||||
|
||||
|
||||
class WebishServer(service.MultiService):
|
||||
name = "webish"
|
||||
|
||||
def __init__(self, client, webport, nodeurl_path=None, staticdir=None,
|
||||
def __init__(self, client, webport, tempdir, nodeurl_path=None, staticdir=None,
|
||||
clock=None, now_fn=time.time):
|
||||
service.MultiService.__init__(self)
|
||||
# the 'data' argument to all render() methods default to the Client
|
||||
@ -170,7 +187,7 @@ class WebishServer(service.MultiService):
|
||||
# time in a deterministic manner.
|
||||
|
||||
self.root = root.Root(client, clock, now_fn)
|
||||
self.buildServer(webport, nodeurl_path, staticdir)
|
||||
self.buildServer(webport, tempdir, nodeurl_path, staticdir)
|
||||
|
||||
# If set, clock is a twisted.internet.task.Clock that the tests
|
||||
# use to test ophandle expiration.
|
||||
@ -180,9 +197,9 @@ class WebishServer(service.MultiService):
|
||||
|
||||
self.root.putChild(b"storage-plugins", StoragePlugins(client))
|
||||
|
||||
def buildServer(self, webport, nodeurl_path, staticdir):
|
||||
def buildServer(self, webport, tempdir, nodeurl_path, staticdir):
|
||||
self.webport = webport
|
||||
self.site = tahoe_lafs_site(self.root)
|
||||
self.site = TahoeLAFSSite(tempdir, self.root)
|
||||
self.staticdir = staticdir # so tests can check
|
||||
if staticdir:
|
||||
self.root.putChild("static", static.File(staticdir))
|
||||
@ -260,4 +277,4 @@ class IntroducerWebishServer(WebishServer):
|
||||
def __init__(self, introducer, webport, nodeurl_path=None, staticdir=None):
|
||||
service.MultiService.__init__(self)
|
||||
self.root = introweb.IntroducerRoot(introducer)
|
||||
self.buildServer(webport, nodeurl_path, staticdir)
|
||||
self.buildServer(webport, tempfile.tempdir, nodeurl_path, staticdir)
|
||||
|
Loading…
Reference in New Issue
Block a user