This should fix#506, but it means that if (for some weird reason) Twisted can't be auto-installed and the find_trial.py script doesn't work, the user will get a weird failure message instead of a clean failure message explaining that trial couldn't be found. Oh well.
Chris Galvan is working on a much nicer fix to all these issues -- see #505.
This hopefully works around the problem that Twisted v8.1.0 has a bug when used
with pyOpenSSL v0.7 which bug causes some unit tests to spuriously fail -- see
known_issues.txt r2788:
http://allmydata.org/trac/tahoe/browser/docs/known_issues.txt?rev=2788#L122
Also it matches with the fact that --reactor=poll is required on cygwin.
Remove docs/install-details.html and README.win32 for now (see #282).
Remove checks for pywin32 and pyopenssl in Makefile -- that is (or will be) automated by setuptools.
Remove twisted from setup_requires. This causes the problem in which Nevow doesn't declare its dependency on Twisted (#440) to yield a clear ImportError mentioning Twisted and to fail repeatedly, rather than yielding a weird ImportError and working on the second identical attempt.
Fix Makefile to set PATH so that trial and twistd can be found by "make test" after Twisted was installed into support/ during "make"
Looking over Brian's shoulder the other day, I saw a message that said "Please seedocs/install.html".
Looking at the source code of the Makefile, it seems like it should say
@echo "ERROR: Not all of Tahoe's dependencies are in place. Please see \
docs/install.html for help on installing dependencies."
instead of its current:
@echo "ERROR: Not all of Tahoe's dependencies are in place. Please see\
docs/install.html for help on installing dependencies."
But, I remember changing this more than once in the past, so either there is a different version of make somewhere which interprets trailing backslashes differently, or someone (possibly me) has repeatedly gotten confused about this issue. Anyway, this time around, I'm trying:
@echo "ERROR: Not all of Tahoe's dependencies are in place. Please see docs/install.html for help on installing dependencies."
Even though it is > 80 chars.
It's evil and wrong to call something a "Makefile" when it contains code that can't be interpreted by POSIX make and requires GNU make.
But everyone else is doing it. ;-)
the make mac-upload target now requires an UPLOAD_DEST argument to be given,
which is the rsync destination (including trailing '/') to which the version
stamped directory containing the .dmg should be placed. the account the
build is running as (e.g. 'buildslave') should have ssh access to the account
specified in that dest. one might also consider locking the key down to the
target directory by adding something like
command="rsync --server -vlogDtpr . /home/amduser/public_html/dist/mac-blah/"
to the corresponding authorized_key entry on the target machine.
We have a desire to collect runtime statistics from multiple nodes primarily
for server monitoring purposes. This implements a simple implementation of
such a system, as a skeleton to build more sophistication upon.
Each client now looks for a 'stats_gatherer.furl' config file. If it has
been configured to use a stats gatherer, then it instantiates internally
a StatsProvider. This is a central place for code which wishes to offer
stats up for monitoring to report them to, either by calling
stats_provider.count('stat.name', value) to increment a counter, or by
registering a class as a stats producer with sp.register_producer(obj).
The StatsProvider connects to the StatsGatherer server and provides its
provider upon startup. The StatsGatherer is then responsible for polling
the attached providers periodically to retrieve the data provided.
The provider queries each registered producer when the gatherer queries
the provider. Both the internal 'counters' and the queried 'stats' are
then reported to the gatherer.
This provides a simple gatherer app, (c.f. make stats-gatherer-run)
which prints its furl and listens for incoming connections. Once a
minute, the gatherer polls all connected providers, and writes the
retrieved data into a pickle file.
Also included is a munin plugin which knows how to read the gatherer's
stats.pickle and output data munin can interpret. this plugin,
tahoe-stats.py can be symlinked as multiple different names within
munin's 'plugins' directory, and inspects argv to determine which
data to display, doing a lookup in a table within that file.
It looks in the environment for 'statsfile' to determine the path to
the gatherer's stats.pickle. An example plugins-conf.d file is
provided.
This patch adds support for a mac native build.
At the moment it's a fairly simple .app - i.e. so simple as to be unacceptable
for a shipping product, but ok for testing and experiment at this point.
notably once launched, the app's ui does not respond at all, although its dock
icon does allow it to be force-quit.
this produces a single .app bundle, which when run will look for a node basedir
in ~/.tahoe. If one is not found, one will be created in ~/Library/Application
Support/Allmydata Tahoe, and that will be symlinked to ~/.tahoe
if the basedir is lacking basic config (introducer.furl and root_dir.cap) then
the wx config wizard will be launched to log into an account and to set up
those files.
if a webport file is not found, the default value of 8123 will be written into
it.
once the node has started running, a webbrowser will be opened to the webish
interface at the users root_dir
note that, once configured, the node runs as the main thread of the .app,
no daemonisation is done, twistd is not involved.
the binary itself, from within the .app bundle, i.e.
"Allmydata Tahoe.app/Contents/MacOS/Allmydata Tahoe"
can be used from the command line and functions as the 'tahoe' executable
would in a unix environment, with one exception - when launched with no args
it triggers the default behaviour of running a node, and if necessary config
wizard, as if the user had launched the .app
one other gotcha to be aware of is that symlinking to this binary from some
other place in ones $PATH will most likely not work. when I tried this,
something - wx I believe - exploded, since it seems to use argv[0] to figure
out where necessary libraries reside and fails if argv[0] isn't in the .app
bundle. it's pretty easy to set up a script a la
#!/bin/bash
/Blah/blah/blah/Allmydata\ Tahoe.app/Contents/MacOS/Allmydata\ Tahoe "${@}"
Using pkg_resources.require() like this also apparently allows people to install multiple different versions of packages on their system and tahoe (if pkg_resources is available to it) will import the version of the package that it requires. I haven't tested this feature.