We used to export 'LC_COLLATE=C' for the environment of Genode tools. This was
meant to ensure that sorting is always done C-style and not dependent on the
users locale settings. This is required, for instance, to ensure that the same
archive always yields the same hash.
However, 'export LC_COLLATE=C' is not sufficient. It can be outruled by an
'LC_ALL' setting in the users environment. The manual of 'sort' recommends to
set 'LC_ALL=C' locally if you want reliable results and this is what this
commit does. Furthermore it removes the former 'export LC_COLLATE=C'
directives.
Note that I couldn't find a way to set 'LC_ALL' local to 'exec ... sort' in
TCL. This is why I set it global instead using TCLs 'env' array.
Note that the Make directive '$(sort ...)' and the TCL directive 'lsort',
unlike the Shell directive 'sort', are not affected by the users locale
settings.
Fixes#4144
The check was accidentally in
66d44289e1 tool/ports: streamline hash tool usage
because $(call VAR, ...) just expands to nothing without errors if VAR
is undefined.
Unfortunately, some sites do not support the HTTP/1.1 range header and
just serve the whole file, which takes it's time and may result in a
timeout.
Fixes#2819
Our nightly did not detect the current stale hashes because only updated
(in the sense of a changed hash) ports are prepared. Others were left
untouched besides the check_port_source step. Now, check_hash also
checks for missing hash file updates.
SHA1 is susceptible to collision attacks and is generally deprecated.
Source code archives are particularly vulnerable because the hash digest
can be tweaked by hiding by arbitrary data in code comments and files
not processed during build.
With this in mind the 'prepare_port' tool now attempts to verify digests
as SHA256 with a fallback to SHA1. When CHECK_HASH=no is set the tool
will refuse to verify digests as SHA1. The use of SHA1 for creating
unique port versions is retained because the hashes are produced locally
from inputs stored in a git history.
Issue #2767
We call curl a second time if the first check fails. This gives download
sites time to reconsider their response and helps, for example, to check
the qemu-usb port.
To raise readability when preparing multiple ports in parallel we prefix
also the git clone output with the port name dark-yellow-coloured. To
achieve this we sed the git output. In sed \x1b[ resolves to an escape
sequence and \033[, that we use normally, doesn't. The echo command, at
the other hand, resolves both to an escape sequence. Thus we use the
sed-compatible version in general. This commit inhibits the progress
output of git clone as it can't be redirected to sed.
Ref #1872
The tool/prepare_port tool is now able to handle a list of ports that
shall be prepared. Additionally, one may state the number of ports that
shall be prepared in parallel at a max by using the -j parameter. If -j
is not set by the user, the tool acts as with -j1. The previous
implementation of the tool that prepares only a single port was moved to
tool/ports/mk/prepare_single_port.mk and acts as back end to the new
prepare_port tool. The interface of the new prepare_port tool is
backwards compatible. When called for one port only, the behavior is the
same as when calling tool/ports/mk/prepare_single_port.mk directly.
Removes "usage" rule from prepare_single_port.mk. Removes shebang line
from prepare_single_port.mk.
Ref #1872
The 'check_port_source' checks whether all remote sources defined for a given
port are currently available. It returns zero, when all remote resources are
available.
Fix#1430
The patch supports both, a download-specific UNZIP_OPT(download) and a
general UNZIP_OPT that can be defined across downloads.
UNZIP_OPT(download) overrides UNZIP_OPT.
Note, the `--strip-components=1` argument is not required for unzip.
Issue #1357
tool/ports/shortcut
create symbolic link from 'contrib/<port-name>-<hash>' to
contrib/<port-name>
tool/ports/current
print current contrib directory of port
Fixes#1345.
Some downloads are available via HTTPS only, but wget < 3.14 does not
support server-name identification, which is used by some sites. So, we
disable certificate checking in wget and check the validity of the
download via SIG or SHA.
Fixes#1334.