Remove the stale site definitions from @APACHE, KERNEL etc.
* Remove site that had dropped APACHE
* Remove KERNEL site leading to wrong directory
* Remove dead sites
* Convert ftp/http URLs to https, if possible. Remove duplicate
Signed-off-by: Hannu Nyman <hannu.nyman@iki.fi>
Remove stale sites from @GNOME alias:
* remove 2 sites that have stale 3 years old content
* remove 2 sites that have dropped GNOME
* convert 2 sites from FTP to HTTP
Signed-off-by: Hannu Nyman <hannu.nyman@iki.fi>
The aria2c command tries to load config from
${XDG_CONFIG_HOME:-${HOME}/.config}/aria2/aria2.conf by default,
which may result unexpected behavior.
As a replacement, people can use environment variable ARIA2C_OPTIONS
to custom arguments passed to aria2c like curl and wget below.
Including --conf-path=/path/to/config.conf in ARIA2C_OPTIONS can
also set a custom config file path easily if needed.
Signed-off-by: Zhang Hua <zhanghuadedn@gmail.com>
Introduce a new option in the "Advanced configuration options" to
configure a custom download tool.
By declaring a string in "Use custom download tool" an user can force
what command to use to download package. With the string empty the
default tool used is curl, with wget as a fallback if not available.
download.pl supports 3 tools officially aria2c, curl and wget.
If one of the tool is used in this config, download.pl will use the
default args to make use of them.
If the provided string is different than aria2c, curl or wget, the command
is used as is and the download url will be appended at the end of such command.
While at it also tweak the tool selection logic and chose the tool only
once when the script is called and move aria2c specific variables in the
relevant section.
Signed-off-by: Christian Marangi <ansuelsmth@gmail.com>
Generilize download tool check and skip other check if a download tool
has been found.
While at it also reintroduce c836ca84e8
that was previously dropped with aria2c support.
Signed-off-by: Christian Marangi <ansuelsmth@gmail.com>
Currently we use /dev/shm to place aria2c tmp file. This is not present
on macos. Use the openwrt tmp directory instead of the linux-only
/dev/shm to save compatibility with more os.
Fixes: d391236269 ("download.pl: add aria2c support")
Signed-off-by: Christian Marangi <ansuelsmth@gmail.com>
With the introduction of aria2c support, curl and wget no longer try to
download the file from mirrors. Fix this regression by emptying the
remaining mirrors list only when aria2c is used.
Fixes: d391236269 ("download.pl: add aria2c support")
Signed-off-by: Christian Marangi <ansuelsmth@gmail.com>
Fix whitespace in mirror urls and replace for loop with join+map logic.
Fixes: d391236269 ("download.pl: add aria2c support")
Signed-off-by: Christian Marangi <ansuelsmth@gmail.com>
Use aria2c download tool by default on package download if available in
the system.
aria2c permits to use multiple mirrors and may improve download speed on
special context where servers are hard to reach.
Co-authored-by: Christian Marangi <ansuelsmth@gmail.com>
Signed-off-by: Bradford Zhang <zyc@zyc.name>
[ fix wrong var in the script and improve commit description ]
Signed-off-by: Christian Marangi <ansuelsmth@gmail.com>
Several users of wget for downloads (curl is not available in the
system) have reported broken download functionality:
wget --tries=5 --timeout=20 --output-document=- https://cdn.kernel.org/pub/linux/kernel/v5.x/linux-5.10.142.tar.xz
http://: Invalid host name.
Thats all happening due to '' was passed as an argument, which got later
expanded to http://.
In the context of a list constructor '' is not nothing, it is an empty
string element. So fix it by using () as it will yield "nothing" and
thus not introduce an empty string element.
Fixes: #10692
Fixes: 90c6e3aedf ("scripts: always check certificates")
Signed-off-by: Jo-Philipp Wich <jo@mein.io> [shellwords() -> ()]
Signed-off-by: Petr Štetiar <ynezz@true.cz>
When running build in verbose mode `make V=s` we can see a lot of
following warnings when curl is not available in the system:
Can't exec "curl": No such file or directory at scripts/download.pl line 77.
So lets fix it by redirecting of the stderr to null hole.
Signed-off-by: Petr Štetiar <ynezz@true.cz>
Remove flags from wget and curl instructing them to ignore bad server
certificates. Although other mechanisms can protect against malicious
modifications of downloads, other vectors of attack may be available
to an adversary.
TLS certificate verification can be disabled by turning oof the
"Enable TLS certificate verification during package download" option
enabled by default in the "Global build settings" in "make menuconfig"
Signed-off-by: Josh Roys <roysjosh@gmail.com>
[ add additional info on how to disable this option ]
Signed-off-by: Christian Marangi <ansuelsmth@gmail.com>
Ubuntu started to flag which as deprecated and it
seems which is not really standard and may vary
across Distro.
Drop the use of which and use the standard 'command -v'
for this simple task.
Which is still present in the prereq if some package/script
still use which.
A utility script called command_all.sh is implemented that
will just mimic the output of which -a.
Signed-off-by: Ansuel Smith <ansuelsmth@gmail.com>
Before this commit, it was assumed that mkhash is in the PATH. While
this was fine for the normal build workflow, this led to some issues if
make TOPDIR="$(pwd)" -C "$pkgdir" compile
was called manually. In most of the cases, I just saw warnings like this:
make: Entering directory '/home/.../package/gluon-status-page'
bash: line 1: mkhash: command not found
bash: line 1: mkhash: command not found
bash: line 1: mkhash: command not found
bash: line 1: mkhash: command not found
bash: line 1: mkhash: command not found
bash: line 1: mkhash: command not found
bash: line 1: mkhash: command not found
bash: line 1: mkhash: command not found
[...]
While these were only warnings and the package still compiled sucessfully,
I also observed that some package even fail to build because of this.
After applying this commit, the variable $(MKHASH) is introduced. This
variable points to $(STAGING_DIR_HOST)/bin/mkhash, which is always the
correct path.
Signed-off-by: Leonardo Mörlein <me@irrelefant.net>
Multiple sources are hosted on OpenWrts source server only. The source
URLs to point to the server vary based on different epochs in OpenWrts
history.
Replace all by @OPENWRT which is an "empty" mirror, therefore using the
fallback servers sources.cdn.openwrt.org and sources.openwrt.org.
Signed-off-by: Paul Spooren <mail@aparcar.org>
In case the default sources for a package fail use the CDN rather than
our own mirror. In case the CDN fails, fallback to our mirror.
Also remove mirror1 which isn't available anymore.
Signed-off-by: Paul Spooren <mail@aparcar.org>
It seems like after a build the /dl dir seems to now contain a .hash
file for each source file due to inproper cleanup so fix it by removing
those intermediate files before leaving the download action.
Fixes: 4e19cbc553 ("download: handle possibly invalid local tarballs")
Reported-by: Hannu Nyman <hannu.nyman@iki.fi>
Signed-off-by: Petr Štetiar <ynezz@true.cz>
Currently it's assumed, that already downloaded tarballs are always
fine, so no checksum checking is performed and the tarball is used even
if it might be corrupted.
From now on, we're going to always check the downloaded tarballs before
considering them valid.
Steps to reproduce:
1. Remove cached tarball
rm dl/libubox-2020-08-06-9e52171d.tar.xz
2. Download valid tarball again
make package/libubox/download
3. Invalidate the tarball
sed -i 's/PKG_MIRROR_HASH:=../PKG_MIRROR_HASH:=ff/' package/libs/libubox/Makefile
4. Now compile with corrupt tarball source
make package/libubox/{clean,compile}
Signed-off-by: Petr Štetiar <ynezz@true.cz>
With this commit, the download script will try downloading source files
using the filename instead of the url-filename in case the previous
download attempt using the url-filename failed.
This is required, as the OpenWrt sources mirrors serve files using the
filename files might be renamed to after downloading. If the original
mirror for a file where url-filename and filename do not match goes
down, the download failed prior to this patch.
Further improvement can be done by performing this only for the
OpenWrt sources mirrors.
Signed-off-by: David Bauer <mail@david-bauer.net>
This reverts commit c737a9ee6a.
The source CDN has been discontinued in its current form and will take a
while to be reestablished. Even then it makes little sense to put a CDN
before other CDNs such as kernel.org, apache.org, sourceforge etc.
Signed-off-by: Jo-Philipp Wich <jo@mein.io>
OpenWrt now has a CDN for sources at sources.cdn.openwrt.org which
mirrors sources.openwrt.org.
Downloading sources outside Europe or US (mainland) could
result in low throughput, extremely slowing down the first compilation of
the build system.
This patch adds sources.cdn.openwrt.org as the first mirror to offer
worldwide fast download speeds by default. If the CDN goes down for
whatever reason, the script jumps to the next available mirror and
downloads requested files as before (in regional varying speed).
Signed-off-by: Paul Spooren <mail@aparcar.org>
Acked-by: Eneas U de Queiroz <cotequeiroz@gmail.com>
apache mirrors holds only latest releases, to download
older releases, one must use archive.apache.org to get
them.
Signed-off-by: Jiri Kastner <cz172638@gmail.com>
SourceForge has supported HTTPS for its downloads for a long time now.
I have not been able to see any failures resulting from this change.
Signed-off-by: Rosen Penev <rosenp@gmail.com>
When calling a download target, hash verification is now completely
skipped if we set PKG_HASH=skip.
This allows to easily bump package version:
$ make package/<mypackage>/download PKG_HASH=skip V=s
$ make package/<mypackage>/check FIXUP=1 V=s
This will download the new version of the package, and then automatically
update PKG_HASH with the hash of the new version. Of course, it is still
the responsibility of the packager to ensure that the new tarball is
legitimate, because it is downloaded from a possibly untrusted source.
Fixes: b30ba14e ("scripts/download.pl: fail loudly if provided hash is unsupported")
Signed-off-by: Baptiste Jonglez <git@bitsofnetworks.org>
Signed-off-by: Jo-Philipp Wich <jo@mein.io>
Acked-by: Stijn Tintel <stijn@linux-ipv6.be>
Signed-off-by: John Crispin <john@phrozen.org>
Currently, if the provided hash is unsupported (length different from 32
or 64 bytes), we happily download the requested file without any kind of
checksum verification.
This is quite dangerous and may provide a false sense of security, because
a single typo in the hash (e.g. one character deleted by mistake) may skip
checksum verification entirely.
Instead, fail immediately if we don't support the provided hash.
In particular, if an external package repository decides to change the
hash algorithm one day, we will now fail loudly instead of skipping
checksum verification without complaints.
Note: if some users of scripts/download.pl knowingly provide an empty hash
because they don't need checksum verification, this change will break
them. This does not seem to be the case currently, but if this feature is
ever needed, an option should be added to download.pl instead of relying
on the hash being empty.
Fixes: eaa4eba10a ("scripts/download.pl: add SHA-256 support")
Signed-off-by: Baptiste Jonglez <git@bitsofnetworks.org>
If CONFIG_DOWNLOAD_FOLDER is set to for example "~/dl", the download
script fails to create the .hash and .dl files with the following
errors:
Cannot create file ~/dl/dropbear-2017.75.tar.bz2.dl: No such file or directory
sh: 1: cannot create ~/dl/dropbear-2017.75.tar.bz2.hash: Directory nonexistent
If the tarball already exists in the ~/dl dir, it's properly found and
used, so this issue only affects the download.pl script.
This patch calls glob() on the target dir parameter, which will expand `~`.
Signed-off-by: Zoltan Gyarmati <mr.zoltan.gyarmati@gmail.com>
Internet2 isn't considered a trusted issuer meaning that https links to
rit.edu will fail.
The host mirror.csclub.uwaterloo.ca has a trusted SSL cert and peering
is good so it can replace rit.edu without performance issues.
Signed-off-by: Daniel Engberg <daniel.engberg.lists@pyret.net>
[Jo-Philipp Wich: rewrapped commit message]
Signed-off-by: Jo-Philipp Wich <jo@mein.io>
Because wget doesn't know how to do Negotiate authentication with a proxy
and curl does, use curl if it's present. The user is expected to have a
~/.curlrc that sets the options necessary for any proxy authentication.
A ~/.curlrc is completely optional however and curl will work in exactly
the same manner as wget without one.
Signed-off-by: Brian J. Murrell <brian@interlinx.bc.ca>
[Jo-Philipp Wich: Rework code to detect curl usability by checking --version,
Use vararg style open() to bypass the shell when downloading,
Use Text::ParseWords to decompose env vars into arguments]
Signed-off-by: Jo-Philipp Wich <jo@mein.io>
In the build system, flock will prevent multiple concurrent downloads
for the same file. However, if one download request for the same file is
waiting for another one to finish, it will result in downloading the
same file twice consecutively.
Prevent this issue by exiting immediately if the file has already been
downloaded
Signed-off-by: Felix Fietkau <nbd@nbd.name>
Provide HTTPS URL when possible, try to keep 8 mirrors per entry and spread
over several locations of the world. Since most active contributors are in
US/CA and/or EU prioritize mirrors that are within those regions if possible.
Signed-off-by: Daniel Engberg <daniel.engberg.lists@pyret.net>
The Apache Software Foundation offers diverse download mirros.
For packaging Apache software a new alias @APACHE is defined.
Signed-off-by: Heinrich Schuchardt <xypron.glpk@gmx.de>
SVN-Revision: 48270