In the process of converting to autoconf, the kconfig option
were not properly translated.
Fix that.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Dumping the backtrace has been broken since changeset #652e56d6d35a:
scripts: execute each steps in a subshell
We can spawn sub-sub-shells in some cases.
The way the fault handler works is to dump the backtrace, but to avoid
printing it once for every sub-shell (which could get quite confusing),
it simply exits when it detects that it is being run in a sub-shell,
leaving to the top-level shell the work to dump the backtrace.
Because each step is executed in its own sub-shell, the variable arrays
that contain the step name, the source file and line number, are lost
when exiting the per-step sub-shell.
Hence, the backtrace is currently limited to printing only the top-level
main procedure of the shell.
Fix this thus:
- when dumping the bckatraces for the steps & the functions, remember
it was dumped, and only dump it if it was not already dumped
- at the top-level shell, print the hints
Also, rename the top-level step label.
Reported-by: Benoît Thébaudeau <benoit.thebaudeau@advansee.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Some architectures support a mixed hard/soft floating point, where
the compiler emits hardware floating point instructions, but passes
the operands in core (aka integer) registers.
For example, ARM supports this mode (to come in the next changeset).
Add support for softfp cross compilers to the GCC and GLIBC
configuration. Needed for Ubuntu and other distros that are softfp.
Signed-off-by: Michael Hope <michael.hope@linaro.org>
[yann.morin.1998@anciens.enib.fr: split the original patch]
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
When hardfloat is selected, we need to pass that selection down to
./configure and in the CFLAGS.
Signed-off-by: Michael Hope <michael.hope@linaro.org>
[yann.morin.1998@anciens.enib.fr: split the original patch]
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
With the upcoming softfp support, the case..esac test would become
a bit convoluted if it were to test three different booleans.
Introduce a new blind string config option that defaults to the
selected floating point type used.
Signed-off-by: Michael Hope <michael.hope@linaro.org>
[yann.morin.1998@anciens.enib.fr: split the original patch]
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Change CT_ExtractGit so that it clones the repository, instead of just
symlinking it. After cloning, any given ref is checked out, or if no
ref is given, the HEAD of the repository is checked out.
This makes CT_Extract behave similar for git repositories as it does
for tarballs, so that it for example can be used for passing glibc-ports
as a git repository.
Signed-off-by: "Esben Haabendal" <esben.haabendal@prevas.dk>
[yann.morin.1998@anciens.enib.fr: fix incomplete var rename]
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Virtually all FTP server available on-line support passive FTP.
At least, this is the case for the servers crosstool-NG needs to
connect to.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Currently, we use either wget or curl, whichever is installed.
In case both are installed, both are used. This means that it
takes a while trying all extensions.
Remove use of wget, and use only curl.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Some packages are available as LZMA tarballs. LZMA is a relatively recent
compression algorithm; it's slightly better than bzip2, but offers much
faster decompression. LZMA is now deprecated in favor of XZ, but some
packages switched to LZMA when XZ was not yet available, or still in its
infancy. Latest XZ (which totaly obsoletes LZMA) offers a backward LZMA-
compatible utility, so we can check for 'lzma' nonetheless.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
./configure does check for the presence of gz and bzip2, so we can
safely use them in the build scripts.
On the other hand, more recent formats (eg. XZ) are not yet widely
available, and we do not want, and can't, force the user to install
them as a pre-requisite.
So, build up a list of allowed tarball formats based on the available
decompressors. For no, this is a static list, but the upcoming XZ
support will conditionnaly add to this list.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
When downloading via svn/cvs/... an attempt to retrieve from the
mirror is made. If the mirror does not have the required tarball,
an error message is printed. This is misleading, as the download
may later succeed via svn/cvs/...
Remove the messages about failed downloads altogether.
At the same time, use "if ... then ... fi" instead of "... && ..."
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
This is needed later, when we'll conditionnally use both the
upstream and the mirror URLs.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Currently, the mirror can be used either:
- as a fallback in case upstream is unavailable (default behavior)
- as the preferred source for downloads
But the most common use-case seems to provide a truely-LAN mirror
to speed up downloads in big corpos', and/or provide a 'trusted'
source for the tarballs.
So, make the following changes;
- if a mirror is specified, always try that before trying upstream
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
The cvs download helper looks for the local tarballs dir to see if it
can find a pre-downloaded tarball, and if it does not find it, does
the actual fetch to upstream via cvs.
In the process, it does not even try to get a tarball from the local
mirror, which can be useful if the mirror has been pre-populated
manually (or with a previously downloaded tree).
Fake a tarball get with the standard tarball-download helper, but
without specifying any upstream URL, which makes the helper directly
try the LAN mirror.
Of course, if no mirror is specified, no URL wil be available, and
the standard cvs retrieval will kick in.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
The svn download helper looks for the local tarballs dir to see if it
can find a pre-downloaded tarball, and if it does not find it, does
the actual fetch to upstream via svn.
In the process, it does not even try to get a tarball from the local
mirror, which can be useful if the mirror has been pre-populated
manually (or with a previously downloaded tree).
Fake a tarball get with the standard tarball-download helper, but
without specifying any upstream URL, which makes the helper directly
try the LAN mirror.
Of course, if no mirror is specified, no URL wil be available, and
the standard svn retrieval will kick in.
Reported-by: ANDY KENNEDY <ANDY.KENNEDY@adtran.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
When retrieving tarballs from upstream, if no URL was given, do not
fail; simmply ignore that fact.
This will be used later when the SVN helper will call the standard
helper to try the LAN mirror before trying svn.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Some archives like those of the 2011.07 revisions of Linaro GCC contain a folder
name different from the archive basename, which leads to errors afterwards, e.g.
when patching. E.g.:
gcc-linaro-4.5-2011.07.tar.bz2 extracts to gcc-linaro-4.5-2011.07-0/
This patch changes CT_Extract() to force the extraction of all archives to a
folder named like the archive basename. E.g.:
gcc-linaro-4.5-2011.07.tar.bz2 now extracts to gcc-linaro-4.5-2011.07/
Signed-off-by: "Benoît THÉBAUDEAU" <benoit.thebaudeau@advansee.com>
In case of eglibc, some add-ons that were previously external are
now internal (bundled with the main sources).
So we do not want to fail if an add-on can't be downloaded; we
want to post-pone the check until we can extract the main archive.
So:
- try to retrieve the add-on
- if it fails, print a warning instead of calling CT_Abort
- return 1
So, components that want to catch the error and want to handle it can,
while components that do not will gracefuly fail thanks to our catching
every errors.
Bonus: it works without changing any existing retrieval procedure! :-)
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
For glibc/eglibc, if the add-on can not be extracted, we want to catch
the error to print a meaningful error message.
So:
- try to extract the tarball
- if it fails, print a waring instead of calling CT_Abort
- return 1
So, components that want to catch the error and want to handle it can,
while components that do not will gracefuly fail thanks to our catching
every errors.
Bonus: it works without changing any existing extract procedure! :-)
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
In case of glibc/eglibc, some add-ons that were previously external are
now internal (bundled with the main sources).
So we do not want to fail if an add-on tarball can't be downloaded; we
want to post-pone the check until we can extract the main archive.
So:
- try to download the tarball
- if it fails, print a warning instead of calling CT_Abort
- return 1
So, components that want to catch the error and want to handle it can,
while components that do not will gracefuly fail thanks to our catching
every errors.
Bonus: it works without changing any existing retrieval procedure! :-)
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Reformat the error messages:
- strip ${CT_LIB_DIR} from scripts path names
- strip ${CT_TOP_DIR} from build.log path and docs path
- overall shorter lines
- point to the known-issues file
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
In case date(1) does not support nanosecond resolution, it does
not interpret '%N', and leave it as-is. So we have to remove it.
Note that some versions replaces '%N' with 'N', so we have to
take this into account as well.
Reported-by: Kyle Grieb <grieb.kyle@gmail.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Using CT-NG with progress bar disabled, still prints CR ('\r') characters
on the output. When you capture the output to a file as part of an
automated build, it shows extra empty lines.
For example:
------------------------------------------------------------
[INFO ] Performing some trivial sanity checks
[INFO ] Build started 20110404.113619
[INFO ] Building environment variables
[EXTRA] Preparing working directories
[EXTRA] Installing user-supplied crosstool-NG configuration
------------------------------------------------------------
Signed-off-by: Javier Viguera <javier.viguera@digi.com>
Managing the shared version of the companion libraries
has become cumbersome.
Also, it will one day be possible to use the companion
libraries from the host distribution, and then we will
be able to easily use either shared or static libs.
As a side note, while working on the canadian-rework
series, it has become quite more complex to properly
handle shared companion libraries, as they need to be
built both for the build and gost systems. That's not
easy to handle. At all.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
In a lot of places, we need to call some commands with specific
variable settings, a-la:
var1=val1 var2=val2 /foo/bar/buz opt1 opt2
Unfortunately, we currently can not log the variable settings.
Enhance CT_DoExecLog with a crude heuristic that works pretty well
and that can also log setting variables.
Reported-by: ANDY KENNEDY <ANDY.KENNEDY@adtran.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Users tend to look for the build log in the current working directory,
rather than in the toolchain's installation dir. While bundling the build
log in the toolchain installation dir is nice for distribution and review,
it can be easier to have the build log readily available in the working
directory, as it is quicker to get to it.
So, the build log stays in the working directory until the toolchain is
completely and successfully built, and then a (compressed) copy is made.
Reported-by: Trevor Woerner <twoerner@gmail.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Doing a chmod on the whole source dir after every packages
are extracted can take a hell of a lot of time.
The offending packages are far from legion, and they now
have their own chmod u+w to cleanup their own mess...
Reported-by: ANDY KENNEDY <ANDY.KENNEDY@adtran.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Consider the buildtools install directory as a prefix directory,
that is, install buildtools in prefix/bin/, not in prefix/.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
There is absolutely *no* reason for the buildtools (wrappers to gcc, g++,
as, ld... for the local machine) to be in the toolchain directory. Moreover,
they are removed after the build completes.
Move them out of the toolchain directory, and into the build directory (but
yet the part specific to the current toolchain). This means we no longer
need to explicitly remove them either, BTW, but we need to save/restore them
for the restart feature.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Computed paths may contain double slashes.
This is not an issue but it is just ugly to look at.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Download to an intermediate temp file, and rename it to its final
name only of download succeeds.
This catches both a failed download, and also the case where the user
interrupts the download. Thus, the a partial download gets discarded,
and we no longer try to extract a partial tarball, which we would
previously have done.
Suggested by Thomas PETAZZONI.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
It can happen, in some circumpstances, than one can succeed where
the other would fail. Those cases involves convoluted enterprise
networks with proxies playing tricks.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
The save/restore state output is voluminous; using this flag allows us
to quickly see or ignore when something is just being saved.
[Yann E. MORIN: this is a blind log level, and is used only to search
in the build-log afterward.]
Signed-off-by: Anthony Foiani <anthony.foiani@gmail.com>
I ran into some minor difficulties looking through the build log for a
particular file: I wasn't interested in seeing it unpacked, but only
when it is built or installed. Adding these two levels allows me to
differentiate between those cases.
[Yann E. MORIN: Those are blind log levels, and are used only to search
in the build-log afterward.]
Signed-off-by: Anthony Foiani <anthony.foiani@gmail.com>
To decide whether we need to backup the companion libraries,
do not rely on the !shared case. In the future other cases
may require not to save the companion libraries (eg. if using
the ones provided by the host distro).
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
It happens from time to time that the server mis-behaves, and breaks the
connection right in the middle of nowhere, for no good reason, leaving us
with a partial file, on which the extract pass would choke.
Remove partial downloads, to fail early.
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Even when // downloads are not enabled, aria2 can
fail on some servers (eg. uclibc.org).
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Replace the over-engineered and buggy test in CT_SanitizePath
with a straight forward string pattern match, and also
handle empty PATH elements which are qeuivalent to ".".
Thanks-To: Arnaud Lacombe <lacombar@gmail.com>
Signed-off-by: Johannes Stezenbach <js@sig21.net>
Add CT_SanitizePath function which removes entries referring to ., /tmp
and non-existing directories from $PATH, and call it early in the
build script.
If . is in PATH, gcc-4.4.4 build breaks:
[ALL ] checking what assembler to use...
/tmp/build/targets/arm-unknown-linux-uclibcgnueabi/build/gcc-core-static/arm-unknown-linux-uclibcgnueabi/bin/as
...
[ALL ] config.status: creating as
i.e. "as" is supposed to be the arm-unknown-linux-uclibcgnueabi cross assembler,
but config.status creates a local "as" script which is calling the
host assembler.
Signed-off-by: Johannes Stezenbach <js@sig21.net>
[Yann E. MORIN: style fixes + explanations]
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@anciens.enib.fr>
Call to get the directory mode depending on $CT_SYS_OS
yann.morin.1998@anciens.enib.fr:
CT_SYS_OS has changed on Linuxsystem, it only gets the kernel name "Linux",
and not the system name, 'GNU/'.
Saving and restoring the steps requires saving/restoring multiple
directories. Depending on the configuration, some may not exist.
Add a wrapper that checks before creating/extracting the tarballs.
Not all target tuples consist of an VENDOR, KERNEL and SYSTEM part, build up the
tuple in such a way to no extra or trailing dashes are added to CT_TARGET
Signed-off-by: Bart vdr Meulen <bartvdrmeulen@gmail.com>
On some systems (eg. *BSD and Darwin), date does not support nanoseconds
(%N) precision. Instead of printing '%N' in this case, it just prints 'N'.
Fix the sed expression to handle this case.
Add a git wrapper to retrieve components from their git tree.
Add a git wrapper to create a working copy (in our tarballs dir).
Recognise git trees when searching for local copies.
By default curl doesn't folow redirects. This breaks sourceforge downloads.
Add the -L option to curl to fix this.
Curl also downloads the html as a file even when it gets a 404. This breaks
http downloads when using the failback system. Add the -f option to curl to fix
this.
Signed-off-by: Richard Strand <richard.strand@icomera.com>
In case the remote file does not exist (and probably for some
other reasons as well), aria2 nonetheless creates an empty file
(or not empty for some other reasons).
The solution is to delete the file whenever aria2 fails.
aria2 is a powerfull downloader that is capable of chunking and
parallel retrieval.
Due to li;itations in crosstool-NG retrieval facilities, it's not possible
to take fully advantage of aria2. It might happen that, in the future,
those limitations get lifted away, so we can take use features such as
parallel downloading from more than one server at the same time. For now,
it should still speed up downloads thanks to parallel downloading of chunks.
The newlib "team" rolls new releases about once a year (december).
This is quite a long time between releases, in case code was fixed.
So, allow user to use a CVS snapshot to benefit early from fixes
and enhancements to newlib.
Some projects' module (eg. newlib) are checked-out into a sudirectory
rather than into their own directory. Handle this case in the CT_GetCVS
function.
They have nothing to do in here, just let the user
configure his/her system appropriately.
-------- diffstat follows --------
/trunk/scripts/build/libc/eglibc.sh | 1 0 1 0 -
/trunk/scripts/functions | 100 0 100 0 -----------------------------
/trunk/config/global/download.in | 148 0 148 0 -------------------------------------------
3 files changed, 249 deletions(-)
- introduce the config dir, where components can store their config files
- move the munged uClibc config file to the config dir
- now, the state dir really is an indication that a build can be restarted
Thanks to Groleo Marius <groleo@gmail.com> for spotting the inconsistency
of the state dir usage, and suggesting this change.
/trunk/scripts/build/libc/uClibc.sh | 6 3 3 0 +++---
/trunk/scripts/crosstool-NG.sh.in | 9 7 2 0 +++++++--
/trunk/scripts/functions | 15 12 3 0 ++++++++++++---
3 files changed, 22 insertions(+), 8 deletions(-)
"chmod u+w" the full src tree: because of nochdir and cvs snapshots, we can't reliably know were we are...
/trunk/scripts/functions | 11 3 8 0 +++--------
1 file changed, 3 insertions(+), 8 deletions(-)
... I added a step after
"debug" called "finish", and moved the code in crosstool.sh
after the loop that processes the steps from crosstool.sh
into a do_finish function in functions. Thus, it is now
possible to restart after the "debug" step to re-do the
final few things (clean and compress).
/trunk/scripts/crosstool-NG.sh | 38 0 38 0 --------------------------------------
/trunk/scripts/functions | 42 42 0 0 ++++++++++++++++++++++++++++++++++++++++++
/trunk/steps.mk | 3 2 1 0 ++-
3 files changed, 44 insertions(+), 39 deletions(-)
- renaming the dircetory in CT_ExtratAndPatch is wrong:
- patches against the C library addons may be build against the short *or* long name... :-(
- symlink is more robust, even if less nice
- renaming the directory _after_ CT_ExtractAndPatch is too late:
- if patches are against the short name, and we renamed too the long name, patches don't apply
- so we'll never reach the point where we rename
/trunk/scripts/build/libc/glibc.sh | 1 0 1 0 -
/trunk/scripts/build/libc/eglibc.sh | 1 0 1 0 -
/trunk/scripts/functions | 2 1 1 0 +-
3 files changed, 1 insertion(+), 3 deletions(-)
CT_LIBC_FILE:
- that one was not easy, as it had sneaked into CT_ExtractAndPatch
- which in turn made CT_ExtractAndPatch have references to C library addons
- which in turn relieved the C library _extract functions from doing their own job
- which in turn imposed some nasty tricks in CT_ExtractAndPatch
- which in turn made life easier for the DUMA _get and _extract functions
- which unveiled some bizare behavior for pushd and popd:
- if using smthg ike: 'pushd foo |bar':
- the directory is *neither* changed
- *nor* is it pushed onto the stack
- which made popd fail
CT_MakeAbsolutePath:
- used only to make CT_LOCAL_TARBALLS_DIR canonical
- which is ((almost) useless:
- hopefully, the user entered a full path already
- if it's not the case, too bad...
/trunk/scripts/build/debug/200-duma.sh | 5 1 4 0 +--
/trunk/scripts/build/libc/glibc.sh | 61 32 29 0 +++++++++++++++++---------------
/trunk/scripts/build/libc/uClibc.sh | 16 10 6 0 +++++---
/trunk/scripts/build/libc/eglibc.sh | 48 26 22 0 ++++++++++++++-----------
/trunk/scripts/crosstool.sh | 8 0 8 0 ----
/trunk/scripts/functions | 77 15 62 0 ++++++++--------------------------------
6 files changed, 84 insertions(+), 131 deletions(-)
- retrieve from local storage (CT_GetLocal)
- save to local storage (CT_SaveLocal)
- retrieve from CVS (CT_GetCVS)
- make CT_GetFile and CT_GetCVS use CT_GetLocal and CT_SaveLocal
/trunk/scripts/functions | 126 91 35 0 +++++++++++++++++++++++++++++++++++++++++++++-----------------
1 file changed, 91 insertions(+), 35 deletions(-)
- vendor and alias must not contain spaces
- vendor must not contain dashes '-'
- sed_expr must not generate an alias with a space in it
/trunk/scripts/functions | 17 16 1 0 ++++++++++++++++-
/trunk/config/toolchain.in | 1 1 0 0 +
2 files changed, 17 insertions(+), 1 deletion(-)
We need GNU Awk? Then check for, and use 'gawk', not plain 'awk'.
Be a little mre verbose if a tool was not found.
/trunk/configure | 7 4 3 0 ++++---
/trunk/scripts/build/kernel/linux.sh | 2 1 1 0 +-
/trunk/scripts/functions | 16 8 8 0 ++++++++--------
/trunk/scripts/saveSample.sh | 4 2 2 0 ++--
4 files changed, 15 insertions(+), 14 deletions(-)
Be a little less verbose when extracting (and patching) files.
/trunk/scripts/functions | 7 2 5 0 ++-----
1 file changed, 2 insertions(+), 5 deletions(-)
Use a little bit more of CT_DoExecLog.
/trunk/scripts/functions | 35 18 17 0 ++++++++++++++++++-----------------
1 file changed, 18 insertions(+), 17 deletions(-)
Simplify CT_DoExecLog: it does not support affectations prior to the command, anyway.
/trunk/scripts/functions | 5 1 4 0 +----
1 file changed, 1 insertion(+), 4 deletions(-)
Print the time at which at step was finished (along with the time it took to complete).
/trunk/scripts/functions | 7 5 2 0 +++++--
1 file changed, 5 insertions(+), 2 deletions(-)
It is similar to CT_DoLog, but instead of printing its arguments, it uses them as a command, and logs the output of that command.
/trunk/scripts/functions | 8 8 0 0 ++++++++
1 file changed, 8 insertions(+)
It seems to be helping gcc somewhat into telling the correct endianness to ld that sticks with little endian even when the target is big (eg armeb-unknown-linux-uclibcgnueabi).
There's still work to do, especially finish the gcc part that is not in this commit.
/trunk/scripts/functions | 9 7 2 0 +++++++--
1 file changed, 7 insertions(+), 2 deletions(-)
network, most probably due to proxies. Have downloaders (wget and curl)
timeout on too slow connections (they don't by default).
scripts/functions | 17 12 5 0 ++++++++++++-----
1 file changed, 12 insertions(+), 5 deletions(-)
Rationale:
Most of the time, soft-float problems are caused by this sucker of gcc:
it has support for soft float for all of the targets I've tried so far,
but does not activate this code until you dwelve into half a dozen of
files to make it accept to build and link the support code...
So, yes: gcc has soft-float support. And again, yes: gcc is a sucker.
If you select to debug ct-ng, then you have two new options:
- DEBUG_CT_PAUSE_STEPS : pause between every steps,
- DEBUG_CT_SAVE_STEPS : save state between every steps.
To restart a saved state, just set the RESTART make variable when calling make:
- make RESTART=<step_name>
- pipe size in Linux is only 8*512=4096 bytes
- pipe size is not setable
- when the feeding process spits out data faster than the eating
process can read it, then the feeding process stalls after 4KiB
of data sent to the pipe
- for us, the progress bar would spawn a sub-shell every line,
and the sub-shell would in turn spawn a 'date' command.
Which was sloooww as hell, and would cause some kind of a
starvation: the pipe was full most of the time, and the
feeding process was stalled all this time.
Now, we use internal variables and a little hack based onan offset
to determine the elapsed time. Much faster this way, but still
CPU-intensive.
Add a uClibc-0.9.29 patch directory with one patch (from me!).
Update the armeb-unknown-linux-uclibc sample to uClibc-0.9.29.
Some eyecandy in the gdb build process.
- add a framework to easily add new ones
- add gdb as a first debug facility
- add patches for gdb
After the kernel checked its installed headers, clean up the mess of .checked.* files.
Reorder scripts/crosstool.sh:
- dump the configuration early
- renice early
- get info about build system early, when setting up the environment
- when in cross or native, the host tools are those of the build system, and only in this case
- elapsed time calculations moved to scripts/functions
Remove handling of the color: it's gone once and for all.
Update tools/addToolVersion.sh:
- handle debug facilities
- commonalise some code
- remove dead tools (cygwin, tcc)
Point to my address for bug reports.
- the tarball directory is considered as a local copy, and tarballs are copied to a working area,
- the sources and build directories (CT_SRC_DIR and CT_BUILD_DIR) are now computed, and no longer an option,
- the build dir has been renamed from 'build' to 'targets'.
That should ease preparing a tarball of the resulting target.