curl: increase maximum write-buffer size

The original size of 16K impedes the batched processing of network
packets. Changing the value to 256K reduces the number of context
switches when downloading large files and thereby improves the
throughput by more than 25% (base-hw on qemu_x86_64, using fetchurl
to download a file of 100 MiB via the NIC router from lighttpd).

Issue #4697
This commit is contained in:
Norman Feske 2022-12-03 20:52:32 +01:00 committed by Christian Helmuth
parent 8a9974b6f9
commit 0584ac195c
4 changed files with 14 additions and 2 deletions

View File

@ -1 +1 @@
ba3c2049149311d614a70359426f5b0a49ea239f
b2103a900cd655abca670fc02d1574be4785585a

View File

@ -8,7 +8,7 @@ SIG(curl) := ${URL(curl)}.asc
KEY(curl) := daniel@haxx.se
DIR(curl) := src/lib/curl
PATCHES := src/lib/curl/curl_h.patch
PATCHES := $(addprefix src/lib/curl/,no_socketpair.patch max_write_size.patch)
DIRS := include
DIR_CONTENT(include) = src/lib/curl/include/curl

View File

@ -0,0 +1,12 @@
+++ src/lib/curl/include/curl/curl.h
@@ -247,6 +247,10 @@
#define CURL_MAX_WRITE_SIZE 16384
#endif
+/* Genode: override the default to foster the batching of network packets */
+#undef CURL_MAX_WRITE_SIZE
+#define CURL_MAX_WRITE_SIZE 262144
+
#ifndef CURL_MAX_HTTP_HEADER
/* The only reason to have a max limit for this is to avoid the risk of a bad
server feeding libcurl with a never-ending header that will cause reallocs