Settings

Theme

Curl 7.51.0 Released

curl.haxx.se

181 points by emillon 9 years ago · 34 comments

Reader

neic 9 years ago

I see that Ubuntu 16.04 LTS have version 7.47.0 [1]. Its been 9 months, 9 releases and at least 15 CVEs since then. I can also see that some of the CVEs was reported to distros@openwall [2]. I (naively) assumed that once this was reported, the package maintainers would update the packages, push a release at the same time as the original developer made a public statement. Then I could just update my system and be done with it.

Where is the fault in this chain? How can I as a maintainer of a few servers be sure my servers are secure without manually patching every package?

[1] http://packages.ubuntu.com/xenial/libcurl3 [2] http://oss-security.openwall.org/wiki/mailing-lists/distros

EDIT: changed "12 CVEs" to "at least 15 CVEs". The changelog don't have CVE-numbers in the title for all of them.

  • hannob 9 years ago

    It's the concept of LTS distributions to stick with one version and only patch important bugfixes and security vulnerabilities.

    So if the Ubuntu security team does its job properly then you shouldn't have a reason to worry.

    (However given the number of security vulns these days it's often challenging for LTS distributions to backport all security fixes. There are already breakdowns of the LTS concept, e.g. sticking with latest upstream versions for some packages like chromium where backporting is not realistic.)

  • 0x0 9 years ago

    Debian similarly is backporting security fixes.

    You can see the status of all known CVEs and which .deb updates patches them here: https://security-tracker.debian.org/tracker/source-package/c...

    The best way to stay on top of things is to subscribe to your distro's security advisory mailing list, for example https://lists.debian.org/debian-security-announce/

  • geofft 9 years ago

    If you go to the "Ubuntu Changelog" link on the right, you can see that they've backported three security fixes (CVE-2016-5419, CVE-2016-5420, and CVE-2016-5421) since the 16.04 LTS release.

    You are trusting Ubuntu's judgment that the remaining 12 CVEs aren't that important. Ubuntu's security team is pretty good, but I don't think there is any distro that is extremely good. In part this is because a distro is on the hook for compatibility of all the software they ship, and expected to prioritize compatibility over security. Anything other than targeted security fixes can cause regressions.

  • sdeziel 9 years ago

    You can get an overview of the various CVEs affecting curl: https://people.canonical.com/~ubuntu-security/cve/pkg/curl.h...

mitchtbaum 9 years ago

Those looking from a "reimplement it in Rust" angle may like:

* https://github.com/hyperium/hyper/

* https://github.com/lukaszwawrzyk/rust-wget

* https://github.com/tokio-rs/tokio-curl

fuhrysteve 9 years ago

Does anyone have an abbreviated explanation of what the security vulnerabilities that were addressed here? I recall there was a very ominous post to look out for this release because of some nasty stuff they found.

jamies888888 9 years ago

I love cURL. Keep up the good work.

du_bing 9 years ago

What is the biggest usage of Curl? I am new to Linux,sorry.

  • hannob 9 years ago

    The biggest usage is probably not the tool itself, but the library.

    It's the de-facto-standard-HTTP-library (although it does more than just HTTP). E.g. if you have a PHP script that downloads some data from another webpage it very often does this with curl.

  • thejosh 9 years ago

    Apart from the oodles of software that depends on curl/libcurl, my favourite thing is to "Copy as cURL" from a Chrome request.

  • SturgeonsLaw 9 years ago

    It's handy for giving an API endpoint a slap without needing a scripting language

  • 72deluxe 9 years ago

    The underlying library (libcurl) is extremely useful on all platforms, not just Linux.

  • 0xmohit 9 years ago

    I find it very useful for debugging. For example, when you have a web server that redirects requests saying:

      curl --verbose --location URL
    
    or even

      curl --trace --location URL
    
    provides the desired information.
  • AstroJetson 9 years ago

    Data transfer by many protocols using URLs. See their website https://curl.haxx.se for details.

    Super tool, we use it all the time for file transfers

  • gnur 9 years ago

    I use for a few things, almost every day.

    Getting my outgoing ip address (to check net connectivity): curl ip.mydomain.net or (to force ipv4) curl -4 ip.mydomain.net

    To check how a page redirects: curl -v example.com (shows headers)

    Interacting with elasticsearch, mainly showing indexes and their health: curl localhost:9200/_cat/indices

  • dorfsmay 9 years ago

    It's used to send an http/HTTPS reqeuests, controlling every aspect of it, form headers to cookies, to ignoring/not SSL certs, and has advanced debugging option to show you the entire dialog (-sv) and use a different IP address (--resolve to test you firewalls, LBs etc...).

    • sglane 9 years ago

      It sure does quite a lot and I love it, but I wish it could handle SIP and WebSockets without too much trouble.

  • erelde 9 years ago

    File transfers, getting http headers, getting html content of a page piping it to a script to clean it up and read later (long articles).

    That's what I use it for, I think, mainly.

    Edit: also between php scripts

  • SFJulie 9 years ago

    curl is a trusted way to execute unsafe code on the internet

    https://gnu.moe/wallofshame.md

    • geofft 9 years ago

      "curl | sh" is no worse than "wget && tar xf && ./configure". I have yet to see anyone who knows how to audit a configure script generated by GNU autoconf, which is generally a multiple-tens-of-thousands-of-line monstrosity that generates a bunch of C files and compiles and runs them. FUD about "curl | sh" isn't rooted in any sensible security modeling.

      As a general rule of thumb, if you think everyone around you is independently doing something stupid, you should first pursue the hypothesis that it is your reasoning that is flawed and not the entire rest of the world's.

      • feld 9 years ago

        But you're verifying the signature of the tarball first, right?

        • geofft 9 years ago

          Almost all of the examples on that page use curl https:// | sh. Which, again, makes it a superior option to wget && gpg --verify && ./configure; I have yet to see anyone who is better at PGP fingerprint verification than their OS's SSL stack is at TLS certificate verification. (There are a very small number of people who are as good, but not better.)

kinow 9 years ago

Change log for this release

Fixed in 7.51.0 - November 2 2016

Changes:

    nss: additional cipher suites are now accepted by CURLOPT_SSL_CIPHER_LIST
    New option: CURLOPT_KEEP_SENDING_ON_ERROR 
Bugfixes:

    CVE-2016-8615: cookie injection for other servers
    CVE-2016-8616: case insensitive password comparison
    CVE-2016-8617: OOB write via unchecked multiplication
    CVE-2016-8618: double-free in curl_maprintf
    CVE-2016-8619: double-free in krb5 code
    CVE-2016-8620: glob parser write/read out of bounds
    CVE-2016-8621: curl_getdate read out of bounds
    CVE-2016-8622: URL unescape heap overflow via integer truncation
    CVE-2016-8623: Use-after-free via shared cookies
    CVE-2016-8624: invalid URL parsing with '#'
    CVE-2016-8625: IDNA 2003 makes curl use wrong host
    openssl: fix per-thread memory leak using 1.0.1 or 1.0.2
    http: accept "Transfer-Encoding: chunked" for HTTP/2 as well
    LICENSE-MIXING.md: update with mbedTLS dual licensing
    examples/imap-append: Set size of data to be uploaded
    test2048: fix url
    darwinssl: disable RC4 cipher-suite support
    CURLOPT_PINNEDPUBLICKEY.3: fix the AVAILABILITY formatting
    openssl: don’t call CRYTPO_cleanup_all_ex_data
    libressl: fix version output
    easy: Reset all statistical session info in curl_easy_reset
    curl_global_cleanup.3: don't unload the lib with sub threads running
    dist: add CurlSymbolHiding.cmake to the tarball
    docs: Remove that --proto is just used for initial retrieval
    configure: Fixed builds with libssh2 in a custom location
    curl.1: --trace supports % for sending to stderr!
    cookies: same domain handling changed to match browser behavior
    formpost: trying to attach a directory no longer crashes
    CURLOPT_DEBUGFUNCTION.3: fixed unused argument warning
    formpost: avoid silent snprintf() truncation
    ftp: fix Curl_ftpsendf
    mprintf: return error on too many arguments
    smb: properly check incoming packet boundaries
    GIT-INFO: remove the Mac 10.1-specific details
    resolve: add error message when resolving using SIGALRM
    cmake: add nghttp2 support
    dist: remove PDF and HTML converted docs from the releases
    configure: disable poll() in macOS builds
    vtls: only re-use session-ids using the same scheme
    pipelining: skip to-be-closed connections when pipelining
    win: fix Universal Windows Platform build
    curl: do not set CURLOPT_SSLENGINE to DEFAULT automatically
    maketgz: make it support "only" generating version info
    Curl_socket_check: add extra check to avoid integer overflow
    gopher: properly return error for poll failures
    curl: set INTERLEAVEDATA too
    polarssl: clear thread array at init
    polarssl: fix unaligned SSL session-id lock
    polarssl: reduce #ifdef madness with a macro
    curl_multi_add_handle: set timeouts in closure handles
    configure: set min version flags for builds on mac
    INSTALL: converted to markdown => INSTALL.md
    curl_multi_remove_handle: fix a double-free
    multi: fix inifinte loop in curl_multi_cleanup()
    nss: fix tight loop in non-blocking TLS handhsake over proxy
    mk-ca-bundle: Change URL retrieval to HTTPS-only by default
    mbedtls: stop using deprecated include file
    docs: fix req->data in multi-uv example
    configure: Fix test syntax for monotonic clock_gettime
    CURLMOPT_MAX_PIPELINE_LENGTH.3: Clarify it's not for HTTP/2
  • teh_klev 9 years ago

    Why repeat this here in an inferior format?

    • IsmaOlvey 9 years ago

      Because the commenter perceived that some people only read the comments (or read the comments first).

    • LeonM 9 years ago

      I think this format is superior for mobile users, it gives them only the relevant information in just a few kB of data.

      In this case, haxx.se is not that bad, but many news sites present so much ads, overlays, non-responsive UI, dark UI etc etc that most mobile browsers crash, and loading takes forever due to 10+mb of ads on a 3G connection just to display 20 lines of information.

      • teh_klev 9 years ago

        I read this first on my android phone and can assure you that your copy paste is inferior, especially so due to the use of a fixed width font causing line truncation.

        Also this kinda thing is about as popular here as "tldr" posts which are quickly suppressed.

        My friendly advice is to not make a habit of this

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection