A reported Free Download Manager supply chain attack redirected Linux users to a malicious Debian package repository that installed information-stealing malware.

The malware used in this campaign establishes a reverse shell to a C2 server and installs a Bash stealer that collects user data and account credentials.

Kaspersky discovered the potential supply chain compromise case while investigating suspicious domains, finding that the campaign has been underway for over three years.

    • @30p87@feddit.de
      link
      fedilink
      441 year ago

      And via a website too. That’s like pushing a car. One of the main strengths of Linux are open repositories, maintained by reputable sources and checked by thousands of reputable people. Packages are checksummed and therefore unable to be switched by malicious parties. Even the AUR is arguably a safer and more regulated source. And it’s actually in there.

    • @TrustingZebra@lemmy.one
      link
      fedilink
      111 year ago

      It’s still my favorite download manager on Windows. It often downloads file significantly faster than the download manager built into browsers. Luckily I never installed it on Linux, since I have a habit of only installing from package managers.

      Do you know of a good download manager for Linux?

      • @FredericChopin_@feddit.uk
        link
        fedilink
        English
        141 year ago

        How much faster are we talking?

        I’ve honestly never looked at my downloads and though huh you should be quicker, well maybe in 90’s.

        • no banana
          link
          fedilink
          71 year ago

          Right? I’ve not thought about download speeds since the 2000’s.

        • @TrustingZebra@lemmy.one
          link
          fedilink
          71 year ago

          FDM does some clever things to boost download speeds. It splits up a download into different chuncks, and somehow downloads them concurrently. It makes a big difference for large files (for example, Linux ISOs).

          • @somedaysoon@lemmy.world
            link
            fedilink
            English
            161 year ago

            It only makes a difference if the server is capping the speed per connection. If it’s not then it will not make a difference.

            • @TrustingZebra@lemmy.one
              link
              fedilink
              31 year ago

              I guess many servers are capping speeds them. Makes sense since I almost never see downloads actually take advantage of my Gigabit internet speeds.

              • @somedaysoon@lemmy.world
                link
                fedilink
                English
                3
                edit-2
                1 year ago

                It’s interesting to me people still download things in that fashion. What are you downloading?

                I occasionally download something from a web server, but not enough to care about using a download manager that might make it marginally faster. Most larger files I’m downloading are either TV shows and movies from torrents and usenet, or games on steam. All of which will easily saturate a 1Gbps connection.

          • @FredericChopin_@feddit.uk
            link
            fedilink
            English
            91 year ago

            Im curious as to how it would achieve that?

            It can’t split a file before it has the file. And all downloads are split up. They’re called packets.

            Not saying it doesn’t do it, just wondering how.

            • @everett@lemmy.ml
              link
              fedilink
              131 year ago

              It could make multiple requests to the server, asking each request to resume starting at a certain byte.

                • @drspod@lemmy.mlOP
                  link
                  fedilink
                  181 year ago

                  The key thing to know is that a client can do an HTTP HEAD request to get just the Content-Length of the file, and then perform GET requests with the Range request header to fetch a specific chunk of a file.

                  This mechanism was introduced in HTTP 1.1 (byte-serving).

        • arglebargle
          link
          fedilink
          English
          61 year ago

          just grabbed a gig file - it would take about 8 minutes with a standard download in Firefox. Use a manager or axel and it will be 30 seconds. Then again speed isnt everything, its also nice to be able to have auto retry and completion.

          • PenguinCoder
            link
            fedilink
            English
            11 year ago

            I was just going to recommend this too; Use axel, aria2 or even ancient hget.

      • @Xirup@lemmy.dbzer0.com
        link
        fedilink
        81 year ago

        JDownloader, XDM, FileCentipede (this one is the closest to IDM, although it uses closed source libraries), kGet, etc.

      • arglebargle
        link
        fedilink
        English
        41 year ago

        axel. use axel -n8 to make 8 connections/segments which it will assemble when it is done

    • Gotta admit, it was me. I’ve only used a computer for short time.
      I’ve got my first laptop 3 years ago, and that broke after just 2 months. And anyway, with AMD Athlon 64 it greatly struggled with a browser. So really I only started seriously using computer at the start of 2021, when I got another, usable laptop. And that’s when I downloaded freedownloadmanager.deb. Thankfully, I didn’t get that redirect, so it was a legitimate file.

    • @Hamartiogonic@sopuli.xyz
      link
      fedilink
      81 year ago

      Oh, I know someone who adds the word “free” to various search words like “free pdf reader” or “free flash player” (happened a very long time ago). He’s also the kind of person who I can imagine having a bunch of viruses and malware on his computer.

    • @Honytawk@lemmy.zip
      link
      fedilink
      51 year ago

      People not well versed in Linux.

      You know, the non-techies, which the Linux community claims should know such things but obviously does not.

    • @gaael@lemmy.world
      link
      fedilink
      21 year ago

      I’ve installed and used it, and still do.

      My internet connection is not that reliable, and when I download big files that are not torrents (say >1000 MB) and the download is interrupted because of internet disconnect, Firefox often has trouble getting back to it while FDM doesn’t.

      FDM also lets me set download speed limits, which means I can still browse the internet while downloading.

      It’s not my main tool for downloading stuff, but it has its uses.

  • @drspod@lemmy.mlOP
    link
    fedilink
    561 year ago

    The article mentions how to check for infection:

    If you have installed the Linux version of the Free Download Manager between 2020 and 2022, you should check and see if the malicious version was installed.

    To do this, look for the following files dropped by the malware, and if found, delete them:

    /etc/cron.d/collect
    /var/tmp/crond
    /var/tmp/bs
    
  • Hairyblue
    link
    fedilink
    351 year ago

    What is a free download manager and why would someone need one?

    • lemmyvore
      link
      fedilink
      English
      231 year ago

      Back in the day when most stuff was on FTP and HTTP and your connection was crap and could drop at any time, you’d use a download manager to smooth things along. It could resume downloads when connection dropped, it could keep a download going for days on end and resume as needed, and it could abusing the bandwitdh limitations of the source site by using multiple parallel connections that pulled on different file chunks. In some ways it was very similar to how we use BT today.

      It was also useful to keep a history of stuff you’d downloaded in case you needed it again, manage the associated files etc.

      • @drspod@lemmy.mlOP
        link
        fedilink
        161 year ago

        and it could abusing the bandwitdh limitations of the source site by using multiple parallel connections that pulled on different file chunks

        Also for files which had multiple different mirror sites you could download chunks from multiple mirrors concurrently which would allow you to max out your bandwidth even if individual mirrors were limiting download speeds.

    • Dhs92
      link
      fedilink
      141 year ago

      It’s a download client that can pause/Resume downloads, as well as use multiple connections to download files

      • Hairyblue
        link
        fedilink
        41 year ago

        Like a BitTorrent?

        I guess I just don’t download that much stuff.

        • @db2@sopuli.xyz
          link
          fedilink
          31 year ago

          BitTorrent works in chunks basically, or can download it nonlinearly. Downloading from a site in a basic way gets the file from start to finish, the download manager can let you stop it and pick up where you left off, as long as the server you’re getting the file from is configured to allow it.

          https://github.com/agalwood/Motrix

          (Note: I don’t use that or any other download manager and haven’t since Windows 95, it’s linked as example only)

    • @puffy@lemmy.world
      link
      fedilink
      7
      edit-2
      1 year ago

      Back in the 2000s, browsers were really bad at downloading big things over slow connections since they couldn’t resume, a brief disconnect could destroy hours of progress. But I don’t think you need this anymore.

  • @gabriele97@lemmy.g97.top
    link
    fedilink
    251 year ago

    How is it possible that users noticed strange behaviors (new Cron jobs) and they didn’t check the script launched by those jobs 😱

    • @jsdz@lemmy.ml
      link
      fedilink
      661 year ago

      Linux popularity going up means the percentage of users who know what cron is goes down.

        • @d3Xt3r@lemmy.nzM
          link
          fedilink
          5
          edit-2
          1 year ago

          They actually are, kind of. It’s called Tron: Ares and it’s been in production hell for some years, the most recent delay being due to the ongoing writer’s strike. Filming is expected to start after the strike is over, but personally my enthusiasm for the movie died after they announced Jared Leto as one of the cast.

      • @Ferk@lemmy.ml
        link
        fedilink
        7
        edit-2
        1 year ago

        If they were complaining about cronjobs being created (like the post says), then they must have known what cron is.

  • _cnt0
    link
    fedilink
    81 year ago

    malicious Debian package repository

    *laughs in RPM*

    This comment was presented by the fedora gang.

    • @puffy@lemmy.world
      link
      fedilink
      131 year ago

      Right, but you could do the same with RPM. Not everyone is aware of this, but installing a package executes scripts with root access over your system.

  • @rufus@discuss.tchncs.de
    link
    fedilink
    8
    edit-2
    1 year ago

    Mmmh. You kinda deserve being infected if you do things like this. Every beginner tutorial specifically tells you not to download random stuff from the internet and ‘sudo’ install it. Every Wiki with helpful information has these boxes that tell you not to do it. I’m okay if you do it anyways. But don’t blame anyone else for the consequences. And don’t tell me you haven’t been warned.

    Also I wonder about the impact this had. It went unnoticed for 3 years. So I can’t imagine it having affected many people. The text says it affected few people. And it didn’t have any real impact.

    But supply chain attacks are real. Don’t get fooled. And don’t install random stuff. Install the download manager from your package repository instead.

    • @ipkpjersi@lemmy.ml
      link
      fedilink
      81 year ago

      I kind of disagree. Applications often require root permissions to install themselves, since regular users can’t access certain folders like /opt, etc.

      Also, do you really think that people would actually read the source and then compile all their software themselves? Do you do the same?

      Generally though I do agree, you’re probably fine installing software from your distro’s repos but even that’s not bulletproof and also it’s not like third-party repos are uncommon either.

      • @rufus@discuss.tchncs.de
        link
        fedilink
        1
        edit-2
        1 year ago

        Yes. I do it the correct way. I use my favourite distro’s package manager to install software. This way it’s tested, a few people had a look at the changes, and sometimes a CI script automatically determines if the installer affects other parts of the system. I go to great lengths to avoid doing it any other way. (I’ve been using some flatpaks in recent times, though. But sometimes I also install it only for a separate user account. Mainly when it’s proprietary or niche.)

        It is super rare that I install random stuff from the internet. Or ‘curl’ and then pipe the installer script into a root shell. And when I do, I put in some effort to see if it’s okay. I think i had a quick glance at most of the install .sh scripts before continuing. So yes, I kinda do my best. And I isolate that stuff and don’t put it on the same container that does my email.

        Most of the times you can avoid doing it the ‘stupid way’. And even the programming package managers like ‘npm’, ‘cargo’, … have started to take supply chain attacks seriously.

  • @rurb@lemmy.ml
    link
    fedilink
    61 year ago

    I had to essentially read the same thing four times before there was any new information in this post. Not sure if that’s a Jerboa thing or what, but probably could have been avoided.

    • @drspod@lemmy.mlOP
      link
      fedilink
      81 year ago

      Yeah I agree, sorry about that. I thought that the body-text field was mandatory to fill in, so I used the introductory paragraph from the article so as not to editorialize.

    • PenguinCoder
      link
      fedilink
      English
      71 year ago

      curl --proto '=https' --tlsv1.2 -sSf https://download-more-ram.sh | sh

      PHEW thanks, I’m safe.