Since moving over to Linux from Windows, I have been able to find a suitable replacement for literally every application except a download manager. On Windows, I would use IDM, which is quite well known. Some may wonder why even bother with a DL manager? Well, it’s useful for those websites you download a large file from that has a slow connection. It can multiply the download speed x8 or more and I find value in this especially when downloading large ISO files from slow servers.
I have found some download managers just from search engines, but they all function wonky and look terrible at the same time. Nothing as good as IDM so far.
If not, well, let me introduce you. It’s not necessarily wonky in function, but it is kinda ugly. But guess what - it works! Most times, at least.
Some sites have upgraded their protocols due to AI model development technology (Dall E, Stable Diffusion, GPT, etc.) which makes it harder for things like jDownloader to get every result you are looking for. So, it’s not the fault of the app per se.
However, you only mentioned ISOs, and these work 99.99% of the time. Only some specific, usually closed-source sites don’t work.
It’s flagged as out-of-date, but it’s actually a self-updating app. Though I must say, sometimes those updates are too frequent, and lately you have to try the update a few times to get it to work. Apart from that, it’s everything I need in a download manager, and I’ve been using it since I learned of its existence close to 10 years ago.
Yes, I just had another quick test (feeling lazy) to find out if wget really is the best…
Tested axel, aria2c, and for sure wget gives the nicest output and nothing was quicker.
Well the main thing I learn from this thread is that most folks using Linux just can’t be bothered - though they’ll try to investigate with a quick web search or whatever they just can’t be doing with the extra layers of complexity, plugins, or whatever else is needed.
It’s trivial to copy a URL and wget (paste) an ISO - or yt-dlp (paste) for video.
Despite being a newer linux user, I don’t mind getting my hands dirty and utilizing the terminal. Quite frankly, I really like the terminal. If wget can do what I just asked @Kresimir, I will use it.
Edit: I will say, however, I think it’s unfortunate that ‘kget’ by KDE is not integrated with firefox. Seems like such a no-brainer to keep that alive, but for some reason it’s not. An AUR for integration did exist but ceased about a few years ago.
I tried gwget - I’m mostly a UI guy, what can I say.
It would seem that it can’t fetch files from multiple URLs at once. Meaning, I use an add-on in ff that can copy all URLs from open tabs. When I tried pasting it in the field, it just gave me a “file name too long” error - meaning it
thinks the URLs are all one file
or that it just can’t read multiple URLS
OR the format they are pasted in needs something more specific, so it didn’t add them to the download list.
Doing the same thing with jDownloader2, meaning using the ff copy URLs add-on while jDownloader2 is open, it automatically detects that I just copied several links and added them to the app waiting for me to hit download. ← This can be turned off by the way.
I’m not saying it needs to work like jDownloader2, but this is a core functionality for me. Is it possible to do this? Maybe I need to use another UI or maybe it has flags I don’t know about?
It’s the Java UI, and maybe you heard that it has ads in it. The AUR package has stripped out the popup style ads. Now it just has ONE unobtrusive ad that I forgot about until I started writing this. Very negligible.
In this case, I need to ensure there is only a space between the URLs - as opposed to a line.
However, so far wget doesn’t work with sites, it works with specific files on sites, which means you need the specific link.
Willing to do more testing though because wget has a gazillion flags.