Home > linux, tools > command line alternatives to wget and so much better!

command line alternatives to wget and so much better!

September 30th, 2010 Leave a comment Go to comments

Most people including myself are hooked on using wget to do whatever quickies that we need to do on our servers. I use it in my scripts, crontab entries and even site mirroring and web crawling.

[ad]

If you’ve used it extensively, you’ve probably started to see it’s limitations when it comes to downloading large files in a short period of time, multi-threading, and web caching and web crawling.  Yes all that is possible with wget but it’s not powerful enough to stick to it.

The following is a list of alternatives that’ll blow wget away. All of the applications work on Unix/Linux, MacOS, Cygwin and probably everything else if you can get them compiled.

  1. I’ve come across a little program called Axel. It tries to accelerate downloads by using multiple connections for one download. It opens more than one HTTP/FTP connection per download and each connection transfers its own, separate, part of the file. This comes handy when some sites limit the speed of each connection so by opening more than one connection at a time multiplies the allowable bandwidth. Once all parts of the file is downloaded, it is seamlessly put together into original. This makes it better than wget and a more qualified tool for downloading large files quicker.
  2. The second tool that rocks is Aget. It is a multi-threaded download accelerator. It supports HTTP downloads and can be run from the console. From the Aget website: Tests show that Aget is successful in realizing its objectives. A file of size 36.347.010 bytes was downloaded in 14 minutes 28 secs via wget; whereas it was downloaded in 3 minutes and 15 seconds via Aget! Amazing indeed.
  3. The third super getter is Prozilla. A download accelerator for Linux. It makes multiple server connections to download a file, and also supports FTP searching for faster mirrors. Supports FTP & HTTP and the file will be downloaded as fast as possible as your bandwidth allows if not otherwise specified.
  4. Then there is Curl. An awesome command line tool for transferring of data and manipulating it in ways one couldn’t imagine.  Not enough can be said about Curl as it’s an awesome alternative to wget.
  5. The fifth rock star is Manda. Manda is a threaded download accelerator. It downloads files via HTTP by splitting the files into parts, speeding the download. I don’t have too much info on this as I didn’t get to test it but it’s in the same rank as axel and aget.
  6. The sixth punk star is GetFast. GetFast is a file download accelerator. It is multi-threaded and allows the downloading of Web pages and their sub-links. GetFast is an animal like the rest. It fetches anything and everything in lightening speed.
  7. How about giving Aria2 a try? A lightweight  multi-protocol & multi-source, cross platform download  utility operated in command-line  Use it as a bitTorrent client!
  8. And then there is Mulk. Not in the same league as above but it does the same job.  Multi-connection command line tool to download Internet sites. Similar to wget and cURL, but it manages up to 50 parallel links. Main features are: recursive fetching, Metalink retrieving, segmented download and image filtering by width and height.

Wget is an amazing tool with lots of options and is perfect for many things but if you’re looking for speed and getting large things done in small time, then give these apps a try. And the best part is, unlike wget, these apps resume from where it ended in case download process gets interrupted.

[ad]

I am probably missing other download accelerators that are on the same rank as the ones above so please put it up in comment area and I will update the article to include all.

Categories: linux, tools Tags:
  1. Aekold
    October 1st, 2010 at 12:27 | #1

    Good article, thanx. Have some notes though. wget has -c flag, and can continue download where it stopped. And I must say – unlike most of other mentioned tools. I couldn’t make aria2c resume download ever, and other apps I tried had to revert huge chunks of data because they were not sure where to start. I don’t know, maybe it’s my aura, but mostly it’s better for me to download in single thread just to know where to continue if it will fail.

  2. October 1st, 2010 at 14:56 | #3

    Thanks, i’ve tried axel and works very well, here is my modified pacman.conf so pacman uses axel instead of wget, in case that anyone need it

    XferCommand = /usr/bin/axel -a -o %o %u

  3. October 15th, 2011 at 16:19 | #4

    Thanks, I really like Axel, thanks for the suggestion!

  4. January 22nd, 2013 at 00:26 | #5

    I’d like to mention httpie, a really great web client. Not designed for parallel downloads, but for easily making complex requests. https://github.com/jkbr/httpie http://blogs.operationaldynamics.com/andrew/software/research/testing-restful-apis-with-httpie

  1. October 1st, 2010 at 13:02 | #1
  2. October 2nd, 2010 at 19:55 | #2