Home > python > Download files with threads easily

Download files with threads easily

You have a file with a list of URLs that you want to download. You already know the wget trick:

wget -i down.txt

However, if you want to fetch a lot of files, it can be slow.

Well, let’s launch wget instances parallelly and fetch those files quickly. With concurrent.futures, it’s just a few lines:

#!/usr/bin/env python

import os
import concurrent.futures
from threading import Lock

lock = Lock()
INPUT = "down.txt"

def download(url):
    cmd = "wget -q {url}".format(url=url)
    with lock:
        print cmd

def main():
    with concurrent.futures.ThreadPoolExecutor(max_workers=THREADS) as ex:
        with open(INPUT) as f:
            for line in f:
                line = line.rstrip("\n")
                ex.submit(download, line)


if __name__ == "__main__":

Thanks to defnull at reddit who directed me towards conncurrent.futures.

  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: