What is most efficient language for web scraping purposes
We decided to do this small test to find out what is more efficient (speed, CPU and RAM usage wise) programming language for web scraping purposes. To be honest, we wrote all scraping scripts in a same manner, and we ran it in single thread. Each scraper we ran for 10 minutes on same machine, almost at same time. We ran it on: Linux Ubuntu 14.04 (under Virtual Box), 1 CPU Core, 4Gb RAM.
We compared following programming language: Diggernaut meta-language (based on Golang backend), Perl, PHP5, Python 2.7, Python + Scrapy, Ruby. As target we used U.S. Department of Health & Human Services website.
Lets look at the speed chart
As you can see there are 3 leaders: Diggernaut was able to fetch almost 3K pages, Ruby – approx 2.5K and Python + Scrapy – approx 1.5K. Other languages are really slow.
However, if we look to CPU usage chart, we will see a bit different picture
First place here goes to PHP5 which used just 2.5% of CPU, then Diggernaut with 3.5% and third is Perl with approx 4%. Other languages are also close by, except Python + Scrapy – 11% is a way too much we think.
And last parameter we measured is RAM usage:
Winner here is Diggernaut with 26Mb, then Perl with 29Mb, and PHP5 with 39Mb. Ruby here is outsider with 154Mb of RAM usage.
So to summarize measures we will score each language using 100-points score system. We will score each measure separately (best result gets 100 points, worst gets 0 points) and then we will use average.
Diggernaut with Golang backend is clear winner in this run. Also we need to mention that development time for Diggernaut scraper generally took 1.5-2 times less time.
We decided to attach files we used for test, so you may try and ensure: scripts