This week we have been given the opportunity to conduct our work placement for I.T VET at 2pi Software. Within these days we have been able to set up a small web server running an Odroid as the master system, which had a load balancer installed, using Nginx, to evenly spread the traffic from incoming requests.
A load balancer is a device that evenly distributes incoming traffic between nodes. This increases the amount of requests and will increase the reliability of the server, which is critical for a company’s efficiency for their online presence. A company that uses load balancing is Google, which would have a massive load of requests being sent to the servers, which will then have to be evenly distributed across all their servers.
The requests were then distributed across 3 different Raspberry Pi systems, low cost, high availability micro computers that allow users to achieve goals such as learning basic coding to more experienced projects like creating a mini arcade machine.
These were then benchmarked to see how much stress the server will be able to take when under a heavy load of requests. We recorded our results and then compared the data to make sure the nodes were running at the same efficiency.
Our setup looks like this:
We ran multiple tests using different types of media, these include:
- Static HTML
- Static HTML with an image
- PHP and MySQL
The results we collected are below, where data presented ranges from the 50th to 100th percentile. The number on the right of the percentage indicates the response time in milliseconds.
Odroid w/PHP & MySQL (3 nodes)
Odroid w/ static HTML
Odroid w/ PHP
Odroid w/ image HTML
100% 2690 (longest request)
100% 1493 (longest
100% 2288 (longest request)
100% 1273 (longest request)
These results show that using more complex web tools will slow down the web server significantly. This is shown in the odroid with PHP & mySQL which has results that are double of the static HTML.