Blocking IP Ranges to Protect Server Resources | Extraparse

Blocking IP Ranges to Protect Server Resources

March 25, 20255 min read835 words

Learn effective techniques to block malicious IPs and bots, preventing server resource overload and abuse. This guide covers methods for both Apache and Nginx servers with practical code examples.

Table of Contents

Author: Extraparse

Introduction

Protecting your server from malicious IPs and bots is crucial to maintain optimal performance and security. This guide explores six effective techniques to block unwanted traffic, ensuring your server resources are safeguarded against abuse and overload.

1. Geolocation-Based IP Blocking

Geolocation blocking allows you to restrict access from specific countries or regions. This method is effective if you notice malicious traffic originating from particular locations.

Apache Configuration Example

1<IfModule mod_geoip.c>
2GeoIPEnable On
3GeoIPDBFile /usr/share/GeoIP/GeoIP.dat
4SetEnvIf GEOIP_COUNTRY_CODE CN BlockCountry
5<RequireAll>
6Require all granted
7Require not env BlockCountry
8</RequireAll>
9</IfModule>

Nginx Configuration Example

1http {
2geo $blocked_country {
3default 0;
4CN 1;
5RU 1;
6}
7server {
8if ($blocked_country) {
9return 403;
10}
11# ... other configurations ...
12}
13}

2. Blocking IPs Using .htaccess (Apache)

The .htaccess file allows you to block specific IP addresses or ranges, preventing them from accessing your server.

Example:

1<RequireAll>
2Require all granted
3Require not ip 192.168.1.10
4Require not ip 203.0.113.0/24
5</RequireAll>

3. Nginx Configuration for IP Blocking

Nginx provides directives to block individual IPs or entire ranges directly within its configuration files.

Example:

1http {
2deny 192.168.1.10;
3deny 203.0.113.0/24;
4allow all;
5server {
6# ... server configurations ...
7}
8}

4. Utilizing Firewall Rules (e.g., iptables)

Implementing firewall rules provides a robust layer of security by controlling incoming and outgoing traffic at the network level.

Example:

1# Block a single IP
2sudo iptables -A INPUT -s 192.168.1.10 -j DROP
3# Block an IP range
4sudo iptables -A INPUT -s 203.0.113.0/24 -j DROP
5# Save iptables rules
6sudo iptables-save | sudo tee /etc/iptables/rules.v4

5. Leveraging Security Modules or Services

Security modules like ModSecurity for Apache or services like Fail2Ban can automate the detection and blocking of malicious IPs based on suspicious activity.

Example with Fail2Ban:

1# Install Fail2Ban
2sudo apt-get install fail2ban
3# Configure Fail2Ban for Apache
4sudo nano /etc/fail2ban/jail.local
5[apache]
6enabled = true
7port = http,https
8filter = apache-auth
9logpath = /var/log/apache*/*access.log
10maxretry = 5
11bantime = 3600

6. Blocking User Agents and Specific Bots

In addition to blocking IP ranges, it's essential to restrict access based on user agents and specific bots that may cause unwanted traffic or attempts to exploit your server. By identifying and blocking malicious user agents, you can further enhance your server's security.

Blocking Bots and User Agents with .htaccess (Apache)

The .htaccess file can be configured to block specific bots and user agents by leveraging rewrite rules and environment variables. Below are examples of how to implement these blocks.

Basic Bot Blocking

1# Block specific bots
2RewriteEngine On
3RewriteCond %{HTTP_USER_AGENT} (MJ12bot|VeBot|BLP_bbot|datagnionbot|YandexBot|PetalBot|YandexImages|bingbot|AspiegelBot|dotbot|Baiduspider) [NC]
4RewriteRule .* - [R=403,L]

Advanced Bot Blocking

1<IfModule mod_rewrite.c>
2RewriteEngine On
3# Block a comprehensive list of bots
4RewriteCond %{HTTP_USER_AGENT} (alexibot|majestic|mj12bot|rogerbot|econtext|eolasbot|eventures|liebaofast|nominet|oppo\sa33|acapbot|acoonbot|asterias|attackbot|backdorbot|becomebot|binlar|blackwidow|blekkobot|blexbot|blowfish|bullseye|bunnys|butterfly|careerbot|casper|checkpriv|cheesebot|cherrypick|chinaclaw|choppy|clshttp|cmsworld|copernic|copyrightcheck|cosmos|crescent|cy_cho|datacha|demon|diavol|discobot|dittospyder|dotbot|dotnetdotcom|dumbot|emailcollector|emailsiphon|emailwolf|extract|eyenetie|feedfinder|flaming|flashget|flicky|foobot|g00g1e|getright|gigabot|go-ahead-got|gozilla|grabnet|grafula|harvest|heritrix|httrack|icarus6j|jetbot|jetcar|jikespider|kmccrew|leechftp|libweb|linkextractor|linkscan|linkwalker|loader|masscan|miner|mechanize|morfeus|moveoverbot|netmechanic|netspider|nicerspro|nikto|ninja|nutch|octopus|pagegrabber|petalbot|planetwork|postrank|proximic|purebot|pycurl|python|queryn|queryseeker|radian6|radiation|realdownload|scooter|seekerspider|semalt|siclab|sindice|sistrix|sitebot|siteexplorer|sitesnagger|skygrid|smartdownload|snoopy|sosospider|spankbot|spbot|sqlmap|stackrambler|stripper|sucker|surftbot|sux0r|suzukacz|suzuran|takeout|teleport|telesoft|true_robots|turingos|turnit|vampire|vikspider|voideye|webleacher|webreaper|webstripper|webvac|webviewer|webwhacker|winhttp|wwwoffle|woxbot|xaldon|xxxyy|yamanalab|yioopbot|youda|zeus|zmeu|zune|zyborg) [NC]
5RewriteCond %{REMOTE_HOST} (163data|amazonaws|colocrossing|crimea|g00g1e|justhost|kanagawa|loopia|masterhost|onlinehome|poneytel|sprintdatacenter|reverse.softlayer|safenet|ttnet|woodpecker|wowrack) [NC]
6RewriteCond %{HTTP_REFERER} (semalt\.com|todaperfeita) [NC,OR]
7RewriteCond %{HTTP_REFERER} (blue\spill|cocaine|ejaculat|erectile|erections|hoodia|huronriveracres|impotence|levitra|libido|lipitor|phentermin|pro[sz]ac|sandyauer|tramadol|troyhamby|ultram|unicauca|valium|viagra|vicodin|xanax|ypxaieo) [NC]
8RewriteRule .* - [F,L]
9</IfModule>

Blocking Bots and User Agents with Nginx

Nginx can also be configured to block specific user agents and bots by modifying the server configuration files.

Basic Bot Blocking

1http {
2map $http_user_agent $bad_bot {
3default 0;
4~*(MJ12bot|VeBot|BLP_bbot|datagnionbot|YandexBot|PetalBot|YandexImages|bingbot|AspiegelBot|dotbot|Baiduspider) 1;
5}
6server {
7if ($bad_bot) {
8return 403;
9}
10# ... other configurations ...
11}
12}

Advanced Bot Blocking

1http {
2map $http_user_agent $bad_bot {
3default 0;
4~*(alexibot|majestic|mj12bot|rogerbot|econtext|eolasbot|eventures|liebaofast|nominet|oppo\sa33|acapbot|acoonbot|asterias|attackbot|backdorbot|becomebot|binlar|blackwidow|blekkobot|blexbot|blowfish|bullseye|bunnys|butterfly|careerbot|casper|checkpriv|cheesebot|cherrypick|chinaclaw|choppy|clshttp|cmsworld|copernic|copyrightcheck|cosmos|crescent|cy_cho|datacha|demon|diavol|discobot|dittospyder|dotbot|dotnetdotcom|dumbot|emailcollector|emailsiphon|emailwolf|extract|eyenetie|feedfinder|flaming|flashget|flicky|foobot|g00g1e|getright|gigabot|go-ahead-got|gozilla|grabnet|grafula|harvest|heritrix|httrack|icarus6j|jetbot|jetcar|jikespider|kmccrew|leechftp|libweb|linkextractor|linkscan|linkwalker|loader|masscan|miner|mechanize|morfeus|moveoverbot|netmechanic|netspider|nicerspro|nikto|ninja|nutch|octopus|pagegrabber|petalbot|planetwork|postrank|proximic|purebot|pycurl|python|queryn|queryseeker|radian6|radiation|realdownload|scooter|seekerspider|semalt|siclab|sindice|sistrix|sitebot|siteexplorer|sitesnagger|skygrid|smartdownload|snoopy|sosospider|spankbot|spbot|sqlmap|stackrambler|stripper|sucker|surftbot|sux0r|suzukacz|suzuran|takeout|teleport|telesoft|true_robots|turingos|turnit|vampire|vikspider|voideye|webleacher|webreaper|webstripper|webvac|webviewer|webwhacker|winhttp|wwwoffle|woxbot|xaldon|xxxyy|yamanalab|yioopbot|youda|zeus|zmeu|zune|zyborg) 1;
5}
6server {
7if ($bad_bot) {
8return 403;
9}
10# ... other configurations ...
11}
12}

Additional Firewall Rules for Enhanced Protection

Beyond blocking user agents, incorporating firewall rules can provide an extra layer of security. Below are examples of how to implement these rules in .htaccess.

1# Block malicious IP addresses
2deny from 47.128.124.223
3deny from 47.128.98.51
4deny from 4.227.36.106
5deny from 4.227.36.46
6deny from 89.64.100.148
7deny from 109.206.213.203
8deny from 37.248.154.66
9deny from 78.8.13.175
10deny from 85.128.143.191
11deny from 31.11.210.169
12deny from 85.128.143.197
13deny from 83.27.210.133
14deny from 109.243.130.198
15deny from 77.222.255.47
16deny from 85.128.143.49
17deny from 4.227.36.63
18deny from 4.227.36
19deny from 20.171.207.226
20deny from 20.171.207
21# Block query strings in URLs to prevent malicious requests
22<IfModule mod_rewrite.c>
23RewriteEngine On
24RewriteCond %{QUERY_STRING} (((/|%2f){3,3})|((\.|%2e){3,3})|((\.|%2e){2,2})(/|%2f|%u2215)) [NC,OR]
25RewriteCond %{QUERY_STRING} (/|%2f)(:|%3a)(/|%2f) [NC,OR]
26RewriteCond %{QUERY_STRING} (/|%2f)(\*|%2a)(\*|%2a)(/|%2f) [NC,OR]
27RewriteCond %{QUERY_STRING} (absolute_|base|root_)(dir|path)(=|%3d)(ftp|https?) [NC,OR]
28RewriteCond %{QUERY_STRING} (/|%2f)(=|%3d|$&|_mm|cgi(\.|-)|inurl(:|%3a)(/|%2f)|(mod|path)(=|%3d)(\.|%2e)) [NC,OR]
29RewriteCond %{REQUEST_URI} (\^|`|<|>|\|\|) [NC,OR]
30RewriteCond %{REQUEST_URI} ([a-z0-9]{2000,}) [NC]
31RewriteRule .* - [F,L]
32</IfModule>

Best Practices

  • Regular Updates: Continuously monitor and update your list of blocked user agents and IP addresses to adapt to new threats.
  • Testing Rules: After implementing new rules, test your website to ensure that legitimate traffic isn't inadvertently blocked.
  • Use Security Modules: Combine these techniques with security modules like ModSecurity or services like Fail2Ban for automated protection.

Conclusion

Implementing these IP blocking techniques, along with strategies to block malicious user agents and specific bots, enhances your server's security by mitigating the risks posed by malicious actors and automated threats. Regularly update your blocking rules and monitor server traffic to maintain optimal protection.

xtelegramfacebooktiktoklinkedin
Author: Extraparse

Comments

You must be logged in to comment.

Loading comments...