Introduction
Protecting your server from malicious IPs and bots is crucial to maintain optimal performance and security. This guide explores six effective techniques to block unwanted traffic, ensuring your server resources are safeguarded against abuse and overload.
1. Geolocation-Based IP Blocking
Geolocation blocking allows you to restrict access from specific countries or regions. This method is effective if you notice malicious traffic originating from particular locations.
Apache Configuration Example
1<IfModule mod_geoip.c>2GeoIPEnable On3GeoIPDBFile /usr/share/GeoIP/GeoIP.dat4SetEnvIf GEOIP_COUNTRY_CODE CN BlockCountry5<RequireAll>6Require all granted7Require not env BlockCountry8</RequireAll>9</IfModule>
Nginx Configuration Example
1http {2geo $blocked_country {3default 0;4CN 1;5RU 1;6}7server {8if ($blocked_country) {9return 403;10}11# ... other configurations ...12}13}
2. Blocking IPs Using .htaccess (Apache)
The .htaccess
file allows you to block specific IP addresses or ranges, preventing them from accessing your server.
Example:
1<RequireAll>2Require all granted3Require not ip 192.168.1.104Require not ip 203.0.113.0/245</RequireAll>
3. Nginx Configuration for IP Blocking
Nginx provides directives to block individual IPs or entire ranges directly within its configuration files.
Example:
1http {2deny 192.168.1.10;3deny 203.0.113.0/24;4allow all;5server {6# ... server configurations ...7}8}
4. Utilizing Firewall Rules (e.g., iptables)
Implementing firewall rules provides a robust layer of security by controlling incoming and outgoing traffic at the network level.
Example:
1# Block a single IP2sudo iptables -A INPUT -s 192.168.1.10 -j DROP3# Block an IP range4sudo iptables -A INPUT -s 203.0.113.0/24 -j DROP5# Save iptables rules6sudo iptables-save | sudo tee /etc/iptables/rules.v4
5. Leveraging Security Modules or Services
Security modules like ModSecurity
for Apache or services like Fail2Ban
can automate the detection and blocking of malicious IPs based on suspicious activity.
Example with Fail2Ban:
1# Install Fail2Ban2sudo apt-get install fail2ban3# Configure Fail2Ban for Apache4sudo nano /etc/fail2ban/jail.local5[apache]6enabled = true7port = http,https8filter = apache-auth9logpath = /var/log/apache*/*access.log10maxretry = 511bantime = 3600
6. Blocking User Agents and Specific Bots
In addition to blocking IP ranges, it's essential to restrict access based on user agents and specific bots that may cause unwanted traffic or attempts to exploit your server. By identifying and blocking malicious user agents, you can further enhance your server's security.
Blocking Bots and User Agents with .htaccess (Apache)
The .htaccess
file can be configured to block specific bots and user agents by leveraging rewrite rules and environment variables. Below are examples of how to implement these blocks.
Basic Bot Blocking
1# Block specific bots2RewriteEngine On3RewriteCond %{HTTP_USER_AGENT} (MJ12bot|VeBot|BLP_bbot|datagnionbot|YandexBot|PetalBot|YandexImages|bingbot|AspiegelBot|dotbot|Baiduspider) [NC]4RewriteRule .* - [R=403,L]
Advanced Bot Blocking
1<IfModule mod_rewrite.c>2RewriteEngine On3# Block a comprehensive list of bots4RewriteCond %{HTTP_USER_AGENT} (alexibot|majestic|mj12bot|rogerbot|econtext|eolasbot|eventures|liebaofast|nominet|oppo\sa33|acapbot|acoonbot|asterias|attackbot|backdorbot|becomebot|binlar|blackwidow|blekkobot|blexbot|blowfish|bullseye|bunnys|butterfly|careerbot|casper|checkpriv|cheesebot|cherrypick|chinaclaw|choppy|clshttp|cmsworld|copernic|copyrightcheck|cosmos|crescent|cy_cho|datacha|demon|diavol|discobot|dittospyder|dotbot|dotnetdotcom|dumbot|emailcollector|emailsiphon|emailwolf|extract|eyenetie|feedfinder|flaming|flashget|flicky|foobot|g00g1e|getright|gigabot|go-ahead-got|gozilla|grabnet|grafula|harvest|heritrix|httrack|icarus6j|jetbot|jetcar|jikespider|kmccrew|leechftp|libweb|linkextractor|linkscan|linkwalker|loader|masscan|miner|mechanize|morfeus|moveoverbot|netmechanic|netspider|nicerspro|nikto|ninja|nutch|octopus|pagegrabber|petalbot|planetwork|postrank|proximic|purebot|pycurl|python|queryn|queryseeker|radian6|radiation|realdownload|scooter|seekerspider|semalt|siclab|sindice|sistrix|sitebot|siteexplorer|sitesnagger|skygrid|smartdownload|snoopy|sosospider|spankbot|spbot|sqlmap|stackrambler|stripper|sucker|surftbot|sux0r|suzukacz|suzuran|takeout|teleport|telesoft|true_robots|turingos|turnit|vampire|vikspider|voideye|webleacher|webreaper|webstripper|webvac|webviewer|webwhacker|winhttp|wwwoffle|woxbot|xaldon|xxxyy|yamanalab|yioopbot|youda|zeus|zmeu|zune|zyborg) [NC]5RewriteCond %{REMOTE_HOST} (163data|amazonaws|colocrossing|crimea|g00g1e|justhost|kanagawa|loopia|masterhost|onlinehome|poneytel|sprintdatacenter|reverse.softlayer|safenet|ttnet|woodpecker|wowrack) [NC]6RewriteCond %{HTTP_REFERER} (semalt\.com|todaperfeita) [NC,OR]7RewriteCond %{HTTP_REFERER} (blue\spill|cocaine|ejaculat|erectile|erections|hoodia|huronriveracres|impotence|levitra|libido|lipitor|phentermin|pro[sz]ac|sandyauer|tramadol|troyhamby|ultram|unicauca|valium|viagra|vicodin|xanax|ypxaieo) [NC]8RewriteRule .* - [F,L]9</IfModule>
Blocking Bots and User Agents with Nginx
Nginx can also be configured to block specific user agents and bots by modifying the server configuration files.
Basic Bot Blocking
1http {2map $http_user_agent $bad_bot {3default 0;4~*(MJ12bot|VeBot|BLP_bbot|datagnionbot|YandexBot|PetalBot|YandexImages|bingbot|AspiegelBot|dotbot|Baiduspider) 1;5}6server {7if ($bad_bot) {8return 403;9}10# ... other configurations ...11}12}
Advanced Bot Blocking
1http {2map $http_user_agent $bad_bot {3default 0;4~*(alexibot|majestic|mj12bot|rogerbot|econtext|eolasbot|eventures|liebaofast|nominet|oppo\sa33|acapbot|acoonbot|asterias|attackbot|backdorbot|becomebot|binlar|blackwidow|blekkobot|blexbot|blowfish|bullseye|bunnys|butterfly|careerbot|casper|checkpriv|cheesebot|cherrypick|chinaclaw|choppy|clshttp|cmsworld|copernic|copyrightcheck|cosmos|crescent|cy_cho|datacha|demon|diavol|discobot|dittospyder|dotbot|dotnetdotcom|dumbot|emailcollector|emailsiphon|emailwolf|extract|eyenetie|feedfinder|flaming|flashget|flicky|foobot|g00g1e|getright|gigabot|go-ahead-got|gozilla|grabnet|grafula|harvest|heritrix|httrack|icarus6j|jetbot|jetcar|jikespider|kmccrew|leechftp|libweb|linkextractor|linkscan|linkwalker|loader|masscan|miner|mechanize|morfeus|moveoverbot|netmechanic|netspider|nicerspro|nikto|ninja|nutch|octopus|pagegrabber|petalbot|planetwork|postrank|proximic|purebot|pycurl|python|queryn|queryseeker|radian6|radiation|realdownload|scooter|seekerspider|semalt|siclab|sindice|sistrix|sitebot|siteexplorer|sitesnagger|skygrid|smartdownload|snoopy|sosospider|spankbot|spbot|sqlmap|stackrambler|stripper|sucker|surftbot|sux0r|suzukacz|suzuran|takeout|teleport|telesoft|true_robots|turingos|turnit|vampire|vikspider|voideye|webleacher|webreaper|webstripper|webvac|webviewer|webwhacker|winhttp|wwwoffle|woxbot|xaldon|xxxyy|yamanalab|yioopbot|youda|zeus|zmeu|zune|zyborg) 1;5}6server {7if ($bad_bot) {8return 403;9}10# ... other configurations ...11}12}
Additional Firewall Rules for Enhanced Protection
Beyond blocking user agents, incorporating firewall rules can provide an extra layer of security. Below are examples of how to implement these rules in .htaccess
.
1# Block malicious IP addresses2deny from 47.128.124.2233deny from 47.128.98.514deny from 4.227.36.1065deny from 4.227.36.466deny from 89.64.100.1487deny from 109.206.213.2038deny from 37.248.154.669deny from 78.8.13.17510deny from 85.128.143.19111deny from 31.11.210.16912deny from 85.128.143.19713deny from 83.27.210.13314deny from 109.243.130.19815deny from 77.222.255.4716deny from 85.128.143.4917deny from 4.227.36.6318deny from 4.227.3619deny from 20.171.207.22620deny from 20.171.20721# Block query strings in URLs to prevent malicious requests22<IfModule mod_rewrite.c>23RewriteEngine On24RewriteCond %{QUERY_STRING} (((/|%2f){3,3})|((\.|%2e){3,3})|((\.|%2e){2,2})(/|%2f|%u2215)) [NC,OR]25RewriteCond %{QUERY_STRING} (/|%2f)(:|%3a)(/|%2f) [NC,OR]26RewriteCond %{QUERY_STRING} (/|%2f)(\*|%2a)(\*|%2a)(/|%2f) [NC,OR]27RewriteCond %{QUERY_STRING} (absolute_|base|root_)(dir|path)(=|%3d)(ftp|https?) [NC,OR]28RewriteCond %{QUERY_STRING} (/|%2f)(=|%3d|$&|_mm|cgi(\.|-)|inurl(:|%3a)(/|%2f)|(mod|path)(=|%3d)(\.|%2e)) [NC,OR]29RewriteCond %{REQUEST_URI} (\^|`|<|>|\|\|) [NC,OR]30RewriteCond %{REQUEST_URI} ([a-z0-9]{2000,}) [NC]31RewriteRule .* - [F,L]32</IfModule>
Best Practices
- Regular Updates: Continuously monitor and update your list of blocked user agents and IP addresses to adapt to new threats.
- Testing Rules: After implementing new rules, test your website to ensure that legitimate traffic isn't inadvertently blocked.
- Use Security Modules: Combine these techniques with security modules like
ModSecurity
or services likeFail2Ban
for automated protection.
Conclusion
Implementing these IP blocking techniques, along with strategies to block malicious user agents and specific bots, enhances your server's security by mitigating the risks posed by malicious actors and automated threats. Regularly update your blocking rules and monitor server traffic to maintain optimal protection.
Comments
You must be logged in to comment.
Loading comments...