Backstory
So, I'm sitting there, glued to the charts. Yes, I am obsessed with looking at charts and graphs, especially when they're trending upwards. :) I'm seeing some incredible numbers for views and bounce rates.

This is what the site's monthly traffic graph looks like (from August 19, 2025, to September 18, 2025)
Plus, interestingly enough, it's mostly the pagination pages that are visited. Of course, they visit other pages too, but not so often. Here's the graph:

Naturally, I got worried. I started looking for the cause because I'd heard about traffic from China. Additionally, my site isn't adapted to Asian culture at all, not to mention translations. I could have made a joke like:

But I don't find it funny, and what if there really are real Chinese people there? If so, no offense - just a joke.
My investigation into where and why they came
First, I decided to find out where they were coming from. Maybe Google started sending this kind of traffic, or Yandex. Considering the friendship between Russia and China, I wouldn't be surprised. Or maybe Baidu got to me? No, no, and no. It's all direct traffic, and no search engine has anything to do with it.
And there could be several options:
- Either DDoS
- Or clickers
- Or parsers
- Or content copying
Let's figure it out.
DDOS Atack
There are also DoS attacks, the same thing, but the requests originate from a single machine. Considering the fact that the Metrica records access from many IP addresses, it's more likely a DDoS attack, but not quite.
You see, there aren't that many requests. There should be thousands, tens of thousands... per hour. I'm sure it's not a DDoS attack. On graphs, a DDoS attack might look something like this:

Clickers, or a type of negative SEO
Clear and obvious signs of clickers are:
- Abnormally low or high click depth
- Short time on site
- The source of traffic is a search engine.
The most important thing here is that the source must be a search engine; otherwise, it's just an annoying bot clogging your statistics. That's all.
As you remember, my traffic is direct. And the pages this bot views are pagination pages, meaning pages that generally don't contribute to rankings.
Therefore, I don't think it's a clicker, or at least not the smartest one.
Perhaps it's a parser?
And I'm probably right in saying that this is most likely it. The thing is, parsers tend to favor paginated pages. After all, they allow you to find all the content on a website.
But a couple of questions still arise. For example:
- Why not look at the sitemap.xml if you really need that list of articles on the website?
- Why make it so complicated? Why rotate user agents and use proxies? My anti-parser protection isn't that strong.
My solution
In any case, it doesn't really matter why someone decided to sic a bunch of bots on my site. What matters more is what I do about it. Will I be a softie and just tolerate these bots, or can I take some countermeasures to make life harder for them (whoever sicced them on me).
What I can do:
- Block an entire country via the .htaccess file on the server.
- Change the site structure to make it harder to parse the content.
- Use some kind of bot check, like adding a captcha or something like that.
Well, since I'm in a hurry to minimize the damage to my site, whatever it may be, I've decided to block an entire country from visiting it. At least it will hold them temporarily.
How to block an entire country using an .htaccess file
Before we start doing things that could harm the site, let's do some investigation. Firstly, we need to know what an .htaccess file is. Secondly, how it works. Thirdly, why we would do this. At least, until we find another, more flexible solution.
Directives consist of three parts:
1. Defining order
- Order deny,allow - Deny all IP addresses except those listed from responding
- Order allow,deny - Allow all IP addresses except those listed from responding
2. Action
- Denyfrom — Deny (respond to requests) from
- Allow from — Allow (respond to requests) from
3. Adress/Range/Classless addressing
- 1.0.8.0 - Specific IP
- 1.0.8 - Will be blocked all IPs from 1.0.8.0 to 1.0.8.255
- 1.0.8.0/24 - Using classless inter-domain routing, you can flexibly specify the desired ranges like this.
In my specific case, blocking all addresses from China looks like this:
Some clarification is needed. So, the first line activates the web server for Python applications – Phusion Passenger. Then, for this web server, we select a virtual environment. But that's a different topic. Let's move on to the web content access rules.
- Order allow,deny – allows access to the web resource to everyone except those listed below.
- Allow from all – allows access to everyone.
- Deny from ADDRESS – denies access to the resource to specific addresses.
You can replay it and, for example, deny access to everyone except those listed below, like this:
Now only Chinese people will be able to watch my content :)
But there are so many Chinese IP addresses, and how can you fit them all into one file? Yeah... I can also add that not only are there so many, but they change, and tracking so many addresses—hundreds of millions—is becoming an impossible task.
Cloud services like CloudFlare are suggested as a solution. But I have slightly different suggestions. The first is fairly fast but temporary. The second takes longer to implement and requires individual configuration on each hosting provider, but it provides an up-to-date list of IP addresses.
First, you need to find the corresponding IP addresses and their ranges by the countries they're located in. I recommend this website; it not only provides a convenient way to quickly copy addresses but also allows for parsing, making it very convenient for custom solutions like mine.
A simple way: blocking traffic from a specific country
On this website, you can download the file you need with IP addresses by country, then copy its contents directly into your .htaccess file.
Save the file and restart the web server. If everything is done correctly, nothing will change for you, but access to the site will be blocked for Chinese users. However, you could have accidentally blocked yourself and your country; this is what it would look like:

This is if you accidentally blocked yourself and your country
If there was an error in the syntax of the .htaccess file, you will see something like this:

Something is wrong on the server's side.
I'd like to point out that my site has implemented its own 403 (Forbidden) and 500 (Server Error) response codes. These codes are not displayed when blocked in this way.
The hard way: writing a custom bash script
The core of this method is essentially the same; only here the entire process is automated. I'll write a Bash script and create a task to run it at the beginning of each day. The script below will:
- Download and update the list of IP ranges.
- Add - Deny from
- Insert it into the .htaccess file.
For example, if you run the command below:
Then you'll block all traffic from China. You can find more commands and options in my repository, where you can download this script.
Now, to keep the list of IP addresses constantly updated, you can create a Cron task that will run the block-country.sh script daily, for example.
Conclusions and results
After successfully adding IP addresses from China to htaccess, traffic dropped as expected, and bounce rates and time on site improved dramatically. This refers to average viewing time and average bounce rate.

So, it was Sunday, and I blocked China at 12:00.
That's how it is. It's unpleasant, but there's more to come. :) I hope this article helped me figure out what's wrong with your site, why you're suddenly so inundated with Chinese traffic, and what can be done about it.