Checks for "list.txt" on websites - why?

General discussion and chat (archived)
John connor

Re: Checks for "list.txt" on websites - why?

Unread post by John connor » 2017-09-20, 22:57

satrow wrote:Excepting the ID IP 36.78.100.89, which might be too new on the scene to be sure of, the IPs are all flagged as static broadband from regular ISPs and proxy servers or network sharing devices.

These are all most likely infected routers. I see this all the time on my websites.

John connor

Re: Checks for "list.txt" on websites - why?

Unread post by John connor » 2017-09-20, 23:01

Moonchild wrote:
John connor wrote:What was the user agent for these requests?
Various, so likely spoofed from a table of known ones. No referrers were sent, either.

193.188.254.67 - "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
61.7.186.96 - "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
182.160.124.29 - "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
212.22.86.114 - "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
188.211.224.162 - "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
201.249.88.35 - "Mozilla/5.0 (compatible; Googlebot/2.1;+http://www.google.com/bot.html)"
180.183.104.120 - "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; InfoPath.1)"
185.64.220.113 - "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
182.253.178.194 - "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; en) Opera 8.50"
193.93.228.209 - "Mozilla/5.0 (Windows NT 5.1; U) Opera 7.54 [ru]"
Those look like all old UAs and if you were to use CIDRAM you can add the ancient UA blocks module and they would all get served a 403. Also, that fake GoogleBot would have been blocked.

I'll make an inquiry on this to my other forum I'm a memeber of and see what they all say. Perhaps they can shed some light on this list.txt.

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35602
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Checks for "list.txt" on websites - why?

Unread post by Moonchild » 2017-09-20, 23:51

I'm not interested in cidram -- it sits at the PHP level which is not where this should be handled.
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

John connor

Re: Checks for "list.txt" on websites - why?

Unread post by John connor » 2017-09-21, 15:22

So I asked on a forum I'm a member of about these list.txt requests and despite doing some research as well, nothing came up. It was of the opinion that it could be just a bot looking for anything that might have something of value. A lot of these bots are just dumb or just probing for vulnerabilities. list.txt is a very common phrase, so it's really hard to determine what the bot was after exactly.

Another member offered this IP table that you might be interested in.

Code: Select all

iptables -A INPUT -m string --from 40 --to 80 --algo bm --string "list.txt" -p tcp -j DROP
If you use Fail2Ban he offered code for that as well, but he mentioned it being used with Apache and you indicated that you don't use Apache.

User avatar
Moonchild
Pale Moon guru
Pale Moon guru
Posts: 35602
Joined: 2011-08-28, 17:27
Location: Motala, SE
Contact:

Re: Checks for "list.txt" on websites - why?

Unread post by Moonchild » 2017-09-22, 08:37

Thanks for the iptables rule but really, just responding with a 404 is best, unless you are literally being DoS flooded with requests and it interfering with the web server (which isn't likely going to happen with nginx unless it's of the volume that would cause problems anyway).
"Sometimes, the best way to get what you want is to be a good person." -- Louis Rossmann
"Seek wisdom, not knowledge. Knowledge is of the past; wisdom is of the future." -- Native American proverb
"Linux makes everything difficult." -- Lyceus Anubite

Locked