|
178.33.226.69 - - [31/Oct/2021:21:06:40 -0400] "POST /admin/ HTTP/1.1" 503 4726 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
24.86.170.153 - - [31/Oct/2021:21:06:41 -0400] "POST /admin/ HTTP/1.1" 503 4725 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
192.248.56.16 - - [31/Oct/2021:21:06:42 -0400] "POST /admin/ HTTP/1.1" 503 4726 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
140.210.93.135 - - [31/Oct/2021:21:06:42 -0400] "POST /admin/ HTTP/1.1" 503 4725 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
192.99.45.85 - - [31/Oct/2021:21:06:44 -0400] "POST /admin/ HTTP/1.1" 503 4725 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
我在robots里加了
User-agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0
Disallow: /
但是无效,以前对其他爬虫这样是可以的,请问为啥呀,咋解决呢 |
|