Author |
Message |
giantmidget
Regular
Joined: Nov 27, 2005
Posts: 58
|
Posted:
Tue Nov 07, 2006 2:57 pm |
|
None, only a couple redirects. |
|
|
|
|
montego
Site Admin
Joined: Aug 29, 2004
Posts: 9457
Location: Arizona
|
Posted:
Wed Nov 08, 2006 6:30 am |
|
Then Rewrite is not allowed by your host (I think). I would check with them. These statements work as they are a straight copy-and-paste from my own .htaccess file.
Anyone else have an issue with the statements I posted? |
_________________ Only registered users can see links on this board! Get registered or login!
Only registered users can see links on this board! Get registered or login! |
|
|
|
Tao_Man
Involved
Joined: Jul 15, 2004
Posts: 252
Location: OKC, OK
|
Posted:
Wed Nov 08, 2006 2:38 pm |
|
I tried it and got error 500, I played around with it and there seems to be only one line that is the problem
Code:
RewriteCond %{HTTP_USER_AGENT} ^f*** [NC,OR]
|
If I comment out that one line it works just fine |
_________________ ------------------------------------------
To strive, to seek, to find, but not to yield!
I don't know Kara-te but I do know cra-zy, and I WILL use it! |
|
|
|
xxxSNIPExxx
New Member
Joined: Mar 10, 2005
Posts: 5
Location: Netherlands, Alkmaar
|
Posted:
Wed Nov 08, 2006 4:44 pm |
|
Tao_Man wrote: | I tried it and got error 500, I played around with it and there seems to be only one line that is the problem
Code:
RewriteCond %{HTTP_USER_AGENT} ^f*** [NC,OR]
|
If I comment out that one line it works just fine |
For me to...Thx for the solution |
|
|
|
|
montego
|
Posted:
Thu Nov 09, 2006 6:22 am |
|
Uuugghhhh... I see what has happened now! The blasted NUKE censorship function! What I posted is completely correct, however, do you see the three asterisks? Guess what that is supposed to be. Yes, it is a four letter word which starts with "f" and ends in "k".
Sorry about not catching that! |
|
|
|
|
Tao_Man
|
Posted:
Thu Nov 09, 2006 10:40 am |
|
Kinda wondered if that was the case. |
|
|
|
|
xxxSNIPExxx
|
Posted:
Fri Nov 10, 2006 12:12 am |
|
I still get visitors like
Code:libwww-perl/5.65
libwww-perl/5.805
libwww-perl/5.79
|
This rule should keep them off......right?
Code:RewriteCond %{HTTP_USER_AGENT} ^libwww-perl/[0-9].[0-9]* [OR]
|
Those i ad manualy.
Code:RewriteCond %{HTTP_USER_AGENT} ^libwww-perl/5.65 [OR]
RewriteCond %{HTTP_USER_AGENT} ^libwww-perl/5.805 [OR]
RewriteCond %{HTTP_USER_AGENT} ^libwww-perl/5.79 [OR]
|
That works right.....does somebody now why?
Must i ad for example...to block them all
Code:RewriteCond %{HTTP_USER_AGENT} ^libwww-perl/[0-9].[0-9][0-9]* [OR]
RewriteCond %{HTTP_USER_AGENT} ^libwww-perl/[0-9].[0-9][0-9][0-9]* [OR]
|
|
|
|
|
|
montego
|
Posted:
Fri Nov 10, 2006 6:13 am |
|
I'll have to look through my logs as I hadn't seen anything come through on this...
Hardcoding as you have done should work just fine, but I can't let the other go. I'll check into it. |
|
|
|
|
montego
|
Posted:
Fri Nov 10, 2006 6:34 am |
|
Very odd. I'm not seeing any libwww-perl/n.nnn anything.
I also just now checked the pattern against these examples and I am getting exact hits, so that should be working. I check them here:
http://regexlib.com/RETester.aspx
Excellent tool! |
|
|
|
|
evaders99
Former Moderator in Good Standing
Joined: Apr 30, 2004
Posts: 3221
|
Posted:
Fri Nov 10, 2006 8:57 am |
|
I don't want anyone to use any version of libwww-perl, so I block the entire thing
Just ^libwww-perl |
_________________ - Only registered users can see links on this board! Get registered or login! -
Need help? Only registered users can see links on this board! Get registered or login! |
|
|
|
xxxSNIPExxx
|
Posted:
Fri Nov 10, 2006 10:21 am |
|
Just as simple as that Thx... |
|
|
|
|
malrock1
Hangin' Around
Joined: Nov 04, 2006
Posts: 47
Location: Wanaka New Zealand
|
Posted:
Sat Nov 11, 2006 4:25 pm |
|
I have this in my .htaccess
***********************************
# -------------------------------------------
# Start of NukeSentinel(tm) admin.php Auth
# -------------------------------------------
#<Files .ftaccess>
# deny from all
#</Files>
#<Files .staccess>
# deny from all
#</Files>
#<Files admin.php>
# <Limit GET POST PUT>
# require valid-user
# </Limit>
# AuthName "Restricted"
# AuthType Basic
# AuthUserFile http://www.verticalresource.org/.staccess
#</Files>
RewriteCond %{HTTP_USER_AGENT} ^libwww-perl/*.* [OR]
# -------------------------------------------
# Start of NukeSentinel(tm) DENY FROM area
# -------------------------------------------
************************************
I am still getting libwww-perl scripts trying to access my BB ........
How can I stop it? |
_________________ Life's a rock
go climb it
www.verticalresources.org |
|
|
|
jakec
Site Admin
Joined: Feb 06, 2006
Posts: 3048
Location: United Kingdom
|
Posted:
Sat Nov 11, 2006 4:53 pm |
|
I think all you need to have is
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^libwww-perl
RewriteEngine off
I'm sure someone will correct me if I'm wrong. |
|
|
|
|
montego
|
Posted:
Sat Nov 11, 2006 5:32 pm |
|
I have tested both now using the regex tester and both will work, however, evaders' (which is jakec's) is the simplest.
malrock1, if jakec's additional RewriteEngine On/Off statements don't do it, then maybe your host doesn't have that module turned on? You might want to check with them. phpinfo() might not even show you whether it is loaded or not. |
|
|
|
|
malrock1
|
Posted:
Sun Nov 12, 2006 1:50 pm |
|
No I'm stil getting user agents coming at me with libwww-perl
had another 15 last night
*************************
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^libwww-perl
RewriteEngine off
************************** |
|
|
|
|
jakec
|
Posted:
Sun Nov 12, 2006 2:11 pm |
|
Just want to check, as Montego suggested have you checked with your host whether that module is turned on?
Are all the IP addresses the same? you could block the IP addresses. |
|
|
|
|
evaders99
|
Posted:
Sun Nov 12, 2006 3:17 pm |
|
You'll need both a RewriteCond and a RewriteRule
Code:
RewriteCond %{HTTP_USER_AGENT} ^libwww-perl
RewriteRule ^.*$ - [L]
|
|
|
|
|
|
montego
|
Posted:
Tue Nov 14, 2006 7:13 am |
|
Evaders, I have this to not work as well as Tao_Man. I changed my RewriteRule to what you show and started getting these again. As soon as I changed it back to the following, it worked again (and no libwww-perl):
RewriteRule ^.*$ http://127.0.0.1 [R,L]
Here is where Tao_Man and I discussed this:
http://www.ravenphpscripts.com/postt11804.html |
|
|
|
|
evaders99
|
Posted:
Tue Nov 14, 2006 8:07 am |
|
Ah sorry, I thought the dash was to stop Apache from processing.
We'll just send the redirect as you did, back to the site itself
Code:
RewriteCond %{HTTP_USER_AGENT} ^libwww-perl
RewriteRule ^.*$ http://127.0.0.1 [R,L]
|
|
|
|
|
|
technocrat
Life Cycles Becoming CPU Cycles
Joined: Jul 07, 2005
Posts: 511
|
Posted:
Tue Nov 14, 2006 9:06 am |
|
That's a smart change to make! Good idea guys. |
_________________ Only registered users can see links on this board! Get registered or login!
Only registered users can see links on this board! Get registered or login! / Only registered users can see links on this board! Get registered or login! |
|
|
|
evaders99
|
Posted:
Tue Nov 14, 2006 2:32 pm |
|
I was thinking, maybe do an internal redirect to a script that generates gibberish nonstop. It would keep sending data until the connection times out (or would it, through libwww-perl?)
Most of the scripts don't follow external redirects unless you set it to. I wonder if anyone knows enough about libwww-perl
We could seriously slow these bot machines down if its possible to send them junk and hog the connection
EDIT:
From what I'm reading, the timeout default is 180 seconds.
http://cpan.uwinnipeg.ca/htdocs/libwww-perl/LWP/UserAgent.html
But it can easily be changed to a smaller amount. So it would be really easy to bypass. Oh well |
|
|
|
|
|