Author |
Message |
kguske
Site Admin

Joined: Jun 04, 2004
Posts: 6437
|
Posted:
Sun Apr 23, 2006 6:39 am |
|
This might be an enhancement request...
There are 3 or 4 attacks that I've seen on my sites repeatedly from different IP addresses. I guess the mentally-handicapped morons think: "Uh, well I was blocked once by NukeSentinel, but, uh, maybe it won't block me the next 50 times."
In some cases, you know an email address (typically when they try to add an admin user). In others, you know a website address (when they try to load a script running on their sites).
I've thought of creating a page of these email addresses and redirecting harvesters to that page. Of course, an attacker could use someone else's email address when attacking, so you wouldn't want to automatically add the address to that page.
By the same token, you could redirect attacks to the sites of repeated attackers, to waste their bandwidth (let attackers hurt each other).
A third idea is to add a special blocker for repeat attacks. This would check the blocked table to see if that specific attack has been blocked before and would override the blocker for the specific attack so that the rectally-obsessed get a special reward for repeat business - a sort of reverse frequent attacker program.
I would, of course, want the ability to import and export the attacks (minus the domain, of course) so that when someone attacks one of my sites (or anyone else willing to share their attack list or use mine) they would get that special treatment on the next site...
What do you think? Do you have other ideas for annoying these duplicating dunderheads? |
_________________ I search, therefore I exist...
Only registered users can see links on this board! Get registered or login! |
|
|
 |
Raven
Site Admin/Owner

Joined: Aug 27, 2002
Posts: 17088
|
Posted:
Sun Apr 23, 2006 8:46 am |
|
While I agree, in principle with fighting back, I think it would just provoke more attacks. I still get about 6 U-N-I-O-N attacks daily with all kinds of contortions. The new kiddies, fresh out of diapers, have just gotten a new 'puter from mommy and daddy and have discovered copy-n-paste from a h@ck3r web site. I figure by the time they no longer need their pacifier they will have moved on. |
|
|
|
 |
kguske

|
Posted:
Sun Apr 23, 2006 9:03 am |
|
That's true, but there are probably around 5 who keep using the same attacks month after month after month... Sometimes, it's possible to get the site hosting the bad script shut down, but that requires some effort. As for increasing the attacks, I normally don't use PC Killer, but if you're stupid enough to try the same attack twice... |
|
|
|
 |
evaders99
Former Moderator in Good Standing

Joined: Apr 30, 2004
Posts: 3221
|
Posted:
Sun Apr 23, 2006 5:13 pm |
|
Sounds like a good idea to me.. don't know how effective it would be though. Most don't even use a standard browser, just a bunch of robot scripts. I'm compiling quite a list of blocks |
_________________ - Only registered users can see links on this board! Get registered or login! -
Need help? Only registered users can see links on this board! Get registered or login! |
|
|
 |
kguske

|
Posted:
Mon Apr 24, 2006 5:19 pm |
|
Good point, evaders99. I haven't looked at that, but that would certainly explain the repeated attacks. What can we do to affect robots? I'd rather do nothing than waste time (especially if it's Ravens!)... |
|
|
|
 |
evaders99

|
Posted:
Mon Apr 24, 2006 5:27 pm |
|
Very little honestly. Block and move on. I'm working on integrating some of my code with Sentinel, to block all those robots I've seen  |
|
|
|
 |
montego
Site Admin

Joined: Aug 29, 2004
Posts: 9457
Location: Arizona
|
Posted:
Tue Apr 25, 2006 6:14 am |
|
And I wonder if some type of "service" such as Guardian is starting with his SpamList database is something useful with several different things, such as:
- Bad Referrers
- Spam Bots
- Protected ranges (the "good guys" in terms of search robots)
- etc....
It would be awesome if something like this could be community driven, but "moderated" and made available either as NS data replacements and/or as an externally hosted service.
Ok, I am going to do back to bed or get more coffee... bed? coffee? bed? coffee? ... |
_________________ Only registered users can see links on this board! Get registered or login!
Only registered users can see links on this board! Get registered or login! |
|
|
 |
evaders99

|
Posted:
Tue Apr 25, 2006 7:42 am |
|
That is what I was planning, I just have to figure out a way to have it distribute and update without doing too many requests on my server. |
|
|
|
 |
srhh
Involved


Joined: Dec 27, 2005
Posts: 296
|
Posted:
Tue Apr 25, 2006 12:30 pm |
|
This doesn't relate to the bots issue, but an idea would be to redirect Sentienl after a repeated attack to some computer crimes page on the FBI website.
They'll never know if a report was made or not. (Yes, kiddies, it is a federal offense. Sleep tight. ) |
|
|
|
 |
evaders99

|
Posted:
Tue Apr 25, 2006 12:48 pm |
|
One thing I'm thinking is to publish as revisions, the way SVN does.. transmitting only the necessary changes. But that means keeping things in a file format. |
|
|
|
 |
kguske

|
Posted:
Tue Apr 25, 2006 8:33 pm |
|
I'd consider a SOAP server - and maybe charging a small annual fee for updates... |
|
|
|
 |
Guardian2003
Site Admin

Joined: Aug 28, 2003
Posts: 6799
Location: Ha Noi, Viet Nam
|
Posted:
Wed Apr 26, 2006 2:28 am |
|
At the moment, my module holds the 'bad referer' data in my own database, then updates a simple delimited text file which the distributed module connects to in order for the user to update the list which the module creates at the beginning of the users htaccess file.
There is also the means to 'report' a bad referer for investigation
All very basic stuff and just about within my 'skill' level lol.
I have not had any negative impact on serving the list to the 50 odd users who have downloaded the module but by default, bad referers on a users site are redirected to an 'advice' page on mine - you want to see my referer list - WOW!
So the theory certainly works!
Yes I think it would be great of lists like this could be incorporated not just into Sentinel but 'nuke'.
By that I mean, we could block bad referers, bad IP's, bad bots and website downloaders, stop known spammer email address' from registering on nuke sites, the potential is enormous.
evaders99 - you have far more experience with PHP than me and could probably do in 10 minutes what it would take me months to accomplish so, trying to stay 'on topic' here is something of my 'vision'.
Eventually..........
The users module would take the initial list and move the data into their own DB table dedicated for this purpose.
The module would display whether or not an update was available via an XML file and when the user clicks the link, the user DB is modified to include additional data or drop old data and check for duplicate data etc before writing to/amending the htaccess file .
The user should have the facility to 'report' a bad IP/referer/bot etc to the 'master list' where it is held 'pending' until verified/moderated.
When the user submits this report, it should also add this data to a local (user) file so the user can block whatever the data pertains too even if the submitted 'report' is denied entry into the master list. |
|
|
|
 |
evaders99

|
Posted:
Wed Apr 26, 2006 10:20 am |
|
I have seen your bad referer database, good job. XML may be the way to go here, some routines from Sentinel should help with the IP parts.
Definitely want to include a reporting feature.
What I've been looking into is the way spam blocklists work. They all work from DNS now, you ask the DNS server for an IP and it returns whether it is listed or not. It is quite effective, but it runs on IP check on every request. Not sure if it is even cached or anything. Those like ORBS and SPEW are quite good.. but each block list has its own way to manage things. Some keep only a known list of reports... some will allow people to remove, while others will not. While this may not affect us so much, some block lists use automated systems to find open relays, etc.
Some of the same blocks used against spammers should be used against hackers. There are the same people using open proxies, dynamic IPs, spam-friendly "bulletproof" hosting, etc.
As a policy, I would say be strict on submissions. They must have all relevant data.. down to the latest detail. We wouldn't want people to fake submit things. I think a human eye is good here... we'll want an actual person to review them before being blocked.
If we identify a repeat offender, we notify their ISP or host. What we may consider is a no second chances policy... if we notify and the ISP does nothing, we keep the block permanent until that IP range is reassigned. That would reduce a lot of multiple delisting/relisting issues.
Sorry this is so long... At any rate, these are just the ideas floating through my head. I have a couple of other models I want to integrate, the use of proper HTTP rules like Bad Behavior for blogs/wikis, reporting models such as SpamCop, etc... |
|
|
|
 |
montego

|
Posted:
Wed Apr 26, 2006 8:09 pm |
|
Stop, please, you guys are getting me excited!  |
|
|
|
 |
Guardian2003

|
Posted:
Thu Apr 27, 2006 3:03 am |
|
evaders, do you have any links to ORBS or SPEW so I can do some homework?
I totally agree that the human element in making a final decision as to what gets included in a 'block' list is essential, for the same reason I have kept away from 'automation' thus far - the list is only as good as the 'block' data is valid.
It only takes one rogue entry to make the whole excercise worthless.
Take for example 'proxies' - we all know that undesirable people use proxies as a way to hide their tracks and many people advocate 'blocking proxies' in Sentinel. So what then becomes of say, the AOL users who may very well be legitimate users of a site but through no fault of their own are routed through a proxy... |
|
|
|
 |
evaders99

|
Posted:
Thu Apr 27, 2006 7:58 am |
|
Wikipedia is a good start on how DNS blocklists work
http://en.wikipedia.org/wiki/DNSBL
Some other good reads
SORBS - http://www.us.sorbs.net/faq/
SPEWS - http://www.spews.org/faq.html
I'm also reading more towards hacker and robot detection. Honeynet Project is quite good at that, making "honey pots" of servers they can use to analyze hackers work, without actually doing damage to the server. http://www.honeynet.org/
I doubt we'd need something that complicated as a fake server, but it may be useful to have certain sites as "honey pots".. not actually blocking IPs, but sending the report on to be analyzed. That's my plan to use several of my sites for that.
All just preliminary work... to when I can actually get time to work on it. My robot script so far isn't extensive, but I have about 200 IPs banned at the moment just from it. |
|
|
|
 |
Guardian2003

|
Posted:
Thu Apr 27, 2006 10:01 am |
|
I'm getting a few trapped by including a partial sql injection url in robots.txt and letting Sentinel do some work. |
|
|
|
 |
|