Win a copy of Mesos in Action this week in the Cloud/Virtualizaton forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

IP/host filtering in log files?

 
senthil sen
Ranch Hand
Posts: 184
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hello All,

I am trying to find repeated IP hits to the server in the log file or how to find which IP has the highest number of hits to the server or has done the most damaged? This is not only for IP's but also for any host name or any user having the highest entry in the server log?

Thanks,
 
Charles Lyons
Author
Ranch Hand
Posts: 836
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
You're welcome. Despite the question marks though, there isn't a question here, just a set of statements about what you want to do. So what have you thought about or tried, and what problems have you been having? Then we'll go from there...
 
Tim Holloway
Saloon Keeper
Pie
Posts: 18156
53
Android Eclipse IDE Linux
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Ah well, I'm not exactly fluent in malayam, tamil or urdu. I think I can see the question in there, even if it doesn't match the question marks.

First, the number of attacks isn't related to the amount of damage done. I have typically 10,000 attempts made on one of my FTP servers every day. So far they have all bounced. But it would take only one successful attack on any IP or port to get my machine owned.

On the Fedora/Red Hat architecture, there's a package called logwatch that will scan your logfiles every day and email a summary of this and other things as well. By default it gets sent to the root account on the local machine, but you can override that and send it to whomever and wherever you prefer.

Here's a small typical excerpt:


You can see that after all these years, the SQL Slammer is still alive and well!
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic