Apache
- Home
- Logging Guide
- Unusual traffic patterns
Unusual traffic patterns
In this page
- Inroduction
- Examples of unusual traffic patterns in Apache server
- How to identify malicious users in your Apache server
All the data transiting in and out of your network constitutes the network traffic. Firewall and web server logs helps you to record the details of network traffic and analyzing these log data to observe unusual traffic patterns helps you to detect or even prevent malicious activities in your network.
This article discusses how to detect unusual traffic patterns by analyzing Apache web server logs.
Here’s a sample of Apache server log
86.28.187.243 - - [11/Mar/2013:15:16:41 +0000]
"GET /siteimages/TBP/css/dynamiccontent.css HTTP/1.1" 200 3554 0
When you parse this log data either manually or using a log management solution you can infer the below:
- 86.28.187.243 is the IP address of the client which made the request to the server.
- [11/Mar/2013:15:16:41 +0000] is the time at which the server finished processing the request.
- "GET /siteimages/TBP/css/dynamiccontent.css HTTP/1.1" Is the request line from the client that includes what was requested and what method was used.
- 200 is the status code that the server sends back to the client.
- 3554 Is the size of the requested file object.
Information such as the IP address, the size of the file requested object and more can help you detect unusual traffic. Here are a few examples of unusual traffic patterns.
- Web traffic due to robots
According to Google, any traffic due to the search activities of a robot, computer programs or automated services is deemed unusual. Not all bots are harmful. But harmful bots can inflict considerable damage ranging from stealing information to infecting a host as a part of DDoS attack. The best way to detect robot traffic is compiling the list of robot IPs and use IP filter
- Geographical irregularities
Another indicator of compromise is a geographical irregularity. When your network traffic is inbound from or outbound to a country with which your organization has no connections, then it could be a sign of malicious activity. There is no fool-proof method of getting the exact geographic location, however, tools that identify this by whois records are available. Deploying Threat Intelligence systems is of immense help in this regard.
- Unusual command and control traffic
Command and control servers or C2C play a vital role in botnet attacks.An organization's network can fall prey to botnet infection, wherein a large number of systems often belonging to a single network are infected and used to carry out illegal activities or infect more systems. C2C servers are at the helm of botnet operation. Therefore, a large number of packets travelling into the network as C2C traffic is suspicious.
Identifying malicious users
If the access logs point to scraping of generous amounts of data from the website, it is most likely to be an attempt at making it unresponsive—in other words, a DoS attack. This information can be parsed from the logs using the following command, which fetches logs with the requested file object size greater than 20kB.
_sourceCategory=Apache/Access | parse regex
"(?<client_ip>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})" | parse regex
"HTTP/1\.1\" \d+ (?<size>\d+)" | (size/1024) as kbytes | sum(kbytes) as kbytes by client_ip | sort kbytes | limit 20
Error logs also contain a wealth of information.. Error 404, for example, means that a connection is established to your server by the client, but to a nonexistent page. A large number of error logs can crash your website. Which is why it is essential to identify the IPs of such clients. This can be parsed using the command
_sourceCategory=Apache/Error | parse regex
"(?<client_ip>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})" | parse regex
"\[.*:(?<log_level>[a-z]+)\]" | where client_ip !="" AND log_level in ("emerg", "alert", "crit") | count client_ip | top 10 client_ip by _count</log_level></client_ip>
Once identified, this IP should be blocked. This scenario can play in your favour only if you have real time monitoring.
Stay vigilant and protected by regularly monitoring the activities in your network. EventLog Analyzer, a comprehensive log management solution, is capable of auditing Apache servers, in real time. It provides error, and attack reports that help you investigate and ensure security of your websites. Also, with the custom log parser, a simple yet powerful feature, you can deeply analyze logs, parse out new fields and constantly be on the lookout for new attack patterns. Click here to know more.