About these ads

Archive

Posts Tagged ‘C&C Server’

The Next Step of Botnets

September 15, 2012 Leave a comment

A BlackHole (Exploit Kit) absorbing an Onion (Ring), the future of Botnets?

This information security week has offered many interesting points: the brand new CRIME attack against SSL/TLS, the release of BlackHole Exploit Kit 2.0 that promises new stealth vectors of Drive-By download infections, the takedown of the emerging Nitol botnet by Microsoft, and, last but not least, the first (?) known example of a new generation of a C&C Server leveraging the anonymization granted by Tor Service.

The latter is in my opinion the news with the most important consequences for the Information Security community, since delineates the next step of Botnets’ evolution, after the common, consolidated, C&C communication schema, and its natural evolution consisting in Peer-to-Peer (P2P) communication.

The first (I wonder if it is really the first) discovery of a Botnet command server hidden in Tor, using IRC protocol to communicates with its zombies,  has been announced in a blog post by G-Data. Of course the advantages of such a similar communication schema are quite simple: the Botnet may use the anonymity granted by the Deep Web to prevent the identification and the likely takedown of the server, and the encryption of the Tor protocol to make traffic identification harder by traditional layers of defense. Two advantages that greatly exceed the Tor latency which represents the weakness of this communication schema.

Maybe it was only a matter of time, in any case it is not a coincidence that in the same weeks researchers have discovered BlackHole 2.0 and the first (maybe) C&C infrastructure hidden inside the Deep Web: Cyber Criminals are continuously developing increasingly sophisticated methods to elude law enforcement agencies and to evade the security controls of the traditional bastions, and the botnets are confirming more than ever to be the modern biblical plague for the Web…

And even if every now and then good guys are able to obtain a victory (as the Nitol takedown), the war is far from over.

About these ads

I, BOT (Coming To A C&C Server Near You)

May 22, 2012 3 comments

Few days ago I have discovered that the City I live in (Rome), ranks at number two in the World for the number of BOT infections, at least according to Symantec Internet Security Threat Report Edition XVII.

Of course reports must be taken with caution, but it is undoubted that Bot infections are becoming a huge problem for the Information Security Community (a modern Biblical Plague), so huge to deserve the attentions of The Federal Communication Commission. As a matter of fact, on March 2012, FCC, working with communications companies including Verizon, Cox, and Comcast, has passed a voluntary code that delineates the steps that ISPs must take to combat botnets. As you will probably know, botnets may be used by cybercrookers for making money with different criminal purposes ranging from information theft to the execution of DDoS Attacks: have a look to this interview to a botnet operator to have an idea (and to discover that botnets are used also to counterfeit virtual currency).

Such a similar plague is pushing a major change to the traditional security paradigm, a change that can be summarized in few words: if yesterday the refrain for system administrators was “Beware of what enters your network” (so all the security warfare was focused in checking the ingress traffic), today it is becoming: “Beware of what leaves your network“.

This is nothing else than a consequence of the fact that traditional endpoints technologies are proving not to be so effective against Bots, so a new approach, which aims to control the egress traffic generated by compromised endpoints and leaving the organization, is needed. The effectiveness of traditional endpoint technologies is not optimal since new variants (capable of evading antivirus controls) come out much faster than the related signatures developed by vendors: try to have a look at the average antivirus detection rate against Zeus (the god of bots), and you will probably be disappointed in noticing that it is stable at a poor 38%). On the other hand, recognizing the communication patterns at the perimeter is a more profitable strategy, since the different variants generally do not change deeply the communication protocols with the C&C Server (unless a P2P protocol is used, see below).

The strategy to mitigate botnets relies on the fact that each botnet has (in theory) a single point of failure: it is the C&C Server to which Cyber Hunters and Law Enforcement Agencies address their takeover attempts to take them down definitively or to turn them into sinkholes for studying the exact morphology and extension of the infection). Depending on the botnet configuration, each infected endpoint polls the C&C server for new instructions at a given time interval and that is the point of the process in which good guys may act: detecting (and blocking) that traffic allows to identify infected machines (and my experience indicate that too often those machines are equipped with an updated and blind antivirus).

For the chronicle the C&C Server is only a theoretical single point of failure since C&C Servers are generally highly volatile and dynamic so it is not so easy to intercept and block them (the only way to take down a botnet), hence in my opinion, it should be more correct to say that a botnet has has many single points of failure (an information security oxymoron!).

As if not enough, in order to make life harder for good guys, the next generation botnets are deploying P2P protocols for decentralizing the C&C function and make their takedown even tougher.

But good guys have a further weapon in this cat and mouse game: the cloud intelligence. Even if I am not a cloud enthusiast, I must confess that this technology is proving to be a crucial element to thwart botnets since it allows to collect real time information about new threats and to centralize the “intelligence” needed to dynamically (and quickly) classify them. Real time information is collected directly from the enforcement points placed at the perimeter, which analyze the egress traffic from an organization containing compromised machines. Of course after the successful analysis and classification, the new patterns may be shared among the enforcement points all over the five continents in order to provide real time detection (and hence protection) against new threats. This approach is clearly much more efficient than an endpoint based enforcement (which would need to share the information among a larger amount of devices), provided the enforcement point are positioned adequately, that is they are capable to monitor all the egress traffic.

The combination of the analysis of egress traffic and cloud intelligence is a good starting points for mitigating the botnet effects (for sure it is necessary to identify infected machines) but, as usual, do not forget that the user is the first barrier so a good level of education is a key factor together with consolidated processes and procedures to handle the infections.

Follow

Get every new post delivered to your Inbox.

Join 3,175 other followers