We have not completely assimilated the BEAST vulnerability, and here it comes, from Bochum, Germany, another serious flaw involving Encryption, or better, involving XML Encryption.
XML Encryption, is a W3C standard widely used to securely transmit information inside Application-to-Application Web services connections. It was believed to be a robust standard mechanism to protect data exchange between a wide class of applications using web services and deployed in different sectors, for instance business, e-commerce, financial, healthcare, governmental and military applications. For the generic user a typical scenario involves, for example, credit card information encryption for a payment within an XML-based purchase order.
Unfortunately it lools like the mechanism is not so robust as it was supposed to be, and the discovery comes from Juraj Somorovsky and Tibor Jager, two Researchers at the Ruhr University of Bochum (RUB) who were successful in cracking parts of the XML encryption process used in web services, thus making it possible to decrypt encrypted data. They demonstrated their work at the ACM Conference on Computer and Communications Security in Chicago this week.
As far as the attack technique is concerned, once again CBC (Cipher-Block Chaining) is indicted since the new discovered vulnerability, as in case of the BEAST attack, is exploitable only in this encryption mode.
The attack strategy seems very similar to the one behind the BEAST Attack and allows to decrypt data, encrypted in AES-CBC, by sending modified ciphertexts to the server, and gathering information from the received error messages using the cryptographic weakness of the CBC mode, in particular the fact that, by conveniently manipulating the IV, ciphertexts encrypted in CBC mode can be modified so that the resulting ciphertext is related to the original ciphertext in a certain way (see the description of the BEAST attack for an example).
So, by choosing a given ciphertext, the attacker is able to recover the entire plaintext and the only prerequisite requires the availability of what the researchers define an “oracle”, that is a pattern telling the attacker if a given ciphertext contains a “correctly formed” plaintext that is a valid encoding (e.g. in UTF-8 or ASCII) of a message. Even worse XML signature is not able to mitigate the attack.
In their paper the authors showed that a moderately optimized implementation of the attack was able to decrypt 160 bytes of encrypted data within 10 seconds by issuing 2,137 queries to the Web Service, morever the complexity of the attack grows only linearly with the ciphertext size, thus allowing to recover a larger plaintext of 1,600 bytes takes about 100 seconds and 23,000 queries.
The proof of concept has been performed on a Web Service based on the Apache Axis2 XML framework and verified on JBoss, but many other vendors are affected, that is the reason why the two researchers announced the vulnerability to the W3C XML Encryption Working Group in February 2011. Vendors affected include the above mentioned Apache Software Foundation (Apache Axis2), RedHat Linux (JBoss), but also IBM and Microsoft.
Unfortunately fixing the flaw will not be that easy, the only suitable way is to replace CBC mode by using a symmetric cryptographic primitive providing confidentiality and integrity, this means to change the XML Encryption standard!
Advanced Persistent Threats are probably the most remarkable events for Information Security in 2011 since they are redefining the infosec landscape from both technology and market perspective.
I consider the recent shopping in the SIEM arena made by IBM and McAfee a sign of the times and a demonstration of this trend. This is not a coincidence: as a matter of fact the only way to stop an APT before it reaches its goal (the Organization data), is an accurate analysis and correlation of data collected by security devices. An APT attack deploys different stages with different tactics, different techniques and different timeframes, which moreover affect different portion of the infrastructure. As a consequence an holistic view and an holistic information management are needed in order to correlate pieces of information spread in different pieces of the networks and collected by different, somewhat heterogeneous and apparently unrelated, security devices.
Consider for instance the typical cycle of an attack carried on by an APT:
Of course the picture does not take into consideration the user, which is the greatest vulnerability (but unfortunately an user does not generate logs except in a verbal format not so easy to analyze for a SIEM). Moreover the model should be multiplied for the numbers of victims since it is “unlikely” that such a similar attack could be performed on a single user at a time.
At the end, however, it is clear that an APT affects different components of the information security infrastructure at different times with different threat vectors:
- Usually stage 1 of an APT attack involves a spear phishing E-mail containing appealing subject and argument, and a malicious payload in form of an attachment or a link. In both cases the Email AV or Antispam are impacted in the ingress stream (and should be supposed to detect the attack, am I naive if I suggest that a DNS lookup could have avoided attacks like this?). The impacted security device produce some logs (even if they are not straightforward to detect if the malicious E-mail has not been detected as a possible threat or also has been detected with a low confidence threshold). In this stage of the attack the time interval between the receipt of the e-mail and its reading can take from few minutes up to several hours.
- The following stage involves user interaction. Unfortunately there is no human firewall so far (it is something we are working on) but user education (a very rare gift). As a consequence the victim is lured to follow the malicious link or click on the malicious attachment. In the first scenario the user is directed to a compromised (or crafted) web site where he downloads and installs a malware (or also insert some credentials which are used to steal his identity for instance for a remote access login). In the second scenario the user clicks on the attached file that exploits a 0-day vulnerability to install a Remote Administration Tool. The interval between reading the malicious email and installing the RAT takes likely several seconds. In any case Endpoint Security Tools may help to avoid surfing to malicious site or, if leveraging behavioral analysis, to detect anomalous pattern from an application (a 0-day is always a 0-day and often they are released after making reasonably sure not to be detected by traditional AV). Hopefully In both cases some suspicious logs are generated by the endpoint.
- RAT Control is the following stage: after installation the malware uses the HTTP protocol to fetch commands from a remote C&C Server. Of course the malicious traffic is forged so that it may be hidden inside legitimate traffic. In any case the traffic pass through Firewalls and NIDS at the perimeter (matching allowed rules on the traffic). In this case both kind of devices should be supposed to produce related logs;
- Once in full control of the Attacker, the compromised machine is used as a hop for the attacker to reach other hosts (now he is inside) or also to sweep the internal network looking for the target data. In this case a NIDS/anomaly detector should be able to detect the attack, monitoring, for instance, the number of attempted authentications or wrong logins: that is the way in which Lockheed Martin prevented an attack perpetrated by mean of compromised RSA seeds, and also, during the infamous breach, RSA detected the attack using a technology of anomaly detection Netwitness, acquired by EMC, its parent company immediately after the event.
At this point should be clear that this lethal blend of threats is pushing the security firms to redefine their product strategies, since they face the double crucial challenge to dramatically improve not only their 0-day detection ability, but also to dramatically improve the capability to manage and correlate the data collected from their security solutions.
As far as 0-day detection ability is concerned, next-gen technologies will include processor assisted endpoint security or also a new class of network devices such as DNS Firewalls (thanks to @nientenomi for reporting the article).
As far data management and correlation are concerned, yes of course a SIEM is beautiful concept… until one needs to face the issue of correlation, which definitively mean that often SIEM projects become useless because of correlation patterns, which are too complex and not straightforward. This is the reason why the leading vendors are rushing to include an integrated SIEM technology in their product portfolio in order to provide an out-of-the-box correlation engine optimized for their products. The price to pay will probably be a segmentation and verticalization of SIEM Market in which lead vendors will have their own solution (not so optimized for competitor technologies) at the expense of generalist SIEM vendors.
On the other hand APT are alive and kicking, keep on targeting US Defense contractors (Raytheon is the latest) and are also learning to fly though the clouds. Moreover they are also well hidden considered that, according to the Security Intelligence Report Volume 11 issued by Microsoft, less than one per cent of exploits in the first half of 2011 were against zero-day vulnerabilities. The 1% makes the difference! And it is a big difference!
- Information, The Next Battlefield (paulsparrows.wordpress.com)
Today the Information Security Arena has been shaken by two separate, although similar, events: IBM and McAfee, two giants in this troubled market, have separately decided to make a decisive move into the Security Information And Event Management (SIEM) market by acquiring two privately held leading companies in this sector.
As a matter of fact, nearly in contemporary, today IBM has officially announced to acquire Q1 Labs while McAfee was officially declaring its intent to acquire privately owned company NitroSecurity.
Although part of different tactics, the two moves follow, in my opinion, the same strategy which aims to build a unified and self-consistent security model: a complete security framework must not only provide information but also the intelligence to manage it, Information is power and Security is no exception to this rule.
But in order to be a real power, information must be structured and here comes the key point. Both vendors are leading providers of Network and Host Intrusion Prevention Solutions, heritage of the acquisions of ISS by IBM and Intrushield by McAfee and have hence the ability to capture security events from endpoints and networks: definitively they have the ability to provide the information, but they miss the adequate intelligence to correlate and manage it in order to make it structured.
This is completely true for McAfee that, (at least until today) lacked a SIEM solution in its portfolio and needed to rely on the SIA Certified SIEM Partner (Of course NitroSecurity was certified as a Sales Teaming Partner, the higher level). But in part this is also true for IBM that, despite the Micromuse acquisition and its troubled integration with Tivoli, was never able to became a credible player in this market, confined at the boundaries of the various (magic) quadrants.
Now they can make a decisive change to their positioning and also leverage a powerful trojan horse (the Information Management) to push their technologies to conquer new customers and market segments.
Is maybe a coincidence that another leader provider of SIEM solutions (ArcSight) is part of a company (HP) which also has in its portfolio Tipping Point (as part of the 3Com acquisition) a leader provider of Network IPS?
Event detection and event correlations (and management) are converging in the new Unified Security Model, general SIEM vendors are advised…
The intention by UK-headquartered company Sophos to acquire Astaro, the privately-held security company co-headquartered in Karlsruhe, Germany and Wilmington, Massachusetts (USA) is simply the last effect of the process of vendor consolidation acting in the information security market. It is also the trigger for some random thoughts…
In the last two years a profound transformation of the market is in place, which has seen the birth (and subsequent growth) of several giants security vendors, which has been capable of gathering under their protective wings the different areas of information security.
The security model is rapidly converging toward a framework which tends to collect under a unified management function, the different domains of information security, which according to my personal end-to-end model, mat be summarized as follows: Endpoint Security, Network Security, Application Security, Identity & Access Management.
- Endpoint Security including the functions of Antivirus, Personal Firewall/Network Access Control, Host IPS, DLP, Encryption. This component of the model is rapidly converging toward a single concept of endpoint including alle the types of devices: server, desktop, laptop & mobile;
- Network & Contente Security including the functions of Firewall, IPS, Web and Email Protection;
- Application Security including areas of WEB/XML/Database Firewall and (why not) proactive code analysis;
- Compliance: including the functions of assessment e verification of devce and applications security posture;
- Identity & Access Management including the functions of authentication and secure data access;
- Management including the capability to manage from a single location, with an RBAC model, all the above quoted domains.
All the major players are moving quickly toward such a unified model, starting from their traditional battlefield: some vendors, such as McAfee and Symantec, initiallty moved from the endpoint domain which is their traditional strong point. Other vendors, such as Checkpoint, Fortinet, Cisco and Juniper moved from the network filling directly with their technology, or also by mean of dedicated acquisitions or tailored strategic alliances, all the domains of the model. A further third category is composed by the “generalist” vendors which were not initially focused on Information Security, but became focused by mean of specific acquisition. This is the case of HP, IBM and Microsoft (in rigorous alphabetical order) which come from a different technological culture but are trying to become key players by mean of strategic acquisitions.
It is clear that in similar complicated market the position and the role of the smaller, vertical, players is becoming harder and harder. They may “hope” to become prey of “bigger fishes” or just to make themselves acquisitions in order to reach the “critical mass” necessary to survive.
In this scenario should be viewed the acquisition of Astaro by Sophos: from a strategical perspective Sophos resides permanently among the leaders inside the Gartner Magic quadrant but two of three companions (Symantec and Mcafee, the third is Trend Micro) are rapidly expanding toward the other domains (meanwhile McAfee has been acquired by Intel). In any case all the competitors have a significant major size if compared with Sophos, which reflects in revenues, which in FY 2010 were respectively 6.05, 2.06 and 1.04 B$, pretty much bigger than Sophos, whose revenues in FY 2010 were approximately 260 M$, about one fourth of the smaller between the three above (Trend Micro which is, like Sophos, a privately-owned company).
In perspective the acquisition may be also more appealing and interesting for Astaro, which is considered one of the most visionary players in the UTM arena with a primary role in the European market. Its position with respect to the competition is also more complicated since the main competitors are firms such as Fortinet, Check Point and Sonicwall which all have much greater size (as an example Checkpoint revenues were about 1.13 B $ in FY 2010 which sound impressive if compared with the 56 M $ made by Astaro in the Same Fiscal Year).
In this scenario, the combined company aims to head for $500 million in 2012.
Last but not least both companies are based in Europe (respectively in England and Germany) and could rely on an US Headquarter in Massachusetts.
From a technological perspective, the two vendors are complementary, and the strategy of the acquisition is well summarized by the following phrase contained in the Acquisition FAQ:
Our strategy is to provide complete data and threat protection for IT, regardless of device type, user location, or network boundaries. Today, we [Sophos] offer solutions for endpoint security, data protection, and email and web security gateways. The combination of Sophos and Astaro can deliver a next generation set of endpoint and network security solutions to better meet growing customer needs […]. With the addition of Astaro’s network security, we will be the first provider to deliver truly coordinated threat protection, data protection and policy from any endpoint to any network boundary.
Sophos lacks of a network security solution in its portfolio, and the technology from Astaro could easily fill the gap. On the other hand, Astaro does not own an home-built antivirus technology for its products (so far it uses ClamAV and Avira engines to offer a double layer of protection), and the adoption of Sophos technologies (considered one of the best OEM Antivirus engine) could be ideal for its portfolio of UTM solutsions.
Moreover the two technologies fit well among themselves to build an end-to-end security model: as a matter of fact Information security is breaking the boundary between endpoint and network (as the threats already did). Being obliged to adapt themselves to the new blended threats, which often uses old traditional methods to exploit 0-day vulnerabilities on the Endpoint, some technologies like Intrusion prevention, DLP and Network Access Control, are typically cross among different elements of the infrastructure, and this explains the rush of several players (as Sophos did in this circumstance) to enrich their security portfolio with solutions capable of covering all the information Security Domains.
Just to have an idea, try to have a look to some acquisitions made by the main security players in the last years (sorry for the Italian comments). Meanwghile the other lonely dancers (that is the companies currently facing the market on their own), are advised…
- Sophos to acquire Astaro – some reactions (nakedsecurity.sophos.com)
- Sophos Acquires Internet Security Appliance Maker Astaro (techcrunch.com)
- Application Security: What’s Next? (paulsparrows.wordpress.com)
How many times, stuck in traffic on a hot August day, we hoped to have a pair of wings to fly through the clouds and free from the wreckage of burning metal.
Unfortunately, at least for me (even if my second name in English would sound exactly like Sparrows) no wing so far, miraculously, popped up to save me, nevertheless I am quite confident that, in a quite near future, I will be saved by the clouds even if I will not be able to fly, or better said, I will be saved by cloud technologies that will help me, and the other poor drivers bottled between the asphalt and the hot metal, under the ruthless August sun to avoid unnecessary endless traffic jams on Friday afternoons.
Some giants of Information Technology (Cisco and IBM in primis) are exploring and experimenting such similar solutions, aimed to provide Car Drivers with real time information about traffic and congestions in order to suggest them the optimal route. In this way they will provide a benefit to the individual, avoiding him a further amount of unnecessary stress, and to the community as well, contributing to fight pollution and making the whole environment more livable and enjoyable.
The main ingredients of this miraculous technological recipe consist in Mobile Technologies and cloud technologies and the reasons are apparently easy to understand: everybody always carries with him a smartphone which is an incommensurable real time probe source of precious data necessary to model a traffic jam (assuming that it will be ever possible to model a traffic jam in the middle of the Big Ring of Rome): as a matter of fact a smartphone allows to provide real-time traffic information correlated with important parameters such as GPS position, average speed, etc.
Cloud technologies provide the engine to correlate information coming from mobile devices (and embedded devices) belonging to many different vehicles, providing the computational (dynamically allocated) resources needed to aggregate and make coherent data from many moving sources in different points of the same city or different interconnected cities. Cloud technologies may act a a single, independent, point of collection for data gathered on each device, dynamically allocating resources on-demand (let us suppose to have, in the same moment, two different jams, one of which is growing to an exponential rate and requires, progressively more and more computational resources), providing, to the individual (and to the City Administrators) a real time comprehensive framework, coherent and updated (nobody would hope to be led, by his navigator, to a diversion with a traffic-jam much worse than the original one which caused the diversion.
Of course, already today many consumer navigators offer the possibility to provide real-time traffic information, anyway the huge adoption of cloud technologies will offer an unprecedented level of flexibility together with the possibility to deal with a huge amount of data and to correlate the collected information with other data sources (for instance the V2V Veichle2Veichle e V2I Veichle2Infrastructure). From the city administrations perspective, the collected data will be invaluable for identifying the more congested points (and drive the subsequent proper targeted corrective actions), and moreover for supporting a coherent and sustainable development of the city.
Cisco and IBM are working hard to make this dream become true in few years with different approaches converging to the cloud: Cisco is leveraging the network intelligence for a project pilot in the Korean City of Busan (3.6 million of inhabitants). Cisco vision aims, in the third phase of the project scheduled before the end of 2014, to provide the citizens with many different mobile services in the cloud, with a Software-As-A-Service approach. Those services are dedicated to improve urban mobility, distance, energy management and safety. A similar project has recently been announced also for the Spanish City of Barcelona.
The IBM project, more focused on applications, is called The Smarter City and aims to integrate all the aspects of city management (traffic, safety, public services, etc..) within a common IT infrastructure. Few days ago the announcement that some major cities of the Globe, for instance Washington and Waterloo (Ontario), will participate to the initiative.
Even if the cloud provides computing power, dynamicity, flexibility and the ability to aggregate heterogeneous data sources at an abstract layer, a consistent doubt remains, and it is represented by security issues for the infrastructure… Apart from return on investment considerations (for which there are not yet consistent statistics because of the relative youth of the case studies depicted above), similar initiatives will succeed in their purpose only if supported by a robust security and privacy model. I already described in several posts the threats related to mobile devices, but in this case the cloud definitely makes the picture even worse because of the centralization of the information (but paradoxically this may also be an advantage if one is able to protect it well.) and the coexistence of heterogeneous data, even though logically separated, on the same infrastructure. As a consequence compromising the only point that contains all the data coming from heterogeneous sources that govern the physiological processes of a city, could have devastating impacts since the system would be affected at different levels and the users at different services. Not to mention, moreover, in case of wider use of this technologies, the ambitions of cyberterrorism that could, with a single computer attack, cripple the major cities around the globe.