Gartnerrecently highlighted the top ten technologies for information security and their implications for security organizations in 2014.
“Enterprises are dedicating increasing resources to security and risk. Nevertheless, attacks are increasing in frequency and sophistication. Advanced targeted attacks and security vulnerabilities in software only add to the headaches brought by the disruptiveness of the Nexus of Forces, which brings mobile, cloud, social, and big data together to deliver new business opportunities,” says Neil MacDonald, vice president and Gartner Fellow. “With the opportunities of the Nexus come risks. Security and risk leaders need to fully engage with the latest technology trends if they are to define, achieve, and maintain effective security and risk management programs that simultaneously enable business opportunities and manage risk.”
The top 10 technologies for information security are:
Cloud Access Security Brokers
Cloud access security brokers are on-premises or cloud-based security policy enforcement points placed between cloud services consumers and cloud services providers to interject enterprise security policies as the cloud-based resources are accessed. In many cases, initial adoption of cloud-based services has occurred outside the control of IT, and cloud access security brokers offer enterprises to gain visibility and control as its users access cloud resources.
Adaptive Access Control
Adaptive access control is a form of context-aware access control that acts to balance the level of trust against risk at the moment of access using some combination of trust elevation and other dynamic risk mitigation techniques. Context awareness means that access decisions reflect current condition, and dynamic risk mitigation means that access can be safely allowed where otherwise it would have been blocked. Use of an adaptive access management architecture enables an enterprise to allow access from any device, anywhere, and allows for social ID access to a range of corporate assets with mixed risk profiles.
Pervasive Sandboxing (Content Detonation) and Inversion-of-Control Confirmation
Some attacks will inevitably bypass traditional blocking and prevention security protection mechanisms, in which case it is key to detect the intrusion in as short a time as possible to minimize the hacker’s ability to inflict damage or exfiltrate sensitive information. Many security platforms now include embedded capabilities to run (“detonate”) executables and content in virtual machines (VMs) and observe the VMs for indications of compromise. This capability is rapidly becoming a feature of a more-capable platform, not a stand-alone product or market. Once a potential incident has been detected, it needs to be confirmed by correlating indicators of compromise across different entities—for example, comparing what a network-based threat detection system sees in a sandboxed environment to what is being observed on actual endpoints in terms of processes, behaviors, registry entries and so on.
Endpoint Detection and Response Solutions
The endpoint detection and response market is an emerging market created to satisfy the need for continuous protection from advanced threats at endpoints (desktops, servers, tablets and laptops)—most notably significantly improved security monitoring, threat detection and incident response capabilities. These tools record numerous endpoint and network events and store this information in a centralized database. Analytics tools are then used to continually search the database to identify tasks that can improve the security state to deflect common attacks, to provide early identification of ongoing attacks (including insider threats), and to rapidly respond to those attacks. These tools also help with rapid investigation into the scope of attacks, and provide remediation capability.
Big Data Security Analytics at the Heart of Next-generation Security Platforms
Going forward, all effective security protection platforms will include domain-specific embedded analytics as a core capability. An enterprise’s continuous monitoring of all computing entities and layers will generate a greater volume, velocity, and variety of data than traditional security information and event management systems can effectively analyze. Gartner predicts that by 2020, 40% of enterprises will have established a “security data warehouse” for the storage of this monitoring data to support retrospective analysis. By storing and analyzing the data over time, and by incorporating context and including outside threat and community intelligence, patterns of “normal” can be established and data analytics can be used to identify when meaningful deviations from normal have occurred.
Machine-readable Threat Intelligence, Including Reputation Services
The ability to integrate with external context and intelligence feeds is a critical differentiator for next-generation security platforms. Third-party sources for machine-readable threat intelligence are growing in number and include a number of reputation feed alternatives. Reputation services offer a form of dynamic, real-time “trustability” rating that can be factored into security decisions. For example, user and device reputation as well as URL and internet protocol address reputation scoring can be used in end-user access decisions.
Containment and Isolation as a Foundational Security Strategy
In a world where signatures are increasingly ineffective in stopping attacks, an alternative strategy is to treat everything that is unknown as untrusted and isolate its handling and execution so that it cannot cause permanent damage to the system it is running on and cannot be used as a vector for attacks on other enterprise systems. Virtualization, isolation, abstraction, and remote presentation techniques can be used to create this containment so that, ideally, the end result is similar to using a separate “air-gapped” system to handle untrusted content and applications. Virtualization and containment strategies will become a common element of a defense-in-depth protection strategy for enterprise systems, reaching 20% adoption by 2016 from nearly no widespread adoption in 2014.
“Software defined” is about the capabilities enabled as we decouple and abstract infrastructure elements that were previously tightly coupled in our data centers: servers, storage, networking, security, and so on. Like networking, compute, and storage, the impact on security will be transformational. Software-defined security doesn’t mean that some dedicated security hardware isn’t still needed—it is. However, like software-defined networking, the value and intelligence moves into software.
Interactive Application Security Testing
Interactive application security testing (IAST) combines static application security testing (SAST) and dynamic application security testing (DAST) techniques. This aims to provide increased accuracy of application security testing through the interaction of the SAST and DAST techniques. IAST brings the best of SAST and DAST into a single solution. This approach makes it possible to confirm or disprove the exploitability of the detected vulnerability and determine its point of origin in the application code.
Security Gateways, Brokers, and Firewalls to Deal with the Internet of Things
Enterprises, especially those in asset-intensive industries like manufacturing or utilities, have operational technology (OT) systems provided by equipment manufacturers that are moving from proprietary communications and networks to standards-based, internet protocol-based technologies. More enterprise assets are being automated by OT systems based on commercial software products. The end result is that these embedded software assets need to be managed, secured, and provisioned appropriately for enterprise-class use. OT is considered to be the industrial subset of the “Internet of Things,” which will include billions of interconnected sensors, devices, and systems, many of which will communicate without human involvement and that will need to be protected and secured.