Threat Intelligence Without the Noise

You’ve undoubtedly heard the saying, “You can’t see the forest for the trees.” Its meaning is essentially this: Allowing yourself to become too focused on the small details of a task may lead you to miss the big picture—your goal or intended end state. You may have also heard the famous Mark Twain quote, “The secret of getting ahead is getting started. The secret of getting started is breaking your complex overwhelming tasks into small manageable tasks, and starting on the first one.” Paradoxically, both sayings are true when it comes to managing the wealth of data produced and presented by today’s security tools.

Over time, companies’ deployed security tooling has grown at such a rapid pace that one could teasingly adapt Moore’s Law to read, “security vendor implementations double every 18 months.” OK, so that’s a bit of hyperbole, but the visual is surely not lost on security professionals. Every year new categories of vendor tools emerge on the scene, with promises of preventing attacks, stopping attacks in progress, visualizing the attack surface, quantifying risk, and so on. In an honest effort to defend their organizations' networks from the proliferation of data and growing cyber security risks, security teams adopt these tools whenever possible to get the best advantage against the adversary.

This piling on of tools, though, often leads security and operations staff to simultaneously:

  1. feel overwhelmed by the amount of data and the number of tools needed to access that data;
  2. zoom in on a small set of data that seems most problematic.

However, the big picture for all security teams should be prioritizing imminent, targeted threats to the organization then acting upon those threats to achieve decreased organizational risk. What’s the best way to accomplish this? Utopia for most security pros is one central solution that collects, correlates, and normalizes data from all their security tools, then spits out actionable, prioritized recommendations for knocking down threats. A form of Cyber Ninja Warrior, if you will.

When buzzwords become product segments

This was the vision the ThreatQuotient team had in mind when they founded the company in 2013 as a threat intelligence platform. At the time, threat intelligence—actionable, relevant threat intelligence—was the hottest industry buzzword and so many companies were jumping into the fray. The founding team built a solid version 1 of their product, but by 2016 they knew they needed to make some revisions to scale and adapt to market needs.

Last week, Chief Marketing Officer Marc Solomon explained the company’s current offering to me and shared where their product shines. “Because the company launched in the threat intelligence heyday,” he said, “we’ve been labeled as a threat intelligence platform (TIP). But our goal is to change market perception of the supported use cases and demonstrate that ThreatQ is a broader platform, where Security Orchestration, Automation, and Response (SOAR) is ultimately a subset of capabilities,” but with a focus on data management. The company wants to position itself as a security operations platform that puts data at the forefront.

The approach to their product, he said, is based on three layers: the data layer, the analytical layer, and the operational layer. Tying the whole thing together is the company’s integrations network, the sought-after ability to gather data from different sources deployed in the network and aggregate it into something meaningful. Because ThreatQuotient has its own Open Exchange of over 200 integration partners, the amount of data to be processed and analyzed is vast. The crux of it all is getting to the right data while reducing the noisiness that plagues so many security tools today.

I’ve heard companies across the vendor spectrum use the phrase, “it’s all about getting the right data at the right time” before, so I asked Solomon what ThreatQuotient means when they say they collect the “right data” for their customers. He explained that the core of the data management layer is pulling internal telemetry first—a combination of data from integrated tools like the customer’s SIEM, ticketing system, and EDR —and enriching it with external data. This data set is contained in ThreatQ’s Threat Library, which the company describes as a “central repository of relevant and contextualized intelligence” that is scored with a criticality rating and which each company can customize based on priorities and business needs.

Thus, once the system has the right data, it can then be passed through the analytical layer, where customers can review scoring and manage alerts, then the operational layer, where intelligence analysts and SOC operators can automate workflows.

Cyber simulations put to the test

The most differentiated part of the solution, in my opinion, is ThreatQ Investigations—a cyber security situation room—which is an online tool that allows analysts, researchers, and incident handlers to collaborate in real-time using the intelligence collected. Cross-functional teams can manipulate and visualize various threat scenarios, whether real or simulated, to dig deeper into certain aspects of a potential threat or to investigate what happened during a prior incident. I can envision teams using the situation room for threat hunting and forensics, which seems rather obvious, but also to run drills on various scenarios were they to come to fruition, much like a more-traditional incident response test. Only, in the situation room, teams can apply specific threat data or attack types to the test.

I look forward to hearing more from ThreatQuotient as the company builds on its already-robust capabilities. If you’re looking for a flexible threat intelligence tool that can be customized, reach out to the team and see what they have to say. As always, if you do test drive their tool, please reach out and give us your feedback about this interesting solution.