In the landscape of threat intelligence, there are many challenges. Of these challenges, data format should not be one of them. However, it is.
As many already know, proprietary data formats cause both integration and scalability issues. In today’s market of seemingly endless cyber security solutions, data interoperability is paramount and we as an industry are spending too much time, money, and effort reinventing the wheel to communicate across systems or products.
Realizing threat intelligence is complex by nature, consuming a data format will never be as simple as something like Bluetooth. However, it is in the industry’s best interest to rigorously move toward a data format that can help vendors, governments, and sharing groups achieve data interoperability.
Enter, STIX. More specifically, STIX 2.1.
From its rigid XML beginnings, STIX 2.1 is the product of a full specification migration leveraging an improved JSON data format, making it easier for machines and humans alike to understand and consume threat data.
At CTA, we were early adopters of STIX 2.0 and understand firsthand, the evolution of STIX has also come at a cost. The spec is large and dynamic, leaving intel orchestration open for interpretation. This creates friction in two different areas. The first, communication between subject matter experts and software engineers. Secondly, sharing groups.
Despite this, it appears as though STIX is becoming, at a minimum, loosely adopted. Sharing communities are doing their best to implement it, but in order to achieve more widespread adoption, I think the industry needs more vendors to introduce STIX as part of their organic workflow. Meaning, a company’s internal interoperability would rely on STIX to eliminate today’s excessive data format conversion necessity. This introduces expensive custom coding as well as opportunity for both implementation and operational issues.
With all of this in mind, we find ourselves at a crossroads. We have a viable format to adopt but it comes with a steeper than average learning curve. That said, to solve a challenge so difficult, a steeper than average learning curve should be expected. In my estimation, to close the gap between learning and implementing, we need better visual training and prototyping tools. I recently open-sourced a STIX 2.1 data modeling user interface to encourage just that.
These kinds of tools will certainly help, but change will not come overnight. It is going to take a concerted effort from the entire industry to make standardization work, but if we want to reduce the cost of operation, increase software interoperability, and eventually get in front of the learning curve, I believe it is imperative we start adopting data format standards sooner rather than later.
Author: Jason Minnick
The latest from the cyber threat alliance
CYBER THREAT ALLIANCE THRIVING IN 2022
As I write this blog, we’re a little over halfway through 2022. This year has certainly produced some significant surprises – several of which fall into the “what didn’t happen” category. Regardless, though, CTA continues its work to enable members to better protect their customers, aid in the disruption of malicious actors, and raise the […]
CTA Board of Directors Spotlight: Lee Klarich, Palo Alto Networks
CTA Board of Directors Spotlight: Lee Klarich, Chief Product Officer, Palo Alto Networks What inspired you to found CTA? The threat intelligence information gleaned by threat researchers is invaluable to defending against cyberthreats, and ultimately, it’s [...]