In the landscape of threat intelligence, there are many challenges. Of these challenges, data format should not be one of them. However, it is.
As many already know, proprietary data formats cause both integration and scalability issues. In today’s market of seemingly endless cyber security solutions, data interoperability is paramount and we as an industry are spending too much time, money, and effort reinventing the wheel to communicate across systems or products.
Realizing threat intelligence is complex by nature, consuming a data format will never be as simple as something like Bluetooth. However, it is in the industry’s best interest to rigorously move toward a data format that can help vendors, governments, and sharing groups achieve data interoperability.
Enter, STIX. More specifically, STIX 2.1.
From its rigid XML beginnings, STIX 2.1 is the product of a full specification migration leveraging an improved JSON data format, making it easier for machines and humans alike to understand and consume threat data.
At CTA, we were early adopters of STIX 2.0 and understand firsthand, the evolution of STIX has also come at a cost. The spec is large and dynamic, leaving intel orchestration open for interpretation. This creates friction in two different areas. The first, communication between subject matter experts and software engineers. Secondly, sharing groups.
Despite this, it appears as though STIX is becoming, at a minimum, loosely adopted. Sharing communities are doing their best to implement it, but in order to achieve more widespread adoption, I think the industry needs more vendors to introduce STIX as part of their organic workflow. Meaning, a company’s internal interoperability would rely on STIX to eliminate today’s excessive data format conversion necessity. This introduces expensive custom coding as well as opportunity for both implementation and operational issues.
With all of this in mind, we find ourselves at a crossroads. We have a viable format to adopt but it comes with a steeper than average learning curve. That said, to solve a challenge so difficult, a steeper than average learning curve should be expected. In my estimation, to close the gap between learning and implementing, we need better visual training and prototyping tools. I recently open-sourced a STIX 2.1 data modeling user interface to encourage just that.
These kinds of tools will certainly help, but change will not come overnight. It is going to take a concerted effort from the entire industry to make standardization work, but if we want to reduce the cost of operation, increase software interoperability, and eventually get in front of the learning curve, I believe it is imperative we start adopting data format standards sooner rather than later.
Author: Jason Minnick
The latest from the cyber threat alliance
On behalf of the Cyber Threat Alliance and the Cybersecurity Coalition, we would like to cordially invite you to attend our fifth annual cyber policy event, CyberNextDC. CyberNextDC is one of DC’s leading cybersecurity policy events of the year. This year’s event will be held in–person with an option to view via Zoom. CyberNextDC 2022 […]
Systemic Cybersecurity Risk and role of the Global Community: Managing the Unmanageable
Cyberattacks are frequently becoming ‘cyber events’ with systemic impact. How can governments and businesses respond?
Preparing for New Incident Reporting Requirements
Mandatory cyber incident reporting is being extended to many more organizations. Those already subject to these regulations face new, more stringent, requirements. Engaging proactively with government agencies and your own incident response and legal partners will make mandatory incident reporting as frictionless as [...]