Introduction
Cyber-security organizations of every size have challenges, to varying degrees, of engaging their internal customers (e.g., Business Functions, Product Teams, Information Technology: IT) for a security review of a said new solution, product, feature, and/or service. Often reviews are performed later in the implementation life-cycle, in a haphazard manner, or sometimes not at all.
To remediate this recurring challenge the following text will serve as a guide for internal Cyber-security teams to execute security reviews. This will be done by establishing the present challenges, as well as providing best practices and industry guidance on what engagement options may work best. However, we must first define what constitutes a security review.
A security review is a formal process where a Cyber-security professional, or a team of said professionals, reviews the logical/conceptual model of a new solution, product, feature, and/or service aligned with a documented risk model (e.g., NIST RMF, ENISA). This review will also take into consideration industry best practices, vendor guidance, as well as an organization’s security requirements, reference architectures (RAs), policies, procedures, guidelines, and standards. It is not a manual code walk-through, a vendor review (to see if a potential/existing vendor’s Cyber posture align with expectations), or a security assessment (vulnerability scan/penetration test/Red Team exercise). Also, note that a security review is not a regulatory and/or contractual compliance gap analysis (e.g., PCI DSS, HIPAA, GxP, GDPR, NYDFS, GLBA, NERC CIP). The following illustrates the differences between these workstreams:
Ultimately, keeping these differences in mind will assist with establishing a separation of duties within the Cyber function. With that said, many organizations have their challenges establishing this separation.
Current Challenges
Many organizations have transitioned to an Agile project management methodology, particularly for software development efforts, that does not include a formal, monolithic design phase. Therefore, many organizations find a security review to hamper, or slow down, the velocity of the project. Additionally, many organizations do not have a formal governance process. Thus, tracking and process optimization are off the radar for leadership.
Speaking of efficiencies, or lack thereof, many Cyber-security Architecture/Assurance teams conduct their reviews in a siloed manner, which bypasses knowledge sharing across the function. These teams seldom request feedback with their focus on output/productivity. This creates a snowball effect regarding trust and deteriorating collaboration within the enterprise. Because of this experience, leadership becomes stuck and resistant to change due to skill-set gaps, constant personnel changes, and/or a lack of (industry/functional) knowledge regarding best practices.
This dilemma may also result in a one size fits all mindset for security reviews as well. Were as a broader view may allow the incorporation of threat modeling for (web/mobile) applications and/or cloud systems. Also, if physical devices or facilities are in scope, then effort must be made to review (and usually assess) that as well, which takes another, different skill-set. Finally, it is often found that there is friction from the need for an internal review for Cyber projects; so, effort must be made to ensure that Cyber governs its own when it comes to security reviews.
Anecdotal Best Practices
Popular opinion among Cyber-security Architects is that security reviews are a qualitative work-stream that cannot be commoditized. However, it is agreed upon that industry guidance would assist in delivering a higher quality deliverable with enhanced consistency. Therefore, the following thought leadership has been created to foster a discussion about how to best move forward.
Engagement Options
First In, First Out (FIFO)
Driven in a reactive, event-driven fashion via Change Advisory Board (CAB) reviews and/or privileged account requests, FIFO engagements are most often found within small, immature Cyber organizations. It is often found that the tracking mechanisms, and metrics, are lacking if present at all. This option, while aligned with ITIL/ITOps best practices, is often coupled to an individual resource as well.
Boardroom
Reviews are conducted upon request on a recurring temporal (i.e., weekly) basis to a large virtual/physical audience in a gated manner. This option follows a Waterfall project management methodology and often requires multiple cycles for approval. While not Agile, this approach is best suited for teams of Cyber-security Architects who serve as generalists, or who have varying experience.
ODA Model
Named after U.S. Army Special Forces detachments (Operational Detachment-Alpha: ODA, or A-Teams), this model uses more of a peer-based, collaborative approach. Engagement is proactive with paired Cyber-security Architects, who have a specialized skill-set (e.g., DevSecOps/GitOps, Cloud, Data/MLOps/AIOps, NetSec/Telecom, IoT, Crypto, IAM/IdM, CyberOps/ITOps), acting in a collaborative manner. After-action reviews (AARs), and subsequent lessons learned, by the Cyber-security Architects allow for them to mentor and elevate the skill-set of embedded subject matter experts (SME) within a specific product team, function, and/or line of business (LOB), thus serving as force multipliers.
Outsourced
Outsourced security reviews are not as common as third-party provided vendor reviews; however, it is an option. Organizations that provide said services include managed security service providers (MSSP), independent consultants, as well as virtual Chief Information Security Officer (vCISO) providers. If utilized, the (outsource) service provider should establish a standardized, monitored process that is as platform/vendor-independent as possible. Performance and pay metrics should also be agreed upon before the commencement of services.
Hybrid
Larger enterprises seeking to mature their Cyber engagement model may utilize a hybrid model where a Senior Cyber-security Architect would leverage the FIFO approach augmented by Outsourced resources. The intent here is to ensure that reviews are conducted while additional staff come on-board and are trained accordingly. While expensive at first, this model will allow for an adequate pace towards maturity.
It is found that mature enterprises work best with the ODA Model, though it is fairly hard to find. Also, smaller organizations should embrace the outsourced model in a cost-effective manner.
Maturity Models
Level 1: Ad-Hoc
Reviews are performed in a reactive manner verbally, or via whiteboard sessions, and if tracked at all are done so via email, JIRA tickets, or ServiceNow (SNOW) tickets.
Level 2: Defined
Reviews follow a documented, governed process using standardized outputs (e.g., templates, forms, diagrams via a consistent format [e.g., C4]). Reviews are tracked, mostly for audit purposes, via JIRA/SNOW tickets. Metrics (e.g., time to complete, function/LOB quantity, risk levels, status: open/closed/orphaned) are collected via spreadsheets if done at all.
Level 3: Measured
Reviews follow a documented, governed process using standardized outputs (e.g., templates, forms, diagrams via a consistent format [e.g., C4]) via a risk-based approach, with high-risk reviews requiring further detail (i.e., threat models via a consistent tool [e.g., Microsoft]). Governance is consistent, though workflows are antiquated (e.g., risk/issue tracking, remediation plan follow-ups).
Level 4: Optimized
Reviews follow a documented, governed process using standardized outputs (e.g., templates, forms, diagrams via a consistent format [e.g., C4]) and a risk-based approach, with high-risk reviews requiring additional detail (i.e., threat models via a consistent tool [e.g., Microsoft]). Governance and process efficiencies have been realized and are revisited on a recurring basis to ensure alignment.
To reach level-4, which most organizations have not, requires a level of both flexibility and governance. While the vast majority of organizations suffer from resource constraints, the goal is to establish processes to ensure that they are executed with the proper level of governance and standardization.