Thursday, June 4, 2020

A Practical Guide to Cyber-security Engagement

Introduction

Cyber-security organizations of every size have challenges, to varying degrees, of engaging their internal customers (e.g., Business Functions, Product Teams, Information Technology: IT) for a security review of a said new solution, product, feature, and/or service. Often reviews are performed later in the implementation life-cycle, in a haphazard manner, or sometimes not at all.
To remediate this recurring challenge the following text will serve as a guide for internal Cyber-security teams to execute security reviews. This will be done by establishing the present challenges, as well as providing best practices and industry guidance on what engagement options may work best. However, we must first define what constitutes a security review.
A security review is a formal process where a Cyber-security professional, or a team of said professionals, reviews the logical/conceptual model of a new solution, product, feature, and/or service aligned with a documented risk model (e.g., NIST RMF, ENISA). This review will also take into consideration industry best practices, vendor guidance, as well as an organization’s security requirements, reference architectures (RAs), policies, procedures, guidelines, and standards. It is not a manual code walk-through, a vendor review (to see if a potential/existing vendor’s Cyber posture align with expectations), or a security assessment (vulnerability scan/penetration test/Red Team exercise). Also, note that a security review is not a regulatory and/or contractual compliance gap analysis (e.g., PCI DSS, HIPAA, GxP, GDPR, NYDFS, GLBA, NERC CIP). The following illustrates the differences between these workstreams:

Ultimately, keeping these differences in mind will assist with establishing a separation of duties within the Cyber function. With that said, many organizations have their challenges establishing this separation.

Current Challenges

Many organizations have transitioned to an Agile project management methodology, particularly for software development efforts, that does not include a formal, monolithic design phase. Therefore, many organizations find a security review to hamper, or slow down, the velocity of the project. Additionally, many organizations do not have a formal governance process. Thus, tracking and process optimization are off the radar for leadership.
Speaking of efficiencies, or lack thereof, many Cyber-security Architecture/Assurance teams conduct their reviews in a siloed manner, which bypasses knowledge sharing across the function. These teams seldom request feedback with their focus on output/productivity. This creates a snowball effect regarding trust and deteriorating collaboration within the enterprise. Because of this experience, leadership becomes stuck and resistant to change due to skill-set gaps, constant personnel changes, and/or a lack of (industry/functional) knowledge regarding best practices.
This dilemma may also result in a one size fits all mindset for security reviews as well. Were as a broader view may allow the incorporation of threat modeling for (web/mobile) applications and/or cloud systems. Also, if physical devices or facilities are in scope, then effort must be made to review (and usually assess) that as well, which takes another, different skill-set. Finally, it is often found that there is friction from the need for an internal review for Cyber projects; so, effort must be made to ensure that Cyber governs its own when it comes to security reviews.

Anecdotal Best Practices

Popular opinion among Cyber-security Architects is that security reviews are a qualitative work-stream that cannot be commoditized. However, it is agreed upon that industry guidance would assist in delivering a higher quality deliverable with enhanced consistency. Therefore, the following thought leadership has been created to foster a discussion about how to best move forward.     

Engagement Options

First In, First Out (FIFO)

Driven in a reactive, event-driven fashion via Change Advisory Board (CAB) reviews and/or privileged account requests, FIFO engagements are most often found within small, immature Cyber organizations. It is often found that the tracking mechanisms, and metrics, are lacking if present at all. This option, while aligned with ITIL/ITOps best practices, is often coupled to an individual resource as well.

Boardroom

Reviews are conducted upon request on a recurring temporal (i.e., weekly) basis to a large virtual/physical audience in a gated manner. This option follows a Waterfall project management methodology and often requires multiple cycles for approval. While not Agile, this approach is best suited for teams of Cyber-security Architects who serve as generalists, or who have varying experience.

ODA Model

Named after U.S. Army Special Forces detachments (Operational Detachment-Alpha: ODA, or A-Teams), this model uses more of a peer-based, collaborative approach. Engagement is proactive with paired Cyber-security Architects, who have a specialized skill-set (e.g., DevSecOps/GitOps, Cloud, Data/MLOps/AIOps, NetSec/Telecom, IoT, Crypto, IAM/IdM, CyberOps/ITOps), acting in a collaborative manner. After-action reviews (AARs), and subsequent lessons learned, by the Cyber-security Architects allow for them to mentor and elevate the skill-set of embedded subject matter experts (SME) within a specific product team, function, and/or line of business (LOB), thus serving as force multipliers.

Outsourced

Outsourced security reviews are not as common as third-party provided vendor reviews; however, it is an option. Organizations that provide said services include managed security service providers (MSSP), independent consultants, as well as virtual Chief Information Security Officer (vCISO) providers. If utilized, the (outsource) service provider should establish a standardized, monitored process that is as platform/vendor-independent as possible. Performance and pay metrics should also be agreed upon before the commencement of services.

Hybrid

Larger enterprises seeking to mature their Cyber engagement model may utilize a hybrid model where a Senior Cyber-security Architect would leverage the FIFO approach augmented by Outsourced resources. The intent here is to ensure that reviews are conducted while additional staff come on-board and are trained accordingly. While expensive at first, this model will allow for an adequate pace towards maturity.

It is found that mature enterprises work best with the ODA Model, though it is fairly hard to find. Also, smaller organizations should embrace the outsourced model in a cost-effective manner.     

Maturity Models

Level 1: Ad-Hoc

Reviews are performed in a reactive manner verbally, or via whiteboard sessions, and if tracked at all are done so via email, JIRA tickets, or ServiceNow (SNOW) tickets.

Level 2: Defined

Reviews follow a documented, governed process using standardized outputs (e.g., templates, forms, diagrams via a consistent format [e.g., C4]). Reviews are tracked, mostly for audit purposes, via JIRA/SNOW tickets. Metrics (e.g., time to complete, function/LOB quantity, risk levels, status: open/closed/orphaned) are collected via spreadsheets if done at all.

Level 3: Measured

Reviews follow a documented, governed process using standardized outputs (e.g., templates, forms, diagrams via a consistent format [e.g., C4]) via a risk-based approach, with high-risk reviews requiring further detail (i.e., threat models via a consistent tool [e.g., Microsoft]). Governance is consistent, though workflows are antiquated (e.g., risk/issue tracking, remediation plan follow-ups).

Level 4: Optimized

Reviews follow a documented, governed process using standardized outputs (e.g., templates, forms, diagrams via a consistent format [e.g., C4]) and a risk-based approach, with high-risk reviews requiring additional detail (i.e., threat models via a consistent tool [e.g., Microsoft]). Governance and process efficiencies have been realized and are revisited on a recurring basis to ensure alignment.

To reach level-4, which most organizations have not, requires a level of both flexibility and governance. While the vast majority of organizations suffer from resource constraints, the goal is to establish processes to ensure that they are executed with the proper level of governance and standardization.

Monday, October 8, 2018

Native Versus Generic Security Baselines for Cloud

For a while now specific providers (Security Scorecard, BitSight) have provided security benchmarking for a client's ecosystem / vendors.

While that is great, these algorithms have been generic in nature versus taking cloud security nuances (i.e., AWS S3 utilization) into consideration.

To fill that gap, cloud service providers (CSPs) have now added their own benchmarks (e.g., AWS Trusted Advisor, Azure Secure Score) that will baseline a specific account versus the entire cloud ecosystem.

One would think that partnerships, maybe in conjunction with the Cloud Security Alliance's (CSA) Security, Trust & Assurance Registry (STAR) program, would allow cloud consumers to provide a holistic view of one's security maturity.

Wednesday, September 19, 2018

Using AWS X-Ray to Assist in Code Walk-throughs

Fancy a manual code walk-through?  Well, some assistance never hurt...

I leveraged AWS X-Ray to simplify understanding the sources and sinks.  Did it work, yes.  Is it for anything else other than microservices (e.g., ERP / EHR / EMR, trading, AI), not really.

Friday, September 14, 2018

Smart Contract Security

As more orgs look at embracing blockchain there will be a need to assess the security of Smart Contracts, particularly for Ethereum-based blockchains.

Look for vendors to develop solutions, and custom prof svcs firms to cater to this niche.

Sunday, August 26, 2018

Transitioning Technical Professional Service & Payment for Outcomes

With the advent of bug bounties, the transition of healthcare charges aligned to outcomes, and the history of legal services tied to outcomes, there needs to be a transition to technical professional services being aligned to outcomes.

Now, one may argue that FFP "packaged" projects are already tied to outcomes, though those are far & few between.

So, eventually industry should tie compensation to outcomes & we may see a better percentage of efficiencies from the larger consulting firms.

Saturday, August 25, 2018

Cloud Architecture: Build (IaaS) versus Buy (PaaS)

Cloud providers are introducing many new services to their portfolios.  So, organizations now have a decision to make regarding build vs buy.

Here are pros & cons for each:

PaaS / Buy: (pros) time to market, CapEx reductions; (cons) usually multi-tenant, will require skill-set updates, vendor lock-in, OpEx increases

IaaS / Build: (cons) familiar ITSM / ITIL model, CapEx focus, single tenant, existing skill-set, increased portability; (cons) slower agility

Monday, June 25, 2018

Cloud Visibility

With more organizations going to the cloud, with shadow IT, and with GDPR requirements cloud visibility seems to be the latest fad....

Microsoft & Amazon picked up on this several years ago, thus Azure Info Protection (AIP) and AWS Macie but, that does not cover them together or Google / Salesforce / Rackpsace.

So, expect this area to gain traction for several more years...