App Supply Chain Risk: how much is too much?
Last week in Does software governance need to catch-up to pharmaceuticals? we touched on the unavoidable tension between an enterprise’s objectives and those of their software suppliers (arising from conflicting business objectives) …leading to the next question;
When do App Supplier “tensions” translate into material risk?
The obvious (but not necessarily straightforward) response is to analyze the potential risk using your favorite risk analysis methodology/process.
All Apps are NOT created equal – and your ISV suppliers can’t help.
In fact, the very same application will almost certainly yield entirely different risk ratings across each of its clients as “Threat event frequency” and all of the “Loss magnitude” variables are, by definition, entity-specific (not shared across enterprises).
Vulnerability exposure: Stepping through an entire assessment process is well beyond the scope of this post, but I wanted to focus in on another important manifestation of the aforementioned enterprise/supplier disconnect; vulnerability exposure.
The following table highlights the entirely disparate vulnerability exposures that are most closely associated WITH THE SAME THREAT EVENTS when viewed from an enterprise’s perspective versus the ISV supplier.
Threat event: A threat actor acts in a manner that has the potential to cause harm.
Vulnerability (event): a control condition changes or threat capability changes.
(Freund, Jack. Measuring and Managing Information Risk. Elsevier Science.)
Who cares if you manage these risks or not?
Well, either directly or indirectly, virtually every modern information privacy regulation and statute published (or updated) in the past five years. Here are three pointers into a tiny sliver of the long list of concrete examples that could be referenced here.
The Commonwealth of Massachusetts: 201 CMR 17.00: STANDARDS FOR THE PROTECTION OF PERSONAL INFORMATION OF RESIDENTS OF THE COMMONWEALTH
17.03 (2) f Oversee service providers, by:
1. Taking reasonable steps to select and retain third-party service providers that are capable of maintaining appropriate security measures to protect such personal information consistent with 201 CMR 17.00 and any applicable federal regulations; and
2. Requiring such third-party service providers by contract to implement and maintain such appropriate security measures for personal information;
Note: 201 CMR 17.03 (2) f 2 that suggests that these obligations must be reflected in license agreements, which is in stark contrast to the limited liability clauses found in virtually every software license agreement in the marketplace today.
New York State: 23 NYCRR 500: CYBERSECURITY REQUIREMENTS FOR FINANCIAL SERVICES COMPANIES
Section 500.11 (a) Third Party Service Provider Policy. Each Covered Entity shall implement written policies and procedures designed to ensure the security of Information Systems and Nonpublic Information that are accessible to, or held by, Third Party Service Providers. Such policies and procedures shall be based on the Risk Assessment of the Covered Entity and shall address to the extent applicable:
(1) the identification and risk assessment of Third Party Service Providers;
(2) minimum cybersecurity practices required to be met by such Third Party Service Providers in order for them to do business with the Covered Entity;
(3) due diligence processes used to evaluate the adequacy of cybersecurity practices of such Third Party Service Providers;
Note: 500.11 (a) (2) which includes the risk assessment standard on appropriate suppliers.
The European Union: The General Data Protection Regulation (EU) 2016/679 (GDPR) Recital 78 Appropriate technical and organisational measures
Recital 78: The protection of the rights and freedoms of natural persons with regard to the processing of personal data require that appropriate technical and organisational measures be taken to ensure that the requirements of this Regulation are met. … When developing, designing, selecting and using applications, services and products … producer should be encouraged to take into account the right to data protection when developing and designing such products, services and applications and, with due regard to the state of the art, to make sure that controllers and processors are able to fulfil their data protection obligations. The principles of data protection by design and by default should also be taken into consideration in the context of public tenders.
Note: The GDPR speaks specifically to HOW applications are designed and developers, stresses that these standards need to be applied to vendor selection (“public tenders”) and applies a higher standard than “reasonable”, e.g. “state of the art.”
What do I do with this?
Even if these tensions are effectively universal, the Loss Magnitude (and by extension, materiality) associated with these threat events are, blessedly, unique and highly variable to each organization.
Is there a way to segment and identify the licensed software that may pose the most significant risk?
Can established risk assessment frameworks be applied to streamline this process?
Of course, they can.