Third party risks and Halloween costumes have something in common.
You’ve got your traditional risks — think cybersecurity, resilience or regulatory matters – that come knocking year after year, just like ghosts, goblins, and witches. There are also newly important third party risks whose magnitude has increased in the past 12 months. Think of environmental, social and governance (ESG) risks, artificial intelligence (AI) risks, and post-quantum cryptography (PQC) – which are the equivalent of someone dressing up this year up as the Ever Given, avocado toast, or Grogu from The Mandalorian.
1. Evolving ESG Reporting Requirements
Three out of the five current members of the U.S. Securities and Exchange Commission (SEC) have signaled that they plan to adopt climate-related disclosure requirements. While those rules likely would not come online until sometime next year, a steadily growing number of companies voluntarily adhere to ESG disclosure frameworks, such as those developed by Dow Jones, the Sustainability Accounting Standards Board (SASB), the Global Reporting Initiative (GRI) and others. “If you’re a third-party risk manager, you should be monitoring ESG developments, including changing reporting requirements. Scope 3 emissions should be of particular interest to those firms with a climate change focus” Roboff notes. Think of Scope 3 emissions as third party carbon footprints that result from “activities and from assets not owned or controlled by the reporting organization, but that the organization indirectly impacts in its value chain,” according to the U.S. Environmental Protection Agency. For example, a manufacturer who assembles complex products with components sourced internationally must account for the carbon impact of all the components used in that assembly even when the manufacturer does not build the components in-house. As evident in the U.S.’ on-again, off-again engagement with the Paris Climate Accords the politically contentious nature of climate change and ESG regulations mean that U.S. ESG regulations could swing back and forth based on election outcomes. As such, third party risk managers may well face challenges keeping up with evolving ESG regulatory requirements. Reporting climate-related data also poses challenges because much of this data has not been collected and scrutinized by organizations in the past. Greenwashing is an emerging risk. “From a third party risk management perspective, there are significant questions about the accuracy of much ESG data,” Roboff notes. “Some of these questions will be very difficult for individual organizations to address on their own. Although the demand for trusted third party ESG due diligence is intense, those capabilities are currently in short supply.
2. Edge Computing
Edge computing describes the storage and processing of data at or near the source from which it is harvested (in contrast to transmitting the same data back to centralized servers and systems for analysis). The approach – which is being deployed much more frequently in tandem with 5G and IoT adoption — delivers quicker insights and response times while consuming less network bandwidth. But edge computing also poses cybersecurity risks. Centralized systems tend to have stout, continuously monitored information security protections. Computing conducted atop buildings, in vehicles and other field locations are more vulnerable to cybersecurity breaches. “Digital transformation initiative have accelerated introducing more products and data which are being used in places that they’ve never been used before,” Miller points out. “That raises troubling questions about the extent to which devices and software being used are developed and managed in a secure fashion.”
3. Diverging Regulatory Guidance
Shared Assessments monitor, and help influence, global regulatory developments, including the high volume of rules-making in the financial services industry. “We’ve been seeing more divergence among international financial services guidance regulators, and it’s concerning,” Roboff reports. One recent example: U.S., U.K., and E.U. financial services regulators have long shared the view that an outsourcer is responsible for managing risks throughout the entire third party supply chain – meaning its third, fourth, fifth and nth parties. That risk management has several components, including contractual obligations for third parties maintaining outsourcer hygiene expectation if the third party outsources and some degree of Nth party due diligence as circumstances reasonably permit. That changed in March of this year when the U.K.’s Prudential Regulatory Authority (PRA) issued guidance indicating that it does not expect outsourcers to monitor any vendors (fourth, fifth, nth parties) except for third parties. The lack of uniform guidance means that global companies and their third party risk management (TPRM) teams may increasingly want to default to the highest regulatory expectations.
4. Post-Quantum Cryptography (PCQ)
“The new cyber warfare is attacking an increasingly large cross section of our society,” Roboff asserts. “That’s one of many important reasons why post-quantum cryptography is so important and why more organizations should be paying attention to it.” Quantum computing progress continues to advance and will one day make the strongest public-key cryptography at the heart of current cyber defenses vulnerable to breaches. That explains why the National Institute of Standards (NIST) launched a PCQ initiative five years ago. The effort centers on developing new cryptographic systems that are secure against both quantum and classical computers while preserving the ability to interoperate with existing communications protocols and networks. NIST has completed three rounds of evaluations for a new PCQ standard. More recently, the NIST National Cybersecurity Center of Excellence (NCCoE) initiated the development of practices to ease the migration from the current set of public-key cryptographic algorithms to replacement algorithms that are resistant to quantum computer-based attacks.
5. Advanced Automation
In a recent Bloomberg Businessweek Q&A, Google’s former head of ethical AI research took U.S. regulators to task for not taking an active role in overseeing AI products and usage. “Products have to be regulated,” Timnit Gebru asserted. “Government agencies’ jobs should be expanded to investigate and audit these companies, and there should be standards that have to be followed if you’re going to use AI in high-stakes scenarios.” Regardless of if and when AI regulations appear, AI, machine learning, and other forms of advanced automation require human oversight to minimize bias (intentional or non-intentional). Miller notes that this oversight requires talent which is getting more and more difficult to find, recruit and keep amid the talent shortages specific to cyber, data science, and technology.. “We always hear how important it is to keep a human in the loop,” he notes, “but the problem is that humans don’t always scale as quickly as the technology does, and that poses risks.”
Third party risks, like Halloween costumes, come in many varieties. At least that was the case until this change-riddled year when supply chain snafus – something third party risk managers know a thing or two about — relegated many of the most popular get-ups to “back order” or “sold out” status.