As technology advances and cybersecurity grows in importance, compliance requirements continue to increase. Legacy approaches to compliance — asking system admins for point-in-time screenshots multiple times a year — have become inefficient and outdated. In order to evolve the security compliance profession, we need a new way of performing compliance activities through continuous automation. Watch Omer Singer, Head of Cybersecurity Strategy at Snowflake, and Richard Marcus, Vice President, Information Security at AuditBoard, discuss the way forward for compliance teams in today’s ever-evolving cyber risk landscape, including:
- The building blocks for a data-driven compliance program — and some common pitfalls to avoid.
- Practical examples of how the AuditBoard and Snowflake teams are approaching automation and continuous monitoring for evidence collection and vendor reviews.
- Keys to continuous compliance success, and next steps to unlock even more value for your organization.
Watch the full conversation, and read the can’t-miss highlights below.
Building a Data-Driven Compliance Program: Key Fundamentals
Richard Marcus: “We know the future’s going to have data and technology. Let’s get practical: For me, the foundation of an automated and data-driven compliance program is using technology to start to bring together your controls inventory — one place where you can manage all of your requirements and can trace the relationships and justification for those requirements. A huge pain point for a lot of people is having different sets of regulatory frameworks that you’re bound by. Being able to duplicate and align those with your control library means that you’re only implementing things once. You can apply the ‘audit once, comply many’ concept. It’s also the single source of truth for your asset owners, whether they be system or application owners or control owners or risk owners, they start to understand why you might be asking for certain things. They understand the scope of those requirements, what assets are in scope of that requirement, and then ultimately, who’s accountable. Having one single, common source of truth, where everything is laid out and organized makes everything that you build on top of it more efficient over time.”
Omer Singer: “Having the controls identified and scoped out is super important. I see some teams who want to be data-driven start analyzing start by focusing on the sources and take a sort of hoarder’s mentality: ‘I want it all. Bring me all the data.’ It’s better to starting with understanding, what is it that we really care about? From there, it’s a question of what data you’ll bring in and how you’ll automate it. For anybody starting out on this journey, I’d say start with scope and start with identifying the controls that you really care about.”
Richard Marcus: “I like to say that if you automate a bad process, you’re going to end up with more bad output. Once you have these basics scoped out in your controls inventory, the next thing that we tend to think about is evidence collection or control testing. This is another of these fundamental building blocks — before you start automating things, do you understand what you want to do with these controls? Do you have procedures outlined with what kind of evidence you’re hoping to get to satisfy these control requirements? I strongly recommend that you consider going out and manually testing controls for one or two cycles to get those procedures down correctly so you know who to go to for the information and what systems it’s coming from. Then, if you have good compliance people on your teams, they’re going to start to get bored doing the same thing over and over again, and will figure a way to automate themselves out of this job. But having the clarity and continuity of the requirements and the process down, making sure it’s repeatable and consistent — those are all key things to make sure you have in place before you can start to think about continuous monitoring or automating this process.”
Omer Singer: “I don’t want to assume that everybody is familiar with Snowflake. Snowflake is a data cloud company — we’ve built a data platform that is cloud native, think of it as a big database in the sky. It supports all three of the major clouds, there’s no limits on how much data you can collect a day or how long you can keep it, and it’s very cheap to store it. Companies are using it for lots of different use cases, and it supports an open architecture soyou can plug in applications to it. We’re seeing companies like AuditBoard saying if there’s already a lot of relevant business data in Snowflake, and it’s easy for people to bring in evidence — we can provide a fast track to effective continuous compliance automation by running AuditBoard as the compliance automation layer on top of the existing Snowflake data platform… Our job is to make it cost effective and fast to ask questions of the data and AuditBoard focuses on the compliance use cases.”
AuditBoard & Snowflake Example: Evidence Collection
Richard Marcus: “It probably goes without saying, but Snowflake and AuditBoard are customers of each other. We’re going to show some examples of continuous monitoring use cases that our teams have been working on for our own internal compliance activities that may help you conceptualize how this type of technology and architecture might be useful within your own teams. The first example that I’m going to share has to do with evidence collection that we’re pulling in from our AWS environment. We’ve mapped all of the CIS benchmarks for AWS hardening. These are technical controls that can be mapped or aligned with any of the compliance frameworks that you might be following — maybe you’re a NIST shop or maybe you follow PCI or ISO or SOC 2. We’ve imported them into our control library and mapped them to the various controls and frameworks that we follow. Step two would be connecting a monitor back to that AWS data, whether you’re asking AWS directly or you’re pulling the data from AWS into a data warehouse like Snowflake. In this case, what we’re monitoring for is a specific requirement around password policies that says passwords have to expire and be reset within 90 days. We’re just asking a question of the data: Do I have any users in the user database with a password that’s older than 90 days? Pretty simple question, and the answer’s in the data, so we’re creating a monitor to go out and fetch that data automatically. Where this becomes really powerful is that this is something that can be scheduled. You can ask this question much more frequently if the bots are doing it than if you were having a human trying to answer this question with any degree of urgency.”
Omer Singer: “The opportunity of the security data lake and having all this data centralized is that a lot of times the information you need to automate is not present in the record itself that you collected off that API, which may have just one dimension of the data. In Richard’s example, you have a password policy for a certain user. Is this user on the cloud engineering team, or are they on the sales engineering team? Have they recently been fired or changed teams? These are all considerations that could be very relevant. If you didn’t have the data centralized, you might have to take the initial finding and have somebody review it before passing it along to the people who actually fix the issue, which adds a human into the loop and kind of breaks down the whole continuous compliance process. With this model of AuditBoard pointing to Snowflake, anytime you find that you need to have a human in the loop, you can ask, ‘Why does this human needed to be in the loop?’ In this case, because they need to check information that is not available to the continuous compliance system yet. Then the next step is to talk to our data engineering friends and bring in that additional data set, so that we can move that task out of the analysts’ day-to-day and into the automation. That is very powerful. That gives me a lot of hope that we’re going to be able to do much better in the future as an industry and hopefully never fail an audit again because if you’re continuously asking the questions that the auditors are going to ask, you’re going to be successful.”
Richard Marcus: “What I really like about this approach is that if you’re collecting data to answer a particular question, you can start to build a repository of that evidence over time. If you have your controls mapped on the back end, now that evidence becomes useful for lots of people in your organization who might be trying to answer similar questions. So maybe you collect it for a PCI audit, but it’s also useful for the SOX team or internal audit or another other team doing another type of audit. You can pull the data in and create that audit trail that becomes really useful for other teams.”
AuditBoard & Snowflake Example: Vendor Reviews
Richard Marcus: “Everybody’s got to do vendor reviews. Most organizations have a requirement to do a risk assessment on new third parties that they engage with. Where people usually fall down is on the annual reassessment — or whatever period you have to go and reassess those relationships over time. There is a great monitor that someone built on our team to look at all of the different vendor information that we have in our database, check the last time that was reviewed, and notify us when it’s time to review that vendor again. [The video shows] the SQL for those that are interested. We’ve got the data at various tables. We know where that information is, we know what risk tier a vendor might be in, and we know the dates when they were last reviewed. You end up with a query that says ‘Tell me where you’ve got vendors that are outside of or maybe approaching that threshold,’ so that this becomes actionable information that the compliance team can use.”
Omer Singer: “Speaking of this specific use case around vendor risk management, what we’ve been doing at Snowflake is talking to the providers of vendor information and asking them to make their data available on the data marketplace. How much easier is it when you can get the information about the vendors through a data share, no API integration required? One click within the marketplace, it’s requested, and we’re seeing the vendors responding… Ask the vendors that you work with to start playing a role in this ecosystem — just a little work can make your lives much easier, and we’re seeing vendors happy to do it.”
Keys to Continuous Monitoring Success
Richard Marcus: “Be thoughtful about the workflow that you wrap around some of the continuous monitoring and automation. I think in the legacy model, there’s a lot of focus around how the evidence was collected. Did you sit over the shoulder of the engineer and take screenshots? What’s the chain of custody of that evidence look like, and how do you know it came from the source system that you’re auditing? In a more automated or continuous workflow, you still have to be prepared to visualize and explain to an auditor where the data came from and enforce proper version control and change management around those workflows. For example, if a regulation or your framework requirement changes, you may have to change the query. Is that query the final approved version that we all agree is the proper testing procedure for this particular control? Those kind of responsibilities don’t just go away because you’ve automated things, but having a platform makes that a lot easier to maintain that integrity and show how the workflow works to stakeholders.”
Omer Singer: “First thing is to decide that we’re going to take a data-driven approach and we’re not going at it by ourselves. We have data experts inside the company, and there’s an ecosystem geared to enable it. This used to be pretty hard and now it’s gotten much easier. Reach out to the people that use the data platform at your company, whether it’s Snowflake or a different one, and talk to them about things that are challenging for you or that you have on the roadmap for this year — and start brainstorming. Maybe the data’s already present in the data platform, and if it’s not they can probably bring it in pretty easily. When it comes to actually mapping the controls to the automation, that’s where AuditBoard has you covered off the shelf. I think it’s an amazing approach, and can just plug into that existing Snowflake environment. Then once you do get the insights, go back to those stakeholders, make sure that you’re giving them access to those dashboards to empower everybody to own their responsibility and be successful.”
Richard Marcus: “As long as you’re making the effort to move to a continuous monitoring approach and automate evidence collection, you might as well take on continuous reporting and issue management too, right? This is one of the biggest ways that you can unlock value for your organization. When you move from a point-in-time audit where maybe at the end of the audit, you’re sharing the results with your team and your leadership team, you’re trying to drive the right behaviors and the right changes. That no longer has to be a once a year or once a quarter activity. I think there’s also a lot of value and opportunity in integrating dashboards into this work, so that you can show off on a real-time basis how you’re doing against the different compliance frameworks that you’re beholden to. I’m not big on public shaming, but you can’t change what you don’t measure. So putting this data in front of people will drive the right behaviors around your organization as well.”
Looking for more thought leadership? Check out our on-demand webinar library, and stay tuned for more thought leadership videos featuring audit, risk, and compliance leaders about industry issues, insights, and experiences.