Early Insight into a PCI DSS v4.0 Assessment
It seems like Apple releases a new version of the iPhone every year these days, and despite all the new iterations featuring similar looks, builds, and functions, there’s always that period where everyone has to get used to the new thing.
It’s the same with any new innovation, and the latest version of the PCI DSS is no different. Though it’s been some time now since v4.0 was released for public consumption by the payments community, the industry is still getting used to the new standard and its various nuances.
At Schellman, our experienced PCI QSA team has had the opportunity to complete multiple assessments against PCI DSS v4.0, and to help ease your own transition, we’re going to offer some takeaways in this article.
Though you should keep in mind that these may change as we all become more accustomed to the new approach, insight like this could still help you right now as you ease into compliance with the new requirements.
7 Takeaways from a PCI DSS v4.0 Assessment
1. Your trepidation about the new version is valid, but you’re likely in better shape regarding the first transition deadline than you think.
We understand your nerves about v4.0 and you’re not alone—whenever a new standard is introduced, many worry that their current approach will not be sufficient to meet the rigor of a new compliance framework.
But now that we’ve gone through an assessment, we can say that the new requirements you’ll need to meet before the first “go-live” date of March 31, 2024 are not especially onerous—most revolve around the documentation of roles and responsibilities, and coming up with documentation that can support those needs at the outset shouldn’t be too difficult.
Now, that’s not to say there is no work to be done or that these mandates should be taken lightly; however, documenting and formalizing roles and responsibilities is probably something you’ve already been doing—merely formalizing these requirements is not likely to be a heavy lift.
2. However, be wary of the requirements that “go live” on March 31, 2025.
That being said, those requirements that will become official a year after in 2025 appear to be a different, more difficult story. When performing our first readiness assessments and full 4.0 assessment, we received a lot of blank looks regarding these.
For instance, requirement 6 asks for the organization to have an inventory of custom and bespoke software. The organizations we’ve evaluated were surprised when their lists were insufficient, and yes—these can make for massive lists, but the intent is to get a better handle on what software risks will apply to your environment. (You cannot do that if you do not know what software is deployed where.)
2025 might seem far away, but organizations have already stumbled in implementing some of these new requirements, so get started now as you’ll likely need every minute of runway available to achieve compliance here.
3. Double-check whether the Customized Approach is right for you.
You likely already know that the new standard allows organizations to meet the “spirit and intent” of the requirement rather than meeting the requirements exactly as stated. That may sound incredibly appealing, and, in fact, there may be many business reasons that you would want to go this route.
However, now that we’ve been through an assessment, we feel that most organizations likely won’t need to use that approach given the way the requirements are written and the fact that compensating controls are still very much in play. Plus, the Customized Approach requires extensive documentation, including a Target Risk Analysis (TRA) that may make some shy away completely from using many or even one customized control to meet their needs.
4. Get clear on whether you’re a “multi-tenant service provider.”
“You’re not a shared hosting provider? Are you sure?”
We understand that it’s not necessarily easy to discern based on the definition in the standard—as assessors we receive pushback all the time on if these requirements are truly applicable to the organizations being assessed.
Be ready if your assessor has asked and inquired about this before because it likely means they saw something in your environment that may qualify you as a multi-tenant service provider—yes, that means you too, cloud providers. Be prepared to meet these requirements and ensure that you are good to go to meet the rigor of them.
5. Consider further training that encompasses new approaches and technologies.
While you absolutely should start with reading through the requirements, it certainly could help complete your understanding to also invite the perspective of a professional that has been doing this for many years and has the experience and expertise to answer questions regarding the new nuances.
At Schellman, we pride ourselves on our ability to perform both readiness assessments as well as provide training sessions for our customers—these can be done separately or together as part of an engagement to help bolster your grasp of the details, especially given that these are largely uncharted waters of a new standard.
We always get new questions and new approaches every single time we do one of these brainstorming-type sessions, and we’ve seen firsthand how immensely helpful it is when we walk through a training exercise and have auditee personnel ask us questions on how they might approach a requirement.
6. Expect a longer—but cleaner—report.
Point blank: the new reporting template is longer, but it’s actually a good thing.
In version 3.2.1, there were two issues:
- The report was repetitive; and
- The reporting instructions were a bit vague on what was expected as far as documentation.
Now, the reporting instructions allow the assessors to reference evidence, interviews, and process walkthroughs more simply. Despite it taking longer to write on our side, the structure and format should make it easier and more productive reading for your organization.
7. There are different signature requirements regarding deliverables.
For those assessments against PCI DSS v4.0, your QSA will need to sign their name to the ROC. That might seem odd, as many assessors have never done so before—there may have been a lot of names listed on reports against v3.2.1, just not the assessor signing off on the report. But this is a good move—it’ll hold your QSA accountable and encourages performance of good, quality assessment work.
Next Steps for Your PCI DSS v4.0 Compliance
As with all the latest concepts or inventions, it takes time to get used to all the different facets and new hurdles. With PCI DSS v4.0, the stakes are incredibly high—you need to get your compliance right even though the standard has only recently been released and come into use.
But now, with this insight from us after early assessments against this version, you know a little more about some particulars you and your assessor should look out for as you proceed through your transition.
For more help deconstructing the PCI DSS v4.0, make sure you check out our detailed breakdowns of different aspects and changes so that you can get started in adjusting your environment:
- How to Define Time in PCI DSS 4.0
- Understanding the Updates to Risk Management in PCI DSS v4.0
- Scoping Validation Requirements in PCI DSS 4.0: What’s Changed?
- New Multi-Factor Authentication Requirements in PCI DSS 4.0
- How to Keep Your Legacy Systems Compliant Under PCI DSS v4.0
And, if you’re interested in engaging our team for one of the aforementioned learning sessions—or, if you just have some lingering questions about this standard—please feel free to contact us so that we can do our best to set your mind at ease and prepare you.
About JOE O'DONNELL
Joe O'Donnell is a Senior Manager with Schellman mainly dedicated to the PCI and PCI specialty service lines. Prior to joining Schellman in 2015, Joe worked at in industry within the Enterprise Risk Management consulting practice. He managed IT Reviews in support of the financial audit but helped with various engagements including but not limited to: SOC reports, penetration testing and vulnerability scanning, SOX, HIPAA, and bank audits. Before focusing his career on IT auditing services, Joe worked as an Enterprise Operations Computing Analyst where he gained experience in IT systems analysis and data center operations.