Why Access Reviews Fail Consistently
SOC 2 CC6.3 requires that access to systems and data is appropriate for each user's role, and that access is reviewed on a regular basis. ISO 27001 Annex A.8.2 similarly requires review of privileged access rights. Both frameworks specify that the review must happen, that it must be documented, and that excess access identified in the review must be revoked.
In theory, quarterly access reviews should be straightforward: pull a list of users and their permissions, have each team manager confirm that the access is still appropriate, remove access that has been granted but is no longer needed. In practice, this process fails regularly across companies of all sizes.
The failure patterns are consistent enough to be predictable: reviews happen inconsistently (one quarter has documentation, the next does not), reviewers approve access without actually reviewing it ("rubber-stamp approvals"), access is not actually revoked after a reviewer marks it as excessive, and the evidence quality is poor (a screenshot of a spreadsheet does not establish a clear chain of custody).
The Four Most Common Failure Modes
1. The Scheduling Problem
Access reviews rely on someone remembering to initiate them. In practice, this means they get done when compliance pressure is highest — during audit prep — and missed or delayed during normal operations. A company that does access reviews in February and November, skips Q2 and Q3, and then tries to assemble quarterly evidence before an audit has a gap problem.
SOC 2 Type II auditors sample access review records across the observation period. If you have two reviews in a year instead of four, the auditor notes the gap. If the gaps occur in months that had high IAM activity — onboarding a large class of employees, granting temporary elevated access for a migration — the gap is more significant.
2. The Rubber-Stamp Problem
When access reviews are sent as spreadsheets via email, reviewers rarely scrutinize each row. A manager presented with a 200-row spreadsheet of user permissions at 4 PM on a Thursday will click "approve" without examining whether every role assignment is still appropriate. This produces documentation that looks complete but does not represent actual review.
Auditors test for this. A common technique is to check whether any access was removed as a result of the review. If every review cycle ends with zero access removals across dozens of users and dozens of systems, the auditor questions whether the reviews are substantive. Well-designed access reviews catch removals 10–15% of the time — not because security is poor, but because role changes, project transitions, and team restructuring constantly create mismatches between current access and current need.
3. The Follow-Through Problem
Even when reviewers genuinely mark access as excessive, the access is often not revoked in a timely manner. The review generates a task. The task sits in someone's queue. The access stays in place for weeks or months after the reviewer confirmed it should be removed.
CC6.3 requires not just that access is reviewed but that the results of the review are acted upon. An access review that identifies 12 excess permissions but only revokes 3 of them within 30 days has a measurable completion rate problem. Auditors who find this pattern will probe for additional examples of access management failures.
4. The Evidence Quality Problem
Manual access reviews typically produce evidence in formats that are difficult to audit: email threads with spreadsheet attachments, screenshots of admin consoles, PDF exports from HR systems that do not match the actual system state at the time of review. When an auditor needs to verify that a specific user's access was reviewed and approved by a named manager on a specific date, these formats often cannot provide that level of specificity.
Chain of custody matters for access review evidence. The auditor needs to verify that: the permission list reviewed was accurate as of the review date, the review was completed by an authorized approver, and any access removed was removed as a direct result of the review decision. A screenshot of a spreadsheet satisfies none of these requirements independently.
What a Functional Automated Access Review Looks Like
An automated access review workflow solves each of these failure modes with specific mechanisms:
Automated Scheduling and Initiation
The review schedule is configured once, and reviews launch automatically on the defined cadence — quarterly, semi-annually, or according to a custom schedule. Each review is initiated by the platform, not by someone remembering. The compliance team sees the upcoming review on their dashboard 2 weeks in advance, the review launches on the scheduled date, and reviewers receive their assignments automatically.
This eliminates the scheduling problem: reviews happen on the configured cadence regardless of what else is going on in the organization.
Reviewer-Specific Assignment with Structured Questions
Rather than a single spreadsheet sent to one compliance owner, automated access reviews generate reviewer-specific assignments. Each manager sees only the users and permissions within their scope. The assignment includes specific yes/no questions for each permission set rather than asking the reviewer to assess a raw permission list they may not understand.
The structured format produces more meaningful reviews: a manager who knows their team can answer "is this user still in their role? Do they still need this level of access?" more accurately than they can evaluate a raw IAM policy name. Engagement is higher when the task is scoped and structured rather than open-ended and overwhelming.
Automatic Escalation and Deadline Tracking
Automated systems send reminders as deadlines approach and escalate to a backup reviewer if the primary reviewer does not respond within the review window. This addresses both the non-response problem and the rubber-stamp problem — a reviewer who knows that non-response will escalate to their manager is more likely to complete the review genuinely.
Completion rate metrics are tracked and visible to the compliance team throughout the review period. If a review is running at 40% completion with three days left, the compliance team sees it and can intervene directly rather than discovering the gap during audit prep.
Direct Integration with Deprovisioning
When a reviewer marks access as excessive, the system generates a deprovisioning task linked directly to the review decision. The task is assigned to the relevant system administrator with the specific access to remove, the deadline for removal, and the link to the review decision that triggered it. Completion of the deprovisioning task is logged and attached to the access review record.
This creates the chain of custody the auditor needs: reviewer decision → deprovisioning task → completion record — all linked together in the evidence record. The auditor can verify the full lifecycle without asking follow-up questions.
What the Evidence Record Should Show
A well-constructed access review evidence record for SOC 2 CC6.3 includes:
- The complete list of users and permissions reviewed, pulled directly from the source system (Okta, AWS IAM, GitHub) at the review date — not manually assembled
- The name and role of each reviewer, and the date they completed their assignment
- The specific approval or revocation decision for each permission item
- For any access marked for removal: the resulting deprovisioning task, the assignee, the completion date, and the system record confirming removal
- The total completion rate for the review cycle and the date all open items were closed
This level of evidence is difficult to produce manually and straightforward to produce with a platform that automates the process. CompliRun pulls access lists directly from Okta, AWS IAM, GitHub, and other integrated systems at the scheduled review date, ensuring the reviewed list matches the actual system state. All reviewer decisions, timestamps, and follow-up actions are stored in a single auditor-readable record per review cycle.
Privileged Access: Higher Stakes, Same Failures
CC6.3 specifically calls out privileged access — administrative roles, production system access, database administrator rights, and similar elevated permissions — as requiring more frequent or more rigorous review. ISO 27001 A.8.2 similarly emphasizes privileged access rights as a distinct category requiring regular review.
In practice, privileged access reviews fail in the same ways as regular access reviews, but the consequences of failure are more significant. An unreviewed production database administrator role that should have been revoked when an engineer changed teams represents a more material exposure than an unreviewed read-only dashboard access. Auditors sample privileged access reviews with additional scrutiny.
The same automated approach applies: scheduled reviews, structured reviewer assignments, deadline escalation, and linked deprovisioning tasks. The difference is the cadence — privileged access reviews typically run monthly or bi-monthly rather than quarterly — and the scrutiny applied to each decision.
The SOC 2 Audit Interaction
During fieldwork, auditors sample access review records from across the observation period — typically 2–3 review cycles from different quarters. For each sampled review, they verify: that the review was completed by an authorized reviewer, that the user and permission list was accurate, that excess access was actually removed, and that the removal happened within the timeframe specified in your access management policy.
Companies with automated access review platforms consistently fare better in this sampling than companies with manual processes. The evidence is more complete, the chain of custody is clearer, and follow-through on deprovisioning is demonstrably higher. The auditor interview also goes differently: an engineer who knows their access reviews are automated can describe the specific workflow and show the platform record rather than explaining why the Q2 spreadsheet is missing from the shared drive.
Automate your access reviews with CompliRun
CompliRun schedules access reviews, sends reviewer assignments, tracks completion, and logs deprovisioning outcomes — all organized as CC6.3 evidence. Set up takes under an hour.
Request a Demo