AI in Client Acceptance and Continuance (A&C): What the PCAOB Thinks

Artificial intelligence is actively reshaping research, planning, and risk assessment. For audit quality and compliance leaders, the most pressing question is how to use AI in A&C without triggering inspection risks.

The PCAOB’s Stance: AI Is an Assistive Tool, Not an Auditor

The PCAOB does not prohibit the use of AI, but it is clear on one point: technology is not a replacement for professional judgment. There is no “AI exception” to professional responsibility.

In its July 2024 Spotlight, PCAOB staff observed that while firms are investing heavily in generative AI, the most effective implementations focus on administrative and research tasks, with human partners retaining responsibility for final conclusions. Because A&C sits at the intersection of independence, ethics, and firm risk, it remains a high‑judgment area subject to heightened inspection scrutiny.

Bridging the Gap with Qualified Third Parties

Many firms bridge the gap between AI-driven efficiency and human expertise by engaging qualified third parties to perform A&C due diligence. However, delegating the task does not delegate the responsibility.

  • Supervision Standards (AS 1201):
    Lead auditors must supervise auditor‑engaged specialists. Firms cannot simply file a third‑party report; they must evaluate the specialist’s methods and assess the sufficiency and appropriateness of the evidence obtained.
  • The QC 1000 Factor:
    The PCAOB’s new Quality Control standard (QC 1000), effective December 15, 2026, places greater emphasis on managing “external resources.” Firms must implement robust controls to ensure that third‑party providers and any AI tools they use meet the firm’s standards for competence, objectivity, and reliability.

Navigating Inspection Risks

When it comes to PCAOB inspections, how AI is used in A&C matters just as much as whether it is used at all. Here are the red-flags:

  • Allowing AI tools to automatically determine “accept” or “decline” decisions
  • Relying on AI outputs that are not explainable or cannot be defended
  • Treating third‑party reports as final without a meaningful review
  • Succumbing to automation bias by blindly trusting a software-generated score

The Documentation Mandate

From a PCAOB inspector’s perspective, “the system recommended it” is not a defensible rationale. Documentation must be audit‑ready and clearly demonstrate:

  1. The Role of AI:
    Whether AI was used for research, drafting, data analysis, or other support functions.
  2. The Inputs:
    The data, sources, and prompts provided to the AI tool or third party.
  3. The Challenge:
    How the engagement team evaluated, corroborated, or challenged the AI or third‑party output.
  4. Professional Skepticism:
    Evidence that a human partner applied judgment and took responsibility for the final A&C decision.

 

Disclaimer: This communication is for general informational purposes only and does not constitute legal advice. The summary provided in this alert does not, and cannot, cover in detail what employers need to know about the amendments to the Philadelphia Fair Chance Law or how to incorporate its requirements into their hiring process. No recipient should act or refrain from acting based on any information provided here without advice from a qualified attorney licensed in the applicable jurisdiction.