In December 2023, Honolulu's Department of Planning and Permitting had a six-month backlog on building permit pre-screening. Applicants submitted plans and waited half a year before anyone even checked whether the paperwork was complete. By January 2026, that wait was down to seven days.
Two things made that possible. Robotic process automation handled completeness checks starting in 2022. Then CivCheck, an AI platform, took over code compliance review, flagging zoning conflicts, fire code issues, and building code violations before a human reviewer ever opened the file.
The speed improvement is real. So is a question nobody in Honolulu, or anywhere else, has answered: when the AI clears a plan that contains a structural deficiency, and that deficiency causes a wall to fail or a foundation to crack, who writes the check?
Fourteen Cities and Counting
Honolulu is not an outlier. Pueblo County, Colorado partnered with Blitz AI in January 2026 for automated plan review. Lancaster, California, where Mayor R. Rex Parris described permitting as a "bureaucratic pain point," adopted AI tools that he says have turned the process into a "driver of growth." Bellevue, Washington uses Govstream.ai in its planning department. Naples, Florida signed with Blitz AI and CityView in April 2026. Louisville, Kentucky is piloting. San Jose is leveraging AI to address its housing crisis.
At least three vendors are competing for municipal contracts: CivCheck, Blitz AI, and Govstream.ai. Someone has filed a patent (US12229480B1) on "AI-assisted code compliance checking for building and land development," which tells you something about how large the vendors expect this market to become.
Every one of these deployments uses the same phrase: "human-in-the-loop." CivCheck CEO Dheekshita Kumar put it this way in January 2026: "Our AI doesn't make decisions. It helps humans make better decisions faster by organizing information, flagging relevant code sections and surfacing issues that still require professional judgment."
That sentence does a lot of legal work.
Three Doors, All Locked
Imagine a scenario. Your architect submits plans for a custom home. The city's AI platform reviews them, flags twelve minor issues (a missing egress window dimension, a setback annotation that's unclear), and marks the structural engineering as compliant. A human plan reviewer sees the AI's green light on structural, spends her limited time on the twelve flagged items, and approves the permit. Eighteen months later, your engineered floor system deflects beyond allowable limits. Remediation costs $87,000.
You have three parties to pursue. None of them will pay easily.
Door 1: The city. In most states, municipalities enjoy sovereign immunity or governmental immunity for discretionary functions, and building plan review has historically been classified as a discretionary governmental function. Courts in Texas, Florida, California, and New York have all held, in various formulations, that a building department's failure to catch a code violation during plan review does not create municipal liability. The rationale: plan review is a public service, not a guarantee. The city did not design your house. It reviewed it as a courtesy to public safety.
AI does not change this legal shield. If anything, it reinforces it. The city can argue that it exercised reasonable discretion by adopting state-of-the-art review tools, and that the AI's miss was not negligence but an inherent limitation of the technology.
Door 2: The AI vendor. SaaS companies sell software, not professional services. Their end-user license agreements almost universally disclaim liability for consequential, incidental, and special damages. Read the terms of any enterprise software contract and you will find a clause capping the vendor's total liability at the amount the customer paid in the preceding twelve months. For a municipality paying $50,000 annually for an AI plan review license, that cap leaves your $87,000 remediation bill uncovered even in the unlikely event you could pierce the contractual privity barrier as a third-party homeowner.
The vendors are careful. Kumar's "our AI doesn't make decisions" framing is not just marketing. It is a liability architecture. If the AI only "surfaces issues" and "flags relevant code sections," the decision to approve remains with the human reviewer, and the vendor has no duty of care to you.
Door 3: Your design professionals. Your architect and structural engineer carry errors and omissions insurance. They are the parties with a direct contractual obligation to deliver code-compliant plans. In theory, they remain liable regardless of whether a human or an AI reviewed the permit application.
In practice, something shifts when AI enters the loop. If the AI review flagged twelve issues and marked structural as compliant, the architect may argue reliance. "The city's own review system confirmed my structural calculations. If the system found twelve other problems but not this one, that is a system failure, not a design failure." Whether that argument holds up depends on jurisdiction and the specific facts, but it introduces ambiguity that did not exist when a human reviewer caught or missed the same violation.
The Holding Cost Arithmetic
Permit delays are expensive. A 2021 NAHB study found that regulatory costs account for 24.3% of the final price of a new single-family home, averaging $93,870 per unit. NAHB Chairman Carl Harris told Congress in February 2025 that federal permits alone can take "upwards of one year," with Endangered Species Act consultations adding years more.
On a $420,000 new home financed with a construction loan at 8.5%, every month of permit delay costs the builder approximately $2,975 in carrying charges. A six-month backlog, like Honolulu's pre-AI reality, burns $17,850 before a single shovel hits dirt.
Cutting that to one week saves roughly $17,150 per home. Across Honolulu's roughly 2,500 annual residential permits, the aggregate savings approach $42.9 million per year in carrying costs alone. That is a staggering number. It explains why cities are adopting these tools fast and asking liability questions later.
But those savings assume the AI reviews are at least as accurate as human reviews. If AI introduces new categories of error, or if human reviewers become less thorough because they defer to the AI's initial pass, the cost of remediation on the back end could offset the savings on the front end. Nobody has published data on AI plan review error rates versus human error rates. The technology is too new. Honolulu's CivCheck deployment is barely two years old.
What AIA and ConsensusDocs Don't Say
Jane Kutepova, counsel at Michelman & Robinson LLP, wrote in Area Development's Q4 2025 issue that "construction contracts, by and large, don't address the use of AI, which is becoming dangerous."
She identified four questions that standard AIA and ConsensusDocs forms fail to answer: Who is responsible for selecting and configuring AI tools? Who owns the data and digital content these systems generate? Who assumes liability for actions or decisions made by autonomous systems? Whether AI malfunctions, cyberattacks, or system outages should be treated as force majeure events?
Philip Stein of Bilzin Sumberg made a similar argument in August 2025: "Contracts that incorporate AI need to explicitly allocate liability for such errors. Builders must determine whether fault lies with the contractor, the AI vendor, or both."
Right now, nobody is making that determination. Builders are signing contracts written before AI plan review existed, submitting to review processes they did not choose and cannot negotiate, and assuming that the permit stamp means what it has always meant. It does not. The stamp now means an algorithm found no issues and a reviewer, who may or may not have independently verified the AI's work, concurred.
The Strongest Argument For AI Review
A tired plan reviewer working through her forty-second set of residential plans this week may miss a header span that exceeds the prescriptive tables in IRC Section R602.7. She is human. She has been doing this for twenty-three years and she is good at it, but she is also carrying a caseload that her department staffed for in 2019, before pandemic-era building applications surged 30%.
An AI system checks every line of code. Every setback. Every egress dimension. Every structural span. It does not get tired. It does not skip the mechanical room because it is Friday at 4:30 PM. If the system is properly trained on the applicable version of the IRC, IBC, and local amendments, it may catch more violations than any individual reviewer, not fewer.
This is the strongest case against worrying about the liability gap: it may be theoretical. If AI review reduces errors in absolute terms, there are fewer violations to litigate, and the question of who pays becomes less urgent because fewer things go wrong.
The problem is that "may" is doing heavy lifting. No jurisdiction using AI plan review has published comparative error rates. No independent audit has measured whether AI catches more, fewer, or different types of violations than human reviewers. We are running a nationwide experiment on building safety with no control group and no published results.
What You Can Do
If you are building a custom home: Ask your building department whether it uses AI-assisted plan review. If yes, ask whether the AI reviews structural plans or only checks for completeness and zoning compliance. Ask whether the human reviewer independently verifies structural calculations or relies on the AI's initial assessment. You probably will not get a clear answer, but the question puts the department on notice that you expect a substantive review, not a rubber stamp.
If you are an architect or structural engineer: Do not assume that AI-reviewed permits reduce your liability exposure. They may increase it by introducing a reliance defense for builders who argue that the city's system confirmed code compliance. Document your own code compliance analysis independently of the permit review process. If you submit plans through an AI pre-screening portal and it flags issues, fix them. If it misses something you later discover, document that too.
If you are a general contractor: Add an AI clause to your construction contracts. Kutepova and Stein are right: the standard forms do not cover this. At minimum, your contract should specify that the owner acknowledges the city may use AI-assisted plan review, that such review does not constitute a warranty of code compliance, and that the contractor's obligation to build to code exists independently of the permit review process.
If you are a homeowner who just closed on a new build: Your homeowner's warranty likely covers structural defects for ten years (in states that have adopted the structural warranty statute framework). That coverage does not depend on how the permit was reviewed. But if you discover a defect and need to pursue a claim, understand that the chain of liability now has a new link in it, and that link is designed to deflect responsibility rather than accept it.
What This Analysis Did Not Prove
Sovereign immunity doctrine varies state by state. Some states have waived immunity for ministerial (as opposed to discretionary) governmental functions, and a court could potentially classify AI-assisted review as ministerial rather than discretionary, since the AI applies rules mechanically. That classification would open municipalities to liability in those states. No court has ruled on this question yet.
I assumed that AI vendor contracts disclaim consequential damages based on standard SaaS contracting practices. The actual terms between CivCheck, Blitz AI, Govstream.ai, and their municipal customers are not public. A municipality with strong procurement counsel may have negotiated better indemnification terms than I have assumed.
The $42.9 million carrying cost savings figure for Honolulu assumes all 2,500 annual residential permits were delayed six months pre-AI and are now delayed one week. In reality, the six-month backlog applied to the pre-screening queue, not to the entire permit timeline. Overall permit issuance times are still longer than one week. The figure illustrates magnitude, not precision.
No actual lawsuit has been filed involving an AI-reviewed building permit that missed a code violation. This entire analysis is prospective. The technology is new enough that the first test cases have not yet reached a courtroom. When they do, the outcomes will depend on facts and jurisdictions that this article cannot predict.
Sources
- GovTech, January 2026: Honolulu DPP pre-screening backlog from six months to seven days; CivCheck deployment; 174 projects in prescreen
- Route Fifty, May 2025: Honolulu DPP automation and CivCheck partnership details; "human-in-the-loop" quotes from CEO Kumar
- GovTech, January 2026: Pueblo County, CO and Blitz AI partnership; OpenGov integration
- Area Development, Q4 2025: Jane Kutepova, Michelman & Robinson, on AI gaps in standard construction contracts; AIA and ConsensusDocs shortcomings
- Bilzin Sumberg, August 2025: Philip Stein on contractual liability, IP, data privacy, and employment risks of AI in homebuilding
- NAHB, February 2025: Congressional testimony on permitting roadblocks; 2021 study finding regulatory costs at 24.3% of home price ($93,870 average)
- US Patent US12229480B1: AI-assisted code compliance checking for building and land development
- EINNews, April 2026: Blitz AI and CityView partnership for Naples, FL