Chapter 6: The Guardrails
A compliance call from Henderson's forces Sam to confront the governance gap - and build the policies, data controls, and monitoring that every automated business needs.
Want to keep reading?
Enter your email to unlock all chapters and get the complete book as a PDF.
Get the Full Book
We'll email you the complete PDF and unlock all chapters on the site.
Check your inbox!
The full PDF of The Business Owner's Playbook is on its way to your email.
Something went wrong. Please try again.
Book download is not currently available. Please try again later.
The call comes at 7:45 AM on a Tuesday.
It’s Henderson’s. Not the procurement manager who sends the POs. Not the accounts payable person who sends the payments. It’s someone from their legal and compliance department whose name Sam has never heard before.
“Ms. Laurent? This is David Kim from Henderson’s vendor compliance team. I need to ask you some questions about your data handling practices.”
Sam’s stomach drops, though she doesn’t yet know why.
David Kim explains: Henderson’s is a publicly traded grocery chain. They have vendor data requirements. Specifically, they have a clause in their supplier agreement, Section 8.3, which Sam signed three years ago and has not read since, that states all vendor employees must follow Henderson’s Acceptable Data Use Policy when handling Henderson’s proprietary information. This includes order data, pricing, product mix, and promotional plans.
“We’ve become aware,” David Kim says, in the careful tone of someone reading from a script, “that your organization may have been inputting Henderson’s data into third-party AI tools without authorization.”
Sam’s mouth goes dry.
“Can you confirm or deny whether Henderson’s purchase order data, pricing information, or order volumes have been shared with external AI platforms?”
Sam cannot confirm or deny because she genuinely doesn’t know. She tells David Kim she’ll investigate and call back within 24 hours.
She hangs up. Her hands are shaking.
Here is what happened.
Jordan, who has been using ChatGPT to build productivity shortcuts for weeks, created a particularly useful tool: a prompt that analyzes Henderson’s weekly POs and identifies trends. “They’re ordering 15% more Amber this quarter,” he’d told Sam proudly. “I think they’re prepping for a fall promotion.”
The analysis was correct. The problem was how Jordan got it. He had pasted six weeks of Henderson’s purchase orders, complete with product codes, volumes, per-unit pricing, and distribution center addresses, into ChatGPT. On the consumer version. With no enterprise agreement. No data processing agreement. No guarantee that the data wouldn’t be used to train future models.
Henderson’s found out because their own compliance monitoring flagged it. They run periodic checks on supplier data exposure, and Jordan’s ChatGPT session, or rather, the patterns it generated, triggered an alert.
Sam doesn’t know any of this yet. She only knows that her biggest customer’s legal team is on the phone and the words “data handling” and “AI tools” are being used in the same sentence.
Sam calls Jordan into the boardroom.
“Have you been putting Henderson’s data into ChatGPT?”
Jordan goes pale. Not because he was hiding it, because he genuinely didn’t think it was a problem. “I was just analyzing their order patterns. I wasn’t sending it to anyone. It was just… ChatGPT.”
“What exactly did you paste in?”
Jordan pulls up his conversation history. Sam reads the prompts. She sees Henderson’s PO numbers, their product-specific pricing (which is under NDA), their distribution center locations, their order volumes going back two months.
Sam’s first instinct is anger. Her second instinct, the one that makes her a good leader, is honesty.
This isn’t Jordan’s fault. Jordan was never told not to do this. There is no AI policy at MapleCo. There are no guidelines about what data can and can’t go into which tools. Sam never thought to create one because, until six weeks ago, “AI” wasn’t part of how her business operated.
The truth is that employees everywhere are doing exactly what Jordan did: using consumer AI tools with consumer terms of service to handle business data their employers don’t know about.
Sam didn’t have a shadow AI problem. She had a governance vacuum.
After Sam leaves the boardroom, Jordan sits at his desk and opens his ChatGPT history. He scrolls through weeks of conversations. Henderson’s PO numbers. Pricing data. Distribution center addresses. Each one felt like progress when he typed it. Each one now looks like evidence.
He opens a new email. Types “Resignation” in the subject line. Stares at it for ten minutes. Doesn’t send it. Closes the tab. Then he starts making a list of every tool he’s used and every dataset he’s touched, because if Sam asks — and she will — he wants to have the answer ready.
Sam calls Oscar.
She expects him to be surprised. He isn’t.
“This was going to happen,” he says. “Not because Jordan did something wrong. Because you didn’t have guardrails. Every company I work with either creates an AI policy before something goes wrong or creates one after. The ‘after’ version is more expensive.”
Sam waits for the “I told you so.” It doesn’t come. Instead, Oscar asks a practical question.
“How bad is the Henderson’s situation?”
“I don’t know. Their compliance guy wants to talk tomorrow. I don’t even know what to tell him.”
“Tell him the truth. It was unauthorized use by an employee who wasn’t given guidance. It wasn’t malicious. You’re implementing a policy immediately. And you’re putting data controls in place.”
“Will that be enough?”
“I don’t know. But lying is worse. Henderson’s is a public company. They’ve seen this before. They know every supplier in 2026 has employees using AI tools. What they want to know is whether you’re aware of it and whether you’re doing something about it.”
But the Henderson’s situation is only half the crisis.
The other half arrives that same afternoon. Sam gets an email from a customer, a small restaurant chain in Niagara, saying they received an invoice dated for this month with last month’s pricing. The old, lower pricing. The invoice was generated by the new automation system, which pulled prices from the master list. But the master list hadn’t been updated to reflect the price increase that Sam communicated verbally to Lisa two weeks ago, which Lisa had been manually overriding in QuickBooks on a per-order basis, which the automation system doesn’t know about because nobody told it.
The system did exactly what it was designed to do — it just didn’t know what it didn’t know.
Sam issues a corrected invoice. The dollar amount is small. But it’s the principle: the automation confidently sent out wrong data because the source data was wrong. And nobody caught it because the whole point of automation is that you don’t have to check every transaction.
That night, Sam sits in her office long after everyone has gone home.
The PO automation: brilliant. Transformed her mornings. Freed Lisa. Impressed Henderson’s.
Her own manual override: humiliating. She caused the exact error the system was designed to prevent, because she couldn’t let go.
The governance gap: dangerous. One employee’s well-intentioned ChatGPT use almost cost them their biggest account.
The pricing error: embarrassing. The system did exactly what it was told, and what it was told was wrong.
Sam wonders: How many of these failures are the system’s, and how many are mine?
She picks up the phone. It’s 9 PM. She calls Oscar anyway.
He picks up on the second ring. “I was wondering when you’d call.”
“How did you know?”
“Because this is the part that always happens. The first win feels like the end of the story. It’s actually the middle.”
“You had four failures this month,” he says. “Can you separate them?”
Sam thinks. “Jordan’s ChatGPT thing. That’s governance.”
“Good. What else?”
“The pricing error. The system used the wrong price list because we have three versions that don’t agree. That’s… data integrity? The system exposed a problem that was already there.”
“Good. What else?”
“Nobody caught the pricing error before it went to the customer. We need monitoring. Sanity checks.”
“Good. And the fourth one?”
Sam is quiet for a long time.
“The fourth one is me,” she says. “I was processing Henderson’s orders by hand because I didn’t trust the system. I caused the error. The system had it right. I overrode it.”
Oscar doesn’t say anything for a moment. When he does, his voice is different. Softer. “That one’s the most important, Sam. The first three are structural. Policy, data, monitoring. Those are problems I can help you build solutions for. The fourth one is about whether you’re willing to actually let those solutions work.”
“Lisa said something similar. Less diplomatically.”
“Lisa’s been watching you hold the whole company together with your bare hands for twelve years. She knows what it costs. She also knows you can’t keep doing it.”
Sam stares at the ceiling.
“Every automation that breaks teaches you something a successful one never could,” Oscar says. “And the hardest lesson is that sometimes the bottleneck is the person who built the business. Fixing that one is harder than fixing any system.”
Sam doesn’t respond. She doesn’t need to. He’s right, and they both know it.
“So what do I do?”
“You do what you did after every other failure, but smarter. You don’t stop. You don’t go backwards. You build the guardrails. And this time, you build them around yourself too.”
Over the next two weeks, Sam and Oscar build three things.
The AI Policy. It’s one page. It says: MapleCo employees may use AI tools for internal productivity. No customer data, financial data, or competitive information may be entered into any AI tool without written approval. The approved tool list is maintained by Sam and reviewed quarterly. When in doubt, ask before pasting.
Sam shows it to Lisa. Lisa says, “This should have existed six months ago.” Sam agrees.
She shows it to Jordan. Jordan reads it, nods, and says, “I’m sorry about the Henderson’s thing. I really didn’t know.” Sam believes him. She also gives him a new job: Jordan is now responsible for maintaining the approved tool list and onboarding the team when new tools are added. His Zapier instincts and ChatGPT curiosity, it turns out, make him the perfect person to own this, as long as there are guardrails.
Oscar adds one more principle. “Your policy covers data. But not every process carries the same risk, and your team needs to know where the line is.”
He draws it simply: “Administrative repetition — reading a PO, extracting fields, matching against a database, sending a tracking notification — that’s safe to automate with monitoring. The system does the work, a person reviews the exceptions.”
“But pricing exceptions, contract-sensitive communication, compliance reporting, anything where a wrong answer damages a customer relationship or creates legal exposure — that stays human-reviewed. The system does the prep. It organizes, flags, summarizes. But a person makes the call.”
Sam thinks about Henderson’s. “So the PO extraction is on one side of that line. But if Henderson’s asks for a custom pricing agreement…”
“A person handles that. Every time. The system can pull the data and draft the comparison, but the decision and the communication stay human. That’s not a limitation. That’s the design.”
Sam adds it to the policy. One paragraph. It takes five minutes. It will save them from a conversation they don’t want to have with David Kim again.
The Single Source of Truth. Oscar helps Sam consolidate her pricing into one database that feeds both QuickBooks and the automation system. No more three-way disagreements between the Google Sheet, QuickBooks, and customer-specific side deals. When a price changes, it changes once, in one place, and everything downstream updates automatically.
This is harder than it sounds. It takes a full week. But when it’s done, the leaky tubing is sealed.
The Monitoring Layer. Oscar adds automated checks to the PO and invoicing systems. Any invoice total that deviates more than 10 percent from the historical average for that customer gets flagged. Any new customer gets flagged for manual review. Any order with a product code that doesn’t match the catalog gets flagged.
Lisa reviews the flags each morning. It takes ten minutes. It catches the things that previously would have slipped through until a customer called to complain.
Sam calls David Kim at Henderson’s.
She tells the truth. An employee, without malicious intent, used a consumer AI tool to analyze Henderson’s order data. The employee was not given guidance because MapleCo did not have an AI policy. That has been corrected. Here is the policy. Here are the controls. It will not happen again.
David Kim listens. He asks a few questions. Then he says: “Ms. Laurent, I appreciate your transparency. Most suppliers we flag on this either deny it or scramble to hire a lawyer.”
Sam waits for the but.
“But transparency doesn’t undo the exposure. We’re placing MapleCo on a 90-day vendor probation. During that period, we’ll be monitoring your data handling practices. We’ll need a copy of your AI policy, a signed attestation from every employee who handles Henderson’s data, and a quarterly compliance report. If everything checks out at 90 days, you return to standard vendor status.”
“And if it doesn’t?”
“Then we have a different conversation.”
Henderson’s also reduces their Q4 order volume by 15% during the probationary period. Not as punishment, David Kim explains. As risk management. They’re diversifying supply until they’re satisfied MapleCo’s controls are in place.
Sam hangs up. The account isn’t lost. But it’s wounded. Fifteen percent of Henderson’s is real money, and the probationary status means every interaction for the next three months carries extra weight. She can’t afford a single mistake.
It’s not the worst outcome. But it’s not the clean save she hoped for.
The 90-day clock hasn’t reset. It’s been replaced by a different 90-day clock. This one has paperwork.
That weekend, Sam sits on the bleachers at her daughter’s hockey practice.
She doesn’t bring her laptop. She doesn’t check her phone every five minutes. She watches the kids skate and she thinks about the last three months.
She thinks about the last three months. Every mistake she made was reasonable. Every mistake was wrong. And the hardest lesson wasn’t about technology or process or governance. It was about herself.
That her need to be essential — to personally touch every Henderson’s order, to be the one the business couldn’t survive without — wasn’t strength. It was the thing preventing the business from outgrowing her.
Sam isn’t sure she’s fully fixed that yet. But she’s no longer afraid to try.
Her phone buzzes. It’s an automated notification from the PO system: “Saturday batch: 3 POs received, 3 processed, 0 exceptions.”
She reads it, smiles, and puts the phone back in her pocket.