Some defense contractors are accidentally sabotaging their own CMMC compliance—not through poor implementation, but through perfectly logical procurement decisions.
The same vendor selection instincts that work brilliantly for qualifying suppliers, evaluating technical capabilities, and making risk-based purchasing decisions actively mislead when selecting CMMC partners. Price variations that normally signal quality differences actually reflect scope confusion. Brand recognition that usually indicates reliability means nothing when all authorized C3PAOs issue identical certificates. And confident recommendations that demonstrate expertise often reveal partners with exactly one solution regardless of your problem.
Three CMMC practitioners who've collectively guided hundreds of defense contractors through certification—Joe Devine (President, Axiotrop), Derrich Phillips (Lead CMMC Assessor, Aspire Cyber), and Pat Garcia (CEO, Kompleye)—gathered to discuss partner selection with a unified message: Traditional procurement logic doesn't just fail in CMMC partner selection. It inverts. And with fewer than 800 companies worldwide having achieved Level 2 certification at the time of their discussion, the ecosystem is too immature for conventional evaluation methods to work.
Here's what you need to know to choose partners who actually deliver.
This blog covers a DCMMC panel: "Pass once, Protect always: Choosing CMMC L2 Partners Who Deliver"
Watch the full panel above or keep reading for the recap.
The CMMC partner ecosystem operates under conditions that break normal vendor evaluation. With such a small population of certified companies, even partners who've successfully guided multiple organizations through certification still have limited pattern recognition across different industries, sizes, and technical environments. Statistical noise drowns out quality signals.
This creates three compounding problems:
Price variations are meaningless without scope clarity. When C3PAO assessment quotes vary by $30,000-$40,000 for the same company, it's almost never about capability differences. Pat Garcia, whose firm has conducted 20 assessments, explained that scoping differences rather than capability differences explain most price variation between proposals. One C3PAO might assume your enclave includes physical CUI requiring additional objectives. Another might not have asked. You're not comparing quality—you're comparing different interpretations of the same requirement.
Brand premium buys you nothing. Phillips, who has helped six companies achieve Level 2 certification, noted that you don't necessarily need to go with the well-known designer C3PAOs—many lesser-known C3PAOs are just as qualified. Since all authorized C3PAOs issue the same certificate, paying for name recognition often means longer wait times as popular assessors juggle larger client loads.
Confidence signals the wrong thing. The consultant who immediately recommends MS365 GCC High without asking about your workflows isn't demonstrating expertise; they're revealing they have one solution regardless of your problem. Devine described it as "the elephant in the room"—a solution that works for many companies and dominates conversations. But it's an enterprise-level approach. Smaller companies needing enclave strategies might find it operationally incompatible with their collaboration patterns or cost structures.
The solution isn't to abandon evaluation, it's to evaluate completely differently.
Forget capability statements and marketing materials. Here's how to reveal whether a partner actually knows what they're doing.
Real expertise manifests as specificity about YOUR environment, not comprehensive claims about universal capability.
For identity and access management, Phillips outlined what to look for: Are they certified on your specific platform—Microsoft Entra ID, Active Directory, or whatever you're using? Can they discuss role-based access control, privileged access, and ensuring admins have separate accounts from standard users in the context of your actual infrastructure? Generic best-practices answers suggest generic experience.
For logging and SIEM capabilities, Garcia identified three capabilities that reveal real implementation experience: centralized log visibility that ingests information from different sources across on-prem, cloud, and endpoints; alert monitoring capability that helps you respond in a timely manner to different events; and documentation requirements that help comply with the 90-day window for retaining records should a reportable incident get identified. But the key isn't whether partners can list these—it's whether they can explain how they'd implement them in your specific environment.
For enclave strategies, Devine emphasized that assessments always start with network diagrams and data flow diagrams. With an enclave solution, you should expect to see both enterprise and enclave pieces that are separate and distinct. If partners can't readily produce these or haven't considered whether your solutions are FedRAMP Moderate Authorized or equivalent, they're planning to figure it out after you've signed the contract.
Don't ask partners what they can do in general. Ask them to walk through what they would do specifically for you. Partners with real experience will sketch architectures, ask clarifying questions about your current infrastructure, and identify challenges specific to your setup. Partners without it will speak in abstractions and assure you they handle this all the time.
Phillips emphasized asking how many companies they've helped achieve Level 2 certification as a baseline filter. But go deeper: Have they certified companies in your industry? Your size? With your technical stack?
Devine was explicit about this: "We don't want your assessor to learn your tech on your dime." He recommends asking whether assessors have already assessed systems with your specific technology. An assessor who's only worked with MS365 GCC High shouldn't be learning Google Workspace during your assessment.
Garcia shared a story that illustrates professional integrity: A C3PAO came back days after an interview and declined an engagement because they didn't understand the client's tech stack. The client wasn't happy about it, but Garcia understood why—they didn't want that C3PAO learning on the job or misinterpreting requirements.
The market punishes this honesty, rewarding partners who confidently claim universal expertise over those who acknowledge limitations. Don't be one of the contractors reinforcing that incentive. Value partners who clearly articulate what they know, what they don't, and how they'd approach what's necessary.
Devine pointed out that many organizations working toward CMMC don't have dedicated IT or cyber staff, so consultants are talking to business leaders who don't understand technical jargon. "If people come at you with acronyms and they're really just talking techy, that might not be the best choice for you," he noted. You need support, but you also want to understand what they're putting in place.
It’s not dumbing down concepts, instead it’s ensuring business leaders can evaluate whether proposed solutions are operationally viable for specific workflows before implementation begins. If your team can't collaborate the way they need to, they'll find workarounds. Those workarounds will either break your compliance posture or break your operations.
For contractors working with Managed Service Providers, Garcia identified what he considers the most critical document: the Shared Responsibility Matrix. This document tells you exactly what the MSP's responsibility is, what the organization needs to focus on, and what they need to share in terms of implementation.
The absence of this document creates assessment failures. Garcia reported seeing many situations during the JSVA program where MSPs weren't even required to have such documents, which extended the assessment time to days and sometimes weeks.
But the deeper problem is what the missing document reveals. An MSP without a Shared Responsibility Matrix hasn't thought systematically about where their services end and client obligations begin. Controls fall through gaps between what the MSP thinks you're handling and what you think they're handling. You discover these gaps during assessment, when it's too late to remediate without delays.
Before signing any MSP agreement, demand to see their standard Shared Responsibility Matrix. If they don't have one, they're not ready to support your CMMC journey—regardless of their technical capabilities or pricing.
Recommended Reading: Virtru Shared Responsibility Matrix for CMMC 2.0
Technical competence is necessary but insufficient. You also need partners whose approach aligns with how CMMC actually works.
Phillips distinguished between two philosophies. Some assessors, particularly those from ISO 9001 quality audit backgrounds, operate under the principle that they haven't done their job unless they've found something wrong—they're not leaving until they find the needle in your haystack.
But CMMC is explicitly structured as a maturity model. You're expected to improve over time, not be perfect at assessment. Devine emphasized that organizations should look for C3PAOs who understand that this is a daunting task and that it's a maturity model matrix for a reason; you're expected to be getting better over time. If someone's coming across like they're looking to fail you, go to somebody else who's willing to work with you.
This doesn't mean seeking easy assessors who overlook deficiencies. It means finding partners who approach assessment as verification of good-faith implementation rather than treasure hunts for disqualifying flaws.
You can't determine this from marketing materials. Have actual conversations with lead assessors to understand their philosophy. Do they communicate collaboratively or defensively? Do they explain requirements in plain English or hide behind acronyms? How they talk about other companies' assessments reveals how they'll treat yours.
Phillips highlighted a vulnerability most contractors miss: Cybersecurity is sticky. Once you work with somebody for months and they've helped with implementation, he's seen organizations become beholden to their IT or cybersecurity providers. They feel like they want to fire them but can't because they don't know what kind of damage that person might do on the way out or what skeletons they have buried.
Your implementation partner should be working themselves out of a job, not creating dependency. Documentation should be specific enough that you could transition to a different provider without operational disruption. Phillips emphasized that everything should be documented so it's not just in the IT person's head, where someone brand new can come in and understand that system and how to maintain it.
If partners resist this level of documentation detail, they're building job security, not organizational security.
Use contract negotiation as extended due diligence, not just legal formalization. The terms partners resist reveal operational maturity and confidence.
Include mutual separation clauses without severe penalties. Phillips recommends having some type of clause where both parties can go their separate ways without severe penalties. This tests partner confidence—those who insist on long-term lock-in are signaling they expect clients to want to leave.
Require detailed statements of work. Devine emphasized that SOW documents should show competence around what the plan of action is, documenting the projects that will take you from wherever you are to a score of 110 and ready for assessment. Partners who resist detailed SOWs either don't have repeatable processes or don't want to commit to timelines they're uncertain about.
Demand accreditation maintenance. Garcia noted that C3PAOs need to go through an accreditation process with Cyber AB to meet ISO 17020 requirements. Contracts should incorporate terms and conditions requiring that the C3PAO maintains its accreditation and authorization by Cyber AB to perform those assessments.
Define clear decision points. Garcia emphasized that for C3PAO contracts, phases should be well-defined with clear decision points. If during Phase 1 readiness assessment you're not prepared, you want the ability to stop at that point, get some time, and delay your assessment without contractual penalties.
Phillips offered the simplest and most overlooked evaluation method: "The same concept when like most of y'all go to a new restaurant, what do you check? The reviews." Google your potential C3PAOs, assessors, and MSPs to see what customer reviews and testimonials they have.
Ask for references. Check LinkedIn for other contractors who've worked with them. Look for patterns across multiple sources. Not because reviews are perfect signals, but because patterns across reviews reveal truths that individual sales conversations obscure.
If a partner has conducted dozens of assessments but has no public testimonials, no case studies, and no contractors willing to be references, that absence is information.
Garcia described seeing cases where companies get certified using a specific tool or environment or solution, and then they have to go back and implement something else because it doesn't serve their purpose—they can't collaborate or work together effectively. The solution becomes operationally incompatible with how the business actually functions.
This is the real cost of treating partner selection as a traditional procurement decision. You optimize for audit performance rather than operational sustainability. You get compliant systems that can't support actual business operations.
The organizations that select partners using traditional procurement logic—lowest price, biggest brand, most confident pitch—are the ones discovering six months after certification that their team can't collaborate, their documentation bears no resemblance to actual operations, and they're dependent on an IT provider they can't fire without operational risk.
Meanwhile, the organizations that invert their evaluation criteria—prioritizing operational specificity over comprehensive claims, valuing honesty about limitations, demanding documentation that enables independence—are both compliant and operationally sustainable.
Devine summed up what contractors need to demand: partners who understand not just documentation requirements, but your business and your tech stack, and how you're handling CUI.
With fewer than 800 companies having achieved Level 2 certification when these practitioners convened, we're still in the early phases of CMMC ecosystem maturity. The partner landscape is immature, quality is wildly inconsistent, and the traditional procurement heuristics that work for everything else will fail you here.
The question is whether you recognize this early enough to choose differently, or whether you learn it the expensive way, six months into an implementation that checked all the traditional procurement boxes but missed the actual point.
Your CMMC partner selection isn't just another vendor decision. It's the decision that determines whether your compliance investment builds security capability or just creates another dependency with a certificate attached.
Start evaluating CMMC partners like you're selecting people who need to understand your business, operate sustainably within your workflows, and build systems you can actually maintain after they leave.
Because that's exactly what they need to do.
DCMMC is an annual practitioner-led, vendor-neutral CMMC community event focused on delivering real security outcomes for the DIB and was a chance to connect with DC’s defense peers. Learn more and watch the other panels here.