A PE-backed industrial company hired me to sharpen their go-to-market targeting. The executive sponsor had a blunt assessment: "Our ICP targeting and campaign structure is pretty weak." He wasn't wrong. But the problem was deeper than targeting. They didn't have an ICP at all. They had scattered assumptions living in sales decks, engineering presentations, and one executive's head.
Gartner reports that only 42% of companies have a formally documented ICP. Most ICP development guides assume clean CRM data, product usage metrics, and digital buying signals. What happens when your customers buy through 12-18 month evaluations, your data lives in PowerPoints and an ERP system, and your buying committee includes engineers who will never fill out a form?
I'm sharing the five-step process I ran, including the deep research prompts, because the methodology matters more than the tools. The dynamics here -- PE growth targets, fragmented data, a buying committee where the technical evaluator holds the veto -- show up everywhere outside pure-play inbound SaaS. If any of that sounds familiar, keep reading.
Step 1: Audit Existing Customer Data (When "Data" Means PowerPoints and Tribal Knowledge)
The company had a CRM, an ERP, and a strategy folder with roughly 2GB of presentations. No single source of truth for customer attributes.
The CRM captured "strategic" sales -- contacts, companies, a multi-stage deal pipeline built to satisfy PE reporting. But transactional orders where customers sent POs directly, plus web store purchases, bypassed the CRM entirely. The ERP captured the transactional side but didn't reconcile with CRM in a way that covered the gap. The executive team had a manual workaround to bridge the two systems. It was better than nothing and worse than adequate.
The strategy folder contained sales decks, market analyses, and customer presentations dating back years. Valuable patterns buried inside binary files that no search engine could reach.
Here's the audit framework I used:
-
Map every data source. CRM, ERP, file repositories, email, shared drives. List what each one captures and what it misses. The gaps between systems are usually where the most useful ICP signals are hiding.
-
Quantify the coverage gap. Which customers are in the CRM? Which bypass it entirely? For this company, a significant share of repeat orders never touched the CRM. If you've only been analyzing CRM data, your ICP is built on an incomplete picture.
-
Convert tribal knowledge to structured data. We converted roughly 50 presentations from PowerPoint and PDF to searchable text. That extraction alone surfaced patterns the team had been operating on intuitively but never documented: which industries kept showing up in closed-won deals, which use cases generated repeat orders, which customer types churned after the first engagement.
-
Set the baseline honestly. Don't pretend the data is cleaner than it is. Acknowledging the gaps is step one to building around them.
Your version of this might be Gong recordings, Salesforce notes, or Slack threads with your best AEs. The format differs. The buried knowledge is the same. The audit isn't about finding clean data -- it's about mapping where the knowledge actually lives and deciding which source is canonical.
Step 2: Interview Internal Stakeholders (The Step Most Teams Skip)
I interviewed the executive sponsor, sales leadership, and technical leads. Each group held a different piece of the picture.
The executive sponsor saw the business from the PE board's perspective: which segments drove growth, which acquisitions created adjacencies, where the revenue targets pointed.
Sales leadership knew which deals closed and which stalled. They described the "good" customer in behavioral terms: "They come to us when their in-house team can't get from concept to delivery." That one sentence was more useful than any firmographic database.
Technical leads knew the product fit: which applications mapped to the company's strengths and which were stretch plays that consumed resources without generating margin. One told me, "We win when the customer has tried to solve the problem internally and realized they need someone who's done it before. We lose when they think they just need extra hands."
That distinction -- expertise-seeking versus capacity-seeking buyers -- shaped the entire disqualification framework. It's one of the most portable things I took from this engagement. If you sell anything complex, your best customers are probably seeking expertise, not capacity. And the ones that look promising but stall? Usually capacity-seekers who wanted a cheaper pair of hands, not a partner.
When the ICP Owner Can't Query a Database
The executive sponsor wanted frameworks before tools. Conceptual understanding before execution details. This matters because the person who owns the ICP often isn't the person who can configure an enrichment waterfall or write a CRM filter.
I made a mistake early: I showed him a detailed capabilities matrix when he needed a simple visual showing how the pieces fit together. Once I switched to frameworks first, tool details second, the feedback loop accelerated. If the executive who's accountable for your ICP can't give feedback without needing a technical translator, the format is wrong.
Interview the humans first. Query the systems second. Build the ICP in whatever format lets the decision-maker iterate on it.
Step 3: Map the Buying Committee (Who Has the Veto?)
In this engagement, the buying committee looked different from the standard Champion/Economic Buyer/Technical Evaluator model:
- Senior Technical Leader -- The technical evaluator AND often the champion. If they say no, the deal is dead regardless of executive support.
- Procurement/Supply Chain -- Focused on vendor qualification, pricing, and delivery terms. Can kill deals on compliance or cost basis alone.
- VP/SVP or CTO -- Budget authority. Often rubber-stamps the technical leader's recommendation but needs a strategic narrative about why this vendor, why now.
- Program Manager -- Controls budget for a specific program. Timeline-driven. Cares about delivery milestones, not platform architecture.
The person who signs the check wasn't the person who controlled vendor selection. The senior technical leader held the effective veto. Without their buy-in, no amount of executive sponsorship moved the deal forward.
Choosing the Primary Persona
We chose the senior technical leader as the entry point. Three validated pains drove that choice:
-
Talent gap. Persistent inability to hire specialized engineers fast enough. There's a sweet spot here: when the in-house team is stretched but not absent. Too small a team means too risky to outsource. Full team means no need. Stretched-but-present is where the partner conversation starts.
-
Schedule pressure. Complex programs with hard deadlines and financial penalties for late delivery. The technical leader feels this directly. Their reputation is on the line.
-
Compliance risk. Growing regulatory requirements. Fear of failures, costly recalls, personal professional liability.
These pains were specific enough to drive messaging and signal detection, but broad enough to apply across the company's core segments. The principle generalizes: find the person with the kill switch, and build the ICP around them. Not around the person who signs the check.
Step 4: Build Hypothesis-Driven Profiles (Pains, Triggers, Decision Criteria)
Most ICP guides tell you to build a profile. I think that's backwards. I structured it as a set of hypotheses to validate.
- Company characteristics (hypothesis): Mid-market companies with engineering gaps in a specialty domain. Transitioning from legacy approaches to next-generation technology. In-house teams that are stretched, not absent.
- Disqualification signals (hypothesis): Companies with mature in-house specialized teams (they don't need a partner). Companies too small to afford the engagement model. Companies outside the relevant technical domain (wrong fit).
- Trigger events (hypothesis): New government funding plus open specialist roles (budget confirmed, capacity gap confirmed). Recent acquisition creating integration pressure. Regulatory action in their product category.
The difference matters. "We think mid-market companies with talent gaps are our best fit" is a hypothesis. "We've validated that 7 of our top 10 accounts match this profile and renew at 2x the rate" is a finding. Most ICP projects stop at the hypothesis and call it done.
How PE Ownership Changes ICP Work
PE ownership changed the dynamics in ways that don't show up in standard guides.
The board expected aggressive growth from recent acquisitions. The ICP wasn't an academic exercise -- it was the targeting logic behind a growth plan with a timeline and a board presentation attached. The executive needed an ICP that could feed CRM segments, not a PDF on a shared drive. When the board asks "who are we targeting and why?", the answer has to come from pipeline data.
And here's what most ICP guides miss about PE-backed companies: acquisitions mean the ICP keeps changing. The "what we can sell" answer keeps evolving. The executive wanted what he called "a closed loop methodology" -- something he could run repeatedly with active feedback, not a one-time deliverable that starts decaying the moment it ships.
Any company with aggressive growth targets and a board to report to needs an ICP that converts directly into pipeline operations: segments, scoring, campaign triggers. If your ICP can't be loaded into a CRM filter, it's a strategy doc, not an operational tool.
Step 5: Validate With Data Signals
With hypotheses defined, we built a signal detection framework to replace assumptions with evidence.
| Signal Category | Example Data Sources | Why It Matters |
|---|---|---|
| Specialist hiring surge (3+ open roles) | LinkedIn Jobs, Indeed | Talent gap confirmed. They need external capacity. |
| New government funding in their domain | SBIR.gov, USAspending.gov, grant databases | Hard deadline, likely understaffed, budget confirmed. |
| Recent contract with delivery penalties | Procurement portals, public filings | Schedule pressure quantified in dollar terms. |
| Regulatory actions in their product category | Relevant regulatory databases | Safety/compliance pressure activated. |
| First-ever hire for a specialty leadership role | Company scaling into new capability. Greenfield. |
The signal categories came directly from the pains we validated in Steps 2 and 3. These aren't random firmographic filters. They're evidence that a specific pain is active right now for a specific company.
How AI Tools Accelerated the Research
We used Clay for company enrichment and signal detection at scale. The real value was waterfall enrichment: try one source, fall back to another, build a complete picture from partial data across multiple providers.
We connected the CRM via API for existing customer analysis -- pulled company attributes, deal history, and contact roles to validate the persona hypothesis against real data. Do our best customers actually match the profile we built from interviews?
We used AI research agents to scan government databases, regulatory sites, and job boards for intent signals matching our hypotheses. The goal wasn't to replace the executive's judgment. It was to give him evidence for decisions he was already making on instinct.
What didn't work: The premium enrichment API we wanted wasn't available at the company's pricing tier. We fell back to a more affordable tool for contact enrichment -- less comprehensive but fast enough for validation. Also, our first pass at regulatory signal detection pulled too broadly. We had to narrow the criteria twice before the signal-to-noise ratio was useful for outbound prioritization. Budget time for iteration. Not every signal works on the first attempt.
Deep Research Prompts
When you need to build an ICP from scratch and the company doesn't have clean data, AI deep research tools (ChatGPT Deep Research, Gemini Deep Research) can compress weeks of manual research into hours. Here's how I structure the prompts.
Company Deep Research Prompt:
7-section structure. Adapt the variables.
You are a company researcher conducting exhaustive analysis of {COMPANY}
operating in {DOMAIN}. Produce a comprehensive intelligence document.
RESEARCH FRAMEWORK:
1. Company Foundation & Strategic Position
- Founding narrative, evolution, key milestones
- Financial standing (funding, metrics, unit economics)
- Leadership deep dive (background, philosophy, public statements)
2. Market Position & Competitive Landscape
- Industry dynamics, regulatory shifts, technology trends
- Direct competitors with positioning, strengths, weaknesses
- Feature comparison and differentiation analysis
3. Ideal Customer Profile & Buyer Personas
- Customer segments with pain points and trigger events
- Buyer personas (role, goals, frustrations, decision criteria)
- Buying process and committee mapping
4. Technology & Operations
- Tech stack, development approach, integration points
- Team structure, hiring signals, culture indicators
5. Customer Voice & Market Sentiment
- Review analysis (G2, Capterra, industry sites)
- Case studies and proof points
6. GTM Readiness Assessment
- Current acquisition motion and gaps
- Segment prioritization by traction and revenue potential
7. Strategic Summary
- Top 5 GTM priorities, segment ranking, key risks
QUALITY RULES:
- Tag findings: [FACT] (2+ sources), [SIGNAL] (1 source), [INFERENCE]
- If unavailable, state it — never fabricate
- Prioritize sources from last 12-18 months
- Quantify: numbers over adjectives
Competitive Landscape Prompt (Category Warfare mode):
Per-competitor deep dives work for 1-3 competitors. For 4+, I switch to what I call "Category Warfare" -- cluster by structural business model, not alphabetically. This surfaces cluster-level vulnerabilities that individual profiles miss.
Analyze the competitive landscape for {CATEGORY}.
STRUCTURAL PRINCIPLES:
1. Cluster by business model (e.g., "full-service platforms" vs
"point solutions" vs "services-first") — NOT alphabetically
2. Force economic modeling: every claim connects to cost structure,
margin sensitivity, or revenue exposure
3. Evidence tagging mandatory: [FACT], [SIGNAL], [INFERENCE]
FOR EACH CLUSTER:
- Shared business model and customer profile
- Revenue mix and margin structure (if observable)
- Technology approach and dependencies
- Cluster-level vulnerability patterns
CROSS-CLUSTER ANALYSIS:
- Structural Decoupling Matrix: compare all competitors on 10
dimensions (time-to-insight, marginal cost, scalability ceiling,
automation depth, etc.)
- AI Disruption Impact: which revenue streams are automatable?
- Reaction Forecasting: how will each cluster respond to disruption?
OUTPUT: Cluster map, vulnerability matrix, sales doctrine per cluster
The prompt is maybe 20% of the value. The other 80% is knowing what to do with the output -- how to validate findings against real customer data, how to turn raw research into testable ICP hypotheses, how to translate it into CRM segments and campaign triggers. The prompts compress the research phase. The methodology is what makes the research useful.
What Changed: Before vs. After
Before: "We sell to anyone who needs our capabilities." No segments, no prioritization, no disqualification criteria. The executive couldn't tell his board who the company was targeting or why. Pipeline reviews were anecdotal.
After: A living document in the company's knowledge repository containing:
- Validated vertical segments with specific company characteristics and disqualification criteria. The team could look at a prospect and say "this is segment 2" or "this is a disqualify" without asking the executive.
- A primary persona with documented pains, verbatim language from interviews, and a buying committee map showing how the technical evaluator, procurement, executive sponsor, and program manager interact on a typical deal.
- Validated messaging concepts tied to specific intent signals. Not "we should talk about talent gaps." Instead: "When we see 3+ specialist roles open on LinkedIn, send this sequence with this value proposition."
- A signal detection table the sales team could use to prioritize accounts without asking "who should I call next?"
- Disqualification criteria that saved the team from chasing accounts that looked good on paper but would never close.
Within six weeks, the team had three segmented outbound campaigns running where they'd previously had one generic motion. The executive could tell his board: "Here's who we're going after, here's why, and here's how we'll know when they're ready to buy." Not closed revenue yet -- but operational clarity the whole organization could execute against.
What I'd Do Differently
Ask "what do you have?" before "what should we build?" I started building the ICP framework before asking what customer data the executive already had on hand. He had data he could have shared earlier. Always start with what exists.
Lead with frameworks, not tools. I showed the executive a long capabilities list when he needed a simple visual of how the pieces fit. Non-technical executives need the mental model before the mechanics. Once the framework clicked, the tool conversations got productive fast.
Budget more time for data reconciliation. The gap between CRM and ERP was bigger than expected. Two systems that "talk to each other" can still leave a significant share of your customer interactions invisible to targeting logic. The ICP is only as good as the data underneath it.
Run deep research on day one. The prompts I shared in Step 5 should have run in week one. They compress so much context that everything downstream -- interviews, hypothesis building, signal design -- gets sharper when you start with that foundation. I ran them mid-engagement. Next time, day one.
The ICP development process for an industrial company isn't fundamentally different from building one for SaaS. The inputs are messier, the buying cycles are longer, the terrain is more technical. But the core discipline is the same: start with what you know, structure it as hypotheses, validate with signals, and build for iteration. The 58% of companies that skip this work are targeting everyone -- which means they're targeting no one.
The 5 Steps in 30 Seconds
- Audit your data. Map every source of customer knowledge. Assume it's messier than you think.
- Interview stakeholders. Executive for strategy, sales for behavior, technical leads for product fit.
- Map the buying committee. Find the person with the veto. Build the ICP around them.
- Structure as hypotheses. Company characteristics, trigger events, disqualification criteria. Testable, not final.
- Validate with signals. Intent data, job postings, public records, AI-accelerated research. Build for iteration.
By Victor Sowers, STEEPWORKS
