An anti-smoke scorecard is a technical template designed to evaluate SEO and Google Ads proposals in just 30 minutes using measurable criteria and verifiable deliverables. This system allows you to assign a score to strategic blocks to identify unsubstantiated ranking promises, a lack of methodology, and risks of transparency in account ownership. In this article, you'll find a step-by-step guide to conducting the audit, the key questions that will compel you to choose the right provider, and a summary table to help you make an informed decision based on real evidence.
Why you need an anti-smoking scorecard
In procurement processes, smoke and mirrors appear as ranking promises, vague language, and intangible deliverables. An anti-smoke scorecard transforms feelings into measurable criteria: what is audited, what is delivered, how it is measured, and who controls access. If you can't verify or replicate a claim, you're on risky ground. A structured system reduces bias, accelerates decisions, and protects the budget.
Signs of vague proposals
Before you open Excel, learn to spot five recurring red flags: ranking promises, lack of verifiable deliverables, absence of methodology, superficial reporting, and optimization without evidence. Each flag has specific manifestations, which I explain below so you can demand immediate proof.
Ranking promises
When a provider says 'you'll reach the top' or 'we'll position you in X weeks', ask for the evidence: what measurable keywords? what authority benchmarks? what studies of previous attempts by your competitors? If the answer is vague, mark it as a penalty in the anti-smoke scorecard.
Lack of verifiable deliverables
A verifiable deliverable is a document, access point, or screenshot with replicable steps: a technical audit with a checklist and screenshots, a prioritized list of URLs, CMS change logs, writing templates with briefs and examples, and evidence of A/B testing. If the proposal only lists 'on-page improvements' without listing specific files, that's a red flag.
Lack of methodology
Ask for a clear workflow: diagnosis, hypothesis, prioritization, implementation, measurement. If they don't explain how they generate hypotheses (e.g., from Search Console data and intent analysis), there's no scientific traction, only intuition.
Superficial reporting
Worthy reports must connect metrics with decisions: impressions/CTR by intent, converting pages, tests, and results with time-based granularity and assigned responsibilities. Reports that simply show 'organic traffic is up' without breakdowns are for internal marketing, not for decision-making.
Optimization without evidence
If they promise to 'optimize for AI' or 'improve relevance' without showing how they will measure entity signals, links, or content changes, ask for concrete examples: before/after rich snippets, entity maps, and proof of changes to search models.
Anti-smoke Scorecard Blocks
The anti-smoke scorecard is divided into six sections: initial diagnosis, strategy, implementation, measurement, transparency, and risks. Each section has criteria, a weighting, and a color-coded system (green/yellow/red). Below, I describe what to evaluate in each section and how to score it.
Initial diagnosis
What they audit and with what tools: server logs, Search Console, Google Analytics/GA4, Screaming Frog, backlink tools, and intent analysis with keyword data. A useful diagnosis includes quantified findings, a list of critical URLs, and priorities with estimated impact. If the proposal doesn't specify the tools or deliverables, it deducts points on the anti-smoke scorecard. To see how SEO is structured at the website level, review process examples from effective web positioning that clarify technical deliverables.
Strategy
The strategy should translate findings into improvement hypotheses: for example, 'improving titles on 30 pages with transactional intent will result in a +X% CTR'. It should include prioritization by impact and effort (I/E matrix). If the strategy is generic or lacks quantifiable hypotheses, apply a yellow or red traffic light system based on severity.
Execution
Describe what you're implementing, who will do it, and in what format it will be delivered (Jira tickets, staging changes, commits, draft list). Implementation without ownership or verifiable steps is just smoke and mirrors. A good proposal lists technical, on-page, and content changes, along with estimated timelines.
Measurement
Detail what is tracked and how: conversion goals, GA4 events, funnels, and tests. Measurement should include monitoring windows, intent-based segmentation, and quality metrics (bounce rate, time on page, relevant events). Periodic reports should show actions taken and future decisions, not just trend graphs.
Transparency
Access and ownership: who maintains accounts, what is ultimately handed over, and how ownership is transferred. Request read-only or administrator access as needed and define clauses regarding content and data ownership. A lack of clarity here is a strong indicator of dangerous dependency.
Risks and limits
A good vendor identifies dependencies (internal resources, Ads budget, IT deployment time) and sets limits: what the contract doesn't cover. If they don't mention risks or assumptions, give them a low score on the anti-smoke scorecard.
Scoring and traffic light template
Assign weights to each block (example: Diagnosis 15%, Strategy 20%, Execution 25%, Measurement 20%, Transparency 10%, Risks 10%). Score 0–5 per criterion. Convert the total to a traffic light system: 80–100% green, 55–79% yellow, less than 55% red. The "anti-smoke" scorecard prevents decisions based on intuition and transforms evidence into a single, comparable figure.
Key questions that require specificity
When you receive a proposal, use questions that elicit details. Here's a prioritized list to use in 30-minute interviews. Each question should be accompanied by a specific request for evidence or a deliverable.
- What will they audit first, and in what format will they deliver the report? Please request a sample file or screenshots of the diagnostic process.
- What tools will you be using and why those specific tools? Please request exports or dashboard examples.
- What will they change first, and why that priority? It requires a prioritized list with estimated impact.
- How will you measure success? Specify concrete events, timeframes, and success criteria.
- What won't they do? A clear answer about exclusions avoids overloading expectations.
If a response doesn't include a deliverable or example, it deducts points from the anti-smoke scorecard. In practice, this requires that the prioritization definition have reproducible methods (e.g., impact/effort matrix or traffic-by-intent analysis).
Scorecard Summary Table
| Block | Key criterion | Minimal evidence |
|---|---|---|
| Diagnosis | Tools and quantified findings | Report with lists of URLs and screenshots |
| Strategy | Hypothesis + prioritization | Impact/effort matrix and roadmap |
| Measurement | Events, objectives, and control window | Dashboard and test plan |
Boost your business with Agencia Roco
Receive a free consultation to identify opportunities in your positioning, campaigns, and sales funnel. We'll provide you with a prioritized plan to attract leads and convert them into customers.
Quick checklist for a 30-minute review
Use this practical workflow to evaluate proposals in half an hour: 1) request diagnostic deliverables; 2) ask for the initial hypothesis and its prioritization; 3) confirm what they will implement and how tasks will be delegated; 4) verify the measurement plan; 5) review access and ownership; 6) assess risks. If any point lacks evidence, apply a penalty in the anti-smoke scorecard and request written clarification.
Practical example
Let's say you receive a proposal to improve online sales. You request a diagnostic, and the provider delivers a PDF with 50 observations, but without prioritization or estimated impact. Apply the "Anti-Smoke" Scorecard: low score in Strategy and Execution. Ask them to transform those 50 observations into an I/E matrix and deliver the first 10 changes as support tickets. If they can't, change providers. To compare contracting approaches between agencies and freelancers, consider the differences in scope and control described in [reference to relevant documentation]. SEO agency vs freelancer.
How to evaluate Google Ads proposals alongside SEO
An integrated proposal should demonstrate coordination between organic and paid keywords, conversion attribution, and copy testing that is then repurposed for SEO. Request campaign examples and account structures; demand access to keyword-segmented results and landing pages. If the paid proposal separates objectives without an attribution plan, insist on integrating GA4 and shared conversions.
When the proposal includes Ads management, validate that there is a clear plan to measure incremental increases and not just vanity metrics. To see how the agency and Ads fit together in digital marketing processes, review common practices of a Google Ads digital marketing agency.
Boost your business with Agencia Roco
Receive a free consultation to identify opportunities in your positioning, campaigns, and sales funnel. We'll provide you with a prioritized plan to attract leads and convert them into customers.
Decision and transfer
Once you've compared proposals against the anti-smoke scorecard, request an onboarding plan that includes access and ownership transfers. Define final deliverables and quarterly checkpoints. Don't accept clauses of indefinite dependency: the goal is to be able to replace providers without loss of data or control.
Common mistakes when applying the Scorecard
Avoid these pitfalls: using overly technical criteria without business context, assigning scores without evidence, and letting a single stakeholder decide. Keep the evaluation sheet and request written evidence. The anti-smoke scorecard works best when replicated by several people within the company.
Steps to decide
1) Run the assessment in 30 minutes with the key questions; 2) request missing evidence and give 48 hours for delivery; 3) recalculate scores; 4) decide based on a traffic light system and internal capacity to execute. A supplier with a green light and replicable evidence deserves a trial contract; a yellow one could move to a pilot program with limited deliverables; a red one should be discarded.
Practical resources and templates
Attach a spreadsheet with the weighted blocks and formulas to your process; always request audit examples and dashboards. Implementing an anti-smoke scorecard does not guarantee results on its own: results depend on domain authority, product/service quality, competition, and consistency in execution.
Quick guide to contract language
Include clear clauses: deliverables with specific formats, SLAs for response times, asset ownership, and termination conditions. Avoid clauses that restrict data access or require additional services without independent evaluation.
Final checklist before signing
Verify that the proposal includes: a diagnosis with supporting evidence, a prioritized roadmap, due dates, a measurement plan, and access information. If any of these are missing, demand corrections or assign a red flag in the anti-problem scorecard and request a new version.



