Overview
PODERM
2026-01-13 to 2026-04-12
64-Point Assessment
41 of 64 evaluated
Result Check Value Confidence
PASS
Pixel installation status
2 pixel(s) detected
critical
UNVERIFIED
NA
Server-side tracking (CAPI) status
CAPI status unknown
critical
UNVERIFIED
NA
Event deduplication between Pixel and CAPI
Requires manual verification
critical
UNVERIFIED
NA
Event Match Quality score
Not available via API
critical
UNVERIFIED
NA
Domain verification in Business Manager
Requires Business Manager check
high
UNVERIFIED
NA
Aggregated Event Measurement configuration
Requires Events Manager check
high
UNVERIFIED
NA
Standard vs custom event usage
Requires Events Manager check
high
UNVERIFIED
NA
CAPI Gateway deployment
Requires manual verification
medium
UNVERIFIED
WARNING
iOS attribution window configuration
67% using 7d_click/1d_view
high
VERIFIED
NA
Data freshness and event lag
Requires Events Manager check
medium
UNVERIFIED
Result Check Value Confidence
PASS
Ad creative format diversity
3 formats
critical
VERIFIED
FAIL
Number of creatives per ad set
Avg 2.7 ads/adset
high
VERIFIED
WARNING
Video aspect ratio coverage (9:16 for Reels)
59 video creatives
high
ESTIMATED
PASS
Creative fatigue detection
CTR change: +3.1%
critical
CALCULATED
NA
Video hook rate (3-second retention)
Not available via API
high
UNVERIFIED
FAIL
Social proof through organic post boosting
0% boosted
medium
CALCULATED
NA
User-generated content (UGC) proportion
Requires manual review
high
UNVERIFIED
WARNING
Advantage+ Creative enhancements
0 with enhancements
medium
VERIFIED
PASS
Creative freshness (days since newest ad)
Newest ad: 5 days
high
VERIFIED
PASS
Ad frequency for prospecting audiences
Avg frequency: 1.69
high
CALCULATED
PASS
Ad frequency for retargeting audiences
Top-quartile freq: 1.93
medium
CALCULATED
PASS
Click-through rate vs industry benchmark
CTR: 1.87%
high
CALCULATED
PASS
Creative age fatigue risk
0 fatigued ad(s)
high
CALCULATED
FAIL
Value proposition in ad text
0% with text
medium
ESTIMATED
WARNING
Creative differentiation across audiences
11/9870 identical pairs
medium
CALCULATED
NA
Creative spend concentration risk
No ad spend data
medium
UNVERIFIED
Result Check Value Confidence
WARNING
Number of active campaigns (consolidation)
6 active campaigns
high
VERIFIED
PASS
Campaign Budget Optimization (CBO) adoption
67% CBO
high
VERIFIED
NA
Learning phase health across ad sets
Learning phase data not available
critical
UNVERIFIED
NA
Learning phase reset frequency
Requires edit history
high
UNVERIFIED
WARNING
Advantage+ Shopping Campaigns testing
0 ASC campaigns
medium
VERIFIED
PASS
Ad set audience overlap / cannibalization
0 overlapping pairs
high
CALCULATED
PASS
Minimum daily budget per ad set
Avg EUR 350.00/day
high
VERIFIED
FAIL
Campaign objective alignment with sales goal
3 misaligned
high
VERIFIED
WARNING
Advantage+ Placements usage
57% automatic
medium
VERIFIED
PASS
Multi-platform placement distribution
6 platforms active
medium
VERIFIED
WARNING
Attribution window standardization
67% correct
high
VERIFIED
PASS
Bid strategy optimization
5x LOWEST_COST_WITHOUT_CAP, 1x NOT_SET
high
VERIFIED
PASS
Overall campaign frequency control
Avg: 1.69
high
CALCULATED
NA
Breakdown effect monitoring
Requires process review
medium
UNVERIFIED
NA
UTM parameter implementation
Requires ad URL inspection
medium
UNVERIFIED
NA
A/B testing activity
Requires Experiments review
medium
UNVERIFIED
WARNING
Budget adequacy for learning phase exit
Avg CPA: EUR 35.56
high
CALCULATED
PASS
Budget utilization rate
148% utilization
medium
CALCULATED
PASS
CPM trend (rising cost risk)
CPM +7%
medium
CALCULATED
PASS
Campaign spend concentration
Top: 2% of spend
medium
VERIFIED
PASS
Short-lived campaign detection
0 short-lived campaigns
medium
CALCULATED
PASS
Seasonal campaign detection
No stale seasonal campaigns
low
VERIFIED
Result Check Value Confidence
WARNING
Audience overlap between ad sets
29% overlap
high
ESTIMATED
NA
Custom audience data freshness
No audience data
high
UNVERIFIED
NA
Lookalike audience source quality
No audience data
medium
UNVERIFIED
PASS
Advantage+ Audience automation
57% with Advantage+
medium
VERIFIED
WARNING
Purchaser exclusion from prospecting
70% with exclusions
high
VERIFIED
NA
CRM / first-party data sync freshness
No audience data
high
UNVERIFIED
PASS
Placement-demographic alignment (IG vs FB)
IG 54% / FB 45%
medium
VERIFIED
NA
Demographic reach vs conversion efficiency
Requires age/gender breakdown
high
UNVERIFIED
Result Check Value Confidence
WARNING
Post-view attribution inflation risk
34.2% post-view
critical
CALCULATED
NA
False retargeting campaign detection
No RTG campaigns identified
critical
UNVERIFIED
NA
ASC existing customer budget cap
No ASC campaigns
high
UNVERIFIED
PASS
Prospecting vs retargeting budget split
Prosp: 100% / RTG: 0% / Ret: 0%
high
CALCULATED
PASS
Existing customer purchase concentration
1% existing customer
high
ESTIMATED
PASS
Prospecting ROAS & new customer cost
nCAC 1.0x blended CPA
high
ESTIMATED
PASS
CPA trend (rising cost risk)
CPA improving (-11%)
medium
CALCULATED
NA
Landing page diversity across funnel stages
No URLs found
low
UNVERIFIED
⚠️ Top Finding
34.2% of your reported purchases come from users who saw your ad but never clicked. Your reported ROAS of 1.2x may actually be closer to 0.8x on a click-only basis.
See full analysis →
Total Spend
EUR 85,845
2026-01-13 to 2026-04-12
Reported ROAS
1.19x
7d click + 1d view
Click-Only ROAS
0.76x
7d click only
Active Campaigns
298
of 298 total
Key Finding
34.2% of your reported purchases come from users who saw your ad but never clicked. Your reported ROAS of 1.2x may actually be closer to 0.8x on a click-only basis.
This means approximately EUR 36,766 in potentially over-attributed revenue
What Meta Reports
1.19x
Revenue: EUR 101,823
Purchases: 2,414
Click-Attributed Only
0.76x
Revenue: EUR 65,057
Purchases: 1,589
🧮 How We Calculated This
Step 1: Total revenue (default attribution) = EUR 101,823.01
Step 2: Total revenue (7d click only) = EUR 65,057.48
Step 3: View-only revenue = EUR 101,823.01 - EUR 65,057.48 = EUR 36,765.53
Step 4: View-only % = EUR 36,765.53 / EUR 101,823.01 = 34.2%
API fields used: action_values filtered for action_type=purchase
Attribution windows: 7d click + 1d view (reported) vs 7d click only
✅ How to Verify This Yourself
We encourage you to verify every number in this report. Here's exactly how:
Open Ads Manager → Columns → Customize Columns
Search "Attribution Setting Comparison"
Add columns for 7-day click / 1-day view
Compare the two revenue columns
🖩 ROAS Calculator
Pre-filled with your account data. Edit to verify with your own numbers:
Performance Trends
Monthly Spend & ROAS
Spend (bars) and ROAS (line) over the trend period. Click-only ROAS shown as dashed line.
CPM Trend (Monthly)
Cost per 1,000 impressions over the trend period.
CPA Trend (Monthly)
Cost per acquisition over the trend period.
Year-over-Year Comparisons
Q4 2024 vs Q4 2025
Spend: 27.5%
ROAS: None%
CPA: None%
Q1 2025 vs Q1 2026
Spend: 36.8%
ROAS: None%
CPA: None%
ROAS of 1.19x with 0.76x on click-only basis. Review the trend charts below for seasonal vs structural context.
Campaign Analysis
Spend vs Click-Only ROAS
Bubble size = click-only purchases. Horizontal line = break-even (ROAS 1.0).
Campaign Details
Campaign
Spend
ROAS (reported)
ROAS (click)
Purchases
CPA
CTR
CPM
ROAS Deep Dive
Reported ROAS
1.19x
7d click + 1d view
Click-Only ROAS
0.76x
7d click only
Post-View %
34.2%
of purchases from view-only
Total Revenue
EUR 101,823
Click: EUR 65,057 | View: EUR 36,766
Total Purchases
2,414
Click: 1,589 | View: 825
Avg CPA
EUR 35.56
AOV: EUR 42.18
With 34.2% post-view attribution, your true Meta-driven revenue is likely between EUR 65,057 (click-only) and EUR 101,823 (reported).
ROAS by Campaign
Grouped bars: reported (lighter) vs click-only (darker). Red line = break-even.
Show Full ROAS Calculation
Reported ROAS = action_values:purchase / spend
= EUR 101,823.01 / EUR 85,844.79
= 1.1861x
Click-only ROAS = action_values:purchase [7d_click only] / spend
= EUR 65,057.48 / EUR 85,844.79
= 0.7579x
View-only revenue = EUR 101,823.01 - EUR 65,057.48 = EUR 36,765.53
View-only % = 34.2%
How to verify in Ads Manager:
1. Go to Ads Manager > Columns > Customize Columns
2. Add "Purchase ROAS" and "Purchase" metrics
3. Click "Compare" > "Attribution Setting Comparison"
4. Compare "7-day click, 1-day view" vs "7-day click"
Structure & Budget
Active Campaigns
298
of 298 total
Total Spend
EUR 85,845
Jan 2026 — Apr 2026 (3 months)
Spend Distribution by Campaign
Structure Checks (22 checks)
Result Check Value Severity Confidence
WARNING
Number of active campaigns (consolidation)
6 active campaigns
high
VERIFIED
PASS
Campaign Budget Optimization (CBO) adoption
67% CBO
high
VERIFIED
NA
Learning phase health across ad sets
Learning phase data not available
critical
UNVERIFIED
NA
Learning phase reset frequency
Requires edit history
high
UNVERIFIED
WARNING
Advantage+ Shopping Campaigns testing
0 ASC campaigns
medium
VERIFIED
PASS
Ad set audience overlap / cannibalization
0 overlapping pairs
high
CALCULATED
PASS
Minimum daily budget per ad set
Avg EUR 350.00/day
high
VERIFIED
FAIL
Campaign objective alignment with sales goal
3 misaligned
high
VERIFIED
WARNING
Advantage+ Placements usage
57% automatic
medium
VERIFIED
PASS
Multi-platform placement distribution
6 platforms active
medium
VERIFIED
WARNING
Attribution window standardization
67% correct
high
VERIFIED
PASS
Bid strategy optimization
5x LOWEST_COST_WITHOUT_CAP, 1x NOT_SET
high
VERIFIED
PASS
Overall campaign frequency control
Avg: 1.69
high
CALCULATED
NA
Breakdown effect monitoring
Requires process review
medium
UNVERIFIED
NA
UTM parameter implementation
Requires ad URL inspection
medium
UNVERIFIED
NA
A/B testing activity
Requires Experiments review
medium
UNVERIFIED
WARNING
Budget adequacy for learning phase exit
Avg CPA: EUR 35.56
high
CALCULATED
PASS
Budget utilization rate
148% utilization
medium
CALCULATED
PASS
CPM trend (rising cost risk)
CPM +7%
medium
CALCULATED
PASS
Campaign spend concentration
Top: 2% of spend
medium
VERIFIED
PASS
Short-lived campaign detection
0 short-lived campaigns
medium
CALCULATED
PASS
Seasonal campaign detection
No stale seasonal campaigns
low
VERIFIED
Quick Wins (3 actions)
iOS attribution window
2 min
8/12 use 7d/1d, 0 use 1d click only, 4 have other settings.
Attribution setting
2 min
8/12 use recommended 7d/1d setting. Standardize attribution windows across all ad sets.
Exclusion audiences
10 min
Only 7/10 prospecting ad sets (70%) have exclusions. Some prospecting budget may be spent on existing customers.
Audiences & Funnel
Breakdown Explorer
Age
Gender
Platform
Device
Country
Attribution: Click vs View Purchases
Attribution: Click vs View Revenue
Audience & Targeting Checks (8 checks)
Result Check Value Severity Confidence
WARNING
Audience overlap between ad sets
29% overlap
high
ESTIMATED
NA
Custom audience data freshness
No audience data
high
UNVERIFIED
NA
Lookalike audience source quality
No audience data
medium
UNVERIFIED
PASS
Advantage+ Audience automation
57% with Advantage+
medium
VERIFIED
WARNING
Purchaser exclusion from prospecting
70% with exclusions
high
VERIFIED
NA
CRM / first-party data sync freshness
No audience data
high
UNVERIFIED
PASS
Placement-demographic alignment (IG vs FB)
IG 54% / FB 45%
medium
VERIFIED
NA
Demographic reach vs conversion efficiency
Requires age/gender breakdown
high
UNVERIFIED
Funnel & Strategy Checks (8 checks)
Result Check Value Severity Confidence
WARNING
Post-view attribution inflation risk
34.2% post-view
critical
CALCULATED
NA
False retargeting campaign detection
No RTG campaigns identified
critical
UNVERIFIED
NA
ASC existing customer budget cap
No ASC campaigns
high
UNVERIFIED
PASS
Prospecting vs retargeting budget split
Prosp: 100% / RTG: 0% / Ret: 0%
high
CALCULATED
PASS
Existing customer purchase concentration
1% existing customer
high
ESTIMATED
PASS
Prospecting ROAS & new customer cost
nCAC 1.0x blended CPA
high
ESTIMATED
PASS
CPA trend (rising cost risk)
CPA improving (-11%)
medium
CALCULATED
NA
Landing page diversity across funnel stages
No URLs found
low
UNVERIFIED
Creatives
Creative Gallery
Sort: Spend (highest)
Sort: ROAS (highest)
Sort: CTR (highest)
Sort: Age (oldest)
Sort: Fatigue Risk
All Formats
Videos Only
Images Only
Share/Link
Dynamic
All Risk Levels
High Fatigue
Medium Fatigue
Low Fatigue
No Fatigue
No creative assets available. Run python fetch_creatives.py --account <name> to fetch thumbnails.
Live Ad Previews
⚠ Live previews require an active Meta login. If previews appear blank, log into facebook.com in another tab.
Facebook Feed
Select an ad above
Instagram Feed
Select an ad above
Creative Checks Detail
Result Check Value Severity Confidence
PASS
Ad creative format diversity
3 formats
critical
VERIFIED
FAIL
Number of creatives per ad set
Avg 2.7 ads/adset
high
VERIFIED
WARNING
Video aspect ratio coverage (9:16 for Reels)
59 video creatives
high
ESTIMATED
PASS
Creative fatigue detection
CTR change: +3.1%
critical
CALCULATED
NA
Video hook rate (3-second retention)
Not available via API
high
UNVERIFIED
FAIL
Social proof through organic post boosting
0% boosted
medium
CALCULATED
NA
User-generated content (UGC) proportion
Requires manual review
high
UNVERIFIED
WARNING
Advantage+ Creative enhancements
0 with enhancements
medium
VERIFIED
PASS
Creative freshness (days since newest ad)
Newest ad: 5 days
high
VERIFIED
PASS
Ad frequency for prospecting audiences
Avg frequency: 1.69
high
CALCULATED
PASS
Ad frequency for retargeting audiences
Top-quartile freq: 1.93
medium
CALCULATED
PASS
Click-through rate vs industry benchmark
CTR: 1.87%
high
CALCULATED
PASS
Creative age fatigue risk
0 fatigued ad(s)
high
CALCULATED
FAIL
Value proposition in ad text
0% with text
medium
ESTIMATED
WARNING
Creative differentiation across audiences
11/9870 identical pairs
medium
CALCULATED
NA
Creative spend concentration risk
No ad spend data
medium
UNVERIFIED
Pixel / CAPI Checks (10 checks)
Result Check Value Severity Confidence
PASS
Pixel installation status
2 pixel(s) detected
critical
UNVERIFIED
NA
Server-side tracking (CAPI) status
CAPI status unknown
critical
UNVERIFIED
NA
Event deduplication between Pixel and CAPI
Requires manual verification
critical
UNVERIFIED
NA
Event Match Quality score
Not available via API
critical
UNVERIFIED
NA
Domain verification in Business Manager
Requires Business Manager check
high
UNVERIFIED
NA
Aggregated Event Measurement configuration
Requires Events Manager check
high
UNVERIFIED
NA
Standard vs custom event usage
Requires Events Manager check
high
UNVERIFIED
NA
CAPI Gateway deployment
Requires manual verification
medium
UNVERIFIED
WARNING
iOS attribution window configuration
67% using 7d_click/1d_view
high
VERIFIED
NA
Data freshness and event lag
Requires Events Manager check
medium
UNVERIFIED
Findings & Roadmap
All Categories
Pixel / CAPI
Creative
Structure
Audience
Funnel
All Severity
High
Medium
Low
All Results
Fail
Warning
Pass
⚡ Quick Wins — Fix These Today (3 actions)
iOS attribution window
2 min
8/12 use 7d/1d, 0 use 1d click only, 4 have other settings.
Attribution setting
2 min
8/12 use recommended 7d/1d setting. Standardize attribution windows across all ad sets.
Exclusion audiences
10 min
Only 7/10 prospecting ad sets (70%) have exclusions. Some prospecting budget may be spent on existing customers.
Total estimated time: 14 minutes
🗓 30-Day Roadmap
Week 1
Fix Foundation
+5 pts
Week 2
Consolidate
+4 pts
Week 3
Strengthen Creatives
+4 pts
Week 4
Monitor & Optimize
+2.0 pts
Arithmetic check: 76 + 15.0 = 91.3 (target). Sum of deltas matches target − current.
Competitor Intelligence
No competitor data available. Run python fetch_competitor_ads.py --account <name> --token YOUR_TOKEN to fetch competitor ads from the Ad Library.
Competitor Ads
Sort: Longest Running (winners)
Sort: Newest First
All Competitors
Your Account vs Competitors
Data Integrity
All sanity checks PASSED
Data Cross-Reference
Metric Value How to Verify
Total Spend EUR 85,844.79 Ads Manager > Account Overview > Columns: Amount Spent
Total Revenue (reported) EUR 101,823.01 Ads Manager > Columns: Purchase ROAS * Spend
Total Revenue (click-only) EUR 65,057.48 Ads Manager > Attribution: 7d click > Purchase Conversion Value
Total Purchases 2,414 Ads Manager > Columns: Purchases
Reported ROAS 1.1861x Ads Manager > Columns: Purchase ROAS
Click-Only ROAS 0.7579x Ads Manager > Compare Attribution > 7d click
Implied AOV EUR 42.18 Revenue / Purchases
Post-View % 34.2% (Reported - Click) / Reported purchases
Known Data Gaps
These checks require manual verification or data not available via the Meta Marketing API:
Server-side tracking (CAPI) status: CAPI status unknown Event deduplication between Pixel and CAPI: Requires manual verification Event Match Quality score: Not available via API Domain verification in Business Manager: Requires Business Manager check Aggregated Event Measurement configuration: Requires Events Manager check Standard vs custom event usage: Requires Events Manager check CAPI Gateway deployment: Requires manual verification Data freshness and event lag: Requires Events Manager check Video hook rate (3-second retention): Not available via API User-generated content (UGC) proportion: Requires manual review Creative spend concentration risk: No ad spend data Learning phase health across ad sets: Learning phase data not available Learning phase reset frequency: Requires edit history Breakdown effect monitoring: Requires process review UTM parameter implementation: Requires ad URL inspection A/B testing activity: Requires Experiments review Custom audience data freshness: No audience data Lookalike audience source quality: No audience data CRM / first-party data sync freshness: No audience data Demographic reach vs conversion efficiency: Requires age/gender breakdown False retargeting campaign detection: No RTG campaigns identified ASC existing customer budget cap: No ASC campaigns Landing page diversity across funnel stages: No URLs found
ROAS Sanity Calculator
Enter your own numbers to verify:
Data Provenance
Account ID act_188246506427142
Account Name PODERM
Date Range 2026-01-13 to 2026-04-12
Scoring Period Jan 2026 — Apr 2026 (3 months)
API Version Meta Marketing API v21.0
Schema Version v4.2
Checks Total 64
Checks Evaluated 41
Generated 2026-04-13 16:38
📖 Glossary of Terms
ROAS
Return On Ad Spend = Revenue / Spend
CPA
Cost Per Acquisition = Spend / Purchases
CPM
Cost Per Mille = (Spend / Impressions) * 1000
CTR
Click-Through Rate = Clicks / Impressions * 100
AOV
Average Order Value = Revenue / Purchases
Post-View
Conversions from users who saw but never clicked the ad
Post-Click
Conversions from users who clicked the ad before purchasing
Attribution Window
Time period after ad interaction during which conversions are counted
CBO
Campaign Budget Optimization — budget managed at campaign level
ABO
Ad Set Budget Optimization — budget managed at ad set level
ASC
Advantage+ Shopping Campaign — automated campaign type by Meta
CAPI
Conversions API — server-side tracking for more accurate measurement
nCAC
New Customer Acquisition Cost = Prospecting spend / new purchases
Learning Limited
Meta status indicating insufficient conversions for optimization
Frequency
Average number of times each user saw the ad
Creative Fatigue
Declining CTR over time indicating audience has seen the ad too many times