Methodology · v3.2

How we score.

Four weighted pillars, measured the same way for every provider, on a quarterly cadence. No commission-tier weighting. No "editor's pick" overrides. The score is the score.

01 · 35%
Real uptime
02 · 25%
Performance
03 · 20%
Renewal pricing
04 · 20%
Support quality
[01] · The four pillars

What goes into a score.

01 · 35%

Real uptime

90-day continuous monitoring of a thin landing-page test from six global probe locations (Ashburn, Frankfurt, Mumbai, Tokyo, São Paulo, Sydney). 30-second interval. We do not use vendor-reported uptime; some vendors round, exclude planned maintenance, or only report on the production tier (not shared).

Inputs
  • »Successful HTTP 2xx response ratio
  • »Time-to-recovery on failures
  • »Frequency of >5min outages in the trailing quarter
02 · 25%

Performance

TTFB and full-page-load times for an identical WordPress test site image, deployed identically per host. We measure cold (no cache), warm (3rd page load), and 95th-percentile slow-region performance.

Inputs
  • »TTFB (Time to First Byte) from cold cache
  • »Largest Contentful Paint (LCP) on a 1 Mbps throttled connection
  • »Static asset CDN coverage breadth
03 · 20%

Renewal pricing

Average yearly cost across years 1–3, in USD, on the plan-tier most readers actually buy. We publish both the intro price and the renewal multiplier because the gap is often where the deception lives.

Inputs
  • »Year-1 promotional price (as listed at checkout, with mandatory add-ons)
  • »Renewal price (years 2–3) — taken from the merchant's published rate card
  • »Hidden fees: required domain privacy, SSL upcharges, mandatory backups
04 · 20%

Support quality

Quarterly mystery-shop tickets to each provider across three categories: pre-sale, billing dispute, and technical (a deliberate misconfiguration we know how to fix). We measure response time and accuracy.

Inputs
  • »Median first-response time, by channel (chat / email / phone)
  • »Resolution time for the technical mystery-shop ticket
  • »Whether the first response is a templated deflection or a real answer
[02] · Guardrails

How we keep it honest.

Ownership disclosure

Every provider's corporate parent is researched and disclosed on its review page. We pay particular attention to Newfold Digital, GoDaddy, IONOS, EIG-era brands, and recent private-equity rollups. If a provider is independent, we say so explicitly with an 'Indie' badge.

Sample size

A score requires at least 30 days of continuous monitoring data and one quarterly mystery-shop cycle. Providers below that threshold are listed as 'preliminary' with a faded score and a note.

Score range

Scores are on a 0–10 scale. In practice most reputable providers cluster between 6.0 and 8.5. Scores of 9+ are reserved for genuinely best-in-class on every pillar. Scores below 5 indicate active recommendation against, not just mediocrity.

Re-benchmark cadence

Every 90 days. The 'Δ 90d' column shows the change since the previous run; an upward arrow means the provider improved, downward means it regressed. We publish a public changelog when a score moves by more than 0.5.

Conflict of interest

A provider's affiliate commission rate, the recency of our last payout, and the volume of clicks we've sent them are all data points we deliberately do not feed into the scoring model. Our affiliate disclosure explains the firewall in detail.

Edge cases & exclusions

We exclude providers that fail basic legal/operational checks: no published terms, no verifiable physical address, scam-pattern pricing, or repeat customer-data incidents in the trailing 24 months.

[03] · Versioning

Methodology changes.

Major version bumps (v3 → v4) re-baseline every score. Minor bumps add or refine inputs without invalidating prior runs.

  1. v3.2 · Q1 2026
    Added support-quality mystery shop; rebalanced renewal-pricing weight from 25% → 20%.
  2. v3.1 · Q4 2025
    Added São Paulo + Mumbai probe locations.
  3. v3.0 · Q3 2025
    First public methodology release. Replaced our internal scoring spreadsheet.

See the methodology in action.

The full ranked index applies these rules to every provider we track. The "Δ 90d" column shows where a score has moved since the last benchmark run.