Disclosure · SafeScan Now earns commissions when readers buy through certain links. We never accept paid rankings — see our methodology.

Read full disclosure
Methodology

How We Score Antivirus Software

SafeScan Now scores combine six pillars. Every weight is published. Every input has a public lab source or a documented SafeScan Now test. We do not curve scores, we do not delete failed tests, and we do not bend the methodology to favour partners.

Six pillars, public weights

Detection

30%

Real-world protection rate against fresh malware and zero-day samples, blended across AV-TEST, AV-Comparatives Real-World Protection, and SE Labs.

  • AV-TEST Protection score (latest two cycles)
  • AV-Comparatives Real-World Protection (latest series)
  • SE Labs Total Accuracy (latest quarter)

Performance

20%

System impact during scans, copy operations, and application launches. Measured on identical HP EliteBook 840 G10 hardware in our lab.

  • AV-Comparatives Performance Test
  • SafeScan Now in-house benchmark (boot time, copy 10 GB, Chrome cold-start, app install)

Pricing

15%

Honest cost over three years — first-year price plus the renewal price you actually pay in years 2 and 3. Discounts are applied only if they recur.

  • First-year list / promo price
  • Year-2 renewal price (verified each month)
  • Devices covered, refund window

Privacy History

15%

Documented privacy incidents, ownership changes, and telemetry behaviour over the last five years. Penalty-based rather than reward-based.

  • Privacy policy review (telemetry scope, sale of data)
  • Ownership / parent company changes
  • Documented incidents (Avast Jumpshot, Kape ownership, etc.)

Ease of Use

10%

Install friction, dashboard clarity, default-on protection, and the absence of upsell pop-ups during normal use.

  • SafeScan Now usability test (install → first scan → settings)
  • Upsell-pop-up count over a 7-day passive-use window

Support

10%

Live chat / phone availability, average reply time, and quality of support documentation.

  • 3 support tickets per vendor per quarter (anonymous)
  • Knowledge base depth & accuracy spot-check

The formula

score = 0.30·detection + 0.20·performance + 0.15·pricing + 0.15·privacy_history + 0.10·ease_of_use + 0.10·support

Why our score differs from other sites

If our top 5 doesn't match Tom's Guide, SafetyDetectives, or Cybernews, here's why — and the differences are deliberate, not accidental.

vs Tom's Guide
Tom's Guide weights brand familiarity heavily. We weight independent lab data and renewal pricing, which often re-orders the top 5.
vs SafetyDetectives
SafetyDetectives is owned by Kape Technologies, which also owns several products it reviews. Our Privacy History pillar penalises that conflict directly.
vs Cybernews
Cybernews publishes 4,050+ hours of testing — we use their methodology as a baseline and add Renewal Pricing and Privacy History as standalone pillars.

Where the lab data comes from

Detection and Performance pull from five independent labs. Each lab tests differently. Our methodology blends the latest two cycles per lab, normalises to a 0–100 scale, and discards single-cycle outliers.

Read: AV-TEST vs AV-Comparatives explained →
  • · AV-TEST — Magdeburg, DE
  • · AV-Comparatives — Innsbruck, AT
  • · SE Labs — London, UK
  • · MRG Effitas — London, UK
  • · ICSA Labs — Mechanicsburg, US

FAQ

Methodology FAQs

If a question is missing, write to corrections@safescannow.com and we will add and answer it on the page.