How We Test VPNs
This page is a practical summary of our VPN testing workflow. It explains the steps we repeat every cycle before a score or ranking is published.
Score model used in production
The current score model is fixed and documented publicly. Commercial terms are not part of the formula. For implementation details, see /methodology.
| Metric | Weight |
|---|---|
| Speed composite | 24% |
| Latency | 10% |
| Logging policy | 14% |
| Ownership clarity | 8% |
| Jurisdiction risk | 9% |
| Audit status | 10% |
| Streaming unlock | 11% |
| Torrent policy | 7% |
| Kill switch reliability | 7% |
Re-test cadence
Providers are re-tested monthly, with additional runs after major app updates, policy changes, and major streaming-block shifts. Freshness is exposed through visible test dates.
You can see these outputs directly in the VPN Index and the full Transparency Report.
Related research pages
Full protocol and governance details.
Live matrix of tested providers and metrics.
Interactive scorecards and filters by use case.
How rankings stay independent from commercial pressure.
How the business model works and where boundaries sit.
Report data issues or suggest test additions.