SpeedCurve uses a form of synthetic testing. The testing is conducted on servers in datacenters using a throttled connection to try and mimic the conditions an average user might experience. Pages are loaded in real web browser and performance metrics collected.
Consistent stable baseline
Nothing to install
Can measure and compare any website
Detailed analysis of assets
Video recording and filmstrips of the user experience
Can measure "Start render" which is based on video analysis
Performance waterfall charts
Limited sample size
Datacenter based so average network conditions are mimicked.
Real User Monitoring (RUM)
Real User Monitoring (RUM) captures performance metrics as real people browse your website. The amount of performance data collected depends on the web browser's support for timing API's. While RUM can't measure as much detail as synthetic monitoring it more than makes up for it by collecting huge amounts of actual performance data from real people on real computers on real connections all over the world.
Large sample size
Real network and browser conditions
Conversion and KPI correlation with performance metrics
Limited performance metrics
No detailed analysis of assets
No performance waterfall charts
You need both!
To really understand the performance of your website, get detailed performance metrics and really understand how the front-end code and assets on your website are affecting performance we recommend running both synthetic and RUM. These two forms of testing are complementary and between them show you exactly how long your pages are taking to load for real users and then provide the detail required to diagnose and improve performance.
Here's a Velocity conference presentation by Mark from SpeedCurve and Cliff from Soasta discussing how to combine the analysis of synthetic and RUM data: