If you are a site that gets a high percentage of traffic from synthetic 'bots', you may want to filter that data so it doesn't skew your analytics from real users. We get this question a lot from users. 

For Google Analytics (GA) users, as a best practice we recommend using the black box solution by simply enabling filtering for known bots in your Admin settings. Currently, this includes SpeedCurve (as well as WebPagetest) traffic identified by our user agent.

If you are not using GA, or you wish to have more control, you can set up a segment or a filter in your analytics to exclude hits from our user agent. By default, the pattern we include for identification is "PTST/SpeedCurve". 

Did this answer your question?