- Help Center
- Analyze Results
- Statistical Significance
-
Getting Started
-
Configuration
- Targeting
- Split URL
- Product Testing
- Full Stack
- Experiment Management
- CSP Configuration
- Experiment Execution
- Reports
- Exit Popups
- GTM Integration
- Troubleshooting
- Performance Optimization
- Event-Triggered Changes
- Holdout Groups
- Split URL Pages
- URL Parameters
- DataLayer
- Menu Configurations
- Traffic Exclusion
- Experiment Scheduling
- Dynamic Element Changes
- Price Targeting
- Experience Scheduling
- Privacy
- Hash Changes
- Async Tracking
- Selective Installation
- CSS Selectors
- Vue.js Integration
- Page Content
- Multipage Split URL
- Organic Traffic
- Visual Editor
- Server-Side Testing
- Traffic Bucketing
- GDPR Warnings
- Statistical Confidence
- Browser Privacy
- Query Parameters
- Embedded Videos
- Tracking Code Execution
- Simultaneous Experiments
- Tags
- Deployments
- Disable Testing
- Locations
- Programmatic Bucketting
- Query Parameter Handling
- Convert Library
- Variation Previews
- Experiment Editing
- Opt-Out Script
- Data Reset
- Body Hiding
- Visit-Specific Variations
- Variation Styling
- Preview Issues
- Variation Editing
- Full-Site Testing
- Blinking Variations
- Cross-Domain Cookies
- Regex Support
- Conversion Tracking
- SPA Testing
- Project Setup
- Cross-Domain Tracking
- Geo-Targeting
- Analytics Tools
- Campaign Tags
- Previewing
- IDs
- Query String Targeting
- Bounce Rate Goals
- Bot Filtering
- Query String Variables
- Custom Audiences
- Redirects
- Baseline
- Tracking Code Location
- Secure Cookies
- AngularJS
- Cloudflare
- Code Installation
-
Track Goals
- Form Tracking
- Cookie Management
- iFrame Click Tracking
- Performance Optimization
- Revenue Tracking
- Interaction Goals
- Form Submissions
- Advanced Goals
- Lazy Loading
- Multi-Conversions
- URL Parameters
- Bounce Rate Goals
- DataLayer Integration
- Scroll Depth
- Social Interactions
- Page Views
- Marketo Forms
- Feature Analysis
- AJAX Forms
- Revenue Tracking via GTM
- Order Outliers
- Cumulative Revenue
- Goal Templates
- Adding Revenue Goals
- JS-Based Goals
- Goal Basics
- Google Analytics Goals
- Social Sharing
- Dynamic Goals
- Typeform Integration
-
Target Visitors
- Geolocation
- Interaction Goals
- Goal-Based Targeting
- Weather Targeting
- Cookie-Based Targeting
- Page Visits
- Audience Management
- Audience Segmentation
- Experiment Targeting
- Advanced Audience Creation
- Audience Templates
- Audience Creation
- Data Layer Integration
- Manual Activation
- JavaScript Conditions
- Device Targeting
- Language Targeting
- IP-Based Exclusion
- Visitor Management
- Page Tagging
- Cookies
-
Troubleshooting
- Google Warnings
- Visual Editor
- HTTPS Content
- Logs
- Support Options
- Bootstrap
- Cookie Blocking
- Change History
- Mobile Debugging
- AdWords
- Bot Exclusion
- Domain Issues
- Cloudflare Issues
- Monitoring
- Cloaking Penalties
- Goal Editor Issues
- Variations
- Snippet Performance
- Changes Not Saved
- Blocked Visual Editor
- Goal Testing
- Visual Editor Browsing
- Experiment Issues
- Installation Verification
- Data Leak Prevention
- Usage Limits
- Experiment Previews
- GA4 Revenue
- Chrome Debugger Logs
- SPA Errors
- Checkout JSON Error
-
Analyze Results
-
Integrations
- Google Analytics
- Cookie Consent Platforms
- Microsoft Clarity
- Plausible
- Marketo
- HubSpot
- Tealium
- Smartlook
- Klaviyo
- Salesforce CRM
- FullStory
- Snowplow Analytics
- Webflow
- GA4 Roles
- Amplitude
- Segment
- React
- BigCommerce
- WooCommerce
- Active Campaign
- Google Tag Manager
- Mixpanel
- Zapier
- Inspectlet
- Crazy Egg
- LanderApp
- Unbounce
- Instapage
- Drupal
- PrestaShop
- Magento
- Roistat
- Piano Analytics
- Heap Analytics
- Kissmetrics
- Mouseflow
- Adobe Analytics
- Clicky
-
Account Management
-
Developers
-
What's New
-
Common Questions
-
Shopify
How to Make Sure My Results are Statistically Significant
THIS ARTICLE WILL HELP YOU:
Change Your Statistical Confidence
In any experiment, the larger your sample size, the more accurate (and valuable) the results of the data will be.
Depending on your particular needs, you can adjust the minimum or maximum length that an experiment should run.
You can also ensure that your winning test starts running by itself after a certain percentage of statistical significance, as well as sitting a minimum number of visitors and conversions to trigger this change.
To do this, go to your Experiment Summary and select Stats Settings:
Find and click on the setting that manages the Confidence Level. With this setting, you can set an experiment-wide significance level at which you'd like Convert to declare significant results (winners and losers) on the Report page. You should be aware of certain things associated with changing the statistical significance setting. In general, a higher setting is more accurate and increases the time required for Convert to declare significant results because it requires a larger sample size. A lower statistical significance level decreases the amount of time needed to declare significant results, but lowering the setting also increases the chance that some of the results will be false positives.
You will also be able to see many other options. Simply make your desired changes and save.