Attribution
Understand first-touch and last-touch attribution
Attribution connects conversions to the clicks that drove them. Configure attribution settings for each conversion type in Events & Conversions settings.
How It Works
- Visitor clicks your link - They receive a unique visitor ID stored in a cookie
- Visitor browses your site - The tracking pixel maintains the visitor ID
- Visitor converts - The conversion is linked to their click(s) based on the attribution model configured for that conversion type
Attribution Window
The time period during which a conversion can be attributed to a click.
- Default: 30 days
- Range: 1-90 days
If a visitor clicks your link today and converts within your attribution window, the conversion is attributed to that click. Clicks outside the window are not attributed.
Attribution Models
Each conversion type can have its own attribution model. When a visitor clicks multiple links before converting, the model determines how credit is distributed:
First-Touch
100% credit goes to the first link the visitor clicked.
- Best for: Lead generation, understanding initial discovery
- Use case: Content marketing, brand awareness, newsletter signups
- Recommended for: Register, Subscribe, Download conversions
Last-Touch
100% credit goes to the last link the visitor clicked before converting.
- Best for: Understanding what drove the final decision
- Use case: Direct response campaigns, sales promotions
- Recommended for: Add to Cart, Begin Checkout conversions
Position-Based
Credit is distributed across multiple touchpoints: 40% to first click, 40% to last click, and 20% split among middle clicks.
- Best for: Valuing both discovery and closing
- Use case: E-commerce purchases with longer consideration cycles
- Recommended for: Purchase conversions
Position-Based Example
A visitor clicks your YouTube link on Monday, LinkedIn link on Wednesday, and Twitter link on Friday before purchasing. YouTube gets 40% credit, LinkedIn gets 20% credit, and Twitter gets 40% credit.
Default Attribution Models
Each predefined conversion type comes with a recommended attribution model:
| Conversion Type | Default Model | Rationale |
|---|---|---|
| Purchase | Position-Based | Both discovery and closing matter for sales |
| Register | First-Touch | What initially brought them in matters most |
| Add to Cart | Last-Touch | Intent signal driven by most recent interaction |
| Begin Checkout | Last-Touch | Strong purchase intent from final push |
| Download | First-Touch | Lead magnet attribution tracks discovery |
| Subscribe | First-Touch | Acquisition metric values initial contact |
Note
Changing the attribution model for a conversion type only affects new conversions. Existing conversions retain their original attribution.
Post-Purchase Survey Attribution
Not all traffic sources are visible through behavioral tracking. Channels like word of mouth, podcasts, and organic social often appear as "Direct" traffic because visitors type your URL directly or use untracked links.
The Post-Purchase Survey asks customers "How did you hear about us?" after a conversion, capturing these dark funnel channels that behavioral attribution misses.
How Survey Attribution Works
- Enable the survey in Survey Settings
- A modal appears automatically after conversion events — no extra script needed
- Responses are collected and shown on the Attribution dashboard alongside behavioral data
What Visitors See
After a conversion event fires (e.g. a purchase), a small modal slides in from the bottom-right corner of the page. It displays your configured question (default: "How did you hear about us?") with clickable option buttons. If "Other" is enabled, visitors can type a free-text response. After submitting, a thank-you message appears briefly before the modal fades away.
The modal uses your configured brand color for the submit button and selected option styling. It won't show again to the same visitor after they've responded (tracked via localStorage).
Testing the Survey
To verify the survey works on your site:
- Go to Survey Settings and toggle the survey on
- Visit a page on your site that has the qklnk tracking pixel installed
- Open your browser's developer console and run:
qklnk.event('register') - The survey modal should appear in the bottom-right after a short delay
- Select an option and submit — you should see the thank-you message
- Check Survey Submissions to confirm the response was recorded
To test again, clear the ql_survey_{your_site_id} key from localStorage in your browser's developer tools — this resets the "already responded" flag.
Survey Results on the Attribution Page
As soon as you receive your first survey response, the Attribution page shows a Survey Attribution section with:
- Response breakdown — what channels customers are reporting, with counts and revenue
- Behavioral vs survey comparison — where tracking data and self-reported data agree or disagree
- Dark funnel discoveries — channels like word of mouth and podcasts that only appear through surveys
Survey-Enhanced Model
Once you collect 10+ survey responses, the Survey-Enhanced attribution model unlocks (a progress indicator on the Attribution page tracks your progress). To enable it, turn on Attribution Blending in Survey Settings.
What Attribution Blending Does
Normally, attribution credit goes entirely to behavioral tracking data (clicks, UTMs, referrers). When you enable blending, a "Survey-Enhanced" model appears on the Attribution page that adjusts channel credit when survey responses disagree with behavioral tracking.
Example: A visitor arrives with no UTMs and no referrer — behavioral tracking labels this as "Direct." But the customer tells you "I heard about you on a podcast." With blending enabled, credit shifts from Direct toward Podcast.
How much credit shifts depends on two factors:
- Behavioral confidence — if behavioral data is strong (multiple touchpoints with UTMs), the survey has little influence. If behavioral data is weak (single Direct visit), the survey becomes the primary signal.
- Survey weight slider — controls how aggressively to trust surveys when they disagree with behavioral data. 0.3-0.5 is recommended for most teams.
When survey and behavioral data agree (e.g., both say "Google"), blending uses 100% behavioral — the survey simply confirms what tracking already showed.
Max Plan Feature
Post-purchase surveys require the Max plan, which includes visit-level attribution data needed for behavioral comparison.
Best Practices
- Match model to goal - Use first-touch for acquisition metrics, last-touch for intent signals, position-based for purchases
- Set appropriate windows - Longer windows for considered purchases, shorter for impulse buys
- Use unique links - Create separate links for each placement to see which performs best
- Review periodically - As your marketing mix changes, revisit attribution settings