How to turn on persistent urls in WordPress?
-
I'm using an appointment form on my website and I have the option to add a referral url to form submissions so that i know which pages the form submission came from.
I need to be able to distinguish between organically generated form submissions and those that come in via AdWords. If referral url shows the AdWords tracking code i know the form submission came in from AdWords.
My problem is that when a visitor comes in after clicking an ad and then visits another page on my website that AdWords tracking code disappears from the url. I was told that there was a way to turn on persistent urls in WordPress but I can't figure out how to do it.
I'm assuming that if i turn persistent urls on the AdWords tracking code will remain on every subsequent url that they visit on my website. Is this true?
Any help with this will be greatly appreciated.
-
Thanks for your help everyone. I'm working on the GCLID attribution now.
-
Max is definitely right that you need code. The most common attribution method is last non direct. The easiest way to determine PPC v SERPs is to try to grab the GCLID. If you end up growing your business and/or merging this information back to AdWords from the offline conversion tracking option they offer you will need the GCLID.
-
This is just going to disable yoast canonical url, I don't see how could it help passing query string parameters through the user visit path.
-
You can use either one or another, cookie is persistent through different visits (and last as long as you decide it to last), while the session variable last only for the current user session. Depends on the attribution window you want to use.
-
Thanks for your help guys. I've tried using your method smarttill but unfortunately it didn't work.
I will try it your way Max but how do i log where the visitor is coming from with a cookie or a session variable?
-
Add SEO Yoast as a plugin tin Wordpress. add this to your functions.php add_filter( 'wpseo_canonical', '__return_false' );
-
You need coding, when the visitor land on the entry page of your site take the utm_source or utm_campaign from the url and log where he is coming from, in a cookie, session variable, etc... Then pass it through on form submission. You can use header, footer or any wordpress piece of code used in every page.
You can't keep the query string through the visitor path unless you code too, and it's more complex, and I don't believe you can find a wordpress plugin doing that. For sure is not something you can do with a standard wordpress installation.
-
Thanks for your help Max but i don't need to know how many leads came in through the different referral sources. I already know that. What i do need to do is identify each form submission as coming from organic traffic or ppc.
Like i've mentioned earlier, the leads coming in through the form need to be logged into a client management software so i need to take the contact information of the form submitter and enter it in the system as coming from organic or ppc. This is done to track ROI.
-
Maybe I am missing something, but form submission is either PPC or organic because the visitor is coming from PPC or Organic. So if you define a goal in analytics for the form submission, triggered either by url match or javascript, you can later check in analytics how many lead were generated through PPC or organic checking the goals per channel/referral/campaign.
Keep in mind you can use utm_source, utm_campaing, etc... In the links originating the leads, if you control them.
-
I know analytics. I can see referral traffic and goal paths and all that. What I need is to be able to attribute individual form submissions to either organic or ppc traffic.
Each form submission is a lead. Each lead needs to be logged in a client management software so in order to properly attribute a lead to either ppc or organic traffic i need to use persistent urls so that the referral url field in my form reflects the traffic source vie google tracking code in the url.
I hope someone here can help shed some light on this. Thanks.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitted URL marked 'noindex'
Search Console is giving this issue for near 100 pages of my website. I have checked the Yoast plugin settings. We haven't used any meta robots tag for these pages, neither have these pages been disallowed in robots.txt. Previosuly this issue was for some 20+ pages. I tried to reindex them by submitting the urls again. Now the count has risen to 100+. There is also this "Submitted URL blocked by robots.txt" issue for pages which are NOT disallowed in robots.txt. Can anyone please suggest me a solution here?
Reporting & Analytics | | Reema240 -
How to get multiple pages to appear under main url in search - photo attached
How do you get a site to have an organized site map under the main url when it is searched as in the example photo? SIte-map.png
Reporting & Analytics | | marketingmediamanagement0 -
Changing URL Parameters in Webmaster Tools
We have a bit of a conundrum. Webmaster tools is telling us that they are crawling too many URLs: Googlebot found an extremely high number of URLs on your site: http://www.uncommongoods.com/ In their list of URL examples, all of the URLs have tons of parameters. We would probably be ok telling Google not to index any of the URLs with parameters. We have a great URL structure. All of our category and product pages have clean links (no parameters) The parameters come only from sorts and filters. We don't have a need for Google to index all of these pages. However, Google Analytics is showing us that over the last year, we received a substantial amount of search revenue from many of these URLs (800+ of them converted) So, Google is telling us they are unhappy. We want to make Google happy by ignoring all of the paramter URLs, but we're worried this will kill the revenue we're seeing. Two questions here: 1. What do we have to lose by keeping everything as-is. Google is giving us errors, but other than that what are the negative repercussions? 2. If we were to de-index all of the parameter URLs via Webmaster tools, how much of the revnenue would likely be recovered by our non-parameter URLs? I've linked to a screenshot from Google Analytics ArxMSMG.jpg
Reporting & Analytics | | znotes0 -
Changed URL's, traffic dropped from 2k week to 1K week. Need advice!
Hi Mozers, I recently changed my URLs for my ecommerce site and my traffic went from 2,000 visitors a week to 1,000 visitors a week, over a 3 week period. Traffic is down, so are unique Kwds. I need advice on why this happened and what I should do moving forward. To brief, I have a ecommerce website, www.ecustomfinishes.com. I noticed pattern that a lot of my URLs with a unique URL structure (URL.Com/ProductDescription/ProductName) were getting a lot of entrances ~30-50 a month, and others that followed the path of my subcategory (URL.com/SubCat/Product) were getting 0-3 entrances a month. The seo pattern was that those with unique product URLs were hitting long tail Kwds, and those URLs with /subcategory/product were getting far less traffic. I changed 150 or so urls to be unique. Good idea, I thought. Since then: CON: Since then my traffic dropped from 2200 visitors a week to 1100 visitors a week. -25% week to week, over 3 weeks CON: # of non-paid keywords sending visits: -25% week to week, over 3 weeks PRO: my Urls receiving entrances +10% week to week, over 3 weeks REF: http://imgur.com/GwZT8 Question: What are your best suggestions moving forward? Any advice is much appreciated, Thank you!!! abBN3
Reporting & Analytics | | longdenc_gmail.com0 -
Count of words in all urls in a subdomian
Hi, I am trying to understand in a simple way how much content (words per url) are included in all urls under a subdomain. Is there a way to get this information from any of the tools ? thanks!
Reporting & Analytics | | picolo0 -
Duplicate Url with Google shopping feed
In webmaster tool I have many duplicate url tagged as google_shopping Obviously i'm tagging the url with the goog url builder Url: elettrodomestici.yeppon.it/cura-corpo/tagliacapelli/remington-tagliacapelli-funzionamento-rete-ricaricabile-lame-in-acciaio-inox-hc5150-garanzia/ Duplicate url: elettrodomestici.yeppon.it/cura-corpo/tagliacapelli/remington-tagliacapelli-funzionamento-rete-ricaricabile-lame-in-acciaio-inox-hc5150-garanzia/?utm_source=google_shopping&utm_medium=web&utm_content=Elettrodomestici+e+Clima+%3E+Cura+del+corpo+%3E+Tagliacapelli&utm_campaign=google_shopping How can I solve it? Thanks
Reporting & Analytics | | yeppon0 -
Strange 404 Error URL
Can anyone help determine how a URL like "www.mycompany.com/lago_www.bad-nsfw-content.com" would appear on the "not found" crawl error list in Google Webmaster Tools? The "www.bad-nsfw-content" site has nothing to do with our company and I don't how it would get associated with our site.
Reporting & Analytics | | pbhatt0 -
What is the best Wordpress Analytics Plugin to use?
I am installing a new instance of WordPress and want to use Google Analytics. Which Plug-in is the best to use? I am finding several Plug-ins that all seem to do the same thing: Google Analyticator Google Analytics Dashboard Simple Google Analytics Ultimate Google Analytics Wordpress Google Analytics Maybe one of these are not the best? Any recommendations would be appreciated. Thanks!
Reporting & Analytics | | LBike0