How to turn on persistent urls in WordPress?
-
I'm using an appointment form on my website and I have the option to add a referral url to form submissions so that i know which pages the form submission came from.
I need to be able to distinguish between organically generated form submissions and those that come in via AdWords. If referral url shows the AdWords tracking code i know the form submission came in from AdWords.
My problem is that when a visitor comes in after clicking an ad and then visits another page on my website that AdWords tracking code disappears from the url. I was told that there was a way to turn on persistent urls in WordPress but I can't figure out how to do it.
I'm assuming that if i turn persistent urls on the AdWords tracking code will remain on every subsequent url that they visit on my website. Is this true?
Any help with this will be greatly appreciated.
-
Thanks for your help everyone. I'm working on the GCLID attribution now.
-
Max is definitely right that you need code. The most common attribution method is last non direct. The easiest way to determine PPC v SERPs is to try to grab the GCLID. If you end up growing your business and/or merging this information back to AdWords from the offline conversion tracking option they offer you will need the GCLID.
-
This is just going to disable yoast canonical url, I don't see how could it help passing query string parameters through the user visit path.
-
You can use either one or another, cookie is persistent through different visits (and last as long as you decide it to last), while the session variable last only for the current user session. Depends on the attribution window you want to use.
-
Thanks for your help guys. I've tried using your method smarttill but unfortunately it didn't work.
I will try it your way Max but how do i log where the visitor is coming from with a cookie or a session variable?
-
Add SEO Yoast as a plugin tin Wordpress. add this to your functions.php add_filter( 'wpseo_canonical', '__return_false' );
-
You need coding, when the visitor land on the entry page of your site take the utm_source or utm_campaign from the url and log where he is coming from, in a cookie, session variable, etc... Then pass it through on form submission. You can use header, footer or any wordpress piece of code used in every page.
You can't keep the query string through the visitor path unless you code too, and it's more complex, and I don't believe you can find a wordpress plugin doing that. For sure is not something you can do with a standard wordpress installation.
-
Thanks for your help Max but i don't need to know how many leads came in through the different referral sources. I already know that. What i do need to do is identify each form submission as coming from organic traffic or ppc.
Like i've mentioned earlier, the leads coming in through the form need to be logged into a client management software so i need to take the contact information of the form submitter and enter it in the system as coming from organic or ppc. This is done to track ROI.
-
Maybe I am missing something, but form submission is either PPC or organic because the visitor is coming from PPC or Organic. So if you define a goal in analytics for the form submission, triggered either by url match or javascript, you can later check in analytics how many lead were generated through PPC or organic checking the goals per channel/referral/campaign.
Keep in mind you can use utm_source, utm_campaing, etc... In the links originating the leads, if you control them.
-
I know analytics. I can see referral traffic and goal paths and all that. What I need is to be able to attribute individual form submissions to either organic or ppc traffic.
Each form submission is a lead. Each lead needs to be logged in a client management software so in order to properly attribute a lead to either ppc or organic traffic i need to use persistent urls so that the referral url field in my form reflects the traffic source vie google tracking code in the url.
I hope someone here can help shed some light on this. Thanks.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Crawl Errors/Not Found - Strange URLs
Hello, In Google Search Console under Crawl > Crawl Errors > Not found I have strange URLs like the following: https://www.domain.com//UbaOZ/
Reporting & Analytics | | chuck-layton
https://www.domain.com//UPhXZ/
https://www.domain.com//KaUpZ/WYdhZ/SnQZZ/MOcUZ/ There is no info in Linked From tab. Have you seen this type of error??
Does anyone know whats causing it??
How should it be fixed?? Thanks for reading and the help!0 -
Page Tracking using Custom URLs - is this viable?
Hi Moz community! I’ll try to make this question as easy to understand as possible, but please excuse me if it isn’t clear. Just joined a new team a few months ago and found out that on some of our most popular pages we use “custom URLs” to track page metrics within Google Analytics. NOTE: I say “custom URLs” because that is the best way for me to describe them. As an example: This page exists to our users: http://usnews.rankingsandreviews.com/cars-trucks/Ram_HD/2012/photos-interior/ But this is the URL we have coded on the page: cars-trucks/used-cars/reviews/2012-Ram-HD/photos-interior/ (within the custom variance script labeled as “var l_tracker=” ) It is this custom URL that we use within GA to look up metrics about this page. This is just one example of many across our site setup to do the same thing Here is a second example: Available page to user: http://usnews.rankingsandreviews.com/cars-trucks/Cadillac_ATS/2015/ Custom “var l_tracker=” /cars-trucks/2015-Cadillac-ATS/overview/ NOTE: There is a small amount of fear that the above method was implemented years ago as a work-around to a poorly structured URL architecture. Not validated, but that is a question that arose. Main Questions: Is the above implementation a normal and often used method to track pages in GA? (coming from an Omniture company before – this would not be how we handled page level tracking) Team members at my current company are divided on this method. Some believe this is not a proper implementation and are concerned that trying to hide these from Google will raise red flags (i.e. fake URLs in general = bad) I cannot find any reference to this method anywhere on the InterWebs - If method is not normal: Any recommendations on a solution to address this? Potential Problems? GA is currently cataloguing these tracking URLs in the Crawl Error report. Any concerns about this? The team wants to hide the URLs in the Robots.txt file, but some team members are concerned this may raise a red flag with Google and hurt us more than help us. Thank you in advance for any insight and/or advice. Chris
Reporting & Analytics | | usnseomoz0 -
Www.googleadservices.com/pagead/conversion_async.js what is this url doing on my site?
Hello Guys, I am using google tagmanager and i have configured adwords in tag manager now what i find is that this link - www.googleadservices.com/pagead/conversion_async.js showing on my homepage not in view source but when i do inspect element at that time it appears. So do you think after using google tag manager still i need to use the given link? Thanks, Raghu
Reporting & Analytics | | raghuvinder0 -
Is Google turning Off Webmaster Tools Search data
My Webmaster Tools account has stopped showing data past 9/23, which is a full week old. Typically it is just a few days behind schedule. Is Google cutting this off?
Reporting & Analytics | | gametv0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
When I look at my SEOMOZ campaigns I see there are a lot of warnings in regards to missing Meta Tags Descriptions but they exist on a clien'ts wordpress site
when I look at my SEOMOZ campaigns I see there are a lot of warnings in regards to missing Meta Tags Descriptions but they exist on a clien'ts wordpress site
Reporting & Analytics | | Doug_Hay1 -
No Link Data Available for this URL appears often. Are my sites too small to show up?
I am on the trial until Mar 5, 2012. I seldom get the info I want. Is it because my sites are too narrow a niche? I don't seem to be getting the data I'd like from your service. I'm trying to like it, but when I keep getting messages like this, it makes it hard to justify: "No Link Data Available for this URL appears often" Sample sites that I am unable to get data. I especially would like to know how many backlinks exist for each site. I paid someone to help me with them and I'd like to verify their work.: http://costaricadentistreview.com/ http://costaricadentistreviews.com/ http://costaricadentalimplants.org Any suggestions? Thanx Kurt Gross
Reporting & Analytics | | kurtray0 -
Duplicate content? Split URLs? I don't know what to call this but it's seriously messing up my Google Analytics reports
Hi Friends, This issue is crimping my analytics efforts and I really need some help. I just don't trust the analytics data at this point. I don't know if my problem should be called duplicate content or what, but the SEOmoz crawler shows the following URLS (below) on my nonprofit's website. These are all versions of our main landing pages, and all google analytics data is getting split between them. For instance, I'll get stats for the /camp page and different stats for the /camp/ page. In order to make my report I need to consolidate the 2 sets of stats and re-do all the calculations. My CMS is looking into the issue and has supposedly set up redirects to the pages w/out the trailing slash, but they said that setting up the "ref canonical" is not relevant to our situation. If anyone has insights or suggestions I would be grateful to hear them. I'm at my wit's end (and it was a short journey from my wit's beginning ...) Thanks. URL www.enf.org/camp www.enf.org/camp/ www.enf.org/foundation www.enf.org/foundation/ www.enf.org/Garden www.enf.org/garden www.enf.org/Hante_Adventures www.enf.org/hante_adventures www.enf.org/hante_adventures/ www.enf.org/oases www.enf.org/oases/ www.enf.org/outdoor_academy www.enf.org/outdoor_academy/
Reporting & Analytics | | DMoff0