Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
-
Ben,
I have a follow up question from our previous discussion at
http://www.seomoz.org/qa/discuss/52837/google-analytics
To summarize, to implement what we need, we need to do three things:
- add GA code to the Darden page
_gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']);
- Change links on the Darden Page to look like
http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now
- Have symplicity add this code.
_gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']);
Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code?
Nick
-
Hi Nick,
I'm following up on older, unanswered questions here in Q&A. Did you ever get this sorted out?
-
I think that if you leave out part two, traffic from each site to the other will show up as referred. I'm not sure that there are any other consequences.
As for the jQuery stuff, I can't fault the logic, but I'm no web developer... sorry not to be more helpful.
-
Sergei,
I have put everything in place that I think I should. But, my results are not showing up in Google Analytics. Do you see anything wrong with
- The css on the "Apply Now" link at
http://www.darden.virginia.edu/web/MBA/Admissions/Home/
-
The code at http://www.darden.virginia.edu/web/js/gaq.js that is included works.
-
The code at https://darden-admissions.symplicity.com/ works.
Nothing comes up in Google Analytics content. What am I missing?
-
I will give it a try and get back to you.
-
Yes , this should work.
-
If I understand you correctly, I am hearing this.
Include a javascript file called replace_on_click.js which includes
$(document).ready(function{
$('.class_with_onclick').click(function(){
_gaq.push(['_link', 'https://darden-admissions.symplic...']); return false;
})
});Then in html
[https://darden-admissions.symplic...](<a class=)">Apply Now
Is this what you are saying?
-
You do not have to add onClick event to the link. You may try jQuery and its event binder, such as click(); in your case. Just assign specific classes to these links and do smth like this
$(document).ready(function{
$('.class_with_onclick').click(function(){
//your stuff goes here.
})
});
Sorry if am being too techy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Why google does not remove my page?
Hi everyone, last week i add "Noindex" tag into my page, but that site still appear in the organic search. what other things i can do for remove from google?
Technical SEO | | Jorge_HDI0 -
How to do A/B testing without creating two separate url like google analytic experiment?
Hello Experts, I want to do A/B testing for my page. In google analytic experiment we have to create two pages 1) Original Page 2) Variant 1 but I don't want to go in this method that is I donot want to create two pages is it possible only via one page but two different events or something else ? If yes then which is the best tool? Thanks! Wrigths!
Technical SEO | | wright3350 -
Google showing https:// page in search results but directing to http:// page
We're a bit confused as to why Google shows a secure page https:// URL in the results for some of our pages. This includes our homepage. But when you click through it isn't taking you to the https:// page, just the normal unsecured page. This isn't happening for all of our results, most of our deeper content results are not showing as https://. I thought this might have something to do with Google conducting searches behind secure pages now, but this problem doesn't seem to affect other sites and our competitors. Any ideas as to why this is happening and how we get around it?
Technical SEO | | amiraicaew0 -
Website is not indexed in Google
Hi Guys, I have a problem with a website from a customer. His website is not indexed in Google (except for the homepage). I could not find anything that can possibly be the cause. I already checked the robots.txt, sitemap, and plugins on the website. In the HTML code i also couldn't find anything which makes indexing harder than usual. This is the website i am talking about: http://www.xxxx.nl/ (Dutch) The only thing that i am guessing now is the Google sandbox, but even that is quite unlikely. I hope you guys discover something i could not find! Thanks in advance 🙂
Technical SEO | | B.Great0 -
Google ranking downgraded
On Sept 29th our website 'keyword' rankings who totally destroy. We were always #1 to #2 for these keywords 1. listing presentation
Technical SEO | | RandyRoussie
2. real estate presentation Our website is: http://www.agentpresentations.com We asked Google through Webmaster Tools to reconsider the site and they replied back saying nothing was wrong. We run Google Adwords and we contacted them and they looked at the keywords and sent screenshots to us showing the ranking we still there. But they are not... as proven in Webmaster tools. So Adwords told us to ask Webmaster Tools to reconsider the site again. Nothing yet... What do you suggest?0 -
Sitemaps for Google
In Google Webmaster Central, if a URL is reported in your site map as 404 (Not found), I'm assuming Google will automatically clean it up and that the next time we generate a sitemap, it won't include the 404 URL. Is this true? Do we need to comb through our sitemap files and remove the 404 pages Google finds, our will it "automagically" be cleaned up by Google's next crawl of our site?
Technical SEO | | Prospector-Plastics0