Do I need to do anything with masking?
-
Hello, I read up a bit on rel=cansomething where if a page has duplicate content you should put the tag there linking to the real page.
I have a godaddy domain that uses blogger for it's posts currently (Foward with masking). Do I need this tag?
-
Okay I finally got some host gator hosting but I have no idea what to do. I am up to the part where I am entering the details for the CNAME but it says the record already exists?
What the ??
-
It can be a bit tough without seeing the domain (since these situation can vary a lot), but I tend to agree with Matthew - the masking can cause a lot of strange side-effects, for search and visitors. One thing I'd definitely do is so how Google is indexing your content. You can just use the "site:" operator:
site:example.com
(with your domain, and leave off the "www") - if your correct domain is showing up with indexed pages, and there aren't a ton of duplicates, you're probably doing ok. If your posts aren't being indexed and are only showing up under the Blogger domain, then your set-up is sub-optimal at best.
-
Hey wait hold on, the part where you said 'host my blog on a url that I already own' I have done that already from a tutorial i saw somewhere where I update the record or something by putting my url.
I also have masking on my account?
Now I'm very lost -_-
Update: Okay I took off masking with my godaddy account and just did the second option you put which was update the records or something (I already had that).
-That's it! We've updated the DNS records, so your blog will now appear at . Thanks for being a Go Daddy customer! Will everything work fine now?
-
Hi zmbatt,
URL masking is not an ideal solution for SEO as it masks the URL for subsequent pages of the site which can create perceived duplicate content. Here is an article with a little bit more on that.
http://webenso.com/domain-forwarding-seo/In your case, the ideal solution is to setup a custom domain for your Blogger.com blog. Here are the instructions on how to do that. Click "Host my blog on a URL that I already own" then click on "Yes" for the "hosted at GoDaddy" question.
http://support.google.com/blogger/bin/static.py?hl=en&ts=1233381&page=ts.csDoing that will let you have individual URLs appear in the address bar. As for the canonical URL you were asking about, it is best practice to use them if you can. They are only essential if you have problems with duplicate content. Here is an article discussing how to set them up in Blogger.
http://www.mybloggertricks.com/2012/02/solve-problems-related-to-blogspot.htmlI hope that helps. Thanks,
Matthew
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After I 301 redirect duplicate pages to my rel=canonical page, do I need to add any tags or code to the non canonical pages?
I have many duplicate pages. Some pages have 2-3 duplicates. Most of which have Uppercase and Lowercase paths (generated by Microsoft IIS). Does this implementation of 301 and rel=canonical suffice? Or is there more I could do to optimize the passing of duplicate page link juice to the canonical. THANK YOU!
Technical SEO | | PFTools0 -
How long after google crawl do you need 301 redirects
We have just added 301's when we moved our site. Google has done a crawl & spat back a few errors. How long do I need to keep those 301's in place? I may need to change some. Thanks
Technical SEO | | Paul_MC0 -
RegEx help needed for robots.txt potential conflict
I've created a robots.txt file for a new Magento install and used an existing site-map that was on the Magento help forums but the trouble is I can't decipher something. It seems that I am allowing and disallowing access to the same expression for pagination. My robots.txt file (and a lot of other Magento site-maps it seems) includes both: Allow: /*?p= and Disallow: /?p=& I've searched for help on RegEx and I can't see what "&" does but it seems to me that I'm allowing crawler access to all pagination URLs, but then possibly disallowing access to all pagination URLs that include anything other than just the page number? I've looked at several resources and there is practically no reference to what "&" does... Can anyone shed any light on this, to ensure I am allowing suitable access to a shop? Thanks in advance for any assistance
Technical SEO | | MSTJames0 -
Is there an easy tool to add all keywords I need to seomozpro?
In my website there are 3,300 products. www.theprinterdepo.com. I have about 200 keywords on the seomoz account. Every item has about 5 or more variations of keywords. examples: hp 1320 1320 printer hp 1320 refurbished 1320 hp printer hp laser 1320 and so on. The question is: Is there an easy way to add all the keywords related to my products and all its variations to the seomoz dashboard so I can more easily track them? If not, what do you suggest me to do?
Technical SEO | | levalencia10 -
Do user metrics really mean anything?
This is a serious question, I'd also like some advice on my experience so far with the Panda. One of my websites, http://goo.gl/tFBA4 was hit on January 19th, it wasn't a massive hit, but took us from 25,000 to 21,000 uniques per day. It survived Panda completely prior. The only thing that had changed, was an upgrade in the CMS, which caused a lot of duplicate content, i.e 56 copies of the homepage, under various URLs. These were all indexed in Google. I've heard varying views, as to whether this could trigger Panda, I believe so, but i'd appreciate your thoughts on it. There was also the above the fold update on the 19th, but we have 1 ad MAX on each page, most pages have none. I hate even having to have 1 ad. I think we can safely assume it was Panda that did the damage. Jan 18th was the first Panda refresh, since we upgraded our CMS in mid-late December. As it was nothing more than a refresh, I feel it's safe to assume, that the website was hit, due to something that had changed on the website, between the Jan 18th refresh and the one previous. So, aside from fixing the bugs in the CMS, I felt now was a good time to put a massive focus on user metrics, I worked hard and continuing to spend a lot of time, improving them. Reduced bounce rate from 50% to 30% (extremely low in the niche) Average page views from 7 to 12 Average time on site from 5 to almost 8 minutes Plus created a mobile optimised version of the site Page loading speeds slashed. Not only did the above improvements have no positive effect, traffic continued to slide and we're now close to a massive 40% loss. Btw I realise neither mobile site nor page loading speeds are user metrics. I fully appreciate that my website is image heavy and thin on text, but that is an industry wide 'issue'. It's not an issue to my users, so it shouldn't be an issue to Google. Unlike our competitors, we actively encourage our users to add descriptions to their content and provide guidelines, to assit them in doing so. We have a strong relationship with our artists, as we listen to their needs and develop the website accordingly. Most of the results in the SERPs, contain content taken from my website, without my permission or permission of the artist. Rarely do they give any credit. If user metrics are so important, why on earth has my traffic continued to slide? Do you have any advice for me, on how I can further improve my chances of recovering from this? Fortunately, despite my artists download numbers being slashed in half, they've stuck by me and the website, which speaks volumes.
Technical SEO | | seo-wanna-bs0 -
Do sites really need a 404 page?
We have people posting broken links to our site is this looking us link juice as they link to 404 pages. We could redirect to the homepage or just render the home page content, in both cases we can still display a clear page not found message. Is this legal (white hat).
Technical SEO | | ed1234560 -
Duplicate exact match domains flagged by google - need help reinclusion
Okay I admit, I've been naughty....I have 270+ domains that are all exact match for city+keyword and have built tons of back links to all of them. I reaped the benefits....and now google has found my duplicate templates and flagged them all down. Question is, how to get the reincluded quickly? Do you guys think converting a site to a basic wordpress template and then simply using 275 different templates and begging applying each site manually would do it, or do you recommend. 1. create a unique site template for each site 2. create unique content any other advice for getting reincluded? Aside from owning up and saying, "hey i used the same template for all the sites, and I have created new templates and unique content, so please let me back".
Technical SEO | | ilyaelbert3 -
Duplicate titles OK if page don't need to rank well?
I know It is not a good idea to have duplicate titles across a website on pages as Google does not like this. Is it ok to have duplicate titles on pages that aren't being optimised with SERP's in mind? or could this have a negative effect on the pages that are being optimised?
Technical SEO | | iSenseWebSolutions0