Is Duplicate title made Sanbox?
-
I use Seomoz tool and discover that my webpage have 1000 duplicate title, my keyword with domain key 's position is 25, but i dont have any keyword ontop 100 of google.
Is the Duplicate title effect in the key word position or SEO ? right?
-
I'm not sure what the "id=" parameter is doing, functionally, but there are many ways to control it, and Robots.txt is probably not your best bet. You could META NOINDEX the duplicates, or you could rel=canonical to the "master" page. It's a complex subject, and I've got a comprehensive post about it here:
http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
Those duplicates can definitely dilute your ranking ability, and even trigger a Panda problem (that could harm your entire site).
I don't believe there's really a sandbox in 2012, in the way we used to mean it. Many sites go out the gate ranking pretty quickly. Sometimes, after an initial grace period (30 days or so?), Google lowers a site's ranking, but that's usually because they re-evaluated the link profile. Large-scale duplication wouldn't cause that. The good news is that, once you clean it up, rankings should gradually improve.
-
Dear friend!
Thanks you for the support ^^!
My website backlinks are pointed from my another website by the banner at the left hand side bar ( follow link, of course this is also travel website), and i litle is from the forum and blog, the biggest links are from my website ( another site).
So, Should i remove from my another website or from the blog, forum...
Because my another website have index 2000+ so the links are too many.
And with the backlink, what is the most important factors? PA, DA, or link root domain...
Thank you ^^!
-
Ok so now we know it is due to the Penguin update.
The first step to do will be analyzing every single on of your backlink. Use Webmaster Tools and export all the backlinks that are pointing to your website. Next you will have to check out each single backlinks to see the quality of the link. I like to use the Mozbar addon to quickly check what the MozRank and MozTrust of the page is where the backlinks to your site is located.
Once you got the full list of links you want to have gone, you will first have to manually try and remove the links. This could be relatively easy if the link for example is from a directory where you had to make a profile first in order to create a listing. In that case, you can just delete the listing. Otherwise you will have to email the owner of the website if they could remove your backlink.
Now if, let's say 2-3 weeks have passed and you still haven't heard back of some the webmasters (even though you contacted them), you can go ahead and use the link disavow tool. This tool however has to be used with high care.
Have a read of http://searchengineland.com/google-launches-disavow-links-tool-136826 for an understanding of the tool.
If you don't fully understand the tool, please come back to the SEOMoz forums and ask for help, you need to fully understand how it works.
Good luck and feel free to ask any other questions!
-
You right!
I think so! My website loss Top at that time.So What should i do my friend?
i remove or try to rename to anchor text. To remove the backlinks i remove it on the web page where placed the keyword or still to use google webmaster tool to remove it?
Thank!
-
By the sound of you might of gotten a penalty of Google.
I believe it might of been the penguin update.
I had a look at your backlink profile, and out of a total of 2.9k links you have 968 links with the anchor text of "vietnam visa of arrival". This is a clear signal to google that you are trying to manipulate to rank for that keyword.
Did the rankings decreased around the period of April the 24th?
-
Having duplicate titles will most certainly be effecting your SERP rankings, along with many many others factors of course.
Page titles should be unique on each page, and contain keywords that describe or that are relevant to the page.
Check out: http://www.seomoz.org/learn-seo/title-tag
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
Duplicate Page Content and Titles from Weebly Blog
Anyone familiar with Weebly that can offer some suggestions? I ran a crawl diagnostics on my site and have some high priority issues that appear to stem from Weebly Blog posts. There are several of them and it appears that the post is being counted as "page content" on the main blog feed and then again when it is tagged to a category. I hope this makes sense, I am new to SEO and this is really confusing. Thanks!
Technical SEO | | CRMI0 -
Avoiding duplication in TLDs
I have started a ecom site with following config global version geekwik.com priced in usd india version geekwik.in priced in inr mostly the content in both sites is same (90% same), major difference is currency (and payment gateway) and helpline numbers etc How do I setup robots.txt and google webmaster so that indian users get results from India TLD and global users get results from global TLD and there is no duplication of content. .
Technical SEO | | geekwik0 -
Duplicate pages
Hi Can anyone tell me why SEO MOZ thinks these paes are duplicates when they're clearly not? Thanks very much Kate http://www.katetooncopywriter.com.au/how-to-be-a-freelance-copywriter/picture-1-58/ http://www.katetooncopywriter.com.au/portfolio/clients/other/ http://www.katetooncopywriter.com.au/portfolio/clients/travel/ http://www.katetooncopywriter.com.au/webservices/what-i-do/blog-copywriter/
Technical SEO | | ToonyWoony0 -
Category URL Duplicate Content
I've recently been hired as the web developer for a company with an existing web site. Their web architecture includes category names in product urls, and of course we have many products in multiple categories thus generating duplicate content. According to the SEOMoz Site Crawl, we have roughly 1600 pages of duplicate content, I expect primarily from this issue. This is out of roughly 3600 pages crawled. My questions are: 1. Fixing this for the long term will obviously mean restructuring the URLs for the site. Is this worthwhile and what will the ramifications be of performing such a move? 2. How can I determine the level and extent of the effects of this duplicated content? 3. Is it possible the best course of action is to do nothing? The site has many, many other issues, and I'm not sure how highly to prioritize this problem. In addition, the IT man is highly doubtful this is causing an SEO issue, and I'm going to need to be able to back up any action I request. I do feel I will need to strongly justify any possible risks this level of site change could cause. Thanks in advance, and please let me know if any more information is needed.
Technical SEO | | MagnetsUSA0 -
Duplicate title tag error
Hi all, I am new to SEO, and we have just launched a new version of our site (kept the domain name the same though). I keep getting errors for duplicate title tags - e.g. www.sandafayre.com/default.aspx and www.sandafayre.com/Default.aspx, www.sandafayre.com/StampAuctions.aspx and www.sandafayre.com/stampauctions.aspx (plus loads others :o). The only difference each time seems to be the capitalisation of the first character - but I though URLs were not case sensitive? I've been advised to add the rel canonical tag to one of the pages, but the problem is I really only have 1 version of each page! Can anybody help please? Many thanks in advance! Nikki
Technical SEO | | Stampy780 -
Noindex duplicate content penalty?
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit. Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site. In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
Technical SEO | | Grumpy_Carl0 -
Canonicalization - duplicate homepage issues
I'm trying to work out the best way to resolve an issue where Google is seeing duplicate versions of a homepage, i.e. http://www.home.co.uk/Home.aspx and http://www.home.co.uk/ The site runs on Windows servers. I've tried implementing redirects for homepages before (for a different site on a linux server) and ended up with a loop, so although I know I can read lots of info (as I have been doing) and try again, I am really concerned about getting it wrong. Can anyone give me some advice on the best way to make Google take just one version of the page? Obviously link juice is also being diluted so I need to get this sorted asap. Thanks.
Technical SEO | | travelinnovations0