Moving content to a clean URL
-
Greetings
My site was seriously punished in the recent penguin update. I foolishly got some bad out sourced spammy links built and I am now paying for it
I am now thinking it best to start fresh on a new url, but I am wondering if I can use the content from the flagged site on the new url.
Would this be flagged as duplicate content, even if i took the old site down?
your help is greatly appreciated
Silas
-
Thanks nathan
yes I kinda doubt the disavow tool is of much use, hence why I am now looking at moving to a new url
lessons learned
-
Well after reading some other Q&A posts I made the discovery that the disavow tool doesn't permanently remove the link.
Others have suggested going through the effort or physically removing the link.
I found this article on backlink
-
Hi Nathan
thanks for your reply.
Unfortunately I already used the disavow tool a few months back when I realized what happened to my site. I was starting to rank up again then the new penguin update came along a buried me.
So you dont think I would be punished for duplicate content if I used my content on a new url after taking down the old site?
many thanks!
-
Hi Silas,
While moving content would probably solve your problem, a cheaper option would be to use the "disavow links" tools from Google. This is to be used if you know what links are causing the problem. I have not had to deal with this problem yet myself but from my reading i believe this may solve your problem as well as prevent you from any potential problem from creating a new url, the hassle of re-branding that url and just having to figure out a good new one!
Just simply Google "disavow tool" and it should be the first result. It will take you to your webmaster tools page but i'm not sure how to find it without doing a search.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question RE: Links in Headers, Footers, Content, and Navigation
This question is regarding this Whiteboard Friday from October 2017 (https://moz.com/blog/links-headers-footers-navigation-impact-seo). Sorry that I am a little late to the party, but I wanted to see if someone could help out. So, in theory, if header links matter less than in-content links, and links lower on the page have their anchor text value stripped from them, is there any point of linking to an asset in the content that is also in the header other than for user experience (which I understand should be paramount)? Just want to be clear.Also, if in-content links are better than header links, than hypothetically an industry would want to find ways to organically link to landing pages rather than including that landing page in the header, no? Again, this is just for a Google link equity perspective, not a user experience perspective, just trying to wrap my head around the lesson. links-headers-footers-navigation-impact-seo
White Hat / Black Hat SEO | | 3VE0 -
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
Restructuring URLS - unsure if this falls on the spammy side of paths.
Hi all, I'm restructuring a site that has been built with no real structure. It's moving over to HTTPS and having a full new development so it's a good time to tackle it all together. It's a snowboard site and at the moment the courses, camps ect are all just as pages like: examplesnowboarding.com/off-piste-backcountry/ I'm wanting to tighten the structure so it gives more meaning to the pages and so I can style them selectively and make it easier for the client to manage but I'm worried repeating the word snowboard too often will look spammy. I'm wanting to do the following: URL - examplesnowboarding.com/snowboard-courses/splitboard-backcountry-intro/
White Hat / Black Hat SEO | | snowflake74
URL - examplesnowboarding.com/snowboard-camps/technical-performance/
URL - examplesnowboarding.com/snowboard-camps/girls-only/
URL - examplesnowboarding.com/snowboard-lessons/private/
URL - examplesnowboarding.com/snowboard-lessons/group/ The urls are clean and humanly descriptive but it does mean that the "snowboard" keyword is used a lot! The other 2 options I thought of were like so (including snowboard in the page name not path) URL - examplesnowboarding.com/courses/snowboard-splitboard-backcountry-intro/
URL - examplesnowboarding.com/camps/snowboard-technical-performance/
URL - examplesnowboarding.com/camps/snowboard-girls-only/
URL - examplesnowboarding.com/lessons/private-snowboard/
URL - examplesnowboarding.com/lessons/group-snowboard/ or simply removing "snowboard" as "snowboarding" is already in the main url URL - examplesnowboarding.com/courses/splitboard-backcountry-intro/
URL - examplesnowboarding.com/camps/technical-performance/
URL - examplesnowboarding.com/camps/girls-only/
URL - examplesnowboarding.com/lessons/private/
URL - examplesnowboarding.com/lessons/group/ Any thoughts appreciated!1 -
Site build in the 80% of canonical URLs - What is the impact on visibility?
Hey Everyone, I represent international wall decorations store where customer can freely choose a pattern to be printed on a given material among a few milions of patterns. Due to extreme large number of potential URL combinations we struggle with too many URL adressess for a months now (search console notifications). So we finally decided to reduce amount of products with canonical tag. Basing on users behavior, our business needs and monthly search volume data we selected 8 most representative out of 40 product categories and made them canonical toward the rest. For example: If we chose 'Canvas prints' as our main product category, then every 'Framed canvas' product URL points rel=canonical tag toward its equivalent URL within 'Canvas prints' category. We applied the same logic to other categories (so "Vinyl wall mural - Wild horses running" URL points rel=canonical tag to "Wall mural - Wild horses running" URL, etc). In terms of Googlebot interpretation, there are really tiny differences between those Product URLs, so merging them with rel=canonical seems like a valid use. But we need to keep those canonicalised URLs for users needs, so we can`t remove them from a store as well as noindex does not seem like an good option. However we`re concerned about our SEO visibility - if we make those changes, our site will consist of ~80% canonical URLs (47,5/60 millions). Regarding your experience, do you have advices how should we handle that issue? Regards
White Hat / Black Hat SEO | | _JediMindBender
JMB0 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
20-30% of our ecommerce categories contain no extra content, could this be a problem
Hello, About 20-30% of our ecommerce categories have no content beyond the products that are in them. Could this be a problem with Panda? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
Schema.org tricking and duplicate content across domains
I've found the following abuse, and Im curious what could I do about it. Basically the scheme is: own some content only once (pictures, description, reviews etc) use different domain names (no problem if you use the same IP or IP-C address) have a different layout (this is basically the key) use schema.org tricking, meaning show (the very same) reviews on different scale, show a little bit less reviews on one site than on an another Quick example: http://bit.ly/18rKd2Q
White Hat / Black Hat SEO | | Sved
#2: budapesthotelstart.com/budapest-hotels/hotel-erkel/szalloda-attekintes.hu.html (217.113.62.21), 328 reviews, 8.6 / 10
#6: szallasvadasz.hu/hotel-erkel/ (217.113.62.201), 323 reviews, 4.29 / 5
#7: xn--szlls-gyula-l7ac.hu/szallodak/erkel-hotel/ (217.113.62.201), no reviews shown It turns out that this tactic even without the 4th step can be quite beneficial to rank with several domains. Here is a little investigation I've done (not really extensive, took around 1 and a half hour, but quite shocking nonetheless):
https://docs.google.com/spreadsheet/ccc?key=0Aqbt1cVFlhXbdENGenFsME5vSldldTl3WWh4cVVHQXc#gid=0 Kaspar Szymanski from Google Webspam team said that they have looked into it, and will do something, but honestly I don't know whether I could believe it or not. What do you suggest? should I leave it, and try to copy this tactic to rank with the very same content multiple times? should I deliberately cheat with markups? should I play nice and hope that these guys sooner or later will be dealt with? (honestly can't see this one working out) should I write a case study for this, so maybe if the tactics get bigger attention, then google will deal with it? Does anybody could push this towards Matt Cutts, or anybody else who is responsible for these things?0