Can we use our existing site content on new site?
-
We added 1000s of pages unique content on our site and soon after google release penguin and we loose our ranking for major keywords and after months of efforts we decided to start a new site.
If we use all the existing site content on new domain does google going to penalized the site for duplicate content or it will be treated as unique?
Thanks
-
You are right I think my seo guy purchased some links. Do you think changing anchor text for those incoming links can help us?
If no and we are unable to contact the webmaster of the sites for link removal what are the other ways?
Thanks
-
Looking at your external links using the SEOMoz Open Site Explorer tool, someone clearly had paid for links to your site. Many of your links have the words Dubai Hotel as the link text and it's obvious to me that these are unnatural links. You need to get rid of the paid links.
-
waiting for your feedback!
-
-
oh, also regarding duplicate content, if you use the content from your old site for the new site and remove/deindex the old site there is no duplicate content issue because the content is not living on two sites. it's unique
-
don't just abandon the old/existing site without looking into the reasons you got penalized or you'll just create a new site with no PR and no Google trust that will be doomed to suffer the same fate.
Send the URL and I can take a quick look at your site if you like and give you some insight.
-
If both site has the content up then yes it will be duplicate content and Google doesn't like it.
Changing sites and moving content to another site might not work too well due to the fact that you will have no backlinks to those articles. Thus they will not rank as well anymore if they were ranking well with the other site.
It makes sense in a way, someone can't just create a new site and copy content from another site and have it ranked equally or even on the same page as the original content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi! I first wrote an article on my medium blog but am now launching my site. a) how can I get a canonical tag on medium without importing and b) any issue with claiming blog is original when medium was posted first?
Hi! As above, I wrote this article on my medium blog but am now launching my site, UnderstandingJiuJitsu.com. I have the post saved as a draft because I don't want to get pinged by google. a) how can I get a canonical tag on medium without importing and b) any issue with claiming the UJJ.com post is original when medium was posted first? Thanks and health, Elliott
Technical SEO | | OpenMat0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
How to handle New Page/post with site map
Hi, I've created and submitted to google (through webmaster tool) a site map with the WP plugin XML google maps. Now I've created new pages and posts. My question is: do i have to recreate and re submit another site map to google or can i just submit to google the new pages and posts with the option 'FETCH AS GOOGLE' ? Tx so much in advance.
Technical SEO | | tourtravel0 -
Can Silos and Exact Anchor Text In Links Hurt a Site Post Penguin?
Just got a client whose site dropped from a PR of 3 to zero. This happened shortly after the Penguin release, June, 2012. Examining the site, I couldn't find any significant duplicate content, and where I did find duplicate content (9%), a closer look revealed that the duplication was totally coincidental (common expressions). Looking deeper, I found no sign of purchased links or linking patterns that would hint at link schemes, no changes to site structure, no change of hosting environment or IP address. I also looked at other factors, too many to mention here, and found no evidence of black hat tactics or techniques. The site is structured in silos, "services", "about" and "blog". All page titles that fall under services are categorized (silo) under "services", all blog entries are categorized under "blogs", and all pages with company related information are categorized under "about". When exploring the site's links in Site Explorer (SE), I noticed that SE is identifying the "silo" section of links (i.e. services, about, blog, etc.) and labeling it as an anchor text. For example, domain.com/(services)/page-title, where the page title prefix (silo), "/services/", is labeled as an anchor text. The same is true for "blog" and "about". BTW, each silo has its own navigational menu appearing specifically for the content type it represents. Overall, though there's plenty of room for improvement, the site is structured logically. My question is, if Site Explorer is picking up the silo (services) and identifying it as an anchor text, is Google doing the same? That would mean that out of the 15 types of service offerings, all 15 links would show as having the same exact anchor text (services). Can this type of site structure (silo) hurt a website post Penguin?
Technical SEO | | UplinkSpyder0 -
Linking Domains in Open Site Explorer Report No Longer Exist. Help.
Hello to all, I have a number of Linking Domains on our Open Site Explorer Report that no longer exist. I've run URL checks on just a sample of the list, and found that approx. 35% of that sample are from now dead Linking Domains. Can someone help? If these Linking Domains are defunct, how can I remove these? Does Google reflect negatively on these dead Linking Domains in our SERPs? Has anyone experienced this before? What action did you take?
Technical SEO | | -Al-0 -
Should this site start again on a new domain
Hi We have not done SEO on this site they have used another company who looks like they outsourced and the links have been built by a third party all blog networks and this company have said they cannot get the links removed. Google flagged artificial links on this web site in February and in April it lost over 10000 visitors in a month and its just free falled ever since. The categories have been recreated and no redirects created due to the amount of backlinks from the blog sites to the original category pages but the site is not recovering its down to 1500 visitors a month and used to get 14000 a month. So should my customer ditch the domain and move this site to fresh domain? http://www.kids-beds-online.com Any answers would really be appreciated. thanks Tracy
Technical SEO | | dashesndots0 -
Should we introduce subfolders into the URLs on a new site?
A site we are working on currently gives no indication of the subfolders in the URL. Eg. the site uses: www.examplesite.com/brand-name Rather than: www.examplesite.com/popular-products/brand-name There are breadcrumbs on site to show the user what part of the site they are in and how they navigated there. We are building a new site and have to decide what route to take: Since the site is already performing relatively well in the SERPs and the URLs are nice and short this way, is it a good idea to keep them like this or is it better for usability to include the subfolders? This post suggests that we would be best off to keep the URLs as they are - particularly since less would be changed http://www.seomoz.org/blog/should-i-change-my-urls-for-seo Thanks in advance for your opinions! Liz @lizstraws
Technical SEO | | oneresult0 -
A site I am working with has multiple duplicate content issues.
A reasonably large ecommerce site I am working with has multiple duplicate content issues. On 4 or 5 keyword domains related to site content the owners simply duplicated the home page with category links pushing visitors to the category pages of the main site. There was no canonical URL instruction, so have set preferred url via webmaster tools but now need to code this into the website itself. For a reasonably large ecommerce site, how would you approach that particular nest of troubles. That's even before we get to grips with the on page duplication and wrong keywords!
Technical SEO | | SkiBum0