Switching site content
-
I have been advised to take a particular path with my domain, to me it seems "black hat" but ill ask the experts:
Is it acceptable when one owns an exact match location domain eg london.com, to run as a tourist information site, gathering links from wikipedia,bbc,local paper/radio/sports websites etc, then after 6 - 12 months, switch the content to a business site?
What could the penalties be?
Please advise...
-
Wow some great answers.
Thankyou.
Im in such a predicament now,I intended to keep the content and have a local section on the site with local events and local forum etc, thus also retaining a community around.
The anchor text i was going to build during the non-business phase was placename.
And as i want to build the new business as a brand its new anchor text would also be placename.
I am eventually wanting to pursue seoplacename, but i do have a lot of learning & practice to do yet (1-2 years) so, I wanted to build a "site age" to go with the 20 year domain registered age(is this relevant to rankings?).
Or would i be left with the wrong type of community...
-
The problem with "bait and switch" is that it doesn't really work with content and relevancy signals, including anchor text.
Google has filed several patents over the years showing how they can devalue anchor text when they see changes in context. A broad example would be building links to a site about "red pickups" and then changing everything to "tom hanks". In this situation, you're likely not going to rank near as well for "Tom Hanks" after the switch as you did for "red pickups" so you likely wasted a lot of link equity.
That said, this technique does work "a little". That's why you see black-hatters buying old websites and throwing up spam. The new pages can sometimes get a little traffic based on authority and link juice, but in my opinion the ROI is so small, and the risks large, that it's not worth the effort.
A better strategy, in my opinion, is to build out your business website in full view of the world, and attract links that are related to your industry in creative and surprising ways. Instead of bait and switch, surprise and delight people with the unexpected. This way you retain 100% of any link equity you build and the rewards pay off greater down the road.
-
You could possibly get away with it it if the topic of the new site was in some way related to the content on the old site (the closer the better). However, the further away the new site topic gets from the context surrounding the back links that were created for the old site, the less less value it's going to get from them.
Regardless of your domain name, it still takes effort to build solid links. Going through the exercise of getting good links makes the company a better company, a better competitor, and a better search result and will most likely give a better ROI than a bait and switch tactic.
-
I wonder the same thing. It's way easier to build links for a non-commercial site. If you were to bait and switch, I'd think you'd want to keep the basic content that was linked to to be fair to those who linked to you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is this site ranked #1 in Google with such a low DA (is DA not important anymore?)
Hi Guys, Would you mind helping me with the below please? I would like to get your view on it and why Google ranks a really new domain name #1 with super low domain authority? Or is Domain Authority useless now in Google? It seems like from the last update that John Mueller said that they do not use Domain Authority so is Moz Domain Authority tool not to take seriously or am I missing something? There is a new rehab in Thailand called https://thebeachrehab.com/ (Domain authority 13)It's ranked #1 in Google.co.th for these phrases: drug rehab thailand but also for addiction rehab thailand. So when checking the backlink profile it got merely 21 backlinks from really low DA sites (and some of those are really spammy or not related). Now there are lots of sites in this industry here which have a lot higher domain authority and have been around for years. The beach rehab is maybe only like 6 months old. Here are three domains which have been around for many years and have much higher DA and also more relevant content. These are just 3 samples of many others... <cite class="iUh30">https://www.thecabinchiangmai.com (Domain Authority 52)</cite>https://www.hope-rehab-center-thailand.com/ (Domain Authority 40)https://www.dararehab.com (Domain Authority 32) These three sites got lots of high DA backlinks (DA 90++) from strong media links like time.com, theguardian.com, telegraph.co.uk etc. (especially thecabinchiangmai.com) but the other 2 got lots of solid backlinks from really high DA sites. So when looking at the content, thebeachrehab.com has less content as well. Can anyone have a look and let me know your thoughts why Google picks a brand new site, with DA 13 and little content in the top compared to competition? I do not see the logic in this? Cheers
White Hat / Black Hat SEO | | igniterman75
John0 -
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
Site that's 301 redirected is ranking for brand
We own a number of foreign TLD domains for our brand. They are all 301-redirected to our main .com branded domain. One of them is appearing in our branded search results, outranking out main .com page. To be clear, this is despite there being a 301 redirect from it to the .com page. Any ideas on what is going on here?
White Hat / Black Hat SEO | | ipancake0 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Site architecture change - +30,000 404's in GWT
So recently we decided to change the URL structure of our online e-commerce catalogue - to make it easier to maintain in the future. But since the change, we have (partially expected) +30K 404's in GWT - when we did the change, I was doing 301 redirects from our Apache server logs but it's just escalated. Should I be concerned of "plugging" these 404's, by either removing them via URL removal tool or carry on doing 301 redirections? It's quite labour intensive - no incoming links to most of these URL's, so is there any point? Thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
I think my site is affected by a Google glitch...or something
Although google told me No manual spam actions found i had not received an unnatural link request notice i figured it would be a good idea to clean these up so i did. So i have submitted 3 reconsideration requests from google. They all came back with the same response: No manual spam actions found. I really doubt that anyone at google really checked those out.You will notice that i don't even appear on page 1-10 at all...its clearly google filtering the site out from the results(except for my brand terms), but i have no idea what for.What do you guys think it is? If you see anythign let me know so i can have it fixed.This has been going on for 2 months now...my company has been around for a long time...i dont understand why suddenly im not showing up in searches for the keyword si used to rank for...
White Hat / Black Hat SEO | | CMTM0 -
Retail Site and Internal Linking Best Practices
I am in the process of recreating my company's website and, in addition to the normal retail pages, we are adding a "learn" section with user manuals, reviews, manufacturer info, etc. etc. It's going to be a lot of content and there will be linking to these "learn" pages from both products and other "learn" pages. I read on a SEOmoz blog post that too much internal linking with optimized anchor text can trigger down-rankings from Google as a penalty. Well, we're talking about having 6-8 links to "learn" pages from product pages and interlinking many times within the "learn" pages like Wikipedia does. And I figured they would all have optimized text because I think that is usually best for the end user (I personally like to know that I am clicking on "A Review of the Samsung XRK1234" rather than just "A Review of Televisions"). What is best practice for this? Is there a suggested limit to the number of links or how many of them should have optimized text for a retail site with thousands of products? Any help is greatly appreciated!
White Hat / Black Hat SEO | | Marketing.SCG0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0