How to avoid Sandbox?
-
How to avoid Sandbox?
-
What is Sandbox? In order to avoid something like Sandbox, one should know very well what Sandbox is. But, nobody knows if Sandbox does exist, so let's just focus on the main problem here: How do I get my pages indexed? I have tried over years a lot of techniques, but I found only one that seems to work. If your site is not dynamic, make it so. Create the sitemap and the feed (I recommend RSS 2.0). Put your sitemap in your robots.txt. (last line, like this: Sitemap: http://www.yourdomainname.com/sitemap.xml). Submit sitemap to Sitemaps section in your Webmaster Tools' account. Submit your RSS feed to main RSS directories (just google the words, and you'll find plenty of them). Start with FeedBurner, to please Google. Wait a week or so and you'll see that your pages will start appearing in index. Good luck!
-
Google Sandbox is a debated topic from 2004 and 2005 that has never been confirmed. You shouldn't concern yourself with it too much. Also, the concept of Sandbox would only temporarily penalize new domains for the first few months. If you are worried about being penalized either temporarily or permanantely, there are a couple things you can always do:
1. Create great content
2. Used aged domainsIf you concern yourself with making the best site possible and don't worry about making a quick buck, you shouldn't have a problem.
-
We need a bit more info.
I dont believe there is a sandbox as such.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to avoid a redirect chain?
Hi there, I am aware that it is not good practice to have a redirect chain but I am not really sure hoe to do it (on Apache). I have multiple redirects in a chain because on the one hand I had to redirect because the content of the site got a new URL and because on the other hand I changed from http to https. Thus I have a chain like http://example.com via 301 to http://the-best-example.com via 301 to https://the-best-example.com via 301 to https://greatest-example.com Obviously I want to clean this up without loosing any link juice or visitors who had bookmarked my site. So, I could make three separate redirects: http://example.com via 301 to https://greatest-example.com
Technical SEO | | netzkern_AG
http://the-best-example.com via 301 to https://greatest-example.com
https://the-best-example.com via 301 to https://greatest-example.com But is there a way to combine it? Can I use an "OR" operator to link the 3 conditions to this one rule? Any other suggestions? Thanks a lot!!!0 -
Avoid Keyword Self-Cannibalization
I'm working on the SEO for a page on my website for using a specific keyword. When I do the on page grader, It has unticked - 'Avoid Keyword Self-Cannibalization'. What is the best way to sort this issue? I've noticed that the page is using 2 URLs - does this play a role with this and my ranking in Google?
Technical SEO | | Jaybeamer0 -
One URL To All Sites, How Can I Avoid ?
I am using EMD and have an only 1 page which is the main url. Now my question is how can i avoid the penalty of submitting the same URL to the different platform like Web2.0, Article Directory etc. Please help.
Technical SEO | | seodadoo5670 -
My .htaccess has changed, what do i do to avoid it again...?
Hello Today i notice that our site did not auto changed from without www to with, when i checked the .htaccess file i notice # in-front of each line and i know we did not insert it in there, after i removed it it worked fine. The only changes that we did recently was to a mobile version to the site but the call to autodirect is in a JS and not in the .htaccess, could it be the server..? is there any way that anything else might cause this...? The site is HTML and WP could it be because of that...? Thank's Simo
Technical SEO | | Yonnir0 -
Avoiding Duplicate Content in E-Commerce Product Search/Sorting Results
How do you handle sorting on ecommerce sites? Does it look something like this? For Example: example.com/inventory.php example.com/inventory.php?category=used example.com/inventory.php?category=used&price=high example.com/inventory.php?category=used&location=seattle If not, how would you handle this? If so, would you just include a no-index tag on all sorted pages to avoid duplicate content issues? Also, how does pagination play into this? Would it be something like this? For Example: example.com/inventory.php?category=used&price=high__ example.com/inventory.php?category=used&price=high&page=2 example.com/inventory.php?category=used&price=high&page=3 If not, how would you handle this? If so, would you still include a no-index tag? Would you include a rel=next/prev tag on these pages in addition to or instead of the no-index tag? I hope this makes sense. Let me know if you need me to clarify any of this. Thanks in advance for your help!
Technical SEO | | AlexanderAvery1 -
Same product in Multiple categories ecommerce store, best way to avoid duplicate content?
Hello All, Im building a magento store, with around 500 products. One thing is that I am going to have some products in Multiple categories. Do you think the best solution is to remove any category name from the url structure or would this devalue SEO? Also would the use of canonical links remove any duplicate content issues if the category name was left in. So overall what would get better results No category name in URL (e.g.phonename-model1.html) V category name in url (e.g. phones/phonename-model1.html / videophones/phonename-model1.html +using canonical links Any feedback or views would be great
Technical SEO | | voipme0 -
Avoiding duplicate content with national e-commerce products and localized vendors
Hello 'mozzers! For our example purposes, let's say we have a national cog reseller, www.cogexample.com, focusing on B2C cog sales. The website's SEO efforts revolve around keywords with high search volumes -- no long tail keywords here! CogExample.com sells over 35,000 different varieties of cogs online, broken into search engine friendly categories and using both HTML and Meta pagination techniques to ensure adequate deep-linking and indexing of their individual product pages. With their recent fiscal success, CogExample.com has signed 2,500 retailers across the United States to re-sell their cogs. CogExample.com's primary objective is B2C online sales for their highly-sought search terms, ie "green cogs". However, CogExample.com also wants their retailers to show up for local/geo search; ie "seattle green cogs". The geo/location-based retailer's web-content will be delivered from the same database as the primary online store, and thus is very likely to cause duplicate content issues. Questions 1. If the canonical meta tag is used to point the geo-based product to the online primary product, the geo-based product will likely be placed in the supplementary indexed. Is this correct? 2. Given the massive product database (35,000) and retailers (2,500) it is not feasible to re-write 87,500,000 pages of content to sate unique content needs. Is there any way to prevent the duplicate content penalty? 3. Google product feeds will be used to localize content and feed Google's product search. Is this "enough" to garnish sizable amounts of traffic and/or retain SERP ranks?
Technical SEO | | CatalystSEM0 -
Sandboxed
Hi all, Any help with the following. We built a new site for a customer in June of last year. We then cracked on with the on page and off page SEO. All white had....good quality. 3 months in the site was still not ranking with google and indeed had been sandboxed. All working fine with Bing and Yahoo. We followed all the steps to get recognitions for Google but to no avail. In December we took the drastic step of providing the customer with a completely new site...new content, design, structure etc etc. In Jan we went back and fixed all the external linking sources to link to the new pages on the new site. Now 7 months in.....the site is STILL sandboxed. All still fine with Bing and Yahoo. Thoughts anyone?
Technical SEO | | SEOwins0