Should /dev folder be blocked?
-
I have been experiencing a ranking drop every two months, so I came upon a new theory this morning...
Does Google do a deep crawl of your site say every 60-90 days and would they penalize a site if they crawled into your /dev area which would contain pretty the exact same urls and content as your production environment and therefore penalize you for duplicate content?
The only issue I see with this theory is that I have been penalized only for specific keywords on specific pages, not necessarily across the board.
Thoughts?
What would be the best way to block out your /dev area?
-
Hey Ryan,
It isn't really a competition thing because I bounce back to the same or better sport after about 30 days, so it seems very algorithmic. I just haven't been able to figure what/how I am getting penalized. However, this morning while looking at some rankings I noticed a dev page was indexed and in Google's rankings so I said "Ah oh...." and went to see my developer hadn't blocked out our development directory... which means if Google deep indexes a site every so often they would probably crawl it and find an exact copy of the live site...
The 60-90 deep indexing would also make sense as the ranking drop has always occurred around the 27th of the month.
Once I block out /dev what do you think would be the best way to get Google to re-crawl the site and perhaps remove any duplicate issues - delete out my robots.txt and sitemaps from Google and re-submit them?
-
It is a great idea to block the development portion of your site. You can do that with your robots.txt file:
User-Agent: *
Disallow: /dev
But you're right, it doesn't sound like the culprit in your case. Its more likely that your competition is slowly gaining the edge on you. Make sure your on-page SEO is optimized and then try to get some more inbound links to your pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we still have this Page Rank / Link juice / Link equity? So this dilution concept?
Hi all, As per the traditional or standard SEO rules, we have this link juice and dilution concept. Many websites have changed their linking structure with this with the beleif "the more number of pages, the PR will get diluted". Then many websites avoided more number of pages from homepage to avoid link juice dilution. Even we followed same. But I just wonder it's still the same way Google handles websites and rankings as per the links. And many websites even avoid more number of 2nd tier/hierarchy pages to avoid link dilution. I have gone through our competitors where they been employing lot of top level pages like 2nd tier/hierarchy pages but still doing good at rankings. Please share your views and suggestions on this. Thanks
Web Design | | vtmoz0 -
Is it cloaking/hiding text if textual content is no longer accessible for mobile visitors on responsive webpages?
My company is implementing a responsive design for our website to better serve our mobile customers. However, when I reviewed the wireframes of the work our development company is doing, it became clear to me that, for many of our pages, large parts of the textual content on the page, and most of our sidebar links, would no longer be accessible to a visitor using a mobile device. The content will still be indexable, but hidden from users using media queries. There would be no access point for a user to view much of the content on the page that's making it rank. This is not my understanding of best practices around responsive design. My interpretation of Google's guidelines on responsive design is that all of the content is served to both users and search engines, but displayed in a more accessible way to a user depending on their mobile device. For example, Wikipedia pages have introductory content, but hide most of the detailed info in tabs. All of the information is still there and accessible to a user...but you don't have to scroll through as much to get to what you want. To me, what our development company is proposing fits the definition of cloaking and/or hiding text and links - we'd be making available different content to search engines than users, and it seems to me that there's considerable risk to their interpretation of responsive design. I'm wondering what other people in the Moz community think about this - and whether anyone out there has any experience to share about inaccessable content on responsive webpages, and the SEO impact of this. Thank you!
Web Design | | mmewdell0 -
Question About Site Redesign and Nav / Page Structure
Hey guys, i am currently redesigning our company's site, and have come across some things that I'm not quite sure of. We used to have individual service pages in our main navigation (design, video, marketing) before the redesign. In this new design, i had the idea of making just one "services" or "capabilities" page, where these three services would each be outlined, and each service would have a list of links to more specific landing pages. Obviously, breaking it up correctly with HTML5 using the andtags. What I'm wondering is that if i'm going to be penalized for having those three services that aren't necessarily related too closely on the same page as opposed to having the one page for each service (like we have now). Any help would be greatly appreciated, and let me know if i need to elaborate more. Thanks in advance!
Web Design | | RenderPerfect0 -
Multiple Sites, multiple locations similar / duplicate content
I am working with a business that wants to rank in local searches around the country for the same service. So they have websites such as OURSITE-chicago.com and OURSITE-seattle.com -- All of these sites are selling the same services, but with small variations in each state due to different legal standards in the state. The current strategy is to put up similar "local" websites with all the same content. So the bottom line is that we have a few different sites with the same content. The business wants to go national and is planning a different website for each location. In my opinion the duplicate content is a real problem. Unfortunately the nature of the service makes it so that there aren't many ways to say the same thing on each site 50 times without duplicate content. Rewriting content for each state seems like a daunting task when you have 70+ pages per site. So, from an SEO standpoint we have considered: Using the canonocalization tag on all but the central site... I think this would hurt all of the websites SERPs because none will have unique content. Having a central site with directories OURSITE.com/chicago -- but this creates a problem because we need to link back to the relevant content in the main site and ALSO have the unique "Chicago" content easily accessable to Chicago users while having Seattle users able to access their Seattle data. The best way we thought to do this was using a frame with a universal menu and a unique state based menu... Also not a good option because of frames will also hurt SEO. Rewrite all the same content 50 times. You can see why none of these are desirable options. But I know that plenty of websites have "state maps" on their main site. Is there a way to accomplish this in a way that doesn't make our copywriter want to kill us?
Web Design | | SysAdmin190 -
Making a third-party hosted blog look like a folder on the main domain
I have a client that has a "completely pristine" Microsoft.net web environment that is unwilling to put a wordpress installation on their server. Their management team wants a wordpress blog for the marketing department. Is there a means where we can host the wordpress blog with a regular hosting company but yet have it appear as part of the main site e.g., mainsite.com/blog vs. having to put it in a subdomain (blog.mainsite.com) and lose all the SEO benefits of the blog content?
Web Design | | jtroia0 -
SEOMoz crawl report shows a duplicate content and duplicate title for these two url's http://freightmonster.com/ and http://freightmonster.com/index.html. How do I fix this?
What page is attached to http://freightmonster.com/ if it is not the index.html ? Should I do a redirect from the index page to something more descriptive?
Web Design | | FreightBoy1 -
XML Sitemap that updates daily/weekly?
Hi, I have a sitemap on my site, that updates but it isn't a XML sitemap. See here: http://www.designerboutique-online.com/sitemap/ I have used some free software to crawl the site and create a sitemap of pages, however I think that if I were to upload the sitemap, it would be out of date as soon as I listed new products on the site, so would need to rerun it. Does anyone know how I can get this to refresh daily or weekly? Or any software that can do it? I have a web firm that are willing to do one, but our relationship is at an all time low and I don't want to hand over £200 for them to do one. Anyone with any ideas or advice? Thanks Will
Web Design | | WillBlackburn0 -
Hosting/design company that is both cheap and has a nice partner package. Any ideas?
I need to signon with a hosting/design company that is both cheap and has a nice partner package. Any ideas?
Web Design | | christinarule0