Optimize a Classifieds Site
-
Hi,
I have a classifieds website and would like to optimize it. The issues/questions I have:
-
A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)?
-
Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city.
The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation?
I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site.
Cheers!
-
-
Thanks Dr. Peter :). I have implemented your suggestions, so will see if I get any better rankings. Meanwhile, I will continue link building effort for the site!
-
They shouldn't - a META NOINDEX is easier to undo than a Robots.txt block, 301, or canonical tag, in my experience. The biggest risk is just a delay - it may take Google a little time to re-index the content once you remove the tag.
What I wouldn't do is add/remove the tag rapidly. For example, if you had a product that went out of stock every other day, I'd leave it alone - Google wouldn't respond quickly enough to all those changes. So, once a category has enough results, I'd lift the NOINDEX permanently. It's really just a move to consolidate while you build up the site - both in terms of content and your link profile.
-
I really want to clear out thin content and your response makes it much clear to me. Now I know want to do next. Thank you so much for replying and clarifying the details.
I have another question.. Let's consider this scenario where I add META NOINDEX to the category pages that have less than 5 classified ads. Later down the road there are more than 5 ads posted in that category and I would like to put META INDEX... will google treat this page differently meaning with some penalty of NOINDEX in first place and then INDEX later on or not index these categories as they were NOINDEX earlier?
-
Unfortunately, the painful reality, especially if you've been hit by Panda, is that you probably can't support that scale or that it looks thin to Google. 500 cities X 50 categories = 25,000 "category" pages, so to speak, all of which are basically just search results. For most sites, it's just too much.
I'd definitely keep the cities as sub-folders. If you go the sub-domain route, you could fracture your internal link-juice even more. It depends a bit on the authority and marketing budget of the site. If each city is a separate property with its own sales force, budget, etc., there may be a logic to sub-domains. Unless you're Groupon or someone like that, though, it's probably a bad idea.
You may have to prune down the indexed content, to be frank. I'd look for other Panda factors, too, like aggressive ad density (too many ads to too little content) or very thin pages. If you have tons of cities or categories with no listings, META NOINDEX them. You could even do it dynamically - only let Google index a page if it has 1+ listings, for example.
I'd also take a look at other low-value content, like paginated search. If each city has 100s of pages and you're indexing page 2, page 3, etc., consider consolidating them. It's a tricky topic, but Adam Audette has a great write-up here:
http://searchengineland.com/five-step-strategy-for-solving-seo-pagination-problems-95494
These pages can look very low-value to Google. Add in search sorts and other variants, and your 25K categories could be exploding into hundreds of thousands of pages, before Google even gets to the listings themselves. The ads are the real meat of the site, and that's where you want Google to focus.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How To Optimize A Page For Multiple Keywords
Hi Guys, In this video by Brian Dean he talks about how to go about optimising for multiple keywords. He basically said the main factors for optimising a page for multiple keywords are the following: Identify other keywords with same search intent as your primary keyword. Add them to title tag strategically, don't stuff them in there. Add as many of those keywords as h2 tags into the content, again when it makes sense. Are there any other more advanced ways you can use to optimize a page for multiple keywords with same search intent that could be good? Any suggestions would be very much appreciated! Cheers.
Intermediate & Advanced SEO | | spyaccounts110 -
How much SEO damage would it do having a subdomain site rather directory site?
Hi all! With a coleague we were arguing about what is better: Having a subdomain or a directory.
Intermediate & Advanced SEO | | Gaston Riera
Let me explain some more, this is about the cases: Having a multi-language site: Where en.domain.com or es.domain.com rather than domain.com/en/ or domain.com/es/ Having a Mobile and desktop version: m.domain.com or domain.com rather than domain.com/m or just domain.com. Having multiple location websites, you might figure. The dicussion started with me saying: Its better to have a directory site.
And my coleague said: Its better to have a subdomain site. Some of the reasons that he said is that big companies (such as wordpress) are doing that. And that's better for the business.
My reasons are fully based on this post from Rand Fishkin: Subdomains vs. Subfolders, Rel Canonical vs. 301, and How to Structure Links for SEO - Whiteboard Friday So, what does the community have to say about this?
Who should win this argue? GR.0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
How to know if your site has been penalized by Google
Hello, One of my clients ranking drop dramatically.
Intermediate & Advanced SEO | | ogdcorp
We believe it was due to an upgrade to his site. While the site was live www.clientdomain.com
Work was being done on the new site www.clientdomain.com/new (1 month) I think google crawled the /new link and took as a content duplication since both sites had the same content. Is there a MOZ tool to see if a site has been penalized or any online tool? Thanks0 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0 -
Problems with a NoIndex NoFollow Site
For legal reasons my website is going to launch non-branded websites. We do not have the capacity to make these site sufficiently unique from the main site so we are planning on having them be NoIndex NoFollow. Are there any potential SEO problems here? What will the implication be if in ~1-2 years from launching the NoIndex NoFollow we make the site unique, take away the tag and want to start promoting these sites organically. Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Domain w/ Identical Content to Site we are Optimizing
Hi Guys, We've been optimizing a client's site for about a year or so now and on a call the other day the client brought up that he owns and operates another site that's marketing the same product, but to a difference audience (we work on the direct to consumer side, this is a distributior focused site),with the same exact content as the site we are optimizing. Obviously this is a major duplcant content issue and we need to get it resolved very quickjly. We have already reccomendt to the client that we re-write content, but this is where my questions comes in - Which site should we rewrite the content on? The site we are optimizing is the more impoorant of the two, while we still want the other site to hold rankings we dont want to end up accidently optimizing the other site wherein the site we are working on full time suffers a lost when a "compeiting" site creates compeltely new content and may, accidentally, end up ranking higher than the site we are focusing on full time. As links also play a role, would that be a KPI to look at here in determining which site gets new content and which does not? In this scenairo, would would you guys recommend? Just want to make sure I'm dotting all my I's, and crossing T's here. Many thanks to all in advance, Mike
Intermediate & Advanced SEO | | Havas_Disco0 -
Not ranking well after site was hacked
My site was hacked and that seemed to have a pretty big effect on search rankings. I'm pretty sure I've gotten the hack completely removed (as of 10/16.) During the hack and even now, all of my posts created after Aug 1 don't rank in regular Google searches at all even if I search for the titles in quotes. But I can tell they are indexed because I see them when I do a site search in Google. And I see that they are cached. Posts from before Aug 1 rank well in Google searches. The issue with August and after posts no longer appearing started in mid September. Prior to that the some of the August posts were actually performing very well. The September date corresponds to when we first started working on removing the hack. It looks like the August and after posts all have a pagerank of 0, whereas the mozrank is much higher. I've request reconsideration from Google and was told that no manual action was taken against my site. I know that's a lot of background, but I'm wondering how long do I need to wait before the August and after posts start ranking? Is there anything I can do in the mean time to address this pagerank issue?
Intermediate & Advanced SEO | | Chris-at-Magoosh0