Magneto site with many pages
-
just finsihed scan to a magento site.
off course I am getting thousand of pages that are dynamic.
search pages and other.
checking with site command on Google I see 154,000 results
which pages it is recommended to block?
some people are talking about blocking the search pages and some actually talking about allowing them?
any answer on this?
Thanks
-
There was no effect on the rankings, no significant ups or downs. But now when I do the site:www.domain.com command in Google, I see just the pages that I want Google to index.
It can only help in the long run I guess.
-
Hi
sorry for responding late.
what was the affect on your results when u did block all those pages?
Thanks
-
yes my thought is blocking through robot.txt
-
I've had similar problems with a few Magento sites. This is a standard list I use in my robots.txt files (below.) I hope it helps.
You don't have to include all the Magento folders like 'app' 'lib' 'var' and 'admin' t etc hey are just there to be thorough.
I think you'll get the idea. I've brought the number of indexed pages down from half a million to just a few thousand using these.
Disallow: /*? Disallow: /*.js$ Disallow: /*.css$ Disallow: /404/ Disallow: /admin/ Disallow: /api/ Disallow: /app/ Disallow: /catalog/category/view/ Disallow: /catalog/product/view/ Disallow: /catalog/product_compare/ Disallow: /catalogsearch/ Disallow: /catalogsearch/advanced/ Disallow: /catalogsearch/term/ Disallow: /catalogsearch/term/popular/ Disallow: /cgi-bin/ Disallow: /checkout/ Disallow: /checkout/cart/ Disallow: /contacts/ Disallow: /contacts/index/ Disallow: /contacts/index/post/ Disallow: /customer/ Disallow: /customer/account/ Disallow: /customer/account/login/ Disallow: /downloader/ Disallow: /install/ Disallow: /js/ Disallow: /lib/ Disallow: /magento/ Disallow: /newsletter/ Disallow: /pkginfo/ Disallow: /private/ Disallow: /poll/ Disallow: /report/ Disallow: /review/ Disallow: /sendfriend/ Disallow: /skin/ Disallow: /tag/ Disallow: /var/ Disallow: /wishlist/
-
Hi there! When you say block, do you mean through your robots.txt?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
Does it make sense to create new pages with friendlier URLs then redirect old pages to new?
Hi Moz! My client has messy URLs. does it make sense to write new clean URLs, then 301 redirect all old URLs to the new ones? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Is site page structure hurting its chances to rank?
I have a client that sells geotextiles and related products. None of his keywords gets a lot of traffic google as it is a very B2B niche specific industry. For instance, and these numbers are off the top of my head The phrase geotextiles may get 80 searches a month and we have a domain.com/geotextiles.php page Then there are woven and nonwoven geotextiles which may get 30 searches a month We too have a domain.com/nonwoven-geotextiles.php and etc It then goes even further and has things like slit film series non woven /woven and we have subpages from there. To me, I feel as if we need to merge all of these pages to just a singular geotextile page with headers for woven and nonwoven and product info for the sub branches of those two. I feel as if we are basically competing for the same phrase again and again and again for very small amounts of traffic. Thoughts?
Intermediate & Advanced SEO | | Atomicx0 -
Multi-Location SEO: Sites vs Pages
I just started with a new company that requires multi-location SEO for its niche product/service. Currently, we have a main corporate website, as well as, 40+ individual dealer websites (we host all). Keep in mind each of these dealers consist of only 1-2 people, so corporate I will be managing the site or sites and content strategy. Many of the individual dealer sites actually rank very well (#1-#3) in their areas for our targeted keywords, but they all use the same duplicate content. Also, there are many dealer sites that have dropped off the radar in last year, which is probably because of the duplicate and static content. So I'm at a crossroads... Attempt to redo all of these location sites with unique and local content for each or Create optimized unique pages for each of them on our main site and redirect their current local domains to their page on our site Any advise regarding which direction to go in and why. Why is very important. It will be very difficult to convince a dealer that is #1 with his local site that we are redirecting to our main site, so I need some good ammo and reasoning. Also, any tips toward achieving local seo success will be greatly appreciated, too! Thank you!
Intermediate & Advanced SEO | | the-coopersmith0 -
How many inner links on one page?
I have seen Matt Cutts video about links per page and know that too many links "may" harm the flow of link juice. But what should e-commerce sites do? We have category pages with more than a few thousands products in each of them. So linking to each of them dilutes the PR flow? We could use pagination, but doesn't it give a disadvantage in user experience when he needs to go 10 links deep to reach a product? And Google robots won't update the information frequently because it will be on the lowest part of our site? Now our goal is to make all our products appear like Facebook scroll down page. We know that Google doesn't use Ajax to see more links so robots and all the users that don't have JavaScript could see the paginated results. Is it a good way to put all products and links like this?
Intermediate & Advanced SEO | | komeksimas1 -
Optimal site structure for travel site
Hi there, I am seo-managing a travel website where we are going to make a new site structure next year. We have about 4000 pages on the site at the moment. The structure is only 2-levels at the moment: Level 1: Homepage Level 2: All other pages (4000 individual pages - (all with different urls)) We are adding another 2-3 levels, but we have a challenge: We have potentially 2 roads to the same product (e.g. "phuket diving product") domain.com/thailand/activities/diving/phuket-diving-product.asp domain.com/activities/diving/thailand/phuket-diving-product.asp I would very much appreciate your view on the problem: How do I solve this dilemma/challenge from a SEO standpoint? I want to avoid DC if possible, I also only want one landing page - for many reasons. And usability is of course also very important. Best regards, Chris
Intermediate & Advanced SEO | | sembseo0 -
Is the Penguin algorithmic penalty on a page basis or a site basis?
Just wondering if there has been any clarification of whether the Penguin algorithmic penalty is on a Page basis or a Site basis? In other words, is it all or nothing?
Intermediate & Advanced SEO | | darkgreenguy0