Magneto site with many pages
-
just finsihed scan to a magento site.
off course I am getting thousand of pages that are dynamic.
search pages and other.
checking with site command on Google I see 154,000 results
which pages it is recommended to block?
some people are talking about blocking the search pages and some actually talking about allowing them?
any answer on this?
Thanks
-
There was no effect on the rankings, no significant ups or downs. But now when I do the site:www.domain.com command in Google, I see just the pages that I want Google to index.
It can only help in the long run I guess.
-
Hi
sorry for responding late.
what was the affect on your results when u did block all those pages?
Thanks
-
yes my thought is blocking through robot.txt
-
I've had similar problems with a few Magento sites. This is a standard list I use in my robots.txt files (below.) I hope it helps.
You don't have to include all the Magento folders like 'app' 'lib' 'var' and 'admin' t etc hey are just there to be thorough.
I think you'll get the idea. I've brought the number of indexed pages down from half a million to just a few thousand using these.
Disallow: /*? Disallow: /*.js$ Disallow: /*.css$ Disallow: /404/ Disallow: /admin/ Disallow: /api/ Disallow: /app/ Disallow: /catalog/category/view/ Disallow: /catalog/product/view/ Disallow: /catalog/product_compare/ Disallow: /catalogsearch/ Disallow: /catalogsearch/advanced/ Disallow: /catalogsearch/term/ Disallow: /catalogsearch/term/popular/ Disallow: /cgi-bin/ Disallow: /checkout/ Disallow: /checkout/cart/ Disallow: /contacts/ Disallow: /contacts/index/ Disallow: /contacts/index/post/ Disallow: /customer/ Disallow: /customer/account/ Disallow: /customer/account/login/ Disallow: /downloader/ Disallow: /install/ Disallow: /js/ Disallow: /lib/ Disallow: /magento/ Disallow: /newsletter/ Disallow: /pkginfo/ Disallow: /private/ Disallow: /poll/ Disallow: /report/ Disallow: /review/ Disallow: /sendfriend/ Disallow: /skin/ Disallow: /tag/ Disallow: /var/ Disallow: /wishlist/
-
Hi there! When you say block, do you mean through your robots.txt?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple pages optimised for the same keywords but pages are functionally different and visually different
Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
Intermediate & Advanced SEO | | TrueluxGroup
https://www.whichledlight.com/categories/led-spotlights
and the other page is
https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂0 -
Site Migration of 4 sites into 1?
Hi Guys, I have a massive project involving a migration of 4 sites into 1. 4 sites include: **www.MainSite.com ** www.E-commerce.com www.Membership.com www.ResearchStudy.com Goal of this project is to have 1-4 regrouped into Main Site I will be following the best practice from this post https://moz.com/blog/web-site-migration-guide-tips-for-seos which has an awesome checklist. I am actually about to start Phase 3: URL redirect mapping. Because all of these sites have hundreds of duplicates, I figured I should first resolve the Main Site dup issues before creating the URL redirect mapping but what about the other domains (2,3,4) though? Should I first resolve the Dup issues on those ones as well or it is not necessary since they will be pointing into the Main Site new domain? I want to make sure I don't overwork the programming team and myself. Thanks For sharing your expertise and any tips on how should I move forward with this.
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Should We Add the W3.org Language Tag To Every Page Or Just The Home Page?
Greetings, We have five international sites around the world, two of which are in difference languages. Currently we have the following line of html code on the home page of each of the sites: Clearly, we need to change the "en" portion for the sites that aren't in English, but, should we include that meta tag in each of the site's pages, or will the home page suffice. Thanks!
Intermediate & Advanced SEO | | CSawatzky0 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
Our Site's Content on a Third Party Site--Best Practices?
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content. I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site. Our thoughts so far: add a paragraph of original content to our content link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties) What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site? They are really pushing for not using a canonical--so this isn't an option. What would you do?
Intermediate & Advanced SEO | | nicole.healthline1 -
Is the Penguin algorithmic penalty on a page basis or a site basis?
Just wondering if there has been any clarification of whether the Penguin algorithmic penalty is on a Page basis or a Site basis? In other words, is it all or nothing?
Intermediate & Advanced SEO | | darkgreenguy0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Can obfuscated Javascript be used for too many links on a page?
Hi mozzers Just looking for opinions/answers on if it is ever appropriate to use obfuscated Javascript on links when a page has many links but they need to be there for usability? It seems grey/black hat to me as it shows users something different to Google (alarm bells are sounding already!) BUT if the page has many links it's losing juice which could be saved....... Any thoughts appreciated, thanks.
Intermediate & Advanced SEO | | TrevorJones0