Moving Pages Up a Folder to come off root domain
-
Good Morning
I've been doing some competitor research to see why they're ranking higher than us and noticed that one who seems to be doing well has changed their url structure so that rather than being www.domain.com/product-category/product-subcategory/product-info-page/ they've removed levels so for instance they now have:
www.domain.com/product-subcategory/ and www.domain.com/product-info-page/ basically everything seems to come off the root domain rather than having the traditional structure.
Our rankings for the product-subcategory pages, which are probably what most people would search for, are just sitting below the first page in most instances and have been for a while
I'm interested to know other people's thoughts and if this is an approach they've taken and had good results?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having a Size Chart and Personalization Descriptions on each page - Duplicate Content?
Hi everyone, I am coding a Shopify Store theme currently and we want to show customers the size comparisons and personalization options for each product. It will be a great UX addition since it is the number one & two things asked via customer support. But my only concern is that Google might flag it as duplicate content since it will be visible on each product page. What are your thoughts and/or suggestions? Thank you so much in advance.
White Hat / Black Hat SEO | | MadeByBrew0 -
Buy exact match domain and 301 worth it?
So there is this exact match domain that gets about 500 visitors a day. it has trust flow 17 and citation flow of 23 which is just a little lower than our own website. The website talks about one of our keywords and rank on second page in SERPs. I am not interested in buying and running that website, but rather just to liquidate all the pages with 301s into our existing domain and onto relevant pages. So the 301s would be to relevant pages. The question is, would this strategy be worth it in todays SEO world and Google updates?
White Hat / Black Hat SEO | | TVape0 -
G.A. question - removing a specific page's data from total site's results?
I hope I can explain this clearly, hang in there! One of the clients of the law firm I work for does some SEO work for the firm and one thing he has been doing is googling a certain keyword over and over again to trick google's auto fill into using that keyword. When he runs his program he generates around 500 hits to one of our attorney's bio pages. This happens once or twice a week, and since I don't consider them real organic traffic it has been really messing up my GA reports. Is there a way to block that landing page from my overall reports? Or is there a better way to deal with the skewed data? Any help or advice is appreciated, I am still so new to SEO I feel like a lot of my questions are obvious, but please go easy on me!
White Hat / Black Hat SEO | | MyOwnSEO0 -
Domain authority - Low quality links
I have a question I hope people can help me on. it is my intention for my next project to focus on domain authority, and a small number of high quality links. I have a couple of scenarios I would appreciate some advice on: 1. Can lower quality links lower domain authority? 2. Would you avoid links from low quality sites no matter what \ what domain authority levels should you avoid links from. 3. Should I be looking at link profiles of the sites I get links from. Does it matter if a site I get a link from has 1000's of spammy links (i.e. something to look out for when doing guest blogging). 4. Should I avoid directories no matter what, or is high pr \ domain authority directories ok to use, if I end up on a page of other relevant directory submissions related to my niche. Essentially, my aim is to have high quality links, but equally, there are some decent sites on the fringes that I will need to consider (based on a competitors link profile I researches).
White Hat / Black Hat SEO | | Jonathan19790 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Link Removal and Disavow - Is Page Rank a sign directory is okay with Google
Hi, Currently cleaning up a clients link profile in preparation for disavow file and I have reached the stage where I am undecided on some directories as I don't want to remove all links. Is Page Rank an indication that Google is okay with a particular directory? For example the following domain is questionable, but has a PR of 3. Do I need to consider scrapping all such links in anticipation of future updates? http://www.easyfinddirectory.com/shopping-and-services/clothing http://www.toplocallistings.co.uk/Apparel/West_Midlands/Shropshire/ Thanks in advance Andy
White Hat / Black Hat SEO | | MarzVentures0 -
Schema.org tricking and duplicate content across domains
I've found the following abuse, and Im curious what could I do about it. Basically the scheme is: own some content only once (pictures, description, reviews etc) use different domain names (no problem if you use the same IP or IP-C address) have a different layout (this is basically the key) use schema.org tricking, meaning show (the very same) reviews on different scale, show a little bit less reviews on one site than on an another Quick example: http://bit.ly/18rKd2Q
White Hat / Black Hat SEO | | Sved
#2: budapesthotelstart.com/budapest-hotels/hotel-erkel/szalloda-attekintes.hu.html (217.113.62.21), 328 reviews, 8.6 / 10
#6: szallasvadasz.hu/hotel-erkel/ (217.113.62.201), 323 reviews, 4.29 / 5
#7: xn--szlls-gyula-l7ac.hu/szallodak/erkel-hotel/ (217.113.62.201), no reviews shown It turns out that this tactic even without the 4th step can be quite beneficial to rank with several domains. Here is a little investigation I've done (not really extensive, took around 1 and a half hour, but quite shocking nonetheless):
https://docs.google.com/spreadsheet/ccc?key=0Aqbt1cVFlhXbdENGenFsME5vSldldTl3WWh4cVVHQXc#gid=0 Kaspar Szymanski from Google Webspam team said that they have looked into it, and will do something, but honestly I don't know whether I could believe it or not. What do you suggest? should I leave it, and try to copy this tactic to rank with the very same content multiple times? should I deliberately cheat with markups? should I play nice and hope that these guys sooner or later will be dealt with? (honestly can't see this one working out) should I write a case study for this, so maybe if the tactics get bigger attention, then google will deal with it? Does anybody could push this towards Matt Cutts, or anybody else who is responsible for these things?0 -
Duplicate content showing on local pages
I have several pages which are showing duplicate content on my site for web design. As its a very competitive market I had create some local pages so I rank high if someone is searching locally i.e web design birmingham, web design tamworth etc.. http://www.cocoonfxmedia.co.uk/web-design.html http://www.cocoonfxmedia.co.uk/web-design-tamworth.html http://www.cocoonfxmedia.co.uk/web-design-lichfield.html I am trying to work out what is the best way reduce the duplicate content. What would be the best way to remove the duplicate content? 1. 301 redirect (will I lose the existing page) to my main web design page with the geographic areas mentioned. 2. Re write the wording on each page and make it unique? Any assistance is much appreciated.
White Hat / Black Hat SEO | | Cocoonfxmedia0