Is it good practice of keeping all our pages at second level?
-
While defining the site structure we thought of having all pages at second level only.
i.e. domain.com/services
please let us know the pros and cons of having this as architecture.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
External 404 pages
A client of mine is linking to a third-party vendor from their main site. The page being linked to loads with a Page Not Found error and then replaces some application content once the Javascript kicks in. This process is not visible to users (the application loads fine for front-end users) but it is being picked up as a 404 error in broken link reports. This link is part of the site skin so it's on every page. Outside of the annoyance of having lots of 404 errors being flagged in a broken link report, does this cause any actual issue? Eg, do search enginges see that my client is linking to something that is a 404 error, and does that cause them any harm?
Intermediate & Advanced SEO | | mkleamy0 -
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Best practice to 301 NON-WWW pages?
Hi Guys, Have a site which has 302 redirects installed for pages like: https://domain.com.au/ to https://www.domain.com.au/ (302 redirect) Is it worth changing the redirect to a 301? This is a large site, like 10,000 pages. Also does anyone know how can this be done via Magento? Cheers
Intermediate & Advanced SEO | | bridhard80 -
Competing with doorway pages
Hi all, it's my understanding that 'doorway pages' are bad practice. However, when googling for the services that our company offers, along the lines of '[service] [location]', businesses turn up in Google SERPs that outrank us purely with doorway pages. Take this as an example: https://www.google.co.uk/search?q=seo+dorking One of the results is this company who seem to rank for pretty much every town modifier: https://prioritypixels.co.uk/seo-agency-dorking/ If you look at their sitemaps you'll see thousands of these pages: https://prioritypixels.co.uk/page-sitemap16.xml All the content is slightly different but broadly speaking it is very similar. It seems that, in the short term, we can't compete with this company but we could if we employed the same tactics. So my question is: is what they are doing really risking a penalty? b1Lpp5
Intermediate & Advanced SEO | | Bee1590 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
Does it make sense to create new pages with friendlier URLs then redirect old pages to new?
Hi Moz! My client has messy URLs. does it make sense to write new clean URLs, then 301 redirect all old URLs to the new ones? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
"No index" page still shows in search results and paginated pages shows page 2 in results
I have "no index, follow" on some pages, which I set 2 weeks ago. Today I see one of these pages showing in Google Search Results. I am using rel=next prev on pages, yet Page 2 of a string of pages showed up in results before Page 1. What could be the issue?
Intermediate & Advanced SEO | | khi50 -
Removing pages from index
Hello, I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
Intermediate & Advanced SEO | | AlexGop
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist. The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal. Also, should I 301, 404 or 410 these pages? Any help would be appreciated. Thanks, Alex0