Is it necessary to optimize every page of a site
-
I recently took over the SEO work for a website that has a limited budget. I'd like to use the resources to get as much as I can for a few pages on the site (keyword research, on-page optimization). Are there pitfalls to not optimizing every page on a site? If so, what are they?
-
100% agree with this. Depending on the size of the site I would also try to make sure you don't have any dupe content anywhere and that you have unique meta tags across pages (again, depends on size of site). As you never know where traffic may come from and what surprises are in store.
Then super focus on the sections of the site you feel are valuable, conquer and repeat.
-
If you have a limited budget I would work on targeting only a limited number of pages and a limited number of keywords.
You can run into problems where clients have limited budgets and they have a site with say 10k pages.
I don't think you would ever optimize every page of a website you have pages on sites like T's & C's and Privacy Policy
Target the low hanging fruit and pages where you can gather quick wins, I would go top 10.
-
I think it's fine to focus on a few/your most important pages, but I would take the time to make sure the internal linking/navigation is optimized for the entire/majority of the site. Letting bots crawl all your pages is the first step. Then focus optimization where you get get the most return.
-
Eric,
Every page is another opportunity to be relevant on a given subject. If you have a limited budget I would suggest determining the top 25 pages (or 200 - depending on your time/budget), building those up, and using their optimization successes as justification for a larger budget.
Nothing wrong with starting small.
Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword appearing on almost every slug of product pages = over-optimizatio
Hello all, I have an online store, let's say for example I sell forks of all kinds and colors. So naturally, I have 'product category' pages with titles and slugs like: Big forks
On-Page Optimization | | Veptune
Small forks
Plastic forks
Red fork
etc.. And plenty of product pages with slugs and H1 like: Small red fork
Large plastic fork
18th-century fork
etc... Some category pages are well-ranked, others are not, the same goes for product pages. The problem is that for the main keyword, 'fork' (exact query in the search console), my site is completely absent. Google should logically have referenced my homepage (which has links to all categories) for this main keyword. I have also optimized the page for it, without overdoing it. I wonder if it's not because I have a lot of pages with 'fork' in the slug, and perhaps Google thinks it's too much (even though it's logical for this word to be present in all product pages because it's an essential word to describe the product). I wonder if I should not modify half of my product pages to remove the word 'fork' from the slug...(only from the slug, without touching the H1 because removing the word 'fork' would remove its meaning). Do you have any experiences with this kind of issue? I wouldn't ask the question if my homepage was behind the competition, but it's completely absent. Thanks0 -
My site on desktop browser: page 2 /mobile browser: page 0
Using my two most pertinent keywords in Chome my site shows up page two. Using the same keywords on my iPhone does not show my site at all (I clicked on to page 15). I have a mobile ranking of 84 on Google PageSpeed Insights. Could be a bit higher but not enough to totally ignore my site. What am I missing?
On-Page Optimization | | artsp0 -
Should I optimize my home-page or a sub-page for my most important keyword
Quick question: When choosing the most important keyword set that I would like to rank for, would I be better off optimizing my homepage, or a sub page for this keyword. My thinking goes as follows: The homepage (IE www.mysite.com) naturally has more backlinks and thus a better Google Page Rank. However, there are certain things I could do to a subpage (IE www.mysite.com/green-widgets-los-angeles ) that I wouldn't want to do to the homepage, which might be more "optimal" overall. Option C, I suppose, would be to optimize both the homepage, and a single sub-page, which is seeming like a pretty good solution, but I have been told that having multiple pages optimized for the same keywords might "confuse" search engines. Would love any insight on this!
On-Page Optimization | | Jacob_A2 -
How to optimize WordPress Pages with Duplicate Page Content?
I found the non WWW ans WWW duplicate pages URL only, more than thousand pages.
On-Page Optimization | | eigital0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Site structure question
I'm currently working on a very awkward custom-WP setup, in which I can't maintain the present drop-down navigation menu without having those pages under a parent or without completely recoding everything. I have two requirements, for SEO purposes I'm looking for the following structure for each targeted landing page: www.example.com/landing-page as opposed to www.example.com/sub/landing-page Of course, having my landing pages as a child, I get the latter of the two. For navigational purposes they need to fall under a specific category in a drop-down menu. With any other theme or setup this is an easy fix, but not here. What I have now is that the landing pages are currently placed under a parent category page. But, they have custom permalinks. The permalinks are setup as follows www.example.com/landing-page But, technically the exact structure is still www.example.com/sub/landing-page which then redirects to the custom permalink. So, my question is - in an attempt to get my most important landing pages close to the root for better PR and crawlability, do I still get the same benefit with my current setup? Is this structure I have, better, worse, or indifferent? Thanks.
On-Page Optimization | | JayAdams320 -
Optimization of home page
Hi there I have an issue which, despite searching hard, I simply cannot find the right solution for. We have an index page that used to rank pretty well for a main industry keyword. However following a revamp of the site last year the kw slipped and no longer brings in decent traffic levels. The problem seems to be that the old static site had a sprinkling of variable anchor text links that brought value to the home page. Instead of the main anchor being "home" we would revert to "main keyword" and variations across the site sometimes in t he content but mainly on the nav bars. However the new CMS design structure restricts us considerably with anchor distribution and so instead we opted for the site logo on the masthead to have an ALT tag for "main keyword" but so as not to game google too much we added .."home" to the tag. Probably pointless but we figured it could do no harm. This ALT text is site wide Problem now is that we have lost the spread of internal nav bar anchors and variety etc. We have slipped in the serps for "main keyword" and I cant help thinking we are not maximising the anchors as we should. So what Im coming to is this.... How can we tell if Google is picking up the ALT tage anchor as the main anchor to rank the site at the expense of all internal text anchors. Despite retaining lots of embedded anchors - according to the Moz metrics these are not being picked up because OSE suggests the ALT tag anchor is taking precedence. The serps probably support this view as well. Should we: a) Vary the masthead ALT if there is no way of avoiding this being the most important link / anchor on the page b) Remove the ALT anchor and instead opt for content links high on the page (we do have nav bar links saying "Home" site wide as well which may overrid the embedded links?) c) Leave the ALT alone and still push for content anchors as described in b) What is the best way to handle this..? Best wishes and thanks Morch
On-Page Optimization | | Morch0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5