Wrong Page Indexing in SERPS - Suggestions?
-
Hey Moz'ers!
I have a quick question. Our company (Savvy Panda) is working on ranking for the keyword: "Milwaukee SEO".
On our website, we have a page for "Milwaukee SEO" in our services section that's optimized for the keyword and we've been doing link building to this. However, when you search for "Milwaukee SEO" a different page is being displayed in the SERP's.
The page that's showing up in the SERP's is a category view of our blog of articles with the tag "Milwaukee SEO".
**Is there a way to alert google that the page showing up in the SERP's is not the most relevant and request a new URL to be indexed for that spot? **
I saw a webinar awhile back that showed something like that using google webmaster sitelinks denote tool.
I would hate to denote that URL and then loose any kind of indexing for the keyword.
Ideas, suggestions? -
I'm not sure how many of your /tag/ pages are ranking but if you can figure that part out, you can try doing htaccess 301 redirects for specific URLs, example:
redirect 301 //tag/Milwaukee-SEO.html http://savvypanda.com/services/milwaukee-seo.html
If you need further help with .htaccess and Joomla, I'm pretty well rounded with my skills. We use Joomla for a majority of our clients (followed by Wordpress.)
-
i'm cool with not having them indexed, i'm just worried that if I demote or block the /tag/ from being indexed we'll lose ranking for keywords.
Right now the /tag/ URL is ranking fairly well. ?
-
I personally would not bother indexing the /tag/ pages since all that content exists on their own "permalink" somewhere within your site from what I could tell with a quick look.
-
Hey Dan,
You caught on to the big problem we're correcting now. It's the way our tagging system works in our blog... it's causing all kinds of duplicate content errors. We're changing tagging systems to help this problem.So I plan on doing this first, but do you have any ideas how to correct the /tag/ URL that's being indexed instead of our "MIlwaukee SEO" services page?
-
I see your /tag/ listing is showing up in the SERPs. I also noticed you have duplicate content issues on your website.
S****ee this for an example:
I'd consider fixing the duplicate content issue first, that is definitely a major problem and is probably affecting a lot of other landing pages. Fixing this might also fix your original problem that you posted about.
-
I believe you are referring to googles robot.txt which is designed to have google skip a page while indexing. I dont think you want to do this. However, I checked the backlinks (anchored text to your site) and seems like you have not built any incoming links using your keyword "Milwaukee SEO" . I would recommend just building some good links to using "Milwaukee SEO"
Your code should look like this Milwaukee SEO
Post this on a few Local sites. Since you are web design company as well, you can include that script in some of your local sites footers : ) Goof luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt, Disallow & Indexed-Pages..
Hi guys, hope you're well. I have a problem with my new website. I have 3 pages with the same content: http://example.examples.com/brand/brand1 (good page) http://example.examples.com/brand/brand1?show=false http://example.examples.com/brand/brand1?show=true The good page has rel=canonical & it is the only page should be appear in Search results but Google has indexed 3 pages... I don't know how should do now, but, i am thinking 2 posibilites: Remove filters (true, false) and leave only the good page and show 404 page for others pages. Update robots.txt with disallow for these parameters & remove those URL's manually Thank you so much!
Intermediate & Advanced SEO | | thekiller990 -
Client has moved to secured https webpages but non secured http pages are still being indexed in Google. Is this an issue
We are currently working with a client that relaunched their website two months ago to have hypertext transfer protocol secure pages (https) across their entire site architecture. The problem is that their non secure (http) pages are still accessible and being indexed in Google. Here are our concerns: 1. Are co-existing non secure and secure webpages (http and https) considered duplicate content?
Intermediate & Advanced SEO | | VanguardCommunications
2. If these pages are duplicate content should we use 301 redirects or rel canonicals?
3. If we go with rel canonicals, is it okay for a non secure page to have rel canonical to the secure version? Thanks for the advice.0 -
Wrong page getting ranked
Hi all, we have product category pages on our ecommerce web site and we also produce blog content (such as buyers guides, setup guides etc) to help with ranking and give our site some good quality, unique content. However we are sometimes finding that the buyers guide / blog content gets ranked by Google over our product category page. I'm hoping, if I give an example or two, some one smart out there may be able to point me in the right direction as to how we can avoid this and get the product category page ranked instead? You will see from my examples we are linking internally using the keywords from the buyers guides to the product category pages in order to show the most important page to Google for these keywords and are trying to structure the product category pages as well as possible to make it the most optimized page for the term. Example: Keyword "twin dvd player"... product category page: http://www.3wisemonkeys.co.uk/dvd/portable-dvd-player-car/twin-dvd-player/ ... blog page actually getting ranked for this keyword: http://www.3wisemonkeys.co.uk/advice-center/dual-screen-and-twin-dvd-player-explained/ Keyword "site radio".... product category page: http://www.3wisemonkeys.co.uk/audio/radio/site-radio/ .... blog buyer guide page actually getting ranked for keyword: http://www.3wisemonkeys.co.uk/advice-center/Site-radio-buying-guide/ Any help / pointers appreciated. Thanks.
Intermediate & Advanced SEO | | jasef0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0 -
Google replacing subpages in index with home page?
Hi! I run a backlink building company. Recently, we had a customer who had us build targeted backlinks to certain subpages on his site. Then something really bizarre happened...all of a sudden, their subpages that were indexed in Google (the ones we were building links to) disappeared from the index, to be replaced with their home page. They haven't lost their rank, per se--it's just now their home page instead of their subpages. At this point, we are tracking literally thousands of keywords for our link building customers, and we've never run into this issue before. Have you ever run into it? If so, what's the best way to handle it from an SEO company perspective? They have a sitemap.xml and their GWT account reports no crawl errors, so it doesn't seem to be a site issue.
Intermediate & Advanced SEO | | ownlocal0