How do I get google to index the right pages with the right key word?
-
Hello I notice that even though I have a site map google is indexing the wrong pages under the wrong key words. As a result its not as relevant and is not ranking properly.
-
So this will stop google from ranking my site map on the first page for key words I want to rank for versus the actual page listed in the ite map?
-
Ah I understand perfectly what are trying to accomplish. You can put attributes in your XML file that indicate to a search engine spider how important the page is, but there is absolutely no guarantee that bots will pay any attention to it. In fact, they could use that as a negative indicator and discount those pages instead.
It's similar to the attribute in an XML file that tells a search engine how often the page is update. Because these things are easy to manipulate, I can almost guarantee that Google ignores them.
There is only one way to achieve what you want: be the best page on the planet for the keywords you want to rank for. That means writing mind-blowingly good content and getting more inbound links of higher quality than anyone else.
I hope that helps and gets you where you want to be.
Dana
-
To be a bit more specific. Instead of showing the page I want to show its actually ranking the sitemap page instead. The XML sitemap on page 1 lol.... I need the actual content to show....haha...
-
Hi ursalesguru. I'm not trying to be flippant, but man, if we all knew the answer to that question, none of us would be here.
It's a struggle. Search engine spiders like Googlebot are really stupid...no, really, they are. They can't see some of the obvious things that human beings can see when looking at a page. That's why SEOs spend so much time writing "alt" attributes, meta tags, <h>tags, transcribing videos, researching anchor text and building links and adding structured data as prescirbed by schema.org. All of this is an attempt to try to get search engines to see a page the way a human sees a page.</h>
Still, it's possible to get all that right and still lose. It requires time, patience and intelligent work. Our mutual frustration is one reason this forum is so gosh darned popular
Keep on keeping on. Learn, read, stay passionate. The rest will follow.
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inner pages of a directory site wont index
I have a business directory site thats been around a long time but has always been split into two parts, a subdomain and the main domain. The subdomain has been used for listings for years but just recently Ive opened up the main domain and started adding listings there. The problem is that none of the listing pages seem to be betting indexed in Google. The main domain is indexed as is the category page and all its pages below that eg /category/travel but the actual business listing pages below that will not index. I can however get them to index if I request Google to crawl them in search console. A few other things: I have nothing blocked in the robots.txt file The site has a DA over 50 and a decent amount of backlinks There is a sitemap setup also any ideas?
Technical SEO | | linklander0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Does Google Parse The Anchor Text while Indexing
Hey moz fanz, I'm here to ask a bit technical and open-minding question.
Technical SEO | | atakala
In the Google's paper http://infolab.stanford.edu/~backrub/google.html
They say they parse the page into hits which is basically word occurences.
But I want to know that they also do the same thing while keeping the anchor text database.
I mean do they parse the anchor text or keep it as it is .
For example, let's say my anchor text is "real car games".
When they indexing my link with anchor text, do they parse my anchor text as hits like
"real" distinct hits
"car" distinct hits
"games" distinct hits.
OR do they just use it as it is. As "real car games"0 -
New Page Showing Up On My Reports w/o Page Title, Words, etc - However, I didn't create it
I have a WordPress site and I was doing a crawl for errors and it is now showing up as of today that this page : https://thinkbiglearnsmart.com/event-registration/?event_id=551&name_of_event=HTML5 CSS3 is new and has no page title, words, etc. I am not even sure where this page or URL came from. I was messing with the robots.txt file to allow some /category/ posts that were being hidden, but I didn't re-allow anything with the above appendages. I just want to make sure that I didn't screw something up that is now going to impact my rankings - this was just a really odd message to come up as I didn't create this page recently - and that shouldnt even be a page accessible to the public. When I edit the page - it is using an Event Espresso (WordPress plugin) shortcode - and I don't want to noindex this page as it is all of my events. Sorry this post is confusing, any help or insight would be appreciated! I am also interested in hiring someone for some hourly consulting work on SEO type issues if anyone has any references. Thank you!
Technical SEO | | webbmason0 -
Home page deindexed by google
when I search my website on google by site:www.mydomain.com I have found my domain with www has been de-indexed by google, but when I search site:mydomain.com, my home page--**mydomain.com **show up on the search results without www, put it simple, google only index my domain without www, I wonder how to make my domain with www being indexed, and how to prevent this problem occure again.
Technical SEO | | semer0 -
Pages not Indexed after a successful Google Fetch
I am trying to understand why google isn't indexing key content on my site. www.BeyondTransition.com is indexed and new pages show up in a couple of hours. My key content is 6 pages of information for each of 3000 events (driven by mySQL on a wordpress platform). These pages are reached via a search page, but no direct navigation from the home page. When I link to an event page from an indexed page it doesn't show up in search results. When I use fetch on webmaster tools the fetch is successful but is then not indexed - or if it does appear in results it's directed to the internal search page e.g. http://www.beyondtransition.com/site/races/course/race110003/ has been fetched and submitted with links but when I search for BeyondTransition Ironman Cozumel I get these results.... So what have I done wrong and how do I go about fixing it? All thoughts and advice appreciated Thanks Denis
Technical SEO | | beyondtransition0 -
Remove Deleted (but indexed) Pages Through Webmaster Tools?
I run a blog/directory site. Recently, I changed directory software and, as a result, Google is showing 404 Not Found crawling errors for about 750 non-existent pages. I've had some suggest that I should implement a 301 redirect, but can't see the wisdom in this as the pages are obscure, unlikely to appear in search and they've been deleted. Is the best course to simply manually enter each 404 error page in to the Remove Page option in Webmaster Tools? Will entering deleted pages into the Removal area hurt other healthy pages on my site?
Technical SEO | | JSOC0