Best Practices
-
Okay this would be a piece of cake for most of you out there..
What are the best practices once you add a page or piece of content on your website with a new keyword that you have never used before but plan to use it with every relevant new page you add. How do you ensure that Google will crawl that page?
Secondly, if you add the new keyword in the old pieces of content/ pages you have already published by editing the content to suit that keyword, how would you ensure that it gets crawled my Google.
Thanks in advance
-
Sorry I missed this!
If you have your website architecture set up well you can always request Google to index a page and all pages that it links to. You'll see this option when you click the Submit to index button. You won't have to submit a substantial amount of individual pages this way.
I personally would keep an eye the pages of most value. These are the pages you are optimizing for that show up in the search results and are generating traffic.
Hope this helps.
-
Andreas,
Thanks for the tip. Will do
Regards,
-
RangeMarketing,
Thank you for your response. I will do that now for sure.
Also, do you think I need to make it as an exercise to check which page was last crawled. Like our website has more than 20k plus pages. Whats the best way to figure out? Which tool do you recommend?
Thanks
-
RangeMarketing is right, but there is an pretty easier way to, share the page @ gplus.
I realized that it sometimes is faster. But usually I fatch as google in both cases, like Range Marketing said. -
If you have internal links pointing to the page with the new/updated content Google will eventually find it, however, the quickest way to have this happen is to request a crawl in Google Webmaster Tools.
Under Crawl > Fetch as Google
Once the status of the page loads, you should see a button labeled Submit to index. Click this to submit the page to be indexed.
There are free tools available to find out the last time Google indexed (crawled) a specific page. I personally use the free SEO Book Toolbar. I believe Moz's free toolbar does this as well but I could be wrong.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Permalinks for SEO - Custom structure vs Postname
Good Morning Moz peeps, I am new to this but intending on starting off right! I have heard a wealth of advice that the "post name" permalink structure is the best one to go with however... i am wondering about a "custom structure" combing the "post name" following the below example structure: Www.professionalwarrior.com/bodybuilding/%postname/ Where "professional" and "bodybuilding" is my focus/theme/keywords of my blog that i want ranked. Thanks a mill, RO
Intermediate & Advanced SEO | | RawkingOut0 -
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Best SEO url woocommerce, what to do?
Hi! Today we have our product categories indexed (by misstake) and for one of our desired keywords, a category have the nr 1 rank. By misstake, we didnt set nofollow, noindex on our categories, just tags, archives etc. We are now migrating to from Ithemes Exchange to Woocommerce and ime looking on improving our SEO urls for the categories. For keyword "Key1" we rank with this url: http://site/product-category/Key1. The seo meta title and description where untouched when we launched the site last spring so it doesnt look so good.. The plan is to stripe out product-category and instead ad some description ( i have a newly written text of 95 words, 519 letters without space with they keyword precent 5 times in a natural way ) to that particular category and have the url as following: http://site/key1 and then have a 301 redirect for the old http://site/product-category/Key1. What do you think of this? What shall i consider? on the right track? Grateful for any help! // Jonas
Intermediate & Advanced SEO | | knubbz0 -
Best Sitemap Generator XML
Hello Everyone, Can Anyone Suggest best Site map Generator Software??
Intermediate & Advanced SEO | | ieplnupur0 -
What is the best way to find related forums in your industry?
Hi Guys, Just wondering what is the best way to find forums in your industry?
Intermediate & Advanced SEO | | edward-may2 -
Best way to help a city-centric service provider market in new nearby territories?
Our client recently acquired new county territories outside the main area city. We could create separate location pages under the primary domain, but are wondering if micro sites with unique content (and location-including url) that links back to the location pages would also be a good idea. There is some traction for certain location-based keywords in those areas. Better to focus on the one domain, or augment with separate websites in different parts of the state? I can come up with plausible reasons for and against either, but would love your thoughts. Thank you for any insight!
Intermediate & Advanced SEO | | PerfectPitchConcepts0 -
Best way to consolidate link juice
I've got a conundrum I would appreciate your thoughts on. I have a main container page listing a group of products, linking out to individual product pages. The problem I have is the all the product pages target exactly the same keywords as the main product page listing all the products. Initially all my product pages were ranking much higher then the container page, as there was little individual text on the container page, and it was being hit with a duplicate content penality I believe. To get round this, on the container page, I have incorporated a chunk of text from each product listed on the page. However, that now means "most" of the content on an individual product page is also now on the container page - therefore I am worried that i will get a duplicate content penality on the product pages, as the same content (or most of it) is on the container page. Effectively I want to consolidate the link juice of the product pages back to the container page, but i am not sure how best to do this. Would it be wise to rel=canonical all the product pages back to the container page? Rel=nofollow all the links to the product pages? - or possibly some other method? Thanks
Intermediate & Advanced SEO | | James770 -
What is the best practice when a client is setting up multiple sites/domains
I have a client that is creating separate websites to be used for different purposes. What is the best practice here with regards to not looking spammy. i.e. do the domains need to registered with different companies? hosted on different servers, etc? Thanks in advance for your response.
Intermediate & Advanced SEO | | Dan-1718030