Best Practices
-
Okay this would be a piece of cake for most of you out there..
What are the best practices once you add a page or piece of content on your website with a new keyword that you have never used before but plan to use it with every relevant new page you add. How do you ensure that Google will crawl that page?
Secondly, if you add the new keyword in the old pieces of content/ pages you have already published by editing the content to suit that keyword, how would you ensure that it gets crawled my Google.
Thanks in advance
-
Sorry I missed this!
If you have your website architecture set up well you can always request Google to index a page and all pages that it links to. You'll see this option when you click the Submit to index button. You won't have to submit a substantial amount of individual pages this way.
I personally would keep an eye the pages of most value. These are the pages you are optimizing for that show up in the search results and are generating traffic.
Hope this helps.
-
Andreas,
Thanks for the tip. Will do
Regards,
-
RangeMarketing,
Thank you for your response. I will do that now for sure.
Also, do you think I need to make it as an exercise to check which page was last crawled. Like our website has more than 20k plus pages. Whats the best way to figure out? Which tool do you recommend?
Thanks
-
RangeMarketing is right, but there is an pretty easier way to, share the page @ gplus.
I realized that it sometimes is faster. But usually I fatch as google in both cases, like Range Marketing said. -
If you have internal links pointing to the page with the new/updated content Google will eventually find it, however, the quickest way to have this happen is to request a crawl in Google Webmaster Tools.
Under Crawl > Fetch as Google
Once the status of the page loads, you should see a button labeled Submit to index. Click this to submit the page to be indexed.
There are free tools available to find out the last time Google indexed (crawled) a specific page. I personally use the free SEO Book Toolbar. I believe Moz's free toolbar does this as well but I could be wrong.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
350 (Out the 750) Internal Links Listed by Webmaster Tools Dynamically Generated-Best to Remove?
Greetings MOZ Community: When visitors enter real estate search parameters in our commercial real estate web site, the parameters are somehow getting indexed as internal links in Google Webmaster Tools. About half are 700 internal links are derived from these dynamic URLs. It seems to me that these dynamic alphanumeric URL links would dilute the value of the remaining static links. Are the dynamic URLs a major issue? Are they high priority to remove? The dynamic URLs look like this: /listings/search?fsrepw-search-neighborhood%5B%5D=m_0&fsrepw-search-sq-ft%5B%5D=1&fsrepw-search-price-range%5B%5D=4&fsrepw-search-type-of-space%5B%5D=0&fsrepw-search-lease-type=1 These URLs do not show up when a SITE: URL search is done on Google!
Intermediate & Advanced SEO | | Kingalan10 -
How to best handle expired content?
Similar to the eBay situation with "expired" content, what is the best way to approach this? Here are a few examples. With an e-commerce site, for a seasonal category of "Christmas" .. what's the best way to handle this category page after it's no longer valid? 404? 301? leave it as-is and date it by year? Another example. If I have an RSS feed of videos from a big provider, say Vevo, what happens when Vevo tells me to "expire" a video that it's no longer available? Thank you!
Intermediate & Advanced SEO | | JDatSB0 -
Best way to handle traffic from links brought in from old domain.
I've seen many versions of answers to this question both in the forum, and throughout the internet... However, none of them seem to specifically address this particular situation. Here goes: I work for a company that has a website (www.example.com) but has also operated under a few different names in the past. I discovered that a friend of the company was still holding onto one of the domains that belonged to one of the older versions of the company (www.asample.com) and he was kind enough to transfer it into our account. My first reaction was to simply 301 redirect the older to the newer. After I did this, I discovered that there were still quite a few active and very relevant links to that domain, upon reporting this to the company owners they were suddenly concerned that a customer may feel misdirected by clicking www.asample.com and having www.example.com pop up. So I constructed a single page on the old domain that explained that www.asample.com was now called www.example.com and provided a link. We recently did a little house cleaning and moved all of our online holdings "under one roof" so to speak, and when the rep was going over things with the owners began to exclaim that this was a horrible idea, and that domain should instead be linked to it's own hosting account, and wordpress (or some other CMS) should be installed, and a few pages of content about the companies/subject should be posted. So the question: Which one of these is the most beneficial to the site and the business that are currently operating (www.example.com?) I don't see a real problem with any of these answers, but I do see a potentially un-needed expense in the third solution if a simple 301 will bring about the most value. Anyone else dealt with a situation like this?
Intermediate & Advanced SEO | | modulusman0 -
What is the best way to scrape serps for targeted keyword research?
Wanting to use search operators such as "KEYWORD inurl:blog" to identify potential link targets, then download target url, domain and keyword into an excel file. Then use SEOTools to evaluate the urls from the list. I see the link aquisition assistant in the Moz lab, but the listed operators are limited. Appreciate any suggestions on doing this at scale, thanks!
Intermediate & Advanced SEO | | Qualbe-Marketing-Group0 -
Best way to de-index content from Google and not Bing?
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
Intermediate & Advanced SEO | | nicole.healthline0 -
What are the Best Practices for moving a blog from subdomain to domain/subcategory?
Howdy SEOmoz fans! (couldn't resist). I'm moving a wordpress blog from blog.domain.com to domain.com/blog. Trying to do it right the first time and cover all my bases. Issues I'm trying to handle correctly, in varying degrees of importance: External LInks Internal Links Google Friendly Traffic Routing in a dynamic environment (wordpress, 301, .htaccess, etc.) Thanks so much for any and all input!
Intermediate & Advanced SEO | | NTM1 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640 -
Best approach to launch a new site with new urls - same domain
www.sierratradingpost.com We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results. The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites. Except for the homepage the URL structure for the new site is different than the old site. What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues? Here is what we got back from a Google post which may highlight our concerns better: http://www.google.com/support/forum/p/Webmasters/thread?tid=62d0a16c4702a17d&hl=en&fid=62d0a16c4702a17d00049b67b51500a6 Thank You, sincerely, Stephan Woo Cude SEO Specialist scude@sierratradingpost.com
Intermediate & Advanced SEO | | STPseo0