Best Practices
-
Okay this would be a piece of cake for most of you out there..
What are the best practices once you add a page or piece of content on your website with a new keyword that you have never used before but plan to use it with every relevant new page you add. How do you ensure that Google will crawl that page?
Secondly, if you add the new keyword in the old pieces of content/ pages you have already published by editing the content to suit that keyword, how would you ensure that it gets crawled my Google.
Thanks in advance
-
Sorry I missed this!
If you have your website architecture set up well you can always request Google to index a page and all pages that it links to. You'll see this option when you click the Submit to index button. You won't have to submit a substantial amount of individual pages this way.
I personally would keep an eye the pages of most value. These are the pages you are optimizing for that show up in the search results and are generating traffic.
Hope this helps.
-
Andreas,
Thanks for the tip. Will do
Regards,
-
RangeMarketing,
Thank you for your response. I will do that now for sure.
Also, do you think I need to make it as an exercise to check which page was last crawled. Like our website has more than 20k plus pages. Whats the best way to figure out? Which tool do you recommend?
Thanks
-
RangeMarketing is right, but there is an pretty easier way to, share the page @ gplus.
I realized that it sometimes is faster. But usually I fatch as google in both cases, like Range Marketing said. -
If you have internal links pointing to the page with the new/updated content Google will eventually find it, however, the quickest way to have this happen is to request a crawl in Google Webmaster Tools.
Under Crawl > Fetch as Google
Once the status of the page loads, you should see a button labeled Submit to index. Click this to submit the page to be indexed.
There are free tools available to find out the last time Google indexed (crawled) a specific page. I personally use the free SEO Book Toolbar. I believe Moz's free toolbar does this as well but I could be wrong.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google for Jobs best practice for Job Boards?
I head up SEO for a niche job board. We disallowed our job ad pages (/job/) in the robots.txt as this is user-generated content and really eating up our crawl budget, causing penalties etc. Now Google for Jobs has hit the UK (our strongest region for traffic), I'm torn about what to do next. Our jobs will only show in GfJ if we remove the jobs pages from the robots.txt and apply the directed structured data to every single jobs page and monitor this constantly. I will also have to constantly invest in our website developers no indexing / canonicalizing new job pages and paginations. Is GfJ worth it? I have spoken to one other job board who has seen more brand awareness from appearing in GfJ but almost no traffic / application increase. But are we missing a trick here? Any advice would be greatly appreciated.
Intermediate & Advanced SEO | | gracekimberley11 -
Best Method for Press Releases from an SEO Perspective?
Is it best practice to post your news release on your website THEN submit to distribution services/news sources, or, wait until it gets out there then just put an announcement on your website with an excerpt, and then link to the PR on the most prominent news site?
Intermediate & Advanced SEO | | Wizkids9641 -
Glossary/Terms Page - What is the best way?
We have a glossary section on our website with hundreds of terms. At the moment we have it split into letters, e.g. there one page with all the terms starting with A, another for B etc.. I am conscious that this is not the best way to do things as not all of these pages are being indexed, and the traffic we get to these pages is very low. Any suggestions on what would be the best way to improve this? The 2 ideas I have at the moment are Have every term on a separate page, but ensuring there is enough copy for that term Leave as is, but have the URL change once a user scrolls down the page. E.g. the first page would be www.website.com/glossary/a/term-1 then once the user scrolls past this terms and onto the next one the URL would change to www.website.com/glossary/a/term-2
Intermediate & Advanced SEO | | brian-madden0 -
SEO best practices for embedding content in a map
My company is working on creating destination guides for families exploring where to go on their next vacation. We've been creating and promoting content on our blog for quite some time in preparation for the map-based discovery. The UX people in my company are pushing for design/functionality similar to:
Intermediate & Advanced SEO | | Vacatia_SEO
http://sf.eater.com/maps/the-38-essential-san-francisco-restaurants-january-2015 From a user perspective, we all love this, but I'm the SEO guy and I'm having a hard time figuring out the best way to guide my team regarding getting readers to the actual blog article from the left content area. The way they want to do it is to have the content displayed overtop the map when someone clicks on a pin. Great, but there's no way for me to optimize the map for every article. After all, if we have an article about best places to snorkel on Maui, I want Google to direct people to the blog article specific to that search term because that page is the authority on that subject. Additionally, the map page itself will have no original content because it will be pulling all the blog content from other URLS, which will get no visitors if people read on the map. We also want people, when they find an article they like, to be able to copy a URL to share. If the article is housed on the map page, the URL will be ugly and long (not SEO friendly) based on parameters from the filters the visitor used to drill down to that article. So I don't think I can simply optimize the map filtered-URL. Can I? The others on my team do not want visitors to ping pong back and forth between map and article and would prefer people stay on the discovery map. We did have a thought that we'd give people an option to click a link to read the article off the map but I doubt people will do it which means that page will never been visited, thus crushing it's page rank. so questions: How can i pass link juice/SEO love from the map page to the actual blog article while keeping the user on the map? Does google pass that juice if you use Iframes? What about doing ajax calls? Anyone have experience doing this? Am I making a mountain out of a molehill? Should I trust that if I create good content, good UX and allow people to explore how they prefer, Google will give me the love? Help me Rand Fishkin, you're my only hope!1 -
Changing domains - best process to use?
I am about to move my Thailand-focused travel website into a new, broader Asia-focused travel website. The Thailand site has had a sad history with Google (algorithmic, not penalties) so I don't want that history to carry over into the new site. At the same time though, I want to capture the traffic that Google is sending me right now and I would like my search positions on Bing and Yahoo to carry through if possible. Is there a way to make all that happen? At the moment I have migrated all the posts over to the new domain but I have it blocked to search engines. I am about to start redirecting post for post using meta-refresh redirects with a no-follow for safety. But at the point where I open the new site up to indexing, should I at the same time block the old site from being indexed to prevent duplicate content penalties? Also, is there a method I can use to selectively 301 redirect posts only if the referrer is Bing or Yahoo, but not Google, before the meta-refresh fires? Or alternatively, a way to meta-refresh redirect if the referrer is Google but 301 redirect otherwise? Or is there a way to "noindex, nofollow" the redirect only if the referrer is Google? Is there a danger of being penalised for doing any of these things? Late Edit: It occurs to me that if my penalties are algorithmic (e.g. due to bad backlinks), does 301 redirection even carry that issue through to the new website? Or is it left behind on the old site?
Intermediate & Advanced SEO | | Gavin.Atkinson0 -
Best way to move a page without 301
I have a page that currently ranks high for its term. That page is going away for the main website users, meaning all internal site links pointing to that page are going away and point to a new page. Normally you would just do a 301 redirect to the new URL however the old URL will still need to remain as a landing page since we send paid media traffic to that URL. My question is what is the best way to deal with that? One thought was set up a canonical tag, however my understanding is that the pages need to be identical or very close to the same and the landing page will be light on content and different from the new main page. Not topically different but not identical copy or design, etc.
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
Domain Name Change - Best Practices?
Good day guys, We got a restaurant that is changing its name and domain. However they are keeping the same server location, same content and same pages (we are just changing the logo on the website). It just has to go a new domain. We don't want to lose the value of the current site, and we want to avoid any duplicate penalties. Could you please advise of the best practices of doing a domain name change? Thank you.
Intermediate & Advanced SEO | | Michael-Goode0 -
What is the best way to scrape serps for targeted keyword research?
Wanting to use search operators such as "KEYWORD inurl:blog" to identify potential link targets, then download target url, domain and keyword into an excel file. Then use SEOTools to evaluate the urls from the list. I see the link aquisition assistant in the Moz lab, but the listed operators are limited. Appreciate any suggestions on doing this at scale, thanks!
Intermediate & Advanced SEO | | Qualbe-Marketing-Group0