Forcing Google to Crawl a Backlink URL
-
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests).
My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
-
No problem!
-
Appreciate the ideas. I am considering pointing a link at it, but this requires a little more thought and effort to do so ethically. But, at this point, it's probably my best option. Thanks!
-
You might try pinging the site out or just building a link to the site.
-
Both are good ideas. Thank you!
-
Ahhhh, that's a bummer.
Well, you could try to submit a URL from the .gov site that isn't as buried but links to the URL you want crawled.
You could try emailing someone that manages the website, giving them a helpful reminder that they have quality pages not being indexed regularly by Google
Good luck!
-
Thanks for the suggestion! But I should have mentioned in the original post that I've submitted twice via Submit URL form and the url has yet to show up in Latest Links in Webmaster Tools.
-
You could try the URL submit tool: https://www.google.com/webmasters/tools/submit-url
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Google Manual Penalty - Dilemma?
Hi Guys, A while back, my company had a 'partial match' manual penalty from google for 'unnatural links' pointing to our site. This glorious feat was accomplished by our previous SEO agency for quite heavily spamming links (directories, all kinds of low quality sites). That being said, when the penalty hit we really didnt see any drop in traffic. In fact, it was not long after the penalty that we launched a new website and since our traffic has grown quite significantly. we've doubled our total visits from prior penalty to now. This previous SEO also did submit a couple of reconsideration requests (both done loosely as to fool Google by only removing a small amount of links, then abit more the next time when it failed - this was obviously never going to work). Since then, I myself have submitted a reconsideration request which was very thorough, disavowing 85 Domains (every single one at domain level rather than the individual URLs as I didnt want to take any chances), as well as getting a fair few links removed from when the webmaster responded. I documented this all and made multiple contacts to the webmasters so i could show this to Google. This reconsideration request was not successful - Google made some new backlinks magically appear that i had not seen previously. But really, my main point is; am I going to do more damage removing more and more links in order to remove the penalty, because as it stands we haven't actually noticed any negative effects from the penalty! Perhaps the negative effects have not been noticed due to the fact that not long after the penalty, we did get a new site which was much improved and therefore would naturally get much more traffic than the old site, but overall it has not been majorly noticed. What do you guys think - is it worth risking drop in rankings to remove the penalty so we don't face any future issues, or should I not go too heavy with the link removal in order to preserve current rankings? (im really interested to see peoples views on this, so please leave a comment if you can help!)
White Hat / Black Hat SEO | | Sandicliffe0 -
Google is giving one of my competitors a quasi page 1 monopoly, how can I complain?
Hi, When you search for "business plan software" on google.co.uk, 7 of the 11 first results are results from 1 company selling 2 products, see below: #1. Government site (related to "business plan" but not to "business plan software")
White Hat / Black Hat SEO | | tbps
#2. Product 1 from Palo Alto Software (livePlan)
#3. bplan.co.uk: content site of Palo Alto Software (relevant to "business plan" but only relevant to "business plan software" because it is featuring and linking to their Product 1 and Product 2 sites)
#4. Same site as #3 but different url
#5. Palo Alto Software Product 2 (Business Plan Pro) page on Palo Alto Software .co.uk corporate site
#6. Same result as #5 but different url (the features page)
#7. Palo Alto Software Product 2 (Business Plan Pro) local site
#8, #9 and #10 are ok
#11. Same as #3 but the .com version instead of the .co.uk This seems wrong to me as it creates an illusion of choice for the customer (especially because they use different sites) whereas in reality the results are showcasing only 2 products. Only 1 of Palo Alto Software's competitors is present on page 1 of the search results (the rest of them are on page 2 and page 3). Did some of you experience a similar issue in a different sector? What would be the best way to point it out to Google? Thanks in advance Guillaume0 -
Fix Bad Links in Google
I have a client who had some grey hat SEO done in the past. Some of their back links aren't from the best neighborhoods. Google didn't seem to mind until 9/28, when they literally disappeared for all searches except for their domain name. Google still has their site indexed, but it's just not showing up. There are no messages in Webmaster Tools. I know Bing has the tool where you can disavow bad links and ask them to discount them. Google doesn't have such a tool, but what is the strategy when you don't have control over the link sources, such as in blog comments? Could this update have been a delayed Penguin ranking change from the latest Penguin Update on the 18th? http://www.seomoz.org/google-algorithm-change Any advice would be greatly appreciated. Thanks, Tom
White Hat / Black Hat SEO | | TomBristol0 -
First of 183 million with 6 backlinks
Hello Everyone, I am from Hungary and i'd like to ask about a Hungarian site: www.viagra.info.hu. (do not need to speak Hungarian to be able to answer) If you type viagra in google.hu this is the first page of 183 million (not bad) with a wonderful number of 6 backlinks, coming before wikipedia and 70000 backlink sites. Additional info: site is one year old:) I can not discover the black hat but surely there is. Any idea? My other question is how they are doing that when looking from the serps below their description tag there are 3 links (their menus) that are not sitelinks? Someone please help me
White Hat / Black Hat SEO | | sesertin0 -
From page 3 to page 75 on Google. Is my site really so bad?
So, a couple of weeks ago I started my first CPA website, just as an experiment and to see how well I could do out of it. My rankings were getting better every day, and I’ve been producing constant unique content for the site to improve my rankings even more. 2 days ago my rankings went straight to the last page of Google for the keyword “acne scar treatment” but Google has not banned me or given my domain a minus penalty. I’m still ranking number 1 for my domain, and they have not dropped the PR as my keyword is still in the main index. I’m not even sure what has happened? Am I not allowed to have a CPA website in the search results? The best information I could find on this is: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=76465 But I’ve been adding new pages with unique content. My site is www.acne-scar-treatment.co Any advice would be appreciated.
White Hat / Black Hat SEO | | tommythecat1 -
Google Places
My client offers training from many locations within the UK. These locations/venues are not owned by them, however I see no problem in setting up a different listing for each location in Google Places. At the end of the day if a user searched for “Training London” they are looking for somewhere that they can book a course that would be in their local area. As my client has a “venue” there I think there is a good argument to say that your listing would be valid. What are your thoughts.
White Hat / Black Hat SEO | | cottamg0