Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Forcing Google to Crawl a Backlink URL
-
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests).
My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
-
No problem!
-
Appreciate the ideas. I am considering pointing a link at it, but this requires a little more thought and effort to do so ethically. But, at this point, it's probably my best option. Thanks!
-
You might try pinging the site out or just building a link to the site.
-
Both are good ideas. Thank you!
-
Ahhhh, that's a bummer.
Well, you could try to submit a URL from the .gov site that isn't as buried but links to the URL you want crawled.
You could try emailing someone that manages the website, giving them a helpful reminder that they have quality pages not being indexed regularly by Google
Good luck!
-
Thanks for the suggestion! But I should have mentioned in the original post that I've submitted twice via Submit URL form and the url has yet to show up in Latest Links in Webmaster Tools.
-
You could try the URL submit tool: https://www.google.com/webmasters/tools/submit-url
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz was unable to crawl your site? Redirect Loop issue
Moz was unable to crawl your site on Jul 25, 2017. I am getting this message for my site: It says "unable to access your homepage due to a redirect loop. https://kuzyklaw.com/ Site is working fine and last crawled on 22nd July. I am not sure why this issue is coming. When I checked the website in Chrome extension it saysThe server has previously indicated this domain should always be accessed via HTTPS (HSTS Protocol). Chrome has cached this internally, and did not connect to any server for this redirect. Chrome reports this redirect as a "307 Internal Redirect" however this probably would have been a "301 Permanent redirect" originally. You can verify this by clearing your browser cache and visiting the original URL again. Not sure if this is actual issue, This is migrated on Https just 5 days ago so may be it will resolved automatically. Not sure, can anybody from Moz team help me with this?
White Hat / Black Hat SEO | | CustomCreatives0 -
HOW!??! Homepage Ranking Dropped Completely out of Top 100 on Google....
So I'm competing for a very competitive keyword, and I've been on the bottom of page 2 for a while now, ranking for my homepage, which is very content rich and has GREAT links pointing to it. Out of nowhere, last week I dropped completely out of the top 100 or so, yet one of my article posts now ranks on page 6 or so for the same keyword. I have great authoritative links, my on-page is spot on, all of my articles are super super high quality, I don't understand how my homepage, which has ranked for the main keyword for months on page 2, can just completely drop out of the top 100 or so.... Can anyone help provide some insight?
White Hat / Black Hat SEO | | juicyresults0 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
Google places VS position one ranking above the places.
Hi Guys, Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword? I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts? Or would they have that listing as well as the places listing? I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc. Appreciate some guidance Thanks. BC
White Hat / Black Hat SEO | | Bodie0 -
A Straight Answer to Outsourcing Backlinking, Directory Submission and Social Bookmarking
Hey SEOmoz Community! I've spent a bit of time now reading about SEO in books as well as online here within the SEOmoz community. However, I've still struggled to find a straight answer to whether or not directory submissions to non-penalized websites is acceptable.I suspect the reason I haven't found a straight YES or NO answer is because it isn't so straightforward and I respect that. My dilemma is as follows: I want to raise the domain authority for a few websites that I optimize for. I've submitted and gotten listed a bunch of excellent backlinks, however it still is a painfully slow process. My clients understandably want to see results faster, and because they have virtually no past outsourced link-building campaigns, I am beginning to think that I can invest some money for outsourcing directory submissions. I see more and more people talking about the latest Penguin updates, and how many of these sites are now penalized. BUT, is there any harm to submitting to directories such as the ones on SEOmoz's spreadsheet that aren't penalized? My concern is that in the future these will be penalized anyways, and is there a chance then that my site will also be de-listed from Google? At what point does Google completely 'blacklist' your site from its engine? Furthermore, I don't understand how Google can penalize a website to the point of de-listing it, because what would prevent other competitors from sending mass spammy back-links to another? What it all comes down to: At this point, are verified mass directory submissions through outsourcing still much more beneficial than detrimental to the ranking of a website? Thanks SEOmoz community, Sheldon
White Hat / Black Hat SEO | | swzhai0 -
Do shady backlinks actually damage ranking?
That is, it looks like a whole bunch of sites got smacked around the penguin/panda updates, but is this by virtue of actually being hurt by google's algorithms, or by virtue of simply not being helped "as much"? That is, was it a matter of the sites just not having any 'quality' backlinks, having relied on things google no longer liked, which would result in not having as much to push them to the top? That is, they would have been in the same position had they not had those shoddy practices? Or was google actively punishing those sites? That is, are they worse off for having those shoddy practices? I guess the reason I ask is I'm somewhat terrified of going "out there" to get backlinks -- worst case scenario: would it just not do much to help, or would it actually hurt? Thanks!
White Hat / Black Hat SEO | | yoni450 -
Will Google Penalize Content put in a Div with a Scrollbar?
I noticed Moosejaw was adding quite a bit of content to the bottom of category pages via a div tag that makes use of a scroll bar. Could a site be penalized by Google for this technique? Example: http://www.moosejaw.com/moosejaw/shop/search_Patagonia-Clothing____
White Hat / Black Hat SEO | | BrandLabs0