Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can hidden backlinks ever be ok?
-
Hi all,
I'm very new to SEO and still learning a lot.
Is it considered a black hat tactic to wrap a link in a DIV tag, with display set to none (hidden div), and what can the repercussions be?
From what I've learnt so far, is that this is a very unethical thing to be doing, and that the site hosting these links can end up being removed from Google/Bing/etc indexes completely. Is this true?
The site hosting these links is a group/parent site for a brand, and each hidden link points to one of the child sites (similar sites, but different companies in different areas).
Thanks in advance!
-
Hi Ryan,
Thanks for the quick feedback.
This clears up things for me a bit.Thanks,
Stephen -
The separation between black hat and white hat tactics is generally a clear line. The simple question is, does the code exist for the benefit of your site's visitors or solely to manipulate search engines?
DIV tags are used to apply CSS rules to specific pieces of code. If you have a link contained in a DIV and the display set to none, that link would clearly never be seen by the site's visitors. It is apparent the link exists solely to manipulate search engine results, and therefore is a black hat tactic.
When Google and other search engines discover black hat tactics being used on a site, they will take action. The action can be relatively minor such as ignoring the link. The action could be mid-range such as removing the page containing the link from the index. At the extreme end, they can remove the entire site from the index.
Each search engine has their own internal guidelines on how to handle these issues. Some issues are handled automatically via algorithms, while other issues are handled by manual review. There are no published standards on exactly which punishments will be handed out for a given violation. It is simply best to completely avoid anything black hat.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlinks that go to a redirected URL
Hey guys, just wondering, my client has 3 websites, 2 of 3 will be closed down and the domains will be permanently redirected to the 1 primary domain - however they have some high quality backlinks pointing the domains that will be redirected. How does this effective SEO? Domain One (primary - getting redesign and rebuilt) - not many backlinks
Technical SEO | | thinkLukeSEO
Domain Two (will redirect to Domain One) - has quality backlinks
Domain Three (will redirect to Domain One) - has quality backlinks When the new website is launched on Domain One I will contact the backlink providers and request they update their URL - i assume that would be the best.0 -
How can I stop a tracking link from being indexed while still passing link equity?
I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL?
Technical SEO | | UnbounceVan0 -
"5XX (Server Error)" - How can I fix this?
Hey Mozers! Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue? Crawl Issues and Notices for: http://www.refusedcarfinance.com/news/category/news We found 1 crawler issue(s) for this page. High Priority Issues 1 5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Technical SEO | | RocketStats0 -
What punctuation can you use in meta tags? Are there any Google does not like?
So I know you can use dashes and | in meta tags, but can anyone tell me what other punctuation you can use? Also, it'd be great to know what punctuation you can't use. Thanks!
Technical SEO | | Trevorneo1 -
How can I Style Long "List Posts" in Wordpress?
Hi All, I have been working on a list-post which spans over 100 items. Each item on the list has a quick blurb to explain it, an image and a few resource links. I am trying to find an attractive way to present this long list post in Wordpress. I have seen several sites with long list posts however; they place their items one on top of the other which yields a VERY long page and the end user has to do a lot of scrolling. Others turn their lists into slideshows, but I have no data on how slides perform against 10-mile-long-lists which load in 1 page. I would like to do something similar to what List25.com does as they present about 5-10 items per page and they seem to have pagination. The pagination part I understand however; is there a shortcode plugin to format lists in an attractive way just like list25?
Technical SEO | | IvanC0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
What can I do about missing Meta Description for category pagest etc.?
On all my campaigns I'm returning high levels of 'Missing Meta Description Tags'. The problem with fixing this is they're all for category, tag and author pages. Is there a way to add a meta description to these pages (there are hundreds) or will it not really have any ranking effect?
Technical SEO | | SiliconBeachTraining0