Text-indent=-9999em = Bad SEO ?
-
Since we weren't the ones who designed above mentioned website there is something we really don't understand. They have replaced images with text using css as below examples.
CSS --------->
div#logo { background: #fff url(../images/logo.gif) no-repeat 20px 0px; margin-bottom: 30px; } div#logo a { height: 148px; text-indent: -1000em; display: block; }
HTML ----------- >
******************
What my question is out of 100 scale how much does this affect SEO ? What if we keep H1 tags black without putting any text between tags ?
-
I understand your situation and am sympathetic. I am simply trying to offer you the information you are seeking.
If you are performing SEO work on a site, then it is your role to educate the owner / developer on the risks associated with the site in it's present condition and to make recommendations.
I am in the fortunate position where I can turn down a client. I would turn away a client who was unwilling or unable to make changes which involved black hat methods. I would not want my name associated with such a site, nor the headaches that are involved with such sites.
If you choose to continue working with the site, I would strongly recommend a written document where you advised the client of these issues along with the consequences and they have acknowledged (i.e. signed) the document.
As far as making other changes, you are still breaking the rules but are fixing some of the violations. The changes may or may not help you avoid a penalty.
-
This site wasn't designed by me . So I cannot change any code at the moment . This site uses alot of hidden texts . such as navigation , company logo etc . What If I just keep the image without putting any text between those tags . That why search engines don't find any text in HTML at least.
-
When any crawler visits your site, they can see all of your web code. They definitely can see your CSS and will de-index a site for this form of manipulation.
Blocking your CSS files would yield the same result. You are required to present the same code to search engines that you do for users.
It's called Black Hat SEO. The company you are trying to cheat is the world's #1 search company with 30+ billion in revenue last year. There are a million other websites trying to get away with cheating. This has all been tried before.
Even if you devised a brilliant new method of cheating the system, it is only a matter of time until you are caught and your site is de-indexed. You would lose 100% of traffic along with any revenue from organic search.
-
Well How Can I google find that Im hiding any text between tags unless one of my competitors report them ?
What If I block Stylesheet using robots.txt ?
-
If you hide any H1 tags or content on your site, you will surely be caught and penalized. It's just a question of time.
How costly will the penalty be? Log into Google WMT and see what % of your site's traffic comes from organic search. No imagine that being 0% for a few weeks that is will take Google to review your reconsideration request. Now weigh the relatively minor benefit this manipulation offers your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Relaunching a website - SEO implicataions
Im looking to relaunch a current website, that will undergo a complete makeover. Can you you tell me what factors I need to consider in doing this, particularly with regards to maintaining seo and migrating the current site in general
Intermediate & Advanced SEO | | aplnzmarch180 -
Proximity of keywords in text
In content, does the proximity of semantically related keyword matter ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Dealing with negative SEO
Interested to know people strategies for detecting and mitigating negative SEO. Previously I've used link monitoring tool and kept an eye on all new back links coming in to any page on the site. I have then manually assessed each one again using some tools and actually visiting the website. However, this always leaves me with one dilemma. Regardless of my assessment how do search engines see that link? I run three lists a white list, grey list and blacklist. White list - very relevant and have a lot of authority. I.e. leading industry blogs and forums. Grey list - out of topic/industry, directories Blacklist - sites de-indexed by Google, illegal content or absolute spam (i.e. one page filled with hundreds of links to different domains) Do you have any thoughts? How do you assess if link is bad?
Intermediate & Advanced SEO | | seoman100 -
Pagination & SEO
Hi In one of my other Q&A's someone mentioned I may need to look at pagination. For instance, are these pages counted as 'new' pages in Google's eyes when clicking on pagination? http://www.key.co.uk/en/key/plastic-storage-boxes http://www.key.co.uk/en/key/plastic-storage-boxes#productBeginIndex:30&orderBy:5&pageView:list& Does anyone have any advice on what I could do? It's not something I have had much experience with. Thank you Becky
Intermediate & Advanced SEO | | BeckyKey0 -
Software to Analyse Bad Links
Is it possible to get a pretty good idea of my site's link profile by merging the link data from Google Webmaster Tools, MOZ and SEMRUSH (backlinks). I would think that combining the linking domains from these three packages would create a pretty good profile. Once I create a list of domains that link to my site is it possible to run them thru MOZ so as to evaluate their quality? Last year I paid a reputable SEO firm to run a link analysis, process link removal requests and finally a disavow, only to see my domain authority decline from 33 to 24. So I am leary of the process. That being said I have reviewed the disavow file that was submitted last year and still see about a third of the low quality domains still linking to our site. Alternatively is it worthwhile to run a link detox report. Maybe it is worth biting the bullet and spending the $175.00 dollars to run a report. Our site (www.nyc-officespace-leader.com) does not have too many links so maybe I can research this manually. Thoughts???
Intermediate & Advanced SEO | | Kingalan10 -
What next with SEO
I've been working on my site for over 2 years and have some very good links and now have a PageRank 4. My site has fallen down from page 1 to page 4 for 'Web Design London' which may be due to not putting much work into link building in the last 6 months. The site is pretty well optimised onsite but there are less that 20 pages of content. With time constraints in place because I have to run the business, would it be better to increase the content, seek out more links or outsource the work. Ideally I would do both but money and time restrict this. If I was to outsource, do you have recommendations and rough prices? Thanks
Intermediate & Advanced SEO | | wpwebdesignlondon0 -
Corporate pages and SEO help
We own and operate more than two dozen educational related sites. The business team is attempting to standardize some parts of our site hierarchy so that our sitemap.php, about.php, privacy.php and contact.php are all at the root directory. Our sitemap.php is generated by our sitemap.xml files, which are generated from our URLlist.txt files. I need to provide some feedback on this initiative. I'm worried about adding more stand-alone pages to our root directory and as part of a separate optimization in the future I was planning to suggest we group the "privacy", "about" and "contact" pages in a separate folder. We generally try to put our most important pages/directories for SEO in the root as our homepages pass a lot of link juice and have high authority. We do not invest SEO time into optimizing these pages as they're not pages we're trying to rank for, and I've already been looking into even no-following all links to them from our footer, sitemap, etc. I know that adding these "corporate" pages to a site are usually a standard part of the design process but is there any SEO benefit to having them at the root? And along the same lines, is there any SEO harm to having unimportant pages at the root? What do you guys think out there in Moz land?
Intermediate & Advanced SEO | | Eric_edvisors0 -
What are the best suites of SEO tools?
I normally use SEOmoz and a bit of SEMrush but I dont really know much outside of those two. Im looking to do a review of the big, trustworthy ones - along the lines of free trial price vs value ranktracking linkbuilding help onpage analysis and help competitor analysis reports I heard good things about Raven Tools and Web CEO. Ive seen mention of SEOpowersuite on this forum but the site looks spammy as hell Anyone have a view on those 5 tools or any others in a similar vein? Or any other top line criteria I should be looking at? Cheers
Intermediate & Advanced SEO | | firstconversion
Stephen1