Two basic questions re. Crawl Diagnostic results
-
I'm a novice...I've just run my crawl diagnostics and I wonder how important is it to a) Have meta-descriptions on every page, b) Have all titles less than 70 characters?
Thanks in advance. Dan.
-
Dan,
the meta description is your chance to pitch to your customer and nothing more. It is entirely possible to throw away the benefit a #1 ranking can give you in click through if you meta description lets you down.
Also for me it is very important to keep within the 70 characters limit for the page title as having overly long titles can not only not read correctly in the search results but you may be diluting your affectiveness on targeting a page for a certain topic both to the user and the SE's
-
Thanks both of you, that definitely helps me. Dan.
-
Meta-descriptions are an opportunity for you to instruct the SE's what your page is about specifically. If you leave it blank - they will extract an excerpt that they think applies to the page. Whilst this is ok, it might not be exactly what you want to display on the SERPS.
Use the meta descriptions to make the most of the SERPS page, if you page is about blue widget investment - then your description should have text that is relevant to that term, and contain text that includes or is at least similar to the term - if it is a large page and you leave the field blank - the SE's can pull in a snippet of an entirely different part of the page that isn't relevant at all.
As for titles - If you have old pages that have long titles it can be a bit of a drag redoing them - but if you are continually producing content I would suggest restricting to sub 70 characters from here on from an "adhering to best practices" point of view.
-
In terms of pure SEO Meta Descriptions will have no effect on your rankings. But they are potentially your first chance of presenting your content to users convincing them to come to your site, so in that respect they are important. Search engines will use page content as a meta description if there isn't one and that can be fine, but it isn't always what you want.
Titles are a little more tricky, if you have keywords beyond those 70 characters they are unlikely to be "read" by the spiders, and that is less than ideal, so it's important to try to say what you need within that 70 character limit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question About Thin Content
Hello, We have an encyclopedia type page on our e-commerce site. Basically, it's a page with a list of terms related to our niche, product definitions, slang terms, etc. The terms on the encyclopedia page are each linked to their own page that contains the term and a very short definition (about 1-2 sentences). The purpose of these is to link them on product pages if a product has a feature or function that may be new to our customers. We have about 82 of these pages. Are these pages more likely to help us because they're providing information to visitors, or are they likely to hurt us because of the very small amount of content on each page? Thanks for the help!
Technical SEO | | mostcg0 -
Question about keywords on multiple pages
Hello all great to be apart of this community, My question is: I am trying to rank for two separate "two keyword" searches which are "BBB boost" and "ZZZ boost" I am planning to put "ZZZ boost" on my homepage/landing, and "BBB boost" on my second page where the end user actually purchases said product. "ZZZ boost" - receives around 22,000 monthly searches and "BBB boost" - around 5000 monthly searches Because each of these share the one keyword "boost" in them, will it affect my ability to rank for even one page on the "two keyword" phrase? Or will it cause both pages to come up in the google search results on either "two keyword" phrase because they share the same word "boost" in them? If so does that affect the ability to rank 1 page since they share the same domain name, will it divide page ranking/serp ranking?
Technical SEO | | zerk890 -
.htaccess redirect question
Hi guys and girls Please forgive me for being an apache noob, but I've been trawling for a while now and i can't seem to find a definitive guide for my current scenario. I've walked into a but of a cluster$%*! of a job, to rescue a horribly set up site. One of many, many problems is that they have 132 302redirects set up. Some of these are identical pages but http-https, others are the same but https-http and some are redirects to different content pages with http-http. A uniform redirecting of http to https is not an option so I'm looking to find out the best practice for reconfiguring these 302s to 301s within .htaccess? Thanks in advance 🙂
Technical SEO | | craig.gto0 -
URL redirect question
Hi all, Just wondering whether anybody has experience of CMSs that do a double redirect and what affect that has on rankings. here's the example /page.htm is 301 redirected to /page.html which is 301 redirected to /page As Google has stated that 301 redirects pass on benefits to the new page, would a double redirect do the same? Looking forward to hearing your views.
Technical SEO | | A_Q0 -
SEOMoz Crawler vs Googlebot Question
I read somewhere that SEOMoz’s crawler marks a page in its Crawl Diagnostics as duplicate content if it doesn’t have more than 5% unique content.(I can’t find that statistic anywhere on SEOMoz to confirm though). We are an eCommerce site, so many of our pages share the same sidebar, header, and footer links. The pages flagged by SEOMoz as duplicates have these same links, but they have unique URLs and category names. Because they’re not actual duplicates of each other, canonical tags aren’t the answer. Also because inventory might automatically come back in stock, we can’t use 301 redirects on these “duplicate” pages. It seems like it’s the sidebar, header, and footer links that are what’s causing these pages to be flagged as duplicates. Does the SEOMoz crawler mimic the way Googlebot works? Also, is Googlebot smart enough not to count the sidebar and header/footer links when looking for duplicate content?
Technical SEO | | ElDude0 -
Webmaster Tools Site Map Question
I have TLD that has authority and a number of micro-sites built off of the primary domain. All sites relate to the same topic, as I am promoting a destination. The primary site and each micro-site have their own CMS installation, but the domains are mapped accordingly. www.regionalsite.com/ <- primary
Technical SEO | | VERBInteractive
www.regioanlsite.com/theme1/ <- theme 1
www.regioanlsite.com/theme2/ <- theme 2
www.regionalsite.com/theme3/ <- theme 3 Question: Should my XML site map for Webmaster Tools feed all sites off of the primary domain site map or are there penalties for this? Thanks.0 -
Question Concerning Pages With Too Many Links:
I have run SEO moz software for a clients site, Its showing that virtually every single page has too many links. For instance this url: http://www.golfthere.com/AboutUs Am I missing something? I do not see 157 links on this page.
Technical SEO | | ToddKing0 -
Is this dangerous (a content question)
Hi I am building a new shop with unique products but I also want to offer tips and articles on the same topic as the products (fishing). I think if was to add the articles and advice one piece at a time it would look very empty and give little reason to come back very often. The plan, therefore, is to launch the site pulling articles from a number of article websites - with the site's permission. Obviously this would be 100% duplicate content but it would make the user experience much better and offer added value to my site as people are likely to keep returning even when not in the mood to purchase anything; it also offers the potential for people to email links to friends etc. note: over time we will be adding more unique content and slowly turning off the pulled articled. Anyway, from an seo point of view I know the duplicate content would harm the site but if I was to tell google not to index the directory and block it from even crawling the directory would it still know there is duplicate content on the site and apply the penalty to the non duplicate pages? I'm guessing no but always worth a second opinion. Thanks Carl
Technical SEO | | Grumpy_Carl0