Best META Fields to Include on New Site
-
I am in the process of transitioning sites to a Drupal CMS and am curious to know what META information to provide on each of the new site pages. Currently, this is the set-up I plan on using:
My questions to the community are:
- whether or not I've added all pertinent information, and
- if there's anything I'm overlooking
-
Catalyste,
Yes, this is something that I will need to implement. With that, I also wanted to ask you, if we fore-go implementing 'rich snippets' and 'authorship' fields in the BETA is that going to create more problems for development down the line?
More specifically, if we decide to hold off implementing authorship/rich snippets until we launch the actual production site for visitors/customers, will this create more problems than just implementing this stuff now?
Likewise, are there any additional suggestions you'd make for a news organization's site?
-
I would say since Google may show this one in search results which gives you a little control on what users will see when this page is triggered by a Search.
Limit it to 160 characters. Beside that, no that much ...Meta where so abused over the past.
Maybe you should also start to think on implementing Rich snippets if you have time :
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=99170
-
Agree with Shane on meta robots and revisit
SEOMoz recommend that the keywords tag isn't used as it serves no SEO value, all it does is tell competitors the keywords you are trying to rank on
If you are active on Facebook, you might also want to consider using
as it gives analytics for referral traffic
-
Are useless, and I think were always "An Old Wives Tale" that they worked (or at least within the past 4 -5 years give or take) (index, follow is default unless specified noindex, nofollow or by other means robots.txt ect..)
is disregarded by Google, but there are still bots out there that accept i believe
Pragma and site verification are for you to decide -
Pragma just tells the individual browser not to cache, and site verification is just for Google Webmaster Tools - which can be done many ways. (.txt, DNS Zone ect..)
Some others are Author, Publisher and Canonical - but especially in the case of canonical - if you do not do it correctly it can really mess with Bots.
Hope this helps
Extra info on Author and Canonical
http://www.seomoz.org/blog/authorship-google-plus-link-building
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving site from html to Wordpress site: Should I port all old pages and redirect?
Any help would be appreciated. I am porting an old legacy .html site, which has about 500,000 visitors/month and over 10,000 pages to a new custom Wordpress site with a responsive design (long overdue, of course) that has been written and only needs a few finishing touches, and which includes many database features to generate new pages that did not previously exist. My questions are: Should I bother to port over older pages that are "thin" and have no incoming links, such that reworking them would take time away from the need to port quickly? I will be restructuring the legacy URLs to be lean and clean, so 301 redirects will be necessary. I know that there will be link juice loss, but how long does it usually take for the redirects to "take hold?" I will be moving to https at the same time to avoid yet another porting issue. Many thanks for any advice and opinions as I embark on this massive data entry project.
Technical SEO | | gheh20130 -
Query on Site Architecture
Hi All, When I check on my ecommerce site in one of the architecture tool in that my Ecommerce Homepage interlink with 765 pages whereas when I check few competitors and big brands then there homepage linked with 28 pages, 33, 47, 57 etc not like my site 765 pages. Do I am wrong anywhere? Can you please check the screenshot of mine & one of the competitor's site architecture? Because as per me site architecture also play good role in google organic ranking. vXs5dh2 16wre
Technical SEO | | pragnesh96390 -
Site address change: new site isn't showing up in Google, old site is gone.
We just transitioned mccacompanies.com to confluentstrategies.com. The problem is that when I search for the old name, the old website doesn't come up anymore to redirect people to the new site. On the local card, Google has even taken off the website altogether. (I'm currently still trying to gain access to manage the business listing) When I search for confluent strategies, the website doesn't come up at all. But if I use the site: operator, it is in the index. Basically, my client has effectively disappeared off the face of the Google. (In doing other name changes, this has never happened to me before) What can I do?
Technical SEO | | MichaelGregory0 -
Meta description and Meta Keywords
Hi, We are new to SEO and have some meta Q's Should Meta descriptions and meta keywords be different on every page? Is it bad to have the same meta data repeated on the site? If it has to be different does it have to be totally different per page of just slightly different? Should the description contain keywords is there an advantage to that? Thanks Andrew
Technical SEO | | Studio330 -
Best way to host new product?
Hi guys We are launching a new product, the web pages are being built by a 3rd party and fall outside our current CMS. We're considering either hosting it on 1) sub domain 2) folder within existing site (although will be tricky to implement) or 3) a different URL altogether. What would you say is the best for SEO? Many thanks in advance.... Nigel
Technical SEO | | Richard5550 -
Best strategy for redirecting domain authority from an acquired site...?
Hi all, I'm an in-house for a company that made several acquisitions last year prior to my starting. I'm just now hearing about several loose-ends websites that belong to companies that have been absorbed by us. The question is how to best approach the task of utilizing that site's domain authority to our site's benefit. There is already a link to the homepage in the header of the site in question (our logo's right under theirs) so we're already getting some linkjuice. Looks like the whois information never changed. Here are the options I'm considering: 1. Blanket redirect (all of their pages there into our home page) - not ideal. 2. Targeted redirect (try to "connect the dots" between content pages with similar subjects/keyword relevance - better than #1, but is it worth the extra effort? 3. More linking (add more strategically placed and keyword optimized links back to our site) - also more work, but certainly do-able if the consensus is to leave the site up. 4. Any other suggestions? Thanks for your help everyone!
Technical SEO | | TGViaWest0 -
What to include on a sitemap for a huge site?
I have a very large site and I'm not sure what all to include on the sitemap page. We have categories such as items1, items2 and in the items1 category are 100 vendors with their individual vendor pages. Should I link all 100 vendor pages on the sitemap or just the main items1 category?
Technical SEO | | CFSSEO0 -
Site Crawl
I was wondering if there was a way to use SEOmoz's tool to quickly and easily find all the URLs on you site and not just the ones with errors. The site that I am working on does not have a site map. What I am trying to do is find all the URLs along with their titles and description tags. Thank you very much for your help
Technical SEO | | pakevin0