Meta Description: How to Implement It?
-
I'm pretty new to SEO and am starting to implement the knowledge that I've learned into the websites that I'm working on. The websites have already been created but without Meta-descriptions.
My problem is that when I try to implement a Meta Description it moves the whole website text and design. I am using Wordpress to manage and edit the sites.
Any help would be great.
Thanks!
-
All In One SEO Pack is another good free plugin that allows you to directly edit the meta description tag on each page
-
Hi Stuart,
A meta description shouldn't be visible on the page itself (although a piece of content can be used as description), but should be included in the head section of the html.
I can recommend installing a plugin like Yoast SEO for WordPress. This plugin makes it very easy to add a meta description to all your pages.
Good luck!
-
Hey,
Unless there is a specific reason for not using a plugin I would install Yoasts SEO plugin. It is much easier to manage all the meta stuff with.
Here is a link: http://yoast.com/wordpress/seo/
It's also totally free
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Change in Meta Description - 320 to 160
Why google showing only 160 Char, instead of 320? Is there any official announcement from @Google? I have noticed form last week in Google SERP, Description is showing 160 again. Please help me with valuable information.
Technical SEO | | HuptechWebseo3 -
When do I change out my meta tags after a full website revamp?
We're creating a new version of our entire website - look and feel is completely different, though core functionality and results are the same. Â Just cleaner, faster.. etc. Â We're doing a temp redirect to the temporary url for testing and to slow roll the release to some of our users for a more friendly approach. Â Eventually, the new look and feel will be under the original url. I've researched best practices for the site transfer, including "make sure the meta tags for title and description are exactly the same". The concern I have is that Moz Analytics is detecting a lot of errors in the existing meta tags. Â They're too long, have changed and become inconsistent after being passed through different hands, & some have some keyword stuffing in there. I have plans to change them out and really clean them up... I'm just wondering, when is the best time to do that? Since the tags are bad, should I just do it now but make sure that the old and new are matching? Â Or should I wait (and for how long?) after the new site is switched over and everything is on the original URL?
Technical SEO | | SFMoz0 -
Noindex meta tag
Hi When following Webmaster Tools/Optimization/HTML Improvements it says that we have duplicate title tags and duplicate meta descriptions for hundreds of pages, As corrective action we have added to those pages and also changed title tags to make sure that they are different but still Webmaster keeps reporting that the duplication exist. Is it possible that google bot doesn't see our noindex code while crawling? By the way our seomoz report says that there is no duplicate title tag or meta description on our site google has crawled our site today and we received our report from seomoz today thanks
Technical SEO | | iskq0 -
Google truncating or altering meta title - affect rankings?
I have a site that the title tag is too long and the title is simply the name of the site (I think they get it from ODP, not sure) Anyway, the rankings for the home page have dropped quite a bit. Â I'm wondering if the change that Google makes affects rankings (i.e. name of site doesn't have all the keywords).
Technical SEO | | santiago230 -
Timely use of robots.txt and meta noindex
Hi, I have been checking every possible resources for content removal, but I am still unsure on how to remove already indexed contents. When I use robots.txt alone, the urls will remain in the index, however no crawling budget is wasted on them, But still, e.g having 100,000+ completely identical login pages within the omitted results, might not mean anything good. When I use meta noindex alone, I keep my index clean, but also keep Googlebot busy with indexing these no-value pages. When I use robots.txt and meta noindex together for existing content, then I suggest Google, that please ignore my content, but at the same time, I restrict him from crawling the noindex tag. Robots.txt and url removal together still not a good solution, as I have failed to remove directories this way. It seems, that only exact urls could be removed like this. I need a clear solution, which solves both issues (index and crawling). What I try to do now, is the following: I remove these directories (one at a time to test the theory) from the robots.txt file, and at the same time, I add the meta noindex tag to all these pages within the directory. The indexed pages should start decreasing (while useless page crawling increasing), and once the number of these indexed pages are low or none, then I would put the directory back to robots.txt and keep the noindex on all of the pages within this directory. Can this work the way I imagine, or do you have a better way of doing so? Thank you in advance for all your help.
Technical SEO | | Dilbak0 -
Ajax #! URLs, Linking & Meta Refresh
Hi, We recently underwent a platform change and unfortunately our updated ecom site was coded using java script. The top navigation is uncrawlable, the pertinent product copy is undetectable and duplicated throughout the code, etc - it needs a lot of work to make it (even somewhat) seo-friendly. We're in the process of implementing ajax #! to our site and I've been tasked with creating a document of items that I will test to see if this solution will help our rankings, indexing, etc (on Google, I've read the issues w/ Bing). I have 2 questions: 1. Do I need to notify our content team who works on our linking strategy about the new urls? Would we use the #! url (for seo) or would we continue to use the clean url (without the #!) for inbound links? 2. When our site transferred over, we used meta refresh on all of the pages instead of 301s for some reason. Instead of going to a clean url, our meta refresh says this:Â Â . Would I update it to have the #! in the url? Should I try and clean up the meta refresh so it goes to an actual www. url and not this browsererrorview page? Or just push for the 301? I have read a ton of articles, including GWT docs, but I can't seem to find any solid information on these specific questions so any help I can get would be greatly appreciated. Thanks!
Technical SEO | | Improvements0 -
Hyperlinks under description in organic listings ...
If you search on Google for yosemite rentals the first organic result that shows up is www.redwoodsinyosemite.com. How are they able to get the links under description: Agent Login - Our Gallery etc ..
Technical SEO | | afranklin0 -
Backtracking from verification meta tag to the correct Google account is difficult
A Google verification meta tag was created and implemented on a site that I am now responsible for (I took over an SEO project after a long lapse), but no one seems to know what Google account was used to create the meta tag in the first place. I'm finding it very difficult to backtrack from verification meta tag to the Google account, and all the online help is for those having trouble moving forward with the verification. Any suggestions or advice?
Technical SEO | | MaryDoherty0