Can Page Content & Description Have Same Content?
-
I'm studying my crawl report and there are several warnings regarding missing meta descriptions.
My website is built in WordPress and part of the site is a blog.
Several of these missing description warnings are regarding blog posts and I was wondering if I am able to copy the first few lines of content of each of the posts to put in the meta description, or would that be considered duplicate content?
Also, there are a few warnings that relate to blog index pages, e.g. http://www.iainmoran.com/2013/02/ - I don't know if I can even add a description of these as I think they are dynamically created?
While on the subject of duplicate content, if I had a sidebar with information on several of the pages (same info) while the content would be coming from a WP Widget, would this still be considered duplicate content and would Google penalise me for it?
Would really appreciate some thoughts on this,please.
Thanks,
Iain.
-
Thanks, Tom - I'll have a go at that and make an actual robots.txt file and upload it.
It is odd though and when I was creating my WP pages there are Yoast Options for each page - several of them I set to noindex, though looking at the virtual robots.txt, these isn't the case. My file just has:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Thanks again for all your help,
Iain.
-
http://wordpress.org/support/topic/robotstxt-file-4
Thats about the only thing I can find on it. Hope you can gleam some use out of it. Seem rather complicated for such an easy task.
-
Cheers Tom,
Yeah it is rather strange. There doesn't appear to be another plugin that should be causing this. Yoast is certainly the one relating to SEO.
Iain.
-
I think this is where I run out of useful things to add. That seems very odd to me.
Do you have any other plugins active that might be producing a robots.txt file?
-
Thanks Tom,
When I click Edit Files in Yoast it says:
"If you had a robots.txt file and it was editable, you could edit it from here."And yet, I do have one (albeit it appears a virtual one) as it can be viewed here:
http://www.iainmoran.com/robots.txtIf I try to view the site files on the server, via FTP or CPanel, there is no robots.txt file there!
I appear to be using the latest version of Yoast.
Thanks,
Iain.
-
Hey Iain,
There is a way to edit the file with Yoast. It should have a section called Edit Files when you click on the "SEO" part on the left hand side of your Wordpress dashboard. Once in there you should see robots.txt on the top. If you dont see it you might need to upgrade to the newest version of Yoast.
Thanks,
Tom
-
Thanks so much for your reply, Tom - very useful indeed!
I'm using Yoast SEO for WordPress, which apparently creates a virtual robots.txt and I can't see anyway to edit it as such. Unlike the posts themselves, which I can set to "noindex", the dynamic pages I cannot.
Unless I make my own robots.txt and upload it to my server, but I'm concerned that it will confuse matters and conflict with the one created/managed by Yoast?
Thanks again,
Iain.
-
Hey Iain,
I would do custom meta descriptions if possible. Meta descriptions are generally used to "sell" the content. They dont have any effect on ranking and if you dont feel like adding custom content to them, Google will just display the first content the page automatically. It is not considered duplicate content.
I would also probably get rid of those blog index pages from your xml sitemap and no index them if you can with meta robots or robots.txt. Those will produce duplicate content and you really want to drive people and bots to the posts themselves. Not the index pages of the posts.
I also wouldnt worry about the sidebar. As long as you are providing a decent amount of unique content on each page you will be fine.
Hope that helps.
Site looks good!
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
How to Remove Web Cache snapshot page & other language SEO Title in Google Search Engine?
Hi... Please tell me how to remove web cache link given below. I have changed my SEO title but it can't be changed...Any other methods for without using webmaster tools. Kw3arat
Technical SEO | | Thilak_geo040 -
How to handle pages I can't delete?
Hello Mozzers, I am using wordpress and I have a small problem. I have two sites, I don't want but the dev of the theme told me I can't delete them. /portfolio-items/ /faq-items/ The dev said he can't find a way to delete it because these pages just list faqs/portfolio posts. I don't have any of these posts so basically what I have are two sites with just the title "Portfolio items" and "FAQ Items". Furthermore the dev said these sites are auto-generated so he can't find a way to remove them. I mean I don't believe that it's impossible, but if it is how should I handle them? They are indexed by search engines, should I remove them from the index and block them from robots.txt? Thanks in advance.
Technical SEO | | grobro0 -
After I 301 redirect duplicate pages to my rel=canonical page, do I need to add any tags or code to the non canonical pages?
I have many duplicate pages. Some pages have 2-3 duplicates. Most of which have Uppercase and Lowercase paths (generated by Microsoft IIS). Does this implementation of 301 and rel=canonical suffice? Or is there more I could do to optimize the passing of duplicate page link juice to the canonical. THANK YOU!
Technical SEO | | PFTools0 -
Issue: Duplicate Page Content
Hi All, I am getting warnings about duplicate page content. The pages are normally 'tag' pages. I have some blog posts tagged with multiple 'tags'. Does it really affect my site?. I am using wordpress and Yoast SEO plugin. Thanks
Technical SEO | | KLLC0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
Pdf page titles and descriptions errors
In my weekly crawl report I suddenly have a huge number of title not found and missing page descriptions errors for all of my pdf files. The pdfs do have a page title (defined in the file/properties tab) However these page titles are not being picked up by the crawler or google. Any ideas how do I fix this? ( I am using the Adrobat 9 Distiller)
Technical SEO | | PerriCline0 -
Discrepency between # of pages and # of pages indexed
Here is some background: The site in question has approximately 10,000 pages and Google Webmaster shows that 10,000 urls(pages were submitted) 2) Only 5,500 pages appear in the Google index 3) Webmaster shows that approximately 200 pages could not be crawled for various reasons 4) SEOMOZ shows about 1,000 pages that have long URL's or Page Titles (which we are correcting) 5) No other errors are being reported in either Webmaster or SEO MOZ 6) This is a new site launched six weeks ago. Within two weeks of launching, Google had indexed all 10,000 pages and showed 9,800 in the index but over the last few weeks, the number of pages in the index kept dropping until it reached 5,500 where it has been stable for two weeks. Any ideas of what the issue might be? Also, is there a way to download all of the pages that are being included in that index as this might help troubleshoot?
Technical SEO | | Mont0