Duplicate description error: one for meta one for og:type
-
I am getting the duplicate description error from Moz. I use both the
og:description
and . I am not sure if that is going to get me penalized by the search engines or my pages somehow discounted if I have meta description andog:description
on the same page.What does Moz recommend?
NOTE: for years I have followed this in the best practices format put out from other sources.
Short Answer: Use both!
Long Answer:
The OG stands for Open Graph which is apart of the Open Graph protocol of which works on platforms such as Facebook.
The meta description element is for search engines such as Google, Yahoo and Bing.
Since these are two separate tags that kinda do the same thing but they are designed for different types of platforms, one for Facebook and the other for Search Engines. The reasoning behind this is that the Open Graph protocal is more rich in what content can be feed to Facebook without scrapping the full page, think rich snippets. So images, description and more information is feed to Facebook via the Open Graph.
Using both is a good idea.
-
I don't know if its common practice but its not a new fix. Never heard of there being any issues with doing it that way (at least no one has ever told me this suggestion threw other errors for them if it has).
-
Hi Mike,
Is it common practice? I have never heard of that before. You are correct in that it does resolve the Moz error.
I will need to see how Google Analytics responds before making the change on all pages. Do you already have personal experience with if GA ranks the combined description as "ok" or an "error"?
NOTE: While Moz threw an error with my former structure, GA did not. If I now can eliminate the Moz error, will it throw a GA error?
-
Have you tried combining them into one? e.g.
name="description" property="og:description" content="My meta description copy."/>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Types
How to get Link Type(Follow,NoFollow) using API??? There is no such metric defined in following url's 1-https://moz.com/help/guides/moz-api/mozscape/api-reference/link-metrics 2-https://moz.com/help/guides/moz-api/mozscape/api-reference/url-metrics
API | | rogerdavid0 -
What is the metric to check link state and link type for MOZ API ?
Kindly do not share the website url related to url and link metrics,instead of mention the correct metric for link state and link type. Thanks!
API | | rogerdavid0 -
Sample API call error
from mozscape import Mozscape client = Mozscape('xxxxx', 'xxxxx') # Now for some anchor text results anchorResults = client.anchorText('http://www.moz.com') # Or for just specific columns anchorTermResults = client.anchorText('http://www.moz.com', cols=Mozscape.ATCols.term) this is the error I am getting **mozscape.MozscapeError: HTTP Error 403: Permission denied: fpalgkadgnamhiblgpakemcfeedbebdcfk** My python is installed in appdata/local/programs and it is an enterprise environment. Thank you in advance!
API | | Sreenivas.Bathula0 -
Error when generating secret API key
Hi, I am trying to generate an API key for the past 24 hours and I keep on getting the same vague explanation (attached below) of the error. Help Hub had no responses regarding the error also. "Oops - Something went wrong while trying to get your API Credentials. Please try again or check out the Help Hub if you are still experiencing issues." Appreciate any assistance on solving this issue. Thanks! Ian
API | | kwaken0 -
803 Crawl attempt error
Hello I'd be very grateful for any advice with this: My insights show I have an 803 error. Now, under "pages with crawl attempt error" the page in question is just an uploaded image to wordpress. However, above the graph it says: "We were unable to access your homepage, which prevented us from crawling the rest of your site. It is likely that other browsers as well as search engines may encounter this problem and abort their sessions." Does this really mean my homepage? or is the only issue with the image? I have noticed for the past 8 weeks I'm getting 1 crawl attempt error every 2 weeks (so when viewed weekly I have 1 error one week, 0 error the next week etc) Is this normal? Since receiving this 803 error, I have significantly dropped in SERPS for 3 key terms I was on page 1 for (now dropped to pages 3-4). Could this be related? I realise this is a bit specific, but thanks in advance. Cheers 🙂
API | | wearehappymedia0 -
Moz crawl error? DA & Linking domains significant drop.
Hi guys! Our site www.carwow.co.uk appears to have been punched in the face by the latest Moz update. It's claiming our #linking root domains has dropped from 225 to 135 and has subsequently hit our DA from 38 to 35. We haven't disavowed any links and our off-site strategy has been going well the past 2 months. Search performance has increased by around 15% (around 5k sessions) and rankings have improved week on week. Any idea if this is a Moz error? That's almost a 50% drop in linking root domains. Thanks, James
API | | Matt.Carwow0 -
Huge in crease in on page errors
Hi guys I’ve just checked my online campaign and I see errors in my crawl diagnostics have almost doubled from the 21<sup>st</sup> of October to the 25<sup>th</sup> of October going from 6708 errors to 11 599. Can anyone tell me what may have caused this? Also I notice we have a lot of issues with duplicate page titles which seems strange as no new pages have been added, can anyone explain why this might be? I look forward to hearing from you
API | | Hardley1110 -
Batch URL Error 413
Hello, i am using the free mozscape api to get the domain and page authority of urls. i am batching the url as shown in the sample code however i have over 500 URL i need to check and when running the request i get the following: stdClass Object ( [status] => 413 [error_message] => Too many urls in batch. Batches must be less than or equal to 200 urls. ) When i change it to 200 urls the request works fine, now is there a way to enable me to batch all 500 urls at once? i did read that the beta api is capable of batching urls in one request : http://moz.com/blog/400-higher-throughput-mozscape-api-now-in-beta-and-seeking-testers Has this been implemented yet into the current api? Thanks
API | | pauledwards0