Is there any way to prevent Google from using structured data on specific pages?
-
I've noticed that Google is now serving what looks like host-specific video cards on mobile for our site. Is there any way to control which videos are included in these lists without removing the structured data on those clip pages or user pages?
We don't want to noindex those pages but we don't want content from those pages to appear as video cards.
-
Hi Garrett, I'm not sure I understand the question completely. The screenshot example you shared was a YouTube result.
Craig
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Rank on Moz compared to Ahrefs
So there seems to be a huge philosophical difference behind how Moz and Ahrefs calculates page rank (PA). On Moz, PA is very dependent on a site's DA. For instance, any new page or page with no backlinks for a 90DA site on Moz will have around 40PA. However, if a site has around 40 DA, any new page or page with no backlinks will have around 15PA PA. Now if one were to decide to get tons of backlinks to this 40 DA/15PA page, that will raise the PA of the page slightly, but it will likely never go beyond 40PA....which hints that one would rather acquire a backlink from a page on a high DA site even if that page has 0 links back to it as opposed to a backlink from a page on a low DA site with many, many backlinks to it. This is very different from how Ahrefs calculates PA. For Ahrefs, the PA of any new page or page with no backlinks to it will have a PA of around 8-10ish....no matter what the DA of the site is. When a page from a 40DA site begins acquiring a few links to it, it will quickly acquire a higher PA than a page from a 90DA site with no links to it. The big difference here is that for Ahrefs, PA for a given page is far more dependent on how many inbound links that page has. On the other hand, for Moz, PA for a given page is far more dependent on the DA of the site that page is on. If we were to trust Moz's PA calculations, SEOrs should emphasize getting links from high DA sites....whereas if we were to trust Ahref's PA calculations, SEOrs should focus less on that and more on building links to whatever page they want to rank up (even if that page is on a low DA site). So what do you guys think? Do you agree more with Moz or Ahref's valuation of PA. Is PA of a page more dependent on the DA or more dependent on it's total inbound links?
Algorithm Updates | | ButtaC1 -
Link reclamation: What happens when backlinks are pointing to other page than the most related page? Any risks?
Hi all, We have started link reclamation process as we failed to redirect our old website links to newly created pages. Unfortunately most of the backlinks are pointing to a page which already has lots of backlinks. Just wondering if I can redirect the old pages to the other pages than the actual related page they must be pointing to make sure only one page doesn't take away all the backlinks. And what happens if Google find that backlink is pointing to a different page than the actual page? Thanks
Algorithm Updates | | vtmoz0 -
Ctr question with home page and product pages
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Algorithm Updates | | BobAnderson0 -
Drop in Traffic from Google, However no change in the rankings
I have seen a 20% drop in traffic from google last week (After April 29th). However when I try to analyze the rank of the keywords in the google results that send me traffic they seem to be the same. Today (6th March) Traffic has fallen further again with not much/any visible change in the rankings. Any ideas on what the reason for this could be? I have not made any changes to the website recently.
Algorithm Updates | | raghavkapur0 -
Can a google data refresh knock your pages out of the rankings?
I see that around mid November 2013 a handful of my sites pages dropped off of Google completely. It was around the data refreshes in November, and while everyone says it doesn't effect that much I was wondering if anyone knew if it could knock some of my pages out of the rankings for a specific keyword. Note - we had previously held muliple listings for different pages on our site for this particular keyword. Google kept the highest ranking and knocked the lower ones off. See attached image of our keyword ranking history to see what I mean. DcJJM0M
Algorithm Updates | | franchisesolutions0 -
Has Google problems in indexing pages that use <base href=""> the last days?
Since a couple of days I have the problem, that Google Webmaster tools are showing a lot more 404 Errors than normal. If I go thru the list I find very strange URLs that look like two paths put together. For example: http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm If I check on which page Google found that path it is showing me the following URL: http://www.domain.de/languages/languageschools/havanna/spanishcourse.htm If I check the source code of the Page for the Link leading to the London Page it looks like the following: [...](languages/languageschools/london/london.htm) So to me it looks like Google is ignoring the <base href="..."> and putting the path together as following: Part 1) http://www.domain.de/laguages/languageschools/havanna/ instead of base href Part 2) languages/languageschools/london/london.htm Result is the wrong path! http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm I know finding a solution is not difficult, I can use absolute paths instead of relative ones. But: - Does anyone make the same experience? - Do you know other reasons which could cause such a problem? P.s.: I am quite sure that the CMS (Typo3) is not generating these paths randomly. I would like to be sure before we change the CMS's Settings to absolute paths!
Algorithm Updates | | SimCaffe0 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0 -
Best way to find new studies?
I'm wondering how sites break news without dedicating too much time researching. Are there any tools that you use? For example, if you wanted to be one of the first people to know when a celebrity was diagnosed with a health condition, how would you go about doing that?
Algorithm Updates | | nicole.healthline0