Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is there a way to prevent Google Alerts from picking up old press releases?
-
I have a client that wants a lot of old press releases (pdfs) added to their news page, but they don't want these to show up in Google Alerts. Is there a way for me to prevent this?
-
Thanks for the post Keri.
Yep, the OCR option would still make the image option for hiding "moo"
-
Harder, but certainly not impossible. I had Google Alerts come up on scanned PDF copies of newsletters from the 1980s and 1990s that were images.
The files recently moved and aren't showing up for the query, but I did see something else interesting. When I went to view one of the newsletters (https://docs.google.com/file/d/0B2S0WP3ixBdTVWg3RmFadF91ek0/edit?pli=1), it said "extracting text" for a few moments, then had a search box where I could search the document. On the fly, Google was doing OCR work and seemed decently accurate in the couple of tests I had done. There's a whole bunch of these newsletters at http://www.modelwarshipcombat.com/howto.shtml#hullbusters if you want to mess around with it at all.
-
Well that is how to exclude them from an alert that they setup, but I think they are talking about anyone who would setup an alert that might find the PDFs.
One other idea I had, that I think may help. If you setup the PDFs as images vs text then it would be harder for Google to "read" the PDFs and therefore not catalog them properly for the alert, but then this would have the same net effect of not having the PDFs in the index at all.
Danielle, my other question would be - why do they give a crap about Google Alerts specifically. There has been all kinds of issues with the service and if someone is really interested in finding out info on the company, there are other ways to monitor a website than Google Alerts. I used to use services that simply monitor a page (say the news release page) and lets me know when it is updated, this was often faster than Google Alerts and I would find stuff on a page before others who did only use Google Alerts. I think they are being kind of myopic about the whole approach and that blocking for Google Alerts may not help them as much as they think. Way more people simply search on Google vs using Alerts.
-
The easiest thing to do in this situation would be to add negative keywords or advanced operators to your google alert that prevent the new pages from triggering the alert. You can do this be adding advanced operators that exclude an exact match phrase, a file type, the clients domain or just a specific directory. If all the new pdf files will be in the same directory or share a common url structure you can exclude using the "inurl:-" operator.
-
That also presumes Google Alerts is anything near accurate. I've had it come up with things that have been on the web for years and for whatever reason, Google thinks they are new.
-
That was what I was thinking would have to be done... It's a little complicated on why they don't want them showing up in Alerts. They do want them showing up on the web, just not as an Alert. I'll let them know they can't have it both ways!
-
Robots.txt and exclude those files. Note that this takes them out of the web index in general so they will not show up in searches.
You need to ask your client why they are putting things on the web if they do not want them to be found. If they do not want them found, dont put them up on the web.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Finding issue on Gtmetrix speed test and google speed test
Hey, when i have tested my website https://www.socprollect-mea.com/ on GT metrix. When i test morning, it shown page speed of 3.8 seconds and in noon it shown 2 seconds and later i check, it is defeclecting. speed on the Google page speed test as well. What kind of error is this and does anyone have the solution for this issue?
On-Page Optimization | | nazfazy0 -
Does RSS Feed help to rank better in Google?
Hello, I heard RSS Feed helps in ranking. However, I am not sure if I should enable RSS Feed or not. Whenever I publish an article on my site , I see that many other websites have leeched my Feed and get's the same article I written published with a nofollow backlink to my website article. The worst part is that my article doesn't appear in Google search, but the website which copied my article gets ranked in Google. Although the article gets index on google (checked by using site:website.com). Although some articles show up after 24 hours by ranking higher from the sites which copied my article. Any idea what should I do? Thank you
On-Page Optimization | | hakhan2010 -
Google is indexing urls with parameters despite canonical
Hello Moz, Google is indexing lots of urls despite the canonical in my site. Those urls are linked all over the site with parameters like ?, and looks like Google is indexing them despite de canonical. Is Google deciding to index those urls because they are linked all over the site? The canonical tag is well implemented.
On-Page Optimization | | Red_educativa0 -
Does Rel=canonical affect google shopping feed?
I have a client who gets a good portion of their sales (~40%) from Google Product Feeds, and for those they want each (Product X Quantity) to have it’s own SKU, as they often get 3 listings in a given Google shopping query, i.e. 2,4,8 units of a given product. However, we are worried about this creating duplicate content on the search side. Do you know if we could rel=canonical on the site without messing with their google shopping results? The crux of the issue is that they want the products to appear distinct for the product feed, and unified for the web so as not to dilute. Thoughts?
On-Page Optimization | | VISISEEKINC0 -
How does Google Detect which keywords my website should show up for in the SE?
When I checked my Google Webmaster Tools I found that my website is showing up for keywords that I didn't optimize for ... for example I optimize my website for "funny pictures with captions", and the website is showing up for "funny images with captions". I know that this is good, but the keyword is dancing all around, sometimes I search for "funny pictures with captions" and I show up in the 7th page, and some time I don't show up. and the same goes for the other keyword. of course I am optimizing for more than two keywords but the results is not consistent. my question is how does Google decide which keywords you website should show up for? Is it the on-page keywords?, or is it the off-page anchor text keywords? Thank you in advance ...
On-Page Optimization | | FarrisFahad
FarrisFahad0 -
Does Google look at page design
Hi everybody, At the moment i'm creating several webshops and websites with the same layout, so visitors can recognize the websites are from the same company. But i was wondering: Does google look at the layout of a webpage that it's not a copy of another website? This because loads of website have the same wordpress/joomla templates etc, or doesn't this effect rankingpositions? Thank you,
On-Page Optimization | | iwebdevnl0 -
Howdy, do curse words on your content article hurt SEO in any way or form?
howdy, do curse words on your content article hurt SEO in any way or form? and if so is there a "list" of registered curse keywords that should be avoided?
On-Page Optimization | | david3050 -
Should I let Google index tags?
Should I let Google index tags? Positive? Negative Right now Google index every page, including tags... looks like I am risking to get duplicate content errors? If thats true should I just block /tag in robots.txt Also is it better to have as many pages indexed by google or it's should be as lees as possible and specific to the content as much as possible. Cheers
On-Page Optimization | | DiamondJewelryEmpire0