Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is there a way to prevent Google Alerts from picking up old press releases?
-
I have a client that wants a lot of old press releases (pdfs) added to their news page, but they don't want these to show up in Google Alerts. Is there a way for me to prevent this?
-
Thanks for the post Keri.
Yep, the OCR option would still make the image option for hiding "moo"
-
Harder, but certainly not impossible. I had Google Alerts come up on scanned PDF copies of newsletters from the 1980s and 1990s that were images.
The files recently moved and aren't showing up for the query, but I did see something else interesting. When I went to view one of the newsletters (https://docs.google.com/file/d/0B2S0WP3ixBdTVWg3RmFadF91ek0/edit?pli=1), it said "extracting text" for a few moments, then had a search box where I could search the document. On the fly, Google was doing OCR work and seemed decently accurate in the couple of tests I had done. There's a whole bunch of these newsletters at http://www.modelwarshipcombat.com/howto.shtml#hullbusters if you want to mess around with it at all.
-
Well that is how to exclude them from an alert that they setup, but I think they are talking about anyone who would setup an alert that might find the PDFs.
One other idea I had, that I think may help. If you setup the PDFs as images vs text then it would be harder for Google to "read" the PDFs and therefore not catalog them properly for the alert, but then this would have the same net effect of not having the PDFs in the index at all.
Danielle, my other question would be - why do they give a crap about Google Alerts specifically. There has been all kinds of issues with the service and if someone is really interested in finding out info on the company, there are other ways to monitor a website than Google Alerts. I used to use services that simply monitor a page (say the news release page) and lets me know when it is updated, this was often faster than Google Alerts and I would find stuff on a page before others who did only use Google Alerts. I think they are being kind of myopic about the whole approach and that blocking for Google Alerts may not help them as much as they think. Way more people simply search on Google vs using Alerts.
-
The easiest thing to do in this situation would be to add negative keywords or advanced operators to your google alert that prevent the new pages from triggering the alert. You can do this be adding advanced operators that exclude an exact match phrase, a file type, the clients domain or just a specific directory. If all the new pdf files will be in the same directory or share a common url structure you can exclude using the "inurl:-" operator.
-
That also presumes Google Alerts is anything near accurate. I've had it come up with things that have been on the web for years and for whatever reason, Google thinks they are new.
-
That was what I was thinking would have to be done... It's a little complicated on why they don't want them showing up in Alerts. They do want them showing up on the web, just not as an Alert. I'll let them know they can't have it both ways!
-
Robots.txt and exclude those files. Note that this takes them out of the web index in general so they will not show up in searches.
You need to ask your client why they are putting things on the web if they do not want them to be found. If they do not want them found, dont put them up on the web.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solved Recommended title length for Google search results
I read the recommended title length is 50-60 characters depending on alphabets, etc,.
On-Page Optimization | | Mike555
Anyways, my question is, is there any harm of having longer title?
If all my important keywords are within the 50-60 characters that will show up on search results, I can still make the title longer, it's just that those keywords outside won't have any effect on search results?0 -
How does Google handle read more tags in Wordpress
Hi Everyone I am wondering how Google handles the read more tag in Wordpress. I pasted the link to a blog post on Google and found nothing (domain.com/post#readmore). Then I paste the version without #readmore (domain.com/post) and found that Google indexed the page but with the option to click "read more" to read it. The full blog post is not in their index, just the version asking you to read more. Is this because Google hasn't gotten to it or is Google ignoring it. I am not sure but ideally I rather have the full blog post indexed, not the read more version. I am curious to whether this will cause duplicate content issues. What are your experience with this and is it advisable to use an alternate method for read more. Maybe with a Wordpress plugin. Thanks in advance.
On-Page Optimization | | gaben0 -
How to remove subdomains in a clean way?
Hello, I have a main domain example.com where I have my main content and then I created 3 subdomains one.example.com, two.example.com and three.example.com I think the low ranking of my subdomains is affecting the ranking of my main domain, the one I care the most. So, I decided to get rid of the subdomains. The thing is that only for one.example.com I could transfer the content to my main domain and create 301 redirects. For the other two subdomains I cannot integrate the content in my main domain as it doesn't make sense. Whats the cleanest way to make them dissapear? (just put a redirect to my main domain even if the content is not the same) or just change the robots to "noindex" and put a 404 page in the index of each subdomain. I want to use the way that will harm the least the performance with Google. Regards!
On-Page Optimization | | Gaolga0 -
How does Google treat Dynamic Titles?
Let's say my website can be accessed in only 3 states Colorado, Arizona and Ohio. I want to display different information to each visitor based on where they are located. For this I would also like the title to change based on their location. Not quite sure how Google we treat the title and rank the site.... Any resources you can provide would be helpful. Thanks
On-Page Optimization | | Firestarter-SEO0 -
Does Rel=canonical affect google shopping feed?
I have a client who gets a good portion of their sales (~40%) from Google Product Feeds, and for those they want each (Product X Quantity) to have it’s own SKU, as they often get 3 listings in a given Google shopping query, i.e. 2,4,8 units of a given product. However, we are worried about this creating duplicate content on the search side. Do you know if we could rel=canonical on the site without messing with their google shopping results? The crux of the issue is that they want the products to appear distinct for the product feed, and unified for the web so as not to dilute. Thoughts?
On-Page Optimization | | VISISEEKINC0 -
Blocking Subdomain from Google Crawl and Index
Hey everybody, how is it going? I have a simple question, that i need answered. I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more. What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc? Hope to hear from you, Best Regards,
On-Page Optimization | | JesusD3 -
My Meta Description changes when i use different keyword in google search.
Hello everyone, I have a question for the community. I have a website with several articles and news that i manage. I set specific meta descriptions for every page but when i search in google it gives me back different meta descriptions depending on the keyword that i use to search. What i notice is that google looks in my page for the most relevant part of the text that combines with my keyword and gives me back that result. I thought that this only happen when i have an empty meta description. Anyone felt the same ? Best Ricardo www.meuportalfinanceiro.pt
On-Page Optimization | | Adclick0 -
How to properly remove pages and a category from Google's index
I want to remove this category http://www.webdesign.org/web-design-news-all/ and all the pages in that category (e.g. http://www.webdesign.org/web-design-news-all/7386.html ) from Google's index. I used the following string in the "Reomval URS" section in Google Webmaster Tools: http://www.webdesign.org/web-design-news-all/* is that correct or I better use http://www.webdesign.org/web-design-news-all/ ? Thanks in advance.
On-Page Optimization | | VinceWicks0