Dealing with PDFs?
-
Hello fellow mozzers!
One of our clients does an excellent job of providing excellent content, and we don't even have to nag them about it (imagine that!). This content is usually centered around industry reports, financial analyses, and economic forcasts; however, they always post them in the form of pdfs.
How does Google view PDF's, and is there a way to optimize them? Ideally, I am going to try to get this client set up with a blog-like plateform that will use HTML text, rather than PDF's, but I wanted to see what info was out there for PDF's.
Thanks!
-
Thank you Keri for the helpful resource. I actually ended up doing all of those things for our client. Also, I found out that the default Drupal 6 robot.txt file, does not allow the search engines to index pdf's, images, and flash. Therefore, one must eliminate the disallow: /sites/ from the Robot.txt file.
-
This doesn't address ranking, but the YOUmoz post does talk about best practices for optimizing PDF content and may help you. http://www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search
-
To be honest Dana, outside of the basics mentioned, I tended not to go overboard and many of them started to rank naturally as Google spidered the site. Just remember to give the link to the PDF a strong anchor text and if possible, add a little content around it to explain what visitors can expect in the document. Also remember to add a link to Adobe so that they can download the free reader if they dont have it already.
Hope this helps,
Regards,
Andy
-
Thank you iNet SEO, Excellent resource...
I was also wondering if anyone had any posts / experience with understanding the indexing and ranking of PDF content?
-
Yes, you can optomise PDF's - have a read of this as it seems to cover most points
http://www.seoconsultants.com/pdf/seo
Sorry, I forgot to add that PDF's are useful for those who are wishing to download something to read at a later stage or whilst offline. Don't rush to advise them that HTML is the way to go unless it actually is. I have printed off many a PDF and taken it into meetings with me.
Regards,
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
Dealing with closely related pages
I have a book with 8 pages which I offer free on my site: http://www.pottytrainingchart4kids.com/free-potty-training-book/ For technical reasons each of the 8 pages are on a seperate page. This might cause thin content/duplicate content since most of the code is the same besides the images and there isn't much on each page. How would you suggest I deal with this? I remember once reading about rel prev or something like that but I am not sure if it is applicable. I would like all page rank to go to the main page. Should I add no index to the other pages? I am not really sure what I should do to prevent a Panda penalty. Thanks in advance!
Technical SEO | | JillB20130 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
Dealing with manual penalty...
I'm in the back-and-forth with Google's Quality Search team at the moment. We discovered a manual penalty on our website and have been trying to get it removed as of late. Problem is, tons of spammy incoming links. We did not ask for or purchase any of these links, it just so happens that spammy websites are linking to our site. Regardless, I've done my best to remove quite a few links in the past week or so, responding to the Quality Search team with a spreadsheet of the links in question and the action taken on each link. No luck so far. I've heard that if I send an email to a website asking for a link removal, I should share that with Google as well. I may try that. Some of the links are posted on websites with no contact info. A WhoIs search brings up a hidden registrant. Removing these links is far from easy. My question is, what are some techniques that are proven to be effective when working your way through the removal of a manual penalty? I know Google isn't going to tell me all of the offending links (they've offered a few examples, we've had those removed - still penalized) so what's the best way for me to find them myself? And, when I have a link removed, it may stay in Webmaster Tools as an active link for a while even though it no longer exists. Does the Quality Search team use Webmaster Tools to check or do they use something else? It's an open-ended question, really. Any help dealing with a manual penalty and what you have done to get that penalty removed is of great help to me. Thanks!
Technical SEO | | ccorlando0 -
What is the proper way to deal with multiple product pages?
If an e-commerce site selling backpacks has 200 items and they only show 30 per page, what is the proper way to deal with page 2, page 3, etc? Do you let each page rank separately? even though they are competing on the same keywords? Do you use canonical tags on pages after the first page?
Technical SEO | | youngsgarden1 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0 -
How do i deal with duplicate content on the same domain?
I'm trying to find out if there's a way we can combat similar content on different pages on the same site, without having to re write the whole lot? Any ideas?
Technical SEO | | indurain0 -
What's the best way to deal with an entire existing site moving from http to https?
I have a client that just switched their entire site from the standard unsecure (http) to secure (https) because of over-zealous compliance issues for protecting personal information in the health care realm. They currently have the server setup to 302 redirect from the http version of a URL to the https version. My first inclination was to have them simply update that to a 301 and be done with it, but I'd prefer not to have to 301 every URL on the site. I know that putting a rel="canonical" tag on every page that refers to the http version of the URL is a best practice (http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394), but should I leave the 302 redirects or update them to 301's. Something seems off to me about the search engines visiting an http page, getting 301 redirected to an https page and then being told by the canonical tag that it's actually the URL they were just 301 redirected from.
Technical SEO | | JasonCooper0