Dealing with PDFs?
-
Hello fellow mozzers!
One of our clients does an excellent job of providing excellent content, and we don't even have to nag them about it (imagine that!). This content is usually centered around industry reports, financial analyses, and economic forcasts; however, they always post them in the form of pdfs.
How does Google view PDF's, and is there a way to optimize them? Ideally, I am going to try to get this client set up with a blog-like plateform that will use HTML text, rather than PDF's, but I wanted to see what info was out there for PDF's.
Thanks!
-
Thank you Keri for the helpful resource. I actually ended up doing all of those things for our client. Also, I found out that the default Drupal 6 robot.txt file, does not allow the search engines to index pdf's, images, and flash. Therefore, one must eliminate the disallow: /sites/ from the Robot.txt file.
-
This doesn't address ranking, but the YOUmoz post does talk about best practices for optimizing PDF content and may help you. http://www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search
-
To be honest Dana, outside of the basics mentioned, I tended not to go overboard and many of them started to rank naturally as Google spidered the site. Just remember to give the link to the PDF a strong anchor text and if possible, add a little content around it to explain what visitors can expect in the document. Also remember to add a link to Adobe so that they can download the free reader if they dont have it already.
Hope this helps,
Regards,
Andy
-
Thank you iNet SEO, Excellent resource...
I was also wondering if anyone had any posts / experience with understanding the indexing and ranking of PDF content?
-
Yes, you can optomise PDF's - have a read of this as it seems to cover most points
http://www.seoconsultants.com/pdf/seo
Sorry, I forgot to add that PDF's are useful for those who are wishing to download something to read at a later stage or whilst offline. Don't rush to advise them that HTML is the way to go unless it actually is. I have printed off many a PDF and taken it into meetings with me.
Regards,
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Safety Data Sheet PDFs are Showing Higher in Search Results than Product Pages
I have a client who just launched an updated website that has WooCommerce added to it. The website also has a page of Safety Data Sheets that are PDFs that contain information about some of the products. When we do a Google search for many of the products the Safety Data Sheets show up first in the search results instead of the product pages. Has anyone had this happen and know how to solve the issue?
Technical SEO | | teamodea0 -
How to deal with URLs when changing shopping cart software to ensure SEO
NSFW ALERT (LINK BELOW) We are changing the shopping section of our website. Currently the products sit on our own website and when a user goes to checkout they are taken to Mals (a shopping cart site). This means our URL’s look like this. NSFWhttps://www.aprilnites.com.au/mascara_vibe.htmlThe new software is Ecwid and we are using this with a site created in RapidWeaver so the URLs will not be clean and will have all ? And # parameters. I’m wondering if this will hurt the SEO of our whole site or just the product pages. I’m also unsure of how best to deal with the current URLs. Should I use a 301 redirect on all of them to take the user back to the home page of the shop. For us the shop is more of a catalogue. Our main website is the most important part but I want to make sure we are following best practice when making this change. Hope someone can help.Many thanks
Technical SEO | | AprilN0 -
How to deal with 80 websites and duplicated content
Consider the following: A client of ours has a Job boards website. They then have 80 domains all in different job sectors. They pull in the jobs based on the sectors they were tagged in on the back end. Everything is identical across these websites apart from the brand name and some content. whats the best way to deal with this?
Technical SEO | | jasondexter0 -
Dealing with closely related pages
I have a book with 8 pages which I offer free on my site: http://www.pottytrainingchart4kids.com/free-potty-training-book/ For technical reasons each of the 8 pages are on a seperate page. This might cause thin content/duplicate content since most of the code is the same besides the images and there isn't much on each page. How would you suggest I deal with this? I remember once reading about rel prev or something like that but I am not sure if it is applicable. I would like all page rank to go to the main page. Should I add no index to the other pages? I am not really sure what I should do to prevent a Panda penalty. Thanks in advance!
Technical SEO | | JillB20130 -
What is the best way to deal with pages whose content changes?
My site features businesses that offers activities for kids. Each business has its own page on my site. Business pages contains a listing of different activities that organization is putting on (such as events, summer camps, drop-in activities). Some businesses only offer seasonal activities (for example, during Christmas break and summer camps). The rest of the year, the business has no activities -- the page is empty. This is creating 2 problems. It's poor user experience (which I can fix no problem) but it also is thin content and sometimes treated as duplicate content. What's the best way to deal with pages whose content can be quite extensive at certain points of the year and shallow or empty at other parts? Should I include a meta ROBOTS tag to not index when there is no content, and change the tag to index when there is content? Should I just ignore this problem? Should I remove the page completely and do a redirect? Would love to know people's thoughts.
Technical SEO | | ChatterBlock0 -
Dealing with 404 pages
I built a blog on my root domain while I worked on another part of the site at .....co.uk/alpha I was really careful not to have any links go to alpha - but it seems google found and indexed it. The problem is that part of alpha was a copy of the blog - so now soon we have a lot of duplicate content. The /alpha part is now ready to be taken over to the root domain, the initial plan was to then delete /alpha. But now that its indexed I'm worried that Ill have all these 404 pages. I'm not sure what to do.. I know I can just do a 301 redirect for all those pages to go to the other ones in case a link comes on but I need to delete those pages as the server is already very slow. Or does a 301 redirect mean that I don't need those pages anymore? Will those pages still get indexed by google as separate pages? Please assist.
Technical SEO | | borderbound0 -
Dealing with Dead Pages on an Ecommerce Site
Hello everyone! I'm working on a project for a small jewelry store. They have a store in North Carolina and an ecommerce site (on Shopify - which I loathe!). I'm not exactly an SEO expert, but the client likes the way I handle social media and I know enough to get them much farther down the road than they are now. The big problem is that most everything sold is handmade and one of a kind. So, the site has LOTS of dead links. I'd love everyone's suggestions on how to: Best avoid this in the first place as new products are added and promoted via Facebook, Twitter, blog posts and so on Suggestions for managing the sold items - I don't think it seems wise to leave them up as "SOLD" The site is http://www.laurajamesjewelry.com I'm grateful for your assistance! And look forward to sharpening my SEO skills. ~Robin
Technical SEO | | RobinBertelsen0 -
What's the best way to deal with an entire existing site moving from http to https?
I have a client that just switched their entire site from the standard unsecure (http) to secure (https) because of over-zealous compliance issues for protecting personal information in the health care realm. They currently have the server setup to 302 redirect from the http version of a URL to the https version. My first inclination was to have them simply update that to a 301 and be done with it, but I'd prefer not to have to 301 every URL on the site. I know that putting a rel="canonical" tag on every page that refers to the http version of the URL is a best practice (http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394), but should I leave the 302 redirects or update them to 301's. Something seems off to me about the search engines visiting an http page, getting 301 redirected to an https page and then being told by the canonical tag that it's actually the URL they were just 301 redirected from.
Technical SEO | | JasonCooper0