PDFs and indexing
-
Hello and good morning.
I work for a paint manufacturing company in the UK on their seo campaigns across a couple of websites, this is my question. as paint and chemicals require data and tech sheets by law, available to be downloadable for said product, should these be included in the sitemap, we auto generate our sitemaps which they include these files, with low priorities and never change in terms of name etc.
they basically have a name of say 092847.pdf for example which cannot be changed, but from an seo view this doesn't mean a thing? so theres my question should they be included and would they carry any value?
-
thank you, I'm not saying they couldn't be changed it would just cause a lot of stress for our labs and tech guys who create these and work by the number. were as having a naming structure things would become a mess and everything up in the air.
I will look into the back end keywords, authors, company name which may give them some sort of impact from what I read on the link above.
-
-
Hi
Sitemaps - yes, include anything in Sitemaps that you want users to be able to find, so the more ways you can lead a Search Engine to it, the better.
Filename - it would help if you could change the filenames to include keywords, but if that's not an option then there are other things you can do to optimise each PDF.
There's a good overview of optimising PDFs here - How To Optimize PDF Documents For Search
As that post mentions, include links back to your site for maximum value, especially if these documents are shared on other websites. Also, a bit of branding within each PDF (just add a logo) could help you out in some way.
Hope that's helpful
-
Case A:
If the content of the PDF's is valuable, if it contains also some text about the product, I would make them indexable. It will make niche searchers find you.
You might want to make a separate sitemap for these PDF's, just to keep things clean.Case B:
If it's only numbers and very technical jibber jabber, I wouldn't let it index, since Google won't understand it either.Update with an interesting story:
A client of mine also had technical PDF sheets online. He has put a lot of effort in that. There were a few (4-5) competitors using direct links to the PDF's. After a while, we referred all that competitor traffic to a special landing page trying to convince why my client is a better deal. It's still online on some of the sites, since some competitors never really checked the PDF's.
Made my client very happy. -
Hey there
I can't imagine them having any SEO value, but I can't see the PDFs doing any harm either.
PDFs are crawlable and indexable by the search engines, so I would want to keep it in your sitemap for the user. I'm quite familiar with your industry (my dad worked with providing paint and chemical coatings) and I can imagine your target audience being quite specific in their searches, looking for products by code and specifications. A PDF would probably be the ideal solution for this and so having it indexed and sitting on your domain could bring in some organic traffic.
I'd make sure that the PDFs are branded if possible containing clear links back to your site, in order to funnel any long-tail traffic back to your homepage and sales pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do custom tracking codes affect indexing?
Hello, My company uses a tracking system that allows our employees to apply a short code snippet to the end of our URLs for marketing attribution. An example of such a code would be: https://www.schoolofmotion.com/?ref_id=moz-test However, in Google Analytics we are seeing duplicates of our content, where the pages with the individual tracking codes are counted separately from the pages without. From a reporting perspective, this is annoying and definitely worth a fix. However, I'm curious if this problem is affecting our search potential as well. Could this tracking system be splitting traffic in Google's eyes? From an SEO perspective, how should we approach this? Would canonical tags fix this duplication issue in Google Analytics? Is there something else that we should use? Thanks in advance. The Moz community is incredible.
Reporting & Analytics | | CalebWardSoM1 -
How to Diagnose "Crawled - Currently Not Indexed" in Google Search Console
The new Google Search Console gives a ton of information about which pages were excluded and why, but one that I'm struggling with is "crawled - currently not indexed". I have some clients that have fallen into this pit and I've identified one reason why it's occurring on some of them - they have multiple websites covering the same information (local businesses) - but others I'm completely flummoxed. Does anyone have any experience figuring this one out?
Reporting & Analytics | | brettmandoes2 -
Sudden Drop in Index Status on GSC
Hi all, We've seen a sudden drop in index status on GSC from 19,000 to 12,000. Rankings, referring domains, organic traffic etc. have not changed. However, we have implemented a huge number of redirects and done a site migration from http to https in the past year. Could this have an effect? Thanks!
Reporting & Analytics | | SMVSEO0 -
Why google stubbornly keeps indexing my http urls instead of the https ones?
I moved everything to https in November, but there are plenty of pages which are still indexed by google as http instead of https, and I am wondering why. Example: http://www.gomme-auto.it/pneumatici/barum correctly redirect permanently to https://www.gomme-auto.it/pneumatici/barum Nevertheless if you search for pneumatici barum: https://www.google.it/search?q=pneumatici+barum&oq=pneumatici+barum The third organic result listed is still http. Since we moved to https google crawler visited that page tens of time, last one two days ago. But doesn't seems to care to update the protocol in google index. Anyone knows why? My concern is when I use API like semrush and ahrefs I have to do it twice to try both http and https, for a total of around 65k urls I waste a lot of my quota.
Reporting & Analytics | | max.favilli0 -
Index.php and /
Hello, We have a php system and in the MOZ error report our index.php shows up as a duplicate for / (home page). I instituted a rel canonical on the index.php because the / gets better rank than the other. This said, the error report through MOZ still shows them as duplicates. Should I be using a 301 instead? Please help! Also, I would love a good technical SEO book (for bridging the gap between SEO and programmer) if someone can recommend one? Thanks in advance!
Reporting & Analytics | | lfrazer0 -
Webmaster Tools Indexed pages vs. Sitemap?
Looking at Google Webmaster Tools and I'm noticing a few things, most sites I look at the number of indexed pages in the sitemaps report is usually less than 100% (i.e. something like 122 indexed out of 134 submitted or something) and the number of indexed pages in the indexed status report is usually higher. So for example, one site says over 1000 pages indexed in the indexed status report but the sitemap says something like 122 indexed. My question: Is the sitemap report always a subset of the URLs submitted in the sitemap? Will the number of pages indexed there always be lower than or equal to the URLs referenced in the sitemap? Also, if there is a big disparity between the sitemap submitted URLs and the indexed URLs (like 10x) is that concerning to anyone else?
Reporting & Analytics | | IrvCo_Interactive1 -
Bing Won't Index Site - Help!
For the past few weeks I’ve been trying to figure out why my client's site is not indexed on bing and yahoo search engines. My Google analytics is telling me I’m getting traffic (very little traffic) from Bing almost daily but Bing webmaster tools is telling me I’ve received no traffic and no pages have been indexed into Bing since the beginning of December. At once point I was showing ranking in Bing for only one keyword then all of a sudden none of my pages were being indexed and I now rank for nothing for that website. From Google I’m getting over 1200 visits per month. I have been doing everything I can to possibly find the culprit behind this issue. I feel like the issue could be a redirect problem. In webmaster tools on Bing I’ve used “Fetch as Bingbot” and every time I use it I get a Status of “Redirection limit reached.”. I also checked the CRAWL Information and it’s saying all the URL’s to the site are under 301 redirect. A month or so ago the site was completely revamped and the canonical URL was changed from non www to www. I have tried manually adding pages to be indexed multiple times and Bing will not index any of the sites pages. I have submitted the sitemap to Bing and I am now at a loss. I don’t know what’s going on and why I can’t get the site listed on Bing. Any suggestions would be greatly appreciated. Thanks,
Reporting & Analytics | | VITALBGS
Stephen0 -
Why do I have few different index URL addresses?
Yes I know, sorry guys but I also have a problem with duplicate pages. It shows that almost every page of my site has a duplicate content issue and looking at my folders in the server, I don't see all these pages... This is a static Website with no shopping cart or anything fancy. The first on the list is my [index] page and this is giving me a hint about some sort of bad settings on my end with the SEOMOZ crawler??? Please advice and thank you! index-variations.jpg
Reporting & Analytics | | cssyes0