Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Adding Rel Canonical to multiple pages
-
Hi,
Our CMS generates a lot of duplicate content, (Different versions of every page for 3 different font sizes). There are many other reasons why we should drop this current CMS and go with something else, and we are in the process of doing that. But for now, does anyone know how would I do the following:
I've created a spreadsheet that contains the following:
Column 1: rel="canonical" tag for URL
Column 2: Duplicate Content URL # 1
Column 3: Duplicate Content URL # 2
Column 4: Duplicate Content URL # 3
I want to add the tag from column 1 into the head of every page from column 2,3, and 4.
What would be a fast way to do this considering that I have around 1800 rows.
Check the screenshot of the builtwith.com result to see more information about the website if that helps.
Farris
-
Yeah, wish I could give you a simpler answer, but I'm afraid it might end up being a little tricky. Hit the biggest problems first, and at least you can manage time/money a bit. The one bright side is that the rules should be no harder to code in ColdFusion than anything else (PHP, ASP, whatever). It's just the core logic that's tricky.
-
That's what I thought. I need to find someone in the company who knows cold fusion and go through it.
Thanks for your help though. I appreciate it.
Farris
-
Unfortunately, the rules may differ from page to page and will be entirely dependent on how your pages are generated. If it's just a matter of the "index.cfm" version vs. root ("/") versions of pages, those canonical should be straightforward. For the other parameters, though (like "i", "fs", etc.), it depends entirely on the function of those parameters.
I know ColdFusion reasonably well, and even given that, I couldn't give you a one-size-fits-all rule that would solve the problem. It really has to be guided by your site structure and code/data logic. Personally, I'd start with the pattern that generates the most problems and solve that one first. In other words, if one template (like "/press-releases") generates dozens or hundreds of duplicates, deal with that first. If you solve the top 3-4 problems, you may clean up quite a bit. That could be more effective than trying to fix everything at once.
-
Here's a spreadsheet sample. I did what Roberto suggested. I have a column with the ready for every duplicate content URL.
The site is dynamic. That was the main problem I was facing, I'm not sure how to set the canonicals on each page without having to go into the html and copy the tag from the spreadsheet to the manually.
I added the screenshot of builtwith.com in the main question hoping it would give anyone insight as to how I would code rules to set the canonicals.
-
Could you provide an approximate example that matches your real situation (a fake domain is fine, but with the same basic format)? This is a situation where fake examples that don't match the real situation probably won't help us (or you) much.
Once you have the spreadsheet, how are you going to translate that into tags? If this is a dynamic site, it would be better to be able to code rules to set the canonicals - and potentially much easier.
-
Following the same concept:
- Create a column (Column E) with the following information "then another column (Column F) with ""/>"
- In column G enter the following formula: =CONCATENATE(E1,Cell of Duplicate URL, F3).
The end result will have Column A with the Domain in it. Follow steps 6 & 7 to complete the process.
Feel free to send me a sample spreadsheet with some info and I can set it up for you.
-
Roberto, Thank you for your answer. I just realized that I was unclear when I asked the question. I already have the link containing the canonical tag for each of the URLs ready. That is what column A already contains. I need to add that into the section of the pages in column 2,3, and 4. I'm just unsure how to do this for 1800 rows each containing the correct URL in column A, and in column 2,3, and 4 the URLs of the duplicate content pages that need the link added to the section. Check the image below to see what I mean. I appreciate the effort though Farris
-
Farris,
This is the way I would do it.
You have the following columns created:
- Column A: "canonical" tag for UR
- Column B: Duplicate Content URL # 1
- Column
Duplicate Content URL # 2 - Column
Duplicate Content URL # 1
Follow the next steps:
- Create three more columns with to duplicate columns B, C, D
- Use the following formula on column B "**=CONCATENATE(A1,B1)" **
- Copy the same formula for columns C & D
- Replace the “B1” in your formula for the respective columns (i.e. Column C should have C1.)
- Copy & Paste the content of columns E, F, G (The copied columns with formulas) to all the rows.
- Once copied, the information in columns E, F, G should look like the end result that you want.
- if data is correct, copy columns E, F, G and paste in the same location but use Paste Special and paste values only. This will remove your formulas.
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel=Canonical For Landing Pages
We have PPC landing pages that are also ranking in organic search. We've decided to create new landing pages that have been improved to rank better in natural search. The PPC team however wants to use their original landing pages so we are unable to 301 these pages to the new pages being created. We need to block the old PPC pages from search. Any idea if we can use rel=canonical? The difference between old PPC page and new landing page is much more content to support keyword targeting and provide value to users. Google says it's OK to use rel=canonical if pages are similar but not sure if this applies to us. The old PPC pages have 1 paragraph of content followed by featured products for sale. The new pages have 4-5 paragraphs of content and many more products for sale. The other option would be to add meta noindex to the old PPC landing pages. Curious as to what you guys think. Thanks.
Technical SEO | | SoulSurfer80 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Is it good to redirect million of pages on a single page?
My site has 10 lakh approx. genuine urls. But due to some unidentified bugs site has created irrelevant urls 10 million approx. Since we don’t know the origin of these non-relevant links, we want to redirect or remove all these urls. Please suggest is it good to redirect such a high number urls to home page or to throw 404 for these pages. Or any other suggestions to solve this issue.
Technical SEO | | vivekrathore0 -
Rel = prev next AND canonical?
I have product category pages that correctly have the prev next but the moz crawl is giving me duplicate content errors. I would not think I also need to have canonical - but do I ?
Technical SEO | | JohnBerger0 -
Two different canonical tags on one page
Due to an error, some of my pages now have two canonical tags on them. One is correct and the other goes to a nonsense URL (404 page). I know I should ideally remove the incorrect ones, but it's a big manual job. Are they doing any harm? Can I just leave them there and let Google figure it out? The correct ones are higher up in the code. Will this make a difference? Any help appreciated.
Technical SEO | | ShearingsGroup0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | | JohannCR0 -
Adding 'NoIndex Meta' to Prestashop Module & Search pages.
Hi Looking for a fix for the PrestaShop platform Look for the definitive answer on how to best stop the indexing of PrestaShop modules such as "send to a friend", "Best Sellers" and site search pages. We want to be able to add a meta noindex ()to pages ending in: /search?tag=ball&p=15 or /modules/sendtoafriend/sendtoafriend-form.php We already have in the robot text: Disallow: /search.php
Technical SEO | | reallyitsme
Disallow: /modules/ (Google seems to ignore these) But as a further tool we would like to incude the noindex to all these pages too to stop duplicated pages. I assume this needs to be in either the head.tpl or the .php file of each PrestaShop module.? Or is there a general site wide code fix to put in the metadata to apply' Noindex Meta' to certain files. Current meta code here: Please reply with where to add code and what the code should be. Thanks in advance.0