Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to Add canonical tags on .ASPX pages?
-
What is the proper way (or is it possible) to add canonical tags on website pages that end in .aspx?
If you add a canonical tag to the Master Page it will put that exact canonical tag on every page, which is bad.
Is there a different version of the tag to put on individual pages?
And one to put on the home page without the Master Page error?
-
Put a asp:placeholder or asp:literal in masterpage. When you want to have a canonical-tag from an inheriting page, just give placeholder / literal value.
-
The Master Page is the main template page that all of the asp pages on the site are based on. If I put a standard canonical url tag in the Master Page, that canonical url will then be on all of the pages.
-
Yes that is the correct code for apache sites, but asp sites don't have a section.
-
The extensions of the pages won't matter, provided you're able to actually put the canonical tag itself within the of the page. If you put in the , it'll be ignored.
You only need to put the canonical tag on pages that are duplicates of other pages. You'll need to be able to specify the correct href for the canonical tag for each page, which is the full URL of the page it's a duplicate of. If you only have that level of control to place this only on the duplicate pages, you are still ok, as you can have a page rel=canonical to itself (according to Matt Cutts here). So if all the duplicate URLs and the original URL all rel=canonical to the original page, it should work. If you don't even have that level of control, you might not be able to use the canonical tag. I hope that's what you mean by "Master Page"... if you have each master page rel=canonical to itself, it sounds like it could solve this for you.
FYI, if you can 301 redirect these duplicate pages to the original page, that's the preferred method of resolving duplicate content issues.
-
Correct me if I'm wrong but isn't it only to add:
rel="canonical" href="URL" />
in the header?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How is Single Page Application (SPA) bad for SEO
Hi guys. I am quite inspired of SPA technique. It's really amazing when all your interaction with the site is going on the fly and you don't see any page reloads. I've started implementing the site with this instruction and already found nice guys to make the design. The only downside of the using SPA which I can see **is the **SEO part. That's because the URL does not really change and different pages don't have their unique URL addresses.
Web Design | | Billy_gym
Actually they have, but it looks like: yoursite.com/#/products yoursite.com/#/prices yoursite.com/#/contact So all of them goes after # and being just anchors. For Google this mean all of these pages is just yoursite.com/ My question is what is really proven method to implement the URL structure in Single Page Application, so all the pages indexed by Google correctly (sorry I don't mention the other search engines because of market share). The other question, of course, is examples. It will be great to see real life site examples, better authority sites, which use SPA technique and well indexed by search engines.1 -
Do we need both an .XML Sitemap and a .aspx sitemap?
Hi Mozers, We recently switched servers and it came to my attention that we have two sitemaps a XML version of the sitemap and a .aspx version of the sitemap. This came to light as the .aspx version of the sitemap is causing the site to come to a screeching halt as it has some complex code and lists over 80,000 products. My question is do we need both versions of the sitemap? My understanding is that the XML version is for Search Engine bots and the .aspx version is for customers. I can't imagine that anyone is using our .aspx version as it is basically a page with 80,000 links and it's buried away on the site, so we were hoping to kill off the .aspx version of the sitemap and keep the .xml version for Search Engine Bots. I wanted to check here first to make sure we did not any negative search engine implications. Any help would be most appreciated. Thanks so much! Patrick
Web Design | | gatorpool0 -
Should i be using shortcodes for my my page content.
Hello, I have a question. Sorry if this is been answered before. Recently I decided to do a little face lift to my main website pages. I wanted to make my testimonials more pretty. Found this great plugin for testimonials which creates shortcodes. I love how it looks like, but just realised that when I use images in shortcodes, these are not picked up by search engines 😞 only text is. Image search ability is pretty important for me and I'm not sure if I should stick with my plain design and upload images manually with all alt tags and title tags or there is a way to adjust shortcode so it shows images to search engines. You can see example here. https://a-fotografy.co.uk/maternity-photographer-edinburgh/ Let me know your thoughts guys. Regards, Armands
Web Design | | A_Fotografy1 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Multiple Local Schemas Per Page
I am working on a mid size restaurant groups site. The new site (in development) has a drop down of each of the locations. When you hover over a location in the drop down it shows the businesses info (NAP). Each of the location in the Nav list are using schema.org markup. I think this would be confusing for search robots. Every page has 15 address schemas and individual restaurants pages NAP is at the below all the locations' schema/NAP in the DOM. Have any of you dealt with multiple schemas per page or similar structure?
Web Design | | JoshAM0 -
What is the best tool to view your page as Googlebot?
Our site was done with asp.net and a lot of scripting. I want to see what Google can see and what it can't. What is the best tool that duplicates Googlebot? I have found several but they seem old or inaccurate.
Web Design | | EcommerceSite0 -
Landing pages vs internal pages.
Hey everyone I have run into a problem and would greatly appreciate anyone that could weigh in on it. I have a web client that went to an outside vendor for marketing. The client asked me to help them target some keywords and since I am new to the SEO world I have proceeded by researching the best keywords for the client. I found 6 that see excellent monthly searches. I then registered the .com and or .net domain names that match these words. I then started building landing pages that make reference to the keyword and then have links to his site to get more info. My customer sent the first of these sites to the marketer and he says I am doing things all wrong. He says rather then having landing pages like this I should just point the domain names at internal pages to the website. He also says that I should not have different looks for the landing pages from the main site and that I should have the full site menu on each landing page. I wanted to here what everyone here has to say about the pros and cons of the way to do this cause the guy giving the advice to me has a lower ranking site then I do and I have only started working on getting my site ranked this year. He has atleast according to him been doing this forever. Thanks, Ron
Web Design | | bsofttech0