How to Implement AMP for Single Blog Post?
-
Hello Moz Team,
I would like to implement AMP for my single blog post not on whole blog. Is it possible? if Yes then How?
Note - I am already using GTM for my website abcd.com but I would like to use for my blog post only and my blog is like - abcd.com/blog..............let me clarify Blog Post means - abcd.com/blog/my-favorite-dress
Thanks!
-
Hello Moz Team,
Can anyone tell me it is possible to implemented AMP for single page?
Thanks!
-
I'm curious about AMP also. I currently have a Wix site and it looks like they are implementing some AMP features. Anyone have any thoughts on Wix and AMP?
-
Hi Tim,
Sorry given link not showing anything related to tag manager. Can anyone share implementation with tag manager?
Thanks!
-
Hey Johny, your best bet is to take a look at the AMP Project guide. It should take you through it step by step. From markup and validation through to when you publish your posts.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is a page with links to all posts okay?
Hi folks. Instead of an archive page template in my theme (I have my reasons), I am thinking of simply typing the post title as and when I publish a post, and linking to the post from there. Any SEO issues that you can think of? Thanks in advance!
Intermediate & Advanced SEO | | Nobody16165422281340 -
URL Parameters, Forms & SEO
Hi I have some pages on the site which have a quote form, in my site crawl I see these showing as duplicate content - my webmaster says this isn't the case, but I'm not sure. Landing page - https://www.key.co.uk/en/key/high-esd-chairs Page with form - https://www.key.co.uk/en/key/high-esd-chairs?quote-form - this also somehow has a canonical on it pointing to https://www.key.co.uk/en/key/high-esd-chairs?quote-form Which neither of us have added. I'm thinking we need to get the canonical needs to be updated to https://www.key.co.uk/en/key/high-esd-chairs Is it worth doing this for all these pages or am I worrying about nothing? Becky
Intermediate & Advanced SEO | | BeckyKey0 -
Disavow post Penguin update
As recent Penguin update makes quick move with backlinks with immediate impact; does Disavow tool also results the changes in few days rather than weeks like earlier? How long does it take now to see the impact of disavow? And I think still we must Disavow some links even Google claim that it'll take care of bad backlinks without passing value from them?
Intermediate & Advanced SEO | | vtmoz0 -
Should I delete 100s of weak posts from my website?
I run this website: http://knowledgeweighsnothing.com/ It was initially built to get traffic from Facebook. The vast majority of the 1300+ posts are shorter curation style posts. Basically I would find excellent sources of information and then do a short post highlighting the information and then link to the original source (and then post to FB and hey presto 1000s of visitors going through my website). Traffic was so amazing from FB at the time, that 'really stupidly' these posts were written with no regard for search engine rankings. When Facebook reach etc dropped right off, I started writing full original content posts to gain more traffic from search engines. I am starting to get more and more traffic now from Google etc, but there's still lots to improve. I am concerned that the shortest/weakest posts on the website are holding things back to some degree. I am considering going through the website and deleting the very weakest older posts based on their quality/backlinks and PA. This will probably run into 100s of posts. Is it detrimental to delete so weak many posts from a website? Any and all advice on how to proceed would be greatly recieved.
Intermediate & Advanced SEO | | xpers1 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Robots.txt & Duplicate Content
In reviewing my crawl results I have 5666 pages of duplicate content. I believe this is because many of the indexed pages are just different ways to get to the same content. There is one primary culprit. It's a series of URL's related to CatalogSearch - for example; http://www.careerbags.com/catalogsearch/result/index/?q=Mobile I have 10074 of those links indexed according to my MOZ crawl. Of those 5349 are tagged as duplicate content. Another 4725 are not. Here are some additional sample links: http://www.careerbags.com/catalogsearch/result/index/?dir=desc&order=relevance&p=2&q=Amy
Intermediate & Advanced SEO | | Careerbags
http://www.careerbags.com/catalogsearch/result/index/?color=28&q=bellemonde
http://www.careerbags.com/catalogsearch/result/index/?cat=9&color=241&dir=asc&order=relevance&q=baggallini All of these links are just different ways of searching through our product catalog. My question is should we disallow - catalogsearch via the robots file? Are these links doing more harm than good?0 -
New my domain.com/blog option vs. my blog.mydomain.com option
Our e-commerce site has been on Big Commerce for about a year now. One thing many SEO folks had told us is that having a blog located at /blog was going to help more than a subdomain blog. option. BC has never had the option to have a blog hosted on their platform (/blog) until now. I am now wondering, since we have lost traffic in the past and are trying everything we can to regain it, if we should purchase the Wordpress Site Redirect upgrade and move the subdomain blog (blog.) to the new site option /blog. Any help or feedback from you is very much appreciated. I have attached a screenshot of our main website vs. our blog from Open Site Explorer in case it helps anything. I29Tw5P
Intermediate & Advanced SEO | | josh3300 -
Where Does Blogging Fit Into SEO
I read an article yesterday that said blogging comes under the heading of social media, which is at the top of the so called SEO pyramid. I have taken this to mean less time should be spent in social media compared to other areas of SEO. Yet content creation was at the bottom of the pyramid (more time allocation here). Isn't blogging part of content creation? I would have thought there is a limit to what can be done for service/product & landing pages. Whereas blogs are a great way to produce more unique content for a website. Any clarification would be appreciated. Thanks - Christina
Intermediate & Advanced SEO | | ChristinaRadisic0