Will sitemap generated in Yoast for a combined wordpress/magento site map entire site ?
-
Hi
For an ecommerce site thats been developed via a combination of wordpress and magento and has yoast installed, will the sitemap (& other yoast features) map (& apply to) the entire site or just wordpress aspects ?
In other words does one need to do anything else to have a full sitemap for a combined magento/wordpress site or will Yoast cover it all ? This link seems to suggest should be fine but seeing if anyone else encountered this and had problems or if straightforward ?
http://fishpig.co.uk/wordpress-integration/docs/plugins.html
cheers
dan
-
thanks Mark !!
-
thanks Paddy
-
I don't believe Yoast's Wordpress SEO plugin will cover Magento files. The Wordpress SEO plugin is built into the Wordpress, and pulls the page, post, category, taxonomy, etc. data from Wordpress in order to build the sitemap. The plugin itself creates a sitemap for each page, post type, etc., and is highly configurable. It then uses a sitemap index file to connect everything together.
But since the Wordpress doesn't have the Magento information, since they are separate CMS's with separate databases, the Yoast Wordpress SEO plugin won't build a sitemap file for information in the other CMS, as it has no access to it and no knowledge of it.
You can use other tools to create sitemaps for the site, either through manual crawls or installed software on your server, but the yoast wordpress plugin won't do it.
Mark
-
I don't know about yoast, but if wordpress and magento have their own site maps, they both can be added to sitemaps under webmaster tools.
You can also check the sitemap that yoast makes, to see if it has both wordpress and magento links in the xml file (pick 2 random urls and seach).
I would just make sure that the xml is being updated when you add new pages to your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL dynamic structure issue for new global site where I will redirect multiple well-working sites.
Dear all, We are working on a new platform called [https://www.piktalent.com](link url), were basically we aim to redirect many smaller sites we have with quite a lot of SEO traffic related to internships. Our previous sites are some like www.spain-internship.com, www.europe-internship.com and other similars we have (around 9). Our idea is to smoothly redirect a bit by a bit many of the sites to this new platform which is a custom made site in python and node, much more scalable and willing to develop app, etc etc etc...to become a bigger platform. For the new site, we decided to create 3 areas for the main content: piktalent.com/opportunities (all the vacancies) , piktalent.com/internships and piktalent.com/jobs so we can categorize the different types of pages and things we have and under opportunities we have all the vacancies. The problem comes with the site when we generate the diferent static landings and dynamic searches. We have static landing pages generated like www.piktalent.com/internships/madrid but dynamically it also generates www.piktalent.com/opportunities?search=madrid. Also, most of the searches will generate that type of urls, not following the structure of Domain name / type of vacancy/ city / name of the vacancy following the dynamic search structure. I have been thinking 2 potential solutions for this, either applying canonicals, or adding the suffix in webmasters as non index.... but... What do you think is the right approach for this? I am worried about potential duplicate content and conflicts between static content dynamic one. My CTO insists that the dynamic has to be like that but.... I am not 100% sure. Someone can provide input on this? Is there a way to block the dynamic urls generated? Someone with a similar experience? Regards,
Technical SEO | | Jose_jimenez0 -
Will adding /blog/ to my urls affect SEO rankings?
Following advice from an external SEO agency I removed /blog/ from our permalinks late last year. The logic was that it a) doesn't help SEO and b) reduces the character count for the slug. Both points make sense. However, it makes segmenting blog posts from other content in Google Analytics impossible. If I were to add /blog/ back into my URLs, and redirected the permalinks, would it harm my rankings? Thanks!
Technical SEO | | GerardAdlum0 -
Sitemap use for very large forum-based community site
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Technical SEO | | CommManager
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it: Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs. Sitemap contains the above but is also dynamically updated with new threads & blog posts. Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance.0 -
Google not indexing /showing my site in search results...
Hi there, I know there are answers all over the web to this type of question (and in Webmaster tools) however, I think I have a specific problem that I can't really find an answer to online. site is: www.lizlinkleter.com Firstly, the site has been live for over 2 weeks... I have done everything from adding analytics, to submitting a sitemap, to adding to webmaster tools, to fetching each individual page as googlebot and then submitting to index via webmaster tools. I've checked my robot files and code elsewhere on the site and the site is not blocking search engines (as far as I can see) There are no security issues in webmaster tools or MOZ. Google says it has indexed 31 pages in the 'Index Status' section, but on the site dashboard it says only 2 URLS are indexed. When I do a site:www.lizlinketer.com search the only results I get are pages that are excluded in the robots file: /xmlrpc.php & /admin-ajax.php. Now, here's where I think the issue stems from - I developed the site myself for my wife and I am new to doing this, so I developed it on the live URL (I now know this was silly) - I did block the content from search engines and have the site passworded, but I think Google must have crawled the site before I did this - the issue with this was that I had pulled in the Wordpress theme's dummy content to make the site easier to build - so lots of nasty dupe content. The site took me a couple of months to construct (working on it on and off) and I eventually pushed it live and submitted to Analytics and webmaster tools (obviously it was all original content at this stage)... But this is where I made another mistake - I submitted an old site map that had quite a few old dummy content URLs in there... I corrected this almost immediately, but it probably did not look good to Google... My guess is that Google is punishing me for having the dummy content on the site when it first went live - fair enough - I was stupid - but how can I get it to index the real site?! My question is, with no tech issues to clear up (I can't resubmit site through webmaster tools) how can I get Google to take notice of the site and have it show up in search results? Your help would be massively appreciated! Regards, Fraser
Technical SEO | | valdarama0 -
Approach for an established site looking to serve different content to regions in a single country/lang
Hi guys, I have an established site that currently serves the same content to all regions - west and east - in a single country with the same language. We are now looking to vary the content across west and east regions - not dramatically, but the products offered will be slightly different. From what i gather, modifying the url is best for countries, so feels like overkill for regions within the same country. I'm also unlikely to have very unique content, outside of the varied products, so I'm mindful of duplicate/similar content, but I know I can use canonical tags to address. I have a fairly modern CMS that can target content based on region, but mindful of upsetting Google re; showing different content to what the bot might encounter, assuming this is still a thing. So, three questions from an SEO perspective - Do i need to really focus on changing my url structure, especially as I'm already established in a competitive market, or will I do more harm than good? Is the region in the URL a strong signal? If I should make some changes to the url and/or metadata, what are the best bang for buck changes you would make? How does Google Local fit into this? Is it a separate process via webmaster tools, or does it align to the above changes? Cheers!!! Jez
Technical SEO | | jez0000 -
Sitemap Generator Tool
We have developed a very large domain with well over 500 pages that need to be indexed. The tool we usually use to create a sitemap has a limit of 500 pages. Does anyone know of good tool we can use to create a sitemap text and xml that doesn't have a limit of pages? Thanks!
Technical SEO | | TracSoft0 -
Will an identical site impact SERP results
I came across two identical sites for two different business owners in the same industry. I'm sure you've seen these. A web company offers individuals in the same profession a template site with the exact same content for each site. All that is different is the domain. i.e. mycompany.com/news/topicsname will have the exact same content, images, tags, etc. as mycompany2.com/news/topicsname. I would assume having the duplicate content, especially if two site owners are in the same town, will ultimately hurt the rankings of at least one site. Is this correct? Thank you for your help.
Technical SEO | | STF0 -
How to visualize our entire site to discover the origin of URLs?
What is a tool to use so that I can visualize all links to all pages on the site so that I can discover how certain duplicate content URLs are being created?
Technical SEO | | poolguy0