SEO-Friendly Method to Load XML Content onto Page
-
I have a client who has about 100 portfolio entries, each with its own HTML page.
Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page.
Here's a sample of the javascript used, this is one of many more lines of code:
// load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml',
Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site. This has to stay, so the page needs to remain dynamic.
I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions.
Thanks!
-
As a response to my own question, I received some other good suggestions to this issue via Twitter:
- @__jasonmulligan__ suggested XSLT
- @__KevinMSpence__ suggested "...easiest solution would be to use simplexml --it's a PHP parser for lightweight XML" & "Just keep in mind that simplexml loads the doc into memory, so there can be performance issues with large docs."
- Someone suggested creating a feed from the XML, but I don't think that adds a ton of benefit aside from another step, since you'd still need a way to pull that content on to the page.
- There were also a few suggestions for ways to convert the XML feed to another solution like JSON on the page, but those were really outside the scope of what we were looking to do.
Final recommendation to the client was to just add text links manually beneath all of the Javascript content, since they only were adding a few portfolio entries per year, and it would look good in the theme. A hack, perhaps, but much faster and cost-effective. Otherwise, would have recommended they go with PHP plus the simplexml recommendation from above.
-
Think you need to find a developer who understand progressive enhancement so that the page degrades gracefully. You'll need to deliver the page using something server-side (php?) and then add the bells and whistles later.
I'm guessing the budget won't cover moving the entire site/content onto a database/cms platform.
How does the page look in Google Webmaster Tools - (Labs, Instant Preview). Might give you a nice visual way to explain the problem to the client.
-
Site was done a year or two ago by a branding agency. To their credit, they produced clean and reasonably-well documented code, and they do excellent design work. However, they relied too heavily on Flash and javascript to load content throughout the site, and the site has suffered as a result.
Site is entirely HTML, CSS, & Javascript and uses Dreamweaver template files to produce the portfolio entry pages, which then propagate into the XML files, which then get loaded by the rest of the site.
I wouldn't call it AJAX - I think it loads all of the XML file and then uses the filters to display appropriate content, so there are no subsequent calls to the server for more data.
User interface is great, and makes it easy to filter and sort by relevant portfolio items. It's just not indexable.
-
What's the reason it was implemented this way in the first place? Is the data being exported from another system in a particular way?
What's the site running on - is there a CMS platform?
Is it javascript because it's doing some funky ajax driven "experience" or are they just using javascript and the xml file to enable you to filter/sort based on different facets?
Final silly question - how's the visitor expected to interact with them?
-
Try creating an XML sitemap with all the entries, spin that into an HTML sitemap version and also a portfolio page with a list of entries by type. It's a bit of work, but will probably work best.
-
Thanks Doug,
I forgot to mention it above, but I am definitely mentioning other workaround methods of getting the content indexed, specificallly:
- XML Sitemap
- Cross-linking - there's plenty of other opportunities to link throughout the site that haven't been done yet - so that's high on the list.
- Off-site deep link opportunities are also large and will be addressed.
- The projects aren't totally linear, so we can't use next/previous in this example, but that's a good idea as well.
Those aside, there is a fundamental issue with the way the data is working now and I want to address the ideal solution, since it's within the client's budget to have that content redesigned properly.
-
While helpfully not answering the question, could you generate a xml sitemap (I take it the portfolio data is being generated from something?) to help Google find and index the pages?
Is there any cross linking between the individual portfolio pages or at least a next/previous?
(My first thought would have been the php route.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issues on same page with urls to multiple tabs?
Hello everyone! I'm first time here, and glad to be part of Moz community! Jumping right into the question I have. For a type of pages we have on our website, there are multiple tabs on each page. To give an example, let's say a page is for the information about a place called "Ladakh". Now the various urls that the page is accessible from, can take the form of: mywanderlust.in/place/ladakh/ mywanderlust.in/place/ladakh/photos/ mywanderlust.in/place/ladakh/places-to-visit/ and so on. To keep the UX smooth when the user switches from one tab to another, we load everything in advance with AJAX but it remains hidden till the user switches to the required tab. Now since the content is actually there in the html, does Google count it as duplicate content? I'm afraid this might be the case as when I Google for a text that's visible only on one of the tabs, I still see all tabs in Google results. I also see internal links on GSC to say a page mywanderlust.in/questions which is only supposed to be linked from one tab, but GSC telling internal links to this page (mywanderlust.in/questions) from all those 3 tabs. Also, Moz Pro crawl reports informed me about duplicate content issues, although surprisingly it says the issue exists only on a small fraction of our indexable pages. Is it hurting our SEO? Any suggestions on how we could handle the url structure better to make it optimal for indexing. FWIW, we're using a fully responsive design with the displayed content being exactly same for both desktop and mobile web. Thanks a ton in advance!
Intermediate & Advanced SEO | | atulgoyal0 -
Is Paging Comments SEO Friendly? Implications?
So glad to be here. Just amazed to see so many discussions over here. I had a quick query. One of our my blog has more than 500 comments on almost 50+ posts (and in some posts, it's even 1000+ comments). This impacts the load time as well as the experience on mobile. So, wanted to understand if I enable pagination of comments, is it SEO friendly. Does it negatively impacts SEO? I do not want to take the route of migrating to Disqus, FB comments, etc.
Intermediate & Advanced SEO | | flmgo820 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
Membership/subscriber (/customer) only content and SEO best practice
Hello Mozzers, I was wondering whether there's any best practice guidance out there re: how to deal with membership/subscriber (existing customer) only content on a website, from an SEO perspective - what is best practice? A few SEOs have told me to make some of the content visible to Google, for SEO purposes, yet I'm really not sure whether this is acceptable / manipulative, and I don't want to upset Google (or users for that matter!) Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Transferring link juice from a canonical URL to an SEO landing page.
I have URLs that I use for SEM ads in Google. The content on those pages is duplicate (affiliate). Those pages also have dynamic parameters which caused lots of duplicate content pages to be indexed. I have put a canonical tag on the Parameter pages to consolidate everything to the canonical URL. Both the canonical URL and the Parameter URLs have links pointing to them. So as it stands now, my canonical URL is still indexed, but the parameter URLs are not. The canonical page is still made up of affiliate (duplicate) content though. I want to create an equivalent SEO landing page with unique content. But I'd like to do two things 1) remove the canonical URL from the index - due to duplicate affiliate content, and 2) transfer the link juice from the canonical URL over to the SEO URL. I'm thinking of adding a meta NoIndex, follow tag to the canonical tag - and internally linking to the new SEO landing page. Does this strategy work? I don't want to lose the link juice on the canonical URL by adding a meta noindex tag to it. Thanks in advance for your advice. Rob
Intermediate & Advanced SEO | | partnerf0 -
Merge content pages together to get one deep high quality content page - good or not !?
Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy
Intermediate & Advanced SEO | | Tit0 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0 -
Need help/insights. Site SEO = OK, Authority BLs = OK, Rank page #1\. How to reach pos #1?
Hi there! Some doubts are confusing my head and need some assistence from you to get on the right track. I'll explain my situation and want to hear from you what do you really recommend for med/long term permanent results. 1 - I have a PR2 (.com.br) domain; 2 - I'm talking about little/med competition micro-niche keywords; 3 - I got all pages I want to, indexed (I have a well SEO constructed website with internal link building); 4 - If a keyword has average competition, I'll already start ranking in page #3 on the SERP's; For a few low competition keywords I start on page #1; 5 - I do a little whitehat link building, 1 or 2 backlinks on authority sites and then like 15 days later I came to page #1, generally on position 9/10; And then I got stucked 🙂 No more authority sites where I can get backlinks... I do some posts on the company twitter/facebook page's, but they are no follow, so I don't really now if this can help. (never see a SERP result). I did some "blackhat" stuff to see if it really work: I can say for sure the "profile backlinks" that we can buy from some sites doesn't work (maybe it's just for me). I can't see it on webmaster tool and neither my ranks changed since I bought a pack of 100 links (the links are working, I see it one by one) to test. Maybe the problem is about the domains, cause my site is .com.br and I'm buying .com profile links. I guess google understand backlinks from .com.br more valuable for my sites. Back to whitehat: I wrote some articles and posted it the right way, of course on .com.br articles sites, got it indexed and can see the backlink on webmaster tool, but no change on SERP's. (maybe this can be a long term result and I'm not seeing it yet). I'm really "scratching my hand" to do some blackhat stuff, but I don't want to lose what I already have done... I heard a lot about scrapebox but doesn't fell confortable to spam as hell a lot of blogs. I really want long term permanent results (my sites are totally whitehat/corporate sites). Can you expert guys give me some point to where I need to "walk" now to improve the SERP's? I never reached top #1 and want to try to rank at least one time to understand how this can be made... I'm thinking now to pay someone to rewrite 20 copies of an article and up it on some sites, to see if 20 can improve something. But still no confident, because it will cost like $100 for a good writer do it for me on my language. Maybe I can do better things with 100 bucks. I guess I did the path right: Internal SEO -> got indexed -> backlinking from authorities -> new articles backlinks to me (is it ok at this position or no?) -> (what next ?) I know SEO is a hard/never ending work, but what I'm trying to get cleaned on my head is the path of the work (if a right path really exists). Every word will be apreciated. What do you can suggest to me to try now? (please give me a hint to see SERP's results 🙂 if I feel that something worked, no matter how it can cost to me, but I'll pay for the work happily) Sorry if I'm a little confusing, english isnt' my first language. Thanks.
Intermediate & Advanced SEO | | azaiats20