Need to duplicate the index for Google in a way that's correct
-
Usually duplicated content is a brief to fix.
I find myself in a little predicament:
I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal.
The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads.
So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines.
I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path.
So the question is: Have someone else stumbled upon a similar problem?
If so...? How did you fix it.
-
Indexation of ad-server code may prove tricky as often it will have a generic & dynamic make-up (i.e. JavaScript). It depends on how you set-up the code and ultimately how your engine will serve the code to end users.
What you are wanting to achieve is definitely possible through automation (whether via an ad-server or a custom script). The variable is just how the ad-server you use serves the listings. (You might like to try DoubleClick or OpenX first. Both are free).
Best of luck!
Anthony -
Thanks Anthony, I was going to try that approach from the start with a small script handling these listing thorugh an xml flow that's more easily controlled when it comes to layout. It seems like handling these listings as ads is the way forward.
Thnx!
Ps.
Do you think this will help the listings to become indexed on the smaller sites as well?
.Ds
-
You say it's identical. So by definition, I assume it's the same piece of code calling the job listings being replicated across all the areas you want the listings to show?
You can use an ad-server for this purpose. It will give you a greater sense of control over impressions, not to mention offer you a greater insight into how effective they are in terms of clicks & engagement. The code should be a little lighter, too.
-
The problem is that its not common "ads" but joblistings that's running on several sites with identical format and url structure, these listings are all in an own database that we use to pull the listings from. So the thing is: I want them to be indexed since it would drive a lot of longtail traffic to the nisched sites.
I've never used Doubleclick but know how it works. If this is a problem that's possible to solve by using an ad server please let me know
-
Can you not just put your ads in an ad server such as DoubleClick and run an include across the entire network, or am I missing something here?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has discovered a URL but won't index it?
Hey all, have a really strange situation I've never encountered before. I launched a new website about 2 months ago. It took an awfully long time to get index, probably 3 weeks. When it did, only the homepage was indexed. I completed the site, all it's pages, made and submitted a sitemap...all about a month ago. The coverage report shows that Google has discovered the URL's but not indexed them. Weirdly, 3 of the pages ARE indexed, but the rest are not. So I have 42 URL's in the coverage report listed as "Excluded" and 39 say "Discovered- currently not indexed." When I inspect any of these URL's, it says "this page is not in the index, but not because of an error." They are listed as crawled - currently not indexed or discovered - currently not indexed. But 3 of them are, and I updated those pages, and now those changes are reflected in Google's index. I have no idea how those 3 made it in while others didn't, or why the crawler came back and indexed the changes but continues to leave the others out. Has anyone seen this before and know what to do?
Intermediate & Advanced SEO | | DanDeceuster0 -
At scale way to check content in google?
Is there any tools people know about where I can verify that Google is seeing all of our content at scale. I know I can take snippets and plug them into Google to see if we are showing up, but this is very time consuming and want to know across a bulk of pages.
Intermediate & Advanced SEO | | HashtagHustler0 -
Why doesn't my website crawl by Google?
Hi mozzers and members, I am having issues, why my website: http://profilecosmeticsurgery.com/ crawl by Google? let me share more clearly when this starts happening. A month or around 45 days back our website is being indexed and crawled quite well without any issues with having .html extension pages with static built website.
Intermediate & Advanced SEO | | SEOOOOOoooooooo
We finally thought to change to .php version and make whole website and its pages to be treated dynamically.
Once we changed all changes, thereafter this issues started. It has been more than 45 days, our website isn't being crawled since then. I didn't know what are the things preventing this to? Please help. Thanks in Advance Capture1.PNG0 -
Baffled by this site's inability to rank
Hi guys, I've been working on a site for quite a while and it has a really good link profile, excellent content, no errors or penalties (as far as I can tell) but for some reason it consistently ranks below a lot of thin poor quality websites with spammy EMDs and a few obviously paid links from old-skool business directories etc. It has a significantly higher DA and linking root domains that almost all of them. Also it just bounces around from #40 to #28 to#35 to #40 to #28 on a weekly basis for many of our primary keywords. There just seems to be no logic to this and it goes against everything I know and everything we're taught. (I should probably point out that I've been doing this quite a while and have a number of other sites ranking extremely well in quite a few different verticals), Has anyone ever experienced anything like this and what did you do? Before I throw in the towel it would be good to hear from others and try and understand why this happens and if there is anything else I can try to help my client and fix it. Many thanks in advance.
Intermediate & Advanced SEO | | Blaze-Communication0 -
A few questions on Google's Structured Data Markup Helper...
I'm trying to go through my site and add microdata with the help of Google's Structured Data Markup Helper. I have a few questions that I have not been able to find an answer for. Here is the URL I am referring to: http://www.howlatthemoon.com/locations/location-chicago My company is a bar/club, with only 4 out of 13 locations serving food. Would you mark this up as a local business or a restaurant? It asks for "URL" above the ratings. Is this supposed to be the URL that ratings are on like Yelp or something? Or is it the URL for the page? Either way, neither of those URLs are on the page so I can't select them. If it is for Yelp should I link to it? How do I add reviews? Do they have to be on the page? If I make a group of days for Day of the Week for Opening hours, such as Mon-Thu, will that work out? I have events on this page. However, when I tried to do the markup for just the event it told me to use itemscope itemtype="http://schema.org/Event" on the body tag of the page. That is just a small part of the page, I'm not sure why I would put the event tag on the whole body? Any other tips would be much appreciated. Thanks!
Intermediate & Advanced SEO | | howlusa0 -
Why the archive sub pages are still indexed by Google?
Why the archive sub pages are still indexed by Google? I am using the WordPress SEO by Yoast, and selected the needed option to get these pages no-index in order to avoid the duplicate content.
Intermediate & Advanced SEO | | MichaelNewman1 -
There's NO reason these sites should be beating mine...Or is there?
Hi Over the past 10 months, my internal page rankings (previously excellent) have plummeted. I'm now trying to recover them. I haven't received an unnatural links warning in Google Webmaster Tools. Also, I used to have hundreds of internal links to each of these 21 pages using the same exact-match anchor text eg, Tuscany real estate, Umbria real estate, etc. I changed this about 6 months ago. So why am I still ranking poorly for these (only moderately competitive keywords) behind sites with poorer metrics? 1) Keyword: lake como real estate My page here – **http://tinyurl.com/d34k8m ** -- used to rank No1 or No2 neck-and-neck with this page www.immobiliarevacanzelago.com/. He's still No1 but I’m down to about No13. Yet when I look in Open Site Explorer virtually all my metrics beat his.
Intermediate & Advanced SEO | | Jeepster0 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0