Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
I show different versions of the same page to the crawlers and users, but do not want to do anymore
-
Hello,
While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
-
Hi there
Ideally, create one page that serves both search engines and users, because you want users to find your page via search engines and you want search engines to be able to crawl your content. It's thought that Google is getting better at crawling Javascript, but you need to make sure that you text or content is readable in a text-based browser or is visible to Google with Javascript off. Here's a resource for you.
That being said, focus on having one page for the content you're trying to create, so you can put more SEO efforts into building the equity in that page. You can also build other pages around variations of that topic that link back to that page, and link to these new pages from the one main topic page as well. This will help build your site from both a topic standpoint and passing linking equity throughout your site.
Let me know if this makes sense or helps. Best of luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Internal Links to Ecommerce Category Pages
Hello, I read a while back, and I can't find it now, that you want to add internal links to your main category pages. Does that still apply? If so, for a small site (100 products) what is recommended? Thanks
White Hat / Black Hat SEO | | BobGW0 -
How to 301 redirect from old domain and their pages to new domain and pages?
Hi i am a real newbie to this and i hope for a guide on how to do this. I seen a few moz post and is quiet confusing hopefully somebody able to explain it in layman terms to me. I would like to 301 redirect this way, both website contain the same niche. oldwebsite.com > newwebsite.com and also its pages..... oldwebsite.com/test >newwebsite.com/test So my question here is i would like to host my old domain and its pages in my new website hosting in order to redirect to my new domain and its pages how do i do that? would my previous page link overwrite my new page link? or it add on the juice link? Do i need to host the whole old domain website into my new hosting in order to redirect the old pages? really confusing here, thanks!
White Hat / Black Hat SEO | | andzon0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | | razasaeed0 -
Off-page SEO and link building
Hi everyone! I work for a marketing company; for one of our clients' sites, we are working with an independent SEO consultant for on-page help (it's a large site) as well as off-page SEO. Following a meeting with the consultant, I had a few red flags with his off-page practices – however, I'm not sure if I'm just inexperienced and this is just "how it works" or if we should shy away from these methods. He plans to: guest blog do press release marketing comment on blogs He does not plan to consult with us in advance regarding the content that is produced, or where it is posted. In addition, he doesn't plan on producing a report of what was posted where. When I asked about these things, he told me they haven't encountered any problems before. I'm not saying it was spam-my, but I'm more not sure if these methods are leaning in the direction of "growing out of date," or the direction of "black-hat, run away, dude." Any thoughts on this would be crazy appreciated! Thanks, Casey
White Hat / Black Hat SEO | | CaseyDaline0 -
Why would links that were deleted by me 3 months ago still show up in reports?
I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike
White Hat / Black Hat SEO | | shags380