Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
I show different versions of the same page to the crawlers and users, but do not want to do anymore
-
Hello,
While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
-
Hi there
Ideally, create one page that serves both search engines and users, because you want users to find your page via search engines and you want search engines to be able to crawl your content. It's thought that Google is getting better at crawling Javascript, but you need to make sure that you text or content is readable in a text-based browser or is visible to Google with Javascript off. Here's a resource for you.
That being said, focus on having one page for the content you're trying to create, so you can put more SEO efforts into building the equity in that page. You can also build other pages around variations of that topic that link back to that page, and link to these new pages from the one main topic page as well. This will help build your site from both a topic standpoint and passing linking equity throughout your site.
Let me know if this makes sense or helps. Best of luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google want contact numbers in the meta description?!
Reading up it seems like there's complete free reign to enter what you want in the meta description and they are not considered a direct ranking signal However I have added contact numbers to the meta descriptions for around 20 reasonably high ranking pages for my company and it seems to have had a negative effect (taken screen grabs and previous rankings) More strangely when you 'inspect' the page the meta description features the desired number yet when you find the page in the serps the meta description just does not feature the number (page has been cached and the description does not carry on) I'm wondering whether such direct changes are seen as spam and therefore negative to the page?
White Hat / Black Hat SEO | Mar 15, 2017, 11:58 PM | Jacksons_Fencing1 -
Client Wants To Use A .io Domain Name - How Bad For Organic?
Hi, I have a U.S. client who is stuck on a name that he wants to get as a .io (British Indian Ocean) domain name for a new site. Aside from the user confusion/weirdness, how much harder do you think this makes this sites organic in the U.S. in the future with a .io domain name? FYI, the other part of the domain name he wants to use is short, meaningless and implies nothing in and of itself. Thanks!
White Hat / Black Hat SEO | Dec 13, 2016, 3:21 AM | 945012 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | Nov 7, 2013, 9:47 AM | Heehaw0 -
Pages linked with Spam been 301 redirected to 404\. Is it ok
Pl suggest, some pages having some spam links pointed to those pages are been redirected to 404 error page (through 301 redirect) - as removing them manually was not possible due to part of core component of cms and many other coding issue, the only way as advised by developer was making 301 redirect to 404 page. Does by redirecting these pages to 404 page using 301 redirect, will nullify all negative or spam links pointing to them and eventually will remove the resulting spam impact on the site too. Many Thanks
White Hat / Black Hat SEO | Nov 11, 2014, 8:55 PM | Modi0 -
Starting every page title with the keyword
I've read everywhere that it's vital to get your target keyword to the front of the title that you're writing up. Taking into account that Google likes things looking natural I wanted to check if my writing title's like this for example: "Photographers Miami- Find the right Equipment and Accessories" ..Repeated for every page (maybe a page on photography in miami, one on videography in Orlando etc) is a smart way to write titles or if by clearly stacking keywords at the front of every title won't be as beneficial as other ways of doing it?
White Hat / Black Hat SEO | Jun 20, 2013, 7:10 AM | xcyte0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | Apr 30, 2013, 6:42 AM | razasaeed0 -
Why would links that were deleted by me 3 months ago still show up in reports?
I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike
White Hat / Black Hat SEO | Jan 19, 2013, 9:59 AM | shags380