Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What are the potential SEO downsides of using a service like unbounce for content pages?
-
I'm thinking of using unbounce.com to create some content driven pages. Unbounce is simple, easy-to-use, and very easy for non-devs at my company to create variations on pages.
I know they allow adding meta descriptions, title tags, etc and allow it to be indexable by Google, but I was wondering if there were any potential downsides to using unbounce as opposed to hosting it myself.
Any help would be appreciated!
-
Hi,
I'm the person behind SEO at Unbounce.
There is no technical SEO drawback. Unbounce allows you to directly control all of the elements of your on-page SEO. You can even employ rel="canonical" if you are so inclined to indicate which variation Google should pay the most attention to.
If you have any questions feel free to contact me: Carlos@unbounce.com
-
Hi Ben,
Thanks for the answer! Sorry if I wasn't clear in my original question, but we are actually using Unbounce for PPC testing already.
The pages that we are planning on creating are not necessarily landing pages. It's just much faster for us to create pages/content on Unbounce at the moment than it is creating actual pages on our site. (That way non-devs can work and create new pages as well.)
In your opinion would there be any major downsides to creating some page on unbounce? Obviously it's not ideal, but if there are no major issues we might use their service albeit temporarily.
Thanks!
Seiya
-
That's some solid advice right there.
-
This process may lend itself to PPC a bit more than SEO. When split testing you will need to be aware of duplicate content, and considering that your ultimate goal is to figure out which landing pages are more effective, you will end up removing some of the pages anyway. On a large scale this isn't going to be as effective.
I would consider running a PPC account to test these pages and not have them indexed. Then, once you have a landing page that performs well, create it on the site and promote it with SEO.
-
Seiyav-
If you are looking for a simple and easy solution to start taking advantage of AB testing and getting landing pages created quickly without waiting for a developer, it is a very cost effective model. You can generate landing pages quickly and easily without developers.....you can set up testing easily and the system will provide you metrics to measure the results without a lot of in-depth thought.
There really are no downsides other than the cost.....
We have found with larger clients as they generate some expertise and some clarity regarding which landing pages are working better, then they start to bring it inside to generate more control, save money and become more knowledgable about AB testing and development of landing pages. But unbounce is a great step in that process.
Good luck. Hope it helps..
Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CSS user select and any potential affect on SEO
Hi everyone and thank you in advance for your helpful comments. We have a client who is concerned about copying of content from their site because it has happened a few times in the last few years. We have explained that the content is essentially publicly available and that using the CSS selector user-select to prevent selection of text will really only prevent the technically limited users from working out how to get the text. He is happy that it will at least stop some people. So the question is would there be any way that this would have an affect on SEO? We would make an assumption that it doesnt but putting it out there for some feedback. Cheers Eddie
Technical SEO | | vital_hike0 -
Home Page Ranking Instead of Service Pages
Hi everyone! I've noticed that many of our clients have pages addressing specific queries related to specific services on their websites, but that the Home Page is increasingly showing as the "ranking" page. For example, a plastic surgeon we work with has a page specifically talking about his breast augmentation procedure for Miami, FL but instead of THAT page showing in the search results, Google is using his home page. Noticing this across the board. Any insights? Should we still be optimizing these specific service pages? Should I be spending time trying to make sure Google ranks the page specifically addressing that query because it SHOULD perform better? Thanks for the help. Confused SEO :/, Ricky Shockley
Technical SEO | | RickyShockley0 -
JavaScript page loader - SEO impact
Hello all,
Technical SEO | | Lvet
I am working on a site that has a bizarre page load system. All pages get loaded trough the same Javascript snippet, for example: Changing the values in the form changes the page that is loaded. The most incredible thing is that, against my expectations, pages do get indexed by Google.
My question is: "Does loading pages dynamically using JavaScript affect the overall SEO performance?" Why are pages getting indexed? Thank you for shedding light on this.
Cheers
Luca0 -
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | | karl621 -
How to prevent duplicate content at a calendar page
Hi, I've a calender page which changes every day. The main url is
Technical SEO | | GeorgFranz
/calendar For every day, there is another url: /calendar/2012/09/12
/calendar/2012/09/13
/calendar/2012/09/14 So, if the 13th september arrives, the content of the page
/calendar/2012/09/13
will be shown at
/calendar So, it's duplicate content. What to do in this situation? a) Redirect from /calendar to /calendar/2012/09/13 with 301? (but the redirect changes the day after to /calendar/2012/09/14) b) Redirect from /calendar to /calendar/2012/09/13 with 302 (but I will loose the link juice of /calendar?) c) Add a canonical tag at /calendar (which leads to /calendar/2012/09/13) - but I will loose the power of /calendar (?) - and it will change every day... Any ideas or other suggestions? Best wishes, Georg.0 -
Can you mark up a page using Schema.org and Facebook Open Graph?
Is it possible to use both Schema.org and Facebook Open Graph for structured data markup? On the Google Webmaster Central blog, they say, "you should avoid mixing the formats together on the same web page, as this can confuse our parsers." Source - http://googlewebmastercentral.blogspot.com/2011/06/introducing-schemaorg-search-engines.html
Technical SEO | | SAMarketing1 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0