Pages For Products That Don't Exist Yet?
-
Hi,
I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for.
Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger.
What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party?
The pages would be like "coming soon" pages, but still optimized to the main product search term.
About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking.
What do you think? Thanks!
-
Hi Fellows,
Thanks for the thoughts Mike, Charles & Kieran. All good ideas!
Best...Mike
-
I would second / third this. And don't put coming soon write a short spiel that says 2011 Super Widget is coming soon and should be this color and size and we will have ht charger when it comes out. What other items do you think the widget should have as accessories.
-
Ditto, I think it is a very good idea. One additional thing I would do is create some sort of "Signup" form so you can collect details of people interested in purchasing once released you could then send a email to once you get the full page up and running.
Kind of opposite to what you said, which I have also had great success with, is when a product isn't made any more, don't delete it, however keep the page and put on it alternative versions of the widget. People still search for years old products.
-
I think this is a very common thing to do, and I'd recommend you do the same. I wouldn't do it crazy ahead, but if a product comes soon, why not give yourself an advantage.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema for Landing Pages
Hi guys, I do a lot of landing pages for cars and would like to know what the best practices are for some things in Schema, so I can enhance their web presence. I would like to make some bullet points of the features of the vehicles show up in Google search results. What would be the best way to make this happen with Schema? Also, can I use coordinates in the code to make the landing page show up on a search result saying "2014 Volkswagen Beetle near me," rather than "2014 Volkswagen Beetle near Clarence, NY?" Can I make an image of the brand or car show up in the search results along with the meta description (kind of what bloggers do). Thanks!
White Hat / Black Hat SEO | | oomdomarketing0 -
Linking my pages
Hello everybody, i have a small dilemma and i am not shore what to do. I am (my company) the owner of 10 e-commerce web sites. On every site i have a link too the other 9 sites and i am using an exact keyvoerd (not the shop name).Since the web stores are big and have over a 1000 pages, this means thet all my sites have a lot off inbound links (compared with my competiton). I am woried that linking them all together could be bad from Googles point of wiev. Can this couse a problem for me, should i shange it? Regardes, Marko
White Hat / Black Hat SEO | | Spletnafuzija0 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Will my association's network of sites get penalized for link farming?
Before beginning I found these similar topics here: http://www.seomoz.org/q/multiple-domains-on-same-ip-address-same-niche-but-different-locations http://www.seomoz.org/q/multiple-domains-on-1-ip-address We manage over two dozen dental sites that are individually owned through out the US. All these dentists are in a dental association which we also run and are featured on (http://www.acedentalresource.com/). Part of the dental associations core is sharing information to make them better dentists and to help their patients which in addition to their education, is why they are considered to be some of the best dentists in the world. As such, we build links from what we consider to be valuable content between the sites. Some sites are on different IPs and C-Blocks, some are not. Given the fact that each site is only promoting the dentist at that brick and mortar location but also has "follow" links to other dentists' content in the network we fear that we are in the grey area of link building practices. Questions are: Is there an effective way to utilize the power of the network if quality content is being shared? What risks are we facing given our network? Should each site be on a different IP? Would having some of our sites on different servers make our backlinks more valuable than having all of our sites under the same server? If it is decided that having unique IPs is best practice, would it be obvious that we made the switch? Keep in mind that ALL sites are involved in the association, so naturally they would be linking to each other, and the main resource website mentioned above. Thanks for your input!
White Hat / Black Hat SEO | | DigitalElevator0 -
Should I use nofollow or don’t I have to worry about that?
I'm a developer and each time than I put at the bottom of the sites I build my company's logo with a link to our site. Could This action harm my website? Should I use nofollow or don’t I have to worry about that?
White Hat / Black Hat SEO | | soulmktpro0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0 -
How to improve on page Optimization ?
How to Optimize our website in terms of onpage optimisation, how relavent the onpage should be to that of the search engine. will onpage optimisatrion will help us to rank better in organic search results.
White Hat / Black Hat SEO | | AkshayaPatra0