Pages For Products That Don't Exist Yet?
-
Hi,
I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for.
Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger.
What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party?
The pages would be like "coming soon" pages, but still optimized to the main product search term.
About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking.
What do you think? Thanks!
-
Hi Fellows,
Thanks for the thoughts Mike, Charles & Kieran. All good ideas!
Best...Mike
-
I would second / third this. And don't put coming soon write a short spiel that says 2011 Super Widget is coming soon and should be this color and size and we will have ht charger when it comes out. What other items do you think the widget should have as accessories.
-
Ditto, I think it is a very good idea. One additional thing I would do is create some sort of "Signup" form so you can collect details of people interested in purchasing once released you could then send a email to once you get the full page up and running.
Kind of opposite to what you said, which I have also had great success with, is when a product isn't made any more, don't delete it, however keep the page and put on it alternative versions of the widget. People still search for years old products.
-
I think this is a very common thing to do, and I'd recommend you do the same. I wouldn't do it crazy ahead, but if a product comes soon, why not give yourself an advantage.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we remove our "index" pages (alphabetical link list to all of the products on the site)?
We run an e-commerce site with a large number of product families, with each family having a number of products within it. We have a set of pages (26 - one for each letter A-Z) that are lists of links to the product family pages. We originally created these pages thinking it would aid in discoverability of these pages to search engines, of course as time has gone on, techniques like this have fallen out of favor with Google as it provides negligible value to the user. Should we consider removing these pages from the site overall? Is it possible that it could be viewed by Panda as resembling a link farm? Thanks in advance!
White Hat / Black Hat SEO | | ChrisRoberts-MTI1 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
I've purchased a PR 6 domain what will be best use of it ?
I've purchased a PR 6 domain what will be best use of it ? Should make a new site or redirect it to my low pr sites? Or I wasted my $100 ?
White Hat / Black Hat SEO | | IndiaFPS0 -
Do industry partner links violate Google's policies?
We're in the process of The Great _Inquisition_piecing together a reconsideration request. In doing so, we reached out to an agency to filter and flag our backlinks as safe, should be no-followed, or should be removed. The problem is, they flagged several of our earned, industry partner links (like those pointing to us, HireAHelper, from 1-800-Pack-Rat and PODS for example) as either should be no-followed or should be removed. I have a hard time believing Google would penalize such a natural source of earned links, but then again, this is our second attempt at a Reconsideration Request, and I want to cover all my bases. What say you Moz community? No-follow? Remove? Leave alone?
White Hat / Black Hat SEO | | DanielH0 -
Pages linked with Spam been 301 redirected to 404\. Is it ok
Pl suggest, some pages having some spam links pointed to those pages are been redirected to 404 error page (through 301 redirect) - as removing them manually was not possible due to part of core component of cms and many other coding issue, the only way as advised by developer was making 301 redirect to 404 page. Does by redirecting these pages to 404 page using 301 redirect, will nullify all negative or spam links pointing to them and eventually will remove the resulting spam impact on the site too. Many Thanks
White Hat / Black Hat SEO | | Modi0 -
Starting every page title with the keyword
I've read everywhere that it's vital to get your target keyword to the front of the title that you're writing up. Taking into account that Google likes things looking natural I wanted to check if my writing title's like this for example: "Photographers Miami- Find the right Equipment and Accessories" ..Repeated for every page (maybe a page on photography in miami, one on videography in Orlando etc) is a smart way to write titles or if by clearly stacking keywords at the front of every title won't be as beneficial as other ways of doing it?
White Hat / Black Hat SEO | | xcyte0 -
Blog commenting - dos and don'ts
Dear Community, I'm getting into blog commenting heavily now for the relationships I'm building with other bloggers. I think the relationships I will build with these other influencers will be helpful. But I'm concerned that Google may penalize my site if I have a lot of links coming from blog commenting. If I sense that a blog is spammy, obviously I stay away. I've also noticed that a number of CommentLuv sites include a link to my latest blog post, and that has helped me greatly in promoting my posts and building readership. I am also interested in the follow links I get from it, but concerned in that regard that (1) Google won't count those follow links (won't pass page rank) and (2) Google will penalize me for some reason or in some way. What does everyone think about this approach of blog commenting, and in particular, including posting some comments on CommentLuv blogs. Thanks! Mike
White Hat / Black Hat SEO | | Harbor_Compliance0 -
I think I've been hit by Penguing - Strategy Discusson
Hi, I have a network of 50 to 60 domain names which have duplicated content and whose domains are basically a geographical location + the industry I am in. All of these websites have links to my main site. Over the weekend I saw my traffic fall. I attribute our drop in rankings to what people are calling Penguing 1.1. I want to keep my other domains as we are slowly creating unique content for each of those sites. However, in the mean time, clearly I need to deal with the inbound linking and anchor text problem. Would adding a nofollow tag to all links that point to my main site resolve my issue with Google's penguin update? Thanks for the help.
White Hat / Black Hat SEO | | MangoMan160