You're given 10,000 recipes and told to build a site--what would you do?
-
Say you were given a list of 10,000 recipes and asked to build an SEO friendly site. Would you build a recipe search engine and index the search results (of course making sure that IA and user engagement metrics are great)?
Or, would you try to build static pages?
-
I have also one https://besttoasterovenguides.com/ about kitchen niche. Can someone check it's not showing any links correctly.
-
I would use tools liek copyscape or anyother to see if the recipes (exact text) are already not available online so you won't get into any sort of duplication issues.
Considering 10,000 it will take some time for sure.
If its not copied material then surely go for a website.
Building static pages will take much more time compared to using a cms i guess.
-
There are some great responses on the SEO aspect of your project already from Keri and Matt. As far as building the site, I would build it off of WordPress and use a custom post type for "Recipes", and custom taxonomies for "ingredients" and "type" etc... Then you can use the default WP search function and taxonomy lists for users to easily search for the right recipe.
-
I'd ask if the recipes were already on the web and see if I'm going to be fighting a huge duplicate content problem against an established site.
-
I don't know that they're mutually exclusive. I think you need to create a page for each of the recipes. Then I think you need a great search engine for it (search by ingredient, by course, by main protein, by what's in pantry, etc.) You'll want to definitely get your ideas from successful sites like AllRecipes.com, Taste.com.au, FoodNetwork.com and such.
Also - from an SEO/on-site perspective, you need to figure out how to get integrated hRecipe (schema/rich snippet) data into Google. Search "banana bread" click "more" then "Recipes" - at the top you'll see Search Tools > Ingredients. You need your ingredients for every one of those 10k recipes to show up in this part of Google. This is how food bloggers search and they're going to be a HUGE part of your audience.
Make sure each can be rated, and if I were doing this from scratch right now, I'd make sure everyone who submits UGC in the future has a place to put their rel=author on each recipe. If you can integrate ingredients, rel=author and ratings, you'll be on the way to great food SEO.
-
I would look at similar sites. Not sure where you are located, but here in Australia we have a great site called Taste (Taste.com.au).
There's got to be tonnes of great recipe sites like that - I'd just copy elements of what they are doing. Or at least use it as a starting point to do some significant research.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Adult Toys Sites
Does anyone know of any changes SEOwise when running an adult toy site versus a normal eCommerce site? Is there any tips or suggestions that are worth knowing to achieve rankings faster? Thanks,
Intermediate & Advanced SEO | | the-gate-films0 -
Google can't access/crawl my site!
Hi I'm dealing with this problem for a few days. In fact i didn't realize it was this serious until today when i saw most of my site "de-indexed" and losing most of the rankings. [URL Errors: 1st photo] 8/21/14 there were only 42 errors but in 8/22/14 this number went to 272 and it just keeps going up. The site i'm talking about is gazetaexpress.com (media news, custom cms) with lot's of pages. After i did some research i came to the conclusion that the problem is to the firewall, who might have blocked google bots from accessing the site. But the server administrator is saying that this isn't true and no google bots have been blocked. Also when i go to WMT, and try to Fetch as Google the site, this is what i get: [Fetch as Google: 2nd photo] From more than 60 tries, 2-3 times it showed Complete (and this only to homepage, never to articles). What can be the problem? Can i get Google to crawl properly my site and is there a chance that i will lose my previous rankings? Thanks a lot
Intermediate & Advanced SEO | | granitgash
Granit FvhvDVR.png dKx3m1O.png0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
Why is my site not ranked?
Hey, does enybody have an idea, why my site www.detox.si is not ranked for the KW detox in www.google.si (Slovenia). It is being indexed, but it does not rank and i have no idea why. Best, M.
Intermediate & Advanced SEO | | Spletnafuzija0 -
What do I do about sites that copy my content?
I've noticed that there are a number of websites that are copying my content. They are putting the full article on their site, mentioning that it was reposted from my site, but contains no links to me. How should I approach this? What are my rights and should I ask them to remove it or add a link? Will the duplicate content affect me?
Intermediate & Advanced SEO | | JohnPeters0 -
Strange situation - Started over with a new site. WMT showing the links that previously pointed to old site.
I have a client whose site was severely affected by Penguin. A former SEO company had built thousands of horrible anchor texted links on bookmark pages, forums, cheap articles, etc. We decided to start over with a new site rather than try to recover this one. Here is what we did: -We noindexed the old site and blocked search engines via robots.txt -Used the Google URL removal tool to tell it to remove the entire old site from the index -Once the site was completely gone from the index we launched the new site. The new site had the same content as the old other than the home page. We changed most of the info on the home page because it was duplicated in many directory listings. (It's a good site...the content is not overoptimized, but the links pointing to it were bad.) -removed all of the pages from the old site and put up an index page saying essentially, "We've moved" with a nofollowed link to the new site. We've slowly been getting new, good links to the new site. According to ahrefs and majestic SEO we have a handful of new links. OSE has not picked up any as of yet. But, if we go into WMT there are thousands of links pointing to the new site. WMT has picked up the new links and it looks like it has all of the old ones that used to point at the old site despite the fact that there is no redirect. There are no redirects from any pages of the old to the new at all. The new site has a similar name. If the old one was examplekeyword.com, the new one is examplekeywordcity.com. There are redirects from the other TLD's of the same to his (i.e. examplekeywordcity.org, examplekeywordcity.info), etc. but no other redirects exist. The chances that a site previously existed on any of these TLD's is almost none as it is a unique brand name. Can anyone tell me why Google is seeing the links that previously pointed to the old site as now pointing to the new? ADDED: Before I hit the send button I found something interesting. In this article from dejan SEO where someone stole Rand Fishkin's content and ranked for it, they have the following line: "When there are two identical documents on the web, Google will pick the one with higher PageRank and use it in results. It will also forward any links from any perceived ’duplicate’ towards the selected ‘main’ document." This may be what is happening here. And just to complicate things further, it looks like when I set up the new site in GA, the site owner took the GA tracking code and put it on the old page. (The noindexed one that is set up with a nofollowed link to the new one.) I can't see how this could affect things but we're removing it. Confused yet? I'd love to hear your thoughts.
Intermediate & Advanced SEO | | MarieHaynes0 -
Splitting a Site into Two Sites for SEO Purposes
I have a client that owns a business that really could be easily divided into two separate business in terms of SEO. Right now his web site covers both divisions of his business. He gets about 5500 visitors a month. The majority go to one part of his business and around 600 each month go to the other. So about 11% I'm considering breaking off this 11% and putting it on an entirely different domain name. I think I could rank better for this 11%. The site would only be SEO'd for this particular division of the company. The keywords would not be in competition with each other. I would of course link the two web sites and watch that I don't run into any duplicate content issues. I worry about placing the redirects from the pages that I remove to the new pages. I know Google is not a fan of redirects. Then I also worry about the eventual drop in traffic to the main site now. How big of a factor is traffic in rankings? Other challenges include that the business services 4 major metropolitan areas. Would you do this? Have you done this? How did it work? Any suggestions?
Intermediate & Advanced SEO | | MSWD0 -
Building Widgets...
Hi all, For a project I'm working on there will be an opportunity to have a number of websites link back to our main site. Rather than giving out a straight forward link text I'm more interested in building and handing out some kind of widget which is topical to both us and the websites giving us links. Although I do a fair bit of web development with various technologies I have never played around with building widgets as such which can pull in data feeds from our database etc... Does anyone have any good recommendations of tutorials covering this area or alternatively any companies offering this kind of widget building service. Thanks in advance, Darren
Intermediate & Advanced SEO | | DarrenAtkinson0