You're given 10,000 recipes and told to build a site--what would you do?
-
Say you were given a list of 10,000 recipes and asked to build an SEO friendly site. Would you build a recipe search engine and index the search results (of course making sure that IA and user engagement metrics are great)?
Or, would you try to build static pages?
-
I have also one https://besttoasterovenguides.com/ about kitchen niche. Can someone check it's not showing any links correctly.
-
I would use tools liek copyscape or anyother to see if the recipes (exact text) are already not available online so you won't get into any sort of duplication issues.
Considering 10,000 it will take some time for sure.
If its not copied material then surely go for a website.
Building static pages will take much more time compared to using a cms i guess.
-
There are some great responses on the SEO aspect of your project already from Keri and Matt. As far as building the site, I would build it off of WordPress and use a custom post type for "Recipes", and custom taxonomies for "ingredients" and "type" etc... Then you can use the default WP search function and taxonomy lists for users to easily search for the right recipe.
-
I'd ask if the recipes were already on the web and see if I'm going to be fighting a huge duplicate content problem against an established site.
-
I don't know that they're mutually exclusive. I think you need to create a page for each of the recipes. Then I think you need a great search engine for it (search by ingredient, by course, by main protein, by what's in pantry, etc.) You'll want to definitely get your ideas from successful sites like AllRecipes.com, Taste.com.au, FoodNetwork.com and such.
Also - from an SEO/on-site perspective, you need to figure out how to get integrated hRecipe (schema/rich snippet) data into Google. Search "banana bread" click "more" then "Recipes" - at the top you'll see Search Tools > Ingredients. You need your ingredients for every one of those 10k recipes to show up in this part of Google. This is how food bloggers search and they're going to be a HUGE part of your audience.
Make sure each can be rated, and if I were doing this from scratch right now, I'd make sure everyone who submits UGC in the future has a place to put their rel=author on each recipe. If you can integrate ingredients, rel=author and ratings, you'll be on the way to great food SEO.
-
I would look at similar sites. Not sure where you are located, but here in Australia we have a great site called Taste (Taste.com.au).
There's got to be tonnes of great recipe sites like that - I'd just copy elements of what they are doing. Or at least use it as a starting point to do some significant research.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way of crawling my entire site to get a list of NoFollow links?
Hi all, hope somebody can help. I want to crawl my site to export an audit showing: All nofollow links (what links, from which pages) All external links broken down by follow/nofollow. I had thought Moz would do it, but that's not in Crawl info. So I thought Screaming Frog would do it, but unless I'm not looking in the right place, that only seems to provide this information if you manually click down each link and view "Inlinks" details. Surely this must be easy?! Hope someone can nudge me in the right direction... Thanks....
Intermediate & Advanced SEO | | rl_uk0 -
Can 'follow' rather than 'nofollow' links be damaging partner's SEO
Hey guys and happy Monday! We run a content rich website, 12+ years old, focused on travel in a specific region, and advertisers pay for banners/content etc alongside editorial. We have never used 'nofollow' website links as they're no explicitly paid for by clients, but a partner has asked us to make all links to them 'nofollow' as they have stated the way we currently link is damaging their SEO. Could this be true in any way? I'm only assuming it would adversely affect them if our website was peanalized by Google for 'selling links', which we're not. Perhaps they're just keen to follow best practice for fear of being seen to be buying links. FYI we now plan to change to more full use of 'nofollow', but I'm trying to work out what the client is refering to without seeming ill-informed on the subject! Thank you for any advice 🙂
Intermediate & Advanced SEO | | SEO_Jim0 -
Migrating From Parameter-Driven URL's to 'SEO Friendly URL's (Slugs)
Hi all, hope you're all good and having a wonderful Friday morning. At the moment we have over 20,000+ live products on our ecomms site, however, all of the products are using non-seo friendly URL's (/product?p=1738 etc) and we're looking at deploying SEO friendly url's such as (/product/this-is-product-one) etc. As you could imagine, making such a change on a big ecomms site will be a difficult task and we will have to take on A LOT of content changes, href-lang changes, affiliate link tests and a big 301 task. I'm trying to get some analysis together to pitch the Tech guys, but it's difficult, I do understand that this change has it's benefits for SEO, usability and CTR - but I need some more info. Keywords in the slugs - what is it's actual SEO weight? Has anyone here recently converted from using parameter based URL's to keyword-based slugs and seen results? Also, what are the best ways of deploying this? Add a canonical and 301? All comments greatly appreciated! Brett
Intermediate & Advanced SEO | | Brett-S0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
How to build PA and DA for a new site?
Hi Guys, Any good tips on how to build PA and DA for a new website about 3 months old now, and our DA is at 5 while PA 1. Any good ideas from past experiences on how to build these up? Cheers
Intermediate & Advanced SEO | | edward-may0 -
I have a general site for my insurance agency. Should I create niche sites too?
I work with several insurance agencies and I get this questions several times each month. Most agencies offer personal and business insurance and in a certain geographic location. I recommend creating a quality general agency site but would they have more success creating other nice sites as well? For example, a niche site about home insurance and one about auto insurance. What would your recommendation be?
Intermediate & Advanced SEO | | lagunaitech1 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0 -
Can a Hosting provider that also hosts adult content sites negatively affect our SEO rankings on a non-adult site hosted on same platform?
We're considering moving a site to a host that also offers hosting for adult websites. Can this have a negative affect on SEO, if our hosting company is in any way associated with adult websites?
Intermediate & Advanced SEO | | grapevinemktg0