Devaluing certain content to push better content forward
-
Hi all, I'm new to Moz, but hoping to learn a lot from it in hopes of growing my business. I have a pretty specific question and hope to get some feedback on how to proceed with some changes to my website. First off, I'm a landscape and travel photographer. My website is at http://www.mickeyshannon.com - you can see that the navigation quickly spreads out to different photo galleries based on location. So if a user was looking for photos from California, they would find galleries for Lake Tahoe, Big Sur, the Redwoods and San Francisco. At this point, there are probably 600-800 photos on my website. At last half of these are either older or just not quite up to par with the quality I'm starting to feel like I should produce. I've been contemplating dumbing down the galleries, and not having it break down so far. So instead of four sub-galleries of California, there would just be one California gallery. In some cases, where there are lots of good images in a location, I would probably keep the sub-galleries, but only if there were dozens of images to work with. In the description of each photo, the exact location is already mentioned, so I'm not sure there's a huge need for these sub-galleries except where there's still tons of good photos to work with.
I've been contemplating building a sort of search archive. Where the best of my photos would live in the main galleries, and if a user didn't find what they were looking for, they could go and search the archives for older photos. That way they're still around for licensing purposes, etc. while the best of the best are pushed to the front for those buying fine art prints, etc. These pages for these search archives would probably need to be de-valued somehow, so that the main galleries would be more important SEO-wise. So for the California galleries, four sub-galleries of perhaps 10 images each would become one main California gallery with perhaps 15 images. The other 25 images would be thrown in the search archive and could be searched by keyword.
The question I have - does this sound like a good plan, or will I really be killing my site when it comes to SEO by making such a large change? My end goal would be to push my better content to the front, while scaling back a lot of the excess. Hopefully I explained this question well. If not, I can try to elaborate further!
Thanks,
Mickey
-
Thanks Rob! That helps a lot. I've been considering beefing up copy content on some of the pages (especially the gallery pages). This kind of gives me a direction to move towards. Devaluing was probably a bad word to use. I mainly just wanted the better content pushed forward without having to actually delete anything. Thanks for the help and also the kind comments on the photography!
Mickey
-
Hi Mickey,
I took a look at your site - first let me say you have some pretty great shots there!
But on to SEO. I think I understand what you're saying you want to do with the site. Really, this comes down to site architecture and the issue you are going to have is that a lot of your pages will be very photo-heavy, meaning you will have very little content to work with while simultaneously struggling against duplicate content penalties because you will have to rely on alt-tags to label your content.
For me, your best bet would be to institute several category pages to lead off your main page and have them showcase some text-based content. For example, writing a small bio (300-500 words) on the location and targeting geo-specific keywords (i.e. California photography). Create these pages for what you want to rank for. If you don't want too many words showing up on a page, you can institute an accordion design so users have to click certain blocks for the content to appear. This still allows search engines to crawl the content, however.
For other pages you don't want ranking, you can place a link leading from your category pages to follow-up pages which feature the images you are less fond of. You can do the same with your search archive. The end result would be something like:
Home Page
-> Category Page -> Search Archive/Secondary Photos
-> Category Page -> Search Archive/Secondary Photos
-> Category Page -> Search Archive/Secondary PhotosThis will place your premium content directly in front of your visitors and will help with ranking after you place textual content on the pages. It also does not involve any "de-valuing" of content on your site - just moves in the positive direction.
Hope this helps with your question and let me know if you need any further clarification.
Best of luck!
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving content to a new domain
I need to move a lot of content with podcasts and show notes to a new domain. Instead of doing redirects, we want to keep some content on the current domain to retain the link value. There are business reason to keep content on both websites but the new website will primarily be used for SEO moving forward.If we keep the audio portion of the podcast on the old website and move the show notes and the audio portion of the podcast to the new website, is there any issues with duplicate content?Long-term, I presume Google will re-index the old and the new pages, thus no duplicate content, but I want to make sure I'm not missing anything. I was planning to fetch pages in Search Console as we migrate content.Thanks for your help!
Technical SEO | | JimmyFritz0 -
Premium Content
Hey Guys I woking on a site that publishes hundreds of new content a day and part of the content is only available for users for 30 days. After 30 days the content is only accessible to premium users.
Technical SEO | | Mr.bfz
After 30 days, the page removes the content and replaces it with a log in/ sign up option. The same URL is kept for each page and the title of the article.
I have 2 concerns about this method. Is it healthy for the site to be removing tons of content of live pages and replace with a log in options Should I worry about Panda for creating tons of pages with unique URL but very similar source /content - the log in module and the text explaining that it is only available to premium users. The site is pretty big so google has some tolerance of things we can get away with it. Should I add a noindex attribute for those pages after 30 days? Even though it can takes months until google actually removes from the index. Is there a proper way for performing this type of feature in sites with a log in option after a period of time (first click free is not an option) Thanks Guys and I appreciate any help!0 -
H1 Headers and Unique Content
Should my H1 header contain the same keywords in the same order, verbatim as my SEO title or some variation of them? Or does it matter?
Technical SEO | | keeot0 -
Javascript tabbed navigation and duplicate content
I'm working on a site that has four primary navigation links and under each is a tabbed navigation system for second tier items. The primary link page loads content for all tabs which are javascript controlled. Users will click the primary navigation item "Our Difference" (http://www.holidaytreefarm.com/content.cfm/Our-Difference) and have several options with each tabs content in separate sections. Each second tier tab is also available via sitemap/direct link (ie http://www.holidaytreefarm.com/content.cfm/Our-Difference/Tree-Logistics) without the js navigation so the content on this page is specific to the tab, not all tabs. In this scenario, will there be duplicate content issues? And, what is the best way to remedy this? Thanks for your help!
Technical SEO | | Total-Design-Shop0 -
RSS Feed - Dupe Content?
OK so yesterday a website agreed to publish my RSS feed and I just wanted to check something. The site in question is far more established than mine and I am worrying that with my content appearing on their website pretty much at the same time as mine, will Google index theirs first and therefore consider mine to be dupe? They are linking back to each of my articles with the text "original post" and I'm not sure whether this will help. Thanks in advance for any responses!
Technical SEO | | marcoose810 -
Does this content get indexed?
A lot of content on this site is displayed in pop up pages. Eg. Visit the Title page http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title To access the sample report or fee details, the info is shown in a pop up page with a strange url. Example: http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title+-+Fee+Details I can't see any of these pages being indexed in Google or other search engines when I do a site search: http://www.landgate.wa.gov.au/corporate.nsf/web/Certificate+of+Title+-+Fee+Details Is there a way to get this content indexed besides telling the client to restructure this content?
Technical SEO | | Bigheadigital0 -
Internal linking with Old Content
Hello, I have a sports website in which users write their opinions about the sporting events that take place every day throughout the year. Each of these sporting events generates a new page or URL indicating the match with date. For example: www.domain.com/baseball/boston-v-yankees-04-24-2012-1234.html The teams face several times a year, and each match creates a different URL or page. I would like to link old pages to new pages and vice versa. How would you recommend these pages to be linked? Linking them to each other or linking old pages to new pages that are generated or otherwise? I would appreciate your orientation and help in this case. Thank you.
Technical SEO | | NorbertoMM1 -
Blocking AJAX Content from being crawled
Our website has some pages with content shared from a third party provider and we use AJAX as our implementation. We dont want Google to crawl the third party's content but we do want them to crawl and index the rest of the web page. However, In light of Google's recent announcement about more effectively indexing google, I have some concern that we are at risk for that content to be indexed. I have thought about x-robots but have concern about implementing it on the pages because of a potential risk in Google not indexing the whole page. These pages get significant traffic for the website, and I cant risk. Thanks, Phil
Technical SEO | | AU-SEO0