Having Content be the First thing the bots see
-
If you have all of your homepage content in a tab set at the bottom of the page, but really would want that to be the first thing Google reads when it crawls your site, is there something you can implement where Google reads your content first before it reads the rest of your site? Does this cause any violations or are there any red flags that get raised from doing this? The goal here would just be to get Google to read the content first, not hide any content
-
it should only be the first line as h1, not the content. We styled it all the same so it didn't look silly. WE did make local cities h2....not sure if that's good or bad...but it stinks to serve so many cities and only rank at your physical location. Especially when there are 20 cities with in 20 miles here in DC metro.
Not sure if local "city pages" will work or how that changes the landing page experience verse a very interactive home page...Google didn't think about all of that!
-
Just checked how you have done it and I see what you mean - it's a bit tricky. One thing I noticed is that all that text is wrapped in a h1. I would take it out and put it in as standard content.
Also if you could take the text that is in your slideshow images and convert it to readable text that would provide you with a bit more relevant content on the site that may help.
Best of luck with it!
-
well....darn...its on the footer pretty much. Check out imageworksstudio.com
(about tab, lower left)
Thing is...you don't really want to spam up your site with content on a home page, as a branding firm we prefer short clear messaging that is focused on customer pain points, value props etc. Of course these are images and not really seo relevant anyways. Grrr - double edged sword.
Thanks again. I appreciate your comments.
-
It is done using CSS, but it needs to be clarified if the content is down far due to other content on the page or if it is down low due to HTML tags (perhaps from a navigation). The former might make a difference, but I think G can detect that trick anyway. The latter is irrelevant in my opinion, as the tags will be discounted.
-
There's been a bit of dicussion about this before and I seem to remember that using CSS to push content up the page actually had a slightly beneficial effect on rankings.
It's mainly going to be an issue if your content is really low down on the page due to things like intrusive banner ads or lots of adverts.
-
That's what I thought too....but I'm old school SEO and have no idea if this has changed! Thanks.
-
This can be done via CSS, but I'm not sure doing so has value any more. It used to be a practice a couple of years back, but I don't think it is necessary anymore.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
Content Cannibalism Question with example
Hi, Since I love writing and I write a lot I always find myself worried about ruining for my self with Content Cannibalism. Yesterday, while looking to learn about diamonds I encountered a highly ranked website that has two pages ranking high on the first page simultaneously (4th and 5th) - I never noticed it before with Google. The term I googled was "vvs diamonds" and the two pages were: http://bit.ly/1N51HpQ and http://bit.ly/1JefWYS Two questions: 1. Does that happen often with Google (presenting two lines from the same site on first page)? 2. Would it be better practice for the writer to combine them? - creating a one more powerful page... Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
Is my website is having enough content on it to rank?
I have less content on my website, is this okay or I need to add more content on my pages? Website is - brandstenmedia.com.au Any other suggestions for the website?
Intermediate & Advanced SEO | | Green.landon0 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Is slugs in the URL now a good thing?
Hi, Until now I've adviced a lot of web shops to avoid having long URL structures for their categories and products (aka. remove the useless slugs). Recently I discovered that Google started rolling out more and more results that looks like these screenshots: http://filer.crenia.no/McDn & http://filer.crenia.no/McYO (look at the URL in the SERP) I'm assuming the slugs are a vital part of creating these SERP results. Personally, I also think they look better and favor them compared to the old SERPs. Does anyone have any experience with these, what impact they have or any reason not to add slugs to URLs again?
Intermediate & Advanced SEO | | Inevo0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
How to Best Establish Ownership when Content is Duplicated?
A client (Website A) has allowed one of their franchisees to use some of the content from their site on the franchisee site (Website B). This franchisee lifted the content word for word, so - my question is how to best establish that Website A is the original author? Since there is a business relationship between the two sites, I'm thinking of requiring Website B to add a rel=canonical tag to each page using the duplicated content and referencing the original URL on site A. Will that work, or is there a better solution? This content is primarily informational product content (not blog posts or articles), so I'm thinking rel=author may not be appropriate.
Intermediate & Advanced SEO | | Allie_Williams0 -
Expired Content
Hi We have a listing website that has a huge amount of listings.These listings are changing all time, they become passive or deleted. We would like to choose the response code for the passive for deleted pages. Which response type must we use ? Redirect to last category with 301 Give 410 Gone response code Give 404 Response code which option would we choose ? and any ideas ?
Intermediate & Advanced SEO | | SEMTurkey0