Unique content but exactly the same graphical layout - a problem?
-
Hello,
I have a coaching website at www.bobweikel.com I want to make a second coaching website with the same exact wordpress theme, totally identical except a slightly different logo image. Everything the same. The only difference is that all the content will be unique on the new site.
Does it matter to Google that the graphics are absolutely identical?
I assume it's fine, I just am making sure.
-
Google cares about duplicate content not duplicate CSS
-
Hi Vadim,
That's what I"ve always thought. But the CSS will be identical too. You don't think Google cares?
-
Does it matter to Google that the graphics are absolutely identical?
Google only sees text and code, it does not see graphics, remember the google bot/crawler just scans the content, code, and does not see images at all other than the alt tags and image file names.
For users, a logo us usually the defining element that will help a user distinguish between the two sites
Hope this helps
-
You have to copy and paste it, for some reason it looks like it's a relative link.
If your content is significantly different you will be fine. There are a lot of websites using the same basic Word Press themes with minimal customization so I don't see how this would be any different. Though Google is getting more advanced at determining the lay out and make up of a page, I'm not sure that theme and design elements are as distinguishable to them.
-
Bob, that link is producing a 404.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is your opinion in the use of jquery for a continuous scroll type of page layout?
So, I'm in 2 minds about this; let me start with a bit of background info. Context
Web Design | | ChrisAshton
We have a new client who is in the final days of their new site design and were when they first contacted us. Their design essentially uses 5 pages, each with several pages worth of content on each, separated with the use of jquery. What this means is a user can click a menu item from a drop-down in the nav and be taken directly to that section of content like using internal anchor links as if it were a separate page, or they can click the top-level nav item and scroll through each "sub-page" without having to click other links. Vaguely similar to Google's "How Search Works" page if each sector of that page had it's own URL, only without the heavy design elements and slow load time. In this process, scrolling down to each new "sub-page" changes the URL in the address bar and is treated as a new page as far as referencing the page, adding page titles, meta descriptions, backlinks etc. From my research this also means search engines don't see the entire page, they see each sub-page as their own separate item like a normal site. My Reservations I'm worried about this for several reasons, the largest of them being that you're essentially presenting the user with something different to the search engines. The other big one being that I just don't know if search engines really can render this type of formatting correctly or if there's anything I need to look out for here. Since they're so close to launching their new site, I don't have time to set up a test environment and I'm not going to gamble with a new corporate website but they're also going to be very resistant to the advice of "start the design over, it's too dangerous". The Positives
For this client in particular, the design actually works very well. Each of these long pages is essentially about a different service they offer and the continuous scrolling through the "sub-pages" acts as almost a workflow through the process, covering each step in order. It also looks fantastic, loads quickly and has a very simple nav so the overall user experience is great. Since the majority of my focus in SEO is on UX, this is my confusion. Part of me thinks that obscuring the other content on these pages and only showing each individual "sub-page" to search engines is an obvious no-no, the other part of me feels that this kind of user experience and the reasonable prevalence of AJAX/Paralax etc means search engines should be more capable of understanding what's going on here. Can anyone possibly shed some light on this with either some further reading or first-hand experience?0 -
Duplicate Content? Designing new site, but all content got indexed on developer's sandbox
An ecommerce I'm helping is getting a complete redesign. Their developer had a sandbox version of their new site for design & testing. Several thousand products were loaded into the sandbox site. Then Google/Bing crawled and indexed the site (because developer didn't have a robots.txt), picking up and caching about 7,200 pages. There were even 2-3 orders placed on the sandbox site, so people were finding it. So what happens now?
Web Design | | trafficmotion
When the sandbox site is transferred to the final version on the proper domain, is there a duplicate content issue?
How can the developer fix this?0 -
40 percent redundant content on landing pages with 60 percent unique information.
I have searched schema.org for tags to use for our redudant content on 25 unique local landing pages. The redundant content references our services and abilities on each page. Could anyone tell me how to retain this content and direct the search engines to disregard this portion of the landing page. We are a WordPress site -- if there is a plugin - I would love to know which one might work, although I have not been able to find one that will protect us from duplicate content issues. Thank you in advance.
Web Design | | seant1190 -
Parallax, SEO, and Duplicate Content
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize. We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function. Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang. www.example.com/About < This is its own page www.example.com/about/#/history < This is a subpage that you scroll to on the About page We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content? Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user. I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
Web Design | | PaulRonin2 -
Will changing content managment systems affect rankings?
We're considering changing our content management system. This would probably change our url structure (keep root domain name, but specific product pages and what not would have different full urls). Will our rankings be affected if we use different urls for current pages? I know we can do 401 redirects, but anything else I should consider? Thanks, Dan
Web Design | | dcostigan0 -
Crawl Diagnostics Summary - Duplicate Content
Hello SEO Experts, I am a developer at www.bowanddrape.com and we are working on improving the SEO of the website. The SEOMoz Crawl Diagnostics Summary shows that following 2 URL have duplicate content. http://www.bowanddrape.com/clothing/Tan+Accessories+Calfskin+Belt/50_5142 http://www.bowanddrape.com/clothing/Black+Accessories+Calfskin+Belt/50_5143 Can you please suggest me ways to fix this problem? Is the duplicate content error because of same "The Details", "Size Chart" and "The Silhouette" and "You may also like" ? Thanks, Chirag
Web Design | | ChiragNirmal0 -
SEO tricks for a one page site with commented html content
Hi, I am building a website that is very similar to madebysofa.com : means it is one page site with entire content loaded (however are commented in html) and by clicking on sections it modify the DOM to make specific section visible. It is very interesting from UX point of view but as far as I know, since this way most of my content is always commented and hidden from crawlers, I will loose points regarding SEO. Is there any workaround you can recommend or you think sites like madebysofa.com are doomed to loose SEO points by nature? Best regards,
Web Design | | Ashkan10 -
Duplicate content and blog/twitter feeds
Hi Mozzers, I have a question... I'm planning to add a blog summary/twitter feed throughout my website (onto every main content page) and then started worrying about duplicate content. What is best practice here? Let me know - thanks, Luke PS. I sat down and re: blog feed... thought that perhaps it would help if I fed different blog posts through to different pages (which I could then edit so I could add<a></a> text different from that in blog). Not sure about twitter.
Web Design | | McTaggart1