How could I create this? Would it be a chrome extension?
-
I do a lot of checking for duplicate content on sites. I use chrome and generally I highlight a phrase, do right click and then "Search Google for...". However, I would like to have a quick shortcut where I can search Google for a phrase that is enclosed in quotes.
Is there a chrome extension for this? If not, can I build one?
Thanks.
-
Huge congratulations on the new status, Dana!
The gag is, we've all been turned back into Aspirants. Glitch in the Matrix, I guess, but check yourself and you'll see. (Points are still there, just the title has changed.)
Paul
-
Bwahahaha! Well played, Marie.
From another Aspirant;
Paul
-
I think it's pretty safe to say you are all gurus. Marie was being facetious, I think.
Dana
(newbie guru)
-
Nakul,
That's kinda cool, I will check it out.
Hey, sorry to hear about Alan getting kicked out of the Guru club, (who know's what he said to offend the moz team!!!).
What?!?!
Me too!?!?!
I could understand with him, I mean hell, compare our photos! I am pretty!!!
-
LOL....this is what I get when a non-guru answers my question.
-
hire an intern. "go look for duplicate content". Done!
-
Thanks. I've got a few of those. But I'm just looking for a way to shorten the process of taking a chunk of text and searching Google for it in quotes.
-
Marie
I just found this Chrome Extension, which does have a copyscape aspect in it. So you can click a button and it can check the entire page, not just a small snippet of text or phrase.
https://chrome.google.com/webstore/detail/meta-seo-inspector/ibkclpciafdglkjkcibmohobjkcfkaef
I hope this helps.
-
That would be handy wouldn't it? Even more so if you could highlight a few snippets and it searched for each phrase occurring on a single page.
A browser extension would be the way to go, but I have no idea if there is one out there already. Doesn't sound difficult though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Where to Place Quality Content in Order to Create Links?
Assuming we have retained a an award winning journalist to write articles/blog posts about our business. Assuming the content is useful and engaging. Where would be the best place to publish it to create high quality backlinks? 1. Our website blog
Intermediate & Advanced SEO | | Kingalan1
2. Social media sites like our LinkedIn or Facebook pages.
3. Sending completed articles to websites that might potentially have an interest in publishing them.
4. Publishing the articles on our website and then promoting them with Adwords and Facebook to demographics that would find them interesting and link to them.
5. Combination of publishing an article on our website and posting a related article on social media and linking it back to the original article on our website.
6. Place a custom written article of extremely high quality on affiliate website run by the HOTH or a competitor. But before publishing check the affiliate website on AHREFS and Link Research Tools to ensure that the metrics are not at all spammy (decent domain rating). Which of the above options (or combination of) would most likely result in backlinks of good quality? Assume the quality of the writing is excellent. If pitching the content to other websites (#3) would work, how would we identify these websites? Thanks,
Alan0 -
Category Pages For Distributing Authority But Not Creating Duplicate Content
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike
Intermediate & Advanced SEO | | 945010 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
Alrogthimc penalty due to pharma hack that created drug links to home page. What to do?
Our site: Starcitylimo.com got destroyed by a pharma styled hack to their wordpress site. After having to re build the site from scratch to remove the virus, it was found that hundreds or even thousands of pharma links from overseas sites point to his home page (so we can't just 404 the pages). Contacting the sites for removal does nothing. Added to dissavow 4 months ago did nothing. He's page 5 for every keyword he was position 5 or better for. Is this one of those situations where its time to move on to a new domain?
Intermediate & Advanced SEO | | iAnalyst.com0 -
Update content or create a new page for a year related blog post?
I have a page called 'video statistics 2013' which ranks really well for video stat searches and drives in a lot of traffic to the site. Am I best to just change the title etc to 2014 and update the content, or create a totally new page? The page has 2013 in the URL as well which may be a problem for just updating?
Intermediate & Advanced SEO | | JonWhiting0 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
Dynamically creating unique page titles on enterprise site
Hi, I want to dynamically create unique page titles (possible meta descriptions too) on a 10k page site. Many of the page titles are either duplicates or are missing. I heard about the option of grabbing the page titles from a database or possibly using the h1 as the page title. solmelia.com (the website consist of mostly static pages) Any suggestions would be much appreciated. Best Regards,
Intermediate & Advanced SEO | | Melia0 -
Create a new XML Sitemap for a blog subdomain?
What would be the best way to go about this? A site just put a blog on http://blog.domain.com/ Should there be a separate XML Sitemap for that particular subdomain or should the original XML Sitemap for the main domain be sufficient? Looking forward to your responses. Thanks
Intermediate & Advanced SEO | | iAnalyst.com0