We noticed that goods offered in our email newsletters used to disappeared from first google search results page!?
-
We noticed that goods offered in our email newsletters used to disappeared from fisrt google search results page. Goods where in top 5 positions or even higher, but after email newsletters we didn't find them even in top 100. We suspect service provider of email sending is in blacklist? Could it be reason? If yes, how could we check that?
-
No that should not be an issue...a single privately sent email to customers should not effect your rankings in the google rankings. Do the due diligence on the provider company that you worked with, use the open site explorer and Webmaster Tools to see if you have new links, if so where did they come. It sounds like an issue that is not related to the email...but at any rate try to find out why a dropped happened
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Custom Search vendors and options
Hi everyone, We're in the process of finding someone who will be able to help set up Google Custom Search on our site and are having some trouble - most agencies we were hoping could help solely focus on Google Search Appliance, a hardware-specific approach that doesn't suit our needs. Specifically, we'd like to replace our current site search engine with Google Custom search, as well as configure it as deeply as possible for the best search experience I'm hoping people could give me some ideas on who might be able to help, or the best places to look. Thanks in advance!
Industry News | | digitalcrc1 -
International pages - SEO - which metatags to use?
I'm trying to get my International pages set up correctly for SEO
Industry News | | MikeSEOTruven
Can you tell me which of the following meta-tags are the ones to use on the pages?
I've heard that some might be obsolete, so will it hurt if I throw on all 3 or just choose 1? Example: Italian language page0 -
Need reccomodations for Good SEO company
Hello, I'm looking for seo in the gaming industry - European languages Thanks
Industry News | | Rogeroz0 -
How can i discover how many of my pages have been indexed by google?
I am currently in the process of trying to produce a report for my corporation and this is a metric that i cannot seem to find on OpenSiteExplorer. Could anyone help?
Industry News | | CF20150 -
Google Trusted Stores
Hello, So we sell millions of dollars a month in merchandise - most of that comes from eBay transactions. We do have a script that posts to eBay and we do download our transactions from eBay and process the orders from our admin. Now I feel we will do a lot better in the SERPs if we have the trusted stores quality signal. However; it comes down to this. The conversion pixel. Since the don't pay through the site - do you think we can get away of sending a email to a second conversion page for eBay transactions? Have any of you noticed a boost in SERPs once you were approved with the Trusted Stores? Any advise?
Industry News | | joseph.chambers0 -
Google Webspam Algo Update 24/4/12
Having just checked our clients rankings 95% have not been affected, in fact many have moved up rankings. 1 or 2 have had big drops 😞 Who has been effected by this? The forums are full of people talking about sites being floored from the serp's. it will be interesting to follow the aftermath of this and get some insight into what exactly has changed!
Industry News | | ifluidmedia0 -
Chrome blocked sites used by Googles Panda update
Google's Panda update said it used Chrome users blocked sites lists as a benchmark for what they now term poor quality content. They said the Panda update effectively took about 85% of them out of the search results. This got me thinking, it would be very nice to discover what are the exact sites they don't like. Does anyone know if there is an archive of what these sites might be? Or if none exists, maybe if people could share their Chrome blocked sites on here we might get an idea?
Industry News | | SpecialCase0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690