Kilobytes downloaded per day droped, why?
-
Hi,
Our Kilobytes downloaded per day in search console droped drastically, Do you know why?
Thanks
Roy
-
Roy,
Did you optimize images (make them smaller in size)? Is the traffic real? Spam hits will not utilize data like a real person on your site. Also google could be serving cached pages so the information is not as large as downloading your actual assets from your site.
Thanks,
Don
-
Hi Don,
I don't see less traffic, everything is ok
Thanks
Roy
-
Kadut,
Are you seeing less traffic or maybe the pages being visited have less content to download. Need more information to give you specific answers.
Thanks,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a limit on back linking per week
I know the ratio matters of follow/no follow when it comes to backlinks/inbound but also if there is a high volume of them per week, it can be flagged as spammy. Is there a limit on how many backlinks can be done in a week for a domain? I am talking backlinks from high domain authority low spam links. Media/blog sites.
Intermediate & Advanced SEO | | SeobyKP0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
SEO is changing - how has your day to day changed?
I'm sure we all read on our alternatives to Google Reader that SEO is changing - "here's what we must do to be relevant in 2014". I find these articles boring and uninformative. I suspect I'm not alone. The reason I'm not their biggest fan is because I feel like I've invested 10 minutes into an article that I have no actual guidance from. Therefore, I thought I'd ask the real SEO's, you guys, what has actually changed for you? Are you now not creating content with the aim of getting links? If you run a commercial website, what are you doing different to rank your product pages - directly or indirectly? Please share with the group. I'm sure many like me are still brainstorming and creating content they think will grab people's attention and gain them links, whilst also pushing their Facebook, Twitter, Youtube profiles, etc etc. What has changed about this?
Intermediate & Advanced SEO | | purpleindigo0 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
How much content on PDF download page
Hello, This is about content for an ecommerce site. We have an article page that we also created a PDF out of. We have an HTML page that doesn't have anything commercial on it that is the download page for the PDF page. How much of the article do you recommend we put on the non-commercial HTML download page? Should we put most of the article on there? We're trying to get people to link to the HTML Download page, not the PDF.
Intermediate & Advanced SEO | | BobGW0 -
Minimum word count per page?
I'm seeding a new site with hundreds of (high quality) posts, but since I am paying per word written, I'm wondering if anybody in the community has any anecdotal evidence as to how many words of content there should now be for a page to be counted just the same as a 700 word+ post, for example? I know there are always examples of pages ranking well with, for instance, 50 words or less of content, but does anyone have any strong evidence on what the minimum count should be, or has anyone read anything very informative in regards to this issue? Thanks a lot in advance!
Intermediate & Advanced SEO | | corp08030 -
Pagination and links per page issue.
Hi all, I have a listings based website that just doesn't seem to want to pass rank to the inner pages. See here for an example: http://www.business4sale.co.uk/Buy/Hotels-For-Sale-in-the-UK I know that there are far too many links on this page and I am working on reducing the number by altering my grid classes to output fewer links. The page also displays a number of links to other page numbers for these results. My script adds the string " - Page2" to the end of the title, description and URL when the user clicks on page two of these results. My question is: Would an excessive amount(200+) of links on a page result in less PR being passed to this page(looking spammy)? And would using rel canonical on page numbers greater than 1 result in better trust/ranking? Thanks in advance.
Intermediate & Advanced SEO | | Mulith0 -
Are there certain times of the day that it is better to update content or blogs? How do I find out what time is best for a particular site?
Trying to figure out how to best optimize timing of new content... including blogs and other on page content?
Intermediate & Advanced SEO | | AaronSchinke0