Friday, April 09, 2010 at 11:00 AM
Webmaster Level: AllYou may have heard that here at Google we're obsessed with speed, in our products and on the web. As part of that effort, today we're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.
Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we've seen in our internal studies that when a site responds slowly, visitors spend less time there. But faster sites don't just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. We use a variety of sources to determine the speed of a site relative to other sites.
If you are a site owner, webmaster or a web author, here are some free tools that you can use to evaluate the speed of your site:
- Page Speed, an open source Firefox/Firebug add-on that evaluates the performance of web pages and gives suggestions for improvement.
- YSlow, a free tool from Yahoo! that suggests ways to improve website speed.
- WebPagetest shows a waterfall view of your pages' load performance plus an optimization checklist.
- In Webmaster Tools, Labs > Site Performance shows the speed of your website as experienced by users around the world as in the chart below. We've also blogged about site performance.
- Many other tools on code.google.com/speed.
We encourage you to start looking at your site's speed (the tools above provide a great starting point) — not only to improve your ranking in search engines, but also to improve everyone's experience on the Internet.
217 comments:
«Oldest ‹Older 1 – 200 of 217 Newer› Newest»Zoompf also provides a free performance scan similar to Google's PageSpeed or YSlow, to find over 300 front-end performance issues.
http://zoompf.com/free
How much of a speed difference affects the rankings? Should I be concerned about running GWO experiments?
Exciting stuff. Glad speed is becoming a target for search ranking. Strangeloop Networks conducted research on Performance Impact: How Web speed affects online business KPIs- check out the webinar here: http://www.strangeloopnetworks.com/news/events/past_webinars.aspx
I do not think that this is a solid idea. What about sites that post lots of photos on their pages or use complex services that take longer to load? What about all the sites that use advertisement? They obviously load slower than a plain HTML site.
It would be nice if Google would add more transparency to the new signal, including if a website's rankings are affected by its loading time (in webmaster tools for instance).
You guys hopefully look at the connection speeds and origins of visitors as well. A website with lots of Indian users for instance will likely have slower speeds reported than a website with Japanese or Swedish users. Are those factors included in the calculation?
How can a webmaster check to see if a recent ranking drop (say on April 1) is related to that new factor?
Might be interested in checking out Compuware and their Gomez web performance product. Targeted for larger sites and pretty powerful. http://www.compuware.com/solutions/web-performance-management.asp
So if Google Analytics' code snippet is slow, would that lower a website's rankings? That would be the ultimate irony.
How is page speed measured by Google?
If the Google toolbar measurements are used to determine page speed, this can give a wrong performance figure. If say the home page of a site has the same URL for logged in and logged out users, a complex web application can bring the average page load time up quite a lot. A site with reports and charts (e.g. Google analytics) loads very slow when logged in, this shouldn't affect the ranking IMO.
Any comments on this?
Great stuff, a little content on site can make web faster but make website's rankings up?
Now i have to move my websites from Shared Hosting to VPS :(
Looks like I need to advise all my readers to remove Google Friend Connect due to speed problems.
I wrote about that problem before when site speed was first introduced to webmaster tools.
I haven't seen any data to suggest it runs faster
I do wonder how this will effect websites that are extremely educational -- particularly in the science and math fields -- that have applets and scripts to explain and show difficult concepts and allow the user to manipulate data. By using page load speed, we could see these extremely informational and useful websites drop their page rank while more superficial, simple websites take their place. I'm not wholly convinced this was a great idea.
I don't often comment here, but I spend a lot of time and ewffort with Google trying to improve my sites ranking and just spend weeks gziping and adding last-modified tags to everything. I am very computer literate and this is not an easy task.
I have some major problems with this. Your current tool in Webmaster cites many pages with >1 DNS lookup and guess what - that other lookup is for Google Analytics! And as I move to more integrated services I become more dependant on other sites and their ability to perform (eg embedding a video from Vimeo, YouTube, content from creative commons, and many others)!
My site (johnbishopimages.com also shares resources on my webhost (bluehost.com) with other sites because this is what I can afford. Are you going to penalize sites because they can't afford dedicated hardware resources? Kinda flies in the face of many of Google's other initiatives!
How important will this signal play in page ranking? If it plays a major role, you are penalizing the little guy who is trying to get a start and rewarding the larger corporate sites because they can afford lots of iron and content servers spread across the countryside - not exactly a level playing field and one that, on the surface, this signal seems to perpetuate.
Please either rethink this strategy and gibe us more insight to how it will work.
John
;-j
I'm removing Google Analytics code from all my sites - it's very slow and WMT always shows it as one area of improvement. Google AdSense code often renders excruciatingly slow. I guess, it's gotta go, too.
Additionally, I don't see any practical value in "site speed" "Page Speed" - maybe - but what is "site speed"? Depending on the feature of the site you are using, you can get a very fast page (static HTML), very slow page (text search in a database) or any of the variations of speed in between. There is no average! It's like the average temperature of all the people in a room. Makes no sense.
Also, I don't hold my breath for Google to explain how they measure the speed but it would be fair to at least acknowledge if they are talking about the time the page(s) render in the user's browser - depends greatly on the browser and the PC - or the time it take the pages' components to load. They keep using both interchangeably yet these are two very different things.
Talk about confusion: how 'bout pages that load from different URLs as in iframe or frames?
How about slowing down competitors' sites via botnets? Has anyone thought about implications?
This is the worst move Google has taken, hope they reverse their decision soon.
Google, I’m getting messages that some pages are missing a title tag, but they’re not—I think GoogleBot may be borking if the HEAD element is missing (remember, optional in HTML5!) I couldn’t verify via meta tag because of this too. Hope you can fix this, thanks.
It seems it's a finest hour for WEBO Site SpeedUp - http://www.webogroup.com/ - also open source solution.
The least they can do in my opinion is to tell a webmaster if a page ranking or website ranking is affected by its page loading times.
So now even creatives, developers, hosting services,Information Architecture(Technical Architecture), Information design all matters to SEO - in short it is to say build your website properly and help users/internet growth !
Shame.
Officially, Google set a ranking factor not to increase relevancy but to reduce crawling cost.
This is something I've been hoping would happen for a long time.
It's in Google's best interest to rank sites higher that make the user's experience a priority. A site with alot of images and advertisements does not enhance user experience. All things being equal I would prefer to visit a fast loading site versus a slow one when I search for something.
....so all the spam bloggers are going to buy even MORE of the ISPs and then use their botnets to DDOS the legit sites.
BRILLIANT!!! not.
This will end badly.
thanks.
Glad to see this finally happen, though I wish there was more transparency into what "slow" means. I did a quick analysis of the analysis, including unanswered questions that I hope we can get answered:
http://www.transparentuptime.com/2010/04/your-sites-performance-now-affects-your.html
How will this be affected by preferred access issues? (i.e., following the recent court case setting back net neutrality causes) Will we be able to track what pay-for-play preferential treatment looks like using this tool?
"All things being equal I would prefer to visit a fast loading site versus a slow one when I search for something."
Well I would prefer to find a site that has the information that I'm looking for. I would not mind waiting a bit longer to retrieve those information. What good is a fast loading time if the site is offering useless information?
This will only lead to attacks on popular sites to slow them to a crawl and to spammers using fast static sites to serve their contents.
Up above, T.B.H. Ames said...
"I do wonder how this will affect websites that are extremely educational...
I run just such a site, with large, highly illustrated pages (a bit like Wikipedia page length and with as many photos) and obviously this is a huge concern, not least because I've seen a huge traffic and income drop-off recently. Is this why, I wonder?
I think it's most unlikely, but unfortunately, I have no way of knowing because the information given in the post is a little bit too vague for me to tell.
I've been following the Google Page Speed initiative since December, when it was first mooted, and I've spent a huge amount of time improving my page speed since then, to the extent that the Webmaster tools graph suggests I have improved the average speed (of 400 or so pages) by about 50%. Superb! Thanks guys! I appreciate the steer and, all told, I applaud Google for trying this bold initiative. In the long run I think it's broadly a good thing unless it encourages people to go for speed over quality, rich content, which is not at all a good thing.
Like other posters here, I would appreciate more (and more precise) information so I know where I am. At the moment, like other posters, I am now erring on the side of caution: a) I no longer dare use any kind of urchin-type analytics because of potential speed impact; b) I will now, most likely, be removing some of the other widgets and stuff I use; c) I am scared about running experiments too and using anything remotely server-intensive, except pure HTML (e.g. a wiki running on PHP) that could slow pages down. These things are all negative impacts of a speed drive, as far as I can see, and a backward step. No?
So I do fear a negative impact on page quality if we relentlessly pursue page speed, unless you can reassure people about what factors will and won't make a difference and how. I appreciate you can't always reveal what you do, and I fully understand that, but at the moment I feel we have too little information.
In sum: thanks for pushing us into action. Broadly supportive, but concerned about negative impacts, including Analytics/ads/widgets/PHP etc, and very concerned about gradual, creeping impacts on page quality.
I guess once again I will have to be the black sheep of the bunch.
This is not good at all. While I am all for improving speed on website, sometimes it is beyond our control. Sometimes website hosts are at fault. So if this does in fact affect ranking, than that is a bit problematic.
And what about site with Flash? Does this mean once again we have to revamp our sites just to fit the needs of a company. (And yes that is a Bash on Apple too).
Speed should NOT be a factor in rankings. I don't get why people can't think this through before praising Google.
And I love how Google now "praises" speed when their own Adwords has a loading screen on it now.
BRAVO google.
If you consider the broader impact: This is an important step for the future of the internet - putting a standard in place for a critical part of the user experience. Together with other best practice guidelines, this ultimately will improve the experience for everyone using the internet.
Ed Robinson
http://www.aptimize.com
I like how Google is doing this. The interweb is about communication and sharing of information. It makes sense that the highly relevant site I want to find might rank lower than one that's just a blank page with nothing on it. After all, the blank page may not have anything useful, but it LOADS FAST and that's what's really important.
I cannot be spending time out of my busy day waiting for relevant web sites to load.
It also really makes a ton of sense, because if your site gets indexed at a time of day when it gets a lot of traffic, then it will have lower rankings all the time. I think this is great, because your site should FAST FOR EVERYONE ALL THE TIME or it's a crap site with shitty information and the webmaster should be castrated and have their eyes plucked out by vultures from hell.
Another great innovation from Google.
As an artist I must say: bad idea. Creative Webmaster and Blogger use images to illustrate their texts - or to show their pictures.
Will the serps in future just show results without images?
My WMT says me that all of my sites are slow but I will not remove images - they are a benefit for each sites.
I am a bit concerned about this. My site provides relevant and useful information to my readers, however, because I provide proof of payments in terms in images regularly in my blog posts, I'm concerned that the images slow down the speed my site takes to load, and hence affect my rankings.
What can I do about this?
On several of my sites, by far the slowest-loading elements are the JavaScript for Google Adwords and Analytics!
Why so many knee-jerk reactions? Did you actually read the post?
Loading speed is becoming a factor in the rankings. It is not the only factor.
Relevant pages will still top the rankings. If your site has good content, it will continue to rank well.
Users value both speed and quality. By including speed in the ranking algorithms, Google is just trying to give users what they want.
By making this factor public knowledge, Google are hoping to spur website owners to make their sites faster. This can only be a good thing.
Wow there are a lot of cry baby's on here. I support site speed and proper coding, remember content is still king and stop whining.
If you'd like to see your sites pagespeed and yslow results together and track it over time, you can do this for free at
http://gtmetrix.com/
We'll be adding some more to it over the next few weeks to get a handle on how long it takes your site to load from a users perspective.
thanks
Thank you Google!
Superficially, encouraging site speed is a worthy goal, but that strikes me as just one pro outweighed by many cons.
There are a number of drawbacks, most listed in earlier comments, but I'll add one more: geography. If I'm building an NZ-hosted site targeted at NZ users, competing with US-hosted sites, how will load speed influence its ranking?
Unless site speed is measured from many points around the globe and averaged (unlikely), I would have to assume that it will be measured from the US. Not only would this penalise an NZ-hosted site, it would actually work directly against the feature's original intention, because all else being equal NZ users should expect to get faster performance from the NZ-hosted site in the first place.
The first thing that struck me on reading this, though, was that it is not a feature I want as a Google user. If I'm looking for something I wanted answers ranked by relevance. If a site has the best answer to my question, I don't want it second in the list because it loads a little slower than a less relevant site.
I use Google as a search engine, not a site reviewer. This feature strikes me as a step backward: it makes Google less useful to me.
How can we find out the webpage loading speed?
Ooh No!
that's Open the way for the exploitation of hosting companies
I suggest removing godaddy site seal as it is very slow to load.
hi my site is reported spam!
now when i want open my website , a security warning shown !
google send a messege me about the post that , reported!
i delete that post!
how can i change back my website to ago and without showing security wanrnig?
With this Google is costing webmasters a lot of money.
Also I can't find any solution how to increase your webspeed. Instate of tools to check your webspeed.
:(
@MarshallsBlog a content delivery network will speed up your sites content and CDNs are not that expensive anymore.. maxcdn has a great introductory offer $10 for 1 TeraByte
At the moment, like other posters, I am now erring on the side of caution: a) I no longer dare use any kind of urchin-type analytics because of potential speed impact; b) I will now, most likely, be removing some of the other widgets and stuff I use; c) I am scared about running experiments too and using anything remotely server-intensive, except pure HTML (e.g. a wiki running on PHP) that could slow pages down.
I think you're over-reacting here. It's good that you care about speed, but you don't need to be quite so strict.
When optimising for speed, you need to weigh the potential speed gains against any loss of features. Otherwise, you'd just remove everything and be left with a blank page. ;) You also need to consider the difficulty of implementing and maintaining particular optimisations.
For example, minifying your entire HTML document could be difficult, depending on your setup. Sure, it will give a performance gain -- but a small one.
Google Analytics has a new asynchronous loading method. Loaded in this way, it has almost no effect on the page load time. It still gets reported by WMT Page Speed, but this is an anomaly.
Widgets from external sites (e.g. social sharing widgets) do often slow sites down badly. Use them with discretion. Do they really improve the user experience, or are they just cruft? To my mind, most of them are designer-centric cruft.
Server-side processing is the last thing you should be worrying about. 80 -- 90% of end-user response time is spent on the front-end, so your server performance only accounts for 10 -- 20% of your loading time. But if you're concerned, then take some measurements! There's nothing like real numbers to put things in perspective.
As a user, I'm delighted with this. If I'm searching for something, I'll often have to open a couple of the Google search results to find the answer; I'm much less inconvenienced by a handful of seconds to identify a page as not having the information I need than waiting 30 seconds for one to crawl down the pipe. Using page speed as a tie-breaker will certainly improve my browsing experience, just like blocking Flash does.
As a web developer, I'm pleased by this as well: I've advocated efficient site design for years, but others crank out sites which only perform well if your URL starts with "localhost". Yesterday, I saw a big commercial site (ebuyer) which loaded extremely slowly - it was immediately obvious why. The front page was loading three separate CSS files - all from their own server - the first of which consisted of six @import directives. NINE - perhaps more - separate HTTP requests just to get the CSS data? Insane. The other aspects were as bad, but the CSS seemed a nice obvious example.
Users have been complaining about slow page loads for years now, but largely been ignored by a certain subset of developers who don't see the problem with ramming megabytes of extra cruft into their pages. Maybe Google adding weight to that will finally end that bad habit, or at least reduce it.
I have implemented some of the advice google has given about site speed.
Most of my sites were already relatively fast, as they sit on dedicated servers.
Contrary to what some of the whingers on here are saying, implementing most of the suggestions is a no cost exersise and fairly easy, therefor it should be done.
The only significant thing I havn't done yet, is to gzip my style sheets, which I am working on.
The performance improvement is quite noticable and the webmaster tools sitespeed page confirms this.
here is one of my sites afors
I can only say, I am very pleased with the results.
So thank you Google for bringing site speed to the fore and to my attention in particular.
With regard to some of the posts saying they can't, won't, or are removing analytics etc, then you are cutting off your nose despite your face. So look out, your sites will definately suffer longer term.
Very interesting article and helpful, too.
What's your approach regarding the use of Urchin Soiftware? Since it's stongly recommended to place the tracking code into the header section of your pages this might be slowdown as well. Or would you recommend to host both (urchin.js and __utm.gif) on an external source?
Thanks,
Holger
Im all for rewarding sites for being able to serve pages quickly but before this was rolled out to the public Google really should have sorted out their own issues.
analytics, ad sense... both the most common issue reported by Site performance in webmaster tools
WE implemented in DECEMBER ONLY
All you people that are whining about slow speed have slow sites.
I will no longer be putting Google Analytics on my page. I will also be removing all AdSense advertisements. My website will no longer be embedding YouTube videos.
All three make websites slow.
I hope this addition would help to make the web a better (and faster) place. :)
Its a great tool and exceptionally worked on my sites http://dxnlanka.blogs.lk and http://dxnlanka.webs.com
Anyone here using Host1free? Are they providing good hosting service?
How good is there a free plan compared to other hosting provider?
Any help would be greatly appreciated.
[url=http://www.host1free.com]free hosting[/url]
In general this sounds good. However, one thing that is still a problem is that you haven't addressed the algorithm clearly enough. Are you just comparing the exact values from the "Labs/Site Performance" from GWT? Or something else? For instance I have a dating site that has 30% of our users in the 1st world and 70% of the users in the 3rd world. Does the slow load time from the slow connections from the 3rd world negatively affect my "page speed" score for my SERPS in the 1st world?
Will try my to best to speed up my site!
I know this would happen and I have kept the blog to a minima, I haven't upgraded my blogger template till last week. The only extra scripts are google adsense n analytics.
I think google not only check speed but also the usability factors like font size etc.
Спасибо Google. Теперь все задумаются над улучшением скорости своих сайтов.
This could be good or bad. But how are we supposed to know if Google isn't transparent about the process and how significant this factor is. Would appreciate if Google would elaborate.
@webalytics: they recommend not using urchin.js at all - it's obsolete, we're all supposed to have moved to loading ga.js asynchronously, because of exactly this speed issue. It's quite possible to do things like this AFTER the page is fully loaded from the user's perspective, and if you do this it does NOT show up as a problem in Google's site performance tool - and in the Analytics case, they literally provide the code ready to paste into your own HTML!
I don't know about AdWords, but you can certainly do something similar with YouTube videos as well; I've seen sites already using a simple placeholder for each video. It isn't until you click the placeholder that the Flash or HTML5 video object gets rendered.
Use the tools Google already provides us with and none of these things would be problems, for developers or for users!
My website sped in Webmaster Tools hasn't been updated since March 14, but Googlebot crawls my site every 30 minutes. How can I get it updated?
This is great news, it's about time that Google takes site speed into consideration. Don't worry too much about your hosting, optimize your site first and you can start by validating the code you wrote. There are also many shared hosting providers out there who provide decent speed, so you should be just fine.
Slow websites are already penalized by users.
I can't figure out why speed should affect the ranking.
Probably js code will be out of the equation but what about network round-trip?
Sites hosted in the US (close to Googlebot) will have better ranking due to less network delay? This is wrong
The thing is to try to see this with an open mind. I was initially hostile back in December, but decided to run the tools and tests, and immediately saw how much benefit I could gain for free. Anyone who's doubtful, suspicious, or hostile, do try the tools and surprise yourselves.
Thanks to MikeHopley for v.helpful answers to my earlier questions. :)
I have a product that could be used in many countries so I want people around the world to find my website.
So I am concerned about how this will affect one of the key benefits of the internet over other traditional marketing channels: that is, the possibility to reach out to new international markets.
If you want to make search results more relevant for people in other countries besides the country where my website is hosted, then please don't penalise them for having a slow internet connection to my server.
Having a copy of my website in each country would not be economical or manageable for me, and having multiple copies would not be good for Google.
Also, won't the user's computer's specifications and Local Area Network affect download speed? If so, and you are using data from Google toolbar in the browser, how will you take this into account?
It would be better if you used a metric that is not affected by user variables such as the number of hops from the server to the user or by factors that are beyond the control of the website owner such as the vagaries of intermittent hosting services or ISPs.
Instead of trying to measure "speed" (who's speed?), I propose a combination of page size and conformance with W3C standards.
Is it possible during the calculation of the "site speed signal" consider only the SPEED OF A SITE ITSELF, excluding external scripts (like Google Analytics) and graphical visitor counters (like StatCounter, ClustrMaps, etc)?
I'm asking, cause Webmaster Tools are showing these as the only possible reason of site slowness.
Or maybe you can create a list of most popular services, whose graphical logos will be excluded from the consideration?
So will App Engine hosted sites improve their speed when server is idle (or spinning a new server to scale). It seems that if Google is obsessed with speed this is one place they need to make some improvements.
Mistake, if you ask me."War and Peace was taking too long to read, so have a go at these cliff notes instead."
I think it's a great idea, especially when sites are bogged down with popups, ads and site analytics.
I have a fairly slow connection, and sites that require 8 HTTP requests from 5 different domains doesn't work well with my connection.
I think this is ridiculous. You have a good sense of humor. Have you tried the results of Page Speed in your own websites? You follow your own recommendations?
What about Blogger? You make Blogger slow adding innecesary code. You don't offer a decent file hosting service for Blogger users in order to let the users optimize the blog's load times. Look the Blogger gadgets embedded with iframes, the comment form (it's slow and doesn't work well), and a large etc.
And what about the Google API? Have you tested the load times?
Please look at your own before and start to work in order to improve your services. Do you need any employers in Blogger?
I think there's good intentions behind this, but it'll be very detrimental in some circumstances.
My business and clients are in New Zealand. I tried a US based host for a start but load times to NZ were pathetic. So now that they're all hosted in NZ so they load nice and fast for their intended audience, WMT tells me that my sites take about 16 seconds to load. Is that going to make my rankings plumet?
What is the benefit if a fast but content- and image-less page is preferred?
In order to speed up my start page FriendConnect, Analytics & AdManager have to go.
The content will be split in to four pages.
-> Usability worse, speed & position better.
To see the same content the user has to open four pages.
Sorry for my FriendConnect community with 800+ members too.
@Google: Is that what you want?
@Rick:
WMT SiteSpeed gets its load-time data from a subset of your actual visitors (those who have the Google Toolbar with Pagerank checking activated).
If you're seeing average readings of 16 seconds, then that means 16 seconds for your actual users, not for some arbitrary Google server pinging you from the US.
Just how local is your host? If you've chosen a host in your own city, then you may be seeing extremely fast loading times that are not reflected over the rest of New Zealand.
Pay attention also to the accuracy level of these data. Is GWT is saying, "These estimates are of low accuracy (less than 100 data points)"? If so, then you could find that the data are not reliable (not enough samples to be statistically significant).
With regard to server location: have you considered using a CDN for your static assets (images, javascript, css, flash...)? You can now get a zero-commitment CDN such as Amazon Cloudfront, where you just pay for the bandwidth used. The cost is trivial.
At the other end of the scale, if you're running a huge business with a high-traffic website, you can get better performance and volume pricing from the big boys such as Akamai. The CDN marketplace now caters for every scale of business.
The location of the HTML document itself is much less important, because this item accounts for only about 10--20% of loading time.
@James: Thanks for the information. We are aware of the existence of the Asynchronous GATC. But we also know that there are compatibility issue when using GATC via the "setlocalremote" parameter and then parsing log files into Urchin Software.
What's your approach to this?
Cheers,
Holger
This is great to see. We provide a free page analysis tool at:
WebPageAnalzyer.com
to test website speed and offer recommendations. As well as operating costs, faster site speed has been shown to improve comversion rates and lower bailout rates.
You forgot Chrome's profile tab in it's developer tools for speed measurement. It provides a full break down. I was able to use it to work out why ajax search was adding fifteen seconds to page load times.
On another note, speed is good but less content = faster but less relevant content.
Web rendering speed is more about the hardware you are running to spider the sites and the speed and quality of the rendering engine in googlebot. While I think your idea has it's merits in that a lot of these web developers who build sites that pull heaps of adverts and write bad bloated code, it depends on what context "Speed" Is being measured, what is considered "Slow" For x amounts of bytes in total and whether the average speed result of several crawls is used instead of just one, as to whether this idea is a useful one.
Seeing how the relevance of search results provided by Google have been degrading in my observations over the years. I think methods to improve result relevance would be more welcome but standards compliant pages are always the fastest to load in web browsers and I'm all for that.
@MikeHopley:
Cool, I didn't know the speed data was taken from actual visitors, but now I'm quite worried. Though this does mean it only records people using Firefox (and I guess IE) who love stacking themselves up with toolbars.
My VPS is hosted in Auckland while I'm at the other end of the country in Invercargill. I get load times around 5 secs uncached and 1-3 secs cached.
You're right, it says it has less than 100 data points.
In regards to using a CDN for assets, that's quite tricky with the CMS I use (scripts and CSS are auto combined, minified, gzipped and cached and images are auto cropped, scaled and cached) but I'm definitely going to look into it if it'll make a difference.
Thanks for your feedback.
And once again google promotes MFAs.
Can someone explain the exact relation between speed and quality?
Not sure this is valid logic. Sounds too simplistic. I agree with the commentators who ask why not better filter spam sites and bogus forums. Quality sometimes takes a bit longer to load.
@Rui - See Andy Kings's books on website performance optimization.
http://www.websiteoptimization.com/speed/1/1-3.html
Seems like I will have to invest in a VPS or dedi now
How much page speed is enough to rank a website?
hi,
I think this is good move for future. Big corporate site never compromise on their rich content so they go for dedicated server and small sites will go for search guidance.
Google should show some transparency and while ranking the pages they should include download speed of the page so it will help user to decide whether it is worth for them or not to click with respect to their internet speed. And site webmaster also take it seriously.
My site is not so much rich with images: http://www.seo-speaking.com
I thank you for your efforts in Google
You are wonderful thank you for your
Administration Forums Salfit
ws
http://www.salfeet.ws/vb
This sounds like a foolish move on Google's part if quality content is the goal from searches. If I'm wrong, let Google explain how this will not undermine good content.
Some of the best information pages I've seeen have a lot of photos and a lot of text. They should load slower.
It seems that this effort by Google could encourage folks with websites to reduce useful content, to keep higher rankings.
Now ... if Google is able to make allowance for images, and gauge speed more by data transfer speed rather than just page size, that might be fine.
@Ginox
Having you page speed performing better than average is a good start. Compare it at your Webmaster Tools -> Google Labs -> Site performance and check for yourself.
@Rick:
My VPS is hosted in Auckland while I'm at the other end of the country in Invercargill. I get load times around 5 secs uncached and 1-3 secs cached.
You're right, it says it has less than 100 data points.
I'd probably not take it too seriously then -- unless you know that you're on a faster-than-typical connection for NZ.
In regards to using a CDN for assets, that's quite tricky with the CMS I use (scripts and CSS are auto combined, minified, gzipped and cached and images are auto cropped, scaled and cached) but I'm definitely going to look into it if it'll make a difference.
I think using origin pull could help here.
Something I didn't think of before: you would need to be quite selective choosing a CDN. Most budget/mid-level CDNs have networks concentrated in Europe and the US.
If a CDN has no servers in New Zealand, then I'd guess it would actually make performance worse for your customers.
You'd be okay with Akamai, of course, but they require you to buy at least $150 US per month of bandwidth (a very rough estimate). That would be from a reseller/partner, not dealing with them directly.
Site speed is a very important thing for the enduser experience.
The slowest loading things an my page are googleAds and analytics..
Which time is measured in your webmastertools site performance chart? The time till the browser has loaded all local resources or ALL resources even like googleads or adsense?
I designed my site http://www.ispreview.co.uk to be lean and fast on both a shared, vps or dedicated server, so I don't expect to suffer any problems as a result of this. Still it would be nice to have more details on how this might affect PR, the blog post is not clear.
Hi, does it affects all the Google search engines, like Google.fr ?
I saw this coming late last year and got prepared. Site speed should be a factor because it forces you to optimize your data for everyones benefit. I hate slow sites and there are still people out there surfing dialup(broadband).
Even sites heavy with pictures and files can benefit from some optimization. My experience has been that even those sites can be optimized and cleaned up to improve speed without sacrificing design/layout. Big clunky sites are easy to optimize too. Follow the advice and techniques yslow and pagespeed recommend and it helps. They are not asking you to remove these items, just clean them up so they perform optimally.
Seems like the people whining are the ones with the clunky sites that do not want to adapt. Its not easy work, but to stay on top of your internet rankings, you must stay on top of developing ranking standards and implement accordingly
@Google: Note that some sites are very much focused on a specific geographic region, e.g. www.pensioenpage.com and www.bol.com are both focused on The Netherlands, primarily. The content is all in Dutch.
So when you measure the site performance from the US, the figures may not be relevant to the real target audience.
Do you have this potential issue covered?
It's a good thing that Google is highlighting the importance of site speed and its role in web-search ranking. Not only will the assessment of this site speed helps in improving rankings but also helps in overall Internet user experience. I've been looking for more ways to deliver more traffic to our sites originally by utilizing resources specifically for SEO Wales.
This addition will be a big benefit. Sure, taking the time and spending resources to make your site faster isn't always easy, but the bottom line is improving the experience for your readers/visitors/customers. Anything you do/add/change on your site should have improve the visitor/customer experience as a goal.
I would appreciate if W3C validation was a factor in search rankings. So that sites with hundreds or thousands errors are ranked significantly lower. (or does that happen already?)
That would be one more motivation factor for web masters to create their sites correctly.
funny thing..this page was very slow in loading to me :)
The slowest things about my sites are the includes from Google Analytics.
"today we're including a new signal in our search ranking algorithms"..."We launched this change a few weeks back." So, which is it? A few weeks back or "today?"
I think it is amusing how worried people are about this new feature. Regardless of what Google measures or not, the fact is that if your site is slow, visitors will not stay - end of story. So... it makes sense for Google to give slow sites a low priority in the page-rankings, because users want fast content.
Maybe this will be the final nail in the coffin for all those self-proclaimed internet marketing "gurus" whose pages feature everywhere and take ages to load.
It's nice that the news is finally official! :D
It makes complete sense. How many times have you left a site because it's loading that fraction too slowly? The fact of life is that people are more and more impatient and webmasters need to make sure they pay careful attention to the speed of their site!
This does mean, that is more important than ever for companies to have a good hosting provider on a dedicated server!
UKFast are the UK's fastest hosting provider and allow you to test the speed of your site online... http://bit.ly/c6bljT
Cheers for the article guys!
www.Dotcom-Monitor.com especially focused on a proactive approach to Google's use of site speed factor for SEO when we attended the November, 2009 at the Pubcon SEO conference. We set up 9 free online tools to measure website speed at http://www.dotcom-monitor.com/task_instant_test.aspx
I noticed the slowness of Google Friend Connect and it is gone.
This page scored 78/100. Interesting.
How will we know a good rating from a bad?
I have been working on speeding up my site, and most of the issues I'm seeing using webpagetest.org (recommended by Google) is that the js in adsense and analytics codes are causing the issues:
Enable browser caching of static assets:
WARNING (24.0 hours) - http://pagead2.googlesyndication.com/pagead/abglogo/abg-en-100c-000000.png
WARNING (24.0 hours) - http://pagead2.googlesyndication.com/pagead/abglogo/abg-en-100c-000000.png
WARNING (24.0 hours) - http://pagead2.googlesyndication.com/pagead/expansion_embed.js
WARNING (24.0 hours) - http://pagead2.googlesyndication.com/pagead/js/graphics.js
WARNING (24.0 hours) - http://pagead2.googlesyndication.com/pagead/show_ads.js
WARNING (24.0 hours) - http://pagead2.googlesyndication.com/pagead/sma8.js
WARNING (7.0 days) - http://googleads.g.doubleclick.net/pagead/imgad?id=CJja74qE-ZO9_AEQyAEYvQEyCNaRSHIcHYmF
WARNING (14.0 days) - http://googleads.g.doubleclick.net/pagead/test_domain.js
WARNING (14.0 days) - http://pagead2.googlesyndication.com/pagead/render_ads.js
WARNING (14.0 days) - http://www.google-analytics.com/urchin.js
So when will AdSense and Analytics codes be "optimized".
I don't want penalties for your codes...
therefore site that contains photos and videos will be having problem with this? specially flash made websites.
One of the most senseless Google "Feature" than ever!
Pages with Videos, Pictures or logged in Features will be punished by page ranking :D
unbelievable...
And for the first ever time in TWELVE years Google "announces" a "signal" that may affect ranking, and the clamouring masses think the sky is falling.
Well, it will make a change from them (t)wittering on about the SGB "value"!!
This nonsense shows us what the world is waiting for:
Its better to get bad information fast than to get good information slower ?
Wow, what a progress.....
@amazon blurted:
"Its better to get bad information fast than to get good information slower?"
The Google page-rank is only an indication of how easy it is for people to find your page when searching on Google. If you have fast pages that are full of garbage, and you make the effort to streamline them for a good page-rank with Google, then good for you.
Most people who have rubbish on their pages won't make the effort (no matter whether their pages are slow or fast).
Having said that, regardless of whether a page has good information or not, I expect it to load fast.
All Google is doing is catering to the needs of the users of the web.
If you are looking for a really good tool to Measure End User Experience there is an awesome free tool to check out http://www.real-user-monitoring.com
Calling people whiners is not just inconsiderate but also inappropriate.
There is a legitimate concern amongst people who design sites for a living that are worried that they will have to rebuild sites (meaning remove images, embedded videos and flash) to improve search rankings.
It is also a concern for people trying to reach an international audience as well. Clearly a US hosted site will be slower than a site in the UK.
So instead of calling people whiners, try to think of it from our point of view. Yes, I do understand the importance of speed for visitors. What I don't understand is trying to use it for ranking.
Helpful Matt. Now, about those other 199 search ranking items....
Matt N noted that Calling people whiners is not just inconsiderate but also inappropriate.
Yes, but it's much better than calling them Weiners!
A really good movement. As a user I can understand what google feel. Nobody wants to waste their time for loading a slow website.
Google would do better getting rid of more Made for AdSense pages and sites, improving search results and less dilution of AdSense ads. I already know know speed is an issue which is why I went to a dedicated server on Softlayer several years ago. I have monitored speed and DNS lookups. The problems have been with having slow loading AsSense code, not my pages. I think this is much ado about nothing. Content is king. Speed is nice, but fast Made for Adsense sites (and Made for Adsense sites continue to grow in search results) is of little value.
This move will cause a surge in demand for ddos attacks from unscrupulous competitors. After all, what could be easier than to remove a competitor from the issuance of a highly competitive request using a ddos attack . But any protection against ddos attacks in any way reduces the rate of response on the site, which may lead to a deterioration in ranking. Therefore, the company Google itself triggers the growth of cyber crime. You need to think first, gentlemen, before making such statements.
Any competitor willing to resort to a DDoS attack in an attempt to reduce the target's PageRank would have far better reasons to do so anyway: mounting a DDoS attack will directly interfere with customer access in the first place, regardless of any impact on PageRank and search ratings. Like blockading a restaurant entrance so the health inspector can't renew their inspection certificate and closes the place down: the blockade itself achieves the same aim without the health inspector's involvement anyway!
"There is a legitimate concern amongst people who design sites for a living that are worried that they will have to rebuild sites (meaning remove images, embedded videos and flash) to improve search rankings."
Not particularly legitimate, redesigning your site to obsess over a single one of the 200 factors PageRank considers would be almost absurd. Thinking twice before choking up your site with Flash objects in particular would be wise, not just because of this change but because a lot of users like me dislike and block Flash and/or use devices which don't support Flash in the first place (iPod Touch/iPhone/iPad), meaning your page renders with big holes in where you tried to put Flash.
Making your pages load quickly has ALWAYS been something you should consider - it's not as if Google finally making it a very small part of your overall PageRank is an enormous change in any case. Page speed is still much more important to users than it is to Googlebot - and if you care more about Googlebot's impression of your site than your visitors', you really shouldn't be designing websites.
Nice post!!! site speed produce a great impact on website performance and higher ranking in search engine.Your site should be user friendly and search engine friendly so that you get better results in search engine.Page speed is really more important to users than it is to Googlebot - and if you care more about Googlebot's impression than you drive more traffic to your Web Development Services website.
What about countries that deliberately uses the low bandwidth. How will that be calculated. Because people may have bandwidth or downloading speed from 100Kbps to 100Mbps. At that case how it will be calculated. 100kbps will eventually take more time than the 100MBps. Shall i keep it as Google is urging its users to go for higher bandwidth.
I've dropped from a #2 ranking (held for years) to a #6 on my primary search term due to this. This, after I've applied as much of Google's PageSpeed recommendations as I could a couple of months ago after I saw this coming.
My index page was previously issuing over 120 http requests and I brought that down to about 20 and made many other optimizations too, thanks to Google's recommendations.
I didn't bother to optimise pages that are rearly used. Are rarely used pages weighted equally and will that bring my average down? Or, maybe I should suddenly create 100 fast loading fluffy pages to bring my average up? Surely Google would not want to encourage that behavior!
My primary content loads extremely fast and it loads first. I have other things (Facebook Become a fan, ShareThis, and relevant ads) loading at a time when the user is not impacted by any of this stuff unless that is the stuff they want to use.
All of these things have provided a much BETTER experience for my users than they had a few months ago. How do I get rewarded? By losing my search positioning.
My only recourse seems to be to either live with my #6 ranking or remove useful links because they "appear" to take time by whatever algorithm Google is using to measure it.
My very strong opinion on this whole thing is that although Google's intentions are sincere, they are flawed in this case and should be immediately rescinded.
Guys, what about sites hosted overseas? My sites are for Australia and in Australia and this obviously makes it "slow". How does that work?
speed i think it dose work new signal or tracking ????? http://alvgom.blogspot.com/
Good news for users and could be challenging for website managers. I think you can still be optimistic even if your site is bit slow. Check this out,
http://the-useful.blogspot.com/2010/03/how-to-be-optimistic-when-your-site-is.html
@ Warren Schwader:
I've dropped from a #2 ranking (held for years) to a #6 on my primary search term due to this.
You can't possibly know that.
It's more likely that your ranking change is due to other factors. The rankings change all the time; maybe one of your competitors has improved his site, or publicised it better!
This means, If established companies using faster dedicated servers will always rank higher on google compared to shared hosting, even though if my website has more relevant content to my competitors and I work harder than them. This really is a joke.
@dave:
established companies using faster dedicated servers will always rank higher on google compared to shared hosting, even though if my website has more relevant content to my competitors and I work harder than them.
No, that's not true. You have made two false assumptions:
1. Sites are ranked only according to speed.
2. To optimise for speed, it's important to have a dedicated server.
(1) Site speed is only a small factor in determining the rankings. Google is not stupid: they know that users are looking for good content.
If you have much better content than your competitors, then you will (probably) rank much better. However, if you're about equal in content, then the faster site will likely rank higher.
(2) The speed of the server has little effect on the speed of the site. Most websites will get little benefit from a dedicated server.
Providing your web host does not oversell its shared servers, then they should perform almost as well as a dedicated one. A responsible web host will also let you know when they feel your site is out-growing shared hosting.
Finally, remember that only 10--20% of the response time is spent on the back-end (the server). 80--90% is spent on the front-end (the client downloading stuff). If you doubled the performance of your server, you would only improve your response times by 5--10%!
If you double the front-end performance, then your response times would improve by 40--45%. This is 4 to 9 times better than the back-end improvement, and it's also much easier and cheaper to achieve.
As a bonus, front-end optimisation drastically reduces the work your server needs to do (it has fewer and smaller objects to serve for a given page). This means you can stay on shared hosting for longer.
Everyone moaning about this and how it's going to kill off your rankings, he did mention it's only going to be a very small factor in terms of your rankings.
Having a faster site will obviously be beneficial but not as much as having a good ranking ethically optimised site...
Thanks so much for the information that you have gathered here. One thing that I think is also very important so that you can get more information for your site is a free website analysis. Getting that from a company that offers SEO or a free software you can download online is great. it gives you so much information and helps you to find out what is happening with your site and what needs to be done. So worth it if you have time to have it done.
This is great news indeed, and a good step towards fighting search engine spam. Most of the sites on the spammy side are made without any regard to costumers, and therefore to speed.
My biggest concern is that it will be a lot simpler for search engine spammers to shift from the usual methods to getting great speeds (setting a few things at server side, plus using a CDN like Cloudfront can be done in half an hour), while complex portals and web applications would need quite an affort to catch up. But this is only a short term problem, in the long run the web will be faster and bit more spam free. Thanks a lot Google.
It may be a good decision from a user point of view. But certain sites with complex coding AND PHOTOS/VIDEOS will get knocked out by this decision for sure
@ Warren Schwader:
I've dropped from a #2 ranking (held for years) to a #6 on my primary search term due to this.
MikeHopley said...
"You can't possibly know that.
It's more likely that your ranking change is due to other factors. The rankings change all the time; maybe one of your competitors has improved his site, or publicised it better!"
Well, I am certain that I know my niche and neighborhood infinitely better than you do and am well qualified to make this assertion. It is slow moving territory that I keep up on constantly. Interestingly, my main page just improved to a PR4 in the last month too.
While I can't say for certain that it is caused only by this one new factor, I can pinpoint that the change in ranking came at the same time as the change in Google's algorithm.
By the way, it's almost impossible to reach 100/100 on both Google Page Speed and Yahoo YSlow. I made a site to show what's a perfectly fast site, but then again, it's only for fun. In the real world noone would even go near that number. On a production site my best is 98/100.
@Warren Schwader:
While I can't say for certain that it is caused only by this one new factor, I can pinpoint that the change in ranking came at the same time as the change in Google's algorithm.
Correlation is not cause. Too many variables + not enough data = indeterminable causation.
I still doubt if this a good idea. There are multiple sites, specially news or blogs that provide great news and information which the used would like to read and need immediate attentions from search engine to get it crawled. I am sure the ideal size the of the page in Google Algo is 200K (approximately 4 Seconds Download) but exceptional is 350K (approximately 7-8 seconds)
Ryan @ Techethos
http://www.makefreeinternationalcalls.com
What a useless signal. I, for one, don't care at all how fast or slow a site is if it has the information I'm looking for. I would hate to see the relevance of search results further harmed by the inclusion of speed as a factor in ranking search results.
One other thing that you can do to help you get better rankings on google is to find a company that does a free website analysis so you can see what things you can do to improve your SEO. Do your research and try to find a good company that wants to work for you
www.boostability.com
hj a nice day
If you are looking to speed up your website for cheap go to www.lightspeednow.com. They have three different packages to choose from! Amazing service! I just used them for one of my companies and my website is loading extremely fast! So worth the money!
IT support services serve as a boon for small businesses, as even a minor technical problem can badly affect the smooth running of your computer. Technical support service providers keep your business running smoothly by providing IT Services Los Angeles services.
Do I understand right when thinking that if there are a number of websites dedicated to the same topic and the majority of them discuss the subject using text only while a few also provide pictures, videos, slide shows(i.e. give more visual information), then those few have less chances to rank better then those less informative text-only web pages due to the loading speed issue?
Nice information, I really appreciate the way you presented.Thanks for sharing..
http://www.w3cvalidation.net/
Google; There should be one click unsubscribe in your emails following blog comments. That is email best-practice and I think a legal requirement in some jurisdictions. Instead, I am being prompted to sign in and sign up for Blogger.
hi
My homepage is new.Visit my page.
www.radarcezasi.com
hi
My homepage is new. Visit for my page please click
www.radarcezasi.com
Interesting concept... but lots of room for error. For example, Alexa tells me that 69% of sites are slower than my site, but google says 70% are faster. They can't both be right!
GWT is also reporting my avg load time at 4.5 secs. It could be that the googlemap widget continues to generate http requests as it animates, but this enhances user experience - the actual content loads very quickly, here's an example:
href=http://tickets.gruvr.com/band/taylor-swift/
Another factor: I optimize my site for users - i.e., if I know it's a human, the DB queries are given *high_priority* - ie spiders will tend to see slower times.
In other words, whatever you'd doing to measure performance has a long way to go.
This is a confusing initiative:
Load speed seems to already be part of the equation since Google includes "bounce rate" in page ranking. The bounce rate in itself (despite some possible shortcomings) is the most multi-faceted indicator of the interest expressed by users for a page.
If a page is "ugly" or offensive or a scam or too slow or not what I want or whatever, I bounce off the site, don't you?
Who cares how slow a page is if it contains the information you want. It could be a large reference document you need, a big training video or a small site run on a PC in a university lab. It might be slow but you'll wait, right?
actually, how could 'bounce rate' possibly be a factor? If it were, it'd be trivial to maniupulate, just hire folks to do searches and bounce off (or stay on) sites.
In my experience, the best sites give me exactly what I seek,instantly, in one small bite. No need to click - I'm outa there. It's the confusing sites that force you to click around or are too dense with info to quickly get what you want... I tire of 'information dense' pages, as do many folks.
Like it or not, bounce rate IS a ranking signal. See http://seoblackhat.com/2008/11/21/bounce-rate-seo/
And a confusing one at that, think of the mplications for a web site catering to people with ADHD...
That's just a correlation, there's no indication of cause-effect. For example, it could be that people are much less patient when reviewing the 20th item than on the first 5 items.. I.e., lower rankings cause higher bounce, not vice-versa. On my own site, bounce rate has been steadily increasing over the past few month (with no change in content) but so has search traffic.
USER needs an option in this matter, corporate interests are not the only influence on web traffic. Something like a ckeckbox: Now-now-now vs Give me WHAT I ASK for (purpose of querying Google in the first place perhaps?)
Improve relevance heuristics, that should be a worthwhile investment and I believe would cut out a lot of bloat sites anyway, allow the appropriate ones to rise to the top appropriately.
There are too many variables to site speed to really be effective.
Not to mention, sites are often programmed in such a way that the user experience seems fast and snappy, but their 'off-screen' stuff is still loading (or even pre-fetching). Now those sites will be penalized, and the result will be sites that are designed for a spider's version of 'site speed' rather than a users actual experience of 'site speed'.
Also, 'link farms' will benefit from this (text loads fast), while actual content providers (say, photographers, new articles with video, etc) will be hurt.
it is a dilemma that optimum
size for site not more 16 kb
for image not more 50
but image best size optimum is circa 70 kb
Willi SEO Company
www.resdays.com
Please everyone!
Do you really believe that google will use some crappy "cam up with it in 10 seconds" method to analize speed?
My sites get an average of 80-85 in Page Speed. And some of the suggestions which are made are true, and i am working on fixing these issues.
And even so that for some of them it takes 1,7 seconds to just generate the content. I think that this new aproach will make developers to rethink the database structure, optimize some code, and for the designers to go back to the root of their profession.
A lot of people forget that more, and more website trafic comes from mobile devices. And good practices mather there more, as the platform is mouch more restricted.
So all together i like the idea.
yah speeding up site has to be the new in thing in SERP u know.....one of my new site The Mad Newsis very slow to load and that is something i was worried abt now i will use this to make it faster in loading etc....
My site has got a penalty or what is not Showing up for the newer post while my older posts are ranked super high in the SERPs......my site please check it what is wrong ????
I am glad that google considers site speed as one of the parameters in calculating the pagerank of a particular website.Excellent thinking by Google People.If you do not know how to check the speed you can go to webmaster tools and check it there.
How to test speed difference and
SEO Ranking Speed difference is may be by Alexa rank and Google ranking or Pagerank for Yahoo and MSN Microsoft Bing are not
Resdays
Is funny how in Brazil all the sites are already optimized as far as speedy. Otherwise most users wouldn't have access since high speed internet is a privilege of few.
Thanks Google for considering page speed a good factor.How Can I decrease the loading speed of my blog Mobinepal
What is going to happen when the telecom corporations manage to sleaze/bribe through their anti-neutrality laws? When only the wealthy and corporations can afford high-speed content distribution, does that additionally, they will be the only ones appearing in the search results?
The search speed metric sounds like a generally good idea to me *in the current political climate*, but if net neutrality gets trashed, I think this metric should go with it.
Very intereating post, along with the comments, although did not read them all.
But here a link you can use which shows Yslow and PageSed results on one page for Comparison.
I think its a great idea, i've been working my butt off to make our ecommerce site as fast as i can (it measures a lot faster than even amazon.com), and so i am glad to see that it will pay off in seo as well as making happier customers. This is also a nice touch as its one factor blackhat seo'ers can't game, unless they're doing so to also benefit their visitors.
And to all of those complaining... keep in mind that if you have youtube videos, ads, google analytics...etc it shouldn't matter. A site can be optimized and run fast even with those. Javascript is most likely the culprit for slow sites, so you may want to check out appending scripts to the DOM using a function that utilizes body.onload or even better window.onload events. And to others, who i'm sure thought they were making a clever point, google does an excellent job of keeping ga.js optimized (gzipped for one), and the extra connection it opens up allows it to download in parallel to the rest of your site. Thus making ga.js load faster than if you hosted it, and i may be wrong but i also think its hosted on a google cdn aswell, so it is definitely faster than if you hosted it (assuming you yourself are not using a cdn). However, if you still don't agree then you can still host ga.js yourself, which is most beneficial when you can combine it with other .js includes you may have (though if you go this route i'd recommend making sure your version is uptodate by some sort of automated check).
Point being, go do some learning and stop complaining as speeding up your site should have already been a priority to begin with.
Frankly, I'm not going to let Google hold me hostage over this. There are too many uncontrollable factors such as poor performing Hosting hardware, over populated Hosting platform, ISP traffic congestion, POOR PERFORMING GOOGLE TOOLS, etc. This is just another POOR JUDGEMENT call by GOOGLE and an excuse to justify knocking down a site's page rank. A site should not be penallized for the failure of others and GOOGLE.
This is arguably the *best* news I've heard all year (at least regarding the internet). Over the past few years there has been an appalling decline in the efficiency of viewing websites, with the glut of javascripting and AJAXing and a growing disregard for compliance with coding standards. Not only do pages take *ages* to load, but in many cases the browser *continues* downloading bytes even after the page has appeared, probably scripting libraries and god-knows-what-else that ties up the connection. This is especially troublesome for peersons using dial-up connections, which are inherently sluggish to begin with.
I often wonder if these webheads ever actually check out their DL speeds at all, or just blithely assume that everyone is on a T1.
I seem to recall that early-on the recommendation was that a webpage not exceed 64K. I recently clocked "Constant Contact" at pouring out over 6 megabits (!) during an email-editing session. Plainly ridiculous!
I applaud Google for taking a step in reining in some of this bloat, and offer a toast: Here's to faster, more elegant internet coding!
EAR
For Professional Photographers this is turning into a nightmare.
In fact we are seeing our pages disappear from Google all together. Especially the images. I, along with many other Professional Photographers make our living on having our images seen. Since the update most if not all of our images have been dropped from Google.
Check out this thread in the Zenfolio Forum.. a site dedicated to hoisting Photographers pages. A very SEO aware site that many professionals use. Of the thousands of members and over a million images.. at last count Google only had 460 images indexed. And its dropping. ( http://forums.zenfolio.com/forums/t/5703.aspx?PageIndex=1 )
This is a nightmare for those of us who take pictures , but aren’t experts in SEO.
Gene Inman
http://www.GeneInman.com
BigEasy: A quick look at geneinman.com, mentioned in that thread as having "disappeared", shows me it's using cloaking to hide the images so that anyone trying to "spider" the site gets 'null.gif' instead of the actual photos, which get shown through a mix of CSS trickery and what looks like Flash.
I don't think the Zenfolio problem is anything to do with site speed: it looks to me as if the image-crawler simply isn't seeing IMG tags referencing the photos in the first place! As noted in your link, Smugmug and Flickr have millions of images indexed, but Zenfolio's site has essentially dropped off the face of the net - I think their "clever" HTML is concealing the images from Google as a side-effect. One of their team responses suggests they are serving up different content to known spiders; my suspicion is that either this isn't working properly, or that Googlebot is getting suspicious about being fed bot-specific content, which is usually a bad sign about the site.
I see in webmaster tools that Google is using cached pages that have scripts/ads that are no longer there and my site is rated as very slow. But when I checked with a 3rd party online tool that measured YSlow & PageSpeed, I get a B/A grade respectively.
It's a bit frustrating to be rated on non-existent content, even though Google bots go through my site daily.
YSlow is often wrong, or is this advertising service, or by not precise test sites will go out of the top. Or do a site for users with photos, design, style or make dummy index.html but Georgia will be 0.2s. Or it should only affect the Trust or the webmaster will cease to introduce pictures and graphics - from which Google will then take pictures? My website http://dominicantoday.info to test yslow loaded 10.7s, and all other no more than 4-6.
Yes. You can check Glad speed and Zoompf. For page speed, you can use Google toolbar measurements as well.
I really can't stand your redirection script, tool, or whatever it is. I mean, I am a fluent English speaker and if I type www.google.com I expect to go to the English version, not the Spanish version from Mexico. You shouldn't redirect our request, when one knows we can go to www.google.com.mx if we wanted -not the case here- to see the page in Spanish. I think you do this to reach a bigger number of people, but it is annoying. I hate that trend because it is like telling us what to see, instead of letting us choose what we want to see on the internet.
Thanks a pant-load, Google! Now I am constantly battling clients to keep them from crippling the quality and depth of their content in order to follow Google's new dictum: speed wins.
I think it is very irresponsible of such a large company to put people into a frenzy over the perception that speed is king. These poor souls are the ones who are not technical enough to understand what you are trying to tell them about speed being one of over 200 threads of data used to gauge a page's relevancy.
All they hear is: Speed wins. If you are not faster than everyone, you lose.
You have just made making a well thought-out and designed website needlessly more difficult.
Here's a thought: instead of rewarding the speed demons, why not simply punish the heinous offenders. That way, there will be a nice, wide-open playing field where I would be happy to have my websites....
Thank you for informing the community about this.
It's another good reason to use an extremely fast compiled programming language -- such as C# in ASP.NET MVC, and to avoid scripting languages like PHP, Python, Ruby, VBScript, etc -- when developing new web applications.
If I chose to have my homepage available only via HTTPS. Will I now be pentalized because of the increase in page load (due to key exchange) ?
Does page speed gonna affect SERP as well?
Well, that is a good question. But it all boils down to building or re-building a site to be search engine friendly. In time I think we will SERPS leave if a site takes to long to load.
Going back to this issue - as others have said - it would be great if there was a bit more transparency i.e. a signal/flag to indicate that speeds are having an effect (or not).
e.g. one of my system customers has a page of their 50 newest sets of photos (they're an image library) - they could reduce this to say 25 to increase load speed - but then they would be loosing some functionality in a way - presenting new products to paying customers. Need to find a balance.
What is the speed issues cutoff ?
I dislike! "Analytics" Hangs and very annoying, it's the most dumb thing that Google ever developed, w/o thinking that users whom doesn't have a fast bradband, CPU, RAM, etc should straggle with a ridicule slow done time in their end. Same problem with image search!
I think Google Analytics' code is most important role of the site speed in web search ranking.
Anybody have any ideas when this will be rolled out to apply to all sites and countries (especially the UK). And is it going to be based on page load speed in KB per second or total load time which would be silly and will it also be based on page score from the firefox page speed tool google reccomend so much.
I hope its soon, just got a new faster server and have page score of 99/100 so hoping for a ROI now I have had to folk out a small fortune.
nice post.
as i think my ranking are falling down due to i think the site speed factor also.
thanks...
I disagree with the user getting a better experience with faster upload times.
This is not; The Need for Speed!
There should no penalty for sites that take longer to load (how do you know in the first place that it is not a site with quality information)
That algorithm does not makes sense.
What about services that are genuine quality services that do business from A-Z and you want clear keyword SEO for the thousands of terms when a general term is adequate because for instance when I call a plumber for a service I still need to ask questions anyway (so much for being keyword specific). A plumber is a plumber. Just show the listings of all of them in thelocal area as an honest service for your customers Google!
In the carpet field, maybe I am looking for a carpet repairman but any quality installer can make the repair. You are cutting me the consumer away from my local business because most labourers are excellent at their service but not good at web design and quality writing.
Google has carried this too far!
If it is only information that consumers seek and not services, then that may be a different story.
But semantic is semantics.
The race makes the competitive market. It's all about supply and demand and unfortunately money comes to mind in the equation........
So a crappy plumber who is devious and is very knowledgable about SEO and the other areas that Google use their algorithms for can use his black hat or grey hat techniques to fool his customers into thinking he might be number1 (or at least Google might indicate that in their rankings).
I personally am in the services arena and rank very well being on the first page but it's a constant battle and always take me away from my focus when all this technology gets in the way.
I also find that hiring SEO companies are spamming on the telephone and online to sell their services.
There are so may of these fly by night companies that it's easy pickings for them.
Who created this demand?
I thinks this brings out the wolves, the hyenas and the scavengers.
It brings the worst out in people.
It would be nice if we blend old with new because traditional can still help people that are not able to utilize technology as "fast" as Google would like.
How often are page ranks updated? I have a 0(zero) page rank and that doesn't make sense. I've followed all guidlines and utilized all available tools to help me optimize. I am wondering when my site will be reconsidered. http://pulpnews.com. Is there some type of process I must go through? I would simply like to know where I rank. Thank you.
Thanks for this valuable information, I think page loading speed will get a REALLY IMPORTANT factor. Think about the impending explosion of mobile web requests by smart phones. If speed is not acceptable, people won't visit the site again => drop rankings sooner or later.
At JoomlArt.com we are testing the new google pagespeed module for Apache and found out that our site is already very well optimized. You can learn more about our findings here:
http://www.joomlart.com/blog/news-updates/google-page-speed-for-joomla-drupal-or-magento-will-it-work
Matt Cutts says: SEO = User Experience. Which is for me in big part load speed.
We have a website that was adversely affected by (3rd party) technical issues (now resolved), and a botnet attack (on going, but under control with 1750 IP addresses blacklisted), which caused it to go slow intermittently for a couple of weeks.
Now the Google ranking has dropped, and I don't know if the slow speed seen by Google is the cause of this (seems likely as little else has changed as we've been busy fighting the above issues, and the crawl stats are not good). I note also one of the pages it crawls regularly was also adversely affected by the botnet attack.
It would be good to know how long the penalty is applied for once speed issues are resolved so I can tell the boss.
The botnet attack is clearly beyond our control, so our Google rank is potentially adversely affected by a rogue operator - something I know Google are keen to avoid, since you don't want to give an incentive to DDoS websites.
Is it possible, or is it already done(?), that this penalty could be indicated in the Webmaster tools. Since it seems to me that there is no great competitive secret in revealing if someones website is too slow by Google's own criteria.
I suspect the botnet attack might also have given us some spam penalties as we are busy deleting the rubbish they left behind, changing, and rechanging the system to discourage them (captchas don't always work it seems).
It has been a rough couple of weeks, don't punish me for it!
What is a good speed? I have been playing about with caching and stuff and reduce load time and have made some improvements, but still takes a long time to open the bigger pages.
One problem seems to be offsite resources. These are loaded async so do these pose no problems? Is the key the time to Start Render (webpagespeedtest.org) or total page load time?
Cheers
Jon.
Does google care about the speed for whole/complete page(content+ads) to load or is it OK to have the content load faster than ads? For example, if the page loads the content in <1sec but it takes around 2-3 seconds for ads to load to finish the complete page, what google would consider?
How much of a speed difference affects the rankings?
It is not disclosed by google
but it is one from 3-5 important rankings Resdays lok at this site
for speed
what is the good load time percentage ?
I am not exactly sure if this is the right place to ask, but I'll give it a try anyway:
On our page we offer a download, similar to sourceforge we redirect the user to the download url via meta refresh; so the download starts and he still remains on our page without having to do an extra click.
The Webmastertools now say that this particular page takes ~14s to load which might affect our ranking...
Any ideas on how to avoid this?
If you have a long time for rediricting for download it is not your problem but problem the server
with links.
But not make much links from
home site and make links only from
small sites,find others links
provider
regards
How can we make the business case to get resources approved to speed up our sites?
And just how much should site owners invest in speeding up their sites?
Generic stats, however convincing, don't speak in dollars to decision makers.
Use Google Analytics to do that. We have developed a plugin which we are sharing:
See If your Site's slow it's costing you Insight & Revenue:Google Analytics Solution, Reports, Code
Brian Katz - Analytics - VKI
hello you can copy analystic code and put it in extenral js file that will speedup you page
thanks
Want to know the sitespeed related to the percentage of slower websites? Take a look at this graph of the site performance.
With 3G and true broadband, being more and more common, the speed at which the site loads should not a be a much of concern to most of the users.
Also, the fact remains that it is only one of the 200 other factors affecting the ranking of the website.
Really from 200 Factors SEO
You need only any factors like
site speed.site size,links,ranking,density.conversion rates and...?
Willi
Post a Comment