Comparing blog visit statistics

By Ken Perrott 26/01/2011

Every month I post a ranking of New Zealand blog sites which make their visit statistics available. Being basically lazy I do this automatically using a Google spreadsheet to import data from the appropriate site meter. The 7 day average visits are automatically updated in the NZ Blog Ranking web page. And the monthly ranking here.

Incidentally, I notice that these ranking tables are often being used by people wishing to browse New Zealand blogs. So it is worth bloggers including their blogs in the rankings just from the point of view of linking and traffic. Of course, it also gives bloggers some idea of how their visit numbers compare with other similar blogs.

There a range of methods for obtaining blog visit stats and they don’t necessarily measure the same thing. There is often confusion between visits by individuals and the number of page views in total. (An individual may visit several page during a single visit.)

Reported blog rankings are reliable

Fifteen months ago I compared the data reported by commonly used site meters. Specifically, Stat counter, Sitemeter, Shiny Stat and Go Stats. In the graphs below I report the resulting data normalised as a ratio of the data for one meter – Statcounter. (The data points are the visit numbers (or page views) for the specific meter divided by the visit numbers (or page views) for Statcounter.

Obviously the ratios vary over time and between meters. The Sitemeter and Statcounter data are most similar (average visit ratio of 0.96 and page view ratio of 1.02). Fortunately most bloggers use one of these two meters. Go Stat visit numbers  and Shiny Stat page views are a little low. I have yet to find a NZ blog using Go Stats and only a few use Shiny Stats.

This gives me some confidence that the reported rankings are realistic.

However, one problem I have identified is bloggers who use an entirely different method for obtaining their stats drawing false conclusions when they compare their data with the reported rankings. The old problem of comparing oranges and apples.

A common problem is where the blog is on a wider platform. An example is individual blogs on the NZ SciBlog platform. SciBlogs itself ranks very high (often number 5) but the individual bloggers (about 30 currently) ranking (for those which include site meters on their separate blog) is obviously much lower.  For example NZ SciBlogs is currently ranking at No. 5, whereas my syndicated blog at the platform (Open Parachute @ Sciblogs) is ranking at No. 36.

It’s worth checking which site meter is being used.

Comparing apples and oranges can be delusional

Another problem arose recently when a local blogger claimed very high traffic which would have ranked the blog number 4. Despite this the blog was not even showing on Alexa or Technorati – indicating much lower traffic.  My enquries elucidated the information that he was basing his claim on data extracted from his ISP server log files. There are programmes, such as Webalizer,  which can do this. They may be useful for analysing trends for a single blog but no way should they be used to compare data between blogs, especially when other site meters are used.

Example Webalizer chart
Webalizer stats
Image via Wikipedia

Server log file data is not easy to interpret.  It requires intelligent use to avoid over counting because of the way a page or visit may be interpreted. It can for example include each separate image file on a page as a visit if not careful. (See Simpletons Guide to Web Server Analysis).

Wikipedia also describes problems with the results from Webalizer (and this should apply to similar programmes):

’Generated statistics do not differentiate between human visitors and robots. As a result all reported metrics are higher than those due to people alone. Many webmasters claim that webalizer produces highly unrealistic figures of visits, which are sometimes 200 to 900% higher than the data produced by javascript based web statistics such as ’Google Analytics’ or ’StatCounter’.’

So, this blogger could analyse his data  for trends, page popularity, etc., but he was wrong to compare them with those obtained by other bloggers using site meters.

I advised him to install a site meter. But he may possibly be happier to keep the warm glow his server log files give him.

Similar articles

Enhanced by Zemanta