Nov 15 2007
I’m always considering new ways of ranking the Top 100 Australian blogs. I’ve been updating the list for 7 months now, and obviously a fair amount of time goes into maintaining it each fortnight. I’d like the list to have as much integrity, credibility and be as inclusive as possible.
Index = (3 x AU + X + T) /5
Estimating Traffic for the Top 100 Australian Blogs List
Following on from my comparison of Alexa and Compete, these are some of my concerns:
- The favouritism shown by Alexa to blogs of a technical nature
- Blogs on shared domains (excluding WordPress.com and Blogger blogs, which do provide an Alexa rank) or part of a larger domain – perhaps not being accurately ranked
- Is Alexa even credible/reliable enough to factor in the formula (but bearing in mind that the aim of the list is to find blogs which are popular amongst Australians – which is why the Alexa AU ranks is weighted)
- Blogs that have many subscribers, but do not fair well on Alexa are often not (well) ranked
- Technorati Rank for blogs which release plugins or themes being (unfairly) lowered
How is blog success, or popularity measured?
As Shai points out:
But, I really, really, really believe that a blog’s success is not just measured by Technorati authorities, Alexa rankings, Google PageRanks, pageviews, unique visitors, and comments. And yes, not even Awards. 😉
I certainly agree, but in formulating a list you HAVE to use quantitative measures. And obviously, the more automated it is, the easier my job is each fortnight.
Ratified use a combination of Technorati (authority), Google PageRank, and number of subscribers in formulating their top list. The problems that I have with this methodology are;
- that it’s not inclusive. If you don’t use Feedburner, or activate your awareness API, then your ranking is extremely affected,
- that it quite heavily weights Google PageRank, which given recent controversy, is probably not the accurate tool of “authority” measure that it once was – particularly amongst blogs.
I also read a very interesting presentation by Avinash Kaushik (a Web Analytics guru) that he gave recently at blogworld (presentation available at EightBlack). Avinash talks about blog measurement and showcases the growth of his blog Occam’s Razor over it’s history. He also covers measuring blog success on his blog.
While he talks about measures such as visitors, subscriber growth and citations (i.e. Technorati authority), he also talks about “conversion” on blogs. Through a plug in called generalstats, it’s possible to see how many posts have been written, the number of words in the posts, the number of comments and the number of words in the comments (if you scroll down in my sidebar, you’ll see I’ve added these at the very bottom).
From these statistics you can determine your rate of conversion by this formula:
# visitor comments / # posts
(note this is a little hard to compute, because you have to subtract your own comments from the total. I think this can be a little hard to determine with accuracy).
This is all good and well for measuring your own “success”, but conversion is not something that can easily be measured externally.
It’s interesting to note that in July, Avinash formulated a list of top 10 Web Analytics blogs using feed subscribers (E4) and Technorati rank (F4) as the variables. The formula applied (devised by Kevin Hillstorm and adapted by Avinash) is somewhat more complicated than mine 😉
Kevin’s inital formula uses Alexa and Technorati. But I’m digressing.
So, excluding variables that are too hard to ascertain (number of posts, comments etc), some variables that have been suggested for inclusion into the formula are
- Google PageRank
- Age of a domain
- Back links from Google and/or Yahoo
- Technorati ranking
- Alexa ranking
- Feed subscribers
With feed subscribers, as mentioned, this is not available for every blog. So if this figure were to be included and wasn’t available, this would either have to be offered by contenders for the list (honour system, or via some proof?).
Alternatively, I could use a Bloglines and/or Google’s subscriber count. Obviously this has dangers, as some blogs report that Google only counts a fraction of their subscribers, and this figure varies from 10% up to 40% of the true count.
Over to You
Given the concerns that I’ve outlined, and keeping the updates as automated as possible (it’s a labour of love, remember 😉 ), here’s your chance to have a say.
- What do you think makes a blog “successful”?
- What variables do you think should be considered in measuring a “top blog”?
- What formula would you apply?
- What are your thoughts about the credibility of such lists, and how can they be more credible?
I look forward to hearing your feedback.
21 Responses to “Measuring Blog Success and Formulas for Top Lists”