Transparency in reporting of major social network performance recently received a shot in the arm courtesy of WatchMouse. Using their Public Status Pages, WatchMouse now tracks 20 giant social networks’ uptime and other performance metrics at Social.DownorNot.com.
Specifically, users can view the performance speed and uptime of home pages, login pages, and APIs from Classmates, Del.icio.us, Digg, Facebook, Flickr, Foursquare, Friendster, Gowalla, Hi5, Hyves, LinkedIn, MySpace, Netlog, Orkut, Stumbleupon, Twitter, Xanga, Xing, Yelp, and YouTube. If upon automatic checking any site returns errors or takes longer than 8 seconds to respond, it’s marked as error and unavailable. The uptime percentage has its basis in the number of errors reported by such checks. read more
When I set up my first Web site in 1995, Web counters were the big thing. Virtually every site had one of those (rather pointless) rolling counters at the bottom that tracked how many “hits” the page got. We were, at that point, obsessed with the idea that our pages were being read and could care less by who. The whole idea of international publishing was still new and exciting.
Later counters became more evolved, the term “hits” became meaningless and we focused on “visitors” or “users”. A variety of new trackers, most with their own buttons, began to pop up. Those slowly replaced the hit counter as the new metric to watch.
However, as the millennium rolled over and the first tech bubble burst, we saw even more advanced metrics rise out of the ashes. Attention became the most valuable thing to track, especially in an AJAX Web where page views and visitors would be almost meaningless. It was no longer a matter of just how many people visited, but how long they stayed and what they did.
Now we’ve moved forward again, this time it’s “engagement that we’re looking at. Services such as PostRank allow you to track comments, tweets and links to your site as part of your “Engagement Score”, combining that info with your other, more traditional data.
But with so many metrics to track. There’s a legitimate question about what stats are the most important for a blogger to track. The answer is simple: All of them and none of them. read more
Three weeks ago I signed up my blog for a beta service by Tynt called Tracer in an attempt to both test the service and get a better understanding of how people are using my content.
The information provided by Tracer is only aggregate in nature, there is no information about what an individual user did with your content, and Tracer does nothing to prevent copying, thus it is not a DRM solution. All Tracer does is analyze how users interact with your content and which pages are the most “active”.
To do that, Tracer follows four metrics: page views, selections (meaning when someone selects objects), copies (actually copying the work) and generated traffic (clicks on links generated by Tracer).
After over three weeks of running the service, I’ve gotten some pretty good data on my site and the results more than surprised me. Here is what I learned. read more
Gregory Go of About.com Guide to Online Business made it clear to the crowded room about how the numbers drive payment and drives success when it comes to paying a blogger. “If you are looking to make money blogging for a company or blog network, you have to understand the metrics.”
Gregory listed three key web analytics that should be used to set a price for paying a blogger.
Consistency – Word Count Metric: Number of posts per week or month published with a minimum word count per post.
Internal Metrics: Numbers based upon direct interaction and actions such as comment count, feed or newsletter subscribers, and direct sales generated.
External Metrics: Performance compared to the general Internet/blogosphere metrics. This includes page view counts and referrer or inbound links.
While few pay solely based upon one of these three metrics, most blogs and blog networks compensate bloggers based upon a combination of these numbers. read more
Last Friday, (August 15th, 2008) I took a snapshot of the Internet’s top blogs. This freeze frame identifies the blogs that have developed the skills necessary to compete. Unlike traditional top blog lists, I did not seek to place blogs in order of perceived importance. Instead, I combined public lists of top blogs ordered by the amount of inlinks (Technorati), amount of community subscriptions (Bloglines), ability to start and follow trends (BlogPulse) and the ability to thrive in foreign markets (Wikio). I then weighed each individual blog against its all encompassing internet performance using SEOmoz’s Trifecta Tool. The result is a list of blogs that have proven to be powerful in all aspects of Internet success.
If you’re a tech blogger, there’s a good chance your ranking has gone up. By the way, in the Alexa world, that is NOT a good thing.
The traffic data Website is no longer relying solely on their toolbar to rank sites. Instead, they will be aggregating data from “multiple sources.” Pretty cryptic, eh?
The reality is, whether you but stock in these numbers or not, they do affect the way your blog/Website does business. Since Alexa has been on the scene for over 10 years, many people take what they say seriously.
You will also notice that historic data is now only available from the previous nine months. This window will be expanded over the next few months as the site recalculates. They also promise new features will be rolled out in the near future.
I noticed about a 30k drop (that’s good!) on my blog.
How has the recalculation affected your corner of the Web?