Category Archives: Metrics

I encourage skeptics on the Internet to measure what they do, to discover what is effective and what is not. I sometimes give examples of how to do this, or where I’ve measured my own efforts.

My Year in Review: 2014

Measuring TapeIt’s time once again for my year-in-review wrap-up post. I encourage skeptics to measure what they do, so this is part of my effort to practice what I preach.

Something different for this year, I’m going to include some non-skeptical, personal items because – what the heck.  I’ve spent a bunch of effort losing some weight and making other improvements in my life. In the spirit of quantified self why not look back on that too?

So here we go…

Continue reading

My Skeptical 2013 in Review

Measuring TapeI suddenly realized I hadn’t done a year-end wrap-up post here at Skeptools. I’m always talking about how skeptics need to measure what we are doing, so I should practice what I preach.

So below you will find the top viewed content in my various channels this past year and other year-end stats.  As always there’s something to be learned here.

For instance, it is often said that controversy is a good way to court online hits. The biggest controversy I jumped into during 2013 was reviewing the infamous “block bot” – and yes, that did get some hits for both this blog and Virtual Skeptics.  But the top post on this blog in 2013 – leading by a factor of four – was not that post, it was a how-to right in the core of this blog’s mission of helping skeptic activists online.

There are definitely other interesting surprises, so be sure to scroll down to the end. For instance, by far the most-viewed online item of mine in 2013 was a long-available news video I merely tweaked & captioned and put in my YouTube channel. And I learned that joining Storify in 2013 was a good idea, as one story I created there got more views than most of my blog posts do.

Continue reading

Measuring the growth in local skeptic groups via LogoI’ve written a number of times about finding ways to measure skepticism and skeptic activity.  How can we know whether skepticism is having an effect if we don’t measure what we do?

But polls and surveys can be tedious and expensive! So grass-roots skeptics often need to look for more ad-hoc ways to measure things. Fortunately many such opportunities present themselves online, some of which I wrote about at the JREF blog back in 2011.

Simply knowing how many skeptics there are is one useful metric, but there is no single skeptic membership organization that could conduct such a count. We’re spread across hundreds of local groups and affiliated with many different national organizations.

I could attempt a survey of local groups, but that would be a time consuming process – even more so than my ongoing census of skeptic podcasts. But what if there were a place online that kept track of local skeptic groups? That started me thinking about…

Continue reading

Please help update my skeptic podcast census

Podcast IconBack in May 2011 I attempted to measure the amount of skeptic podcasting being produced and generated some interesting statistics.  In April 2012 I ran an update post attempting an overall census of what podcasts were out there, and found I had missed a bunch the first time through.  This post is an attempt at updating the data once again.

It seems there was some untapped demand for a good catalog of skeptic podcasts – that 2012 post is the number 3 most popular single post on this blog for the last year!  Podcasters have told me they see new listeners coming to their sites regularly, referred from there.  Noticing that phenomena, Shane P. Brady took the data I generated there and turned it into an interactive catalog called Skeptunes.  I encourage you to give it a look.

One of the things I noticed in the 2012 survey was that the number of different skeptic podcast titles seemed to be topping off around 100, and it looked like there was a slight trend downward.  It’s always dangerous to extrapolate from real-world measurements like this, so I didn’t draw much of a conclusion.  So now that more than a year has gone by, I think it’s time to measure again and see what is actually happening.

I’ve started with my data from last time, and looked in Skeptunes, SkepticsOnThe.Net and of course iTunes for any new ones I missed. But I’m sure I still must have missed something. Check out the list in the rest of this post and let me know which ones I’m missing.

Continue reading

Data-Driven Skepticism

Since the beginning of this blog, we’ve been talking about ways to re-use and mash up data that already exists online. This is the core of what the programmable web is about, and there are many potential data sources to use. Figuring out ways to use them that advances skepticism and critical thinking is the key.

Among the others who noticed the utility of re-using existing data this way were journalists. This is because at the same time these fantastic web APIs and tools have become available, governments and other public institutions have moved to open up many of their massive public-domain databases for use by the public. When these datasets contain information that might bear on policy issues and decisions, they are potential gold mines for journalists.

This has kicked off a trend called data-driven journalism. Simply put, it is journalists using data mining and other data analysis techniques in order to find the basis for stories. I think skeptics could learn from the techniques of data-driven journalism, and use them for our purposes too. Indeed, I’ve done some very small experiments in that direction in my metrics articles.

Beware: it’s not the easiest thing in the world to get right. There are definitely many ways you can be tripped up if you aren’t careful. But I think if you are careful there are some interesting techniques here that will be helpful to skeptics.

So let’s explore what it would mean to do data-driven skepticism.

Continue reading

Finding targets for skeptical analysis via RBUTR

One of the interesting side-effects of the anti-misinformation tools I wrote about on Sunday may be better availability of metrics about what misinformation is actually making the rounds.  That could be very useful for skeptics.

I often wonder whether skeptics are staying focused on the right topics. Skeptics are reactive. We often find ourselves responding to news articles, social media trends and other ephemera needing critical analysis. While this is necessary, there is always the danger that we might be distracted from other topics needing us. Those neglected topics could affect equally as many people but are not getting media attention. This is why I often talk about the long tail and focusing on a niche, because the more skeptics who do that, the better overall topic coverage we can get.

I was reminded of this while listening to this week’s Skeptics’ Guide to the Universe podcast, in which host Steven Novella pointed out that although the pseudoscience of neuromuscular dentistry has existed for half a century, there is “very little written about it, skeptically.” I’ve also seen evidence of this when responding to earnest requests for information on the James Randi Educational Foundation’s forum.  Requests occasionally arrive there for a skeptical analysis on some product that has been around for quite some time, and yet nothing appropriately critical about it can be found online.

Let me give you a quick example of how information generated by one of those new tools might help us see whether the focus problem exists and solve it at the same time.

Continue reading

How many skeptic podcasts are there? Please help me find them all

UPDATE in October 2013: Work is underway to update this census, please go to this newer post.

Last May I attempted to measure the amount of skeptic podcasting being produced. We learned a number of interesting things in that post, one of which was that there are over two hours of new skeptic podcast material being produced every day.

It is almost a year later, and we are coming up on the end of the seventh year of skeptical podcasting. It seems appropriate to revisit the census and see where the current numbers are. Unfortunately, there’s no one place to go to find all the skeptical podcasts. Last year I had to create my list by hand, and I definitely missed several.

This year I have that list to start with, plus some known announcements of new podcasts. But I’m sure I must have missed a few. Check out the list in the rest of this post and let me know which ones I’m missing.

Continue reading

My Top Posts of 2011: A Lesson Learned Again

This is the time of year you see a many “year in review” posts. It’s good to take a moment and look back at what you’ve done.

In the spirit of this blog, I’ve got a year in review post coming later that will be a how-to about measuring your own skeptical contribution for the year. But for this one I just thought I’d look at what my top posts in each venue (blog, social media and so on) were and how much traffic or attention they got.

As I’ve pointed out before, raw traffic levels are often a misused indicator. Traffic can surge for a variety of reasons that have nothing to do with the quality or importance of the post itself, so it can be dangerous to let yourself be guided entirely by “ratings” as it were.

That having been said, the traffic levels may have some a lesson to teach us (albeit one that we have seen before on this blog). Read on.

Continue reading

How is The WOT Project doing after six weeks?

Back in March I blogged about Web of Trust and how it could be a powerful tool for skepticism. Web of Trust is a crowdsourced web site safety rating system, that can warn unsuspecting internet users (and now, everyone on Facebook) when they are about to visit a site that contains scams, malware or other potential danger. I suggested that by rating sites selling products based on superstition or pseudoscience, skeptics could turn WOT into a tool for skeptical outreach. Indeed, many skeptic targets such as PowerBalance already sport negative WOT ratings.

In June Canadian skeptic Erik Davis launched a site called The WOT Project. His focus is the opposite side of the equation: protecting the WOT ratings of skeptic sites. Each week the WOT Project posts a list of skeptic sites and encourages participating skeptics to give them a good rating in WOT. The sixth such set was published on Monday.

Since WOT has an API, the ratings can be measured over time. Since my two most recent blog posts were about measuring skeptic outreach on Wikipedia, I thought it would be appropriate to do this for WOT as well. So let’s see how well has WOT Project done in their efforts to protect skeptic web sites on WOT.

Continue reading

How much traffic does a Wikipedia “Did You Know” attract?

If you follow me on social media (Twitter, Facebook and so on) you may have seen me congratulate Dr. Karen Stollznow over the weekend. It was because her new Wikipedia biography (launched just before TAM) appeared on the main page of Wikipedia as part of the “Did You Know?” feature. This is a box on the left side of the page that pulls interesting trivia from articles recently added to Wikipedia.

The main page of the English Wikipedia is apparently used by many as an entry point. It currently receives between 4 and 5 million page views every day. That’s a tremendous amount of traffic, and it guarantees that anything linked from that page is going attract alot of readers.

Like the rest of Wikipedia, the Did You Know? feature is collaboratively edited. Anyone can nominate a page to appear there, as long as they follow certain rules. When I create a new article that is relevant to skepticism, I nominate it in the hopes that it will be displayed here one day. The goal is to get the attention of those 4 million people, and expose them to skepticism.

But how effective is this? Fortunately, Wikipedia’s transparency allows us to examine the traffic numbers and answer that question. In this article I hope to show it is a very good way to get new people exposed to skeptic concepts.

Continue reading