A comparison of antispam vendors

InfoWorld recently released a report where they compared the effectiveness of various spam filters.  It's mostly about on-premise anti-spam appliances.  They do touch on hosted solutions but don't go into much detail.  At the end, they do a filter-by-filter comparison.  You can view the results of their study by looking at the pretty image here.

The table contains a very nice looking comparison.  It has total valid mail, spam percentage (catch rate), false positive rates, and the like.  It is twelve categories in all.  But at the very end, we still are having trouble answering the question of which one performed the best.   We can see that Ironport and Barracuda have the lowest catch rates, but Ironport has a pretty good FP rate.  There's a lot of numbers, how can we summarize them?

To do this, let's go back and look at my Relative Performance Index.  Recall that this is a metric I created that combines the catch rate and false positive rates and normalizes the results.  Also recall our definition of spam in the inbox (SITI), a measurement that combines the amount of spam and non-spam that the end-user sees in their mailbox.  The results are below:

Barracuda Borderware Ironport Mirapoint Proofpoint
RPI   5 1 91 7 7
SITI    7% 10% 12% 8% 7%

 

  IronMail Sendio Symantec Tumbleweed
RPI 5 3 51 22
SITI 4% 0% 10% 17%

From this table, we can clearly see that IronPort has the best RPI (higher is better).  In fact, they totally crush their competition using this metric due to their low FP rates.  So, while the catch rate of Borderware was higher, the FP rate boosts the Relative Performance of Ironport.

The numbers change a bit when we look at Spam in the Inbox.  Here, Ironport's lower catch rate negatively affects the user experience in spam, but better FP evasion improves it. 

It would be better still if we further combined RPI and SITI (or SITI, catch rate and false positive rate).  I will leave that for another post.