Home › Forums › Click Here For RSF Post Member Forum › “Lies, Damned Lies, and Statistics”
- This topic has 0 replies, 1 voice, and was last updated 6 months, 3 weeks ago by Adam Smith.
-
AuthorPosts
-
-
05/24/2024 at 11:03 am #4830Adam SmithParticipant
Mark Twain famously popularized this saying, based on the concept that statistics can have persuasive power – even when used inappropriately.
In a recent e-blast, Truth in the Ranch author “JW” used statistics as proof points to support their arguments. Regardless of the points they were attempting to make, there were some glaring overstatements and mistakes made in presenting the results. Let’s set the record straight on some of their assertions.
The first argument put forth is related to the 50% email Open Rate, which they claim is “above industry average”, and certainly meant to bolster the assertion that “the survey garnered an impressive response”. Well, I’m here to tell you Virginia that Open Rate does not equal members that opened the email and certainly not members that read the email.
Emails sent from e-marketing platforms (like JW’s Mailchimp) contain a tracking pixel, which is a code snippet that loads when the email is opened (those of you that run with your security settings on high and block images may see a box with a red “X” in it; that’s the tracking pixel, and it won’t load until you agree to load images). This HTML code snippet tells Mailchimp “the email to this contact has been opened”. However, it doesn’t know who – or what – opened the email. Today’s modern email clients and services have aggressive spam filtering and security features, and that software will open emails to check them first – before they even land in your inbox. Mailchimp records an “open” by the security and filtering software as equal to an “open” by a person, it knows no distinction.
So, how many members opened the email? Nobody knows. In addition, email open rates are comparative, not absolute. If you’re a car dealer and you send out 100,000 emails in April and 20% are opened, and in May you send out another 100,000 and 25% are opened, you can say that the title, timing, content, offer, or other aspects of the email campaign are improved by 5 percentage points. But that’s about all you can say, there’s no industry benchmark whatsoever.
And these stats don’t tell you if the member actually read the email, which is the most salient point; until Google starts tracking our eyes as they move down the screen we’ll never know.
But wait, we do know that 169 people took the survey. Or, do we?
JW asserts that “169 respondents who opened the email clicked to the survey, yielding an 11.6% click-through rate”. Well once again that pesky filtering and security software is at work; the software will click every link in an email to make sure that the links are reputable. Did a member click the link? Again, we’ll never know.
What we really want to know is: How many members took the survey? Critically, JW fails to tell us these numbers. The inference is that it’s 169, but the email doesn’t say that, it just says that there were 169 clicks to the survey page. However, the Margin of Error (4.4%) that is revealed tells us how many took the survey. Assuming a 95% confidence level (which is the standard in market research), we would need 127 respondents of a pool of 169 people to yield a 4.4% Margin of Error.
In the 2023 board election we had 1,169 votes tallied; the prior year it was 1,441. Should 127 people who responded to a (clearly) biased survey inform our direction on anything? We don’t think so. Sounds like just a small, but vocal, group of agitators.
-
-
AuthorPosts
- You must be logged in to reply to this topic.