Using Email Swipes – can it work?

I got back to emailing daily recently and began by conducting some tests to see how well vendor provided email swipes would work with my list.

Over the easter, there were two product launches that I decided to put some more effort to. And tried two different approaches. Products I promoted were Arbitrage Underdog and Instant Funnel Machine.

I won’t go into much detail now about the products themselves, you can check them from the links above.

Provided emails were used as written and advised

Anyway, both of them provided emails swipes that one can use for launch promotion.

Arbitrage Underdog had three emails.

While Instant Funnel Machine had seven emails together with the advised timing for sending them out over the launch period.

One was to be sent prior the launch, then one on day one, two on the second day and three on the final day.

For Arbitrage Underdog I made a simple squeeze page which lead to a mini autoresponder series with the provided emails. Instant Funnel Machine I promoted to my existing list.

I bought a 300 click package from a Solo Ad vendor in Udimi. I have used this seller before so I knew what kind of opt-in rate I could expect from her.

Results surprised me

The actual opt-in rate was over 50% which is something I have never seen before in my squeeze pages. This made me excited about the prospect of this test’s performance.

To keep things really simple, after signing up visitors were redirected to the sales page of Arbitrage Underdog.

But. Not. A. Single. Sale.

I could understand this if the offer was really bad. But the offer seemed to do really well otherwise. Arbitrage Underdog was still earlier this week in the WarriorPlus leaderboard. Even after several weeks and dozen or so new launches.

They have had a conversion rate of almost 10% and have sold over 2000 copies of the software. With 50% of the buyers also going for the first upsell.

Everyone who gave their email was shown the sales page. And they also received first email message directly and then two additional ones after 24 hours each.

Open and click through rates for the emails in the table below.

email swipe open rates

There might be multitude of reasons

Of course, there might be several completely valid reasons for my lack of results here.

Most likely ones being only using one traffic source and sending only 300 clicks. Not a really a statistically significant amount.

But the major learning point that I take from this test is that it is not a good idea to use email swipes word for word during the launch period.

Especially when there are potentially dozens of other marketers who are sending the same or similar emails to their lists at the same time.

The test was not a complete failure though. I did gain 150+ subscribers to my list.

If you want to check the squeeze page I used, please see the screenshot below.

email swipe squeeze page

For reference, until this test, my best performing squeeze page with the same kind of traffic has been this:

squeeze page example

This page has had a sign-up rate of some 21% and conversion rate to sales of 4%.

Sending the swipes to existing list

The other test I did was by scheduling a set of emails from vendors swipes to be sent to my existing list. I did not send any broadcasts during this period.

There were emails provided for days. One day prior to launch and three days after. The plan was to send one email on two first days of the sequence, two on the second and three on the third day.

Open rates for these emails varied from 4% to roughly 10%. And 10% has pretty much been the norm for my emails in recent past. Statistics for individual messages can be seen from the table below.

email swipe results

For clarity I have normalized figures for per hundred subscribers. In the end the hole sequence delivered over 7 clicks per hundred subscribers.

Still no sales

But sadly, as was the case with Arbitrage Underdog, not a single sale. And conversion rate for this promotion has been 9% and they have done over 250 sales after the launch.

From the results of this test, I draw the same conclusion as from the first one. That it is hard to make someone else’s emails work on your own list.

That being said, there might be other reasons as well, like list not made up from buyers and so on.

Or it might be that statistically my list was just too small. And if it was larger, numbers would swing towards the average and the sales would come.

Also, the point I made above about promoting during a launch period when everyone and their dog is promoting at the same time might not be such a good idea. Heavily dependent on the product though, I think.

Could I do better?

After writing all of the above I began thinking that, could I do better?

Could I write a better series of emails for example for Arbitrage Underdog and actually make the sales?

If you would like to read about that kind of a test, please let me know in the comments about that as well. Also, what would be your conclusions based on the results?

Split testing is really worth it – who would’ve believed it?

I have always known that you should do split tests. But knowing and actually doing are two different things.

I am trying to learn list building alongside blogging and that is why the subject came up. But at the moment my traffic levels are so low that building the list just from my blog visitors (signup form is in the sidebar by the way) would be too slow. For that reason, I am also driving traffic with solo ads.

It is clear that traffic quality from solo ads is lower compared to organic visitors to my site via search engines and so on. But at the moment solo ads have a clear convenience advantage as I am able to get x amount of clicks for a predetermined price.

This makes it easy to test my squeeze page. I just got started with this and over the weekend did a really simple test of just switching the order of image and form. Variant A had form fields on right and variant B had the form on the left side of the page.

As I am just learning this stuff I had to try to be too advanced at first. I tried creating experiments in Google Analytics but for some reason, it broke my GetResponse form code. And the page didn’t load completely or not at all.

Luckily I realized quite soon that it is possible to do split tests directly in form editor in GetResponse. After that, it was smooth sailing.

I do need to figure out what it was that was wrong in the Google Analytics, though. I have a feeling that it is a good skill to have for the future as well.

Results

What were the results you ask? No less than 44% difference between the forms. Okay, it was only around 300 clicks so statistical validity might be a bit sketchy but that will have to do for now.

split testing

The difference between only using variant A and only using variant B would’ve been almost 40 clicks or over 150% in favor of variant B.

Shoemoney blog has also a good example of how a really small change can make a big difference to the results. Just by making a small change to the add-to-cart button conversions were increased by almost 80%

Humans are strange. But I think this stuff can get addictive.

What are your experiences with split testing? How and what do you test? Please let me know in the comments.