Last week summary
What did I do last week
– created a plan for market research phase
– had a meeting and got an advice from a professional
What am I going to do this week
– figure out what to put on landing page A and landing page B
16 months of my life have been quite exciting. Me and my partner have experienced a global movement in rise, being a Digital nomad. We have traveled through
3 continents, experienced amazing things and made great new friends. Since there’s plenty of sources about this topic online, I’m not going to cover it any deeper. I’d just like to say we’ve experienced a few trips with this jolly bunch – Hacker Paradise, and we both found them really nice and inspiring surrounding for all remote workers and open minded people in general. One of the people I’ve met there was Alex Johnson, a great guy from Cardiff, Wales, doing landing pages and online conversion for a living.
A different view
As a result of my previous week’s revelation, I’ve contacted Alex past week, roughly four months after we met. As he was happy to help me, so we scheduled a Skype call the very next day. I’ve introduced Alex to my software idea, and the survey questions I’ve assembled for the customers and for the salons. My biggest problem was how to funnel the both surveys I had to one single landing page. The suggestion was quite simple: you don’t. You split them separate, and do a split testing afterwards.
Cold emails are probably not the way to go
As Alex nicely suggested, reaching out to ~300 salons we had with a cold email and serving them with survey may not be the best approach, for several reasons.
1) First of all, the reply rate is most likely to be really low. Low for
300 emails we had would probably be
<10%, which, hypothetically, already put us below
2) The second issue was the survey length. Having
10 questions in the survey would probably mean
~50% of the current
~30ish people would not finish it. This would provide us with, hypothetically,
3) The third, and the foremost reason/question was what could we do with the gathered data? The way the current questions are assembled, most likely nothing.
Consequently the first part of the advice was instead of having the same approach with salons and customers, treat them differently, because they are two different types of customers. Salons should have more personalized approach, and maybe the best options would be to have one on one interviews with the salon owners. This would mean we’d have far smaller number of surveyees, but the answers would provide much more value and visibility and hopefully some new insights within the domain/problem.
Landing page is a good approach
As we already established, the landing page route would probably be the best one for the customer survey. The steps would look something like:
1) Create a landing page
2) Funell all of the channels through it
3) Gather emails
4) Send the survey
5) Process the gathered data
The issue was my customer survey had even more questions than the salon one. Since I thought the online payments in Croatia might be the biggest issue, even a critical blocker (end of this post), most of the survey questions were within the online payments domain. This was making my survey biger then it should be.
In order to reduce the survey size, and in the same time to get the answer to the payment quesyoin, Alex suggested I could do a simple split testing.
Split testing is, according to the Instapage:
Split testing, commonly referred to as A/B testing, allows marketers to compare two different versions of a web page a control (the original) and variation to determine which performs better, with the goal of boosting conversions.
According to this, my goal would be to create two variants of the same landing page:
1) Landing page A: first variant of the landing page, stating explicitly that the application is using online payments
2) Landing page B: second variant of the landing page, not saying a single thing about online payments
Furthermore, after creating the landing pages, I should spend exactly the same amount of budget on marketing for both, and monitor the gap between the Landing page A and Landing page B conversion. If the gap is huge, people are most likely not really fond of the idea of using online payments, and I should think about alternative solution. If the gap is not as big, people are not as afraid as I thought, and it may be a positive sign.
Finally, if the conversion rate for both pages is miserable, the landing pages suck or my idea sucks 🙂
Open ended questions
Another good point Alex made during our conversation was usage of open ended questions. My initial survey was constructed in a way it was providing all the answers ahead. That seems to be good, if you’re knowledgeable of the domain you’re tackling with. Because my haircut knowledge is as profound as 18mm on the shaving machine, please., it turns out there is a high possibility that might be a wrong approach. Hence the data gathered this way could possibly be misleading, guiding me to wrong decisions in the end.
In conclusion, having some open ended questions in the survey might leave enough room for the surveyees to provide some personal answers, thus possibly providing me with more insights and their perspective about the problem.