Oct 19, 2018

Help us finding out what to build next

With more than a million minutes of baby monitoring the app has made quite some impact by now. I do have ideas for improvements and enhancements - but I am also looking for your thoughts. Please let me know how the app can be improved and what features you need added.

I created a page for feature voting on Productific.

Jul 21, 2016

App Icon (ASO part 4)

There is only one chance to make a first impression: the app icon is what users see first. So I figured my app icon should be good. But what makes a good icon?

While there are many ways to design the app icon and you can sink an infinite number of hours into design exercices, it's fairly easy to test different colors for an existin app icon design. The impact of colors can be significant and I wanted to know what the bottom line impact is. So I went ahead and checked a set of different icon colors.

Setting up an A/B test for icons in the Google Play store provided the following results.

Three alternative colors tested


Best option would give 20% more downloads


Daily stats rather erratic


Result stabilizes over time


Applying the result


Result: Testing different app icon colors may increase downloads by 20%.

Mar 11, 2016

Messing it all up with typos (ASO part 3)

I guess I shouldn't have maintained my app store listing late at night, while everyone in my family was asleep. Late at night people are sleepy and when the're sleepy they make mistakes.

So somehow I ended up with a typo in my app's title in the store listing. I listed a "Baby" app instead of a "Babi Monitor" app, just missed the second word. Probably a copy-&-paste issue. The impact is pretty obvious:

typo in
this week
While downloads did not completely vanish a decrease is clearly visible. Lesson learned: do your homework and avoid typos.

Jan 6, 2016

Device Discovery in Wifi Networks

To use a baby monitor two smartphones have to connect. They have to find and see each other in the network, one device needs to discover another device automatically without much user interaction.

There was an IT world before the smartphone entered our lifes. So all the nuts and bolts are already available: to discover devices in a network you simply send a UDP broadcast message into the wifi network and listen for such messages on each device. This is the textbook approach which only took a few minutes of coding to build. Then the details of Android-reality bit me.

Does your babysitter talk "IP MULTICAST"?
In order to work with UDP broadcast Messages an Android application needs to have the permission CHANGE_WIFI_MULTICAST_STATE. Appearently, that is a safety measure in Google's thinking. However, I wonder which typical user actually understands whether an app should be allowed multicasts? Do your babysitter and your grandma understand what this is? Mine don't - but well, who cares, this permission is easily added to the application.

    Persmission CHANGE_WIFI_MULTICAST_STATE
    Allows applications to enter Wi-Fi Multicast mode.

From now on, when grandma does the babysitting with the app, I'll just tell her IP-multicasting is a good thing... sure she will understand.

Some devices just kill broadcasts but won't tell you
Android provides features and functions to manage IP-multicasting. Ultimately, this is to save battery life and avoid excessive use of multicasting which is a battery burner. You can switch multicasting on and off to cover only specific moments when you really need it. However, some manufacturers/devices simly ignore this without notice. Reason unknown, they just won't do UDP discovery. Go figure... The worst part of it is that there is no way for a developer to get feedback from the device, nothing like a reliable active/inactive status for multicasting.

To make device discovery working also for such non-multicast devices another discovery technique must be in place. The obvious choice is to use a TCP scan: check the whole subnet and ask each possible IP address if there is a baby monitor available. The major drawback of this is obvious - it can be a long running process for large Wifi networks in hotels and company buildings.

No netmask? An Android bug.
When building a TCP scan I noticed that on one of my devices, a new Moto G, the scan went on for very, very long. Other devices did well, though. After a lengthy tracing and log search analysis it turned out that Android 5 has a significant bug: it won't give your app the current Wifi netmask - it'll simply return zero which can lead to unpredictable behaviour. To Google this is a known issue (issue 82477). By now Google's bug database contains a good workaround. It is difficult to understand how such a fundamental flaw can make it into a consumer grade operating system.

Summary: while device discovery is easy in the textbooks, the reality in the fragmented Android device and version space makes it a challenging task using a combination of UDP broadcast, TCP scan and working around Android bugs.

Jan 3, 2016

App Store Optimization - A Language Trap (ASO part 2)

Looking at Google Play download statistics I noticed the following in the early days:

For some reason English downloads, with the US the biggest user space, are low compared to the peer group of similar apps. Typically, US-English should represent 20-25% of app downloads according to Google Play. I am seeing 2%. This is not good.

Lost in Translation
I am German. In Germany we use the term ‘babyphone’ - everyone knows what a babyphone is even though ‘phone’ is an english word. Pah! A quick check on the web reveals that ‘babyphone’ is not actually an english term, at least not one commonly used. It should be ‘baby monitor’, this is what I really meant. I stepped right into a language trap ... how could this happen? To me?

A/B Test
Before I change texts in the app store listing I'll give this idea a verification. Another A/B-test is set up quickly in Google Play, this time it's a localized test to compare different texts in the English listing.

 
However, this A/B test is not showing any meaningful results. Either actual download numbers in the US are too small to produce meaningful statistics, or the text doesn't make a difference at all. Maybe parents in the US are not checking on their babies? Hm...
 
Looking at this for a while I noticed that in the A/B tests only the description texts can be changed while the app title remains:
 
 
Probably, the app's title is actually what users look at most but the title cannot be A/B-tested. I went to try something different. Fortunately, there is a separate language provisioned for 'English - United Kingdom'. So there is a separate English language to play around with while keeping US unchanged. The app's English-UK downloads should acommodate for 7% of overall downloads according to public Google Play overall statistics, whereas I am seeing 3% for my app.
 
Testing with English-UK
I added a new language entry in the Google Play listing for English-UK and also updated the apk file to contain English-UK texts, changing 'Babyphone Wifi' to 'Baby Monitor Wifi'. After a few days UK downloads have increased to match the expected overall share of 7%:
 

Interestingly, also US downloads have increased a little but. It seems that some users in the US have set their phone to English-UK.


Doing the Change
With all the testing there now was enough confidence to change the name of the app from 'Babyphone' to 'Baby Monitor' for English users. With this change, cumulated downloads of English-US and English-UK users increased significantly - by approximately 200%.

 
 
Result: App store optimization is not always straight forward and may require a second thought.
 
Update: Not only 'Baby Monitor' is a good description of the app, I also use 'Baby Alarm' meanwhile

Nov 22, 2015

Google Play App Store Optimization (ASO)

The Google Play app store uses various pictures (icons, photos) to detail an app offering. Some pictures may appear more attractive to users while others well tend to draw users away. But which are the right ones to use?

Google offers a feature to try various pictures and pick the best. You simply add different pictures and see what catches best with users. Google is using such optimization througout their business ever since their early days, check this CNET article for more info.

With such A/B-testing embedded in Google Play it is fairly straight forward to give different pictures a try. What can possibly be the impact of changing a few pictures, or is there an impact at all? I am not expecting much...


After a few days of testing the following impact is projected by Google Play: 
  • Using picture #3 instead of the current picture #1 would give a 20% download advandage (228 vs. 190)
  • Using picture #2 instead of the current picture #1 would give a 36% download advandage (260 vs. 190)
That's not too bad, improving downloads by >30% is a pretty clear result. Furthermore, an info is shown about more data being required:
Probably, the overall downloads of my app are not constant enough for Google to calculate a statistically robust result so Google show this info. However, during the few days of testing the percentage values appeared to be fairly stable. For now I'll finish the test nevertheless, want to look at another change to test.

Result: 30% increase in app downloads by using a better background image.

Oct 20, 2015

Room Temperature

Last weekend we stayed at a friend's place. He has one of those automatic, timer controlled heatings. It was set to energy saving mode which means it turns down the heating at night time, when people usually go to sleep. Of course we stayed up a little longer, didn't notice the energy saving mode and our little guy, who went to bed as usual, got pretty cold. Long story short: the Baby Monitor app needs to check the room temperature. Some devices have a temperaute sensor which can easily be accessed, that’s a quick one to have in the app.