Three tactics top app advertisers use to improve download efficiency

By: Jessie Liu, Sr. Analytics and Media Manager

When evaluating the success of ad campaigns in the Streaming Applications industry (SVOD, AVOD, vMVPD) it’s useful to look not only into the number of app downloads but also into app download efficiency. A 2020 Amazon Ads study highlights the importance of this.

Story highlights:

The Streaming Applications (SA) industry, which includes Subscription Video-on-Demand (SVOD), Ad-supported Video-on-Demand (AVOD) and Virtual Multichannel Video Programming Distributor (VMVPD) often uses the number of app downloads to compare performance across different advertisers. At Amazon Ads, we believe that it’s important to not only consider the total number of downloads but also download efficiency: how often impressions lead to downloads.

To calculate app download efficiency, we analysed the downloads per thousand impressions (DPM) of 38 brands in the SA category on Amazon in 2020. We found that top-performing advertisers had 22X higher app download efficiency than other advertisers. To help advertisers improve their download efficiency, we look at the differentiating tactics used by top performers and provide recommendations for improving them.

For more on how we collected our data, see the Methodology section at the end of this article.

1. Top-performing app advertisers combine Streaming TV ads, mobile ads and Fire TV Sponsored Tiles

This study shows that brands that combined Streaming TV ads, Fire TV Sponsored Tiles, and mobile ads saw +22X higher app download efficiency (and delivered 2x more impressions) than advertisers who used Streaming TV ads alone.


Download efficiency


More impressions


When planning campaigns, we recommend that advertisers:

  • Consider running ads across Fire TV, Fire tablet and mobile.
  • Tailor ad creatives to devices to ensure that customers have a positive experience on all devices.

2. Top-performing app advertisers vary their ad creatives

Creative campaigns with more versions of specific messaging may be felt by audiences to be more relevant, and so may drive higher engagement. In fact, this analysis shows that top-performing advertisers implemented 1.8X more unique creatives than other advertisers.


Advertisers should consider continually refreshing creatives and conducting A/B Testing. A/B testing is an effective and cost-efficient way to determine what resonates with viewers and prevents unnecessary spending. We recommend testing elements like different calls to action and content types to understand what performs better in terms of driving more downloads. Lastly, we remind advertisers to scrutinise the ad design, call to action, claims, pricing in creatives and landing page to ensure advertising content and creative setups are appropriate for a general audience and comply with Amazon’s policies.

3. Top-performing app advertisers leverage negative keywords

Top-performing audiences were 6-10% more likely to use negative keyword tactics than other advertisers, and also show higher app download efficiency.


Consider leveraging Amazon Ads tools to create custom audience segments based on genre, streaming and lifestyle, and in-market behavioural signals that align with campaign objectives. Leverage the standard audience performance report to understand which audiences are not responding to campaigns and consider excluding them in the future.


In this study, we analysed 38 brands in the Streaming Applications category in the US over 12 months of advertising during 2020 from January to December. The Streaming Apps category includes advertisers that offer services like Subscription Video-on-Demand, Ad-Supported Video-on-Demand and Virtual Multichannel Video Programming Distributor (vMVPD).

We used downloads per thousand impressions (DPM) as an app download efficiency metric to measure success. We then identified the top advertising strategies to help increase DPM with machine learning algorithms. Pearson Correlation, Linear regression, XGBoost and subject matter expert suggestions are used to assign features weights. This analysis highlights the greatest differences between advertisers with the highest and lowest DPM, and does not predict performance or claim causality.

How does clustering work?

We created a binary composite score based on DPVR, and then applied an XGBoost classifier to identify which features and with which weights best predict these labels. In doing so, we considered advertising actions as features such as ad product usage intensity and mix, timing of advertising support, tactics of targeting, creatives and placements, customer review counts and ratings, percentage of products with quality product pages and the types of products promoted in ads.

Using the identified features and weights mentioned above, we next applied a k-medoid clustering algorithm to classify advertisers into clusters. Note that we classified advertisers by their actions rather than by the components of their composite score. Finally, we ranked the final clusters by their composite scores from high to low. Cluster 1 is the most successful cluster with the highest composite score, and Cluster 5 is the least successful.