A deep commitment to data analysis is key

image Companies have more and more data at their disposal which has opened up a new battlefront for competition, and as I wrote back in November it is increasingly true that The best companies are analytical.

I’m returning to that theme today having read a post on affiliate blog Shoemoney about Scaling Facebook campaigns.  The point of the post (you guessed it) is to say that more than ever before the winners will be those who are the most analytical and data obsessed – specifically thorough testing more different combinations of campaigns and targeting than ever before.

Two quotes from the post:

Making more ads with little changes can have a big impact. Normally in PPC Engines, I was so used to just split testing 2-3 ads at a time and letting those run that I had to change my mindset. With Facebook, you can run literally hundreds of ads at the same time, going to the same offer.

and

In my experience, most newbies target very little. So, target more specific things. Remember, people in one place in life might convert differently than another. People at one workplace might convert differently than another. Test them all… ages, geographical location, sexual preference, etc. People at x might convert different than Y. Most people group them all into one large lump of demographic targeting. If the ad is not profitable, you kill entire sub sections which might have been profitable. This might take time, but it’s worth the effort. You can put a small budget on each test to maximize your testing efforts.

Advertising arbitrage businesses have perhaps unsurprisingly led the way in mining data to maximise results, but in all online businesses the same approach can yield dividends.  For a SaaS or consumer internet company a little work is required to establish the goals of the business, identify the site metrics that support those goals and maybe write the code to track the data, but once that is done the monitoring and iteration approach described above can be applied.

What I’m describing is making the iterate-measure-analyse-iterate loop a (maybe the) core process.  That is very different from collecting data, even reams of data, and then performing ad hoc analyses and series’ of one-off A-B tests.

Reblog this post [with Zemanta]