Wednesday, January 10, 2018

Overview of a Data Management Platform

I'm a numbers guy so I’m going to share some statistics. The Data Management Platform (DMP) market is expected to reach ~ USD 3 billion by the end of 2023 with approximately 15% growth during the forecasted period from 2017 – 2023 according to this report. The other statistic from a report published in August 2015 (a lot has improved since then) is that only 50% of companies were using a DMP then. This tells me that there is still a lot of juice to be squeezed when it comes to companies investing in a DMP. So, what is a Data Management Platform?

As per WikipediaA data management platform (DMP) is a centralized computing system for collecting, integrating and managing large sets of structured and unstructured data from disparate sources. My definition is slightly different and this is how I define it.

A data management platform (DMP) allows marketers to integrate multiple data sources such as 1st party (online, email, media, offline/CRM), 2nd party (partner) and 3rd party (provider) data into a centralized system typically leveraged for digital advertising media activation and optimization.


Source: https://upload.wikimedia.org/wikipedia/commons/9/9b/DMP-diagram3.jpg

Now that we got the definition out of the way, let’s look at a few advantages and features that a DMP offers:
  • Integrate and centralize data: A DMP leverages a common identifier which allows us to integrate multiple data sources. First party data such as client’s own onsite data is captured using a device/cookie ID which can be integrated with campaign media data typically tracked by adding tracking pixels to digital advertising banners. This data can be further integrated with offline CRM where a user ID captured on sign in can be extended by bringing in additional user metadata to the DMP via an offline file. DMP also has the capability to allow marketers to purchase additional 2nd party and 3rd party to bring in new qualified users (cookies/devices) into the DMP.
  • Maximize marketing spend: A DMP allows digital marketers, advertisers (most common) and publishers to leverage integrated data across multiple sources and cross device channels. The marketers are able to leverage the platform to take data from multiple sources and share it with outbound platforms called Demand Side Platforms (DSP) such as Adobe Media Optimizer, DoubleClick Bid Manager etc. A DSP allows advertisers to buy media to run retargeting or other campaigns based off data which may be a combination of online first party and provider’s 3rd party data shared out via the DMP. A common use case is to retarget users who've added an item to their shopping cart but left without purchasing. With this, the marketers aim to increase conversions which probably wouldn't have happened.
  • Deliver a personalized experience based on integrated data: Testing and personalization tools such as Adobe Target have the ability to take first party onsite data and offline CRM data from the DMP to run personalization tests. An example use case can be running a personalization campaign comprising of demographic data such as gender, location or income combined with onsite behavior tracked in the DMP and serving an experience customized per visitor.
  • Display consistent advertising to users across devices: A DMP is also able to stitch users traversing across devices using a hashed email address (most accurate), IP address and location. As users log in, a DMP is able to identify the user across devices as well as on the same device even is the user is not logged as a profile of the visitor is created. This feature allows marketers to deliver a consistent message or retarget the same users across devices which can be one of the objectives of a campaign.
  • Audience extension via data providers: A DMP has partnerships with different 3rd party data providers such as Acxiom, Dun & Bradstreet etc. who sell user demographic, psychographic and other offline behavioral data. These data providers either charge a flat fee or bill based on a CPM. Advertisers can build out test segments in the DMP to see how many new prospects they can get with the 3rd party data set to target more users for their campaigns. As an example, marketers who are planning to run a campaign to sell high-end laptops might want to target technical professionals who are between the ages of 25-35. This data wouldn’t be available in their first party data set so they can purchase this from a data provider and combine it with onsite data in the DMP for a prospecting campaign.
  • Leverage lookalike models: Lookalike models or algorithms are used to identify similar audiences from a benchmarked audience segment. An example use case might be to create a lookalike from a base/benchmark segment of users who’ve converted and run the model against a bunch of 3rd party audiences to find new users who exhibit similar conversion behavior.
There are other features that a DMP offers such as testing audiences, overlap reporting, frequency capping reporting etc. but they are supplemental to the main features already covered in this article. I will discuss these in some capacity in the future.  

Based on all the advantages which a DMP offers, it’s imperative for organizations to invest not only in a DMP but also in resources who know how to leverage it properly so that they get the best return on investment.

Sunday, January 7, 2018

Back Again!

After a (really) long break, I've decided to get back to blogging. This time, I'll be writing more about data management platforms and their role in the digital advertising and analytics space as well as other skills I've picked up during this time. I've picked up data management while being here at Adobe where I'm focused on supporting our clients on Adobe Audience Manager among other Adobe products. I will be active and will try to get a blog post out every other week time permitting but I'm looking forward to getting out there and start sharing what I know with everyone.

Tuesday, June 16, 2009

Web Analytics Implementation Process

Web Analytics in an organization should be just like a development cycle starting from requirement gathering to validation. Below is a visualization of an ideal Web Analytics process. This process is more suited for tech organizations which already have defined KPIs and regular weekly/monthly releases of new features on their website.


1) Requirement Gathering: This is the start of the Web Analytics process and it deals with an Analyst collecting tracking requirements from stakeholders. Similarly this step will also involve review of feature specifications of new items that are part of a release cycle. An example of a new feature can be a new page being added on the website or a new outgoing/external link being added or even an A/B Test.
2) Creating a Tracking plan: Once all the requirements have been gauged, the Analyst will create a Tracking Plan/Analytics plan/Solution Design document to define the variables for Web Analytics vendor tools (custom variables, pagename variables etc) like Omniture SiteCatalyst, WebTrends, Clicktracks or Google Analytics. This is usually an excel document containing a matrix of all the variables and their corresponding values.
3) Development: In this step, the Analyst will usually work along side a developer to get the features implemented on the website. This step also requires the Analyst to assist the developer with any questions she has regarding the Web Analytics code or the Tracking plan. This applies especially to new developers who do not understand the Web Analytics snippet.
4) Data Validation: This step deals with the QA/testing of Web Analytics data that land up in the Web Analytics tool. I have written a comprehensive article detailing the importance of this step as this in itself is a separate process.
5) Reporting/Analysis/Recommendations/Next Steps: After the data is found to be clean, it is the responsibility of the Analyst to report numbers resulting from the feature which went live during the previous release cycle. The Analyst will also provide analysis (explaining the data or conversion etc) and possible recommendations/next steps to improve the website even more.

This, according to me is an ideal end to end process which organizations should be following to manage Web Analytics. It is vital for a big organization to incorporate these steps in their overall plan for Web Analytics to ensure smooth functioning.

Monday, June 1, 2009

My take on 404 Error Page Naming and Analytics

‘404 Error Pages’ are the pages displayed when someone is not able to find a link/URL on a website. There are usually 2 ways by which one can find the 404 page:

1) Typing in the wrong URL: If a visitor has typed a wrong URL, by default he will see a ‘The Page cannot be found’ page in case there is no custom 404 page present in the website. Below is a screenshot of such a page.

In order to fix this, the best practice is to create a custom 404 page which will be shown to visitors who try to access a page which has either been removed or doesn’t exist. This 404 page should contain links to the most important pages of your website and will play an important role in engaging visitors back to your website. You can also create 404 pages which have a funny message. Some examples of such pages can be found here.


2) Deleted or moved links: The same default page mentioned above will appear in case a visitor clicks on a link/page that has either been deleted or moved to a new location.

To fix this, implement 301 redirects which send visitors to the new page which has been moved to a new location.

As far as Web Analytics tracking is concerned, it is pivotal to accurately track how many people are looking at the 404 page and what URLs are they looking for. The method explained below will help you track 404 pages efficiently (Tracking impressions on the 404 page and the incorrect URL) through Omniture and Google Analytics.

1) Adobe Analytics: Capture the incorrect URL (JavaScript function document.location) in the s.pagename variable and append ‘404’ to it as shown below.
s.pageName="404:"+document.location (E.g. If the incorrect URL is http://www.undp.org/ss, then the pagename variable will capture it as ‘404:http://www.undp.org/ss’. This naming structure helps in gauging the amount of traffic going to incorrect pages as well as fixing broken links. Similarly pathing can be performed on the error page to find the flow of traffic to and from this page.
• Another mandatory variable which should be populated on error pages is s.pageType which should be populated as s.pageType="errorPage".
Below is a screenshot of the UNDP 404 page using similar Omniture snippet.



2) Google Analytics: Capture the incorrect URL in the trackPageview function as shown below:
pageTracker._trackPageview("404:" + document.location) (E.g. If the incorrect URL is http://seattleindian.com/seattle/xyz.asp, then the value captured in the ‘utmp’ variable will be ‘404:http://seattleindian.com/seattle/xyz.asp’.
Below is a screenshot of the SeattleIndian 404 page using similar Google Analytics snippet.


Below are some advantages of implementing custom 404 pages in your website:

1) Engaging visitors to pivotal pages of your website: If your 404 error page has links to important pages of your website, users can be sent to important pages of your website thereby increasing user engagement. You should also add a link to the sitemap page and a search box.
2) Leveraging Web Analytics to optimize your website: You can utilize Web Analytics tools by analyzing 404 URLs which users type and fix broken links on your website.
3) Reduces user frustration: Creating a custom 404 page eases user frustration caused due to not being able to find what they were looking for.

Monday, May 25, 2009

Small change with a huge impact

Recently I was involved in changing the layout of a website and measuring the impact of that change. We changed the Top navigation on this website and changed the color of a link to Red/Bold. It was a very minor change with respect to the whole website as the Top navigation menu only contributed to less than 5% of users engaging in the website. We wanted to make this change to enhance the Top navigation and entice more clicks on the edited link (Coupons). P.S. We leveraged Google Analytics to measure this change. Below is a screenshot of the previous Top Navigation menu:


After a week, I pulled the ‘Top Content’ report and filtered on the Top menu Coupons link. I was pleasantly shocked to notice the results. There was a 65% increase in User Engagement (Clicks) on the Top navigation Coupons link clearly due to changing the link color to Red. From the context of the website, this page amounts only to a small proportion of traffic but this change has paved the way for similar changes which can be replicated on others pages in the future. Below is a screenshot of the change we made on the Coupons link:


Immediately after noticing this change I sent a tweet in excitement: ‘Wow! Top menu navigation link text change resulted in over 60% increase in user clicks. Changed the font of the link to Red/Bold #ga #wa’. Surprisingly, I got a response from a Twitter user: ‘So funny. We also changed the nationalgeographic.com top nav to red/bold in Dec 07 for commerce promo. It stuck.’ The user mentioned that Nationalgeographic also made a similar change back on 2007 and they too noticed an increase in clicks on the button. Isn’t it coincidental?

Going forward we plan to replicate the same exercise on the Side menu. We will also be performing AB Tests on the Top navigation menu and compare it with newer menus. P.S. It is always a good practice to add a query string parameter in the URL. E.g. Add ‘menu=top’ (http://www.seattleindian.com/seattle/indian-restaurant-coupons.asp?menu=top) to distinguish this URL as a Top navigation link.

Hope you like this article. Please comment and let me know if you’ve done similar exercises and noticed a considerable impact.

Sunday, May 17, 2009

My take on Social Media Analytics

Social Media is one topic which I haven’t written about in the past. I am still coming to grips with Social Media as there’s so much to learn but am convinced that it is something which just cannot be ignored. I say this because websites like Facebook with 200 Million active users (Source: Facebook), LinkedIn with 39 Million users as of May 2009 (Source: Wikipedia) and Twitter with 7 Million users as of Feb 2009 (Source: Nielsen) are at the peak of their popularity with more users being added by the hour. P.S. Population of United States is 304 Million as of May, 2009. In this article I will write about my experience with Social Media Analytics and the mediums to track it. Twitter will be covered in detail.

Social Media measurement:

1) Twitter Campaign Analytics: It is very easy to measure traffic from Twitter with Campaign Tracking parameters. There is an article which I wrote on Email Marketing which brushed on the concept of Campaign tracking. Here is an example of a Twitter Campaign URL which can be measured in Google Analytics: http://rkapoor.blogspot.com/?utm_source=twitter&utm_medium=googleurlbuilder&utm_campaign=socialmediaarticletweet. This URL was created from the Google Analytics URL Builder. These campaigns can be attributed to Web Analytics metrics like Bounces, Time spent on Site, Visitor Type and Conversion once we start getting traffic from Twitter.

2) Short URL Analytics: There are also various short URL websites which can shorten the campaign URL to be placed on websites like Twitter (Maximum Twitter Tweet length is 140 characters so short URLs are very efficient). One of the websites is Clop.in which offers you an interface where you can easily create campaigns for Google Analytics, Omniture and WebTrends. This tool will allow you to create a custom campaign URL string and will shorten it. I leveraged this tool to shorten my Twitter campaign URL to http://clop.in/PWGCto. Other short URL websites are tinyURL, Cli.gs etc. Some of these websites will also report the total clicks, referrers and location of the users who clicked on the short URL along with measuring metrics on the destination website.

3) Reputation tracking tools: A new variety of tools have appeared which allow companies to analyze what consumers are saying about their brand (Customer Sentiment). I tried my hands on Radian6 and SM2 which offer a great interface built out of data captured from blogs, forums, review websites microblogging and news websites. These results are captured based on the keywords which users or companies search for. SM2 has a great graphical interface whereas Radian6 shows keyword search results via Widgets. For e.g. Microsoft can search for ‘Zune’ and analyze consumer feedback/mood for this product. The possible data points available from this search keyword can be segmentation of Media type with the location, sentiment and engagement. Based on this data, companies could even contact sources/people that have a negative taste about the brand to improve their reputation. Other reputation tracking tools are Buzzlogic and Buzzmetrics.

4) Twitter specific tools: As Twitter is the hottest from of Social Media today, companies are putting in more effort to know about their competition and the most popular trends in real time on Twitter. Some tools which I have used and report data from Twitter are Tweetvolume (Competition comparison tool) and Twitscoop (Current buzz on Twitter). There are some websites like Tweetvalue which even assign a price to your Twitter profile and offer to buy your Twitter account (I haven’t really tried selling it yet).

5) Widget Analytics: Another form of Analytics which comes under the Social Media umbrella is Widget Analytics. Widget is a snippet of code which can be embedded in Web pages and can be used to display Videos, Ads, News, and Weather etc. Widget integration has recently exploded with Social networking sites, blogs, and Ecommerce websites etc. Some metrics which can be measured on Widgets are clicks, impressions, install conversion, widget stickiness, installs by country and time spent on widget etc. Some tools which provide widgets and an interface to measure them are Gigya and Clearspring etc.

Through this article I am able to share my experience with Social Media measurement but there are lots of other tools which I haven’t used or mentioned. Please let me know if I need to add any more tools as many might have been missed. I will be writing more about Social Media in the future.

Sunday, April 26, 2009

Should a Web Analyst have development skills? - Part 2

I wrote an article in 2007 about Web Analysts having development skills and my conclusion was that it would be an add-on to have basic skills. I got a comment from a user who thought that only having analytics skills are suffice and I somewhat still agree with him. I have seen a lot of Web Analysts who are perfect in analyzing data but don’t have development skills. They tend to do very well in their usual job but lack context pertaining to the implementation of code. This article will cover what I’ve learnt about the Web Analytics job market since then and what makes an ideal Web Analyst.

I have been analyzing the Web Analytics job market and have noticed that almost 90% of the listed profiles have a mention of Web languages like HTML, JavaScript or Flash. (P.S. I had predicted such a trend) This wasn’t so common a couple of years back when companies mostly looked for people who are simply ‘Analysts’. In my opinion, it is very important to know how the Web Analytics code works and the technology behind capturing data. My take on the Web Analytics data capture is explained here. In one of my assignments, I was involved in configuring the Web Analytics tracking code to track features which were not possible through the generic snippet. A JavaScript Wrapper had to be created and added in the code. This asset helps a Web Analyst to stand out and is often the path to rise in the organization as a multi-talented contributor. Apart from knowing programming, it is also helpful for a Web Analyst to know basic SQL as most companies have an in-house reporting system which might need to be extracted for analysis.

We can have hours of discussion on whether the above skills are really necessary for a Web Analyst but based on the current market situation, extra skills other than analysis will be more than useful. Below are a few skills which I think will be make a very good Web Analyst in the order of priority:

1) Analytical skills, drawing conclusion from data and offering recommendations to improve the business (Presentation skills and Excel knowledge included)
2) Client interaction and excellent interpersonal skills (Requirement gathering and building relationships)
3) Statistics knowledge (Ensuring whether data is ready for analysis and concepts like Confidence level)
4) Basic Programming skills (Understanding Web Analytics code and ability to enhance it)
5) Basic SQL skills (Ability to pull data from the backend databases if necessary)

I will appreciate if you can share your views in case you agree/disagree with this article.