When you open your analytics package of choice and pull a report, topline metrics include, no doubt, the number of visits and unique visitors. And these basic concepts seem obvious—when a person lands on your page, it counts as a visit and if it’s their first time, that’s a unique visitor. But with complex analytics implementations—especially those involving multiple platforms that handle, for example, web metrics, testing, and segmentation—the simplicity of these definitions begins to blur.
This is particularly confounding for users of Adobe Target (formerly Test&Target) and Analytics (formerly SiteCatalyst)—which are, presumably, part of the same platform. There are, however, several important reasons for this and, fortunately, a few ways to handle any discrepancies.
Different Tools, Different Methodologies
A fundamental reason for differences in reported data is the way in which Target and Analytics function at the page level. Where Target uses the Mbox system—an HTML object with a special class that calls a JavaScript function—Analytics uses on-page and linked JavaScript, pushed through a tracking pixel—also called a “web bug” or tag—server request. The Mbox method makes it easy for Target to serve offers to specific parts of a webpage and the tracking pixel method allows Analytics to build a link to Adobe’s Data Collection Layer. The key takeaway here, however, is that Target and Analytics use two concurrent but separate actions to record visitors and other metrics.
Ideally, these actions would fire flawlessly and almost simultaneously. In practice, however, slow-loading pages, non-standard implementations, and conflicts with other tools can create situations in which one action fires and the other does not—at least not as consistently.
Inconsistent Implementation
Another common problem arises when the various tools are not implemented evenly across a site. Typically, the implementation of Analytics is consistent and complete, but this is not necessarily the case when it comes to Target. Without Mboxes on the page, Target can’t count visitors—an issue that becomes more common with complex websites or when testing managers disable Mboxes to reduce unnecessary server calls.
Different Tools, Different Definitions
For the purposes of Target, the visitors count records the number of users that have entered a test. Sometimes—as when a test only includes a certain fraction of total traffic—this number is different than the one recorded by Analytics, which measures overall site traffic.
In addition to this, session durations can differ across the two tools, altering what counts as a “unique visitor.” In typical implementations, Target, the “new” or unique segment includes visitors whom are visiting the site for the first time, the first time since clearing cookies, the first time since an Mbox was installed on the page, or if it has been two weeks since the last visit. In Analytics, however, the new segment includes visitors whom are visiting the site for the first time—or the first time since clearing cookies—and persists for up to one year.
Unique visitors too, can differ when conversion markers are set in Target. In this scenario, visitors who convert then return later will be counted again as a unique visitor. Analytics, in this case, will not consider this return visit unique.
Dealing with Discrepancies
Knowing some discrepancies are inevitable, it’s important to have a strategy to deal with them. Basically, this requires three essential practices to ensure you’re generating the best, most accurate analysis.
1. Trust in Trends
Though the exact numbers each tool is reporting may differ, the trends should be very close, if not identical. Essentially, if a 10 percent increase in page views is recorded by Target, that 10 percent trend should be reflected in Analytics. When these trends fall out of synch, it’s likely a sign of some larger issue in data capture or reporting.
2. Compare Like Data
This second practice is to avoid mixing data from different sources during analysis. This is very important when, for example, you start calculating metrics like revenue per visitor—especially if your tools are reporting varying numbers of visitors and average order values.
3. Calibrate, Audit, and Move On
It’s important to understand data discrepancies but ultimately, you cannot allow them to cripple your analysis. Choosing the primary platform for analytics, calibrating it carefully, verifying it against other tools or data as necessary, and continually auditing it for accuracy is critical. But once these efforts have been made, it’s time to accept the results and move on.
Dealing with data discrepancies is a confounding, sometimes frustrating experience. By understanding the cause of the differences, however, we can arrive at a point in which accurate, trustworthy data can drive better decisions.
—
Brooks Bell helps top brands profit from A/B testing, through end-to-end testing, personalization, and optimization services. We work with clients to effectively leverage data, creating a better understanding of customer segments and leading to more relevant digital customer experiences while maximizing ROI for optimization programs. Find out more about our services.
Categories