Skip to content

Cyber Sale: Save big on Moz Pro! Sign up by Dec. 6, 2024

561424fcc24600 66117170

Campaign Tracking Without Going Crazy: Keeping Order in AdWords Optimization

Anthony Coraggio

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Table of Contents

Anthony Coraggio

Campaign Tracking Without Going Crazy: Keeping Order in AdWords Optimization

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Pay-per-click advertising generates vast amounts of data, which presents us with tremendous potential for optimization and success. However, this formidable sword cuts both ways—even skilled managers can quickly find themselves adrift if tests and changes are not carefully tracked. Here’s a quick, actionable guide to keeping order in your AdWords account with a simple and professional activity log.

The philosophy of orderly management

Good Adwords management is an exacting science—every tweak and change made should be for a specific reason, with a particular goal in mind. Think in terms of the scientific method: we’re always moving forward from hypothesis, to test, to result, and back again.

When it comes time to evaluate the results of these changes and iterate to the next step, it’s very important to know exactly what changes were made (and when). Likewise, when the numbers break unexpectedly, it’s vital to be able to eliminate as many variables as possible as quickly as possible in our analysis. Many of us operate in collaborative environments, so this information needs to be readily accessible.

To be able to do that, we need a system that defines when and where these changes happened, and clearly explains the nature of the change. Beyond that, we also need to keep it user-friendly for two very important reasons. First, many of us operate in collaborative environments, so this information needs to be readily accessible to teammates, supervisors, and clients that may need it. Second, it’s vital to remember that the most elaborate, brilliantly-detailed tracking plan is going to be useless if you don’t actually use it consistently. To get started building a good system, let’s take a look at the tools we have at hand.

Tools of the trade

AdWords changelog

The first and most obvious tool that might come to mind is the Adwords native changelog, but this should be viewed as a tool of last resort in most cases. Anyone that has had to dig through that information line-by-line trying to diagnose an issue will tell you that it’s less than optimal, even with the improved filtering options Google has provided. The crux of the issue here is that there is no indicator of intent—why was the change made? Was it a considered part of a test? What other changes were a part of the same move made?

That said, the changelog can be a handy feature when it comes to quick refreshers on a former budget cap or tracing a trend in bids—especially when downloaded to Excel. Just don’t rely on it for everything!

Google Analytics annotations

This is our second UI option, and a key one. Obviously this isn’t in AdWords itself (though that would be a lovely feature), but if you spend even half your time in online marketing, chances are you’ve got GA open in a second tab or window already! If you commit the effort to nothing else, do it for this. Placing annotations for major changes or tests doesn’t only help you—it provides a touchpoint for anyone else that might need to look into traffic ups and downs, and can save hours of time in the future.. Note that I said "major"—remember that this is a shared system, and you can easily swamp it if you get too granular.

Spreadsheets

This is where most of my logs go, as proper coding and some simple filtering makes it a breeze to find the information you need quickly. I’ll get into more detail on practical usage below, but basically this is where the when/where/why goes for future reference. My preference here is usually to use Google Sheets for the simple collaboration features, but you can do just as well with a shared Excel file on OneDrive.

Project management tools

Keeping your test tracking connected to and aligned with your project management tools is always wise. There are myriad project management software tools out there, but I favor agile PM for SEM applications—Trello, Jira, Mingle, Basecamp, and more are all useful. The key here is really that your activity and test logs are easily available wherever you keep project resources, and linked to from whatever cards or items are associated to a particular test. For example, if you have a task card titled “Client-128: A/B Ad Test For {Campaign>Ad Group}”, note “per task Client-128” in your activity log and link directly to that card if your tool permits it. You can also link to the activity log from the card or a project resource file if you’re using a cloud sheet, as in Google Docs Sheets.

Creating a system & putting it all together

Now you know all the tools—here’s how to put them together. To get you started, there are two primary areas you’ll want to address with your activity log: ongoing changes/optimizations, and major planned tests.

Tracking ongoing changes: the standard activity log

The standard activity log is your rock. It's the one point where the hundreds of changes and thoughts the human brain could never hope to perfectly recall will always be, ready to answer any question you (or your client, or your boss) might come up with down the line. An activity log should, at minimum, tell us the following:

  • What happened?
  • When did it happen?
  • Who was involved?
  • Why did it happen?

If I notice an inflection point on a particular graph starting on 9/28 and need more information, I should be able to go back and see that User X paused out Campaign Y that morning, because they had spoken with the client and learned that budget was to be shifted out to Campaign Z. Instant context, and major time saved! If I want to know more, I know who to ask and how to ask the right question for a quick and productive conversation.

Ongoing optimizations and relatively small changes can stack up very quickly over time, so we also want to be sure that it’s an easy system to sort through. This is part of why I prefer to use a spreadsheet, and recommend including a couple columns for simple filtering and searching. Placing a unique sequential ID on every item gives you a reliable point of return if you muddle up the order or dates, and a note indicating the type and magnitude of the change makes searching for the highlights far easier.

Anything you can do with your chosen tool to simplify and speed up the process is fair game, as long as you can reasonably expect others to understand what you’ve put in there. Timestamp hotkeys and coded categories (e.g. “nkw” denoting a negative keyword expansion) in particular can save headaches and encourage compliance. Finally, always keep your logs open. It’s easy to forget early on, and dragging your cursor through just a few extra clicks to open them back up when you’re in the zone can be a bigger obstacle than you might expect!

Formal test tracking

When you’re conducting formal A/B or multivariate tests in your account, a higher standard of documentation is a good idea. Even if you’re not presenting this to a client formally, put together a quick line of data detailing the following for every major test you plan and execute:

  • Purpose. Every test should have a reason behind it. Documenting this is a good exercise in holding yourself to account on smart testing in general, but this is most important for future analysis and test iterations—it’s what sets up the "why."
  • Hypothesis. Marketers have a reputation for playing fast and loose with statistical methods, but remember that for results you can trust, you should have a falsifiable hypothesis. Again, get this down so you can say what exactly your results do and do not prove.
  • Procedure. Exactly what it sounds like—what did you do in implementing this test? You need to record what the controlled and experimental variables were, so you can appropriately account for what might have influenced your results and what might be worth trying again differently in the future.
  • Results. Again, easy—what was the outcome? Don’t be stingy with the details here; confidence level, effect size, and the actual ad copy or landing page that was tested should be recorded for posterity and later reference.

I like putting at least the hypothesis and results in a combined test results spreadsheet for quick future reference. Over time, as people shift through roles, what was tested a year ago can quickly fade from organizational memory. When planning your next test, you need to be able to quickly go back and see if it’s been done before, and whether it’s worth trying again. I’ve seen a lot of wasted duplication of effort in companies I’ve consulted for this exact reason—don’t let that be you!

I also recommend plugging in a quick line in my standard activity log for each action on a test (i.e. launched, finalized, paused), since these are often pretty high-impact changes and it’s helpful to have this information in your go-to spot.

Make it work

I’ll close with a brief reiteration of what I believe is the most important part of activity logging and test tracking: actually doing it. Internal adoption of any new tool or process is almost always the toughest hurdle (ask anyone who’s ever overseen a CRM implementation). As with any habit, there are a few simple behaviors that can help you make good tracking practices a reliable part of your routine:

  • Start small. It won’t hurt to start by logging just the biggest, most important activities. You’ll have an easier time remembering to do it, and you’ll soon start doing it for more and more tweaks automatically.
  • Be accountable. Even if you’re the only one touching the account, tell someone else what you’re doing and ask them to check in on you. There’s nothing like social accountability to reinforce a behavior!
  • Have a goal in mind. If you don’t feel a sense of purpose in what you’re doing, you’re probably just not going to do it. Make a pact with yourself or your team that you’ll review your activity logging one week from when you start and share thoughts and ideas on improving it. You’ve then got a clear and present point of reference for success and moving forward.

Do you have any favorite tricks or tactics for keeping good track of your SEM campaigns? Share them with us in the comments!

Back to Top
Anthony Coraggio
Anthony is Technical Program Manager at Wheelhouse Search and a Distilled alumnus, with an extensive background in PPC, SEO, and general online marketing strategy. He lives in Seattle with his partner Nicole and is perpetually delighted by guitars, history, and the science of everything.

With Moz Pro, you have the tools you need to get SEO right — all in one place.

Read Next

How to Create an SEO Forecast [Free Template Included] — Whiteboard Friday

How to Create an SEO Forecast [Free Template Included] — Whiteboard Friday

Nov 22, 2024
Directional Reporting in GA4 — Whiteboard Friday

Directional Reporting in GA4 — Whiteboard Friday

Aug 02, 2024
The Value of SEO Beyond Traffic and Leads — Whiteboard Friday

The Value of SEO Beyond Traffic and Leads — Whiteboard Friday

Jun 28, 2024

Comments

Please keep your comments TAGFEE by following the community etiquette

Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.