In the world of ads, apparently it’s good to not just run ads but also see how effective they are. Who woulda thought? Google’s ad platform is obviously a big one but they’re certainly not the only one. In this case, we’re getting data from Facebook’s ad system, putting it into a CSV, and then uploading it to Google Analytics (GA) so we can run queries against it (or something. See disclaimer).
I started here and it really wasn’t the most helpful page. Hopefully this walkthrough alleviates some of the pains I had.
To get started, you’ll want to know what kind of data you want to import. You must have
ga:medium (ex: “organic”, “email”, etc.) and
ga:source (ex. “facebook”)
Aside from those two, you have to have at least one of the following:
Otherwise, what are you even tracking?
So think about your data in these terms, you can export it now from your data source, or you can do it later. The world is yours.
You’ll need an account that can access the GA API. Google has approximately a billion different APIs so hit up their Developer Console. Create a new project then select it to create users within it. I called my project “Analytics”. You can get similarly creative if you want. We’re going to hang out in the APIs & auth section. Go to the item called Analytics API and turn it on. Easy.
Since we’re working on the server, we’ll want to create a new Client ID and make sure it’s a service application. Once you create that, a .p12 file will download to your machine. Don’t close this tab.
Google gives you a .p12 file but when authenticating, you need a .pem file. If you’re on a Mac, you can use the handy openssl tool on the command line to create a .pem.
$ cd into your Downloads directory and run this:
openssl pkcs12 -in downloaded-file.p12 -out key.pem -nodes
I moved the pem to the project directory just so it’d be easier to access later.
Take a quick look at the “Permissions” section and make sure that this new user you created has the “Can Edit” permissions.
Firstly, and this really frustrated me for a few hours, use email that you got from creating a Client ID and add it as a user the the GA profile you’re going to be uploading data to.
Let’s go ahead and get GA setup. Login to your GA profile an use the Admin tab on top. Pick the property you want to upload data to then use the Data Import section to create a new data set for Cost Data. Give it a name, select your views, and then setup your schema. Grab the Custom Data Source ID because you’ll need it later. Save it and let’s move on.
Google has a handy Node lib that makes connecting to their APIs a bit easier. I’m going to cross my fingers that you understand a bit of nodejs so just get the file started with your requires
Google uses a JSON Web Token to connect to their APIs. You don’t have to make it yourself, exactly, you just have to pass parameters to it. Creating the auth client is kind of easy assuming you blindly follow examples and don’t try to read any docs. Here’s what I ended up with and it worked:
1 2 3 4 5 6 7
The googleapi module make the whole process fairly easy, my biggest difficulty was finding what I had to pass to the upload function before it would actually upload.
The docs for dailyUploads are fairly helpful but didn’t do the best job describing what options were. Luckily, I figured them so you don’t have to.
1 2 3 4 5 6 7 8 9
Real quick, we want to get a string representation of the CSV so go ahead and get that using node’s fs module.
So now we can create the client, kick some options to it, authorize and upload our data. Make sure this is wrapped in an
authClient.authorize() so we’re producing an access token.
1 2 3 4 5 6 7 8 9 10 11 12
If you’ve done everything right, you should be able to run
$ node app.js and get a response that contains a nextAppendLink.
All together now!
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55
SO there you go, you can upload cost data to GA. Now you can query it just like anything else in GA.