Monday, December 15, 2014

Swift & Google Analytics

I like swift pretty well.

Sometimes I'm amazed at how simple and elegant it is, sometimes I'm confused and hate it - but that's how it is learning new languages.

I have a pretty simple app I'm working on using Swift and now that it's ready to be user tested I want to throw some Google Analytics into it.  Google only has the regular iOS libraries which is no problem (according to the Swift documentation).  Let's see how to set this up:

First step:  Review the documentation at:

Based on this quote:
To import a set of Objective-C files in the same app target as your Swift code, you rely on an Objective-C bridging header to expose those files to Swift. Xcode offers to create this header file when you add a Swift file to an existing Objective-C app, or an Objective-C file to an existing Swift app.

It looks like we can just add the files to our Swift project.  So let's download the latest iOS SDK for Google Analytics:

I like to keep my 3rd party libraries in different "groups" in XCode, so I created an "Analytics" group.   In the Google Analytics download there is a nice neat little folder called "Library" in this version, this folder has all of the necessary headers you will need.  Take that folder and drop it into your Analytics group and tell XCode to copy the files.

Problem 1
XCode was supposed to ask if I wanted to create a bridge file, unfortunately it didn't, this will be a problem later for sure.  

Problem 2
Google Analytics documentation doesn't say anything about adding the ligGoogleAnalyticsServices.a file to the project either.  Odd that they overlooked it this time those poor newbies...

We do need to add some frameworks to our project if they're not there:
  • CoreData.framework
  • SystemConfiguration.framework
  • libz.dylib
After all that I did a build and it seems ok.  Doesn't mean much, but it's nice to know nothing is broken.

Ok, back to the apple documentation...

Since we didn't get our bridge file created auto-magically, we have to create it ourselves.  Luckily there is a little blurb that tells us how to do it ourselves (thanks Apple!).
Alternatively, you can create a bridging header yourself by choosing File > New > File > (iOS or OS X) > Source > Header File.
So, we create that file and add our Google Analytics headers into it, mine looks like this:

//  -Bridging-Header.h
//  DentalNetwork
//  Created by Aaron OBrien on 12/15/14.
//  Copyright (c) 2014 spct. All rights reserved.

#ifndef DentalNetwork__Bridging_Header_h
#define DentalNetwork__Bridging_Header_h

#import "GAI.h"
#import "GAIDictionaryBuilder.h"
#import "GAIEcommerceFields.h"
#import "GAIEcommerceProduct.h"
#import "GAIEcommerceProductAction.h"
#import "GAIEcommercePromotion.h"
#import "GAIFields.h"
#import "GAILogger.h"
#import "GAITrackedViewController.h"
#import "GAITracker.h"


When I did a build nothing happened and I expected some error or something the documentation says this:
Under Build Settings, make sure the Objective-C Bridging Header build setting under Swift Compiler - Code Generation has a path to the header.
The path should be relative to your project, similar to the way your Info.plist path is specified in Build Settings. In most cases, you should not need to modify this setting.
So I opened up the build settings in XCode and searched for Bridge and found out it was blank.  I added the path that I thought it was and got a great error telling me the value was wrong.  Fixed that and everything compiled.  This is looking promising!  Step 1 of the Google Analytics seems to be done.

Let's see if we can initialize a tracker!
To initialize the tracker, import the GAI.h header in your application delegate .m file and add this code to your application delegate'sapplication:didFinishLaunchingWithOptions: method:
Step 1 import the GAI.h file.... hmm Apple says this:
Any public Objective-C headers listed in this bridging header file will be visible to Swift. The Objective-C functionality will be available in any Swift file within that target automatically, without any import statements. Use your custom Objective-C code with the same Swift syntax you use with system classes.
So we should be able to skip importing the GAI.h class because Swift does it for us.  That is so convenient!

next we need to write some code to initialize the triacker:
  [GAI sharedInstance].trackUncaughtExceptions = YES;
  GAI.sharedInstance().trackUncaughtException = true
Pretty simple enough, code completion even worked! Follow Google's example and setup the rest of your initialization and you get something like this:
        GAI.sharedInstance().trackUncaughtExceptions = true
        GAI.sharedInstance().dispatchInterval = 20
        GAI.sharedInstance().logger.logLevel = GAILogLevel.Verbose

I hit the build button and got an error that said this:
Undefined symbols for architecture arm64:
  "_OBJC_CLASS_$_GAI", referenced from:
      __TMaCSo3GAI in AppDelegate.o
ld: symbol(s) not found for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
That is telling us we don't have the implementation of the GAI definition.  So this is a good time to add that ligGoogleAnalyticsServices.a file I mentioned before.  It's good we've done this before with Objective-C or that would have been a tuff error to work through!

OK!  I added the .a file verified that it was added in my linked binaries in the build phases and got a clean compile!

Now we need to do something with this new setup.

Automatic screen tracking sounds great but it can cause a lot of problems when Google upgrades their SDK and changes things around (renaming parameters or what not).  Also you have to extend the UIViewController and that is kind of bothersome to me, so let's do some manual tracking instead.
It's a bit of extra effort but it turns out that encapsulating your code from Google's API thrashing is a very good idea (no offense to Google, but they do change stuff around in this particular API an awful lot).  So off we go to

To Swift-ify their example code we get something like this:
        var tracker:GAITracker = GAI.sharedInstance().defaultTracker as GAITracker
        tracker.set(kGAIScreenName, value:"Home Screen")
Let's compile and run our code and Ka-Bam we are measuring screens!

That wasn't too bad at all was it?

One more caveat, before you get all crazy measuring screens and other things; take the time to create a utility class and encapsulate any of your other code from accessing any Google Api's directly.  That way when their new version changes you can update your code in one place and you don't have to go through and touch every place you are calling Google Analytics.  This was a painful lesson to learn...

Cheers and Happing Coding!


Wednesday, September 17, 2014

Swift Pains

Learning a new programming language can be frustrating. Some more so than others, Python for example is pretty easy, there are a few oddities but it's easy enough.  In contrast Objective-C for me was difficult but now that I "get it" I prefer it over most other languages. Swift seems to be following the Objective-C pattern, apparently I'm still not quite "getting it".

In case you weren't paying attention - or just don't follow iOS development, this year at WWDC(2014) Apple released a new language called Swift. It's supposed to perform faster and handle memory much better; that alone seems compelling enough to test it out, but I have to say it has not been a great experience. To be honest, it's possible I'm just grumpy about learning something new; but it seems to me that there are some oddities about the language that are a bit limiting.

The first oddity I ran into was the lack of multi-dimensional array. If you aren't familiar with the term I'm sure you're familiar with the concept; basically an array is a collection of of objects - you could think of it as a line of soldiers.  To make it multi-dimensional it would be like a troop of soldiers instead of a single line) you have rows and columns (of soldiers). Often times the example used to illustrate a multi-dimensional array is a spreadsheet with rows and columns but you get the idea.

To create an multi-dimensional array most languages use some type of syntax like this:

  • int grid[][] =... 
  • or int[][] grid = ... 
Hmm... not so with Swift from what I can tell there is no native support for multi-dimensional arrays.  You have to create a structure like this (feel free to copy and paste here it's pretty much copy and paste from the Swift manual):

import Foundation

struct Matrix {
    let rows: Int, columns: Int
    var grid:[String]
    init(rows: Int, columns: Int)
        self.rows = rows
        self.columns = columns
        grid = Array(count: rows * columns, repeatedValue:"")
    func indexIsValidForRow(row: Int, column: Int) -> Bool
        return row >= 0 && row < rows && column >= 0 && column < columns
    subscript(row: Int, column: Int) -> String
            assert (indexIsValidForRow(row, column: column), "Index out of range)")
            return grid[(row * columns) + column]
            assert(indexIsValidForRow(row, column: column), "Index out of range")
            grid[(row * columns) + column] = newValue


So now that you have this structure set up here is an example on how to use it:
        var letters: [String] = ["A", "B", "C", "D", "E", "F", "G", "H", "I", "J", "K", "L", "M", "N", "O", "P", "Q", "R", "S", "T", "U", "V", "W", "X", "Y", "Z"]
        var rows = 6
        var columns = 8
        //create our Matrix struct

        var matrix = Matrix(rows: rows, columns: columns) 
        for (var row = 0; row < matrix.rows; row++)
            for (var column = 0; column < matrix.columns; column++)
                var rnd = Int(arc4random_uniform(26))
                var randomLetter = letters[rnd]
                // sets the value in a row/column to a random letter
                matrix[row, column] = randomLetter
So you  get a taste of the oddity that is swift - we access our variables in our matrix like this:
        println("First Element: \(matrix[0, 0])")
        println("Last Element:  \(matrix[6, 8])")
Typically in most languages you would access an element like this: matrix[6][8] but since we're not really working with a proper multi-dimensional array we use a comma to separate the subscript call.

So that is different and pretty weird if you don't know about it before hand, even after knowing about it, I still have to scratch my head and wonder what Apple engineer's are thinking on that one.

Next in our list of frustrating things about Swift is the core data integration...

CoreData is a very strange beasty when it comes to Swift.  No one has said this about Swift but from what I have gathered, Swift is a different language the compiler processes to generate binary compatible code with Objective-C.   Basically two languages that do the same thing.  I believe that Apple has been able to take their existing frameworks and generate swift code off of them and regenerated the mountains of documentation that they have written over the years.

You can have a project that uses both languages in it, you have to create a bridge file for it to work but it's possible and probably will be very common to do this in the future.   The tricky part is that if you  need to use certain frameworks (and CoreData is one of them) you have to make your Swift code available to the land of Objective-C, even if you aren't coding Objective-C.

I've gone a couple of rounds with CoreData and swift and have decided the best thing to do (if you're in a pure swift code base) is to edit the generated Swift NSManagedObject classes.  It is always a bad idea to do this but we just need to tweak them a tiny bit and add the "@objc" attribute to the swift files that are generated.  Editing code that is generated is always a problem but for now this is not a huge problem.  First of all thanks to Christer from stack overflow who answered a similar question I was working on that lead me to the @objc solution.

Here is what the Swift documentation says about @objc:
“Apply this attribute to any declaration that can be represented in Objective-C—for example, non-nested classes, protocols, properties and methods (including getters and setters) of classes and protocols, initializers, deinitializers, and subscripts. The objc attribute tells the compiler that a declaration is available to use in Objective-C code.”
“The objc attribute optionally accepts a single attribute argument, which consists of an identifier. Use this attribute when you want to expose a different name to Objective-C for the entity the objc attribute applies to. You can use this argument to name classes, protocols, methods, getters, setters, and initializers. ...”

Excerpt From: Apple Inc. “The Swift Programming Language.” iBooks.
What this all means is that when you generate your Swift Managed Object classes open them up and
change this:
import Foundation 
import CoreData 

class Patient: NSManagedObject 
    @NSManaged var firstName: String 
    @NSManaged var lastName: String 
    @NSManaged var games: NSSet 

to this:
import Foundation 
import CoreData 

@objc(Patient) //add this line to make everything work
class Patient: NSManagedObject 
    @NSManaged var firstName: String 
    @NSManaged var lastName: String 
    @NSManaged var games: NSSet 

So with that bit of knowledge in had we can start getting productive again(just to be clear without the @objc attribute in our generated class files the following code will not work).
var newPatient:Patient =      NSEntityDescription.insertNewObjectForEntityForName("Patient",               inManagedObjectContext: context) as Patient 
newPatient.firstName = firstNameTextField.text 
newPatient.lastName  = lastNameTextField.text
Querying the data is simple as well:                var req = NSFetchRequest(entityName: "Patient")
        var sortDescriptor = NSSortDescriptor(key: "lastName", ascending: true)
        req.sortDescriptors = [sortDescriptor]
        self.patients = context.executeFetchRequest(req, error: nil)

Maybe it will get better, as I get more comfortable but so far Swift is a poorly named language chocked full of gotchas, and facepalms.

Good luck if you diving into iOS development with Swift.


Monday, July 28, 2014

Data Persistence With AppEngine

When you are using AppEngine, there are a handful of choices to store your data.  Once Google introduced Cloud SQL the choices just increased.  How can you figure out which framework to use?

I'm a pretty big fan of the Cloud Storage so that limited my choices just a bit - but I still had a choice between JDO or JPA.  How and why should you decide one of these frameworks over the other?  I'm sure a true J2EE aficionado can tell you which is the better choice and why under which circumstances depending on which app server and database you're using.  I do know that I've tried to shoe horn both of these technologies into AppEngine and found it to be very cumbersome.  To be honest I wasn't looking forward to fighting through it once again.  As I was researching which one to chose I ran across a very amusing quote at the Objectify website:
The quote from the page is:
"After you've banged your head against JDO and screamed "Why, Google, why??" enough times..."

I decided to play around with Objectify to see how hard or easy it was, and turns out it was super easy.  I started with the latest version Objectify 5.  It is pretty new, most of the info you'll find on the internet is for the older versions.  The latest is a tad more straightforward and arguably easier to understand and use.

Here are the basics:
When you create a new domain object you must annotate two things:

  1. The @Entity annotation on the class itself
  2. The Id annotation on a variable to be used as the @Id
  3. An Optional annotation is there if you want to search by a particular field @Index - this actually took me a little while to figure out
Once you have your domain object properly annotated (the annotations are all in the 

com.googlecode.objectify.annotation package) you need to access them somehow.  Objectify supports my favorite pattern really well - I'm not sure if it has an official name but I really like to have all of my database access in a single class.  It may sound a tad horrible but if you keep the business logic out of it and you are just creating, reading, updating and deleting stuff around it doesn't usually get too unmanageable.  It makes maintenance and testing very easy - and God forbid you ever have to change your persistence framework you can keep your changes all in one place.

To use objectify there are two things you need to have: a Factory and a Service.  This is how it is recommended to get access to them:

private static Objectify service()
   return ObjectifyService.ofy();
private static ObjectifyFactory factory()
   return ObjectifyService.factory();

Once you have those in place the next thing you need to do is register your annotated domain objects:

Saving is as easy as this:
public static void saveUser(User user)
   Result> result = service().save().entity(user);;

If the Object you are saving has an Id populated the save becomes an update.

The now method on the result is a flag tells the system we want to wait for the save to finish - if you leave that off it will still save but whenever the system gets around to it.

Searching looks like this (remember if you're searching a field it has to have the index annotation):
public static MobileUser findMobileUserByEmail(String email)
     List< User > users = service().load().type(User.class).filter("email", email).list();
     User mu = null;
     if (users.size() > 0)
          mu = users.get(0);
     return mu;

And lastly Deleting looks like this:
public static void deleteUser(User ) 

These are the imports for my DataHelper class:
import com.googlecode.objectify.Key;
import com.googlecode.objectify.Objectify;
import com.googlecode.objectify.ObjectifyFactory;
import com.googlecode.objectify.ObjectifyService;
import com.googlecode.objectify.Result;

Pretty simple stuff.  Objectify does support transactions but for my little test project I wasn't worried about that too much.  Once I had this all coded I ran into a couple of issues and to be honest I had a hard time figuring it out and really started wishing I had a test case to just hit my class that accessed the data store but how can you test a class that access cloud storage persistence that is pretty much buried behind rest calls.  Guess what,  Google has a spectacular test class called: LocalServiceTestHelper that does just that.

Basically you create your standard JUnit class and create an instance of the DataStoreServiceTestConfig class like this:
private final LocalServiceTestHelper helper = new LocalServiceTestHelper(new LocalDatastoreServiceTestConfig());

Once that guy is created you can access the datastore in your test case normally (and so can the Objectify framework).
So I could write a test that looked like this;
public void testRetrieve()
     User user = DataService.findUserByEmail("");


and it just magically works.

If you're writing a web service for AppEngine I would highly recommend using Google Cloud Endpoints and Objectify.  The level of effort is small but the capability and value returned is very high.

Happy Coding!


Thursday, July 17, 2014

AppEngine With Patience

My previous blog post was about using Google Cloud Endpoints as REST framework. I was a bit skeptical when I finished the blog but now that I've played with it for a coupled of days I have to say it's truly amazing (and very easy).

One of the first services I usually write when trying out a new REST framework is a service to retrieve the server time.  This is useful in a couple of ways, it lets you exercise the REST framework without getting caught up in a lot of business logic and it let's you know if the server is running quickly and easily.  Here is the code for that service:


import java.util.Calendar;


     version= "v1",
     namespace = @ApiNamespace(ownerDomain = "",
        ownerName = "",
public class ServerStatus 
  @ApiMethod(name="serverTime", httpMethod="get")
  public Status serverTime()
    Status status = new Status();
    return status;


A little bit heavy on annotations I suppose, but they're pretty easy to understand and mostly copy and paste between classes you want to use as services.  The name attribute in the Api annotation is important, and the @ApiMethod is entirely optional, I believe if you don't specify an httpMethod the default is post, however this is pretty much a moot point as you will see if you keep reading.

So that is pretty cool stuff.  Arguably easier or as easy as jersey - however since we're using AppEngine  it's absolutely easier because there are ZERO configuration changes or jars to import.  Did you hear that?  No configuration or extra jars - that is AMAZING!

But wait that's not all!  Once you have your service working the way you like it there is a little maven command that will generate a discovery document for you

 mvn appengine:endpoints_get_discovery_doc

So, what is a discovery document and why is that important?  It is a description of the services contained in that file and it can be used to generate client code to hit those services (or endpoints).

Once you generate that file you can run a command using Google's Library generator to auto generate a whole bunch of code.  You download the source and compile it yourself using XCode, an important note there was one line of code that was wouldn't compile.  It's on line 34 of the FHUtils.m file here is the naughty line:

    NSMutableCharacterSet *setBuilder = [NSCharacterSet characterSetWithRange:NSMakeRange('a', 26)];
and this is the very simple fix:
    NSMutableCharacterSet *setBuilder = [NSMutableCharacterSet characterSetWithRange:NSMakeRange('a', 26)];

Hardly worth noting but if you are in a panic that is how you fix it.  Follow the instructions on the link and you should be generating all kinds of code.  Now there are some pretty specific instructions you need to follow.  There is a section labeled "Adding Required files to your iOS project" step 1g has you doing something I didn't even know was possible!  You can disable arc management on a file by file basis!  Holy smokes - that is a useful trick.

Once you get it all up and running you have probably added over 20 files to your project - it's a lot but keep in mind you didn't have to write it - and since it's code you can jump in and take a look to see how Google thinks things should be done!

Here is a snippet of code that uses the auto generated code that we created above:
    static GTLServiceStatusAPI *service = nil;
    if (!service)
        service = [[GTLServiceStatusAPI allocinit];
        service.retryEnabled = YES;
    GTLQueryStatusAPI *query = [GTLQueryStatusAPI queryForServerTime];

    [service executeQuery:query completionHandler:^(GTLServiceTicket *ticket, GTLStatusAPIStatus *object, NSError *error){
        if (error != nil)
            NSLog(@"Error getting time: %@", [error localizedDescription]);
            NSString *serverTime = [object currentTime];
            NSLog(@"Server time is %@", serverTime);


Here is the same code in pseudo code so you can get the theory down:

ServiceObject service = new ServiceObject
ServiceQuery query = new ServiceQuery

[service executeQuery: query handleCallback:^(ServiceTicket ticket, ServiceResponse response, Error error){
  handle response

That pattern is followed when ever you want to call the server - posting and getting is abstracted away from you, even the URL that you're communicating is hidden away.  All in all an amazing framework.

There is so much magic in serializing the response and deserializing it on the backend, I can't believe how easy it was to get working.  It is sort of like writing EJB's in the old days except this actually works.

There were a couple of gotchas I feel obligated to mention.  
  • Your AppEngine app seems to need to be deployed out on app engine.  This means you can't hit the dev server while you're developing the client code.  There is probably a very clever work around but it wasn't worth the effort for me.
  • Also since the project we created in the previous blog is all maven based I ended up writing a couple of scripts to:
    1. deploy to appengine (if you have 2Factor authentication enabled this is a bit of a hassle) 
    2. generate the code once the discovery file was generated.  This process is well documented I just wrote a script to be lazy.
My next post will hopefully cover using Objectify as a persistence layer and how you can use JUnit tests to test it.  Exciting stuff!



Tuesday, July 15, 2014

New ways to do old things

Web Services are a way of life for mobile developers.  I can't really think of a single mobile app I've written that didn't have a backend of some type, even if it's just Google Analytics.

A very common pattern for web services are to setup a basic set of REST based services.  These let the phone update info on the backend very easily.  One of the fastest and easiest Web Servers to use is Google's AppEngine unfortunately rest frameworks are oddly heavy - meaning that the frameworks use lots of jars and extra libraries to do something that seems like it should be simple - and the heavier the framework the more effort is to get it working.

Here is an example of a restful url to list all the frogs on a backend:
Another way to create that same functionality would be to create a basic url like this:

The backend code for both  techniques would be very similar.

Jersey is a framework I've used a lot.  It's a great framework for REST and there are lots of benefits to using a REST framework other than URL pattern matching.  The problem is that Google's AppEngine doesn't seem to like Jersey very much.  Oh sure, you can get it working (eventually).  Some people would probably say it's even easy - but it's actually a huge pain in the butt to configure and run the latest version of Jersey on the latest version of AppEngine.

So that brings us to our blog title is there a new way to do the same thing I've been doing for several years now?  Hopefully one that is less painful than trying to get the Jersey framework to function?

Well good news! Apparently Google has a new framework or API called "Cloud EndPoints".  I started playing with them today and so far it's a little frustrating - so I though I'd share some of it with you.  :)

First of all after reading up on the documentation it sounds pretty cool, it looks to lean pretty heavily on Eclipse and the Google Web plugin which is cool - we all love eclipse right?

As you go through the tutorial you see several mentions to the plugin - and then you get to the part where you build a demo application entirely from maven.  That's cool, we like Maven too right? So the tutorial page tells you to issue this maven command:
  1. mvn archetype:generate -Dappengine-version=1.9.6
and that works just fine also.  Well, at least until you import the new project into eclipse so you can code the app.  After you import the project into eclipse you can expect to get several errors about your pom.xml file and your persistence.xml file.

Really?!  What the hell?  This is so unlike Google - things typically just magically work with their frameworks and libraries.

There are about 6 errors in the POM file that all have to do with "Plugin execution not covered by lifecycle configuration" this turns out to be an eclipse or more specifically an M2E plugin (maven 2 eclipse) error.  It has a pretty easy fix here is the documentation on why this error occurs, here is the snippet you need to add to your pom.xml file (it goes inside the tag):

          <ignore />
          <ignore />


Keep in mind this fix is just so that eclipse won't complain about your pom file.  It does not affect your build if you're using Jenkins or building on the command line it is ignored, also to be clear the errors only occur in Eclipse - so it's not really a Google error. 

OK, so now our build is working.  now we have to figure out what's going on with the persistence.xml file - when you look at that file all that is in there is one line:

xml version="1.0" encoding="UTF-8"?>

Well, that is clearly a problem no xml validator would be happy with that, so add just enough xml to make it happy:

xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" 
<persistence-unit name="dataStore">


Keep in mind this is only valid for the XML validation.  Once we get ready to use the data store we will need to fix it the right way.

I'll keep you posted on how the endpoints work out.  It's a tad irritating so far but learning a new way to do something is often frustrating.