Monday, July 28, 2014

Data Persistence With AppEngine

When you are using AppEngine, there are a handful of choices to store your data.  Once Google introduced Cloud SQL the choices just increased.  How can you figure out which framework to use?

I'm a pretty big fan of the Cloud Storage so that limited my choices just a bit - but I still had a choice between JDO or JPA.  How and why should you decide one of these frameworks over the other?  I'm sure a true J2EE aficionado can tell you which is the better choice and why under which circumstances depending on which app server and database you're using.  I do know that I've tried to shoe horn both of these technologies into AppEngine and found it to be very cumbersome.  To be honest I wasn't looking forward to fighting through it once again.  As I was researching which one to chose I ran across a very amusing quote at the Objectify website:
The quote from the page is:
"After you've banged your head against JDO and screamed "Why, Google, why??" enough times..."

I decided to play around with Objectify to see how hard or easy it was, and turns out it was super easy.  I started with the latest version Objectify 5.  It is pretty new, most of the info you'll find on the internet is for the older versions.  The latest is a tad more straightforward and arguably easier to understand and use.

Here are the basics:
When you create a new domain object you must annotate two things:

  1. The @Entity annotation on the class itself
  2. The Id annotation on a variable to be used as the @Id
  3. An Optional annotation is there if you want to search by a particular field @Index - this actually took me a little while to figure out
Once you have your domain object properly annotated (the annotations are all in the 

com.googlecode.objectify.annotation package) you need to access them somehow.  Objectify supports my favorite pattern really well - I'm not sure if it has an official name but I really like to have all of my database access in a single class.  It may sound a tad horrible but if you keep the business logic out of it and you are just creating, reading, updating and deleting stuff around it doesn't usually get too unmanageable.  It makes maintenance and testing very easy - and God forbid you ever have to change your persistence framework you can keep your changes all in one place.

To use objectify there are two things you need to have: a Factory and a Service.  This is how it is recommended to get access to them:

private static Objectify service()
   return ObjectifyService.ofy();
private static ObjectifyFactory factory()
   return ObjectifyService.factory();

Once you have those in place the next thing you need to do is register your annotated domain objects:

Saving is as easy as this:
public static void saveUser(User user)
   Result> result = service().save().entity(user);;

If the Object you are saving has an Id populated the save becomes an update.

The now method on the result is a flag tells the system we want to wait for the save to finish - if you leave that off it will still save but whenever the system gets around to it.

Searching looks like this (remember if you're searching a field it has to have the index annotation):
public static MobileUser findMobileUserByEmail(String email)
     List< User > users = service().load().type(User.class).filter("email", email).list();
     User mu = null;
     if (users.size() > 0)
          mu = users.get(0);
     return mu;

And lastly Deleting looks like this:
public static void deleteUser(User ) 

These are the imports for my DataHelper class:
import com.googlecode.objectify.Key;
import com.googlecode.objectify.Objectify;
import com.googlecode.objectify.ObjectifyFactory;
import com.googlecode.objectify.ObjectifyService;
import com.googlecode.objectify.Result;

Pretty simple stuff.  Objectify does support transactions but for my little test project I wasn't worried about that too much.  Once I had this all coded I ran into a couple of issues and to be honest I had a hard time figuring it out and really started wishing I had a test case to just hit my class that accessed the data store but how can you test a class that access cloud storage persistence that is pretty much buried behind rest calls.  Guess what,  Google has a spectacular test class called: LocalServiceTestHelper that does just that.

Basically you create your standard JUnit class and create an instance of the DataStoreServiceTestConfig class like this:
private final LocalServiceTestHelper helper = new LocalServiceTestHelper(new LocalDatastoreServiceTestConfig());

Once that guy is created you can access the datastore in your test case normally (and so can the Objectify framework).
So I could write a test that looked like this;
public void testRetrieve()
     User user = DataService.findUserByEmail("");


and it just magically works.

If you're writing a web service for AppEngine I would highly recommend using Google Cloud Endpoints and Objectify.  The level of effort is small but the capability and value returned is very high.

Happy Coding!


Thursday, July 17, 2014

AppEngine With Patience

My previous blog post was about using Google Cloud Endpoints as REST framework. I was a bit skeptical when I finished the blog but now that I've played with it for a coupled of days I have to say it's truly amazing (and very easy).

One of the first services I usually write when trying out a new REST framework is a service to retrieve the server time.  This is useful in a couple of ways, it lets you exercise the REST framework without getting caught up in a lot of business logic and it let's you know if the server is running quickly and easily.  Here is the code for that service:


import java.util.Calendar;


     version= "v1",
     namespace = @ApiNamespace(ownerDomain = "",
        ownerName = "",
public class ServerStatus 
  @ApiMethod(name="serverTime", httpMethod="get")
  public Status serverTime()
    Status status = new Status();
    return status;


A little bit heavy on annotations I suppose, but they're pretty easy to understand and mostly copy and paste between classes you want to use as services.  The name attribute in the Api annotation is important, and the @ApiMethod is entirely optional, I believe if you don't specify an httpMethod the default is post, however this is pretty much a moot point as you will see if you keep reading.

So that is pretty cool stuff.  Arguably easier or as easy as jersey - however since we're using AppEngine  it's absolutely easier because there are ZERO configuration changes or jars to import.  Did you hear that?  No configuration or extra jars - that is AMAZING!

But wait that's not all!  Once you have your service working the way you like it there is a little maven command that will generate a discovery document for you

 mvn appengine:endpoints_get_discovery_doc

So, what is a discovery document and why is that important?  It is a description of the services contained in that file and it can be used to generate client code to hit those services (or endpoints).

Once you generate that file you can run a command using Google's Library generator to auto generate a whole bunch of code.  You download the source and compile it yourself using XCode, an important note there was one line of code that was wouldn't compile.  It's on line 34 of the FHUtils.m file here is the naughty line:

    NSMutableCharacterSet *setBuilder = [NSCharacterSet characterSetWithRange:NSMakeRange('a', 26)];
and this is the very simple fix:
    NSMutableCharacterSet *setBuilder = [NSMutableCharacterSet characterSetWithRange:NSMakeRange('a', 26)];

Hardly worth noting but if you are in a panic that is how you fix it.  Follow the instructions on the link and you should be generating all kinds of code.  Now there are some pretty specific instructions you need to follow.  There is a section labeled "Adding Required files to your iOS project" step 1g has you doing something I didn't even know was possible!  You can disable arc management on a file by file basis!  Holy smokes - that is a useful trick.

Once you get it all up and running you have probably added over 20 files to your project - it's a lot but keep in mind you didn't have to write it - and since it's code you can jump in and take a look to see how Google thinks things should be done!

Here is a snippet of code that uses the auto generated code that we created above:
    static GTLServiceStatusAPI *service = nil;
    if (!service)
        service = [[GTLServiceStatusAPI allocinit];
        service.retryEnabled = YES;
    GTLQueryStatusAPI *query = [GTLQueryStatusAPI queryForServerTime];

    [service executeQuery:query completionHandler:^(GTLServiceTicket *ticket, GTLStatusAPIStatus *object, NSError *error){
        if (error != nil)
            NSLog(@"Error getting time: %@", [error localizedDescription]);
            NSString *serverTime = [object currentTime];
            NSLog(@"Server time is %@", serverTime);


Here is the same code in pseudo code so you can get the theory down:

ServiceObject service = new ServiceObject
ServiceQuery query = new ServiceQuery

[service executeQuery: query handleCallback:^(ServiceTicket ticket, ServiceResponse response, Error error){
  handle response

That pattern is followed when ever you want to call the server - posting and getting is abstracted away from you, even the URL that you're communicating is hidden away.  All in all an amazing framework.

There is so much magic in serializing the response and deserializing it on the backend, I can't believe how easy it was to get working.  It is sort of like writing EJB's in the old days except this actually works.

There were a couple of gotchas I feel obligated to mention.  
  • Your AppEngine app seems to need to be deployed out on app engine.  This means you can't hit the dev server while you're developing the client code.  There is probably a very clever work around but it wasn't worth the effort for me.
  • Also since the project we created in the previous blog is all maven based I ended up writing a couple of scripts to:
    1. deploy to appengine (if you have 2Factor authentication enabled this is a bit of a hassle) 
    2. generate the code once the discovery file was generated.  This process is well documented I just wrote a script to be lazy.
My next post will hopefully cover using Objectify as a persistence layer and how you can use JUnit tests to test it.  Exciting stuff!



Tuesday, July 15, 2014

New ways to do old things

Web Services are a way of life for mobile developers.  I can't really think of a single mobile app I've written that didn't have a backend of some type, even if it's just Google Analytics.

A very common pattern for web services are to setup a basic set of REST based services.  These let the phone update info on the backend very easily.  One of the fastest and easiest Web Servers to use is Google's AppEngine unfortunately rest frameworks are oddly heavy - meaning that the frameworks use lots of jars and extra libraries to do something that seems like it should be simple - and the heavier the framework the more effort is to get it working.

Here is an example of a restful url to list all the frogs on a backend:
Another way to create that same functionality would be to create a basic url like this:

The backend code for both  techniques would be very similar.

Jersey is a framework I've used a lot.  It's a great framework for REST and there are lots of benefits to using a REST framework other than URL pattern matching.  The problem is that Google's AppEngine doesn't seem to like Jersey very much.  Oh sure, you can get it working (eventually).  Some people would probably say it's even easy - but it's actually a huge pain in the butt to configure and run the latest version of Jersey on the latest version of AppEngine.

So that brings us to our blog title is there a new way to do the same thing I've been doing for several years now?  Hopefully one that is less painful than trying to get the Jersey framework to function?

Well good news! Apparently Google has a new framework or API called "Cloud EndPoints".  I started playing with them today and so far it's a little frustrating - so I though I'd share some of it with you.  :)

First of all after reading up on the documentation it sounds pretty cool, it looks to lean pretty heavily on Eclipse and the Google Web plugin which is cool - we all love eclipse right?

As you go through the tutorial you see several mentions to the plugin - and then you get to the part where you build a demo application entirely from maven.  That's cool, we like Maven too right? So the tutorial page tells you to issue this maven command:
  1. mvn archetype:generate -Dappengine-version=1.9.6
and that works just fine also.  Well, at least until you import the new project into eclipse so you can code the app.  After you import the project into eclipse you can expect to get several errors about your pom.xml file and your persistence.xml file.

Really?!  What the hell?  This is so unlike Google - things typically just magically work with their frameworks and libraries.

There are about 6 errors in the POM file that all have to do with "Plugin execution not covered by lifecycle configuration" this turns out to be an eclipse or more specifically an M2E plugin (maven 2 eclipse) error.  It has a pretty easy fix here is the documentation on why this error occurs, here is the snippet you need to add to your pom.xml file (it goes inside the tag):

          <ignore />
          <ignore />


Keep in mind this fix is just so that eclipse won't complain about your pom file.  It does not affect your build if you're using Jenkins or building on the command line it is ignored, also to be clear the errors only occur in Eclipse - so it's not really a Google error. 

OK, so now our build is working.  now we have to figure out what's going on with the persistence.xml file - when you look at that file all that is in there is one line:

xml version="1.0" encoding="UTF-8"?>

Well, that is clearly a problem no xml validator would be happy with that, so add just enough xml to make it happy:

xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" 
<persistence-unit name="dataStore">


Keep in mind this is only valid for the XML validation.  Once we get ready to use the data store we will need to fix it the right way.

I'll keep you posted on how the endpoints work out.  It's a tad irritating so far but learning a new way to do something is often frustrating.