The Making of TrackMyMacca’s

The good people at have been nice enough to recognize the TrackMyMacca’s mobile app we created with Tribal DDB Sydney and Dinahmoe Stockholm by awarding it the Mobile of the Day. They also asked us to write an article about how the project came to be, so here you go.

The article was originally posted on theFWA Friday February 22, but I thought I’d give some love to my blog by also posting it here.

Please go to the original article to see all the nice illustrations.

In May 2012, Tribal DDB Sydney approached us to partner with them to create an AR- based experience with a sophisticated animated 3D universe built around McDonald’s products.

In this universe users could explore the ingredients that go into making five iconic McDonald’s products. In doing so, the user could get real-time information and stories about how the ingredients they were consuming right now had made it from the farm to their bellies.

Tribal DDB designed the complete solution for the TrackMyMacca’s app, which included turning McDonald’s supply chain data into an API, the way the app is split from the API and the user experience. Our role was to design and build the 3D universe and develop the app for iOS devices.

For us, TrackMyMacca’s was a real passion project, Both teams at ACNE and Tribal DDB went above and beyond to make it perfect. When we saw the final 3D world fold out on the table in front of us, we were incredibly pleased. In our eyes, TrackMyMacca’s is the perfect example of how technology can enable great design.

The TrackMyMacca’s app presented a series of challenges as we were not only dealing with 3D animations on a mobile device, but we were also showing these animations augmented onto the real world through the use of Augmented Reality – a technology with a notorious past.

Augmented Reality’s Notorious Past

When AR first caught our attention in 2009, everyone wanted to get a piece of the magic future technology that seemingly bridged the gap between the real and digital world.

Countless sites and experiences followed, but in the end we were still tied to the desktop computer and the limitations of having to create an experience where the user was asked to hold a printed symbol in front of a webcam. One might say that the AR bubble burst that same year.

Fast-forward to three years later, and the world of AR has significantly evolved with mobile technology making huge leaps forward, unleashing the power necessary to not only run the real-time image recognition algorithms but also render complex 3D animations at the same time.

We started building prototypes to explore the different AR libraries available and both parties quickly agreed on using Qualcomm’s Vuforia AR library. As well as having a significantly dedicated online community, Vuforia also complemented our development platform for the project – Unity3D.

Building a 3D Universe on Mobile

Unity was the perfect choice for a project of this scale; it has great IDE, is a very capable and flexible scripting platform, and also has a great online community – much like Vuforia.

When modelling for real-time rendering on a handheld device, we need to be constantly aware of elements that impact performance, such as polygon count, texture size and draw calls. All of which makes this very different from non-real time 3D, where we only need to worry about how many GHz hours to rent at a render farm.

Our 3D artists created the world in close collaboration with our Unity developers and we had to develop many iterations to make sure we stretched the hardware as far as we possibly could without actually breaking it.

Working under these conditions proved to be a great catalyst for creativity, especially for our developers. For instance, we would realize that the animated textures for the toaster skid marks were too straining for the target device, which forced us to come up with a scripted solution for dynamically creating the same effect on the fly.

As it turned out, even seemingly impossible change requests to the 3D world could be achieved!

Putting It All Together

One of the biggest challenges of this project was the test and development workflow. Vuforia 2.0 was released towards the very end of the project, and one of the big advantages of this version of the SDK, in combination with Unity 4, is that you can use your webcam to test your work during the development process.

But for most of the project this great functionality wasn’t available to us, and that meant deploying to a device every time we had to test even the smallest change. Fortunately that won’t be the case for future AR projects.

Audio integration was also an interesting challenge for this project. While the iPhone is capable of delivering really good sound quality if you connect a pair of decent headphones to it, the reality is that most users will experience the audio through the device’s speaker.

Audio for a mobile experience means catering to both scenarios. Our partner DinahMoe created a soundscape that both added a wonderful dimension to the ambient noises of a restaurant experience, and delivered a truly immersive experience to those users who enjoyed it through their headphones.

We also encountered challenges regarding shakiness of the scene. AR works well if the 3D scene is approximately the same size as the AR target, but in our case the scene was many times bigger than that. This caused some issues with the stability of the scene.

The small movements inevitably made when pointing a device at an AR target, are magnified exponentially based on the size relationship between the target and the final 3D scene.

This particular issue was solved with the help of the Vuforia community by dampening the input that comes from the gyroscope in the device, and by easing the scaling and orientation changes in the 3D scene.

Working with AR means combining real world objects with 3D, and while the technology has improved tremendously over the previous years, we’re still very much at the mercy of the user.

In the case of TrackMyMacca’s, the ideal scenario was a user in a well lit McDonald’s restaurant, with a good wifi-connection and a sturdy unbroken box as a target. This was of course completely out of our control, and all we could do was to build the greatest app we possibly could. The rest was up to the user.

About the Author

I work as an Interactive Creative Director at ACNE Production in Los Angeles. I specialize interactive direction on experience-based campaigns across multiple digital platforms. I’ve directed and lead campaigns with such brands as Nike, Coca-Cola, Nokia, GE, McDonald’s, Toyota, to name a few.

I graduated from the IT University of Copenhagen in 2005 and started working at Framfab/LBi in Copenhagen shortly after as a developer. I stayed with LBi for five years, eventually making Director of Technology.

I’ve always had a strong focus on how to bring an idea to life, so the step across the Atlantic Ocean to ACNE Production in 2010 was a natural one to make. I was hired as a Technical and Interactive director and was promoted to Interactive Creative Director in early 2013.

In my academic education I’ve combined a bachelor in arts and aesthetics with a masters degree in information technology. This unusual combination gives me a unique understanding of both the creative, technical and aesthetic aspects of a concept, as well as the tools and knowledge to bring that concept into life.

ACNE Production

Venice, CA

Agency: Tribal DDB Sydney

Production Company: ACNE Production

Sound: DinahMoe

Comments are closed.