Category: Technology

I have seen the future of live sports, and it is in VR

I just finished watching the regular season opener between defending NBA champions Golden State Warriors and New Orleans Pelicans. Warriors had home court and won the game with a safe margin, winning every quarter except the 4th.

What made this game different is that I watched it through a VR live stream, made possible by NextVR. For this game, I had a court-side seat. I was in fact sitting even closer than the regular, outrageously expensive court-side seats, as I was sitting dead center in front of the scorer’s table, about two feet off the ground right next to an official broadcast camera operator. To me, this was a milestone experience in live broadcast sports.

I’m a sports fan. I love going to live sports games; I love the atmosphere, I love the noise, I love the excitement and I love the game – although my friends will know that the love of the game is limited to certain sports. Live TV broadcasts have always been a poor excuse to seeing the real thing, an excuse that I’ve had to make more often over the last couple of years, as my work/life/family-balance has changed, and my favorite teams have been playing a continent and an ocean away. I do it, I like it, but I don’t love it – I’ve watched so many games with my favourite football teams in Europe where I’ve really wanted to be at the stadium instead of at home in my living room.

Watching one single live basketball game in VR has changed all that. Suddenly, I am there, court-side, watching the action as if I was there. From my seat I can’t see the scoreboard, so I have to keep track of the score in my head. There are no instant replays, since I’m watching the events as they unfold from one locked-off position. I will often look up at the shot-clock above the hoop to see how much time is left on the attack. From time to time the official will stand right in front of me, blocking my line of sight of the action. There is no commentator helping me identify the good plays and analyzing the game in real time. All of these things are the exact same things I experience when I go to a live game, and it’s thrilling as hell. I will even applaud a particularly well played assist by Steph Curry as if I was there, realizing seconds after that I’m the only one in the room – and waking up my daughter at the same time. But damn, that was well played.

When I watch traditional live sports broadcast, my attention wanders. I will look away from the screen, go to the kitchen, maybe even jump to a different tab on my computer, returning only to the live broadcast if I hear something exciting. In VR sports, there is no such thing – I’m there the whole time, immersed in the experience. And I don’t want to leave. Curiously enough I see the people sitting next to me court-side taking out their phones to check Facebook, but I can’t believe they will want to miss the action in front of them.

This is one of the first times I’ve felt genuinely excited by something I’ve watched in VR – beyond the pseudo-excitement of watching a cool tech-demo. I’m a casual basketball fan, but not a Warriors fan (#LobCity !) And yet I stayed on through the whole thing, because the experience was amazing.

This will change everything, I kid you not!

So let’s get the bad things out of the way:

  • The resolution could have been better.
  • The audio levels went up and down a lot, but it was cool to hear the chatter around me.
  • The Samsung Gear VR “needed to cool down” several times. A known issue.
  • The camera sat too low, I felt like a midget.
  • The two alternate camera positions, one behind the board and one in the nosebleeds, were less than optimal. Fortunately the court-side position was used most of the time.
  • It’s incredibly annoying how long it takes to get back into the app if you have to take the headset of for a minute to put your daughter back to sleep.
  • The POV was not 360, it was about 180. Annoying at first, but it made sense after a while.

But those things can and will be solved. I had to run to 7/11 to get beer before the final quarter, and drinking a cold beer while watching the game felt just as good as drinking one at the live game. Except it was a lot cheaper and I didn’t have to wait in line.

Let me also mention that I felt no physical discomfort from being in VR for a full quarter at a time – I’m not prone to motion sickness in VR under normal circumstances, but usually 10 minutes will throw me off. I credit this to the locked-off position, the limited field-of-view and the great head-tracking of the Gear VR.

And look how convenient it is for the family – this is me watching a basketball game in VR:

I can’t wait for the next chapter in this – how I wish someone would setup a live VR rig when Denmark faces Sweden in the playoffs for the 2016 Euro Cup. I want to be there to witness our glorious victory.

How to watch YouTube 360 videos in Google Cardboard

By popular demand, here is a quick guide on how to get YouTube’s new 360 videos to play inside Google Cardboard.

YouTube has quickly become the best source of 360 videos, and the quality is pretty high, provided the original video is in good enough quality. The guide is simple enough to follow, but it’s a bunch of manual steps, so there is some work involved.

Disclaimer: Downloading videos from YouTube is a legal grey area at best, so proceed at your own risk.

I’ve tried with this video:

Step 1: Download the source files, using your favourite YouTube downloader
I use Keepvid, so with my example, use this URL:

http://keepvid.com/?url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Ft%3D250%26v%3Dj81DDY4nvos

It splits the content into separate video and audio, and I’m guessing that’s unique to 360 video. Download the highest quality video version (in this case 2160p) in mp4-format and the separate audio track in M4a-format.

Step 2: Combine the video and audio files, using your favourite video tool.
I used ffmpeg, since it’s awesome. Here is the simple (and quick) command-line I used to merge the two:

ffmpeg -i video.mp4 -i audio.m4a.mp4 -c:v copy -c:a aac -strict experimental output.mp4

The merged file is called “output.mp4″ in this case

Step 3: View the video in your favourite VR video player for Android.
I used KolorEyes. It’s not pretty, but it works. It’s pretty specific about where to put the video-files, but follow that and you’ll do fine. I’m sure you know how to transfer files from your computer to your phone.

Vine finally released on Android

Vine, a project I’ve been really excited about since the beginning, has finally been released on Android, which means I can finally start using it. This also means that it’s no longer an iPhone exclusive. In the opinion of a lot of very sad people, this will completely dilute the concept, just as it did for Instagram.

Fortunately, I’m not one of those people. Also, I own a Samsung Galaxy Note II, so Android is my thing after a 2 year disappointing flirt with an iPhone 4. But I still like to crack a good joke in the Poor/bad taste/smelly => Android, Rich/sophisticated/delicious scent => iPhone category.

The release on Android also means that we can start using Vine as a platform for creating user generated content in the campaigns we do, and that’s very exciting news

The Making of TrackMyMacca’s

The good people at theFWA.com have been nice enough to recognize the TrackMyMacca’s mobile app we created with Tribal DDB Sydney and Dinahmoe Stockholm by awarding it the Mobile of the Day. They also asked us to write an article about how the project came to be, so here you go.

The article was originally posted on theFWA Friday February 22, but I thought I’d give some love to my blog by also posting it here.

Please go to the original article to see all the nice illustrations.

In May 2012, Tribal DDB Sydney approached us to partner with them to create an AR- based experience with a sophisticated animated 3D universe built around McDonald’s products.

In this universe users could explore the ingredients that go into making five iconic McDonald’s products. In doing so, the user could get real-time information and stories about how the ingredients they were consuming right now had made it from the farm to their bellies.

Tribal DDB designed the complete solution for the TrackMyMacca’s app, which included turning McDonald’s supply chain data into an API, the way the app is split from the API and the user experience. Our role was to design and build the 3D universe and develop the app for iOS devices.

For us, TrackMyMacca’s was a real passion project, Both teams at ACNE and Tribal DDB went above and beyond to make it perfect. When we saw the final 3D world fold out on the table in front of us, we were incredibly pleased. In our eyes, TrackMyMacca’s is the perfect example of how technology can enable great design.

The TrackMyMacca’s app presented a series of challenges as we were not only dealing with 3D animations on a mobile device, but we were also showing these animations augmented onto the real world through the use of Augmented Reality – a technology with a notorious past.

Augmented Reality’s Notorious Past

When AR first caught our attention in 2009, everyone wanted to get a piece of the magic future technology that seemingly bridged the gap between the real and digital world.

Countless sites and experiences followed, but in the end we were still tied to the desktop computer and the limitations of having to create an experience where the user was asked to hold a printed symbol in front of a webcam. One might say that the AR bubble burst that same year.

Fast-forward to three years later, and the world of AR has significantly evolved with mobile technology making huge leaps forward, unleashing the power necessary to not only run the real-time image recognition algorithms but also render complex 3D animations at the same time.

We started building prototypes to explore the different AR libraries available and both parties quickly agreed on using Qualcomm’s Vuforia AR library. As well as having a significantly dedicated online community, Vuforia also complemented our development platform for the project – Unity3D.

Building a 3D Universe on Mobile

Unity was the perfect choice for a project of this scale; it has great IDE, is a very capable and flexible scripting platform, and also has a great online community – much like Vuforia.

When modelling for real-time rendering on a handheld device, we need to be constantly aware of elements that impact performance, such as polygon count, texture size and draw calls. All of which makes this very different from non-real time 3D, where we only need to worry about how many GHz hours to rent at a render farm.

Our 3D artists created the world in close collaboration with our Unity developers and we had to develop many iterations to make sure we stretched the hardware as far as we possibly could without actually breaking it.

Working under these conditions proved to be a great catalyst for creativity, especially for our developers. For instance, we would realize that the animated textures for the toaster skid marks were too straining for the target device, which forced us to come up with a scripted solution for dynamically creating the same effect on the fly.

As it turned out, even seemingly impossible change requests to the 3D world could be achieved!

Putting It All Together

One of the biggest challenges of this project was the test and development workflow. Vuforia 2.0 was released towards the very end of the project, and one of the big advantages of this version of the SDK, in combination with Unity 4, is that you can use your webcam to test your work during the development process.

But for most of the project this great functionality wasn’t available to us, and that meant deploying to a device every time we had to test even the smallest change. Fortunately that won’t be the case for future AR projects.

Audio integration was also an interesting challenge for this project. While the iPhone is capable of delivering really good sound quality if you connect a pair of decent headphones to it, the reality is that most users will experience the audio through the device’s speaker.

Audio for a mobile experience means catering to both scenarios. Our partner DinahMoe created a soundscape that both added a wonderful dimension to the ambient noises of a restaurant experience, and delivered a truly immersive experience to those users who enjoyed it through their headphones.

We also encountered challenges regarding shakiness of the scene. AR works well if the 3D scene is approximately the same size as the AR target, but in our case the scene was many times bigger than that. This caused some issues with the stability of the scene.

The small movements inevitably made when pointing a device at an AR target, are magnified exponentially based on the size relationship between the target and the final 3D scene.

This particular issue was solved with the help of the Vuforia community by dampening the input that comes from the gyroscope in the device, and by easing the scaling and orientation changes in the 3D scene.

Working with AR means combining real world objects with 3D, and while the technology has improved tremendously over the previous years, we’re still very much at the mercy of the user.

In the case of TrackMyMacca’s, the ideal scenario was a user in a well lit McDonald’s restaurant, with a good wifi-connection and a sturdy unbroken box as a target. This was of course completely out of our control, and all we could do was to build the greatest app we possibly could. The rest was up to the user.

About the Author

I work as an Interactive Creative Director at ACNE Production in Los Angeles. I specialize interactive direction on experience-based campaigns across multiple digital platforms. I’ve directed and lead campaigns with such brands as Nike, Coca-Cola, Nokia, GE, McDonald’s, Toyota, to name a few.

I graduated from the IT University of Copenhagen in 2005 and started working at Framfab/LBi in Copenhagen shortly after as a developer. I stayed with LBi for five years, eventually making Director of Technology.

I’ve always had a strong focus on how to bring an idea to life, so the step across the Atlantic Ocean to ACNE Production in 2010 was a natural one to make. I was hired as a Technical and Interactive director and was promoted to Interactive Creative Director in early 2013.

In my academic education I’ve combined a bachelor in arts and aesthetics with a masters degree in information technology. This unusual combination gives me a unique understanding of both the creative, technical and aesthetic aspects of a concept, as well as the tools and knowledge to bring that concept into life.

ACNE Production

Venice, CA

Agency: Tribal DDB Sydney

Production Company: ACNE Production

Sound: DinahMoe

A walk down memory lane with Eyeblaster and Mediamind

Let me warn you, this post is certainly written for a niche audience. But I just made a discovery that I hope can help other people who have to deal with the same issue I was having.

So here is the situation: At ACNE Production we’re currently developing a series of Rich Media banners for one of our clients. The challenge for us is that we have a lot of different clients that use different media partners to publish and host their banners, and all of these media partners have different ways of doing things. Often figuring out the way things work with this particular media partner takes up a significant part of my time as a technical director, since I’d rather have the developers focusing on writing code. We just did a couple of banners using Google’s DoubleClick Studio, and this time the mediapartner is Mediamind, who merged with Eyeblaster last year (I think). Eyeblaster have been around for some time, and it seems they’ve been developing their framework for a LONG time. It’s certainly very big and complicated.

The problem is that to develop an Eyeblaster banner you have to build it in the Eyeblaster Workshop, which is a proprietary plugin for Flash. Once you’ve installed that plugin you have access to a very sophisticated toolbox with templates and even a sandbox for previewing as well as publishing capabilities. Unfortunately everything happens behind the scenes; You don’t know where any of the code comes from, you just know that weird classes are imported and that static objects are referenced from the timeline in the template files.

The problem is that no serious Flash developer would ever user the Flash timeline to write any code, other than maybe a stop(); action inside a graphical asset. Any code beyond that should be written in an IDE, such as FDT or Flash Builder. Our tool of choice at ACNE Production is FDT, currently version 5.5. Now, since all the mysterious Eyeblaster magic happens behind the scenes, trying to develop an Eyeblaster banner using an ActionScript IDE will result in a ton of reference errors inside your project, since all the Eyeblaster code is nowhere to be found.

With DoubleClick studio it was fairly simple and worked the way you would expect it to. There is a plugin that you install using the Adobe Extension Manager, but there are also several SWC files that contain compiled versions of the code to be used in your ActionScript IDE. With Eyeblaster, there was no such thing, at least not on the surface. So I thought that I would try and break open the .mxp file that contained the Eyeblaster Workshop, hoping to find a bunch of SWC files in there.

Unfortunately, opening an MXP file was no easy task. I assumed it was just a zip-archive, but I couldn’t get it to unzip. Enter Gooogle, that told me about an ancient piece of software called MXP Lister. And this is where it gets really old school: MXP Lister is a plugin for Total Commander(!) I actually couldn’t believe my eyes. One of the developers at my old factory used Total Commander on Windows back in 2007, and even back THEN it was totally old school, although for a semi-old geek like myself it brought back fond memories of the MS DOS days and Norton Commander or even Directory Opus on the Commodore Amiga system.

So I booted up the trusty old office Dell and installed Total Commander and the MXP Lister plugin, hoping to reveal the dark secrets of the Eyeblaster MXP package. And I certainly did – the MXP (Macromedia eXtension Package) contained MXI, which is an XML metadata file that describes what to do with the contents of the package. And this revealed that it copies actual ActionScript classes into the Flash Library. Since Flash Pro intrinsically (and secretly) understands and relies on code in there, there is absolutely no reference to that directory whatsoever in the publish settings for the .fla files created by the Eyeblaster Workshop.

Anyway, I was able to find those classes and copy the code into the project we’re working on, so that we can use a decent IDE to write the code instead of having to write code in the timeline. In case you’ve forgotton where Flash stores it’s internal classes (and I had), here it is (on OS X with Flash CS6 installed):

/Users/{USER NAME}/Library/Application\ Support/Adobe/Flash\ CS6/en_US/Configuration/Classes/Eyeblaster\ ActionScript\ 3.0/eyeblaster

So there you have it – a walk down memory lane to the days of writing code in the timeline and even further to the days of File Managers. While I’m sure the Eyeblaster Workshop works out just fine for designers who don’t write any code but just rely on simple actions and don’t care HOW it works, just that it works, it just isn’t a good solution for a developer. I hope EyeBlaster realises this and provide us with SWCs instead of the “idiot”-proof template files. I also hope someone can use this information in the future, which is the reason I wrote this post.

Geek Out,
Martin

GE Performance Machines: Process and technology

GE Performance Machines

The GE Performance Machines interactive experience combines a large library of video recordings of three pro football players with interactivity in the form of 9 mini games putting man against different machines built by GE. ACNE Production created this experience in collaboration with BBDO New York and Dinahmoe and I was the technical and interactive director on the project. The site was made Site of the Day on theFWA.com March 13 2012.

Functional Requirements
The project launched around the Super Bowl and was expected to receive a lot of traffic around the time of the launch. Furthermore it was made very clear to us that the client wasn’t a big fan of loaders on websites. Since we were creating an interactive experience it was very important to find out what exactly that meant. Turns out she didn’t like having to wait a long time to get into the main part of a website, which to us is a perfectly understandable objection – our philosophy on loading is that the site should only load what it needs to get going – many interactive websites tend to load everything up front, even though it isn’t needed right away. I believe this is done because it’s easier from a development perspective than having to deal with dynamic loading later on in the interactive narrative. Well, sites aren’t built for developers, they’re built for users, so we do of course need to figure out a way to make the site feel light and the experience seamless to the user.

We were also asked to figure out how we wanted to deal with users on mobile platforms. It wasn’t a requirement that the experience should work on mobile, but in creating a brand experience it’s of course important to take all platforms into account, since lots of traffic comes from various handheld devices.

Interactive Experiences in the Cloud
These last couple of years have seen a lot of internet infrastructure being moved from dedicated hosting solutions to some kind of cloud hosting and cloud applications. The typical interactive experience has a fairly short life expectancy, but during that time it will receive a lot of traffic, making it a perfect candidate for being hosted in the cloud. This project was no exception. We were dealing with a lot of video files – in fact no less than 197 in three bandwidth versions for a total of 591 videos. The core functionality of the site is switching rapidly between these many different videos, depending on what the gameplay dictates. Furthermore several of the videos have to start playing not from the beginning, but from a specific point, again depending on the gameplay. One way of doing this is to preload all videos for a given game and storing them in memory for when they’re needed it in the game, but that would require us to load all videos up front, resulting in potentially a very long load time. There is also a limit to how much video, we can hold in memory, and that limit is quite low.

Fortunately Amazon Web Services have a very interesting product that we can use for just this kind of experience. Amazon Cloudfront is a Content Delivery Network (CDN), that among other things have Flash Media Server capabilities in it’s portfolio. The idea behind that is to allow content providers to serve video in a more traditional sense, where a user would watch videos in a video player. But it works perfectly for an interactive video experience like ours – It sits in the cloud and scales gracefully to meet our traffic needs, and the Flash Media Server lets us play any video on demand with very low latency. Cloudfront works perfectly with the industry standard for video player Open Source Media Framework (OSMF), which is the technology being the frontend video used in this project.

Creating a Seamless Experience
While the latency is very low, it’s still there – the result is that videos will occasionally take 0.2 to 0.5 seconds to start playing, which is something the human eye notices. To make this less apparent to the user we worked with our sound partner Dinahmoe on letting the music be the element that stitches the videos together. The music is not embedded in the videos but plays out as a separate element in the experience. This means that when the video stops playing for a second, the music is still playing, and that creates a very good illusion of a seamless experience.

The mobile site
One of the things we’re particularly proud of is the mobile site that supports this experience. The desktop site is the main focus of the campaign and that is what most of the effort went into, but we came up with a simple and fast solution for a mobile site that doesn’t feel watered down. As with any interactive experience we have buttons for social sharing on the desktop site that allows users to Like the site on Facebook, Tweet it on Twitter or +1 it on Google Plus. If I go through my social feed on my mobile device and decide to check out the link to the GE Performance Machines site posted by a friend of mine, I will be taken to a site tailored specifically for the mobile platform, but with the same deeplink that my friend shared. The mobile site features videos capturing the interactive action on the desktop features, but without the interactivity. In fact, the user always wins in the mobile version of the interactive features. If I later visit the same link on my desktop computer I will be taken to the full site.

Sprinting through the Waterfall
This project had a very short timeline and was completed in only 6 weeks. To accomplish this we had to work in an iterative process were all stakeholders were involved from the beginning. Traditionally interactive campaigns have taken a waterfall approach where the developers are handed a bunch of completed Photoshop files and assets and then locked into a basement for a month or two to build the experience. While most software development companies have probably buried the waterfall approach years ago, it still makes some sense in advertising, as it allows the creatives to fully visualize and finalize their design and concept before handing it over to the developers. Later when the developers are released from the basement, that design can be used as a very accurate guideline to review the finished product. Unfortunately this approach is very time consuming and expensive, and furthermore it creates a gap between the stakeholders in the project. That’s probably the reason why production and development people hate it.

Modern software development typically uses an iterative approach, SCRUM being one of the most popular and well-known of those. This approach breaks the project up into clearly defined phases with clearly defined roles, responsibilities and goals for each phase. Developers and other production people love this, because they love clearly defined goals. It’s also a lot more efficient and flexibile than the waterfall approach, since the later phases can be re-defined based on the outcomes of the phases before them. But in our experience an approach like this is too demanding on a lot of people, in particular creatives and the client, as it requires a very high level of abstraction to fully understand how the pieces come together in the end. Another thing that makes an approach like this challenging is the fact that we have a shoot very early in the project timeline. Everybody is locked into the result of this shoot, and we can’t be flexible in later phases to make up for the stuff that went wrong during the shoot or the stuff we didn’t think about at the time.

The timeline on this project demanded that we took an iterative approach, or we would have still been working on it now. Development happened alongside design, and as soon as the edits had been selected from the shoot, we started to mock up working prototypes where we could fit all the pieces together using raw edits and experiment with the gameplay. This allowed our creative people to start playing the game very early on, which again allowed them to create and tweak the design elements to support the gameplay perfectly. Furthermore our client could start playing the games very early on, which was a huge benefit to the project, as it allowed us plenty of time to set the gameplay parameters and the game difficulty to a level that everybody liked. As the CGI was added to the videos and the post work was done, we replaced the raw edits with final videos. As the game gauges were designed and rendered, they slowly replaced the green placeholder boxes in the game, and slowly everything came together and started looking like a real game.

Consuming Music in 2011 as a Music and Technology Aficionado

It’s no secret that I’m a big fan of music as a consumer and have been for a long time. I heard an interesting podcast the other day from the Danish radio station P1. The program was “Harddisken” and the topic of the podcast was on consumption of music in a modern world with a panel discussion between different lobbyists, one from the streaming service WIMP, one from the public library streaming service Bibzoom and one from the semi-public Danish music rights organization KODA.

The discussion wasn’t really a discussion – all of the panel members seemed to be in agreement that music streaming services are the best thing since sliced bread and that it will revitalize the music industry. But it got me thinking about the way my own consumption of music has changed over the years. The way I consume music is really a combination of my great two greatest interests: New music and technology.

The Napster Years

My preferences in music are very picky. I don’t want other people to choose for me, I believe in my own taste, although I am heavily inspired by some music resources, in particular Soundvenue Magazine and Pitchfork Media. But I don’t want anybody to choose my music for me, so some of the classic online music services like online radio and more modern genre-based services like Pandora don’t really work for me. I was one of the heavy users of Napster back in it’s prime for that very reason. To me, Napster was basically a gigantic music library where everything seemed available. My music taste wasn’t very evolved back then, and that might have been the reason why it felt like I could find everything I was looking for. After the slow death of Napster I had kind of a dead period of music consumption myself.

The CD years

When I eventually picked music up again I became bit of a HIFI geek, buying a decent amplifier, a CD deck and a set of very good (and very BIG) Dali speakers. I still had quite a large collection of illegally downloaded music, but my collection of music started growing by buying CDs which I then ripped to my computer. One thing that I’ve always really hated about illegally downloaded music is that the quality is usually quite bad and that it’s very hard to keep it organised because the various pirates around the world apply their own organisational schemes instead of relying on proper ID3 tags. Because of that and because I wanted to go in a more legit direction I eventually deleted the entire collection of crappy illegal music and started maintaining my own collection of music ripped in high quality and with decent ID3 tags from my own collection. That collection has continued until today, and I now have around 90GB of music, which is nowhere near the 2 TB of music the typical music pirate “owns” but quite a considerable collection considering it’s mainly from physical music.

But finding new music has always been a problem. I’ve never really enjoyed going to record stores and listening to music because their selection (at least in Denmark) is always so limited and the prices (also in Denmark) are ridiculously high. I prefer exploring in front of my computer by reading the reviews and recommendations from my peers, but the problem is that it’s very difficult to actually HEAR the music you’re reading about. As mentioned I’m quite concerned about quality, so the crappy samples offered by services like MySpace (please die soon, it’s a pain to watch you suffering like that), allmusic or the crappiest of all: YouTube (ptui!) wasn’t really a solution. So I admit that I still had to resort to the flavour of the time in P2P networks like DC++ and various torrent clients to find the music before eventually buying it on CD online at UK prices (roughly half of Danish prices). I also subscribed to a number of music providers over the years, including the monthly Soundvenue Sampler and the montly Fabric! and FabricLive! CD.

Enter Spotify! (and Rdio)

Spotify had existed for some time before I eventually picked it up. You couldn’t get it in Denmark without going through some trouble and I still had my large CD collection and huge, expensive HIFI system so I was quite happy without it. But moving to the states left me without my huge stereo and CDs, and the only quality equipment for consuming music I had left was my UltraSone headphones and my (second set of) Etymotic earbuds. I still had my collection of music, some of which I could store on my new 16GB iPhone and listen to, but I had to find a way to find new music. The answer was pretty obvious in Spotify. So I went through the trouble of getting a Spotify account, which was just as complex in the US as in the States, but it was definitely worth the effort. I pretty much completely stopped downloading illegal music as soon as this service was made available to me. Being fortunate enough to have unlimited data on my phone I could connect to and find all the music I wanted. I since lost the debit card I used for my Spotify trick and had to close my account, but fortunately the very similar Rdio service had launched shortly before that, and I switched to that. Being an early adopter was a bit painful, but they’ve definitely caught up and now offer a very solid service.

Cloud music libraries

Unfortunately services like Rdio and Spotify have one big problem: While they have a HUGE collection of music, they don’t have EVERYTHING. In particular I’m missing some of the music from my old collection, which I now rarely get to listen to, because it’s so bloody inconvenient to have to sync the music to my iPhone. It just feels so old-fashioned having to connect to a computer and decide what music you want to put on a device with limited storage. But I believe I’ve found the answer in the cloud-based music libraries. I’ve started uploading my collection to my Amazon cloud drive, and that music is now available to me in the quality that I like (because I’m the one who ripped it) while I’m on the road. The big drawback is obviously that I can’t play the music from my iPhone, since Big Brother has decided against it, but at least it really rocks from my computers and my Android devices. My Amazon cloud drive is free for now, but it only has 5GB of storage. Another player I’m waiting with excitement for is Google Music, which is unfortunately in closed beta. And I guess Apple will launch a similar service with their iCloud this Monday, which is a bloody shame, because that’s probably the real reason why Apple won’t allow the Cloud Player in their app store – we all have to use the Apple approved service instead. This all puts me in a very awkward position, since the iCloud will probably only work on iDevices, but as the iPhone is my primary device, it leaves me with little choice.

My future as a music consumer will consist of a combination of streaming services, Rdio currently being my weapon of choice and one of the cloud based music services to serve my music collection to me wherever I might be. The next month or so will tell which service I choose, but I really hope Apple will allow me to make the choice myself, although I seriously doubt it.

Building Huge Games in Adobe AIR

Most of the fall in our office was spent working on the Yahoo! Bus Stop Derby, a fantastic project involving multi player games on huge touch screens. The campaign ended January 28th, but I thought I’d share my experience from being the Tech Lead at ACNE on this project.

Here is the background: ClearChannel is putting up a new line of interactive bus shelters in the streets of San Francisco. These bus shelters feature a 72″ touch screen in portrait mode and are connected to the internet over a 3G modem built into the units. The screens are supposed to be used for interactive advertising and games, and we were given the opportunity to be the first to build an interactive experience for these screens through a partnership with Goodby, Silverstein & Partners who are the advertising agency for Yahoo!

There was a lot of buzz surrounding the campaign and I encourage you to go elsewhere if you want to read what the game is all about. Instead I will write a little bit about the challenges we faced from a technical and user experience perspective.

The user experience challenge:
The form factor in this project presented a whole new set of HCI challenges for us. We have to step away from the conventions we normally rely on when creating digital experiences for the desktop and web as there is no mouse or keyboard, and people have a very short attention span with this thing. It’s more relevant to look to smartphones and tablet computers, but obviously there are major differences between the form factor on a 4″ screen with multi touch capability that you can hold in your hand and standing in front of a screen that’s taller than most people and interacting with much larger gestures.

Take a look at the image below and you will understand some of the challenges we had:


So the person on the left is an average size male adult. The person on the right is a 10 year old boy. How do we make sure both can play these games? There is a limit to how high the child can reach, and while the adult can reach down and reach the lower parts of the screen by kneeling or bending over, this isn’t a comfortable position to be in when playing a game. Another thing is the sheer size of the visual display. If you were watching a movie on a 72″ screen, you would probably want to be standing at least 8 feet away from the screen, but in our case our users literally can’t stand longer than an arms length away from the screen.

We also had to be very careful with relying to much on some of the newer touch screen conventions coming from smartphones and tablet computers. Part of this is because not all bus passengers are necessarily that touch screen savvy, but also because you don’t necessarily think “touch screen” when you walk up to a giant display like this. From previous observations I’ve made with large touch screen installations I’ve found that a lot of people aren’t comfortable with walking up to one of these screens and start interacting with it.

So we had to make the user experience very simple compared to what we usually do and we can’t leverage more than roughly 50% of the screen area. The goal is that everybody should be encouraged to and able to play with these things without being hardcore gamers or super tech savvy.

The technical and practical challenges:
We started development of this project before the hardware was ready, so we started by building the project based on assumptions on how it would work and on early prototypes of the actual 72″ units. We decided to build the project in Adobe AIR 2.0, since that technology gave us the opportunity to develop the project very rapidly and to share the work between a large team of developers with experience on the Flash platform. I was technical director on the project and had no less than 6 Flash developers working with me. 1 developer responsible for each game, 1 responsible for the overall UI and 1 developer to help out where ever help was needed.

All the developers had experience with building games and most had experience with touch screen devices, but I was the only one who had created AIR for touchscreen devices previously, so we had to setup a working environment where the individual developers could work and test on their own computers without having to learn too much new technologies. So I was basically responsible for setting up an architecture for them to work in where they wouldn’t have to worry about connecting to the Flash Media Server for multi player communication, how to run and launch the application and how to switch between the games and the UI. Half of the team were working in Flash Builder 4, and the other half were working in FDT 3 or 4. And 2 were on Windows and the rest on Mac. So the solution was to create an environment where we used ANT to build and test each individual element in the application to allow for rapid test and deployment without dependencies on the development platform. Most of the team didn’t have experience with such a cross platform approach, but everybody caught up just fine with the flow without having to spend too much time learning it.

One of the big challenges with this project has been dealing with 3G connection on the units. We don’t want to bother casual bus passengers with error messages about latency issues or loss of connection, so the focus in the error handling has been to make sure the impact on the user is as small as possible. So while the Internet might be crashing in the background, the goal was to make sure the user could go on playing his or her game without noticing something is wrong. Adobe AIR offers some really good solutions for dealing with this situation, as you can do offline storage using this technology. So basically every time we get an XML response back from the backend keeping track of the score, we would store the response locally. That way, next time the unit needs to check the high score, if it fails in connecting to the backend server, it will simply fall back to the local version of the high score XML stored on the local computer. Pretty neat, and the user will never notice anything is wrong.

The high score might not be completely up to date, but there is no way the user will know, and when the internet connection comes back up, the scores will be updated with the latest data, and since it was never down for very long I doubt any users would have the time to travel to one of the other bus stops to spot the inconsistency.

There were a lot of solutions like that built into the application, and all of them were built on assumptions on how we expected the units to behave in the field, so naturally we were a little anxious once the units actually hit the field, but after a little bit of going back and forth and a very solid team effort between all the stakeholders everything worked out just fine.

On the practical side we were developing for a resolution of 1080×1920, and that meant having to get monitors that could rotate to portrait mode and connecting these to our laptops. A couple of weeks into the project we were shipped a 70″ screen from Korea. One of the more unusual challenges we had to deal with was how to get this 250kg beast through our office door and how to mount it against the wall, but fortunately a couple of the developers were pretty big, so we eventually managed.

This was definitely the best looking image I’ve ever seen on a monitor. This unit could light up the room by itself, and seeing our layout on it for the first time was really cool. To my regret we couldn’t connect our PS3 to it, so we didn’t really get to test the performance of the monitor. Also it made a lot of noise when it was turned on.

The 70″ device wasn’t touch screen enabled, and we had to wait another couple of weeks until we got a unit that was touch screen enabled.

Unfortunately this unit was only 47″ and while this is a pretty big screen, there is still a long way up to 72″, so building the games put a lot of demand on us, as we basically had to take the game back and forth between our 24″ development monitors, where we did the development to the 47″ touch screen to test the game play and the 70″ to check out the layout. At times the developers were standing in line to get time on the big screens, but most of the time we were able to share the screens between us. For a while I even wrote code with the 70″ display as my primary monitor just to try it out, but I had to stop when my eyes felt like they were about to start bleeding. That display is very bright indeed.

All in all I’m very proud of this project, and I absolutely love getting away from the usual challenges with developing for the web and mobile and really going big. The project ended Jan 28 and the screens are now part of the ClearChannel portfolio of outdoor displays for interactive advertising.