The World Cup offers a good barometer for how tech is being used by a critical mass of people around the world. Four years ago, we had the first World Cup of the social media era. This summer, the big story was how fans were using mobile devices and tablets to engage with the matches and with each other. And just as audience habits are changing, advertising tactics are evolving to keep up. Nike's Phenomenal Shot is a perfect example: a real-time campaign that played off the action in ways that wouldn't have been possible four years ago.
Remixing unforgettable moments in real time
It happens every four years. Fans around the world simultaneously cheer as amazing World Cup moments unfold.
As part of the latest Google Art Copy & Code collaboration, Nike partnered with Grow, Wieden+Kennedy, Mindshare and Goo Technologies to create a new way of tapping into the real-time energy of live sports all across the web.
Nike Phenomenal Shot let fans all over the world view, remix and share phenomenal moments from Nike athletes just seconds after the plays happened. Display ads that celebrated these moments were delivered across the Google Display Network to a global audience with near-instantaneous speed.
Athletes came to life via a rich WebGL experience that incorporated the capabilities of mobile devices and browsers. Fans could rotate 360 degrees around the players, panning and tilting their phones to move around players in 3D, all to create the perfect shot and then share it.
Doing anything for the first time is always a learning experience. Here we'll walk you through our experience working with one brand, Nike. We'll unpack the lessons we learned and show you what we did.
Challenge #1: Prepping to work in real time
Many world-class footballers celebrate their big moments in signature ways—running with arms wide open, sliding on one's knees or pointing to the sky. These gestures were just some of the many visuals we needed to build for Nike athletes, which we then plugged into the pipeline. The challenge was serving up these visuals to a global audience before the exciting in-game moment had passed.
While we couldn't predict when goals would be scored, or by whom, we could prep and perfect the workflow pipeline before the campaign began so we could react nimbly as these moments popped up during the tournament.
How we did it: Dynamic creative was built with DoubleClick Studio. Using an HTML5 template, we created and managed one flexible ad—instead of hundreds of custom-built ads—that could run across screens. This strategy helped simplify the scaling process when a big moment occurred.
We then served the dynamic creatives through DoubleClick Campaign Manager onto the Google Display Network, putting each ad together worldwide in real time based on the live scenarios that were occurring.
To accommodate dozens of ad alternatives localized for 15 different languages, we created a Google Spreadsheet template that served as our dynamic feed to generate ads. Once the dynamic feed was set up, we could generate the ads quickly by updating the spreadsheet, which minimized the effort and speed of releasing ads in real time. The spreadsheet was robust yet simple enough that the ads could be updated without assistance from technical or creative employees.
A single codebase was used to generate ad units. Grunt, a JavaScript library, helped expedite the production and previewing of ads.
What we learned: Preparation for real-time moments was vital for success. Establishing the most efficient work pipeline across all partners early in production was key to being able to plan for both the expected and the unexpected. And while that preparation was key, execution could be simplified by controlling everything through a single spreadsheet and a single HTML5 ad template.
We knew we wanted the creative to respond to real-time moments in the game. We just didn't know what those moments might be, when they'd occur, or which athletes would be involved, so we had to be prepared for everything.
Challenge #2: Developing a visually stunning mobile experience
With device-performance and file-size limitations, 3D and other rich visual experiences can be difficult to create in a mobile environment and frequently require opening a separate app. To address these issues, we tapped into the power of the mobile Chrome browser, which supports rich graphics capabilities, as well as the expertise of some pretty talented agency partners.
How we did it: We started with the animated World Cup short "The Last Game, created by Wieden+Kennedy and Passion Pictures for Nike. Goo Technologies helped us convert the assets into WebGL using the Goo Create Engine so users could then interact with the players.
We knew WebGL would give us the rich look and feel we wanted, but the technology is still relatively new on smartphones. We needed to convert the 3D assets from the short film into something we could work with. While there are a plenty of engines available for that, we chose Goo because we were impressed by the performance and visual quality of Pearl Boy. And the Goo Create Engine is known for its fine-tuned memory control and mobile-specific optimizations.
The Goo team also optimized textures and lighting to get as close as possible to the still 3D render, making the characters look like they were straight out of the short. Hard coding these parts provided a fast load time and eliminated the need to render dynamic lighting. The Grow team then took that content to create the web experience.
What we learned: Optimizing 3D assets for WebGL prior to the tournament meant we could produce a visually stunning mobile experience quickly so when an iconic World Cup moment occurred, we only had to pose and apply textures to the models instead of building an entire scene.
Nike's "The Last Game" athletes were rendered in Goo Create.
Challenge #3: Standardizing the mobile experience
Once the characters were converted to WebGL, we had to ensure the experience would work uniformly across devices.
How we did it: To ensure a consistent experience, we aggregated data from various hardware manufacturers and built a custom tool. Device sensors such as the gyroscope, accelerometer, and compass had to be normalized so the 3D camera would react the same way when fans panned any device 360 degrees around a character.
Out-of-the-box IMU support for web applications, however, is currently not universally compatible, so we created a JavaScript library that would detect these device variations and provide normalized values. We then implemented an exponential smoothing algorithm to ensure a fluid experience when users panned and tilted their phones. We created a seamless experience across all devices using HTML5 device-motion and device-orientation APIs.
We wanted to create a single experience across devices that responded consistently to user gestures.
What we learned: We chose not to build a mobile experience geared toward a single lowest denominator of device performance/capability. Instead, we found that creating custom libraries that targeted specific devices allowed us to ensure that, no matter what device a user chose, we would be able to present a rich experience that did not compromise our original vision.
A custom JavaScript library of device variations allowed Phenomenal Shot to offer the same experience on any device.
Challenge #4: Creating a 3D experience for all
Because WebGL is a new technology that isn't yet available on all mobile operating systems, we had to find a creative workaround to simulate the 3D experience using 2D assets.
How we did it: We used image sequences to make 2D look like 3D on non-WebGL sites. But rather than creating an animated flipbook, we tied the animation directional controls to device-sensor feedback data so users could control the animation by moving their phones.
Using the WebGL renderer built for the full 3D experience, we built a tool that extracted imagery from the actual WebGL scene. We took snapshots of the WebGL environment from every possible angle, which allowed us to recreate the experience in a 2D world. The final export was 100 images per player, which together simulated a 3D experience when fans panned their phones around a character.
We maintained one codebase with very few variants between the 2D and 3D experiences. Sensor controls, sharing to social platforms and other features all worked with little or no modifications; the experience felt very similar on both non-WebGL and WebGL-enabled devices.
What we learned: By creating a tool to capture 2D assets from the 3D experience, we could simplify our production pipeline and gain additional device support without compromising the overall user experience.
A side-by-side comparison of the 2D versus the 3D experience.
Conclusion
We pushed out real-time ads to celebrate phenomenal World Cup plays, such as when Brazil's Neymar Jr. scored two goals in the first half of the Cameroon match. In just a few weeks, about 2.2 million people experienced Phenomenal Shot, exploring and customizing Nike content. While the ads themselves ran in 15 countries, the social and sharable nature of the experience engaged fans from more than 200 countries. As WebGL and other technologies improve and become available on more devices, we can continue to push boundaries. A beautiful, real-time campaign like Phenomenal Shot is just the beginning.