Dec 19, 2021
We’ve been developing AR products for social media platforms since 2019 and have completed projects for Sony, L’Oréal, Lego, Red Bull (DE), Real Me (Global), etc. Read below to find out how we managed to fit a full-fledged race into 4MB.
It’s a general trend: in the last two years, big brands and companies are increasingly turning to gaming as a tool for communicating with a younger audience. The that manages Ford Europe was also looking for different interaction formats involving Spark AR for the new ad campaign and got interested in our Follow the Dream project.
In 2020 Ford launched the new urban SUV, Ford Puma ST — a car that rivals a race car: fast to respond, sharp in cornering, truly built to thrill, just like the tagline says. To promote the car, the marketing team wanted to use the gamification capabilities of Spark AR games and traditional advertising channels and thus decided to create an Instagram race.
It was a simulation of a race on the Brands Hatch Circuit, located southeast of London. The main challenge of the project was to use an actual map. We had to recreate a realistic cornering on a real track, and we couldn’t afford for it to look like 2D simulations; otherwise, it wouldn’t convey all the advantages of the car.
For the client and the team to imagine the final product’s appearance, we started by preparing a detailed game scenario with mechanics, game logic, and part of storyboards — this is our process for each Spark AR project. Whether it is a simple AR effect or a game, all participants must have a clear, understandable, and — most importantly — approved reference point. This product was no exception, and we immediately drew:
The project team:
The creative producer played a crucial role, as he wasn’t just responsible for the idea but could look at the completed gameplay critically through the eyes of the client and the player.
The technical director helped the team meet the deadlines by running mechanics alongside concept work. The finished product was supposed to be ready in a month as the advertising campaign had a strict timeframe.
We couldn’t just create a fictional environment around the track for this project. We had to reflect an actual Brands Hatch Circuit, which hosts many British and international competitions. The idea was to give a total immersion in the atmosphere of the race from the very first second, and if so — we couldn’t allow ourselves to copy and redraw — we had to use the original track.
We learned that the map on Google Maps was flat and didn’t reflect the surface elevation and irregularities. That’s why we started looking for ways to extract the desired landscape fragment in 3D.
As a result, we got a 3D screenshot of the terrain with the required elevation differences. When we compared this model to YouTube videos of the track, we saw that our idea had worked out nicely.
Technical details on how to draw the terrain from the map
1. Launch the Google Chrome browser in a specific mode providing the process’s PID.
2. Run Render Doc, Inject into the process, find our process by PID.
3. Open the area of interest in Google Maps in the Google Chrome browser you opened previously. Look for the Capture Frame button.
4. The software makes a 3D scan of the area.
5. Go to Blender 3D, connect the add-on Maps Models Importer, load the exported map file.
6. Compare the result with the reviews of the track on YouTube, to see if the idea to recreate the landscape this way was successful.
7. Brush up the landscape.
8. Add the road.
9. Send this to the artists.
We had less than 2 MB of space for the entire map, so we used a texture atlas and placed all the patterns necessary for painting the track on it.
It looked realistic: as if we had just arrived at the track early in the morning to train before the upcoming race.
We couldn’t use a random car model: the player had to drive solely Ford Puma ST. The client provided us with a 3D model of the car, but it was more than 100 MB in size instead of the 1.5 MB we needed and consisted of 1.5 million triangles instead of ~10k.
We optimized it using the traditional game development method: model retopology, baking normals, and base materials, creating textures in a substance painter. We essentially rebuilt the model, reducing its size multiple times while maintaining sufficient realism. The final version of the car model consisted of 6,600 triangles — and that’s the size we could work with.
If you look close enough, you’ll notice several visual elements such as switched-on headlights, tire marks on the road, and smoke coming from under the tires. In combination with illuminated, interactive objects, all this directly interacts with the user, showing what’s happening.
Also, we integrated acceleration, which distorts the picture from the camera, adding even more dynamics to the game.
Mini-map is a common element of powerful computer games, that has never been implemented for Instagram filters. Basically, it’s a duplicate of the road, repeating the car’s coordinates. The map is overlaid as a texture on top of the final game frame and is rendered separately. We needed to allocate memory space for it as well.
To build the trajectory and preserve the shape of the landscape, we used an obj. file with information about the extreme points and calculated the car’s position along five axes, XYZ, rotationX and rotationZ. We calculated the rotationY axis from the difference between the current vehicle position and the one 50ms earlier. We managed to make the car fit the bends of the road along the entire track.
And that’s not all! If you take a closer look at the car while playing, you can see that it turns the wheels, leaves marks on the road, and even tilts to the side during turns. To get this effect, we made a simple rig of the car (each wheel has an axis of rotation and position) and deduced the parameter of the rotation curvature. The same goes for the car body tilt: when the car turns to the left, the body tilts to the right. It seems that the car is skidding, but this is just an optical illusion.
We enhanced these two effects by animating the picture around the car.
We enhanced these two effects by animating the picture around the car since the camera is fixed in the Spark AR Studio and can’t be moved. We went from the opposite: while the car is stationary, we animate the entire world around it. It’s like an illusion of movement when we’re at a train station, and it seems that our train is moving, while it’s the train on the other track that does.
We also used our own SDK to speed up the development of game mechanics. SDK is our internal product, and we’ve developed many ready-made templates of the most popular game mechanics for Instagram based on it.
In the end, the game took 5800 lines of code.
One of the biggest joys of working in Spark AR Studio is having one of the best node hierarchy scene organization. FYI: node hierarchy used for setting up the visual, less for the mechanics of the game.
The screenshot below partially shows the hierarchy of the project and the space that the final source takes.
After a month of well-coordinated work, we got a dynamic AR game, in which a person doesn’t just look at the product, but, engaging, interacts with it for an entire minute, which can’t be achieved by any traditional advertising channel. The organic promotions of the effects only after 2,5 weeks from release driving over 775K impressions, 75K game session / AR effect uses, and 2K shares.