Data is playing an increasing role in all aspects of life — one of those areas is the sports world where fans follow their favorite professionals, track their stats and cheer them on at events or online. And as the sport of cycling rises in popularity, real-time data was critical in making the Tour de France fans feel they were part of the race, no matter how or where they viewed the event. 

In partnership with the Amaury Sports Organisation (A.S.O.), the organizer of the Tour de France, Dimension Data, a global IT services provider, delivered a unique, engaging fan experience for this event using real-time data and stats collected throughout the 21-day race. This past July was the second year in which the company worked with the riders and teams to collect, analyze, and share rider data with fans and with the media for a broader, richer fan experience.

The information is provided through a cloud, data analytics, and digital delivery platform originally built in 2015 that allowed for the sharing of real-time information on individual riders, including position, speed, and distance between riders. In 2016, based on the results of the live website from the prior year, the A.S.O. and Dimension Data developed the Race Center, a cloud hosted web application that combined live race data with videos, social media feeds, and other data to the digital hub of the Tour de France. Additionally, the team brought in new data related to wind speed and direction as well as prevailing weather conditions.

Tim Wade, senior director of architecture in the Dimension Data Sports Practice, said, “The solution enhancements in 2016 meant we told richer stories as they happened, giving viewers, the media, cycling fans, and race commentators deeper insights into the many aspects of the sport. This year, we worked with a much broader palette, with access to more meaningful race data, race routes, riders, and current weather conditions. And the exciting part was the way we delivered all this information to A.S.O. through a unified digital platform. We were able to speak directly to a generation of younger viewers who rely on technologies such as social media and live video.” 

The data is secured in real time via telemetry sensors installed under each rider’s seat, captured by the race cars and motorbikes following the riders, then beamed to helicopters and sent via the cloud to a big data truck located at the daily finish line. The information is processed in a data analytics platform with the assistance of analysts on the ground and then shared with a social media team and TV broadcasters that distribute the data across the live tracking and commentator sites.

Cloud and data analytics is integral to the success of this platform. As the 2016 race progressed through each of the 21 stages, the big data truck followed along, covering more than 3,040 miles through winding and steep mountains, valleys, fields, and all types of weather. Cloud technology with its inherent flexible, scalable architecture handled the compute and processing ensuring that the data was securely and continuously collected, aggregated, and shared with broadcast commentators and fans through the web, social media, and TV broadcasting.

“Cloud was the cornerstone of the solution. All the fan interaction was through the cloud. And it ran the digital platform from which we presented information to the live tracking and commentator sites. We also used the cloud for data redundancy. We sent the race data to the truck at the finish line and shared it with London and Amsterdam data centers, so if there was an issue, we were fully covered,” Wade said.

The cloud platform was put to the test in Stage 12 of the 2016 Tour de France when riders faced the punishing climb from Montpellier to Mont Ventoux at the 6,272 ft summit. In the early morning hours of July 14, the stage’s route changed as the weather forecast predicted winds of 93 miles per hour, forcing the finish line to be relocated. Although this helped the riders, it presented new challenges for gathering and delivering race information. The technical zone at the race finish line was now split in two — with race fans and some central providers at one location 3.7 miles further up the mountain while the larger data and media trucks remained at the original finish line.

With no interconnectivity between the two locations, the technology team quickly took an assessment, switched to new internet-based services, and made the necessary adjustments to the application and networking configurations to ensure that data was provided without interruption and the team could meet the 100% service availability outlined in the A.S.O. service level agreement.

“Our technical team immediately split into groups to set up the new integration requirements successfully. We also ensured that the disaster recovery solution in the cloud was operational within that short time period while working around challenges like lack of mobile coverage for the technical zone. In spite of the inclement weather, we were still able to deliver race data to fans, professionals and the media around the world,” Wade commented.

Overall, 22 analysts and technologists in France as well as regions around the world were involved in testing, developing and managing the cloud solution that supported the 21-day race. The team rolled out 41,338 ft of cables that allowed for more than 127.8 million data records to be processed in the cloud.

“This year, we delivered valuable race information to A.S.O. through a unified digital platform hosted in the cloud, in parallel with a big data truck and cloud-based, disaster recovery solution. The quality of data and real-time availability of that data were critical components of the technology solution. The cloud ensured we kept the data secure, and encrypted connectivity allowed the race information to be transmitted to broadcasters, the media, the teams, and viewers securely,” Wade said. “We are looking forward to further cloud innovation and data analytics advancements in 2017.”

 

This article was originally posted “Racing through the clouds” from Cloud Strategy Magazine.