Search:

BeIN Case Study Creating a Tailor-made Test Infrastructure

bein-digiturk-casestudy

Who is BeIN?

BeIN, a global sports and entertainment media group, broadcasts 60 channels in 43 countries across 5 continents, in 7 different languages. BeIN services include streaming and VoD (Video-on-demand) as well as value-added services built on a complex business logic developed during the last two decades.

Overview

This case study covers the work that kloia did with Digiturk in four phases. All phases are self-contained but on the big scale, every one of them are dependent on their predecessors.   

 

Phase 1 – Building the API test automation infrastructure.

Phase 2 – Unifying Web, Android and iOS projects under the same roof.

Phase 3 – Integrating the automation into CI/CD.

Phase 4 Integrating the device farm for mobile test automation.


Phase 1 - New Architecture, New Test Automation

Problem

Digiturk’s streaming platform had a monolith design and this was hindering the development process since every code change was affecting the whole system and applications. Also, it was increasing the complexity of testing and development processes. So Digiturk decided to start the migration process to the microservice architecture. The newly developed microservices and current APIs needed to be tested. Also as per client request, an API test automation project needed to be introduced.

Phase 2 - A Common Ground for Client Projects

Problem

The client side test projects were written in different languages, and with different frameworks and this was making it hard to enable everyone to contribute all projects. The need was finding a common ground for all projects, making it easy for the team to take part in every project.

Phase 3 - CI/CD Integration

Problem

Executing tests within certain intervals, triggering them with events and generating reports are pre-requisites for creating an automation environment. So, we implemented a CI/CD integration.

Phase 4 - Device Farm Integration

Problem

Mobile applications must be tested on a variety of devices and app versions because of the key differences listed below:

- OS Type

- OS Version

- Screen Size

- Hardware

- Device Model / Brand

Digiturk’s customers use Digiturk’s application on various devices. This implies that the Digiturk application needs to be tested on most of these devices. Additionally, implementing CI/CD requires several dedicated mobile devices. The purchase and maintenance costs of these devices are expensive. Therefore, we needed to search for and implement a device farm service for mobile test automation projects.

Client: Digiturk, BeIN Media Company

Project type: Test Automation Transformation Project

Website: www.beinconnect.com.tr

Phase 1 - New Architecture, New Test Automation

Solution

We started by analyzing the team's project management structure. We identified areas for quick wins and strengthened the foundations by re-introducing project management fundamentals and implementing agile processes for the automation projects. Furthermore, we scheduled meet-ups that included retrospectives, sprint reviews, and grooming.

Modernisation processes often involve significant code changes. It is crucial to test every feature to ensure everything works as intended during the modernisation process. Therefore, we began designing a test infrastructure to address potential problems that may arise related to API services, ranging from functionality to performance issues. Our aim was to develop API tests that follow the microservice development plan and have a regression test suite to see impact results of any changes made.

We needed to decide on an API testing tool and framework. There were various options available, including paid and open-source frameworks and as a consultancy company, we evaluate and choose the most appropriate framework based on different factors.

Some of them are:

  • Price
  • Usability
  • Reporting
  • Learning Curve
  • Development Cost

After considering our options, we chose Karate Framework to test APIs. We prepared and presented Proof of Concepts (PoC) to demonstrate the feasibility of our proposed API testing process, and we created tailor-made infrastructure for our customer's architecture.

When BeIN approved the change, we created a roadmap, divided into different phases.

As a part of Phase 1, we configured the project architecture to run load tests along with the API tests. We prepared the documentation in accordance with project requirements and rules. We trained the QA team through tutorials and technical sessions. We updated the existing test cases and created new ones for the microservices. Furthermore, we automated the Smoke and Regression test suites and integrated them with the Azure DevOps pipeline for scheduled testing.

Phase 2 - A Common Ground for Client Projects

Solution

The present Client Side (Web, Android, and IOS) test automation projects were written in Java / Selenium, Java / Appium, Javascript / Cypress and Objective C / XCUITest pairs.

We analyzed the structure of these projects and our client’s needs and decided that the best course of action was to gather all client side projects under a common roof - language, framework, and architecture. We decided on Ruby and Behavior Driven Development approach as our base. The reasons for this are:

  1. Easier to Learn

Learning Ruby, one of the high-level development languages, is far easier than learning Java or Objective C from scratch. So it allowed us to quickly onboard manual test engineers onto automation projects.

  1. Faster Development

Writing test scripts in Ruby is faster than writing in Java or Objective C. We chose Appium framework instead of XCUITest on mobile automation side because Appium is a cross-platform framework and this makes it possible to use the same functions in both IOS and Android projects; therefore cuts the developing efforts by half.

  1. Low Maintenance Cost

Development of mobile test automation projects are costly. Having both projects in the same structure decreased the maintenance efforts.

  1. Efficiency Through Standardization

The decision to use the same language and structure across all three automation projects enabled each team member to participate in the projects more efficiently. The iOS automation project was initially written in Objective-C using the XCUITest Framework, which was challenging for beginners to learn and write. By using Appium, a common ground was established for both mobile projects.

Adopting a BDD approach facilitated a common ground for Web, API, and Mobile projects, reducing the challenges of context switching between them. As a result, each team member was able to participate in two or three different projects simultaneously.

The first step was to migrate the Web project, which we then used as a model for the other two projects. We automated smoke and regression test scenarios and, once the web automation project reached a certain level of maturity (which took approximately six months), we began migrating the mobile projects. Because the Web project was regarded as a pilot project for the projects that would follow, and all team members were onboarded with it, it took a while to reach a certain level of maturity. However, the subsequent projects reached their maturity thresholds in much shorter periods since everyone was familiar with the project structure they have seen in the Web project.

We standardized methods and step names by making small modifications to the scenarios used on the web project, thus creating a structure that makes maintenance faster and makes reading and contributing to the code easier.

Phase 3 - CI/CD Integration

Solution

The existing infrastructure was built on Azure DevOps. To execute automated web tests in parallel, a structure was built with Selenium Grid 4. Pipeline was built, documentation was prepared and team members were onboarded to configure and use CI/CD integration.


To satisfy different workflow needs, a set of jobs was created in the pipeline:

  • Smoke: This job is run once every morning to ensure essential futures of product work as intended.
  • Regression: This job is run once every deployment to ensure that new developments do not break any existing features.
  • Single & Multiple Tag: This job is to execute particular scenarios having a certain tag to test specific parts of application.

Creating an execution schedule for smoke and regression jobs helped us find out problematic features and take action about them on a daily basis. Also, environment variables such as driver instances (Chrome, Firefox, Safari), tags of the tests to be run, retry count and thread were configured, so that they can be passed as arguments.

Along with these, reporting the results of these test runs was crucial to detect failed tests and the problematic features. We used Cucumber Report for the reporting system. Also, a bash script was written to create custom html reports from execution results since Azure’s mail notification tool was producing confusing test summaries by showing rerun tests as distinct tests.

Phase 4 - Device Farm Integration

Solution

In a world overflowing with a large spectrum of mobile devices, testing the application on every device by purchasing and maintaining every one of them is not impossible but implausible since it is certain that it will cost you dearly. This is precisely where a cloud device farm comes to the rescue. Its primary advantages can be listed as follows:

  • Device Diversity
  • No maintenance costs
  • Effortless scalability
  • Increased Accessibility
  • Exhaustive test logging
  • Easy integration of new devices
  • CI/CD Integration
  • Hybrid & on-premise options
  • Execution in different regions

When choosing a device farm service, we considered these criteria:

  • Execution Speed
  • Supported Operating Systems
  • Supported Tool Stack
  • Reporting / Logging
  • CI/CD Integration
  • Debugging
  • Parallel Execution
  • Community & Support
  • Real Device Diversity
  • Pricing
  • Video Recording / Screenshot
  • Beta Version Support

Then created POCs with several frameworks. What we finally decided on was the digital.ai Continuous Testing Platform, formerly known as Experitest. 

Digital.ai Continuous Testing provides cloud and on-premise device farm service options. Our customer decided to have them on cloud with hybrid option having both private and public devices. After the installation, we implemented these devices to mobile projects and CI/CD processes.

With the help of digital.ai Continuous Testing and App Center APIs, the application version management was integrated to the automation processes.  

Bein customer story video will be added.

bein-digiturk-casestudy-results

Results

Mobile

  • 110 smoke scenarios (run time 3.5h)

  • 70 prod scenarios (run time 2h)

  • 340 regression scenarios (run time 9h)

  • 12 pipelines were built on 4 different projects.

  • Approximately 6 hours of daily test execution

Web

  • 250+ smoke scenarios

  • 100+ prod scenarios

  • 1000+ regression scenarios

  • 16 pipelines were built on 3 different projects.

API

  • 100+ smoke scenarios

  • 600+ regression scenarios

  • 2 pipelines were built on a project.

At the beginning of our project, only 2 of 14 people were able to write test automation scripts. After the project was over, almost the whole team was involved in the automation processes.

We performed the automation transformation at not only API side but also at the client side. Especially API services are hard to understand for the people who are not accustomed to the intricacies of the programming languages, such as business departments. By implementing a BDD approach, we built communication bridges for software and business departments to understand each other.

The logging processes handled more efficiently. The team started to take actions more swiftly and efficiently for the bugs that were logged.

Contact