by: VMware Senior DevOps Engineer Anand Kumar and VMware Lead DevOps Engineer Pervinder Sudan
Every IT organization wants code quality in its software application development that’s fast and efficient. Considering the complexity, criticality, and scalability of the projects, every management team is willing to invest time and money to meet the desired code quality in their projects. That’s why quality is one of the major pillars of VMware IT DevOps. See Figure 1.
Figure 1: Pillars of VMware IT DevOps process
VMware IT DevOps process addresses many aspects of quality
- Prebuild check: validation actions are necessary before compiling source code
- Prepare: build meta data from the dependent user story and application requirement
- Code quality: scan to identify critical vulnerabilities, bugs, code duplicates, and industry best practices
- Build: compile and package source code
- Deploy: deploy the build artifact to the target applications
- Build validation: automated test execution builds sanity
All project development teams need to be on board for the DevOps platform to run the code quality checks and respond quickly with the code quality metrics before code is deployed into continuous integration and continuous deployment (CI/CD) environments. Even though source code management (SCM), CI/CD, and automated smoke test validation fall under the scope of DevOps process, one significant challenge was how to improve code quality without impacting the feature velocity and agility of the development teams.
Why code quality matters
The quality of any code is important as it severely impacts the overall lifespan and endurance of the software solution. Taking it one step further, the quality determines if the product or application is safe, secure, reliable, and free from any malicious bugs at runtime.
The Systems Sciences Institute at IBM has reported that “the cost to fix an error found after product release was four to five times higher as one uncovered during design, and up to 100 times higher than the one identified during the maintenance phase.”
Testing alone is not enough
Extensive testing by manual as well as automated means does not uncover every single error in the code.
Measuring code quality
There are various ways to measure the code quality:
Reliability—this measures how well the system will run without downtime over a specific period while in operation. The number of defects and uptime of the software are the key factors in this category. Running a statistical analysis tool will provide the number of defects.
Maintainability—this measures the software maintenance and its ease of care. Maintainability takes the size of the codebase, its complexity, consistency, and structure into account. Developing a maintainable codebase is reliant on both automation and human reviewers.
Coverage—code coverage is a software testing metric that determines the number of lines of the code that are successfully validated under a test procedure, which in turn, helps analyze how comprehensively a software is verified.
Without capturing code coverage results, it’s not easy to know how extensively the code has been unit tested. Automated unit test executions through continuous integration comprise the collection of code coverage metrics and statics, which provides insight into the required quality characteristics.
Duplication—as the name suggests, this is a repetition of a line or a block of the code in the same file or another. People might consider code duplication acceptable, however, it poses challenges to the software. Even code with similar functionality is considered to be duplicated.
The main reason for creating the duplicate code is copy and paste programming. Duplicate code makes the program lengthy and bulky—difficult for maintainability and modularity of the programming.
Having duplicate code makes the code slow down during processing; increasing the risk of failure and adds to the technical debt associated to the application. Duplicate codes leave an opening for attackers to exploit or actually get into the code, making it vulnerable. Repairing technical debts decreases developer productivity.
Reusability—code reusability is one of the key metrics of measuring code quality. Code reuse is the practice of writing the code in a modular fashion so that can be reused in other modules or functions thereby reducing the duplicity of code or functions. But, to reuse it, the code needs to be of high quality. That means it should be safe, secure, and reliable. Systematically developing high-quality reusable software components and frameworks is even harder.
The number of interdependencies determines the reliability of the code. Running a static analyser can help identify excessive interdependencies.
Improving code quality effectively
Coding standard—coding standards keep everyone using the same style. It maintains consistency and readability. The goal is to maintain lower complexity but with higher quality.
This is best completed by providing developers with the required training and helping them to comply. A static code analyser is also beneficial in supporting this.
Analyze the code before review
For development aspects, code quality needs to be set as a primary view when a project is initiated. This is the core reason why analyzing the code before the review begins is critical.
The sooner any errors are identified, the quicker they are resolved. This improves the overall effectiveness in the code quality.
Refactor legacy code
Refactoring can improve the quality of an existing codebase. Refactoring the legacy code and cleaning up the codebase can lead to lowering its complexity and allow for running at a higher rate overall.
At the initial stages of adopting DevOps processes, adopting agile development methodology for project development and setting up the DevOps process did not resolve issues completely. One of the key KPIs being tracked was improving the code quality and reduction of the code bugs issues in CI/CD and higher environments.
This required a solution or approach that is easily accessible to developers to run code quality checks as often as required to maintain and analyze code issues reported by the platform and setting up a framework with guardrails to make things flow in the right direction. We also wanted to create good visibility around the progress made and benefits it brought to the development team in terms of improved code quality, such as fewer issues in CI/CD and production environments thereby saving precious time spent on fixing the issues found late in the development lifecycle or even in production.
The journey
It was a difficult task; the team had multiple discussions with developers and operation leads to understand the challenges on a legacy code base and cloud-native applications. Since the number of developers and projects were spread across VMware IT, it was a big challenge to provide a uniform solution for different codebases, programming languages, and application platforms from the DevOps platform perspective of supporting individual teams and their multilanguage code base.
The DevOps team set up an execution plan around a common unified platform for managing and maintaining code quality issues, coverage, duplications, reliability, and maintainability for heterogeneous applications simultaneously. It took three to four quarters working with developers to gradually arrive at the desired code stability. With a given timeline, project developers achieved the agreed threshold benchmark and qualified all measures of code quality.
To meet the required quality of code, the DevOps team declared code quality thresholds, rules, and policies as standard for IT applications. The team automated the complete process of assessing the code quality using a variety of tools and integrated it into the DevOps process. The sharing of the code quality metrics early in the software development lifecycle helped developers to fix the code quality issues as they get reported during the continuous integration phase. We have caught several leaks in apps code through continuous inspection, therefore maintaining the overall health of IT applications.
By defining the automated DevOps process with strict code quality checks enabled during the early development cycle, VMware IT improved code quality for IT apps services by 80 percent.
In essence, the DevOps team has designed the processes and conventions around moving defect detection as early as in the CI/CID workflow. This way, the same compounding effects that inflate the negative impacts of late defect detection now work in favor of increasing software quality and resilience.
VMware on VMware blogs are written by IT subject matter experts sharing stories about our digital transformation using VMware products and services in a global production environment. Contact your sales rep or [email protected] to schedule a briefing on this topic. Visit the VMware on VMware microsite and follow us on Twitter.