List of Most popular DevOps Tools:
Our last tutorial focused on Continuous Delivery in DevOps, now let’s see about DevOps tools.
In our Software Testing forum we have seen several excellent tutorials on areas like Project Management, ALM, Defect Tracking, Testing etc. along with the individual tools that are best in class in a particular segment or in the appropriate area of SDLC.
Further Reading => DevOps Tutorial list
And I have written some tutorials on IBM and Microsoft ALM tools. But now my focus is on the general trend of today’s automation market.
Overview of DevOps
DevOps plays a vital role in providing automation in the area of Build, Testing and Release to project teams which are normally termed today as Continuous Integration, Continuous Testing, and Continuous Delivery.
Hence, teams, today are looking at faster delivery, quick feedback from customers, providing quality software, less recovery cycle time from any crashes, minimize defects etc. from more and more automation. Thus one needs to ensure that with all the tools used and about the Integrations for the Development and Operations team to collaborate or communicate better.
In this tutorial, I will provide some guidelines which according to me are the possible DevOps tools and scenarios that you could look to use for Java/J2EE projects for On-Premise and Cloud Deployments and most importantly how they could integrate and operate efficiently.
Needless to say, there are many other tools out of which I would be discussing here the free to use ones.
What You Will Learn:
- Best DevOps Tools in 2019
- DevOps Tools for Cloud Build and Deployment
Best DevOps Tools in 2019
I have always believed that the process also plays a very important role in achieving the goals which I mentioned in the previous section. So it is not only tools that enable DevOps but a process like Agile also plays a very important role from the point of view of faster delivery.
Let’s start by looking at the DevOps tools which could be used for On-Premise Build and Deployment
1. Atlassian JIRA
JIRA is a commercial software and licenses need to be procured for installing On-Premise based on the number of users. The tool can be used for Agile Project Management, Defect and Issue Tracking.
As mentioned before a process is a certain pre-requisite for DevOps implementation, so the Project Managers can use JIRA to create Product Backlog, Sprint Backlogs and maintain the entire traceability starting from EPIC, User Story and so on till Test artifacts like Test Case.
Click here to refer to the series of tutorials on how to effectively use JIRA for Project Management, Tasks, Issues, Reporting etc.
Thus, typically in defining any DevOps pipeline, planning should be the first component with the product backlog and sprint backlog defined in JIRA.
I have used JIRA as a planning/defect tracking tool as it integrates well across the DevOps pipeline starting with Jenkins which is used as a Continuous integration tool and is a very important component of DevOps.
Visit the Atlassian JIRA website to know more about JIRA.
Click here to get the different pricing options for on-premise and cloud hosting of JIRA.
2. Eclipse – IDE for Java/J2EE Development
Thus typically developers write code and commit the code to a version control repository such as Git/GitHub which supports team development. Every developer will download the code from a version control repository, make changes and commit the code back to Git/GitHub.
Any Git commits will be integrated with Jira tasks/defects to see which files have changed, who has changed and any commit comments. This will ensure traceability of every code change a developer does with the tasks or defects assigned to him.
You can download specific eclipse versions here for your projects.
3. Git – Version Control Tool
One of the fundamental building blocks of any CI setup is to have a strong version control system. Even though there are different version control tools in the market today like SVN, ClearCase, RTC, TFS, Git fits in very well as a popular and distributed version control system for teams located at different geographical locations.
It is a free and open source tool and supports most of the version control features of check-in, commits, branches, merging, labels, push and pull to/from GitHub etc.
It is pretty easy to learn and maintain for teams initially looking at a tool to version control their artifacts. There are many websites which show how to learn and master Git. You can click here for such a website to read and gain knowledge.
For a distributed setup of maintaining your source code and other files to be shared with your teams, you will need to have an account with an online host service- GitHub.
Though I have suggested Git it is up to the teams and organizations to look at different version control tools which fit in very well in their setup or based on customer recommendation in a DevOps pipeline.
Git can be downloaded for Windows, MacOS, and Linux from the git-scm website.
Kiuwan adds security into DevOps understanding how SAST and SCA testing should be implemented. With a unique distributable engine, pipelines are never at risk of creating bottlenecks and TTM is improved while ensuring the most stringent security checks are in place.
With a DevSecOps approach, Kiuwan achieves outstanding benchmark scores (Owasp, NIST, CWE, etc) and offers a wealth of features that go beyond static analysis, catering to every stakeholder in the SDLC.
Complete integration with your favorite tools (Jira, Jenkins, CircleCI, Bamboo, Slack, Visual Studio, IntelliJ, etc…)
Visit Official Website Link: Kiuwan Code Security
As we have seen certain pre-requisites for any devops implementation we will now focus on the main tool for continuous integration which performs build ( ANT or Maven), code analysis and storing the binary artifacts (for e.g. WAR /JAR/EAR file) in a repository manager tool like JFrog Artifactory.
Jenkins is a free open source Continuous Integration tool or server which helps to automate the activities as mentioned of build, code analysis and storing the artifacts. These activities are triggered once a developer or the team commits the code to version control repository.
So a schedule for the build is defined in Jenkins for initiating it. For E.g. it could be once in 2 to 3 days or every Friday at 10 PM etc. This again depends on the completion of tasks assigned to individual developers and the due date for completion. These are all planned in a Sprint Backlog in JIRA as discussed initially.
Jenkins today has so many plugins and works as a CI tool for various different technologies like C/C++, Java/J2EE, .NET, Angular JS etc.
It also provides plugins to integrate with SonarQube for code review, JFrog Artifactory for storing binary artifacts, testing tools like Selenium etc. as a part of the automation process and reduces manual intervention as far as possible.
Typically all builds running on one single build machine is not a good option. Jenkins provides you with a feature of Master-Slave where the builds are distributed, run in different environments and can take a load of the master server.
One of the notable features of Jenkins is to visualize the entire build/delivery process steps as a pipeline.
Jenkins can also help to automate the deployments to app servers like Tomcat, JBoss, Weblogic through plugins and also to container platforms like Docker.
Personally, I prefer the deployments being done with a proper deployment tool like IBM Urbancode or CA RA with its integration to Jenkins. We will see this later in the tutorial.
Today out of my experience I have seen so many organizations using Jenkins to automate the entire build process and also maintain a separate central server to automate the entire build process and facilitating the deployment as well.
Click here to download and install Jenkins.
Example – Jenkins configuration with Git
Example – Jenkins Delivery pipeline
Example – Jenkins Master Slave
In the next section on DevOps tools for Cloud especially AWS, we will see how Jenkins integrates with AWS developer tools like CodePipeline, CodeCommit, CodeBuild, and CodeDeploy
SonarQube is an open source tool which is mainly used for analyzing the code quality.
SonarQube supports analyzing the code of many popular programming languages. The execution of code quality is called during the Jenkins build process and the results of any violations can be seen in the SonarQube dashboard. This also helps to review the code quickly.
Every time a developer makes a change to the source code, Jenkins build process will trigger SonarQube and updates will instantly be available as a part of the automation.
So organizations would need quality and reliable code thereby reducing bugs which could be costly later on in the lifecycle. This is where SonarQube plays a very important role in the Code Quality.
Click here to download and install SonarQube.
7. JFrog Artifactory
We have seen that source code is stored in a version control tool like Git. The binary artifacts produced out of any build process is normally stored in a repository manager like JFrog Artifactory.
Once build is successfully completed the WAR/JAR/EAR files are copied and stored in Artifactory as a part of the post-build action defined in Jenkins. Different versions of the artifacts are also maintained in artifactory. This is very useful if ever there is a need to manually rollback to any previous version.
These artifacts are then picked up for deployment to different app servers like Tomcat, JBoss, Weblogic etc using tools like IBM Urbancode Deploy or CA – RA. Nexus Artifactory is also another popular repository manager tool that can be used. Click here to read more on Nexus.
Different pricing options are available here for JFrog Artifactory (Pro, Pro Plus, Pro X, and Enterprise) which may be suitable for your organization.
Example – Jenkins – Artifatory Integration
Click here to download a free trial copy of JFrog Artifactory.
8. IBM Urbancode Deploy
IBM Urbancode Deploy is a commercial tool from IBM which helps to automate the deployment of artifacts for different environments from development until production.
Post the build from Jenkins you can install the IBM Urbancode Deploy plugin and automatically trigger the deployment of the binary artifacts to different environments.
As mentioned before, I personally prefer a pr.oper deployment tool wherein all aspects of deployment are taken care of.
IBM Urbancode Deploy provides the following feature to help in ease of deployments:
- Automated deployment and rollback
- Changes propagated to all environments including databases.
- Appropriate configuration for different environments.
- Approval process
- A visual depiction of the complete deployment process.
- Complete inventory maintained as we know what is deployed and who has deployed.
- Integration with multiple J2EE app servers like Tomcat, JBoss, Oracle Weblogic, IBM Websphere Application Server etc. and IIS web server for .NET deployments.
- Integration with container platform like Docker.
Example – Integration of Jenkins with IBM Urbancode Deploy
Example – Graphical view of application deployment to Tomcat
Click here to get a Free trial copy for evaluation purpose of IBM Urbancode Deploy.
You will need to register for an IBM id to download.
9. CA-Release Automation (RA)
CA Release Automation is another similar commercial tool from Computer Associates which provides the above-mentioned features for automated application deployment.
So once the application is built successfully and a WAR file generated it is picked up as a part of Jenkins Post-build action and deployed to the target environment/app server as per the flow defined in CA RA.
Typical steps followed for any deployment either in IBM UCD or CA RA post the build from Jenkins is as follows and can change as per the needs.
- Download the application WAR/JAR/EAR file to the target environment.
- Stop the current application running.
- Un-install the application.
- Install the new version of the application by downloading from Artifactory or Nexus.
- Start the application.
- Check the application status.
- In case the application did not deploy successfully or may be due to environment compatibility there can be a rollback action as well.
Example: Post-build Action Integration of Jenkins with CA RA
One will need to contact the local IBM or CA team for pricing on IBM Urbancode Deploy or CA RA.
For quite some time we all have been talking about Virtualization where one physical server provides the feature to host or contains multiple virtual machines. So these VM’s can run either Windows or Linux operating systems and has all the libraries or binaries and applications which run on it.
Typically every VM is very large in size. From a DevOps point of view, you can use every VM for a particular environment like Dev or QA or PROD etc. With VM’s the entire hardware is virtualized.
Docker’s, on the other hand, uses the concept of Containers which virtualizes the Operating System. It can be used to package the application (for E.g. WAR file) along with the dependencies to be used for deploying in different environments.
Docker uses the workflow of BUILD-SHIP-RUN which means you create images (based on Dockerfile), publish the images (to DockerHub) and run the image by which the container is created in any environment.
DockerHub is a registry of images built by communities and can store or upload the images which you build as well.
You can click here to log in and upload images.
Image repository with Tags
Sample Dockerfile to automate deploy of WAR file to Tomcat
FROM tomcat copy sample.war /usr/local/tomcat/webapps/ CMD [“catalina.sh”,”run”]
Jenkins integration with Docker
Selenium is a free open source automated functional testing tool to test web applications. It is normally installed as a Firefox browser plugin and helps to record and playback of test scenarios. In a DevOps cycle once the application is deployed on the Test or QA environment selenium automated testing is invoked.
Click here to read more on Selenium features and download for installation.
Selenium as a tool is very easy to learn. I would suggest to read the selenium tutorial series to understand the installation process for the Firefox browser, gain knowledge and master automated testing for web applications.
Putting it all Together
Let’s now see a larger picture of how all the tools that we discussed above integrate and give us the desired DevOps pipeline which the teams are looking from an end to end automation point of view.
Illustrative Devops Pipeline
DevOps Tools for Cloud Build and Deployment
When I started my software career around 20+ years back the infrastructure (Software and Hardware) for any kind of development and deployment had to be procured.
This included placing order with the vendor for servers, waiting for a certain amount of time to get the same, once delivered server space had to be reserved, installation of the server, operating system, storage configuration etc. We also had to be worried about performance, availability (24*7), maintenance, network etc.
This was too much of effort involved in bringing up a server for activities of development and deployment of applications.
Things had to change with the evolution of Cloud Computing which means you access all your applications and databases over the internet. So Cloud Computing providers maintain all of the hardware that is needed to run your web application.
All the resources with an appropriate configuration that you need to host your application are available within a click. Time is drastically reduced for making available the resources for developers. Most importantly you pay only for what you need to use.
The focus for developers using Cloud-based services is only on what they need to work on our projects and not to worry about infrastructure availability. I am not getting into types of cloud computing (IaaS, PaaS, SaaS) there are tons of information available over the internet to describe what they are.
There are many cloud providers. But the 3 most popular ones that I have worked with are:
- Amazon Web Services
- Microsoft Azure
- Google Cloud
In this section, I will put my focus on tools for a pipeline, source code repository, build and deployment with Amazon Web Services. Not to forget that teams still use DevOps tools like Jenkins, Git, Maven, and others. So it is imperative that while teams may want to move their assets and artifacts to cloud infrastructure we also need to maximize their existing investments in tools and data with integrations/migrations as far as possible.
Click here to learn about AWS and the various services for Architects, Developers, and SysOPS. We will use the free account for the tools mentioned but of course, in a production environment, you will need to procure the services for use.
From a Build and Deployment point of view, we will look at the following AWS services
- AWS CodePipeline
- AWS CodeCommit
- AWS CodeBuild
- AWS CodeDeploy
1. AWS CodePipeline
AWS CodePipeline is similar to the Jenkins Pipeline which helps to have a visual view of the end to end delivery process.
So in a CodePipeline, you will typically configure the following
- Source Code Repository – So your source code would need to be either in AWS CodeCommit or GitHub repository.
- Build Service – AWS CodeBuild details will be configured as part of the pipeline.
- Deploy – AWS CodeDeploy will be configured into the pipeline.
- During the deploy process to different environments if any approvals are needed they could be configured as well
So if there is a code change by the developer the visual representation of Build and Deploy can be seen to be automated.
Source code repository configuration in AWS CodePipeline
Build configuration in AWS CodePipeline which uses Maven build
Deployment configuration in AWS CodePipeline
Complete Execution is seen in AWS CodePipeline
2. AWS CodeCommit
AWS CodeCommit is a secure online version control service which hosts private Git repositories. A team need not maintain their own version control repository instead they use AWS CodeCommit to store their source code or even binaries like the WAR/JAR/EAR files generated out of the build.
With AWS CodeCommit you create a repository and every developer will clone it to their local machine, add files to it and push it back to the AWS CodeCommit repository. One uses the standard GIT commands with the AWS CodeCommit repository.
For E.g. once the AWS CodeCommit repository is cloned to local machine you would use commands like ‘git pull’, ‘git add’, ‘git commit’, ‘git push’ etc..
Illustrative AWS CodeCommit empty repository created
Clone the repository to the local machine
Files added to AWS CodeCommit repository
3. AWS CodeBuild
As we have seen the source code and other project artifacts are stored in AWS CodeCommit repository.
To implement Continuous Integration AWS CodeBuild like Jenkins fetches the latest changes of the source code from AWS CodeCommit or GitHub repository as configured and based on the build specification YAML file (created as buildspec.yml) the commands are run based on the four phases like Install, Pre-build, Build and Post-build.
Once the build is completed the artifacts (WAR/ZIP/JAR/EAR) are stored in the AWS Storage which is an S3 bucket.
Sample buildspec.yml file
version: 0.2 phases: install: commands: - echo Nothing in the install phase... pre_build: commands: - echo Nothing in the pre_build phase... build: commands: - echo Build started on `date` - mvn clean install post_build: commands: - echo Build completed on `date` artifacts: files: - target/HelloWorld-Maven.war
Sample AWS Codebuild project
Artifact (WAR file) copied to S3 bucket
4. AWS CodeDeploy
As the name suggests AWS Codedeploy is the deployment service which automates the deployment of the application (in this case WAR file) to the Amazon EC2 Linux or Windows instances.
Since we now have the artifacts stored in S3 bucket which was completed using AWS CodeBuild the artifacts are then picked up from the S3 bucket and deployed appropriately to the app server Tomcat or JBoss etc. in the AWS EC2 instance provisioning.
AWS CodeDeploy depends on a YAML file called appspec.yml which has instructions on the deployment to the EC2 instances.
Sample appspec.yml file where the index.html file is copied and deployed to the Apache server
version:10.0 os:linux files: -source: /opt/deploy/index.html destination:/var/www/html/ hooks: BeforeInstall: -location:scripts/before_install runas:niranjan AfterInstall: -location:scripts/restart_server runas:niranjan
GitHub repo of all files needed to run AWS CodeDeploy
Deployment execution in AWS CodeDeploy
Jenkins Integration with AWS Services
As mentioned earlier, nowadays teams are using Jenkins much as the defacto CI tool and in most case,s they would not really like to move away from it but rather integrate with the AWS services which we discussed. While there are certain procedures involved and I have shown screenshots of the integration.
1. Jenkins integration with AWS CodeCommit
2. Jenkins integration with AWS CodeBuild
3. Jenkins integration with AWS CodeDeploy
Putting it All Together for AWS DevOps Stack
The stack looks below for the AWS services that are discussed above.
The purpose of this tutorial was to introduce you to the main DevOps tools and services used for On-Premise and Cloud deployment especially with Amazon Web Services.
It was to provide the enthusiasts of DevOps the popular tools that are available and how they integrate with one single view of automation and not much of manual intervention.
I also wanted to mention about few other DevOps tools which are equally popular like BitBucket (Web-based version control repository similar to GitHub but owned by Atlassian), Bamboo (Continuous Integration and Continuous deployment tool similar to Jenkins developed by Atlassian), Chef/Puppet/Ansible (Managing infrastructure and Application deployment).
Our upcoming tutorial will explain you all about Installation and configuration of commonly used open source DevOps tools.