Mainframe agility is paramount for any mainframe-centric enterprise in order to achieve true business agility and to get to the market faster. But how can it be done? Let’s take a look at some of the key DevOps mainframe tooling elements.
The challenges of mainframe agility
Global banks and large corporations are heavily mainframe based. They execute billions of transactions and see their mainframe workloads increasing as their businesses grow. It is evident that mainframes are still a critical piece of any enterprise system – and they are not going to die.
DevOps-enabled mainframes will not only help the IT organisations to see the economic advantage but also to benefit from the superior computing power of the mainframes which is also scalable, reliable, secure and fast.
There are many challenges in achieving DevOps mainframe agility. There are limitations around release cycles, automation, automated testing capabilities, and build and deployment – not to mention the limits around integration with cross platform elements. However, with the right tooling combined with a cultural shift we can overcome these challenges. So – what are the tools available?
Setting up a CI/CD pipeline for mainframe artifacts
There are quite a number of tools from multiple vendors available in the market. Featured below is a vendor-neutral selection of toolsets needed to bring DevOps to mainframes.
- Modernised development environments: Mainframe developments are mostly done in TSO/ISPF editors. Using a modernised development environment will allow developers at all levels to develop both mainframe and non-mainframe applications, will be preferred as part of the DevOps toolchain.
- Agile-enabled SCM tools: Most mainframe version control tools are designed for the traditional Waterfall Development methodology and don’t support parallel development as Agile dictates. This means that shifting to an Agile-enabled SCM tool is essential for mainframe transformation. A single SCM tool for both mainframe and distributed code would be an ideal choice as maintaining multiple SCM tools will be complex. For this, the system should be independent of any of the target platforms in the environment. This makes it necessary to manage off-mainframe version control, in order to uphold both the feedback cycle times and the continuous integration that contemporary developers want. One of the key challenges in having the mainframe code off the platform is that the Mainframe developers don’t feel comfortable with it, as they are used to having everything centralised on the platform. But there is no harm in keeping the source code outside the mainframe as the executables will run only on them. The version control tool should have not only the programming language source codes but also all types of application and system related configurations.
- Automated testing tools: As of this date, there are no mainframe-based tools to support COBOL unit test cases at a fine level of granularity. There are automated testing products for COBOL unit test cases available to support the testing of COBOL codes off-platform, using a suitable COBOL compiler in a shared development environment. However, none of these products are very mature. This is an area for further tool development. Beyond unit testing, there are typically other levels of testing required as well. These include functional testing, interface testing, and integration testing. To achieve true continuous delivery for mainframe artifacts, you will need tooling capabilities for all these testing areas. To reduce your dependence on test region availability – which is at premium in mainframe systems – test virtualisation can be a great help. It will also reduce MIPS and cost less.
- Environment Provisioning: Typical mainframe applications require access to files, databases, subroutine libraries, CICS regions and external APIs. This access is usually outside of the developer’s control. In many cases, a developer doesn’t even have the necessary access privileges for many of the key tasks in the development region. An administrator would need to provide the developer with the required access after raising a ticket. This can take days or even weeks There are no CD tools that provide robust support here, but service virtualisation products can be of some help.
- Test data management: This is one of the biggest challenges in implementing a CD pipeline for a DevOps mainframe application. Mainframe applications access multiple databases, and the development team may not have access to all of the data stores that are to be populated as part of automated tests. There may not be suitable APIs to access externally hosted services. This means that test data creation is still a manual task in most organisations. Missing tooling or limited tooling support is a roadblock for implementing 100% CD pipeline.
- Automated deployment: There are few tools available in the market today which can automatically deploy all development artifacts across all target environments.
In conclusion, the tools that are available today can only help us achieve partial Continuous Delivery pipeline. Achieving true continuous delivery for the mainframe artifacts is only possible with a whole host of additional tooling capabilities.
We’re happy to help guide you through every step of your DevOps mainframe journey. Join our DevOps Mainframe LinkedIn group and take part in the conversation!