Published on 00/00/0000
Last updated on 00/00/0000
Published on 00/00/0000
Last updated on 00/00/0000
Share
Share
INSIGHTS
7 min read
Share
Pipelines as code have evolved in the past decade from the early days of Groovy scripts to various forms of declarative constructs with many open source and commercial offerings. In general, declarative pipelines use structured syntax with predefined keywords to offer better readability and abstractions of complexity. Still, the declarative pipeline is often far from true declarative in spirit when including domain-specific semantics.
With the increasing adoption of declarative pipelines over conventional CI jobs, standardization, and pipeline code quality are of utmost importance in large-scale cloud native projects that involve multiple pipelines from one or more version control repositories. Inheritance of pipelines from one another helps improve productivity by sharing and reusing pipelines across the projects.
Similarly, the workflows in various OSS or commercial offerings require conditional constructs or domain-specific language semantics to decide the flow of complex pipelines. This approach contradicts declarative pipeline conventions and is unsuitable for dynamically generating pipelines based on declarative workflows. But there may be an alternative implementation: using reusable and extensible templates to address these limitations.
What about declarative pipeline technology as part of your overall enterprise technology stack? In large organizations dealing with multiple projects or in open source development environments, there is a need to share these templates within the organization or globally across private or public Git-based version control repositories offered by GitHub, Bit Bucket, Gerrit, etc. This empowers a true distributed pipelines development ecosystem like go modules in Golang programming language.
This blog discusses how effectively we can automate the workflows with templates on Accordion's cloud native CI/CD platform. Accordion pipelines with dynamic workflows are truly declarative, which can be compared to the likes of Kubernetes YAML, which manifests free from domain-specific semantics. This brings in powerful dynamic pipelines with a set of manifest files effectively dictated by the process embedded in the YAML manifest.
In CI/CD (Continuous Integration/Continuous Deployment) pipelines, templates refer to predefined, reusable sets of instructions or configurations that help streamline setting up and managing pipelines for different projects or components within a software development environment. Templates are designed to promote consistency, reduce duplication of effort, and improve the overall efficiency of building, testing, and deploying software.
The Accordion pipeline template's scope is granular, pervasive, and not specific to a stage or the full pipeline. Inheritance works at a block level or even for a single step within a stage. One can create a template for an agent definition or environment or pipeline options, stages, or steps or with all of them in the most abstract form. Also, these templates can be nested to abstract out the least relevant data. Here is the sample template:
From the above illustrations, template.yaml is a primary template that contains the template name, i.e., "master_template," and different pipeline blocks like environment, options, and stages. Whereas pipeline.yaml is a purely abstracted pipeline that imports the template from a private GitHub repository with the URL "https://aci-github.cisco.com/accordion/pipeline_templates/templates/master_template.yaml" and its git revision "8f1463c" (this is for pipeline reproducibility). Credentials are optional depending on the repository type, i.e., public or private. Credentials can be pulled from the CI system's native credential store or Kubernetes secrets. The imported template is tagged as "template" in the pipeline - as seen in item 4 in pipeline.yaml. Once a template is imported, one can either inherit the entire template or a particular block from the template.yaml. In the above example, pipeline.yaml only inherits the block "stages," as shown in item 1.
If we were to inherit the entire pipeline from the template, the pipeline.yaml would just be:
The above illustrations demonstrate the behavior of the generation of pipelines dynamically from the templates. When this template inheritance feature is combined with the feature of policy-based workflows, this software unveils easy self-serviced pipelines with dynamic workflows. Most of the work is one-time, i.e., declaration of templates and all workflows, thereby reusing them for various pipeline requirements.
Workflows define the flow of a declarative pipeline based on several constructs like parameters, environment, or even webhook trigger information. Usually, these workflows are developed with the underlying pipeline's CI system DSL and custom syntaxes. There is a learning curve involved for pipeline developers to understand these semantics. The solution shared provides a way of defining semantic-free workflows with YAML specification.
Accordion's dynamic workflows have a pure declarative approach in which one can declare the stages' (or tasks) names applicable to a workflow and assign a name to the workflow. When the workflow name is attached to the Gerrit code change or pull request of GitHub / Bit Bucket in the form of a tag or a comment, the Accordion CICD software applies the workflow name to a given workflow and template files to render all stages with their definitions and generate the resultant pipeline dynamically. While the actual pipeline code remains constant as a boilerplate code for any workflow, just the workflow name gets attached to the code commit or job parameters (external to the scope of pipeline code) to define the pipeline's flow.
Dynamic workflows offer:
Here are some examples to illustrate the behavior of policy-based dynamic workflows in Accordion CICD.
In the above example, pipeline.yaml will remain the same in any workflow. For example, when reviewing the workflow with the policy name "a7n:lint", the policy label "a7n:lint" is attached to a GitHub or BitBucket PR. The Accordion CI software dynamically generates a pipeline with the blocks defined in the workflow file and the lint stage by rendering them from templates.
Even if a developer wants to execute a pipeline with a different workflow, only the relevant workflow name attached to the GitHub pull request changes. For example, upon attaching "a7n:build", it dynamically generates the resultant pipeline for stages build, lint, and test - all this happens with no changes to the underlying pipeline code. This feature does not need any conditional constructs or custom DSL syntax in pipelines or workflow files. This also does not require importing multiple workflow files per workflow for a complex pipeline.
Highlighting important aspects of dynamic workflows:
The Accordion dynamic workflows and templates bring in the power of distributed pipeline development for large-scale projects, avoiding repeatability of the pipeline code development, low maintenance, abstraction of secrets, and secure logic. This enables the generation of workflows on the fly with a selected GitHub label, a comment on a pull request, or any input from any VCS software. Pure declarative manifests depict the flow of the process from which the pipeline flow is built dynamically with VCS triggers and selections.
If you want to explore further, we encourage you to read more about Accordion and what it can do for you.
We sincerely thank Kalyan Ghosh, Debasish Chakraborty and the Data Center and Cloud Networking DevOps team for their valuable contributions to this project.
Get emerging insights on innovative technology straight to your inbox.
The Shift is Outshift’s exclusive newsletter.
The latest news and updates on generative AI, quantum computing, and other groundbreaking innovations shaping the future of technology.
Discover how AI assistants can revolutionize your business, from automating routine tasks and improving employee productivity to delivering personalized customer experiences and bridging the AI skills gap.