Select Page

Shift Left Development

Development Environments, Static Analysis Tools, and CI/CD Pipelines

Authors: Roland Knight, Chief Technology Officer – Product


In today’s rapidly evolving business landscape, timely software delivery is paramount. A faster developer cycle time ensures high-quality, secure, and on-time releases. An effective strategy to achieve this is by shifting tools left, empowering developers to take charge in the early stages of the software development cycle.

Streamlining the development cycle can yield numerous benefits for the software development life cycle (SDLC). Developers can achieve faster iterations by optimizing their development environment creation, static analysis tools, and CI/CD pipelines, leading to increased productivity. This expedited process encourages minor code changes in commits, resulting in fewer merge conflicts and defects being addressed before check-in rather than later. Additionally, shorter cycle times can foster a more positive work environment, as developers enjoy the satisfaction of seeing their work come to fruition sooner, which can boost motivation and job satisfaction. This article will explore some of the top tools and selection strategies to help reduce development cycle time.

Instant Development Environments with Development Containers

Before developers can start coding for a project, they must go through a detailed onboarding process that typically involves installing and setting up an integrated development environment (IDE) and its associated dependencies. These dependencies can range from operating system drivers (such as CUDA for AI/ML and game applications) and OS dependencies and configurations to binary tool dependencies, dependency management tools, database/service dependencies, and integrated development environments (IDE) plugins. While there are several ways to install and configure development environments, the most used configurations are:

  • Entirely Local: IDE, source, and build environment are all local on the developer’s machine (e.g., VS Code, Eclipse, IntelliJ IDEA).
  • Remote Desktop: Same as entirely local except in a remote machine (or virtual machine) with access via remote desktop tools (e.g., VNC, Citrix).
  • Remote Web Application: IDE, source, and build environments are remote and can be accessed through a web browser (e.g., VS Code Web, Eclipse Theia, AWS Cloud9).
  • Local and VM: IDE is installed locally on the developer’s machine, but source files and build environment are in a remote (or local) VM (e.g., VS Code Remote Development, JetBrains Gateway, Eclipse RSE).
  • Development Containers: Local IDE and local (or remote) container(s) hosting source files and build environment (e.g., VS Code Remote Development, JetBrains Dev Containers Plugin).

Development Containers are an exciting and modern technological advancement that allows you to seamlessly use your local IDE with a containerized build and execution environment, whether local or remote. This setup offers the flexibility to personalize your local workspace while maintaining a consistent and isolated environment for build and execution. You can set up a fully configured working development environment using VS Code with just one click in minutes. Your developers will appreciate this feature!

But what exactly is a development container? Microsoft introduced an open specification and reference implementation that defines the metadata to configure your development container’s Operating System (OS), tool, and runtime stack. By utilizing this metadata to configure your development container, you can choose from over 40 ready-made templates (e.g., Alpine, Ubuntu, Conda), install additional tools from a selection of over 25 features (e.g., AWS CLI, Python, Rust), and then define volume mounts, ports to expose, additional containers to start, and more. Furthermore, this configuration can be found in your repository, which VS Code will automatically detect and prompt to open in a container. Even better, you can include a specially formatted URL in your project file, and developers can open your project with just one click when browsing the repository!

Efficient IDE performance is essential for software development, but handling large projects with source files outside the local file system can lead to significant performance problems. Luckily, there is a solution: the VS Code Development Containers extension. This extension installs vital components that require low-latency access to source files, like the Language Server, to ensure resource-intensive features such as auto-complete, error-checking, and jump-to-definition do not cause performance degradation. The result is a high-performance development environment with an isolated, consistently configured container environment.

The Development Containers CLI tool is the key to fully portable build and execution environments and can be utilized in CI/CD pipelines to guarantee builds that are 100% consistent with the development environment. Additionally, all deployment environments, including production, can use the same base configuration to ensure consistent tooling and configuration for runtime execution.

By utilizing Development Containers in VS Code, developers can have local IDE and containerized development/build/execution, leading to a fully featured environment with the same performance. Consistent, isolated environments that are identical to CI/CD pipeline environments help eliminate pipeline failures and deployment inconsistencies. Plus, the one-click setup, which takes less than a minute after caching, makes this tool even more appealing to developers.

Fast, Integrated, and Consistent Static Analysis Tooling

Let us first define what we mean by static analysis tooling. Essentially, this refers to a collection of tools used to analyze source code and binary build artifacts, including devices such as static application security testing (SAST), code quality scanners, secrets detectors, dependency analyzers, and container image scanners.

However, SAST and code quality scanning can be time-consuming and require developers to navigate multiple tools to access scanning results. Developers want an integrated interface that is simple and easy to use, allowing them to identify and repair any issues that have been identified quickly. Additionally, the local results must be identical to the pipeline job results to avoid pipeline failures and extended cycle times. Therefore, we need to streamline and shift left static analysis tooling.

But with so many static analysis tools available, how do we choose the best one for our project? Unfortunately, there is no easy answer to this question. However, we can break it down into two main categories: the type of project and the resources available.

  • Open-source projects: You are in luck! The tools you need are free and free from GitHub and GitLab. Both provide the tools you need that can also run locally before committing.
  • Well-funded commercial projects: Spend the money and get GitHub Enterprise or GitLab Ultimate. You will get a robust, integrated suite of static analysis tools running locally. However, truly shifting all tooling left will require configuration and integration work.
  • Budget-conscious commercial projects: This is where it gets challenging. Your specific tool set will depend on many factors, including your security needs and capability to curate and configure a toolset. We recommend selecting open-source tools with significant vendor backing, such as GitHub/GitLab. These are often the same tools available in their higher tier licensing but without the slick pipeline user interface integration – this we can work around.

Regardless of your choice above, the result is still the same: unified tools that run locally and in the pipeline with consistent results. No swivel or logins to additional user interfaces are required to see scanning results. Note that some integration on the client slide may still be needed to streamline your workflow or to augment it with specialized scanning tools.

Local, Portable, and Optimized CI/CD Pipeline Execution

Dealing with intricate pipelines can often require multiple attempts before achieving seamless execution. In cases where the pipeline runs for 30 minutes or more, it could take days to perfect it. Additionally, even after successful implementation, complicated modifications can still consume a significant amount of time. The optimal approach is to move pipeline execution to local developer environments and decrease the execution timeframe.

Meet Dagger – a cutting-edge tool created by the brilliant minds behind Docker. With Dagger, you can seamlessly create pipelines using code and expertly parallelize jobs according to their dependencies while executing them within containers.

Among its standout features are:

  • Local pipeline execution: Command line tool to run your pipeline in local containers.
  • Pipeline as code: Define your pipeline in various languages, including TypeScript, Python, Java, Go, and GraphQL.
  • Automatic parallelization: Dependencies in the pipeline are used to build a directed acyclic graph (DAG) to execute the pipeline optimally.
  • Automatic caching: The pipeline definition includes a definition of reusable layers that are automatically cached. Each job is essentially a Dockerfile-as-code!

Making strides in ensuring that your pipeline runs seamlessly on your local machine and generates identical outcomes during post-commit execution is a notable advancement. By integrating this capability with pipeline-as-code, container-image-as-code, and automatic directed acyclic graph (DAG) concurrency, you elevate your CI/CD pipeline to a new level. We strongly urge you to explore Dagger for your pipeline requirements.


In this article, we provide recommendations to accelerate development by shifting many tools left in the development cycle.

  • Development Environments: Development Containers allow you to set up consistent development environments in seconds. It is stable and ready to use in VSCode now!
  • Static Analysis Tools: AST Tooling – many choices – unify and move to open source as much as possible.
  • CI/CD Pipelines: Go beyond just running your pipeline jobs locally and run your optimized pipeline-as-code pipeline locally with Dagger. Dagger is rapidly approaching a 1.0 release. We recommend starting to use it now!

In part two, we will explore tools and strategies for shift left deployment to support the secret-free deployment of development environments with complex service and dataset dependencies.



Digital Product Engineering

Cloud Services

Data & Analytics

AI and Automation
Modern Managed Services

Build Operate Transfer

Innova Orion GCC Services

Talent Solutions


Communications & Media

Government Solutions

Healthcare, Life Sciences,
and Insurance

Banking & Financial Services

Energy, Oil & Gas and Utilities


Retail & CPG

Travel & Transportation and Hospitality



Automation Anywhere










Innovation @ Work

Blogs and Insights

Research and Whitepapers

Case Studies


Webinars & Tech Talks
US Employment Reports


About Us

Leadership Team

Strategic Partnerships

Office Locations




The Innova Foundation


Explore Open Positions

Life @ Innova Solutions

Candidate Resource Library

Let's Connect