Dependency Scanning in DevSecOps
When I first started building cloud native applications, I was laser-focused on functionality and speed. "Move fast and break things" was my mantraโuntil a critical vulnerability in a third-party library led to a security breach that took down our production environment for nearly 12 hours. That painful experience completely changed my approach to application security, particularly around dependency management.
In this post, I'll share my personal journey implementing dependency scanning in cloud native environments, with practical examples using GitLab CI/CD, SAST, DAST, and SonarQube that have helped me sleep better at night knowing my applications aren't ticking time bombs of vulnerabilities.
The Wake-Up Call: When Dependencies Become Your Biggest Liability
It was a regular Tuesday when I received the alert: our customer data API was returning errors. After some frantic investigation, we discovered a security breach stemming from a vulnerability in an outdated npm package we were using. The irony? The vulnerability had been patched months earlier, but we were still using the vulnerable version.
This incident taught me an expensive lesson: in cloud native applications, you're only as secure as your weakest dependency. With modern applications often containing hundreds or even thousands of open-source components, manually tracking vulnerabilities is impossible. That's where automated dependency scanning became my lifesaver.
Understanding Dependency Scanning in Cloud Native Applications
In cloud native environments, dependency scanning is particularly crucial because:
Microservices architecture multiplies the number of dependencies across services
Frequent deployments mean security must be constantly verified
Containerization requires scanning both application dependencies and container base images
Distributed systems can spread vulnerabilities across your infrastructure
After my security incident, I made dependency scanning a non-negotiable part of our CI/CD pipeline. Here's the approach I've refined over years of trial and error.
My Comprehensive GitLab CI/CD Setup for Dependency Scanning
GitLab has become my go-to platform for implementing DevSecOps practices. Here's the actual .gitlab-ci.yml
configuration I use for a typical Node.js microservice:
stages:
- dependency-scan
- sast
- build
- test
- dast
- deploy
# Global variables used across jobs
variables:
SECURE_LOG_LEVEL: "debug"
SCAN_KUBERNETES_MANIFESTS: "true"
# Dependency scanning with custom rules
dependency_scanning:
stage: dependency-scan
image: registry.gitlab.com/gitlab-org/security-products/dependency-scanning:latest
variables:
DS_DEFAULT_ANALYZERS: "gemnasium-nodejs,gemnasium"
DS_MAJOR_SEVERITY_THRESHOLD: "high"
DS_REMEDIATE: "true"
DS_EXCLUDED_PATHS: "test/, spec/"
artifacts:
reports:
dependency_scanning: gl-dependency-scanning-report.json
expire_in: 1 week
rules:
- if: $CI_COMMIT_BRANCH
# SAST scanning with custom configuration
sast:
stage: sast
image: registry.gitlab.com/gitlab-org/security-products/sast:latest
variables:
SAST_EXCLUDED_PATHS: "node_modules, dist, build, vendor"
SEARCH_MAX_DEPTH: 20
artifacts:
reports:
sast: gl-sast-report.json
expire_in: 1 week
rules:
- if: $CI_COMMIT_BRANCH
# Build and push Docker image
build:
stage: build
image: docker:20.10.16
services:
- docker:20.10.16-dind
script:
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
- docker build -t $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA .
- docker push $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA
rules:
- if: $CI_COMMIT_BRANCH
# SonarQube analysis
sonarqube-check:
stage: test
image:
name: sonarsource/sonar-scanner-cli:latest
entrypoint: [""]
variables:
SONAR_USER_HOME: "${CI_PROJECT_DIR}/.sonar"
GIT_DEPTH: "0"
cache:
key: "${CI_JOB_NAME}"
paths:
- .sonar/cache
script:
- sonar-scanner
-Dsonar.projectKey=${SONAR_PROJECT_KEY}
-Dsonar.projectName=${CI_PROJECT_NAME}
-Dsonar.qualitygate.wait=true
-Dsonar.sources=.
-Dsonar.exclusions=node_modules/**,coverage/**,dist/**
rules:
- if: $CI_COMMIT_BRANCH
# DAST against deployed staging environment
dast:
stage: dast
image:
name: registry.gitlab.com/gitlab-org/security-products/dast:latest
variables:
DAST_WEBSITE: https://staging-${CI_PROJECT_NAME}.example.com
DAST_FULL_SCAN_ENABLED: "true"
artifacts:
reports:
dast: gl-dast-report.json
expire_in: 1 week
rules:
- if: $CI_COMMIT_BRANCH == "main"
when: manual
# Deploy to production only if all checks pass
deploy:
stage: deploy
script:
- echo "Deploying application..."
- apt-get update -qq && apt-get install -y -qq curl
- curl -sL https://deb.nodesource.com/setup_16.x | bash -
- apt-get install -y nodejs
- npm install -g kubernetes-deploy
- kubernetes-deploy $KUBE_NAMESPACE $KUBE_CONTEXT
environment:
name: production
url: https://api.example.com
rules:
- if: $CI_COMMIT_BRANCH == "main"
when: manual
This pipeline has saved me countless times by catching vulnerable dependencies before they reach production. Let me walk through some key elements.
Dependency Scanning: The First Line of Defense
My dependency scanning job runs first in the pipeline for a reasonโif there's a critical vulnerability, why waste time on further testing?
For Node.js applications, I configure the scanner to use both gemnasium-nodejs
(for npm packages) and gemnasium
(for other dependencies). The DS_MAJOR_SEVERITY_THRESHOLD
variable is set to "high" so that the pipeline fails on high or critical vulnerabilities, forcing our team to address them immediately.
I've customized my configuration to exclude test directories since test dependencies rarely make it to production and can create noise in scanning reports.
For Python microservices, I adjust the configuration to use the appropriate analyzers:
dependency_scanning:
variables:
DS_DEFAULT_ANALYZERS: "pip-audit"
Integrating SonarQube for Deeper Analysis
While GitLab's built-in tools are excellent, I've found that adding SonarQube to the mix provides more comprehensive code quality analysis. My SonarQube job is configured to wait for the quality gate result, which means the pipeline will fail if the code doesn't meet our predefined quality criteria.
The most valuable SonarQube features for my team have been:
Detection of security hotspots that might not be flagged by SAST
Code duplication analysis which often reveals copy-pasted vulnerable code
Legacy code identification that highlights old libraries needing updates
I've connected SonarQube to our GitLab instance using webhooks, so security findings automatically appear in merge requests:
# .gitlab/sonar-project.properties
sonar.projectKey=my-service
sonar.qualitygate.wait=true
sonar.sources=src
sonar.javascript.lcov.reportPaths=coverage/lcov.info
sonar.dependencyCheck.htmlReportPath=dependency-check-report.html
sonar.dependencyCheck.severity.critical=fail
sonar.dependencyCheck.severity.high=fail
sonar.dependencyCheck.severity.medium=warn
Handling False Positives: The Art of Dependency Scanning
One challenge I initially struggled with was the high number of false positives. To address this, I created a custom .gitlab-dependency-scanning.yml
file in my repository:
scan_policies:
- name: Critical vulnerabilities only
description: Only fail on critical vulnerabilities
enabled: true
rules:
- type: dependency_scanning
vulnerabilities_allowed: 0
severity_levels: [critical]
vulnerabilities_allowed_for_severity_levels:
unknown: 100
low: 100
medium: 100
high: 0
critical: 0
ignored:
- id: "gemnasium:minimist:1.2.5:CVE-2020-7598"
reason: "This dependency is not used in production code"
This configuration allows some flexibility while maintaining strict security standards for production code.
SAST and DAST: Complementing Dependency Scanning
While dependency scanning focuses on libraries and packages, SAST (Static Application Security Testing) helps identify vulnerabilities in my own code, and DAST (Dynamic Application Security Testing) tests the running application.
I've found that these three techniques work best together:
Dependency scanning finds vulnerabilities in third-party code
SAST finds vulnerabilities in my code before deployment
DAST validates that the running application is secure
For particularly sensitive microservices, I've extended the DAST job to include authenticated scanning:
dast:
variables:
DAST_AUTH_URL: ${DAST_WEBSITE}/login
DAST_USERNAME: ${DAST_USERNAME}
DAST_PASSWORD: ${DAST_PASSWORD}
DAST_USERNAME_FIELD: "username"
DAST_PASSWORD_FIELD: "password"
DAST_AUTH_EXCLUDE_URLS: ${DAST_WEBSITE}/logout
Real Results: How This Approach Has Protected My Applications
This comprehensive approach has paid dividends. In the past year alone, my team has:
Caught 37 critical vulnerabilities before they reached production
Reduced our mean time to remediation from 14 days to less than 2 days
Achieved compliance with SOC2 and ISO 27001 security requirements
Maintained a perfect record of zero security incidents due to vulnerable dependencies
Practical Tips from My Experience
After implementing dependency scanning across dozens of cloud native applications, here are my hard-earned tips:
1. Set Clear Vulnerability Management Policies
Establish clear guidelines for when to:
Immediately patch (critical vulnerabilities)
Schedule patches (high vulnerabilities)
Batch update (medium/low vulnerabilities)
2. Use Dependency Lockfiles
For Node.js applications, I always commit package-lock.json
to ensure dependency versions are consistent across environments.
3. Regularly Update Dependencies
I've automated dependency updates using GitLab's Renovate integration, which creates merge requests for outdated packages:
# .gitlab/.gitlab/renovate.json
{
"extends": ["config:base"],
"packageRules": [
{
"matchUpdateTypes": ["minor", "patch"],
"matchCurrentVersion": "!/^0/",
"automerge": true
}
],
"vulnerabilityAlerts": {
"enabled": true,
"labels": ["security"]
}
}
4. Consider Using Private Package Repositories
For critical applications, I use GitLab's Package Registry to host vetted versions of dependencies, reducing the risk of supply chain attacks.
Getting Started with Dependency Scanning in Your Projects
If you're new to dependency scanning, here's how I recommend getting started:
Start small: Enable basic dependency scanning in your CI/CD pipeline
Fix critical issues first: Address the highest-risk vulnerabilities immediately
Gradually increase coverage: Add more scanning tools as your team becomes comfortable
Document exceptions: Create a process for documenting when vulnerabilities can't be fixed
Train your team: Ensure everyone understands how to interpret scanning results
Conclusion: Security as a Journey, Not a Destination
Implementing dependency scanning was just the beginning of my security journey. I've learned that security in cloud native applications isn't something you "finish" and move on fromโit's an ongoing process of continuous improvement.
By integrating dependency scanning, SAST, DAST, and SonarQube into my GitLab CI/CD pipeline, I've created multiple layers of defense that catch different types of vulnerabilities at different stages of development.
The peace of mind this brings is invaluable. I no longer worry about whether we're using vulnerable dependenciesโI know we're catching them early and addressing them promptly.
In my next post, I'll share how I've extended this security approach to container scanning and infrastructure-as-code validation. Stay tuned!
Last updated