Compare commits
5 Commits
7587c285e7
..
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 7f366204a9 | |||
| 1036fce55e | |||
| 38ff2f4fde | |||
| f0b937deb7 | |||
| 251070dd77 |
@@ -1,3 +1,68 @@
|
||||
# agentguard-ci
|
||||
|
||||
A DevSecOps Argo Workflows pipeline to protect against AI coding agent hallucinations and supply chain attacks.
|
||||
A DevSecOps Argo Workflows pipeline specifically designed to protect against AI coding agent hallucinations, supply chain attacks, and security misconfigurations in a homelab or solo-developer environment.
|
||||
|
||||
## 📖 The Problem
|
||||
|
||||
AI coding agents are highly productive "junior developers," but they lack intrinsic context. They frequently hallucinate dummy credentials, introduce insecure application logic, or pull in new, potentially typosquatted dependencies.
|
||||
|
||||
This pipeline acts as a strict, automated gatekeeper that prioritizes zero-noise alerting, allowing you to maintain high development velocity without compromising the security of your exposed homelab.
|
||||
|
||||
## 🏗️ Architecture & Features
|
||||
|
||||
This project deploys an **Argo ClusterWorkflowTemplate** that orchestrates a parallel security scanning matrix whenever code is pushed:
|
||||
* **TruffleHog**: Verifies leaked API keys dynamically to prevent false-positives from AI hallucinations.
|
||||
* **Semgrep**: Scans first-party application logic for vulnerabilities (e.g., SQLi, XSS).
|
||||
* **Socket.dev**: Analyzes dependencies for supply chain attacks, malware, and typosquatting.
|
||||
* **Pulumi CrossGuard**: Validates Infrastructure as Code against policy packs.
|
||||
* **Syft + Grype**: Generates SBOMs and scans for container vulnerabilities scored via EPSS.
|
||||
* **KICS**: Scans infrastructure misconfigurations.
|
||||
* **DefectDojo & MinIO**: Uploads findings to a centralized ASPM dashboard and raw SARIF/JSON reports to S3-compatible storage.
|
||||
* **Policy Enforcement**: Custom TypeScript logic automatically fails the build if any findings exceed your defined CVSS severity threshold.
|
||||
|
||||
For deep-dive architecture decisions, see the [Pipeline Overview ADR](docs/pipeline-overview.md) and [Secret Strategy ADR](docs/secret-strategy.md).
|
||||
|
||||
## 🚀 Prerequisites
|
||||
|
||||
Before installing the pipeline, ensure your Kubernetes cluster has the following installed:
|
||||
* **Argo Workflows**
|
||||
* **Infisical Kubernetes Operator** (for secret injection)
|
||||
* **DefectDojo** (for vulnerability dashboards)
|
||||
* **MinIO / S3** (for raw report storage)
|
||||
|
||||
You will also need API keys or tokens for: Socket.dev, Pulumi, AWS/MinIO, and DefectDojo.
|
||||
|
||||
## 🛠️ Installation
|
||||
|
||||
### 1. Build the Pipeline Tools Image
|
||||
The pipeline relies on custom TypeScript logic (e.g., CVSS enforcement and API uploads). Build and push this image to your registry:
|
||||
```bash
|
||||
cd tools
|
||||
docker build -t your-registry/agentguard-tools:latest .
|
||||
docker push your-registry/agentguard-tools:latest
|
||||
```
|
||||
*(Make sure to update `clusterworkflowtemplate.yaml` with your custom image if you do not use `agentguard-tools:latest`)*
|
||||
|
||||
### 2. Configure Helm Values
|
||||
Update `helm/values.yaml` (if applicable) and configure your Infisical integration:
|
||||
```yaml
|
||||
pipeline:
|
||||
enabled: true
|
||||
infisical:
|
||||
workspaceSlug: "your-workspace-id"
|
||||
projectSlug: "your-project-id"
|
||||
```
|
||||
|
||||
### 3. Deploy via Helm
|
||||
Install the pipeline and its associated resources to your cluster:
|
||||
```bash
|
||||
helm upgrade --install agentguard-ci ./helm -n argo
|
||||
```
|
||||
|
||||
## 🔐 Secret Management Integration
|
||||
|
||||
To prevent hardcoded secrets in the pipeline, this project uses the **Infisical Kubernetes Operator**.
|
||||
|
||||
When you deploy the Helm chart, it creates an `InfisicalSecret` Custom Resource (`helm/templates/infisical-secret.yaml`). The Infisical Operator securely fetches your vault secrets (like `SOCKET_DEV_API_KEY` and `DEFECTDOJO_API_TOKEN`) and synchronizes them into a standard Kubernetes `Secret` named `amp-security-pipeline-secrets`.
|
||||
|
||||
The Argo Workflow then mounts this standard secret as environment variables inside the scanning containers, ensuring zero secret leakage in the Git repository.
|
||||
+16
@@ -0,0 +1,16 @@
|
||||
import glob, re, os
|
||||
|
||||
files = glob.glob("helm/templates/scan-*.yaml") + glob.glob("helm/templates/upload-*.yaml") + ["helm/templates/enforce-policy.yaml"]
|
||||
for f in files:
|
||||
with open(f) as file:
|
||||
content = file.read()
|
||||
match = re.search(r'spec:\n templates:\n(.*)(?:{{- end }})', content, re.DOTALL)
|
||||
if match:
|
||||
template_content = match.group(1).strip()
|
||||
# Extract the base name e.g. scan-kics
|
||||
base_name = os.path.basename(f).replace('.yaml', '')
|
||||
new_content = f'{{{{- define "template.{base_name}" }}}}\n{template_content}\n{{{{- end }}}}\n'
|
||||
new_filename = os.path.join(os.path.dirname(f), f"_{base_name}.yaml")
|
||||
with open(new_filename, "w") as out:
|
||||
out.write(new_content)
|
||||
os.remove(f)
|
||||
@@ -31,7 +31,7 @@ For solo personal projects, a complex CI/CD security pipeline is usually overkil
|
||||
|
||||
---
|
||||
|
||||
### The Chosen Solution: Dual-Layer Approach
|
||||
### The Chosen Solution: Dual-Layer Approach + Infisical Runtime Injection
|
||||
|
||||
#### Layer 1: Gitleaks (The Local Guard)
|
||||
* **Where:** Local developer machine (Pre-commit Hook).
|
||||
@@ -41,6 +41,10 @@ For solo personal projects, a complex CI/CD security pipeline is usually overkil
|
||||
* **Where:** GitHub Actions / CI Pipeline (Post-commit).
|
||||
* **Why:** Uses active verification. If a secret slips past (via an AI agent pushing directly or a bypassed local hook), TruffleHog actively calls out to external APIs to verify if the key is live. By using the `--only-verified` flag, it guarantees zero false positives and only fails the pipeline if it proves a key is an active threat.
|
||||
|
||||
#### Layer 3: Infisical Operator (Pipeline Runtime Injection)
|
||||
* **Where:** Inside the Kubernetes Cluster (via `InfisicalSecret` CRD).
|
||||
* **Why:** The security pipeline itself requires numerous highly-privileged secrets (DefectDojo API tokens, AWS S3 keys, Pulumi access tokens, Socket.dev keys) to execute the scans and upload reports. We do not store these in GitOps. Instead, the Helm chart deploys an `InfisicalSecret` resource. The Infisical Kubernetes Operator authenticates with the central vault, pulls the secrets dynamically, and syncs them into a native Kubernetes `Secret` (`amp-security-pipeline-secrets`). The Argo Workflow containers then consume these safely at runtime as environment variables.
|
||||
|
||||
---
|
||||
|
||||
### Tradeoffs & Accepted Risks
|
||||
|
||||
@@ -5,7 +5,7 @@ metadata:
|
||||
data:
|
||||
renovate.json: |
|
||||
{
|
||||
"extends": ["github>my-org/my-repo//renovate-preset"],
|
||||
"extends": [{{ .Values.preset | quote }}],
|
||||
"onboarding": false,
|
||||
"platform": "github",
|
||||
"repositories": {{ toJson .Values.repositories }}
|
||||
|
||||
@@ -4,4 +4,5 @@ image:
|
||||
pullPolicy: IfNotPresent
|
||||
|
||||
schedule: "0 * * * *"
|
||||
preset: "github>my-org/my-repo//renovate-preset"
|
||||
repositories: []
|
||||
|
||||
@@ -0,0 +1,17 @@
|
||||
{{- define "template.enforce-policy" }}
|
||||
- name: enforce-policy
|
||||
inputs:
|
||||
parameters:
|
||||
- name: fail-on-cvss
|
||||
container:
|
||||
image: agentguard-tools:latest
|
||||
command:
|
||||
- node
|
||||
- /app/dist/enforce-policy.js
|
||||
env:
|
||||
- name: FAIL_ON_CVSS
|
||||
value: "{{inputs.parameters.fail-on-cvss}}"
|
||||
volumeMounts:
|
||||
- name: workspace
|
||||
mountPath: /workspace
|
||||
{{- end }}
|
||||
@@ -1,11 +1,5 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
name: amp-security-pipeline-v1.0.0
|
||||
spec:
|
||||
templates:
|
||||
- name: scan-crossguard
|
||||
{{- define "template.scan-defectdojo" }}
|
||||
- name: scan-defectdojo
|
||||
container:
|
||||
image: pulumi/pulumi:3.154.0
|
||||
env:
|
||||
@@ -1,10 +1,4 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
name: amp-security-pipeline-v1.0.0
|
||||
spec:
|
||||
templates:
|
||||
{{- define "template.scan-kics" }}
|
||||
- name: scan-kics
|
||||
container:
|
||||
image: checkmarx/kics:1.7.14
|
||||
@@ -1,10 +1,4 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
name: amp-security-pipeline-v1.0.0
|
||||
spec:
|
||||
templates:
|
||||
{{- define "template.scan-semgrep" }}
|
||||
- name: scan-semgrep
|
||||
container:
|
||||
image: returntocorp/semgrep:1.85.0
|
||||
@@ -1,10 +1,4 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
name: amp-security-pipeline-v1.0.0
|
||||
spec:
|
||||
templates:
|
||||
{{- define "template.scan-socketdev" }}
|
||||
- name: scan-socketdev
|
||||
container:
|
||||
image: socketdev/socketcli:latest
|
||||
@@ -1,10 +1,4 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
name: amp-security-pipeline-v1.0.0
|
||||
spec:
|
||||
templates:
|
||||
{{- define "template.scan-syft-grype" }}
|
||||
- name: scan-syft-grype
|
||||
container:
|
||||
image: anchore/syft:latest
|
||||
@@ -0,0 +1,16 @@
|
||||
{{- define "template.scan-trufflehog" }}
|
||||
- name: scan-trufflehog
|
||||
container:
|
||||
image: trufflesecurity/trufflehog:latest
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
args:
|
||||
- |
|
||||
set -eu
|
||||
mkdir -p /workspace/reports
|
||||
trufflehog filesystem /workspace --json > /workspace/reports/trufflehog.json || true
|
||||
volumeMounts:
|
||||
- name: workspace
|
||||
mountPath: /workspace
|
||||
{{- end }}
|
||||
@@ -0,0 +1,22 @@
|
||||
{{- define "template.upload-defectdojo" }}
|
||||
- name: upload-defectdojo
|
||||
container:
|
||||
image: agentguard-tools:latest
|
||||
env:
|
||||
- name: DEFECTDOJO_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: amp-security-pipeline-secrets
|
||||
key: DEFECTDOJO_URL
|
||||
- name: DEFECTDOJO_API_TOKEN
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: amp-security-pipeline-secrets
|
||||
key: DEFECTDOJO_API_TOKEN
|
||||
command:
|
||||
- node
|
||||
- /app/dist/upload-defectdojo.js
|
||||
volumeMounts:
|
||||
- name: workspace
|
||||
mountPath: /workspace
|
||||
{{- end }}
|
||||
@@ -1,10 +1,4 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
name: amp-security-pipeline-v1.0.0
|
||||
spec:
|
||||
templates:
|
||||
{{- define "template.upload-storage" }}
|
||||
- name: upload-storage
|
||||
container:
|
||||
image: amazon/aws-cli:2.15.40
|
||||
@@ -1,3 +1,4 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
@@ -47,21 +48,11 @@ spec:
|
||||
value: "{{workflow.parameters.fail-on-cvss}}"
|
||||
- name: upload-storage
|
||||
dependencies:
|
||||
- scan-trufflehog
|
||||
- scan-semgrep
|
||||
- scan-kics
|
||||
- scan-socketdev
|
||||
- scan-syft-grype
|
||||
- scan-crossguard
|
||||
- scanners
|
||||
template: upload-storage
|
||||
- name: upload-defectdojo
|
||||
dependencies:
|
||||
- scan-trufflehog
|
||||
- scan-semgrep
|
||||
- scan-kics
|
||||
- scan-socketdev
|
||||
- scan-syft-grype
|
||||
- scan-crossguard
|
||||
- scanners
|
||||
template: upload-defectdojo
|
||||
- name: enforce-policy
|
||||
dependencies:
|
||||
@@ -76,54 +67,6 @@ spec:
|
||||
dependencies:
|
||||
- scanners
|
||||
template: sinks-and-enforcement
|
||||
- name: scan-trufflehog
|
||||
dependencies:
|
||||
- clone
|
||||
template: scan-trufflehog
|
||||
arguments:
|
||||
parameters:
|
||||
- name: working-dir
|
||||
value: "{{workflow.parameters.working-dir}}"
|
||||
- name: scan-semgrep
|
||||
dependencies:
|
||||
- clone
|
||||
template: scan-semgrep
|
||||
arguments:
|
||||
parameters:
|
||||
- name: working-dir
|
||||
value: "{{workflow.parameters.working-dir}}"
|
||||
- name: scan-kics
|
||||
dependencies:
|
||||
- clone
|
||||
template: scan-kics
|
||||
arguments:
|
||||
parameters:
|
||||
- name: working-dir
|
||||
value: "{{workflow.parameters.working-dir}}"
|
||||
- name: scan-socketdev
|
||||
dependencies:
|
||||
- clone
|
||||
template: scan-socketdev
|
||||
arguments:
|
||||
parameters:
|
||||
- name: working-dir
|
||||
value: "{{workflow.parameters.working-dir}}"
|
||||
- name: scan-syft-grype
|
||||
dependencies:
|
||||
- clone
|
||||
template: scan-syft-grype
|
||||
arguments:
|
||||
parameters:
|
||||
- name: working-dir
|
||||
value: "{{workflow.parameters.working-dir}}"
|
||||
- name: scan-crossguard
|
||||
dependencies:
|
||||
- clone
|
||||
template: scan-crossguard
|
||||
arguments:
|
||||
parameters:
|
||||
- name: working-dir
|
||||
value: "{{workflow.parameters.working-dir}}"
|
||||
- name: clone-repo
|
||||
inputs:
|
||||
parameters:
|
||||
@@ -146,41 +89,34 @@ spec:
|
||||
- name: fail-on-cvss
|
||||
dag:
|
||||
tasks:
|
||||
- name: trufflehog
|
||||
template: scan-trufflehog
|
||||
- name: semgrep
|
||||
template: scan-semgrep
|
||||
- name: kics
|
||||
template: scan-kics
|
||||
- name: socketdev
|
||||
template: scan-socketdev
|
||||
- name: syft-grype
|
||||
template: scan-syft-grype
|
||||
- name: defectdojo
|
||||
template: scan-crossguard
|
||||
{{- range $scanner := list "trufflehog" "semgrep" "kics" "socketdev" "syft-grype" "defectdojo" }}
|
||||
- name: {{ $scanner }}
|
||||
template: scan-{{ $scanner }}
|
||||
arguments:
|
||||
parameters:
|
||||
- name: working-dir
|
||||
value: "{{inputs.parameters.working-dir}}"
|
||||
{{- end }}
|
||||
- name: sinks-and-enforcement
|
||||
container:
|
||||
image: alpine:3.20
|
||||
image: curlimages/curl:latest
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
args:
|
||||
- echo "stub: sinks and enforcement"
|
||||
- name: scan-trufflehog
|
||||
template: scan-trufflehog
|
||||
- name: scan-semgrep
|
||||
template: scan-semgrep
|
||||
- name: scan-kics
|
||||
template: scan-kics
|
||||
- name: scan-socketdev
|
||||
template: scan-socketdev
|
||||
- name: scan-syft-grype
|
||||
template: scan-syft-grype
|
||||
- name: scan-crossguard
|
||||
template: scan-crossguard
|
||||
- name: upload-storage
|
||||
template: upload-storage
|
||||
- name: upload-defectdojo
|
||||
template: upload-defectdojo
|
||||
- name: enforce-policy
|
||||
template: enforce-policy
|
||||
- |
|
||||
set -eu
|
||||
echo "Pipeline complete. You can configure a webhook notification here."
|
||||
if [ -n "${SLACK_WEBHOOK_URL:-}" ]; then
|
||||
curl -X POST -H 'Content-type: application/json' --data '{"text":"Security Pipeline Finished"}' "${SLACK_WEBHOOK_URL}" || true
|
||||
fi
|
||||
{{ include "template.scan-syft-grype" . | indent 4 }}
|
||||
{{ include "template.scan-socketdev" . | indent 4 }}
|
||||
{{ include "template.scan-defectdojo" . | indent 4 }}
|
||||
{{ include "template.scan-semgrep" . | indent 4 }}
|
||||
{{ include "template.scan-trufflehog" . | indent 4 }}
|
||||
{{ include "template.scan-kics" . | indent 4 }}
|
||||
{{ include "template.upload-defectdojo" . | indent 4 }}
|
||||
{{ include "template.upload-storage" . | indent 4 }}
|
||||
{{ include "template.enforce-policy" . | indent 4 }}
|
||||
{{- end }}
|
||||
|
||||
@@ -1,88 +0,0 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
name: amp-security-pipeline-v1.0.0
|
||||
spec:
|
||||
templates:
|
||||
- name: enforce-policy
|
||||
inputs:
|
||||
parameters:
|
||||
- name: fail-on-cvss
|
||||
container:
|
||||
image: python:3.12-alpine
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
args:
|
||||
- |
|
||||
set -eu
|
||||
python - <<'PY'
|
||||
import json
|
||||
import os
|
||||
import pathlib
|
||||
import sys
|
||||
|
||||
threshold = float(os.environ["FAIL_ON_CVSS"])
|
||||
reports_dir = pathlib.Path("/workspace/reports")
|
||||
findings = []
|
||||
|
||||
for report in sorted(reports_dir.iterdir()):
|
||||
if not report.is_file():
|
||||
continue
|
||||
text = report.read_text(errors="ignore")
|
||||
if report.suffix == ".sarif":
|
||||
try:
|
||||
data = json.loads(text)
|
||||
except json.JSONDecodeError:
|
||||
continue
|
||||
for run in data.get("runs", []):
|
||||
for result in run.get("results", []):
|
||||
for fix in result.get("properties", {}).get("security-severity", []):
|
||||
pass
|
||||
for level in result.get("properties", {}).values():
|
||||
pass
|
||||
for prop in [result.get("properties", {}), result.get("taxa", [])]:
|
||||
pass
|
||||
for region in result.get("locations", []):
|
||||
pass
|
||||
sev = result.get("properties", {}).get("security-severity")
|
||||
if sev is None:
|
||||
continue
|
||||
try:
|
||||
score = float(sev)
|
||||
except (TypeError, ValueError):
|
||||
continue
|
||||
if score >= threshold:
|
||||
findings.append((report.name, score))
|
||||
elif report.suffix == ".json":
|
||||
try:
|
||||
data = json.loads(text)
|
||||
except json.JSONDecodeError:
|
||||
continue
|
||||
if isinstance(data, dict):
|
||||
for item in data.get("findings", data.get("vulnerabilities", [])):
|
||||
score = item.get("cvss") or item.get("score")
|
||||
if score is None:
|
||||
continue
|
||||
try:
|
||||
score = float(score)
|
||||
except (TypeError, ValueError):
|
||||
continue
|
||||
if score >= threshold:
|
||||
findings.append((report.name, score))
|
||||
|
||||
if findings:
|
||||
for name, score in findings:
|
||||
print(f"{name}: CVSS {score} >= {threshold}", file=sys.stderr)
|
||||
raise SystemExit(1)
|
||||
|
||||
print(f"No findings met or exceeded CVSS {threshold}")
|
||||
PY
|
||||
env:
|
||||
- name: FAIL_ON_CVSS
|
||||
value: "{{inputs.parameters.fail-on-cvss}}"
|
||||
volumeMounts:
|
||||
- name: workspace
|
||||
mountPath: /workspace
|
||||
{{- end }}
|
||||
@@ -1,19 +0,0 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
name: amp-security-pipeline-v1.0.0
|
||||
spec:
|
||||
templates:
|
||||
- name: scan-trufflehog
|
||||
container:
|
||||
image: alpine:3.20
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
args:
|
||||
- mkdir -p /workspace/reports && echo "stub: trufflehog" > /workspace/reports/trufflehog.json
|
||||
volumeMounts:
|
||||
- name: workspace
|
||||
mountPath: /workspace
|
||||
{{- end }}
|
||||
@@ -1,66 +0,0 @@
|
||||
{{- if .Values.pipeline.enabled }}
|
||||
apiVersion: argoproj.io/v1alpha1
|
||||
kind: ClusterWorkflowTemplate
|
||||
metadata:
|
||||
name: amp-security-pipeline-v1.0.0
|
||||
spec:
|
||||
templates:
|
||||
- name: upload-defectdojo
|
||||
container:
|
||||
image: python:3.12-alpine
|
||||
env:
|
||||
- name: DEFECTDOJO_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: amp-security-pipeline-secrets
|
||||
key: DEFECTDOJO_URL
|
||||
- name: DEFECTDOJO_API_TOKEN
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: amp-security-pipeline-secrets
|
||||
key: DEFECTDOJO_API_TOKEN
|
||||
command:
|
||||
- sh
|
||||
- -c
|
||||
args:
|
||||
- |
|
||||
set -eu
|
||||
python - <<'PY'
|
||||
import json
|
||||
import os
|
||||
import pathlib
|
||||
import urllib.request
|
||||
|
||||
base_url = os.environ["DEFECTDOJO_URL"].rstrip("/")
|
||||
api_token = os.environ["DEFECTDOJO_API_TOKEN"]
|
||||
product_name = os.environ.get("DEFECTDOJO_PRODUCT_NAME", "agentguard-ci")
|
||||
scan_map = {
|
||||
".sarif": "SARIF",
|
||||
".json": "Generic Findings Import",
|
||||
}
|
||||
reports_dir = pathlib.Path("/workspace/reports")
|
||||
for report in sorted(reports_dir.iterdir()):
|
||||
if not report.is_file():
|
||||
continue
|
||||
scan_type = scan_map.get(report.suffix)
|
||||
if not scan_type:
|
||||
continue
|
||||
req = urllib.request.Request(
|
||||
f"{base_url}/api/v2/import-scan/",
|
||||
data=json.dumps({
|
||||
"scan_type": scan_type,
|
||||
"product_name": product_name,
|
||||
"file_name": report.name,
|
||||
}).encode(),
|
||||
headers={
|
||||
"Authorization": f"Token {api_token}",
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
method="POST",
|
||||
)
|
||||
urllib.request.urlopen(req)
|
||||
PY
|
||||
volumeMounts:
|
||||
- name: workspace
|
||||
mountPath: /workspace
|
||||
{{- end }}
|
||||
@@ -0,0 +1,14 @@
|
||||
FROM node:20-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY package.json package-lock.json ./
|
||||
RUN npm ci
|
||||
|
||||
COPY tsconfig.json ./
|
||||
COPY src ./src
|
||||
|
||||
RUN npm run build
|
||||
|
||||
# The default command isn't strictly necessary as Argo will override it
|
||||
CMD ["node", "/app/dist/enforce-policy.js"]
|
||||
Generated
+1853
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,21 @@
|
||||
{
|
||||
"name": "tools",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"main": "index.js",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"test": "vitest run",
|
||||
"build": "tsc"
|
||||
},
|
||||
"keywords": [],
|
||||
"author": "",
|
||||
"license": "ISC",
|
||||
"type": "commonjs",
|
||||
"devDependencies": {
|
||||
"@types/node": "^25.6.0",
|
||||
"tsx": "^4.21.0",
|
||||
"typescript": "^6.0.3",
|
||||
"vitest": "^4.1.4"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,58 @@
|
||||
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
||||
import * as fs from 'node:fs';
|
||||
import * as path from 'node:path';
|
||||
import * as os from 'node:os';
|
||||
import { checkReports } from './enforce-policy.js';
|
||||
|
||||
describe('enforce-policy', () => {
|
||||
let tempDir: string;
|
||||
|
||||
beforeEach(() => {
|
||||
tempDir = fs.mkdtempSync(path.join(os.tmpdir(), 'reports-'));
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
fs.rmSync(tempDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('should find vulnerabilities above threshold in SARIF', () => {
|
||||
const sarifData = {
|
||||
runs: [{
|
||||
results: [
|
||||
{ properties: { 'security-severity': '8.5' } },
|
||||
{ properties: { 'security-severity': '5.0' } }
|
||||
]
|
||||
}]
|
||||
};
|
||||
fs.writeFileSync(path.join(tempDir, 'test.sarif'), JSON.stringify(sarifData));
|
||||
|
||||
const findings = checkReports(tempDir, 7.0);
|
||||
expect(findings).toHaveLength(1);
|
||||
expect(findings[0].name).toBe('test.sarif');
|
||||
expect(findings[0].score).toBe(8.5);
|
||||
});
|
||||
|
||||
it('should find vulnerabilities above threshold in JSON', () => {
|
||||
const jsonData = {
|
||||
findings: [
|
||||
{ cvss: 9.0 },
|
||||
{ score: 6.5 }
|
||||
]
|
||||
};
|
||||
fs.writeFileSync(path.join(tempDir, 'test.json'), JSON.stringify(jsonData));
|
||||
|
||||
const findings = checkReports(tempDir, 7.0);
|
||||
expect(findings).toHaveLength(1);
|
||||
expect(findings[0].name).toBe('test.json');
|
||||
expect(findings[0].score).toBe(9.0);
|
||||
});
|
||||
|
||||
it('should set process.exitCode = 1 for invalid JSON', () => {
|
||||
fs.writeFileSync(path.join(tempDir, 'invalid.json'), '{ "bad": json');
|
||||
|
||||
const findings = checkReports(tempDir, 7.0);
|
||||
expect(findings).toHaveLength(0);
|
||||
expect(process.exitCode).toBe(1);
|
||||
process.exitCode = 0; // reset for other tests
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,85 @@
|
||||
import * as fs from 'node:fs';
|
||||
import * as path from 'node:path';
|
||||
|
||||
export function checkReports(reportsDir: string, threshold: number): { name: string; score: number }[] {
|
||||
const findings: { name: string; score: number }[] = [];
|
||||
if (!fs.existsSync(reportsDir)) return findings;
|
||||
|
||||
const files = fs.readdirSync(reportsDir).sort();
|
||||
|
||||
for (const file of files) {
|
||||
const fullPath = path.join(reportsDir, file);
|
||||
if (!fs.statSync(fullPath).isFile()) continue;
|
||||
|
||||
const text = fs.readFileSync(fullPath, 'utf-8');
|
||||
let data: any;
|
||||
try {
|
||||
data = JSON.parse(text);
|
||||
} catch (e) {
|
||||
console.error(`Error parsing ${file}: Invalid JSON`);
|
||||
process.exitCode = 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (file.endsWith('.sarif')) {
|
||||
const runs = data.runs || [];
|
||||
for (const run of runs) {
|
||||
const results = run.results || [];
|
||||
for (const result of results) {
|
||||
const sev = result.properties?.['security-severity'];
|
||||
if (sev === undefined) continue;
|
||||
|
||||
const score = parseFloat(sev);
|
||||
if (isNaN(score)) continue;
|
||||
|
||||
if (score >= threshold) {
|
||||
findings.push({ name: file, score });
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if (file.endsWith('.json')) {
|
||||
const items = data.findings || data.vulnerabilities || [];
|
||||
for (const item of items) {
|
||||
const rawScore = item.cvss || item.score;
|
||||
if (rawScore === undefined) continue;
|
||||
|
||||
const score = parseFloat(rawScore);
|
||||
if (isNaN(score)) continue;
|
||||
|
||||
if (score >= threshold) {
|
||||
findings.push({ name: file, score });
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return findings;
|
||||
}
|
||||
|
||||
// Ensure the code runs when executed directly
|
||||
import { fileURLToPath } from 'node:url';
|
||||
|
||||
if (process.argv[1] && fileURLToPath(import.meta.url) === process.argv[1]) {
|
||||
const thresholdStr = process.env.FAIL_ON_CVSS;
|
||||
if (!thresholdStr) {
|
||||
console.error("FAIL_ON_CVSS environment variable is required.");
|
||||
process.exit(1);
|
||||
}
|
||||
const threshold = parseFloat(thresholdStr);
|
||||
if (isNaN(threshold)) {
|
||||
console.error("FAIL_ON_CVSS must be a number.");
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const reportsDir = "/workspace/reports";
|
||||
const findings = checkReports(reportsDir, threshold);
|
||||
|
||||
if (findings.length > 0) {
|
||||
for (const finding of findings) {
|
||||
console.error(`${finding.name}: CVSS ${finding.score} >= ${threshold}`);
|
||||
}
|
||||
process.exit(1);
|
||||
} else {
|
||||
console.log(`No findings met or exceeded CVSS ${threshold}`);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,68 @@
|
||||
import * as fs from 'node:fs';
|
||||
import * as path from 'node:path';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
|
||||
export async function uploadReports() {
|
||||
const baseUrl = (process.env.DEFECTDOJO_URL || "").replace(/\/$/, "");
|
||||
const apiToken = process.env.DEFECTDOJO_API_TOKEN;
|
||||
const productName = process.env.DEFECTDOJO_PRODUCT_NAME || "agentguard-ci";
|
||||
|
||||
if (!baseUrl || !apiToken) {
|
||||
console.error("DEFECTDOJO_URL and DEFECTDOJO_API_TOKEN must be set.");
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const scanMap: Record<string, string> = {
|
||||
".sarif": "SARIF",
|
||||
".json": "Generic Findings Import",
|
||||
};
|
||||
|
||||
const reportsDir = "/workspace/reports";
|
||||
if (!fs.existsSync(reportsDir)) {
|
||||
console.log("No reports directory found.");
|
||||
return;
|
||||
}
|
||||
|
||||
const files = fs.readdirSync(reportsDir).sort();
|
||||
|
||||
for (const file of files) {
|
||||
const fullPath = path.join(reportsDir, file);
|
||||
if (!fs.statSync(fullPath).isFile()) continue;
|
||||
|
||||
const ext = path.extname(file);
|
||||
const scanType = scanMap[ext];
|
||||
if (!scanType) continue;
|
||||
|
||||
console.log(`Uploading ${file} as ${scanType}...`);
|
||||
|
||||
try {
|
||||
const response = await fetch(`${baseUrl}/api/v2/import-scan/`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Authorization": `Token ${apiToken}`,
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
scan_type: scanType,
|
||||
product_name: productName,
|
||||
file_name: file,
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const text = await response.text();
|
||||
console.error(`Failed to upload ${file}: ${response.status} ${response.statusText} - ${text}`);
|
||||
process.exitCode = 1;
|
||||
} else {
|
||||
console.log(`Successfully uploaded ${file}`);
|
||||
}
|
||||
} catch (e) {
|
||||
console.error(`Network error uploading ${file}:`, e);
|
||||
process.exitCode = 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (process.argv[1] && fileURLToPath(import.meta.url) === process.argv[1]) {
|
||||
uploadReports();
|
||||
}
|
||||
@@ -0,0 +1,14 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2022",
|
||||
"module": "NodeNext",
|
||||
"moduleResolution": "NodeNext",
|
||||
"outDir": "./dist",
|
||||
"rootDir": "./src",
|
||||
"strict": true,
|
||||
"esModuleInterop": true,
|
||||
"skipLibCheck": true,
|
||||
"forceConsistentCasingInFileNames": true
|
||||
},
|
||||
"include": ["src/**/*"]
|
||||
}
|
||||
Reference in New Issue
Block a user