- Create root directory structure under
c:\Java Developer\DAST\ - Create
.gitignore(node_modules, reports/, *.tar, .env, /tmp/) - Create
LICENSE(Apache-2.0) - Initialize git repo (
git init) - Initial commit: "chore: initialize ZeroDAST project scaffold"
- Create
demo-app/package.json- Dependencies: express, pg, jsonwebtoken, bcryptjs, swagger-jsdoc, swagger-ui-express, helmet, cors
- Dev dependencies: eslint, jest (for CI lint/test steps)
- Scripts: start, lint, test
- Create
demo-app/.dockerignore(node_modules, npm-debug.log, .git)
- Create
demo-app/src/index.js- Express app setup with JSON body parsing
- Mount all route files
- Swagger UI at
/api-docs - OpenAPI JSON spec at
/v3/api-docs - Port from
PORTenv var (default 8080) - Global error handler middleware
- Create
demo-app/src/db.js- PostgreSQL Pool from
DATABASE_URL - Connection retry logic (up to 10 retries, 2s interval)
- PostgreSQL Pool from
- Create
demo-app/src/swagger.js- swagger-jsdoc config with OpenAPI 3.0
- Security scheme: Bearer JWT
- Server URL:
http://localhost:8080
- Create
demo-app/src/middleware/auth.js- JWT verification from Authorization header
- Extract userId and role to
req.user - Return 401 on missing/invalid token
- Create
demo-app/src/middleware/errorHandler.js- INTENTIONAL VULN: Returns stack trace in error responses (info disclosure)
- Add
// codeql[js/stack-trace-exposure]suppression comment
- Create
demo-app/src/routes/health.js-
GET /health{ status: "ok", timestamp: ... }
-
- Create
demo-app/src/routes/auth.js-
POST /api/auth/register- bcrypt hash, insert user, return JWT -
POST /api/auth/login- verify credentials, return JWT - INTENTIONAL VULN: Verbose error on login failure (user enumeration)
- JSDoc OpenAPI annotations on each endpoint
-
- Create
demo-app/src/routes/users.js-
GET /api/users- list all users (admin only) -
GET /api/users/:id- get user profile -
PUT /api/users/:id- update profile - INTENTIONAL VULN: IDOR - no ownership check on
:id - JSDoc OpenAPI annotations with example IDs
-
- Create
demo-app/src/routes/documents.js-
GET /api/documents- list user's documents -
GET /api/documents/:id- get document by ID -
POST /api/documents- create document -
DELETE /api/documents/:id- delete document - INTENTIONAL VULN: IDOR - no ownership check on
:id - JSDoc OpenAPI annotations with example IDs
-
- Create
demo-app/src/routes/search.js-
GET /api/search?q=<query>- search documents - INTENTIONAL VULN: SQL Injection - query concatenated into SQL string (not parameterized)
- Add
// codeql[js/sql-injection]suppression comment -
GET /api/search/preview?q=<query>- returns HTML with unsanitized query - INTENTIONAL VULN: Reflected XSS - query echoed in HTML response
- JSDoc OpenAPI annotations
-
- Create
demo-app/Dockerfile- Multi-stage build (builder -> production)
-
FROM node:20-alpinefor both stages -
npm install --omit=devin builder - Non-root user in production stage (
USER node) -
HEALTHCHECKusingwget -qO-(Alpine lacks curl) -
EXPOSE 8080
- Git commit: "feat(demo-app): add Express app with intentional vulnerability surfaces for DAST validation"
- Create
db/seed/schema.sql-
CREATE TABLE users(id SERIAL, email, name, password_hash, role, created_at) -
CREATE TABLE documents(id SERIAL, user_id FK, title, content, visibility, created_at) -
CREATE TABLE organizations(id SERIAL, name, owner_id FK) -
CREATE TABLE api_tokens(id SERIAL, user_id FK, token, scope, expires_at) - Indices on foreign keys
-
- Create
db/seed/mock_data.sql- 3 users: alice (user), bob (user), admin (admin)
- bcrypt hashes for password
Test123! - 6 documents across users (mixed public/private visibility)
- Alice's private docs (IDs 1-2), Bob's private docs (IDs 4-5)
- 2 organizations
- API tokens: valid, expired, admin-scoped
- All data obviously fake (
@test.localemails)
- Create
db/seed/overlay.sql.example- Template with comments: what's allowed vs forbidden
- Example INSERT for adding test data for a new feature
- Clear explanation of validation rules
- Create
db/seed/validate_overlay.py-
import pglastfor AST parsing - File size check (reject > 100KB)
- psql meta-command detection (
\copy,\!, etc.) - URL/IP pattern detection in raw content
- Parse SQL into AST via
pglast.parse_sql() - Statement whitelist: InsertStmt, CreateStmt, IndexStmt, AlterTableStmt (ADD only)
- Deep INSERT inspection:
- Reject subqueries (SELECT inside INSERT values)
- Reject RETURNING clause
- Reject ON CONFLICT DO UPDATE
- Reject CTEs (WITH clauses)
- Dangerous function blacklist (pg_read_file, dblink, CHR, lo_import, etc.)
- Dollar-quoting obfuscation detection
- Clear error messages with offending statement excerpt
- Exit code 0 = valid, 1 = rejected
-
- Create
tests/test_validate_overlay.py- Test: Valid INSERT passes ?
- Test: Valid CREATE TABLE passes ?
- Test: Valid CREATE INDEX passes ?
- Test: INSERT with SELECT FROM pg_shadow -> REJECTED ?
- Test: INSERT with CTE -> REJECTED ?
- Test: CREATE FUNCTION -> REJECTED ?
- Test: Dollar-quoting obfuscation -> REJECTED ?
- Test: CHR() concatenation -> REJECTED ?
- Test: Comment-obfuscated keywords -> REJECTED ?
- Test: INSERT RETURNING -> REJECTED ?
- Test: ON CONFLICT DO UPDATE -> REJECTED ?
- Test: psql meta-commands -> REJECTED ?
- Test: URLs in data -> REJECTED ?
- Test: File > 100KB -> REJECTED ?
- Test: DROP TABLE -> REJECTED ?
- Git commit: "feat(db): add schema, mock data, overlay system, and AST-based SQL validator with 15+ bypass test cases"
- Create
security/run-dast-env.sh-
docker network create --internal dast-net - Start PostgreSQL on
dast-net - Start hardened app container on
dast-net - Start ZAP on
dast-net - Handle ZAP exit codes (0, 2, 3 = expected; >3 = crash)
- Cleanup via trap on exit
-
--cap-drop=ALL -
--security-opt=no-new-privileges:true -
--read-only -
--tmpfs /tmp:rw,noexec,nosuid,size=100m -
--user 1000:1000 -
--memory=1g --memory-swap=1g -
--pids-limit=512 -
--rmfor auto-cleanup - Pass
DATABASE_URLandJWT_SECRETas env vars
-
- Create
scripts/bootstrap-auth.sh- Accept APP_URL parameter (default:
http://untrusted-app:8080) - Login as alice@test.local
- Parse JSON response with
jqor Node.js fallback - v2 FIX: Validate token extraction (exit 1 if login fails)
- Save token to
/tmp/zap-auth-token.txt - Also bootstrap Bob's token for authz tests
- Save Bob's token to
/tmp/zap-auth-token-bob.txt
- Accept APP_URL parameter (default:
- Create
scripts/delta-detect.sh- Read changed files via
git diff --name-only origin/main...HEAD - Route file pattern matching (
*/routes/*,*/controllers/*) - v2 FIX: Regex
\.(get|post|put|delete|patch)\s*\((leading dot!) - Core file detection triggers FULL scan (middleware, db, index, Dockerfile)
- Fail-safe: if no routes found AND no core changes -> FULL
- Deduplicate and output endpoint paths
- Read changed files via
- Create
scripts/generate-delta-scan.sh- Read delta endpoint list from file
- Generate ZAP Automation Framework YAML with
includePathsregexes - If input is "FULL", copy full
automation.yamlinstead - Output to
/tmp/zap-config.yaml
- Create
scripts/authz-tests.sh- Login as Alice, login as Bob
- Alice tries Bob's private document -> 200/204 means IDOR detected
- Bob tries DELETE Alice's document -> 200/204 means IDOR detected
- Bob tries to update Alice's user profile
- Output: list of IDOR findings
- Exit 0 always (demo/default mode)
- Support
EXPECT_IDOR=true|falsefor demo vs hardened apps
- Create
scripts/verify-canaries.sh- Read
reports/zap-report.json - Check for expected findings: "SQL Injection", "Cross Site Scripting", "Application Error Disclosure"
- Found -> pass, missing -> fail pipeline with coverage gap message
- Read
- Create
scripts/parse-zap-report.js- Read ZAP JSON report
- Count by risk level (Critical/High/Medium/Low/Informational)
- v2 FIX: Configurable fail level via
ZAP_FAIL_LEVELenv var (default: High) - Generate markdown summary table for PR comment
- Return exit code 1 if findings exceed fail level
- Create
tests/test_delta_detect.sh- Test:
router.get('/api/users')matches -> extracts/api/users - Test:
app.post('/api/auth/login')matches -> extracts/api/auth/login - Test: middleware change triggers FULL
- Test: Dockerfile change triggers FULL
- Test: non-route JS file -> no match -> FULL (fail-safe)
- Test:
- Git commit: "feat(security): add Docker network isolation, container hardening, delta detection, authz tests, and canary verification"
- Create
security/zap/.zap-version- Contains pinned version number (e.g.,
2.17.0)
- Contains pinned version number (e.g.,
- Create
security/zap/automation.yaml- v2 FIX:
env.vars.AUTH_TOKENsection for OS env passthrough - Context:
zerodast-targetpointing to app URL - Job:
openapi- import from/v3/api-docswithtargetUrloverride tohttp://untrusted-app:8080 - Job:
replacer- runtime-baked Bearer token injection viaREQ_HEADER - Job:
requestor- seed/api/debug/errorand/api/search/preview - Job:
spider- discover additional reachable URLs - Job:
passiveScan-wait- v2 NEW: passive scan before active (maxDuration: 2 min) - Job:
activeScan- 8 threads, v2 FIX: delayInMs: 50 (not 0), maxScanDuration: 30 min, tuned SQLi/XSS rules - Job:
report- JSON format to/zap/wrk/zap-report.json - Job:
report- HTML format to/zap/wrk/zap-report.html
- v2 FIX:
- Create
security/zap/scan-policy.yaml- KEEP rules: XSS, SQLi, CORS, traversal-style checks
- Filter the policy toward the demo tech stack and away from irrelevant stacks
- Create
security/zap/.zap-baseline.json- Suppress known informational alerts from demo app
- Document each suppression with rationale
- Git commit: "feat(zap): add pinned ZAP config with passive+active scan, auth injection, and tech-stack filtering"
- Create
.github/workflows/ci.yml- Name:
CI Tests - Trigger:
pull_requestonmain - Permissions:
contents: readonly - Concurrency group with cancel-in-progress
- Checkout with
fetch-depth: 0 - Use
working-directory: demo-appfor npm steps - Install dependencies with
npm ci - Lint with
npm run lint - Test with
npm test - Run Semgrep with pinned SHA
- Run Gitleaks with pinned SHA
- Detect delta endpoints and save artifact file
- Build Docker image for the PR SHA
-
docker savethe image tarball - Upload artifact bundle with retention-days
1 - All actions pinned by SHA
- Name:
- Create
.github/workflows/dast-pr.yml- Name:
DAST PR Scan - Trigger:
workflow_runofCI Tests - Concurrency group by head SHA with cancel-in-progress
-
dast-scanjob onubuntu-22.04 -
timeout-minutes: 15 - Permissions:
actions: read,contents: read - Condition: only run for successful PR-triggered CI runs
- Checkout trusted
main - Download cross-workflow PR artifact bundle using
github-tokenandrun-id - Install
pglastand validateoverlay.sqlif present in artifacts - Pre-pull Postgres and ZAP images
-
docker loadthe PR image tarball - Generate delta or full ZAP config from artifact input
- Seed DB with schema, mock data, and optional overlay
- Bootstrap auth token before ZAP
- Run ZAP in the isolated runtime environment
- Run authz tests post-scan
- Run canary verification only for
FULLscans - Upload DAST reports and summary artifacts
-
report-resultsjob runs on a separate runner - Comment on the PR via
actions/github-script - Report findings summary and comments without hard-failing the intentionally vulnerable demo repo by default
- Name:
- Create
.github/workflows/dast-nightly.yml- Trigger on
pushtomain - Trigger on nightly
schedule -
timeout-minutes: 30 - Build the demo image from the current repo state
- Run full scan with same isolation/hardening layers
- Bootstrap auth before ZAP
- Run
verify-canaries.sh - Run
authz-tests.sh - Upload nightly report artifact
- Create a GitHub issue when findings exceed threshold
- Trigger on
- Git commit:
feat(ci): add 3-workflow DAST pipeline with privilege isolation, Docker network isolation, and configurable fail levels
- Create
ai-prompts/INSPECT_REPO.md- Step 1: Identify tech stack from manifest files
- Step 2: Find API surface (route definitions)
- Step 3: Understand auth mechanism (JWT, session, OAuth2, API key)
- Step 4: Understand data model (migrations, schemas, ORM models)
- Step 5: Identify DAST config (Docker, env vars, health endpoints)
- Output: Structured YAML profile for other prompts to consume
- Create
ai-prompts/GENERATE_CONFIG.md- Takes INSPECT_REPO output
- Generates: ZAP automation YAML, seed data SQL, auth bootstrap script, docker-compose, scan policy
- Create
ai-prompts/ADAPT_AUTH.md- Framework-specific auth bootstrap generators
- Templates for: Express+JWT, FastAPI+OAuth2, Spring+Sessions, Go+APIKey
- Create
ai-prompts/ADAPT_SEED.md- Schema-aware mock data generators
- Reads migration files, generates INSERT statements covering all tables
- Create
ai-prompts/AI_TRIAGE.md(v2 NEW)- Post-scan triage prompt
- Input: zap-report.json + source code of vulnerable endpoint
- Output: Root cause analysis + exact fix suggestion
- Git commit: "feat(ai): add 5 structured AI prompts for universal repo adaptation and post-scan triage"
- Create
README.md- Project name, tagline, architecture diagram (ASCII art)
- v2 FIX: "Self-benchmarked via AlphaSudo/sbtr-benchmark" (not "certified")
- Quick Start (5 steps)
- Comparison table: T1/T2/T3/T4
- ?? WARNING: Demo app is intentionally vulnerable - never deploy to production
- License badge, status badges
- Create
docs/ARCHITECTURE.md- 3-layer defense model with diagrams
- v2 FIX: "Privilege Isolation" not "Temporal Isolation" in security layers
- Docker
--internalnetwork explanation - Data flow between workflows
- Speed lever explanations
- Create
docs/QUICK_START.md- Prerequisites: Docker, Node.js 20+, Python 3.8+
- Step-by-step local setup
- Copy-paste YAML for different frameworks
- Common pitfalls (CRLF line endings, workflow name mismatch)
- Create
docs/CONTRIBUTING_SECURITY.md- How to write
overlay.sqlfor new features - Allowed/forbidden statement whitelist
- Why ON CONFLICT DO UPDATE is forbidden
- What happens when validation fails
- Sparse checkout path requirements
- How to write
- Create
docs/SUPPLY_CHAIN_RULES.md- 6-Rule Framework from AlphaSudo/sbtr-benchmark
- How each rule is implemented in workflows
- Rule 4b exception for DAST (sandboxed binary crossing) - documented
- Mapping to SBTR benchmark tiers
- Create
docs/THREAT_MODEL.md- All attack vectors (poisoned seed, poisoned code, container escape, token hijacking)
- Mitigation for each
- v2 NEW: Fork PR behavior documented ("fork PRs get DAST only after merge - intentional")
- Residual risk: hypervisor escape (GitHub's problem)
- ZAP on
--internalnetwork nuance (trusted image, but documented)
- Create
docs/TIER_COMPARISON.md- T1 vs T2 vs T3 (us) vs T4
- v2 FIX: Scores labeled as self-benchmarked
- Scan time, security score, cost, complexity
- When to upgrade from T3 to T4
- Create
docs/AI_GUIDED_SETUP.md- How to use the AI prompts
- Workflow: Inspect -> Generate -> Adapt -> Validate
- Dry-run mode: verify auth bootstrap before full scan
- Examples for Node.js, Python, Java, Go
- Git commit: "docs: add comprehensive documentation covering architecture, security, setup, and AI-guided adaptation"
- Create
docker-compose.yml- DB service: postgres:16-alpine with explicit healthcheck
- App service: builds from
./demo-app, depends on healthy DB - ZAP service: pinned version,
profiles: ["dast"], network mode - All env vars (DATABASE_URL, JWT_SECRET = throwaway values)
- Port mappings: 5432 (DB), 8080 (App)
- Create
Makefile-
make build— build demo-app Docker image -
make up— start DB + app, wait for healthy -
make seed— run schema.sql + mock_data.sql -
make dast— full local DAST (up + seed + auth + ZAP) -
make validate FILE=overlay.sql— run AST validator -
make test— run pytest for validator + bash tests for delta -
make authz— run authz tests -
make clean— docker compose down -v --remove-orphans
-
- Create
scripts/run-dast-local.sh- Local wrapper runs build -> isolated runtime -> auth bootstrap -> ZAP scan -> verify canaries -> report
- Trap handler for cleanup on exit/error
- Print summary at end
- Git commit: "feat(dev): add docker-compose, Makefile, and local DAST runner for developer testing"
- Run
pip install pglast==6.* pytest - Run
pytest tests/test_validate_overlay.py -v- all 15+ tests pass - Run
bash tests/test_delta_detect.sh- all regex tests pass
- Run integrated local DAST path (
make dast/scripts/run-dast-local.sh) - build, app startup, DB seed, and full ZAP scan complete- Verify ZAP finds: SQL Injection ?
- Verify ZAP finds: Cross Site Scripting ?
- Verify ZAP finds: Application Error Disclosure ?
- Run
bash scripts/verify-canaries.sh- all canaries pass - Run
bash scripts/authz-tests.sh- IDOR surfaces confirmed
- Start app on
--internalnetwork - From app container:
wget -qO- https://httpbin.org/getfails - From app container: TCP connect to
dast-db:5432succeeds
- Verify
--read-onlyworks with Node.js demo app (no crash) - Verify
--pids-limit=512sufficient under ZAP load - Verify
--memory=1gsufficient (no OOM kills)
- Submit
INSERT INTO users SELECT * FROM pg_shadow- validator rejects - Submit
CREATE FUNCTION evil()- validator rejects - Submit file with
\copycommand - validator rejects - Submit file > 100KB - validator rejects
- Submit valid INSERT - validator accepts
- Verify all
.shfiles have Unix line endings (LF, not CRLF) - Add
.gitattributesif needed:*.sh text eol=lf
- Git commit: "test: verify all security layers, canary detection, and overlay bypass rejection"
- Review all
@<SHA>placeholders - replace with real pinned SHAs - Review all
// codeql[...]suppression comments are present on intentional vulns - Review
workflow_runworkflow name matches exactly - Review
.gitattributesfor line endings - Run full
make dastone final time - clean run
- Tag:
v0.1.0— first local-validated release
- 3 consecutive successful E2E runs locally
- Test fork PR path (only ci.yml runs - no DAST, no secrets)
- Test merge to main (dast-nightly triggers full scan)
- Test malicious overlay PR (validator blocks)
- Measure actual CI time: target 5-9 min delta, 15-25 min full
- Create
SECURITY.mdfor public repo - Push to AlphaSudo/zerodast
| Phase | Items | Est. Time |
|---|---|---|
| 0. Scaffolding | 5 | 15 min |
| 1. Demo App | 23 | ~4 hours |
| 2. DB Seed | 12 | ~2 hours |
| 3. Security Scripts | 15 | ~3 hours |
| 4. ZAP Config | 5 | ~3 hours |
| 5. Workflows | 14 | ~4 hours |
| 6. AI Prompts | 6 | ~2 hours |
| 7. Documentation | 10 | ~2 hours |
| 8. Local Tools | 4 | ~1 hour |
| 9. Verification | 12 | ~2 hours |
| 10. Polish | 5 | ~30 min |
| Total | 111 items | ~21 hours |