Getting started

Clarity matters not a stack

Create organization

Azure DevOps Organization = Company boundary Azure DevOps Project = Product / Platform boundary

A Project represents:

  • A product

  • Or a platform

chevron-rightdetailedhashtag

Inside a Project you get:

  • Repos

  • Pipelines

  • Boards

  • Artifacts

  • Test Plans

Each repo:

  • Owns one microservice

  • Has its own pipeline

  • Deploys independently

Create an Pipeline

chevron-rightsetup till pipeline setuphashtag

select vcs

i choose GitHub

window popup login to github (Oauth/username and password)

select repo

select template

this will popup to sign in to azure portal

we need to select acr but it is not created we will create first then attach here

in AWS ECR - we create repository - registry is already created with own personal id

now create acr

save and run

failed need to add self hosted agent

chevron-rightACR Detailshashtag

Correct — registry name should be at project / org level, NOT microservice level.

You’re thinking in the right DevOps way 👍


🧱 How to think about Azure Container Registry naming

🔹 Azure Container Registry (ACR)

  • One ACR per project / product / org

  • Acts like a container image warehouse

Example:

Do NOT create:

That becomes unmanageable.


🧩 Where microservice names go

Microservice names are used as image repositories inside ACR, not as the registry itself.

Example structure:

Image tags:


🧠 Best practice (industry standard)

Level
Name example

Registry

project / company / environment

Repository

microservice name

Tag

version / env / commit


Single project

Multiple environments

or


✅ Final answer

✔️ Registry name = Project / Org / PlatformNot microservice

You’re aligning with how AKS + CI/CD + enterprise DevOps setups work.

If you want, tell me:

  • Single project or multiple projects?

  • Dev / QA / Prod separation?

  • Using AKS or App Service?

I’ll suggest the exact naming convention you should follow.

Got it 👍 — this screenshot is showing the new “Domain name label scope” option in Azure Container Registry, and this is where the confusion usually comes from.

Let me explain exactly what this means, which option to choose, and why.


🔍 What is Domain name label scope in ACR?

This setting controls how Azure generates the registry DNS name and how reusable the name is across:

  • Tenants

  • Subscriptions

  • Security boundaries

It does NOT change how you use images day-to-day.


🧩 The options (simple explanation)

1️⃣ Unsecure (default – most common)

What it does

  • Registry DNS is created directly from the registry name

  • Format:

  • Example:

Characteristics

  • Name must be globally unique

  • If someone else already used the name → you must change it

  • No hashing or tenant binding

Best for

  • Labs

  • Dev/Test

  • Most normal projects

  • Learning, demos, small teams

➡️ This is what you should select in 90% of cases


2️⃣ Tenant Reuse

What it does

  • Azure appends a hash tied to your Azure AD tenant

  • Same registry name can be reused inside the same tenant

Example:

Best for

  • Enterprises with multiple subscriptions

  • Central IT governance

  • Name collision avoidance inside tenant


3️⃣ Subscription Reuse

What it does

  • Hash is tied to subscription

  • Same name reusable within subscription only

Example:

Best for

  • Very large orgs

  • Multiple environments per subscription


🏆 What YOU should choose (based on your setup)

From your screenshot and context:

  • Single project (logistics)

  • Standard ACR

  • Likely DevOps / AKS usage

  • No enterprise name collision problem

Your resulting registry:

This is clean, standard, and industry-common.


🚫 Common misconception

❌ This has nothing to do with security of images ❌ This does not expose your registry publicly ❌ Access is still controlled by:

  • Azure AD

  • RBAC

  • Network rules / Private Endpoint

“Unsecure” is just a naming term, not a security weakness.


✅ Final answer (short)

  • ✔️ Use Unsecure

  • ✔️ Registry name = project / platform

  • ✔️ Microservices go inside the registry, not as registry names


If you want, next I can help you with:

  • ACR → AKS pull permissions

  • Azure DevOps pipeline push to ACR

  • Repo + tag naming standard (service:env-version)

Just tell me what’s next 👌

chevron-rightself hosted agentshashtag

go to project setting and click on agent pool

click on new agent

must not run with sudo

craete an user first

error: agent dont have docker permission ad agent to docker group

and restart may solve problem

patil123

enter server url: https://dev.azure.com/techwithpatilarrow-up-right

enter PAT

You tested interactive mode:

add it in daemon service to mae it persistent

Run runner as a daemon service

grant permission to pipeline to use agent

error: Pipeline does not have permissions to use the referenced

go to pool> security> add this pipeline

add pipelinee this will auto save

now rerun pipeline -

use agent

there is no extar setup on agent server

i already have docker and ran above commands

now its working

we are done with ci stage

chevron-rightEdit Pipelinehashtag

when u edit pipeline 1st time it ask to authenticate to gituhb via oauth

and we will get this file

chevron-rightsonarqube setuphashtag
chevron-rightSonarqube integrationhashtag

  • Code Quality Gate – SonarQube

  • Security Scanning

    • Dependency & code vulnerabilities (Snyk)

    • Container image scan (Snyk)

How enterprises do it

  • 1 Git repo = 1 SonarQube project

  • Each microservice has:

    • Its own repo

    • Its own pipeline

    • Its own Sonar project

install sonarqube extension form azure marketplace

go to org setting > extensions

select organization and install

now add sonarqube server to use with extension

add sonar server url ie. qa.echwithpatil.com

generate token and add it

account > security

select service connection that we created

select use standalone cli for python and other language

leave empty for scanner version

add analysis scan

now add quality gate

Yes, you must install Java (Java 17 recommended)

Why:

  • SonarScanner is a Java application

  • Azure DevOps downloads SonarScanner automatically

  • But it cannot run without Java

Your current error is 100% because Java is missing, not SonarScanner.

chevron-rightSonar scan all options - scan typeshashtag

Good screenshot 👍 You’ve installed the SonarQube extension correctly. Now the key question:

Which SonarQube task(s) should I select for a production pipeline?

✅ Correct enterprise answer (Azure DevOps)

You must use ALL THREE — in a specific order.


✅ The ONLY correct order (mandatory)

From the right-side Tasks panel, select these three tasks, in this order:

1️⃣ Prepare Analysis Configuration

📌 This is mandatory

  • Task name in UI: Prepare Analysis Configuration

  • What it does:

    • Connects pipeline → SonarQube server

    • Injects auth + project metadata

    • Sets up scanner environment

👉 Without this, nothing works


2️⃣ Run Code Analysis

📌 This actually runs the scanner

  • Task name in UI: Run Code Analysis

  • What it does:

    • Executes SonarScanner

    • Scans your Python code

    • Uploads results to SonarQube

👉 This is where analysis happens


3️⃣ Publish Quality Gate Result

📌 This makes it production-grade

  • Task name in UI: Publish Quality Gate Result

  • What it does:

    • Waits for SonarQube Quality Gate

    • Fails pipeline if gate fails

👉 This is what blocks bad code from prod


🔁 Visual flow (what Azure DevOps does internally)


❌ What NOT to do (common mistakes)

❌ Selecting only Run Code Analysis ❌ Skipping Publish Quality Gate Result ❌ Putting SonarQube steps after Docker build ❌ Running Sonar after image push


✅ Where to place these tasks in YOUR pipeline

Based on your current pipeline, SonarQube steps should be:


When configuring Prepare Analysis Configuration, select:

  • SonarQube server → your service connection

  • Scanner modeCLI

  • Config modeManual

  • Project keyshipping-rate-calculator

  • Project nameshipping-rate-calculator

  • Sources.

This matches enterprise CLI usage, even though it’s a UI task.


⚠️ Important clarification (very important)

Even though the UI says “tasks”, internally Azure DevOps still uses:

SonarScanner CLI

So:

  • You are NOT doing anything wrong

  • This is enterprise-approved

  • Most Azure DevOps orgs use these 3 tasks, not raw sonar-scanner commands


✅ Final recommendation (follow this)

✔ Select Prepare Analysis Configuration ✔ Then Run Code Analysis ✔ Then Publish Quality Gate Result ✔ Place them before Docker build ✔ Enforce Quality Gate failure


Next step (I strongly recommend)

If you want, I can:

  • Map these UI tasks → exact YAML

  • Tune SonarQube for Python (rules + exclusions)

  • Add PR analysis

  • Combine Sonar + Snyk cleanly

Just tell me what you want next 👍

chevron-rightSonar scan - languages types and optionhashtag

This screen is the most important decision point — and for a Python project, the choice is 100% clear.


✅ What to select for a Python project

👉 Select:

✅ “Use standalone SonarScanner CLI”

Do NOT select

  • Integrate with .NET

  • Integrate with Maven or Gradle

Those are only for:

  • .NET → C#, ASP.NET

  • Maven/Gradle → Java


✅ Correct configuration (fill exactly like this)

1️⃣ SonarQube Server Endpoint

✔ Already correct


2️⃣ Choose the way to run the analysis

Select

This is the enterprise-standard option for:

  • Python

  • Node.js

  • Go

  • Terraform

  • Mixed-language repos


3️⃣ Scanner Mode (after selecting CLI)

You will now see CLI options instead of .NET fields.

Fill them like this 👇

🔹 Config Mode

🔹 Project Key

⚠️ Must exactly match the project key in SonarQube UI

🔹 Project Name

🔹 Sources


If you see Additional Properties, add:

(You can later move this into sonar-project.properties)


✅ Final correct setup summary (Python)

Field
Value

Server

kvm-sonarqube

Analysis mode

✅ Standalone SonarScanner CLI

Config mode

Manual

Project key

shipping-rate-calculator

Sources

.


🔁 Correct task order (don’t miss this)

After this task, you must also add:

  1. Run Code Analysis

  2. Publish Quality Gate Result

Pipeline order must be:


❌ Common mistakes to avoid (very important)

❌ Choosing Integrate with .NET for Python ❌ Skipping Publish Quality Gate Result ❌ Running Sonar after Docker build ❌ Using Maven/Gradle for non-Java projects


Why this is the enterprise choice

  • SonarScanner CLI works for any language

  • Same approach across all microservices

  • Matches what big orgs use with Azure DevOps

  • Easy to standardize & template


If you want, I can now:

  • Convert this UI config → clean YAML

  • Add pytest + coverage properly

  • Validate your SonarQube project setup

  • Add PR-based analysis

Just tell me 👍

3️⃣ Project Key – what EXACT value?

✅ Use:

Rules (very important):

  • Must be globally unique in SonarQube

  • Must exactly match the project you created in SonarQube UI

  • Cannot contain spaces

📌 Good enterprise pattern:

Example:

(You can rename later if needed)


4️⃣ Project Name – what to add?

✅ Use:

Why:

  • Human-readable name in dashboard

  • Can contain spaces (but keep same as repo)


5️⃣ Project Version – what to use?

or

Enterprise reason:

  • Every scan is traceable

  • Easy to correlate with pipeline runs


6️⃣ Sources – what to put?

✅ Use:

Meaning: Scan entire repo (excluding default exclusions)

chevron-rightSonarqube cloud setuphashtag

create an sonarqube account with azure devops acount

select organization and create PAT from user setting and add pat with code real/write access

select trial

select repository (project)

select 30 days

select with azure devops pipeline

  • Analyze a project with Azure Pipelines

  • Install our extension

    Navigate to the SonarQube Cloud extensionarrow-up-right in the Visual Studio Marketplace and click Get it free to install it.Install our extension

  • Add a new SonarQube Cloud Service Endpoint

    In your Azure project:

  • Click on Verify to check that everything is linked correctly.

  • Use this token: 8569fc640d4fb2737ab625a7a5cad1aa063dc1f6

  • Add a new service connection of the type SonarQube Cloud

  • Go to Project settings > Service connections

  • Configure Azure Pipeline

    What option best describes your project?

    JS/TS & WebMavenGradle.NETPythonC, C++ or ObjCFlutter or DartOther (for Go, PHP, ...)

Create a new pipeline or edit existing one

Follow the steps on Azure to initialize your pipeline and link it to your repository.

Add the following four steps to your pipeline

1. Set Fetch Depth

Azure pipelines by default have Shallow fetch enabled, we must set the fetchDepth value to 0 to override this and enable fetching of blame information.

2. Prepare Analysis Configuration

Please ensure this task runs before your build step.

3. Run Code Analysis

This task needs to run after your build step.

Note: i have installed openjdk 17 on agent u can found steps in last docs or section

4. Publish Quality Gate Result

This task is not mandatory but will allow you to decorate your Pull Request.

If you plan not to use such a feature, you can omit it. Be aware that this task may increase your build time.

Enable the pipeline on push

Enable Continuous Integration for your pipelinearrow-up-right to make sure your code will be analysed after every update.

Enable the PR Decoration

Enable PR triggerarrow-up-right to make sure your code will be analysed after every update.

Save and run your pipeline

You can now trigger your pipeline execution. When it is done, this page should refresh itself with your analysis results if everything runs successfully.

If you need help, we have an extended edition of the tutorial: Integrate Azure DevOps with SonarQube Cloudarrow-up-right

  • In the Project Name field, enter logistics

  • In the Project Key field, enter techwithpatil_logistics

  • In Choose the way to run the analysis, select onboarding.azure.pipelines.settings.scanner.run.python.bold.

  • Select the SonarQube Cloud organization techwithpatil

  • Select the SonarQube Cloud endpoint.

  • Create a sonar-project.properties file

    Create a configuration file in the root directory of the project and name it sonar-project.properties

And you are done!

  • If everything is running successfully, once the analysis is complete you'll be redirected to the Overview page of your project where the new analysis results will be displayed. This can take a few minutes.

  • You'll get an analysis of your default branch

  • Each new push you make on a branch or Pull Request will automatically trigger a new analysis

working pipeline file

chevron-rightsnyk cloud integrationhashtag

sign in with snyk cloud with pavanpatil0744 github account

get a snyk token

Store token securely

Azure DevOps → Library → Variable Group

change varible type to secret

goto org sertting > extension and install synk security scan extension

setup service connection

chevron-rightsnyk scan typeshashtag

application dependencies

Good — this screen is the SnykSecurityScan task configuration in Azure DevOps. I’ll tell you exactly what to select here for a production-safe setup, why, and what NOT to touch.

I’ll keep it very precise.


🔐 1️⃣ Snyk API token

What to select

✔ This is correct ✔ It should be the Service Connection name, not a variable ✔ Do NOT paste tokens here manually

👉 You already did this correctly


🧪 2️⃣ “What do you want to test?”

Select:

Why

  • This runs Snyk Open Source (dependency scan)

  • Scans:

    • requirements.txt

    • pipenv

    • poetry

  • This is the first and most important Snyk scan

Correct choice for now


📁 3️⃣ Custom path to manifest file to test

What to put

👉 Leave this EMPTY

Why

  • Snyk auto-detects Python manifests

  • Enterprises only fill this when:

    • Monorepo

    • Multiple services in subfolders

❌ Do NOT put . ❌ Do NOT put repo path unless needed


🚨 4️⃣ Testing severity threshold

Meaning

Severity
Pipeline behavior

Low / Medium

Allowed

High / Critical

❌ Fail pipeline

This is enterprise standard for main branch.


⏱ 5️⃣ “When to run Snyk monitor”

Select:

Why

  • Sends results to Snyk UI

  • Enables dashboards & reports

  • Required for vulnerability tracking

✔ This is correct


✅ FINAL VALUES (copy mentally)

Field
Value

Snyk API token

snyk-cloud

What to test

Application

Custom path

(leave empty)

Severity threshold

High

When to run monitor

always


🔒 Enterprise rule (important)

Dependency scan runs BEFORE Docker build

This is correct in your pipeline.


🔁 What comes next (DON’T mix yet)

Later, you will add another Snyk task for container scanning:

But not in this step.


⚠️ Very common mistakes (avoid)

❌ Setting severity to Low (too noisy) ❌ Filling custom path unnecessarily ❌ Running dependency scan after Docker ❌ Hardcoding token


✅ After you click “Add”

Your pipeline flow becomes:

This is exactly how enterprises do it.


🧠 DevOps interview-ready explanation

“We run Snyk Open Source scans on the main branch with a High severity threshold to block vulnerable dependencies before building artifacts.”

That’s a senior-level answer.


Next step (tell me when ready)

After you add this task, I’ll help you:

  • Add Snyk Container scan

  • Tune PR vs main behavior

  • Ensure Docker only runs if Snyk passes

Just say 👍

chevron-rightsnyk scan typeshashtag

Great question — this is exactly the level where DevOps turns into DevSecOps. I’ll explain Snyk Application, Code, and Container scans in a way that is:

  • Enterprise-real

  • Pipeline-oriented

  • Interview-ready

  • Crystal clear (no marketing fluff)


🛡️ Snyk’s 3 Core Scan Types (What, Why, When)

Snyk covers 3 different attack surfaces. They do NOT replace each other — each catches different classes of risk.


1️⃣ Snyk Application Scan (Open Source / Dependencies)

🔍 What it scans

  • Third-party dependencies your app uses

  • Examples (Python):

    • requirements.txt

    • Pipfile

    • poetry.lock

🚨 What it detects

  • Known CVEs in libraries

  • Vulnerable transitive dependencies

  • License issues (GPL, AGPL, etc.)

🧠 Example

You write clean code, but use:

Snyk finds:

You did nothing wrong — dependency is unsafe.


🏭 Enterprise usage

Stage
Behavior

PR

Warn or comment

Main

❌ Block on High+

Prod

Always monitored

✅ Why enterprises love it

  • Catches 80% of real-world attacks

  • Fast (seconds)

  • No code change required


📌 In your pipeline


2️⃣ Snyk Code Scan (SAST – Static Application Security Testing)

🔍 What it scans

  • Your own source code

  • Business logic flaws

  • Security anti-patterns

🚨 What it detects

  • SQL Injection

  • Command Injection

  • Hardcoded secrets

  • Insecure crypto usage

  • Path traversal

🧠 Example

Snyk flags:


🏭 Enterprise usage

Stage
Behavior

PR

🔔 Feedback

Main

Usually non-blocking

Prod

Reporting only

⚠️ Why it’s NOT always blocking

  • More false positives than dependency scans

  • Needs developer context

  • Slower than app scan


📌 In your pipeline

Many enterprises run this only on PRs


3️⃣ Snyk Container Scan (Runtime Security)

🔍 What it scans

  • Docker image layers

  • OS packages

  • Runtime libraries

  • Base image vulnerabilities

🚨 What it detects

  • Vulnerable OS packages (glibc, openssl)

  • Misconfigured base images

  • End-of-life OS versions

🧠 Example

Your code is fine, dependencies are safe, BUT:

Base image contains:

Snyk blocks it.


🏭 Enterprise usage

Stage
Behavior

Build

❌ Block on Critical

Registry

Prevent push

Runtime

Continuous monitoring

🔥 Why it’s critical

Most production attacks exploit:

OS-level vulnerabilities

Not app code.


📌 In your pipeline


🧩 How the 3 scans work together

Layer
Scan
Catches

Code

Snyk Code

Logic flaws

Dependencies

Snyk App

CVEs in libs

Runtime

Snyk Container

OS & image CVEs

🔐 Defense in depth


🧠 Enterprise pipeline strategy (BEST PRACTICE)

Scan
PR
Main

App

Warn

❌ Block High

Code

Info

Info

Container

Skip

❌ Block Critical


🎯 How to explain this in interviews

“We use Snyk Open Source to block vulnerable dependencies, Snyk Container to prevent unsafe images from reaching our registry, and Snyk Code mainly as a feedback mechanism for developers during PRs.”

That answer = senior DevSecOps.


🏁 Final takeaway

  • Application scan → fastest, highest ROI

  • Code scan → developer hygiene

  • Container scan → production safety net

Each one protects a different attack vector.


If you want next, I can:

  • Draw a visual pipeline flow

  • Help tune PR vs main logic

  • Show how companies reduce false positives

  • Add Snyk ignore policies correctly

Just tell me 👍

chevron-rightsnyk clihashtag

install on u buntu

chevron-righttrivy scan easy setuphashtag

This output is exactly what you wanted for testing 👍 Your Trivy gate is working perfectly.

I’ll explain what happened, how to read this, and what to do next.


What this output means (plain English)

1️⃣ Trivy scanned your image successfully

You see:

So Trivy scanned:

  • OS packages

  • Python dependencies inside the image

No scanner or pipeline issue here.


2️⃣ Django is intentionally vulnerable (good test)

This line is the key:

That means:

  • You added Django 2.2

  • Trivy correctly detected:

    • 6 CRITICAL CVEs

    • 15 HIGH CVEs

This proves: ✅ Dependency scanning works ✅ CVE database is working ✅ Severity filtering is working


3️⃣ Pipeline FAILED by design (security gate)

At the end:

Why? Because your scan step uses:

So:

  • HIGH/CRITICAL found → exit code = 1

  • Azure DevOps marks step as FAILED

  • Docker Push is blocked

This is correct DevSecOps behavior.


How to read the vulnerability table (important)

Example row:

Meaning:

  • Vulnerability ID: CVE-2019-14234

  • Severity: CRITICAL

  • Affected package: Django 2.2

  • Fix available in: 2.2.4+

So Trivy is not just flagging — it’s telling you how to fix it.


Why some say “fixed” but pipeline still fails

You may notice:

This does NOT mean: ❌ “Your image is fixed”

It means: ✅ “A fixed version exists”

Since you’re still running 2.2, the vulnerability is valid.


What you should do next (real workflow)

✅ Option 1: Fix the vulnerability (normal flow)

Update requirements.txt:

Rebuild → scan passes → image pushes.


✅ Option 2: Temporary ignore (testing only)

Create .trivyignore:

⚠️ Use only for false positives or approved exceptions.


Allows HIGH but blocks CRITICAL only.


What this proves (important takeaway)

You have now successfully implemented:

✔ Dependency vulnerability scanning ✔ Severity-based security gate ✔ CI pipeline enforcement ✔ Real CVE detection (not fake demo)

This is enterprise-grade DevSecOps.


One-line explanation (interview / review ready)

We intentionally introduced a vulnerable dependency, and Trivy correctly blocked the pipeline by detecting high and critical CVEs before the image was pushed.


Final reassurance

Nothing is broken. Nothing is misconfigured. This failure is a success.

If you want next, I can:

  • Show a passing scan after upgrade

  • Add PR vs main behavior

  • Add approved vulnerability allowlist

  • Optimize scan speed

Just say 👍

chevron-rightonce all setup new pipelinehashtag

we already have oauth connected for github

select repo

select file or create it

go to agent pool and give this pipeline permission to run

chevron-rightsonarqube issuehashtag

https://github.com/mc1arke/sonarqube-community-branch-plugin/issues/49arrow-up-right

  1. 6:13:16.737 INFO EXECUTION FAILURE16:13:16.738 INFO Total time: 13.119s##[error]16:13:16.738 ERROR Error during SonarScanner CLI execution16:13:16.738 ERROR Validation of project failed:o To use the property "sonar.branch.name" and analyze branches, Developer Edition or above is required. See https://docs.sonarqube.org/latest/analyzing-source-code/branches/branch-analysis/arrow-up-right for more information.16:13:16.738 ERROR16:13:16.738 ERROR Re-run SonarScanner CLI using the -X switch to enable full debug logging.16:13:16.738 ERROR Error during SonarScanner CLI execution16:13:16.738 ERROR Validation of project failed:o To use the property "sonar.branch.name" and analyze branches, Developer Edition or above is required. See https://docs.sonarqube.org/latest/analyzing-source-code/branches/branch-analysis/arrow-up-right for more information.16:13:16.738 ERROR16:13:16.738 ERROR Re-run SonarScanner CLI using the -X switch to enable full debug logging.##[error][ERROR] SonarQube Server: Error while executing task Analyze: The process '/home/azagent/myagent/_work/_tasks/SonarQubeAnalyze_6d01813a-9589-4b15-8491-8164aeb38055/8.0.1/sonar-scanner/bin/sonar-scanner' failed with exit code 2##[error]The process '/home/azagent/myagent/_work/_tasks/SonarQubeAnalyze_6d01813a-9589-4b15-8491-8164aeb38055/8.0.1/sonar-scanner/bin/sonar-scanner' failed with exit code 2Finishing: SonarQubeAnalyze

fix - added script block

custom quality gate

Last updated