DevOps Engineer
Full Time
full-time
24 Oct 2025
Verified by Turrior
Content + Source + Freshness • 17 Dec 2025 • 95% confidence
85 / 100
Offer value
High value stemming from the growing field of DevOps, competitive skills demand, and opportunities for advancement in data infrastructure roles.
- Work in a high-demand area of cloud and data engineering
- Emphasis on CI/CD methodologies and automation
- Involvement in impactful projects
Pros
- Strong job market for DevOps engineers skilled in Azure
- Comprehensive involvement in CI/CD pipeline development
- Potential for professional growth in cloud technologies
Cons
- High competition due to the popularity of DevOps roles
- The job may require on-call availability
- Significant learning curve for those new to Azure DevOps tools
Who it's for
Senior/Lead • Office/Hybrid possible
Good fit
- Experienced engineers in cloud and DevOps
- Individuals wanting to work on innovative data tools
- Tech-forward professionals interested in pipeline systems
Not recommended for
- Beginners in software engineering or cloud technologies
- Those unwilling to adjust to new tools and methodologies
- Candidates preferring a strictly defined role without variety
Motivation fit
Desire to lead in innovative data solutionsInterest in implementing best practices in cloud deploymentEagerness to advance in the tech industry with an emphasis on collaboration
Key skills
DevOps methodologiesCI/CD pipeline developmentCloud architecture and securityAutomation scripting with Python or PowerShell
Score: 85/100 AI verified analysis
About the job
Project Role : DevOps Engineer
Project Role Description : Responsible for building and setting up new development tools and infrastructure utilizing knowledge in continuous integration, delivery, and deployment (CI/CD), Cloud technologies, Container Orchestration and Security. Build and test end-to-end CI/CD pipelines, ensuring that systems are safe against security threats.
Must have skills : Microsoft Azure DevOps
Good to have skills : NA
Minimum 5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: You’ll be the engineer who makes data move reliably. You’ll build Azure DevOps CI/CD that orchestrates ADF and Databricks Workflows to deliver data into a Medallion (Bronze/Silver/Gold) architecture, and you’ll own Terraform-based IaC that provisions secure, scalable Azure and Databricks resources across environments. Roles & Responsibilities: -• Build CI/CD for data ingestion o Author and maintain Azure DevOps (ADO) YAML pipelines to version, validate, and deploy ADF pipelines and Databricks Workflows/Jobs. o Implement parameterized, reusable pipeline templates (multi-stage, approvals, gates) for dev/test/prod with artifact promotion. o Orchestrate batch and streaming patterns (e.g., ADF triggers + Databricks Jobs/Auto Loader) into Bronze ? Silver ? Gold layers. o Integrate code quality, unit tests, and data checks (pytest, dbx/tests, Great Expectations or equivalent) in CI. • Provision cloud & lakehouse with IaC o Design Terraform modules and workspaces to provision Azure (RGs, VNets/PE, Key Vault, Storage, Event Hubs, Azure SQL, Monitor/Log Analytics) and Databricks (workspaces, UC metastore/catalogs/schemas, grants, clusters/cluster policies, Jobs, Repos/Asset Bundles). o Manage Terraform state (remote backends), environment promotion, and drift detection; enforce Policy-as-Code (Azure Policy, Sentinel/OPA). o Implement secure networking (VNet injection, Private Link, firewall rules), identity (AAD groups, service principals, workload identities), and secrets management (Key Vault ? Databricks secret scopes). • Operate and improve the platform o Set up observability for pipelines (Azure Monitor, Log Analytics, Databricks metrics & audit logs), SLOs, alerting, and incident runbooks. o Optimize spend (autoscaling, spot/On-Demand mix, job cluster vs warehouse usage, caching, cluster policies). o Champion Git strategies (trunk/PR), code reviews, and release tagging; document standards and how-to guides. Professional & Technical Skills: • 5+ years in DevOps/Platform Engineering supporting data platforms on Azure. • Strong hands-on with: o Azure DevOps: Repos, multi-stage YAML pipelines, variable groups, service connections. o Azure Data Factory (ADF): authoring, parameterization, linked services, triggers, integration runtimes. o Databricks: Workflows/Jobs, Repos/Asset Bundles, Delta Lake (Bronze/Silver/Gold), notebooks, job clusters & cluster policies; familiarity with Unity Catalog (catalogs/schemas/grants). o Terraform (AzAPI/AzureRM + Databricks providers): modules, workspaces, remote state, pipelines. o Azure networking & security: VNet/Private Endpoints, RBAC, Key Vault, MSI/service principals. • Proficiency in Python and Bash/PowerShell for automation and CI tasks. • Solid understanding of data ingestion patterns (batch/streaming, CDC/Auto Loader, file- and API-based sources).
Project Role Description : Responsible for building and setting up new development tools and infrastructure utilizing knowledge in continuous integration, delivery, and deployment (CI/CD), Cloud technologies, Container Orchestration and Security. Build and test end-to-end CI/CD pipelines, ensuring that systems are safe against security threats.
Must have skills : Microsoft Azure DevOps
Good to have skills : NA
Minimum 5 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary: You’ll be the engineer who makes data move reliably. You’ll build Azure DevOps CI/CD that orchestrates ADF and Databricks Workflows to deliver data into a Medallion (Bronze/Silver/Gold) architecture, and you’ll own Terraform-based IaC that provisions secure, scalable Azure and Databricks resources across environments. Roles & Responsibilities: -• Build CI/CD for data ingestion o Author and maintain Azure DevOps (ADO) YAML pipelines to version, validate, and deploy ADF pipelines and Databricks Workflows/Jobs. o Implement parameterized, reusable pipeline templates (multi-stage, approvals, gates) for dev/test/prod with artifact promotion. o Orchestrate batch and streaming patterns (e.g., ADF triggers + Databricks Jobs/Auto Loader) into Bronze ? Silver ? Gold layers. o Integrate code quality, unit tests, and data checks (pytest, dbx/tests, Great Expectations or equivalent) in CI. • Provision cloud & lakehouse with IaC o Design Terraform modules and workspaces to provision Azure (RGs, VNets/PE, Key Vault, Storage, Event Hubs, Azure SQL, Monitor/Log Analytics) and Databricks (workspaces, UC metastore/catalogs/schemas, grants, clusters/cluster policies, Jobs, Repos/Asset Bundles). o Manage Terraform state (remote backends), environment promotion, and drift detection; enforce Policy-as-Code (Azure Policy, Sentinel/OPA). o Implement secure networking (VNet injection, Private Link, firewall rules), identity (AAD groups, service principals, workload identities), and secrets management (Key Vault ? Databricks secret scopes). • Operate and improve the platform o Set up observability for pipelines (Azure Monitor, Log Analytics, Databricks metrics & audit logs), SLOs, alerting, and incident runbooks. o Optimize spend (autoscaling, spot/On-Demand mix, job cluster vs warehouse usage, caching, cluster policies). o Champion Git strategies (trunk/PR), code reviews, and release tagging; document standards and how-to guides. Professional & Technical Skills: • 5+ years in DevOps/Platform Engineering supporting data platforms on Azure. • Strong hands-on with: o Azure DevOps: Repos, multi-stage YAML pipelines, variable groups, service connections. o Azure Data Factory (ADF): authoring, parameterization, linked services, triggers, integration runtimes. o Databricks: Workflows/Jobs, Repos/Asset Bundles, Delta Lake (Bronze/Silver/Gold), notebooks, job clusters & cluster policies; familiarity with Unity Catalog (catalogs/schemas/grants). o Terraform (AzAPI/AzureRM + Databricks providers): modules, workspaces, remote state, pipelines. o Azure networking & security: VNet/Private Endpoints, RBAC, Key Vault, MSI/service principals. • Proficiency in Python and Bash/PowerShell for automation and CI tasks. • Solid understanding of data ingestion patterns (batch/streaming, CDC/Auto Loader, file- and API-based sources).
