title-image
Turrior - Let work find you
Recruiters get AI-ranked shortlists and automated outreach, filling roles up to 5× faster.
0%
Popularity
0d
Avg. Time to Hire
0h
Recruiter Res. Time
0%
HR Satisfaction
Careers at Nexus Systems Group
All open opportunities, right here. Explore, apply, grow.
Apply now

Data Engineer

Contractor
contractor
9 Dec 2024
Toronto
Verified by Turrior

Content + Source + Freshness • 12 Dec 2025 • 95% confidence

85 / 100

Offer value

High value role due to extensive experience requirements and involvement with advanced ETL technologies.

  • Involvement in prestigious government data projects
  • Use of cutting-edge technologies like Databricks and Delta Lake
  • Stable, long-term contract with potential for growth
  • Requires substantial expertise and experience
Pros
  • Involvement in government projects offers stability and prestige
  • Opportunities to work with highly advanced ETL and data management tools
  • Long-term contract providing professional growth in data engineering
Cons
  • High experience requirement (7+ years) may limit candidate pool
  • Onsite work three days a week could be inconvenient for some
  • Government contracts can involve bureaucratic challenges

Who it's for

Senior / Lead • Hybrid / Onsite three days a week

Good fit
  • Experienced data engineers with strong backgrounds in ETL
  • Candidates looking for stability in government projects
  • Individuals seeking to advance in data analytics and engineering
Not recommended for
  • Less experienced engineers unfamiliar with ETL processes
  • Preferably remote work seekers
  • Candidates resistant to hierarchical structures

Motivation fit

Desire to work on impactful government projectsInterest in mastering advanced data management techniquesEagerness to lead within a technical team environment

Key skills

Advanced ETL processesDelta Lake and DatabricksPython and PySpark programmingTeam leadership and collaboration
Score: 85/100 AI verified analysis

About the job



Data Enginer  – Senior
Ministry - Government Client 
Toronto  3 days onsite
CRJMC
14 month contract 


Must Have Skills
·        7+ years using ETL tools such as Microsoft SSIS, stored procedures, T-SQL 
·        2+ Delta Lake, Databricks and Azure Databricks pipelines
o   Strong knowledge of Delta Lake for data management and optimization.
o   Familiarity with Databricks Workflows for scheduling and orchestrating tasks.
·        2+ years Python and PySpark 
·        Solid understanding of the Medallion Architecture (Bronze, Silver, Gold) and experience implementing it in production environments. 
·        Hands-on experience with CDC tools (e.g., GoldenGate) for managing real-time data.
·        SQL Server, Oracle
?
Experience:
·        Experience of 7+ years of working with SQL Server, T-SQL, Oracle, PL/SQL development or similar relational databases
·        Experience of 2+ years of working with Azure Data Factory, Databricks and Python development
·        Experience building data ingestion and change data capture using Oracle Golden Gate 
·        Experience in designing, developing, and implementing ETL pipelines using Databricks and related tools to ingest, transform, and store large-scale datasets
·        Experience in leveraging Databricks, Delta Lake, Delta Live Tables, and Spark to process structured and unstructured data.
·        Experience working with building databases, data warehouses and working with delta and full loads
·        Experience on Data modeling, and tools – e.g. SAP Power Designer, Visio, or similar
·        Experience working with SQL Server SSIS or other ETL tools, solid knowledge and experience with SQL scripting
·        Experience developing in an Agile environment
·        Understanding data warehouse architecture with a delta lake
·        Ability to analyze, design, develop, test and document ETL pipelines from detailed and high-level specifications, and assist in troubleshooting.
·        Ability to utilize SQL to perform DDL tasks and complex queries
·        Good knowledge of database performance optimization techniques
·        Ability to assist in the requirements analysis and subsequent developments
·        Ability to conduct unit testing and assist in test preparations to ensure data integrity
·        Work closely with Designers, Business Analysts and other Developers
·        Liaise with Project Managers, Quality Assurance Analysts and Business Intelligence Consultants
·        Design and implement technical enhancements of Data Warehouse as required.
 
 
Technical Skills (70 points)
 
·        Experience in developing and managing ETL pipelines, jobs, and workflows in Databricks.
·        Deep understanding of Delta Lake for building data lakes and managing ACID transactions, schema evolution, and data versioning.
·        Experience automating ETL pipelines using Delta Live Tables, including handling Change Data Capture (CDC) for incremental data loads.
·        Proficient in structuring data pipelines with the Medallion Architecture to scale data pipelines and ensure data quality.
·        Hands-on experience developing streaming tables in Databricks using Structured Streaming and readStream to handle real-time data.
·        Expertise in integrating CDC tools like GoldenGate or Debezium for processing incremental updates and managing real-time data ingestion.
·        Experience using Unity Catalog to manage data governance, access control, and ensure compliance.
·        Skilled in managing clusters, jobs, autoscaling, monitoring, and performance optimization in Databricks environments.
·        Knowledge of using Databricks Autoloader for efficient batch and real-time data ingestion.
·        Experience with data governance best practices, including implementing security policies, access control, and auditing with Unity Catalog.
·        Proficient in creating and managing Databricks Workflows to orchestrate job dependencies and schedule tasks.
·        Strong knowledge of Python, PySpark, and SQL for data manipulation and transformation.
·        Experience integrating Databricks with cloud storage solutions such as Azure Blob Storage, AWS S3, or Google Cloud Storage.
·        Familiarity with external orchestration tools like Azure Data Factory
·        Implementing logical and physical data models
·        Knowledge of FHIR is an asset
Design Documentation and Analysis Skills (20 points)
·        Demonstrated experience in creating design documentation such as:
o   Schema definitions
o   Error handling and logging
o   ETL Process Documentation
o   Job Scheduling and Dependency Management
o   Data Quality and Validation Checks
o   Performance Optimization and Scalability Plans
o   Troubleshooting Guides
o   Data Lineage
o   Security and Access Control Policies applied within ETL
·        Experience in Fit-Gap analysis, system use case reviews, requirements reviews, coding exercises and reviews.
·        Participate in defect fixing, testing support and development activities for ETL
·        Analyze and document solution complexity and interdependencies including providing support for data validation.
·        Strong analytical skills for troubleshooting, problem-solving, and ensuring data quality.
 
 
Communication and Leadership Skills (10 points)
 
·        Ability to collaborate effectively with cross-functional teams and communicate complex technical concepts to non-technical stakeholders.
·        Strong problem-solving skills and experience working in an Agile or Scrum environment.
·        Ability to provide technical guidance and support to other team members on Databricks best practices.
·        Must have previous work experience in conducting Knowledge Transfer sessions, ensuring the resources will receive the required knowledge to support the system.
·        Must develop documentation and materials as part of a review and knowledge transfer to other members.

 

Similar Jobs

Long agoContractor
Data Engineer
Nexus Systems Group
Contractor
Long agoContractor
Data Engineer
Nexus Systems Group
Contractor
5 months agoFull Time
Data Engineer
Accenture
Full Time
Long agoFull Time
Full Time

End-to-end AI hiring for modern HR teams

Turrior uses artificial intelligence to create job listings, automate candidate screening, conduct video interviews, and apply comprehensive AI scoring — helping companies hire faster, more accurately, and with lower operational costs.

Key benefits:

  • AI-powered job creation and structured job data
  • Intelligent candidate screening and automated shortlisting
  • Video interviews with AI-based answer analysis
  • Comprehensive AI scoring of skills, experience, and role fit
  • Recruitment process automation and reduced time-to-hire

Share job