Data Engineer
Full Time
full time
6 Oct 2025
About the job
• Build and maintain **Kafka pipelines** for claims data ingestion and routing.
• Develop ETL/ELT processes for integrating Amisys, Facets, ABS, and Excelys into Pisces.
• Implement schema validation and ensure data quality across multiple sources.
• Collaborate with BSAs and QA to deliver accurate edits and exclusions.
Requirements
- Location: Coppel, TX and NY (Hybrid with 2-3 days WFO per week)
- Proficiency in **Kafka, Python, SQL** (for ETL and data validation).
- Experience with **cloud-native data platforms** (AWS Glue, Azure Data Factory, GCP Dataflow).
- Familiarity with **MongoDB, Talend, or other integration tools**.
- Strong data modeling, schema design, and performance optimization knowledge.
- Ability to debug data pipeline issues in large-scale environments
🔍 ATS Optimization Keywords
Below are skills and terms extracted directly from this job posting to improve Applicant Tracking System (ATS) visibility. This unique feature helps candidates tailor their applications more effectively — a feature exclusive to JobTailor job listings.
Hard Skills
- Kafka
- Python
- SQL
- ETL
- ELT
- data validation
- data modeling
- schema design
- performance optimization
- debugging

