We are looking for a Senior Data Engineer to build and scale a modern, AWS-based data platform powered by serverless pipelines, DuckDB, and Delta Lake architecture. You will work on integrating multiple SaaS and operational data sources into a robust data warehouse used for analytics and Tableau reporting.
This role requires a hands-on engineer comfortable with Python-based data processing, AWS services, and non-traditional data stack components (DuckDB, Polars, etc.).
You will play a critical role in building a robust, production-grade data platform used for analytics, reporting, and future AI use cases.
Key Responsibilities:
- Design, build, and maintain ETL/ELT pipelines from multiple data sources:
- APIs (e.g., RingCentral, Google Ads, GA4);
- Third-party SaaS platforms;
- Financial/accounting systems;
- Non-standard data sources (custom ingestion);
- Develop scalable ingestion frameworks (batch and incremental loads);
- Handle complex data transformations, including:
- Schema normalization across inconsistent sources;
- Historical vs incremental data reconciliation;
- Build and optimize data warehouse structures (fact/dimension models);
- Ensure data reliability through logging, monitoring, and error handling;
- Optimize performance of queries and storage (cost + speed);
- Collaborate with Data Architect and BI team to ensure data usability;
- Support QA and validation processes;
- Document pipelines, data flows, and system architecture.
Requirements:
- 7+ years of experience in data engineering;
- Experience with AWS ecosystem:
- Lambda, EC2, SQS, EventBridge, ECR, Glue, Athena;
- Strong SQL expertise (advanced level);
- Experience with Python (or similar) for data processing;
- Hands-on experience with modern data warehouses:
- Snowflake / Redshift / BigQuery;
- Experience integrating REST APIs and third-party systems;
- Strong understanding of data modeling (star schema, normalization);
- Experience with orchestration tools (Airflow, Prefect, etc.);
- Familiarity with data quality and validation frameworks;
- Ability to work with messy, inconsistent, real-world data;
- English: Upper Intermediate or higher.
Nice to Have:
- Experience with Delta Tables;
- Experience with dbt or similar transformation tools;
- Experience with marketing data (GA4, Google Ads);
- Experience with financial/accounting data;
- Exposure to real-time or near-real-time data pipelines;
- Experience supporting BI tools (Tableau, Looker, etc.).
We Offer:
- Maximum flexibility;
- Professional trainings, conferences, and certifications;
- Corporate events and benefits;
- Professional literature;
- English courses.
If you are interested, please let us know job@zfort.com