
Remote Big Data Eng
- Ciudad de México
- Permanente
- Tiempo completo
- Design and implement data integration pipelines using Azure Data Factory.
- Manage data storage and optimization in Snowflake.
- Develop reconciliation reports and dashboards using Tableau.
- Collaborate with teams to ensure data accuracy and alignment with business needs.
- Maintain documentation for data workflows and architecture.
- +5 years of experience in big data engineering or related roles.
- Primary skills : Python, Databricks, SQL Secondary : ADLS. Kubernetes/Docker This is a L3 support position.
- Below are the skill sets >> SQL ADLS/Delta tables Kubernetes/Docker Python Databricks GitHub Actions DuckDB
- Strong skills Databricks, Python, L3 Support development
- Not English required
- Native Spanish.
- Fully remote.
Azure Data Factory, Snowflake, Tableau, Data Integration, ETL.Requirements:
- +5 years of experience in big data engineering or related roles.
- Primary skills : Python, Databricks, SQL Secondary : ADLS. Kubernetes/Docker This is a L3 support position.
- Below are the skill sets >> SQL ADLS/Delta tables Kubernetes/Docker Python Databricks GitHub Actions DuckDB
- Strong skills Databricks, Python, L3 Support development