Search by job, company or skills
Description:
POSITION OVERVIEW : Assoc Data Engineer
POSITION GENERAL DUTIES AND TASKS :
Key Skills: ETL, ELT Databricks using PySpark, SQL, and Delta Lake Snowflake, Redshift, BigQuery, Synapse AWS, Azure, GCP Role and Responsibilities: Design, develop, and maintain data pipelines on Databricks using PySpark, SQL, and Delta Lake. Build scalable ETL/ELT processes for ingestion, transformation, and delivery across structured and unstructured datasets. Collaborate with data scientists and analysts to enable machine learning and AI workflows on Databricks. Optimize data lakehouse performance using Delta Lake, caching, partitioning, and indexing strategies. Integrate Databricks with enterprise systems, cloud data platforms (AWS, Azure, GCP), and BI tools. Implement data governance, quality, and security standards within the Databricks environment. Monitor, troubleshoot, and improve Databricks jobs, clusters, and workflows. Support CI/CD automation, DevOps practices, and Infrastructure-as-Code (IaC) for Databricks deployments.
Requirements / Qualifications:
Bachelor's degree in Computer Science, Data Engineering, or related field.
3-5 years of experience in data engineering, ETL/ELT development, or big data platforms.
Hands-on expertise with Databricks, PySpark, and Delta Lake.
Strong proficiency in SQL and working with large datasets in cloud data warehouses (Snowflake, Redshift, BigQuery, Synapse). Experience with cloud platforms (AWS, Azure, GCP) and their native data services. Knowledge of data governance, data quality, and security principles. Familiarity with CI/CD, Git, and Infrastructure-as-Code (Terraform, ARM, CloudFormation).
Date Posted: 25/09/2025
Job ID: 127126329