
Search by job, company or skills
Requirements / Qualifications
. Bachelor's or master's degree in computer science, data engineering, or a related field.
. Minimum 8 years of experience in data engineering, with expertise in AWS services, Databricks, and/or Informatica IDMC.
. Proficiency in programming languages such as Python, Java, or Scala for building data pipelines.
. Evaluate potential technical solutions and make recommendations to resolve data issues especially on performance assessment for complex data transformations and long running data processes.
. Strong knowledge of SQL and NoSQL databases.
. Familiarity with data modeling and schema design.
. Excellent problem-solving and analytical skills.
. Strong communication and collaboration skills.
. AWS certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Data Analytics - Specialty), Databricks certifications, and Informatica certifications are a plus.
Preferred Skills:
. Experience with Pyspark on Databricks.
. Knowledge of data governance and data cataloguing tools, especially Informatica IDMC.
. Familiarity with data visualization tools like Tableau.
. Knowledge of containerization and orchestration tools like Docker and Kubernetes.
. Understanding of DevOps principles for managing and deploying data pipelines.
. Experience with version control systems (e.g., Git) and CI/CD pipelines.
Job ID: 135254409