Search by job, company or skills

S

Data Architect

5-10 Years
Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 15 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description

Working Hours: Monday - Thursday (8.30am -6pm), Friday (8.30am - 5.30pm) (Hybrid working arrangement)

Working Location: Central

Salary Package: Up to $14,000

Employment Type: 6 months contract (renewable)

Responsibilities:
. Lead and implement data engineering strategy and architecture blueprints in alignment with business requirements.
. Contribute to evaluation of data platforms and architecture solutions to support evolving data needs. E.g. Data storage / usage for AI purposes.
. Translate business data requirements into technical specifications and scalable solutions.
. Architect and build ingestion pipelines to collect, clean, merge, and harmonize data from diverse sources.
. Design and implement secure, cloud-based data infrastructure and access mechanisms.
. Monitor and optimize ETL systems and databases for performance, reliability, and scalability.
. Construct reusable data models and maintain data catalogues with metadata and lineage using tools such as ER/Studio.
. Collaborate with data stewards to enforce data governance, quality, and security policies.
. Guide agencies through greenfield and brownfield implementations, from problem definition to solution design.
. Develop standardized approaches for assessments, discovery, and solutioning
. Champion engineering excellence and influence adoption of modern data and infrastructure practices.

Requirements:
. Bachelor's degree in computer science, Software Engineering, Information Technology, or related disciplines.
. 5-10 years of experience in data engineering, cloud infrastructure, or platform engineering.
. Deep understanding of data system design, data structures, algorithms, and data architecture modelling.
. Hands-on experience with cloud platforms (AWS, Azure, GCP) and distributed data technologies (Spark, Hadoop).
. Proficiency in Python and SQL.
. Experience with orchestration frameworks (Airflow, Azure Data Factory) and DevOps tools (Docker, Git, Terraform).
. Familiarity with CI/CD pipelines and infrastructure-as-code practices.
. Experience with Databricks / Snowflake / Denodo and implementing batch/real-time data pipelines.

By submitting your resume, you consent to the collection, use, and disclosure of your personal information per ScienTec's Privacy Policy (scientecconsulting.com/privacy-policy).
This authorizes us to:
. Contact you about potential opportunities.
. Delete personal data as it is not required at this application stage.
. All applications will be processed with strict confidence. Only shortlisted candidates will be contacted.

Wong Siew Ting (Maeve) - R25127375
ScienTec Consulting Pte Ltd - 11C5781

More Info

Job Type:
Employment Type:

Job ID: 146095575

Similar Jobs