Search by job, company or skills

T

Senior Data Engineer - X Delivery

6-9 Years
SGD 2,800 - 5,000 per month
new job description bg glownew job description bg glownew job description bg svg
  • Posted 18 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Who We Are

Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact.

To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive.

What You'll Do

As a part of BCG's X Delivery team, you will work closely with consulting teams on a diverse range of advanced topics. You will have the opportunity to leverage data engineering techniques to deliver value to BCG's Consulting & BCG X (case) teams, BCG X Product teams and Practice Areas (domain). You will collaborate with teams to gather requirements, specify, design, develop, deliver and support industrialized solutions serving client needs. You will provide technical support through deep understanding of relevant tools and processes to build high quality and efficient technology solutions.

Must have strong experience:

  • Python
  • Cloud computing platforms (AWS, Azure, Google Cloud, etc.)
  • Containerization (Docker, Kubernetes, etc.)
  • Relational databases (PostgreSQL, MariaDB, MySQL, etc.)
  • NoSQL databases (MongoDB, Neo4j, Redis, etc)
  • Spark or other distributed big data systems (Hadoop, Pig, Hive, etc.)
  • Stream-processing frameworks (e.g. Kafka)
  • Data pipeline orchestration tools (Airflow, Prefect, Dagster, etc.)
  • Unix-based command line & development tools
  • Version control (e.g. Git)

Functional Skills:

  • Data Modeling for Analytics and decisioning
  • Selecting and integrating Big Data tools
  • Implementing ETL process(s) across on-premise and cloud architectures
  • Monitoring performance and advising any necessary infrastructure changes

Nice to have:

  • Java, Scala
  • Flask, FastAPI, Django or NodeJS (BACKEND)
  • CI/CD tools (CircleCI, Octopus deploy, Jenkins, etc.)
  • Infrastructure as code (Terraform, Chef, Puppet, Ansible, etc.)
  • Deployment (Helm charts, Octopus Deploy, etc.)
  • Monitoring tools (Datadog, New Relic, App Dynamics, etc.)
  • Security tools (sonarqube, Veracode)
  • Unit testing frameworks (Pytest, Mocha, Jest, etc.)
  • Automated UI testing tools (Selenium, Cypress, Playwright, etc.)
  • Postman or other API testing tool

More Info

Job Type:
Industry:
Employment Type:

Job ID: 141373151