Portfolio Company Careers

Discover opportunities across our network of values-driven companies!
Sovereign’s Capital
Sovereign’s Capital

DevOps EngineerVietnam



Software Engineering
Posted on Thursday, September 28, 2023

Life at  Grab

At Grab, every Grabber is guided by The Grab Way, which spells out our mission, how we believe we can achieve it, and our operating principles - the 4Hs: Heart, Hunger, Honour and Humility. These principles guide and help us make decisions as we work to create economic empowerment for the people of Southeast Asia.

Get to Know the Team

The Coban team, part of the Data Tech Family, is the superstar data streaming team at Grab. We are architecting and operating large-scale, mission-critical data streaming platforms that serve engineers from all of Grab’s verticals, processing low latency, multi-TB/hour workloads with 99.99% uptime.

Our multicultural team is made up of brilliant and passionate Infrastructure Engineers and Data Engineers. If you are looking for an opportunity that allows you to design and manage a large-scale data infrastructure in a highly stimulating environment, look no further!

Get to Know the Role

We are looking for a Junior Infrastructure Engineer who would be thrilled at architecting and operating one of the biggest data platforms in Southeast Asia, with a passion for automation and a strong sense of customer service. An engineer in this role would build on inflight data that is of terabyte-scale and distributed systems that routinely serve tens of thousands of queries every second with tight SLAs.

In this role, you would get the opportunity to be part of a team that designs and operates the mission-critical infrastructure that processes every piece of data generated or consumed by Grab’s systems.

The Day-to-Day Activities:

  • Designing, deploying, and operating large-scale data streaming infrastructure in the public cloud (AWS). Consistently finding opportunities to improve and optimize the workloads for reliability, performance, and cost.

  • Creating intelligent GitOps solutions and Gitlab CI pipelines for next-level cloud infrastructure and resource provisioning (Kafka topics, streaming pipelines).

  • Investigating, mitigating, and resolving incidents such as production outages performance degradation, or data loss, in particular around Kafka and Kubernetes. Authoring post-mortems, identifying and implementing longer-run corrective actions. Taking turns in providing on-call support for Coban in autonomy.

  • Running upgrades (EKS, Operating System…) as well as routine maintenance operations at night. Identifying opportunities to reduce the toil by building automation.

  • Finding opportunities to collaborate with our main stakeholders such as the Cloud Infrastructure, Cybersecurity, and Compliance teams, as well as the other data teams of Data Tech (data lake, machine learning, segmentation…). Ensuring that their requirements are built into automated guardrails in our platforms and products.

  • Supporting our users in their needs of more real-time data, always with a view to self-service and automation where possible.

  • Experimenting and rolling out new cross-cutting technologies - container clusters and service mesh, Kubernetes operators, modern stream processing platforms, secrets management, monitoring and logging solutions, and more.

The Must-Haves:

  • Excellent communication in written and spoken English.

  • Real passion for cloud infrastructure, CI/CD pipelines, and automation: you enjoy exploring new products and trying them out.

  • Ability to easily pick up new technologies: you are keen to learn and to expand your knowledge.

  • Customer centricity: you always go above and beyond to deliver a first-class service to your users.

  • Proficient with Linux Systems Administration and network fundamentals (IPv4, DNS, HTTP, TLS…).

  • Some experience with AWS (ideally VPC, EC2, IAM, S3, EKS).

  • Some experience with Docker/containers and Kubernetes.

  • Some experience with Terraform.

The Nice-to-Haves

  • Some experience as a DevOps or Infrastructure Engineer.

  • Proficient with Helm, bash scripting, Python, and Golang.

  • Some experience with GitLab CI, Microsoft Azure, Apache Kafka, Strimzi, Ansible, Hashicorp Vault, Hashicorp Consul, Datadog (UI, agent)

  • Experience with Apache Airflow and Apache Spark.

  • Open-source contributions or public blogs.

Our Commitment

We are committed to building diverse teams and creating an inclusive workplace that enables all Grabbers to perform at their best, regardless of nationality, ethnicity, religion, age, gender identity or sexual orientation, and other attributes that make each Grabber unique.

Forward together

Follow us and keep updated!

Equal opportunity

Grab is an equal opportunity employer. We owe our success to the talents of our globally-diverse team and the varying perspectives they add to our thriving community.

Recruitment agencies

Grab does not accept unsolicited resumes sent by recruiting agencies. Please do not forward resumes to our job postings, Grab employees or other parts of the business. Grab will not be liable to pay any fees to agencies for candidates hired as a result of unrequested resumes.