Job Description :
- Building Big Data environments in the cloud/on premise environment
- Design and build distributed systems using automated container management technology like Kubernetes
- Developing tools and technologies to improve the analytics platforms.
- Develop and maintain distributed Big Data services, components, and tools
- Resolving platform issue related to data processing and analytics issues
- Analyze complex distributed production deployments and make recommendations to optimize performance
Requirement :
- At least 3–5-year experience
- Bachelor’s degree in information technology related field of study with a network engineering focus.
- Proficient understanding of code versioning tools like Git,Gitlab/Github
- Experience in managing a SaaS Service platform with high uptime focused on Kubernetes management
- Use Container platforms such as Kubernetes for large scale deployment of microservices
- Hands-on experience with Linux/Unix OS and DevOps Tools like Jenkins, Docker, Puppet, Ansible, plus understanding of CI/CD concept
- Experience in either OpenShift, TK8, Rancher, AKS, EKS
- Good Programming / Scripting skills in one of the programming languages like Python, Java, C/C, Scala, Bash, Korn Shell, Golang
Optional Requirement :
- Very strong in Kubernetes management, Docker, Linux
- Very strong in Observability (Prometheus/Grafana Stack)
- Strong in python and golang
- Strong in Terraform and Helm
- Strong in Networking and infrastructure
- Experience in ServiceMesh such as Istio
- Track record in large-scale system software development
- Strong with public cloud architecture AWS/Azure
- Microservice awareness
- Experience with Docker Swarm, DC/OS, Cloud Foundry, OpenStack, CloudStack, AWS, GCP or Azure
- Ability to solve complex networking, data, and software issues.
- Experience in design and build data processing architecture
- Experience in build an automated and scalable data pipeline
- Experience in design and build scalable infrastructure and platform to collect and process very large amounts of data, including streaming real-time data
- Experienced in setting up and administering applications in cloud environments
- Experienced in setting up and administering one or more major Hadoop distributions and various ecosystem components (e.g. HDFS, Sqoop, Impala, Spark, Flume, Kafka, Nifi, etc.)
- Experience in NoSQL and RDBMS databases including Redis and MongoDB
Summary
Salary Range | RM8,000 |
Industry | IT |
Location | Bangsar South, KL |
Task ID | #1mkt272 |