Sign up to join belong
It’s ok, we all forget!
Finding a job you love made easy
Reset password in 3 steps
360+ Companies we work with
Facing any problems?
This will reach hiring team soon
Withdraw your application within 24 hrs.
Click here to apply again.
Data and Innovation is the heart of how Airtel has become the Leader in the Telecom space in India. Our
The DevOps Engineer will be part of a team that builds BigData solutions to support our customers (internal and external) with new visibility applications, dashboards and portals. The candidate should have practical, hands-on experience in automation, configuration management, debugging/troubleshooting modern cloud/container based systems and BigData applications. You would be working with development engineers to get more done faster with less effort by building continuous integration pipelines and continuous delivery tools like Jenkins.
What is in store for you:
- Automation is at the core of this practice.
- Taking care of Continuous Integration (CI)/continuous Testing/Continuous delivery
- Deploy, manage and maintain BigData pipeline with technologies like - Kafka, ElasticSearch, Druid, Envoy, Spark, HDFS, Zeppelin, Kubernetes, Docker containers
- Keeping big data pipeline/infra up 24x7
- Monitor pipeline. Take proactive measures to keep pipeline up and healthy.
- Quickly debug and resolve issues.
- Develop automation scripts for repetitive tasks like rebalancing partitions in kafka, making disk space etc., auto deployment of code.
- Writing scripts to automate tests and validate data pipeline
- Automating and streamlining the software development and infrastructure management processes
- Application Performance Monitoring, Logging
- Communication, Collaboration
- Understand organisation’s need and able to apply solutions, tools, standard methodologies from an immediate/dire need to a long-term perspective/vision.
- Come up with standard methodologies, execute and excel as DataOps center of excellence.
- Help build, setup and manage Agile Data Infrastructure & Agile Data Operations.
- Bachelor / Masters degree in engineering or equivalent
- 1-4 years in fast paced high volume scalable platforms
- Experience in operating large scale data infrastructure - Kafka, Druid, Elasticsearch, Spark, HDFS etc..
- Knowledge of application monitoring tools to troubleshoot and diagnose environment issues
- Experience with Jenkins, CICD.
- Proficiency in Shell, Python and Go programming
- Solid understanding of operating systems and networking principles
- Good understanding of Linux
- Plus: Familiar with automation tools such as Helm charts, Ansible etc.
- Plus: Experience in microservices and containerized technologies (Docker, Kubernetes, etc.)
- Plus: Experience with modern web architectures & cloud platforms (AWS, GCP, Azure, etc)