Oops! This job is no longer active.

Please contact the hiring team for pending updates, if any

FEATURED

Hadoop Developer

Airtel X Labs
Gurgaon
1 - 6 years
Gurgaon
1 - 6 years
Big data
Spark
Hadoop
Java
Python
Scala
Last active
90
68% Matching
Usually replies in 18 hours
Was it by mistake?
Withdraw your application within 24 hrs.
You withdrew your application.
Click here to apply again.
Active
Usually replies in 18 hours
90
Big data
Spark
Hadoop
Java
Python
Scala
HIRING EVENT
Address
Airtel center, Phase IV, Sector 18, Gurugram, Haryana, India

  About opportunity

Job Responsibilities:

We expect our data engineers to build durable data pipelines with the ability to scale elegantly with data volume growth. The code you write will enable our users to get data in a timely manner. You’ll work on a variety of tools and systems, most of which are Cloud-based applications or data-platform components (e.g., data pipelines, processing, Visualization, reporting, etc.). In this role, you will be working in Agile development method for faster, quick-wins with the quality of design, development and operating BI components you develop.

  • Design, architect, implement, and support key datasets that provide structured and timely access to actionable business information with the needs of the end customer always in view
  • Gather and process raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.).
  • Retrieve and analyze data using SQL, Excel, and other data management systems
  • Create ETLs/ELTs to take data from various operational systems and craft a unified dimensional or star schema data model for analytics and reporting
  • Develop a deep understanding of vast data sources (existing on the cloud) and know exactly how, when, and which data to use to solve particular business problems
  • Loading from disparate complex data sets.
  • Designing, building, installing, configuring and supporting Hadoop.
  • Translate complex functional and technical requirements into detailed design.
  • Perform analysis of vast data stores and uncover insights.
  • Maintain security and data privacy..
  • Being a part of a POC effort to help build new Hadoop clusters.
  • Test prototypes and oversee handover to operational teams.
  • Propose best practices/standards.
  • Writing high-performance, reliable and maintainable code.

Skills Required:

  • Expertise with the tools in Hadoop Ecosystem including Pig, Hive, HDFS, MapReduce, Sqoop, Storm, Spark, Kafka, Yarn, Oozie, and Zookeeper.
  • Solid experience building APIs (REST), Java services, or Docker Microservices
  • Experience with data pipe
  • Good knowledge of database structures, theories, principles, and practices.
  • Familiarity with data loading tools like Flume, Sqoop.
  • Knowledge of workflow/schedulers like Oozie.
  • Analytical and problem solving skills, applied to Big Data domai
  • Proven understanding with Hadoop, Hive, Pig, and HBase.
  • Good aptitude in multi-threading and concurrency concepts
Read more
Report an error

Was this job relevant for you?

Hadoop Developer

Airtel X Labs   •   Gurgaon