Oops! This job is no longer active.

Please contact the hiring team for pending updates, if any

Chief Architect

Bengaluru

18 - 22 years
7
68% Matching
TOGAF
Cloud
Apache Spark
Apache Hadoop
DevOps
Machine Learning
HBase
Solution Architecture
NLP
Data Science
Was it by mistake?
Withdraw your application within 24 hrs.
You withdrew your application.
Click here to apply again.
TOGAF
Cloud
Apache Spark
Apache Hadoop
DevOps
Machine Learning
HBase
Solution Architecture
NLP
Data Science

  About opportunity

Role Responsibilities

  • Consult with business and technical representatives from business to analyze and document requirements.
  • Provide architectural options and guidance to customer and internal teams on information management, big data architectures and technologies
  • Lead and assist in the evolving design and implementation of solutions for multiple large data warehouses with a solid understanding of cluster and parallel architecture, as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms
  • Build and troubleshoot big data, data management applications
  • Lead a team of developers
  • Keeping up-to-date on data and analytics trends. Must have and present point-of views on big data technologies and ecosystem and be able to build consensus with stakeholders
  • Take information management and analytics business use cases and translate requirements to an industry standard solution leveraging the Hadoop ecosystem
  • Support administration activities on big data technologies
  • Provide and review project documentation
  • Demonstrate strong knowledge of data management and data warehousing concepts
  • Setup and configure Hadoop clusters and its eco-system technologies
  • Define and Drive best practices that can be adopted in big data stack

Requirements

  • 12+ years of DW experience of which around 5 years in big data technologies
  • Should have architected atleast 1 large big data implementation.
  • Ability to troubleshoot issues, problem solve, and multi-task
  • Expertise in HDFS, YARN, MapReduce, HIVE, HBase, Pig, MapReduce, Spark, Flume, Sqoop
  • Expertise in scripting languages like Python, Scala or JAVA
  • Knowledge of workflow/schedulers like Oozie
  • Experience with REST API, HBASEAPI, Kafka, Nifi processors will be advantageous
  • Experience in implementing Kerberos security on the Hadoop Cluster
  • Experience with Azure and/or AWS
  • Experience working for healthcare payer or provider organizations will be advantageous. Knowledge of healthcare data sets will be preferred.
  • Strong Experience on at least one data visualization and reporting tools like Tableau, Power BI, Cognos, etc. is a must
  • Strong background in Linux operating system
  • Knowledge on big data administration methodologies and tools
  • Graduate or post graduate degree in engineering
Report an error

Was this job relevant for you?

Chief Architect

ITC Infotech   •   Bengaluru