Roles and Responsibilities1.In depth understanding/knowledge of Hadoop & spark Architecture and its components such as HDFS, Job Tracker, Task Tracker,executor cores and memory params.
2.Experience in Hadoop development, working experience on SPARK, SCALA is mandatory and Database exposure is must
4.Experience in code optimize to fine tune the applications.
5.Expertise in writing Hadoop/spark Jobs for analyzing data using Spark, Scala, Hive, Kafka and Python
6.Experienced with streaming work flow operations.
7.Experience with developing large-scale distributed applications and developing solutions to analyze large data sets efficiently.
8.Integration with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions. Experience in Data Warehousing and ETL processes.
9.Strong database, SQL, ETL and data analysis skills.
11.Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
13.Experienced Scheduling jobs using Airflow/ESP.
Desired Candidate Profile1 Right communication skills to be able to work with Business.
3.Experience working with remote teams.
4.Work to become the trusted partner of the Architects by regular engagement and proposals on best development practices which can be implemented and would bring change to the process
5.Excellent Performance and Technical Skills, Attend required training and support peers by sharing knowledge and mentoring.
6.Taking up training programs within your team to ensure understanding of BDF.
Salary: Not Disclosed by Recruiter
Industry:IT Services & Consulting
Role CategoryProgramming & Design
Employment Type:Full Time, Permanent
UG:Any Graduate in Any Specialization
PG:Any Postgraduate in Any Specialization
Doctorate:Doctorate Not Required
CHANGE LEADERS CONSULTING