skillindiajobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Big Data with ETL

7.00 to 10.00 Years   Hyderabad   15 Mar, 2021
Job LocationHyderabad
EducationNot Mentioned
SalaryNot Disclosed
IndustryBanking / Financial Services
Functional AreaGeneral / Other Software
EmploymentTypeFull-time

Job Description

Ownership, maintenance and support of CCB Data Ecosystem Hadoop implementations. Coordination of platform maintenance, including change and release management (planned and emergency deployments, configuration and patching). Participation in Resiliency testing. Regular and systematic testing of application redundancy and failover. Supporting application teams on Data ingestion into big data platform using HDF and other HDP components like ( Kafka ,spark and hive etc.) Supporting Application deployment , Data storage and Access authorization (Hive, base and Ranger) etc. Supporting application teams on deployments , failed jobs and log analysis , access related issues etc. Global end user support on big data platforms Ensuring availability and reliability of Data and Analytic platforms Real-time system monitoring (custom and off-the-shelf tools). Engineer and implement custom monitoring solutions if needed. Work with big data teams from other divisions supporting different layers of the platform stack. Monitoring of Hadoop environment/Cluster , jobs, build and deploy and ability to debug issues from Hadoop logs working with development teams and contributing to the strategic roadmap and execution. Responsible to support process improvements to continuously improve the stability and performance of the platform Knowledge transfer to the India and US teams Participation of Run books development. Support off hours on rotational basis. End-to-end Incident and Problem resolution for responsible LOBs. Identify system bottlenecks and opportunities for process improvements Effectively perform root cause analysis of issues and report the outcome to business community and Management. Handling user onboarding and automation. General operational expertise such as good troubleshooting skills, understanding of system s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks. Preferably a candidate with Site Reliability Engineer (SRE) mindset and having key contribution or working experience on Hadoop stack and java ( preferred) Able to build a meaningful engineering discipline, combining software and systems to develop creative engineering solutions to operations problems Key areas of focus include automation, application/platform uptime and application deployment using CI/CD pipeline The position will wear many hats from owning day to day health and performance, to identifying incidents/developing remediation plans etc.Experience required : B.Tech Degree in Computer Science At least 7+ years of experience as a participant on data Engineering related efforts Exposure to the banking businesses, functions, systems, data environments, and processes that are necessary for the production Must have skills Python : Hadoop, Spark, Hive,Impala, Hue,SQL, Java/Scala, Python, Unix shell scripting .Comfortable in querying and analyzing large amount of data using Hive Monitoring of Hadoop environment/Cluster , jobs, build and deploy and ability to debug issues from Hadoop logs Expertise in administration of Hadoop infrastructure , troubleshoot job failures and provide performance tuning and recommendations Experience with ETL tools like Informatica and Abinitio. Experience with troubleshooting Kerberos Authentication problems Familiar with Hadoop security knowledge and permissions schemes Knowledge on LDAP related tasks /activities and Linux administration.Good to have:- Knowledge on AWS and data migration to cloud platform, Python scripting, ansible, CDH to CDP migration projects Good to have- knowledge on Angular / React is added advantage Design, coding, debugging and analytical skills, especially within the big data ecosystem Understanding of risk and governance functions for Hadoop platform Strong communication skills, with prior experience in requirements gathering and working with SMEs and Stakeholders at various leadership levels. Excellent customer service attitude, communication skills (written and verbal), cross-functional, multi-location teams and interpersonal skills Strong technical documentation skills. Knowledge or experience in Dashboards, KPI, Scorecards, Operational and reporting tools Experience performing root cause analysis and problem management from multiple layers of the application stack Strong command of the SQL language & understanding of relational databases,

Keyskills :
end user supportroot cause analysisunix shell scriptingstrong communication skillsbig dataetl toolsroot causeuser supportlog analysis

Big Data with ETL Related Jobs

© 2020 Skillindia All Rights Reserved