skillindiajobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

VP-Software Engineering Team Lead

1.00 to 5.00 Years   Hyderabad   06 Dec, 2022
Job LocationHyderabad
EducationNot Mentioned
SalaryNot Disclosed
IndustryBanking / Financial Services
Functional AreaSBU Head / CEO / Director
EmploymentTypeFull-time

Job Description

    Core Banking provides technology support for applications across all lines of business (Auto/Education, Business Banking, Card Services, Centralized Transactions Operations, Commercial Banking, Consumer Bank, Consumer Internet Group, Credit Risk, Finance, Home Lending, Investment Bank, Treasury & Securities Services, Wealth Management). ESDD is the centralized data hub for Core Banking. ESDD provides high-volume, high-performance enterprise data integration (ETL) capabilities. These capabilities include extracting, transforming and distributing data from and across multiple databases, applications and platforms including on premise and cloud.A data integration developer is responsible for the following key areas:
    • Creation of ETL Data Pipeline processes to validate, transform, enrich, and integrate data
    • Adopt cutting edge technologies such as cloud/containers as a part of application evolution and modernization
    • Understand business requirements and collaborate with the architecture team to translate them into technical design
    • Participate in end-to-end development lifecycle activities of the application, including design, coding, testing and deployment activities.
    • Produce comprehensive tests for all developed code. Support and participate in system and integrated testing across sub-systems as the need arises.
    • Provide technical support for the application on a rotational basis, including meeting service level and performance requirements; and diagnosing and evaluating inefficient processes/code.
    Required Skills:
    • Bachelors degree or equivalent in Computer Science, Engineering (any), or related field.
    • 12+ years of experience in the design and delivery of ETL solutions Java (preferred), Spark (basic)
    • 1+ years of experience in Cloud deployment (AWS, Kubernetes), Kafka, CI / CD, Automation
    • Knowledge of application, design-patterns, data and infrastructure architecture disciplines
    • Experience in UNIX and/or AIX operating environment and UNIX shell scripting is required
    • Ability to understand, modify and write SQL queries
    • Working experience as Agile developer and good understanding of SDLC methodologies/guidelines
    • Knowledge of big data technologies like Hadoop/HIVE/Spark
    Ideal Candidate Attributes
    • Self-starter with a strong work ethic
    • Exhibits leadership
    • Possesses analytical and problem-solving skills
    • Strong organizational and prioritizing skills
    • Ability to multi-task and handle multiple priorities
    • Embraces team-based approach
    • Proactive, strong, clear communication skills
    • Strives for Continuous Improvement
    Ideal candidates should possess:
    • Basic concepts of data integration architecture
    • Basic understanding of data modeling concepts
    • A good grasp of various components of standard ETL Data Pipeline tools (preferably Ab Initio)
    • Design and development experience with data integration solutions using an ETL tool (preferably Ab Initio)
    • Prior experience in working with data pipelines in a cloud or Kubernetes environment
    • Working knowledge of Oracle, especially relating to PL/SQL
    • Understanding of Big data technologies including but not limited to Spark, Hadoop, Hive, Impala and HBase
    • Hands-on experience with understanding Spark processing and debugging graphs and logs
    • Demonstrable knowledge of UNIX operating systems and shell scripting
    • Working knowledge of scheduling tools such as Control-M, Autosys
    • Strong grasp on SDLC concepts including development best practices, code migration procedures and test automation
    • Ability to work in an Agile development environment against challenging project timelines
    • BS/BA degree or equivalent experience
    Additionally, an understanding of at least some of these technologies and practices is beneficial:
    • Knowledge of test automation frameworks such as Cucumber
    • Working knowledge of Kafka as a distributed publish-subscribe system
    • Experience with modern big data consumption technologies such as Dremio
    • Working knowledge of Python, Java and Scala
    • Modern web technologies such as NodeJS, AngularJS, React, Redux, or Typescript
    • Understanding of modern cyber security practices such as Kerberos authentication
    ,

Keyskills :
agiledeliveryframeworkreportingsql serverunix shell scriptingbig dataetl toolcredit riskdata modelingservice levelcyber securityshell scripting

VP-Software Engineering Team Lead Related Jobs

© 2020 Skillindia All Rights Reserved