skillindiajobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Kafka Architect

5.00 to 8.00 Years   Hyderabad   21 Dec, 2020
Job LocationHyderabad
EducationNot Mentioned
SalaryNot Disclosed
IndustryIT - Software
Functional AreaGeneral / Other Software
EmploymentTypeFull-time

Job Description

Role Summary

As a Big Data Architect, you will be implementing several projects for our clients and become a go to person. You would ideally be a specialist in Data and Cloud technologies with excellent consultative skills. You would be working with various data technologies from companies like Azure, AWS, Kafka, Spark, Hadoop platforms etc.

Primary Responsibility

  • Full lifecycle implementation from requirements analysis, platform selection and setup, technical architecture design, application design and development, testing, and deployment.
  • Provide consultative recommendations to clients to solve their Big Data challenges.
  • Work as an individual contributor and or team player based on the projects

Desired Skills, Attributes Experience

  • A minimum of 10 years experience in systems and solution consulting.
  • A minimum of 5 years experience in Cloud, Data Management Platforms and Streaming related technologies
  • A minimum of 2 years experience as Data Architect.
  • The ideal candidate should have Platform Consulting Implementation Experience and Solution Consulting and Implementation Experience .

Skills Required

Must Have

  • Experience in Kafka installation and administration and deep knowledge of Kafka internals
  • Experience of Kubernetes and Confluent Operator
  • Experience of physical data modelling and serialization formats such as Apache Avro
  • Strong experience in implementing software solutions in enterprise Linux or Unix environments
  • Experience in Data Ingestion tools, Streaming, Spark, Hadoop etc.
  • Hands-on experience in setting up and running Azure and AWS Data platforms, Hadoop, Streaming clusters etc.

Should Have

  • The ideal candidate should have Platform Consulting Implementation Experience and Solution Consulting and Implementation Experience .

Platform Consulting Implementation Experience

  • Extensive background in Systems Architecture, Clustering Distributed Systems, Programming, Security, Networking Load Balancing, Monitoring, Scripting and Automation gained in large mission critical infrastructures.
  • Experience on cloud platforms like AWS and Azure, hands-on working knowledge on cloud native data components.
  • Experience with integrating various security solutions such as LDAP, Kerberos, SPNego or system / installation management tools into the overall solution.
  • Strong understanding of OS, network configuration, devices, protocols, and performance optimization.
  • Understanding of configuration management systems, devops and automation (e.g. Ansible, Puppet, Chef).

Solution Consulting Implementation Experience

  • Demonstrable experience in gathering and understanding customer s business requirements, solutioning and implementing them.
  • Strong programming experience in any programming languages with design patterns, algorithms, ETL/ELT pipeline design, and data analytics.
  • In depth work experience on Messaging, Data Ingestion, Stream and Batch Processing etc.
  • Experience in one or more visualisation tools like Qlikview, Tableau etc.

Nice to Have

  • Azure, AWS, Hortonworks, MapR or Cloudera certifications.
  • Experience in installation / upgrade / management of multi node Hadoop clusters using distributions such as Hortonworks, Cloudera, or MapR.
  • Experience in securing Hadoop clusters using technologies such as Ranger, Sentry, data encryption over wire or rest, Kerberos, etc.
  • Experience in managing Hadoop clusters and performing daily activities to have a cluster with minimal downtime.
  • Experience in Scala, Spark implementation projects.

Position Requirements

  • Ability to travel up to 50% of the time
  • Bachelors or master s degree in computer technology
,

Keyskills :
music makingprogramming languagesload balancingdistributed systemsbig datamission criticalpipeline designarchitectural designdesign patternsdata managementmanagement systemsapplication designbusiness requirementscomputer technologysoftware s

Kafka Architect Related Jobs

© 2020 Skillindia All Rights Reserved