skillindiajobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Executive

1.00 to 3.00 Years   Hyderabad   21 Oct, 2021
Job LocationHyderabad
EducationNot Mentioned
SalaryNot Disclosed
IndustryNBFC ( Non Banking Financial Services )
Functional AreaOperations Management / Process Analysis
EmploymentTypeFull-time

Job Description

*Who Do We Need

  • Experience with Data lake, data warehouse ETL build and design
  • Minimum of 3 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github.
  • Minimum of 1 year of designing and building data pipelines from data ingestion to consumption within a hybrid big data architecture, using Cloud Native GCP, Java, Python, Scala, SQL etc.
  • Bachelors degree or equivalent (minimum 2 years) work experience
Bonus Points If You Have
  • Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification
  • Data migration experience from legacy systems including Hadoop, Exadata, Oracle Teradata or Netezza
  • Minimum 1 year of hands-on GCP experience with a minimum of an end to end solution designed and implemented at production scale
, *Who Do We Need
  • Experience with Data lake, data warehouse ETL build and design
  • Minimum of 3 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github.
  • Minimum of 1 year of designing and building data pipelines from data ingestion to consumption within a hybrid big data architecture, using Cloud Native GCP, Java, Python, Scala, SQL etc.
  • Bachelors degree or equivalent (minimum 2 years) work experience
Bonus Points If You Have
  • Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification
  • Data migration experience from legacy systems including Hadoop, Exadata, Oracle Teradata or Netezza
  • Minimum 1 year of hands-on GCP experience with a minimum of an end to end solution designed and implemented at production scale

Keyskills :
source system analysisbig datacloud storagedata solutionslegacy systemsenterprise datadata architecturesqletlgcpjavahivelakesparkcloudscalahadooporaclepythondesign

Executive Related Jobs

© 2020 Skillindia All Rights Reserved