Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Bangalore, Chennai, Hyderabad |
Education | Not Mentioned |
Salary | Rs 16 - 24 Lakh/Yr |
Industry | IT - Software |
Functional Area | Application Programming / Maintenance |
EmploymentType | Full-time |
JOB DESCRIPTION-BIG DATA ARCHITECTBE A PART OF A LARGE ENTERPRISE WIDE TRANSFORMATION INITIATIVE TO STREAMLINE AND SETUP DATA PLATFORM IN GOOGLE CLOUD WHICH INCLUDES DATA INGESTION DATA TRANSFORMATION DATA STORAGE DATA MANAGEMENT DATA GOVERN ANCE ETC DEFINE AND COLLABORATE ON THE STRATEGY AND IMPLEMENTATION OF MASTER DATA MANAGEMENT DATA TAXONOMIES DATA STEWARDSHIP PROCESSES AND DATA SELF SERVICE COLLABORATE WITH DATA ANALYTICS PRINCIPAL ARCHITECTS TO IDENTIFY ARCHITECTURAL GA PS WITHIN THE DATA SPACE REVIEW REQUIREMENTS AND DATA MODELS AND TRANSLATE INTO RESPONSIBLE SCALABLE APPLICATIONS AND ARCHITECTURE FOR A LARGE SCALE DATA PROCESSING SYSTEM USING GOOGLE CLOUD PLATFORM REQUIREMENT STRONG INTERPERSONAL AND COMMUN ICATION SKILLS PROVEN ABILITY TO ENGAGE AND INFLUENCE TECHNOLOGY AND BUSINESS STAKEHOLDERS AT MOST LEVELS WITHIN AN ORGANIZATION ON COMPLEX TECHNICAL AND COMMERCIAL CONCEPTS EXPERIENCE WORKING IN CORE DEVELOPMENT BIG DATA PROJECTS WITH HANDS ON E XPERIENCE HDFS HIVE SPARK KAFKA ETC EXPERIENCE IN ARCHITECTING BIG DATA SOLUTION AT ENTERPRISE SCALE WITH AT LEAST ONE END TO END IMPLEMENTATION STRONG UNDERSTANDING EXPERIENCE OF HADOOP ECO SYSTEM SUCH AS HDFS MAPREDUCE YARN SPARK HIVE HBASE ETC DATA WAREHOUSE ETL DESIGN AND DEVELOPMENT METHODOLOGIES KNOWLEDGE AND EXPERIENCE STRONG KNOWLEDGE OF DATABASE PLATFORMS DATABASE DESIGN PATTERNS AND COMMON DATA INTEGRATION PATTERNS EXPERIENCE WITH NOSQL DATABASES AND WITH REAL TIME MESSAGING AND INGESTION INCLUDING KAFKA HANDS ON EXPERIENCE USING DATABRICKS DELTA LAKE EXPERIENCE IN HANDLING DATA IN ANY CLOUD GCP WILL BE A BONUS BIG DATA ENGINEER JOB DESCRIPTION BE A PART OF A TEAM TO DESIGN CREATE AND MAINTA IN ENTERPRISE DATA LAKE DELTA LAKE HANDS ON DEVELOPMENT ON DATA INGESTION TRANSFORMATION CURATION STORAGE USING VARIOUS TOOLS LIKE SPARK HDFS HIVE DATABRICKS KAFKA NOSQL DATABASE ETC OWN DEVELOP TEST DELIVER AND PROVIDE STATUS ON.Job Requirements: Apache Spark, Databricks, Hive, Google Cloud Dataproc, Data Architecture, Migration Strategy and Planning, Big Data Architecture Document, Data Ingestion Design, Data Migration and Cutover Plan
Keyskills :
apache sparkdata architectureshould havemax-60day