Key Responsibilities:
Lead GCP Solution and Scoping:
Lead the design and implementation of GCP solutions, ensuring alignment with business objectives and technical requirements.
Scope project requirements and deliverables, and develop detailed technical plans.
Technical Architecture and Design:
Create detailed target state technical architecture and design blueprints.
Ensure that architectural designs meet the scalability, security, and performance requirements.
Technical Discovery and Analysis:
Conduct full technical discovery, identifying pain points, business and technical requirements, and “as is” and “to be” scenarios.
Analyze and document existing data infrastructure, identifying opportunities for improvement.
Data Migration and Pipeline Implementation:
Lead data migration efforts from legacy systems to GCP, ensuring minimal disruption and data integrity.
Design, develop, and implement data pipelines, setting up best practices, guidelines, and the right set of architectures.
Ensure efficient and reliable data ingestion, processing, and consumption.
Minimum Qualifications:
Experience:
Minimum of 5 years of experience in designing and building production data pipelines.
Proven experience with hybrid big data architecture using GCP, Hadoop, Hive, HDFS, Hbase, Spark, etc.
Programming Languages:
Expertise in one of the programming languages: Scala, Java, or Python.
Strong skills in PySpark, including structured streaming and batch patterns.
Basic knowledge of Java.
Technical Skills:
In-depth understanding of GCP services such as DataProc, Composer, Airflow, and WireSafe.
Experience with GCP Data Transfer tools and strong execution knowledge of BigQuery.
Familiarity with GCP CICD & SDK.
Proficient in handling Data Lakes, data warehouse ETL build and design, and data migration from legacy systems including Hadoop, Exadata, Oracle Teradata, or Netezza, etc.
Must-Have Skills:
GCP Services:
Proficiency with BigQuery, Cloud Storage, Bigtable, Dataflow, Dataproc, Cloud Composer, Cloud Pub/Sub, and Data Fusion.
Programming Languages:
Expertise in Scala, Java, or Python.
Big Data Technologies:
Extensive experience with Hadoop, Hive, HDFS, Hbase, and Spark.
Preferred Qualifications:
Experience with data migration tools and best practices.
Strong problem-solving skills and the ability to troubleshoot complex data issues.
Excellent communication skills and the ability to work collaboratively in a team environment.
Senior Data Engineer(GCP)
Experience: 5+ Years
Type: Full Time
Location: Bhubaneswar Hyderabad Pune
Notice-period: Immediate/15 days
Budget: Upto 12-15 LPA
Technology: IT