GCP Data Engineer

Experience Required: 4+ Years

Locations: Hyderabad / Pune

Open Positions: 5
Role Responsibilities:

  • Collaborate with technical stakeholders to deliver impactful business solutions
  • Work as part of an agile, multidisciplinary DevOps team
  • Migrate and re-engineer existing services from on-premises to Cloud (GCP/AWS)
  • Understand business needs and provide scalable, real-time solutions
  • Utilize project tools like JIRA, Confluence, and GIT for development lifecycle
  • Develop Python/Shell scripts for automation of server operations
  • Build and maintain tools for system monitoring, notifications, and performance analysis
  • Create and manage operational procedures for deployments and maintenance
  • Document current and proposed configurations, procedures, and policies

Required Qualifications:

  • 2 to 6 years of experience in Data Engineering with Cloud (preferably GCP)
  • Hands-on experience in agile DevOps environments
  • Proficiency with tools and technologies like:
  • Hadoop, NiFi/Kafka
  • Python, DataFlow, Pub/Sub, BigQuery
  • GCP components: GCS, BigQuery, Airflow, Cloud SQL, Pub/Sub/Kafka, Google Cloud SDK
  • Working knowledge of Terraform and Shell scripting
  • Experience with Relational Databases (RDBMS)
  • GCP Data Engineer Certification is a strong plus

Apply for this position

Allowed Type(s): .pdf, .doc, .docx