H-1B Job Board

Finding companies that sponsor visas is a lot of work.
We've made your life easier by compiling top companies and startups that hire foreign nationals.

Senior GCP Network and Data Engineer

EPAM Systems

EPAM Systems

Data Science
Remote
Posted on Dec 20, 2024

Senior GCP Network and Data Engineer Description

We are seeking a highly capable Senior GCP Network and Data Engineer to join our dynamic team.

This role requires a seasoned professional who will focus on designing, implementing, and optimizing our network and data infrastructure on Google Cloud Platform (GCP) with basic experience in Microsoft Azure. The successful candidate should be adept in full-scale data engineering and possess robust skills in network management within cloud environments.


#LI-DNI

Responsibilities

  • Design and implement secure, well-governed cloud infrastructures focusing predominantly on GCP and Azure
  • Develop and maintain scalable and robust cloud network architectures using GCP Networking
  • Design and execute data pipelines using PySpark and ensure data quality and governance
  • Utilize Google Cloud Dataflow and Google Cloud Pub/Sub for data processing and event-driven architectures
  • Implement infrastructure as code using Terraform for consistent and reproducible infrastructure deployment
  • Manage continuous integration and continuous deployment (CI/CD) processes for data pipelines
  • Troubleshoot and optimize SQL and Python applications
  • Collaborate with the team to enhance our Kubernetes environment, focusing on scalability and security
  • Advance the organization's capabilities in data modeling and improve existing data architectures
  • Maintain compliance with security best practices and company policies

Requirements

  • 3+ years of experience in network and data engineering
  • Deep understanding and skills in GCP cloud computing, networking, and infrastructure
  • Basic skills in Azure Networking and Azure Identity/Principal Management
  • Proficiency in Python, PySpark, and SQL
  • Familiarity with Databricks and advanced expertise in Kubernetes
  • Understanding of Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage
  • Capability to design and maintain data pipelines on PySpark
  • Competency in managing data quality and governance
  • Expertise in Infrastructure as Code using Terraform
  • Proficiency in CI/CD processes for data pipelines
  • Fluent English communication skills at a B2+ level

Nice to have

  • Background in data modeling

We offer

  • Competitive compensation depending on experience and skills
  • Variety of projects within one company
  • Being a part of a project following engineering excellence standards
  • Individual career path and professional growth opportunities
  • Internal events and communities
  • Flexible work hours