Data Engineer
Job Description
Data Engineer (Python & GCP)
Minimum Experience: 10+ Years
Location: Phoenix, Arizona (Hybrid)
Position: Contract
Note: Please don’t share AWS Data Engineer Candidates.
Key responsibilities
Design and build scalable ETL/ELT data pipelines for batch and real-time use cases using Python, Dataflow, Dataproc, Pub/Sub, Kafka, and related services.
Implement and optimize data warehousing and analytics solutions on BigQuery and Cloud Storage, following best practices in performance, cost, and reliability.
Architect, document, and troubleshoot cloud-native data platforms, ensuring high availability, monitoring, and CI/CD-based deployments.
Collaborate with business stakeholders to analyze requirements, translate them into technical designs, and deliver robust, production-grade data solutions.
Required experience
10+ years total IT experience with 6+ years in Data Engineering focused on data warehousing and analytics.
4+ years strong Python (including notebooks) and 4+ years building scalable data pipelines (extraction, transformation, loading).
4+ years on a major public cloud and 2+ years hands-on GCP (Dataflow, Dataproc, Cloud Composer/Airflow, BigQuery, Cloud Storage, GKE, Pub/Sub).
2+ years with Kafka, Pub/Sub, Docker, Kubernetes, and experience in architecture design and documentation.
Strong understanding of relational and dimensional data modeling, plus DevOps/CI/CD experience.
Soft skills
Ability to work independently in a hybrid setup, proactively solve problems, and keep stakeholders updated.
Excellent written and verbal communication skills, including technical documentation and interaction with senior business leaders
Similar Jobs
Data Architect
Remote
Databricks Data Engineer
California
Data Analyst
New Jersey
React Front End Engineers
Illinois
Data Engineer
Texas