Data Architect/ Engineer
Job Description
𝗥𝗼𝗹𝗲: 𝗗𝗮𝘁𝗮 𝗔𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁/𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿 (𝗚𝗼𝗼𝗴𝗹𝗲 𝗖𝗹𝗼𝘂𝗱 𝗣𝗹𝗮𝘁𝗳𝗼𝗿𝗺)
𝗟𝗼𝗰𝗮𝘁𝗶𝗼𝗻: 𝗜𝘀𝘀𝗮𝗾𝘂𝗮𝗵, 𝗪𝗮𝘀𝗵𝗶𝗻𝗴𝘁𝗼𝗻 (𝗛𝘆𝗯𝗿𝗶𝗱) ( 𝗣𝗦𝗧 𝗧𝗶𝗺𝗲 𝗭𝗼𝗻𝗲 )
𝗩𝗶𝘀𝗮: 𝗚𝗖 𝗮𝗻𝗱 𝗨𝗦𝗖
𝗞𝗲𝘆 𝗥𝗲𝘀𝗽𝗼𝗻𝘀𝗶𝗯𝗶𝗹𝗶𝘁𝗶𝗲𝘀 :
Data Pipeline Development: Design, build, test, and maintain scalable data pipelines and ETL processes using Python and GCP services (e.g., Dataflow, BigQuery, Pub/Sub).
Data Integration & Modeling: Implement batch and real-time data integration workflows, optimize data models and architecture for performance and storage efficiency.
Collaboration & Support: Work with cross-functional teams to gather data requirements and support data analysts with curated datasets and tools.
System Reliability: Monitor, troubleshoot, and tune data systems for high availability, scalability, and disaster recovery.
DevOps Enablement: Build and manage CI/CD pipelines using GitHub and Terraform; ensure security compliance and operational readiness.
𝗠𝗮𝗻𝗱𝗮𝘁𝗼𝗿𝘆 𝗦𝗸𝗶𝗹𝗹𝘀 & 𝗤𝘂𝗮𝗹𝗶𝗳𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀
𝗧𝗲𝗰𝗵𝗻𝗶𝗰𝗮𝗹 𝗘𝘅𝗽𝗲𝗿𝘁𝗶𝘀𝗲:
Strong Python programming and Spark experience for data analytics.
Proficient in GCP services: GCS, Dataflow, Cloud Functions, Composer, Scheduler,
Datastream, Pub/Sub, BigQuery, Dataproc.
Skilled in Apache Beam for batch and stream processing.
Experience with REST API ingestion, JSON messaging, and scripting (Shell, Perl).
Deep understanding of SQL, cloud-native databases, and data warehousing concepts.
𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴 & 𝗠𝗶𝗴𝗿𝗮𝘁𝗶𝗼𝗻:
Proven experience in migrating legacy systems to modern cloud-based architectures.
Familiarity with distributed computing frameworks and large-scale data handling.
𝗗𝗲𝘃𝗢𝗽𝘀 & 𝗦𝗲𝗰𝘂𝗿𝗶𝘁𝘆:
CI/CD pipeline development with GitHub and Terraform.
Security integration in deployment workflows.
𝗦𝗼𝗳𝘁 𝗦𝗸𝗶𝗹𝗹𝘀:
Strong problem-solving and analytical abilities.
Excellent communication and teamwork skills.
Similar Jobs
Data Architect
Remote
Data Architect
New Jersey, FL
Data Architect/Engineer
Washington
Data Architect
Remote
Data Engineer/Architect
Washington