Pacific Consultancy Services

Data Engineer

Pacific Consultancy ServicesContract
Remote
8 - 12 YearsApr 30th, 2026
89 ViewsBe an Early Applicant
Required Skillset:
PythonGitSnowflakeApache AirflowDatabricksSQLPySpark

Job Description

𝗧𝗶𝘁𝗹𝗲: 𝗗𝗮𝘁𝗮 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿
𝗟𝗼𝗰𝗮𝘁𝗶𝗼𝗻: 𝟭𝟬𝟬% 𝗥𝗲𝗺𝗼𝘁𝗲  
𝗗𝘂𝗿𝗮𝘁𝗶𝗼𝗻: 𝟭𝟮+ 𝗠𝗼𝗻𝘁𝗵𝘀

𝗥𝗼𝗹𝗲 𝗦𝘂𝗺𝗺𝗮𝗿𝘆:
We are seeking an experienced Data Engineer to design, build, and optimize scalable, high‑performance data pipelines using Databricks, Apache Airflow, Snowflake, Python, and SQL.
The role involves end‑to‑end ownership of data ingestion, transformation, orchestration, and optimization across cloud‑based data platforms, enabling analytics, reporting, and downstream data science use cases.

𝗞𝗲𝘆 𝗥𝗲𝘀𝗽𝗼𝗻𝘀𝗶𝗯𝗶𝗹𝗶𝘁𝗶𝗲𝘀:
Data Engineering & Pipeline Development
· Design, develop, and maintain batch and streaming data pipelines using Databricks (PySpark) and Snowflake.
·    Build ETL / ELT frameworks to ingest data from multiple sources (RDBMS, APIs, flat files, cloud storage).
·    Implement data transformation logic using Python and SQL for scalable and high‑volume datasets.
·    Develop metadata‑driven and reusable pipelines following enterprise data engineering best practices.

𝗪𝗼𝗿𝗸𝗳𝗹𝗼𝘄 𝗢𝗿𝗰𝗵𝗲𝘀𝘁𝗿𝗮𝘁𝗶𝗼𝗻:
·    Create and manage complex workflows using Apache Airflow.
·    Implement scheduling, dependency management, retries, alerts, and failure handling.
·    Integrate Airflow with Databricks jobs, Snowflake tasks, and cloud services.

𝗗𝗮𝘁𝗮𝗯𝗿𝗶𝗰𝗸𝘀 & 𝗟𝗮𝗸𝗲𝗵𝗼𝘂𝘀𝗲 𝗔𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁𝘂𝗿𝗲:
·    Work on Databricks Lakehouse architecture including Bronze / Silver / Gold (Medallion) layers.
·    Optimize Spark jobs using partitioning, caching, broadcast joins, and performance tuning.
·    Manage Databricks jobs, clusters, notebooks, and workspace configurations.

𝗦𝗻𝗼𝘄𝗳𝗹𝗮𝗸𝗲 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁:
·    Design and optimize Snowflake schemas, tables, views, and warehouses.
·    Implement Snowflake SQL transformations, performance tuning, and cost optimization.
·    Work with Snowflake features such as Time Travel, Cloning, Tasks, Streams (where applicable).

𝗗𝗮𝘁𝗮 𝗤𝘂𝗮𝗹𝗶𝘁𝘆, 𝗚𝗼𝘃𝗲𝗿𝗻𝗮𝗻𝗰𝗲 & 𝗦𝗲𝗰𝘂𝗿𝗶𝘁𝘆:
·    Implement data quality checks, validation frameworks, and reconciliation logic.
·    Ensure adherence to data governance, security, and compliance requirements.
·    Collaborate with governance teams on metadata, lineage, and access controls.

𝗖𝗜/𝗖𝗗 & 𝗢𝗽𝗲𝗿𝗮𝘁𝗶𝗼𝗻𝘀:
·    Implement CI/CD pipelines for data code using Git‑based version control systems.
·    Support production deployments, monitoring, and incident resolution.
·    Work closely with DevOps, Architecture, and Analytics teams.

 

Similar Jobs

Data Engineer

California

Apr 30th, 2026

Junior Data Engineer

Texas

Apr 30th, 2026

Cloud Data Engineer

Nevada

Apr 30th, 2026

Data Engineer

Remote

Apr 30th, 2026

Snowflake Data Engineer

North Carolina

Apr 30th, 2026