Ideate Technologies

AWS Data Engineer

Ideate TechnologiesContract
RemoteUS Citizen
10 - 20 YearsApr 3rd, 2026
85 ViewsBe an Early Applicant
Required Skillset:
AWS Glue, SNS/SQS, Python/Pypark, Data Lake, Cloudwatch, Cloudtrail, DB Design, SQL

Job Description

Title: AWS Data Engineer
Location: Remote

 

Shashikar V​

+xxxxxxxxxxxxxxxEXT: 1024

Direct(WhatsApp): +xxxxxxxxxxxxxxx

xxxxxxxxxxxxxxx

linkedin.com/in/shashikar-v-0536161a6

 

ONLY USC
 

Primary Skills- AWS Glue, SNS/SQS, Python/Pypark, Data Lake, Cloudwatch, Cloudtrail, DB Design, SQL



JDs;
•     Lead the team technically to complete milestones on time.
•     Understand complete requirement, create Architecture and update all stakeholders.
•     Create POCs
•     Manage delivery/release to customer
•     Develop Services to enable data ingestion from and synchronization with system which exposes required data access mechanisms ensuring near-real-time updates
•     Ingest data from multiple sources using the python and any other ETL tools
•     Design and implement an event-driven architecture using AWS EventBridge, Kafka, or SNS/SQS for real-time data streaming
•     Design, implement, and maintain scalable data pipelines that integrate both on-prem and AWS cloud environments.
•     Develop efficient Python scripts and applications using libraries like pandas, NumPy, etc., to handle and process large datasets.
•     Work with various NoSQL databases (e.g., MongoDB, Cassandra, DynamoDB) to support high-performance data storage and retrieval.
•     Develop and deploy applications in a cloud-native architecture, leveraging modern cloud technologies for scalability and resilience.
•     Continuously monitor data workflows and systems, troubleshoot issues, and optimize performance for reliability and scalability
Transition existing pipeline to MSSQL server
•     Lead the team technically to complete milestones on time.
•     Understand complete requirement, create Architecture and update all stakeholders.
•     Create POCs
•     Ingest data from multiple sources using the python and any other ETL tools
•     Design and implement an event-driven architecture using AWS EventBridge, Kafka, or SNS/SQS for real-time data streaming
•     Design, implement, and maintain scalable data pipelines that integrate both on-prem and AWS cloud environments.

Similar Jobs

Process Engineer

Texas

Apr 3rd, 2026

Enterprise Data Modeler/ Data Architect

Remote

Apr 3rd, 2026

Python Gen AI Engineer

New York

Apr 3rd, 2026

Site Reliability Engineer With KAFKA

California

Apr 3rd, 2026

Cloud Engineer

Texas

Apr 3rd, 2026