Search Jobs

Data Engineer

Remote, DC

Posted: 03/30/2026 Industry: IT/ Software Development Job Number: 26-44614 Pay Rate: 18-19 USD/Hour

Job Description

Job Description: Role Summary
The Level 7 Data Engineer is responsible for designing, developing, and supporting business-critical real-time streaming data pipelines for event processing. Operating with a high level of independence, this role delivers scalable, resilient, and secure cloud-based data solutions that directly impact digital revenue and customer experience. The position collaborates closely with platform, analytics, and engineering teams in a fully remote environment.

Key Responsibilities
•Design, build, and maintain real-time streaming pipelines processing high-volume event data using Python, PySpark, Snowflake, and AWS.
•Develop scalable ingestion and transformation workflows leveraging Airflow (Astronomer), Informatica, and dbt (Core/Cloud).
•Optimize data models and warehouse structures in Snowflake to support low-latency analytics and operational reporting.
•Ensure reliability, scalability, and fault tolerance of business-critical streaming workflows.
•Implement CI/CD best practices using GitLab and automate testing, deployment, and monitoring processes.
•Partner with cross-functional stakeholders to translate real-time digital commerce requirements into robust data engineering solutions.
•Proactively monitor production pipelines, troubleshoot incidents, and resolve performance bottlenecks with minimal supervision.
•Participate in on-call rotations to provide 24/7 support for critical (P1/P2) incidents affecting production systems
•Enforce data governance, security controls, and data quality standards across ingestion and transformation layers.
•Leverage AI-enabled development tools and remain current on emerging AI trends to improve automation, documentation, code efficiency, and operational productivity.

Required Qualifications
•4+ years of experience in data engineering with strong hands-on expertise in Python, SQL, PySpark, Snowflake, and AWS services like SQS, Kinesis, Eventbridge, S3, Lambda, etc
•Proven experience building and supporting real-time or near real-time streaming data pipelines in production environments.
•Solid understanding of data modeling, ETL/ELT design patterns, CI/CD practices, and cloud-native architecture.
•Experience with Airflow (Astronomer), dbt, Informatica, and GitLab-based deployment workflows.
•Demonstrated ability to independently manage moderately complex initiatives supporting business-critical systems in a remote environment.

Preferred Qualifications
•Experience working with eCommerce event-driven architectures and digital transaction ecosystems.
•Experience implementing data observability, monitoring, and automated data quality frameworks.
•Demonstrated application of AI tools to enhance engineering efficiency, workflow automation, and solution delivery.
 

Meet Your Recruiter

Apply Online

Send an email reminder to:

Share This Job:

Related Jobs:

Login to save this search and get notified of similar positions.