Website NNE
NNE is seeking an experienced Advanced Data Engineer to join their growing data team. In this role, you will design, develop, and maintain robust data infrastructure and pipelines that support our organization’s analytics and business intelligence operations. You’ll work with cutting-edge technologies and collaborate with data scientists, analysts, and business stakeholders to drive data-driven decision-making across the company.
Key Responsibilities
Data Pipeline Development & Optimization
- Design and build scalable data pipelines using modern ETL/ELT frameworks.
- Develop efficient data transformation logic to process large volumes of structured and unstructured data.
- Monitor and optimize pipeline performance, reducing latency and improving throughput.
- Implement error handling, logging, and alerting systems to ensure reliability.
Data Architecture & Infrastructure
- Architect robust data warehouse and data lake solutions.
- Evaluate and implement appropriate storage solutions (cloud data warehouses, object storage, databases).
- Design schemas and data models that support analytics and reporting requirements.
- Establish data governance practices and ensure data quality standards.
Required Qualifications
- Experience: 5+ years of professional experience as a Data Engineer or similar role.
- Programming Languages: Advanced proficiency in at least one of: Python, Scala, Java, or Go.
- SQL Expertise: Advanced SQL knowledge with ability to write complex queries and optimize performance.
- Big Data Technologies: Hands-on experience with Apache Spark, Apache Airflow, or similar frameworks.
- Cloud Platforms: Demonstrated experience with AWS, Google Cloud Platform, or Azure (data services).
- Data Warehousing: Experience with modern data warehouse solutions (Snowflake, BigQuery, Redshift, etc.).
- Version Control: Proficiency with Git and collaborative development workflows.
- Problem-Solving: Strong analytical and troubleshooting skills with attention to detail.
Preferred Qualifications
- Experience with Kubernetes and containerization (Docker).
- Familiarity with real-time streaming technologies (Kafka, Kinesis, Pub/Sub).
- Knowledge of data quality frameworks and testing methodologies.
- Experience with Infrastructure-as-Code (Terraform, CloudFormation).
- Background in software engineering or computer science.
- Contributions to open-source data projects.
- Experience with machine learning operations (MLOps) pipelines.
- Familiarity with dbt (data build tool) or similar transformation tools.
Technical Skills
- Languages: Python, SQL, Scala, Java
- Frameworks & Tools: Apache Spark, Apache Airflow, dbt, Great Expectations
- Cloud Services: AWS (EC2, S3, Redshift, Lambda), GCP (BigQuery, Dataflow), Azure (Synapse, Data Factory)
- Data Technologies: Kafka, Snowflake, PostgreSQL, MongoDB
- DevOps: Docker, Git, CI/CD pipelines, Linux
- Data Concepts: OLAP/OLTP, dimensional modeling, data quality, data governance
To apply for this job please visit nne.csod.com.

