Senior Data Engineer
![]() | |
![]() United States | |
![]() | |
Senior Data Engineer – (employer is Grocery Delivery E-Services USA Inc. d/b/a HelloFresh) - Design, develop and maintain efficient ETL/ELT pipelines to extract, transform, and load data from diverse sources into Snowflake, utilizing tools such as AWS Glue, Apache Spark, Apache Airflow, or Lambda functions for orchestration and optimization of data workflows. Ensure pipelines are secure, reliable, and performant. Design, implement, and manage scalable, high-performance data architectures leveraging Snowflake and cloud-based services. Collaborate closely with stakeholders to align data architecture with business objectives, ensuring it supports both current and future requirements. Assist with data migrations from on-premises systems or other cloud platforms, developing integration strategies and ensuring seamless data integration from multiple sources. Implement continuous data ingestion processes, ensuring smooth data streaming. Fine-tune Snowflake queries, optimize warehouse usage, and configure cloud services to maximize cost efficiency in data processing. Monitor and enhance data performance, leveraging both inbuilt cloud-based monitoring tools and third-party solutions. Adjust and scale data architectures in response to changing workload demands and query performance metrics. Implement security best practices across Snowflake and AWS environments, including managing IAM roles, encryption, and data masking. Ensure compliance with data privacy and security standards (e.g., GDPR, HIPAA). Set up automated monitoring and alerts to detect anomalous data access patterns. Collaborate with Data Analysts, and Business Intelligence teams to ensure access to well-structured, accurate data for analysis. Partner with IT and infrastructure teams to ensure proper resource provisioning and allocation. Provide technical leadership to junior data engineers, conducting code reviews and guiding best practices. Document all architecture decisions, data flows, and pipeline processes to facilitate easy maintenance and knowledge sharing across the team. Conduct training sessions and workshops on Snowflake and AWS best practices to foster continuous learning and skill development. Establish and promote best practices for data management, data modeling, and data governance across the organization. Requirements: Bachelor’s degree (US or foreign equivalent) in Computer Science, Computer Engineering, or a related field and five (5) years of experience in the position offered or in a related role. All of the required experience must have included experience with: building, testing, and maintaining production-ready data pipelines; Python, Java, or PL/SQL; designing, developing, and maintaining ETL processes and data pipelines; ingesting, consuming, and processing data using both batch and real-time streaming platforms; cloud-based data warehousing, database management solutions and data storage solutions; transformation of data from various formats of data to ensure scalability and compatibility with modern data processing frameworks; Business Intelligence and Visualization tools; and Continuous Integration and Continuous Deployment (CI/CD). Telecommuting is permitted from any U.S. location. Salary: $175,947/year. E-mail résumé to: meggan.bird@hellofresh.com. Ref. #162.
|