Sr. Data Engineer/Analytics Engineer Location: Remote, must work EST hours Duration: 6-month contract-to-hire Pay: $65+ per hour
JOB DESCRIPTION Our global Fortune 500 client, with U.S. headquarters in Charlotte, NC, is a world class food service provider with a strong presence across the nation. Celebrating almost 30 years in North America, this employee-focused company has received honors for diversity and inclusion, innovation, health and wellness, and company culture. CRG has successfully placed over 220 employees within the last 7 years within this organization, known for its continuous growth opportunities, fantastic benefits package, innovative technology, flexible work environment, and collaborative culture.
We are looking for a hands-on Senior Data Engineer/Analytics Engineer with expertise in developing data pipelines and transforming data to be consumed downstream. This role is crucial in designing, building, and maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, and optimizing performance. Key skills include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python and AWS Lambda, and experience with ETL/ELT processes. Workflow automation using AWS Airflow is essential, and experience with Fivetran and dbt or similar tools are also a must have. RESPONSIBILITIES
Design, build, test, and implement scalable data pipelines using Python and SQL.
Maintain and optimize our Snowflake data warehouse’s performance, including data ingestion and query optimization.
Design and implement analytical data models using SQL in dbt and Snowflake, focusing on accuracy, performance, and scalability
Own and maintain the semantic layer of our data modeling, defining and managing metrics, dimensions, and joins to ensure consistent and accurate reporting across the organization
Collaborate with the internal stakeholders to understand their data needs and translate them into effective data models and metrics
Perform analysis and critical thinking to troubleshoot data-related issues and implement checks/scripts to enhance data quality.
Collaborate with other data engineers and architects to develop new pipelines and/or optimize existing ones.
Maintain code via CI/CD processes as defined in our Azure DevOps platform.
QUALIFICATIONS
Highly self-motivated and detail-oriented with strong communication skills.
5+ years of experience in Data Engineering roles, with a focus on building and implementing scalable data pipelines for data ingestion and data transformation.
Expertise in Snowflake, including data ingestion and performance optimization.
Strong experience using ETL software (Fivetran, dbt, Airflow, etc.)
Strong SQL skills for writing efficient queries and optimizing existing ones.
Proficiency in Python for data extraction from APIs using AWS Lambda, Glue, etc.
Experience with AWS services such as Lambda, Airflow, Glue, S3, SNS, etc.