Job Title: Lead Snowflake Data Engineer Location: Rosemont, IL (Remote ok) Duration: 6+ Month Position overview: We are looking for a hands-on Lead Data Engineer to join our team! In this role, you will be responsible for leading the team of data engineers (Onsite/Offshore) to build new data pipelines (loading millions of transformed true records into our cloud data warehouse in a timely manner) to support evolving analytical needs for critical business decisions. Manage existing data pipelines keeping data relevant and true, ready for analytical consumption. Key responsibilities: • Lead data warehouse development activities for a team, • Lead more than one project in parallel. • Assign data warehouse development tasks to team of 1-6 developers (Onsite/Offshore). • Be a subject-matter expert (SME) in Data Warehouse domain and relevant business function. • Work with minimal supervision and provide status updates and escalation to leadership as appropriate. • Provides application support as part of an on-call rotation to work on resolving outages and user questions. • Review project objectives and determine best technology to apply. • Design and build large scale datasets for a wide range of consumer needs. • Build, test and implement highly efficient data pipelines using a variety of technologies. • Develop best practice standards for solution design and data structures. • Create and maintain project specific documentation like architecture diagrams, flow charts • Analyze new data sources to understand quality, availability, and content. • Write technical design documentation. • Conducts peer code reviews. • Create and executes unit tests. • Supports QA team during testing. • Mentor less experienced members of the development team. Technical skills: • 6 years+ experience on Snowflake SQL – advanced SQL expertise • 6 years+ experience on data warehouse experiences – hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc, • 6 years+ experience working in reporting and analytics environments – development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning, Atlassian Bitbucket and Bamboo etc, • 4 years+ on Python – advanced Python expertise • 4 years+ on any cloud platform – AWS preferred o Hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is a bare minimum, • 4 years+ on any ETL / ELT tool – Informatica, Pentaho, Fivetran, DBT etc. • 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc), Must have soft skills: • Enterprise Experience – Understands and follows enterprise guidelines for CICD, security, change management, RCA, on-call rotation etc, Nice to have: Technical certifications from AWS, Microsoft, Azure, GCP or any other recognized Software vendor, 4 years+ on any ETL / ELT tool – Informatica, Pentaho, Fivetran, DBT etc. 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc), 4 years+ with team lead experience, 3 years+ in a large-scale support organization supporting thousands of users, |
You received this message because you are subscribed to the Google Groups "Latest C2C Requirements2" group.
To unsubscribe from this group and stop receiving emails from it, send an email to latest-c2c-requirements2+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/latest-c2c-requirements2/CAMjeKS-pvZvFS3rVPyYjwQ2vSoFLKHDCGME3YQJjAUjzL1e%2Bqw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.