Monday, 3 March 2025

C2C-GCP Data Engineer (CGEMJP00289478 & CGEMJP00289479)- Middletown PA - Hybrid

0 comments
Hi,

I hope this email finds you well.

My name is Chandan, and I am a Manager-Recruitment from Empower
Professionals Inc. I came across your profile and wanted to reach out
regarding a "GCP Data Engineer" role with one of our clients. Please let
me know if you are available in the job market and interested in this
role (see the job description below) - if so, we can connect and speak
further.



If you have any suitable profiles, please share their updated
resumes with the candidate's location, work authorization and expected
rate so that we can proceed.


12+ years
Title: GCP Data Engineer (CGEMJP00289478 & CGEMJP00289479)

Duration: 12 Months

Location: Middletown PA (Hybrid)


Job Description:-

• Job Description: Data Engineer/Automation Engineer

• Contribute to the migration of legacy data warehouse
to a google cloud-based data warehouse for a Telecom Major.

• Collaborate with Data Product Managers, Data
Architects to design, implement, and deliver successful data solutions

• Help architect data pipelines for the underlying data
warehouse and data marts

• Design and develop very complex ETL pipelines in
Google cloud Data environments.

• Our legacy tech stack includes Teradata and new tech
stack includes GCP Cloud Data Technologies like BigQuery and Airflow and
languages include SQL , Python

• Maintain detailed documentation of your work and
changes to support data quality and data governance

• Support QA and UAT data testing activities

• Support Deployment activities to higher environments

• Ensure high operational efficiency and quality of
your solutions to meet SLAs and support commitment to our customers
(Data Science, Data Analytics teams)

• Be an active participant and advocate of agile/scrum
practice to ensure health and process improvements for your team



Basic Qualification

• 8+ years of data engineering experience developing
large data pipelines in very complex environments

• Very Strong SQL skills and ability to build Very
complex transformation data pipelines using custom ETL framework in
Google BigQuery environment

• Exposure to Teradata and ability to understand
complex Teradata BTEQ scripts

• String Python programming skills

• Strong Skills on build Airflow Jobs and Debug issues

• Ability to Optimize the Query in BigQuery

• Hands-on experience on Google Cloud data Technologies
( GCS , BigQuery, Dataflow, Pub sub, Data Fusion , Cloud Function)

• Experience with cloud data warehouse technology
BigQuery.

• Nice to have experience with Cloud technologies like
GCP (GCS , Data Proc, Pub/sub, Data flow, Data Fusion, Cloud Function)

• Nice to have exposure to Teradata

• Solid experience with Job Orchestration Tools like
Airflow and ability to build complex Jobs.

• Writing and maintaining large Data Pipelines using
Custom ETL framework

• Ability to Automate Jobs using Python

• Familiarity with Data Modeling techniques and Data
Warehousing standard methodologies and practices

• Very good experience with Code Version control
repository like Github

• Good Scripting skills, including Bash scripting and
Python

• Familiar with Scrum and Agile methodologies

• Problem solver with strong attention to detail and
excellent analytical and communication skills

• Ability to work in Onsite / Offshore model and able
to lead a Team






--
Thanks

Chandan

Manager | Empower Professionals

........................................................................................................................................................

chandan@empowerprofessionals.com |

100 Franklin Square Drive – Suite 104 | Somerset, NJ 08873

www.empowerprofessionals.com

Certified NJ and NY Minority Business Enterprise (NMSDC)
To subscribe or unsubscribe: https://send.empowerprofessionals.com/newsletter/subscribe/647186e8-bcb0-4f73-8f80-cb3daff9ad90

No comments:

Post a Comment