Job Title/Role GCP Data Engineer (12+ year profile required)
Contract
Remote work
Mandatory Skills
GCP, DataFLow, DataProc, BQ, Python
Cloud Data Engineer with experience in designing (min 5 years) , building, and optimizing data pipelines. Experience with Google Cloud Composer, DataFlow, DataProc clusters, Apache beam, Hadoop, Bigquery. Experience in process automation and application development using Java, Python,Unix and PL/SQL. Hands-on experience with Database CDC ingestion and streaming ingestion applications like Striim, Nifi is a plus. Experience in Agile software development lifecycle disciplines including Analysis, Design, Coding and Testing. Experience in Big Data and creating high performance/low-latency pipelines using Hadoop, PySpark, Apache beam
Thanks & Regards,
Ashok Huria, Account Manager (Client Management)
--
You received this message because you are subscribed to the Google Groups "Latest C2C Requirements2" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
latest-c2c-requirements2+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/latest-c2c-requirements2/CAGz6%2BaD%2B%3DKFv02qbNCq-gfSR5k3SZgGuzkKu2jHhXGj_Yj0xhA%40mail.gmail.com.
For more options, visit
https://groups.google.com/d/optout.