| Click here to unsubscribe if you no longer wish to receive our emails |
Greetings,
Please review the job description below and let me know if you are interested to submit your resume for the below position.
Requirement: 1#
Position: Google Cloud Engineer
Location: Charlotte, NC
Duration: 12+ Months
Job Description:
Tools
· Cloud composer, Airflow (jobs orchestration)
· Engineer skill sets in Terraform â" automate projects in cloud
· Monitoring tools in GCP, Security and Cost optimization experience.
Candidate should be strong in bellow expected skills,
· (GCP OR "Google Cloud Platform")
· Data lake
· Foundation, Terraform & deployment management
· Big Query
· CI/CD
· Cloud Migration
· Hadoop
Job Description
· 10+ years of experience in data area like Cloud(GCP, Data warehouse, big data lake, ETL, data quality & etc.
· 3+ years of strong GCP Data background (Big Table, Big Query, and CloudSQL)
· Work with Data architects & SME's of current Hadoop Data Lake to understand the current & to be Google Cloud Platform (GCP) architecture and overall migration approach.
· Analyze existing Hadoop Data Lake & data flow jobs developed Spark and Map Reduce.
· Migrate on-premise Hadoop data lake workloads to GCP using Big Query, Cloud Dataproc, CloudSQL & etc.
· Migrate history data from Hadoop Data Lake to GCP using gateway like DistCp (Distributed Copy).
· Tune the existing jobs to provide better performance in Dataproc of cloud platform.
· Support testing of new and existing workloads migrated to GCP.
· Support data validation efforts.
· Collaborate with the stakeholders to create the Go-live plan and participate in production release.
· Support and participate in Product deployment planning and execution
· Create Run book & support other documentation.
Requirement: 2#
Position: AB INITIO/ETL DEVELOPER
Location: Charlotte, NC
Duration: 12+ Months
Job Description:
Hands on ETL development using any combination of Ab Initio, Ab Initio Express IT and Ab Initio Continuos Flows in an agile development environment.
· Support ongoing ETL efforts for Development, QA and Production Environments.
· Research new solutions, tools and methodologies that may improve the overall operation model, process and/or the result of the project or projects at hand.
· Partner with source system Subject Matter Experts (SMEs), Technical Leads and Architects to build integrations and transformations against complex data sources.
· Work very closely with business analysts, development teams and project managers for requirements and business rules.
· Employ extreme attention to detail and flexibility to adapt to dynamic environments and changing business, operations and technology priorities.
· Provide coaching, oversight for software engineers and other members of the development team.
· Perform unit testing and act as a subject matter expert to QA and Business Testers through the testing lifecycle.
· Estimate user stories/size of efforts with accuracy.
· Actively participate in user story refinements and design sessions to influence the sustainability and extendibility of the technical solution.
Education & Experience
· In depth experience with Ab Initio and ETL technologies such as parallelism, plans, generic graphs, dynamic DML, etc.
· Full life cycle Data Warehouse experience including star schema and dimensional data.
· Experience with Linux and Autosys.
· Strong understanding of relational databases such as SQL Server, Teradata, Postgres.
· Vast Experience with writing complex SQL for data analysis and data profiling.
· Strong written, verbal and interpersonal communication skills.
· Strong analytical skills and ability to solve complex technical problems.
· Bachelorâs degree.
Best Regards,
Srinivas Sripada
Kaizen Technologies Inc.
Direct: 541 502 0055
E-mail: srinivass@kaizentek.com| This email is generated using CONREP software. A9555
Subscribe to:
Post Comments (Atom)
|