Friday 8 January 2021

URGENT REQUIREMENTS !!!!!!!!! Multiple Positions

0 comments
Hi ,
 
I am Bharat Lohmore, a Technical Recruiter with Next Level Business Services, Inc., one of the fastest growing IT staffing and services firm in the nation specializing in staff augmentation for end to end Enterprise IT Solutions. I viewed your resume on one of the job boards; I understand that you may be actively looking for new opportunities. I am trying to fill up a consulting position for one of our major clients for an important project. More about the project details are given below:

position:1

Role:  Snowflake SME
Location: Dallas, TX
Duration: Long Term

JD:
Provide expert guidance and collaboration with data outputs architect in the effort to migrate from Vertica to Snowflake
 
·       Evaluate current data model in vertica. Analyze and plan for building the new data model on Snowflake
·       Experience in multiple aspects of Azure, Snowflake Data Warehouse, ETL and Legacy Modernization Projects. • Experience loading data into Snowflake Table (project is on Vertica  to Snowflake migration). Need strong communication skills
·       ​Good knowledge of SQL, with performance tuning and hands-on development using Spark or any other dev tools, is a must.
·       Be responsible for designing, developing and maintaining the data architecture, data models and standards for various Data Integration & Data Warehousing projects in Snowflake, combined with other technologies.
·       knowledge on databricks/spark will be an added skill
·       Knowledge to write data into snowflake using spark and java/scala is an additional skill
·       Cloud Migration - Snowflake, Azure
·       Snowflake, Hadoop, PostgreSQL database

·       SQL and Shell Scripting knowledge required

position:2

Role:  Azure Databricks SME  
Location: Dallas, TX   
Duration: Long Term

JD:
 
Provide expert guidance and collaboration with data outputs architect in the effort to migrate from Hadoop to Deltalake (Databricks))
·       Experience in multiple aspects of Azure, Spark, Databricks (specifically Deltalake), and programming language – Java/Scala, ETL and Legacy Modernization Projects.
·       • Experience loading data into Azure/Snowflake using deltalake. Need strong communication skills
·       ​Good knowledge of SQL, with performance tuning and hands-on development using Spark or any other dev tools, is a must.
·       Be responsible for designing, developing and maintaining the data processingarchitecture, data pipelines and standards for various Data Integration & Data Warehousing projects in databricks, combined with other technologies.
·       knowledge on snowflake will be an added skill
·       Knowledge to write data into Azure/Snowflake using spark and java/scala
·       Cloud Migration - Hadoop, Azure
·       Snowflake, Hadoop, PostgreSQL database
·       SQL and Shell Scripting knowledge required













Position:3

Role:  Hadoop/ETL Consultant with PostGRE Experience 
Location: Houston TX
Duration: Long Term

JD:
 
> Good experience in AWS Glue design and development
> Good design and development experience in Oracle, AWS Aurora DB and Postgre SQL
> Exposure in Big Data, Spark, Spark SQL, Hive  (good to have)
> Previous working experience in AWS and migration from Oracle to Aurora DB (Postgres)
> Experience in any other ETL tool like Talend / Informatica
> Must have Aurora DB and Postgre SQL database design experience
> Must have Data Analysis and Data mapping experience
> Work with Functional / Business Analyst and Data Modeler for Source to Target Data Mapping queries
> Low Level Design of the AWS Glue and Spark Job
> Development and Unit Testing of AWS Glue and Spark jobs
> Support QA and UAT
> Ensure best practices and coding standards are followed in addition to delivering exceptional quality products
> Good knowledge in ETL job scheduling
> Good knowledge in DW Concepts














 
> Good Communication skill and customer facing experience


position:4

Role:  Google Cloud Developer
Location: Remote
Duration: Long Term

JD:
· GCP Certification
· Minimum 5 years in either system engineering or another development background
· Hands-on experience across a broad range of Google services, with deep-dive knowledge and skills in specific services
· Experience in Pyspark, Kafka, Python is a must.
· Familiar with the software development lifecycle and agile methodologies
· Experience with application architecture patterns and best practice
· Working with IT and Business stakeholders
· Deep understanding of using data and analytics services to solve enterprise data challenges
· Adept at SQL and Python
Experienced in the GCP platform and other Google offerings centered around Data and Analytics (BigQuery, Dataproc, Dataflow, Dataprep, Composer, Looker, Data Studio, etc.)



position:5

Role:  Azure Data Engineer
Location: Pleasanton - CA
Duration: Long Term

JD:
**2-3 Experience on creating the frameworks towards building the data pipelines. (Mandatory) · Must have experience on configure the data streams between Event Hub and Azure Service Bus with other integration systems such as Data Bricks etc., (Mandatory) ***

• 2-3 years of experience working on Data bricks, Azure Data Factory and other Azure data solutions ecosystems (Mandatory)
• 2-3 years of experience working on Spark SQL, Hive SQL, USQL. (Mandatory).
• 2-3 years of experience working on Spark, Scala and Python. (Mandatory)
• 2-3 Experience on creating the frameworks towards building the data pipelines. (Mandatory) · Must have experience on configure the data streams between Event Hub and Azure Service Bus with other integration systems such as Data Bricks etc., (Mandatory) ·
• Must have experience working with Onshore / Offshore model. (Mandatory) ·
• Azure Fundamentals Certification (AZ-900) and Azure Data Solution (DP-200 & DP-201) (Preferred). · 2-3 years of experience working on JAVA or other object oriented programming (Preferred).
• Extensive experience working on Big data technologies such has Hive, Pig and Map Reduce are preferred.
• Experience work with structured and unstructured data is must.
Good understanding of data oriented projects for integration and analytics is must.
• Provide Business Intelligence and Data Warehousing solutions and support by leveraging project standards and leading analytics platform.
• Evaluate and define functional requirements for BI and DW solutions
• Build conceptual and logical models based on the functional flow of business in a scalable mode.
• Work directly with Business leadership and Application SMEs to understand the requirement and analyzing the source to fulfill the requirement.
• Propose and develop data solutions to enable effective decision-making, driving business objects or addressing the enterprise integration requirements.
• Analyze the data quality, data governance, compliance and other legal requirements on data storage; address all the required non-business but operational requirements during the design and build of data pipelines.
• Identify avenues on cost savings either by using in-house Cognizant accelerators or by building re-usable frameworks across the projects.
• Interpret data by building the models, charts and tables on reporting platform towards business intelligence requirements.
Expert and Key point of contact between the data analyst, data scientists, and the business/application teams.


position:6

Role:  Scrum Master with Splunk Experience
Location: St Paul, MN
Duration: Long Term

JD:
• Manage scrum teams through all phases of implementation while maintaining high quality, timely deliveries
o Foster a collaborative and safe environment for all team members
o Actively and enthusiastically advocate agile methods, value, principles, and mind set
o Encourage the scrum team to become self-organizing and accountable
o Exhibit strong leadership in planning, leading, and facilitating scrum ceremonies: daily standups, backlog grooming, sprint planning, sprint retrospective, and sprint review and showcase o Work with Product Managers to define and groom the backlog to facilitate sprint planning.
• Experience with architecting, implementing, and operating Splunk or other big data platforms
• Experience in Splunk development, administration, Splunk Queries, Dashboards and Alerts
• Experience with Splunk for creating end user experience dashboards and system metrics
• Experience with monitoring use case implementation methodologies
• Good to have Agile methodology, Amazon Web Services and Python knowledge.
• Excellent communication skills and problem-solving ability




Thanks,
Next Level Business Services, Inc.
Talent Solutions | Digital Transformation | Data Analytics

BharatLohmore
11340 Lakefield Drive Suite #200,
Johns Creek, GA, 30097
(904) 371-2977
bharat.lohmore@nlbservices.com
Web: www.nlbservices.com
If you would prefer to no longer receive any emails from this Company, you may opt out at anytime by clicking here.

No comments:

Post a Comment