Monday 6 May 2024

Sure Shot Closure_ Data Bricks Engineer Jackson Michigan (onsite) (Need 12+ year profile)

0 comments
Data Bricks Engineer
 
Location : Jackson Michigan (onsite)
 
Contract
 
Note: Primary Skill is Data Bricks (It include all the skill sets like Python, Py-Spark, SQL, ADF). Refer the Technical areas/responsibilities defined for this role. Additionally, the skill sets have been mentioned explicitly (ADF, SQL, Python, Py-spark) so that it will be easy for you to look at the high level skill sets in the profile of the Candidates.
Sr.No
Skill
Technical Area
Must Have/Good to have
Proficiency Rating (1 to 5)
1
Data Bricks
• Strong hands on in Pyspark and Apache Spark
• Strong hands on in Medallion architecture
• Experience in Native Spark Migration to Databricks.
• Experience in Building Data Governance Solutions like Unity Catalog, Azure Purview etc.
•Highly experienced in Usability Optimization (Auto Compaction, ZOrdering, Vaccuming), Cost Optimization and Performance Optimization.
• Build Very Strong Orchestration Layer in Databricks/ADF…. Workflows.
• Build CICD for Databricks in Azure Devops.
• Process near Real time Data thru Auto Loader, DLT Pipelines.
• Implement Security Layer in Delta Lake.
• Implement Massive Parallel Processing Layers in Spark SQL and PySpark.
• Implement Cost effective Infrastructure in Databricks.
• Experience In extracting logic and from on prem layers, SAP, ADLS into Pyspark/ADLS using ADF/Databricks.
Must Have (Primary Skill)
 4-5
2
Azure Synapse Analytics/Azure data Factory (ADF)
- Hands on Experience in Azure Synapse Analytics, Azure Data Factory and Data Bricks, Azure Storage, Azure Key Vault, SQL Pools CI/CD Pipeline Designing and other Azure services like functions, logic apps
- Linked services, Various Runtimes, Datasets, Pipelines, Activities
- Strong Hands on Experience in Various Activites like Control flow logic and conditions (For Each, if, switch, until), Lookup, Stored procedure, scripts, validations, Copy Data, Data flow, Azure functions, Notebooks, SQL Pool Stored procedures and etc
- Strong hands on exp in deployment of code through out landscape (Dev -> QA -> Prod), Git Hub, CI/CD pipelines and etc
Must Have
 4-5
3
 SQL Server stored procedures
- strong hands on creating the SQL stored procedures
- Functions, Stored Procedures, how to call one SP into another, How to process record-by-record
- Dynamic SQL
Must have
 4-5
4
 Python
- Must have strong background about the Python libraries like PySpark, Pandas, NumPy, pymysql, Oracle, Pyspark libraries
- Must have strong hands on to get data through APIs
- Must be able to install libraries and help users to troubleshoot issues
- Must have knowledge to get the data through stored procedures via Python
- Should be able to debug the Python code
Must have
 4-5
5
 Sparks
- Hands on experioence in Spark Pools, PySpark
- Should be able to merge data/delta loads through Notebooks
- Must have strong background about the Python libraries and PySpark
Must have
 4-5

Thanks & Regards,

Ashok Huria, Account Manager (Client Management)

Phone: 732-289-3139 |Email: ashok.huria@resource-logistics.com

Resource Logistics, Inc.

Certified MBE Company | NMSDC Certified | ISO Certified 9001:2015; ISO/IEC 27001:2013| Direct Hire | Contingent Staffing | SOW Services | IT Consulting  

39 Milltown Road, East Brunswick, NJ 08816, Office – (732) 553-0566 Ext - 170 | Fax: 732-553-0568 |Website- www.resource-logistics.com

 

--
You received this message because you are subscribed to the Google Groups "Latest C2C Requirements2" group.
To unsubscribe from this group and stop receiving emails from it, send an email to latest-c2c-requirements2+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/latest-c2c-requirements2/CAGz6%2BaCyAdBge7YQ9xavasOG-QYDG%3DjAFWO7k_GbwEYe-JRDeg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

No comments:

Post a Comment