Thursday, 30 May 2024

Data Warehouse/Cloud Architect position in Harrisburg, PA, Pennsylvania, USA

0 comments

Hello

 

Andrew DEW has found a good job that you would like to see.

 

Here are some of the details of that job and rest can be found through the link below.

Please go through it once and share your feedback.

 
Job Opportunity
 
Data Warehouse/Cloud Architect
Contractual | Pennsylvania
Posted on 05/29/2024
APPLY NOW
Job ID DEW - 2085
Job Title Data Warehouse/Cloud Architect
Location Harrisburg, PA, Pennsylvania USA
Job Description

Data Warehouse/Cloud Architect

Location: Harrisburg, PA (Remote)

Contract Duration:  12+ months

 

Job Description:

In this position you will apply your skills to manage the existing cloud data platform to make it more scalable, reliable, and cost efficient. You will also work on additional projects that would leverage the existing architecture in some cases and use newer technologies where needed.

 

Primary Responsibilities:

  • Define and align on strategic initiatives pertaining to Data and Analytics Architecture
  • Design and develop data lakes, manage data flows that integrate information from various sources into a common data lake platform through an ETL Tool and support near real time use cases as well.
  • Design repeatable and reusable solution architectures and data ingestion pipelines for bringing in data from ERP source systems like SAP.
  • Manage data integration with tools like Databricks and Snowflake or equivalent data lake and data warehouses.
  • Design and Develop Data warehouses for Scale.
  • Design and Evaluate Data Models (Star, Snowflake and Flattened)
  • Design data access patterns for OLTP and OLAP based transactions.
  • Triage, debug, and fix technical issues related to Data Lakes and Data Warehouses
  • Serve and Share data through modern data warehousing tools and practices.
  • Coordinate with Business and Technical teams through all the phases in the software development life cycle.
  • Participate in making major technical and architectural decisions.
  • Hands-on prototyping of new technology solutions by working with cross teams

 

You Must Have:

  • 5+ Years of Experience operating on AWS Cloud with building Data Lake and data warehousing architectures.
  • 5+ Years of Experience building Data Warehouses on Snowflake, Redshift, HANA, Teradata, Exasol etc.
  • 3+ Years of Experience with AWS Data services like S3, Glue, Lake Formation, EMR, Kinesis, RDS, DMS and others
  • 3+ Years of Experience with Data Modelling experience
  • 3+ Years of working knowledge in Spark or equivalent big data technologies.
  • 3+ Years of Experience in building Delta Lakes using technologies like Databricks.
  • 3+ Years of working experience in ETL tools like Talend, Informatica, SAP Data Services etc.
  • 3+ Years of Experience in any programming language (Python, R, Scala, Java)
  • 3+ Years of Experience working with ERP systems like SAP focusing on data integration.
  • Good understanding and implementation experience on GenAI models available with Cloud hyperscalers.
  • AWS Bedrock experience is a plus but not required.
  • Bachelor’s degree in computer science, information technology, data science, data analytics or related field
  • Experience working on Agile projects and Agile methodology in general.
  • Excellent problem solving, communications, and teamwork skills.
  • Exceptional presentation, visualization, and analysis skills

 

 




Don't want any more emails? Unsubscribe.

No comments:

Post a Comment