logo

View all jobs

Integration Data Engineer

Iselin, NJ · Information Technology
Our client, a major bank in Central, NJ, is looking for Integration Data Engineer.
Hybrid commute, 3 days on-site in Central NJ Locations and 2 day remote per week.

This is a permanent FT career opportunity, with base salary range 110 -135K DOE, plus around 20% bonus and great benefits package.

Looking for Integration Data Engineer with a background in SQL and data warehousing for enterprise level systems. The ideal candidate is comfortable working with business users along with business analyst expertise.

Major Responsibilities:
  • Design, develop, and deploy Databricks jobs to process and analyze large volumes of data.    
  • Collaborate with data engineers and data scientists to understand data requirements and implement appropriate data processing pipelines.          
  • Optimize Databricks jobs for performance and scalability to handle big data workloads.          
  • Monitor and troubleshoot Databricks jobs, identify and resolve issues or bottlenecks.             
  • Implement best practices for data management, security, and governance within the Databricks environment. Experience designing and developing Enterprise Data Warehouse solutions.
  • Demonstrated proficiency with Data Analytics, Data Insights
  • Proficient writing SQL queries and programming including stored procedures and reverse engineering existing process
  • Leverage SQL, programming language (Python or similar) and/or ETL Tools (Azure Data Factory, Data Bricks, Talend and SnowSQL) to develop data pipeline solutions to ingest and exploit new and existing data sources.    
  • Perform code reviews to ensure fit to requirements, optimal execution patterns and adherence to established standards.
Skills:
• 5+ years - Enterprise Data Management
• 5+ years - SQL Server based development of large datasets
• 5+ years with Data Warehouse Architecture, hands-on experience with Databricks platform. Extensive experience in PySpark coding. Snowflake experience is good to have
• 3+ years Python (Numpy, Pandas) coding experience
• 3+ years’ experience in Finance / Banking industry – some understanding of Securities and Banking products and their data footprints.
• Experience with Snowflake utilities such as SnowSQL and SnowPipe  - good to have
• Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling.
• Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills
• Capable of discussing enterprise level services independent of technology stack
• Experience with Cloud based data architectures, messaging, and analytics
• Superior communication skills
• Cloud certification(s) preferred
• Any experience with Regulatory Reporting is a Plus

Education
• Minimally a BA degree within an engineering and/or computer science discipline
• Master’s degree strongly preferred


Please email your resume or use this link to apply directly:
https://brainsworkgroup.catsone.com/careers/index.php?m=portal&a=details&jobOrderID=16675638
Or email: igork@brainsworkgroup.com
Check ALL our Jobs: http://brainsworkgroup.catsone.com/careers












Keywords: sql oracle python databricks etl data warehousing ssis ETL snow numpy pandas pyspark oltp olap

 

Share This Job

Powered by