Job Description The role of a Data Engineering Applications Associate ETL is to contribute to the overall success of the organization by designing and implementing end-to-end data engineering solutions from data analysis, data modeling to ETL. Key Responsibilities - Champion a customer-focused culture to deepen relationships with clients and leverage broader bank systems and knowledge. - Build best-in-class ETL solutions, working closely with Product Owners, Developers, Scrum Masters, Project Management, and Stakeholders. - Analyze highly complex business rules and logic, translating them into technical implementations. - Perform data wrangling to merge disjointed data sources and implement complex data transformation. - Develop ETL processes using MS SQL Server BI stack, SQL Server Integration Services, and MS Azure Data Factory to ingest, transform, cleanse, and load data into on-prem SQL Server 2017, 2019, and/or Azure SQL DB. - Conduct source data analysis and profiling to complete source-to-target mapping. - Logical and physical data modeling to develop highly optimized data infrastructure for fast querying of large volumes of data. - Assure data quality, security, and compliance requirements are met. - Provide proactive support and maintenance to existing ETL solutions to keep them at their best of quality. - Take ownership of bugs, defects, enhancements, or any assignment end-to-end. - Understand how the bank's risk appetite and risk culture should be incorporated into day-to-day activities and decisions. - Actively pursue effective and efficient operations of his/her respective areas in accordance with the organization's values, its code of conduct, and the global sales principles, while ensuring the adequacy, adherence to, and effectiveness of day-to-day business controls to meet obligations with respect to operational, compliance, AML/ATF/sanctions, and conduct risk. - Champion a high-performance environment and contribute to an inclusive work environment. Required Skills and Qualifications Education / Experience - Bachelor's degree in a technical field such as computer science, computer engineering, or related field required. - 3 years of experience in designing, developing, and deployment of large-scale data warehouse projects from end-to-end. Skills - Expert knowledge and hands-on experience with SQL Scripting: 3 years. - Expert knowledge working with SQL Server SSIS, Integration Services Catalog, and project deployment and Azure Suite – 3 years. - Experience working with Agile methodology – 3 years. - Working knowledge of dimensional data modeling methodology. Nice to Have Skills - Experience working with data warehouse within Google Cloud Platform degrees or certifications. - Experience in Splunk (SPL). - Power BI. Benefits Work in a standard office-based environment; non-standard hours are a common occurrence.