Top 25 U.S. digital financial services company committed to developing award-winning technology and services.
Named one of the top three fastest-growing banking brands in the U.S. in 2020.
Offers a full suite of products including mortgage lending, personal lending, and a variety of deposit and other banking products (savings, money-market, and checking accounts, certificates of deposit (CDs), and individual retirement accounts (IRAs)), self-directed and investment-advisory services, and capital for equity sponsors and middle-market companies.
Where permitted by applicable law, must have received or be willing to receive the COVID-19 vaccine by date of hire to be considered.
WHAT THEY OFFER YOU:
Fast paced, highly collaborative, teamwork-oriented environment
Make an immediate impact in this high visibility role
Ability to drive change within the organization with a focus on advancement in technology and programs
Top-notch leadership committed to developing people
THE BACKGROUND THAT FITS
Design and build to enhance an ever-expanding data platform supporting business process needs for internal and external integration via APIs, data models, self-serve reporting solutions, and interactive querying
Define, build, test, document and audit data platform artifacts including data models, data flow processes, integrations, etc.
Develop standard methodologies and frameworks for unit, functional and integration tests around data pipelines, and drive the team towards increased overall test coverage
Design Continuous integration and deployment processes and best practices for the production data pipelines.
Create data platform artifacts for data flow processes and integrations
Work with the latest and greatest technologies in the Microsoft cloud stack including Azure SQL, Synapse, Data Lake, Data Factory, Databricks, Azure function, Service Bus, etc.
Collaborate and influence Users, Engineers and Products partners to ensure our data infrastructure meets constantly evolving requirements
Finds opportunities to adopt innovative technologies that fuels company vision
MANDATORY SKILLS
Bachelor's degree in Computer Science, Information Systems, similar technical field of study or equivalent practical experience
5+ years of relevant work experience
Experience working with cloud platforms such as Azure or AWS
Solid understanding of real-time data processing, data pipelines, transformation and modeling using traditional and distributed systems
Experience with development of test automation solutions
Detailed knowledge of Relational, Multi-Dimensional databases and No-SQL solutions
The successful candidate will have strong programming skills, especially in SQL, data modeling and related data processing concepts
Experience with developing and architecting data ingestion models, ETL jobs, and alerting to maintain high availability and data integrity
Experience working in Python, PySpark. Databricks experience is preferred
Experience working with real-time data processing framework using Kafka / Azure event hubs
Experience in building modern data lake architectures like Lakehouse/Lambda