Top 25 U.S. digital financial services company committed to developing award-winning technology and services.
Named one of the top three fastest-growing banking brands in the U.S. in 2020.
Offers a full suite of products including mortgage lending, personal lending, and a variety of deposit and other banking products (savings, money-market, and checking accounts, certificates of deposit (CDs), and individual retirement accounts (IRAs)), self-directed and investment-advisory services, and capital for equity sponsors and middle-market companies.
WHAT THEY OFFER YOU:
Fast paced, highly collaborative, teamwork-oriented environment
Make an immediate impact in this high visibility role
Competitive base salary of $135-145k plus bonus and excellent benefits package
Top-notch leadership committed to developing people
100% remote for now, then will sit on-site in Charlotte, NC when staff transitions back into the office after October
WHAT YOU WILL DO
The Senior Data Engineer will join the Compliance Technology team in developing next generation data solutions. The Compliance data warehouse is in the process of migrating to a cloud platform which includes a modern customer data platform, cloud data lake and cloud data warehousing, modern data governance capabilities, and advanced analytics on the cloud.
Design, build and enhance Cloud Data workflows/pipelines to process billions of records in large-scale data environments with experience in end-to-end design and development of near-real time and batch data pipelines.
Design and build Data Lake and data warehouse solutions on Snowflake and AWS using streaming and batch processes
Develop test strategies, software testing frameworks, and test automation
Champion a modern data engineering culture and best practices-based software development
Leverage DevSecOps techniques and have working experience with modern tools such as GitLab, Jira, Jenkins and build automation.
Engage in application design and data modeling discussions; participate in developing and enforcing data security policies
Drive delivery efficiency with automation and reusable components/solutions
HOW YOU ARE QUALIFIED:
Bachelor’s degree in information technology or related experience
5+ years’ experience in field of data engineering involving building data integration solutions or data warehouse environments such as AWS, Snowflake, Oracle, etc.
2+ years working experience in AWS utilizing services such as S3, AWS CLI
Experience developing data lakes, data warehousing, data ingestion, de-duplicating and cleansing customer data.
Deep working knowledge of RDBMS, SQL and ETL skills
Extensive experience in data transformations, cleansing, and de-duplication
Advanced knowledge of SQL (PLSQL or TSQL)
Experience developing data pipelines for both Cloud and Hybrid Cloud infrastructures
Knowledge of python and other scripting languages is highly desirable
Experience working in an Agile delivery environment
Hands on experience building data flows using DevOps pipelines and automation
Ability to work independently and drive solutions end to end leveraging various technologies to solve data problems and develop solutions
Promote and enforce design and development standards and best practices
Passionate about continuous learning, experimenting, applying, and contributing towards cutting edge open source technologies and software paradigms
Ability to research and assess open source technologies and components to recommend and integrate into the design and implementation
Have a proven track record of customer satisfaction and delivery success and ability to establish and maintain appropriate relationships with business and IT stakeholders
Ability to work in an advisory capacity to identify key technical business problems, develop and evaluate alternative solutions and make recommendations
Extensive experience in all aspects of the software development life cycle