Principle Data Engineer in Charlotte, North Carolina
Posted 09/22/21

THE TEAM YOU WILL BE JOINING:

  • Top 25 U.S. digital financial services company committed to developing award-winning technology and services.
  • Named one of the top three fastest-growing banking brands in the U.S. in 2020.
  • Offers a full suite of products including mortgage lending, personal lending, and a variety of deposit and other banking products (savings, money-market, and checking accounts, certificates of deposit (CDs), and individual retirement accounts (IRAs)), self-directed and investment-advisory services, and capital for equity sponsors and middle-market companies.

WHAT THEY OFFER YOU:

  • Fast paced, highly collaborative, teamwork-oriented environment
  • Make an immediate impact in this high visibility role
  • Base salary of $130-140k + 11% bonus and excellent benefits package
  • Top-notch leadership committed to developing people

LOCATION

  • Charlotte, NC or Detroit MI- 100% remote for now, then will sit on-site in Charlotte, NC when staff transitions back into the office after October.
  • 100% remote for the right candidate.

WHAT YOU WILL DO

The Principle Data Engineer position is part of a team that is responsible for multiple applications that support the investment activities of the banking platform. Those responsibilities include, but are not limited to the following:

  • Design and build Data Warehouse based on Data Vault (DV-2) style of data model
  • Design and build Data Lake and big data analytic solutions on AWS using streaming and batch processes
  • Develop test strategies, software testing frameworks, and test automation
  • Champion a modern engineering culture and best practices-based software development
  • Leverage DevSecOps techniques and have working experience with modern tools such as GitHub, Jira, Jenkins, Crucible, and build automation.
  • Engage in application design and data modeling discussions; participate in developing and enforcing data security policies
  • Drive delivery efficiency with automation and reusable components/solutions

HOW YOU ARE QUALIFIED:

  • Minimum 2 years working experience in AWS utilizing services such as S3, AWS CLI, and DynamoDB
  • Deep working knowledge of NoSQL, RDBMS, SQL, JSON, XML and ETL skills
  • Extensive experience in data transformations, cleansing, and de-duplication
  • Advanced knowledge of SQL (PSQL or TSQL)
  • Experience developing data pipelines for both Cloud and Hybrid Cloud infrastructures
  • Knowledge of python and other scripting languages is highly desirable
  • Experience using modern ETL tools such as InfoSphere Datastage Cloud Pack, Apache NiFi, etc.
  • Experience working in an Agile delivery environment
  • Hands on experience building and using DevOps pipelines and automation
  • Ability to work independently and drive solutions end to end leveraging various technologies to solve data problems and develop solutions
  • Promote and enforce design and development standards and best practices
  • Passionate about continuous learning, experimenting, applying, and contributing towards cutting edge open-source technologies and software paradigms
  • Ability to research and assess open-source technologies and components to recommend and integrate into the design and implementation
  • Have a proven track record of customer satisfaction and delivery success and ability to establish and maintain appropriate relationships with business and IT stakeholders
  • Ability to work in an advisory capacity to identify key technical business problems, develop and evaluate alternative solutions and make recommendations
  • Extensive experience in all aspects of the software development life cycle
  • Employee Type: Direct Hire
  • Location: Charlotte, North Carolina
  • Category: Information Technology
  • Date Posted: 09/22/21
Apply Today!
Apply Today!

Apply Today!

Attach a resume file. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

We are uploading your application. It may take a few moments to read your resume. Please wait!