iPipeline Inc

Lead Data Architect

Job ID
2024-1107
Category
Information Systems
Location : Location
UK-Bromley
Additional Job location
iPipeline - Pennsylvania
Additional Job location
iPipeline UK-Cheltenham

Overview

As a global market leader, iPipeline combines technology, innovation, and expertise to deliver ground-breaking, award-winning software solutions that transform the life insurance, financial services, and protection industries. With one of the industry’s largest data sets, we help advisors/advisers and agents to transform paper and manual operations into a secure, seamless digital experience – from proposal to commission– so they can help better secure the financial futures of their clients.

 

At iPipeline, you’ll play a major role in helping us to provide best-in-class, transformative solutions. We’re passionate, creative, and innovative, and together as a team, we continually strive to advance, accelerate, and expand the reach of our technology. We value different perspectives and are committed to creating an environment that embraces diverse backgrounds and fosters inclusion.

 

We’re proud that we’ve been recognized  as a repeat winner of various industry  awards, demonstrating our excellence and highlighting us as a top workplace in both the US and the UK. We believe that the culture we’ve built for our nearly 900 employees around the word is exceptional -- and we’ve created a place where our employees love to come to work, every single day.

 

Come join our team!

 

About iPipeline

Founded in 1995, iPipeline operates as  a business unit of Roper Technologies (Nasdaq:  ROP), a constituent of the Nasdaq 100, S&P 500®, and Fortune 1000® indices. iPipeline is a leading global provider of comprehensive and integrated digital solutions for the life insurance and financial services industries in North America, and life insurance and pensions industries in the UK. We couple one of the most expansive digital and automated platforms with one of the industry’s largest data libraries to accelerate, automate, and simplify various applications, processes, and workflows – from quote to commission – with seamless integration. Our vision is to help everyone achieve lasting financial security by delivering innovative solutions that connect, simplify, and transform the industry.  

 

iPipeline is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to gender, race, color, religious creed, national origin, age, sexual orientation, gender identity, physical or mental disability, and/or protected veteran status. We are committed to building a supportive and inclusive environment for all employees.

Responsibilities

This position will work with other data team members to enhance our sophisticated data platform built on AWS, Snowflake, Sigma, and PowerBI. The role will be responsible for designing and implementing data warehouse and ODS features and improvements in Snowflake and managing ELT from our AWS Data Lake. This includes tooling to support other developers as well as the creation of base data structures that power all downstream analytics and data services. This position works with experts from our other departments that design SaaS applications in the Financial Industry to architect, design, and build data models and semantic layers based on business rules.

 

Responsibilities:

  • Migrate data and tools (modernize) from on premise to Cloud technologies
  • Design and create data models that facilitate near real-time analytics data
  • Design, develop, automate, monitor, and maintain ELT/ETL applications using preferred tools and techniques on Snowflake and AWS
  • Design, develop, and support internal tooling related to datalake deployment and management
  • Tune recurring SQL code on Snowflake for optimal performance and cost
  • Work with our analytics team to provide backend support for our data offerings
  • Enable data observability and resource governance in Snowflake

Qualifications

Required Qualifications:

  • 5+ years of experience with AWS technologies
  • Data warehouse experience, preferably in Snowflake
  • Experience with data delivery and datalake ideology
  • Experience leveraging AWS services in production solutions to support realtime data pipelines.
  • Experience architecting realtime ETL/ELT data pipelines at scale using a variety of database management systems.
  • Experience with high volume data lakes, relational data warehouses and data lakehouse architecture.
  • Lead technical direction on a team of data analytics developers, data engineers and data scientists. 
  • Experience building Star Schema models
  • Possess strong communication skills and the ability to work with technical and business teams

 

Desired Qualifications:

  • Cross-disciplinary experience with coding languages and data (example: combined SQL + Python + JS or functional programming with data and set awareness)
  • Experience with Infrastructure as Code (IaC) methodology and tools such as Terraform or Chef
  • Experience with BI reporting tools such as Power BI, Tableau, QLIK, or Sigma
  • Experience as Snowflake/cloud warehouse admin desirable
  • dbt core/dbt cloud

 

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.