|
Principal Data Engineer - Boise Idaho
Company: Boeing Employees Credit Union Location: Boise, Idaho
Posted On: 01/27/2025
Is it surprising to hear that a financial institution of 1.5 million members and over $30 billion in managed assets say that success comes from focusing on people, not profits?Our "people helping people" philosophy has guided us since 1935, driving our deep commitment to serving our members, communities, and each other. When you join our team, you become part of a purpose-driven organization where your work makes a real difference.While we're proud of our history, we're even more excited about our future. With business and technology transformation on the horizon, there's never been a better time to be part of BECU.PAY RANGEThe Target Pay Range for this position is $184,300.00-$225,000.00 annually. The full Pay Range is $138,800.00 - $258,500.00 annually. At BECU, compensation decisions are determined using factors such as relevant job-related skills, experience, and education or training. Should an offer for employment be made, we will consider individual qualifications. In addition to your salary, compensation incentives are available for the hired applicant. Incentives are performance based and targets vary by role.BENEFITSEmployees and their eligible family members have access to a wide array of employee benefits, such as medical, dental, vision and life insurance coverage. Employees have access to disability and AD&D insurance. We also offer health care and dependent care flexible spending accounts, as well as health savings accounts, to eligible employees. Employees are able to enroll in our company's 401k plan and employer-funded retirement plan. Newly hired employees accrue 6.16 hours of paid time off (PTO) on a per pay period basis based on hours worked (up to a maximum of 160 PTO hours per year) and receive ten paid holidays throughout the calendar year. Additional details regarding BECU Benefits can be found .IMPACT YOU'LL MAKE:The Principal Data Engineer will lead data teams in implementing enterprise data strategies, leveraging expertise in complex data management technologies, and enabling machine learning (ML) and artificial intelligence (AI) solutions. This role is pivotal in ensuring data quality through governance, maintaining data integrity, and optimizing data processes. The Principal Data Engineer will partner with cybersecurity and architecture teams to adopt best practices, deliver scalable data solutions, and stay aligned with evolving technologies. Responsibilities include designing robust data pipelines, ensuring data protection, and presenting future-state roadmaps to technical leadership. The role also fosters collaboration with data architects and security teams to design and implement enterprise-class data workflows.This is a single role being advertised across our approved locations (WA, OR, ID, AZ, TX, GA or SC)To join our dynamic team, we require candidates to be residents of WA, OR, ID, AZ, TX, GA, or SC. If you're located in Washington state and within a reasonable driving distance from Tukwila, we are requesting that you come into our HQ on Tuesdays & Wednesdays.For those candidates that live outside the commute distance of TFC and in any of our approved remote work locations, this role will be remote. Remote or onsite, we are committed to ensuring you are fully engaged and included in our collaborative environment.WHAT YOU'LL DO:This isn't just about ticking off tasks on a list. It's about making a significant, positive change in BECU's journey, where your contributions are valued, and your growth is continually fostered. - Strategic Leadership: Guide and structure complex, enterprise-wide data solutions, ensuring alignment with organizational goals and regulatory standards.
- Process Improvement: Assess and refine data engineering processes to enhance performance, scalability, and efficiency.
- Technical Communication: Present and explain technical topics clearly to engineering teams, vendors, and business stakeholders.
- Collaboration: Partner with leadership and technical teams to create high-performing, best-practice-driven data solutions.
- Prototyping and Design: Develop and implement prototypes, proofs of concept, and scalable, production-ready data workflows.
- Data Governance: Lead efforts to improve metadata and master data management policies in partnership with Data Governance and Architecture teams.
- Policy Development: Create and manage data curation policies to optimize workflows and pipelines.
- Team Guidance: Provide mentorship and frameworks to support data engineering teams in development, testing, and reviews.
- Roadmap Development: Collaborate with Architects and Product Owners to define and document the team's technology vision and roadmap.
- Training and Development: Work with leadership to establish training plans that advance data engineering competencies aligned with industry trends.
- Continuous Learning: Represent BECU at conferences and seminars, staying ahead of data management innovations and sharing insights.WHAT YOU'LL GAIN:
- Cutting-Edge Technology Exposure: Work with modern data platforms, cloud technologies, and AI/ML integrations.
- Professional Growth: Opportunities for training, mentorship, and industry engagement to advance your expertise.
- Collaboration and Impact: Join a supportive team environment where your ideas and skills drive meaningful contributions.
- Recognition and Value: Be part of an organization that prioritizes innovation, inclusivity, and member satisfaction.
- Flexible Work Environment: Enjoy a balanced approach to work-life integration while tackling challenging and rewarding projects.QUALIFICATIONS:Minimum Qualifications:
- Bachelor's degree in computer science or business discipline, or equivalent work or education-related experience.
- Minimum 10 years of progressive experience in data engineering.
- Minimum 6 years of experience in coding languages like Python, Scala, Java, SQL.
- Knowledge of cloud technologies (platforms like Azure, AWS, GCP).
- Knowledge of one or more modern data platforms (Azure Synapse, Databricks, Redshift, Big Query, Snowflake).
- Experience in distributed parallel processing.
- Advanced experience using SQL, data modeling, large datasets, data warehousing, data analysis, analytics engines.
- Knowledge of cloud-hosted SQL-based datastores and NoSQL systems.
- Knowledge of the Software Development Life Cycle (SDLC) and Agile methodologies.
- Experience with Continuous Integration and Continuous Delivery systems and tools such as Azure DevOps, GitHub.
- Experience managing source control models and organization development practices.
- Experience in git, performing code reviews, pull requests, and following branching standards such as Git Flow or Trunk-Based Development.
- Experience in delivering highly scalable data solutions using programming languages compatible with current technical environments.
- Implementation experience with cloud technologies (platforms like Azure, AWS, GCP)
- Implementation experience of two or more of Modern data platforms (AWS/Azure Synapse/Redshift/Snowflake, Databricks, Redshift, Big Query,)Desired Qualifications:
|
|