We are seeking a highly skilled and proactive Data Warehouse Engineer to join our team. In this role, you will transform complex, unstructured data from a variety of sources into consistent, machine-readable formats that support predictive and prescriptive analytics. You will design, develop, and test scalable data pipelines and ETL architectures, playing a key role in driving data-informed decision-making across the organization. This position requires strong analytical abilities, programming expertise, and a collaborative mindset. If you’re passionate about building data infrastructure that powers business intelligence, we want to hear from you.
Remote- but must sit in one of these states: Arkansas, California, Colorado, Georgia, Idaho, Illinois, New Hampshire, Nevada, Oklahoma, Oregon, Pennsylvania, Texas, and Washington.
ESSENTIAL DUTIES AND RESPONSIBILITIES:
-
Design, develop, and maintain scalable and efficient data pipelines that ensure high data quality and integrity.
-
Integrate and harmonize data from multiple sources to support analytics and reporting initiatives.
-
Build and test robust ETL architectures to streamline data extraction, transformation, and loading processes.
-
Partner with the Business Intelligence team to support data modeling and analytical frameworks.
-
Monitor, troubleshoot, and optimize data systems to ensure optimal performance and reliability.
-
Ensure thorough documentation of data pipelines, transformations, and governance processes.
-
Work with analysts and stakeholders to understand data needs and provide technical support for analytical projects.
-
Implement and maintain security and compliance protocols in line with relevant standards and regulations.
-
Resolve data-related issues in a timely and effective manner.
QUALIFICATIONS AND REQUIREMENTS:
-
Proven ability to build positive, collaborative working relationships across diverse teams.
-
Proficiency in Python and SQL for data manipulation and automation.
-
Hands-on experience with ETL tools and data warehousing platforms, particularly Microsoft SQL Server and Azure.
-
Certification in Epic Clarity data model and Caboodle Development required; additional Epic certifications are a plus.
-
Experience developing extracts from OCHIN Epic preferred.
-
Strong analytical thinking with attention to detail and problem-solving capabilities.
-
Excellent verbal and written communication skills.
-
Ability to work independently and manage multiple priorities in a fast-paced environment.
-
Creative thinker who brings new ideas and solutions to the table.
-
Must be able to work during Pacific Standard Time (PST) business hours.
-
Experience with healthcare industry data and standards is highly desirable.
EDUCATION AND EXPERIENCE:
-
Bachelor’s degree or equivalent combination of education and related experience.
-
Minimum of 5 years of advanced data engineering experience.