OU Health

Data Engineer III

Virtual Full time

Position Title:

Data Engineer III

Department:

ETS Analytics and AI Data

Job Description:

The Data Engineer III leads and serves as the subject matter expert in the design, development and delivery of data pipelines and value-added data assets across the OU Health data ecosystem, leveraging a variety of data warehousing methodologies and disciplines. The Data Engineer III primarily designs, builds, and maintains these data assets. Emphasizing agility, partnership, and cross-functional teamwork, the Data Engineer III executes initiatives throughout the product lifecycle. The Data Engineer III works closely with Data Scientists, Business Intelligence Developers and other colleagues to build or enhance robust data systems. Initiatives will often be of significant complexity and risk. The Data Engineer III serves as a technical guide and mentor to the ETS department.

Essential Responsibilities

Responsibilities listed in this section are core to the position. Inability to perform these responsibilities with or without an accommodation may result in disqualification from the position.

  • Leverage subject matter expertise with a variety of data engineering, DataOps, and data warehousing methodologies, techniques, tools, and platforms to transform large quantities of data from multiple sources.

  • Design, create, test, deploy and maintain data pipelines that deliver curated, value-added data assets such as data marts and other purpose-built data stores. Ensure data pipelines are optimized, highly reliable, and contain low technical debt.

  • Design, build, and maintain the tools and infrastructure needed to handle large datasets.

  • Data Governance: Enforce data governance policies including data quality, validation, lineage, metadata management, and adherence to healthcare regulations

  • Quality assurance: Develop and implement comprehensive data quality frameworks, addressing issues such as data accuracy, completeness, and consistency

  • Work closely with different application and operational teams to understand business needs and align data engineering initiatives accordingly.

  • Guide, mentor, quality review and train Data Engineering team and ETS department on technical skills and best practices.

General Responsibilities

  • Performs other duties as assigned.

Minimum Qualifications

Education Requirements: Bachelor's Degree required.

Experience Requirements: 5 or more years in analytics (Business Intelligence, Data Engineering, Data Science, etc.) required.

License/Certification/Registration Requirements:  Epic certification/accreditation required within 6 months of hire or within 3 months of class completion including:

  • Cogito Fundamentals

  • Clarity Data Model

  • Caboodle Data Model

  • Access Data Model

  • Revenue Data Model

  • Clinical Data Model

  • Caboodle Development

  • Additional epic classes as requested to support evolving business needs
     

Knowledge/Skills/Abilities Required:

  • Expert level analytic skills related to working with structured and unstructured datasets.

  • Guide, mentor and train Data Engineering team, Data Scientist and Business Intelligence Developers on technical skills and best practices.

  • Must possess critical thinking and creative problem-solving skills along with the ability to communicate well with stakeholders throughout the organization.

  • Effective communication, project management and organizational skills.

  • Experience supporting and working with cross-functional teams in a dynamic environment.

  • Working knowledge of stream processing and highly scalable data stores.

  • Previous experience manipulating, processing, and extracting value from large, disconnected datasets.

  • Expert level  SQL and data manipulation skills

  • Exposure to big data tools: dbt, SnowPark, Spark, Kafka, etc.

  • Experience with relational SQL and NoSQL databases, including Snowflake, MS SQL Server, and Postrgres

  • Experience with integration tools: Fivetran, Matillion, SSIS, dbt, SnowSQL.

  • Exposure to stream-processing systems: IBM Streams, Flume, Storm, Spark-Streaming, etc.

  • Exposure to consuming and building APIs

  • Exposure to object-oriented/object function programming languages: Python, Java, C++, Scala, etc.

  • Experience with statistical data analysis tools: R, SAS, SPSS, etc.

  • Experience with visual analytics tools: QlikView, Tableau, Power BI etc.

  • Familiarity to Agile methodology for development

  • Familiarity with electronic health records and financial systems. i.e., Epic Systems, Workday, Strata etc.

  • Ability to work independently and within teams.

  • Ability to develop and advise on data asset use to provide solutions to organizational needs.

Current OU Health Employees - Please click HERE to login.

OU Health is an equal opportunity employer. We offer a comprehensive benefits package, including PTO, 401(k), medical and dental plans, and many more. We know that a total benefits and compensation package, designed to meet your specific needs both inside and outside of the work environment, create peace of mind for you and your family.