Position Summary:
Cigna, a leading Health Services company, is seeking a Data Modeler within our Data & Analytics organization. This role is responsible for designing and maintaining scalable, high-performance data models that power transactional, operational, and analytical systems. The Data Modeler will work across on-premises databases, AWS cloud services, and Databricks Lakehouse platforms to ensure data structures support both large-scale ingestion and high-performance analytics. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business.
This role requires a balance of strong data modeling expertise, hands-on experience across multiple data platforms (RDBMS, NoSQL, DocumentDB, Lakehouse), and the ability to collaborate with engineers, architects, and business stakeholders.
Behaviors of a Data Modeler:
Data Modelers are expected to articulate clear business objectives aligned to technical specifications and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary.
Job Description & Responsibilities:
Data Modeler will work on a team of truly talented individuals and be responsible for soliciting requirements, performing deep data analysis and building a thorough understanding of a complex, multi domain data ecosystem. This requires the ability to understand Individual & Family Retail Health Insurance applications and its components that interface with complex enterprise systems and external vendor systems. Data Modeler will have solid knowledge working with complex distributed applications and passion for technologies by solving data integration challenges. Individual will have zeal and drive to be up to speed with latest development in tech arena.
The Data Modeler must be a highly motivated, well-rounded, team player and self-starter that works best in a collaborative, dynamic, agile environment. Excellent communication skills both written and oral are also essential as this position will interface with remote scrum teams, business owners, enterprise architects, security, infrastructure, and end users via email, phone, IM, desktop sharing, and wiki. They should be able to demonstrate the qualities listed below
Qualities:
Analytical Skills: Candidate must have strong analytical skills and be able to recognize the needs of customers and create simple solutions that answer those needs.
Communication: Candidate must be able to clearly communicate their ideas to peers, stakeholders, and management.
Creativity: Creativity is needed to help invent new ways of approaching problems and developing innovative applications as well as bringing experience from other industries.
Customer-Service: If dealing directly with clients and customers, candidate would need good customer service skills and consultant mentality to answer questions and fix issues.
Attention to Detail: Applications have many parts and all must work together for the application to function.
Problem-Solving: As issues come up, candidate must be able to make decisions that move the project forward.
Teamwork: Candidate must work well with others as part of a distributed agile (SAFe) team of engineers, analysts, product owner, and scrum master.
Technical Skills: Candidate must be adept in computer languages and have strong technical aptitude. Must have a solid knowledge with common design patterns should seek out opportunities to implement design patterns.
Leadership Skills: Candidate is expected to lead by example, exhibiting technical excellence and development of expert level business domain knowledge. Influences technical direction within the ART and across ARTs. Advocates for a shared technical vision, enables others to act to fulfill the vision. Challenges existing processes through relentless improvement.
Responsibilities:
Design, develop, and optimize conceptual, logical, and physical data models across diverse platforms (RDBMS, NoSQL, and cloud-native databases).
Model transactional data structures in Oracle, Aurora, DynamoDB, and DocumentDB.
Translate legacy/on-prem relational models into cloud-based models (Aurora/DynamoDB/DocumentDB).
Design and implement dimensional models (star/snowflake) and Data Vault 2.0 for analytical workloads in the conformed zone.
Create Unified Star Schema (USS) and other optimized dimensional models for reporting/analytics in the published zone.
Ensure data models can scale to handle 500M+ records with high-speed write throughput (raw & conformed layers) and read-optimized designs for published layers.
Leverage open table formats in Databricks Lakehouse (Delta Lake/ Iceberg) for modeling across raw, conformed, and published layers.
Maintain data dictionaries, metadata repositories, and model documentation using industry-standard tools such as Erwin, PowerDesigner, or equivalent.
Collaborate with architects and business stakeholders to ensure models align with business requirements and data governance standards.
Data model design and development: Develop and maintain conceptual, logical, and physical data models to support business needs for analytics, operational applications, and reporting.
Requirements gathering: Work with stakeholders, product owners and system analysts to understand business processes, data needs, and reporting requirements.
Data integration: Design data mapping specifications and work with data engineers and system analysts to integrate data from diverse sources into data vault implementations, data warehouses or data lakes.
Data quality and integrity: Ensure the accuracy, consistency, and completeness of data by establishing validation rules and constraints.
Performance optimization: Identify and address performance bottlenecks in data models and optimize database queries for faster retrieval and processing.
Documentation and governance: Create and maintain detailed documentation, including data dictionaries, data flow diagrams, and metadata repositories. Uphold data governance standards and best practices.
System evaluation: Review and evaluate existing data systems for efficiency, discrepancies, and scalability.
A key aspect of this responsibility is ensuring that data is accurately cleaned, transformed, and loaded to enable consistent and reliable analytics and reporting. Very high-quality data is essential to our business foundation.
Enable a 360-degree view of customer-centric information through integration of a multitude of internal/external systems, mobile apps, devices, and data marts.
Support and enhance existing Individual & Family Retail Health Insurance applications used by consumers as well as operations staff.
Participate in all agile ceremonies effectively.
Ability to mentor and coach a team of jr. developers.
Skillset:
Data Modeling Advisor should have 11 to 13 years of experience and should be very familiar with advanced concepts and have relevant, hands-on experience in many of the following areas to be a successful contributor on the team:
Expertise in data modeling methodologies, such as relational, dimensional, and entity-relationship (ER) modeling.
Knowledge of various structured/non-structured databases.
Understanding of OLTP, OLAP concepts, normalization, denormalization, how business use cases drive data model design and implementation.
Proficiency in data modeling tools like ERwin, ER/Studio, or PowerDesigner.
Advanced knowledge of SQL for complex querying, data manipulation, and analysis.
Understanding of Databricks DLT (Delta Live Tables) and workflows and associated data model patterns.
Proven expertise in RDBMS modeling (Oracle, SQL Server, Aurora).
Hands-on experience with NoSQL (DynamoDB) and document databases (DocumentDB, MongoDB).
Strong background in dimensional modeling, Data Vault 2.0, and Unified Star Schema (USS).
Familiarity with Databricks Lakehouse and open table formats (Delta Lake, Iceberg).
Experience handling large-scale datasets (500M+ records) with performance optimization for write and read workloads.
Excellent communication skills and ability to work closely with business and technical stakeholders.
Qualifications:
Bachelor’s degree in computer science or a related discipline strongly preferred, typically eleven or more years of solid, diverse work experience in IT with a minimum of eight years’ experience as a data modeler, or the equivalent in education and work experience. Ideal candidate will have relevant experience either as a consultant or working for a start-up company.
About Evernorth Health Services
Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.