True tandem

Senior Database Engineer

Remote Full Time
Company Description
TrueTandem's mission is to be a trusted information technology solutions provider, committed to the success of our customers, communities and employees. To enable this mission, we listen to our customers’ needs, empower our dedicated and talented employees, envision success together, and deliver innovative cost-effective solutions. For our customers, we aim to deliver more power to meet their business outcomes through technology implementation, integration, optimization and customization. We enable some of the most well-known companies, nonprofits and federal agencies in the United States to intelligently plan and develop their applications, modernize their infrastructure and manage their data.

Project Overview:

This role supports the digital transformation of mission-critical federal systems through the development and delivery of secure, scalable solutions built on Microsoft Power Platform, Dynamics 365, and Azure Data Services. The Database Architect will play a key role in the data modernization initiative—leading data architecture, integration, and solution development across platform components.

 
As a  Database Engineer on our delivery team, you’ll join a culture focused on collaboration, iteration, and continuous improvement. Our agile DevSecOps teams are dedicated to exploring new technologies, automating where possible, and solving meaningful problems in support of public service missions.

 
Technical members of our solutions teams require little guidance, but love to learn, collaborate, and solve problems. This position requires experience and passion for coding, and a strong desire to solve our customers’ unique technology challenges. 


Role Summary:

The Database Engineer is a technical contributor responsible for designing, integrating, and optimizing data architectures that power modern business applications. This hybrid role combines advanced Power Platform development capabilities with strong data engineering, integration, and modeling expertise. The ideal candidate brings a developer mindset to data modernization—building scalable, automated solutions that drive value and impact.


Core Responsibilities:

Data Processing, Coding, and File Preparation

  • Create SAS, SQL, Python, and R programs to prepare raw NCHS, BRFSS, SENSOR Pesticide, and Occupational Hearing Loss data for NIOCCS.
  • Generate files, split datasets, create data quality reports, and upload files for coding and human review.
  • Run repeated cycles of processing (e.g., inconsistent pair identification, coder-QC reconciliation, merging, updating KBs).
  • Develop programs to perform misspelling correction, identify company names, translate Spanish inputs, and improve literal quality before coding.
  • Produce datasets for large-scale testing and track review outcomes.

Automation, Tools, and System Development

  • Build .NET tools (e.g., bulk upload apps, new Admin System API lookup forms, auto-downloader apps).
  • Create Python programs interacting with the NIO API to test coding, generate outputs, and support external partners.
  • Develop machine learning workflows (LightGBM, SGDClassifier) to modernize coding logic.
  • Implement OpenAI-based spell checking, text normalization, company-name detection, and AI-augmented reviewer emulation.
  • Create R equivalents of key SAS processes to diversify system coverage.

API (NIO API) Development, Testing, and Deployment

  • Test, debug, and refine the new NIO API parameters, outputs, and integration with external systems.
  • Generate API metrics, track API hits, query DGW and Azure Log Analytics for API performance monitoring.
  • Work with Digital Gateway, OCIO, and Software Assurance Teams to move the API toward production readiness.
  • Provide API keys, usage guidance, and troubleshooting support to external vendors and state partners.
  • Communicate known issues and coordinate remediation strategies for existing API users.

System Administration & Infrastructure Support

  • Manage the inbox and create SAMS accounts for new users.
  • Update service account passwords and all relevant web configuration files and scheduled tasks.
  • Test server migrations (SQL Server 2016 → 2019), update database connections, track dependencies, and coordinate with OCIO/DB teams.
  • Review logs, fix system issues (autocoder failures, inactive navigation links, truncated literals, etc.).
  • Maintain distribution lists, update app config paths, resolve server access needs, and oversee staging/production environment access.

Coder Workflow, QC, and Reviewer Support

  • Create coder and QC performance reports (monthly reports, workload estimates, review statistics).
  • Generate lists for reviewer assignments, track coder workloads, and forecast QC needs.
  • Provide specialized datasets to reviewers (e.g., flagged literals, inconsistent pairs, company-name review files).
  • Debug BRFSS, NCHS, SENSOR, and other project-specific coding anomalies.
  • Evaluate improvement metrics after applying AI-enhanced preprocessing to reduce coder review burden.

Cross-Team Coordination and Communication

  • Provide technical guidance to internal and external partners on:
    • preparing files for NIOCCS,
    • interpreting outputs,
    • navigating review processes,
    • utilizing the NIO API.
  • Coordinate data flows, discuss timelines, and determine project scope changes with leadership.
  • Respond to external inquiries (EHR vendors, universities, state health departments, public users).

Metrics, Monitoring, and Reporting

  • Update the weekly Web Service Hits Splunk report.
  • Create SQL queries, dashboards, Visio process flowcharts, and Power BI prototypes for activity and metrics.
  • Track API usage, web hits, and code review trends across multiple projects and timeframes.

Research, Modernization, and Process Improvement

  • Explore alternative system builds, modern coding algorithms, ML modeling, and AI-powered preprocessing strategies.
  • Review and propose modernizations to data flows, SENSOR Pesticide processes, and BRFSS pipelines.
  • Investigate security findings, CIPSEA compliance, data retention policies, and DMAT reporting integration.
  • Prepare outlines, slide decks, and plans for advancing and the NIO API ecosystem.

Training, Documentation, and Knowledge Sharing

  • Produce documentation, readme reviews, instructions, and training files for partners and reviewers.
  • Present at EDAV SAS Viya groups and internal meetings on AI-enhanced preprocessing.
 
Required Experience:


Minimum 5 years of total IT experience with 7+ years of application development, designing and modeling, visualization, integrating, and warehousing data systems.

Strong experience with backend data design using Dataverse, SQL Server, and SharePoint.

Strong experience Integrating Power Platform with other systems using Azure, Data Factory, Data Flows, Power Automate, Logic Apps, Azure Functions, REST APIs, and custom connectors