hero

Accelerate your career.

Explore opportunities across TA's portfolio

Senior Software Engineer (Intelligent Data Systems) 

OMNIA Partners

OMNIA Partners

Software Engineering
Franklin, TN, USA
Posted on Jan 15, 2026

Job Title

Senior Software Engineer (Intelligent Data Systems)

Location

Franklin, TN

Open Position Summary – Senior Software Engineer

OMNIA Partners has become the largest and most experienced purchasing organization for public and private sector markets by delivering unparalleled scale and solutions. Through further organic growth and strategic acquisitions, OMNIA Partners will continue to drive economies of scale to execute more contracts, in more verticals, with transparent, value-driven pricing for our membership of companies. Our success and growth have been unparalleled in this space. OMNIA Partners is at the forefront of leveraging AI and data-driven solutions to enhance business operations and customer insights. We are committed to using cutting-edge AI/ML capabilities to improve data quality, automation and overall efficiency across the organization.

We are hiring a Senior Software Engineer to take the lead on designing and building the next generation of our data-driven products. In this role, you will bridge the gap between data science concepts and robust software engineering. You will be responsible for taking innovative ideas – often starting as proofs-of-concept – and architecting them into highly scalable, production-grade systems that drive immediate business impact.

Your primary focus will be building intelligent automation engines that can ingest vast amounts of unstructured data from the outside world, make autonomous decisions about that data using AI/ML and route actionable intelligence to our sales teams and partners. You will work closely with senior leadership to define technical strategy and join a talented team of engineers and architects dedicated to harnessing the power of AI for operational efficiency and growth.

Position Responsibilities:

  • System Architecture & Scaling: Lead the architectural design and implementation of complex backend systems, taking early-stage concepts and maturing them into resilient, high-load production environments.
  • Intelligent Data Acquisition: Develop robust strategies and systems for acquiring large volumes of data from diverse, often unstructured external sources, ensuring high data quality and reliability.
  • Applied AI/ML Integration: Design pipelines that integrate practical AI/ML models (such as text classification, NLP or scoring algorithms) to automate complex decision-making processes and enrich incoming data streams.
  • Data Pipeline Engineering: Build high-throughput data pipelines that ingest, validate, process and route data efficiently to downstream applications, data warehouses and third-party ecosystems.
  • Database Strategy: Optimize data storage and retrieval strategies for large-scale datasets across both relational databases and modern data warehouses.
  • Technology Evaluation: Act as a technical leader by staying current with emerging tools in data engineering and applied AI, recommending adoption where it enhances our capabilities.

Required Education and Skills:

  • Software Engineering Foundations: 5+ years of backend software engineering experience, with a strong track record of building data-intensive applications.
  • Python Proficiency: Expert-level proficiency in Python, with experience using it for both system building and data processing.
  • Handling Unstructured Data: Demonstrated experience building systems that interact with, ingest and structure messy or complex external data sources at scale.
  • Applied Machine Learning: Practical experience integrating Machine Learning into production software workflows. You don't need to be a research scientist, but you must know how to apply standard ML libraries (e.g., scikit-learn, spaCy, or similar) to solve practical problems like classification, entities extraction or scoring.
  • Database Expertise: Strong understanding of data modeling and performance tuning in relational databases and cloud data warehouses (preferably Snowflake).
  • API & Integration: Deep experience designing and consuming complex APIs to connect internal services with third-party data providers.
  • Cloud-Native Mindset: Experience building and deploying applications in cloud environments (AWS, GCP or Azure).

Preferred Qualifications:

  • Experience with containerized deployments (Docker/Kubernetes) and modern orchestration tools (e.g., Airflow, Celery).
  • Bachelor’s or master’s degree in computer science or a related technical field.