• Home
  • Services
  • About Us
  • Locations
  • S3 Europe
  • S3 Americas
  • S3 Asia
  • Veterans
  • Media
  • Leadership
  • Find a Job
  • Cloud AI Engineer

    Job Title: Senior Cloud AI Engineer

    Location: Addison, TX
    Job Type: Full-time


    Job Summary

    We are seeking a highly skilled Senior Cloud & AI Infrastructure Engineer to join our advanced AI systems team. This role focuses on architecting, deploying, and optimizing hybrid cloud solutions and machine learning pipelines to power large language model (LLM) inference and generative AI applications. The ideal candidate will have deep experience with multi-cloud infrastructure, scalable API development, GPU provisioning, and distributed data workflows.


    Key Responsibilities

    • Architect and implement hybrid cloud solutions across AWS, Azure, and GCP, prioritizing scalability, availability, and cost efficiency.

    • Develop and maintain RESTful APIs using FastAPI and Swagger to support real-time LLM inference and serve scalable model pipelines.

    • Optimize and deploy generative AI models (e.g., LLaMA, Mistral, OpenAI GPT), including retrieval-augmented generation (RAG) pipelines using Ray and VectorAI.

    • Automate infrastructure provisioning using Terraform, Ansible, and Crossplane to support multi-cloud deployments.

    • Enable reproducible machine learning workflows using MLflow, DVC, and VectorAI, supporting experiment tracking and model versioning.

    • Provision and manage GPU-accelerated infrastructure to enhance LLM training throughput by up to 50%.

    • Utilize vector databases such as Milvus and Pinecone with Apache Iceberg to enable efficient semantic search and maintain dataset lineage.

    • Design and orchestrate real-time and batch data workflows using Apache Airflow, Spark, and Flink for scalable data processing.

    • Implement observability and monitoring solutions using Prometheus, Datadog, and Splunk to ensure system reliability and performance.

    • Build insightful dashboards and metrics pipelines to drive operational visibility, performance tuning, and rapid debugging.


    Required Qualifications

    • 5+ years of experience in cloud infrastructure engineering, DevOps, or MLOps roles.

    • Demonstrated expertise with AWS, GCP, and Azure in production-grade environments.

    • Proven experience with FastAPI, Swagger, and scalable API design.

    • Solid background in LLM optimization, model serving, and RAG implementations.

    • Hands-on experience with Terraform, Ansible, Crossplane, and GPU provisioning strategies.

    • Familiarity with ML lifecycle tools (MLflow, DVC), and vector stores (Milvus, Pinecone).

    • Strong understanding of data orchestration frameworks (Airflow, Spark, Flink).

    • Deep knowledge of observability tools and building telemetry pipelines.

    • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.


    Preferred Qualifications

    • Experience with container orchestration (Kubernetes, Ray Serve).

    • Contributions to open-source AI or ML infrastructure projects.

    • Strong communication skills and the ability to lead cross-functional initiatives.

    June 28, 2025
  • Senior ETL Engineer

    Job Title: Senior ETL Engineer
    Location: Addison, TX
    Job Type: [Full-Time / Contract]


    Job Summary:

    We are seeking a Senior ETL Engineer with strong technical expertise in ETL development, Apache Spark, Teradata, and cloud-based technologies to join our dynamic team supporting critical data engineering initiatives. The ideal candidate will have hands-on experience with complex data analysis, test automation, and a deep understanding of metadata and data quality best practices.


    Key Responsibilities:

    • Define and design end-to-end data integration and ETL solutions to meet business and strategic goals

    • Lead development, bug-fixing, and maintenance of ETL pipelines and applications using Ab Initio, Spark (Java Spark), UNIX, and Teradata

    • Build high-performing and scalable ETL/data transformation solutions using Spark and other distributed data processing frameworks

    • Lead and participate in all phases of the software development life cycle, including requirements gathering, design, development, testing, deployment, and support

    • Collaborate with internal and external partners to resolve technical issues and ensure data integrity

    • Serve as subject matter expert on data flows, metadata processing, and quality standards

    • Prepare and review technical documentation including Data Models, Data Flows, Source-to-Target mappings, and high-level designs

    • Lead projects and provide technical guidance to junior team members

    • Ensure quality, security, and compliance standards are met for all supported systems and solutions


    Required Skills & Experience:

    • 4–6 years of hands-on experience in ETL development and Teradata

    • 2–4 years of experience with Apache Spark, Test Automation, Test Implementation, and Autosys Scheduler

    • Strong working knowledge of cloud platforms and technologies (e.g., Google Cloud Platform, Cloud Security & Compliance)

    • 1–2 years of experience with Java Spark, Cloud-Based Technologies

    • Deep experience in Complex Data Analysis and data integration design patterns

    • Familiarity with metadata-driven frameworks and Data Quality Engineering best practices

    • Financial services experience required; Consumer Lending and Home Lending domain expertise is strongly preferred

    • Working knowledge of Ab Initio, Express>IT, and DQE is a plus

    May 29, 2025

© 2025 Strategic Staff. All rights reserved.