--- name: triton-inference-config description: | Triton Inference Config - Auto-activating skill for ML Deployment. Triggers on: triton inference config, triton inference config Part of the ML Deployment skill category. allowed-tools: Read, Write, Edit, Bash, Grep version: 1.0.0 license: MIT author: Jeremy Longshore --- # Triton Inference Config ## Purpose This skill provides automated assistance for triton inference config tasks within the ML Deployment domain. ## When to Use This skill activates automatically when you: - Mention "triton inference config" in your request - Ask about triton inference config patterns or best practices - Need help with machine learning deployment skills covering model serving, mlops pipelines, monitoring, and production optimization. ## Capabilities - Provides step-by-step guidance for triton inference config - Follows industry best practices and patterns - Generates production-ready code and configurations - Validates outputs against common standards ## Example Triggers - "Help me with triton inference config" - "Set up triton inference config" - "How do I implement triton inference config?" ## Related Skills Part of the **ML Deployment** skill category. Tags: mlops, serving, inference, monitoring, production