Skip to main content
MLOps

Inference

Guides model-serving and runtime-inference decisions across local, remote, and packaged deployment paths.

On demandAvailable when invokedBuilt In
Install from Aegisaegis skills install inference
Overview

Bundled with the packaged Aegis CLI as a built-in procedural skill.

Already ships inside the packaged Aegis bundle. Use `aegis skills install inference` only when you want an explicit local materialization record.

Aliases

model inferenceserve a modelrun inference

Trigger phrases

help me serve this modelset up inference for this modelrun local inference

Keywords

inferenceservinglatencythroughputdeploymentruntime