Unsafe AI Artifacts
This technique has been observed in real-world attacks on AI systems.
Adversaries may develop unsafe AI artifacts that when executed have a deleterious effect. The adversary can use this technique to establish persistent access to systems. These models may be introduced via a [AI Supply Chain Compromise](/techniques/AML.T0010).
Serialization of models is a popular technique for model storage, transfer, and loading. However, this format without proper checking presents an opportunity for code execution.