Ultra-fast, low latency LLM prompt injection/jailbreak detection ⛓️
jailbreak
security-tools
large-language-models
prompt-engineering
chatgpt-prompts
llm-security
llm-local
llm-guard
llm-guardrails
-
Updated
May 9, 2024 - Python