Skip to main content

Installation

pip install raxe[wrappers]

Basic Usage

from raxe import RaxeOpenAI

# Drop-in replacement for OpenAI client
client = RaxeOpenAI(api_key="sk-...")

# Threats automatically scanned before API call
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "What is AI?"}]
)

print(response.choices[0].message.content)

How It Works

  1. User sends request through RaxeOpenAI
  2. RAXE scans the prompt before calling OpenAI
  3. If threat detected → RaxeBlockedError raised
  4. If safe → Request forwarded to OpenAI
  5. Response returned normally

Benefits

Save Money

Threats blocked before API call - no wasted tokens

Zero Code Changes

Just change the import statement

Full Compatibility

All OpenAI features work normally

Automatic Protection

Every request scanned automatically

Error Handling

from raxe import RaxeOpenAI, RaxeBlockedError

client = RaxeOpenAI(api_key="sk-...")

try:
    response = client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": user_input}]
    )
    return response.choices[0].message.content

except RaxeBlockedError as e:
    # Threat was detected and blocked
    print(f"Blocked: {e.severity}")
    print(f"Rule: {e.rule_id}")
    return "Your request was blocked for security reasons."

Configuration

from raxe import RaxeOpenAI

client = RaxeOpenAI(
    api_key="sk-...",

    # RAXE configuration
    raxe_l1_enabled=True,
    raxe_l2_enabled=True,
    raxe_block_on_threat=True,  # Raise error on threat
)

Streaming Support

from raxe import RaxeOpenAI

client = RaxeOpenAI(api_key="sk-...")

# Streaming works normally
stream = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Tell me a story"}],
    stream=True
)

for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")

All Messages Scanned

The wrapper scans all messages in the conversation:
response = client.chat.completions.create(
    model="gpt-4",
    messages=[
        {"role": "system", "content": "You are helpful"},  # Scanned
        {"role": "user", "content": "What is AI?"},        # Scanned
        {"role": "assistant", "content": "AI is..."},      # Scanned
        {"role": "user", "content": "Tell me more"}        # Scanned
    ]
)

Migration Guide

# Before
from openai import OpenAI
client = OpenAI(api_key="sk-...")

# After (one-line change)
from raxe import RaxeOpenAI
client = RaxeOpenAI(api_key="sk-...")

# Everything else stays the same!

Async Support

from raxe import AsyncRaxeOpenAI

client = AsyncRaxeOpenAI(api_key="sk-...")

response = await client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello"}]
)