Storage Backends¶
litestar-api-auth supports multiple storage backends for API key persistence.
Each backend implements the APIKeyBackend protocol
and stores APIKeyInfo structs.
Memory Backend¶
The memory backend stores keys in a Python dictionary with an asyncio lock for safe concurrent access. It is intended for development and testing only.
from litestar_api_auth import APIAuthConfig
from litestar_api_auth.backends.memory import MemoryBackend, MemoryConfig
config = APIAuthConfig(
backend=MemoryBackend(config=MemoryConfig(name="dev")),
key_prefix="dev_",
)
You can also instantiate it with no arguments – sensible defaults are applied:
backend = MemoryBackend() # MemoryConfig(name="memory") is used
Warning
Keys are lost when the application restarts. Do not use in production.
SQLAlchemy Backend¶
The SQLAlchemy backend persists keys to a relational database. It is powered by Advanced Alchemy and follows its Model → Repository → Service architecture:
Model (
APIKeyModel) – ORM model withBigIntBaseRepository (
APIKeyRepository) – type-safe async data accessService (
APIKeyService) – business logic, automatic session/commit management, dict-to-model conversion
It supports PostgreSQL, MySQL, SQLite, and any other database supported by SQLAlchemy.
Install the optional dependency first:
pip install litestar-api-auth[sqlalchemy]
Then configure the backend with a SQLAlchemyConfig:
from sqlalchemy.ext.asyncio import create_async_engine
from litestar_api_auth import APIAuthConfig
from litestar_api_auth.backends.sqlalchemy import SQLAlchemyBackend, SQLAlchemyConfig
engine = create_async_engine("postgresql+asyncpg://user:pass@localhost/myapp")
config = APIAuthConfig(
backend=SQLAlchemyBackend(
config=SQLAlchemyConfig(
engine=engine,
table_name="api_keys", # default
create_tables=True, # auto-create table on startup
)
),
key_prefix="prod_",
)
Configuration Options¶
SQLAlchemyConfig accepts the following fields:
Field |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
The async SQLAlchemy engine for database access. |
|
|
|
Name of the table that stores API keys. |
|
|
|
Optional database schema name. |
|
|
|
Create the table on startup if it does not exist. |
Database Schema¶
The APIKeyModel ORM model extends
Advanced Alchemy’s BigIntBase, which provides an auto-increment id primary key.
The remaining columns map directly to the fields on
APIKeyInfo:
Column |
Type |
Notes |
|---|---|---|
|
|
Auto-increment primary key from |
|
|
Unique, indexed. UUID identifier for the key. |
|
|
Unique, indexed. SHA-256 hash of the raw key. |
|
|
Human-readable label. |
|
|
JSON array of permission scopes. |
|
|
Defaults to |
|
|
Timezone-aware creation timestamp. |
|
|
Optional expiration timestamp. |
|
|
Updated on each authenticated request. |
|
|
Arbitrary key-value metadata. Maps to |
DateTimeUTC and JsonB are portable Advanced Alchemy column types that
adapt automatically to each database dialect (e.g. native jsonb on PostgreSQL,
JSON on MySQL/SQLite).
Advanced Usage: Model → Repository → Service¶
The SQLAlchemy backend is built on Advanced Alchemy and exposes the full Model → Repository → Service stack for advanced customization:
APIKeyModel– the SQLAlchemy ORM model (extendsBigIntBase)APIKeyRepository– the async repository (extendsSQLAlchemyAsyncRepository)APIKeyService– the async service (extendsSQLAlchemyAsyncRepositoryService)
The service layer sits on top of the repository and adds:
Automatic session and transaction management (commits, rollbacks)
Dict-to-model conversion (pass a
dictinstead of constructing a model)Unit-of-work pattern with
auto_commitcontrolA
match_fieldsoption for upsert-style operations
Subclassing the Model¶
Add custom columns, relationships, or constraints by subclassing
APIKeyModel:
from sqlalchemy import ForeignKey
from sqlalchemy.orm import Mapped, mapped_column, relationship
from litestar_api_auth.backends.sqlalchemy import APIKeyModel
class MyAPIKeyModel(APIKeyModel):
"""API key model with an owner relationship."""
__tablename__ = "my_api_keys"
owner_id: Mapped[int] = mapped_column(ForeignKey("users.id"))
owner: Mapped["User"] = relationship(lazy="joined")
Custom Repository with Additional Queries¶
Create a custom APIKeyRepository
subclass to add domain-specific query methods:
from advanced_alchemy.filters import LimitOffset, OrderBy
from advanced_alchemy.repository import SQLAlchemyAsyncRepository
from litestar_api_auth.backends.sqlalchemy import APIKeyModel
class MyAPIKeyRepository(SQLAlchemyAsyncRepository[APIKeyModel]):
"""Repository with custom query helpers."""
model_type = APIKeyModel
async def find_by_scope(
self,
scope: str,
*,
limit: int = 50,
offset: int = 0,
) -> list[APIKeyModel]:
"""Find active keys that contain a specific scope.
Args:
scope: The scope string to search for (e.g. ``"admin:write"``).
limit: Maximum number of results.
offset: Number of results to skip.
Returns:
List of matching APIKeyModel instances.
"""
from sqlalchemy import cast, String as SAString
return await self.list(
APIKeyModel.is_active == True, # noqa: E712
cast(APIKeyModel.scopes, SAString).contains(scope),
LimitOffset(limit=limit, offset=offset),
OrderBy(field_name="created_at", sort_order="desc"),
)
async def find_expired(self) -> list[APIKeyModel]:
"""Find all keys that have passed their expiration date."""
from datetime import datetime, timezone
return await self.list(
APIKeyModel.expires_at < datetime.now(timezone.utc),
APIKeyModel.is_active == True, # noqa: E712
)
Using the Service Directly¶
APIKeyService wraps the repository
and adds automatic session management, dict-to-model conversion, and commit handling.
This is what the backend itself uses internally:
from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine
from litestar_api_auth.backends.sqlalchemy import APIKeyService
engine = create_async_engine("postgresql+asyncpg://...")
session_factory = async_sessionmaker(engine, expire_on_commit=False)
async with session_factory() as session:
service = APIKeyService(session=session)
# Create from a dict -- the service handles model construction
new_key = await service.create(
{"key_id": "abc-123", "key_hash": "sha256...", "name": "My Key", "scopes": ["read"]},
auto_commit=True,
)
# Query with Advanced Alchemy filters
from advanced_alchemy.filters import LimitOffset, OrderBy
results = await service.list(
LimitOffset(limit=20, offset=0),
OrderBy(field_name="created_at", sort_order="desc"),
)
# Update by passing a dict with the item_id
updated = await service.update(
{"name": "Renamed Key", "id": new_key.id},
item_id=new_key.id,
auto_commit=True,
)
# Delete by primary key
await service.delete(new_key.id, auto_commit=True)
Using the Repository Directly¶
For lower-level access without the service overhead, use
APIKeyRepository directly:
from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine
from litestar_api_auth.backends.sqlalchemy import APIKeyModel, APIKeyRepository
engine = create_async_engine("postgresql+asyncpg://...")
session_factory = async_sessionmaker(engine, expire_on_commit=False)
async with session_factory() as session:
repo = APIKeyRepository(session=session)
# Use Advanced Alchemy's built-in methods
key = await repo.get_one_or_none(APIKeyModel.key_id == "some-uuid")
# Paginate with Advanced Alchemy filters
from advanced_alchemy.filters import LimitOffset, OrderBy
keys = await repo.list(
LimitOffset(limit=20, offset=0),
OrderBy(field_name="created_at", sort_order="desc"),
)
Redis Backend¶
The Redis backend stores keys in Redis, making it a good fit for distributed systems and high-performance applications that need fast key lookups.
Install the optional dependency first:
pip install litestar-api-auth[redis]
Then configure the backend with a RedisConfig:
from redis.asyncio import Redis
from litestar_api_auth import APIAuthConfig
from litestar_api_auth.backends.redis import RedisBackend, RedisConfig
redis_client = Redis.from_url("redis://localhost:6379/0")
config = APIAuthConfig(
backend=RedisBackend(
config=RedisConfig(
client=redis_client,
key_prefix="myapp:api_keys:", # namespace keys in Redis
ttl=None, # no automatic expiration
)
),
key_prefix="api_",
)
Configuration Options¶
RedisConfig accepts the following fields:
Field |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
An async Redis client instance. |
|
|
|
Prefix for all Redis keys (useful for namespacing). |
|
|
|
Optional TTL in seconds for stored keys. |
Custom Backends¶
To build your own storage backend, implement the
APIKeyBackend protocol.
The protocol is decorated with @runtime_checkable, so structural (duck)
typing works – you do not need to inherit from it explicitly.
Here is the full set of methods you must implement:
from __future__ import annotations
from datetime import datetime, timezone
from typing import Any
from litestar_api_auth.backends.base import APIKeyBackend, APIKeyInfo
class MyCustomBackend:
"""Custom storage backend implementing the APIKeyBackend protocol."""
async def create(self, key_hash: str, info: APIKeyInfo) -> APIKeyInfo:
"""Store a new API key.
Args:
key_hash: SHA-256 hash of the raw API key.
info: Metadata about the API key.
Returns:
The created APIKeyInfo.
"""
...
async def get(self, key_hash: str) -> APIKeyInfo | None:
"""Retrieve an API key by its hash.
Args:
key_hash: SHA-256 hash of the raw API key.
Returns:
The APIKeyInfo if found, None otherwise.
"""
...
async def get_by_id(self, key_id: str) -> APIKeyInfo | None:
"""Retrieve an API key by its unique ID.
Args:
key_id: UUID identifier of the key.
Returns:
The APIKeyInfo if found, None otherwise.
"""
...
async def update(self, key_hash: str, **updates: Any) -> APIKeyInfo | None:
"""Update an API key's metadata.
Args:
key_hash: SHA-256 hash of the raw API key.
**updates: Fields to update (name, scopes, is_active, etc.).
Returns:
The updated APIKeyInfo if found, None otherwise.
"""
...
async def delete(self, key_hash: str) -> bool:
"""Delete an API key.
Args:
key_hash: SHA-256 hash of the raw API key.
Returns:
True if the key was deleted, False if not found.
"""
...
async def list(
self,
*,
limit: int | None = None,
offset: int = 0,
) -> list[APIKeyInfo]:
"""List API keys with pagination.
Args:
limit: Maximum number of keys to return (None for all).
offset: Number of keys to skip.
Returns:
List of APIKeyInfo objects.
"""
...
async def revoke(self, key_hash: str) -> bool:
"""Revoke an API key (set is_active to False).
Args:
key_hash: SHA-256 hash of the raw API key.
Returns:
True if the key was revoked, False if not found.
"""
...
async def update_last_used(self, key_hash: str) -> None:
"""Update the last_used_at timestamp for a key.
Called automatically on each authenticated request when
``APIAuthConfig.track_usage`` is enabled.
Args:
key_hash: SHA-256 hash of the raw API key.
"""
...
async def close(self) -> None:
"""Release any resources held by the backend.
Called automatically when the Litestar application shuts down.
"""
...
You can verify that your class satisfies the protocol at runtime:
assert isinstance(MyCustomBackend(), APIKeyBackend)
The APIKeyInfo Struct¶
All backends store and return APIKeyInfo
instances. This is a msgspec.Struct with the following fields:
Field |
Type |
Default |
Description |
|---|---|---|---|
|
|
required |
Unique identifier (UUID) for the key. |
|
|
required |
SHA-256 hash of the raw API key. |
|
|
required |
Human-readable name for the key. |
|
|
required |
Permission scopes (e.g. |
|
|
|
Whether the key is currently active. |
|
|
|
When the key was created. |
|
|
|
When the key expires (None = no expiry). |
|
|
|
When the key was last used. |
|
|
|
Arbitrary key-value metadata. |
APIKeyInfo also provides convenience methods:
is_expired– property that checks whether the key has passed itsexpires_at.has_scope(scope)– returnsTrueif the key has a specific scope.has_scopes(scopes, requirement="all")– checks for multiple scopes. Setrequirement="any"to require at least one match instead of all.