Compare commits

..

4 Commits

Author SHA1 Message Date
06337283e8 fix migrations correctly
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-01-16 16:50:46 +01:00
5d28114a4b remove USER_ATTRIBUTE_DEBUG
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-01-16 15:49:05 +01:00
b7c154ccd2 dont always try to pull the image
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-01-16 15:49:05 +01:00
d1fbf2ed65 remove leftover code
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-01-16 15:49:03 +01:00
1690 changed files with 57676 additions and 164546 deletions

View File

@ -1,28 +1,24 @@
[bumpversion] [bumpversion]
current_version = 2025.6.3 current_version = 2024.12.2
tag = True tag = True
commit = True commit = True
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(?:-(?P<rc_t>[a-zA-Z-]+)(?P<rc_n>[1-9]\\d*))? parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(?:-(?P<rc_t>[a-zA-Z-]+)(?P<rc_n>[1-9]\\d*))?
serialize = serialize =
{major}.{minor}.{patch}-{rc_t}{rc_n} {major}.{minor}.{patch}-{rc_t}{rc_n}
{major}.{minor}.{patch} {major}.{minor}.{patch}
message = release: {new_version} message = release: {new_version}
tag_name = version/{new_version} tag_name = version/{new_version}
[bumpversion:part:rc_t] [bumpversion:part:rc_t]
values = values =
rc rc
final final
optional_value = final optional_value = final
[bumpversion:file:pyproject.toml] [bumpversion:file:pyproject.toml]
[bumpversion:file:uv.lock]
[bumpversion:file:package.json] [bumpversion:file:package.json]
[bumpversion:file:package-lock.json]
[bumpversion:file:docker-compose.yml] [bumpversion:file:docker-compose.yml]
[bumpversion:file:schema.yml] [bumpversion:file:schema.yml]
@ -33,4 +29,6 @@ optional_value = final
[bumpversion:file:internal/constants/constants.go] [bumpversion:file:internal/constants/constants.go]
[bumpversion:file:lifecycle/aws/template.yaml] [bumpversion:file:web/src/common/constants.ts]
[bumpversion:file:website/docs/install-config/install/aws/template.yaml]

View File

@ -5,10 +5,8 @@ dist/**
build/** build/**
build_docs/** build_docs/**
*Dockerfile *Dockerfile
**/*Dockerfile
blueprints/local blueprints/local
.git .git
!gen-ts-api/node_modules !gen-ts-api/node_modules
!gen-ts-api/dist/** !gen-ts-api/dist/**
!gen-go-api/ !gen-go-api/
.venv

View File

@ -7,9 +7,6 @@ charset = utf-8
trim_trailing_whitespace = true trim_trailing_whitespace = true
insert_final_newline = true insert_final_newline = true
[*.toml]
indent_size = 2
[*.html] [*.html]
indent_size = 2 indent_size = 2

View File

@ -28,11 +28,7 @@ Output of docker-compose logs or kubectl logs respectively
**Version and Deployment (please complete the following information):** **Version and Deployment (please complete the following information):**
<!-- - authentik version: [e.g. 2021.8.5]
Notice: authentik supports installation via Docker, Kubernetes, and AWS CloudFormation only. Support is not available for other methods. For detailed installation and configuration instructions, please refer to the official documentation at https://docs.goauthentik.io/docs/install-config/.
-->
- authentik version: [e.g. 2025.2.0]
- Deployment: [e.g. docker-compose, helm] - Deployment: [e.g. docker-compose, helm]
**Additional context** **Additional context**

View File

@ -1,22 +0,0 @@
---
name: Documentation issue
about: Suggest an improvement or report a problem
title: ""
labels: documentation
assignees: ""
---
**Do you see an area that can be clarified or expanded, a technical inaccuracy, or a broken link? Please describe.**
A clear and concise description of what the problem is, or where the document can be improved. Ex. I believe we need more details about [...]
**Provide the URL or link to the exact page in the documentation to which you are referring.**
If there are multiple pages, list them all, and be sure to state the header or section where the content is.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Additional context**
Add any other context or screenshots about the documentation issue here.
**Consider opening a PR!**
If the issue is one that you can fix, or even make a good pass at, we'd appreciate a PR. For more information about making a contribution to the docs, and using our Style Guide and our templates, refer to ["Writing documentation"](https://docs.goauthentik.io/docs/developer-docs/docs/writing-documentation).

View File

@ -20,12 +20,7 @@ Output of docker-compose logs or kubectl logs respectively
**Version and Deployment (please complete the following information):** **Version and Deployment (please complete the following information):**
<!-- - authentik version: [e.g. 2021.8.5]
Notice: authentik supports installation via Docker, Kubernetes, and AWS CloudFormation only. Support is not available for other methods. For detailed installation and configuration instructions, please refer to the official documentation at https://docs.goauthentik.io/docs/install-config/.
-->
- authentik version: [e.g. 2025.2.0]
- Deployment: [e.g. docker-compose, helm] - Deployment: [e.g. docker-compose, helm]
**Additional context** **Additional context**

View File

@ -35,6 +35,14 @@ runs:
AUTHENTIK_OUTPOSTS__CONTAINER_IMAGE_BASE=ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s AUTHENTIK_OUTPOSTS__CONTAINER_IMAGE_BASE=ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
``` ```
For arm64, use these values:
```shell
AUTHENTIK_IMAGE=ghcr.io/goauthentik/dev-server
AUTHENTIK_TAG=${{ inputs.tag }}-arm64
AUTHENTIK_OUTPOSTS__CONTAINER_IMAGE_BASE=ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
```
Afterwards, run the upgrade commands from the latest release notes. Afterwards, run the upgrade commands from the latest release notes.
</details> </details>
<details> <details>
@ -52,6 +60,18 @@ runs:
tag: ${{ inputs.tag }} tag: ${{ inputs.tag }}
``` ```
For arm64, use these values:
```yaml
authentik:
outposts:
container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
global:
image:
repository: ghcr.io/goauthentik/dev-server
tag: ${{ inputs.tag }}-arm64
```
Afterwards, run the upgrade commands from the latest release notes. Afterwards, run the upgrade commands from the latest release notes.
</details> </details>
edit-mode: replace edit-mode: replace

View File

@ -9,9 +9,6 @@ inputs:
image-arch: image-arch:
required: false required: false
description: "Docker image arch" description: "Docker image arch"
release:
required: true
description: "True if this is a release build, false if this is a dev/PR build"
outputs: outputs:
shouldPush: shouldPush:
@ -32,24 +29,15 @@ outputs:
imageTags: imageTags:
description: "Docker image tags" description: "Docker image tags"
value: ${{ steps.ev.outputs.imageTags }} value: ${{ steps.ev.outputs.imageTags }}
imageTagsJSON:
description: "Docker image tags, as a JSON array"
value: ${{ steps.ev.outputs.imageTagsJSON }}
attestImageNames: attestImageNames:
description: "Docker image names used for attestation" description: "Docker image names used for attestation"
value: ${{ steps.ev.outputs.attestImageNames }} value: ${{ steps.ev.outputs.attestImageNames }}
cacheTo:
description: "cache-to value for the docker build step"
value: ${{ steps.ev.outputs.cacheTo }}
imageMainTag: imageMainTag:
description: "Docker image main tag" description: "Docker image main tag"
value: ${{ steps.ev.outputs.imageMainTag }} value: ${{ steps.ev.outputs.imageMainTag }}
imageMainName: imageMainName:
description: "Docker image main name" description: "Docker image main name"
value: ${{ steps.ev.outputs.imageMainName }} value: ${{ steps.ev.outputs.imageMainName }}
imageBuildArgs:
description: "Docker image build args"
value: ${{ steps.ev.outputs.imageBuildArgs }}
runs: runs:
using: "composite" using: "composite"
@ -60,8 +48,6 @@ runs:
env: env:
IMAGE_NAME: ${{ inputs.image-name }} IMAGE_NAME: ${{ inputs.image-name }}
IMAGE_ARCH: ${{ inputs.image-arch }} IMAGE_ARCH: ${{ inputs.image-arch }}
RELEASE: ${{ inputs.release }}
PR_HEAD_SHA: ${{ github.event.pull_request.head.sha }} PR_HEAD_SHA: ${{ github.event.pull_request.head.sha }}
REF: ${{ github.ref }}
run: | run: |
python3 ${{ github.action_path }}/push_vars.py python3 ${{ github.action_path }}/push_vars.py

View File

@ -2,7 +2,6 @@
import configparser import configparser
import os import os
from json import dumps
from time import time from time import time
parser = configparser.ConfigParser() parser = configparser.ConfigParser()
@ -44,11 +43,12 @@ if is_release:
] ]
if not prerelease: if not prerelease:
image_tags += [ image_tags += [
f"{name}:latest",
f"{name}:{version_family}", f"{name}:{version_family}",
] ]
else: else:
suffix = "" suffix = ""
if image_arch: if image_arch and image_arch != "amd64":
suffix = f"-{image_arch}" suffix = f"-{image_arch}"
for name in image_names: for name in image_names:
image_tags += [ image_tags += [
@ -70,31 +70,12 @@ def get_attest_image_names(image_with_tags: list[str]):
return ",".join(set(image_tags)) return ",".join(set(image_tags))
# Generate `cache-to` param
cache_to = ""
if should_push:
_cache_tag = "buildcache"
if image_arch:
_cache_tag += f"-{image_arch}"
cache_to = f"type=registry,ref={get_attest_image_names(image_tags)}:{_cache_tag},mode=max"
image_build_args = []
if os.getenv("RELEASE", "false").lower() == "true":
image_build_args = [f"VERSION={os.getenv('REF')}"]
else:
image_build_args = [f"GIT_BUILD_HASH={sha}"]
image_build_args = "\n".join(image_build_args)
with open(os.environ["GITHUB_OUTPUT"], "a+", encoding="utf-8") as _output: with open(os.environ["GITHUB_OUTPUT"], "a+", encoding="utf-8") as _output:
print(f"shouldPush={str(should_push).lower()}", file=_output) print(f"shouldPush={str(should_push).lower()}", file=_output)
print(f"sha={sha}", file=_output) print(f"sha={sha}", file=_output)
print(f"version={version}", file=_output) print(f"version={version}", file=_output)
print(f"prerelease={prerelease}", file=_output) print(f"prerelease={prerelease}", file=_output)
print(f"imageTags={','.join(image_tags)}", file=_output) print(f"imageTags={','.join(image_tags)}", file=_output)
print(f"imageTagsJSON={dumps(image_tags)}", file=_output)
print(f"attestImageNames={get_attest_image_names(image_tags)}", file=_output) print(f"attestImageNames={get_attest_image_names(image_tags)}", file=_output)
print(f"imageMainTag={image_main_tag}", file=_output) print(f"imageMainTag={image_main_tag}", file=_output)
print(f"imageMainName={image_tags[0]}", file=_output) print(f"imageMainName={image_tags[0]}", file=_output)
print(f"cacheTo={cache_to}", file=_output)
print(f"imageBuildArgs={image_build_args}", file=_output)

View File

@ -1,18 +1,7 @@
#!/bin/bash -x #!/bin/bash -x
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd ) SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
# Non-pushing PR
GITHUB_OUTPUT=/dev/stdout \ GITHUB_OUTPUT=/dev/stdout \
GITHUB_REF=ref \ GITHUB_REF=ref \
GITHUB_SHA=sha \ GITHUB_SHA=sha \
IMAGE_NAME=ghcr.io/goauthentik/server,beryju/authentik \ IMAGE_NAME=ghcr.io/goauthentik/server,beryju/authentik \
GITHUB_REPOSITORY=goauthentik/authentik \
python $SCRIPT_DIR/push_vars.py
# Pushing PR/main
GITHUB_OUTPUT=/dev/stdout \
GITHUB_REF=ref \
GITHUB_SHA=sha \
IMAGE_NAME=ghcr.io/goauthentik/server,beryju/authentik \
GITHUB_REPOSITORY=goauthentik/authentik \
DOCKER_USERNAME=foo \
python $SCRIPT_DIR/push_vars.py python $SCRIPT_DIR/push_vars.py

View File

@ -9,22 +9,17 @@ inputs:
runs: runs:
using: "composite" using: "composite"
steps: steps:
- name: Install apt deps - name: Install poetry & deps
shell: bash shell: bash
run: | run: |
pipx install poetry || true
sudo apt-get update sudo apt-get update
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext libkrb5-dev krb5-kdc krb5-user krb5-admin-server sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext libkrb5-dev krb5-kdc krb5-user krb5-admin-server
- name: Install uv - name: Setup python and restore poetry
uses: astral-sh/setup-uv@v5
with:
enable-cache: true
- name: Setup python
uses: actions/setup-python@v5 uses: actions/setup-python@v5
with: with:
python-version-file: "pyproject.toml" python-version-file: "pyproject.toml"
- name: Install Python deps cache: "poetry"
shell: bash
run: uv sync --all-extras --dev --frozen
- name: Setup node - name: Setup node
uses: actions/setup-node@v4 uses: actions/setup-node@v4
with: with:
@ -35,18 +30,15 @@ runs:
uses: actions/setup-go@v5 uses: actions/setup-go@v5
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- name: Setup docker cache
uses: AndreKurait/docker-cache@0fe76702a40db986d9663c24954fc14c6a6031b7
with:
key: docker-images-${{ runner.os }}-${{ hashFiles('.github/actions/setup/docker-compose.yml', 'Makefile') }}-${{ inputs.postgresql_version }}
- name: Setup dependencies - name: Setup dependencies
shell: bash shell: bash
run: | run: |
export PSQL_TAG=${{ inputs.postgresql_version }} export PSQL_TAG=${{ inputs.postgresql_version }}
docker compose -f .github/actions/setup/docker-compose.yml up -d docker compose -f .github/actions/setup/docker-compose.yml up -d
poetry install --sync
cd web && npm ci cd web && npm ci
- name: Generate config - name: Generate config
shell: uv run python {0} shell: poetry run python {0}
run: | run: |
from authentik.lib.generators import generate_id from authentik.lib.generators import generate_id
from yaml import safe_dump from yaml import safe_dump

View File

@ -11,7 +11,7 @@ services:
- 5432:5432 - 5432:5432
restart: always restart: always
redis: redis:
image: docker.io/library/redis:7 image: docker.io/library/redis
ports: ports:
- 6379:6379 - 6379:6379
restart: always restart: always

View File

@ -1,32 +1,7 @@
akadmin
asgi
assertIn
authentik
authn
crate
docstrings
entra
goauthentik
gunicorn
hass
jwe
jwks
keypair keypair
keypairs keypairs
kubernetes hass
oidc
ontext
openid
passwordless
plex
saml
scim
singed
slo
sso
totp
traefik
# https://github.com/codespell-project/codespell/issues/1224
upToDate
warmup warmup
webauthn ontext
singed
assertIn

View File

@ -23,13 +23,7 @@ updates:
- package-ecosystem: npm - package-ecosystem: npm
directories: directories:
- "/web" - "/web"
- "/web/packages/sfe" - "/web/sfe"
- "/web/packages/core"
- "/web/packages/esbuild-plugin-live-reload"
- "/packages/prettier-config"
- "/packages/tsconfig"
- "/packages/docusaurus-config"
- "/packages/eslint-config"
schedule: schedule:
interval: daily interval: daily
time: "04:00" time: "04:00"
@ -74,9 +68,6 @@ updates:
wdio: wdio:
patterns: patterns:
- "@wdio/*" - "@wdio/*"
goauthentik:
patterns:
- "@goauthentik/*"
- package-ecosystem: npm - package-ecosystem: npm
directory: "/website" directory: "/website"
schedule: schedule:
@ -91,33 +82,7 @@ updates:
docusaurus: docusaurus:
patterns: patterns:
- "@docusaurus/*" - "@docusaurus/*"
build: - package-ecosystem: pip
patterns:
- "@swc/*"
- "swc-*"
- "lightningcss*"
- "@rspack/binding*"
goauthentik:
patterns:
- "@goauthentik/*"
eslint:
patterns:
- "@eslint/*"
- "@typescript-eslint/*"
- "eslint-*"
- "eslint"
- "typescript-eslint"
- package-ecosystem: npm
directory: "/lifecycle/aws"
schedule:
interval: daily
time: "04:00"
open-pull-requests-limit: 10
commit-message:
prefix: "lifecycle/aws:"
labels:
- dependencies
- package-ecosystem: uv
directory: "/" directory: "/"
schedule: schedule:
interval: daily interval: daily
@ -137,15 +102,3 @@ updates:
prefix: "core:" prefix: "core:"
labels: labels:
- dependencies - dependencies
- package-ecosystem: docker-compose
directories:
# - /scripts # Maybe
- /tests/e2e
schedule:
interval: daily
time: "04:00"
open-pull-requests-limit: 10
commit-message:
prefix: "core:"
labels:
- dependencies

View File

@ -1,98 +0,0 @@
# Re-usable workflow for a single-architecture build
name: Single-arch Container build
on:
workflow_call:
inputs:
image_name:
required: true
type: string
image_arch:
required: true
type: string
runs-on:
required: true
type: string
registry_dockerhub:
default: false
type: boolean
registry_ghcr:
default: false
type: boolean
release:
default: false
type: boolean
outputs:
image-digest:
value: ${{ jobs.build.outputs.image-digest }}
jobs:
build:
name: Build ${{ inputs.image_arch }}
runs-on: ${{ inputs.runs-on }}
outputs:
image-digest: ${{ steps.push.outputs.digest }}
permissions:
# Needed to upload container images to ghcr.io
packages: write
# Needed for attestation
id-token: write
attestations: write
# Needed for checkout
contents: read
steps:
- uses: actions/checkout@v4
- uses: docker/setup-qemu-action@v3.6.0
- uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ${{ inputs.image_name }}
image-arch: ${{ inputs.image_arch }}
release: ${{ inputs.release }}
- name: Login to Docker Hub
if: ${{ inputs.registry_dockerhub }}
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry
if: ${{ inputs.registry_ghcr }}
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: make empty clients
if: ${{ inputs.release }}
run: |
mkdir -p ./gen-ts-api
mkdir -p ./gen-go-api
- name: generate ts client
if: ${{ !inputs.release }}
run: make gen-client-ts
- name: Build Docker Image
uses: docker/build-push-action@v6
id: push
with:
context: .
push: ${{ steps.ev.outputs.shouldPush == 'true' }}
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
build-args: |
${{ steps.ev.outputs.imageBuildArgs }}
tags: ${{ steps.ev.outputs.imageTags }}
platforms: linux/${{ inputs.image_arch }}
cache-from: type=registry,ref=${{ steps.ev.outputs.attestImageNames }}:buildcache-${{ inputs.image_arch }}
cache-to: ${{ steps.ev.outputs.cacheTo }}
- uses: actions/attest-build-provenance@v2
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true

View File

@ -1,104 +0,0 @@
# Re-usable workflow for a multi-architecture build
name: Multi-arch container build
on:
workflow_call:
inputs:
image_name:
required: true
type: string
registry_dockerhub:
default: false
type: boolean
registry_ghcr:
default: true
type: boolean
release:
default: false
type: boolean
outputs: {}
jobs:
build-server-amd64:
uses: ./.github/workflows/_reusable-docker-build-single.yaml
secrets: inherit
with:
image_name: ${{ inputs.image_name }}
image_arch: amd64
runs-on: ubuntu-latest
registry_dockerhub: ${{ inputs.registry_dockerhub }}
registry_ghcr: ${{ inputs.registry_ghcr }}
release: ${{ inputs.release }}
build-server-arm64:
uses: ./.github/workflows/_reusable-docker-build-single.yaml
secrets: inherit
with:
image_name: ${{ inputs.image_name }}
image_arch: arm64
runs-on: ubuntu-22.04-arm
registry_dockerhub: ${{ inputs.registry_dockerhub }}
registry_ghcr: ${{ inputs.registry_ghcr }}
release: ${{ inputs.release }}
get-tags:
runs-on: ubuntu-latest
needs:
- build-server-amd64
- build-server-arm64
outputs:
tags: ${{ steps.ev.outputs.imageTagsJSON }}
shouldPush: ${{ steps.ev.outputs.shouldPush }}
steps:
- uses: actions/checkout@v4
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ${{ inputs.image_name }}
merge-server:
runs-on: ubuntu-latest
if: ${{ needs.get-tags.outputs.shouldPush == 'true' }}
needs:
- get-tags
- build-server-amd64
- build-server-arm64
strategy:
fail-fast: false
matrix:
tag: ${{ fromJson(needs.get-tags.outputs.tags) }}
steps:
- uses: actions/checkout@v4
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ${{ inputs.image_name }}
- name: Login to Docker Hub
if: ${{ inputs.registry_dockerhub }}
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry
if: ${{ inputs.registry_ghcr }}
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: int128/docker-manifest-create-action@v2
id: build
with:
tags: ${{ matrix.tag }}
sources: |
${{ steps.ev.outputs.attestImageNames }}@${{ needs.build-server-amd64.outputs.image-digest }}
${{ steps.ev.outputs.attestImageNames }}@${{ needs.build-server-arm64.outputs.image-digest }}
- uses: actions/attest-build-provenance@v2
id: attest
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.build.outputs.digest }}
push-to-registry: true

View File

@ -30,6 +30,7 @@ jobs:
uses: actions/setup-python@v5 uses: actions/setup-python@v5
with: with:
python-version-file: "pyproject.toml" python-version-file: "pyproject.toml"
cache: "poetry"
- name: Generate API Client - name: Generate API Client
run: make gen-client-py run: make gen-client-py
- name: Publish package - name: Publish package

View File

@ -53,7 +53,6 @@ jobs:
signoff: true signoff: true
# ID from https://api.github.com/users/authentik-automation[bot] # ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com> author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
labels: dependencies
- uses: peter-evans/enable-pull-request-automerge@v3 - uses: peter-evans/enable-pull-request-automerge@v3
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}

View File

@ -25,15 +25,15 @@ jobs:
uses: ./.github/actions/setup uses: ./.github/actions/setup
- uses: actions/setup-node@v4 - uses: actions/setup-node@v4
with: with:
node-version-file: lifecycle/aws/package.json node-version-file: website/package.json
cache: "npm" cache: "npm"
cache-dependency-path: lifecycle/aws/package-lock.json cache-dependency-path: website/package-lock.json
- working-directory: lifecycle/aws/ - working-directory: website/
run: | run: |
npm ci npm ci
- name: Check changes have been applied - name: Check changes have been applied
run: | run: |
uv run make aws-cfn poetry run make aws-cfn
git diff --exit-code git diff --exit-code
ci-aws-cfn-mark: ci-aws-cfn-mark:
if: always() if: always()

View File

@ -1,29 +0,0 @@
---
name: authentik-ci-main-daily
on:
workflow_dispatch:
schedule:
# Every night at 3am
- cron: "0 3 * * *"
jobs:
test-container:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
version:
- docs
- version-2025-4
- version-2025-2
steps:
- uses: actions/checkout@v4
- run: |
current="$(pwd)"
dir="/tmp/authentik/${{ matrix.version }}"
mkdir -p $dir
cd $dir
wget https://${{ matrix.version }}.goauthentik.io/docker-compose.yml
${current}/scripts/test_docker.sh

View File

@ -34,7 +34,7 @@ jobs:
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: run job - name: run job
run: uv run make ci-${{ matrix.job }} run: poetry run make ci-${{ matrix.job }}
test-migrations: test-migrations:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
@ -42,34 +42,24 @@ jobs:
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: run migrations - name: run migrations
run: uv run python -m lifecycle.migrate run: poetry run python -m lifecycle.migrate
test-make-seed:
runs-on: ubuntu-latest
steps:
- id: seed
run: |
echo "seed=$(printf "%d\n" "0x$(openssl rand -hex 4)")" >> "$GITHUB_OUTPUT"
outputs:
seed: ${{ steps.seed.outputs.seed }}
test-migrations-from-stable: test-migrations-from-stable:
name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }} - Run ${{ matrix.run_id }}/5 name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }}
runs-on: ubuntu-latest runs-on: ubuntu-latest
timeout-minutes: 20
needs: test-make-seed
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
psql: psql:
- 15-alpine - 15-alpine
- 16-alpine - 16-alpine
- 17-alpine
run_id: [1, 2, 3, 4, 5]
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with: with:
fetch-depth: 0 fetch-depth: 0
- name: checkout stable - name: checkout stable
run: | run: |
# Delete all poetry envs
rm -rf /home/runner/.cache/pypoetry
# Copy current, latest config to local # Copy current, latest config to local
cp authentik/lib/default.yml local.env.yml cp authentik/lib/default.yml local.env.yml
cp -R .github .. cp -R .github ..
@ -82,7 +72,7 @@ jobs:
with: with:
postgresql_version: ${{ matrix.psql }} postgresql_version: ${{ matrix.psql }}
- name: run migrations to stable - name: run migrations to stable
run: uv run python -m lifecycle.migrate run: poetry run python -m lifecycle.migrate
- name: checkout current code - name: checkout current code
run: | run: |
set -x set -x
@ -90,35 +80,31 @@ jobs:
git reset --hard HEAD git reset --hard HEAD
git clean -d -fx . git clean -d -fx .
git checkout $GITHUB_SHA git checkout $GITHUB_SHA
# Delete previous poetry env
rm -rf /home/runner/.cache/pypoetry/virtualenvs/*
- name: Setup authentik env (ensure latest deps are installed) - name: Setup authentik env (ensure latest deps are installed)
uses: ./.github/actions/setup uses: ./.github/actions/setup
with: with:
postgresql_version: ${{ matrix.psql }} postgresql_version: ${{ matrix.psql }}
- name: migrate to latest - name: migrate to latest
run: | run: |
uv run python -m lifecycle.migrate poetry run python -m lifecycle.migrate
- name: run tests - name: run tests
env: env:
# Test in the main database that we just migrated from the previous stable version # Test in the main database that we just migrated from the previous stable version
AUTHENTIK_POSTGRESQL__TEST__NAME: authentik AUTHENTIK_POSTGRESQL__TEST__NAME: authentik
CI_TEST_SEED: ${{ needs.test-make-seed.outputs.seed }}
CI_RUN_ID: ${{ matrix.run_id }}
CI_TOTAL_RUNS: "5"
run: | run: |
uv run make ci-test poetry run make test
test-unittest: test-unittest:
name: test-unittest - PostgreSQL ${{ matrix.psql }} - Run ${{ matrix.run_id }}/5 name: test-unittest - PostgreSQL ${{ matrix.psql }}
runs-on: ubuntu-latest runs-on: ubuntu-latest
timeout-minutes: 20 timeout-minutes: 30
needs: test-make-seed
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
psql: psql:
- 15-alpine - 15-alpine
- 16-alpine - 16-alpine
- 17-alpine
run_id: [1, 2, 3, 4, 5]
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Setup authentik env - name: Setup authentik env
@ -126,12 +112,9 @@ jobs:
with: with:
postgresql_version: ${{ matrix.psql }} postgresql_version: ${{ matrix.psql }}
- name: run unittest - name: run unittest
env:
CI_TEST_SEED: ${{ needs.test-make-seed.outputs.seed }}
CI_RUN_ID: ${{ matrix.run_id }}
CI_TOTAL_RUNS: "5"
run: | run: |
uv run make ci-test poetry run make test
poetry run coverage xml
- if: ${{ always() }} - if: ${{ always() }}
uses: codecov/codecov-action@v5 uses: codecov/codecov-action@v5
with: with:
@ -154,8 +137,8 @@ jobs:
uses: helm/kind-action@v1.12.0 uses: helm/kind-action@v1.12.0
- name: run integration - name: run integration
run: | run: |
uv run coverage run manage.py test tests/integration poetry run coverage run manage.py test tests/integration
uv run coverage xml poetry run coverage xml
- if: ${{ always() }} - if: ${{ always() }}
uses: codecov/codecov-action@v5 uses: codecov/codecov-action@v5
with: with:
@ -202,7 +185,7 @@ jobs:
uses: actions/cache@v4 uses: actions/cache@v4
with: with:
path: web/dist path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**') }}
- name: prepare web ui - name: prepare web ui
if: steps.cache-web.outputs.cache-hit != 'true' if: steps.cache-web.outputs.cache-hit != 'true'
working-directory: web working-directory: web
@ -210,11 +193,10 @@ jobs:
npm ci npm ci
make -C .. gen-client-ts make -C .. gen-client-ts
npm run build npm run build
npm run build:sfe
- name: run e2e - name: run e2e
run: | run: |
uv run coverage run manage.py test ${{ matrix.job.glob }} poetry run coverage run manage.py test ${{ matrix.job.glob }}
uv run coverage xml poetry run coverage xml
- if: ${{ always() }} - if: ${{ always() }}
uses: codecov/codecov-action@v5 uses: codecov/codecov-action@v5
with: with:
@ -241,20 +223,68 @@ jobs:
with: with:
jobs: ${{ toJSON(needs) }} jobs: ${{ toJSON(needs) }}
build: build:
strategy:
fail-fast: false
matrix:
arch:
- amd64
- arm64
needs: ci-core-mark
runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload container images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation # Needed for attestation
id-token: write id-token: write
attestations: write attestations: write
# Needed for checkout timeout-minutes: 120
contents: read steps:
needs: ci-core-mark - uses: actions/checkout@v4
uses: ./.github/workflows/_reusable-docker-build.yaml with:
secrets: inherit ref: ${{ github.event.pull_request.head.sha }}
with: - name: Set up QEMU
image_name: ${{ github.repository == 'goauthentik/authentik-internal' && 'ghcr.io/goauthentik/internal-server' || 'ghcr.io/goauthentik/dev-server' }} uses: docker/setup-qemu-action@v3.3.0
release: false - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/dev-server
image-arch: ${{ matrix.arch }}
- name: Login to Container Registry
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: generate ts client
run: make gen-client-ts
- name: Build Docker Image
uses: docker/build-push-action@v6
id: push
with:
context: .
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
tags: ${{ steps.ev.outputs.imageTags }}
push: ${{ steps.ev.outputs.shouldPush == 'true' }}
build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-server:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' && 'type=registry,ref=ghcr.io/goauthentik/dev-server:buildcache,mode=max' || '' }}
platforms: linux/${{ matrix.arch }}
- uses: actions/attest-build-provenance@v2
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
pr-comment: pr-comment:
needs: needs:
- build - build

View File

@ -29,7 +29,7 @@ jobs:
- name: Generate API - name: Generate API
run: make gen-client-go run: make gen-client-go
- name: golangci-lint - name: golangci-lint
uses: golangci/golangci-lint-action@v8 uses: golangci/golangci-lint-action@v6
with: with:
version: latest version: latest
args: --timeout 5000s --verbose args: --timeout 5000s --verbose
@ -59,7 +59,6 @@ jobs:
with: with:
jobs: ${{ toJSON(needs) }} jobs: ${{ toJSON(needs) }}
build-container: build-container:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
timeout-minutes: 120 timeout-minutes: 120
needs: needs:
- ci-outpost-mark - ci-outpost-mark
@ -73,7 +72,7 @@ jobs:
- rac - rac
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload container images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation # Needed for attestation
id-token: write id-token: write
@ -83,7 +82,7 @@ jobs:
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.6.0 uses: docker/setup-qemu-action@v3.3.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- name: prepare variables - name: prepare variables

View File

@ -49,7 +49,6 @@ jobs:
matrix: matrix:
job: job:
- build - build
- build:integrations
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-node@v4 - uses: actions/setup-node@v4
@ -62,65 +61,14 @@ jobs:
- name: build - name: build
working-directory: website/ working-directory: website/
run: npm run ${{ matrix.job }} run: npm run ${{ matrix.job }}
build-container:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest
permissions:
# Needed to upload container images to ghcr.io
packages: write
# Needed for attestation
id-token: write
attestations: write
steps:
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.6.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/dev-docs
- name: Login to Container Registry
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker Image
id: push
uses: docker/build-push-action@v6
with:
tags: ${{ steps.ev.outputs.imageTags }}
file: website/Dockerfile
push: ${{ steps.ev.outputs.shouldPush == 'true' }}
platforms: linux/amd64,linux/arm64
context: .
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' && 'type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache,mode=max' || '' }}
- uses: actions/attest-build-provenance@v2
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
ci-website-mark: ci-website-mark:
if: always() if: always()
needs: needs:
- lint - lint
- test - test
- build - build
- build-container
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: re-actors/alls-green@release/v1 - uses: re-actors/alls-green@release/v1
with: with:
jobs: ${{ toJSON(needs) }} jobs: ${{ toJSON(needs) }}
allowed-skips: ${{ github.repository == 'goauthentik/authentik-internal' && 'build-container' || '[]' }}

View File

@ -2,7 +2,7 @@ name: "CodeQL"
on: on:
push: push:
branches: [main, next, version*] branches: [main, "*", next, version*]
pull_request: pull_request:
branches: [main] branches: [main]
schedule: schedule:

View File

@ -2,7 +2,7 @@ name: authentik-gen-update-webauthn-mds
on: on:
workflow_dispatch: workflow_dispatch:
schedule: schedule:
- cron: "30 1 1,15 * *" - cron: '30 1 1,15 * *'
env: env:
POSTGRES_DB: authentik POSTGRES_DB: authentik
@ -24,7 +24,7 @@ jobs:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- run: uv run ak update_webauthn_mds - run: poetry run ak update_webauthn_mds
- uses: peter-evans/create-pull-request@v7 - uses: peter-evans/create-pull-request@v7
id: cpr id: cpr
with: with:
@ -37,7 +37,6 @@ jobs:
signoff: true signoff: true
# ID from https://api.github.com/users/authentik-automation[bot] # ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com> author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
labels: dependencies
- uses: peter-evans/enable-pull-request-automerge@v3 - uses: peter-evans/enable-pull-request-automerge@v3
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}

View File

@ -53,7 +53,6 @@ jobs:
body: ${{ steps.compress.outputs.markdown }} body: ${{ steps.compress.outputs.markdown }}
delete-branch: true delete-branch: true
signoff: true signoff: true
labels: dependencies
- uses: peter-evans/enable-pull-request-automerge@v3 - uses: peter-evans/enable-pull-request-automerge@v3
if: "${{ github.event_name != 'pull_request' && steps.compress.outputs.markdown != '' }}" if: "${{ github.event_name != 'pull_request' && steps.compress.outputs.markdown != '' }}"
with: with:

View File

@ -1,47 +0,0 @@
name: authentik-packages-npm-publish
on:
push:
branches: [main]
paths:
- packages/docusaurus-config/**
- packages/eslint-config/**
- packages/prettier-config/**
- packages/tsconfig/**
- web/packages/esbuild-plugin-live-reload/**
workflow_dispatch:
jobs:
publish:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
package:
- packages/docusaurus-config
- packages/eslint-config
- packages/prettier-config
- packages/tsconfig
- web/packages/esbuild-plugin-live-reload
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 2
- uses: actions/setup-node@v4
with:
node-version-file: ${{ matrix.package }}/package.json
registry-url: "https://registry.npmjs.org"
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c
with:
files: |
${{ matrix.package }}/package.json
- name: Publish package
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ${{ matrix.package }}
run: |
npm ci
npm run build
npm publish
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_PUBLISH_TOKEN }}

View File

@ -21,8 +21,8 @@ jobs:
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: generate docs - name: generate docs
run: | run: |
uv run make migrate poetry run make migrate
uv run ak build_source_docs poetry run ak build_source_docs
- name: Publish - name: Publish
uses: netlify/actions/cli@master uses: netlify/actions/cli@master
with: with:

View File

@ -7,23 +7,9 @@ on:
jobs: jobs:
build-server: build-server:
uses: ./.github/workflows/_reusable-docker-build.yaml
secrets: inherit
permissions:
# Needed to upload container images to ghcr.io
packages: write
# Needed for attestation
id-token: write
attestations: write
with:
image_name: ghcr.io/goauthentik/server,beryju/authentik
release: true
registry_dockerhub: true
registry_ghcr: true
build-docs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload container images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation # Needed for attestation
id-token: write id-token: write
@ -31,7 +17,7 @@ jobs:
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.6.0 uses: docker/setup-qemu-action@v3.3.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- name: prepare variables - name: prepare variables
@ -40,25 +26,37 @@ jobs:
env: env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }} DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with: with:
image-name: ghcr.io/goauthentik/docs image-name: ghcr.io/goauthentik/server,beryju/authentik
- name: Docker Login Registry
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v3
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: make empty clients
run: |
mkdir -p ./gen-ts-api
mkdir -p ./gen-go-api
- name: Build Docker Image - name: Build Docker Image
id: push
uses: docker/build-push-action@v6 uses: docker/build-push-action@v6
id: push
with: with:
tags: ${{ steps.ev.outputs.imageTags }}
file: website/Dockerfile
push: true
platforms: linux/amd64,linux/arm64
context: . context: .
push: true
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
build-args: |
VERSION=${{ github.ref }}
tags: ${{ steps.ev.outputs.imageTags }}
platforms: linux/amd64,linux/arm64
- uses: actions/attest-build-provenance@v2 - uses: actions/attest-build-provenance@v2
id: attest id: attest
if: true
with: with:
subject-name: ${{ steps.ev.outputs.attestImageNames }} subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }} subject-digest: ${{ steps.push.outputs.digest }}
@ -66,7 +64,7 @@ jobs:
build-outpost: build-outpost:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload container images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation # Needed for attestation
id-token: write id-token: write
@ -85,7 +83,7 @@ jobs:
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.6.0 uses: docker/setup-qemu-action@v3.3.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- name: prepare variables - name: prepare variables
@ -190,8 +188,8 @@ jobs:
aws-region: ${{ env.AWS_REGION }} aws-region: ${{ env.AWS_REGION }}
- name: Upload template - name: Upload template
run: | run: |
aws s3 cp --acl=public-read lifecycle/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.${{ github.ref }}.yaml aws s3 cp --acl=public-read website/docs/install-config/install/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.${{ github.ref }}.yaml
aws s3 cp --acl=public-read lifecycle/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.latest.yaml aws s3 cp --acl=public-read website/docs/install-config/install/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.latest.yaml
test-release: test-release:
needs: needs:
- build-server - build-server
@ -229,13 +227,13 @@ jobs:
container=$(docker container create ${{ steps.ev.outputs.imageMainName }}) container=$(docker container create ${{ steps.ev.outputs.imageMainName }})
docker cp ${container}:web/ . docker cp ${container}:web/ .
- name: Create a Sentry.io release - name: Create a Sentry.io release
uses: getsentry/action-release@v3 uses: getsentry/action-release@v1
continue-on-error: true continue-on-error: true
env: env:
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }} SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_ORG: authentik-security-inc SENTRY_ORG: authentik-security-inc
SENTRY_PROJECT: authentik SENTRY_PROJECT: authentik
with: with:
release: authentik@${{ steps.ev.outputs.version }} version: authentik@${{ steps.ev.outputs.version }}
sourcemaps: "./web/dist" sourcemaps: "./web/dist"
url_prefix: "~/static/dist" url_prefix: "~/static/dist"

View File

@ -14,7 +14,16 @@ jobs:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Pre-release test - name: Pre-release test
run: | run: |
make test-docker echo "PG_PASS=$(openssl rand 32 | base64 -w 0)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(openssl rand 32 | base64 -w 0)" >> .env
docker buildx install
mkdir -p ./gen-ts-api
docker build -t testing:latest .
echo "AUTHENTIK_IMAGE=testing" >> .env
echo "AUTHENTIK_TAG=latest" >> .env
docker compose up --no-start
docker compose start postgresql redis
docker compose run -u root server test-all
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v2
with: with:

View File

@ -1,21 +0,0 @@
name: "authentik-repo-mirror-cleanup"
on:
workflow_dispatch:
jobs:
to_internal:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- if: ${{ env.MIRROR_KEY != '' }}
uses: BeryJu/repository-mirroring-action@5cf300935bc2e068f73ea69bcc411a8a997208eb
with:
target_repo_url: git@github.com:goauthentik/authentik-internal.git
ssh_private_key: ${{ secrets.GH_MIRROR_KEY }}
args: --tags --force --prune
env:
MIRROR_KEY: ${{ secrets.GH_MIRROR_KEY }}

View File

@ -11,10 +11,11 @@ jobs:
with: with:
fetch-depth: 0 fetch-depth: 0
- if: ${{ env.MIRROR_KEY != '' }} - if: ${{ env.MIRROR_KEY != '' }}
uses: BeryJu/repository-mirroring-action@5cf300935bc2e068f73ea69bcc411a8a997208eb uses: pixta-dev/repository-mirroring-action@v1
with: with:
target_repo_url: git@github.com:goauthentik/authentik-internal.git target_repo_url:
ssh_private_key: ${{ secrets.GH_MIRROR_KEY }} git@github.com:goauthentik/authentik-internal.git
args: --tags --force ssh_private_key:
${{ secrets.GH_MIRROR_KEY }}
env: env:
MIRROR_KEY: ${{ secrets.GH_MIRROR_KEY }} MIRROR_KEY: ${{ secrets.GH_MIRROR_KEY }}

View File

@ -1,8 +1,8 @@
name: "authentik-repo-stale" name: 'authentik-repo-stale'
on: on:
schedule: schedule:
- cron: "30 1 * * *" - cron: '30 1 * * *'
workflow_dispatch: workflow_dispatch:
permissions: permissions:
@ -25,7 +25,7 @@ jobs:
days-before-stale: 60 days-before-stale: 60
days-before-close: 7 days-before-close: 7
exempt-issue-labels: pinned,security,pr_wanted,enhancement,bug/confirmed,enhancement/confirmed,question,status/reviewing exempt-issue-labels: pinned,security,pr_wanted,enhancement,bug/confirmed,enhancement/confirmed,question,status/reviewing
stale-issue-label: status/stale stale-issue-label: wontfix
stale-issue-message: > stale-issue-message: >
This issue has been automatically marked as stale because it has not had This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. Thank you recent activity. It will be closed if no further activity occurs. Thank you

View File

@ -1,27 +0,0 @@
name: authentik-semgrep
on:
workflow_dispatch: {}
pull_request: {}
push:
branches:
- main
- master
paths:
- .github/workflows/semgrep.yml
schedule:
# random HH:MM to avoid a load spike on GitHub Actions at 00:00
- cron: '12 15 * * *'
jobs:
semgrep:
name: semgrep/ci
runs-on: ubuntu-latest
permissions:
contents: read
env:
SEMGREP_APP_TOKEN: ${{ secrets.SEMGREP_APP_TOKEN }}
container:
image: semgrep/semgrep
if: (github.actor != 'dependabot[bot]')
steps:
- uses: actions/checkout@v4
- run: semgrep ci

View File

@ -1,13 +1,9 @@
--- ---
name: authentik-translate-extract-compile name: authentik-backend-translate-extract-compile
on: on:
schedule: schedule:
- cron: "0 0 * * *" # every day at midnight - cron: "0 0 * * *" # every day at midnight
workflow_dispatch: workflow_dispatch:
pull_request:
branches:
- main
- version-*
env: env:
POSTGRES_DB: authentik POSTGRES_DB: authentik
@ -16,34 +12,26 @@ env:
jobs: jobs:
compile: compile:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- id: generate_token - id: generate_token
if: ${{ github.event_name != 'pull_request' }}
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v2
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v4 - uses: actions/checkout@v4
if: ${{ github.event_name != 'pull_request' }}
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}
- uses: actions/checkout@v4
if: ${{ github.event_name == 'pull_request' }}
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: Generate API
run: make gen-client-ts
- name: run extract - name: run extract
run: | run: |
uv run make i18n-extract poetry run make i18n-extract
- name: run compile - name: run compile
run: | run: |
uv run ak compilemessages poetry run ak compilemessages
make web-check-compile make web-check-compile
- name: Create Pull Request - name: Create Pull Request
if: ${{ github.event_name != 'pull_request' }}
uses: peter-evans/create-pull-request@v7 uses: peter-evans/create-pull-request@v7
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}
@ -53,6 +41,3 @@ jobs:
body: "core, web: update translations" body: "core, web: update translations"
delete-branch: true delete-branch: true
signoff: true signoff: true
labels: dependencies
# ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>

View File

@ -15,7 +15,6 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
if: ${{ github.event.pull_request.user.login == 'transifex-integration[bot]'}} if: ${{ github.event.pull_request.user.login == 'transifex-integration[bot]'}}
steps: steps:
- uses: actions/checkout@v4
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v2
with: with:
@ -26,13 +25,23 @@ jobs:
env: env:
GH_TOKEN: ${{ steps.generate_token.outputs.token }} GH_TOKEN: ${{ steps.generate_token.outputs.token }}
run: | run: |
title=$(gh pr view ${{ github.event.pull_request.number }} --json "title" -q ".title") title=$(curl -q -L \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${GH_TOKEN}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
https://api.github.com/repos/${GITHUB_REPOSITORY}/pulls/${{ github.event.pull_request.number }} | jq -r .title)
echo "title=${title}" >> "$GITHUB_OUTPUT" echo "title=${title}" >> "$GITHUB_OUTPUT"
- name: Rename - name: Rename
env: env:
GH_TOKEN: ${{ steps.generate_token.outputs.token }} GH_TOKEN: ${{ steps.generate_token.outputs.token }}
run: | run: |
gh pr edit ${{ github.event.pull_request.number }} -t "translate: ${{ steps.title.outputs.title }}" --add-label dependencies curl -L \
-X PATCH \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${GH_TOKEN}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
https://api.github.com/repos/${GITHUB_REPOSITORY}/pulls/${{ github.event.pull_request.number }} \
-d "{\"title\":\"translate: ${{ steps.title.outputs.title }}\"}"
- uses: peter-evans/enable-pull-request-automerge@v3 - uses: peter-evans/enable-pull-request-automerge@v3
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}

8
.gitignore vendored
View File

@ -11,10 +11,6 @@ local_settings.py
db.sqlite3 db.sqlite3
media media
# Node
node_modules
# If your build process includes running collectstatic, then you probably don't need or want to include staticfiles/ # If your build process includes running collectstatic, then you probably don't need or want to include staticfiles/
# in your Git repository. Update and uncomment the following line accordingly. # in your Git repository. Update and uncomment the following line accordingly.
# <django-project-name>/staticfiles/ # <django-project-name>/staticfiles/
@ -37,7 +33,6 @@ eggs/
lib64/ lib64/
parts/ parts/
dist/ dist/
out/
sdist/ sdist/
var/ var/
wheels/ wheels/
@ -214,6 +209,3 @@ source_docs/
### Golang ### ### Golang ###
/vendor/ /vendor/
### Docker ###
docker-compose.override.yml

View File

@ -1,47 +0,0 @@
# Prettier Ignorefile
## Static Files
**/LICENSE
authentik/stages/**/*
## Build asset directories
coverage
dist
out
.docusaurus
website/docs/developer-docs/api/**/*
## Environment
*.env
## Secrets
*.secrets
## Yarn
.yarn/**/*
## Node
node_modules
coverage
## Configs
*.log
*.yaml
*.yml
# Templates
# TODO: Rename affected files to *.template.* or similar.
*.html
*.mdx
*.md
## Import order matters
poly.ts
src/locale-codes.ts
src/locales/
# Storybook
storybook-static/
.storybook/css-import-maps*

View File

@ -2,7 +2,6 @@
"recommendations": [ "recommendations": [
"bashmish.es6-string-css", "bashmish.es6-string-css",
"bpruitt-goddard.mermaid-markdown-syntax-highlighting", "bpruitt-goddard.mermaid-markdown-syntax-highlighting",
"charliermarsh.ruff",
"dbaeumer.vscode-eslint", "dbaeumer.vscode-eslint",
"EditorConfig.EditorConfig", "EditorConfig.EditorConfig",
"esbenp.prettier-vscode", "esbenp.prettier-vscode",
@ -11,12 +10,12 @@
"Gruntfuggly.todo-tree", "Gruntfuggly.todo-tree",
"mechatroner.rainbow-csv", "mechatroner.rainbow-csv",
"ms-python.black-formatter", "ms-python.black-formatter",
"ms-python.black-formatter", "charliermarsh.ruff",
"ms-python.debugpy",
"ms-python.python", "ms-python.python",
"ms-python.vscode-pylance", "ms-python.vscode-pylance",
"ms-python.black-formatter",
"redhat.vscode-yaml", "redhat.vscode-yaml",
"Tobermory.es6-string-html", "Tobermory.es6-string-html",
"unifiedjs.vscode-mdx", "unifiedjs.vscode-mdx"
] ]
} }

66
.vscode/launch.json vendored
View File

@ -2,76 +2,26 @@
"version": "0.2.0", "version": "0.2.0",
"configurations": [ "configurations": [
{ {
"name": "Debug: Attach Server Core", "name": "Python: PDB attach Server",
"type": "debugpy", "type": "python",
"request": "attach", "request": "attach",
"connect": { "connect": {
"host": "localhost", "host": "localhost",
"port": 9901 "port": 6800
}, },
"pathMappings": [ "justMyCode": true,
{
"localRoot": "${workspaceFolder}",
"remoteRoot": "."
}
],
"django": true "django": true
}, },
{ {
"name": "Debug: Attach Worker", "name": "Python: PDB attach Worker",
"type": "debugpy", "type": "python",
"request": "attach", "request": "attach",
"connect": { "connect": {
"host": "localhost", "host": "localhost",
"port": 9901 "port": 6900
}, },
"pathMappings": [ "justMyCode": true,
{
"localRoot": "${workspaceFolder}",
"remoteRoot": "."
}
],
"django": true "django": true
},
{
"name": "Debug: Start Server Router",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/server",
"cwd": "${workspaceFolder}"
},
{
"name": "Debug: Start LDAP Outpost",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/ldap",
"cwd": "${workspaceFolder}"
},
{
"name": "Debug: Start Proxy Outpost",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/proxy",
"cwd": "${workspaceFolder}"
},
{
"name": "Debug: Start RAC Outpost",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/rac",
"cwd": "${workspaceFolder}"
},
{
"name": "Debug: Start Radius Outpost",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/radius",
"cwd": "${workspaceFolder}"
} }
] ]
} }

32
.vscode/settings.json vendored
View File

@ -1,4 +1,26 @@
{ {
"cSpell.words": [
"akadmin",
"asgi",
"authentik",
"authn",
"entra",
"goauthentik",
"jwe",
"jwks",
"kubernetes",
"oidc",
"openid",
"passwordless",
"plex",
"saml",
"scim",
"slo",
"sso",
"totp",
"traefik",
"webauthn"
],
"todo-tree.tree.showCountsInTree": true, "todo-tree.tree.showCountsInTree": true,
"todo-tree.tree.showBadges": true, "todo-tree.tree.showBadges": true,
"yaml.customTags": [ "yaml.customTags": [
@ -6,19 +28,17 @@
"!Context scalar", "!Context scalar",
"!Enumerate sequence", "!Enumerate sequence",
"!Env scalar", "!Env scalar",
"!Env sequence",
"!Find sequence", "!Find sequence",
"!Format sequence", "!Format sequence",
"!If sequence", "!If sequence",
"!Index scalar", "!Index scalar",
"!KeyOf scalar", "!KeyOf scalar",
"!Value scalar", "!Value scalar",
"!AtIndex scalar", "!AtIndex scalar"
"!ParseJSON scalar"
], ],
"typescript.preferences.importModuleSpecifier": "non-relative", "typescript.preferences.importModuleSpecifier": "non-relative",
"typescript.preferences.importModuleSpecifierEnding": "index", "typescript.preferences.importModuleSpecifierEnding": "index",
"typescript.tsdk": "./node_modules/typescript/lib", "typescript.tsdk": "./web/node_modules/typescript/lib",
"typescript.enablePromptUseWorkspaceTsdk": true, "typescript.enablePromptUseWorkspaceTsdk": true,
"yaml.schemas": { "yaml.schemas": {
"./blueprints/schema.json": "blueprints/**/*.yaml" "./blueprints/schema.json": "blueprints/**/*.yaml"
@ -32,5 +52,7 @@
} }
], ],
"go.testFlags": ["-count=1"], "go.testFlags": ["-count=1"],
"github-actions.workflows.pinned.workflows": [".github/workflows/ci-main.yml"] "github-actions.workflows.pinned.workflows": [
".github/workflows/ci-main.yml"
]
} }

46
.vscode/tasks.json vendored
View File

@ -3,13 +3,8 @@
"tasks": [ "tasks": [
{ {
"label": "authentik/core: make", "label": "authentik/core: make",
"command": "uv", "command": "poetry",
"args": [ "args": ["run", "make", "lint-fix", "lint"],
"run",
"make",
"lint-fix",
"lint"
],
"presentation": { "presentation": {
"panel": "new" "panel": "new"
}, },
@ -17,12 +12,8 @@
}, },
{ {
"label": "authentik/core: run", "label": "authentik/core: run",
"command": "uv", "command": "poetry",
"args": [ "args": ["run", "ak", "server"],
"run",
"ak",
"server"
],
"group": "build", "group": "build",
"presentation": { "presentation": {
"panel": "dedicated", "panel": "dedicated",
@ -32,17 +23,13 @@
{ {
"label": "authentik/web: make", "label": "authentik/web: make",
"command": "make", "command": "make",
"args": [ "args": ["web"],
"web"
],
"group": "build" "group": "build"
}, },
{ {
"label": "authentik/web: watch", "label": "authentik/web: watch",
"command": "make", "command": "make",
"args": [ "args": ["web-watch"],
"web-watch"
],
"group": "build", "group": "build",
"presentation": { "presentation": {
"panel": "dedicated", "panel": "dedicated",
@ -52,26 +39,19 @@
{ {
"label": "authentik: install", "label": "authentik: install",
"command": "make", "command": "make",
"args": [ "args": ["install", "-j4"],
"install",
"-j4"
],
"group": "build" "group": "build"
}, },
{ {
"label": "authentik/website: make", "label": "authentik/website: make",
"command": "make", "command": "make",
"args": [ "args": ["website"],
"website"
],
"group": "build" "group": "build"
}, },
{ {
"label": "authentik/website: watch", "label": "authentik/website: watch",
"command": "make", "command": "make",
"args": [ "args": ["website-watch"],
"website-watch"
],
"group": "build", "group": "build",
"presentation": { "presentation": {
"panel": "dedicated", "panel": "dedicated",
@ -80,12 +60,8 @@
}, },
{ {
"label": "authentik/api: generate", "label": "authentik/api: generate",
"command": "uv", "command": "poetry",
"args": [ "args": ["run", "make", "gen"],
"run",
"make",
"gen"
],
"group": "build" "group": "build"
} }
] ]

View File

@ -10,12 +10,11 @@ schemas/ @goauthentik/backend
scripts/ @goauthentik/backend scripts/ @goauthentik/backend
tests/ @goauthentik/backend tests/ @goauthentik/backend
pyproject.toml @goauthentik/backend pyproject.toml @goauthentik/backend
uv.lock @goauthentik/backend poetry.lock @goauthentik/backend
go.mod @goauthentik/backend go.mod @goauthentik/backend
go.sum @goauthentik/backend go.sum @goauthentik/backend
# Infrastructure # Infrastructure
.github/ @goauthentik/infrastructure .github/ @goauthentik/infrastructure
lifecycle/aws/ @goauthentik/infrastructure
Dockerfile @goauthentik/infrastructure Dockerfile @goauthentik/infrastructure
*Dockerfile @goauthentik/infrastructure *Dockerfile @goauthentik/infrastructure
.dockerignore @goauthentik/infrastructure .dockerignore @goauthentik/infrastructure
@ -23,8 +22,6 @@ docker-compose.yml @goauthentik/infrastructure
Makefile @goauthentik/infrastructure Makefile @goauthentik/infrastructure
.editorconfig @goauthentik/infrastructure .editorconfig @goauthentik/infrastructure
CODEOWNERS @goauthentik/infrastructure CODEOWNERS @goauthentik/infrastructure
# Web packages
packages/ @goauthentik/frontend
# Web # Web
web/ @goauthentik/frontend web/ @goauthentik/frontend
tests/wdio/ @goauthentik/frontend tests/wdio/ @goauthentik/frontend

View File

@ -5,7 +5,7 @@
We as members, contributors, and leaders pledge to make participation in our We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socioeconomic status, identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity nationality, personal appearance, race, religion, or sexual identity
and orientation. and orientation.

View File

@ -1,7 +1,26 @@
# syntax=docker/dockerfile:1 # syntax=docker/dockerfile:1
# Stage 1: Build webui # Stage 1: Build website
FROM --platform=${BUILDPLATFORM} docker.io/library/node:24-slim AS node-builder FROM --platform=${BUILDPLATFORM} docker.io/library/node:22 AS website-builder
ENV NODE_ENV=production
WORKDIR /work/website
RUN --mount=type=bind,target=/work/website/package.json,src=./website/package.json \
--mount=type=bind,target=/work/website/package-lock.json,src=./website/package-lock.json \
--mount=type=cache,id=npm-website,sharing=shared,target=/root/.npm \
npm ci --include=dev
COPY ./website /work/website/
COPY ./blueprints /work/blueprints/
COPY ./schema.yml /work/
COPY ./SECURITY.md /work/
RUN npm run build-bundled
# Stage 2: Build webui
FROM --platform=${BUILDPLATFORM} docker.io/library/node:22 AS web-builder
ARG GIT_BUILD_HASH ARG GIT_BUILD_HASH
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
@ -13,7 +32,7 @@ RUN --mount=type=bind,target=/work/web/package.json,src=./web/package.json \
--mount=type=bind,target=/work/web/package-lock.json,src=./web/package-lock.json \ --mount=type=bind,target=/work/web/package-lock.json,src=./web/package-lock.json \
--mount=type=bind,target=/work/web/packages/sfe/package.json,src=./web/packages/sfe/package.json \ --mount=type=bind,target=/work/web/packages/sfe/package.json,src=./web/packages/sfe/package.json \
--mount=type=bind,target=/work/web/scripts,src=./web/scripts \ --mount=type=bind,target=/work/web/scripts,src=./web/scripts \
--mount=type=cache,id=npm-ak,sharing=shared,target=/root/.npm \ --mount=type=cache,id=npm-web,sharing=shared,target=/root/.npm \
npm ci --include=dev npm ci --include=dev
COPY ./package.json /work COPY ./package.json /work
@ -21,11 +40,10 @@ COPY ./web /work/web/
COPY ./website /work/website/ COPY ./website /work/website/
COPY ./gen-ts-api /work/web/node_modules/@goauthentik/api COPY ./gen-ts-api /work/web/node_modules/@goauthentik/api
RUN npm run build && \ RUN npm run build
npm run build:sfe
# Stage 2: Build go proxy # Stage 3: Build go proxy
FROM --platform=${BUILDPLATFORM} docker.io/library/golang:1.24-bookworm AS go-builder FROM --platform=${BUILDPLATFORM} mcr.microsoft.com/oss/go/microsoft/golang:1.23-fips-bookworm AS go-builder
ARG TARGETOS ARG TARGETOS
ARG TARGETARCH ARG TARGETARCH
@ -49,8 +67,8 @@ RUN --mount=type=bind,target=/go/src/goauthentik.io/go.mod,src=./go.mod \
COPY ./cmd /go/src/goauthentik.io/cmd COPY ./cmd /go/src/goauthentik.io/cmd
COPY ./authentik/lib /go/src/goauthentik.io/authentik/lib COPY ./authentik/lib /go/src/goauthentik.io/authentik/lib
COPY ./web/static.go /go/src/goauthentik.io/web/static.go COPY ./web/static.go /go/src/goauthentik.io/web/static.go
COPY --from=node-builder /work/web/robots.txt /go/src/goauthentik.io/web/robots.txt COPY --from=web-builder /work/web/robots.txt /go/src/goauthentik.io/web/robots.txt
COPY --from=node-builder /work/web/security.txt /go/src/goauthentik.io/web/security.txt COPY --from=web-builder /work/web/security.txt /go/src/goauthentik.io/web/security.txt
COPY ./internal /go/src/goauthentik.io/internal COPY ./internal /go/src/goauthentik.io/internal
COPY ./go.mod /go/src/goauthentik.io/go.mod COPY ./go.mod /go/src/goauthentik.io/go.mod
COPY ./go.sum /go/src/goauthentik.io/go.sum COPY ./go.sum /go/src/goauthentik.io/go.sum
@ -58,75 +76,55 @@ COPY ./go.sum /go/src/goauthentik.io/go.sum
RUN --mount=type=cache,sharing=locked,target=/go/pkg/mod \ RUN --mount=type=cache,sharing=locked,target=/go/pkg/mod \
--mount=type=cache,id=go-build-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/root/.cache/go-build \ --mount=type=cache,id=go-build-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/root/.cache/go-build \
if [ "$TARGETARCH" = "arm64" ]; then export CC=aarch64-linux-gnu-gcc && export CC_FOR_TARGET=gcc-aarch64-linux-gnu; fi && \ if [ "$TARGETARCH" = "arm64" ]; then export CC=aarch64-linux-gnu-gcc && export CC_FOR_TARGET=gcc-aarch64-linux-gnu; fi && \
CGO_ENABLED=1 GOFIPS140=latest GOARM="${TARGETVARIANT#v}" \ CGO_ENABLED=1 GOEXPERIMENT="systemcrypto" GOFLAGS="-tags=requirefips" GOARM="${TARGETVARIANT#v}" \
go build -o /go/authentik ./cmd/server go build -o /go/authentik ./cmd/server
# Stage 3: MaxMind GeoIP # Stage 4: MaxMind GeoIP
FROM --platform=${BUILDPLATFORM} ghcr.io/maxmind/geoipupdate:v7.1.0 AS geoip FROM --platform=${BUILDPLATFORM} ghcr.io/maxmind/geoipupdate:v7.1.0 AS geoip
ENV GEOIPUPDATE_EDITION_IDS="GeoLite2-City GeoLite2-ASN" ENV GEOIPUPDATE_EDITION_IDS="GeoLite2-City GeoLite2-ASN"
ENV GEOIPUPDATE_VERBOSE="1" ENV GEOIPUPDATE_VERBOSE="1"
ENV GEOIPUPDATE_ACCOUNT_ID_FILE="/run/secrets/GEOIPUPDATE_ACCOUNT_ID" ENV GEOIPUPDATE_ACCOUNT_ID_FILE="/run/secrets/GEOIPUPDATE_ACCOUNT_ID"
ENV GEOIPUPDATE_LICENSE_KEY_FILE="/run/secrets/GEOIPUPDATE_LICENSE_KEY"
USER root USER root
RUN --mount=type=secret,id=GEOIPUPDATE_ACCOUNT_ID \ RUN --mount=type=secret,id=GEOIPUPDATE_ACCOUNT_ID \
--mount=type=secret,id=GEOIPUPDATE_LICENSE_KEY \ --mount=type=secret,id=GEOIPUPDATE_LICENSE_KEY \
mkdir -p /usr/share/GeoIP && \ mkdir -p /usr/share/GeoIP && \
/bin/sh -c "GEOIPUPDATE_LICENSE_KEY_FILE=/run/secrets/GEOIPUPDATE_LICENSE_KEY /usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0" /bin/sh -c "/usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0"
# Stage 4: Download uv # Stage 5: Python dependencies
FROM ghcr.io/astral-sh/uv:0.7.17 AS uv FROM ghcr.io/goauthentik/fips-python:3.12.7-slim-bookworm-fips-full AS python-deps
# Stage 5: Base python image
FROM ghcr.io/goauthentik/fips-python:3.13.5-slim-bookworm-fips AS python-base
ENV VENV_PATH="/ak-root/.venv" \
PATH="/lifecycle:/ak-root/.venv/bin:$PATH" \
UV_COMPILE_BYTECODE=1 \
UV_LINK_MODE=copy \
UV_NATIVE_TLS=1 \
UV_PYTHON_DOWNLOADS=0
WORKDIR /ak-root/
COPY --from=uv /uv /uvx /bin/
# Stage 6: Python dependencies
FROM python-base AS python-deps
ARG TARGETARCH ARG TARGETARCH
ARG TARGETVARIANT ARG TARGETVARIANT
RUN rm -f /etc/apt/apt.conf.d/docker-clean; echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache WORKDIR /ak-root/poetry
ENV PATH="/root/.cargo/bin:$PATH" ENV VENV_PATH="/ak-root/venv" \
POETRY_VIRTUALENVS_CREATE=false \
PATH="/ak-root/venv/bin:$PATH"
RUN rm -f /etc/apt/apt.conf.d/docker-clean; echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache
RUN --mount=type=cache,id=apt-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/var/cache/apt \ RUN --mount=type=cache,id=apt-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/var/cache/apt \
apt-get update && \ apt-get update && \
# Required for installing pip packages # Required for installing pip packages
apt-get install -y --no-install-recommends \ apt-get install -y --no-install-recommends build-essential pkg-config libpq-dev libkrb5-dev
# Build essentials
build-essential pkg-config libffi-dev git \
# cryptography
curl \
# libxml
libxslt-dev zlib1g-dev \
# postgresql
libpq-dev \
# python-kadmin-rs
clang libkrb5-dev sccache \
# xmlsec
libltdl-dev && \
curl https://sh.rustup.rs -sSf | sh -s -- -y
ENV UV_NO_BINARY_PACKAGE="cryptography lxml python-kadmin-rs xmlsec" RUN --mount=type=bind,target=./pyproject.toml,src=./pyproject.toml \
--mount=type=bind,target=./poetry.lock,src=./poetry.lock \
--mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/pypoetry \
python -m venv /ak-root/venv/ && \
bash -c "source ${VENV_PATH}/bin/activate && \
pip3 install --upgrade pip && \
pip3 install poetry && \
poetry install --only=main --no-ansi --no-interaction --no-root && \
pip install --force-reinstall /wheels/*"
RUN --mount=type=bind,target=pyproject.toml,src=pyproject.toml \ # Stage 6: Run
--mount=type=bind,target=uv.lock,src=uv.lock \ FROM ghcr.io/goauthentik/fips-python:3.12.7-slim-bookworm-fips-full AS final-image
--mount=type=cache,target=/root/.cache/uv \
uv sync --frozen --no-install-project --no-dev
# Stage 7: Run
FROM python-base AS final-image
ARG VERSION ARG VERSION
ARG GIT_BUILD_HASH ARG GIT_BUILD_HASH
@ -142,12 +140,10 @@ WORKDIR /
# We cannot cache this layer otherwise we'll end up with a bigger image # We cannot cache this layer otherwise we'll end up with a bigger image
RUN apt-get update && \ RUN apt-get update && \
apt-get upgrade -y && \
# Required for runtime # Required for runtime
apt-get install -y --no-install-recommends libpq5 libmaxminddb0 ca-certificates libkrb5-3 libkadm5clnt-mit12 libkdb5-10 libltdl7 libxslt1.1 && \ apt-get install -y --no-install-recommends libpq5 libmaxminddb0 ca-certificates libkrb5-3 libkadm5clnt-mit12 libkdb5-10 && \
# Required for bootstrap & healtcheck # Required for bootstrap & healtcheck
apt-get install -y --no-install-recommends runit && \ apt-get install -y --no-install-recommends runit && \
pip3 install --no-cache-dir --upgrade pip && \
apt-get clean && \ apt-get clean && \
rm -rf /tmp/* /var/lib/apt/lists/* /var/tmp/ && \ rm -rf /tmp/* /var/lib/apt/lists/* /var/tmp/ && \
adduser --system --no-create-home --uid 1000 --group --home /authentik authentik && \ adduser --system --no-create-home --uid 1000 --group --home /authentik authentik && \
@ -158,7 +154,7 @@ RUN apt-get update && \
COPY ./authentik/ /authentik COPY ./authentik/ /authentik
COPY ./pyproject.toml / COPY ./pyproject.toml /
COPY ./uv.lock / COPY ./poetry.lock /
COPY ./schemas /schemas COPY ./schemas /schemas
COPY ./locale /locale COPY ./locale /locale
COPY ./tests /tests COPY ./tests /tests
@ -167,9 +163,10 @@ COPY ./blueprints /blueprints
COPY ./lifecycle/ /lifecycle COPY ./lifecycle/ /lifecycle
COPY ./authentik/sources/kerberos/krb5.conf /etc/krb5.conf COPY ./authentik/sources/kerberos/krb5.conf /etc/krb5.conf
COPY --from=go-builder /go/authentik /bin/authentik COPY --from=go-builder /go/authentik /bin/authentik
COPY --from=python-deps /ak-root/.venv /ak-root/.venv COPY --from=python-deps /ak-root/venv /ak-root/venv
COPY --from=node-builder /work/web/dist/ /web/dist/ COPY --from=web-builder /work/web/dist/ /web/dist/
COPY --from=node-builder /work/web/authentik/ /web/authentik/ COPY --from=web-builder /work/web/authentik/ /web/authentik/
COPY --from=website-builder /work/website/build/ /website/help/
COPY --from=geoip /usr/share/GeoIP /geoip COPY --from=geoip /usr/share/GeoIP /geoip
USER 1000 USER 1000
@ -177,7 +174,11 @@ USER 1000
ENV TMPDIR=/dev/shm/ \ ENV TMPDIR=/dev/shm/ \
PYTHONDONTWRITEBYTECODE=1 \ PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1 \ PYTHONUNBUFFERED=1 \
GOFIPS=1 PATH="/ak-root/venv/bin:/lifecycle:$PATH" \
VENV_PATH="/ak-root/venv" \
POETRY_VIRTUALENVS_CREATE=false
ENV GOFIPS=1
HEALTHCHECK --interval=30s --timeout=30s --start-period=60s --retries=3 CMD [ "ak", "healthcheck" ] HEALTHCHECK --interval=30s --timeout=30s --start-period=60s --retries=3 CMD [ "ak", "healthcheck" ]

145
Makefile
View File

@ -1,21 +1,34 @@
.PHONY: gen dev-reset all clean test web website .PHONY: gen dev-reset all clean test web website
SHELL := /usr/bin/env bash .SHELLFLAGS += ${SHELLFLAGS} -e
.SHELLFLAGS += ${SHELLFLAGS} -e -o pipefail
PWD = $(shell pwd) PWD = $(shell pwd)
UID = $(shell id -u) UID = $(shell id -u)
GID = $(shell id -g) GID = $(shell id -g)
NPM_VERSION = $(shell python -m scripts.generate_semver) NPM_VERSION = $(shell python -m scripts.npm_version)
PY_SOURCES = authentik tests scripts lifecycle .github PY_SOURCES = authentik tests scripts lifecycle .github website/docs/install-config/install/aws
DOCKER_IMAGE ?= "authentik:test" DOCKER_IMAGE ?= "authentik:test"
GEN_API_TS = gen-ts-api GEN_API_TS = "gen-ts-api"
GEN_API_PY = gen-py-api GEN_API_PY = "gen-py-api"
GEN_API_GO = gen-go-api GEN_API_GO = "gen-go-api"
pg_user := $(shell uv run python -m authentik.lib.config postgresql.user 2>/dev/null) pg_user := $(shell python -m authentik.lib.config postgresql.user 2>/dev/null)
pg_host := $(shell uv run python -m authentik.lib.config postgresql.host 2>/dev/null) pg_host := $(shell python -m authentik.lib.config postgresql.host 2>/dev/null)
pg_name := $(shell uv run python -m authentik.lib.config postgresql.name 2>/dev/null) pg_name := $(shell python -m authentik.lib.config postgresql.name 2>/dev/null)
CODESPELL_ARGS = -D - -D .github/codespell-dictionary.txt \
-I .github/codespell-words.txt \
-S 'web/src/locales/**' \
-S 'website/docs/developer-docs/api/reference/**' \
authentik \
internal \
cmd \
web/src \
website/src \
website/blog \
website/docs \
website/integrations \
website/src
all: lint-fix lint test gen web ## Lint, build, and test everything all: lint-fix lint test gen web ## Lint, build, and test everything
@ -32,38 +45,41 @@ help: ## Show this help
go-test: go-test:
go test -timeout 0 -v -race -cover ./... go test -timeout 0 -v -race -cover ./...
test-docker: ## Run all tests in a docker-compose
echo "PG_PASS=$(shell openssl rand 32 | base64 -w 0)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(shell openssl rand 32 | base64 -w 0)" >> .env
docker compose pull -q
docker compose up --no-start
docker compose start postgresql redis
docker compose run -u root server test-all
rm -f .env
test: ## Run the server tests and produce a coverage report (locally) test: ## Run the server tests and produce a coverage report (locally)
uv run coverage run manage.py test --keepdb authentik coverage run manage.py test --keepdb authentik
uv run coverage html coverage html
uv run coverage report coverage report
lint-fix: lint-codespell ## Lint and automatically fix errors in the python source code. Reports spelling errors. lint-fix: lint-codespell ## Lint and automatically fix errors in the python source code. Reports spelling errors.
uv run black $(PY_SOURCES) black $(PY_SOURCES)
uv run ruff check --fix $(PY_SOURCES) ruff check --fix $(PY_SOURCES)
lint-codespell: ## Reports spelling errors. lint-codespell: ## Reports spelling errors.
uv run codespell -w codespell -w $(CODESPELL_ARGS)
lint: ## Lint the python and golang sources lint: ## Lint the python and golang sources
uv run bandit -c pyproject.toml -r $(PY_SOURCES) bandit -r $(PY_SOURCES) -x web/node_modules -x tests/wdio/node_modules -x website/node_modules
golangci-lint run -v golangci-lint run -v
core-install: core-install:
uv sync --frozen poetry install
migrate: ## Run the Authentik Django server's migrations migrate: ## Run the Authentik Django server's migrations
uv run python -m lifecycle.migrate python -m lifecycle.migrate
i18n-extract: core-i18n-extract web-i18n-extract ## Extract strings that require translation into files to send to a translation service i18n-extract: core-i18n-extract web-i18n-extract ## Extract strings that require translation into files to send to a translation service
aws-cfn:
cd lifecycle/aws && npm run aws-cfn
run: ## Run the main authentik server process
uv run ak server
core-i18n-extract: core-i18n-extract:
uv run ak makemessages \ ak makemessages \
--add-location file \ --add-location file \
--no-obsolete \ --no-obsolete \
--ignore web \ --ignore web \
@ -86,10 +102,6 @@ dev-create-db:
dev-reset: dev-drop-db dev-create-db migrate ## Drop and restore the Authentik PostgreSQL instance to a "fresh install" state. dev-reset: dev-drop-db dev-create-db migrate ## Drop and restore the Authentik PostgreSQL instance to a "fresh install" state.
update-test-mmdb: ## Update test GeoIP and ASN Databases
curl -L https://raw.githubusercontent.com/maxmind/MaxMind-DB/refs/heads/main/test-data/GeoLite2-ASN-Test.mmdb -o ${PWD}/tests/GeoLite2-ASN-Test.mmdb
curl -L https://raw.githubusercontent.com/maxmind/MaxMind-DB/refs/heads/main/test-data/GeoLite2-City-Test.mmdb -o ${PWD}/tests/GeoLite2-City-Test.mmdb
######################### #########################
## API Schema ## API Schema
######################### #########################
@ -98,11 +110,11 @@ gen-build: ## Extract the schema from the database
AUTHENTIK_DEBUG=true \ AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \ AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \ AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
uv run ak make_blueprint_schema --file blueprints/schema.json ak make_blueprint_schema > blueprints/schema.json
AUTHENTIK_DEBUG=true \ AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \ AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \ AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
uv run ak spectacular --file schema.yml ak spectacular --file schema.yml
gen-changelog: ## (Release) generate the changelog based from the commits since the last tag gen-changelog: ## (Release) generate the changelog based from the commits since the last tag
git log --pretty=format:" - %s" $(shell git describe --tags $(shell git rev-list --tags --max-count=1))...$(shell git branch --show-current) | sort > changelog.md git log --pretty=format:" - %s" $(shell git describe --tags $(shell git rev-list --tags --max-count=1))...$(shell git branch --show-current) | sort > changelog.md
@ -122,19 +134,14 @@ gen-diff: ## (Release) generate the changelog diff between the current schema a
npx prettier --write diff.md npx prettier --write diff.md
gen-clean-ts: ## Remove generated API client for Typescript gen-clean-ts: ## Remove generated API client for Typescript
rm -rf ${PWD}/${GEN_API_TS}/ rm -rf ./${GEN_API_TS}/
rm -rf ${PWD}/web/node_modules/@goauthentik/api/ rm -rf ./web/node_modules/@goauthentik/api/
gen-clean-go: ## Remove generated API client for Go gen-clean-go: ## Remove generated API client for Go
mkdir -p ${PWD}/${GEN_API_GO} rm -rf ./${GEN_API_GO}/
ifneq ($(wildcard ${PWD}/${GEN_API_GO}/.*),)
make -C ${PWD}/${GEN_API_GO} clean
else
rm -rf ${PWD}/${GEN_API_GO}
endif
gen-clean-py: ## Remove generated API client for Python gen-clean-py: ## Remove generated API client for Python
rm -rf ${PWD}/${GEN_API_PY}/ rm -rf ./${GEN_API_PY}/
gen-clean: gen-clean-ts gen-clean-go gen-clean-py ## Remove generated API clients gen-clean: gen-clean-ts gen-clean-go gen-clean-py ## Remove generated API clients
@ -142,7 +149,7 @@ gen-client-ts: gen-clean-ts ## Build and install the authentik API for Typescri
docker run \ docker run \
--rm -v ${PWD}:/local \ --rm -v ${PWD}:/local \
--user ${UID}:${GID} \ --user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v7.11.0 generate \ docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \
-i /local/schema.yml \ -i /local/schema.yml \
-g typescript-fetch \ -g typescript-fetch \
-o /local/${GEN_API_TS} \ -o /local/${GEN_API_TS} \
@ -150,15 +157,15 @@ gen-client-ts: gen-clean-ts ## Build and install the authentik API for Typescri
--additional-properties=npmVersion=${NPM_VERSION} \ --additional-properties=npmVersion=${NPM_VERSION} \
--git-repo-id authentik \ --git-repo-id authentik \
--git-user-id goauthentik --git-user-id goauthentik
mkdir -p web/node_modules/@goauthentik/api
cd ${PWD}/${GEN_API_TS} && npm link cd ./${GEN_API_TS} && npm i
cd ${PWD}/web && npm link @goauthentik/api \cp -rf ./${GEN_API_TS}/* web/node_modules/@goauthentik/api
gen-client-py: gen-clean-py ## Build and install the authentik API for Python gen-client-py: gen-clean-py ## Build and install the authentik API for Python
docker run \ docker run \
--rm -v ${PWD}:/local \ --rm -v ${PWD}:/local \
--user ${UID}:${GID} \ --user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v7.11.0 generate \ docker.io/openapitools/openapi-generator-cli:v7.4.0 generate \
-i /local/schema.yml \ -i /local/schema.yml \
-g python \ -g python \
-o /local/${GEN_API_PY} \ -o /local/${GEN_API_PY} \
@ -166,20 +173,27 @@ gen-client-py: gen-clean-py ## Build and install the authentik API for Python
--additional-properties=packageVersion=${NPM_VERSION} \ --additional-properties=packageVersion=${NPM_VERSION} \
--git-repo-id authentik \ --git-repo-id authentik \
--git-user-id goauthentik --git-user-id goauthentik
pip install ./${GEN_API_PY}
gen-client-go: gen-clean-go ## Build and install the authentik API for Golang gen-client-go: gen-clean-go ## Build and install the authentik API for Golang
mkdir -p ${PWD}/${GEN_API_GO} mkdir -p ./${GEN_API_GO} ./${GEN_API_GO}/templates
ifeq ($(wildcard ${PWD}/${GEN_API_GO}/.*),) wget https://raw.githubusercontent.com/goauthentik/client-go/main/config.yaml -O ./${GEN_API_GO}/config.yaml
git clone --depth 1 https://github.com/goauthentik/client-go.git ${PWD}/${GEN_API_GO} wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/README.mustache -O ./${GEN_API_GO}/templates/README.mustache
else wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/go.mod.mustache -O ./${GEN_API_GO}/templates/go.mod.mustache
cd ${PWD}/${GEN_API_GO} && git pull cp schema.yml ./${GEN_API_GO}/
endif docker run \
cp ${PWD}/schema.yml ${PWD}/${GEN_API_GO} --rm -v ${PWD}/${GEN_API_GO}:/local \
make -C ${PWD}/${GEN_API_GO} build --user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \
-i /local/schema.yml \
-g go \
-o /local/ \
-c /local/config.yaml
go mod edit -replace goauthentik.io/api/v3=./${GEN_API_GO} go mod edit -replace goauthentik.io/api/v3=./${GEN_API_GO}
rm -rf ./${GEN_API_GO}/config.yaml ./${GEN_API_GO}/templates/
gen-dev-config: ## Generate a local development config file gen-dev-config: ## Generate a local development config file
uv run scripts/generate_config.py python -m scripts.generate_config
gen: gen-build gen-client-ts gen: gen-build gen-client-ts
@ -238,6 +252,9 @@ website-build:
website-watch: ## Build and watch the documentation website, updating automatically website-watch: ## Build and watch the documentation website, updating automatically
cd website && npm run watch cd website && npm run watch
aws-cfn:
cd website && npm run aws-cfn
######################### #########################
## Docker ## Docker
######################### #########################
@ -246,9 +263,6 @@ docker: ## Build a docker image of the current source tree
mkdir -p ${GEN_API_TS} mkdir -p ${GEN_API_TS}
DOCKER_BUILDKIT=1 docker build . --progress plain --tag ${DOCKER_IMAGE} DOCKER_BUILDKIT=1 docker build . --progress plain --tag ${DOCKER_IMAGE}
test-docker:
BUILD=true ${PWD}/scripts/test_docker.sh
######################### #########################
## CI ## CI
######################### #########################
@ -260,21 +274,16 @@ ci--meta-debug:
node --version node --version
ci-black: ci--meta-debug ci-black: ci--meta-debug
uv run black --check $(PY_SOURCES) black --check $(PY_SOURCES)
ci-ruff: ci--meta-debug ci-ruff: ci--meta-debug
uv run ruff check $(PY_SOURCES) ruff check $(PY_SOURCES)
ci-codespell: ci--meta-debug ci-codespell: ci--meta-debug
uv run codespell -s codespell $(CODESPELL_ARGS) -s
ci-bandit: ci--meta-debug ci-bandit: ci--meta-debug
uv run bandit -r $(PY_SOURCES) bandit -r $(PY_SOURCES)
ci-pending-migrations: ci--meta-debug ci-pending-migrations: ci--meta-debug
uv run ak makemigrations --check ak makemigrations --check
ci-test: ci--meta-debug
uv run coverage run manage.py test --keepdb --randomly-seed ${CI_TEST_SEED} authentik
uv run coverage report
uv run coverage xml

View File

@ -42,4 +42,4 @@ See [SECURITY.md](SECURITY.md)
## Adoption and Contributions ## Adoption and Contributions
Your organization uses authentik? We'd love to add your logo to the readme and our website! Email us @ hello@goauthentik.io or open a GitHub Issue/PR! For more information on how to contribute to authentik, please refer to our [contribution guide](https://docs.goauthentik.io/docs/developer-docs?utm_source=github). Your organization uses authentik? We'd love to add your logo to the readme and our website! Email us @ hello@goauthentik.io or open a GitHub Issue/PR! For more information on how to contribute to authentik, please refer to our [CONTRIBUTING.md file](./CONTRIBUTING.md).

View File

@ -2,7 +2,7 @@ authentik takes security very seriously. We follow the rules of [responsible di
## Independent audits and pentests ## Independent audits and pentests
We are committed to engaging in regular pentesting and security audits of authentik. Defining and adhering to a cadence of external testing ensures a stronger probability that our code base, our features, and our architecture is as secure and non-exploitable as possible. For more details about specific audits and pentests, refer to "Audits and Certificates" in our [Security documentation](https://docs.goauthentik.io/docs/security). We are committed to engaging in regular pentesting and security audits of authentik. Defining and adhering to a cadence of external testing ensures a stronger probability that our code base, our features, and our architecture is as secure and non-exploitable as possible. For more details about specfic audits and pentests, refer to "Audits and Certificates" in our [Security documentation](https://docs.goauthentik.io/docs/security).
## What authentik classifies as a CVE ## What authentik classifies as a CVE
@ -20,8 +20,8 @@ Even if the issue is not a CVE, we still greatly appreciate your help in hardeni
| Version | Supported | | Version | Supported |
| --------- | --------- | | --------- | --------- |
| 2025.4.x | ✅ | | 2024.10.x | ✅ |
| 2025.6.x | ✅ | | 2024.12.x | ✅ |
## Reporting a Vulnerability ## Reporting a Vulnerability

View File

@ -2,7 +2,7 @@
from os import environ from os import environ
__version__ = "2025.6.3" __version__ = "2024.12.2"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH" ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"

View File

@ -0,0 +1,79 @@
"""authentik administration metrics"""
from datetime import timedelta
from django.db.models.functions import ExtractHour
from drf_spectacular.utils import extend_schema, extend_schema_field
from guardian.shortcuts import get_objects_for_user
from rest_framework.fields import IntegerField, SerializerMethodField
from rest_framework.permissions import IsAuthenticated
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.views import APIView
from authentik.core.api.utils import PassiveSerializer
from authentik.events.models import EventAction
class CoordinateSerializer(PassiveSerializer):
"""Coordinates for diagrams"""
x_cord = IntegerField(read_only=True)
y_cord = IntegerField(read_only=True)
class LoginMetricsSerializer(PassiveSerializer):
"""Login Metrics per 1h"""
logins = SerializerMethodField()
logins_failed = SerializerMethodField()
authorizations = SerializerMethodField()
@extend_schema_field(CoordinateSerializer(many=True))
def get_logins(self, _):
"""Get successful logins per 8 hours for the last 7 days"""
user = self.context["user"]
return (
get_objects_for_user(user, "authentik_events.view_event").filter(
action=EventAction.LOGIN
)
# 3 data points per day, so 8 hour spans
.get_events_per(timedelta(days=7), ExtractHour, 7 * 3)
)
@extend_schema_field(CoordinateSerializer(many=True))
def get_logins_failed(self, _):
"""Get failed logins per 8 hours for the last 7 days"""
user = self.context["user"]
return (
get_objects_for_user(user, "authentik_events.view_event").filter(
action=EventAction.LOGIN_FAILED
)
# 3 data points per day, so 8 hour spans
.get_events_per(timedelta(days=7), ExtractHour, 7 * 3)
)
@extend_schema_field(CoordinateSerializer(many=True))
def get_authorizations(self, _):
"""Get successful authorizations per 8 hours for the last 7 days"""
user = self.context["user"]
return (
get_objects_for_user(user, "authentik_events.view_event").filter(
action=EventAction.AUTHORIZE_APPLICATION
)
# 3 data points per day, so 8 hour spans
.get_events_per(timedelta(days=7), ExtractHour, 7 * 3)
)
class AdministrationMetricsViewSet(APIView):
"""Login Metrics per 1h"""
permission_classes = [IsAuthenticated]
@extend_schema(responses={200: LoginMetricsSerializer(many=False)})
def get(self, request: Request) -> Response:
"""Login Metrics per 1h"""
serializer = LoginMetricsSerializer(True)
serializer.context["user"] = request.user
return Response(serializer.data)

View File

@ -59,7 +59,7 @@ class SystemInfoSerializer(PassiveSerializer):
if not isinstance(value, str): if not isinstance(value, str):
continue continue
actual_value = value actual_value = value
if raw_session is not None and raw_session in actual_value: if raw_session in actual_value:
actual_value = actual_value.replace( actual_value = actual_value.replace(
raw_session, SafeExceptionReporterFilter.cleansed_substitute raw_session, SafeExceptionReporterFilter.cleansed_substitute
) )

View File

@ -1,7 +1,6 @@
"""authentik administration overview""" """authentik administration overview"""
from django.core.cache import cache from django.core.cache import cache
from django_tenants.utils import get_public_schema_name
from drf_spectacular.utils import extend_schema from drf_spectacular.utils import extend_schema
from packaging.version import parse from packaging.version import parse
from rest_framework.fields import SerializerMethodField from rest_framework.fields import SerializerMethodField
@ -14,7 +13,6 @@ from authentik import __version__, get_build_hash
from authentik.admin.tasks import VERSION_CACHE_KEY, VERSION_NULL, update_latest_version from authentik.admin.tasks import VERSION_CACHE_KEY, VERSION_NULL, update_latest_version
from authentik.core.api.utils import PassiveSerializer from authentik.core.api.utils import PassiveSerializer
from authentik.outposts.models import Outpost from authentik.outposts.models import Outpost
from authentik.tenants.utils import get_current_tenant
class VersionSerializer(PassiveSerializer): class VersionSerializer(PassiveSerializer):
@ -37,8 +35,6 @@ class VersionSerializer(PassiveSerializer):
def get_version_latest(self, _) -> str: def get_version_latest(self, _) -> str:
"""Get latest version from cache""" """Get latest version from cache"""
if get_current_tenant().schema_name == get_public_schema_name():
return __version__
version_in_cache = cache.get(VERSION_CACHE_KEY) version_in_cache = cache.get(VERSION_CACHE_KEY)
if not version_in_cache: # pragma: no cover if not version_in_cache: # pragma: no cover
update_latest_version.delay() update_latest_version.delay()

View File

@ -14,19 +14,3 @@ class AuthentikAdminConfig(ManagedAppConfig):
label = "authentik_admin" label = "authentik_admin"
verbose_name = "authentik Admin" verbose_name = "authentik Admin"
default = True default = True
@ManagedAppConfig.reconcile_global
def clear_update_notifications(self):
"""Clear update notifications on startup if the notification was for the version
we're running now."""
from packaging.version import parse
from authentik.admin.tasks import LOCAL_VERSION
from authentik.events.models import EventAction, Notification
for notification in Notification.objects.filter(event__action=EventAction.UPDATE_AVAILABLE):
if "new_version" not in notification.event.context:
continue
notification_version = notification.event.context["new_version"]
if LOCAL_VERSION >= parse(notification_version):
notification.delete()

View File

@ -1,7 +1,6 @@
"""authentik admin settings""" """authentik admin settings"""
from celery.schedules import crontab from celery.schedules import crontab
from django_tenants.utils import get_public_schema_name
from authentik.lib.utils.time import fqdn_rand from authentik.lib.utils.time import fqdn_rand
@ -9,7 +8,6 @@ CELERY_BEAT_SCHEDULE = {
"admin_latest_version": { "admin_latest_version": {
"task": "authentik.admin.tasks.update_latest_version", "task": "authentik.admin.tasks.update_latest_version",
"schedule": crontab(minute=fqdn_rand("admin_latest_version"), hour="*"), "schedule": crontab(minute=fqdn_rand("admin_latest_version"), hour="*"),
"tenant_schemas": [get_public_schema_name()],
"options": {"queue": "authentik_scheduled"}, "options": {"queue": "authentik_scheduled"},
} }
} }

View File

@ -1,6 +1,7 @@
"""authentik admin tasks""" """authentik admin tasks"""
from django.core.cache import cache from django.core.cache import cache
from django.db import DatabaseError, InternalError, ProgrammingError
from django.utils.translation import gettext_lazy as _ from django.utils.translation import gettext_lazy as _
from packaging.version import parse from packaging.version import parse
from requests import RequestException from requests import RequestException
@ -8,7 +9,7 @@ from structlog.stdlib import get_logger
from authentik import __version__, get_build_hash from authentik import __version__, get_build_hash
from authentik.admin.apps import PROM_INFO from authentik.admin.apps import PROM_INFO
from authentik.events.models import Event, EventAction from authentik.events.models import Event, EventAction, Notification
from authentik.events.system_tasks import SystemTask, TaskStatus, prefill_task from authentik.events.system_tasks import SystemTask, TaskStatus, prefill_task
from authentik.lib.config import CONFIG from authentik.lib.config import CONFIG
from authentik.lib.utils.http import get_http_session from authentik.lib.utils.http import get_http_session
@ -32,6 +33,20 @@ def _set_prom_info():
) )
@CELERY_APP.task(
throws=(DatabaseError, ProgrammingError, InternalError),
)
def clear_update_notifications():
"""Clear update notifications on startup if the notification was for the version
we're running now."""
for notification in Notification.objects.filter(event__action=EventAction.UPDATE_AVAILABLE):
if "new_version" not in notification.event.context:
continue
notification_version = notification.event.context["new_version"]
if LOCAL_VERSION >= parse(notification_version):
notification.delete()
@CELERY_APP.task(bind=True, base=SystemTask) @CELERY_APP.task(bind=True, base=SystemTask)
@prefill_task @prefill_task
def update_latest_version(self: SystemTask): def update_latest_version(self: SystemTask):

View File

@ -36,6 +36,11 @@ class TestAdminAPI(TestCase):
body = loads(response.content) body = loads(response.content)
self.assertEqual(len(body), 0) self.assertEqual(len(body), 0)
def test_metrics(self):
"""Test metrics API"""
response = self.client.get(reverse("authentik_api:admin_metrics"))
self.assertEqual(response.status_code, 200)
def test_apps(self): def test_apps(self):
"""Test apps API""" """Test apps API"""
response = self.client.get(reverse("authentik_api:apps-list")) response = self.client.get(reverse("authentik_api:apps-list"))

View File

@ -1,12 +1,12 @@
"""test admin tasks""" """test admin tasks"""
from django.apps import apps
from django.core.cache import cache from django.core.cache import cache
from django.test import TestCase from django.test import TestCase
from requests_mock import Mocker from requests_mock import Mocker
from authentik.admin.tasks import ( from authentik.admin.tasks import (
VERSION_CACHE_KEY, VERSION_CACHE_KEY,
clear_update_notifications,
update_latest_version, update_latest_version,
) )
from authentik.events.models import Event, EventAction from authentik.events.models import Event, EventAction
@ -72,13 +72,12 @@ class TestAdminTasks(TestCase):
def test_clear_update_notifications(self): def test_clear_update_notifications(self):
"""Test clear of previous notification""" """Test clear of previous notification"""
admin_config = apps.get_app_config("authentik_admin")
Event.objects.create( Event.objects.create(
action=EventAction.UPDATE_AVAILABLE, context={"new_version": "99999999.9999999.9999999"} action=EventAction.UPDATE_AVAILABLE, context={"new_version": "99999999.9999999.9999999"}
) )
Event.objects.create(action=EventAction.UPDATE_AVAILABLE, context={"new_version": "1.1.1"}) Event.objects.create(action=EventAction.UPDATE_AVAILABLE, context={"new_version": "1.1.1"})
Event.objects.create(action=EventAction.UPDATE_AVAILABLE, context={}) Event.objects.create(action=EventAction.UPDATE_AVAILABLE, context={})
admin_config.clear_update_notifications() clear_update_notifications()
self.assertFalse( self.assertFalse(
Event.objects.filter( Event.objects.filter(
action=EventAction.UPDATE_AVAILABLE, context__new_version="1.1" action=EventAction.UPDATE_AVAILABLE, context__new_version="1.1"

View File

@ -3,6 +3,7 @@
from django.urls import path from django.urls import path
from authentik.admin.api.meta import AppsViewSet, ModelViewSet from authentik.admin.api.meta import AppsViewSet, ModelViewSet
from authentik.admin.api.metrics import AdministrationMetricsViewSet
from authentik.admin.api.system import SystemView from authentik.admin.api.system import SystemView
from authentik.admin.api.version import VersionView from authentik.admin.api.version import VersionView
from authentik.admin.api.version_history import VersionHistoryViewSet from authentik.admin.api.version_history import VersionHistoryViewSet
@ -11,6 +12,11 @@ from authentik.admin.api.workers import WorkerView
api_urlpatterns = [ api_urlpatterns = [
("admin/apps", AppsViewSet, "apps"), ("admin/apps", AppsViewSet, "apps"),
("admin/models", ModelViewSet, "models"), ("admin/models", ModelViewSet, "models"),
path(
"admin/metrics/",
AdministrationMetricsViewSet.as_view(),
name="admin_metrics",
),
path("admin/version/", VersionView.as_view(), name="admin_version"), path("admin/version/", VersionView.as_view(), name="admin_version"),
("admin/version/history", VersionHistoryViewSet, "version_history"), ("admin/version/history", VersionHistoryViewSet, "version_history"),
path("admin/workers/", WorkerView.as_view(), name="admin_workers"), path("admin/workers/", WorkerView.as_view(), name="admin_workers"),

View File

@ -1,13 +1,12 @@
"""authentik API AppConfig""" """authentik API AppConfig"""
from authentik.blueprints.apps import ManagedAppConfig from django.apps import AppConfig
class AuthentikAPIConfig(ManagedAppConfig): class AuthentikAPIConfig(AppConfig):
"""authentik API Config""" """authentik API Config"""
name = "authentik.api" name = "authentik.api"
label = "authentik_api" label = "authentik_api"
mountpoint = "api/" mountpoint = "api/"
verbose_name = "authentik API" verbose_name = "authentik API"
default = True

View File

@ -1,12 +1,9 @@
"""API Authentication""" """API Authentication"""
from hmac import compare_digest from hmac import compare_digest
from pathlib import Path
from tempfile import gettempdir
from typing import Any from typing import Any
from django.conf import settings from django.conf import settings
from django.contrib.auth.models import AnonymousUser
from drf_spectacular.extensions import OpenApiAuthenticationExtension from drf_spectacular.extensions import OpenApiAuthenticationExtension
from rest_framework.authentication import BaseAuthentication, get_authorization_header from rest_framework.authentication import BaseAuthentication, get_authorization_header
from rest_framework.exceptions import AuthenticationFailed from rest_framework.exceptions import AuthenticationFailed
@ -14,17 +11,11 @@ from rest_framework.request import Request
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
from authentik.core.middleware import CTX_AUTH_VIA from authentik.core.middleware import CTX_AUTH_VIA
from authentik.core.models import Token, TokenIntents, User, UserTypes from authentik.core.models import Token, TokenIntents, User
from authentik.outposts.models import Outpost from authentik.outposts.models import Outpost
from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API
LOGGER = get_logger() LOGGER = get_logger()
_tmp = Path(gettempdir())
try:
with open(_tmp / "authentik-core-ipc.key") as _f:
ipc_key = _f.read()
except OSError:
ipc_key = None
def validate_auth(header: bytes) -> str | None: def validate_auth(header: bytes) -> str | None:
@ -82,11 +73,6 @@ def auth_user_lookup(raw_header: bytes) -> User | None:
if user: if user:
CTX_AUTH_VIA.set("secret_key") CTX_AUTH_VIA.set("secret_key")
return user return user
# then try to auth via secret key (for embedded outpost/etc)
user = token_ipc(auth_credentials)
if user:
CTX_AUTH_VIA.set("ipc")
return user
raise AuthenticationFailed("Token invalid/expired") raise AuthenticationFailed("Token invalid/expired")
@ -104,43 +90,6 @@ def token_secret_key(value: str) -> User | None:
return outpost.user return outpost.user
class IPCUser(AnonymousUser):
"""'Virtual' user for IPC communication between authentik core and the authentik router"""
username = "authentik:system"
is_active = True
is_superuser = True
@property
def type(self):
return UserTypes.INTERNAL_SERVICE_ACCOUNT
def has_perm(self, perm, obj=None):
return True
def has_perms(self, perm_list, obj=None):
return True
def has_module_perms(self, module):
return True
@property
def is_anonymous(self):
return False
@property
def is_authenticated(self):
return True
def token_ipc(value: str) -> User | None:
"""Check if the token is the secret key
and return the service account for the managed outpost"""
if not ipc_key or not compare_digest(value, ipc_key):
return None
return IPCUser()
class TokenAuthentication(BaseAuthentication): class TokenAuthentication(BaseAuthentication):
"""Token-based authentication using HTTP Bearer authentication""" """Token-based authentication using HTTP Bearer authentication"""

View File

@ -54,7 +54,7 @@ def create_component(generator: SchemaGenerator, name, schema, type_=ResolvedCom
return component return component
def postprocess_schema_responses(result, generator: SchemaGenerator, **kwargs): def postprocess_schema_responses(result, generator: SchemaGenerator, **kwargs): # noqa: W0613
"""Workaround to set a default response for endpoints. """Workaround to set a default response for endpoints.
Workaround suggested at Workaround suggested at
<https://github.com/tfranzel/drf-spectacular/issues/119#issuecomment-656970357> <https://github.com/tfranzel/drf-spectacular/issues/119#issuecomment-656970357>

View File

@ -7,7 +7,7 @@ from rest_framework.exceptions import ValidationError
from rest_framework.fields import CharField, DateTimeField from rest_framework.fields import CharField, DateTimeField
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.serializers import ListSerializer from rest_framework.serializers import ListSerializer, ModelSerializer
from rest_framework.viewsets import ModelViewSet from rest_framework.viewsets import ModelViewSet
from authentik.blueprints.models import BlueprintInstance from authentik.blueprints.models import BlueprintInstance
@ -15,7 +15,7 @@ from authentik.blueprints.v1.importer import Importer
from authentik.blueprints.v1.oci import OCI_PREFIX from authentik.blueprints.v1.oci import OCI_PREFIX
from authentik.blueprints.v1.tasks import apply_blueprint, blueprints_find_dict from authentik.blueprints.v1.tasks import apply_blueprint, blueprints_find_dict
from authentik.core.api.used_by import UsedByMixin from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import JSONDictField, ModelSerializer, PassiveSerializer from authentik.core.api.utils import JSONDictField, PassiveSerializer
from authentik.rbac.decorators import permission_required from authentik.rbac.decorators import permission_required

View File

@ -72,33 +72,20 @@ class Command(BaseCommand):
"additionalProperties": True, "additionalProperties": True,
}, },
"entries": { "entries": {
"anyOf": [ "type": "array",
{ "items": {
"type": "array", "oneOf": [],
"items": {"$ref": "#/$defs/blueprint_entry"}, },
},
{
"type": "object",
"additionalProperties": {
"type": "array",
"items": {"$ref": "#/$defs/blueprint_entry"},
},
},
],
}, },
}, },
"$defs": {"blueprint_entry": {"oneOf": []}}, "$defs": {},
} }
def add_arguments(self, parser):
parser.add_argument("--file", type=str)
@no_translations @no_translations
def handle(self, *args, file: str, **options): def handle(self, *args, **options):
"""Generate JSON Schema for blueprints""" """Generate JSON Schema for blueprints"""
self.build() self.build()
with open(file, "w") as _schema: self.stdout.write(dumps(self.schema, indent=4, default=Command.json_default))
_schema.write(dumps(self.schema, indent=4, default=Command.json_default))
@staticmethod @staticmethod
def json_default(value: Any) -> Any: def json_default(value: Any) -> Any:
@ -125,7 +112,7 @@ class Command(BaseCommand):
} }
) )
model_path = f"{model._meta.app_label}.{model._meta.model_name}" model_path = f"{model._meta.app_label}.{model._meta.model_name}"
self.schema["$defs"]["blueprint_entry"]["oneOf"].append( self.schema["properties"]["entries"]["items"]["oneOf"].append(
self.template_entry(model_path, model, serializer) self.template_entry(model_path, model, serializer)
) )
@ -147,7 +134,7 @@ class Command(BaseCommand):
"id": {"type": "string"}, "id": {"type": "string"},
"state": { "state": {
"type": "string", "type": "string",
"enum": sorted([s.value for s in BlueprintEntryDesiredState]), "enum": [s.value for s in BlueprintEntryDesiredState],
"default": "present", "default": "present",
}, },
"conditions": {"type": "array", "items": {"type": "boolean"}}, "conditions": {"type": "array", "items": {"type": "boolean"}},
@ -218,7 +205,7 @@ class Command(BaseCommand):
"type": "object", "type": "object",
"required": ["permission"], "required": ["permission"],
"properties": { "properties": {
"permission": {"type": "string", "enum": sorted(perms)}, "permission": {"type": "string", "enum": perms},
"user": {"type": "integer"}, "user": {"type": "integer"},
"role": {"type": "string"}, "role": {"type": "string"},
}, },

View File

@ -1,11 +1,10 @@
version: 1 version: 1
entries: entries:
foo: - identifiers:
- identifiers: name: "%(id)s"
name: "%(id)s" slug: "%(id)s"
slug: "%(id)s" model: authentik_flows.flow
model: authentik_flows.flow state: present
state: present attrs:
attrs: designation: stage_configuration
designation: stage_configuration title: foo
title: foo

View File

@ -37,7 +37,6 @@ entries:
- attrs: - attrs:
attributes: attributes:
env_null: !Env [bar-baz, null] env_null: !Env [bar-baz, null]
json_parse: !ParseJSON '{"foo": "bar"}'
policy_pk1: policy_pk1:
!Format [ !Format [
"%s-%s", "%s-%s",

View File

@ -1,14 +0,0 @@
from django.test import TestCase
from authentik.blueprints.apps import ManagedAppConfig
from authentik.enterprise.apps import EnterpriseConfig
from authentik.lib.utils.reflection import get_apps
class TestManagedAppConfig(TestCase):
def test_apps_use_managed_app_config(self):
for app in get_apps():
if app.name.startswith("authentik.enterprise"):
self.assertIn(EnterpriseConfig, app.__class__.__bases__)
else:
self.assertIn(ManagedAppConfig, app.__class__.__bases__)

View File

@ -35,6 +35,6 @@ def blueprint_tester(file_name: Path) -> Callable:
for blueprint_file in Path("blueprints/").glob("**/*.yaml"): for blueprint_file in Path("blueprints/").glob("**/*.yaml"):
if "local" in str(blueprint_file) or "testing" in str(blueprint_file): if "local" in str(blueprint_file):
continue continue
setattr(TestPackaged, f"test_blueprint_{blueprint_file}", blueprint_tester(blueprint_file)) setattr(TestPackaged, f"test_blueprint_{blueprint_file}", blueprint_tester(blueprint_file))

View File

@ -5,6 +5,7 @@ from collections.abc import Callable
from django.apps import apps from django.apps import apps
from django.test import TestCase from django.test import TestCase
from authentik.blueprints.v1.importer import is_model_allowed
from authentik.lib.models import SerializerModel from authentik.lib.models import SerializerModel
from authentik.providers.oauth2.models import RefreshToken from authentik.providers.oauth2.models import RefreshToken
@ -21,13 +22,10 @@ def serializer_tester_factory(test_model: type[SerializerModel]) -> Callable:
return return
model_class = test_model() model_class = test_model()
self.assertTrue(isinstance(model_class, SerializerModel)) self.assertTrue(isinstance(model_class, SerializerModel))
# Models that have subclasses don't have to have a serializer
if len(test_model.__subclasses__()) > 0:
return
self.assertIsNotNone(model_class.serializer) self.assertIsNotNone(model_class.serializer)
if model_class.serializer.Meta().model == RefreshToken: if model_class.serializer.Meta().model == RefreshToken:
return return
self.assertTrue(issubclass(test_model, model_class.serializer.Meta().model)) self.assertEqual(model_class.serializer.Meta().model, test_model)
return tester return tester
@ -36,6 +34,6 @@ for app in apps.get_app_configs():
if not app.label.startswith("authentik"): if not app.label.startswith("authentik"):
continue continue
for model in app.get_models(): for model in app.get_models():
if not issubclass(model, SerializerModel): if not is_model_allowed(model):
continue continue
setattr(TestModels, f"test_{app.label}_{model.__name__}", serializer_tester_factory(model)) setattr(TestModels, f"test_{app.label}_{model.__name__}", serializer_tester_factory(model))

View File

@ -215,7 +215,6 @@ class TestBlueprintsV1(TransactionTestCase):
}, },
"nested_context": "context-nested-value", "nested_context": "context-nested-value",
"env_null": None, "env_null": None,
"json_parse": {"foo": "bar"},
"at_index_sequence": "foo", "at_index_sequence": "foo",
"at_index_sequence_default": "non existent", "at_index_sequence_default": "non existent",
"at_index_mapping": 2, "at_index_mapping": 2,

View File

@ -6,7 +6,6 @@ from copy import copy
from dataclasses import asdict, dataclass, field, is_dataclass from dataclasses import asdict, dataclass, field, is_dataclass
from enum import Enum from enum import Enum
from functools import reduce from functools import reduce
from json import JSONDecodeError, loads
from operator import ixor from operator import ixor
from os import getenv from os import getenv
from typing import Any, Literal, Union from typing import Any, Literal, Union
@ -165,7 +164,9 @@ class BlueprintEntry:
"""Get the blueprint model, with yaml tags resolved if present""" """Get the blueprint model, with yaml tags resolved if present"""
return str(self.tag_resolver(self.model, blueprint)) return str(self.tag_resolver(self.model, blueprint))
def get_permissions(self, blueprint: "Blueprint") -> Generator[BlueprintEntryPermission]: def get_permissions(
self, blueprint: "Blueprint"
) -> Generator[BlueprintEntryPermission, None, None]:
"""Get permissions of this entry, with all yaml tags resolved""" """Get permissions of this entry, with all yaml tags resolved"""
for perm in self.permissions: for perm in self.permissions:
yield BlueprintEntryPermission( yield BlueprintEntryPermission(
@ -192,18 +193,11 @@ class Blueprint:
"""Dataclass used for a full export""" """Dataclass used for a full export"""
version: int = field(default=1) version: int = field(default=1)
entries: list[BlueprintEntry] | dict[str, list[BlueprintEntry]] = field(default_factory=list) entries: list[BlueprintEntry] = field(default_factory=list)
context: dict = field(default_factory=dict) context: dict = field(default_factory=dict)
metadata: BlueprintMetadata | None = field(default=None) metadata: BlueprintMetadata | None = field(default=None)
def iter_entries(self) -> Iterable[BlueprintEntry]:
if isinstance(self.entries, dict):
for _section, entries in self.entries.items():
yield from entries
else:
yield from self.entries
class YAMLTag: class YAMLTag:
"""Base class for all YAML Tags""" """Base class for all YAML Tags"""
@ -234,7 +228,7 @@ class KeyOf(YAMLTag):
self.id_from = node.value self.id_from = node.value
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any: def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
for _entry in blueprint.iter_entries(): for _entry in blueprint.entries:
if _entry.id == self.id_from and _entry._state.instance: if _entry.id == self.id_from and _entry._state.instance:
# Special handling for PolicyBindingModels, as they'll have a different PK # Special handling for PolicyBindingModels, as they'll have a different PK
# which is used when creating policy bindings # which is used when creating policy bindings
@ -292,22 +286,6 @@ class Context(YAMLTag):
return value return value
class ParseJSON(YAMLTag):
"""Parse JSON from context/env/etc value"""
raw: str
def __init__(self, loader: "BlueprintLoader", node: ScalarNode) -> None:
super().__init__()
self.raw = node.value
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
try:
return loads(self.raw)
except JSONDecodeError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc
class Format(YAMLTag): class Format(YAMLTag):
"""Format a string""" """Format a string"""
@ -683,7 +661,6 @@ class BlueprintLoader(SafeLoader):
self.add_constructor("!Value", Value) self.add_constructor("!Value", Value)
self.add_constructor("!Index", Index) self.add_constructor("!Index", Index)
self.add_constructor("!AtIndex", AtIndex) self.add_constructor("!AtIndex", AtIndex)
self.add_constructor("!ParseJSON", ParseJSON)
class EntryInvalidError(SentryIgnoredException): class EntryInvalidError(SentryIgnoredException):

View File

@ -36,7 +36,6 @@ from authentik.core.models import (
GroupSourceConnection, GroupSourceConnection,
PropertyMapping, PropertyMapping,
Provider, Provider,
Session,
Source, Source,
User, User,
UserSourceConnection, UserSourceConnection,
@ -51,7 +50,7 @@ from authentik.enterprise.providers.microsoft_entra.models import (
MicrosoftEntraProviderGroup, MicrosoftEntraProviderGroup,
MicrosoftEntraProviderUser, MicrosoftEntraProviderUser,
) )
from authentik.enterprise.providers.ssf.models import StreamEvent from authentik.enterprise.providers.rac.models import ConnectionToken
from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import ( from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import (
EndpointDevice, EndpointDevice,
EndpointDeviceConnection, EndpointDeviceConnection,
@ -72,7 +71,6 @@ from authentik.providers.oauth2.models import (
DeviceToken, DeviceToken,
RefreshToken, RefreshToken,
) )
from authentik.providers.rac.models import ConnectionToken
from authentik.providers.scim.models import SCIMProviderGroup, SCIMProviderUser from authentik.providers.scim.models import SCIMProviderGroup, SCIMProviderUser
from authentik.rbac.models import Role from authentik.rbac.models import Role
from authentik.sources.scim.models import SCIMSourceGroup, SCIMSourceUser from authentik.sources.scim.models import SCIMSourceGroup, SCIMSourceUser
@ -109,7 +107,6 @@ def excluded_models() -> list[type[Model]]:
Policy, Policy,
PolicyBindingModel, PolicyBindingModel,
# Classes that have other dependencies # Classes that have other dependencies
Session,
AuthenticatedSession, AuthenticatedSession,
# Classes which are only internally managed # Classes which are only internally managed
# FIXME: these shouldn't need to be explicitly listed, but rather based off of a mixin # FIXME: these shouldn't need to be explicitly listed, but rather based off of a mixin
@ -134,7 +131,6 @@ def excluded_models() -> list[type[Model]]:
EndpointDevice, EndpointDevice,
EndpointDeviceConnection, EndpointDeviceConnection,
DeviceToken, DeviceToken,
StreamEvent,
) )
@ -384,7 +380,7 @@ class Importer:
def _apply_models(self, raise_errors=False) -> bool: def _apply_models(self, raise_errors=False) -> bool:
"""Apply (create/update) models yaml""" """Apply (create/update) models yaml"""
self.__pk_map = {} self.__pk_map = {}
for entry in self._import.iter_entries(): for entry in self._import.entries:
model_app_label, model_name = entry.get_model(self._import).split(".") model_app_label, model_name = entry.get_model(self._import).split(".")
try: try:
model: type[SerializerModel] = registry.get_model(model_app_label, model_name) model: type[SerializerModel] = registry.get_model(model_app_label, model_name)

View File

@ -47,7 +47,7 @@ class MetaModelRegistry:
models = apps.get_models() models = apps.get_models()
for _, value in self.models.items(): for _, value in self.models.items():
models.append(value) models.append(value)
return sorted(models, key=str) return models
def get_model(self, app_label: str, model_id: str) -> type[Model]: def get_model(self, app_label: str, model_id: str) -> type[Model]:
"""Get model checks if any virtual models are registered, and falls back """Get model checks if any virtual models are registered, and falls back

View File

@ -49,8 +49,6 @@ class BrandSerializer(ModelSerializer):
"branding_title", "branding_title",
"branding_logo", "branding_logo",
"branding_favicon", "branding_favicon",
"branding_custom_css",
"branding_default_flow_background",
"flow_authentication", "flow_authentication",
"flow_invalidation", "flow_invalidation",
"flow_recovery", "flow_recovery",
@ -59,7 +57,6 @@ class BrandSerializer(ModelSerializer):
"flow_device_code", "flow_device_code",
"default_application", "default_application",
"web_certificate", "web_certificate",
"client_certificates",
"attributes", "attributes",
] ]
extra_kwargs = { extra_kwargs = {
@ -89,7 +86,6 @@ class CurrentBrandSerializer(PassiveSerializer):
branding_title = CharField() branding_title = CharField()
branding_logo = CharField(source="branding_logo_url") branding_logo = CharField(source="branding_logo_url")
branding_favicon = CharField(source="branding_favicon_url") branding_favicon = CharField(source="branding_favicon_url")
branding_custom_css = CharField()
ui_footer_links = ListField( ui_footer_links = ListField(
child=FooterLinkSerializer(), child=FooterLinkSerializer(),
read_only=True, read_only=True,
@ -121,7 +117,6 @@ class BrandViewSet(UsedByMixin, ModelViewSet):
"domain", "domain",
"branding_title", "branding_title",
"web_certificate__name", "web_certificate__name",
"client_certificates__name",
] ]
filterset_fields = [ filterset_fields = [
"brand_uuid", "brand_uuid",
@ -130,7 +125,6 @@ class BrandViewSet(UsedByMixin, ModelViewSet):
"branding_title", "branding_title",
"branding_logo", "branding_logo",
"branding_favicon", "branding_favicon",
"branding_default_flow_background",
"flow_authentication", "flow_authentication",
"flow_invalidation", "flow_invalidation",
"flow_recovery", "flow_recovery",
@ -138,7 +132,6 @@ class BrandViewSet(UsedByMixin, ModelViewSet):
"flow_user_settings", "flow_user_settings",
"flow_device_code", "flow_device_code",
"web_certificate", "web_certificate",
"client_certificates",
] ]
ordering = ["domain"] ordering = ["domain"]

View File

@ -1,9 +1,9 @@
"""authentik brands app""" """authentik brands app"""
from authentik.blueprints.apps import ManagedAppConfig from django.apps import AppConfig
class AuthentikBrandsConfig(ManagedAppConfig): class AuthentikBrandsConfig(AppConfig):
"""authentik Brand app""" """authentik Brand app"""
name = "authentik.brands" name = "authentik.brands"
@ -12,4 +12,3 @@ class AuthentikBrandsConfig(ManagedAppConfig):
mountpoints = { mountpoints = {
"authentik.brands.urls_root": "", "authentik.brands.urls_root": "",
} }
default = True

View File

@ -1,35 +0,0 @@
# Generated by Django 5.0.12 on 2025-02-22 01:51
from pathlib import Path
from django.db import migrations, models
from django.apps.registry import Apps
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
def migrate_custom_css(apps: Apps, schema_editor: BaseDatabaseSchemaEditor):
Brand = apps.get_model("authentik_brands", "brand")
db_alias = schema_editor.connection.alias
path = Path("/web/dist/custom.css")
if not path.exists():
return
css = path.read_text()
Brand.objects.using(db_alias).all().update(branding_custom_css=css)
class Migration(migrations.Migration):
dependencies = [
("authentik_brands", "0007_brand_default_application"),
]
operations = [
migrations.AddField(
model_name="brand",
name="branding_custom_css",
field=models.TextField(blank=True, default=""),
),
migrations.RunPython(migrate_custom_css),
]

View File

@ -1,18 +0,0 @@
# Generated by Django 5.0.13 on 2025-03-19 22:54
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_brands", "0008_brand_branding_custom_css"),
]
operations = [
migrations.AddField(
model_name="brand",
name="branding_default_flow_background",
field=models.TextField(default="/static/dist/assets/images/flow_background.jpg"),
),
]

View File

@ -1,37 +0,0 @@
# Generated by Django 5.1.9 on 2025-05-19 15:09
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_brands", "0009_brand_branding_default_flow_background"),
("authentik_crypto", "0004_alter_certificatekeypair_name"),
]
operations = [
migrations.AddField(
model_name="brand",
name="client_certificates",
field=models.ManyToManyField(
blank=True,
default=None,
help_text="Certificates used for client authentication.",
to="authentik_crypto.certificatekeypair",
),
),
migrations.AlterField(
model_name="brand",
name="web_certificate",
field=models.ForeignKey(
default=None,
help_text="Web Certificate used by the authentik Core webserver.",
null=True,
on_delete=django.db.models.deletion.SET_DEFAULT,
related_name="+",
to="authentik_crypto.certificatekeypair",
),
),
]

View File

@ -33,10 +33,6 @@ class Brand(SerializerModel):
branding_logo = models.TextField(default="/static/dist/assets/icons/icon_left_brand.svg") branding_logo = models.TextField(default="/static/dist/assets/icons/icon_left_brand.svg")
branding_favicon = models.TextField(default="/static/dist/assets/icons/icon.png") branding_favicon = models.TextField(default="/static/dist/assets/icons/icon.png")
branding_custom_css = models.TextField(default="", blank=True)
branding_default_flow_background = models.TextField(
default="/static/dist/assets/images/flow_background.jpg"
)
flow_authentication = models.ForeignKey( flow_authentication = models.ForeignKey(
Flow, null=True, on_delete=models.SET_NULL, related_name="brand_authentication" Flow, null=True, on_delete=models.SET_NULL, related_name="brand_authentication"
@ -73,13 +69,6 @@ class Brand(SerializerModel):
default=None, default=None,
on_delete=models.SET_DEFAULT, on_delete=models.SET_DEFAULT,
help_text=_("Web Certificate used by the authentik Core webserver."), help_text=_("Web Certificate used by the authentik Core webserver."),
related_name="+",
)
client_certificates = models.ManyToManyField(
CertificateKeyPair,
default=None,
blank=True,
help_text=_("Certificates used for client authentication."),
) )
attributes = models.JSONField(default=dict, blank=True) attributes = models.JSONField(default=dict, blank=True)
@ -95,12 +84,6 @@ class Brand(SerializerModel):
return CONFIG.get("web.path", "/")[:-1] + self.branding_favicon return CONFIG.get("web.path", "/")[:-1] + self.branding_favicon
return self.branding_favicon return self.branding_favicon
def branding_default_flow_background_url(self) -> str:
"""Get branding_default_flow_background with the correct prefix"""
if self.branding_default_flow_background.startswith("/static"):
return CONFIG.get("web.path", "/")[:-1] + self.branding_default_flow_background
return self.branding_default_flow_background
@property @property
def serializer(self) -> Serializer: def serializer(self) -> Serializer:
from authentik.brands.api import BrandSerializer from authentik.brands.api import BrandSerializer

View File

@ -24,7 +24,6 @@ class TestBrands(APITestCase):
"branding_logo": "/static/dist/assets/icons/icon_left_brand.svg", "branding_logo": "/static/dist/assets/icons/icon_left_brand.svg",
"branding_favicon": "/static/dist/assets/icons/icon.png", "branding_favicon": "/static/dist/assets/icons/icon.png",
"branding_title": "authentik", "branding_title": "authentik",
"branding_custom_css": "",
"matched_domain": brand.domain, "matched_domain": brand.domain,
"ui_footer_links": [], "ui_footer_links": [],
"ui_theme": Themes.AUTOMATIC, "ui_theme": Themes.AUTOMATIC,
@ -44,7 +43,6 @@ class TestBrands(APITestCase):
"branding_logo": "/static/dist/assets/icons/icon_left_brand.svg", "branding_logo": "/static/dist/assets/icons/icon_left_brand.svg",
"branding_favicon": "/static/dist/assets/icons/icon.png", "branding_favicon": "/static/dist/assets/icons/icon.png",
"branding_title": "custom", "branding_title": "custom",
"branding_custom_css": "",
"matched_domain": "bar.baz", "matched_domain": "bar.baz",
"ui_footer_links": [], "ui_footer_links": [],
"ui_theme": Themes.AUTOMATIC, "ui_theme": Themes.AUTOMATIC,
@ -61,7 +59,6 @@ class TestBrands(APITestCase):
"branding_logo": "/static/dist/assets/icons/icon_left_brand.svg", "branding_logo": "/static/dist/assets/icons/icon_left_brand.svg",
"branding_favicon": "/static/dist/assets/icons/icon.png", "branding_favicon": "/static/dist/assets/icons/icon.png",
"branding_title": "authentik", "branding_title": "authentik",
"branding_custom_css": "",
"matched_domain": "fallback", "matched_domain": "fallback",
"ui_footer_links": [], "ui_footer_links": [],
"ui_theme": Themes.AUTOMATIC, "ui_theme": Themes.AUTOMATIC,
@ -124,38 +121,3 @@ class TestBrands(APITestCase):
"subject": None, "subject": None,
}, },
) )
def test_branding_url(self):
"""Test branding attributes return correct values"""
brand = create_test_brand()
brand.branding_default_flow_background = "https://goauthentik.io/img/icon.png"
brand.branding_favicon = "https://goauthentik.io/img/icon.png"
brand.branding_logo = "https://goauthentik.io/img/icon.png"
brand.save()
self.assertEqual(
brand.branding_default_flow_background_url(), "https://goauthentik.io/img/icon.png"
)
self.assertJSONEqual(
self.client.get(reverse("authentik_api:brand-current")).content.decode(),
{
"branding_logo": "https://goauthentik.io/img/icon.png",
"branding_favicon": "https://goauthentik.io/img/icon.png",
"branding_title": "authentik",
"branding_custom_css": "",
"matched_domain": brand.domain,
"ui_footer_links": [],
"ui_theme": Themes.AUTOMATIC,
"default_locale": "",
},
)
def test_custom_css(self):
"""Test custom_css"""
brand = create_test_brand()
brand.branding_custom_css = """* {
font-family: "Foo bar";
}"""
brand.save()
res = self.client.get(reverse("authentik_core:if-user"))
self.assertEqual(res.status_code, 200)
self.assertIn(brand.branding_custom_css, res.content.decode())

View File

@ -5,12 +5,10 @@ from typing import Any
from django.db.models import F, Q from django.db.models import F, Q
from django.db.models import Value as V from django.db.models import Value as V
from django.http.request import HttpRequest from django.http.request import HttpRequest
from django.utils.html import _json_script_escapes from sentry_sdk import get_current_span
from django.utils.safestring import mark_safe
from authentik import get_full_version from authentik import get_full_version
from authentik.brands.models import Brand from authentik.brands.models import Brand
from authentik.lib.sentry import get_http_meta
from authentik.tenants.models import Tenant from authentik.tenants.models import Tenant
_q_default = Q(default=True) _q_default = Q(default=True)
@ -34,14 +32,13 @@ def context_processor(request: HttpRequest) -> dict[str, Any]:
"""Context Processor that injects brand object into every template""" """Context Processor that injects brand object into every template"""
brand = getattr(request, "brand", DEFAULT_BRAND) brand = getattr(request, "brand", DEFAULT_BRAND)
tenant = getattr(request, "tenant", Tenant()) tenant = getattr(request, "tenant", Tenant())
# similarly to `json_script` we escape everything HTML-related, however django trace = ""
# only directly exposes this as a function that also wraps it in a <script> tag span = get_current_span()
# which we dont want for CSS if span:
brand_css = mark_safe(str(brand.branding_custom_css).translate(_json_script_escapes)) # nosec trace = span.to_traceparent()
return { return {
"brand": brand, "brand": brand,
"brand_css": brand_css,
"footer_links": tenant.footer_links, "footer_links": tenant.footer_links,
"html_meta": {**get_http_meta()}, "sentry_trace": trace,
"version": get_full_version(), "version": get_full_version(),
} }

View File

@ -2,9 +2,11 @@
from collections.abc import Iterator from collections.abc import Iterator
from copy import copy from copy import copy
from datetime import timedelta
from django.core.cache import cache from django.core.cache import cache
from django.db.models import QuerySet from django.db.models import QuerySet
from django.db.models.functions import ExtractHour
from django.shortcuts import get_object_or_404 from django.shortcuts import get_object_or_404
from drf_spectacular.types import OpenApiTypes from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_schema from drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_schema
@ -18,6 +20,7 @@ from rest_framework.response import Response
from rest_framework.viewsets import ModelViewSet from rest_framework.viewsets import ModelViewSet
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
from authentik.admin.api.metrics import CoordinateSerializer
from authentik.api.pagination import Pagination from authentik.api.pagination import Pagination
from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT
from authentik.core.api.providers import ProviderSerializer from authentik.core.api.providers import ProviderSerializer
@ -25,6 +28,7 @@ from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import ModelSerializer from authentik.core.api.utils import ModelSerializer
from authentik.core.models import Application, User from authentik.core.models import Application, User
from authentik.events.logs import LogEventSerializer, capture_logs from authentik.events.logs import LogEventSerializer, capture_logs
from authentik.events.models import EventAction
from authentik.lib.utils.file import ( from authentik.lib.utils.file import (
FilePathSerializer, FilePathSerializer,
FileUploadSerializer, FileUploadSerializer,
@ -42,7 +46,7 @@ LOGGER = get_logger()
def user_app_cache_key(user_pk: str, page_number: int | None = None) -> str: def user_app_cache_key(user_pk: str, page_number: int | None = None) -> str:
"""Cache key where application list for user is saved""" """Cache key where application list for user is saved"""
key = f"{CACHE_PREFIX}app_access/{user_pk}" key = f"{CACHE_PREFIX}/app_access/{user_pk}"
if page_number: if page_number:
key += f"/{page_number}" key += f"/{page_number}"
return key return key
@ -205,7 +209,7 @@ class ApplicationViewSet(UsedByMixin, ModelViewSet):
@extend_schema( @extend_schema(
parameters=[ parameters=[
OpenApiParameter( OpenApiParameter(
name="superuser_full_list", name="list_rbac",
location=OpenApiParameter.QUERY, location=OpenApiParameter.QUERY,
type=OpenApiTypes.BOOL, type=OpenApiTypes.BOOL,
), ),
@ -225,10 +229,8 @@ class ApplicationViewSet(UsedByMixin, ModelViewSet):
"""Custom list method that checks Policy based access instead of guardian""" """Custom list method that checks Policy based access instead of guardian"""
should_cache = request.query_params.get("search", "") == "" should_cache = request.query_params.get("search", "") == ""
superuser_full_list = ( list_rbac = str(request.query_params.get("list_rbac", "false")).lower() == "true"
str(request.query_params.get("superuser_full_list", "false")).lower() == "true" if list_rbac:
)
if superuser_full_list and request.user.is_superuser:
return super().list(request) return super().list(request)
only_with_launch_url = str( only_with_launch_url = str(
@ -317,3 +319,18 @@ class ApplicationViewSet(UsedByMixin, ModelViewSet):
"""Set application icon (as URL)""" """Set application icon (as URL)"""
app: Application = self.get_object() app: Application = self.get_object()
return set_file_url(request, app, "meta_icon") return set_file_url(request, app, "meta_icon")
@permission_required("authentik_core.view_application", ["authentik_events.view_event"])
@extend_schema(responses={200: CoordinateSerializer(many=True)})
@action(detail=True, pagination_class=None, filter_backends=[])
def metrics(self, request: Request, slug: str):
"""Metrics for application logins"""
app = self.get_object()
return Response(
get_objects_for_user(request.user, "authentik_events.view_event").filter(
action=EventAction.AUTHORIZE_APPLICATION,
context__authorized_application__pk=app.pk.hex,
)
# 3 data points per day, so 8 hour spans
.get_events_per(timedelta(days=7), ExtractHour, 7 * 3)
)

View File

@ -5,7 +5,6 @@ from typing import TypedDict
from rest_framework import mixins from rest_framework import mixins
from rest_framework.fields import SerializerMethodField from rest_framework.fields import SerializerMethodField
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.serializers import CharField, DateTimeField, IPAddressField
from rest_framework.viewsets import GenericViewSet from rest_framework.viewsets import GenericViewSet
from ua_parser import user_agent_parser from ua_parser import user_agent_parser
@ -55,11 +54,6 @@ class UserAgentDict(TypedDict):
class AuthenticatedSessionSerializer(ModelSerializer): class AuthenticatedSessionSerializer(ModelSerializer):
"""AuthenticatedSession Serializer""" """AuthenticatedSession Serializer"""
expires = DateTimeField(source="session.expires", read_only=True)
last_ip = IPAddressField(source="session.last_ip", read_only=True)
last_user_agent = CharField(source="session.last_user_agent", read_only=True)
last_used = DateTimeField(source="session.last_used", read_only=True)
current = SerializerMethodField() current = SerializerMethodField()
user_agent = SerializerMethodField() user_agent = SerializerMethodField()
geo_ip = SerializerMethodField() geo_ip = SerializerMethodField()
@ -68,19 +62,19 @@ class AuthenticatedSessionSerializer(ModelSerializer):
def get_current(self, instance: AuthenticatedSession) -> bool: def get_current(self, instance: AuthenticatedSession) -> bool:
"""Check if session is currently active session""" """Check if session is currently active session"""
request: Request = self.context["request"] request: Request = self.context["request"]
return request._request.session.session_key == instance.session.session_key return request._request.session.session_key == instance.session_key
def get_user_agent(self, instance: AuthenticatedSession) -> UserAgentDict: def get_user_agent(self, instance: AuthenticatedSession) -> UserAgentDict:
"""Get parsed user agent""" """Get parsed user agent"""
return user_agent_parser.Parse(instance.session.last_user_agent) return user_agent_parser.Parse(instance.last_user_agent)
def get_geo_ip(self, instance: AuthenticatedSession) -> GeoIPDict | None: # pragma: no cover def get_geo_ip(self, instance: AuthenticatedSession) -> GeoIPDict | None: # pragma: no cover
"""Get GeoIP Data""" """Get GeoIP Data"""
return GEOIP_CONTEXT_PROCESSOR.city_dict(instance.session.last_ip) return GEOIP_CONTEXT_PROCESSOR.city_dict(instance.last_ip)
def get_asn(self, instance: AuthenticatedSession) -> ASNDict | None: # pragma: no cover def get_asn(self, instance: AuthenticatedSession) -> ASNDict | None: # pragma: no cover
"""Get ASN Data""" """Get ASN Data"""
return ASN_CONTEXT_PROCESSOR.asn_dict(instance.session.last_ip) return ASN_CONTEXT_PROCESSOR.asn_dict(instance.last_ip)
class Meta: class Meta:
model = AuthenticatedSession model = AuthenticatedSession
@ -96,7 +90,6 @@ class AuthenticatedSessionSerializer(ModelSerializer):
"last_used", "last_used",
"expires", "expires",
] ]
extra_args = {"uuid": {"read_only": True}}
class AuthenticatedSessionViewSet( class AuthenticatedSessionViewSet(
@ -108,10 +101,9 @@ class AuthenticatedSessionViewSet(
): ):
"""AuthenticatedSession Viewset""" """AuthenticatedSession Viewset"""
lookup_field = "uuid" queryset = AuthenticatedSession.objects.all()
queryset = AuthenticatedSession.objects.select_related("session").all()
serializer_class = AuthenticatedSessionSerializer serializer_class = AuthenticatedSessionSerializer
search_fields = ["user__username", "session__last_ip", "session__last_user_agent"] search_fields = ["user__username", "last_ip", "last_user_agent"]
filterset_fields = ["user__username", "session__last_ip", "session__last_user_agent"] filterset_fields = ["user__username", "last_ip", "last_user_agent"]
ordering = ["user__username"] ordering = ["user__username"]
owner_field = "user" owner_field = "user"

View File

@ -1,7 +1,8 @@
"""Authenticator Devices API Views""" """Authenticator Devices API Views"""
from drf_spectacular.utils import extend_schema from django.utils.translation import gettext_lazy as _
from guardian.shortcuts import get_objects_for_user from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import OpenApiParameter, extend_schema
from rest_framework.fields import ( from rest_framework.fields import (
BooleanField, BooleanField,
CharField, CharField,
@ -13,16 +14,16 @@ from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.viewsets import ViewSet from rest_framework.viewsets import ViewSet
from authentik.core.api.users import ParamUserSerializer
from authentik.core.api.utils import MetaNameSerializer from authentik.core.api.utils import MetaNameSerializer
from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import EndpointDevice from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import EndpointDevice
from authentik.rbac.decorators import permission_required
from authentik.stages.authenticator import device_classes, devices_for_user from authentik.stages.authenticator import device_classes, devices_for_user
from authentik.stages.authenticator.models import Device from authentik.stages.authenticator.models import Device
from authentik.stages.authenticator_webauthn.models import WebAuthnDevice from authentik.stages.authenticator_webauthn.models import WebAuthnDevice
class DeviceSerializer(MetaNameSerializer): class DeviceSerializer(MetaNameSerializer):
"""Serializer for authenticator devices""" """Serializer for Duo authenticator devices"""
pk = CharField() pk = CharField()
name = CharField() name = CharField()
@ -32,27 +33,22 @@ class DeviceSerializer(MetaNameSerializer):
last_updated = DateTimeField(read_only=True) last_updated = DateTimeField(read_only=True)
last_used = DateTimeField(read_only=True, allow_null=True) last_used = DateTimeField(read_only=True, allow_null=True)
extra_description = SerializerMethodField() extra_description = SerializerMethodField()
external_id = SerializerMethodField()
def get_type(self, instance: Device) -> str: def get_type(self, instance: Device) -> str:
"""Get type of device""" """Get type of device"""
return instance._meta.label return instance._meta.label
def get_extra_description(self, instance: Device) -> str | None: def get_extra_description(self, instance: Device) -> str:
"""Get extra description""" """Get extra description"""
if isinstance(instance, WebAuthnDevice): if isinstance(instance, WebAuthnDevice):
return instance.device_type.description if instance.device_type else None return (
instance.device_type.description
if instance.device_type
else _("Extra description not available")
)
if isinstance(instance, EndpointDevice): if isinstance(instance, EndpointDevice):
return instance.data.get("deviceSignals", {}).get("deviceModel") return instance.data.get("deviceSignals", {}).get("deviceModel")
return None return ""
def get_external_id(self, instance: Device) -> str | None:
"""Get external Device ID"""
if isinstance(instance, WebAuthnDevice):
return instance.device_type.aaguid if instance.device_type else None
if isinstance(instance, EndpointDevice):
return instance.data.get("deviceSignals", {}).get("deviceModel")
return None
class DeviceViewSet(ViewSet): class DeviceViewSet(ViewSet):
@ -61,6 +57,7 @@ class DeviceViewSet(ViewSet):
serializer_class = DeviceSerializer serializer_class = DeviceSerializer
permission_classes = [IsAuthenticated] permission_classes = [IsAuthenticated]
@extend_schema(responses={200: DeviceSerializer(many=True)})
def list(self, request: Request) -> Response: def list(self, request: Request) -> Response:
"""Get all devices for current user""" """Get all devices for current user"""
devices = devices_for_user(request.user) devices = devices_for_user(request.user)
@ -76,17 +73,26 @@ class AdminDeviceViewSet(ViewSet):
def get_devices(self, **kwargs): def get_devices(self, **kwargs):
"""Get all devices in all child classes""" """Get all devices in all child classes"""
for model in device_classes(): for model in device_classes():
device_set = get_objects_for_user( device_set = model.objects.filter(**kwargs)
self.request.user, f"{model._meta.app_label}.view_{model._meta.model_name}", model
).filter(**kwargs)
yield from device_set yield from device_set
@extend_schema( @extend_schema(
parameters=[ParamUserSerializer], parameters=[
OpenApiParameter(
name="user",
location=OpenApiParameter.QUERY,
type=OpenApiTypes.INT,
)
],
responses={200: DeviceSerializer(many=True)}, responses={200: DeviceSerializer(many=True)},
) )
@permission_required(
None,
[f"{model._meta.app_label}.view_{model._meta.model_name}" for model in device_classes()],
)
def list(self, request: Request) -> Response: def list(self, request: Request) -> Response:
"""Get all devices for current user""" """Get all devices for current user"""
args = ParamUserSerializer(data=request.query_params) kwargs = {}
args.is_valid(raise_exception=True) if "user" in request.query_params:
return Response(DeviceSerializer(self.get_devices(**args.validated_data), many=True).data) kwargs = {"user": request.query_params["user"]}
return Response(DeviceSerializer(self.get_devices(**kwargs), many=True).data)

View File

@ -4,7 +4,6 @@ from json import loads
from django.db.models import Prefetch from django.db.models import Prefetch
from django.http import Http404 from django.http import Http404
from django.utils.translation import gettext as _
from django_filters.filters import CharFilter, ModelMultipleChoiceFilter from django_filters.filters import CharFilter, ModelMultipleChoiceFilter
from django_filters.filterset import FilterSet from django_filters.filterset import FilterSet
from drf_spectacular.utils import ( from drf_spectacular.utils import (
@ -82,36 +81,9 @@ class GroupSerializer(ModelSerializer):
if not self.instance or not parent: if not self.instance or not parent:
return parent return parent
if str(parent.group_uuid) == str(self.instance.group_uuid): if str(parent.group_uuid) == str(self.instance.group_uuid):
raise ValidationError(_("Cannot set group as parent of itself.")) raise ValidationError("Cannot set group as parent of itself.")
return parent return parent
def validate_is_superuser(self, superuser: bool):
"""Ensure that the user creating this group has permissions to set the superuser flag"""
request: Request = self.context.get("request", None)
if not request:
return superuser
# If we're updating an instance, and the state hasn't changed, we don't need to check perms
if self.instance and superuser == self.instance.is_superuser:
return superuser
user: User = request.user
perm = (
"authentik_core.enable_group_superuser"
if superuser
else "authentik_core.disable_group_superuser"
)
if self.instance or superuser:
has_perm = user.has_perm(perm) or user.has_perm(perm, self.instance)
if not has_perm:
raise ValidationError(
_(
(
"User does not have permission to set "
"superuser status to {superuser_status}."
).format_map({"superuser_status": superuser})
)
)
return superuser
class Meta: class Meta:
model = Group model = Group
fields = [ fields = [

View File

@ -5,7 +5,6 @@ from collections.abc import Iterable
from drf_spectacular.utils import OpenApiResponse, extend_schema from drf_spectacular.utils import OpenApiResponse, extend_schema
from rest_framework import mixins from rest_framework import mixins
from rest_framework.decorators import action from rest_framework.decorators import action
from rest_framework.exceptions import ValidationError
from rest_framework.fields import CharField, ReadOnlyField, SerializerMethodField from rest_framework.fields import CharField, ReadOnlyField, SerializerMethodField
from rest_framework.parsers import MultiPartParser from rest_framework.parsers import MultiPartParser
from rest_framework.request import Request from rest_framework.request import Request
@ -86,7 +85,7 @@ class SourceViewSet(
serializer_class = SourceSerializer serializer_class = SourceSerializer
lookup_field = "slug" lookup_field = "slug"
search_fields = ["slug", "name"] search_fields = ["slug", "name"]
filterset_fields = ["slug", "name", "managed", "pbm_uuid"] filterset_fields = ["slug", "name", "managed"]
def get_queryset(self): # pragma: no cover def get_queryset(self): # pragma: no cover
return Source.objects.select_subclasses() return Source.objects.select_subclasses()
@ -155,17 +154,6 @@ class SourceViewSet(
matching_sources.append(source_settings.validated_data) matching_sources.append(source_settings.validated_data)
return Response(matching_sources) return Response(matching_sources)
def destroy(self, request: Request, *args, **kwargs):
"""Prevent deletion of built-in sources"""
instance: Source = self.get_object()
if instance.managed == Source.MANAGED_INBUILT:
raise ValidationError(
{"detail": "Built-in sources cannot be deleted"}, code="protected"
)
return super().destroy(request, *args, **kwargs)
class UserSourceConnectionSerializer(SourceSerializer): class UserSourceConnectionSerializer(SourceSerializer):
"""User source connection""" """User source connection"""
@ -179,13 +167,10 @@ class UserSourceConnectionSerializer(SourceSerializer):
"user", "user",
"source", "source",
"source_obj", "source_obj",
"identifier",
"created", "created",
"last_updated",
] ]
extra_kwargs = { extra_kwargs = {
"created": {"read_only": True}, "created": {"read_only": True},
"last_updated": {"read_only": True},
} }
@ -202,7 +187,7 @@ class UserSourceConnectionViewSet(
queryset = UserSourceConnection.objects.all() queryset = UserSourceConnection.objects.all()
serializer_class = UserSourceConnectionSerializer serializer_class = UserSourceConnectionSerializer
filterset_fields = ["user", "source__slug"] filterset_fields = ["user", "source__slug"]
search_fields = ["user__username", "source__slug", "identifier"] search_fields = ["source__slug"]
ordering = ["source__slug", "pk"] ordering = ["source__slug", "pk"]
owner_field = "user" owner_field = "user"
@ -221,11 +206,9 @@ class GroupSourceConnectionSerializer(SourceSerializer):
"source_obj", "source_obj",
"identifier", "identifier",
"created", "created",
"last_updated",
] ]
extra_kwargs = { extra_kwargs = {
"created": {"read_only": True}, "created": {"read_only": True},
"last_updated": {"read_only": True},
} }
@ -242,5 +225,6 @@ class GroupSourceConnectionViewSet(
queryset = GroupSourceConnection.objects.all() queryset = GroupSourceConnection.objects.all()
serializer_class = GroupSourceConnectionSerializer serializer_class = GroupSourceConnectionSerializer
filterset_fields = ["group", "source__slug"] filterset_fields = ["group", "source__slug"]
search_fields = ["group__name", "source__slug", "identifier"] search_fields = ["source__slug"]
ordering = ["source__slug", "pk"] ordering = ["source__slug", "pk"]
owner_field = "user"

View File

@ -4,7 +4,7 @@ from typing import Any
from django.utils.timezone import now from django.utils.timezone import now
from drf_spectacular.utils import OpenApiResponse, extend_schema, inline_serializer from drf_spectacular.utils import OpenApiResponse, extend_schema, inline_serializer
from guardian.shortcuts import assign_perm, get_anonymous_user from guardian.shortcuts import assign_perm
from rest_framework.decorators import action from rest_framework.decorators import action
from rest_framework.exceptions import ValidationError from rest_framework.exceptions import ValidationError
from rest_framework.fields import CharField from rest_framework.fields import CharField
@ -138,13 +138,8 @@ class TokenViewSet(UsedByMixin, ModelViewSet):
owner_field = "user" owner_field = "user"
rbac_allow_create_without_perm = True rbac_allow_create_without_perm = True
def get_queryset(self):
user = self.request.user if self.request else get_anonymous_user()
if user.is_superuser:
return super().get_queryset()
return super().get_queryset().filter(user=user.pk)
def perform_create(self, serializer: TokenSerializer): def perform_create(self, serializer: TokenSerializer):
# TODO: better permission check
if not self.request.user.is_superuser: if not self.request.user.is_superuser:
instance = serializer.save( instance = serializer.save(
user=self.request.user, user=self.request.user,

View File

@ -6,6 +6,9 @@ from typing import Any
from django.contrib.auth import update_session_auth_hash from django.contrib.auth import update_session_auth_hash
from django.contrib.auth.models import Permission from django.contrib.auth.models import Permission
from django.contrib.sessions.backends.cache import KEY_PREFIX
from django.core.cache import cache
from django.db.models.functions import ExtractHour
from django.db.transaction import atomic from django.db.transaction import atomic
from django.db.utils import IntegrityError from django.db.utils import IntegrityError
from django.urls import reverse_lazy from django.urls import reverse_lazy
@ -51,6 +54,7 @@ from rest_framework.validators import UniqueValidator
from rest_framework.viewsets import ModelViewSet from rest_framework.viewsets import ModelViewSet
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
from authentik.admin.api.metrics import CoordinateSerializer
from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT
from authentik.brands.models import Brand from authentik.brands.models import Brand
from authentik.core.api.used_by import UsedByMixin from authentik.core.api.used_by import UsedByMixin
@ -67,8 +71,8 @@ from authentik.core.middleware import (
from authentik.core.models import ( from authentik.core.models import (
USER_ATTRIBUTE_TOKEN_EXPIRING, USER_ATTRIBUTE_TOKEN_EXPIRING,
USER_PATH_SERVICE_ACCOUNT, USER_PATH_SERVICE_ACCOUNT,
AuthenticatedSession,
Group, Group,
Session,
Token, Token,
TokenIntents, TokenIntents,
User, User,
@ -82,7 +86,6 @@ from authentik.flows.views.executor import QS_KEY_TOKEN
from authentik.lib.avatars import get_avatar from authentik.lib.avatars import get_avatar
from authentik.rbac.decorators import permission_required from authentik.rbac.decorators import permission_required
from authentik.rbac.models import get_permission_choices from authentik.rbac.models import get_permission_choices
from authentik.stages.email.flow import pickle_flow_token_for_email
from authentik.stages.email.models import EmailStage from authentik.stages.email.models import EmailStage
from authentik.stages.email.tasks import send_mails from authentik.stages.email.tasks import send_mails
from authentik.stages.email.utils import TemplateEmailMessage from authentik.stages.email.utils import TemplateEmailMessage
@ -90,12 +93,6 @@ from authentik.stages.email.utils import TemplateEmailMessage
LOGGER = get_logger() LOGGER = get_logger()
class ParamUserSerializer(PassiveSerializer):
"""Partial serializer for query parameters to select a user"""
user = PrimaryKeyRelatedField(queryset=User.objects.all().exclude_anonymous(), required=False)
class UserGroupSerializer(ModelSerializer): class UserGroupSerializer(ModelSerializer):
"""Simplified Group Serializer for user's groups""" """Simplified Group Serializer for user's groups"""
@ -229,7 +226,6 @@ class UserSerializer(ModelSerializer):
"name", "name",
"is_active", "is_active",
"last_login", "last_login",
"date_joined",
"is_superuser", "is_superuser",
"groups", "groups",
"groups_obj", "groups_obj",
@ -240,12 +236,9 @@ class UserSerializer(ModelSerializer):
"path", "path",
"type", "type",
"uuid", "uuid",
"password_change_date",
] ]
extra_kwargs = { extra_kwargs = {
"name": {"allow_blank": True}, "name": {"allow_blank": True},
"date_joined": {"read_only": True},
"password_change_date": {"read_only": True},
} }
@ -321,6 +314,53 @@ class SessionUserSerializer(PassiveSerializer):
original = UserSelfSerializer(required=False) original = UserSelfSerializer(required=False)
class UserMetricsSerializer(PassiveSerializer):
"""User Metrics"""
logins = SerializerMethodField()
logins_failed = SerializerMethodField()
authorizations = SerializerMethodField()
@extend_schema_field(CoordinateSerializer(many=True))
def get_logins(self, _):
"""Get successful logins per 8 hours for the last 7 days"""
user = self.context["user"]
request = self.context["request"]
return (
get_objects_for_user(request.user, "authentik_events.view_event").filter(
action=EventAction.LOGIN, user__pk=user.pk
)
# 3 data points per day, so 8 hour spans
.get_events_per(timedelta(days=7), ExtractHour, 7 * 3)
)
@extend_schema_field(CoordinateSerializer(many=True))
def get_logins_failed(self, _):
"""Get failed logins per 8 hours for the last 7 days"""
user = self.context["user"]
request = self.context["request"]
return (
get_objects_for_user(request.user, "authentik_events.view_event").filter(
action=EventAction.LOGIN_FAILED, context__username=user.username
)
# 3 data points per day, so 8 hour spans
.get_events_per(timedelta(days=7), ExtractHour, 7 * 3)
)
@extend_schema_field(CoordinateSerializer(many=True))
def get_authorizations(self, _):
"""Get failed logins per 8 hours for the last 7 days"""
user = self.context["user"]
request = self.context["request"]
return (
get_objects_for_user(request.user, "authentik_events.view_event").filter(
action=EventAction.AUTHORIZE_APPLICATION, user__pk=user.pk
)
# 3 data points per day, so 8 hour spans
.get_events_per(timedelta(days=7), ExtractHour, 7 * 3)
)
class UsersFilter(FilterSet): class UsersFilter(FilterSet):
"""Filter for users""" """Filter for users"""
@ -331,7 +371,7 @@ class UsersFilter(FilterSet):
method="filter_attributes", method="filter_attributes",
) )
is_superuser = BooleanFilter(field_name="ak_groups", method="filter_is_superuser") is_superuser = BooleanFilter(field_name="ak_groups", lookup_expr="is_superuser")
uuid = UUIDFilter(field_name="uuid") uuid = UUIDFilter(field_name="uuid")
path = CharFilter(field_name="path") path = CharFilter(field_name="path")
@ -349,11 +389,6 @@ class UsersFilter(FilterSet):
queryset=Group.objects.all().order_by("name"), queryset=Group.objects.all().order_by("name"),
) )
def filter_is_superuser(self, queryset, name, value):
if value:
return queryset.filter(ak_groups__is_superuser=True).distinct()
return queryset.exclude(ak_groups__is_superuser=True).distinct()
def filter_attributes(self, queryset, name, value): def filter_attributes(self, queryset, name, value):
"""Filter attributes by query args""" """Filter attributes by query args"""
try: try:
@ -392,23 +427,8 @@ class UserViewSet(UsedByMixin, ModelViewSet):
queryset = User.objects.none() queryset = User.objects.none()
ordering = ["username"] ordering = ["username"]
serializer_class = UserSerializer serializer_class = UserSerializer
search_fields = ["username", "name", "is_active", "email", "uuid"]
filterset_class = UsersFilter filterset_class = UsersFilter
search_fields = ["username", "name", "is_active", "email", "uuid", "attributes"]
def get_ql_fields(self):
from djangoql.schema import BoolField, StrField
from authentik.enterprise.search.fields import ChoiceSearchField, JSONSearchField
return [
StrField(User, "username"),
StrField(User, "name"),
StrField(User, "email"),
StrField(User, "path"),
BoolField(User, "is_active", nullable=True),
ChoiceSearchField(User, "type"),
JSONSearchField(User, "attributes", suggest_nested=False),
]
def get_queryset(self): def get_queryset(self):
base_qs = User.objects.all().exclude_anonymous() base_qs = User.objects.all().exclude_anonymous()
@ -424,7 +444,7 @@ class UserViewSet(UsedByMixin, ModelViewSet):
def list(self, request, *args, **kwargs): def list(self, request, *args, **kwargs):
return super().list(request, *args, **kwargs) return super().list(request, *args, **kwargs)
def _create_recovery_link(self, for_email=False) -> tuple[str, Token]: def _create_recovery_link(self) -> tuple[str, Token]:
"""Create a recovery link (when the current brand has a recovery flow set), """Create a recovery link (when the current brand has a recovery flow set),
that can either be shown to an admin or sent to the user directly""" that can either be shown to an admin or sent to the user directly"""
brand: Brand = self.request._request.brand brand: Brand = self.request._request.brand
@ -446,16 +466,12 @@ class UserViewSet(UsedByMixin, ModelViewSet):
raise ValidationError( raise ValidationError(
{"non_field_errors": "Recovery flow not applicable to user"} {"non_field_errors": "Recovery flow not applicable to user"}
) from None ) from None
_plan = FlowToken.pickle(plan)
if for_email:
_plan = pickle_flow_token_for_email(plan)
token, __ = FlowToken.objects.update_or_create( token, __ = FlowToken.objects.update_or_create(
identifier=f"{user.uid}-password-reset", identifier=f"{user.uid}-password-reset",
defaults={ defaults={
"user": user, "user": user,
"flow": flow, "flow": flow,
"_plan": _plan, "_plan": FlowToken.pickle(plan),
"revoke_on_execution": not for_email,
}, },
) )
querystring = urlencode({QS_KEY_TOKEN: token.key}) querystring = urlencode({QS_KEY_TOKEN: token.key})
@ -579,6 +595,17 @@ class UserViewSet(UsedByMixin, ModelViewSet):
update_session_auth_hash(self.request, user) update_session_auth_hash(self.request, user)
return Response(status=204) return Response(status=204)
@permission_required("authentik_core.view_user", ["authentik_events.view_event"])
@extend_schema(responses={200: UserMetricsSerializer(many=False)})
@action(detail=True, pagination_class=None, filter_backends=[])
def metrics(self, request: Request, pk: int) -> Response:
"""User metrics per 1h"""
user: User = self.get_object()
serializer = UserMetricsSerializer(instance={})
serializer.context["user"] = user
serializer.context["request"] = request
return Response(serializer.data)
@permission_required("authentik_core.reset_user_password") @permission_required("authentik_core.reset_user_password")
@extend_schema( @extend_schema(
responses={ responses={
@ -614,7 +641,7 @@ class UserViewSet(UsedByMixin, ModelViewSet):
if for_user.email == "": if for_user.email == "":
LOGGER.debug("User doesn't have an email address") LOGGER.debug("User doesn't have an email address")
raise ValidationError({"non_field_errors": "User does not have an email address set."}) raise ValidationError({"non_field_errors": "User does not have an email address set."})
link, token = self._create_recovery_link(for_email=True) link, token = self._create_recovery_link()
# Lookup the email stage to assure the current user can access it # Lookup the email stage to assure the current user can access it
stages = get_objects_for_user( stages = get_objects_for_user(
request.user, "authentik_stages_email.view_emailstage" request.user, "authentik_stages_email.view_emailstage"
@ -738,6 +765,9 @@ class UserViewSet(UsedByMixin, ModelViewSet):
response = super().partial_update(request, *args, **kwargs) response = super().partial_update(request, *args, **kwargs)
instance: User = self.get_object() instance: User = self.get_object()
if not instance.is_active: if not instance.is_active:
Session.objects.filter(authenticatedsession__user=instance).delete() sessions = AuthenticatedSession.objects.filter(user=instance)
session_ids = sessions.values_list("session_key", flat=True)
cache.delete_many(f"{KEY_PREFIX}{session}" for session in session_ids)
sessions.delete()
LOGGER.debug("Deleted user's sessions", user=instance.username) LOGGER.debug("Deleted user's sessions", user=instance.username)
return response return response

View File

@ -2,7 +2,6 @@
from typing import Any from typing import Any
from django.db import models
from django.db.models import Model from django.db.models import Model
from drf_spectacular.extensions import OpenApiSerializerFieldExtension from drf_spectacular.extensions import OpenApiSerializerFieldExtension
from drf_spectacular.plumbing import build_basic_type from drf_spectacular.plumbing import build_basic_type
@ -21,8 +20,6 @@ from rest_framework.serializers import (
raise_errors_on_nested_writes, raise_errors_on_nested_writes,
) )
from authentik.rbac.permissions import assign_initial_permissions
def is_dict(value: Any): def is_dict(value: Any):
"""Ensure a value is a dictionary, useful for JSONFields""" """Ensure a value is a dictionary, useful for JSONFields"""
@ -31,36 +28,8 @@ def is_dict(value: Any):
raise ValidationError("Value must be a dictionary, and not have any duplicate keys.") raise ValidationError("Value must be a dictionary, and not have any duplicate keys.")
class JSONDictField(JSONField):
"""JSON Field which only allows dictionaries"""
default_validators = [is_dict]
class JSONExtension(OpenApiSerializerFieldExtension):
"""Generate API Schema for JSON fields as"""
target_class = "authentik.core.api.utils.JSONDictField"
def map_serializer_field(self, auto_schema, direction):
return build_basic_type(OpenApiTypes.OBJECT)
class ModelSerializer(BaseModelSerializer): class ModelSerializer(BaseModelSerializer):
# By default, JSON fields we have are used to store dictionaries
serializer_field_mapping = BaseModelSerializer.serializer_field_mapping.copy()
serializer_field_mapping[models.JSONField] = JSONDictField
def create(self, validated_data):
instance = super().create(validated_data)
request = self.context.get("request")
if request and hasattr(request, "user") and not request.user.is_anonymous:
assign_initial_permissions(request.user, instance)
return instance
def update(self, instance: Model, validated_data): def update(self, instance: Model, validated_data):
raise_errors_on_nested_writes("update", self, validated_data) raise_errors_on_nested_writes("update", self, validated_data)
info = model_meta.get_field_info(instance) info = model_meta.get_field_info(instance)
@ -92,6 +61,21 @@ class ModelSerializer(BaseModelSerializer):
return instance return instance
class JSONDictField(JSONField):
"""JSON Field which only allows dictionaries"""
default_validators = [is_dict]
class JSONExtension(OpenApiSerializerFieldExtension):
"""Generate API Schema for JSON fields as"""
target_class = "authentik.core.api.utils.JSONDictField"
def map_serializer_field(self, auto_schema, direction):
return build_basic_type(OpenApiTypes.OBJECT)
class PassiveSerializer(Serializer): class PassiveSerializer(Serializer):
"""Base serializer class which doesn't implement create/update methods""" """Base serializer class which doesn't implement create/update methods"""

View File

@ -32,5 +32,5 @@ class AuthentikCoreConfig(ManagedAppConfig):
"name": "authentik Built-in", "name": "authentik Built-in",
"slug": "authentik-built-in", "slug": "authentik-built-in",
}, },
managed=Source.MANAGED_INBUILT, managed="goauthentik.io/sources/inbuilt",
) )

View File

@ -24,15 +24,6 @@ class InbuiltBackend(ModelBackend):
self.set_method("password", request) self.set_method("password", request)
return user return user
async def aauthenticate(
self, request: HttpRequest, username: str | None, password: str | None, **kwargs: Any
) -> User | None:
user = await super().aauthenticate(request, username=username, password=password, **kwargs)
if not user:
return None
self.set_method("password", request)
return user
def set_method(self, method: str, request: HttpRequest | None, **kwargs): def set_method(self, method: str, request: HttpRequest | None, **kwargs):
"""Set method data on current flow, if possbiel""" """Set method data on current flow, if possbiel"""
if not request: if not request:

View File

@ -13,6 +13,7 @@ class Command(TenantCommand):
parser.add_argument("usernames", nargs="*", type=str) parser.add_argument("usernames", nargs="*", type=str)
def handle_per_tenant(self, **options): def handle_per_tenant(self, **options):
print(options)
new_type = UserTypes(options["type"]) new_type = UserTypes(options["type"])
qs = ( qs = (
User.objects.exclude_anonymous() User.objects.exclude_anonymous()

View File

@ -1,15 +0,0 @@
"""Change user type"""
from importlib import import_module
from django.conf import settings
from authentik.tenants.management import TenantCommand
class Command(TenantCommand):
"""Delete all sessions"""
def handle_per_tenant(self, **options):
engine = import_module(settings.SESSION_ENGINE)
engine.SessionStore.clear_expired()

View File

@ -5,7 +5,6 @@ from typing import TextIO
from daphne.management.commands.runserver import Command as RunServer from daphne.management.commands.runserver import Command as RunServer
from daphne.server import Server from daphne.server import Server
from authentik.lib.debug import start_debug_server
from authentik.root.signals import post_startup, pre_startup, startup from authentik.root.signals import post_startup, pre_startup, startup
@ -14,7 +13,6 @@ class SignalServer(Server):
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
start_debug_server()
def ready_callable(): def ready_callable():
pre_startup.send(sender=self) pre_startup.send(sender=self)

View File

@ -2,7 +2,6 @@
from django.apps import apps from django.apps import apps
from django.contrib.auth.management import create_permissions from django.contrib.auth.management import create_permissions
from django.core.management import call_command
from django.core.management.base import BaseCommand, no_translations from django.core.management.base import BaseCommand, no_translations
from guardian.management import create_anonymous_user from guardian.management import create_anonymous_user
@ -17,10 +16,6 @@ class Command(BaseCommand):
"""Check permissions for all apps""" """Check permissions for all apps"""
for tenant in Tenant.objects.filter(ready=True): for tenant in Tenant.objects.filter(ready=True):
with tenant: with tenant:
# See https://code.djangoproject.com/ticket/28417
# Remove potential lingering old permissions
call_command("remove_stale_contenttypes", "--no-input")
for app in apps.get_app_configs(): for app in apps.get_app_configs():
self.stdout.write(f"Checking app {app.name} ({app.label})\n") self.stdout.write(f"Checking app {app.name} ({app.label})\n")
create_permissions(app, verbosity=0) create_permissions(app, verbosity=0)

View File

@ -9,7 +9,6 @@ from django.db import close_old_connections
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
from authentik.lib.config import CONFIG from authentik.lib.config import CONFIG
from authentik.lib.debug import start_debug_server
from authentik.root.celery import CELERY_APP from authentik.root.celery import CELERY_APP
LOGGER = get_logger() LOGGER = get_logger()
@ -29,7 +28,10 @@ class Command(BaseCommand):
def handle(self, **options): def handle(self, **options):
LOGGER.debug("Celery options", **options) LOGGER.debug("Celery options", **options)
close_old_connections() close_old_connections()
start_debug_server() if CONFIG.get_bool("remote_debug"):
import debugpy
debugpy.listen(("0.0.0.0", 6900)) # nosec
worker: Worker = CELERY_APP.Worker( worker: Worker = CELERY_APP.Worker(
no_color=False, no_color=False,
quiet=True, quiet=True,

View File

@ -2,14 +2,9 @@
from collections.abc import Callable from collections.abc import Callable
from contextvars import ContextVar from contextvars import ContextVar
from functools import partial
from uuid import uuid4 from uuid import uuid4
from django.contrib.auth.models import AnonymousUser
from django.core.exceptions import ImproperlyConfigured
from django.http import HttpRequest, HttpResponse from django.http import HttpRequest, HttpResponse
from django.utils.deprecation import MiddlewareMixin
from django.utils.functional import SimpleLazyObject
from django.utils.translation import override from django.utils.translation import override
from sentry_sdk.api import set_tag from sentry_sdk.api import set_tag
from structlog.contextvars import STRUCTLOG_KEY_PREFIX from structlog.contextvars import STRUCTLOG_KEY_PREFIX
@ -25,40 +20,6 @@ CTX_HOST = ContextVar[str | None](STRUCTLOG_KEY_PREFIX + "host", default=None)
CTX_AUTH_VIA = ContextVar[str | None](STRUCTLOG_KEY_PREFIX + KEY_AUTH_VIA, default=None) CTX_AUTH_VIA = ContextVar[str | None](STRUCTLOG_KEY_PREFIX + KEY_AUTH_VIA, default=None)
def get_user(request):
if not hasattr(request, "_cached_user"):
user = None
if (authenticated_session := request.session.get("authenticatedsession", None)) is not None:
user = authenticated_session.user
request._cached_user = user or AnonymousUser()
return request._cached_user
async def aget_user(request):
if not hasattr(request, "_cached_user"):
user = None
if (
authenticated_session := await request.session.aget("authenticatedsession", None)
) is not None:
user = authenticated_session.user
request._cached_user = user or AnonymousUser()
return request._cached_user
class AuthenticationMiddleware(MiddlewareMixin):
def process_request(self, request):
if not hasattr(request, "session"):
raise ImproperlyConfigured(
"The Django authentication middleware requires session "
"middleware to be installed. Edit your MIDDLEWARE setting to "
"insert "
"'authentik.root.middleware.SessionMiddleware' before "
"'authentik.core.middleware.AuthenticationMiddleware'."
)
request.user = SimpleLazyObject(lambda: get_user(request))
request.auser = partial(aget_user, request)
class ImpersonateMiddleware: class ImpersonateMiddleware:
"""Middleware to impersonate users""" """Middleware to impersonate users"""

View File

@ -1,26 +0,0 @@
# Generated by Django 5.0.11 on 2025-01-30 23:55
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("authentik_core", "0042_authenticatedsession_authentik_c_expires_08251d_idx_and_more"),
]
operations = [
migrations.AlterModelOptions(
name="group",
options={
"permissions": [
("add_user_to_group", "Add user to group"),
("remove_user_from_group", "Remove user from group"),
("enable_group_superuser", "Enable superuser status"),
("disable_group_superuser", "Disable superuser status"),
],
"verbose_name": "Group",
"verbose_name_plural": "Groups",
},
),
]

View File

@ -0,0 +1,57 @@
# Generated by Django 5.0.10 on 2025-01-08 17:39
from django.db import migrations
from django.apps.registry import Apps
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
def migrate_user_debug_attribute(apps: Apps, schema_editor: BaseDatabaseSchemaEditor):
from django.apps import apps as real_apps
from django.contrib.auth.management import create_permissions
db_alias = schema_editor.connection.alias
User = apps.get_model("authentik_core", "User")
USER_ATTRIBUTE_DEBUG = "goauthentik.io/user/debug"
# Permissions are only created _after_ migrations are run
# - https://github.com/django/django/blob/43cdfa8b20e567a801b7d0a09ec67ddd062d5ea4/django/contrib/auth/apps.py#L19
# - https://stackoverflow.com/a/72029063/1870445
create_permissions(real_apps.get_app_config("authentik_core"), using=db_alias)
Permission = apps.get_model("auth", "Permission")
new_prem = Permission.objects.using(db_alias).get(codename="user_view_debug")
db_alias = schema_editor.connection.alias
for user in User.objects.using(db_alias).filter(
**{f"attributes__{USER_ATTRIBUTE_DEBUG}": True}
):
user.permissions.add(new_prem)
class Migration(migrations.Migration):
dependencies = [
("authentik_core", "0042_authenticatedsession_authentik_c_expires_08251d_idx_and_more"),
]
operations = [
migrations.AlterModelOptions(
name="user",
options={
"permissions": [
("reset_user_password", "Reset Password"),
("impersonate", "Can impersonate other users"),
("assign_user_permissions", "Can assign permissions to users"),
("unassign_user_permissions", "Can unassign permissions from users"),
("preview_user", "Can preview user data sent to providers"),
("view_user_applications", "View applications the user has access to"),
("user_view_debug", "User receives additional details for error messages"),
],
"verbose_name": "User",
"verbose_name_plural": "Users",
},
),
migrations.RunPython(migrate_user_debug_attribute),
]

View File

@ -1,19 +0,0 @@
# Generated by Django 5.0.13 on 2025-04-07 14:04
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_core", "0043_alter_group_options"),
]
operations = [
migrations.AddField(
model_name="usersourceconnection",
name="new_identifier",
field=models.TextField(default=""),
preserve_default=False,
),
]

View File

@ -1,30 +0,0 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_core", "0044_usersourceconnection_new_identifier"),
("authentik_sources_kerberos", "0003_migrate_userkerberossourceconnection_identifier"),
("authentik_sources_oauth", "0009_migrate_useroauthsourceconnection_identifier"),
("authentik_sources_plex", "0005_migrate_userplexsourceconnection_identifier"),
("authentik_sources_saml", "0019_migrate_usersamlsourceconnection_identifier"),
]
operations = [
migrations.RenameField(
model_name="usersourceconnection",
old_name="new_identifier",
new_name="identifier",
),
migrations.AddIndex(
model_name="usersourceconnection",
index=models.Index(fields=["identifier"], name="authentik_c_identif_59226f_idx"),
),
migrations.AddIndex(
model_name="usersourceconnection",
index=models.Index(
fields=["source", "identifier"], name="authentik_c_source__649e04_idx"
),
),
]

Some files were not shown because too many files have changed in this diff Show More