Compare commits

..

10 Commits

Author SHA1 Message Date
2a3d2cd262 release: 2023.8.6 2024-01-09 18:44:21 +01:00
d9aab79c62 providers/oauth2: fix CVE-2024-21637 (cherry-pick #8104) (#8106)
* providers/oauth2: fix CVE-2024-21637 (#8104)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* update changelog

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2024-01-09 18:43:56 +01:00
1516fe86da release: 2023.8.5 2023-11-21 19:51:16 +01:00
abad6c181f ci: fix permissions for release pipeline to publish binaries (cherry-pick #7512) (#7621)
ci: fix permissions for release pipeline to publish binaries (#7512)

ci: fix permissions

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-21 19:51:11 +01:00
312eb70349 ci: explicitly give write permissions to packages (cherry-pick #7428) (#7430)
ci: explicitly give write permissions to packages (#7428)

* ci: explicitly give write permissions to packages



* run full CI on cherry-picks



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-21 19:51:06 +01:00
3af77ab382 security: fix CVE-2023-48228 (cherry-pick #7666) (#7669)
security: fix CVE-2023-48228 (#7666)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-21 18:13:50 +01:00
72d67f65e5 release: 2023.8.4 2023-10-28 21:44:15 +02:00
ea75741ec2 security: fix oobe-flow reuse when akadmin is deleted (#7361)
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	website/docs/releases/2023/v2023.10.md
2023-10-28 21:26:53 +02:00
aaa9b398f4 sources/ldap: fix inverted interpretation of FreeIPA nsaccountlock (cherry-pick #6877) (#6879)
sources/ldap: fix inverted interpretation of FreeIPA nsaccountlock (#6877)

sources/ldap: fix inverted interpretation of nsaccountlock

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-09-13 19:50:48 +02:00
d54d01b118 providers/saml: set WantAuthnRequestsSigned in metadata (cherry-pick #6851) (#6880)
providers/saml: set WantAuthnRequestsSigned in metadata (#6851)

Co-authored-by: Jens L <jens@goauthentik.io>
2023-09-13 19:50:41 +02:00
2595 changed files with 83982 additions and 288430 deletions

View File

@ -1,30 +1,18 @@
[bumpversion] [bumpversion]
current_version = 2024.10.2 current_version = 2023.8.6
tag = True tag = True
commit = True commit = True
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(?:-(?P<rc_t>[a-zA-Z-]+)(?P<rc_n>[1-9]\\d*))? parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)
serialize = serialize = {major}.{minor}.{patch}
{major}.{minor}.{patch}-{rc_t}{rc_n}
{major}.{minor}.{patch}
message = release: {new_version} message = release: {new_version}
tag_name = version/{new_version} tag_name = version/{new_version}
[bumpversion:part:rc_t]
values =
rc
final
optional_value = final
[bumpversion:file:pyproject.toml] [bumpversion:file:pyproject.toml]
[bumpversion:file:package.json]
[bumpversion:file:docker-compose.yml] [bumpversion:file:docker-compose.yml]
[bumpversion:file:schema.yml] [bumpversion:file:schema.yml]
[bumpversion:file:blueprints/schema.json]
[bumpversion:file:authentik/__init__.py] [bumpversion:file:authentik/__init__.py]
[bumpversion:file:internal/constants/constants.go] [bumpversion:file:internal/constants/constants.go]

View File

@ -1,12 +1,10 @@
env
htmlcov htmlcov
*.env.yml *.env.yml
**/node_modules **/node_modules
dist/** dist/**
build/** build/**
build_docs/** build_docs/**
*Dockerfile Dockerfile
authentik/enterprise
blueprints/local blueprints/local
.git
!gen-ts-api/node_modules
!gen-ts-api/dist/**
!gen-go-api/

2
.github/FUNDING.yml vendored
View File

@ -1 +1 @@
custom: https://goauthentik.io/pricing/ github: [BeryJu]

View File

@ -9,7 +9,7 @@ assignees: ""
**Describe your question/** **Describe your question/**
A clear and concise description of what you're trying to do. A clear and concise description of what you're trying to do.
**Relevant info** **Relevant infos**
i.e. Version of other software you're using, specifics of your setup i.e. Version of other software you're using, specifics of your setup
**Screenshots** **Screenshots**

View File

@ -9,6 +9,9 @@ inputs:
runs: runs:
using: "composite" using: "composite"
steps: steps:
- name: Generate config
id: ev
uses: ./.github/actions/docker-push-variables
- name: Find Comment - name: Find Comment
uses: peter-evans/find-comment@v2 uses: peter-evans/find-comment@v2
id: fc id: fc
@ -54,10 +57,9 @@ runs:
authentik: authentik:
outposts: outposts:
container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
global: image:
image: repository: ghcr.io/goauthentik/dev-server
repository: ghcr.io/goauthentik/dev-server tag: ${{ inputs.tag }}
tag: ${{ inputs.tag }}
``` ```
For arm64, use these values: For arm64, use these values:
@ -66,10 +68,9 @@ runs:
authentik: authentik:
outposts: outposts:
container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
global: image:
image: repository: ghcr.io/goauthentik/dev-server
repository: ghcr.io/goauthentik/dev-server tag: ${{ inputs.tag }}-arm64
tag: ${{ inputs.tag }}-arm64
``` ```
Afterwards, run the upgrade commands from the latest release notes. Afterwards, run the upgrade commands from the latest release notes.

View File

@ -1,53 +1,64 @@
---
name: "Prepare docker environment variables" name: "Prepare docker environment variables"
description: "Prepare docker environment variables" description: "Prepare docker environment variables"
inputs:
image-name:
required: true
description: "Docker image prefix"
image-arch:
required: false
description: "Docker image arch"
outputs: outputs:
shouldBuild: shouldBuild:
description: "Whether to build image or not" description: "Whether to build image or not"
value: ${{ steps.ev.outputs.shouldBuild }} value: ${{ steps.ev.outputs.shouldBuild }}
branchName:
description: "Branch name"
value: ${{ steps.ev.outputs.branchName }}
branchNameContainer:
description: "Branch name (for containers)"
value: ${{ steps.ev.outputs.branchNameContainer }}
timestamp:
description: "Timestamp"
value: ${{ steps.ev.outputs.timestamp }}
sha: sha:
description: "sha" description: "sha"
value: ${{ steps.ev.outputs.sha }} value: ${{ steps.ev.outputs.sha }}
shortHash:
description: "shortHash"
value: ${{ steps.ev.outputs.shortHash }}
version: version:
description: "Version" description: "version"
value: ${{ steps.ev.outputs.version }} value: ${{ steps.ev.outputs.version }}
prerelease: versionFamily:
description: "Prerelease" description: "versionFamily"
value: ${{ steps.ev.outputs.prerelease }} value: ${{ steps.ev.outputs.versionFamily }}
imageTags:
description: "Docker image tags"
value: ${{ steps.ev.outputs.imageTags }}
attestImageNames:
description: "Docker image names used for attestation"
value: ${{ steps.ev.outputs.attestImageNames }}
imageMainTag:
description: "Docker image main tag"
value: ${{ steps.ev.outputs.imageMainTag }}
imageMainName:
description: "Docker image main name"
value: ${{ steps.ev.outputs.imageMainName }}
runs: runs:
using: "composite" using: "composite"
steps: steps:
- name: Generate config - name: Generate config
id: ev id: ev
shell: bash shell: python
env:
IMAGE_NAME: ${{ inputs.image-name }}
IMAGE_ARCH: ${{ inputs.image-arch }}
PR_HEAD_SHA: ${{ github.event.pull_request.head.sha }}
run: | run: |
python3 ${{ github.action_path }}/push_vars.py """Helper script to get the actual branch name, docker safe"""
import configparser
import os
from time import time
parser = configparser.ConfigParser()
parser.read(".bumpversion.cfg")
branch_name = os.environ["GITHUB_REF"]
if os.environ.get("GITHUB_HEAD_REF", "") != "":
branch_name = os.environ["GITHUB_HEAD_REF"]
should_build = str(os.environ.get("DOCKER_USERNAME", "") != "").lower()
version = parser.get("bumpversion", "current_version")
version_family = ".".join(version.split(".")[:-1])
safe_branch_name = branch_name.replace("refs/heads/", "").replace("/", "-")
sha = os.environ["GITHUB_SHA"] if not "${{ github.event.pull_request.head.sha }}" else "${{ github.event.pull_request.head.sha }}"
with open(os.environ["GITHUB_OUTPUT"], "a+", encoding="utf-8") as _output:
print("branchName=%s" % branch_name, file=_output)
print("branchNameContainer=%s" % safe_branch_name, file=_output)
print("timestamp=%s" % int(time()), file=_output)
print("sha=%s" % sha, file=_output)
print("shortHash=%s" % sha[:7], file=_output)
print("shouldBuild=%s" % should_build, file=_output)
print("version=%s" % version, file=_output)
print("versionFamily=%s" % version_family, file=_output)

View File

@ -1,74 +0,0 @@
"""Helper script to get the actual branch name, docker safe"""
import configparser
import os
from time import time
parser = configparser.ConfigParser()
parser.read(".bumpversion.cfg")
should_build = str(len(os.environ.get("DOCKER_USERNAME", "")) > 0).lower()
branch_name = os.environ["GITHUB_REF"]
if os.environ.get("GITHUB_HEAD_REF", "") != "":
branch_name = os.environ["GITHUB_HEAD_REF"]
safe_branch_name = branch_name.replace("refs/heads/", "").replace("/", "-").replace("'", "-")
image_names = os.getenv("IMAGE_NAME").split(",")
image_arch = os.getenv("IMAGE_ARCH") or None
is_pull_request = bool(os.getenv("PR_HEAD_SHA"))
is_release = "dev" not in image_names[0]
sha = os.environ["GITHUB_SHA"] if not is_pull_request else os.getenv("PR_HEAD_SHA")
# 2042.1.0 or 2042.1.0-rc1
version = parser.get("bumpversion", "current_version")
# 2042.1
version_family = ".".join(version.split("-", 1)[0].split(".")[:-1])
prerelease = "-" in version
image_tags = []
if is_release:
for name in image_names:
image_tags += [
f"{name}:{version}",
]
if not prerelease:
image_tags += [
f"{name}:latest",
f"{name}:{version_family}",
]
else:
suffix = ""
if image_arch and image_arch != "amd64":
suffix = f"-{image_arch}"
for name in image_names:
image_tags += [
f"{name}:gh-{sha}{suffix}", # Used for ArgoCD and PR comments
f"{name}:gh-{safe_branch_name}{suffix}", # For convenience
f"{name}:gh-{safe_branch_name}-{int(time())}-{sha[:7]}{suffix}", # Use by FluxCD
]
image_main_tag = image_tags[0].split(":")[-1]
def get_attest_image_names(image_with_tags: list[str]):
"""Attestation only for GHCR"""
image_tags = []
for image_name in set(name.split(":")[0] for name in image_with_tags):
if not image_name.startswith("ghcr.io"):
continue
image_tags.append(image_name)
return ",".join(set(image_tags))
with open(os.environ["GITHUB_OUTPUT"], "a+", encoding="utf-8") as _output:
print(f"shouldBuild={should_build}", file=_output)
print(f"sha={sha}", file=_output)
print(f"version={version}", file=_output)
print(f"prerelease={prerelease}", file=_output)
print(f"imageTags={','.join(image_tags)}", file=_output)
print(f"attestImageNames={get_attest_image_names(image_tags)}", file=_output)
print(f"imageMainTag={image_main_tag}", file=_output)
print(f"imageMainName={image_tags[0]}", file=_output)

View File

@ -1,7 +0,0 @@
#!/bin/bash -x
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
GITHUB_OUTPUT=/dev/stdout \
GITHUB_REF=ref \
GITHUB_SHA=sha \
IMAGE_NAME=ghcr.io/goauthentik/server,beryju/authentik \
python $SCRIPT_DIR/push_vars.py

View File

@ -2,39 +2,36 @@ name: "Setup authentik testing environment"
description: "Setup authentik testing environment" description: "Setup authentik testing environment"
inputs: inputs:
postgresql_version: postgresql_tag:
description: "Optional postgresql image tag" description: "Optional postgresql image tag"
default: "16" default: "12"
runs: runs:
using: "composite" using: "composite"
steps: steps:
- name: Install poetry & deps - name: Install poetry
shell: bash shell: bash
run: | run: |
pipx install poetry || true pipx install poetry || true
sudo apt-get update sudo apt update
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext libkrb5-dev krb5-kdc krb5-user krb5-admin-server sudo apt install -y libpq-dev openssl libxmlsec1-dev pkg-config gettext
- name: Setup python and restore poetry - name: Setup python and restore poetry
uses: actions/setup-python@v5 uses: actions/setup-python@v3
with: with:
python-version-file: "pyproject.toml" python-version: "3.11"
cache: "poetry" cache: "poetry"
- name: Setup node - name: Setup node
uses: actions/setup-node@v4 uses: actions/setup-node@v3
with: with:
node-version-file: web/package.json node-version: "20.5"
cache: "npm" cache: "npm"
cache-dependency-path: web/package-lock.json cache-dependency-path: web/package-lock.json
- name: Setup go
uses: actions/setup-go@v5
with:
go-version-file: "go.mod"
- name: Setup dependencies - name: Setup dependencies
shell: bash shell: bash
run: | run: |
export PSQL_TAG=${{ inputs.postgresql_version }} export PSQL_TAG=${{ inputs.postgresql_tag }}
docker compose -f .github/actions/setup/docker-compose.yml up -d docker-compose -f .github/actions/setup/docker-compose.yml up -d
poetry env use python3.11
poetry install poetry install
cd web && npm ci cd web && npm ci
- name: Generate config - name: Generate config

View File

@ -1,6 +1,8 @@
version: "3.7"
services: services:
postgresql: postgresql:
image: docker.io/library/postgres:${PSQL_TAG:-16} image: docker.io/library/postgres:${PSQL_TAG:-12}
volumes: volumes:
- db-data:/var/lib/postgresql/data - db-data:/var/lib/postgresql/data
environment: environment:

4
.github/codecov.yml vendored
View File

@ -6,5 +6,5 @@ coverage:
# adjust accordingly based on how flaky your tests are # adjust accordingly based on how flaky your tests are
# this allows a 1% drop from the previous base commit coverage # this allows a 1% drop from the previous base commit coverage
threshold: 1% threshold: 1%
comment: notify:
after_n_builds: 3 after_n_builds: 3

View File

@ -2,6 +2,3 @@ keypair
keypairs keypairs
hass hass
warmup warmup
ontext
singed
assertIn

View File

@ -21,9 +21,7 @@ updates:
labels: labels:
- dependencies - dependencies
- package-ecosystem: npm - package-ecosystem: npm
directories: directory: "/web"
- "/web"
- "/web/sfe"
schedule: schedule:
interval: daily interval: daily
time: "04:00" time: "04:00"
@ -36,18 +34,15 @@ updates:
sentry: sentry:
patterns: patterns:
- "@sentry/*" - "@sentry/*"
- "@spotlightjs/*"
babel: babel:
patterns: patterns:
- "@babel/*" - "@babel/*"
- "babel-*" - "babel-*"
eslint: eslint:
patterns: patterns:
- "@eslint/*" - "@typescript-eslint/eslint-*"
- "@typescript-eslint/*"
- "eslint-*"
- "eslint" - "eslint"
- "typescript-eslint" - "eslint-*"
storybook: storybook:
patterns: patterns:
- "@storybook/*" - "@storybook/*"
@ -55,19 +50,6 @@ updates:
esbuild: esbuild:
patterns: patterns:
- "@esbuild/*" - "@esbuild/*"
- "esbuild*"
rollup:
patterns:
- "@rollup/*"
- "rollup-*"
- "rollup*"
swc:
patterns:
- "@swc/*"
- "swc-*"
wdio:
patterns:
- "@wdio/*"
- package-ecosystem: npm - package-ecosystem: npm
directory: "/website" directory: "/website"
schedule: schedule:

View File

@ -1,7 +1,7 @@
<!-- <!--
👋 Hi there! Welcome. 👋 Hi there! Welcome.
Please check the Contributing guidelines: https://docs.goauthentik.io/docs/developer-docs/#how-can-i-contribute Please check the Contributing guidelines: https://goauthentik.io/developer-docs/#how-can-i-contribute
--> -->
## Details ## Details
@ -27,6 +27,7 @@ If an API change has been made
If changes to the frontend have been made If changes to the frontend have been made
- [ ] The code has been formatted (`make web`) - [ ] The code has been formatted (`make web`)
- [ ] The translation files have been updated (`make i18n-extract`)
If applicable If applicable

View File

@ -1,65 +0,0 @@
name: authentik-api-py-publish
on:
push:
branches: [main]
paths:
- "schema.yml"
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
permissions:
id-token: write
steps:
- id: generate_token
uses: tibdex/github-app-token@v2
with:
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v4
with:
token: ${{ steps.generate_token.outputs.token }}
- name: Install poetry & deps
shell: bash
run: |
pipx install poetry || true
sudo apt-get update
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext
- name: Setup python and restore poetry
uses: actions/setup-python@v5
with:
python-version-file: "pyproject.toml"
cache: "poetry"
- name: Generate API Client
run: make gen-client-py
- name: Publish package
working-directory: gen-py-api/
run: |
poetry build
- name: Publish package to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
packages-dir: gen-py-api/dist/
# We can't easily upgrade the API client being used due to poetry being poetry
# so we'll have to rely on dependabot
# - name: Upgrade /
# run: |
# export VERSION=$(cd gen-py-api && poetry version -s)
# poetry add "authentik_client=$VERSION" --allow-prereleases --lock
# - uses: peter-evans/create-pull-request@v6
# id: cpr
# with:
# token: ${{ steps.generate_token.outputs.token }}
# branch: update-root-api-client
# commit-message: "root: bump API Client version"
# title: "root: bump API Client version"
# body: "root: bump API Client version"
# delete-branch: true
# signoff: true
# # ID from https://api.github.com/users/authentik-automation[bot]
# author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
# - uses: peter-evans/enable-pull-request-automerge@v3
# with:
# token: ${{ steps.generate_token.outputs.token }}
# pull-request-number: ${{ steps.cpr.outputs.pull-request-number }}
# merge-method: squash

View File

@ -1,4 +1,3 @@
---
name: authentik-ci-main name: authentik-ci-main
on: on:
@ -7,6 +6,8 @@ on:
- main - main
- next - next
- version-* - version-*
paths-ignore:
- website
pull_request: pull_request:
branches: branches:
- main - main
@ -26,11 +27,14 @@ jobs:
- bandit - bandit
- black - black
- codespell - codespell
- isort
- pending-migrations - pending-migrations
- pylint
- pyright
- ruff - ruff
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: run job - name: run job
@ -38,39 +42,31 @@ jobs:
test-migrations: test-migrations:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: run migrations - name: run migrations
run: poetry run python -m lifecycle.migrate run: poetry run python -m lifecycle.migrate
test-migrations-from-stable: test-migrations-from-stable:
name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }}
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy: continue-on-error: true
fail-fast: false
matrix:
psql:
- 15-alpine
- 16-alpine
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Setup authentik env
uses: ./.github/actions/setup
- name: checkout stable - name: checkout stable
run: | run: |
# Delete all poetry envs
rm -rf /home/runner/.cache/pypoetry
# Copy current, latest config to local # Copy current, latest config to local
cp authentik/lib/default.yml local.env.yml cp authentik/lib/default.yml local.env.yml
cp -R .github .. cp -R .github ..
cp -R scripts .. cp -R scripts ..
git checkout $(git tag --sort=version:refname | grep '^version/' | grep -vE -- '-rc[0-9]+$' | tail -n1) git checkout $(git describe --tags $(git rev-list --tags --max-count=1))
rm -rf .github/ scripts/ rm -rf .github/ scripts/
mv ../.github ../scripts . mv ../.github ../scripts .
- name: Setup authentik env (stable) - name: Setup authentik env (ensure stable deps are installed)
uses: ./.github/actions/setup uses: ./.github/actions/setup
with:
postgresql_version: ${{ matrix.psql }}
- name: run migrations to stable - name: run migrations to stable
run: poetry run python -m lifecycle.migrate run: poetry run python -m lifecycle.migrate
- name: checkout current code - name: checkout current code
@ -80,21 +76,11 @@ jobs:
git reset --hard HEAD git reset --hard HEAD
git clean -d -fx . git clean -d -fx .
git checkout $GITHUB_SHA git checkout $GITHUB_SHA
# Delete previous poetry env poetry install
rm -rf /home/runner/.cache/pypoetry/virtualenvs/*
- name: Setup authentik env (ensure latest deps are installed) - name: Setup authentik env (ensure latest deps are installed)
uses: ./.github/actions/setup uses: ./.github/actions/setup
with:
postgresql_version: ${{ matrix.psql }}
- name: migrate to latest - name: migrate to latest
run: | run: poetry run python -m lifecycle.migrate
poetry run python -m lifecycle.migrate
- name: run tests
env:
# Test in the main database that we just migrated from the previous stable version
AUTHENTIK_POSTGRESQL__TEST__NAME: authentik
run: |
poetry run make test
test-unittest: test-unittest:
name: test-unittest - PostgreSQL ${{ matrix.psql }} name: test-unittest - PostgreSQL ${{ matrix.psql }}
runs-on: ubuntu-latest runs-on: ubuntu-latest
@ -103,53 +89,39 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
psql: psql:
- 12-alpine
- 15-alpine - 15-alpine
- 16-alpine
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
with: with:
postgresql_version: ${{ matrix.psql }} postgresql_tag: ${{ matrix.psql }}
- name: run unittest - name: run unittest
run: | run: |
poetry run make test poetry run make test
poetry run coverage xml poetry run coverage xml
- if: ${{ always() }} - if: ${{ always() }}
uses: codecov/codecov-action@v5 uses: codecov/codecov-action@v3
with: with:
flags: unit flags: unit
token: ${{ secrets.CODECOV_TOKEN }}
- if: ${{ !cancelled() }}
uses: codecov/test-results-action@v1
with:
flags: unit
file: unittest.xml
token: ${{ secrets.CODECOV_TOKEN }}
test-integration: test-integration:
runs-on: ubuntu-latest runs-on: ubuntu-latest
timeout-minutes: 30 timeout-minutes: 30
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: Create k8s Kind Cluster - name: Create k8s Kind Cluster
uses: helm/kind-action@v1.10.0 uses: helm/kind-action@v1.8.0
- name: run integration - name: run integration
run: | run: |
poetry run coverage run manage.py test tests/integration poetry run coverage run manage.py test tests/integration
poetry run coverage xml poetry run coverage xml
- if: ${{ always() }} - if: ${{ always() }}
uses: codecov/codecov-action@v5 uses: codecov/codecov-action@v3
with: with:
flags: integration flags: integration
token: ${{ secrets.CODECOV_TOKEN }}
- if: ${{ !cancelled() }}
uses: codecov/test-results-action@v1
with:
flags: integration
file: unittest.xml
token: ${{ secrets.CODECOV_TOKEN }}
test-e2e: test-e2e:
name: test-e2e (${{ matrix.job.name }}) name: test-e2e (${{ matrix.job.name }})
runs-on: ubuntu-latest runs-on: ubuntu-latest
@ -170,19 +142,17 @@ jobs:
glob: tests/e2e/test_provider_ldap* tests/e2e/test_source_ldap* glob: tests/e2e/test_provider_ldap* tests/e2e/test_source_ldap*
- name: radius - name: radius
glob: tests/e2e/test_provider_radius* glob: tests/e2e/test_provider_radius*
- name: scim
glob: tests/e2e/test_source_scim*
- name: flows - name: flows
glob: tests/e2e/test_flows* glob: tests/e2e/test_flows*
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: Setup e2e env (chrome, etc) - name: Setup e2e env (chrome, etc)
run: | run: |
docker compose -f tests/e2e/docker-compose.yml up -d --quiet-pull docker-compose -f tests/e2e/docker-compose.yml up -d
- id: cache-web - id: cache-web
uses: actions/cache@v4 uses: actions/cache@v3
with: with:
path: web/dist path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**') }} key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**') }}
@ -198,16 +168,9 @@ jobs:
poetry run coverage run manage.py test ${{ matrix.job.glob }} poetry run coverage run manage.py test ${{ matrix.job.glob }}
poetry run coverage xml poetry run coverage xml
- if: ${{ always() }} - if: ${{ always() }}
uses: codecov/codecov-action@v5 uses: codecov/codecov-action@v3
with: with:
flags: e2e flags: e2e
token: ${{ secrets.CODECOV_TOKEN }}
- if: ${{ !cancelled() }}
uses: codecov/test-results-action@v1
with:
flags: e2e
file: unittest.xml
token: ${{ secrets.CODECOV_TOKEN }}
ci-core-mark: ci-core-mark:
needs: needs:
- lint - lint
@ -220,90 +183,93 @@ jobs:
steps: steps:
- run: echo mark - run: echo mark
build: build:
strategy:
fail-fast: false
matrix:
arch:
- amd64
- arm64
needs: ci-core-mark needs: ci-core-mark
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload contianer images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation
id-token: write
attestations: write
timeout-minutes: 120 timeout-minutes: 120
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v2.2.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v2
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env: env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }} DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/dev-server
image-arch: ${{ matrix.arch }}
- name: Login to Container Registry - name: Login to Container Registry
uses: docker/login-action@v2
if: ${{ steps.ev.outputs.shouldBuild == 'true' }} if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
uses: docker/login-action@v3
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: generate ts client
run: make gen-client-ts
- name: Build Docker Image - name: Build Docker Image
uses: docker/build-push-action@v6 uses: docker/build-push-action@v4
id: push
with: with:
context: .
secrets: | secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }} GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }} GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
tags: ${{ steps.ev.outputs.imageTags }}
push: ${{ steps.ev.outputs.shouldBuild == 'true' }} push: ${{ steps.ev.outputs.shouldBuild == 'true' }}
tags: |
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.sha }}
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}-${{ steps.ev.outputs.timestamp }}-${{ steps.ev.outputs.shortHash }}
build-args: | build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }} GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-server:buildcache VERSION=${{ steps.ev.outputs.version }}
cache-to: ${{ steps.ev.outputs.shouldBuild == 'true' && 'type=registry,ref=ghcr.io/goauthentik/dev-server:buildcache,mode=max' || '' }} VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
platforms: linux/${{ matrix.arch }} - name: Comment on PR
- uses: actions/attest-build-provenance@v1 if: github.event_name == 'pull_request'
id: attest continue-on-error: true
if: ${{ steps.ev.outputs.shouldBuild == 'true' }} uses: ./.github/actions/comment-pr-instructions
with: with:
subject-name: ${{ steps.ev.outputs.attestImageNames }} tag: gh-${{ steps.ev.outputs.branchNameContainer }}-${{ steps.ev.outputs.timestamp }}-${{ steps.ev.outputs.shortHash }}
subject-digest: ${{ steps.push.outputs.digest }} build-arm64:
push-to-registry: true needs: ci-core-mark
pr-comment:
needs:
- build
runs-on: ubuntu-latest runs-on: ubuntu-latest
if: ${{ github.event_name == 'pull_request' }}
permissions: permissions:
# Needed to write comments on PRs # Needed to upload contianer images to ghcr.io
pull-requests: write packages: write
timeout-minutes: 120 timeout-minutes: 120
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v2.2.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env: env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }} DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with: - name: Login to Container Registry
image-name: ghcr.io/goauthentik/dev-server uses: docker/login-action@v2
- name: Comment on PR
if: ${{ steps.ev.outputs.shouldBuild == 'true' }} if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
uses: ./.github/actions/comment-pr-instructions
with: with:
tag: ${{ steps.ev.outputs.imageMainTag }} registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker Image
uses: docker/build-push-action@v4
with:
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
push: ${{ steps.ev.outputs.shouldBuild == 'true' }}
tags: |
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}-arm64
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.sha }}-arm64
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}-${{ steps.ev.outputs.timestamp }}-${{ steps.ev.outputs.shortHash }}-arm64
build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
VERSION=${{ steps.ev.outputs.version }}
VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
platforms: linux/arm64

View File

@ -1,4 +1,3 @@
---
name: authentik-ci-outpost name: authentik-ci-outpost
on: on:
@ -16,8 +15,8 @@ jobs:
lint-golint: lint-golint:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- name: Prepare and generate API - name: Prepare and generate API
@ -29,20 +28,18 @@ jobs:
- name: Generate API - name: Generate API
run: make gen-client-go run: make gen-client-go
- name: golangci-lint - name: golangci-lint
uses: golangci/golangci-lint-action@v6 uses: golangci/golangci-lint-action@v3
with: with:
version: latest version: v1.52.2
args: --timeout 5000s --verbose args: --timeout 5000s --verbose
skip-cache: true skip-pkg-cache: true
test-unittest: test-unittest:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- name: Setup authentik env
uses: ./.github/actions/setup
- name: Generate API - name: Generate API
run: make gen-client-go run: make gen-client-go
- name: Go unittests - name: Go unittests
@ -66,32 +63,26 @@ jobs:
- proxy - proxy
- ldap - ldap
- radius - radius
- rac
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload contianer images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation
id-token: write
attestations: write
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v2.2.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v2
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env: env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }} DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/dev-${{ matrix.type }}
- name: Login to Container Registry - name: Login to Container Registry
uses: docker/login-action@v2
if: ${{ steps.ev.outputs.shouldBuild == 'true' }} if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
uses: docker/login-action@v3
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
@ -99,25 +90,19 @@ jobs:
- name: Generate API - name: Generate API
run: make gen-client-go run: make gen-client-go
- name: Build Docker Image - name: Build Docker Image
id: push uses: docker/build-push-action@v4
uses: docker/build-push-action@v6
with: with:
tags: ${{ steps.ev.outputs.imageTags }}
file: ${{ matrix.type }}.Dockerfile
push: ${{ steps.ev.outputs.shouldBuild == 'true' }} push: ${{ steps.ev.outputs.shouldBuild == 'true' }}
tags: |
ghcr.io/goauthentik/dev-${{ matrix.type }}:gh-${{ steps.ev.outputs.branchNameContainer }}
ghcr.io/goauthentik/dev-${{ matrix.type }}:gh-${{ steps.ev.outputs.sha }}
file: ${{ matrix.type }}.Dockerfile
build-args: | build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }} GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
VERSION=${{ steps.ev.outputs.version }}
VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
platforms: linux/amd64,linux/arm64 platforms: linux/amd64,linux/arm64
context: . context: .
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-${{ matrix.type }}:buildcache
cache-to: ${{ steps.ev.outputs.shouldBuild == 'true' && format('type=registry,ref=ghcr.io/goauthentik/dev-{0}:buildcache,mode=max', matrix.type) || '' }}
- uses: actions/attest-build-provenance@v1
id: attest
if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
build-binary: build-binary:
timeout-minutes: 120 timeout-minutes: 120
needs: needs:
@ -130,19 +115,18 @@ jobs:
- proxy - proxy
- ldap - ldap
- radius - radius
- rac
goos: [linux] goos: [linux]
goarch: [amd64, arm64] goarch: [amd64, arm64]
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- uses: actions/setup-node@v4 - uses: actions/setup-node@v3
with: with:
node-version-file: web/package.json node-version: "20.5"
cache: "npm" cache: "npm"
cache-dependency-path: web/package-lock.json cache-dependency-path: web/package-lock.json
- name: Generate API - name: Generate API

View File

@ -12,45 +12,93 @@ on:
- version-* - version-*
jobs: jobs:
lint: lint-eslint:
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
command:
- lint
- lint:lockfile
- tsc
- prettier-check
project:
- web
include:
- command: tsc
project: web
- command: lit-analyse
project: web
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-node@v4 - uses: actions/setup-node@v3
with: with:
node-version-file: ${{ matrix.project }}/package.json node-version: "20.5"
cache: "npm" cache: "npm"
cache-dependency-path: ${{ matrix.project }}/package-lock.json cache-dependency-path: web/package-lock.json
- working-directory: ${{ matrix.project }}/ - working-directory: web/
run: | run: npm ci
npm ci
- name: Generate API - name: Generate API
run: make gen-client-ts run: make gen-client-ts
- name: Lint - name: Eslint
working-directory: ${{ matrix.project }}/ working-directory: web/
run: npm run ${{ matrix.command }} run: npm run lint
build: lint-build:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-node@v4 - uses: actions/setup-node@v3
with: with:
node-version-file: web/package.json node-version: "20.5"
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: npm ci
- name: Generate API
run: make gen-client-ts
- name: TSC
working-directory: web/
run: npm run tsc
lint-prettier:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v3
with:
node-version: "20.5"
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: npm ci
- name: Generate API
run: make gen-client-ts
- name: prettier
working-directory: web/
run: npm run prettier-check
lint-lit-analyse:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v3
with:
node-version: "20.5"
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: |
npm ci
# lit-analyse doesn't understand path rewrites, so make it
# belive it's an actual module
cd node_modules/@goauthentik
ln -s ../../src/ web
- name: Generate API
run: make gen-client-ts
- name: lit-analyse
working-directory: web/
run: npm run lit-analyse
ci-web-mark:
needs:
- lint-eslint
- lint-prettier
- lint-lit-analyse
- lint-build
runs-on: ubuntu-latest
steps:
- run: echo mark
build:
needs:
- ci-web-mark
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v3
with:
node-version: "20.5"
cache: "npm" cache: "npm"
cache-dependency-path: web/package-lock.json cache-dependency-path: web/package-lock.json
- working-directory: web/ - working-directory: web/
@ -60,28 +108,3 @@ jobs:
- name: build - name: build
working-directory: web/ working-directory: web/
run: npm run build run: npm run build
ci-web-mark:
needs:
- build
- lint
runs-on: ubuntu-latest
steps:
- run: echo mark
test:
needs:
- ci-web-mark
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: npm ci
- name: Generate API
run: make gen-client-ts
- name: test
working-directory: web/
run: npm run test || exit 0

View File

@ -12,28 +12,27 @@ on:
- version-* - version-*
jobs: jobs:
lint: lint-prettier:
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
command:
- lint:lockfile
- prettier-check
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-node@v3
with:
node-version: "20.5"
cache: "npm"
cache-dependency-path: website/package-lock.json
- working-directory: website/ - working-directory: website/
run: npm ci run: npm ci
- name: Lint - name: prettier
working-directory: website/ working-directory: website/
run: npm run ${{ matrix.command }} run: npm run prettier-check
test: test:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-node@v4 - uses: actions/setup-node@v3
with: with:
node-version-file: website/package.json node-version: "20.5"
cache: "npm" cache: "npm"
cache-dependency-path: website/package-lock.json cache-dependency-path: website/package-lock.json
- working-directory: website/ - working-directory: website/
@ -49,11 +48,12 @@ jobs:
matrix: matrix:
job: job:
- build - build
- build-docs-only
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-node@v4 - uses: actions/setup-node@v3
with: with:
node-version-file: website/package.json node-version: "20.5"
cache: "npm" cache: "npm"
cache-dependency-path: website/package-lock.json cache-dependency-path: website/package-lock.json
- working-directory: website/ - working-directory: website/
@ -63,7 +63,7 @@ jobs:
run: npm run ${{ matrix.job }} run: npm run ${{ matrix.job }}
ci-website-mark: ci-website-mark:
needs: needs:
- lint - lint-prettier
- test - test
- build - build
runs-on: ubuntu-latest runs-on: ubuntu-latest

View File

@ -23,14 +23,14 @@ jobs:
language: ["go", "javascript", "python"] language: ["go", "javascript", "python"]
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v3
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v3 uses: github/codeql-action/init@v2
with: with:
languages: ${{ matrix.language }} languages: ${{ matrix.language }}
- name: Autobuild - name: Autobuild
uses: github/codeql-action/autobuild@v3 uses: github/codeql-action/autobuild@v2
- name: Perform CodeQL Analysis - name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3 uses: github/codeql-action/analyze@v2

View File

@ -1,43 +0,0 @@
name: authentik-gen-update-webauthn-mds
on:
workflow_dispatch:
schedule:
- cron: '30 1 1,15 * *'
env:
POSTGRES_DB: authentik
POSTGRES_USER: authentik
POSTGRES_PASSWORD: "EK-5jnKfjrGRm<77"
jobs:
build:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: tibdex/github-app-token@v2
with:
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v4
with:
token: ${{ steps.generate_token.outputs.token }}
- name: Setup authentik env
uses: ./.github/actions/setup
- run: poetry run ak update_webauthn_mds
- uses: peter-evans/create-pull-request@v7
id: cpr
with:
token: ${{ steps.generate_token.outputs.token }}
branch: update-fido-mds-client
commit-message: "stages/authenticator_webauthn: Update FIDO MDS3 & Passkey aaguid blobs"
title: "stages/authenticator_webauthn: Update FIDO MDS3 & Passkey aaguid blobs"
body: "stages/authenticator_webauthn: Update FIDO MDS3 & Passkey aaguid blobs"
delete-branch: true
signoff: true
# ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
- uses: peter-evans/enable-pull-request-automerge@v3
with:
token: ${{ steps.generate_token.outputs.token }}
pull-request-number: ${{ steps.cpr.outputs.pull-request-number }}
merge-method: squash

View File

@ -6,16 +6,12 @@ on:
types: types:
- closed - closed
permissions:
# Permission to delete cache
actions: write
jobs: jobs:
cleanup: cleanup:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Check out code - name: Check out code
uses: actions/checkout@v4 uses: actions/checkout@v3
- name: Cleanup - name: Cleanup
run: | run: |

View File

@ -1,8 +1,8 @@
name: ghcr-retention name: ghcr-retention
on: on:
# schedule: schedule:
# - cron: "0 0 * * *" # every day at midnight - cron: "0 0 * * *" # every day at midnight
workflow_dispatch: workflow_dispatch:
jobs: jobs:
@ -11,7 +11,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v1
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}

View File

@ -29,11 +29,11 @@ jobs:
github.event.pull_request.head.repo.full_name == github.repository) github.event.pull_request.head.repo.full_name == github.repository)
steps: steps:
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v1
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v4 - uses: actions/checkout@v3
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}
- name: Compress images - name: Compress images
@ -42,7 +42,7 @@ jobs:
with: with:
githubToken: ${{ steps.generate_token.outputs.token }} githubToken: ${{ steps.generate_token.outputs.token }}
compressOnly: ${{ github.event_name != 'pull_request' }} compressOnly: ${{ github.event_name != 'pull_request' }}
- uses: peter-evans/create-pull-request@v7 - uses: peter-evans/create-pull-request@v5
if: "${{ github.event_name != 'pull_request' && steps.compress.outputs.markdown != '' }}" if: "${{ github.event_name != 'pull_request' && steps.compress.outputs.markdown != '' }}"
id: cpr id: cpr
with: with:

View File

@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
timeout-minutes: 120 timeout-minutes: 120
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: generate docs - name: generate docs

View File

@ -14,7 +14,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
environment: internal-production environment: internal-production
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
with: with:
ref: main ref: main
- run: | - run: |

View File

@ -1,4 +1,3 @@
---
name: authentik-on-release name: authentik-on-release
on: on:
@ -11,64 +10,49 @@ jobs:
permissions: permissions:
# Needed to upload contianer images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation
id-token: write
attestations: write
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v2.2.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v2
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/server,beryju/authentik
- name: Docker Login Registry - name: Docker Login Registry
uses: docker/login-action@v3 uses: docker/login-action@v2
with: with:
username: ${{ secrets.DOCKER_USERNAME }} username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }} password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v2
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: make empty clients
run: |
mkdir -p ./gen-ts-api
mkdir -p ./gen-go-api
- name: Build Docker Image - name: Build Docker Image
uses: docker/build-push-action@v6 uses: docker/build-push-action@v4
id: push
with: with:
context: . push: ${{ github.event_name == 'release' }}
push: true
secrets: | secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }} GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }} GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
build-args: | tags: |
VERSION=${{ github.ref }} beryju/authentik:${{ steps.ev.outputs.version }},
tags: ${{ steps.ev.outputs.imageTags }} beryju/authentik:${{ steps.ev.outputs.versionFamily }},
beryju/authentik:latest,
ghcr.io/goauthentik/server:${{ steps.ev.outputs.version }},
ghcr.io/goauthentik/server:${{ steps.ev.outputs.versionFamily }},
ghcr.io/goauthentik/server:latest
platforms: linux/amd64,linux/arm64 platforms: linux/amd64,linux/arm64
- uses: actions/attest-build-provenance@v1 build-args: |
id: attest VERSION=${{ steps.ev.outputs.version }}
with: VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
build-outpost: build-outpost:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload contianer images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation
id-token: write
attestations: write
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
@ -76,55 +60,45 @@ jobs:
- proxy - proxy
- ldap - ldap
- radius - radius
- rac
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v2.2.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v2
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/${{ matrix.type }},beryju/authentik-${{ matrix.type }}
- name: make empty clients
run: |
mkdir -p ./gen-ts-api
mkdir -p ./gen-go-api
- name: Docker Login Registry - name: Docker Login Registry
uses: docker/login-action@v3 uses: docker/login-action@v2
with: with:
username: ${{ secrets.DOCKER_USERNAME }} username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }} password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
uses: docker/login-action@v3 uses: docker/login-action@v2
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker Image - name: Build Docker Image
uses: docker/build-push-action@v6 uses: docker/build-push-action@v4
id: push
with: with:
push: true push: ${{ github.event_name == 'release' }}
build-args: | tags: |
VERSION=${{ github.ref }} beryju/authentik-${{ matrix.type }}:${{ steps.ev.outputs.version }},
tags: ${{ steps.ev.outputs.imageTags }} beryju/authentik-${{ matrix.type }}:${{ steps.ev.outputs.versionFamily }},
beryju/authentik-${{ matrix.type }}:latest,
ghcr.io/goauthentik/${{ matrix.type }}:${{ steps.ev.outputs.version }},
ghcr.io/goauthentik/${{ matrix.type }}:${{ steps.ev.outputs.versionFamily }},
ghcr.io/goauthentik/${{ matrix.type }}:latest
file: ${{ matrix.type }}.Dockerfile file: ${{ matrix.type }}.Dockerfile
platforms: linux/amd64,linux/arm64 platforms: linux/amd64,linux/arm64
context: . build-args: |
- uses: actions/attest-build-provenance@v1 VERSION=${{ steps.ev.outputs.version }}
id: attest VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
build-outpost-binary: build-outpost-binary:
timeout-minutes: 120 timeout-minutes: 120
runs-on: ubuntu-latest runs-on: ubuntu-latest
@ -141,13 +115,13 @@ jobs:
goos: [linux, darwin] goos: [linux, darwin]
goarch: [amd64, arm64] goarch: [amd64, arm64]
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- uses: actions/setup-node@v4 - uses: actions/setup-node@v3
with: with:
node-version-file: web/package.json node-version: "20.5"
cache: "npm" cache: "npm"
cache-dependency-path: web/package-lock.json cache-dependency-path: web/package-lock.json
- name: Build web - name: Build web
@ -176,15 +150,15 @@ jobs:
- build-outpost-binary - build-outpost-binary
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: Run test suite in final docker images - name: Run test suite in final docker images
run: | run: |
echo "PG_PASS=$(openssl rand 32 | base64 -w 0)" >> .env echo "PG_PASS=$(openssl rand -base64 32)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(openssl rand 32 | base64 -w 0)" >> .env echo "AUTHENTIK_SECRET_KEY=$(openssl rand -base64 32)" >> .env
docker compose pull -q docker-compose pull -q
docker compose up --no-start docker-compose up --no-start
docker compose start postgresql redis docker-compose start postgresql redis
docker compose run -u root server test-all docker-compose run -u root server test-all
sentry-release: sentry-release:
needs: needs:
- build-server - build-server
@ -192,22 +166,19 @@ jobs:
- build-outpost-binary - build-outpost-binary
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/server
- name: Get static files from docker image - name: Get static files from docker image
run: | run: |
docker pull ${{ steps.ev.outputs.imageMainName }} docker pull ghcr.io/goauthentik/server:latest
container=$(docker container create ${{ steps.ev.outputs.imageMainName }}) container=$(docker container create ghcr.io/goauthentik/server:latest)
docker cp ${container}:web/ . docker cp ${container}:web/ .
- name: Create a Sentry.io release - name: Create a Sentry.io release
uses: getsentry/action-release@v1 uses: getsentry/action-release@v1
continue-on-error: true continue-on-error: true
if: ${{ github.event_name == 'release' }}
env: env:
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }} SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_ORG: authentik-security-inc SENTRY_ORG: authentik-security-inc

View File

@ -1,4 +1,3 @@
---
name: authentik-on-tag name: authentik-on-tag
on: on:
@ -11,31 +10,30 @@ jobs:
name: Create Release from Tag name: Create Release from Tag
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v3
- name: Pre-release test - name: Pre-release test
run: | run: |
echo "PG_PASS=$(openssl rand 32 | base64 -w 0)" >> .env echo "PG_PASS=$(openssl rand -base64 32)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(openssl rand 32 | base64 -w 0)" >> .env echo "AUTHENTIK_SECRET_KEY=$(openssl rand -base64 32)" >> .env
docker buildx install docker buildx install
mkdir -p ./gen-ts-api
docker build -t testing:latest . docker build -t testing:latest .
echo "AUTHENTIK_IMAGE=testing" >> .env echo "AUTHENTIK_IMAGE=testing" >> .env
echo "AUTHENTIK_TAG=latest" >> .env echo "AUTHENTIK_TAG=latest" >> .env
docker compose up --no-start docker-compose up --no-start
docker compose start postgresql redis docker-compose start postgresql redis
docker compose run -u root server test-all docker-compose run -u root server test-all
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v1
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- name: prepare variables - name: Extract version number
uses: ./.github/actions/docker-push-variables id: get_version
id: ev uses: actions/github-script@v6
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with: with:
image-name: ghcr.io/goauthentik/server github-token: ${{ steps.generate_token.outputs.token }}
script: |
return context.payload.ref.replace(/\/refs\/tags\/version\//, '');
- name: Create Release - name: Create Release
id: create_release id: create_release
uses: actions/create-release@v1.1.4 uses: actions/create-release@v1.1.4
@ -43,6 +41,6 @@ jobs:
GITHUB_TOKEN: ${{ steps.generate_token.outputs.token }} GITHUB_TOKEN: ${{ steps.generate_token.outputs.token }}
with: with:
tag_name: ${{ github.ref }} tag_name: ${{ github.ref }}
release_name: Release ${{ steps.ev.outputs.version }} release_name: Release ${{ steps.get_version.outputs.result }}
draft: true draft: true
prerelease: ${{ steps.ev.outputs.prerelease == 'true' }} prerelease: false

View File

@ -14,16 +14,16 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v1
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/stale@v9 - uses: actions/stale@v8
with: with:
repo-token: ${{ steps.generate_token.outputs.token }} repo-token: ${{ steps.generate_token.outputs.token }}
days-before-stale: 60 days-before-stale: 60
days-before-close: 7 days-before-close: 7
exempt-issue-labels: pinned,security,pr_wanted,enhancement,bug/confirmed,enhancement/confirmed,question,status/reviewing exempt-issue-labels: pinned,security,pr_wanted,enhancement,bug/confirmed,enhancement/confirmed,question
stale-issue-label: wontfix stale-issue-label: wontfix
stale-issue-message: > stale-issue-message: >
This issue has been automatically marked as stale because it has not had This issue has been automatically marked as stale because it has not had

View File

@ -7,26 +7,21 @@ on:
paths: paths:
- "!**" - "!**"
- "locale/**" - "locale/**"
- "!locale/en/**" - "web/src/locales/**"
- "web/xliff/**"
permissions:
# Permission to write comment
pull-requests: write
jobs: jobs:
post-comment: post-comment:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Find Comment - name: Find Comment
uses: peter-evans/find-comment@v3 uses: peter-evans/find-comment@v2
id: fc id: fc
with: with:
issue-number: ${{ github.event.pull_request.number }} issue-number: ${{ github.event.pull_request.number }}
comment-author: "github-actions[bot]" comment-author: "github-actions[bot]"
body-includes: authentik translations instructions body-includes: authentik translations instructions
- name: Create or update comment - name: Create or update comment
uses: peter-evans/create-or-update-comment@v4 uses: peter-evans/create-or-update-comment@v3
with: with:
comment-id: ${{ steps.fc.outputs.comment-id }} comment-id: ${{ steps.fc.outputs.comment-id }}
issue-number: ${{ github.event.pull_request.number }} issue-number: ${{ github.event.pull_request.number }}

View File

@ -1,8 +1,9 @@
--- name: authentik-backend-translate-compile
name: authentik-backend-translate-extract-compile
on: on:
schedule: push:
- cron: "0 0 * * *" # every day at midnight branches: [main]
paths:
- "locale/**"
workflow_dispatch: workflow_dispatch:
env: env:
@ -15,29 +16,25 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v1
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v4 - uses: actions/checkout@v3
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: run extract
run: |
poetry run make i18n-extract
- name: run compile - name: run compile
run: | run: poetry run ak compilemessages
poetry run ak compilemessages
make web-check-compile
- name: Create Pull Request - name: Create Pull Request
uses: peter-evans/create-pull-request@v7 uses: peter-evans/create-pull-request@v5
id: cpr
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}
branch: extract-compile-backend-translation branch: compile-backend-translation
commit-message: "core, web: update translations" commit-message: "core: compile backend translations"
title: "core, web: update translations" title: "core: compile backend translations"
body: "core, web: update translations" body: "core: compile backend translations"
delete-branch: true delete-branch: true
signoff: true signoff: true

View File

@ -6,17 +6,13 @@ on:
pull_request: pull_request:
types: [opened, reopened] types: [opened, reopened]
permissions:
# Permission to rename PR
pull-requests: write
jobs: jobs:
rename_pr: rename_pr:
runs-on: ubuntu-latest runs-on: ubuntu-latest
if: ${{ github.event.pull_request.user.login == 'transifex-integration[bot]'}} if: ${{ github.event.pull_request.user.login == 'transifex-integration[bot]'}}
steps: steps:
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v1
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}

View File

@ -1,4 +1,4 @@
name: authentik-api-ts-publish name: authentik-web-api-publish
on: on:
push: push:
branches: [main] branches: [main]
@ -10,16 +10,16 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v1
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v4 - uses: actions/checkout@v3
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}
- uses: actions/setup-node@v4 - uses: actions/setup-node@v3
with: with:
node-version-file: web/package.json node-version: "20.5"
registry-url: "https://registry.npmjs.org" registry-url: "https://registry.npmjs.org"
- name: Generate API Client - name: Generate API Client
run: make gen-client-ts run: make gen-client-ts
@ -31,16 +31,11 @@ jobs:
env: env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_PUBLISH_TOKEN }} NODE_AUTH_TOKEN: ${{ secrets.NPM_PUBLISH_TOKEN }}
- name: Upgrade /web - name: Upgrade /web
working-directory: web working-directory: web/
run: | run: |
export VERSION=`node -e 'console.log(require("../gen-ts-api/package.json").version)'` export VERSION=`node -e 'console.log(require("../gen-ts-api/package.json").version)'`
npm i @goauthentik/api@$VERSION npm i @goauthentik/api@$VERSION
- name: Upgrade /web/packages/sfe - uses: peter-evans/create-pull-request@v5
working-directory: web/packages/sfe
run: |
export VERSION=`node -e 'console.log(require("../gen-ts-api/package.json").version)'`
npm i @goauthentik/api@$VERSION
- uses: peter-evans/create-pull-request@v7
id: cpr id: cpr
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}

3
.gitignore vendored
View File

@ -206,6 +206,3 @@ data/
.netlify .netlify
.ruff_cache .ruff_cache
source_docs/ source_docs/
### Golang ###
/vendor/

View File

@ -10,12 +10,12 @@
"Gruntfuggly.todo-tree", "Gruntfuggly.todo-tree",
"mechatroner.rainbow-csv", "mechatroner.rainbow-csv",
"ms-python.black-formatter", "ms-python.black-formatter",
"charliermarsh.ruff", "ms-python.isort",
"ms-python.pylint",
"ms-python.python", "ms-python.python",
"ms-python.vscode-pylance", "ms-python.vscode-pylance",
"ms-python.black-formatter",
"redhat.vscode-yaml", "redhat.vscode-yaml",
"Tobermory.es6-string-html", "Tobermory.es6-string-html",
"unifiedjs.vscode-mdx" "unifiedjs.vscode-mdx",
] ]
} }

2
.vscode/launch.json vendored
View File

@ -22,6 +22,6 @@
}, },
"justMyCode": true, "justMyCode": true,
"django": true "django": true
} },
] ]
} }

35
.vscode/settings.json vendored
View File

@ -4,36 +4,35 @@
"asgi", "asgi",
"authentik", "authentik",
"authn", "authn",
"entra",
"goauthentik", "goauthentik",
"jwe",
"jwks", "jwks",
"kubernetes",
"oidc", "oidc",
"openid", "openid",
"passwordless",
"plex", "plex",
"saml", "saml",
"scim",
"slo",
"sso",
"totp", "totp",
"webauthn",
"traefik", "traefik",
"webauthn" "passwordless",
"kubernetes",
"sso",
"slo",
"scim",
], ],
"python.linting.pylintEnabled": true,
"todo-tree.tree.showCountsInTree": true, "todo-tree.tree.showCountsInTree": true,
"todo-tree.tree.showBadges": true, "todo-tree.tree.showBadges": true,
"python.formatting.provider": "black",
"yaml.customTags": [ "yaml.customTags": [
"!Condition sequence",
"!Context scalar",
"!Enumerate sequence",
"!Env scalar",
"!Find sequence", "!Find sequence",
"!Format sequence",
"!If sequence",
"!Index scalar",
"!KeyOf scalar", "!KeyOf scalar",
"!Value scalar" "!Context scalar",
"!Context sequence",
"!Format sequence",
"!Condition sequence",
"!Env sequence",
"!Env scalar",
"!If sequence"
], ],
"typescript.preferences.importModuleSpecifier": "non-relative", "typescript.preferences.importModuleSpecifier": "non-relative",
"typescript.preferences.importModuleSpecifierEnding": "index", "typescript.preferences.importModuleSpecifierEnding": "index",
@ -50,7 +49,9 @@
"ignoreCase": false "ignoreCase": false
} }
], ],
"go.testFlags": ["-count=1"], "go.testFlags": [
"-count=1"
],
"github-actions.workflows.pinned.workflows": [ "github-actions.workflows.pinned.workflows": [
".github/workflows/ci-main.yml" ".github/workflows/ci-main.yml"
] ]

62
.vscode/tasks.json vendored
View File

@ -2,67 +2,85 @@
"version": "2.0.0", "version": "2.0.0",
"tasks": [ "tasks": [
{ {
"label": "authentik/core: make", "label": "authentik[core]: format & test",
"command": "poetry", "command": "poetry",
"args": ["run", "make", "lint-fix", "lint"], "args": [
"presentation": { "run",
"panel": "new" "make"
}, ],
"group": "test" "group": "build",
}, },
{ {
"label": "authentik/core: run", "label": "authentik[core]: run",
"command": "poetry", "command": "poetry",
"args": ["run", "ak", "server"], "args": [
"run",
"make",
"run",
],
"group": "build", "group": "build",
"presentation": { "presentation": {
"panel": "dedicated", "panel": "dedicated",
"group": "running" "group": "running"
} },
}, },
{ {
"label": "authentik/web: make", "label": "authentik[web]: format",
"command": "make", "command": "make",
"args": ["web"], "args": ["web"],
"group": "build" "group": "build",
}, },
{ {
"label": "authentik/web: watch", "label": "authentik[web]: watch",
"command": "make", "command": "make",
"args": ["web-watch"], "args": ["web-watch"],
"group": "build", "group": "build",
"presentation": { "presentation": {
"panel": "dedicated", "panel": "dedicated",
"group": "running" "group": "running"
} },
}, },
{ {
"label": "authentik: install", "label": "authentik: install",
"command": "make", "command": "make",
"args": ["install", "-j4"], "args": ["install"],
"group": "build" "group": "build",
}, },
{ {
"label": "authentik/website: make", "label": "authentik: i18n-extract",
"command": "poetry",
"args": [
"run",
"make",
"i18n-extract"
],
"group": "build",
},
{
"label": "authentik[website]: format",
"command": "make", "command": "make",
"args": ["website"], "args": ["website"],
"group": "build" "group": "build",
}, },
{ {
"label": "authentik/website: watch", "label": "authentik[website]: watch",
"command": "make", "command": "make",
"args": ["website-watch"], "args": ["website-watch"],
"group": "build", "group": "build",
"presentation": { "presentation": {
"panel": "dedicated", "panel": "dedicated",
"group": "running" "group": "running"
} },
}, },
{ {
"label": "authentik/api: generate", "label": "authentik[api]: generate",
"command": "poetry", "command": "poetry",
"args": ["run", "make", "gen"], "args": [
"run",
"make",
"gen"
],
"group": "build" "group": "build"
} },
] ]
} }

View File

@ -1,28 +1,2 @@
# Fallback * @goauthentik/core
* @goauthentik/backend @goauthentik/frontend website/docs/security/** @goauthentik/security
# Backend
authentik/ @goauthentik/backend
blueprints/ @goauthentik/backend
cmd/ @goauthentik/backend
internal/ @goauthentik/backend
lifecycle/ @goauthentik/backend
schemas/ @goauthentik/backend
scripts/ @goauthentik/backend
tests/ @goauthentik/backend
pyproject.toml @goauthentik/backend
poetry.lock @goauthentik/backend
go.mod @goauthentik/backend
go.sum @goauthentik/backend
# Infrastructure
.github/ @goauthentik/infrastructure
Dockerfile @goauthentik/infrastructure
*Dockerfile @goauthentik/infrastructure
.dockerignore @goauthentik/infrastructure
docker-compose.yml @goauthentik/infrastructure
# Web
web/ @goauthentik/frontend
tests/wdio/ @goauthentik/frontend
# Docs & Website
website/ @goauthentik/docs
# Security
website/docs/security/ @goauthentik/security

View File

@ -1,89 +1,57 @@
# syntax=docker/dockerfile:1
# Stage 1: Build website # Stage 1: Build website
FROM --platform=${BUILDPLATFORM} docker.io/library/node:22 AS website-builder FROM --platform=${BUILDPLATFORM} docker.io/node:20.5 as website-builder
ENV NODE_ENV=production
WORKDIR /work/website
RUN --mount=type=bind,target=/work/website/package.json,src=./website/package.json \
--mount=type=bind,target=/work/website/package-lock.json,src=./website/package-lock.json \
--mount=type=cache,id=npm-website,sharing=shared,target=/root/.npm \
npm ci --include=dev
COPY ./website /work/website/ COPY ./website /work/website/
COPY ./blueprints /work/blueprints/ COPY ./blueprints /work/blueprints/
COPY ./schema.yml /work/
COPY ./SECURITY.md /work/ COPY ./SECURITY.md /work/
RUN npm run build-bundled ENV NODE_ENV=production
WORKDIR /work/website
RUN npm ci --include=dev && npm run build-docs-only
# Stage 2: Build webui # Stage 2: Build webui
FROM --platform=${BUILDPLATFORM} docker.io/library/node:22 AS web-builder FROM --platform=${BUILDPLATFORM} docker.io/node:20.5 as web-builder
ARG GIT_BUILD_HASH
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
ENV NODE_ENV=production
WORKDIR /work/web
RUN --mount=type=bind,target=/work/web/package.json,src=./web/package.json \
--mount=type=bind,target=/work/web/package-lock.json,src=./web/package-lock.json \
--mount=type=bind,target=/work/web/packages/sfe/package.json,src=./web/packages/sfe/package.json \
--mount=type=bind,target=/work/web/scripts,src=./web/scripts \
--mount=type=cache,id=npm-web,sharing=shared,target=/root/.npm \
npm ci --include=dev
COPY ./package.json /work
COPY ./web /work/web/ COPY ./web /work/web/
COPY ./website /work/website/ COPY ./website /work/website/
COPY ./gen-ts-api /work/web/node_modules/@goauthentik/api
RUN npm run build ENV NODE_ENV=production
WORKDIR /work/web
RUN npm ci --include=dev && npm run build
# Stage 3: Build go proxy # Stage 3: Poetry to requirements.txt export
FROM --platform=${BUILDPLATFORM} mcr.microsoft.com/oss/go/microsoft/golang:1.23-fips-bookworm AS go-builder FROM docker.io/python:3.11.5-slim-bookworm AS poetry-locker
ARG TARGETOS WORKDIR /work
ARG TARGETARCH COPY ./pyproject.toml /work
ARG TARGETVARIANT COPY ./poetry.lock /work
ARG GOOS=$TARGETOS RUN pip install --no-cache-dir poetry && \
ARG GOARCH=$TARGETARCH poetry export -f requirements.txt --output requirements.txt && \
poetry export -f requirements.txt --dev --output requirements-dev.txt
WORKDIR /go/src/goauthentik.io # Stage 4: Build go proxy
FROM docker.io/golang:1.21.0-bookworm AS go-builder
RUN --mount=type=cache,id=apt-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/var/cache/apt \ WORKDIR /work
dpkg --add-architecture arm64 && \
apt-get update && \
apt-get install -y --no-install-recommends crossbuild-essential-arm64 gcc-aarch64-linux-gnu
RUN --mount=type=bind,target=/go/src/goauthentik.io/go.mod,src=./go.mod \ COPY --from=web-builder /work/web/robots.txt /work/web/robots.txt
--mount=type=bind,target=/go/src/goauthentik.io/go.sum,src=./go.sum \ COPY --from=web-builder /work/web/security.txt /work/web/security.txt
--mount=type=cache,target=/go/pkg/mod \
go mod download
COPY ./cmd /go/src/goauthentik.io/cmd COPY ./cmd /work/cmd
COPY ./authentik/lib /go/src/goauthentik.io/authentik/lib COPY ./authentik/lib /work/authentik/lib
COPY ./web/static.go /go/src/goauthentik.io/web/static.go COPY ./web/static.go /work/web/static.go
COPY --from=web-builder /work/web/robots.txt /go/src/goauthentik.io/web/robots.txt COPY ./internal /work/internal
COPY --from=web-builder /work/web/security.txt /go/src/goauthentik.io/web/security.txt COPY ./go.mod /work/go.mod
COPY ./internal /go/src/goauthentik.io/internal COPY ./go.sum /work/go.sum
COPY ./go.mod /go/src/goauthentik.io/go.mod
COPY ./go.sum /go/src/goauthentik.io/go.sum
RUN --mount=type=cache,sharing=locked,target=/go/pkg/mod \ RUN go build -o /work/bin/authentik ./cmd/server/
--mount=type=cache,id=go-build-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/root/.cache/go-build \
if [ "$TARGETARCH" = "arm64" ]; then export CC=aarch64-linux-gnu-gcc && export CC_FOR_TARGET=gcc-aarch64-linux-gnu; fi && \
CGO_ENABLED=1 GOEXPERIMENT="systemcrypto" GOFLAGS="-tags=requirefips" GOARM="${TARGETVARIANT#v}" \
go build -o /go/authentik ./cmd/server
# Stage 4: MaxMind GeoIP # Stage 5: MaxMind GeoIP
FROM --platform=${BUILDPLATFORM} ghcr.io/maxmind/geoipupdate:v7.1.0 AS geoip FROM ghcr.io/maxmind/geoipupdate:v6.0 as geoip
ENV GEOIPUPDATE_EDITION_IDS="GeoLite2-City GeoLite2-ASN" ENV GEOIPUPDATE_EDITION_IDS="GeoLite2-City"
ENV GEOIPUPDATE_VERBOSE="1" ENV GEOIPUPDATE_VERBOSE="true"
ENV GEOIPUPDATE_ACCOUNT_ID_FILE="/run/secrets/GEOIPUPDATE_ACCOUNT_ID" ENV GEOIPUPDATE_ACCOUNT_ID_FILE="/run/secrets/GEOIPUPDATE_ACCOUNT_ID"
ENV GEOIPUPDATE_LICENSE_KEY_FILE="/run/secrets/GEOIPUPDATE_LICENSE_KEY" ENV GEOIPUPDATE_LICENSE_KEY_FILE="/run/secrets/GEOIPUPDATE_LICENSE_KEY"
@ -93,93 +61,61 @@ RUN --mount=type=secret,id=GEOIPUPDATE_ACCOUNT_ID \
mkdir -p /usr/share/GeoIP && \ mkdir -p /usr/share/GeoIP && \
/bin/sh -c "/usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0" /bin/sh -c "/usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0"
# Stage 5: Python dependencies
FROM ghcr.io/goauthentik/fips-python:3.12.7-slim-bookworm-fips-full AS python-deps
ARG TARGETARCH
ARG TARGETVARIANT
WORKDIR /ak-root/poetry
ENV VENV_PATH="/ak-root/venv" \
POETRY_VIRTUALENVS_CREATE=false \
PATH="/ak-root/venv/bin:$PATH"
RUN rm -f /etc/apt/apt.conf.d/docker-clean; echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache
RUN --mount=type=cache,id=apt-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/var/cache/apt \
apt-get update && \
# Required for installing pip packages
apt-get install -y --no-install-recommends build-essential pkg-config libpq-dev libkrb5-dev
RUN --mount=type=bind,target=./pyproject.toml,src=./pyproject.toml \
--mount=type=bind,target=./poetry.lock,src=./poetry.lock \
--mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/pypoetry \
python -m venv /ak-root/venv/ && \
bash -c "source ${VENV_PATH}/bin/activate && \
pip3 install --upgrade pip && \
pip3 install poetry && \
poetry install --only=main --no-ansi --no-interaction --no-root && \
pip install --force-reinstall /wheels/*"
# Stage 6: Run # Stage 6: Run
FROM ghcr.io/goauthentik/fips-python:3.12.7-slim-bookworm-fips-full AS final-image FROM docker.io/python:3.11.5-slim-bookworm AS final-image
ARG VERSION
ARG GIT_BUILD_HASH ARG GIT_BUILD_HASH
ARG VERSION
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
LABEL org.opencontainers.image.url=https://goauthentik.io LABEL org.opencontainers.image.url https://goauthentik.io
LABEL org.opencontainers.image.description="goauthentik.io Main server image, see https://goauthentik.io for more info." LABEL org.opencontainers.image.description goauthentik.io Main server image, see https://goauthentik.io for more info.
LABEL org.opencontainers.image.source=https://github.com/goauthentik/authentik LABEL org.opencontainers.image.source https://github.com/goauthentik/authentik
LABEL org.opencontainers.image.version=${VERSION} LABEL org.opencontainers.image.version ${VERSION}
LABEL org.opencontainers.image.revision=${GIT_BUILD_HASH} LABEL org.opencontainers.image.revision ${GIT_BUILD_HASH}
WORKDIR / WORKDIR /
# We cannot cache this layer otherwise we'll end up with a bigger image COPY --from=poetry-locker /work/requirements.txt /
COPY --from=poetry-locker /work/requirements-dev.txt /
COPY --from=geoip /usr/share/GeoIP /geoip
RUN apt-get update && \ RUN apt-get update && \
# Required for installing pip packages
apt-get install -y --no-install-recommends build-essential pkg-config libxmlsec1-dev zlib1g-dev libpq-dev python3-dev && \
# Required for runtime # Required for runtime
apt-get install -y --no-install-recommends libpq5 libmaxminddb0 ca-certificates libkrb5-3 libkadm5clnt-mit12 libkdb5-10 && \ apt-get install -y --no-install-recommends libpq5 openssl libxmlsec1-openssl libmaxminddb0 && \
# Required for bootstrap & healtcheck # Required for bootstrap & healtcheck
apt-get install -y --no-install-recommends runit && \ apt-get install -y --no-install-recommends runit && \
pip install --no-cache-dir -r /requirements.txt && \
apt-get remove --purge -y build-essential pkg-config libxmlsec1-dev libpq-dev python3-dev && \
apt-get autoremove --purge -y && \
apt-get clean && \ apt-get clean && \
rm -rf /tmp/* /var/lib/apt/lists/* /var/tmp/ && \ rm -rf /tmp/* /var/lib/apt/lists/* /var/tmp/ && \
adduser --system --no-create-home --uid 1000 --group --home /authentik authentik && \ adduser --system --no-create-home --uid 1000 --group --home /authentik authentik && \
mkdir -p /certs /media /blueprints && \ mkdir -p /certs /media /blueprints && \
mkdir -p /authentik/.ssh && \ mkdir -p /authentik/.ssh && \
mkdir -p /ak-root && \ chown authentik:authentik /certs /media /authentik/.ssh
chown authentik:authentik /certs /media /authentik/.ssh /ak-root
COPY ./authentik/ /authentik COPY ./authentik/ /authentik
COPY ./pyproject.toml / COPY ./pyproject.toml /
COPY ./poetry.lock /
COPY ./schemas /schemas COPY ./schemas /schemas
COPY ./locale /locale COPY ./locale /locale
COPY ./tests /tests COPY ./tests /tests
COPY ./manage.py / COPY ./manage.py /
COPY ./blueprints /blueprints COPY ./blueprints /blueprints
COPY ./lifecycle/ /lifecycle COPY ./lifecycle/ /lifecycle
COPY ./authentik/sources/kerberos/krb5.conf /etc/krb5.conf COPY --from=go-builder /work/bin/authentik /bin/authentik
COPY --from=go-builder /go/authentik /bin/authentik
COPY --from=python-deps /ak-root/venv /ak-root/venv
COPY --from=web-builder /work/web/dist/ /web/dist/ COPY --from=web-builder /work/web/dist/ /web/dist/
COPY --from=web-builder /work/web/authentik/ /web/authentik/ COPY --from=web-builder /work/web/authentik/ /web/authentik/
COPY --from=website-builder /work/website/build/ /website/help/ COPY --from=website-builder /work/website/help/ /website/help/
COPY --from=geoip /usr/share/GeoIP /geoip
USER 1000 USER 1000
ENV TMPDIR=/dev/shm/ \ ENV TMPDIR /dev/shm/
PYTHONDONTWRITEBYTECODE=1 \ ENV PYTHONUNBUFFERED 1
PYTHONUNBUFFERED=1 \ ENV PATH "/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/lifecycle"
PATH="/ak-root/venv/bin:/lifecycle:$PATH" \
VENV_PATH="/ak-root/venv" \
POETRY_VIRTUALENVS_CREATE=false
ENV GOFIPS=1 HEALTHCHECK --interval=30s --timeout=30s --start-period=60s --retries=3 CMD [ "/lifecycle/ak", "healthcheck" ]
HEALTHCHECK --interval=30s --timeout=30s --start-period=60s --retries=3 CMD [ "ak", "healthcheck" ] ENTRYPOINT [ "/usr/local/bin/dumb-init", "--", "/lifecycle/ak" ]
ENTRYPOINT [ "dumb-init", "--", "ak" ]

222
Makefile
View File

@ -1,224 +1,146 @@
.PHONY: gen dev-reset all clean test web website .SHELLFLAGS += -x -e
.SHELLFLAGS += ${SHELLFLAGS} -e
PWD = $(shell pwd) PWD = $(shell pwd)
UID = $(shell id -u) UID = $(shell id -u)
GID = $(shell id -g) GID = $(shell id -g)
NPM_VERSION = $(shell python -m scripts.npm_version) NPM_VERSION = $(shell python -m scripts.npm_version)
PY_SOURCES = authentik tests scripts lifecycle .github PY_SOURCES = authentik tests scripts lifecycle
DOCKER_IMAGE ?= "authentik:test"
GEN_API_TS = "gen-ts-api"
GEN_API_PY = "gen-py-api"
GEN_API_GO = "gen-go-api"
pg_user := $(shell python -m authentik.lib.config postgresql.user 2>/dev/null)
pg_host := $(shell python -m authentik.lib.config postgresql.host 2>/dev/null)
pg_name := $(shell python -m authentik.lib.config postgresql.name 2>/dev/null)
CODESPELL_ARGS = -D - -D .github/codespell-dictionary.txt \ CODESPELL_ARGS = -D - -D .github/codespell-dictionary.txt \
-I .github/codespell-words.txt \ -I .github/codespell-words.txt \
-S 'web/src/locales/**' \ -S 'web/src/locales/**' \
-S 'website/docs/developer-docs/api/reference/**' \
authentik \ authentik \
internal \ internal \
cmd \ cmd \
web/src \ web/src \
website/src \ website/src \
website/blog \ website/blog \
website/developer-docs \
website/docs \ website/docs \
website/integrations \ website/integrations \
website/src website/src
all: lint-fix lint test gen web ## Lint, build, and test everything all: lint-fix lint test gen web
HELP_WIDTH := $(shell grep -h '^[a-z][^ ]*:.*\#\#' $(MAKEFILE_LIST) 2>/dev/null | \ test-go:
cut -d':' -f1 | awk '{printf "%d\n", length}' | sort -rn | head -1)
help: ## Show this help
@echo "\nSpecify a command. The choices are:\n"
@grep -Eh '^[0-9a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | \
awk 'BEGIN {FS = ":.*?## "}; {printf " \033[0;36m%-$(HELP_WIDTH)s \033[m %s\n", $$1, $$2}' | \
sort
@echo ""
go-test:
go test -timeout 0 -v -race -cover ./... go test -timeout 0 -v -race -cover ./...
test-docker: ## Run all tests in a docker-compose test-docker:
echo "PG_PASS=$(shell openssl rand 32 | base64 -w 0)" >> .env echo "PG_PASS=$(openssl rand -base64 32)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(shell openssl rand 32 | base64 -w 0)" >> .env echo "AUTHENTIK_SECRET_KEY=$(openssl rand -base64 32)" >> .env
docker compose pull -q docker-compose pull -q
docker compose up --no-start docker-compose up --no-start
docker compose start postgresql redis docker-compose start postgresql redis
docker compose run -u root server test-all docker-compose run -u root server test
rm -f .env rm -f .env
test: ## Run the server tests and produce a coverage report (locally) test:
coverage run manage.py test --keepdb authentik coverage run manage.py test --keepdb authentik
coverage html coverage html
coverage report coverage report
lint-fix: lint-codespell ## Lint and automatically fix errors in the python source code. Reports spelling errors. lint-fix:
black $(PY_SOURCES) isort authentik $(PY_SOURCES)
ruff check --fix $(PY_SOURCES) black authentik $(PY_SOURCES)
ruff authentik $(PY_SOURCES)
lint-codespell: ## Reports spelling errors.
codespell -w $(CODESPELL_ARGS) codespell -w $(CODESPELL_ARGS)
lint: ## Lint the python and golang sources lint:
bandit -r $(PY_SOURCES) -x web/node_modules -x tests/wdio/node_modules -x website/node_modules pylint $(PY_SOURCES)
bandit -r $(PY_SOURCES) -x node_modules
golangci-lint run -v golangci-lint run -v
core-install: migrate:
poetry install
migrate: ## Run the Authentik Django server's migrations
python -m lifecycle.migrate python -m lifecycle.migrate
i18n-extract: core-i18n-extract web-i18n-extract ## Extract strings that require translation into files to send to a translation service i18n-extract: i18n-extract-core web-i18n-extract
core-i18n-extract: i18n-extract-core:
ak makemessages \ ak makemessages --ignore web --ignore internal --ignore web --ignore web-api --ignore website -l en
--add-location file \
--no-obsolete \
--ignore web \
--ignore internal \
--ignore ${GEN_API_TS} \
--ignore ${GEN_API_GO} \
--ignore website \
-l en
install: web-install website-install core-install ## Install all requires dependencies for `web`, `website` and `core`
dev-drop-db:
dropdb -U ${pg_user} -h ${pg_host} ${pg_name}
# Also remove the test-db if it exists
dropdb -U ${pg_user} -h ${pg_host} test_${pg_name} || true
redis-cli -n 0 flushall
dev-create-db:
createdb -U ${pg_user} -h ${pg_host} ${pg_name}
dev-reset: dev-drop-db dev-create-db migrate ## Drop and restore the Authentik PostgreSQL instance to a "fresh install" state.
######################### #########################
## API Schema ## API Schema
######################### #########################
gen-build: ## Extract the schema from the database gen-build:
AUTHENTIK_DEBUG=true \ AUTHENTIK_DEBUG=true ak make_blueprint_schema > blueprints/schema.json
AUTHENTIK_TENANTS__ENABLED=true \ AUTHENTIK_DEBUG=true ak spectacular --file schema.yml
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
ak make_blueprint_schema > blueprints/schema.json
AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
ak spectacular --file schema.yml
gen-changelog: ## (Release) generate the changelog based from the commits since the last tag gen-changelog:
git log --pretty=format:" - %s" $(shell git describe --tags $(shell git rev-list --tags --max-count=1))...$(shell git branch --show-current) | sort > changelog.md git log --pretty=format:" - %s" $(shell git describe --tags $(shell git rev-list --tags --max-count=1))...$(shell git branch --show-current) | sort > changelog.md
npx prettier --write changelog.md npx prettier --write changelog.md
gen-diff: ## (Release) generate the changelog diff between the current schema and the last tag gen-diff:
git show $(shell git describe --tags $(shell git rev-list --tags --max-count=1)):schema.yml > old_schema.yml git show $(shell git describe --tags $(shell git rev-list --tags --max-count=1)):schema.yml > old_schema.yml
docker run \ docker run \
--rm -v ${PWD}:/local \ --rm -v ${PWD}:/local \
--user ${UID}:${GID} \ --user ${UID}:${GID} \
docker.io/openapitools/openapi-diff:2.1.0-beta.8 \ docker.io/openapitools/openapi-diff:2.1.0-beta.6 \
--markdown /local/diff.md \ --markdown /local/diff.md \
/local/old_schema.yml /local/schema.yml /local/old_schema.yml /local/schema.yml
rm old_schema.yml rm old_schema.yml
sed -i 's/{/&#123;/g' diff.md
sed -i 's/}/&#125;/g' diff.md
npx prettier --write diff.md npx prettier --write diff.md
gen-clean-ts: ## Remove generated API client for Typescript gen-clean:
rm -rf ./${GEN_API_TS}/ rm -rf web/api/src/
rm -rf ./web/node_modules/@goauthentik/api/ rm -rf api/
gen-clean-go: ## Remove generated API client for Go gen-client-ts:
rm -rf ./${GEN_API_GO}/
gen-clean-py: ## Remove generated API client for Python
rm -rf ./${GEN_API_PY}/
gen-clean: gen-clean-ts gen-clean-go gen-clean-py ## Remove generated API clients
gen-client-ts: gen-clean-ts ## Build and install the authentik API for Typescript into the authentik UI Application
docker run \ docker run \
--rm -v ${PWD}:/local \ --rm -v ${PWD}:/local \
--user ${UID}:${GID} \ --user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \ docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \
-i /local/schema.yml \ -i /local/schema.yml \
-g typescript-fetch \ -g typescript-fetch \
-o /local/${GEN_API_TS} \ -o /local/gen-ts-api \
-c /local/scripts/api-ts-config.yaml \ -c /local/scripts/api-ts-config.yaml \
--additional-properties=npmVersion=${NPM_VERSION} \ --additional-properties=npmVersion=${NPM_VERSION} \
--git-repo-id authentik \ --git-repo-id authentik \
--git-user-id goauthentik --git-user-id goauthentik
mkdir -p web/node_modules/@goauthentik/api mkdir -p web/node_modules/@goauthentik/api
cd ./${GEN_API_TS} && npm i cd gen-ts-api && npm i
\cp -rf ./${GEN_API_TS}/* web/node_modules/@goauthentik/api \cp -rfv gen-ts-api/* web/node_modules/@goauthentik/api
gen-client-py: gen-clean-py ## Build and install the authentik API for Python gen-client-go:
mkdir -p ./gen-go-api ./gen-go-api/templates
wget https://raw.githubusercontent.com/goauthentik/client-go/main/config.yaml -O ./gen-go-api/config.yaml
wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/README.mustache -O ./gen-go-api/templates/README.mustache
wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/go.mod.mustache -O ./gen-go-api/templates/go.mod.mustache
cp schema.yml ./gen-go-api/
docker run \ docker run \
--rm -v ${PWD}:/local \ --rm -v ${PWD}/gen-go-api:/local \
--user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v7.4.0 generate \
-i /local/schema.yml \
-g python \
-o /local/${GEN_API_PY} \
-c /local/scripts/api-py-config.yaml \
--additional-properties=packageVersion=${NPM_VERSION} \
--git-repo-id authentik \
--git-user-id goauthentik
pip install ./${GEN_API_PY}
gen-client-go: gen-clean-go ## Build and install the authentik API for Golang
mkdir -p ./${GEN_API_GO} ./${GEN_API_GO}/templates
wget https://raw.githubusercontent.com/goauthentik/client-go/main/config.yaml -O ./${GEN_API_GO}/config.yaml
wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/README.mustache -O ./${GEN_API_GO}/templates/README.mustache
wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/go.mod.mustache -O ./${GEN_API_GO}/templates/go.mod.mustache
cp schema.yml ./${GEN_API_GO}/
docker run \
--rm -v ${PWD}/${GEN_API_GO}:/local \
--user ${UID}:${GID} \ --user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \ docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \
-i /local/schema.yml \ -i /local/schema.yml \
-g go \ -g go \
-o /local/ \ -o /local/ \
-c /local/config.yaml -c /local/config.yaml
go mod edit -replace goauthentik.io/api/v3=./${GEN_API_GO} go mod edit -replace goauthentik.io/api/v3=./gen-go-api
rm -rf ./${GEN_API_GO}/config.yaml ./${GEN_API_GO}/templates/ rm -rf ./gen-go-api/config.yaml ./gen-go-api/templates/
gen-dev-config: ## Generate a local development config file gen-dev-config:
python -m scripts.generate_config python -m scripts.generate_config
gen: gen-build gen-client-ts gen: gen-build gen-clean gen-client-ts
######################### #########################
## Web ## Web
######################### #########################
web-build: web-install ## Build the Authentik UI web-build: web-install
cd web && npm run build cd web && npm run build
web: web-lint-fix web-lint web-check-compile ## Automatically fix formatting issues in the Authentik UI source code, lint the code, and compile it web: web-lint-fix web-lint web-check-compile
web-install: ## Install the necessary libraries to build the Authentik UI web-install:
cd web && npm ci cd web && npm ci
web-test: ## Run tests for the Authentik UI web-watch:
cd web && npm run test
web-watch: ## Build and watch the Authentik UI for changes, updating automatically
rm -rf web/dist/ rm -rf web/dist/
mkdir web/dist/ mkdir web/dist/
touch web/dist/.gitkeep touch web/dist/.gitkeep
cd web && npm run watch cd web && npm run watch
web-storybook-watch: ## Build and run the storybook documentation server web-storybook-watch:
cd web && npm run storybook cd web && npm run storybook
web-lint-fix: web-lint-fix:
@ -238,38 +160,29 @@ web-i18n-extract:
## Website ## Website
######################### #########################
website: website-lint-fix website-build ## Automatically fix formatting issues in the Authentik website/docs source code, lint the code, and compile it website: website-lint-fix website-build
website-install: website-install:
cd website && npm ci cd website && npm ci
website-lint-fix: lint-codespell website-lint-fix:
cd website && npm run prettier cd website && npm run prettier
website-build: website-build:
cd website && npm run build cd website && npm run build
website-watch: ## Build and watch the documentation website, updating automatically website-watch:
cd website && npm run watch cd website && npm run watch
#########################
## Docker
#########################
docker: ## Build a docker image of the current source tree
mkdir -p ${GEN_API_TS}
DOCKER_BUILDKIT=1 docker build . --progress plain --tag ${DOCKER_IMAGE}
#########################
## CI
#########################
# These targets are use by GitHub actions to allow usage of matrix # These targets are use by GitHub actions to allow usage of matrix
# which makes the YAML File a lot smaller # which makes the YAML File a lot smaller
ci--meta-debug: ci--meta-debug:
python -V python -V
node --version node --version
ci-pylint: ci--meta-debug
pylint $(PY_SOURCES)
ci-black: ci--meta-debug ci-black: ci--meta-debug
black --check $(PY_SOURCES) black --check $(PY_SOURCES)
@ -279,8 +192,25 @@ ci-ruff: ci--meta-debug
ci-codespell: ci--meta-debug ci-codespell: ci--meta-debug
codespell $(CODESPELL_ARGS) -s codespell $(CODESPELL_ARGS) -s
ci-isort: ci--meta-debug
isort --check $(PY_SOURCES)
ci-bandit: ci--meta-debug ci-bandit: ci--meta-debug
bandit -r $(PY_SOURCES) bandit -r $(PY_SOURCES)
ci-pyright: ci--meta-debug
./web/node_modules/.bin/pyright $(PY_SOURCES)
ci-pending-migrations: ci--meta-debug ci-pending-migrations: ci--meta-debug
ak makemigrations --check ak makemigrations --check
install: web-install website-install
poetry install
dev-reset:
dropdb -U postgres -h localhost authentik
# Also remove the test-db if it exists
dropdb -U postgres -h localhost test_authentik || true
createdb -U postgres -h localhost authentik
redis-cli -n 0 flushall
make migrate

View File

@ -15,9 +15,7 @@
## What is authentik? ## What is authentik?
authentik is an open-source Identity Provider that emphasizes flexibility and versatility, with support for a wide set of protocols. authentik is an open-source Identity Provider that emphasizes flexibility and versatility. It can be seamlessly integrated into existing environments to support new protocols. authentik is also a great solution for implementing sign-up, recovery, and other similar features in your application, saving you the hassle of dealing with them.
Our [enterprise offer](https://goauthentik.io/pricing) can also be used as a self-hosted replacement for large-scale deployments of Okta/Auth0, Entra ID, Ping Identity, or other legacy IdPs for employees and B2B2C use.
## Installation ## Installation
@ -27,14 +25,14 @@ For bigger setups, there is a Helm Chart [here](https://github.com/goauthentik/h
## Screenshots ## Screenshots
| Light | Dark | | Light | Dark |
| ----------------------------------------------------------- | ---------------------------------------------------------- | | ------------------------------------------------------ | ----------------------------------------------------- |
| ![](https://docs.goauthentik.io/img/screen_apps_light.jpg) | ![](https://docs.goauthentik.io/img/screen_apps_dark.jpg) | | ![](https://goauthentik.io/img/screen_apps_light.jpg) | ![](https://goauthentik.io/img/screen_apps_dark.jpg) |
| ![](https://docs.goauthentik.io/img/screen_admin_light.jpg) | ![](https://docs.goauthentik.io/img/screen_admin_dark.jpg) | | ![](https://goauthentik.io/img/screen_admin_light.jpg) | ![](https://goauthentik.io/img/screen_admin_dark.jpg) |
## Development ## Development
See [Developer Documentation](https://docs.goauthentik.io/docs/developer-docs/?utm_source=github) See [Developer Documentation](https://goauthentik.io/developer-docs/?utm_source=github)
## Security ## Security
@ -43,3 +41,15 @@ See [SECURITY.md](SECURITY.md)
## Adoption and Contributions ## Adoption and Contributions
Your organization uses authentik? We'd love to add your logo to the readme and our website! Email us @ hello@goauthentik.io or open a GitHub Issue/PR! For more information on how to contribute to authentik, please refer to our [CONTRIBUTING.md file](./CONTRIBUTING.md). Your organization uses authentik? We'd love to add your logo to the readme and our website! Email us @ hello@goauthentik.io or open a GitHub Issue/PR! For more information on how to contribute to authentik, please refer to our [CONTRIBUTING.md file](./CONTRIBUTING.md).
## Sponsors
This project is proudly sponsored by:
<p>
<a href="https://www.digitalocean.com/?utm_medium=opensource&utm_source=goauthentik.io">
<img src="https://opensource.nyc3.cdn.digitaloceanspaces.com/attribution/assets/SVG/DO_Logo_horizontal_blue.svg" width="201px">
</a>
</p>
DigitalOcean provides development and testing resources for authentik.

View File

@ -1,9 +1,5 @@
authentik takes security very seriously. We follow the rules of [responsible disclosure](https://en.wikipedia.org/wiki/Responsible_disclosure), and we urge our community to do so as well, instead of reporting vulnerabilities publicly. This allows us to patch the issue quickly, announce it's existence and release the fixed version. authentik takes security very seriously. We follow the rules of [responsible disclosure](https://en.wikipedia.org/wiki/Responsible_disclosure), and we urge our community to do so as well, instead of reporting vulnerabilities publicly. This allows us to patch the issue quickly, announce it's existence and release the fixed version.
## Independent audits and pentests
In May/June of 2023 [Cure53](https://cure53.de) conducted an audit and pentest. The [results](https://cure53.de/pentest-report_authentik.pdf) are published on the [Cure53 website](https://cure53.de/#publications-2023). For more details about authentik's response to the findings of the audit refer to [2023-06 Cure53 Code audit](https://goauthentik.io/docs/security/2023-06-cure53).
## What authentik classifies as a CVE ## What authentik classifies as a CVE
CVE (Common Vulnerability and Exposure) is a system designed to aggregate all vulnerabilities. As such, a CVE will be issued when there is a either vulnerability or exposure. Per NIST, A vulnerability is: CVE (Common Vulnerability and Exposure) is a system designed to aggregate all vulnerabilities. As such, a CVE will be issued when there is a either vulnerability or exposure. Per NIST, A vulnerability is:
@ -18,10 +14,10 @@ Even if the issue is not a CVE, we still greatly appreciate your help in hardeni
(.x being the latest patch release for each version) (.x being the latest patch release for each version)
| Version | Supported | | Version | Supported |
| --------- | --------- | | --- | --- |
| 2024.8.x | ✅ | | 2023.6.x | ✅ |
| 2024.10.x | ✅ | | 2023.8.x | ✅ |
## Reporting a Vulnerability ## Reporting a Vulnerability
@ -31,12 +27,12 @@ To report a vulnerability, send an email to [security@goauthentik.io](mailto:se
authentik reserves the right to reclassify CVSS as necessary. To determine severity, we will use the CVSS calculator from NVD (https://nvd.nist.gov/vuln-metrics/cvss/v3-calculator). The calculated CVSS score will then be translated into one of the following categories: authentik reserves the right to reclassify CVSS as necessary. To determine severity, we will use the CVSS calculator from NVD (https://nvd.nist.gov/vuln-metrics/cvss/v3-calculator). The calculated CVSS score will then be translated into one of the following categories:
| Score | Severity | | Score | Severity |
| ---------- | -------- | | --- | --- |
| 0.0 | None | | 0.0 | None |
| 0.1 3.9 | Low | | 0.1 3.9 | Low |
| 4.0 6.9 | Medium | | 4.0 6.9 | Medium |
| 7.0 8.9 | High | | 7.0 8.9 | High |
| 9.0 10.0 | Critical | | 9.0 10.0 | Critical |
## Disclosure process ## Disclosure process

View File

@ -1,12 +1,12 @@
"""authentik root module""" """authentik root module"""
from os import environ from os import environ
from typing import Optional
__version__ = "2024.10.2" __version__ = "2023.8.6"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH" ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"
def get_build_hash(fallback: str | None = None) -> str: def get_build_hash(fallback: Optional[str] = None) -> str:
"""Get build hash""" """Get build hash"""
build_hash = environ.get(ENV_GIT_HASH_KEY, fallback if fallback else "") build_hash = environ.get(ENV_GIT_HASH_KEY, fallback if fallback else "")
return fallback if build_hash == "" and fallback else build_hash return fallback if build_hash == "" and fallback else build_hash

View File

@ -1,8 +1,7 @@
"""Meta API""" """Meta API"""
from drf_spectacular.utils import extend_schema from drf_spectacular.utils import extend_schema
from rest_framework.fields import CharField from rest_framework.fields import CharField
from rest_framework.permissions import IsAuthenticated from rest_framework.permissions import IsAdminUser
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.viewsets import ViewSet from rest_framework.viewsets import ViewSet
@ -22,7 +21,7 @@ class AppSerializer(PassiveSerializer):
class AppsViewSet(ViewSet): class AppsViewSet(ViewSet):
"""Read-only view list all installed apps""" """Read-only view list all installed apps"""
permission_classes = [IsAuthenticated] permission_classes = [IsAdminUser]
@extend_schema(responses={200: AppSerializer(many=True)}) @extend_schema(responses={200: AppSerializer(many=True)})
def list(self, request: Request) -> Response: def list(self, request: Request) -> Response:
@ -36,7 +35,7 @@ class AppsViewSet(ViewSet):
class ModelViewSet(ViewSet): class ModelViewSet(ViewSet):
"""Read-only view list all installed models""" """Read-only view list all installed models"""
permission_classes = [IsAuthenticated] permission_classes = [IsAdminUser]
@extend_schema(responses={200: AppSerializer(many=True)}) @extend_schema(responses={200: AppSerializer(many=True)})
def list(self, request: Request) -> Response: def list(self, request: Request) -> Response:

View File

@ -1,12 +1,11 @@
"""authentik administration metrics""" """authentik administration metrics"""
from datetime import timedelta from datetime import timedelta
from django.db.models.functions import ExtractHour from django.db.models.functions import ExtractHour
from drf_spectacular.utils import extend_schema, extend_schema_field from drf_spectacular.utils import extend_schema, extend_schema_field
from guardian.shortcuts import get_objects_for_user from guardian.shortcuts import get_objects_for_user
from rest_framework.fields import IntegerField, SerializerMethodField from rest_framework.fields import IntegerField, SerializerMethodField
from rest_framework.permissions import IsAuthenticated from rest_framework.permissions import IsAdminUser
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.views import APIView from rest_framework.views import APIView
@ -69,7 +68,7 @@ class LoginMetricsSerializer(PassiveSerializer):
class AdministrationMetricsViewSet(APIView): class AdministrationMetricsViewSet(APIView):
"""Login Metrics per 1h""" """Login Metrics per 1h"""
permission_classes = [IsAuthenticated] permission_classes = [IsAdminUser]
@extend_schema(responses={200: LoginMetricsSerializer(many=False)}) @extend_schema(responses={200: LoginMetricsSerializer(many=False)})
def get(self, request: Request) -> Response: def get(self, request: Request) -> Response:

View File

@ -1,52 +1,44 @@
"""authentik administration overview""" """authentik administration overview"""
import platform import platform
from datetime import datetime from datetime import datetime
from ssl import OPENSSL_VERSION
from sys import version as python_version from sys import version as python_version
from typing import TypedDict from typing import TypedDict
from cryptography.hazmat.backends.openssl.backend import backend
from django.utils.timezone import now from django.utils.timezone import now
from drf_spectacular.utils import extend_schema from drf_spectacular.utils import extend_schema
from gunicorn import version_info as gunicorn_version
from rest_framework.fields import SerializerMethodField from rest_framework.fields import SerializerMethodField
from rest_framework.permissions import IsAdminUser
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.views import APIView from rest_framework.views import APIView
from authentik import get_full_version
from authentik.core.api.utils import PassiveSerializer from authentik.core.api.utils import PassiveSerializer
from authentik.enterprise.license import LicenseKey
from authentik.lib.config import CONFIG
from authentik.lib.utils.reflection import get_env from authentik.lib.utils.reflection import get_env
from authentik.outposts.apps import MANAGED_OUTPOST from authentik.outposts.apps import MANAGED_OUTPOST
from authentik.outposts.models import Outpost from authentik.outposts.models import Outpost
from authentik.rbac.permissions import HasPermission
class RuntimeDict(TypedDict): class RuntimeDict(TypedDict):
"""Runtime information""" """Runtime information"""
python_version: str python_version: str
gunicorn_version: str
environment: str environment: str
architecture: str architecture: str
platform: str platform: str
uname: str uname: str
openssl_version: str
openssl_fips_enabled: bool | None
authentik_version: str
class SystemInfoSerializer(PassiveSerializer): class SystemSerializer(PassiveSerializer):
"""Get system information.""" """Get system information."""
http_headers = SerializerMethodField() http_headers = SerializerMethodField()
http_host = SerializerMethodField() http_host = SerializerMethodField()
http_is_secure = SerializerMethodField() http_is_secure = SerializerMethodField()
runtime = SerializerMethodField() runtime = SerializerMethodField()
brand = SerializerMethodField() tenant = SerializerMethodField()
server_time = SerializerMethodField() server_time = SerializerMethodField()
embedded_outpost_disabled = SerializerMethodField()
embedded_outpost_host = SerializerMethodField() embedded_outpost_host = SerializerMethodField()
def get_http_headers(self, request: Request) -> dict[str, str]: def get_http_headers(self, request: Request) -> dict[str, str]:
@ -69,30 +61,22 @@ class SystemInfoSerializer(PassiveSerializer):
def get_runtime(self, request: Request) -> RuntimeDict: def get_runtime(self, request: Request) -> RuntimeDict:
"""Get versions""" """Get versions"""
return { return {
"architecture": platform.machine(),
"authentik_version": get_full_version(),
"environment": get_env(),
"openssl_fips_enabled": (
backend._fips_enabled if LicenseKey.get_total().status().is_valid else None
),
"openssl_version": OPENSSL_VERSION,
"platform": platform.platform(),
"python_version": python_version, "python_version": python_version,
"gunicorn_version": ".".join(str(x) for x in gunicorn_version),
"environment": get_env(),
"architecture": platform.machine(),
"platform": platform.platform(),
"uname": " ".join(platform.uname()), "uname": " ".join(platform.uname()),
} }
def get_brand(self, request: Request) -> str: def get_tenant(self, request: Request) -> str:
"""Currently active brand""" """Currently active tenant"""
return str(request._request.brand) return str(request._request.tenant)
def get_server_time(self, request: Request) -> datetime: def get_server_time(self, request: Request) -> datetime:
"""Current server time""" """Current server time"""
return now() return now()
def get_embedded_outpost_disabled(self, request: Request) -> bool:
"""Whether the embedded outpost is disabled"""
return CONFIG.get_bool("outposts.disable_embedded_outpost", False)
def get_embedded_outpost_host(self, request: Request) -> str: def get_embedded_outpost_host(self, request: Request) -> str:
"""Get the FQDN configured on the embedded outpost""" """Get the FQDN configured on the embedded outpost"""
outposts = Outpost.objects.filter(managed=MANAGED_OUTPOST) outposts = Outpost.objects.filter(managed=MANAGED_OUTPOST)
@ -104,17 +88,17 @@ class SystemInfoSerializer(PassiveSerializer):
class SystemView(APIView): class SystemView(APIView):
"""Get system information.""" """Get system information."""
permission_classes = [HasPermission("authentik_rbac.view_system_info")] permission_classes = [IsAdminUser]
pagination_class = None pagination_class = None
filter_backends = [] filter_backends = []
serializer_class = SystemInfoSerializer serializer_class = SystemSerializer
@extend_schema(responses={200: SystemInfoSerializer(many=False)}) @extend_schema(responses={200: SystemSerializer(many=False)})
def get(self, request: Request) -> Response: def get(self, request: Request) -> Response:
"""Get system information.""" """Get system information."""
return Response(SystemInfoSerializer(request).data) return Response(SystemSerializer(request).data)
@extend_schema(responses={200: SystemInfoSerializer(many=False)}) @extend_schema(responses={200: SystemSerializer(many=False)})
def post(self, request: Request) -> Response: def post(self, request: Request) -> Response:
"""Get system information.""" """Get system information."""
return Response(SystemInfoSerializer(request).data) return Response(SystemSerializer(request).data)

View File

@ -0,0 +1,132 @@
"""Tasks API"""
from importlib import import_module
from django.contrib import messages
from django.http.response import Http404
from django.utils.translation import gettext_lazy as _
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_schema
from rest_framework.decorators import action
from rest_framework.fields import (
CharField,
ChoiceField,
DateTimeField,
ListField,
SerializerMethodField,
)
from rest_framework.permissions import IsAdminUser
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import ViewSet
from structlog.stdlib import get_logger
from authentik.core.api.utils import PassiveSerializer
from authentik.events.monitored_tasks import TaskInfo, TaskResultStatus
LOGGER = get_logger()
class TaskSerializer(PassiveSerializer):
"""Serialize TaskInfo and TaskResult"""
task_name = CharField()
task_description = CharField()
task_finish_timestamp = DateTimeField(source="finish_time")
task_duration = SerializerMethodField()
status = ChoiceField(
source="result.status.name",
choices=[(x.name, x.name) for x in TaskResultStatus],
)
messages = ListField(source="result.messages")
def get_task_duration(self, instance: TaskInfo) -> int:
"""Get the duration a task took to run"""
return max(instance.finish_timestamp - instance.start_timestamp, 0)
def to_representation(self, instance: TaskInfo):
"""When a new version of authentik adds fields to TaskInfo,
the API will fail with an AttributeError, as the classes
are pickled in cache. In that case, just delete the info"""
try:
return super().to_representation(instance)
# pylint: disable=broad-except
except Exception: # pragma: no cover
if isinstance(self.instance, list):
for inst in self.instance:
inst.delete()
else:
self.instance.delete()
return {}
class TaskViewSet(ViewSet):
"""Read-only view set that returns all background tasks"""
permission_classes = [IsAdminUser]
serializer_class = TaskSerializer
@extend_schema(
responses={
200: TaskSerializer(many=False),
404: OpenApiResponse(description="Task not found"),
},
parameters=[
OpenApiParameter(
"id",
type=OpenApiTypes.STR,
location=OpenApiParameter.PATH,
required=True,
),
],
)
def retrieve(self, request: Request, pk=None) -> Response:
"""Get a single system task"""
task = TaskInfo.by_name(pk)
if not task:
raise Http404
return Response(TaskSerializer(task, many=False).data)
@extend_schema(responses={200: TaskSerializer(many=True)})
def list(self, request: Request) -> Response:
"""List system tasks"""
tasks = sorted(TaskInfo.all().values(), key=lambda task: task.task_name)
return Response(TaskSerializer(tasks, many=True).data)
@extend_schema(
request=OpenApiTypes.NONE,
responses={
204: OpenApiResponse(description="Task retried successfully"),
404: OpenApiResponse(description="Task not found"),
500: OpenApiResponse(description="Failed to retry task"),
},
parameters=[
OpenApiParameter(
"id",
type=OpenApiTypes.STR,
location=OpenApiParameter.PATH,
required=True,
),
],
)
@action(detail=True, methods=["post"])
def retry(self, request: Request, pk=None) -> Response:
"""Retry task"""
task = TaskInfo.by_name(pk)
if not task:
raise Http404
try:
task_module = import_module(task.task_call_module)
task_func = getattr(task_module, task.task_call_func)
LOGGER.debug("Running task", task=task_func)
task_func.delay(*task.task_call_args, **task.task_call_kwargs)
messages.success(
self.request,
_("Successfully re-scheduled Task %(name)s!" % {"name": task.task_name}),
)
return Response(status=204)
except (ImportError, AttributeError): # pragma: no cover
LOGGER.warning("Failed to run task, remove state", task=task)
# if we get an import error, the module path has probably changed
task.delete()
return Response(status=500)

View File

@ -1,5 +1,4 @@
"""authentik administration overview""" """authentik administration overview"""
from django.core.cache import cache from django.core.cache import cache
from drf_spectacular.utils import extend_schema from drf_spectacular.utils import extend_schema
from packaging.version import parse from packaging.version import parse
@ -10,9 +9,8 @@ from rest_framework.response import Response
from rest_framework.views import APIView from rest_framework.views import APIView
from authentik import __version__, get_build_hash from authentik import __version__, get_build_hash
from authentik.admin.tasks import VERSION_CACHE_KEY, VERSION_NULL, update_latest_version from authentik.admin.tasks import VERSION_CACHE_KEY, update_latest_version
from authentik.core.api.utils import PassiveSerializer from authentik.core.api.utils import PassiveSerializer
from authentik.outposts.models import Outpost
class VersionSerializer(PassiveSerializer): class VersionSerializer(PassiveSerializer):
@ -20,10 +18,8 @@ class VersionSerializer(PassiveSerializer):
version_current = SerializerMethodField() version_current = SerializerMethodField()
version_latest = SerializerMethodField() version_latest = SerializerMethodField()
version_latest_valid = SerializerMethodField()
build_hash = SerializerMethodField() build_hash = SerializerMethodField()
outdated = SerializerMethodField() outdated = SerializerMethodField()
outpost_outdated = SerializerMethodField()
def get_build_hash(self, _) -> str: def get_build_hash(self, _) -> str:
"""Get build hash, if version is not latest or released""" """Get build hash, if version is not latest or released"""
@ -41,23 +37,10 @@ class VersionSerializer(PassiveSerializer):
return __version__ return __version__
return version_in_cache return version_in_cache
def get_version_latest_valid(self, _) -> bool:
"""Check if latest version is valid"""
return cache.get(VERSION_CACHE_KEY) != VERSION_NULL
def get_outdated(self, instance) -> bool: def get_outdated(self, instance) -> bool:
"""Check if we're running the latest version""" """Check if we're running the latest version"""
return parse(self.get_version_current(instance)) < parse(self.get_version_latest(instance)) return parse(self.get_version_current(instance)) < parse(self.get_version_latest(instance))
def get_outpost_outdated(self, _) -> bool:
"""Check if any outpost is outdated/has a version mismatch"""
any_outdated = False
for outpost in Outpost.objects.all():
for state in outpost.state:
if state.version_outdated:
any_outdated = True
return any_outdated
class VersionView(APIView): class VersionView(APIView):
"""Get running and latest version.""" """Get running and latest version."""

View File

@ -1,33 +0,0 @@
from rest_framework.permissions import IsAdminUser
from rest_framework.viewsets import ReadOnlyModelViewSet
from authentik.admin.models import VersionHistory
from authentik.core.api.utils import ModelSerializer
class VersionHistorySerializer(ModelSerializer):
"""VersionHistory Serializer"""
class Meta:
model = VersionHistory
fields = [
"id",
"timestamp",
"version",
"build",
]
class VersionHistoryViewSet(ReadOnlyModelViewSet):
"""VersionHistory Viewset"""
queryset = VersionHistory.objects.all()
serializer_class = VersionHistorySerializer
permission_classes = [IsAdminUser]
filterset_fields = [
"version",
"build",
]
search_fields = ["version", "build"]
ordering = ["-timestamp"]
pagination_class = None

View File

@ -1,20 +1,19 @@
"""authentik administration overview""" """authentik administration overview"""
from django.conf import settings from django.conf import settings
from drf_spectacular.utils import extend_schema, inline_serializer from drf_spectacular.utils import extend_schema, inline_serializer
from rest_framework.fields import IntegerField from rest_framework.fields import IntegerField
from rest_framework.permissions import IsAdminUser
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.views import APIView from rest_framework.views import APIView
from authentik.rbac.permissions import HasPermission
from authentik.root.celery import CELERY_APP from authentik.root.celery import CELERY_APP
class WorkerView(APIView): class WorkerView(APIView):
"""Get currently connected worker count.""" """Get currently connected worker count."""
permission_classes = [HasPermission("authentik_rbac.view_system_info")] permission_classes = [IsAdminUser]
@extend_schema(responses=inline_serializer("Workers", fields={"count": IntegerField()})) @extend_schema(responses=inline_serializer("Workers", fields={"count": IntegerField()}))
def get(self, request: Request) -> Response: def get(self, request: Request) -> Response:

View File

@ -1,5 +1,4 @@
"""authentik admin app config""" """authentik admin app config"""
from prometheus_client import Gauge, Info from prometheus_client import Gauge, Info
from authentik.blueprints.apps import ManagedAppConfig from authentik.blueprints.apps import ManagedAppConfig
@ -15,3 +14,7 @@ class AuthentikAdminConfig(ManagedAppConfig):
label = "authentik_admin" label = "authentik_admin"
verbose_name = "authentik Admin" verbose_name = "authentik Admin"
default = True default = True
def reconcile_load_admin_signals(self):
"""Load admin signals"""
self.import_module("authentik.admin.signals")

View File

@ -1,22 +0,0 @@
"""authentik admin models"""
from django.db import models
from django.utils.translation import gettext_lazy as _
class VersionHistory(models.Model):
id = models.BigAutoField(primary_key=True)
timestamp = models.DateTimeField()
version = models.TextField()
build = models.TextField()
class Meta:
managed = False
db_table = "authentik_version_history"
ordering = ("-timestamp",)
verbose_name = _("Version history")
verbose_name_plural = _("Version history")
default_permissions = []
def __str__(self):
return f"{self.version}.{self.build} ({self.timestamp})"

View File

@ -1,5 +1,4 @@
"""authentik admin settings""" """authentik admin settings"""
from celery.schedules import crontab from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand from authentik.lib.utils.time import fqdn_rand

View File

@ -1,7 +1,7 @@
"""admin signals""" """admin signals"""
from django.dispatch import receiver from django.dispatch import receiver
from authentik.admin.api.tasks import TaskInfo
from authentik.admin.apps import GAUGE_WORKERS from authentik.admin.apps import GAUGE_WORKERS
from authentik.root.celery import CELERY_APP from authentik.root.celery import CELERY_APP
from authentik.root.monitoring import monitoring_set from authentik.root.monitoring import monitoring_set
@ -12,3 +12,10 @@ def monitoring_set_workers(sender, **kwargs):
"""Set worker gauge""" """Set worker gauge"""
count = len(CELERY_APP.control.ping(timeout=0.5)) count = len(CELERY_APP.control.ping(timeout=0.5))
GAUGE_WORKERS.set(count) GAUGE_WORKERS.set(count)
@receiver(monitoring_set)
def monitoring_set_tasks(sender, **kwargs):
"""Set task gauges"""
for task in TaskInfo.all().values():
task.update_metrics()

View File

@ -1,8 +1,9 @@
"""authentik admin tasks""" """authentik admin tasks"""
import re
from django.core.cache import cache from django.core.cache import cache
from django.core.validators import URLValidator
from django.db import DatabaseError, InternalError, ProgrammingError from django.db import DatabaseError, InternalError, ProgrammingError
from django.utils.translation import gettext_lazy as _
from packaging.version import parse from packaging.version import parse
from requests import RequestException from requests import RequestException
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
@ -10,15 +11,21 @@ from structlog.stdlib import get_logger
from authentik import __version__, get_build_hash from authentik import __version__, get_build_hash
from authentik.admin.apps import PROM_INFO from authentik.admin.apps import PROM_INFO
from authentik.events.models import Event, EventAction, Notification from authentik.events.models import Event, EventAction, Notification
from authentik.events.system_tasks import SystemTask, TaskStatus, prefill_task from authentik.events.monitored_tasks import (
MonitoredTask,
TaskResult,
TaskResultStatus,
prefill_task,
)
from authentik.lib.config import CONFIG from authentik.lib.config import CONFIG
from authentik.lib.utils.http import get_http_session from authentik.lib.utils.http import get_http_session
from authentik.root.celery import CELERY_APP from authentik.root.celery import CELERY_APP
LOGGER = get_logger() LOGGER = get_logger()
VERSION_NULL = "0.0.0"
VERSION_CACHE_KEY = "authentik_latest_version" VERSION_CACHE_KEY = "authentik_latest_version"
VERSION_CACHE_TIMEOUT = 8 * 60 * 60 # 8 hours VERSION_CACHE_TIMEOUT = 8 * 60 * 60 # 8 hours
# Chop of the first ^ because we want to search the entire string
URL_FINDER = URLValidator.regex.pattern[1:]
LOCAL_VERSION = parse(__version__) LOCAL_VERSION = parse(__version__)
@ -47,13 +54,13 @@ def clear_update_notifications():
notification.delete() notification.delete()
@CELERY_APP.task(bind=True, base=SystemTask) @CELERY_APP.task(bind=True, base=MonitoredTask)
@prefill_task @prefill_task
def update_latest_version(self: SystemTask): def update_latest_version(self: MonitoredTask):
"""Update latest version info""" """Update latest version info"""
if CONFIG.get_bool("disable_update_check"): if CONFIG.get_bool("disable_update_check"):
cache.set(VERSION_CACHE_KEY, VERSION_NULL, VERSION_CACHE_TIMEOUT) cache.set(VERSION_CACHE_KEY, "0.0.0", VERSION_CACHE_TIMEOUT)
self.set_status(TaskStatus.WARNING, "Version check disabled.") self.set_status(TaskResult(TaskResultStatus.WARNING, messages=["Version check disabled."]))
return return
try: try:
response = get_http_session().get( response = get_http_session().get(
@ -63,7 +70,9 @@ def update_latest_version(self: SystemTask):
data = response.json() data = response.json()
upstream_version = data.get("stable", {}).get("version") upstream_version = data.get("stable", {}).get("version")
cache.set(VERSION_CACHE_KEY, upstream_version, VERSION_CACHE_TIMEOUT) cache.set(VERSION_CACHE_KEY, upstream_version, VERSION_CACHE_TIMEOUT)
self.set_status(TaskStatus.SUCCESSFUL, "Successfully updated latest Version") self.set_status(
TaskResult(TaskResultStatus.SUCCESSFUL, ["Successfully updated latest Version"])
)
_set_prom_info() _set_prom_info()
# Check if upstream version is newer than what we're running, # Check if upstream version is newer than what we're running,
# and if no event exists yet, create one. # and if no event exists yet, create one.
@ -74,19 +83,13 @@ def update_latest_version(self: SystemTask):
context__new_version=upstream_version, context__new_version=upstream_version,
).exists(): ).exists():
return return
Event.new( event_dict = {"new_version": upstream_version}
EventAction.UPDATE_AVAILABLE, if match := re.search(URL_FINDER, data.get("stable", {}).get("changelog", "")):
message=_( event_dict["message"] = f"Changelog: {match.group()}"
"New version {version} available!".format( Event.new(EventAction.UPDATE_AVAILABLE, **event_dict).save()
version=upstream_version,
)
),
new_version=upstream_version,
changelog=data.get("stable", {}).get("changelog_url"),
).save()
except (RequestException, IndexError) as exc: except (RequestException, IndexError) as exc:
cache.set(VERSION_CACHE_KEY, VERSION_NULL, VERSION_CACHE_TIMEOUT) cache.set(VERSION_CACHE_KEY, "0.0.0", VERSION_CACHE_TIMEOUT)
self.set_error(exc) self.set_status(TaskResult(TaskResultStatus.ERROR).with_error(exc))
_set_prom_info() _set_prom_info()

View File

@ -1,5 +1,4 @@
"""test admin api""" """test admin api"""
from json import loads from json import loads
from django.test import TestCase from django.test import TestCase
@ -8,6 +7,8 @@ from django.urls import reverse
from authentik import __version__ from authentik import __version__
from authentik.blueprints.tests import reconcile_app from authentik.blueprints.tests import reconcile_app
from authentik.core.models import Group, User from authentik.core.models import Group, User
from authentik.core.tasks import clean_expired_models
from authentik.events.monitored_tasks import TaskResultStatus
from authentik.lib.generators import generate_id from authentik.lib.generators import generate_id
@ -22,6 +23,53 @@ class TestAdminAPI(TestCase):
self.group.save() self.group.save()
self.client.force_login(self.user) self.client.force_login(self.user)
def test_tasks(self):
"""Test Task API"""
clean_expired_models.delay()
response = self.client.get(reverse("authentik_api:admin_system_tasks-list"))
self.assertEqual(response.status_code, 200)
body = loads(response.content)
self.assertTrue(any(task["task_name"] == "clean_expired_models" for task in body))
def test_tasks_single(self):
"""Test Task API (read single)"""
clean_expired_models.delay()
response = self.client.get(
reverse(
"authentik_api:admin_system_tasks-detail",
kwargs={"pk": "clean_expired_models"},
)
)
self.assertEqual(response.status_code, 200)
body = loads(response.content)
self.assertEqual(body["status"], TaskResultStatus.SUCCESSFUL.name)
self.assertEqual(body["task_name"], "clean_expired_models")
response = self.client.get(
reverse("authentik_api:admin_system_tasks-detail", kwargs={"pk": "qwerqwer"})
)
self.assertEqual(response.status_code, 404)
def test_tasks_retry(self):
"""Test Task API (retry)"""
clean_expired_models.delay()
response = self.client.post(
reverse(
"authentik_api:admin_system_tasks-retry",
kwargs={"pk": "clean_expired_models"},
)
)
self.assertEqual(response.status_code, 204)
def test_tasks_retry_404(self):
"""Test Task API (retry, 404)"""
response = self.client.post(
reverse(
"authentik_api:admin_system_tasks-retry",
kwargs={"pk": "qwerqewrqrqewrqewr"},
)
)
self.assertEqual(response.status_code, 404)
def test_version(self): def test_version(self):
"""Test Version API""" """Test Version API"""
response = self.client.get(reverse("authentik_api:admin_version")) response = self.client.get(reverse("authentik_api:admin_version"))

View File

@ -1,5 +1,4 @@
"""test admin tasks""" """test admin tasks"""
from django.core.cache import cache from django.core.cache import cache
from django.test import TestCase from django.test import TestCase
from requests_mock import Mocker from requests_mock import Mocker
@ -17,7 +16,6 @@ RESPONSE_VALID = {
"stable": { "stable": {
"version": "99999999.9999999", "version": "99999999.9999999",
"changelog": "See https://goauthentik.io/test", "changelog": "See https://goauthentik.io/test",
"changelog_url": "https://goauthentik.io/test",
"reason": "bugfix", "reason": "bugfix",
}, },
} }
@ -36,7 +34,7 @@ class TestAdminTasks(TestCase):
Event.objects.filter( Event.objects.filter(
action=EventAction.UPDATE_AVAILABLE, action=EventAction.UPDATE_AVAILABLE,
context__new_version="99999999.9999999", context__new_version="99999999.9999999",
context__message="New version 99999999.9999999 available!", context__message="Changelog: https://goauthentik.io/test",
).exists() ).exists()
) )
# test that a consecutive check doesn't create a duplicate event # test that a consecutive check doesn't create a duplicate event
@ -46,7 +44,7 @@ class TestAdminTasks(TestCase):
Event.objects.filter( Event.objects.filter(
action=EventAction.UPDATE_AVAILABLE, action=EventAction.UPDATE_AVAILABLE,
context__new_version="99999999.9999999", context__new_version="99999999.9999999",
context__message="New version 99999999.9999999 available!", context__message="Changelog: https://goauthentik.io/test",
) )
), ),
1, 1,

View File

@ -1,15 +1,15 @@
"""API URLs""" """API URLs"""
from django.urls import path from django.urls import path
from authentik.admin.api.meta import AppsViewSet, ModelViewSet from authentik.admin.api.meta import AppsViewSet, ModelViewSet
from authentik.admin.api.metrics import AdministrationMetricsViewSet from authentik.admin.api.metrics import AdministrationMetricsViewSet
from authentik.admin.api.system import SystemView from authentik.admin.api.system import SystemView
from authentik.admin.api.tasks import TaskViewSet
from authentik.admin.api.version import VersionView from authentik.admin.api.version import VersionView
from authentik.admin.api.version_history import VersionHistoryViewSet
from authentik.admin.api.workers import WorkerView from authentik.admin.api.workers import WorkerView
api_urlpatterns = [ api_urlpatterns = [
("admin/system_tasks", TaskViewSet, "admin_system_tasks"),
("admin/apps", AppsViewSet, "apps"), ("admin/apps", AppsViewSet, "apps"),
("admin/models", ModelViewSet, "models"), ("admin/models", ModelViewSet, "models"),
path( path(
@ -18,7 +18,6 @@ api_urlpatterns = [
name="admin_metrics", name="admin_metrics",
), ),
path("admin/version/", VersionView.as_view(), name="admin_version"), path("admin/version/", VersionView.as_view(), name="admin_version"),
("admin/version/history", VersionHistoryViewSet, "version_history"),
path("admin/workers/", WorkerView.as_view(), name="admin_workers"), path("admin/workers/", WorkerView.as_view(), name="admin_workers"),
path("admin/system/", SystemView.as_view(), name="admin_system"), path("admin/system/", SystemView.as_view(), name="admin_system"),
] ]

View File

@ -10,3 +10,26 @@ class AuthentikAPIConfig(AppConfig):
label = "authentik_api" label = "authentik_api"
mountpoint = "api/" mountpoint = "api/"
verbose_name = "authentik API" verbose_name = "authentik API"
def ready(self) -> None:
from drf_spectacular.extensions import OpenApiAuthenticationExtension
from authentik.api.authentication import TokenAuthentication
# Class is defined here as it needs to be created early enough that drf-spectacular will
# find it, but also won't cause any import issues
# pylint: disable=unused-variable
class TokenSchema(OpenApiAuthenticationExtension):
"""Auth schema"""
target_class = TokenAuthentication
name = "authentik"
def get_security_definition(self, auto_schema):
"""Auth schema"""
return {
"type": "apiKey",
"in": "header",
"name": "Authorization",
"scheme": "bearer",
}

View File

@ -1,10 +1,8 @@
"""API Authentication""" """API Authentication"""
from hmac import compare_digest from hmac import compare_digest
from typing import Any from typing import Any, Optional
from django.conf import settings from django.conf import settings
from drf_spectacular.extensions import OpenApiAuthenticationExtension
from rest_framework.authentication import BaseAuthentication, get_authorization_header from rest_framework.authentication import BaseAuthentication, get_authorization_header
from rest_framework.exceptions import AuthenticationFailed from rest_framework.exceptions import AuthenticationFailed
from rest_framework.request import Request from rest_framework.request import Request
@ -18,7 +16,7 @@ from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API
LOGGER = get_logger() LOGGER = get_logger()
def validate_auth(header: bytes) -> str | None: def validate_auth(header: bytes) -> Optional[str]:
"""Validate that the header is in a correct format, """Validate that the header is in a correct format,
returns type and credentials""" returns type and credentials"""
auth_credentials = header.decode().strip() auth_credentials = header.decode().strip()
@ -33,7 +31,7 @@ def validate_auth(header: bytes) -> str | None:
return auth_credentials return auth_credentials
def bearer_auth(raw_header: bytes) -> User | None: def bearer_auth(raw_header: bytes) -> Optional[User]:
"""raw_header in the Format of `Bearer ....`""" """raw_header in the Format of `Bearer ....`"""
user = auth_user_lookup(raw_header) user = auth_user_lookup(raw_header)
if not user: if not user:
@ -43,7 +41,7 @@ def bearer_auth(raw_header: bytes) -> User | None:
return user return user
def auth_user_lookup(raw_header: bytes) -> User | None: def auth_user_lookup(raw_header: bytes) -> Optional[User]:
"""raw_header in the Format of `Bearer ....`""" """raw_header in the Format of `Bearer ....`"""
from authentik.providers.oauth2.models import AccessToken from authentik.providers.oauth2.models import AccessToken
@ -76,7 +74,7 @@ def auth_user_lookup(raw_header: bytes) -> User | None:
raise AuthenticationFailed("Token invalid/expired") raise AuthenticationFailed("Token invalid/expired")
def token_secret_key(value: str) -> User | None: def token_secret_key(value: str) -> Optional[User]:
"""Check if the token is the secret key """Check if the token is the secret key
and return the service account for the managed outpost""" and return the service account for the managed outpost"""
from authentik.outposts.apps import MANAGED_OUTPOST from authentik.outposts.apps import MANAGED_OUTPOST
@ -103,14 +101,3 @@ class TokenAuthentication(BaseAuthentication):
return None return None
return (user, None) # pragma: no cover return (user, None) # pragma: no cover
class TokenSchema(OpenApiAuthenticationExtension):
"""Auth schema"""
target_class = TokenAuthentication
name = "authentik"
def get_security_definition(self, auto_schema):
"""Auth schema"""
return {"type": "http", "scheme": "bearer"}

View File

@ -1,5 +1,4 @@
"""API Authorization""" """API Authorization"""
from django.conf import settings from django.conf import settings
from django.db.models import Model from django.db.models import Model
from django.db.models.query import QuerySet from django.db.models.query import QuerySet
@ -8,9 +7,9 @@ from rest_framework.authentication import get_authorization_header
from rest_framework.filters import BaseFilterBackend from rest_framework.filters import BaseFilterBackend
from rest_framework.permissions import BasePermission from rest_framework.permissions import BasePermission
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework_guardian.filters import ObjectPermissionsFilter
from authentik.api.authentication import validate_auth from authentik.api.authentication import validate_auth
from authentik.rbac.filters import ObjectFilter
class OwnerFilter(BaseFilterBackend): class OwnerFilter(BaseFilterBackend):
@ -27,14 +26,14 @@ class OwnerFilter(BaseFilterBackend):
class SecretKeyFilter(DjangoFilterBackend): class SecretKeyFilter(DjangoFilterBackend):
"""Allow access to all objects when authenticated with secret key as token. """Allow access to all objects when authenticated with secret key as token.
Replaces both DjangoFilterBackend and ObjectFilter""" Replaces both DjangoFilterBackend and ObjectPermissionsFilter"""
def filter_queryset(self, request: Request, queryset: QuerySet, view) -> QuerySet: def filter_queryset(self, request: Request, queryset: QuerySet, view) -> QuerySet:
auth_header = get_authorization_header(request) auth_header = get_authorization_header(request)
token = validate_auth(auth_header) token = validate_auth(auth_header)
if token and token == settings.SECRET_KEY: if token and token == settings.SECRET_KEY:
return queryset return queryset
queryset = ObjectFilter().filter_queryset(request, queryset, view) queryset = ObjectPermissionsFilter().filter_queryset(request, queryset, view)
return super().filter_queryset(request, queryset, view) return super().filter_queryset(request, queryset, view)

View File

@ -0,0 +1,35 @@
"""API Decorators"""
from functools import wraps
from typing import Callable, Optional
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import ModelViewSet
from structlog.stdlib import get_logger
LOGGER = get_logger()
def permission_required(perm: Optional[str] = None, other_perms: Optional[list[str]] = None):
"""Check permissions for a single custom action"""
def wrapper_outter(func: Callable):
"""Check permissions for a single custom action"""
@wraps(func)
def wrapper(self: ModelViewSet, request: Request, *args, **kwargs) -> Response:
if perm:
obj = self.get_object()
if not request.user.has_perm(perm, obj):
LOGGER.debug("denying access for object", user=request.user, perm=perm, obj=obj)
return self.permission_denied(request)
if other_perms:
for other_perm in other_perms:
if not request.user.has_perm(other_perm):
LOGGER.debug("denying access for other", user=request.user, perm=perm)
return self.permission_denied(request)
return func(self, request, *args, **kwargs)
return wrapper
return wrapper_outter

View File

@ -1,5 +1,4 @@
"""Pagination which includes total pages and current page""" """Pagination which includes total pages and current page"""
from rest_framework import pagination from rest_framework import pagination
from rest_framework.response import Response from rest_framework.response import Response
@ -78,10 +77,3 @@ class Pagination(pagination.PageNumberPagination):
}, },
"required": ["pagination", "results"], "required": ["pagination", "results"],
} }
class SmallerPagination(Pagination):
"""Smaller pagination for objects which might require a lot of queries
to retrieve all data for."""
max_page_size = 10

View File

@ -1,5 +1,4 @@
"""Error Response schema, from https://github.com/axnsan12/drf-yasg/issues/224""" """Error Response schema, from https://github.com/axnsan12/drf-yasg/issues/224"""
from django.utils.translation import gettext_lazy as _ from django.utils.translation import gettext_lazy as _
from drf_spectacular.generators import SchemaGenerator from drf_spectacular.generators import SchemaGenerator
from drf_spectacular.plumbing import ( from drf_spectacular.plumbing import (
@ -12,7 +11,6 @@ from drf_spectacular.settings import spectacular_settings
from drf_spectacular.types import OpenApiTypes from drf_spectacular.types import OpenApiTypes
from rest_framework.settings import api_settings from rest_framework.settings import api_settings
from authentik.api.apps import AuthentikAPIConfig
from authentik.api.pagination import PAGINATION_COMPONENT_NAME, PAGINATION_SCHEMA from authentik.api.pagination import PAGINATION_COMPONENT_NAME, PAGINATION_SCHEMA
@ -102,12 +100,3 @@ def postprocess_schema_responses(result, generator: SchemaGenerator, **kwargs):
comp = result["components"]["schemas"][component] comp = result["components"]["schemas"][component]
comp["additionalProperties"] = {} comp["additionalProperties"] = {}
return result return result
def preprocess_schema_exclude_non_api(endpoints, **kwargs):
"""Filter out all API Views which are not mounted under /api"""
return [
(path, path_regex, method, callback)
for path, path_regex, method, callback in endpoints
if path.startswith("/" + AuthentikAPIConfig.mountpoint)
]

View File

@ -1,13 +1,13 @@
{% extends "base/skeleton.html" %} {% extends "base/skeleton.html" %}
{% load authentik_core %} {% load static %}
{% block title %} {% block title %}
API Browser - {{ brand.branding_title }} API Browser - {{ tenant.branding_title }}
{% endblock %} {% endblock %}
{% block head %} {% block head %}
<script src="{% versioned_script 'dist/standalone/api-browser/index-%v.js' %}" type="module"></script> <script src="{% static 'dist/standalone/api-browser/index.js' %}?version={{ version }}" type="module"></script>
<meta name="theme-color" content="#151515" media="(prefers-color-scheme: light)"> <meta name="theme-color" content="#151515" media="(prefers-color-scheme: light)">
<meta name="theme-color" content="#151515" media="(prefers-color-scheme: dark)"> <meta name="theme-color" content="#151515" media="(prefers-color-scheme: dark)">
{% endblock %} {% endblock %}

View File

@ -1,5 +1,4 @@
"""Test API Authentication""" """Test API Authentication"""
import json import json
from base64 import b64encode from base64 import b64encode
@ -13,8 +12,6 @@ from authentik.blueprints.tests import reconcile_app
from authentik.core.models import Token, TokenIntents, User, UserTypes from authentik.core.models import Token, TokenIntents, User, UserTypes
from authentik.core.tests.utils import create_test_admin_user, create_test_flow from authentik.core.tests.utils import create_test_admin_user, create_test_flow
from authentik.lib.generators import generate_id from authentik.lib.generators import generate_id
from authentik.outposts.apps import MANAGED_OUTPOST
from authentik.outposts.models import Outpost
from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API
from authentik.providers.oauth2.models import AccessToken, OAuth2Provider from authentik.providers.oauth2.models import AccessToken, OAuth2Provider
@ -25,17 +22,17 @@ class TestAPIAuth(TestCase):
def test_invalid_type(self): def test_invalid_type(self):
"""Test invalid type""" """Test invalid type"""
with self.assertRaises(AuthenticationFailed): with self.assertRaises(AuthenticationFailed):
bearer_auth(b"foo bar") bearer_auth("foo bar".encode())
def test_invalid_empty(self): def test_invalid_empty(self):
"""Test invalid type""" """Test invalid type"""
self.assertIsNone(bearer_auth(b"Bearer ")) self.assertIsNone(bearer_auth("Bearer ".encode()))
self.assertIsNone(bearer_auth(b"")) self.assertIsNone(bearer_auth("".encode()))
def test_invalid_no_token(self): def test_invalid_no_token(self):
"""Test invalid with no token""" """Test invalid with no token"""
with self.assertRaises(AuthenticationFailed): with self.assertRaises(AuthenticationFailed):
auth = b64encode(b":abc").decode() auth = b64encode(":abc".encode()).decode()
self.assertIsNone(bearer_auth(f"Basic :{auth}".encode())) self.assertIsNone(bearer_auth(f"Basic :{auth}".encode()))
def test_bearer_valid(self): def test_bearer_valid(self):
@ -52,12 +49,8 @@ class TestAPIAuth(TestCase):
with self.assertRaises(AuthenticationFailed): with self.assertRaises(AuthenticationFailed):
bearer_auth(f"Bearer {token.key}".encode()) bearer_auth(f"Bearer {token.key}".encode())
@reconcile_app("authentik_outposts") def test_managed_outpost(self):
def test_managed_outpost_fail(self):
"""Test managed outpost""" """Test managed outpost"""
outpost = Outpost.objects.filter(managed=MANAGED_OUTPOST).first()
outpost.user.delete()
outpost.delete()
with self.assertRaises(AuthenticationFailed): with self.assertRaises(AuthenticationFailed):
bearer_auth(f"Bearer {settings.SECRET_KEY}".encode()) bearer_auth(f"Bearer {settings.SECRET_KEY}".encode())

View File

@ -1,5 +1,4 @@
"""Test config API""" """Test config API"""
from json import loads from json import loads
from django.urls import reverse from django.urls import reverse

View File

@ -0,0 +1,34 @@
"""test decorators api"""
from django.urls import reverse
from guardian.shortcuts import assign_perm
from rest_framework.test import APITestCase
from authentik.core.models import Application, User
from authentik.lib.generators import generate_id
class TestAPIDecorators(APITestCase):
"""test decorators api"""
def setUp(self) -> None:
super().setUp()
self.user = User.objects.create(username="test-user")
def test_obj_perm_denied(self):
"""Test object perm denied"""
self.client.force_login(self.user)
app = Application.objects.create(name=generate_id(), slug=generate_id())
response = self.client.get(
reverse("authentik_api:application-metrics", kwargs={"slug": app.slug})
)
self.assertEqual(response.status_code, 403)
def test_other_perm_denied(self):
"""Test other perm denied"""
self.client.force_login(self.user)
app = Application.objects.create(name=generate_id(), slug=generate_id())
assign_perm("authentik_core.view_application", self.user, app)
response = self.client.get(
reverse("authentik_api:application-metrics", kwargs={"slug": app.slug})
)
self.assertEqual(response.status_code, 403)

View File

@ -1,5 +1,4 @@
"""Schema generation tests""" """Schema generation tests"""
from django.urls import reverse from django.urls import reverse
from rest_framework.test import APITestCase from rest_framework.test import APITestCase
from yaml import safe_load from yaml import safe_load

View File

@ -1,6 +1,5 @@
"""authentik API Modelviewset tests""" """authentik API Modelviewset tests"""
from typing import Callable
from collections.abc import Callable
from django.test import TestCase from django.test import TestCase
from rest_framework.viewsets import ModelViewSet, ReadOnlyModelViewSet from rest_framework.viewsets import ModelViewSet, ReadOnlyModelViewSet
@ -17,7 +16,6 @@ def viewset_tester_factory(test_viewset: type[ModelViewSet]) -> Callable:
def tester(self: TestModelViewSets): def tester(self: TestModelViewSets):
self.assertIsNotNone(getattr(test_viewset, "search_fields", None)) self.assertIsNotNone(getattr(test_viewset, "search_fields", None))
self.assertIsNotNone(getattr(test_viewset, "ordering", None))
filterset_class = getattr(test_viewset, "filterset_class", None) filterset_class = getattr(test_viewset, "filterset_class", None)
if not filterset_class: if not filterset_class:
self.assertIsNotNone(getattr(test_viewset, "filterset_fields", None)) self.assertIsNotNone(getattr(test_viewset, "filterset_fields", None))
@ -26,6 +24,6 @@ def viewset_tester_factory(test_viewset: type[ModelViewSet]) -> Callable:
for _, viewset, _ in router.registry: for _, viewset, _ in router.registry:
if not issubclass(viewset, ModelViewSet | ReadOnlyModelViewSet): if not issubclass(viewset, (ModelViewSet, ReadOnlyModelViewSet)):
continue continue
setattr(TestModelViewSets, f"test_viewset_{viewset.__name__}", viewset_tester_factory(viewset)) setattr(TestModelViewSets, f"test_viewset_{viewset.__name__}", viewset_tester_factory(viewset))

View File

@ -1,5 +1,4 @@
"""authentik api urls""" """authentik api urls"""
from django.urls import include, path from django.urls import include, path
from authentik.api.v3.urls import urlpatterns as v3_urls from authentik.api.v3.urls import urlpatterns as v3_urls

View File

@ -1,5 +1,4 @@
"""core Configs API""" """core Configs API"""
from pathlib import Path from pathlib import Path
from django.conf import settings from django.conf import settings
@ -20,7 +19,7 @@ from rest_framework.response import Response
from rest_framework.views import APIView from rest_framework.views import APIView
from authentik.core.api.utils import PassiveSerializer from authentik.core.api.utils import PassiveSerializer
from authentik.events.context_processors.base import get_context_processors from authentik.events.geo import GEOIP_READER
from authentik.lib.config import CONFIG from authentik.lib.config import CONFIG
capabilities = Signal() capabilities = Signal()
@ -31,7 +30,6 @@ class Capabilities(models.TextChoices):
CAN_SAVE_MEDIA = "can_save_media" CAN_SAVE_MEDIA = "can_save_media"
CAN_GEO_IP = "can_geo_ip" CAN_GEO_IP = "can_geo_ip"
CAN_ASN = "can_asn"
CAN_IMPERSONATE = "can_impersonate" CAN_IMPERSONATE = "can_impersonate"
CAN_DEBUG = "can_debug" CAN_DEBUG = "can_debug"
IS_ENTERPRISE = "is_enterprise" IS_ENTERPRISE = "is_enterprise"
@ -68,16 +66,11 @@ class ConfigView(APIView):
"""Get all capabilities this server instance supports""" """Get all capabilities this server instance supports"""
caps = [] caps = []
deb_test = settings.DEBUG or settings.TEST deb_test = settings.DEBUG or settings.TEST
if ( if Path(settings.MEDIA_ROOT).is_mount() or deb_test:
CONFIG.get("storage.media.backend", "file") == "s3"
or Path(settings.STORAGES["default"]["OPTIONS"]["location"]).is_mount()
or deb_test
):
caps.append(Capabilities.CAN_SAVE_MEDIA) caps.append(Capabilities.CAN_SAVE_MEDIA)
for processor in get_context_processors(): if GEOIP_READER.enabled:
if cap := processor.capability(): caps.append(Capabilities.CAN_GEO_IP)
caps.append(cap) if CONFIG.get_bool("impersonation"):
if self.request.tenant.impersonation:
caps.append(Capabilities.CAN_IMPERSONATE) caps.append(Capabilities.CAN_IMPERSONATE)
if settings.DEBUG: # pragma: no cover if settings.DEBUG: # pragma: no cover
caps.append(Capabilities.CAN_DEBUG) caps.append(Capabilities.CAN_DEBUG)
@ -100,10 +93,10 @@ class ConfigView(APIView):
"traces_sample_rate": float(CONFIG.get("error_reporting.sample_rate", 0.4)), "traces_sample_rate": float(CONFIG.get("error_reporting.sample_rate", 0.4)),
}, },
"capabilities": self.get_capabilities(), "capabilities": self.get_capabilities(),
"cache_timeout": CONFIG.get_int("cache.timeout"), "cache_timeout": CONFIG.get_int("redis.cache_timeout"),
"cache_timeout_flows": CONFIG.get_int("cache.timeout_flows"), "cache_timeout_flows": CONFIG.get_int("redis.cache_timeout_flows"),
"cache_timeout_policies": CONFIG.get_int("cache.timeout_policies"), "cache_timeout_policies": CONFIG.get_int("redis.cache_timeout_policies"),
"cache_timeout_reputation": CONFIG.get_int("cache.timeout_reputation"), "cache_timeout_reputation": CONFIG.get_int("redis.cache_timeout_reputation"),
} }
) )

View File

@ -1,5 +1,4 @@
"""api v3 urls""" """api v3 urls"""
from importlib import import_module from importlib import import_module
from django.urls import path from django.urls import path
@ -22,9 +21,7 @@ _other_urls = []
for _authentik_app in get_apps(): for _authentik_app in get_apps():
try: try:
api_urls = import_module(f"{_authentik_app.name}.urls") api_urls = import_module(f"{_authentik_app.name}.urls")
except ModuleNotFoundError: except (ModuleNotFoundError, ImportError) as exc:
continue
except ImportError as exc:
LOGGER.warning("Could not import app's URLs", app_name=_authentik_app.name, exc=exc) LOGGER.warning("Could not import app's URLs", app_name=_authentik_app.name, exc=exc)
continue continue
if not hasattr(api_urls, "api_urlpatterns"): if not hasattr(api_urls, "api_urlpatterns"):
@ -33,7 +30,7 @@ for _authentik_app in get_apps():
app_name=_authentik_app.name, app_name=_authentik_app.name,
) )
continue continue
urls: list = api_urls.api_urlpatterns urls: list = getattr(api_urls, "api_urlpatterns")
for url in urls: for url in urls:
if isinstance(url, URLPattern): if isinstance(url, URLPattern):
_other_urls.append(url) _other_urls.append(url)

View File

@ -1,5 +1,4 @@
"""General API Views""" """General API Views"""
from typing import Any from typing import Any
from django.urls import reverse from django.urls import reverse

View File

@ -1,22 +1,22 @@
"""Serializer mixin for managed models""" """Serializer mixin for managed models"""
from django.utils.translation import gettext_lazy as _ from django.utils.translation import gettext_lazy as _
from drf_spectacular.utils import extend_schema, inline_serializer from drf_spectacular.utils import extend_schema, inline_serializer
from rest_framework.decorators import action from rest_framework.decorators import action
from rest_framework.exceptions import ValidationError from rest_framework.exceptions import ValidationError
from rest_framework.fields import CharField, DateTimeField from rest_framework.fields import CharField, DateTimeField, JSONField
from rest_framework.permissions import IsAdminUser
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.serializers import ListSerializer, ModelSerializer from rest_framework.serializers import ListSerializer, ModelSerializer
from rest_framework.viewsets import ModelViewSet from rest_framework.viewsets import ModelViewSet
from authentik.api.decorators import permission_required
from authentik.blueprints.models import BlueprintInstance from authentik.blueprints.models import BlueprintInstance
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer
from authentik.blueprints.v1.oci import OCI_PREFIX from authentik.blueprints.v1.oci import OCI_PREFIX
from authentik.blueprints.v1.tasks import apply_blueprint, blueprints_find_dict from authentik.blueprints.v1.tasks import apply_blueprint, blueprints_find_dict
from authentik.core.api.used_by import UsedByMixin from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import JSONDictField, PassiveSerializer from authentik.core.api.utils import PassiveSerializer
from authentik.rbac.decorators import permission_required
class ManagedSerializer: class ManagedSerializer:
@ -29,7 +29,7 @@ class MetadataSerializer(PassiveSerializer):
"""Serializer for blueprint metadata""" """Serializer for blueprint metadata"""
name = CharField() name = CharField()
labels = JSONDictField() labels = JSONField()
class BlueprintInstanceSerializer(ModelSerializer): class BlueprintInstanceSerializer(ModelSerializer):
@ -49,14 +49,10 @@ class BlueprintInstanceSerializer(ModelSerializer):
if content == "": if content == "":
return content return content
context = self.instance.context if self.instance else {} context = self.instance.context if self.instance else {}
valid, logs = Importer.from_string(content, context).validate() valid, logs = Importer(content, context).validate()
if not valid: if not valid:
raise ValidationError( text_logs = "\n".join([x["event"] for x in logs])
[ raise ValidationError(_("Failed to validate blueprint: %(logs)s" % {"logs": text_logs}))
_("Failed to validate blueprint"),
*[f"- {x.event}" for x in logs],
]
)
return content return content
def validate(self, attrs: dict) -> dict: def validate(self, attrs: dict) -> dict:
@ -91,11 +87,11 @@ class BlueprintInstanceSerializer(ModelSerializer):
class BlueprintInstanceViewSet(UsedByMixin, ModelViewSet): class BlueprintInstanceViewSet(UsedByMixin, ModelViewSet):
"""Blueprint instances""" """Blueprint instances"""
permission_classes = [IsAdminUser]
serializer_class = BlueprintInstanceSerializer serializer_class = BlueprintInstanceSerializer
queryset = BlueprintInstance.objects.all() queryset = BlueprintInstance.objects.all()
search_fields = ["name", "path"] search_fields = ["name", "path"]
filterset_fields = ["name", "path"] filterset_fields = ["name", "path"]
ordering = ["name"]
@extend_schema( @extend_schema(
responses={ responses={

View File

@ -1,6 +1,5 @@
"""authentik Blueprints app""" """authentik Blueprints app"""
from collections.abc import Callable
from importlib import import_module from importlib import import_module
from inspect import ismethod from inspect import ismethod
@ -8,100 +7,40 @@ from django.apps import AppConfig
from django.db import DatabaseError, InternalError, ProgrammingError from django.db import DatabaseError, InternalError, ProgrammingError
from structlog.stdlib import BoundLogger, get_logger from structlog.stdlib import BoundLogger, get_logger
from authentik.root.signals import startup
class ManagedAppConfig(AppConfig): class ManagedAppConfig(AppConfig):
"""Basic reconciliation logic for apps""" """Basic reconciliation logic for apps"""
logger: BoundLogger _logger: BoundLogger
RECONCILE_GLOBAL_CATEGORY: str = "global"
RECONCILE_TENANT_CATEGORY: str = "tenant"
def __init__(self, app_name: str, *args, **kwargs) -> None: def __init__(self, app_name: str, *args, **kwargs) -> None:
super().__init__(app_name, *args, **kwargs) super().__init__(app_name, *args, **kwargs)
self.logger = get_logger().bind(app_name=app_name) self._logger = get_logger().bind(app_name=app_name)
def ready(self) -> None: def ready(self) -> None:
self.import_related() self.reconcile()
startup.connect(self._on_startup_callback, dispatch_uid=self.label)
return super().ready() return super().ready()
def _on_startup_callback(self, sender, **_):
self._reconcile_global()
self._reconcile_tenant()
def import_related(self):
"""Automatically import related modules which rely on just being imported
to register themselves (mainly django signals and celery tasks)"""
def import_relative(rel_module: str):
try:
module_name = f"{self.name}.{rel_module}"
import_module(module_name)
self.logger.info("Imported related module", module=module_name)
except ModuleNotFoundError:
pass
import_relative("checks")
import_relative("tasks")
import_relative("signals")
def import_module(self, path: str): def import_module(self, path: str):
"""Load module""" """Load module"""
import_module(path) import_module(path)
def _reconcile(self, prefix: str) -> None: def reconcile(self) -> None:
"""reconcile ourselves"""
prefix = "reconcile_"
for meth_name in dir(self): for meth_name in dir(self):
meth = getattr(self, meth_name) meth = getattr(self, meth_name)
if not ismethod(meth): if not ismethod(meth):
continue continue
category = getattr(meth, "_authentik_managed_reconcile", None) if not meth_name.startswith(prefix):
if category != prefix:
continue continue
name = meth_name.replace(prefix, "") name = meth_name.replace(prefix, "")
try: try:
self.logger.debug("Starting reconciler", name=name) self._logger.debug("Starting reconciler", name=name)
meth() meth()
self.logger.debug("Successfully reconciled", name=name) self._logger.debug("Successfully reconciled", name=name)
except (DatabaseError, ProgrammingError, InternalError) as exc: except (DatabaseError, ProgrammingError, InternalError) as exc:
self.logger.warning("Failed to run reconcile", name=name, exc=exc) self._logger.debug("Failed to run reconcile", name=name, exc=exc)
@staticmethod
def reconcile_tenant(func: Callable):
"""Mark a function to be called on startup (for each tenant)"""
func._authentik_managed_reconcile = ManagedAppConfig.RECONCILE_TENANT_CATEGORY
return func
@staticmethod
def reconcile_global(func: Callable):
"""Mark a function to be called on startup (globally)"""
func._authentik_managed_reconcile = ManagedAppConfig.RECONCILE_GLOBAL_CATEGORY
return func
def _reconcile_tenant(self) -> None:
"""reconcile ourselves for tenanted methods"""
from authentik.tenants.models import Tenant
try:
tenants = list(Tenant.objects.filter(ready=True))
except (DatabaseError, ProgrammingError, InternalError) as exc:
self.logger.debug("Failed to get tenants to run reconcile", exc=exc)
return
for tenant in tenants:
with tenant:
self._reconcile(self.RECONCILE_TENANT_CATEGORY)
def _reconcile_global(self) -> None:
"""
reconcile ourselves for global methods.
Used for signals, tasks, etc. Database queries should not be made in here.
"""
from django_tenants.utils import get_public_schema_name, schema_context
with schema_context(get_public_schema_name()):
self._reconcile(self.RECONCILE_GLOBAL_CATEGORY)
class AuthentikBlueprintsConfig(ManagedAppConfig): class AuthentikBlueprintsConfig(ManagedAppConfig):
@ -112,13 +51,11 @@ class AuthentikBlueprintsConfig(ManagedAppConfig):
verbose_name = "authentik Blueprints" verbose_name = "authentik Blueprints"
default = True default = True
@ManagedAppConfig.reconcile_global def reconcile_load_blueprints_v1_tasks(self):
def load_blueprints_v1_tasks(self):
"""Load v1 tasks""" """Load v1 tasks"""
self.import_module("authentik.blueprints.v1.tasks") self.import_module("authentik.blueprints.v1.tasks")
@ManagedAppConfig.reconcile_tenant def reconcile_blueprints_discovery(self):
def blueprints_discovery(self):
"""Run blueprint discovery""" """Run blueprint discovery"""
from authentik.blueprints.v1.tasks import blueprints_discovery, clear_failed_blueprints from authentik.blueprints.v1.tasks import blueprints_discovery, clear_failed_blueprints

View File

@ -1,5 +1,4 @@
"""Apply blueprint from commandline""" """Apply blueprint from commandline"""
from sys import exit as sys_exit from sys import exit as sys_exit
from django.core.management.base import BaseCommand, no_translations from django.core.management.base import BaseCommand, no_translations
@ -7,7 +6,6 @@ from structlog.stdlib import get_logger
from authentik.blueprints.models import BlueprintInstance from authentik.blueprints.models import BlueprintInstance
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer
from authentik.tenants.models import Tenant
LOGGER = get_logger() LOGGER = get_logger()
@ -18,18 +16,14 @@ class Command(BaseCommand):
@no_translations @no_translations
def handle(self, *args, **options): def handle(self, *args, **options):
"""Apply all blueprints in order, abort when one fails to import""" """Apply all blueprints in order, abort when one fails to import"""
for tenant in Tenant.objects.filter(ready=True): for blueprint_path in options.get("blueprints", []):
with tenant: content = BlueprintInstance(path=blueprint_path).retrieve()
for blueprint_path in options.get("blueprints", []): importer = Importer(content)
content = BlueprintInstance(path=blueprint_path).retrieve() valid, _ = importer.validate()
importer = Importer.from_string(content) if not valid:
valid, logs = importer.validate() self.stderr.write("blueprint invalid")
if not valid: sys_exit(1)
self.stderr.write("Blueprint invalid") importer.apply()
for log in logs:
self.stderr.write(f"\t{log.logger}: {log.event}: {log.attributes}")
sys_exit(1)
importer.apply()
def add_arguments(self, parser): def add_arguments(self, parser):
parser.add_argument("blueprints", nargs="+", type=str) parser.add_argument("blueprints", nargs="+", type=str)

View File

@ -1,19 +1,17 @@
"""Export blueprint of current authentik install""" """Export blueprint of current authentik install"""
from django.core.management.base import BaseCommand, no_translations
from django.core.management.base import no_translations
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
from authentik.blueprints.v1.exporter import Exporter from authentik.blueprints.v1.exporter import Exporter
from authentik.tenants.management import TenantCommand
LOGGER = get_logger() LOGGER = get_logger()
class Command(TenantCommand): class Command(BaseCommand):
"""Export blueprint of current authentik install""" """Export blueprint of current authentik install"""
@no_translations @no_translations
def handle_per_tenant(self, *args, **options): def handle(self, *args, **options):
"""Export blueprint of current authentik install""" """Export blueprint of current authentik install"""
exporter = Exporter() exporter = Exporter()
self.stdout.write(exporter.export_to_string()) self.stdout.write(exporter.export_to_string())

View File

@ -1,18 +1,14 @@
"""Generate JSON Schema for blueprints""" """Generate JSON Schema for blueprints"""
from json import dumps from json import dumps
from typing import Any from typing import Any
from django.core.management.base import BaseCommand, no_translations from django.core.management.base import BaseCommand, no_translations
from django.db.models import Model, fields from django.db.models import Model
from drf_jsonschema_serializer.convert import converter, field_to_converter from drf_jsonschema_serializer.convert import field_to_converter
from rest_framework.fields import Field, JSONField, UUIDField from rest_framework.fields import Field, JSONField, UUIDField
from rest_framework.relations import PrimaryKeyRelatedField
from rest_framework.serializers import Serializer from rest_framework.serializers import Serializer
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
from authentik import __version__
from authentik.blueprints.v1.common import BlueprintEntryDesiredState
from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT, is_model_allowed from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT, is_model_allowed
from authentik.blueprints.v1.meta.registry import BaseMetaModel, registry from authentik.blueprints.v1.meta.registry import BaseMetaModel, registry
from authentik.lib.models import SerializerModel from authentik.lib.models import SerializerModel
@ -20,23 +16,6 @@ from authentik.lib.models import SerializerModel
LOGGER = get_logger() LOGGER = get_logger()
@converter
class PrimaryKeyRelatedFieldConverter:
"""Custom primary key field converter which is aware of non-integer based PKs
This is not an exhaustive fix for other non-int PKs, however in authentik we either
use UUIDs or ints"""
field_class = PrimaryKeyRelatedField
def convert(self, field: PrimaryKeyRelatedField):
model: Model = field.queryset.model
pk_field = model._meta.pk
if isinstance(pk_field, fields.UUIDField):
return {"type": "string", "format": "uuid"}
return {"type": "integer"}
class Command(BaseCommand): class Command(BaseCommand):
"""Generate JSON Schema for blueprints""" """Generate JSON Schema for blueprints"""
@ -48,7 +27,7 @@ class Command(BaseCommand):
"$schema": "http://json-schema.org/draft-07/schema", "$schema": "http://json-schema.org/draft-07/schema",
"$id": "https://goauthentik.io/blueprints/schema.json", "$id": "https://goauthentik.io/blueprints/schema.json",
"type": "object", "type": "object",
"title": f"authentik {__version__} Blueprint schema", "title": "authentik Blueprint schema",
"required": ["version", "entries"], "required": ["version", "entries"],
"properties": { "properties": {
"version": { "version": {
@ -113,19 +92,16 @@ class Command(BaseCommand):
) )
model_path = f"{model._meta.app_label}.{model._meta.model_name}" model_path = f"{model._meta.app_label}.{model._meta.model_name}"
self.schema["properties"]["entries"]["items"]["oneOf"].append( self.schema["properties"]["entries"]["items"]["oneOf"].append(
self.template_entry(model_path, model, serializer) self.template_entry(model_path, serializer)
) )
def template_entry(self, model_path: str, model: type[Model], serializer: Serializer) -> dict: def template_entry(self, model_path: str, serializer: Serializer) -> dict:
"""Template entry for a single model""" """Template entry for a single model"""
model_schema = self.to_jsonschema(serializer) model_schema = self.to_jsonschema(serializer)
model_schema["required"] = [] model_schema["required"] = []
def_name = f"model_{model_path}" def_name = f"model_{model_path}"
def_path = f"#/$defs/{def_name}" def_path = f"#/$defs/{def_name}"
self.schema["$defs"][def_name] = model_schema self.schema["$defs"][def_name] = model_schema
def_name_perm = f"model_{model_path}_permissions"
def_path_perm = f"#/$defs/{def_name_perm}"
self.schema["$defs"][def_name_perm] = self.model_permissions(model)
return { return {
"type": "object", "type": "object",
"required": ["model", "identifiers"], "required": ["model", "identifiers"],
@ -134,11 +110,10 @@ class Command(BaseCommand):
"id": {"type": "string"}, "id": {"type": "string"},
"state": { "state": {
"type": "string", "type": "string",
"enum": [s.value for s in BlueprintEntryDesiredState], "enum": ["absent", "present", "created"],
"default": "present", "default": "present",
}, },
"conditions": {"type": "array", "items": {"type": "boolean"}}, "conditions": {"type": "array", "items": {"type": "boolean"}},
"permissions": {"$ref": def_path_perm},
"attrs": {"$ref": def_path}, "attrs": {"$ref": def_path},
"identifiers": {"$ref": def_path}, "identifiers": {"$ref": def_path},
}, },
@ -189,20 +164,3 @@ class Command(BaseCommand):
if required: if required:
result["required"] = required result["required"] = required
return result return result
def model_permissions(self, model: type[Model]) -> dict:
perms = [x[0] for x in model._meta.permissions]
for action in model._meta.default_permissions:
perms.append(f"{action}_{model._meta.model_name}")
return {
"type": "array",
"items": {
"type": "object",
"required": ["permission"],
"properties": {
"permission": {"type": "string", "enum": perms},
"user": {"type": "integer"},
"role": {"type": "string"},
},
},
}

View File

@ -14,7 +14,7 @@ from authentik.blueprints.v1.labels import LABEL_AUTHENTIK_SYSTEM
from authentik.lib.config import CONFIG from authentik.lib.config import CONFIG
def check_blueprint_v1_file(BlueprintInstance: type, db_alias, path: Path): def check_blueprint_v1_file(BlueprintInstance: type, path: Path):
"""Check if blueprint should be imported""" """Check if blueprint should be imported"""
from authentik.blueprints.models import BlueprintInstanceStatus from authentik.blueprints.models import BlueprintInstanceStatus
from authentik.blueprints.v1.common import BlueprintLoader, BlueprintMetadata from authentik.blueprints.v1.common import BlueprintLoader, BlueprintMetadata
@ -29,7 +29,7 @@ def check_blueprint_v1_file(BlueprintInstance: type, db_alias, path: Path):
if version != 1: if version != 1:
return return
blueprint_file.seek(0) blueprint_file.seek(0)
instance = BlueprintInstance.objects.using(db_alias).filter(path=path).first() instance: BlueprintInstance = BlueprintInstance.objects.filter(path=path).first()
rel_path = path.relative_to(Path(CONFIG.get("blueprints_dir"))) rel_path = path.relative_to(Path(CONFIG.get("blueprints_dir")))
meta = None meta = None
if metadata: if metadata:
@ -37,7 +37,7 @@ def check_blueprint_v1_file(BlueprintInstance: type, db_alias, path: Path):
if meta.labels.get(LABEL_AUTHENTIK_INSTANTIATE, "").lower() == "false": if meta.labels.get(LABEL_AUTHENTIK_INSTANTIATE, "").lower() == "false":
return return
if not instance: if not instance:
BlueprintInstance.objects.using(db_alias).create( instance = BlueprintInstance(
name=meta.name if meta else str(rel_path), name=meta.name if meta else str(rel_path),
path=str(rel_path), path=str(rel_path),
context={}, context={},
@ -47,6 +47,7 @@ def check_blueprint_v1_file(BlueprintInstance: type, db_alias, path: Path):
last_applied_hash="", last_applied_hash="",
metadata=metadata or {}, metadata=metadata or {},
) )
instance.save()
def migration_blueprint_import(apps: Apps, schema_editor: BaseDatabaseSchemaEditor): def migration_blueprint_import(apps: Apps, schema_editor: BaseDatabaseSchemaEditor):
@ -55,7 +56,7 @@ def migration_blueprint_import(apps: Apps, schema_editor: BaseDatabaseSchemaEdit
db_alias = schema_editor.connection.alias db_alias = schema_editor.connection.alias
for file in glob(f"{CONFIG.get('blueprints_dir')}/**/*.yaml", recursive=True): for file in glob(f"{CONFIG.get('blueprints_dir')}/**/*.yaml", recursive=True):
check_blueprint_v1_file(BlueprintInstance, db_alias, Path(file)) check_blueprint_v1_file(BlueprintInstance, Path(file))
for blueprint in BlueprintInstance.objects.using(db_alias).all(): for blueprint in BlueprintInstance.objects.using(db_alias).all():
# If we already have flows (and we should always run before flow migrations) # If we already have flows (and we should always run before flow migrations)

View File

@ -1,5 +1,4 @@
"""blueprint models""" """blueprint models"""
from pathlib import Path from pathlib import Path
from uuid import uuid4 from uuid import uuid4
@ -71,19 +70,6 @@ class BlueprintInstance(SerializerModel, ManagedModel, CreatedUpdatedModel):
enabled = models.BooleanField(default=True) enabled = models.BooleanField(default=True)
managed_models = ArrayField(models.TextField(), default=list) managed_models = ArrayField(models.TextField(), default=list)
class Meta:
verbose_name = _("Blueprint Instance")
verbose_name_plural = _("Blueprint Instances")
unique_together = (
(
"name",
"path",
),
)
def __str__(self) -> str:
return f"Blueprint Instance {self.name}"
def retrieve_oci(self) -> str: def retrieve_oci(self) -> str:
"""Get blueprint from an OCI registry""" """Get blueprint from an OCI registry"""
client = BlueprintOCIClient(self.path.replace(OCI_PREFIX, "https://")) client = BlueprintOCIClient(self.path.replace(OCI_PREFIX, "https://"))
@ -102,7 +88,7 @@ class BlueprintInstance(SerializerModel, ManagedModel, CreatedUpdatedModel):
raise BlueprintRetrievalFailed("Invalid blueprint path") raise BlueprintRetrievalFailed("Invalid blueprint path")
with full_path.open("r", encoding="utf-8") as _file: with full_path.open("r", encoding="utf-8") as _file:
return _file.read() return _file.read()
except OSError as exc: except (IOError, OSError) as exc:
raise BlueprintRetrievalFailed(exc) from exc raise BlueprintRetrievalFailed(exc) from exc
def retrieve(self) -> str: def retrieve(self) -> str:
@ -118,3 +104,16 @@ class BlueprintInstance(SerializerModel, ManagedModel, CreatedUpdatedModel):
from authentik.blueprints.api import BlueprintInstanceSerializer from authentik.blueprints.api import BlueprintInstanceSerializer
return BlueprintInstanceSerializer return BlueprintInstanceSerializer
def __str__(self) -> str:
return f"Blueprint Instance {self.name}"
class Meta:
verbose_name = _("Blueprint Instance")
verbose_name_plural = _("Blueprint Instances")
unique_together = (
(
"name",
"path",
),
)

View File

@ -1,5 +1,4 @@
"""blueprint Settings""" """blueprint Settings"""
from celery.schedules import crontab from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand from authentik.lib.utils.time import fqdn_rand

View File

@ -1,7 +1,6 @@
"""Blueprint helpers""" """Blueprint helpers"""
from collections.abc import Callable
from functools import wraps from functools import wraps
from typing import Callable
from django.apps import apps from django.apps import apps
@ -21,7 +20,7 @@ def apply_blueprint(*files: str):
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
for file in files: for file in files:
content = BlueprintInstance(path=file).retrieve() content = BlueprintInstance(path=file).retrieve()
Importer.from_string(content).apply() Importer(content).apply()
return func(*args, **kwargs) return func(*args, **kwargs)
return wrapper return wrapper
@ -39,7 +38,7 @@ def reconcile_app(app_name: str):
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
config = apps.get_app_config(app_name) config = apps.get_app_config(app_name)
if isinstance(config, ManagedAppConfig): if isinstance(config, ManagedAppConfig):
config._on_startup_callback(None) config.reconcile()
return func(*args, **kwargs) return func(*args, **kwargs)
return wrapper return wrapper

View File

@ -1,24 +0,0 @@
version: 1
entries:
- model: authentik_core.user
id: user
identifiers:
username: "%(id)s"
attrs:
name: "%(id)s"
- model: authentik_rbac.role
id: role
identifiers:
name: "%(id)s"
- model: authentik_flows.flow
identifiers:
slug: "%(id)s"
attrs:
designation: authentication
name: foo
title: foo
permissions:
- permission: view_flow
user: !KeyOf user
- permission: view_flow
role: !KeyOf role

View File

@ -1,8 +0,0 @@
version: 1
entries:
- model: authentik_rbac.role
identifiers:
name: "%(id)s"
attrs:
permissions:
- authentik_blueprints.view_blueprintinstance

View File

@ -1,9 +0,0 @@
version: 1
entries:
- model: authentik_core.user
identifiers:
username: "%(id)s"
attrs:
name: "%(id)s"
permissions:
- authentik_blueprints.view_blueprintinstance

View File

@ -1,5 +1,4 @@
"""authentik managed models tests""" """authentik managed models tests"""
from django.test import TestCase from django.test import TestCase
from authentik.blueprints.models import BlueprintInstance, BlueprintRetrievalFailed from authentik.blueprints.models import BlueprintInstance, BlueprintRetrievalFailed

View File

@ -1,5 +1,4 @@
"""Test blueprints OCI""" """Test blueprints OCI"""
from django.test import TransactionTestCase from django.test import TransactionTestCase
from requests_mock import Mocker from requests_mock import Mocker

View File

@ -1,23 +1,22 @@
"""test packaged blueprints""" """test packaged blueprints"""
from collections.abc import Callable
from pathlib import Path from pathlib import Path
from typing import Callable
from django.test import TransactionTestCase from django.test import TransactionTestCase
from authentik.blueprints.models import BlueprintInstance from authentik.blueprints.models import BlueprintInstance
from authentik.blueprints.tests import apply_blueprint from authentik.blueprints.tests import apply_blueprint
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer
from authentik.brands.models import Brand from authentik.tenants.models import Tenant
class TestPackaged(TransactionTestCase): class TestPackaged(TransactionTestCase):
"""Empty class, test methods are added dynamically""" """Empty class, test methods are added dynamically"""
@apply_blueprint("default/default-brand.yaml") @apply_blueprint("default/default-tenant.yaml")
def test_decorator_static(self): def test_decorator_static(self):
"""Test @apply_blueprint decorator""" """Test @apply_blueprint decorator"""
self.assertTrue(Brand.objects.filter(domain="authentik-default").exists()) self.assertTrue(Tenant.objects.filter(domain="authentik-default").exists())
def blueprint_tester(file_name: Path) -> Callable: def blueprint_tester(file_name: Path) -> Callable:
@ -26,9 +25,8 @@ def blueprint_tester(file_name: Path) -> Callable:
def tester(self: TestPackaged): def tester(self: TestPackaged):
base = Path("blueprints/") base = Path("blueprints/")
rel_path = Path(file_name).relative_to(base) rel_path = Path(file_name).relative_to(base)
importer = Importer.from_string(BlueprintInstance(path=str(rel_path)).retrieve()) importer = Importer(BlueprintInstance(path=str(rel_path)).retrieve())
validation, logs = importer.validate() self.assertTrue(importer.validate()[0])
self.assertTrue(validation, logs)
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
return tester return tester

View File

@ -1,20 +1,18 @@
"""authentik managed models tests""" """authentik managed models tests"""
from typing import Callable, Type
from collections.abc import Callable
from django.apps import apps from django.apps import apps
from django.test import TestCase from django.test import TestCase
from authentik.blueprints.v1.importer import is_model_allowed from authentik.blueprints.v1.importer import is_model_allowed
from authentik.lib.models import SerializerModel from authentik.lib.models import SerializerModel
from authentik.providers.oauth2.models import RefreshToken
class TestModels(TestCase): class TestModels(TestCase):
"""Test Models""" """Test Models"""
def serializer_tester_factory(test_model: type[SerializerModel]) -> Callable: def serializer_tester_factory(test_model: Type[SerializerModel]) -> Callable:
"""Test serializer""" """Test serializer"""
def tester(self: TestModels): def tester(self: TestModels):
@ -23,9 +21,6 @@ def serializer_tester_factory(test_model: type[SerializerModel]) -> Callable:
model_class = test_model() model_class = test_model()
self.assertTrue(isinstance(model_class, SerializerModel)) self.assertTrue(isinstance(model_class, SerializerModel))
self.assertIsNotNone(model_class.serializer) self.assertIsNotNone(model_class.serializer)
if model_class.serializer.Meta().model == RefreshToken:
return
self.assertEqual(model_class.serializer.Meta().model, test_model)
return tester return tester

View File

@ -1,5 +1,4 @@
"""Test blueprints v1""" """Test blueprints v1"""
from os import environ from os import environ
from django.test import TransactionTestCase from django.test import TransactionTestCase
@ -22,14 +21,14 @@ class TestBlueprintsV1(TransactionTestCase):
def test_blueprint_invalid_format(self): def test_blueprint_invalid_format(self):
"""Test blueprint with invalid format""" """Test blueprint with invalid format"""
importer = Importer.from_string('{"version": 3}') importer = Importer('{"version": 3}')
self.assertFalse(importer.validate()[0]) self.assertFalse(importer.validate()[0])
importer = Importer.from_string( importer = Importer(
'{"version": 1,"entries":[{"identifiers":{},"attrs":{},' '{"version": 1,"entries":[{"identifiers":{},"attrs":{},'
'"model": "authentik_core.User"}]}' '"model": "authentik_core.User"}]}'
) )
self.assertFalse(importer.validate()[0]) self.assertFalse(importer.validate()[0])
importer = Importer.from_string( importer = Importer(
'{"version": 1, "entries": [{"attrs": {"name": "test"}, ' '{"version": 1, "entries": [{"attrs": {"name": "test"}, '
'"identifiers": {}, ' '"identifiers": {}, '
'"model": "authentik_core.Group"}]}' '"model": "authentik_core.Group"}]}'
@ -55,7 +54,7 @@ class TestBlueprintsV1(TransactionTestCase):
}, },
) )
importer = Importer.from_string( importer = Importer(
'{"version": 1, "entries": [{"attrs": {"name": "test999", "attributes": ' '{"version": 1, "entries": [{"attrs": {"name": "test999", "attributes": '
'{"key": ["updated_value"]}}, "identifiers": {"attributes": {"other_key": ' '{"key": ["updated_value"]}}, "identifiers": {"attributes": {"other_key": '
'["other_value"]}}, "model": "authentik_core.Group"}]}' '["other_value"]}}, "model": "authentik_core.Group"}]}'
@ -104,7 +103,7 @@ class TestBlueprintsV1(TransactionTestCase):
self.assertEqual(len(export.entries), 3) self.assertEqual(len(export.entries), 3)
export_yaml = exporter.export_to_string() export_yaml = exporter.export_to_string()
importer = Importer.from_string(export_yaml) importer = Importer(export_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
@ -114,14 +113,14 @@ class TestBlueprintsV1(TransactionTestCase):
"""Test export and import it twice""" """Test export and import it twice"""
count_initial = Prompt.objects.filter(field_key="username").count() count_initial = Prompt.objects.filter(field_key="username").count()
importer = Importer.from_string(load_fixture("fixtures/static_prompt_export.yaml")) importer = Importer(load_fixture("fixtures/static_prompt_export.yaml"))
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
count_before = Prompt.objects.filter(field_key="username").count() count_before = Prompt.objects.filter(field_key="username").count()
self.assertEqual(count_initial + 1, count_before) self.assertEqual(count_initial + 1, count_before)
importer = Importer.from_string(load_fixture("fixtures/static_prompt_export.yaml")) importer = Importer(load_fixture("fixtures/static_prompt_export.yaml"))
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
self.assertEqual(Prompt.objects.filter(field_key="username").count(), count_before) self.assertEqual(Prompt.objects.filter(field_key="username").count(), count_before)
@ -131,7 +130,7 @@ class TestBlueprintsV1(TransactionTestCase):
ExpressionPolicy.objects.filter(name="foo-bar-baz-qux").delete() ExpressionPolicy.objects.filter(name="foo-bar-baz-qux").delete()
Group.objects.filter(name="test").delete() Group.objects.filter(name="test").delete()
environ["foo"] = generate_id() environ["foo"] = generate_id()
importer = Importer.from_string(load_fixture("fixtures/tags.yaml"), {"bar": "baz"}) importer = Importer(load_fixture("fixtures/tags.yaml"), {"bar": "baz"})
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
policy = ExpressionPolicy.objects.filter(name="foo-bar-baz-qux").first() policy = ExpressionPolicy.objects.filter(name="foo-bar-baz-qux").first()
@ -249,7 +248,7 @@ class TestBlueprintsV1(TransactionTestCase):
exporter = FlowExporter(flow) exporter = FlowExporter(flow)
export_yaml = exporter.export_to_string() export_yaml = exporter.export_to_string()
importer = Importer.from_string(export_yaml) importer = Importer(export_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
self.assertTrue(UserLoginStage.objects.filter(name=stage_name).exists()) self.assertTrue(UserLoginStage.objects.filter(name=stage_name).exists())
@ -298,7 +297,7 @@ class TestBlueprintsV1(TransactionTestCase):
exporter = FlowExporter(flow) exporter = FlowExporter(flow)
export_yaml = exporter.export_to_string() export_yaml = exporter.export_to_string()
importer = Importer.from_string(export_yaml) importer = Importer(export_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())

View File

@ -1,5 +1,4 @@
"""Test blueprints v1 api""" """Test blueprints v1 api"""
from json import loads from json import loads
from tempfile import NamedTemporaryFile, mkdtemp from tempfile import NamedTemporaryFile, mkdtemp
@ -78,5 +77,5 @@ class TestBlueprintsV1API(APITestCase):
self.assertEqual(res.status_code, 400) self.assertEqual(res.status_code, 400)
self.assertJSONEqual( self.assertJSONEqual(
res.content.decode(), res.content.decode(),
{"content": ["Failed to validate blueprint", "- Invalid blueprint version"]}, {"content": ["Failed to validate blueprint: Invalid blueprint version"]},
) )

View File

@ -1,5 +1,4 @@
"""Test blueprints v1""" """Test blueprints v1"""
from django.test import TransactionTestCase from django.test import TransactionTestCase
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer
@ -19,7 +18,7 @@ class TestBlueprintsV1ConditionalFields(TransactionTestCase):
self.uid = generate_id() self.uid = generate_id()
import_yaml = load_fixture("fixtures/conditional_fields.yaml", uid=self.uid, user=user.pk) import_yaml = load_fixture("fixtures/conditional_fields.yaml", uid=self.uid, user=user.pk)
importer = Importer.from_string(import_yaml) importer = Importer(import_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())

View File

@ -1,5 +1,4 @@
"""Test blueprints v1""" """Test blueprints v1"""
from django.test import TransactionTestCase from django.test import TransactionTestCase
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer
@ -19,7 +18,7 @@ class TestBlueprintsV1Conditions(TransactionTestCase):
"fixtures/conditions_fulfilled.yaml", id1=flow_slug1, id2=flow_slug2 "fixtures/conditions_fulfilled.yaml", id1=flow_slug1, id2=flow_slug2
) )
importer = Importer.from_string(import_yaml) importer = Importer(import_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
# Ensure objects exist # Ensure objects exist
@ -36,7 +35,7 @@ class TestBlueprintsV1Conditions(TransactionTestCase):
"fixtures/conditions_not_fulfilled.yaml", id1=flow_slug1, id2=flow_slug2 "fixtures/conditions_not_fulfilled.yaml", id1=flow_slug1, id2=flow_slug2
) )
importer = Importer.from_string(import_yaml) importer = Importer(import_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
# Ensure objects do not exist # Ensure objects do not exist

View File

@ -1,57 +0,0 @@
"""Test blueprints v1"""
from django.test import TransactionTestCase
from guardian.shortcuts import get_perms
from authentik.blueprints.v1.importer import Importer
from authentik.core.models import User
from authentik.flows.models import Flow
from authentik.lib.generators import generate_id
from authentik.lib.tests.utils import load_fixture
from authentik.rbac.models import Role
class TestBlueprintsV1RBAC(TransactionTestCase):
"""Test Blueprints rbac attribute"""
def test_user_permission(self):
"""Test permissions"""
uid = generate_id()
import_yaml = load_fixture("fixtures/rbac_user.yaml", id=uid)
importer = Importer.from_string(import_yaml)
self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply())
user = User.objects.filter(username=uid).first()
self.assertIsNotNone(user)
self.assertTrue(user.has_perms(["authentik_blueprints.view_blueprintinstance"]))
def test_role_permission(self):
"""Test permissions"""
uid = generate_id()
import_yaml = load_fixture("fixtures/rbac_role.yaml", id=uid)
importer = Importer.from_string(import_yaml)
self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply())
role = Role.objects.filter(name=uid).first()
self.assertIsNotNone(role)
self.assertEqual(
list(role.group.permissions.all().values_list("codename", flat=True)),
["view_blueprintinstance"],
)
def test_object_permission(self):
"""Test permissions"""
uid = generate_id()
import_yaml = load_fixture("fixtures/rbac_object.yaml", id=uid)
importer = Importer.from_string(import_yaml)
self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply())
flow = Flow.objects.filter(slug=uid).first()
user = User.objects.filter(username=uid).first()
role = Role.objects.filter(name=uid).first()
self.assertIsNotNone(flow)
self.assertEqual(get_perms(user, flow), ["view_flow"])
self.assertEqual(get_perms(role.group, flow), ["view_flow"])

View File

@ -1,5 +1,4 @@
"""Test blueprints v1""" """Test blueprints v1"""
from django.test import TransactionTestCase from django.test import TransactionTestCase
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer
@ -16,7 +15,7 @@ class TestBlueprintsV1State(TransactionTestCase):
flow_slug = generate_id() flow_slug = generate_id()
import_yaml = load_fixture("fixtures/state_present.yaml", id=flow_slug) import_yaml = load_fixture("fixtures/state_present.yaml", id=flow_slug)
importer = Importer.from_string(import_yaml) importer = Importer(import_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
# Ensure object exists # Ensure object exists
@ -31,7 +30,7 @@ class TestBlueprintsV1State(TransactionTestCase):
self.assertEqual(flow.title, "bar") self.assertEqual(flow.title, "bar")
# Ensure importer updates it # Ensure importer updates it
importer = Importer.from_string(import_yaml) importer = Importer(import_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
flow: Flow = Flow.objects.filter(slug=flow_slug).first() flow: Flow = Flow.objects.filter(slug=flow_slug).first()
@ -42,7 +41,7 @@ class TestBlueprintsV1State(TransactionTestCase):
flow_slug = generate_id() flow_slug = generate_id()
import_yaml = load_fixture("fixtures/state_created.yaml", id=flow_slug) import_yaml = load_fixture("fixtures/state_created.yaml", id=flow_slug)
importer = Importer.from_string(import_yaml) importer = Importer(import_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
# Ensure object exists # Ensure object exists
@ -57,7 +56,7 @@ class TestBlueprintsV1State(TransactionTestCase):
self.assertEqual(flow.title, "bar") self.assertEqual(flow.title, "bar")
# Ensure importer doesn't update it # Ensure importer doesn't update it
importer = Importer.from_string(import_yaml) importer = Importer(import_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
flow: Flow = Flow.objects.filter(slug=flow_slug).first() flow: Flow = Flow.objects.filter(slug=flow_slug).first()
@ -68,7 +67,7 @@ class TestBlueprintsV1State(TransactionTestCase):
flow_slug = generate_id() flow_slug = generate_id()
import_yaml = load_fixture("fixtures/state_created.yaml", id=flow_slug) import_yaml = load_fixture("fixtures/state_created.yaml", id=flow_slug)
importer = Importer.from_string(import_yaml) importer = Importer(import_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
# Ensure object exists # Ensure object exists
@ -76,7 +75,7 @@ class TestBlueprintsV1State(TransactionTestCase):
self.assertEqual(flow.slug, flow_slug) self.assertEqual(flow.slug, flow_slug)
import_yaml = load_fixture("fixtures/state_absent.yaml", id=flow_slug) import_yaml = load_fixture("fixtures/state_absent.yaml", id=flow_slug)
importer = Importer.from_string(import_yaml) importer = Importer(import_yaml)
self.assertTrue(importer.validate()[0]) self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
flow: Flow = Flow.objects.filter(slug=flow_slug).first() flow: Flow = Flow.objects.filter(slug=flow_slug).first()

View File

@ -1,5 +1,4 @@
"""Test blueprints v1 tasks""" """Test blueprints v1 tasks"""
from hashlib import sha512 from hashlib import sha512
from tempfile import NamedTemporaryFile, mkdtemp from tempfile import NamedTemporaryFile, mkdtemp
@ -54,7 +53,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
file.seek(0) file.seek(0)
file_hash = sha512(file.read().encode()).hexdigest() file_hash = sha512(file.read().encode()).hexdigest()
file.flush() file.flush()
blueprints_discovery() blueprints_discovery() # pylint: disable=no-value-for-parameter
instance = BlueprintInstance.objects.filter(name=blueprint_id).first() instance = BlueprintInstance.objects.filter(name=blueprint_id).first()
self.assertEqual(instance.last_applied_hash, file_hash) self.assertEqual(instance.last_applied_hash, file_hash)
self.assertEqual( self.assertEqual(
@ -82,7 +81,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
) )
) )
file.flush() file.flush()
blueprints_discovery() blueprints_discovery() # pylint: disable=no-value-for-parameter
blueprint = BlueprintInstance.objects.filter(name="foo").first() blueprint = BlueprintInstance.objects.filter(name="foo").first()
self.assertEqual( self.assertEqual(
blueprint.last_applied_hash, blueprint.last_applied_hash,
@ -107,7 +106,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
) )
) )
file.flush() file.flush()
blueprints_discovery() blueprints_discovery() # pylint: disable=no-value-for-parameter
blueprint.refresh_from_db() blueprint.refresh_from_db()
self.assertEqual( self.assertEqual(
blueprint.last_applied_hash, blueprint.last_applied_hash,
@ -149,7 +148,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
instance.status, instance.status,
BlueprintInstanceStatus.UNKNOWN, BlueprintInstanceStatus.UNKNOWN,
) )
apply_blueprint(instance.pk) apply_blueprint(instance.pk) # pylint: disable=no-value-for-parameter
instance.refresh_from_db() instance.refresh_from_db()
self.assertEqual(instance.last_applied_hash, "") self.assertEqual(instance.last_applied_hash, "")
self.assertEqual( self.assertEqual(

View File

@ -1,5 +1,4 @@
"""API URLs""" """API URLs"""
from authentik.blueprints.api import BlueprintInstanceViewSet from authentik.blueprints.api import BlueprintInstanceViewSet
api_urlpatterns = [ api_urlpatterns = [

View File

@ -1,20 +1,17 @@
"""transfer common classes""" """transfer common classes"""
from collections import OrderedDict from collections import OrderedDict
from collections.abc import Generator, Iterable, Mapping
from copy import copy from copy import copy
from dataclasses import asdict, dataclass, field, is_dataclass from dataclasses import asdict, dataclass, field, is_dataclass
from enum import Enum from enum import Enum
from functools import reduce from functools import reduce
from operator import ixor from operator import ixor
from os import getenv from os import getenv
from typing import Any, Literal, Union from typing import Any, Iterable, Literal, Mapping, Optional, Union
from uuid import UUID from uuid import UUID
from deepmerge import always_merger from deepmerge import always_merger
from django.apps import apps from django.apps import apps
from django.db.models import Model, Q from django.db.models import Model, Q
from rest_framework.exceptions import ValidationError
from rest_framework.fields import Field from rest_framework.fields import Field
from rest_framework.serializers import Serializer from rest_framework.serializers import Serializer
from yaml import SafeDumper, SafeLoader, ScalarNode, SequenceNode from yaml import SafeDumper, SafeLoader, ScalarNode, SequenceNode
@ -46,7 +43,7 @@ def get_attrs(obj: SerializerModel) -> dict[str, Any]:
class BlueprintEntryState: class BlueprintEntryState:
"""State of a single instance""" """State of a single instance"""
instance: Model | None = None instance: Optional[Model] = None
class BlueprintEntryDesiredState(Enum): class BlueprintEntryDesiredState(Enum):
@ -55,16 +52,6 @@ class BlueprintEntryDesiredState(Enum):
ABSENT = "absent" ABSENT = "absent"
PRESENT = "present" PRESENT = "present"
CREATED = "created" CREATED = "created"
MUST_CREATED = "must_created"
@dataclass
class BlueprintEntryPermission:
"""Describe object-level permissions"""
permission: Union[str, "YAMLTag"]
user: Union[int, "YAMLTag", None] = field(default=None)
role: Union[str, "YAMLTag", None] = field(default=None)
@dataclass @dataclass
@ -77,15 +64,14 @@ class BlueprintEntry:
) )
conditions: list[Any] = field(default_factory=list) conditions: list[Any] = field(default_factory=list)
identifiers: dict[str, Any] = field(default_factory=dict) identifiers: dict[str, Any] = field(default_factory=dict)
attrs: dict[str, Any] | None = field(default_factory=dict) attrs: Optional[dict[str, Any]] = field(default_factory=dict)
permissions: list[BlueprintEntryPermission] = field(default_factory=list)
id: str | None = None id: Optional[str] = None
_state: BlueprintEntryState = field(default_factory=BlueprintEntryState) _state: BlueprintEntryState = field(default_factory=BlueprintEntryState)
def __post_init__(self, *args, **kwargs) -> None: def __post_init__(self, *args, **kwargs) -> None:
self.__tag_contexts: list[YAMLTagContext] = [] self.__tag_contexts: list["YAMLTagContext"] = []
@staticmethod @staticmethod
def from_model(model: SerializerModel, *extra_identifier_names: str) -> "BlueprintEntry": def from_model(model: SerializerModel, *extra_identifier_names: str) -> "BlueprintEntry":
@ -103,10 +89,10 @@ class BlueprintEntry:
attrs=all_attrs, attrs=all_attrs,
) )
def get_tag_context( def _get_tag_context(
self, self,
depth: int = 0, depth: int = 0,
context_tag_type: type["YAMLTagContext"] | tuple["YAMLTagContext", ...] | None = None, context_tag_type: Optional[type["YAMLTagContext"] | tuple["YAMLTagContext", ...]] = None,
) -> "YAMLTagContext": ) -> "YAMLTagContext":
"""Get a YAMLTagContext object located at a certain depth in the tag tree""" """Get a YAMLTagContext object located at a certain depth in the tag tree"""
if depth < 0: if depth < 0:
@ -119,8 +105,8 @@ class BlueprintEntry:
try: try:
return contexts[-(depth + 1)] return contexts[-(depth + 1)]
except IndexError as exc: except IndexError:
raise ValueError(f"invalid depth: {depth}. Max depth: {len(contexts) - 1}") from exc raise ValueError(f"invalid depth: {depth}. Max depth: {len(contexts) - 1}")
def tag_resolver(self, value: Any, blueprint: "Blueprint") -> Any: def tag_resolver(self, value: Any, blueprint: "Blueprint") -> Any:
"""Check if we have any special tags that need handling""" """Check if we have any special tags that need handling"""
@ -160,17 +146,6 @@ class BlueprintEntry:
"""Get the blueprint model, with yaml tags resolved if present""" """Get the blueprint model, with yaml tags resolved if present"""
return str(self.tag_resolver(self.model, blueprint)) return str(self.tag_resolver(self.model, blueprint))
def get_permissions(
self, blueprint: "Blueprint"
) -> Generator[BlueprintEntryPermission, None, None]:
"""Get permissions of this entry, with all yaml tags resolved"""
for perm in self.permissions:
yield BlueprintEntryPermission(
permission=self.tag_resolver(perm.permission, blueprint),
user=self.tag_resolver(perm.user, blueprint),
role=self.tag_resolver(perm.role, blueprint),
)
def check_all_conditions_match(self, blueprint: "Blueprint") -> bool: def check_all_conditions_match(self, blueprint: "Blueprint") -> bool:
"""Check all conditions of this entry match (evaluate to True)""" """Check all conditions of this entry match (evaluate to True)"""
return all(self.tag_resolver(self.conditions, blueprint)) return all(self.tag_resolver(self.conditions, blueprint))
@ -192,7 +167,7 @@ class Blueprint:
entries: list[BlueprintEntry] = field(default_factory=list) entries: list[BlueprintEntry] = field(default_factory=list)
context: dict = field(default_factory=dict) context: dict = field(default_factory=dict)
metadata: BlueprintMetadata | None = field(default=None) metadata: Optional[BlueprintMetadata] = field(default=None)
class YAMLTag: class YAMLTag:
@ -231,8 +206,8 @@ class KeyOf(YAMLTag):
): ):
return _entry._state.instance.pbm_uuid return _entry._state.instance.pbm_uuid
return _entry._state.instance.pk return _entry._state.instance.pk
raise EntryInvalidError.from_entry( raise EntryInvalidError(
f"KeyOf: failed to find entry with `id` of `{self.id_from}` and a model instance", entry f"KeyOf: failed to find entry with `id` of `{self.id_from}` and a model instance"
) )
@ -240,7 +215,7 @@ class Env(YAMLTag):
"""Lookup environment variable with optional default""" """Lookup environment variable with optional default"""
key: str key: str
default: Any | None default: Optional[Any]
def __init__(self, loader: "BlueprintLoader", node: ScalarNode | SequenceNode) -> None: def __init__(self, loader: "BlueprintLoader", node: ScalarNode | SequenceNode) -> None:
super().__init__() super().__init__()
@ -259,7 +234,7 @@ class Context(YAMLTag):
"""Lookup key from instance context""" """Lookup key from instance context"""
key: str key: str
default: Any | None default: Optional[Any]
def __init__(self, loader: "BlueprintLoader", node: ScalarNode | SequenceNode) -> None: def __init__(self, loader: "BlueprintLoader", node: ScalarNode | SequenceNode) -> None:
super().__init__() super().__init__()
@ -303,7 +278,7 @@ class Format(YAMLTag):
try: try:
return self.format_string % tuple(args) return self.format_string % tuple(args)
except TypeError as exc: except TypeError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc raise EntryInvalidError(exc)
class Find(YAMLTag): class Find(YAMLTag):
@ -328,10 +303,7 @@ class Find(YAMLTag):
else: else:
model_name = self.model_name model_name = self.model_name
try: model_class = apps.get_model(*model_name.split("."))
model_class = apps.get_model(*model_name.split("."))
except LookupError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc
query = Q() query = Q()
for cond in self.conditions: for cond in self.conditions:
@ -383,15 +355,13 @@ class Condition(YAMLTag):
args.append(arg) args.append(arg)
if not args: if not args:
raise EntryInvalidError.from_entry( raise EntryInvalidError("At least one value is required after mode selection.")
"At least one value is required after mode selection.", entry
)
try: try:
comparator = self._COMPARATORS[self.mode.upper()] comparator = self._COMPARATORS[self.mode.upper()]
return comparator(tuple(bool(x) for x in args)) return comparator(tuple(bool(x) for x in args))
except (TypeError, KeyError) as exc: except (TypeError, KeyError) as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc raise EntryInvalidError(exc)
class If(YAMLTag): class If(YAMLTag):
@ -423,7 +393,7 @@ class If(YAMLTag):
blueprint, blueprint,
) )
except TypeError as exc: except TypeError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc raise EntryInvalidError(exc)
class Enumerate(YAMLTag, YAMLTagContext): class Enumerate(YAMLTag, YAMLTagContext):
@ -437,7 +407,9 @@ class Enumerate(YAMLTag, YAMLTagContext):
"SEQ": (list, lambda a, b: [*a, b]), "SEQ": (list, lambda a, b: [*a, b]),
"MAP": ( "MAP": (
dict, dict,
lambda a, b: always_merger.merge(a, {b[0]: b[1]} if isinstance(b, tuple | list) else b), lambda a, b: always_merger.merge(
a, {b[0]: b[1]} if isinstance(b, (tuple, list)) else b
),
), ),
} }
@ -453,10 +425,9 @@ class Enumerate(YAMLTag, YAMLTagContext):
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any: def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
if isinstance(self.iterable, EnumeratedItem) and self.iterable.depth == 0: if isinstance(self.iterable, EnumeratedItem) and self.iterable.depth == 0:
raise EntryInvalidError.from_entry( raise EntryInvalidError(
f"{self.__class__.__name__} tag's iterable references this tag's context. " f"{self.__class__.__name__} tag's iterable references this tag's context. "
"This is a noop. Check you are setting depth bigger than 0.", "This is a noop. Check you are setting depth bigger than 0."
entry,
) )
if isinstance(self.iterable, YAMLTag): if isinstance(self.iterable, YAMLTag):
@ -465,10 +436,9 @@ class Enumerate(YAMLTag, YAMLTagContext):
iterable = self.iterable iterable = self.iterable
if not isinstance(iterable, Iterable): if not isinstance(iterable, Iterable):
raise EntryInvalidError.from_entry( raise EntryInvalidError(
f"{self.__class__.__name__}'s iterable must be an iterable " f"{self.__class__.__name__}'s iterable must be an iterable "
"such as a sequence or a mapping", "such as a sequence or a mapping"
entry,
) )
if isinstance(iterable, Mapping): if isinstance(iterable, Mapping):
@ -479,7 +449,7 @@ class Enumerate(YAMLTag, YAMLTagContext):
try: try:
output_class, add_fn = self._OUTPUT_BODIES[self.output_body.upper()] output_class, add_fn = self._OUTPUT_BODIES[self.output_body.upper()]
except KeyError as exc: except KeyError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc raise EntryInvalidError(exc)
result = output_class() result = output_class()
@ -491,8 +461,8 @@ class Enumerate(YAMLTag, YAMLTagContext):
resolved_body = entry.tag_resolver(self.item_body, blueprint) resolved_body = entry.tag_resolver(self.item_body, blueprint)
result = add_fn(result, resolved_body) result = add_fn(result, resolved_body)
if not isinstance(result, output_class): if not isinstance(result, output_class):
raise EntryInvalidError.from_entry( raise EntryInvalidError(
f"Invalid {self.__class__.__name__} item found: {resolved_body}", entry f"Invalid {self.__class__.__name__} item found: {resolved_body}"
) )
finally: finally:
self.__current_context = tuple() self.__current_context = tuple()
@ -507,27 +477,24 @@ class EnumeratedItem(YAMLTag):
_SUPPORTED_CONTEXT_TAGS = (Enumerate,) _SUPPORTED_CONTEXT_TAGS = (Enumerate,)
def __init__(self, _loader: "BlueprintLoader", node: ScalarNode) -> None: def __init__(self, loader: "BlueprintLoader", node: ScalarNode) -> None:
super().__init__() super().__init__()
self.depth = int(node.value) self.depth = int(node.value)
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any: def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
try: try:
context_tag: Enumerate = entry.get_tag_context( context_tag: Enumerate = entry._get_tag_context(
depth=self.depth, depth=self.depth,
context_tag_type=EnumeratedItem._SUPPORTED_CONTEXT_TAGS, context_tag_type=EnumeratedItem._SUPPORTED_CONTEXT_TAGS,
) )
except ValueError as exc: except ValueError as exc:
if self.depth == 0: if self.depth == 0:
raise EntryInvalidError.from_entry( raise EntryInvalidError(
f"{self.__class__.__name__} tags are only usable " f"{self.__class__.__name__} tags are only usable "
f"inside an {Enumerate.__name__} tag", f"inside an {Enumerate.__name__} tag"
entry, )
) from exc
raise EntryInvalidError.from_entry( raise EntryInvalidError(f"{self.__class__.__name__} tag: {exc}")
f"{self.__class__.__name__} tag: {exc}", entry
) from exc
return context_tag.get_context(entry, blueprint) return context_tag.get_context(entry, blueprint)
@ -540,8 +507,8 @@ class Index(EnumeratedItem):
try: try:
return context[0] return context[0]
except IndexError as exc: # pragma: no cover except IndexError: # pragma: no cover
raise EntryInvalidError.from_entry(f"Empty/invalid context: {context}", entry) from exc raise EntryInvalidError(f"Empty/invalid context: {context}")
class Value(EnumeratedItem): class Value(EnumeratedItem):
@ -552,8 +519,8 @@ class Value(EnumeratedItem):
try: try:
return context[1] return context[1]
except IndexError as exc: # pragma: no cover except IndexError: # pragma: no cover
raise EntryInvalidError.from_entry(f"Empty/invalid context: {context}", entry) from exc raise EntryInvalidError(f"Empty/invalid context: {context}")
class BlueprintDumper(SafeDumper): class BlueprintDumper(SafeDumper):
@ -580,11 +547,7 @@ class BlueprintDumper(SafeDumper):
def factory(items): def factory(items):
final_dict = dict(items) final_dict = dict(items)
# Remove internal state variables
final_dict.pop("_state", None) final_dict.pop("_state", None)
# Future-proof to only remove the ID if we don't set a value
if "id" in final_dict and final_dict.get("id") is None:
final_dict.pop("id")
return final_dict return final_dict
data = asdict(data, dict_factory=factory) data = asdict(data, dict_factory=factory)
@ -611,31 +574,8 @@ class BlueprintLoader(SafeLoader):
class EntryInvalidError(SentryIgnoredException): class EntryInvalidError(SentryIgnoredException):
"""Error raised when an entry is invalid""" """Error raised when an entry is invalid"""
entry_model: str | None serializer_errors: Optional[dict]
entry_id: str | None
validation_error: ValidationError | None
serializer: Serializer | None = None
def __init__( def __init__(self, *args: object, serializer_errors: Optional[dict] = None) -> None:
self, *args: object, validation_error: ValidationError | None = None, **kwargs
) -> None:
super().__init__(*args) super().__init__(*args)
self.entry_model = None self.serializer_errors = serializer_errors
self.entry_id = None
self.validation_error = validation_error
for key, value in kwargs.items():
setattr(self, key, value)
@staticmethod
def from_entry(
msg_or_exc: str | Exception, entry: BlueprintEntry, *args, **kwargs
) -> "EntryInvalidError":
"""Create EntryInvalidError with the context of an entry"""
error = EntryInvalidError(msg_or_exc, *args, **kwargs)
if isinstance(msg_or_exc, ValidationError):
error.validation_error = msg_or_exc
# Make sure the model and id are strings, depending where the error happens
# they might still be YAMLTag instances
error.entry_model = str(entry.model)
error.entry_id = str(entry.id)
return error

Some files were not shown because too many files have changed in this diff Show More