Compare commits

..

9 Commits

Author SHA1 Message Date
906c63c16f blueprints: add FindObject tag
Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2024-12-19 15:55:34 +01:00
7763a3673c core: bump msgraph-sdk from 1.14.0 to 1.15.0 (#12403)
Bumps [msgraph-sdk](https://github.com/microsoftgraph/msgraph-sdk-python) from 1.14.0 to 1.15.0.
- [Release notes](https://github.com/microsoftgraph/msgraph-sdk-python/releases)
- [Changelog](https://github.com/microsoftgraph/msgraph-sdk-python/blob/main/CHANGELOG.md)
- [Commits](https://github.com/microsoftgraph/msgraph-sdk-python/compare/v1.14.0...v1.15.0)

---
updated-dependencies:
- dependency-name: msgraph-sdk
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-19 12:02:42 +01:00
d99005e130 core: bump pydantic from 2.10.3 to 2.10.4 (#12404)
Bumps [pydantic](https://github.com/pydantic/pydantic) from 2.10.3 to 2.10.4.
- [Release notes](https://github.com/pydantic/pydantic/releases)
- [Changelog](https://github.com/pydantic/pydantic/blob/main/HISTORY.md)
- [Commits](https://github.com/pydantic/pydantic/compare/v2.10.3...v2.10.4)

---
updated-dependencies:
- dependency-name: pydantic
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-19 12:02:30 +01:00
c61f96e770 core: bump google-api-python-client from 2.155.0 to 2.156.0 (#12405)
Bumps [google-api-python-client](https://github.com/googleapis/google-api-python-client) from 2.155.0 to 2.156.0.
- [Release notes](https://github.com/googleapis/google-api-python-client/releases)
- [Commits](https://github.com/googleapis/google-api-python-client/compare/v2.155.0...v2.156.0)

---
updated-dependencies:
- dependency-name: google-api-python-client
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-19 12:02:19 +01:00
83622dd934 core: bump goauthentik.io/api/v3 from 3.2024105.3 to 3.2024105.5 (#12406)
Bumps [goauthentik.io/api/v3](https://github.com/goauthentik/client-go) from 3.2024105.3 to 3.2024105.5.
- [Release notes](https://github.com/goauthentik/client-go/releases)
- [Changelog](https://github.com/goauthentik/client-go/blob/main/model_version_history.go)
- [Commits](https://github.com/goauthentik/client-go/compare/v3.2024105.3...v3.2024105.5)

---
updated-dependencies:
- dependency-name: goauthentik.io/api/v3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-19 12:02:09 +01:00
2eebd0eaa1 translate: Updates for file web/xliff/en.xlf in zh_CN (#12402)
Translate web/xliff/en.xlf in zh_CN

100% translated source file: 'web/xliff/en.xlf'
on 'zh_CN'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-19 09:30:54 +01:00
b61d918c5c translate: Updates for file web/xliff/en.xlf in zh-Hans (#12401)
Translate web/xliff/en.xlf in zh-Hans

100% translated source file: 'web/xliff/en.xlf'
on 'zh-Hans'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-19 09:30:48 +01:00
076a4f4772 translate: Updates for file locale/en/LC_MESSAGES/django.po in zh-Hans (#12400)
Translate django.po in zh-Hans

100% translated source file: 'django.po'
on 'zh-Hans'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-19 09:30:35 +01:00
b3872b35f8 translate: Updates for file locale/en/LC_MESSAGES/django.po in zh_CN (#12399)
Translate locale/en/LC_MESSAGES/django.po in zh_CN

100% translated source file: 'locale/en/LC_MESSAGES/django.po'
on 'zh_CN'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-19 09:29:53 +01:00
88 changed files with 515 additions and 1186 deletions

View File

@ -1,5 +1,5 @@
[bumpversion]
current_version = 2024.12.4
current_version = 2024.10.5
tag = True
commit = True
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(?:-(?P<rc_t>[a-zA-Z-]+)(?P<rc_n>[1-9]\\d*))?

View File

@ -35,6 +35,14 @@ runs:
AUTHENTIK_OUTPOSTS__CONTAINER_IMAGE_BASE=ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
```
For arm64, use these values:
```shell
AUTHENTIK_IMAGE=ghcr.io/goauthentik/dev-server
AUTHENTIK_TAG=${{ inputs.tag }}-arm64
AUTHENTIK_OUTPOSTS__CONTAINER_IMAGE_BASE=ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
```
Afterwards, run the upgrade commands from the latest release notes.
</details>
<details>
@ -52,6 +60,18 @@ runs:
tag: ${{ inputs.tag }}
```
For arm64, use these values:
```yaml
authentik:
outposts:
container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
global:
image:
repository: ghcr.io/goauthentik/dev-server
tag: ${{ inputs.tag }}-arm64
```
Afterwards, run the upgrade commands from the latest release notes.
</details>
edit-mode: replace

View File

@ -9,9 +9,6 @@ inputs:
image-arch:
required: false
description: "Docker image arch"
release:
required: true
description: "True if this is a release build, false if this is a dev/PR build"
outputs:
shouldPush:
@ -32,24 +29,15 @@ outputs:
imageTags:
description: "Docker image tags"
value: ${{ steps.ev.outputs.imageTags }}
imageTagsJSON:
description: "Docker image tags, as a JSON array"
value: ${{ steps.ev.outputs.imageTagsJSON }}
attestImageNames:
description: "Docker image names used for attestation"
value: ${{ steps.ev.outputs.attestImageNames }}
cacheTo:
description: "cache-to value for the docker build step"
value: ${{ steps.ev.outputs.cacheTo }}
imageMainTag:
description: "Docker image main tag"
value: ${{ steps.ev.outputs.imageMainTag }}
imageMainName:
description: "Docker image main name"
value: ${{ steps.ev.outputs.imageMainName }}
imageBuildArgs:
description: "Docker image build args"
value: ${{ steps.ev.outputs.imageBuildArgs }}
runs:
using: "composite"
@ -60,8 +48,6 @@ runs:
env:
IMAGE_NAME: ${{ inputs.image-name }}
IMAGE_ARCH: ${{ inputs.image-arch }}
RELEASE: ${{ inputs.release }}
PR_HEAD_SHA: ${{ github.event.pull_request.head.sha }}
REF: ${{ github.ref }}
run: |
python3 ${{ github.action_path }}/push_vars.py

View File

@ -2,7 +2,6 @@
import configparser
import os
from json import dumps
from time import time
parser = configparser.ConfigParser()
@ -49,7 +48,7 @@ if is_release:
]
else:
suffix = ""
if image_arch:
if image_arch and image_arch != "amd64":
suffix = f"-{image_arch}"
for name in image_names:
image_tags += [
@ -71,31 +70,12 @@ def get_attest_image_names(image_with_tags: list[str]):
return ",".join(set(image_tags))
# Generate `cache-to` param
cache_to = ""
if should_push:
_cache_tag = "buildcache"
if image_arch:
_cache_tag += f"-{image_arch}"
cache_to = f"type=registry,ref={get_attest_image_names(image_tags)}:{_cache_tag},mode=max"
image_build_args = []
if os.getenv("RELEASE", "false").lower() == "true":
image_build_args = [f"VERSION={os.getenv('REF')}"]
else:
image_build_args = [f"GIT_BUILD_HASH={sha}"]
image_build_args = "\n".join(image_build_args)
with open(os.environ["GITHUB_OUTPUT"], "a+", encoding="utf-8") as _output:
print(f"shouldPush={str(should_push).lower()}", file=_output)
print(f"sha={sha}", file=_output)
print(f"version={version}", file=_output)
print(f"prerelease={prerelease}", file=_output)
print(f"imageTags={','.join(image_tags)}", file=_output)
print(f"imageTagsJSON={dumps(image_tags)}", file=_output)
print(f"attestImageNames={get_attest_image_names(image_tags)}", file=_output)
print(f"imageMainTag={image_main_tag}", file=_output)
print(f"imageMainName={image_tags[0]}", file=_output)
print(f"cacheTo={cache_to}", file=_output)
print(f"imageBuildArgs={image_build_args}", file=_output)

View File

@ -1,18 +1,7 @@
#!/bin/bash -x
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
# Non-pushing PR
GITHUB_OUTPUT=/dev/stdout \
GITHUB_REF=ref \
GITHUB_SHA=sha \
IMAGE_NAME=ghcr.io/goauthentik/server,beryju/authentik \
GITHUB_REPOSITORY=goauthentik/authentik \
python $SCRIPT_DIR/push_vars.py
# Pushing PR/main
GITHUB_OUTPUT=/dev/stdout \
GITHUB_REF=ref \
GITHUB_SHA=sha \
IMAGE_NAME=ghcr.io/goauthentik/server,beryju/authentik \
GITHUB_REPOSITORY=goauthentik/authentik \
DOCKER_USERNAME=foo \
python $SCRIPT_DIR/push_vars.py

View File

@ -1,95 +0,0 @@
# Re-usable workflow for a single-architecture build
name: Single-arch Container build
on:
workflow_call:
inputs:
image_name:
required: true
type: string
image_arch:
required: true
type: string
runs-on:
required: true
type: string
registry_dockerhub:
default: false
type: boolean
registry_ghcr:
default: false
type: boolean
release:
default: false
type: boolean
outputs:
image-digest:
value: ${{ jobs.build.outputs.image-digest }}
jobs:
build:
name: Build ${{ inputs.image_arch }}
runs-on: ${{ inputs.runs-on }}
outputs:
image-digest: ${{ steps.push.outputs.digest }}
permissions:
# Needed to upload container images to ghcr.io
packages: write
# Needed for attestation
id-token: write
attestations: write
steps:
- uses: actions/checkout@v4
- uses: docker/setup-qemu-action@v3.3.0
- uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ${{ inputs.image_name }}
image-arch: ${{ inputs.image_arch }}
release: ${{ inputs.release }}
- name: Login to Docker Hub
if: ${{ inputs.registry_dockerhub }}
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry
if: ${{ inputs.registry_ghcr }}
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: make empty clients
if: ${{ inputs.release }}
run: |
mkdir -p ./gen-ts-api
mkdir -p ./gen-go-api
- name: generate ts client
if: ${{ !inputs.release }}
run: make gen-client-ts
- name: Build Docker Image
uses: docker/build-push-action@v6
id: push
with:
context: .
push: true
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
build-args: |
${{ steps.ev.outputs.imageBuildArgs }}
tags: ${{ steps.ev.outputs.imageTags }}
platforms: linux/${{ inputs.image_arch }}
cache-from: type=registry,ref=${{ steps.ev.outputs.attestImageNames }}:buildcache-${{ inputs.image_arch }}
cache-to: ${{ steps.ev.outputs.cacheTo }}
- uses: actions/attest-build-provenance@v2
id: attest
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true

View File

@ -1,102 +0,0 @@
# Re-usable workflow for a multi-architecture build
name: Multi-arch container build
on:
workflow_call:
inputs:
image_name:
required: true
type: string
registry_dockerhub:
default: false
type: boolean
registry_ghcr:
default: true
type: boolean
release:
default: false
type: boolean
outputs: {}
jobs:
build-server-amd64:
uses: ./.github/workflows/_reusable-docker-build-single.yaml
secrets: inherit
with:
image_name: ${{ inputs.image_name }}
image_arch: amd64
runs-on: ubuntu-latest
registry_dockerhub: ${{ inputs.registry_dockerhub }}
registry_ghcr: ${{ inputs.registry_ghcr }}
release: ${{ inputs.release }}
build-server-arm64:
uses: ./.github/workflows/_reusable-docker-build-single.yaml
secrets: inherit
with:
image_name: ${{ inputs.image_name }}
image_arch: arm64
runs-on: ubuntu-22.04-arm
registry_dockerhub: ${{ inputs.registry_dockerhub }}
registry_ghcr: ${{ inputs.registry_ghcr }}
release: ${{ inputs.release }}
get-tags:
runs-on: ubuntu-latest
needs:
- build-server-amd64
- build-server-arm64
outputs:
tags: ${{ steps.ev.outputs.imageTagsJSON }}
steps:
- uses: actions/checkout@v4
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ${{ inputs.image_name }}
merge-server:
runs-on: ubuntu-latest
needs:
- get-tags
- build-server-amd64
- build-server-arm64
strategy:
fail-fast: false
matrix:
tag: ${{ fromJson(needs.get-tags.outputs.tags) }}
steps:
- uses: actions/checkout@v4
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ${{ inputs.image_name }}
- name: Login to Docker Hub
if: ${{ inputs.registry_dockerhub }}
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry
if: ${{ inputs.registry_ghcr }}
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: int128/docker-manifest-create-action@v2
id: build
with:
tags: ${{ matrix.tag }}
sources: |
${{ steps.ev.outputs.attestImageNames }}@${{ needs.build-server-amd64.outputs.image-digest }}
${{ steps.ev.outputs.attestImageNames }}@${{ needs.build-server-arm64.outputs.image-digest }}
- uses: actions/attest-build-provenance@v2
id: attest
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.build.outputs.digest }}
push-to-registry: true

View File

@ -223,18 +223,68 @@ jobs:
with:
jobs: ${{ toJSON(needs) }}
build:
strategy:
fail-fast: false
matrix:
arch:
- amd64
- arm64
needs: ci-core-mark
runs-on: ubuntu-latest
permissions:
# Needed to upload container images to ghcr.io
# Needed to upload contianer images to ghcr.io
packages: write
# Needed for attestation
id-token: write
attestations: write
needs: ci-core-mark
uses: ./.github/workflows/_reusable-docker-build.yaml
secrets: inherit
with:
image_name: ghcr.io/goauthentik/dev-server
release: false
timeout-minutes: 120
steps:
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/dev-server
image-arch: ${{ matrix.arch }}
- name: Login to Container Registry
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: generate ts client
run: make gen-client-ts
- name: Build Docker Image
uses: docker/build-push-action@v6
id: push
with:
context: .
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
tags: ${{ steps.ev.outputs.imageTags }}
push: ${{ steps.ev.outputs.shouldPush == 'true' }}
build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-server:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' && 'type=registry,ref=ghcr.io/goauthentik/dev-server:buildcache,mode=max' || '' }}
platforms: linux/${{ matrix.arch }}
- uses: actions/attest-build-provenance@v2
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
pr-comment:
needs:
- build

View File

@ -72,7 +72,7 @@ jobs:
- rac
runs-on: ubuntu-latest
permissions:
# Needed to upload container images to ghcr.io
# Needed to upload contianer images to ghcr.io
packages: write
# Needed for attestation
id-token: write

View File

@ -2,7 +2,7 @@ name: "CodeQL"
on:
push:
branches: [main, next, version*]
branches: [main, "*", next, version*]
pull_request:
branches: [main]
schedule:

View File

@ -7,23 +7,64 @@ on:
jobs:
build-server:
uses: ./.github/workflows/_reusable-docker-build.yaml
secrets: inherit
runs-on: ubuntu-latest
permissions:
# Needed to upload container images to ghcr.io
# Needed to upload contianer images to ghcr.io
packages: write
# Needed for attestation
id-token: write
attestations: write
with:
image_name: ghcr.io/goauthentik/server,beryju/authentik
release: true
registry_dockerhub: true
registry_ghcr: true
steps:
- uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/server,beryju/authentik
- name: Docker Login Registry
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: make empty clients
run: |
mkdir -p ./gen-ts-api
mkdir -p ./gen-go-api
- name: Build Docker Image
uses: docker/build-push-action@v6
id: push
with:
context: .
push: true
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
build-args: |
VERSION=${{ github.ref }}
tags: ${{ steps.ev.outputs.imageTags }}
platforms: linux/amd64,linux/arm64
- uses: actions/attest-build-provenance@v2
id: attest
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
build-outpost:
runs-on: ubuntu-latest
permissions:
# Needed to upload container images to ghcr.io
# Needed to upload contianer images to ghcr.io
packages: write
# Needed for attestation
id-token: write
@ -147,8 +188,8 @@ jobs:
aws-region: ${{ env.AWS_REGION }}
- name: Upload template
run: |
aws s3 cp --acl=public-read website/docs/install-config/install/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.${{ github.ref }}.yaml
aws s3 cp --acl=public-read website/docs/install-config/install/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.latest.yaml
aws s3 cp website/docs/install-config/install/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.${{ github.ref }}.yaml
aws s3 cp website/docs/install-config/install/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.latest.yaml
test-release:
needs:
- build-server

View File

@ -14,7 +14,16 @@ jobs:
- uses: actions/checkout@v4
- name: Pre-release test
run: |
make test-docker
echo "PG_PASS=$(openssl rand 32 | base64 -w 0)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(openssl rand 32 | base64 -w 0)" >> .env
docker buildx install
mkdir -p ./gen-ts-api
docker build -t testing:latest .
echo "AUTHENTIK_IMAGE=testing" >> .env
echo "AUTHENTIK_TAG=latest" >> .env
docker compose up --no-start
docker compose start postgresql redis
docker compose run -u root server test-all
- id: generate_token
uses: tibdex/github-app-token@v2
with:

View File

@ -29,6 +29,7 @@
"!Enumerate sequence",
"!Env scalar",
"!Find sequence",
"!FindObject sequence",
"!Format sequence",
"!If sequence",
"!Index scalar",
@ -51,7 +52,9 @@
"ignoreCase": false
}
],
"go.testFlags": ["-count=1"],
"go.testFlags": [
"-count=1"
],
"github-actions.workflows.pinned.workflows": [
".github/workflows/ci-main.yml"
]

View File

@ -94,7 +94,7 @@ RUN --mount=type=secret,id=GEOIPUPDATE_ACCOUNT_ID \
/bin/sh -c "/usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0"
# Stage 5: Python dependencies
FROM ghcr.io/goauthentik/fips-python:3.12.8-slim-bookworm-fips AS python-deps
FROM ghcr.io/goauthentik/fips-python:3.12.7-slim-bookworm-fips-full AS python-deps
ARG TARGETARCH
ARG TARGETVARIANT
@ -116,30 +116,15 @@ RUN --mount=type=bind,target=./pyproject.toml,src=./pyproject.toml \
--mount=type=bind,target=./poetry.lock,src=./poetry.lock \
--mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/pypoetry \
pip install --no-cache cffi && \
apt-get update && \
apt-get install -y --no-install-recommends \
build-essential libffi-dev \
# Required for cryptography
curl pkg-config \
# Required for lxml
libxslt-dev zlib1g-dev \
# Required for xmlsec
libltdl-dev \
# Required for kadmin
sccache clang && \
curl https://sh.rustup.rs -sSf | sh -s -- -y && \
. "$HOME/.cargo/env" && \
python -m venv /ak-root/venv/ && \
bash -c "source ${VENV_PATH}/bin/activate && \
pip3 install --upgrade pip poetry && \
poetry config --local installer.no-binary cryptography,xmlsec,lxml,python-kadmin-rs && \
pip3 install --upgrade pip && \
pip3 install poetry && \
poetry install --only=main --no-ansi --no-interaction --no-root && \
pip uninstall cryptography -y && \
poetry install --only=main --no-ansi --no-interaction --no-root"
pip install --force-reinstall /wheels/*"
# Stage 6: Run
FROM ghcr.io/goauthentik/fips-python:3.12.8-slim-bookworm-fips AS final-image
FROM ghcr.io/goauthentik/fips-python:3.12.7-slim-bookworm-fips-full AS final-image
ARG VERSION
ARG GIT_BUILD_HASH
@ -156,7 +141,7 @@ WORKDIR /
# We cannot cache this layer otherwise we'll end up with a bigger image
RUN apt-get update && \
# Required for runtime
apt-get install -y --no-install-recommends libpq5 libmaxminddb0 ca-certificates libkrb5-3 libkadm5clnt-mit12 libkdb5-10 libltdl7 libxslt1.1 && \
apt-get install -y --no-install-recommends libpq5 libmaxminddb0 ca-certificates libkrb5-3 libkadm5clnt-mit12 libkdb5-10 && \
# Required for bootstrap & healtcheck
apt-get install -y --no-install-recommends runit && \
apt-get clean && \
@ -191,8 +176,9 @@ ENV TMPDIR=/dev/shm/ \
PYTHONUNBUFFERED=1 \
PATH="/ak-root/venv/bin:/lifecycle:$PATH" \
VENV_PATH="/ak-root/venv" \
POETRY_VIRTUALENVS_CREATE=false \
GOFIPS=1
POETRY_VIRTUALENVS_CREATE=false
ENV GOFIPS=1
HEALTHCHECK --interval=30s --timeout=30s --start-period=60s --retries=3 CMD [ "ak", "healthcheck" ]

View File

@ -45,6 +45,15 @@ help: ## Show this help
go-test:
go test -timeout 0 -v -race -cover ./...
test-docker: ## Run all tests in a docker-compose
echo "PG_PASS=$(shell openssl rand 32 | base64 -w 0)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(shell openssl rand 32 | base64 -w 0)" >> .env
docker compose pull -q
docker compose up --no-start
docker compose start postgresql redis
docker compose run -u root server test-all
rm -f .env
test: ## Run the server tests and produce a coverage report (locally)
coverage run manage.py test --keepdb authentik
coverage html
@ -254,9 +263,6 @@ docker: ## Build a docker image of the current source tree
mkdir -p ${GEN_API_TS}
DOCKER_BUILDKIT=1 docker build . --progress plain --tag ${DOCKER_IMAGE}
test-docker:
./scripts/test_docker.sh
#########################
## CI
#########################

View File

@ -20,8 +20,8 @@ Even if the issue is not a CVE, we still greatly appreciate your help in hardeni
| Version | Supported |
| --------- | --------- |
| 2024.8.x | ✅ |
| 2024.10.x | ✅ |
| 2024.12.x | ✅ |
## Reporting a Vulnerability

View File

@ -2,7 +2,7 @@
from os import environ
__version__ = "2024.12.4"
__version__ = "2024.10.5"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"

View File

@ -150,6 +150,7 @@ entries:
at_index_sequence_default: !AtIndex [!Context sequence, 100, "non existent"]
at_index_mapping: !AtIndex [!Context mapping, "key2"]
at_index_mapping_default: !AtIndex [!Context mapping, "invalid", "non existent"]
find_object: !AtIndex [!FindObject [authentik_providers_oauth2.scopemapping, [scope_name, openid]], managed]
identifiers:
name: test
conditions:

View File

@ -4,6 +4,7 @@ from os import environ
from django.test import TransactionTestCase
from authentik.blueprints.tests import apply_blueprint
from authentik.blueprints.v1.exporter import FlowExporter
from authentik.blueprints.v1.importer import Importer, transaction_rollback
from authentik.core.models import Group
@ -126,6 +127,7 @@ class TestBlueprintsV1(TransactionTestCase):
self.assertEqual(Prompt.objects.filter(field_key="username").count(), count_before)
@apply_blueprint("system/providers-oauth2.yaml")
def test_import_yaml_tags(self):
"""Test some yaml tags"""
ExpressionPolicy.objects.filter(name="foo-bar-baz-qux").delete()
@ -136,91 +138,93 @@ class TestBlueprintsV1(TransactionTestCase):
self.assertTrue(importer.apply())
policy = ExpressionPolicy.objects.filter(name="foo-bar-baz-qux").first()
self.assertTrue(policy)
self.assertTrue(
Group.objects.filter(
attributes={
"policy_pk1": str(policy.pk) + "-suffix",
"policy_pk2": str(policy.pk) + "-suffix",
"boolAnd": True,
"boolNand": False,
"boolOr": True,
"boolNor": False,
"boolXor": True,
"boolXnor": False,
"boolComplex": True,
"if_true_complex": {
"dictionary": {
"with": {"keys": "and_values"},
"and_nested_custom_tags": "foo-bar",
}
group = Group.objects.filter(name="test").first()
self.assertIsNotNone(group)
self.assertEqual(
group.attributes,
{
"policy_pk1": str(policy.pk) + "-suffix",
"policy_pk2": str(policy.pk) + "-suffix",
"boolAnd": True,
"boolNand": False,
"boolOr": True,
"boolNor": False,
"boolXor": True,
"boolXnor": False,
"boolComplex": True,
"if_true_complex": {
"dictionary": {
"with": {"keys": "and_values"},
"and_nested_custom_tags": "foo-bar",
}
},
"if_false_complex": ["list", "with", "items", "foo-bar"],
"if_true_simple": True,
"if_short": True,
"if_false_simple": 2,
"enumerate_mapping_to_mapping": {
"prefix-key1": "other-prefix-value",
"prefix-key2": "other-prefix-2",
},
"enumerate_mapping_to_sequence": [
"prefixed-pair-key1-value",
"prefixed-pair-key2-2",
],
"enumerate_sequence_to_sequence": [
"prefixed-items-0-foo",
"prefixed-items-1-bar",
],
"enumerate_sequence_to_mapping": {"index: 0": "foo", "index: 1": "bar"},
"nested_complex_enumeration": {
"0": {
"key1": [
["prefixed-f", "prefixed-o", "prefixed-o"],
{
"outer_value": "foo",
"outer_index": 0,
"middle_value": "value",
"middle_index": "key1",
},
],
"key2": [
["prefixed-f", "prefixed-o", "prefixed-o"],
{
"outer_value": "foo",
"outer_index": 0,
"middle_value": 2,
"middle_index": "key2",
},
],
},
"if_false_complex": ["list", "with", "items", "foo-bar"],
"if_true_simple": True,
"if_short": True,
"if_false_simple": 2,
"enumerate_mapping_to_mapping": {
"prefix-key1": "other-prefix-value",
"prefix-key2": "other-prefix-2",
"1": {
"key1": [
["prefixed-b", "prefixed-a", "prefixed-r"],
{
"outer_value": "bar",
"outer_index": 1,
"middle_value": "value",
"middle_index": "key1",
},
],
"key2": [
["prefixed-b", "prefixed-a", "prefixed-r"],
{
"outer_value": "bar",
"outer_index": 1,
"middle_value": 2,
"middle_index": "key2",
},
],
},
"enumerate_mapping_to_sequence": [
"prefixed-pair-key1-value",
"prefixed-pair-key2-2",
],
"enumerate_sequence_to_sequence": [
"prefixed-items-0-foo",
"prefixed-items-1-bar",
],
"enumerate_sequence_to_mapping": {"index: 0": "foo", "index: 1": "bar"},
"nested_complex_enumeration": {
"0": {
"key1": [
["prefixed-f", "prefixed-o", "prefixed-o"],
{
"outer_value": "foo",
"outer_index": 0,
"middle_value": "value",
"middle_index": "key1",
},
],
"key2": [
["prefixed-f", "prefixed-o", "prefixed-o"],
{
"outer_value": "foo",
"outer_index": 0,
"middle_value": 2,
"middle_index": "key2",
},
],
},
"1": {
"key1": [
["prefixed-b", "prefixed-a", "prefixed-r"],
{
"outer_value": "bar",
"outer_index": 1,
"middle_value": "value",
"middle_index": "key1",
},
],
"key2": [
["prefixed-b", "prefixed-a", "prefixed-r"],
{
"outer_value": "bar",
"outer_index": 1,
"middle_value": 2,
"middle_index": "key2",
},
],
},
},
"nested_context": "context-nested-value",
"env_null": None,
"at_index_sequence": "foo",
"at_index_sequence_default": "non existent",
"at_index_mapping": 2,
"at_index_mapping_default": "non existent",
}
).exists()
},
"nested_context": "context-nested-value",
"env_null": None,
"at_index_sequence": "foo",
"at_index_sequence_default": "non existent",
"at_index_mapping": 2,
"at_index_mapping_default": "non existent",
"find_object": "goauthentik.io/providers/oauth2/scope-openid",
},
)
self.assertTrue(
OAuthSource.objects.filter(

View File

@ -311,7 +311,7 @@ class Format(YAMLTag):
class Find(YAMLTag):
"""Find any object"""
"""Find any object primary key"""
model_name: str | YAMLTag
conditions: list[list]
@ -326,7 +326,7 @@ class Find(YAMLTag):
values.append(loader.construct_object(node_values))
self.conditions.append(values)
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
def _get_instance(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
if isinstance(self.model_name, YAMLTag):
model_name = self.model_name.resolve(entry, blueprint)
else:
@ -348,12 +348,29 @@ class Find(YAMLTag):
else:
query_value = cond[1]
query &= Q(**{query_key: query_value})
instance = model_class.objects.filter(query).first()
return model_class.objects.filter(query).first()
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
instance = self._get_instance(entry, blueprint)
if instance:
return instance.pk
return None
class FindObject(Find):
"""Find any object"""
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
instance = self._get_instance(entry, blueprint)
if not instance:
return None
if not isinstance(instance, SerializerModel):
raise EntryInvalidError.from_entry(
f"Model {self.model_name} is not resolvable through FindObject", entry
)
return instance.serializer(instance=instance).data
class Condition(YAMLTag):
"""Convert all values to a single boolean"""
@ -649,6 +666,7 @@ class BlueprintLoader(SafeLoader):
super().__init__(*args, **kwargs)
self.add_constructor("!KeyOf", KeyOf)
self.add_constructor("!Find", Find)
self.add_constructor("!FindObject", FindObject)
self.add_constructor("!Context", Context)
self.add_constructor("!Format", Format)
self.add_constructor("!Condition", Condition)

View File

@ -1,16 +1,15 @@
"""Application Roles API Viewset"""
from django.http import HttpRequest
from django.utils.translation import gettext_lazy as _
from rest_framework.exceptions import ValidationError
from rest_framework.viewsets import ModelViewSet
from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import ModelSerializer
from authentik.core.models import (
Application,
ApplicationEntitlement,
User,
)
@ -19,10 +18,7 @@ class ApplicationEntitlementSerializer(ModelSerializer):
def validate_app(self, app: Application) -> Application:
"""Ensure user has permission to view"""
request: HttpRequest = self.context.get("request")
if not request and SERIALIZER_CONTEXT_BLUEPRINT in self.context:
return app
user = request.user
user: User = self._context["request"].user
if user.has_perm("view_application", app) or user.has_perm(
"authentik_core.view_application"
):

View File

@ -3,7 +3,6 @@
from django.utils.translation import gettext_lazy as _
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import OpenApiParameter, extend_schema
from guardian.shortcuts import get_objects_for_user
from rest_framework.fields import (
BooleanField,
CharField,
@ -17,6 +16,7 @@ from rest_framework.viewsets import ViewSet
from authentik.core.api.utils import MetaNameSerializer
from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import EndpointDevice
from authentik.rbac.decorators import permission_required
from authentik.stages.authenticator import device_classes, devices_for_user
from authentik.stages.authenticator.models import Device
from authentik.stages.authenticator_webauthn.models import WebAuthnDevice
@ -73,9 +73,7 @@ class AdminDeviceViewSet(ViewSet):
def get_devices(self, **kwargs):
"""Get all devices in all child classes"""
for model in device_classes():
device_set = get_objects_for_user(
self.request.user, f"{model._meta.app_label}.view_{model._meta.model_name}", model
).filter(**kwargs)
device_set = model.objects.filter(**kwargs)
yield from device_set
@extend_schema(
@ -88,6 +86,10 @@ class AdminDeviceViewSet(ViewSet):
],
responses={200: DeviceSerializer(many=True)},
)
@permission_required(
None,
[f"{model._meta.app_label}.view_{model._meta.model_name}" for model in device_classes()],
)
def list(self, request: Request) -> Response:
"""Get all devices for current user"""
kwargs = {}

View File

@ -21,7 +21,6 @@ from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import MetaNameSerializer, ModelSerializer
from authentik.core.models import GroupSourceConnection, Source, UserSourceConnection
from authentik.core.types import UserSettingSerializer
from authentik.lib.api import MultipleFieldLookupMixin
from authentik.lib.utils.file import (
FilePathSerializer,
FileUploadSerializer,
@ -76,7 +75,6 @@ class SourceSerializer(ModelSerializer, MetaNameSerializer):
class SourceViewSet(
MultipleFieldLookupMixin,
TypesMixin,
mixins.RetrieveModelMixin,
mixins.DestroyModelMixin,
@ -89,7 +87,6 @@ class SourceViewSet(
queryset = Source.objects.none()
serializer_class = SourceSerializer
lookup_field = "slug"
lookup_fields = ["slug", "pbm_uuid"]
search_fields = ["slug", "name"]
filterset_fields = ["slug", "name", "managed"]

View File

@ -1,14 +1,13 @@
"""User API Views"""
from datetime import timedelta
from importlib import import_module
from json import loads
from typing import Any
from django.conf import settings
from django.contrib.auth import update_session_auth_hash
from django.contrib.auth.models import Permission
from django.contrib.sessions.backends.base import SessionBase
from django.contrib.sessions.backends.cache import KEY_PREFIX
from django.core.cache import cache
from django.db.models.functions import ExtractHour
from django.db.transaction import atomic
from django.db.utils import IntegrityError
@ -92,7 +91,6 @@ from authentik.stages.email.tasks import send_mails
from authentik.stages.email.utils import TemplateEmailMessage
LOGGER = get_logger()
SessionStore: SessionBase = import_module(settings.SESSION_ENGINE).SessionStore
class UserGroupSerializer(ModelSerializer):
@ -769,8 +767,7 @@ class UserViewSet(UsedByMixin, ModelViewSet):
if not instance.is_active:
sessions = AuthenticatedSession.objects.filter(user=instance)
session_ids = sessions.values_list("session_key", flat=True)
for session in session_ids:
SessionStore(session).delete()
cache.delete_many(f"{KEY_PREFIX}{session}" for session in session_ids)
sessions.delete()
LOGGER.debug("Deleted user's sessions", user=instance.username)
return response

View File

@ -1,10 +1,7 @@
"""authentik core signals"""
from importlib import import_module
from django.conf import settings
from django.contrib.auth.signals import user_logged_in, user_logged_out
from django.contrib.sessions.backends.base import SessionBase
from django.contrib.sessions.backends.cache import KEY_PREFIX
from django.core.cache import cache
from django.core.signals import Signal
from django.db.models import Model
@ -28,7 +25,6 @@ password_changed = Signal()
login_failed = Signal()
LOGGER = get_logger()
SessionStore: SessionBase = import_module(settings.SESSION_ENGINE).SessionStore
@receiver(post_save, sender=Application)
@ -64,7 +60,8 @@ def user_logged_out_session(sender, request: HttpRequest, user: User, **_):
@receiver(pre_delete, sender=AuthenticatedSession)
def authenticated_session_delete(sender: type[Model], instance: "AuthenticatedSession", **_):
"""Delete session when authenticated session is deleted"""
SessionStore(instance.session_key).delete()
cache_key = f"{KEY_PREFIX}{instance.session_key}"
cache.delete(cache_key)
@receiver(pre_save)

View File

@ -36,7 +36,7 @@ from authentik.flows.planner import (
)
from authentik.flows.stage import StageView
from authentik.flows.views.executor import NEXT_ARG_NAME, SESSION_KEY_GET, SESSION_KEY_PLAN
from authentik.lib.utils.urls import is_url_absolute, redirect_with_qs
from authentik.lib.utils.urls import redirect_with_qs
from authentik.lib.views import bad_request_message
from authentik.policies.denied import AccessDeniedResponse
from authentik.policies.utils import delete_none_values
@ -208,8 +208,6 @@ class SourceFlowManager:
final_redirect = self.request.session.get(SESSION_KEY_GET, {}).get(
NEXT_ARG_NAME, "authentik_core:if-user"
)
if not is_url_absolute(final_redirect):
final_redirect = "authentik_core:if-user"
flow_context.update(
{
# Since we authenticate the user by their token, they have no backend set

View File

@ -159,9 +159,9 @@ class ConnectionToken(ExpiringModel):
default_settings["port"] = str(port)
else:
default_settings["hostname"] = self.endpoint.host
if self.endpoint.protocol == Protocols.RDP:
default_settings["resize-method"] = "display-update"
default_settings["client-name"] = f"authentik - {self.session.user}"
default_settings["client-name"] = "authentik"
# default_settings["enable-drive"] = "true"
# default_settings["drive-name"] = "authentik"
settings = {}
always_merger.merge(settings, default_settings)
always_merger.merge(settings, self.endpoint.provider.settings)

View File

@ -50,10 +50,9 @@ class TestModels(TransactionTestCase):
{
"hostname": self.endpoint.host.split(":")[0],
"port": "1324",
"client-name": f"authentik - {self.user}",
"client-name": "authentik",
"drive-path": path,
"create-drive-path": "true",
"resize-method": "display-update",
},
)
# Set settings in provider
@ -64,11 +63,10 @@ class TestModels(TransactionTestCase):
{
"hostname": self.endpoint.host.split(":")[0],
"port": "1324",
"client-name": f"authentik - {self.user}",
"client-name": "authentik",
"drive-path": path,
"create-drive-path": "true",
"level": "provider",
"resize-method": "display-update",
},
)
# Set settings in endpoint
@ -81,11 +79,10 @@ class TestModels(TransactionTestCase):
{
"hostname": self.endpoint.host.split(":")[0],
"port": "1324",
"client-name": f"authentik - {self.user}",
"client-name": "authentik",
"drive-path": path,
"create-drive-path": "true",
"level": "endpoint",
"resize-method": "display-update",
},
)
# Set settings in token
@ -98,11 +95,10 @@ class TestModels(TransactionTestCase):
{
"hostname": self.endpoint.host.split(":")[0],
"port": "1324",
"client-name": f"authentik - {self.user}",
"client-name": "authentik",
"drive-path": path,
"create-drive-path": "true",
"level": "token",
"resize-method": "display-update",
},
)
# Set settings in property mapping (provider)
@ -118,11 +114,10 @@ class TestModels(TransactionTestCase):
{
"hostname": self.endpoint.host.split(":")[0],
"port": "1324",
"client-name": f"authentik - {self.user}",
"client-name": "authentik",
"drive-path": path,
"create-drive-path": "true",
"level": "property_mapping_provider",
"resize-method": "display-update",
},
)
# Set settings in property mapping (endpoint)
@ -140,12 +135,11 @@ class TestModels(TransactionTestCase):
{
"hostname": self.endpoint.host.split(":")[0],
"port": "1324",
"client-name": f"authentik - {self.user}",
"client-name": "authentik",
"drive-path": path,
"create-drive-path": "true",
"level": "property_mapping_endpoint",
"foo": "true",
"bar": "6",
"resize-method": "display-update",
},
)

View File

@ -138,6 +138,7 @@ def notification_cleanup(self: SystemTask):
"""Cleanup seen notifications and notifications whose event expired."""
notifications = Notification.objects.filter(Q(event=None) | Q(seen=True))
amount = notifications.count()
notifications.delete()
for notification in notifications:
notification.delete()
LOGGER.debug("Expired notifications", amount=amount)
self.set_status(TaskStatus.SUCCESSFUL, f"Expired {amount} Notifications")

View File

@ -109,8 +109,6 @@ class FlowPlan:
def pop(self):
"""Pop next pending stage from bottom of list"""
if not self.markers and not self.bindings:
return
self.markers.pop(0)
self.bindings.pop(0)
@ -158,13 +156,8 @@ class FlowPlan:
final_stage: type[StageView] = self.bindings[-1].stage.view
temp_exec = FlowExecutorView(flow=flow, request=request, plan=self)
temp_exec.current_stage = self.bindings[-1].stage
temp_exec.current_stage_view = final_stage
temp_exec.setup(request, flow.slug)
stage = final_stage(request=request, executor=temp_exec)
response = stage.dispatch(request)
# Ensure we clean the flow state we have in the session before we redirect away
temp_exec.stage_ok()
return response
return stage.dispatch(request)
return redirect_with_qs(
"authentik_core:if-flow",

View File

@ -103,7 +103,7 @@ class FlowExecutorView(APIView):
permission_classes = [AllowAny]
flow: Flow = None
flow: Flow
plan: FlowPlan | None = None
current_binding: FlowStageBinding | None = None
@ -114,8 +114,7 @@ class FlowExecutorView(APIView):
def setup(self, request: HttpRequest, flow_slug: str):
super().setup(request, flow_slug=flow_slug)
if not self.flow:
self.flow = get_object_or_404(Flow.objects.select_related(), slug=flow_slug)
self.flow = get_object_or_404(Flow.objects.select_related(), slug=flow_slug)
self._logger = get_logger().bind(flow_slug=flow_slug)
set_tag("authentik.flow", self.flow.slug)

View File

@ -78,9 +78,7 @@ class FlowInspectorView(APIView):
self.flow = get_object_or_404(Flow.objects.select_related(), slug=flow_slug)
if settings.DEBUG:
return
if request.user.has_perm("authentik_flows.inspect_flow") or request.user.has_perm(
"authentik_flows.inspect_flow", self.flow
):
if request.user.has_perm("authentik_flow.inspect_flow", self.flow):
return
raise Http404

View File

@ -1,37 +0,0 @@
from collections.abc import Callable, Sequence
from typing import Self
from uuid import UUID
from django.db.models import Model, Q, QuerySet, UUIDField
from django.shortcuts import get_object_or_404
class MultipleFieldLookupMixin:
"""Helper mixin class to add support for multiple lookup_fields.
`lookup_fields` needs to be set which specifies the actual fields to query, `lookup_field`
is only used to generate the URL."""
lookup_field: str
lookup_fields: str | Sequence[str]
get_queryset: Callable[[Self], QuerySet]
filter_queryset: Callable[[Self, QuerySet], QuerySet]
def get_object(self):
queryset: QuerySet = self.get_queryset()
queryset = self.filter_queryset(queryset)
if isinstance(self.lookup_fields, str):
self.lookup_fields = [self.lookup_fields]
query = Q()
model: Model = queryset.model
for field in self.lookup_fields:
field_inst = model._meta.get_field(field)
# Sanity check, if the field we're filtering again, only apply the filter if
# our value looks like a UUID
if isinstance(field_inst, UUIDField):
try:
UUID(self.kwargs[self.lookup_field])
except ValueError:
continue
query |= Q(**{field: self.kwargs[self.lookup_field]})
return get_object_or_404(queryset, query)

View File

@ -280,25 +280,9 @@ class ConfigLoader:
self.log("warning", "Failed to parse config as int", path=path, exc=str(exc))
return default
def get_optional_int(self, path: str, default=None) -> int | None:
"""Wrapper for get that converts value into int or None if set"""
value = self.get(path, default)
if value is UNSET:
return default
try:
return int(value)
except (ValueError, TypeError) as exc:
if value is None or (isinstance(value, str) and value.lower() == "null"):
return None
self.log("warning", "Failed to parse config as int", path=path, exc=str(exc))
return default
def get_bool(self, path: str, default=False) -> bool:
"""Wrapper for get that converts value into boolean"""
value = self.get(path, UNSET)
if value is UNSET:
return default
return str(self.get(path)).lower() == "true"
return str(self.get(path, default)).lower() == "true"
def get_keys(self, path: str, sep=".") -> list[str]:
"""List attribute keys by using yaml path"""
@ -370,33 +354,20 @@ def django_db_config(config: ConfigLoader | None = None) -> dict:
"sslcert": config.get("postgresql.sslcert"),
"sslkey": config.get("postgresql.sslkey"),
},
"CONN_MAX_AGE": CONFIG.get_optional_int("postgresql.conn_max_age", 0),
"CONN_HEALTH_CHECKS": CONFIG.get_bool("postgresql.conn_health_checks", False),
"DISABLE_SERVER_SIDE_CURSORS": CONFIG.get_bool(
"postgresql.disable_server_side_cursors", False
),
"TEST": {
"NAME": config.get("postgresql.test.name"),
},
}
}
conn_max_age = CONFIG.get_optional_int("postgresql.conn_max_age", UNSET)
disable_server_side_cursors = CONFIG.get_bool("postgresql.disable_server_side_cursors", UNSET)
if config.get_bool("postgresql.use_pgpool", False):
db["default"]["DISABLE_SERVER_SIDE_CURSORS"] = True
if disable_server_side_cursors is not UNSET:
db["default"]["DISABLE_SERVER_SIDE_CURSORS"] = disable_server_side_cursors
if config.get_bool("postgresql.use_pgbouncer", False):
# https://docs.djangoproject.com/en/4.0/ref/databases/#transaction-pooling-server-side-cursors
db["default"]["DISABLE_SERVER_SIDE_CURSORS"] = True
# https://docs.djangoproject.com/en/4.0/ref/databases/#persistent-connections
db["default"]["CONN_MAX_AGE"] = None # persistent
if disable_server_side_cursors is not UNSET:
db["default"]["DISABLE_SERVER_SIDE_CURSORS"] = disable_server_side_cursors
if conn_max_age is not UNSET:
db["default"]["CONN_MAX_AGE"] = conn_max_age
for replica in config.get_keys("postgresql.read_replicas"):
_database = deepcopy(db["default"])

View File

@ -6,6 +6,8 @@ postgresql:
user: authentik
port: 5432
password: "env://POSTGRES_PASSWORD"
use_pgbouncer: false
use_pgpool: false
test:
name: test_authentik
read_replicas: {}

View File

@ -214,9 +214,6 @@ class TestConfig(TestCase):
"PORT": "foo",
"TEST": {"NAME": "foo"},
"USER": "foo",
"CONN_MAX_AGE": 0,
"CONN_HEALTH_CHECKS": False,
"DISABLE_SERVER_SIDE_CURSORS": False,
}
},
)
@ -254,9 +251,6 @@ class TestConfig(TestCase):
"PORT": "foo",
"TEST": {"NAME": "foo"},
"USER": "foo",
"CONN_MAX_AGE": 0,
"CONN_HEALTH_CHECKS": False,
"DISABLE_SERVER_SIDE_CURSORS": False,
},
"replica_0": {
"ENGINE": "authentik.root.db",
@ -272,72 +266,6 @@ class TestConfig(TestCase):
"PORT": "foo",
"TEST": {"NAME": "foo"},
"USER": "foo",
"CONN_MAX_AGE": 0,
"CONN_HEALTH_CHECKS": False,
"DISABLE_SERVER_SIDE_CURSORS": False,
},
},
)
def test_db_read_replicas_pgbouncer(self):
"""Test read replicas"""
config = ConfigLoader()
config.set("postgresql.host", "foo")
config.set("postgresql.name", "foo")
config.set("postgresql.user", "foo")
config.set("postgresql.password", "foo")
config.set("postgresql.port", "foo")
config.set("postgresql.sslmode", "foo")
config.set("postgresql.sslrootcert", "foo")
config.set("postgresql.sslcert", "foo")
config.set("postgresql.sslkey", "foo")
config.set("postgresql.test.name", "foo")
config.set("postgresql.use_pgbouncer", True)
# Read replica
config.set("postgresql.read_replicas.0.host", "bar")
# Override conn_max_age
config.set("postgresql.read_replicas.0.conn_max_age", 10)
# This isn't supported
config.set("postgresql.read_replicas.0.use_pgbouncer", False)
conf = django_db_config(config)
self.assertEqual(
conf,
{
"default": {
"DISABLE_SERVER_SIDE_CURSORS": True,
"CONN_MAX_AGE": None,
"CONN_HEALTH_CHECKS": False,
"ENGINE": "authentik.root.db",
"HOST": "foo",
"NAME": "foo",
"OPTIONS": {
"sslcert": "foo",
"sslkey": "foo",
"sslmode": "foo",
"sslrootcert": "foo",
},
"PASSWORD": "foo",
"PORT": "foo",
"TEST": {"NAME": "foo"},
"USER": "foo",
},
"replica_0": {
"DISABLE_SERVER_SIDE_CURSORS": True,
"CONN_MAX_AGE": 10,
"CONN_HEALTH_CHECKS": False,
"ENGINE": "authentik.root.db",
"HOST": "bar",
"NAME": "foo",
"OPTIONS": {
"sslcert": "foo",
"sslkey": "foo",
"sslmode": "foo",
"sslrootcert": "foo",
},
"PASSWORD": "foo",
"PORT": "foo",
"TEST": {"NAME": "foo"},
"USER": "foo",
},
},
)
@ -366,8 +294,6 @@ class TestConfig(TestCase):
{
"default": {
"DISABLE_SERVER_SIDE_CURSORS": True,
"CONN_MAX_AGE": 0,
"CONN_HEALTH_CHECKS": False,
"ENGINE": "authentik.root.db",
"HOST": "foo",
"NAME": "foo",
@ -384,8 +310,6 @@ class TestConfig(TestCase):
},
"replica_0": {
"DISABLE_SERVER_SIDE_CURSORS": True,
"CONN_MAX_AGE": 0,
"CONN_HEALTH_CHECKS": False,
"ENGINE": "authentik.root.db",
"HOST": "bar",
"NAME": "foo",
@ -438,9 +362,6 @@ class TestConfig(TestCase):
"PORT": "foo",
"TEST": {"NAME": "foo"},
"USER": "foo",
"DISABLE_SERVER_SIDE_CURSORS": False,
"CONN_MAX_AGE": 0,
"CONN_HEALTH_CHECKS": False,
},
"replica_0": {
"ENGINE": "authentik.root.db",
@ -456,9 +377,6 @@ class TestConfig(TestCase):
"PORT": "foo",
"TEST": {"NAME": "foo"},
"USER": "foo",
"DISABLE_SERVER_SIDE_CURSORS": False,
"CONN_MAX_AGE": 0,
"CONN_HEALTH_CHECKS": False,
},
},
)

View File

@ -499,11 +499,11 @@ class OAuthFulfillmentStage(StageView):
)
challenge.is_valid()
self.executor.stage_ok()
return HttpChallengeResponse(
challenge=challenge,
)
self.executor.stage_ok()
return HttpResponseRedirectScheme(uri, allowed_schemes=[parsed.scheme])
def post(self, request: HttpRequest, *args, **kwargs) -> HttpResponse:

View File

@ -256,7 +256,7 @@ class AssertionProcessor:
assertion.attrib["IssueInstant"] = self._issue_instant
assertion.append(self.get_issuer())
if self.provider.signing_kp and self.provider.sign_assertion:
if self.provider.signing_kp:
sign_algorithm_transform = SIGN_ALGORITHM_TRANSFORM_MAP.get(
self.provider.signature_algorithm, xmlsec.constants.TransformRsaSha1
)
@ -295,18 +295,6 @@ class AssertionProcessor:
response.append(self.get_issuer())
if self.provider.signing_kp and self.provider.sign_response:
sign_algorithm_transform = SIGN_ALGORITHM_TRANSFORM_MAP.get(
self.provider.signature_algorithm, xmlsec.constants.TransformRsaSha1
)
signature = xmlsec.template.create(
response,
xmlsec.constants.TransformExclC14N,
sign_algorithm_transform,
ns=xmlsec.constants.DSigNs,
)
response.append(signature)
status = SubElement(response, f"{{{NS_SAML_PROTOCOL}}}Status")
status_code = SubElement(status, f"{{{NS_SAML_PROTOCOL}}}StatusCode")
status_code.attrib["Value"] = "urn:oasis:names:tc:SAML:2.0:status:Success"

View File

@ -2,10 +2,8 @@
from base64 import b64encode
from defusedxml.lxml import fromstring
from django.http.request import QueryDict
from django.test import TestCase
from lxml import etree # nosec
from authentik.blueprints.tests import apply_blueprint
from authentik.core.tests.utils import create_test_admin_user, create_test_cert, create_test_flow
@ -13,14 +11,12 @@ from authentik.crypto.models import CertificateKeyPair
from authentik.events.models import Event, EventAction
from authentik.lib.generators import generate_id
from authentik.lib.tests.utils import get_request
from authentik.lib.xml import lxml_from_string
from authentik.providers.saml.models import SAMLPropertyMapping, SAMLProvider
from authentik.providers.saml.processors.assertion import AssertionProcessor
from authentik.providers.saml.processors.authn_request_parser import AuthNRequestParser
from authentik.sources.saml.exceptions import MismatchedRequestID
from authentik.sources.saml.models import SAMLSource
from authentik.sources.saml.processors.constants import (
NS_MAP,
SAML_BINDING_REDIRECT,
SAML_NAME_ID_FORMAT_EMAIL,
SAML_NAME_ID_FORMAT_UNSPECIFIED,
@ -189,19 +185,6 @@ class TestAuthNRequest(TestCase):
self.assertEqual(response.count(response_proc._assertion_id), 2)
self.assertEqual(response.count(response_proc._response_id), 2)
schema = etree.XMLSchema(
etree.parse("schemas/saml-schema-protocol-2.0.xsd", parser=etree.XMLParser()) # nosec
)
self.assertTrue(schema.validate(lxml_from_string(response)))
response_xml = fromstring(response)
self.assertEqual(
len(response_xml.xpath("//saml:Assertion/ds:Signature", namespaces=NS_MAP)), 1
)
self.assertEqual(
len(response_xml.xpath("//samlp:Response/ds:Signature", namespaces=NS_MAP)), 1
)
# Now parse the response (source)
http_request.POST = QueryDict(mutable=True)
http_request.POST["SAMLResponse"] = b64encode(response.encode()).decode()

View File

@ -2,10 +2,9 @@
from django.apps import apps
from django.contrib.auth.models import Permission
from django.db.models import Q, QuerySet
from django.db.models import QuerySet
from django_filters.filters import ModelChoiceFilter
from django_filters.filterset import FilterSet
from django_filters.rest_framework import DjangoFilterBackend
from rest_framework.exceptions import ValidationError
from rest_framework.fields import (
CharField,
@ -14,11 +13,8 @@ from rest_framework.fields import (
ReadOnlyField,
SerializerMethodField,
)
from rest_framework.filters import OrderingFilter, SearchFilter
from rest_framework.permissions import IsAuthenticated
from rest_framework.viewsets import ReadOnlyModelViewSet
from authentik.blueprints.v1.importer import excluded_models
from authentik.core.api.utils import ModelSerializer, PassiveSerializer
from authentik.core.models import User
from authentik.lib.validators import RequiredTogetherValidator
@ -96,9 +92,7 @@ class RBACPermissionViewSet(ReadOnlyModelViewSet):
queryset = Permission.objects.none()
serializer_class = PermissionSerializer
ordering = ["name"]
filter_backends = [DjangoFilterBackend, OrderingFilter, SearchFilter]
filterset_class = PermissionFilter
permission_classes = [IsAuthenticated]
search_fields = [
"codename",
"content_type__model",
@ -106,13 +100,13 @@ class RBACPermissionViewSet(ReadOnlyModelViewSet):
]
def get_queryset(self) -> QuerySet:
query = Q()
for model in excluded_models():
query |= Q(
content_type__app_label=model._meta.app_label,
content_type__model=model._meta.model_name,
return (
Permission.objects.all()
.select_related("content_type")
.filter(
content_type__app_label__startswith="authentik",
)
return Permission.objects.all().select_related("content_type").exclude(query)
)
class PermissionAssignSerializer(PassiveSerializer):

View File

@ -13,7 +13,6 @@ from authentik.core.api.sources import SourceSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import PassiveSerializer
from authentik.events.api.tasks import SystemTaskSerializer
from authentik.lib.api import MultipleFieldLookupMixin
from authentik.sources.kerberos.models import KerberosSource
from authentik.sources.kerberos.tasks import CACHE_KEY_STATUS
@ -60,13 +59,12 @@ class KerberosSyncStatusSerializer(PassiveSerializer):
tasks = SystemTaskSerializer(many=True, read_only=True)
class KerberosSourceViewSet(MultipleFieldLookupMixin, UsedByMixin, ModelViewSet):
class KerberosSourceViewSet(UsedByMixin, ModelViewSet):
"""Kerberos Source Viewset"""
queryset = KerberosSource.objects.all()
serializer_class = KerberosSourceSerializer
lookup_field = "slug"
lookup_fields = ["slug", "pbm_uuid"]
filterset_fields = [
"name",
"slug",

View File

@ -38,9 +38,7 @@ class KerberosBackend(InbuiltBackend):
self, username: str, realm: str | None, password: str, **filters
) -> tuple[User | None, KerberosSource | None]:
sources = KerberosSource.objects.filter(enabled=True)
user = User.objects.filter(
usersourceconnection__source__in=sources, username=username, **filters
).first()
user = User.objects.filter(usersourceconnection__source__in=sources, **filters).first()
if user is not None:
# User found, let's get its connections for the sources that are available
@ -79,7 +77,7 @@ class KerberosBackend(InbuiltBackend):
password, sender=user_source_connection.source
)
user_source_connection.user.save()
return user_source_connection.user, user_source_connection.source
return user, user_source_connection.source
# Password doesn't match, onto next source
LOGGER.debug(
"failed to kinit, password invalid",

View File

@ -18,7 +18,6 @@ from authentik.core.api.property_mappings import PropertyMappingFilterSet, Prope
from authentik.core.api.sources import SourceSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.crypto.models import CertificateKeyPair
from authentik.lib.api import MultipleFieldLookupMixin
from authentik.lib.sync.outgoing.api import SyncStatusSerializer
from authentik.sources.ldap.models import LDAPSource, LDAPSourcePropertyMapping
from authentik.sources.ldap.tasks import CACHE_KEY_STATUS, SYNC_CLASSES
@ -104,13 +103,12 @@ class LDAPSourceSerializer(SourceSerializer):
extra_kwargs = {"bind_password": {"write_only": True}}
class LDAPSourceViewSet(MultipleFieldLookupMixin, UsedByMixin, ModelViewSet):
class LDAPSourceViewSet(UsedByMixin, ModelViewSet):
"""LDAP Source Viewset"""
queryset = LDAPSource.objects.all()
serializer_class = LDAPSourceSerializer
lookup_field = "slug"
lookup_fields = ["slug", "pbm_uuid"]
filterset_fields = [
"name",
"slug",

View File

@ -16,7 +16,6 @@ from rest_framework.viewsets import ModelViewSet
from authentik.core.api.sources import SourceSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import PassiveSerializer
from authentik.lib.api import MultipleFieldLookupMixin
from authentik.lib.utils.http import get_http_session
from authentik.sources.oauth.models import OAuthSource
from authentik.sources.oauth.types.registry import SourceType, registry
@ -171,13 +170,12 @@ class OAuthSourceFilter(FilterSet):
]
class OAuthSourceViewSet(MultipleFieldLookupMixin, UsedByMixin, ModelViewSet):
class OAuthSourceViewSet(UsedByMixin, ModelViewSet):
"""Source Viewset"""
queryset = OAuthSource.objects.all()
serializer_class = OAuthSourceSerializer
lookup_field = "slug"
lookup_fields = ["slug", "pbm_uuid"]
filterset_class = OAuthSourceFilter
search_fields = ["name", "slug"]
ordering = ["name"]

View File

@ -18,7 +18,6 @@ from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import PassiveSerializer
from authentik.flows.challenge import RedirectChallenge
from authentik.flows.views.executor import to_stage_response
from authentik.lib.api import MultipleFieldLookupMixin
from authentik.rbac.decorators import permission_required
from authentik.sources.plex.models import PlexSource, UserPlexSourceConnection
from authentik.sources.plex.plex import PlexAuth, PlexSourceFlowManager
@ -46,13 +45,12 @@ class PlexTokenRedeemSerializer(PassiveSerializer):
plex_token = CharField()
class PlexSourceViewSet(MultipleFieldLookupMixin, UsedByMixin, ModelViewSet):
class PlexSourceViewSet(UsedByMixin, ModelViewSet):
"""Plex source Viewset"""
queryset = PlexSource.objects.all()
serializer_class = PlexSourceSerializer
lookup_field = "slug"
lookup_fields = ["slug", "pbm_uuid"]
filterset_fields = [
"name",
"slug",

View File

@ -9,7 +9,6 @@ from rest_framework.viewsets import ModelViewSet
from authentik.core.api.sources import SourceSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.lib.api import MultipleFieldLookupMixin
from authentik.providers.saml.api.providers import SAMLMetadataSerializer
from authentik.sources.saml.models import SAMLSource
from authentik.sources.saml.processors.metadata import MetadataProcessor
@ -38,13 +37,12 @@ class SAMLSourceSerializer(SourceSerializer):
]
class SAMLSourceViewSet(MultipleFieldLookupMixin, UsedByMixin, ModelViewSet):
class SAMLSourceViewSet(UsedByMixin, ModelViewSet):
"""SAMLSource Viewset"""
queryset = SAMLSource.objects.all()
serializer_class = SAMLSourceSerializer
lookup_field = "slug"
lookup_fields = ["slug", "pbm_uuid"]
filterset_fields = [
"name",
"slug",

View File

@ -33,7 +33,6 @@ from authentik.flows.planner import (
)
from authentik.flows.stage import ChallengeStageView
from authentik.flows.views.executor import NEXT_ARG_NAME, SESSION_KEY_GET, SESSION_KEY_PLAN
from authentik.lib.utils.urls import is_url_absolute
from authentik.lib.views import bad_request_message
from authentik.providers.saml.utils.encoding import nice64
from authentik.sources.saml.exceptions import MissingSAMLResponse, UnsupportedNameIDFormat
@ -74,8 +73,6 @@ class InitiateView(View):
final_redirect = self.request.session.get(SESSION_KEY_GET, {}).get(
NEXT_ARG_NAME, "authentik_core:if-user"
)
if not is_url_absolute(final_redirect):
final_redirect = "authentik_core:if-user"
kwargs.update(
{
PLAN_CONTEXT_SSO: True,

View File

@ -7,7 +7,6 @@ from rest_framework.viewsets import ModelViewSet
from authentik.core.api.sources import SourceSerializer
from authentik.core.api.tokens import TokenSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.lib.api import MultipleFieldLookupMixin
from authentik.sources.scim.models import SCIMSource
@ -48,13 +47,12 @@ class SCIMSourceSerializer(SourceSerializer):
]
class SCIMSourceViewSet(MultipleFieldLookupMixin, UsedByMixin, ModelViewSet):
class SCIMSourceViewSet(UsedByMixin, ModelViewSet):
"""SCIMSource Viewset"""
queryset = SCIMSource.objects.all()
serializer_class = SCIMSourceSerializer
lookup_field = "slug"
lookup_fields = ["slug", "pbm_uuid"]
filterset_fields = ["name", "slug"]
search_fields = ["name", "slug", "token__identifier", "token__user__username"]
ordering = ["name"]

View File

@ -20,7 +20,7 @@ from authentik.flows.planner import (
FlowPlanner,
)
from authentik.flows.stage import ChallengeStageView
from authentik.flows.views.executor import SESSION_KEY_GET, SESSION_KEY_PLAN, InvalidStageError
from authentik.flows.views.executor import SESSION_KEY_PLAN, InvalidStageError
from authentik.lib.utils.urls import reverse_with_qs
from authentik.stages.redirect.models import RedirectMode, RedirectStage
@ -72,9 +72,7 @@ class RedirectStageView(ChallengeStageView):
self.request.session[SESSION_KEY_PLAN] = plan
kwargs = self.executor.kwargs
kwargs.update({"flow_slug": flow.slug})
return reverse_with_qs(
"authentik_core:if-flow", self.request.session[SESSION_KEY_GET], kwargs=kwargs
)
return reverse_with_qs("authentik_core:if-flow", self.request.GET, kwargs=kwargs)
def get_challenge(self, *args, **kwargs) -> Challenge:
"""Get the redirect target. Prioritize `redirect_stage_target` if present."""

View File

@ -1,7 +1,5 @@
"""Test Redirect stage"""
from urllib.parse import urlencode
from django.urls.base import reverse
from rest_framework.exceptions import ValidationError
@ -60,23 +58,6 @@ class TestRedirectStage(FlowTestCase):
response, reverse("authentik_core:if-flow", kwargs={"flow_slug": self.target_flow.slug})
)
def test_flow_query(self):
self.stage.mode = RedirectMode.FLOW
self.stage.save()
response = self.client.get(
reverse("authentik_api:flow-executor", kwargs={"flow_slug": self.flow.slug})
+ "?"
+ urlencode({"query": urlencode({"test": "foo"})})
)
self.assertStageRedirects(
response,
reverse("authentik_core:if-flow", kwargs={"flow_slug": self.target_flow.slug})
+ "?"
+ urlencode({"test": "foo"}),
)
def test_override_static(self):
policy = ExpressionPolicy.objects.create(
name=generate_id(),

View File

@ -2,7 +2,7 @@
"$schema": "http://json-schema.org/draft-07/schema",
"$id": "https://goauthentik.io/blueprints/schema.json",
"type": "object",
"title": "authentik 2024.12.4 Blueprint schema",
"title": "authentik 2024.10.5 Blueprint schema",
"required": [
"version",
"entries"

View File

@ -31,7 +31,7 @@ services:
volumes:
- redis:/data
server:
image: ${AUTHENTIK_IMAGE:-ghcr.io/goauthentik/server}:${AUTHENTIK_TAG:-2024.12.4}
image: ${AUTHENTIK_IMAGE:-ghcr.io/goauthentik/server}:${AUTHENTIK_TAG:-2024.10.5}
restart: unless-stopped
command: server
environment:
@ -54,7 +54,7 @@ services:
redis:
condition: service_healthy
worker:
image: ${AUTHENTIK_IMAGE:-ghcr.io/goauthentik/server}:${AUTHENTIK_TAG:-2024.12.4}
image: ${AUTHENTIK_IMAGE:-ghcr.io/goauthentik/server}:${AUTHENTIK_TAG:-2024.10.5}
restart: unless-stopped
command: worker
environment:

2
go.mod
View File

@ -29,7 +29,7 @@ require (
github.com/spf13/cobra v1.8.1
github.com/stretchr/testify v1.10.0
github.com/wwt/guac v1.3.2
goauthentik.io/api/v3 v3.2024105.3
goauthentik.io/api/v3 v3.2024105.5
golang.org/x/exp v0.0.0-20230210204819-062eb4c674ab
golang.org/x/oauth2 v0.24.0
golang.org/x/sync v0.10.0

4
go.sum
View File

@ -299,8 +299,8 @@ go.opentelemetry.io/otel/trace v1.24.0 h1:CsKnnL4dUAr/0llH9FKuc698G04IrpWV0MQA/Y
go.opentelemetry.io/otel/trace v1.24.0/go.mod h1:HPc3Xr/cOApsBI154IU0OI0HJexz+aw5uPdbs3UCjNU=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
goauthentik.io/api/v3 v3.2024105.3 h1:Vl1vwPkCtA8hChsxwO3NUI8nupFC7r93jUHvqM+kYVw=
goauthentik.io/api/v3 v3.2024105.3/go.mod h1:zz+mEZg8rY/7eEjkMGWJ2DnGqk+zqxuybGCGrR2O4Kw=
goauthentik.io/api/v3 v3.2024105.5 h1:zBDqIjWN5QNuL6iBLL4o9QwBsSkFQdAnyTjASsyE/fw=
goauthentik.io/api/v3 v3.2024105.5/go.mod h1:zz+mEZg8rY/7eEjkMGWJ2DnGqk+zqxuybGCGrR2O4Kw=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20190510104115-cbcb75029529/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=
golang.org/x/crypto v0.0.0-20190605123033-f99c8df09eb5/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=

View File

@ -29,4 +29,4 @@ func UserAgent() string {
return fmt.Sprintf("authentik@%s", FullVersion())
}
const VERSION = "2024.12.4"
const VERSION = "2024.10.5"

View File

@ -4,7 +4,6 @@ import (
"context"
"crypto/tls"
"fmt"
"maps"
"net/http"
"net/url"
"strconv"
@ -17,22 +16,11 @@ import (
"goauthentik.io/internal/constants"
)
func (ac *APIController) getWebsocketURL(akURL url.URL, outpostUUID string, query url.Values) *url.URL {
wsUrl := &url.URL{}
wsUrl.Scheme = strings.ReplaceAll(akURL.Scheme, "http", "ws")
wsUrl.Host = akURL.Host
_p, _ := url.JoinPath(akURL.Path, "ws/outpost/", outpostUUID, "/")
wsUrl.Path = _p
v := url.Values{}
maps.Insert(v, maps.All(akURL.Query()))
maps.Insert(v, maps.All(query))
wsUrl.RawQuery = v.Encode()
return wsUrl
}
func (ac *APIController) initWS(akURL url.URL, outpostUUID string) error {
pathTemplate := "%s://%s%sws/outpost/%s/?%s"
query := akURL.Query()
query.Set("instance_uuid", ac.instanceUUID.String())
scheme := strings.ReplaceAll(akURL.Scheme, "http", "ws")
authHeader := fmt.Sprintf("Bearer %s", ac.token)
@ -49,9 +37,7 @@ func (ac *APIController) initWS(akURL url.URL, outpostUUID string) error {
},
}
wsu := ac.getWebsocketURL(akURL, outpostUUID, query).String()
ac.logger.WithField("url", wsu).Debug("connecting to websocket")
ws, _, err := dialer.Dial(wsu, header)
ws, _, err := dialer.Dial(fmt.Sprintf(pathTemplate, scheme, akURL.Host, akURL.Path, outpostUUID, akURL.Query().Encode()), header)
if err != nil {
ac.logger.WithError(err).Warning("failed to connect websocket")
return err

View File

@ -1,42 +0,0 @@
package ak
import (
"net/url"
"testing"
"github.com/stretchr/testify/assert"
)
func URLMustParse(u string) *url.URL {
ur, err := url.Parse(u)
if err != nil {
panic(err)
}
return ur
}
func TestWebsocketURL(t *testing.T) {
u := URLMustParse("http://localhost:9000?foo=bar")
uuid := "23470845-7263-4fe3-bd79-ec1d7bf77d77"
ac := &APIController{}
nu := ac.getWebsocketURL(*u, uuid, url.Values{})
assert.Equal(t, "ws://localhost:9000/ws/outpost/23470845-7263-4fe3-bd79-ec1d7bf77d77/?foo=bar", nu.String())
}
func TestWebsocketURL_Query(t *testing.T) {
u := URLMustParse("http://localhost:9000?foo=bar")
uuid := "23470845-7263-4fe3-bd79-ec1d7bf77d77"
ac := &APIController{}
v := url.Values{}
v.Set("bar", "baz")
nu := ac.getWebsocketURL(*u, uuid, v)
assert.Equal(t, "ws://localhost:9000/ws/outpost/23470845-7263-4fe3-bd79-ec1d7bf77d77/?bar=baz&foo=bar", nu.String())
}
func TestWebsocketURL_Subpath(t *testing.T) {
u := URLMustParse("http://localhost:9000/foo/bar/")
uuid := "23470845-7263-4fe3-bd79-ec1d7bf77d77"
ac := &APIController{}
nu := ac.getWebsocketURL(*u, uuid, url.Values{})
assert.Equal(t, "ws://localhost:9000/foo/bar/ws/outpost/23470845-7263-4fe3-bd79-ec1d7bf77d77/", nu.String())
}

View File

@ -1,5 +1,4 @@
#!/usr/bin/env -S bash
set -e -o pipefail
#!/usr/bin/env -S bash -e
MODE_FILE="${TMPDIR}/authentik-mode"
function log {
@ -88,6 +87,7 @@ elif [[ "$1" == "bash" ]]; then
elif [[ "$1" == "test-all" ]]; then
prepare_debug
chmod 777 /root
pip install --force-reinstall /wheels/*
check_if_root "python -m manage test authentik"
elif [[ "$1" == "healthcheck" ]]; then
run_authentik healthcheck $(cat $MODE_FILE)

View File

@ -1,5 +1,5 @@
{
"name": "@goauthentik/authentik",
"version": "2024.12.4",
"version": "2024.10.5",
"private": true
}

222
poetry.lock generated
View File

@ -1922,13 +1922,13 @@ grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0.dev0)"]
[[package]]
name = "google-api-python-client"
version = "2.155.0"
version = "2.156.0"
description = "Google API Client Library for Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "google_api_python_client-2.155.0-py2.py3-none-any.whl", hash = "sha256:83fe9b5aa4160899079d7c93a37be306546a17e6686e2549bcc9584f1a229747"},
{file = "google_api_python_client-2.155.0.tar.gz", hash = "sha256:25529f89f0d13abcf3c05c089c423fb2858ac16e0b3727543393468d0d7af67c"},
{file = "google_api_python_client-2.156.0-py2.py3-none-any.whl", hash = "sha256:6352185c505e1f311f11b0b96c1b636dcb0fec82cd04b80ac5a671ac4dcab339"},
{file = "google_api_python_client-2.156.0.tar.gz", hash = "sha256:b809c111ded61716a9c1c7936e6899053f13bae3defcdfda904bd2ca68065b9c"},
]
[package.dependencies]
@ -3107,13 +3107,13 @@ dev = ["bumpver", "isort", "mypy", "pylint", "pytest", "yapf"]
[[package]]
name = "msgraph-sdk"
version = "1.14.0"
version = "1.15.0"
description = "The Microsoft Graph Python SDK"
optional = false
python-versions = ">=3.8"
files = [
{file = "msgraph_sdk-1.14.0-py3-none-any.whl", hash = "sha256:1a2f327dc8fbe5a5e6d0d84cf71d605e7b118b3066b1e16f011ccd8fd927bb03"},
{file = "msgraph_sdk-1.14.0.tar.gz", hash = "sha256:5bbda80941c5d1794682753b8b291bd2ebed719a43d6de949fd0cd613b6dfbbd"},
{file = "msgraph_sdk-1.15.0-py3-none-any.whl", hash = "sha256:85332db7ee19eb3d65a2493de83994ce3f5e4d9a084b3643ff6dea797cda81a7"},
{file = "msgraph_sdk-1.15.0.tar.gz", hash = "sha256:c920e72cc9de2218f9f9f71682db22ea544d9b440a5f088892bfca686c546b91"},
]
[package.dependencies]
@ -3881,19 +3881,19 @@ files = [
[[package]]
name = "pydantic"
version = "2.10.3"
version = "2.10.4"
description = "Data validation using Python type hints"
optional = false
python-versions = ">=3.8"
files = [
{file = "pydantic-2.10.3-py3-none-any.whl", hash = "sha256:be04d85bbc7b65651c5f8e6b9976ed9c6f41782a55524cef079a34a0bb82144d"},
{file = "pydantic-2.10.3.tar.gz", hash = "sha256:cb5ac360ce894ceacd69c403187900a02c4b20b693a9dd1d643e1effab9eadf9"},
{file = "pydantic-2.10.4-py3-none-any.whl", hash = "sha256:597e135ea68be3a37552fb524bc7d0d66dcf93d395acd93a00682f1efcb8ee3d"},
{file = "pydantic-2.10.4.tar.gz", hash = "sha256:82f12e9723da6de4fe2ba888b5971157b3be7ad914267dea8f05f82b28254f06"},
]
[package.dependencies]
annotated-types = ">=0.6.0"
email-validator = {version = ">=2.0.0", optional = true, markers = "extra == \"email\""}
pydantic-core = "2.27.1"
pydantic-core = "2.27.2"
typing-extensions = ">=4.12.2"
[package.extras]
@ -3902,111 +3902,111 @@ timezone = ["tzdata"]
[[package]]
name = "pydantic-core"
version = "2.27.1"
version = "2.27.2"
description = "Core functionality for Pydantic validation and serialization"
optional = false
python-versions = ">=3.8"
files = [
{file = "pydantic_core-2.27.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:71a5e35c75c021aaf400ac048dacc855f000bdfed91614b4a726f7432f1f3d6a"},
{file = "pydantic_core-2.27.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f82d068a2d6ecfc6e054726080af69a6764a10015467d7d7b9f66d6ed5afa23b"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:121ceb0e822f79163dd4699e4c54f5ad38b157084d97b34de8b232bcaad70278"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4603137322c18eaf2e06a4495f426aa8d8388940f3c457e7548145011bb68e05"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a33cd6ad9017bbeaa9ed78a2e0752c5e250eafb9534f308e7a5f7849b0b1bfb4"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15cc53a3179ba0fcefe1e3ae50beb2784dede4003ad2dfd24f81bba4b23a454f"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45d9c5eb9273aa50999ad6adc6be5e0ecea7e09dbd0d31bd0c65a55a2592ca08"},
{file = "pydantic_core-2.27.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8bf7b66ce12a2ac52d16f776b31d16d91033150266eb796967a7e4621707e4f6"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:655d7dd86f26cb15ce8a431036f66ce0318648f8853d709b4167786ec2fa4807"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:5556470f1a2157031e676f776c2bc20acd34c1990ca5f7e56f1ebf938b9ab57c"},
{file = "pydantic_core-2.27.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f69ed81ab24d5a3bd93861c8c4436f54afdf8e8cc421562b0c7504cf3be58206"},
{file = "pydantic_core-2.27.1-cp310-none-win32.whl", hash = "sha256:f5a823165e6d04ccea61a9f0576f345f8ce40ed533013580e087bd4d7442b52c"},
{file = "pydantic_core-2.27.1-cp310-none-win_amd64.whl", hash = "sha256:57866a76e0b3823e0b56692d1a0bf722bffb324839bb5b7226a7dbd6c9a40b17"},
{file = "pydantic_core-2.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ac3b20653bdbe160febbea8aa6c079d3df19310d50ac314911ed8cc4eb7f8cb8"},
{file = "pydantic_core-2.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a5a8e19d7c707c4cadb8c18f5f60c843052ae83c20fa7d44f41594c644a1d330"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f7059ca8d64fea7f238994c97d91f75965216bcbe5f695bb44f354893f11d52"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bed0f8a0eeea9fb72937ba118f9db0cb7e90773462af7962d382445f3005e5a4"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a3cb37038123447cf0f3ea4c74751f6a9d7afef0eb71aa07bf5f652b5e6a132c"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:84286494f6c5d05243456e04223d5a9417d7f443c3b76065e75001beb26f88de"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acc07b2cfc5b835444b44a9956846b578d27beeacd4b52e45489e93276241025"},
{file = "pydantic_core-2.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4fefee876e07a6e9aad7a8c8c9f85b0cdbe7df52b8a9552307b09050f7512c7e"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:258c57abf1188926c774a4c94dd29237e77eda19462e5bb901d88adcab6af919"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:35c14ac45fcfdf7167ca76cc80b2001205a8d5d16d80524e13508371fb8cdd9c"},
{file = "pydantic_core-2.27.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d1b26e1dff225c31897696cab7d4f0a315d4c0d9e8666dbffdb28216f3b17fdc"},
{file = "pydantic_core-2.27.1-cp311-none-win32.whl", hash = "sha256:2cdf7d86886bc6982354862204ae3b2f7f96f21a3eb0ba5ca0ac42c7b38598b9"},
{file = "pydantic_core-2.27.1-cp311-none-win_amd64.whl", hash = "sha256:3af385b0cee8df3746c3f406f38bcbfdc9041b5c2d5ce3e5fc6637256e60bbc5"},
{file = "pydantic_core-2.27.1-cp311-none-win_arm64.whl", hash = "sha256:81f2ec23ddc1b476ff96563f2e8d723830b06dceae348ce02914a37cb4e74b89"},
{file = "pydantic_core-2.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9cbd94fc661d2bab2bc702cddd2d3370bbdcc4cd0f8f57488a81bcce90c7a54f"},
{file = "pydantic_core-2.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5f8c4718cd44ec1580e180cb739713ecda2bdee1341084c1467802a417fe0f02"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15aae984e46de8d376df515f00450d1522077254ef6b7ce189b38ecee7c9677c"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1ba5e3963344ff25fc8c40da90f44b0afca8cfd89d12964feb79ac1411a260ac"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:992cea5f4f3b29d6b4f7f1726ed8ee46c8331c6b4eed6db5b40134c6fe1768bb"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0325336f348dbee6550d129b1627cb8f5351a9dc91aad141ffb96d4937bd9529"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7597c07fbd11515f654d6ece3d0e4e5093edc30a436c63142d9a4b8e22f19c35"},
{file = "pydantic_core-2.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3bbd5d8cc692616d5ef6fbbbd50dbec142c7e6ad9beb66b78a96e9c16729b089"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:dc61505e73298a84a2f317255fcc72b710b72980f3a1f670447a21efc88f8381"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:e1f735dc43da318cad19b4173dd1ffce1d84aafd6c9b782b3abc04a0d5a6f5bb"},
{file = "pydantic_core-2.27.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f4e5658dbffe8843a0f12366a4c2d1c316dbe09bb4dfbdc9d2d9cd6031de8aae"},
{file = "pydantic_core-2.27.1-cp312-none-win32.whl", hash = "sha256:672ebbe820bb37988c4d136eca2652ee114992d5d41c7e4858cdd90ea94ffe5c"},
{file = "pydantic_core-2.27.1-cp312-none-win_amd64.whl", hash = "sha256:66ff044fd0bb1768688aecbe28b6190f6e799349221fb0de0e6f4048eca14c16"},
{file = "pydantic_core-2.27.1-cp312-none-win_arm64.whl", hash = "sha256:9a3b0793b1bbfd4146304e23d90045f2a9b5fd5823aa682665fbdaf2a6c28f3e"},
{file = "pydantic_core-2.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f216dbce0e60e4d03e0c4353c7023b202d95cbaeff12e5fd2e82ea0a66905073"},
{file = "pydantic_core-2.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a2e02889071850bbfd36b56fd6bc98945e23670773bc7a76657e90e6b6603c08"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42b0e23f119b2b456d07ca91b307ae167cc3f6c846a7b169fca5326e32fdc6cf"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:764be71193f87d460a03f1f7385a82e226639732214b402f9aa61f0d025f0737"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1c00666a3bd2f84920a4e94434f5974d7bbc57e461318d6bb34ce9cdbbc1f6b2"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3ccaa88b24eebc0f849ce0a4d09e8a408ec5a94afff395eb69baf868f5183107"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c65af9088ac534313e1963443d0ec360bb2b9cba6c2909478d22c2e363d98a51"},
{file = "pydantic_core-2.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:206b5cf6f0c513baffaeae7bd817717140770c74528f3e4c3e1cec7871ddd61a"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:062f60e512fc7fff8b8a9d680ff0ddaaef0193dba9fa83e679c0c5f5fbd018bc"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:a0697803ed7d4af5e4c1adf1670af078f8fcab7a86350e969f454daf598c4960"},
{file = "pydantic_core-2.27.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:58ca98a950171f3151c603aeea9303ef6c235f692fe555e883591103da709b23"},
{file = "pydantic_core-2.27.1-cp313-none-win32.whl", hash = "sha256:8065914ff79f7eab1599bd80406681f0ad08f8e47c880f17b416c9f8f7a26d05"},
{file = "pydantic_core-2.27.1-cp313-none-win_amd64.whl", hash = "sha256:ba630d5e3db74c79300d9a5bdaaf6200172b107f263c98a0539eeecb857b2337"},
{file = "pydantic_core-2.27.1-cp313-none-win_arm64.whl", hash = "sha256:45cf8588c066860b623cd11c4ba687f8d7175d5f7ef65f7129df8a394c502de5"},
{file = "pydantic_core-2.27.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:5897bec80a09b4084aee23f9b73a9477a46c3304ad1d2d07acca19723fb1de62"},
{file = "pydantic_core-2.27.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d0165ab2914379bd56908c02294ed8405c252250668ebcb438a55494c69f44ab"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6b9af86e1d8e4cfc82c2022bfaa6f459381a50b94a29e95dcdda8442d6d83864"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f6c8a66741c5f5447e047ab0ba7a1c61d1e95580d64bce852e3df1f895c4067"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9a42d6a8156ff78981f8aa56eb6394114e0dedb217cf8b729f438f643608cbcd"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:64c65f40b4cd8b0e049a8edde07e38b476da7e3aaebe63287c899d2cff253fa5"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdcf339322a3fae5cbd504edcefddd5a50d9ee00d968696846f089b4432cf78"},
{file = "pydantic_core-2.27.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf99c8404f008750c846cb4ac4667b798a9f7de673ff719d705d9b2d6de49c5f"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:8f1edcea27918d748c7e5e4d917297b2a0ab80cad10f86631e488b7cddf76a36"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_armv7l.whl", hash = "sha256:159cac0a3d096f79ab6a44d77a961917219707e2a130739c64d4dd46281f5c2a"},
{file = "pydantic_core-2.27.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:029d9757eb621cc6e1848fa0b0310310de7301057f623985698ed7ebb014391b"},
{file = "pydantic_core-2.27.1-cp38-none-win32.whl", hash = "sha256:a28af0695a45f7060e6f9b7092558a928a28553366519f64083c63a44f70e618"},
{file = "pydantic_core-2.27.1-cp38-none-win_amd64.whl", hash = "sha256:2d4567c850905d5eaaed2f7a404e61012a51caf288292e016360aa2b96ff38d4"},
{file = "pydantic_core-2.27.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:e9386266798d64eeb19dd3677051f5705bf873e98e15897ddb7d76f477131967"},
{file = "pydantic_core-2.27.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4228b5b646caa73f119b1ae756216b59cc6e2267201c27d3912b592c5e323b60"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b3dfe500de26c52abe0477dde16192ac39c98f05bf2d80e76102d394bd13854"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aee66be87825cdf72ac64cb03ad4c15ffef4143dbf5c113f64a5ff4f81477bf9"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b748c44bb9f53031c8cbc99a8a061bc181c1000c60a30f55393b6e9c45cc5bd"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ca038c7f6a0afd0b2448941b6ef9d5e1949e999f9e5517692eb6da58e9d44be"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e0bd57539da59a3e4671b90a502da9a28c72322a4f17866ba3ac63a82c4498e"},
{file = "pydantic_core-2.27.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ac6c2c45c847bbf8f91930d88716a0fb924b51e0c6dad329b793d670ec5db792"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b94d4ba43739bbe8b0ce4262bcc3b7b9f31459ad120fb595627eaeb7f9b9ca01"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:00e6424f4b26fe82d44577b4c842d7df97c20be6439e8e685d0d715feceb9fb9"},
{file = "pydantic_core-2.27.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:38de0a70160dd97540335b7ad3a74571b24f1dc3ed33f815f0880682e6880131"},
{file = "pydantic_core-2.27.1-cp39-none-win32.whl", hash = "sha256:7ccebf51efc61634f6c2344da73e366c75e735960b5654b63d7e6f69a5885fa3"},
{file = "pydantic_core-2.27.1-cp39-none-win_amd64.whl", hash = "sha256:a57847b090d7892f123726202b7daa20df6694cbd583b67a592e856bff603d6c"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3fa80ac2bd5856580e242dbc202db873c60a01b20309c8319b5c5986fbe53ce6"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d950caa237bb1954f1b8c9227b5065ba6875ac9771bb8ec790d956a699b78676"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e4216e64d203e39c62df627aa882f02a2438d18a5f21d7f721621f7a5d3611d"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a3d637bd387c41d46b002f0e49c52642281edacd2740e5a42f7017feea3f2c"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:161c27ccce13b6b0c8689418da3885d3220ed2eae2ea5e9b2f7f3d48f1d52c27"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:19910754e4cc9c63bc1c7f6d73aa1cfee82f42007e407c0f413695c2f7ed777f"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e173486019cc283dc9778315fa29a363579372fe67045e971e89b6365cc035ed"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:af52d26579b308921b73b956153066481f064875140ccd1dfd4e77db89dbb12f"},
{file = "pydantic_core-2.27.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:981fb88516bd1ae8b0cbbd2034678a39dedc98752f264ac9bc5839d3923fa04c"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5fde892e6c697ce3e30c61b239330fc5d569a71fefd4eb6512fc6caec9dd9e2f"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:816f5aa087094099fff7edabb5e01cc370eb21aa1a1d44fe2d2aefdfb5599b31"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c10c309e18e443ddb108f0ef64e8729363adbfd92d6d57beec680f6261556f3"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98476c98b02c8e9b2eec76ac4156fd006628b1b2d0ef27e548ffa978393fd154"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c3027001c28434e7ca5a6e1e527487051136aa81803ac812be51802150d880dd"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:7699b1df36a48169cdebda7ab5a2bac265204003f153b4bd17276153d997670a"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1c39b07d90be6b48968ddc8c19e7585052088fd7ec8d568bb31ff64c70ae3c97"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:46ccfe3032b3915586e469d4972973f893c0a2bb65669194a5bdea9bacc088c2"},
{file = "pydantic_core-2.27.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:62ba45e21cf6571d7f716d903b5b7b6d2617e2d5d67c0923dc47b9d41369f840"},
{file = "pydantic_core-2.27.1.tar.gz", hash = "sha256:62a763352879b84aa31058fc931884055fd75089cccbd9d58bb6afd01141b235"},
{file = "pydantic_core-2.27.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2d367ca20b2f14095a8f4fa1210f5a7b78b8a20009ecced6b12818f455b1e9fa"},
{file = "pydantic_core-2.27.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:491a2b73db93fab69731eaee494f320faa4e093dbed776be1a829c2eb222c34c"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7969e133a6f183be60e9f6f56bfae753585680f3b7307a8e555a948d443cc05a"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3de9961f2a346257caf0aa508a4da705467f53778e9ef6fe744c038119737ef5"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e2bb4d3e5873c37bb3dd58714d4cd0b0e6238cebc4177ac8fe878f8b3aa8e74c"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:280d219beebb0752699480fe8f1dc61ab6615c2046d76b7ab7ee38858de0a4e7"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47956ae78b6422cbd46f772f1746799cbb862de838fd8d1fbd34a82e05b0983a"},
{file = "pydantic_core-2.27.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:14d4a5c49d2f009d62a2a7140d3064f686d17a5d1a268bc641954ba181880236"},
{file = "pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:337b443af21d488716f8d0b6164de833e788aa6bd7e3a39c005febc1284f4962"},
{file = "pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:03d0f86ea3184a12f41a2d23f7ccb79cdb5a18e06993f8a45baa8dfec746f0e9"},
{file = "pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7041c36f5680c6e0f08d922aed302e98b3745d97fe1589db0a3eebf6624523af"},
{file = "pydantic_core-2.27.2-cp310-cp310-win32.whl", hash = "sha256:50a68f3e3819077be2c98110c1f9dcb3817e93f267ba80a2c05bb4f8799e2ff4"},
{file = "pydantic_core-2.27.2-cp310-cp310-win_amd64.whl", hash = "sha256:e0fd26b16394ead34a424eecf8a31a1f5137094cabe84a1bcb10fa6ba39d3d31"},
{file = "pydantic_core-2.27.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:8e10c99ef58cfdf2a66fc15d66b16c4a04f62bca39db589ae8cba08bc55331bc"},
{file = "pydantic_core-2.27.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:26f32e0adf166a84d0cb63be85c562ca8a6fa8de28e5f0d92250c6b7e9e2aff7"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c19d1ea0673cd13cc2f872f6c9ab42acc4e4f492a7ca9d3795ce2b112dd7e15"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5e68c4446fe0810e959cdff46ab0a41ce2f2c86d227d96dc3847af0ba7def306"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d9640b0059ff4f14d1f37321b94061c6db164fbe49b334b31643e0528d100d99"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:40d02e7d45c9f8af700f3452f329ead92da4c5f4317ca9b896de7ce7199ea459"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c1fd185014191700554795c99b347d64f2bb637966c4cfc16998a0ca700d048"},
{file = "pydantic_core-2.27.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d81d2068e1c1228a565af076598f9e7451712700b673de8f502f0334f281387d"},
{file = "pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1a4207639fb02ec2dbb76227d7c751a20b1a6b4bc52850568e52260cae64ca3b"},
{file = "pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:3de3ce3c9ddc8bbd88f6e0e304dea0e66d843ec9de1b0042b0911c1663ffd474"},
{file = "pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:30c5f68ded0c36466acede341551106821043e9afaad516adfb6e8fa80a4e6a6"},
{file = "pydantic_core-2.27.2-cp311-cp311-win32.whl", hash = "sha256:c70c26d2c99f78b125a3459f8afe1aed4d9687c24fd677c6a4436bc042e50d6c"},
{file = "pydantic_core-2.27.2-cp311-cp311-win_amd64.whl", hash = "sha256:08e125dbdc505fa69ca7d9c499639ab6407cfa909214d500897d02afb816e7cc"},
{file = "pydantic_core-2.27.2-cp311-cp311-win_arm64.whl", hash = "sha256:26f0d68d4b235a2bae0c3fc585c585b4ecc51382db0e3ba402a22cbc440915e4"},
{file = "pydantic_core-2.27.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9e0c8cfefa0ef83b4da9588448b6d8d2a2bf1a53c3f1ae5fca39eb3061e2f0b0"},
{file = "pydantic_core-2.27.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:83097677b8e3bd7eaa6775720ec8e0405f1575015a463285a92bfdfe254529ef"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:172fce187655fece0c90d90a678424b013f8fbb0ca8b036ac266749c09438cb7"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:519f29f5213271eeeeb3093f662ba2fd512b91c5f188f3bb7b27bc5973816934"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:05e3a55d124407fffba0dd6b0c0cd056d10e983ceb4e5dbd10dda135c31071d6"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c3ed807c7b91de05e63930188f19e921d1fe90de6b4f5cd43ee7fcc3525cb8c"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fb4aadc0b9a0c063206846d603b92030eb6f03069151a625667f982887153e2"},
{file = "pydantic_core-2.27.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:28ccb213807e037460326424ceb8b5245acb88f32f3d2777427476e1b32c48c4"},
{file = "pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:de3cd1899e2c279b140adde9357c4495ed9d47131b4a4eaff9052f23398076b3"},
{file = "pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:220f892729375e2d736b97d0e51466252ad84c51857d4d15f5e9692f9ef12be4"},
{file = "pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a0fcd29cd6b4e74fe8ddd2c90330fd8edf2e30cb52acda47f06dd615ae72da57"},
{file = "pydantic_core-2.27.2-cp312-cp312-win32.whl", hash = "sha256:1e2cb691ed9834cd6a8be61228471d0a503731abfb42f82458ff27be7b2186fc"},
{file = "pydantic_core-2.27.2-cp312-cp312-win_amd64.whl", hash = "sha256:cc3f1a99a4f4f9dd1de4fe0312c114e740b5ddead65bb4102884b384c15d8bc9"},
{file = "pydantic_core-2.27.2-cp312-cp312-win_arm64.whl", hash = "sha256:3911ac9284cd8a1792d3cb26a2da18f3ca26c6908cc434a18f730dc0db7bfa3b"},
{file = "pydantic_core-2.27.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7d14bd329640e63852364c306f4d23eb744e0f8193148d4044dd3dacdaacbd8b"},
{file = "pydantic_core-2.27.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82f91663004eb8ed30ff478d77c4d1179b3563df6cdb15c0817cd1cdaf34d154"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71b24c7d61131bb83df10cc7e687433609963a944ccf45190cfc21e0887b08c9"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fa8e459d4954f608fa26116118bb67f56b93b209c39b008277ace29937453dc9"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce8918cbebc8da707ba805b7fd0b382816858728ae7fe19a942080c24e5b7cd1"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eda3f5c2a021bbc5d976107bb302e0131351c2ba54343f8a496dc8783d3d3a6a"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd8086fa684c4775c27f03f062cbb9eaa6e17f064307e86b21b9e0abc9c0f02e"},
{file = "pydantic_core-2.27.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8d9b3388db186ba0c099a6d20f0604a44eabdeef1777ddd94786cdae158729e4"},
{file = "pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7a66efda2387de898c8f38c0cf7f14fca0b51a8ef0b24bfea5849f1b3c95af27"},
{file = "pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:18a101c168e4e092ab40dbc2503bdc0f62010e95d292b27827871dc85450d7ee"},
{file = "pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ba5dd002f88b78a4215ed2f8ddbdf85e8513382820ba15ad5ad8955ce0ca19a1"},
{file = "pydantic_core-2.27.2-cp313-cp313-win32.whl", hash = "sha256:1ebaf1d0481914d004a573394f4be3a7616334be70261007e47c2a6fe7e50130"},
{file = "pydantic_core-2.27.2-cp313-cp313-win_amd64.whl", hash = "sha256:953101387ecf2f5652883208769a79e48db18c6df442568a0b5ccd8c2723abee"},
{file = "pydantic_core-2.27.2-cp313-cp313-win_arm64.whl", hash = "sha256:ac4dbfd1691affb8f48c2c13241a2e3b60ff23247cbcf981759c768b6633cf8b"},
{file = "pydantic_core-2.27.2-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d3e8d504bdd3f10835468f29008d72fc8359d95c9c415ce6e767203db6127506"},
{file = "pydantic_core-2.27.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:521eb9b7f036c9b6187f0b47318ab0d7ca14bd87f776240b90b21c1f4f149320"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85210c4d99a0114f5a9481b44560d7d1e35e32cc5634c656bc48e590b669b145"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d716e2e30c6f140d7560ef1538953a5cd1a87264c737643d481f2779fc247fe1"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f66d89ba397d92f840f8654756196d93804278457b5fbede59598a1f9f90b228"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:669e193c1c576a58f132e3158f9dfa9662969edb1a250c54d8fa52590045f046"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdbe7629b996647b99c01b37f11170a57ae675375b14b8c13b8518b8320ced5"},
{file = "pydantic_core-2.27.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d262606bf386a5ba0b0af3b97f37c83d7011439e3dc1a9298f21efb292e42f1a"},
{file = "pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:cabb9bcb7e0d97f74df8646f34fc76fbf793b7f6dc2438517d7a9e50eee4f14d"},
{file = "pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_armv7l.whl", hash = "sha256:d2d63f1215638d28221f664596b1ccb3944f6e25dd18cd3b86b0a4c408d5ebb9"},
{file = "pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:bca101c00bff0adb45a833f8451b9105d9df18accb8743b08107d7ada14bd7da"},
{file = "pydantic_core-2.27.2-cp38-cp38-win32.whl", hash = "sha256:f6f8e111843bbb0dee4cb6594cdc73e79b3329b526037ec242a3e49012495b3b"},
{file = "pydantic_core-2.27.2-cp38-cp38-win_amd64.whl", hash = "sha256:fd1aea04935a508f62e0d0ef1f5ae968774a32afc306fb8545e06f5ff5cdf3ad"},
{file = "pydantic_core-2.27.2-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:c10eb4f1659290b523af58fa7cffb452a61ad6ae5613404519aee4bfbf1df993"},
{file = "pydantic_core-2.27.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ef592d4bad47296fb11f96cd7dc898b92e795032b4894dfb4076cfccd43a9308"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c61709a844acc6bf0b7dce7daae75195a10aac96a596ea1b776996414791ede4"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c5f762659e47fdb7b16956c71598292f60a03aa92f8b6351504359dbdba6cf"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4c9775e339e42e79ec99c441d9730fccf07414af63eac2f0e48e08fd38a64d76"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:57762139821c31847cfb2df63c12f725788bd9f04bc2fb392790959b8f70f118"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0d1e85068e818c73e048fe28cfc769040bb1f475524f4745a5dc621f75ac7630"},
{file = "pydantic_core-2.27.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:097830ed52fd9e427942ff3b9bc17fab52913b2f50f2880dc4a5611446606a54"},
{file = "pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:044a50963a614ecfae59bb1eaf7ea7efc4bc62f49ed594e18fa1e5d953c40e9f"},
{file = "pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:4e0b4220ba5b40d727c7f879eac379b822eee5d8fff418e9d3381ee45b3b0362"},
{file = "pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5e4f4bb20d75e9325cc9696c6802657b58bc1dbbe3022f32cc2b2b632c3fbb96"},
{file = "pydantic_core-2.27.2-cp39-cp39-win32.whl", hash = "sha256:cca63613e90d001b9f2f9a9ceb276c308bfa2a43fafb75c8031c4f66039e8c6e"},
{file = "pydantic_core-2.27.2-cp39-cp39-win_amd64.whl", hash = "sha256:77d1bca19b0f7021b3a982e6f903dcd5b2b06076def36a652e3907f596e29f67"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:2bf14caea37e91198329b828eae1618c068dfb8ef17bb33287a7ad4b61ac314e"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:b0cb791f5b45307caae8810c2023a184c74605ec3bcbb67d13846c28ff731ff8"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:688d3fd9fcb71f41c4c015c023d12a79d1c4c0732ec9eb35d96e3388a120dcf3"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d591580c34f4d731592f0e9fe40f9cc1b430d297eecc70b962e93c5c668f15f"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:82f986faf4e644ffc189a7f1aafc86e46ef70372bb153e7001e8afccc6e54133"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:bec317a27290e2537f922639cafd54990551725fc844249e64c523301d0822fc"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:0296abcb83a797db256b773f45773da397da75a08f5fcaef41f2044adec05f50"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:0d75070718e369e452075a6017fbf187f788e17ed67a3abd47fa934d001863d9"},
{file = "pydantic_core-2.27.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:7e17b560be3c98a8e3aa66ce828bdebb9e9ac6ad5466fba92eb74c4c95cb1151"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:c33939a82924da9ed65dab5a65d427205a73181d8098e79b6b426bdf8ad4e656"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:00bad2484fa6bda1e216e7345a798bd37c68fb2d97558edd584942aa41b7d278"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c817e2b40aba42bac6f457498dacabc568c3b7a986fc9ba7c8d9d260b71485fb"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:251136cdad0cb722e93732cb45ca5299fb56e1344a833640bf93b2803f8d1bfd"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d2088237af596f0a524d3afc39ab3b036e8adb054ee57cbb1dcf8e09da5b29cc"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d4041c0b966a84b4ae7a09832eb691a35aec90910cd2dbe7a208de59be77965b"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:8083d4e875ebe0b864ffef72a4304827015cff328a1be6e22cc850753bfb122b"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f141ee28a0ad2123b6611b6ceff018039df17f32ada8b534e6aa039545a3efb2"},
{file = "pydantic_core-2.27.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7d0c8399fcc1848491f00e0314bd59fb34a9c008761bcb422a057670c3f65e35"},
{file = "pydantic_core-2.27.2.tar.gz", hash = "sha256:eb026e5a4c1fee05726072337ff51d1efb6f59090b7da90d30ea58625b1ffb39"},
]
[package.dependencies]

View File

@ -1,6 +1,6 @@
[tool.poetry]
name = "authentik"
version = "2024.12.4"
version = "2024.10.5"
description = ""
authors = ["authentik Team <hello@goauthentik.io>"]

View File

@ -1,7 +1,7 @@
openapi: 3.0.3
info:
title: authentik
version: 2024.12.4
version: 2024.10.5
description: Making authentication simple.
contact:
email: hello@goauthentik.io

View File

@ -1,29 +0,0 @@
#!/bin/bash
set -e -x -o pipefail
hash="$(git rev-parse HEAD || openssl rand -base64 36)"
AUTHENTIK_IMAGE="xghcr.io/goauthentik/server"
AUTHENTIK_TAG="$(echo "$hash" | cut -c1-15)"
if [ -f .env ]; then
echo "Existing .env file, aborting"
exit 1
fi
echo PG_PASS="$(openssl rand -base64 36 | tr -d '\n')" >.env
echo AUTHENTIK_SECRET_KEY="$(openssl rand -base64 60 | tr -d '\n')" >>.env
echo AUTHENTIK_IMAGE="${AUTHENTIK_IMAGE}" >>.env
echo AUTHENTIK_TAG="${AUTHENTIK_TAG}" >>.env
export COMPOSE_PROJECT_NAME="authentik-test-${AUTHENTIK_TAG}"
# Ensure buildx is installed
docker buildx install
# For release builds we have an empty client here as we use the NPM package
mkdir -p ./gen-ts-api
touch .env
docker build -t "${AUTHENTIK_IMAGE}:${AUTHENTIK_TAG}" .
docker compose up --no-start
docker compose start postgresql redis
docker compose run -u root server test-all
docker compose down -v

View File

@ -46,7 +46,7 @@ export class OutpostServiceConnectionListPage extends TablePage<ServiceConnectio
const connections = await new OutpostsApi(DEFAULT_CONFIG).outpostsServiceConnectionsAllList(
await this.defaultEndpointConfig(),
);
await Promise.all(
Promise.all(
connections.results.map((connection) => {
return new OutpostsApi(DEFAULT_CONFIG)
.outpostsServiceConnectionsAllStateRetrieve({

View File

@ -291,7 +291,7 @@ export function renderForm(
${showHttpBasic ? renderHttpBasic(provider) : nothing}
<ak-form-element-horizontal
label=${msg("Federated OIDC Sources")}
label=${msg("Trusted OIDC Sources")}
name="jwtFederationSources"
>
<ak-dual-select-dynamic-selected

View File

@ -131,10 +131,9 @@ export class UserListPage extends WithBrandConfig(WithCapabilitiesConfig(TablePa
constructor() {
super();
const defaultPath = new DefaultUIConfig().defaults.userPath;
this.activePath = getURLParam<string>("path", defaultPath);
this.activePath = getURLParam<string>("path", "/");
uiConfig().then((c) => {
if (c.defaults.userPath !== defaultPath) {
if (c.defaults.userPath !== new DefaultUIConfig().defaults.userPath) {
this.activePath = c.defaults.userPath;
}
});

View File

@ -3,7 +3,7 @@ export const SUCCESS_CLASS = "pf-m-success";
export const ERROR_CLASS = "pf-m-danger";
export const PROGRESS_CLASS = "pf-m-in-progress";
export const CURRENT_CLASS = "pf-m-current";
export const VERSION = "2024.12.4";
export const VERSION = "2024.10.5";
export const TITLE_DEFAULT = "authentik";
export const ROUTE_SEPARATOR = ";";

View File

@ -18,12 +18,6 @@ export class LoadingOverlay extends AKElement implements ILoadingOverlay {
@property({ type: Boolean, attribute: "topmost" })
topmost = false;
@property({ type: Boolean })
loading = true;
@property({ type: String })
icon = "";
static get styles() {
return [
PFBase,
@ -46,7 +40,7 @@ export class LoadingOverlay extends AKElement implements ILoadingOverlay {
}
render() {
return html`<ak-empty-state ?loading=${this.loading} header="" icon=${this.icon}>
return html`<ak-empty-state loading header="">
<span slot="body"><slot></slot></span>
</ak-empty-state>`;
}

View File

@ -3,7 +3,6 @@ import { AKElement } from "@goauthentik/elements/Base";
import { WithBrandConfig } from "@goauthentik/elements/Interface/brandProvider";
import { themeImage } from "@goauthentik/elements/utils/images";
import { msg } from "@lit/localize";
import { CSSResult, TemplateResult, css, html } from "lit";
import { customElement } from "lit/decorators.js";
@ -87,7 +86,7 @@ export class SidebarBrand extends WithBrandConfig(AKElement) {
<div class="pf-c-brand ak-brand">
<img
src=${themeImage(this.brand?.brandingLogo ?? DefaultBrand.brandingLogo)}
alt="${msg("authentik Logo")}"
alt="authentik Logo"
loading="lazy"
/>
</div>

View File

@ -4,7 +4,7 @@ import "@goauthentik/elements/LoadingOverlay";
import Guacamole from "guacamole-common-js";
import { msg, str } from "@lit/localize";
import { CSSResult, TemplateResult, css, html, nothing } from "lit";
import { CSSResult, TemplateResult, css, html } from "lit";
import { customElement, property, state } from "lit/decorators.js";
import AKGlobal from "@goauthentik/common/styles/authentik.css";
@ -21,23 +21,6 @@ enum GuacClientState {
DISCONNECTED = 5,
}
export function GuacStateToString(state: GuacClientState): string {
switch (state) {
case GuacClientState.IDLE:
return msg("Idle");
case GuacClientState.CONNECTING:
return msg("Connecting");
case GuacClientState.WAITING:
return msg("Waiting");
case GuacClientState.CONNECTED:
return msg("Connected");
case GuacClientState.DISCONNECTING:
return msg("Disconnecting");
case GuacClientState.DISCONNECTED:
return msg("Disconnected");
}
}
const AUDIO_INPUT_MIMETYPE = "audio/L16;rate=44100,channels=2";
const RECONNECT_ATTEMPTS_INITIAL = 5;
const RECONNECT_ATTEMPTS = 5;
@ -81,9 +64,6 @@ export class RacInterface extends Interface {
@state()
clientState?: GuacClientState;
@state()
clientStatus?: Guacamole.Status;
@state()
reconnectingMessage = "";
@ -158,7 +138,6 @@ export class RacInterface extends Interface {
};
this.client = new Guacamole.Client(this.tunnel);
this.client.onerror = (err) => {
this.clientStatus = err;
console.debug("authentik/rac: error: ", err);
this.reconnect();
};
@ -196,10 +175,6 @@ export class RacInterface extends Interface {
const params = new URLSearchParams();
params.set("screen_width", Math.floor(RacInterface.domSize().width).toString());
params.set("screen_height", Math.floor(RacInterface.domSize().height).toString());
// https://github.com/goauthentik/authentik/pull/11757
// there are DPI issues when using SSH on HiDPi screens
// but if we're not setting DPI at all the resolution is not respected at all
params.set("screen_dpi", "96");
this.client.connect(params.toString());
}
@ -246,7 +221,6 @@ export class RacInterface extends Interface {
return;
}
this.hasConnected = true;
this.clientStatus = undefined;
this.container = this.client.getDisplay().getElement();
this.initMouse(this.container);
this.client?.sendSize(
@ -336,35 +310,19 @@ export class RacInterface extends Interface {
console.debug("authentik/rac: Sent clipboard");
}
renderOverlay() {
if (!this.clientState || this.clientState == GuacClientState.CONNECTED) {
return nothing;
}
let message = html`${GuacStateToString(this.clientState)}`;
if (this.clientState == GuacClientState.WAITING) {
message = html`${msg("Connecting...")}`;
}
if (this.hasConnected) {
message = html`${this.reconnectingMessage}`;
}
if (this.clientStatus?.message) {
message = html`${message}<br />${this.clientStatus.message}`;
}
const isLoading = [
GuacClientState.CONNECTING,
GuacClientState.DISCONNECTING,
GuacClientState.WAITING,
].includes(this.clientState);
return html`
<ak-loading-overlay ?loading=${isLoading} icon="fa fa-times">
<span> ${message} </span>
</ak-loading-overlay>
`;
}
render(): TemplateResult {
return html`
${this.renderOverlay()}
${this.clientState !== GuacClientState.CONNECTED
? html`
<ak-loading-overlay>
<span>
${this.hasConnected
? html`${this.reconnectingMessage}`
: html`${msg("Connecting...")}`}
</span>
</ak-loading-overlay>
`
: html``}
<div class="container">${this.container}</div>
`;
}

View File

@ -511,7 +511,7 @@ export class FlowExecutor extends Interface implements StageHost {
DefaultBrand.brandingLogo,
),
)}"
alt="${msg("authentik Logo")}"
alt="authentik Logo"
/>
</div>
${until(this.renderChallenge())}

View File

@ -8,7 +8,6 @@ import { DefaultBrand } from "@goauthentik/elements/sidebar/SidebarBrand";
import { themeImage } from "@goauthentik/elements/utils/images";
import "rapidoc";
import { msg } from "@lit/localize";
import { CSSResult, TemplateResult, css, html } from "lit";
import { customElement, property, state } from "lit/decorators.js";
import { ifDefined } from "lit/directives/if-defined.js";
@ -103,7 +102,7 @@ export class APIBrowser extends Interface {
>
<div slot="nav-logo">
<img
alt="${msg("authentik Logo")}"
alt="authentik Logo"
class="logo"
src="${themeImage(
first(this.brand?.brandingLogo, DefaultBrand.brandingLogo),

View File

@ -20,21 +20,13 @@ To add an application to authentik and have it display on users' **My applicatio
2. Click **Create with Wizard**. (Alternatively, use our legacy process and click **Create**. The legacy process requires that the application and its authentication provider be configured separately.)
3. In the **New application** wizard, define the application details, the provider type, bindings for the application.
3. In the **New application** wizard, define the application details, the provider type and configuration, and then click **Submit**.
- **Application**: provide a name, an optional group for the type of application, the policy engine mode, and optional UI settings.
4. To manage the display of the new application on the **My applications** page, you can optionally define the bindings for a specific policy, group, or user. Note that if you do not define bindings, then all users have access to the application, For more information, refer to [authorization](#authorization).
- **Choose a Provider**: select the provider types for this application.
## Authorization
- **Configure a Provider**: provide a name (or accept the auto-provided name), the authorization flow to use for this provider, and any additional required configurations.
- **Configure Bindings**: to manage the listing and access to applications on a user's **My applications** page, you can optionally create a [binding](../flows-stages/bindings/index.md) between the application and a specific policy, group, or user. Note that if you do not define any bindings, then all users have access to the application. For more information about user access, refer to our documentation about [authorization](#policy-driven-authorization) and [hiding an application](#hide-applications).
4. On the **Review and Submit Application** panel, review the configuration for the new application and its provider, and then click **Submit**.
## Policy-driven authorization
To use a [policy](../../customize/policies/index.md) to control which users or groups can access an application, click on an application in the applications list and then select the **Policy/Group/User Bindings** tab. There you can bind users/groups/policies to grant them access. When nothing is bound, everyone has access. Binding a policy restricts access to specific Users or Groups, or by other custom policies such as restriction to a set time-of-day or a geographic region.
Application access can be configured using (Policy) bindings. Click on an application in the applications list, and select the _Policy / Group / User Bindings_ tab. There you can bind users/groups/policies to grant them access. When nothing is bound, everyone has access. You can use this to grant access to one or multiple users/groups, or dynamically give access using policies.
By default, all users can access applications when no policies are bound.
@ -43,49 +35,6 @@ When multiple policies/groups/users are attached, you can configure the _Policy
- Require users to pass all bindings/be member of all groups (ALL), or
- Require users to pass either binding/be member of either group (ANY)
## Application Entitlements
<span class="badge badge--preview">Preview</span>
<span class="badge badge--version">authentik 2024.12+</span>
Application entitlements can be used through authentik to manage authorization within an application (what areas of the app users or groups can access). Entitlements are scoped to a single application and can be bound to multiple users and/or groups (binding policies is not currently supported), giving them access to the entitlement. An application can either check for the name of the entitlement (via the `entitlements` scope), or via attributes stored in entitlements.
An authentik admin can create an entitlement [in the Admin interface](#create-an-application-entitlement) or using the [authentik API](../../developer-docs/api/api.md).
Because entitlements exist within an application, names of entitlements must be unique within an application. This also means that entitlements are deleted when an application is deleted.
### Using entitlements
Entitlements to which a user has access can be retrieved using the `user.app_entitlements()` function in property mappings/policies. This function needs to be passed the specific application for which to get the entitlements. For example:
```python
entitlements = [entitlement.name for entitlement in request.user.app_entitlements(provider.application)]
return {
"entitlements": entitlements,
}
```
### Attributes
Each entitlement can store attributes similar to user and group attributes. These attributes can be accessed in property mappings and passed to applications via `user.app_entitlements_attributes`. For example:
```python
attrs = request.user.app_entitlements(provider.application)
return {
"my_attr": attrs.get("my_attr")
}
```
### Create an application entitlement
1. Open the Admin interface and navigate to **Applications -> Applications**.
2. Click the name of the application for which you want to create an entitlement.
3. Click the **Application entitlements** tab at the top of the page, and then click **Create entitlement**. Provide a name for the entitlement, enter any optional **Attributes**, and then click **Create**.
4. In the list locate the entitlement to which you want to bind a user or group, and then **click the caret (>) to expand the entitlement details.**
5. In the expanded area, click **Bind existing Group/User**.
6. In the **Create Binding** modal box, select either the tab for **Group** or **User**, and then in the drop-down list, select the group or user.
7. Optionally, configure additional settings for the binding, and then click **Create** to create the binding and close the modal box.
## Hide applications
To hide an application without modifying its policy settings or removing it, you can simply set the _Launch URL_ to `blank://blank`, which will hide the application from users.

View File

@ -1,33 +0,0 @@
---
title: Bindings
---
A binding is, simply put, a connection between two components (a flow, stage, policy, user, or group). The use of a binding adds additional functionality to one those existing components; for example, a policy binding can cause a new stage to be presented within a flow to a specific user or group.
:::info
For information about creating and managing bindings, refer to [Working with bindings](./work_with_bindings.md).
:::
Bindings are an important part of authentik; the majority of configuration options are set in bindings.
Bindings are analyzed by authentik's Flow Plan, which starts with the flow, then assesses all of the bound policies, and then runs them in order to build out the plan.
The two most common types of bindings in authentik are:
- stage bindings
- policy bindings
- user and group bindings
A _stage binding_ connects a stage to a flow. The "additional content" (i.e. the content in the stage) is now added to the flow.
A _policy binding_ connects a specific policy to a flow or to a stage. With the binding, the flow (or stage) will now have additional content (i.e. the policy rules).
You can also bind groups and users to another component (a policy, a stage, a flow, etc.). For example, you can create a binding for a specific group, and then [bind that to a stage binding](../stages/index.md#bind-users-and-groups-to-a-flows-stage-binding), with the result that everyone in that group now will see that stage (and any policies bound to that stage) as part of their flow. Or more specifically, and going one step deeper, you can also _bind a binding to a binding_.
Bindings are also used for [Application Entitlements](../../applications/manage_apps.md#application-entitlements), where you can bind specific users or groups to an application as a way to manage who has access to the application.
It's important to remember that bindings are instantiated objects themselves, and conceptually can be considered as a "connector" between two components. This is why you might read about "binding a binding", because technically, a binding is "spliced" into another binding, in order to intercept and enforce the criteria defined in the second binding.
:::info
Be aware that some stages and flows do not allow user or group bindings, because in certain scenarios (authentication or enrollment), the flow plan doesn't yet know who the user or group is.
:::

View File

@ -1,13 +0,0 @@
---
title: Work with bindings
---
As covered in the [overview](./index.md), bindings interact with many other components.
For instructions to create a binding, refer to the documentation for the specific components:
- [Bind a stage to a flow](../stages/index.md#bind-a-stage-to-a-flow)
- [Bind a policy to a flow or stage](../../../customize/policies/working_with_policies#bind-a-policy-to-a-flow-or-stage)
- [Bind users or groups to a specific application with an Application Entitlement](../../applications/manage_apps.md#application-entitlements)
- [Bind a policy to a specific application when you create a new app using the Wizard](../../applications/manage_apps.md#instructions)
- [Bind users and groups to a stage binding, to define whether or not that stage is shown](../stages/index.md#bind-users-and-groups-to-a-flows-stage-binding)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 208 KiB

View File

@ -54,24 +54,3 @@ To bind a stage to a flow, follow these steps:
3. In the list of flows, click the name of the flow to which you want to bind one or more stages.
4. On the Flow page, click the **Stage Bindings** tab at the top.
5. Here, you can decide if you want to create a new stage and bind it to the flow (**Create and bind Stage**), or if you want to select an existing stage and bind it to the flow (**Bind existing stage**).
## Bind users and groups to a flow's stage binding
You can use bindings to determine whehther or not a stage is presented to a single user or any users within a group. You do this by binding the user or group to a stage binding within a specific flow. For example, if you have a flow that contains a stage that prompts the user for multi-factor authentication, but you only want certain users to see this stage (and fulfill the MFA prompt), then you would bind the appropriate group (or single user) to the stage binding for that flow.
To bind a user or a group to a stage binding for a specific flow, follow these steps:
1. Log in as an admin to authentik, and go to the Admin interface.
2. In the Admin interface, navigate to **Flows and Stages -> Flows**.
3. In the list of flows, click the name of the flow to which you want to bind one or more stages.
4. On the Flow page, click the **Stage Bindings** tab at the top.
5. Locate the stage binding to which you want to bind a user or group, and then **click the caret (>) to expand the stage binding details.**
![](./edit_stage_binding.png)
6. In the expanded area, click **Bind existing policy/group/user**.
7. In the **Create Binding** modal box, select either the tab for **Group** or **User**.
8. In the drop-down list, select the group or user.
9. Optionally, configure additional settings for the binding, and then click **Create** to create the binding and close the modal box.
Learn more about [bindings](../bindings/index.md) and [working with them](../bindings/work_with_bindings.md).

View File

@ -12,6 +12,7 @@ For VS Code, for example, add these entries to your `settings.json`:
"!Enumerate sequence",
"!Env scalar",
"!Find sequence",
"!FindObject sequence",
"!Format sequence",
"!If sequence",
"!Index scalar",
@ -60,7 +61,22 @@ configure_flow:
]
```
Looks up any model and resolves to the the matches' primary key.
Looks up any model and resolves to the matches' primary key.
First argument is the model to be queried, remaining arguments are expected to be pairs of key=value pairs to query for.
#### `!FindObject` <span class="badge badge--version">authentik 2025.2+</span>
Examples:
```yaml
flow_designation:
!AtIndex [
!FindObject [authentik_flows.flow, [slug, default-password-change]],
designation,
]
```
Looks up any model and resolves to the matches' serialized data.
First argument is the model to be queried, remaining arguments are expected to be pairs of key=value pairs to query for.
#### `!Context`

View File

@ -9,7 +9,7 @@ The main settings that brands influence are flows and branding.
## Flows
You can explicitly select, in your instance's Brand settings, the _default flows_ to use for the current brand. To do so, log in as an administrator, open the Admin interface, and navigate to **System -> Brands**. There you can optionally configure these default flows:
You can explicitly select, in your instance's Brand settings, the default flow to use for the following configurations:
- Authentication flow: the flow used to authenticate users. If left empty, the first applicable flow sorted by the slug is used.
- Invalidation flow: for typical use cases, select the `default-invalidation-flow` (Logout) flow. This flow logs the user out of authentik when the application session ends (user logs out of the app).

View File

@ -8,8 +8,6 @@ authentik provides several [standard policy types](./index.md#standard-policies)
We also document how to use a policy to [whitelist email domains](./expression/whitelist_email.md) and to [ensure unique email addresses](./expression/unique_email.md).
To learn more see also [bindings](../../add-secure-apps/flows-stages/bindings/index.md) and how to use the [authentik Wizard to bind policy bindings to the new application](../../add-secure-apps/applications/manage_apps.md#add-new-applications) (for example, to configure application-specific access).
## Create a policy
To create a new policy, follow these steps:
@ -24,7 +22,7 @@ To create a new policy, follow these steps:
After creating the policy, you can bind it to either a [flow](../../add-secure-apps/flows-stages/flow/index.md) or to a [stage](../../add-secure-apps/flows-stages/stages/index.md).
:::info
Bindings are instantiated objects themselves, and conceptually can be considered as the "connector" between the policy and the stage or flow. This is why you might read about "binding a binding", because technically, a binding is "spliced" into another binding, in order to intercept and enforce the criteria defined in the policy. To learn more refer to our [Bindings documentation](../../add-secure-apps/flows-stages/bindings/index.md).
Bindings are instantiated objects themselves, and conceptually can be considered as the "connector" between the policy and the stage or flow. This is why you might read about "binding a binding", because technically, a binding is "spliced" into another binding, in order to intercept and enforce the criteria defined in the policy. You can edit bindings on a flow's **Stage Bindings** tab.
:::
### Bind a policy to a flow

View File

@ -27,6 +27,8 @@ AUTHENTIK_TAG=gh-next
AUTHENTIK_OUTPOSTS__CONTAINER_IMAGE_BASE=ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
```
The Beta image is amd64 only. For arm64 platforms, append `-arm64` to the tag name (no spaces).
Next, run the upgrade commands below.
</TabItem>
@ -45,6 +47,8 @@ image:
pullPolicy: Always
```
The Beta image is amd64 only. For arm64 platforms, append `-arm64` to the tag name (no spaces).
Next, run the upgrade commands below.
</TabItem>

View File

@ -70,17 +70,14 @@ To check if your config has been applied correctly, you can run the following co
- `AUTHENTIK_POSTGRESQL__USER`: Database user
- `AUTHENTIK_POSTGRESQL__PORT`: Database port, defaults to 5432
- `AUTHENTIK_POSTGRESQL__PASSWORD`: Database password, defaults to the environment variable `POSTGRES_PASSWORD`
- `AUTHENTIK_POSTGRESQL__USE_PGBOUNCER`: Adjust configuration to support connection to PgBouncer. Deprecated, see below
- `AUTHENTIK_POSTGRESQL__USE_PGPOOL`: Adjust configuration to support connection to Pgpool. Deprecated, see below
- `AUTHENTIK_POSTGRESQL__USE_PGBOUNCER`: Adjust configuration to support connection to PgBouncer
- `AUTHENTIK_POSTGRESQL__USE_PGPOOL`: Adjust configuration to support connection to Pgpool
- `AUTHENTIK_POSTGRESQL__SSLMODE`: Strictness of ssl verification. Defaults to `"verify-ca"`
- `AUTHENTIK_POSTGRESQL__SSLROOTCERT`: CA root for server ssl verification
- `AUTHENTIK_POSTGRESQL__SSLCERT`: Path to x509 client certificate to authenticate to server
- `AUTHENTIK_POSTGRESQL__SSLKEY`: Path to private key of `SSLCERT` certificate
- `AUTHENTIK_POSTGRESQL__CONN_MAX_AGE`: Database connection lifetime. Defaults to `0` (no persistent connections). Can be set to `null` for unlimited persistent connections. See [Django's documentation](https://docs.djangoproject.com/en/stable/ref/settings/#conn-max-age) for more details.
- `AUTHENTIK_POSTGRESQL__CONN_HEALTH_CHECK`: Existing persistent database connections will be health checked before they are reused if set to `true`. Defaults to `false`. See [Django's documentation](https://docs.djangoproject.com/en/stable/ref/settings/#conn-health-checks) for more details.
- `AUTHENTIK_POSTGRESQL__DISABLE_SERVER_SIDE_CURSORS`: Disable server side cursors when set to `true`. Defaults to `false`. See [Django's documentation](https://docs.djangoproject.com/en/stable/ref/settings/#disable-server-side-cursors) for more details.
The PostgreSQL settings `HOST`, `PORT`, `USER`, and `PASSWORD` support hot-reloading. Adding and removing read replicas doesn't support hot-reloading.
All PostgreSQL settings, apart from `USE_PGBOUNCER` and `USE_PGPOOL`, support hot-reloading. Adding and removing read replicas doesn't support hot-reloading.
### Read replicas
@ -99,25 +96,8 @@ The same PostgreSQL settings as described above are used for each read replica.
- `AUTHENTIK_POSTGRESQL__READ_REPLICAS__0__SSLROOTCERT`
- `AUTHENTIK_POSTGRESQL__READ_REPLICAS__0__SSLCERT`
- `AUTHENTIK_POSTGRESQL__READ_REPLICAS__0__SSLKEY`
- `AUTHENTIK_POSTGRESQL__READ_REPLICAS__0__CONN_MAX_AGE`
- `AUTHENTIK_POSTGRESQL__READ_REPLICAS__0__CONN_HEALTH_CHECK`
- `AUTHENTIK_POSTGRESQL__READ_REPLICAS__0__DISABLE_SERVER_SIDE_CURSORS`
### Using a PostgreSQL connection pooler (PgBouncer or PgPool)
When your PostgreSQL database(s) are running behind a connection pooler, like PgBouncer or PgPool, two settings need to be overridden:
- `AUTHENTIK_POSTGRESQL__CONN_MAX_AGE`
A connection pooler running in session pool mode (PgBouncer default) can be incompatible with unlimited persistent connections enabled by setting this to `null`: If the connection from the connection pooler to the database server is dropped, the connection pooler will wait for the client to disconnect before releasing the connection; however this will **never** happen as authentik is configured to keep the connection to the connection pooler forever.
To address this incompatibility, either configure the connection pooler to run in transaction pool mode, or update this setting to a value lower than any timeouts that may cause the connection to the database to be dropped (up to `0`).
- `AUTHENTIK_POSTGRESQL__DISABLE_SERVER_SIDE_CURSORS`
Using a connection pooler in transaction pool mode (e.g. PgPool, or PgBouncer in transaction or statement pool mode) requires disabling server-side cursors, so this setting must be set to `false`.
Additionally, you can set `AUTHENTIK_POSTGRESQL__CONN_HEALTH_CHECK` to perform health checks on persistent database connections before they are re-used.
Note that `USE_PGBOUNCER` and `USE_PGPOOL` are inherited from the main database configuration and are _not_ overridable on read replicas.
## Redis Settings

View File

@ -24,7 +24,7 @@ Parameters:
Description: authentik server memory in MiB
Type: Number
AuthentikVersion:
Default: 2024.12.4
Default: 2024.10.5
Description: authentik Docker image tag
Type: String
AuthentikWorkerCPU:

View File

@ -3,6 +3,12 @@ title: Release 2024.12
slug: "/releases/2024.12"
---
:::::note
2024.12 has not been released yet! We're publishing these release notes as a preview of what's to come, and for our awesome beta testers trying out release candidates.
To try out the release candidate, replace your Docker image tag with the latest release candidate number, such as 2024.12.0-rc1. You can find the latest one in [the latest releases on GitHub](https://github.com/goauthentik/authentik/releases). If you don't find any, it means we haven't released one yet.
:::::
## Highlights
- **Redirect stage** Conditionally redirect users to other flows and URLs.
@ -18,16 +24,6 @@ slug: "/releases/2024.12"
You can disable this behavior in the **Admin interface** under **System** > **Settings**.
- **Deprecated PostgreSQL `USE_PGBOUNCER` and `USE_PGPOOL` settings**
With this release, the `AUTHENTIK_POSTGRESQL__USE_PGBOUNCER` and `AUTHENTIK_POSTGRESQL__USE_PGPOOL` settings have been deprecated in favor of exposing the underlying database settings: `AUTHENTIK_POSTGRESQL__CONN_MAX_AGE` and `AUTHENTIK_POSTGRESQL__DISABLE_SERVER_SIDE_CURSORS`.
If you are using PgBouncer or PgPool as connection poolers and wish to maintain the same behavior as previous versions, `AUTHENTIK_POSTGRESQL__DISABLE_SERVER_SIDE_CURSORS` must be set to `true`. Moreover, if you are using PgBouncer `AUTHENTIK_POSTGRESQL__CONN_MAX_AGE` must be set to `null`.
The newly exposed settings allow supporting a wider set of connection pooler configurations. For details on how these settings interact with different configurations of connection poolers, please refer to the [PostgreSQL documentation](../../install-config/configuration/configuration.mdx#postgresql-settings).
These settings will be removed in a future version.
## New features
- **Redirect stage**
@ -96,7 +92,6 @@ helm upgrade authentik authentik/authentik -f values.yaml --version ^2024.12
- enterprise/rac: fix API Schema for invalidation_flow (#11907)
- enterprise/stages/authenticator_endpoint_gdtc: don't set frame options globally (#12311)
- enterprise: allow deletion/modification of users when in read-only mode (#12289)
- events: notification_cleanup: avoid unnecessary loop (cherry-pick #12417) (#12418)
- flows: better test stage's challenge responses (#12316)
- flows: silent authz flow (#12213)
- internal: add CSP header to files in `/media` (#12092)
@ -117,7 +112,6 @@ helm upgrade authentik authentik/authentik -f values.yaml --version ^2024.12
- providers/scim: accept string and int for SCIM IDs (#12093)
- rbac: fix incorrect object_description for object-level permissions (#12029)
- root: check remote IP for proxy protocol same as HTTP/etc (#12094)
- root: expose CONN_MAX_AGE, CONN_HEALTH_CHECKS and DISABLE_SERVER_SIDE_CURSORS for PostgreSQL config (cherry-pick #10159) (#12419)
- root: fix activation of locale not being scoped (#12091)
- root: fix database ssl options not set correctly (#12180)
- root: fix health status code (#12255)

View File

@ -1,23 +0,0 @@
# CVE-2025-29928
## Deletion of sessions did not revoke sessions when using database session storage
### Summary
When authentik was configured to use the database for session storage (which is a non-default setting), deleting sessions via the Web Interface or the API would not revoke the session and the session holder would continue to have access to authentik.
This also affects automatic session deletion when a user is set to inactive or a user is deleted.
### Patches
authentik 2025.2.3 and 2024.12.4 fix this issue.
### Workarounds
Switching to the cache-based session storage until the authentik instance can be upgraded is recommended.
### For more information
If you have any questions or comments about this advisory:
- Email us at [security@goauthentik.io](mailto:security@goauthentik.io).

View File

@ -57,10 +57,6 @@ When enabled, all the events caused by a user will be deleted upon the user's de
Globally enable/disable impersonation. Defaults to `true`.
### Require reason for impersonation
Require administrators to provide a reason for impersonating a user. Defaults to `true`.
### Default token duration
Default duration for generated tokens. Defaults to `minutes=30`.

View File

@ -105,7 +105,7 @@ If the user does not receive the email, check if the mail server parameters [are
As an Admin, you can simply reset the password for the user.
1. In the Admin interface, navigate to **Directory > Users** to display all users.
2. Either click the name of the user to display the full User details page, or click the chevron beside their name to expand the options.
2. Either click the name of the user to display the full User details page, or click the chevron beside their name to expand the toptions.
3. To reset the user's password, click **Reset password**, and then define the new value.
## Deactivate or Delete user
@ -128,18 +128,3 @@ You may instead deactivate the account to preserve identity data.
2. Review the changes and click **Delete**.
The user list refreshes and no longer displays the removed users.
## Impersonate a user
With authentik, an Admin can impersonate a user, meaning that the Admin temporarily assumes the identity of the user.
1. In the Admin interface, navigate to **Directory > Users** to display all users.
2. Click the name of the user to display the full User details page.
3. On the Overview tab, beneath **User Details**, in the **Actions** area, click **Impersonate**.
4. At the prompt, provide a reason why you are impersonating this user, and then click **Impersonate**.
:::info
An Admin can globally enable or disable impersonation in the [System Settings](../../sys-mgmt/settings.md#impersonation). By default, this option is set to true, meaning all users can be impersonated.
An Admin can also configure whether inputting a reason for impersonation is required in the [System Settings](../../sys-mgmt/settings.md#require-reason-for-impersonation).
:::

View File

@ -306,17 +306,6 @@ export default {
"add-secure-apps/flows-stages/stages/user_write",
],
},
{
type: "category",
label: "Bindings",
link: {
type: "doc",
id: "add-secure-apps/flows-stages/bindings/index",
},
items: [
"add-secure-apps/flows-stages/bindings/work_with_bindings",
],
},
],
},
{
@ -674,11 +663,6 @@ export default {
type: "category",
label: "CVEs",
items: [
{
type: "category",
label: "2024",
items: ["security/cves/CVE-2025-29928"],
},
{
type: "category",
label: "2024",