Compare commits

..

14 Commits

Author SHA1 Message Date
2de11f8a69 release: 2025.2.0-rc2 2025-02-20 23:47:15 +01:00
b2dcf94aba policies/geoip: fix math in impossible travel (cherry-pick #13141) (#13145)
policies/geoip: fix math in impossible travel (#13141)

* policies/geoip: fix math in impossible travel



* fix threshold



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-02-20 23:46:21 +01:00
adb532fc5d enterprise/stages/source: fix Source stage not executing authentication/enrollment flow (cherry-pick #12875) (#13146)
enterprise/stages/source: fix Source stage not executing authentication/enrollment flow (#12875)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-02-20 23:45:43 +01:00
5d3b35d1ba revert: rbac: exclude permissions for internal models (#12803) (cherry-pick #13138) (#13140)
revert: rbac: exclude permissions for internal models (#12803) (#13138)

Revert "rbac: exclude permissions for internal models (#12803)"

This reverts commit e08ccf4ca0.

Co-authored-by: Jens L. <jens@goauthentik.io>
2025-02-20 16:07:21 +01:00
433a94d9ee web/flows: fix error on interactive Captcha stage when retrying captcha (cherry-pick #13119) (#13139)
web/flows: fix error on interactive Captcha stage when retrying captcha (#13119)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-02-20 15:12:03 +01:00
f28d622d10 cmd: set version in outposts (cherry-pick #13116) (#13122)
cmd: set version in outposts (#13116)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-02-19 18:20:28 +01:00
50a68c22c5 sources/oauth: add group sync for azure_ad (cherry-pick #12894) (#13123)
sources/oauth: add group sync for azure_ad (#12894)

* sources/oauth: add group sync for azure_ad



* make group sync optional



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-02-19 18:20:16 +01:00
13c99c8546 web/user: fix opening application with Enter not respecting new tab setting (cherry-pick #13115) (#13118)
web/user: fix opening application with Enter not respecting new tab setting (#13115)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-02-19 17:57:18 +01:00
7243add30f web/admin: update Application Wizard button placement (cherry-pick #12771) (#13121)
web/admin: update Application Wizard button placement (#12771)

* web: Add InvalidationFlow to Radius Provider dialogues

## What

- Bugfix: adds the InvalidationFlow to the Radius Provider dialogues
  - Repairs: `{"invalidation_flow":["This field is required."]}` message, which was *not* propagated
    to the Notification.
- Nitpick: Pretties `?foo=${true}` expressions: `s/\?([^=]+)=\$\{true\}/\1/`

## Note

Yes, I know I'm going to have to do more magic when we harmonize the forms, and no, I didn't add the
Property Mappings to the wizard, and yes, I know I'm going to have pain with the *new* version of
the wizard. But this is a serious bug; you can't make Radius servers with *either* of the current
dialogues at the moment.

* This (temporary) change is needed to prevent the unit tests from failing.

\# What

\# Why

\# How

\# Designs

\# Test Steps

\# Other Notes

* Revert "This (temporary) change is needed to prevent the unit tests from failing."

This reverts commit dddde09be5.

* web: Make using the wizard the default for new applications

# What

1. I removed the "Wizard Hint" bar and migrated the "Create With Wizard" button down to the default
   position as "Create With Provider," moving the "Create" button to a secondary position.
   Primary coloring has been kept for both.

2. Added an alert to the "Create" legacy dialog:

> Using this form will only create an Application. In order to authenticate with the application,
> you will have to manually pair it with a Provider.

3. Updated the subtitle on the Wizard dialog:

``` diff
-    wizardDescription = msg("Create a new application");
+    wizardDescription = msg("Create a new application and configure a provider for it.");
```

4. Updated the User page so that, if the User is-a Administrator and the number of Applications in
   the system is zero, the user will be invited to create a new Application using the Wizard rather
   than the legacy Form:

```diff
     renderNewAppButton() {
         const href = paramURL("/core/applications", {
-            createForm: true,
+            createWizard: true,
         });
```

5. Fixed a bug where, on initial render, if the `this.brand` field was not available, an error would
   appear in the console. The effects were usually harmless, as brand information came quickly and
   filled in before the user could notice, but it looked bad in the debugger.

6. Fixed a bug in testing where the wizard page "Configure Policy Bindings" had been changed to
   "Configure Policy/User/Group Binding".

# Testing

Since the wizard OUID didn't change (`data-ouia-component-id="start-application-wizard"`), the E2E
tests for "Application Wizard" completed without any substantial changes to the routine or to the
tests.

``` sh
npm run test:e2e:watch -- --spec ./tests/specs/new-application-by-wizard.ts
```

# User documentation changes required.

These changes were made at the request of docs, as an initial draft to show how the page looks with
the Application Wizard as he default tool for creating new Applications.

# Developer documentation changes required.

None.

Co-authored-by: Ken Sternberg <133134217+kensternberg-authentik@users.noreply.github.com>
2025-02-19 17:57:03 +01:00
6611a64a62 web: bump API Client version (cherry-pick #13113) (#13114)
web: bump API Client version (#13113)

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
2025-02-19 13:16:48 +01:00
5262f61483 providers/rac: move to open source (cherry-pick #13015) (#13112)
providers/rac: move to open source (#13015)

* move RAC to open source

* move web out of enterprise



* remove enterprise license requirements from RAC

* format



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Simonyi Gergő <28359278+gergosimonyi@users.noreply.github.com>
Co-authored-by: Jens Langhammer <jens@goauthentik.io>
2025-02-19 13:16:18 +01:00
9dcbb4af9e release: 2025.2.0-rc1 2025-02-19 02:36:48 +01:00
0665bfac58 website/docs: add 2025.2 release notes (cherry-pick #13002) (#13108)
website/docs: add 2025.2 release notes (#13002)

* website/docs: add 2025.2 release notes



* make compile



* ffs



* ffs



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-02-19 02:12:47 +01:00
790e0c4d80 core: clear expired database sessions (cherry-pick #13105) (#13106)
core: clear expired database sessions (#13105)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-02-18 23:22:21 +01:00
513 changed files with 9154 additions and 22211 deletions

View File

@ -1,5 +1,5 @@
[bumpversion]
current_version = 2025.2.2
current_version = 2025.2.0-rc2
tag = True
commit = True
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(?:-(?P<rc_t>[a-zA-Z-]+)(?P<rc_n>[1-9]\\d*))?

View File

@ -28,11 +28,7 @@ Output of docker-compose logs or kubectl logs respectively
**Version and Deployment (please complete the following information):**
<!--
Notice: authentik supports installation via Docker, Kubernetes, and AWS CloudFormation only. Support is not available for other methods. For detailed installation and configuration instructions, please refer to the official documentation at https://docs.goauthentik.io/docs/install-config/.
-->
- authentik version: [e.g. 2025.2.0]
- authentik version: [e.g. 2021.8.5]
- Deployment: [e.g. docker-compose, helm]
**Additional context**

View File

@ -20,12 +20,7 @@ Output of docker-compose logs or kubectl logs respectively
**Version and Deployment (please complete the following information):**
<!--
Notice: authentik supports installation via Docker, Kubernetes, and AWS CloudFormation only. Support is not available for other methods. For detailed installation and configuration instructions, please refer to the official documentation at https://docs.goauthentik.io/docs/install-config/.
-->
- authentik version: [e.g. 2025.2.0]
- authentik version: [e.g. 2021.8.5]
- Deployment: [e.g. docker-compose, helm]
**Additional context**

View File

@ -9,22 +9,17 @@ inputs:
runs:
using: "composite"
steps:
- name: Install apt deps
- name: Install poetry & deps
shell: bash
run: |
pipx install poetry || true
sudo apt-get update
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext libkrb5-dev krb5-kdc krb5-user krb5-admin-server
- name: Install uv
uses: astral-sh/setup-uv@v5
with:
enable-cache: true
- name: Setup python
- name: Setup python and restore poetry
uses: actions/setup-python@v5
with:
python-version-file: "pyproject.toml"
- name: Install Python deps
shell: bash
run: uv sync --all-extras --dev --frozen
cache: "poetry"
- name: Setup node
uses: actions/setup-node@v4
with:
@ -35,18 +30,15 @@ runs:
uses: actions/setup-go@v5
with:
go-version-file: "go.mod"
- name: Setup docker cache
uses: ScribeMD/docker-cache@0.5.0
with:
key: docker-images-${{ runner.os }}-${{ hashFiles('.github/actions/setup/docker-compose.yml', 'Makefile') }}-${{ inputs.postgresql_version }}
- name: Setup dependencies
shell: bash
run: |
export PSQL_TAG=${{ inputs.postgresql_version }}
docker compose -f .github/actions/setup/docker-compose.yml up -d
poetry install --sync
cd web && npm ci
- name: Generate config
shell: uv run python {0}
shell: poetry run python {0}
run: |
from authentik.lib.generators import generate_id
from yaml import safe_dump

View File

@ -11,7 +11,7 @@ services:
- 5432:5432
restart: always
redis:
image: docker.io/library/redis:7
image: docker.io/library/redis
ports:
- 6379:6379
restart: always

View File

@ -1,32 +1,7 @@
akadmin
asgi
assertIn
authentik
authn
crate
docstrings
entra
goauthentik
gunicorn
hass
jwe
jwks
keypair
keypairs
kubernetes
oidc
ontext
openid
passwordless
plex
saml
scim
singed
slo
sso
totp
traefik
# https://github.com/codespell-project/codespell/issues/1224
upToDate
hass
warmup
webauthn
ontext
singed
assertIn

View File

@ -82,12 +82,6 @@ updates:
docusaurus:
patterns:
- "@docusaurus/*"
build:
patterns:
- "@swc/*"
- "swc-*"
- "lightningcss*"
- "@rspack/binding*"
- package-ecosystem: npm
directory: "/lifecycle/aws"
schedule:
@ -98,7 +92,7 @@ updates:
prefix: "lifecycle/aws:"
labels:
- dependencies
- package-ecosystem: uv
- package-ecosystem: pip
directory: "/"
schedule:
interval: daily

View File

@ -40,7 +40,7 @@ jobs:
attestations: write
steps:
- uses: actions/checkout@v4
- uses: docker/setup-qemu-action@v3.6.0
- uses: docker/setup-qemu-action@v3.4.0
- uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables

View File

@ -33,7 +33,7 @@ jobs:
npm ci
- name: Check changes have been applied
run: |
uv run make aws-cfn
poetry run make aws-cfn
git diff --exit-code
ci-aws-cfn-mark:
if: always()

View File

@ -15,8 +15,8 @@ jobs:
matrix:
version:
- docs
- version-2025-2
- version-2024-12
- version-2024-10
steps:
- uses: actions/checkout@v4
- run: |

View File

@ -34,7 +34,7 @@ jobs:
- name: Setup authentik env
uses: ./.github/actions/setup
- name: run job
run: uv run make ci-${{ matrix.job }}
run: poetry run make ci-${{ matrix.job }}
test-migrations:
runs-on: ubuntu-latest
steps:
@ -42,7 +42,7 @@ jobs:
- name: Setup authentik env
uses: ./.github/actions/setup
- name: run migrations
run: uv run python -m lifecycle.migrate
run: poetry run python -m lifecycle.migrate
test-make-seed:
runs-on: ubuntu-latest
steps:
@ -69,21 +69,19 @@ jobs:
fetch-depth: 0
- name: checkout stable
run: |
# Delete all poetry envs
rm -rf /home/runner/.cache/pypoetry
# Copy current, latest config to local
# Temporarly comment the .github backup while migrating to uv
cp authentik/lib/default.yml local.env.yml
# cp -R .github ..
cp -R .github ..
cp -R scripts ..
git checkout $(git tag --sort=version:refname | grep '^version/' | grep -vE -- '-rc[0-9]+$' | tail -n1)
# rm -rf .github/ scripts/
# mv ../.github ../scripts .
rm -rf scripts/
mv ../scripts .
rm -rf .github/ scripts/
mv ../.github ../scripts .
- name: Setup authentik env (stable)
uses: ./.github/actions/setup
with:
postgresql_version: ${{ matrix.psql }}
continue-on-error: true
- name: run migrations to stable
run: poetry run python -m lifecycle.migrate
- name: checkout current code
@ -93,13 +91,15 @@ jobs:
git reset --hard HEAD
git clean -d -fx .
git checkout $GITHUB_SHA
# Delete previous poetry env
rm -rf /home/runner/.cache/pypoetry/virtualenvs/*
- name: Setup authentik env (ensure latest deps are installed)
uses: ./.github/actions/setup
with:
postgresql_version: ${{ matrix.psql }}
- name: migrate to latest
run: |
uv run python -m lifecycle.migrate
poetry run python -m lifecycle.migrate
- name: run tests
env:
# Test in the main database that we just migrated from the previous stable version
@ -108,7 +108,7 @@ jobs:
CI_RUN_ID: ${{ matrix.run_id }}
CI_TOTAL_RUNS: "5"
run: |
uv run make ci-test
poetry run make ci-test
test-unittest:
name: test-unittest - PostgreSQL ${{ matrix.psql }} - Run ${{ matrix.run_id }}/5
runs-on: ubuntu-latest
@ -133,7 +133,7 @@ jobs:
CI_RUN_ID: ${{ matrix.run_id }}
CI_TOTAL_RUNS: "5"
run: |
uv run make ci-test
poetry run make ci-test
- if: ${{ always() }}
uses: codecov/codecov-action@v5
with:
@ -156,8 +156,8 @@ jobs:
uses: helm/kind-action@v1.12.0
- name: run integration
run: |
uv run coverage run manage.py test tests/integration
uv run coverage xml
poetry run coverage run manage.py test tests/integration
poetry run coverage xml
- if: ${{ always() }}
uses: codecov/codecov-action@v5
with:
@ -214,8 +214,8 @@ jobs:
npm run build
- name: run e2e
run: |
uv run coverage run manage.py test ${{ matrix.job.glob }}
uv run coverage xml
poetry run coverage run manage.py test ${{ matrix.job.glob }}
poetry run coverage xml
- if: ${{ always() }}
uses: codecov/codecov-action@v5
with:

View File

@ -82,7 +82,7 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.6.0
uses: docker/setup-qemu-action@v3.4.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: prepare variables

View File

@ -2,7 +2,7 @@ name: authentik-gen-update-webauthn-mds
on:
workflow_dispatch:
schedule:
- cron: "30 1 1,15 * *"
- cron: '30 1 1,15 * *'
env:
POSTGRES_DB: authentik
@ -24,7 +24,7 @@ jobs:
token: ${{ steps.generate_token.outputs.token }}
- name: Setup authentik env
uses: ./.github/actions/setup
- run: uv run ak update_webauthn_mds
- run: poetry run ak update_webauthn_mds
- uses: peter-evans/create-pull-request@v7
id: cpr
with:

View File

@ -21,8 +21,8 @@ jobs:
uses: ./.github/actions/setup
- name: generate docs
run: |
uv run make migrate
uv run ak build_source_docs
poetry run make migrate
poetry run ak build_source_docs
- name: Publish
uses: netlify/actions/cli@master
with:

View File

@ -42,7 +42,7 @@ jobs:
with:
go-version-file: "go.mod"
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.6.0
uses: docker/setup-qemu-action@v3.4.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: prepare variables
@ -186,7 +186,7 @@ jobs:
container=$(docker container create ${{ steps.ev.outputs.imageMainName }})
docker cp ${container}:web/ .
- name: Create a Sentry.io release
uses: getsentry/action-release@v3
uses: getsentry/action-release@v1
continue-on-error: true
env:
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}

View File

@ -1,27 +0,0 @@
name: authentik-semgrep
on:
workflow_dispatch: {}
pull_request: {}
push:
branches:
- main
- master
paths:
- .github/workflows/semgrep.yml
schedule:
# random HH:MM to avoid a load spike on GitHub Actions at 00:00
- cron: '12 15 * * *'
jobs:
semgrep:
name: semgrep/ci
runs-on: ubuntu-latest
permissions:
contents: read
env:
SEMGREP_APP_TOKEN: ${{ secrets.SEMGREP_APP_TOKEN }}
container:
image: semgrep/semgrep
if: (github.actor != 'dependabot[bot]')
steps:
- uses: actions/checkout@v4
- run: semgrep ci

View File

@ -1,13 +1,9 @@
---
name: authentik-translate-extract-compile
name: authentik-backend-translate-extract-compile
on:
schedule:
- cron: "0 0 * * *" # every day at midnight
workflow_dispatch:
pull_request:
branches:
- main
- version-*
env:
POSTGRES_DB: authentik
@ -19,30 +15,23 @@ jobs:
runs-on: ubuntu-latest
steps:
- id: generate_token
if: ${{ github.event_name != 'pull_request' }}
uses: tibdex/github-app-token@v2
with:
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v4
if: ${{ github.event_name != 'pull_request' }}
with:
token: ${{ steps.generate_token.outputs.token }}
- uses: actions/checkout@v4
if: ${{ github.event_name == 'pull_request' }}
- name: Setup authentik env
uses: ./.github/actions/setup
- name: Generate API
run: make gen-client-ts
- name: run extract
run: |
uv run make i18n-extract
poetry run make i18n-extract
- name: run compile
run: |
uv run ak compilemessages
poetry run ak compilemessages
make web-check-compile
- name: Create Pull Request
if: ${{ github.event_name != 'pull_request' }}
uses: peter-evans/create-pull-request@v7
with:
token: ${{ steps.generate_token.outputs.token }}

22
.vscode/settings.json vendored
View File

@ -1,4 +1,26 @@
{
"cSpell.words": [
"akadmin",
"asgi",
"authentik",
"authn",
"entra",
"goauthentik",
"jwe",
"jwks",
"kubernetes",
"oidc",
"openid",
"passwordless",
"plex",
"saml",
"scim",
"slo",
"sso",
"totp",
"traefik",
"webauthn"
],
"todo-tree.tree.showCountsInTree": true,
"todo-tree.tree.showBadges": true,
"yaml.customTags": [

46
.vscode/tasks.json vendored
View File

@ -3,13 +3,8 @@
"tasks": [
{
"label": "authentik/core: make",
"command": "uv",
"args": [
"run",
"make",
"lint-fix",
"lint"
],
"command": "poetry",
"args": ["run", "make", "lint-fix", "lint"],
"presentation": {
"panel": "new"
},
@ -17,12 +12,8 @@
},
{
"label": "authentik/core: run",
"command": "uv",
"args": [
"run",
"ak",
"server"
],
"command": "poetry",
"args": ["run", "ak", "server"],
"group": "build",
"presentation": {
"panel": "dedicated",
@ -32,17 +23,13 @@
{
"label": "authentik/web: make",
"command": "make",
"args": [
"web"
],
"args": ["web"],
"group": "build"
},
{
"label": "authentik/web: watch",
"command": "make",
"args": [
"web-watch"
],
"args": ["web-watch"],
"group": "build",
"presentation": {
"panel": "dedicated",
@ -52,26 +39,19 @@
{
"label": "authentik: install",
"command": "make",
"args": [
"install",
"-j4"
],
"args": ["install", "-j4"],
"group": "build"
},
{
"label": "authentik/website: make",
"command": "make",
"args": [
"website"
],
"args": ["website"],
"group": "build"
},
{
"label": "authentik/website: watch",
"command": "make",
"args": [
"website-watch"
],
"args": ["website-watch"],
"group": "build",
"presentation": {
"panel": "dedicated",
@ -80,12 +60,8 @@
},
{
"label": "authentik/api: generate",
"command": "uv",
"args": [
"run",
"make",
"gen"
],
"command": "poetry",
"args": ["run", "make", "gen"],
"group": "build"
}
]

View File

@ -10,7 +10,7 @@ schemas/ @goauthentik/backend
scripts/ @goauthentik/backend
tests/ @goauthentik/backend
pyproject.toml @goauthentik/backend
uv.lock @goauthentik/backend
poetry.lock @goauthentik/backend
go.mod @goauthentik/backend
go.sum @goauthentik/backend
# Infrastructure

View File

@ -5,7 +5,7 @@
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socioeconomic status,
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.

View File

@ -93,59 +93,53 @@ RUN --mount=type=secret,id=GEOIPUPDATE_ACCOUNT_ID \
mkdir -p /usr/share/GeoIP && \
/bin/sh -c "/usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0"
# Stage 5: Download uv
FROM ghcr.io/astral-sh/uv:0.6.9 AS uv
# Stage 6: Base python image
FROM ghcr.io/goauthentik/fips-python:3.12.8-slim-bookworm-fips AS python-base
ENV VENV_PATH="/ak-root/.venv" \
PATH="/lifecycle:/ak-root/.venv/bin:$PATH" \
UV_COMPILE_BYTECODE=1 \
UV_LINK_MODE=copy \
UV_NATIVE_TLS=1 \
UV_PYTHON_DOWNLOADS=0
WORKDIR /ak-root/
COPY --from=uv /uv /uvx /bin/
# Stage 7: Python dependencies
FROM python-base AS python-deps
# Stage 5: Python dependencies
FROM ghcr.io/goauthentik/fips-python:3.12.8-slim-bookworm-fips AS python-deps
ARG TARGETARCH
ARG TARGETVARIANT
RUN rm -f /etc/apt/apt.conf.d/docker-clean; echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache
WORKDIR /ak-root/poetry
ENV PATH="/root/.cargo/bin:$PATH"
ENV VENV_PATH="/ak-root/venv" \
POETRY_VIRTUALENVS_CREATE=false \
PATH="/ak-root/venv/bin:$PATH"
RUN rm -f /etc/apt/apt.conf.d/docker-clean; echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache
RUN --mount=type=cache,id=apt-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/var/cache/apt \
apt-get update && \
# Required for installing pip packages
apt-get install -y --no-install-recommends build-essential pkg-config libpq-dev libkrb5-dev
RUN --mount=type=bind,target=./pyproject.toml,src=./pyproject.toml \
--mount=type=bind,target=./poetry.lock,src=./poetry.lock \
--mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/pypoetry \
pip install --no-cache cffi && \
apt-get update && \
apt-get install -y --no-install-recommends \
# Build essentials
build-essential pkg-config libffi-dev git \
# cryptography
curl \
# libxml
libxslt-dev zlib1g-dev \
# postgresql
libpq-dev \
# python-kadmin-rs
clang libkrb5-dev sccache \
# xmlsec
libltdl-dev && \
curl https://sh.rustup.rs -sSf | sh -s -- -y
build-essential libffi-dev \
# Required for cryptography
curl pkg-config \
# Required for lxml
libxslt-dev zlib1g-dev \
# Required for xmlsec
libltdl-dev \
# Required for kadmin
sccache clang && \
curl https://sh.rustup.rs -sSf | sh -s -- -y && \
. "$HOME/.cargo/env" && \
python -m venv /ak-root/venv/ && \
bash -c "source ${VENV_PATH}/bin/activate && \
pip3 install --upgrade pip poetry && \
poetry config --local installer.no-binary cryptography,xmlsec,lxml,python-kadmin-rs && \
poetry install --only=main --no-ansi --no-interaction --no-root && \
pip uninstall cryptography -y && \
poetry install --only=main --no-ansi --no-interaction --no-root"
ENV UV_NO_BINARY_PACKAGE="cryptography lxml python-kadmin-rs xmlsec"
RUN --mount=type=bind,target=pyproject.toml,src=pyproject.toml \
--mount=type=bind,target=uv.lock,src=uv.lock \
--mount=type=cache,target=/root/.cache/uv \
uv sync --frozen --no-install-project --no-dev
# Stage 8: Run
FROM python-base AS final-image
# Stage 6: Run
FROM ghcr.io/goauthentik/fips-python:3.12.8-slim-bookworm-fips AS final-image
ARG VERSION
ARG GIT_BUILD_HASH
@ -177,7 +171,7 @@ RUN apt-get update && \
COPY ./authentik/ /authentik
COPY ./pyproject.toml /
COPY ./uv.lock /
COPY ./poetry.lock /
COPY ./schemas /schemas
COPY ./locale /locale
COPY ./tests /tests
@ -186,7 +180,7 @@ COPY ./blueprints /blueprints
COPY ./lifecycle/ /lifecycle
COPY ./authentik/sources/kerberos/krb5.conf /etc/krb5.conf
COPY --from=go-builder /go/authentik /bin/authentik
COPY --from=python-deps /ak-root/.venv /ak-root/.venv
COPY --from=python-deps /ak-root/venv /ak-root/venv
COPY --from=web-builder /work/web/dist/ /web/dist/
COPY --from=web-builder /work/web/authentik/ /web/authentik/
COPY --from=website-builder /work/website/build/ /website/help/
@ -197,6 +191,9 @@ USER 1000
ENV TMPDIR=/dev/shm/ \
PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1 \
PATH="/ak-root/venv/bin:/lifecycle:$PATH" \
VENV_PATH="/ak-root/venv" \
POETRY_VIRTUALENVS_CREATE=false \
GOFIPS=1
HEALTHCHECK --interval=30s --timeout=30s --start-period=60s --retries=3 CMD [ "ak", "healthcheck" ]

View File

@ -4,17 +4,34 @@
PWD = $(shell pwd)
UID = $(shell id -u)
GID = $(shell id -g)
NPM_VERSION = $(shell python -m scripts.generate_semver)
NPM_VERSION = $(shell python -m scripts.npm_version)
PY_SOURCES = authentik tests scripts lifecycle .github
GO_SOURCES = cmd internal
WEB_SOURCES = web/src web/packages
DOCKER_IMAGE ?= "authentik:test"
GEN_API_TS = "gen-ts-api"
GEN_API_PY = "gen-py-api"
GEN_API_GO = "gen-go-api"
pg_user := $(shell uv run python -m authentik.lib.config postgresql.user 2>/dev/null)
pg_host := $(shell uv run python -m authentik.lib.config postgresql.host 2>/dev/null)
pg_name := $(shell uv run python -m authentik.lib.config postgresql.name 2>/dev/null)
pg_user := $(shell python -m authentik.lib.config postgresql.user 2>/dev/null)
pg_host := $(shell python -m authentik.lib.config postgresql.host 2>/dev/null)
pg_name := $(shell python -m authentik.lib.config postgresql.name 2>/dev/null)
CODESPELL_ARGS = -D - -D .github/codespell-dictionary.txt \
-I .github/codespell-words.txt \
-S 'web/src/locales/**' \
-S 'website/docs/developer-docs/api/reference/**' \
-S '**/node_modules/**' \
-S '**/dist/**' \
$(PY_SOURCES) \
$(GO_SOURCES) \
$(WEB_SOURCES) \
website/src \
website/blog \
website/docs \
website/integrations \
website/src
all: lint-fix lint test gen web ## Lint, build, and test everything
@ -32,37 +49,34 @@ go-test:
go test -timeout 0 -v -race -cover ./...
test: ## Run the server tests and produce a coverage report (locally)
uv run coverage run manage.py test --keepdb authentik
uv run coverage html
uv run coverage report
coverage run manage.py test --keepdb authentik
coverage html
coverage report
lint-fix: lint-codespell ## Lint and automatically fix errors in the python source code. Reports spelling errors.
uv run black $(PY_SOURCES)
uv run ruff check --fix $(PY_SOURCES)
black $(PY_SOURCES)
ruff check --fix $(PY_SOURCES)
lint-codespell: ## Reports spelling errors.
uv run codespell -w
codespell -w $(CODESPELL_ARGS)
lint: ## Lint the python and golang sources
uv run bandit -c pyproject.toml -r $(PY_SOURCES)
bandit -r $(PY_SOURCES) -x web/node_modules -x tests/wdio/node_modules -x website/node_modules
golangci-lint run -v
core-install:
uv sync --frozen
poetry install
migrate: ## Run the Authentik Django server's migrations
uv run python -m lifecycle.migrate
python -m lifecycle.migrate
i18n-extract: core-i18n-extract web-i18n-extract ## Extract strings that require translation into files to send to a translation service
aws-cfn:
cd lifecycle/aws && npm run aws-cfn
run: ## Run the main authentik server process
uv run ak server
core-i18n-extract:
uv run ak makemessages \
ak makemessages \
--add-location file \
--no-obsolete \
--ignore web \
@ -93,11 +107,11 @@ gen-build: ## Extract the schema from the database
AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
uv run ak make_blueprint_schema > blueprints/schema.json
ak make_blueprint_schema > blueprints/schema.json
AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
uv run ak spectacular --file schema.yml
ak spectacular --file schema.yml
gen-changelog: ## (Release) generate the changelog based from the commits since the last tag
git log --pretty=format:" - %s" $(shell git describe --tags $(shell git rev-list --tags --max-count=1))...$(shell git branch --show-current) | sort > changelog.md
@ -148,7 +162,7 @@ gen-client-py: gen-clean-py ## Build and install the authentik API for Python
docker run \
--rm -v ${PWD}:/local \
--user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v7.11.0 generate \
docker.io/openapitools/openapi-generator-cli:v7.4.0 generate \
-i /local/schema.yml \
-g python \
-o /local/${GEN_API_PY} \
@ -176,7 +190,7 @@ gen-client-go: gen-clean-go ## Build and install the authentik API for Golang
rm -rf ./${GEN_API_GO}/config.yaml ./${GEN_API_GO}/templates/
gen-dev-config: ## Generate a local development config file
uv run scripts/generate_config.py
python -m scripts.generate_config
gen: gen-build gen-client-ts
@ -257,21 +271,21 @@ ci--meta-debug:
node --version
ci-black: ci--meta-debug
uv run black --check $(PY_SOURCES)
black --check $(PY_SOURCES)
ci-ruff: ci--meta-debug
uv run ruff check $(PY_SOURCES)
ruff check $(PY_SOURCES)
ci-codespell: ci--meta-debug
uv run codespell -s
codespell $(CODESPELL_ARGS) -s
ci-bandit: ci--meta-debug
uv run bandit -r $(PY_SOURCES)
bandit -r $(PY_SOURCES)
ci-pending-migrations: ci--meta-debug
uv run ak makemigrations --check
ak makemigrations --check
ci-test: ci--meta-debug
uv run coverage run manage.py test --keepdb --randomly-seed ${CI_TEST_SEED} authentik
uv run coverage report
uv run coverage xml
coverage run manage.py test --keepdb --randomly-seed ${CI_TEST_SEED} authentik
coverage report
coverage xml

View File

@ -2,7 +2,7 @@ authentik takes security very seriously. We follow the rules of [responsible di
## Independent audits and pentests
We are committed to engaging in regular pentesting and security audits of authentik. Defining and adhering to a cadence of external testing ensures a stronger probability that our code base, our features, and our architecture is as secure and non-exploitable as possible. For more details about specific audits and pentests, refer to "Audits and Certificates" in our [Security documentation](https://docs.goauthentik.io/docs/security).
We are committed to engaging in regular pentesting and security audits of authentik. Defining and adhering to a cadence of external testing ensures a stronger probability that our code base, our features, and our architecture is as secure and non-exploitable as possible. For more details about specfic audits and pentests, refer to "Audits and Certificates" in our [Security documentation](https://docs.goauthentik.io/docs/security).
## What authentik classifies as a CVE
@ -20,8 +20,8 @@ Even if the issue is not a CVE, we still greatly appreciate your help in hardeni
| Version | Supported |
| --------- | --------- |
| 2024.10.x | ✅ |
| 2024.12.x | ✅ |
| 2025.2.x | ✅ |
## Reporting a Vulnerability

View File

@ -2,7 +2,7 @@
from os import environ
__version__ = "2025.2.2"
__version__ = "2025.2.0"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"

View File

@ -59,7 +59,7 @@ class SystemInfoSerializer(PassiveSerializer):
if not isinstance(value, str):
continue
actual_value = value
if raw_session is not None and raw_session in actual_value:
if raw_session in actual_value:
actual_value = actual_value.replace(
raw_session, SafeExceptionReporterFilter.cleansed_substitute
)

View File

@ -49,8 +49,6 @@ class BrandSerializer(ModelSerializer):
"branding_title",
"branding_logo",
"branding_favicon",
"branding_custom_css",
"branding_default_flow_background",
"flow_authentication",
"flow_invalidation",
"flow_recovery",
@ -88,7 +86,6 @@ class CurrentBrandSerializer(PassiveSerializer):
branding_title = CharField()
branding_logo = CharField(source="branding_logo_url")
branding_favicon = CharField(source="branding_favicon_url")
branding_custom_css = CharField()
ui_footer_links = ListField(
child=FooterLinkSerializer(),
read_only=True,
@ -128,7 +125,6 @@ class BrandViewSet(UsedByMixin, ModelViewSet):
"branding_title",
"branding_logo",
"branding_favicon",
"branding_default_flow_background",
"flow_authentication",
"flow_invalidation",
"flow_recovery",

View File

@ -1,35 +0,0 @@
# Generated by Django 5.0.12 on 2025-02-22 01:51
from pathlib import Path
from django.db import migrations, models
from django.apps.registry import Apps
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
def migrate_custom_css(apps: Apps, schema_editor: BaseDatabaseSchemaEditor):
Brand = apps.get_model("authentik_brands", "brand")
db_alias = schema_editor.connection.alias
path = Path("/web/dist/custom.css")
if not path.exists():
return
css = path.read_text()
Brand.objects.using(db_alias).update(branding_custom_css=css)
class Migration(migrations.Migration):
dependencies = [
("authentik_brands", "0007_brand_default_application"),
]
operations = [
migrations.AddField(
model_name="brand",
name="branding_custom_css",
field=models.TextField(blank=True, default=""),
),
migrations.RunPython(migrate_custom_css),
]

View File

@ -1,18 +0,0 @@
# Generated by Django 5.0.13 on 2025-03-19 22:54
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_brands", "0008_brand_branding_custom_css"),
]
operations = [
migrations.AddField(
model_name="brand",
name="branding_default_flow_background",
field=models.TextField(default="/static/dist/assets/images/flow_background.jpg"),
),
]

View File

@ -33,10 +33,6 @@ class Brand(SerializerModel):
branding_logo = models.TextField(default="/static/dist/assets/icons/icon_left_brand.svg")
branding_favicon = models.TextField(default="/static/dist/assets/icons/icon.png")
branding_custom_css = models.TextField(default="", blank=True)
branding_default_flow_background = models.TextField(
default="/static/dist/assets/images/flow_background.jpg"
)
flow_authentication = models.ForeignKey(
Flow, null=True, on_delete=models.SET_NULL, related_name="brand_authentication"
@ -88,12 +84,6 @@ class Brand(SerializerModel):
return CONFIG.get("web.path", "/")[:-1] + self.branding_favicon
return self.branding_favicon
def branding_default_flow_background_url(self) -> str:
"""Get branding_default_flow_background with the correct prefix"""
if self.branding_default_flow_background.startswith("/static"):
return CONFIG.get("web.path", "/")[:-1] + self.branding_default_flow_background
return self.branding_default_flow_background
@property
def serializer(self) -> Serializer:
from authentik.brands.api import BrandSerializer

View File

@ -24,7 +24,6 @@ class TestBrands(APITestCase):
"branding_logo": "/static/dist/assets/icons/icon_left_brand.svg",
"branding_favicon": "/static/dist/assets/icons/icon.png",
"branding_title": "authentik",
"branding_custom_css": "",
"matched_domain": brand.domain,
"ui_footer_links": [],
"ui_theme": Themes.AUTOMATIC,
@ -44,7 +43,6 @@ class TestBrands(APITestCase):
"branding_logo": "/static/dist/assets/icons/icon_left_brand.svg",
"branding_favicon": "/static/dist/assets/icons/icon.png",
"branding_title": "custom",
"branding_custom_css": "",
"matched_domain": "bar.baz",
"ui_footer_links": [],
"ui_theme": Themes.AUTOMATIC,
@ -61,7 +59,6 @@ class TestBrands(APITestCase):
"branding_logo": "/static/dist/assets/icons/icon_left_brand.svg",
"branding_favicon": "/static/dist/assets/icons/icon.png",
"branding_title": "authentik",
"branding_custom_css": "",
"matched_domain": "fallback",
"ui_footer_links": [],
"ui_theme": Themes.AUTOMATIC,
@ -124,27 +121,3 @@ class TestBrands(APITestCase):
"subject": None,
},
)
def test_branding_url(self):
"""Test branding attributes return correct values"""
brand = create_test_brand()
brand.branding_default_flow_background = "https://goauthentik.io/img/icon.png"
brand.branding_favicon = "https://goauthentik.io/img/icon.png"
brand.branding_logo = "https://goauthentik.io/img/icon.png"
brand.save()
self.assertEqual(
brand.branding_default_flow_background_url(), "https://goauthentik.io/img/icon.png"
)
self.assertJSONEqual(
self.client.get(reverse("authentik_api:brand-current")).content.decode(),
{
"branding_logo": "https://goauthentik.io/img/icon.png",
"branding_favicon": "https://goauthentik.io/img/icon.png",
"branding_title": "authentik",
"branding_custom_css": "",
"matched_domain": brand.domain,
"ui_footer_links": [],
"ui_theme": Themes.AUTOMATIC,
"default_locale": "",
},
)

View File

@ -5,7 +5,6 @@ from collections.abc import Iterable
from drf_spectacular.utils import OpenApiResponse, extend_schema
from rest_framework import mixins
from rest_framework.decorators import action
from rest_framework.exceptions import ValidationError
from rest_framework.fields import CharField, ReadOnlyField, SerializerMethodField
from rest_framework.parsers import MultiPartParser
from rest_framework.request import Request
@ -155,17 +154,6 @@ class SourceViewSet(
matching_sources.append(source_settings.validated_data)
return Response(matching_sources)
def destroy(self, request: Request, *args, **kwargs):
"""Prevent deletion of built-in sources"""
instance: Source = self.get_object()
if instance.managed == Source.MANAGED_INBUILT:
raise ValidationError(
{"detail": "Built-in sources cannot be deleted"}, code="protected"
)
return super().destroy(request, *args, **kwargs)
class UserSourceConnectionSerializer(SourceSerializer):
"""User source connection"""

View File

@ -32,5 +32,5 @@ class AuthentikCoreConfig(ManagedAppConfig):
"name": "authentik Built-in",
"slug": "authentik-built-in",
},
managed=Source.MANAGED_INBUILT,
managed="goauthentik.io/sources/inbuilt",
)

View File

@ -678,8 +678,6 @@ class SourceGroupMatchingModes(models.TextChoices):
class Source(ManagedModel, SerializerModel, PolicyBindingModel):
"""Base Authentication source, i.e. an OAuth Provider, SAML Remote or LDAP Server"""
MANAGED_INBUILT = "goauthentik.io/sources/inbuilt"
name = models.TextField(help_text=_("Source's display Name."))
slug = models.SlugField(help_text=_("Internal source name, used in URLs."), unique=True)

View File

@ -11,7 +11,6 @@
build: "{{ build }}",
api: {
base: "{{ base_url }}",
relBase: "{{ base_url_rel }}",
},
};
window.addEventListener("DOMContentLoaded", function () {

View File

@ -8,15 +8,13 @@
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1, maximum-scale=1">
{# Darkreader breaks the site regardless of theme as its not compatible with webcomponents, and we default to a dark theme based on preferred colour-scheme #}
<meta name="darkreader-lock">
<title>{% block title %}{% trans title|default:brand.branding_title %}{% endblock %}</title>
<link rel="icon" href="{{ brand.branding_favicon_url }}">
<link rel="shortcut icon" href="{{ brand.branding_favicon_url }}">
{% block head_before %}
{% endblock %}
<link rel="stylesheet" type="text/css" href="{% static 'dist/authentik.css' %}">
<style>{{ brand.branding_custom_css }}</style>
<link rel="stylesheet" type="text/css" href="{% static 'dist/custom.css' %}" data-inject>
<script src="{% versioned_script 'dist/poly-%v.js' %}" type="module"></script>
<script src="{% versioned_script 'dist/standalone/loading/index-%v.js' %}" type="module"></script>
{% block head %}

View File

@ -4,7 +4,7 @@
{% load i18n %}
{% block head_before %}
<link rel="prefetch" href="{{ request.brand.branding_default_flow_background_url }}" />
<link rel="prefetch" href="{% static 'dist/assets/images/flow_background.jpg' %}" />
<link rel="stylesheet" type="text/css" href="{% static 'dist/patternfly.min.css' %}">
<link rel="stylesheet" type="text/css" href="{% static 'dist/theme-dark.css' %}" media="(prefers-color-scheme: dark)">
{% include "base/header_js.html" %}
@ -13,7 +13,7 @@
{% block head %}
<style>
:root {
--ak-flow-background: url("{{ request.brand.branding_default_flow_background_url }}");
--ak-flow-background: url("{% static 'dist/assets/images/flow_background.jpg' %}");
--pf-c-background-image--BackgroundImage: var(--ak-flow-background);
--pf-c-background-image--BackgroundImage-2x: var(--ak-flow-background);
--pf-c-background-image--BackgroundImage--sm: var(--ak-flow-background);

View File

@ -55,7 +55,7 @@ class RedirectToAppLaunch(View):
)
except FlowNonApplicableException:
raise Http404 from None
plan.append_stage(in_memory_stage(RedirectToAppStage))
plan.insert_stage(in_memory_stage(RedirectToAppStage))
return plan.to_redirect(request, flow)

View File

@ -53,7 +53,6 @@ class InterfaceView(TemplateView):
kwargs["build"] = get_build_hash()
kwargs["url_kwargs"] = self.kwargs
kwargs["base_url"] = self.request.build_absolute_uri(CONFIG.get("web.path", "/"))
kwargs["base_url_rel"] = CONFIG.get("web.path", "/")
return super().get_context_data(**kwargs)

View File

@ -37,7 +37,6 @@ class GoogleWorkspaceProviderSerializer(EnterpriseRequiredMixin, ProviderSeriali
"user_delete_action",
"group_delete_action",
"default_group_email_domain",
"dry_run",
]
extra_kwargs = {}

View File

@ -8,10 +8,9 @@ from httplib2 import HttpLib2Error, HttpLib2ErrorWithResponse
from authentik.enterprise.providers.google_workspace.models import GoogleWorkspaceProvider
from authentik.lib.sync.outgoing import HTTP_CONFLICT
from authentik.lib.sync.outgoing.base import SAFE_METHODS, BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.base import BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.exceptions import (
BadRequestSyncException,
DryRunRejected,
NotFoundSyncException,
ObjectExistsSyncException,
StopSync,
@ -44,8 +43,6 @@ class GoogleWorkspaceSyncClient[TModel: Model, TConnection: Model, TSchema: dict
self.domains.append(domain_name)
def _request(self, request: HttpRequest):
if self.provider.dry_run and request.method.upper() not in SAFE_METHODS:
raise DryRunRejected(request.uri, request.method, request.body)
try:
response = request.execute()
except GoogleAuthError as exc:

View File

@ -1,24 +0,0 @@
# Generated by Django 5.0.12 on 2025-02-24 19:43
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
(
"authentik_providers_google_workspace",
"0003_googleworkspaceprovidergroup_attributes_and_more",
),
]
operations = [
migrations.AddField(
model_name="googleworkspaceprovider",
name="dry_run",
field=models.BooleanField(
default=False,
help_text="When enabled, provider will not modify or create objects in the remote system.",
),
),
]

View File

@ -36,7 +36,6 @@ class MicrosoftEntraProviderSerializer(EnterpriseRequiredMixin, ProviderSerializ
"filter_group",
"user_delete_action",
"group_delete_action",
"dry_run",
]
extra_kwargs = {}

View File

@ -3,7 +3,6 @@ from collections.abc import Coroutine
from dataclasses import asdict
from typing import Any
import httpx
from azure.core.exceptions import (
ClientAuthenticationError,
ServiceRequestError,
@ -13,7 +12,6 @@ from azure.identity.aio import ClientSecretCredential
from django.db.models import Model
from django.http import HttpResponseBadRequest, HttpResponseNotFound
from kiota_abstractions.api_error import APIError
from kiota_abstractions.request_information import RequestInformation
from kiota_authentication_azure.azure_identity_authentication_provider import (
AzureIdentityAuthenticationProvider,
)
@ -23,15 +21,13 @@ from msgraph.generated.models.o_data_errors.o_data_error import ODataError
from msgraph.graph_request_adapter import GraphRequestAdapter, options
from msgraph.graph_service_client import GraphServiceClient
from msgraph_core import GraphClientFactory
from opentelemetry import trace
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProvider
from authentik.events.utils import sanitize_item
from authentik.lib.sync.outgoing import HTTP_CONFLICT
from authentik.lib.sync.outgoing.base import SAFE_METHODS, BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.base import BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.exceptions import (
BadRequestSyncException,
DryRunRejected,
NotFoundSyncException,
ObjectExistsSyncException,
StopSync,
@ -39,24 +35,20 @@ from authentik.lib.sync.outgoing.exceptions import (
)
class AuthentikRequestAdapter(GraphRequestAdapter):
def __init__(self, auth_provider, provider: MicrosoftEntraProvider, client=None):
super().__init__(auth_provider, client)
self._provider = provider
def get_request_adapter(
credentials: ClientSecretCredential, scopes: list[str] | None = None
) -> GraphRequestAdapter:
if scopes:
auth_provider = AzureIdentityAuthenticationProvider(credentials=credentials, scopes=scopes)
else:
auth_provider = AzureIdentityAuthenticationProvider(credentials=credentials)
async def get_http_response_message(
self,
request_info: RequestInformation,
parent_span: trace.Span,
claims: str = "",
) -> httpx.Response:
if self._provider.dry_run and request_info.http_method.value.upper() not in SAFE_METHODS:
raise DryRunRejected(
url=request_info.url,
method=request_info.http_method.value,
body=request_info.content.decode("utf-8"),
)
return await super().get_http_response_message(request_info, parent_span, claims=claims)
return GraphRequestAdapter(
auth_provider=auth_provider,
client=GraphClientFactory.create_with_default_middleware(
options=options, client=KiotaClientFactory.get_default_client()
),
)
class MicrosoftEntraSyncClient[TModel: Model, TConnection: Model, TSchema: dict](
@ -71,27 +63,9 @@ class MicrosoftEntraSyncClient[TModel: Model, TConnection: Model, TSchema: dict]
self.credentials = provider.microsoft_credentials()
self.__prefetch_domains()
def get_request_adapter(
self, credentials: ClientSecretCredential, scopes: list[str] | None = None
) -> AuthentikRequestAdapter:
if scopes:
auth_provider = AzureIdentityAuthenticationProvider(
credentials=credentials, scopes=scopes
)
else:
auth_provider = AzureIdentityAuthenticationProvider(credentials=credentials)
return AuthentikRequestAdapter(
auth_provider=auth_provider,
provider=self.provider,
client=GraphClientFactory.create_with_default_middleware(
options=options, client=KiotaClientFactory.get_default_client()
),
)
@property
def client(self):
return GraphServiceClient(request_adapter=self.get_request_adapter(**self.credentials))
return GraphServiceClient(request_adapter=get_request_adapter(**self.credentials))
def _request[T](self, request: Coroutine[Any, Any, T]) -> T:
try:

View File

@ -1,24 +0,0 @@
# Generated by Django 5.0.12 on 2025-02-24 19:43
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
(
"authentik_providers_microsoft_entra",
"0002_microsoftentraprovidergroup_attributes_and_more",
),
]
operations = [
migrations.AddField(
model_name="microsoftentraprovider",
name="dry_run",
field=models.BooleanField(
default=False,
help_text="When enabled, provider will not modify or create objects in the remote system.",
),
),
]

View File

@ -32,6 +32,7 @@ class MicrosoftEntraUserTests(APITestCase):
@apply_blueprint("system/providers-microsoft-entra.yaml")
def setUp(self) -> None:
# Delete all users and groups as the mocked HTTP responses only return one ID
# which will cause errors with multiple users
Tenant.objects.update(avatars="none")
@ -96,38 +97,6 @@ class MicrosoftEntraUserTests(APITestCase):
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
user_create.assert_called_once()
def test_user_create_dry_run(self):
"""Test user creation (dry run)"""
self.provider.dry_run = True
self.provider.save()
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
):
user = User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
microsoft_user = MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user=user
).first()
self.assertIsNone(microsoft_user)
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
def test_user_not_created(self):
"""Test without property mappings, no group is created"""
self.provider.property_mappings.clear()

View File

@ -89,9 +89,9 @@ class SourceStageFinal(StageView):
This stage uses the override flow token to resume execution of the initial flow the
source stage is bound to."""
def dispatch(self, *args, **kwargs):
def dispatch(self):
token: FlowToken = self.request.session.get(SESSION_KEY_OVERRIDE_FLOW_TOKEN)
self.logger.info("Replacing source flow with overridden flow", flow=token.flow.slug)
self._logger.info("Replacing source flow with overridden flow", flow=token.flow.slug)
plan = token.plan
plan.context[PLAN_CONTEXT_IS_RESTORED] = token
response = plan.to_redirect(self.request, token.flow)

View File

@ -4,8 +4,7 @@ from django.urls import reverse
from authentik.core.tests.utils import create_test_flow, create_test_user
from authentik.enterprise.stages.source.models import SourceStage
from authentik.enterprise.stages.source.stage import SourceStageFinal
from authentik.flows.models import FlowDesignation, FlowStageBinding, FlowToken, in_memory_stage
from authentik.flows.models import FlowDesignation, FlowStageBinding, FlowToken
from authentik.flows.planner import PLAN_CONTEXT_IS_RESTORED, FlowPlan
from authentik.flows.tests import FlowTestCase
from authentik.flows.views.executor import SESSION_KEY_PLAN
@ -88,7 +87,6 @@ class TestSourceStage(FlowTestCase):
self.assertIsNotNone(flow_token)
session = self.client.session
plan: FlowPlan = session[SESSION_KEY_PLAN]
plan.insert_stage(in_memory_stage(SourceStageFinal), index=0)
plan.context[PLAN_CONTEXT_IS_RESTORED] = flow_token
session[SESSION_KEY_PLAN] = plan
session.save()
@ -98,6 +96,4 @@ class TestSourceStage(FlowTestCase):
reverse("authentik_api:flow-executor", kwargs={"flow_slug": flow.slug}), follow=True
)
self.assertEqual(response.status_code, 200)
self.assertStageRedirects(
response, reverse("authentik_core:if-flow", kwargs={"flow_slug": flow.slug})
)
self.assertStageRedirects(response, reverse("authentik_core:root-redirect"))

View File

@ -50,8 +50,7 @@ class NotificationTransportSerializer(ModelSerializer):
"mode",
"mode_verbose",
"webhook_url",
"webhook_mapping_body",
"webhook_mapping_headers",
"webhook_mapping",
"send_once",
]

View File

@ -1,43 +0,0 @@
# Generated by Django 5.0.13 on 2025-03-20 19:54
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_events", "0008_event_authentik_e_expires_8c73a8_idx_and_more"),
]
operations = [
migrations.RenameField(
model_name="notificationtransport",
old_name="webhook_mapping",
new_name="webhook_mapping_body",
),
migrations.AlterField(
model_name="notificationtransport",
name="webhook_mapping_body",
field=models.ForeignKey(
default=None,
help_text="Customize the body of the request. Mapping should return data that is JSON-serializable.",
null=True,
on_delete=django.db.models.deletion.SET_DEFAULT,
related_name="+",
to="authentik_events.notificationwebhookmapping",
),
),
migrations.AddField(
model_name="notificationtransport",
name="webhook_mapping_headers",
field=models.ForeignKey(
default=None,
help_text="Configure additional headers to be sent. Mapping should return a dictionary of key-value pairs",
null=True,
on_delete=django.db.models.deletion.SET_DEFAULT,
related_name="+",
to="authentik_events.notificationwebhookmapping",
),
),
]

View File

@ -336,27 +336,8 @@ class NotificationTransport(SerializerModel):
mode = models.TextField(choices=TransportMode.choices, default=TransportMode.LOCAL)
webhook_url = models.TextField(blank=True, validators=[DomainlessURLValidator()])
webhook_mapping_body = models.ForeignKey(
"NotificationWebhookMapping",
on_delete=models.SET_DEFAULT,
null=True,
default=None,
related_name="+",
help_text=_(
"Customize the body of the request. "
"Mapping should return data that is JSON-serializable."
),
)
webhook_mapping_headers = models.ForeignKey(
"NotificationWebhookMapping",
on_delete=models.SET_DEFAULT,
null=True,
default=None,
related_name="+",
help_text=_(
"Configure additional headers to be sent. "
"Mapping should return a dictionary of key-value pairs"
),
webhook_mapping = models.ForeignKey(
"NotificationWebhookMapping", on_delete=models.SET_DEFAULT, null=True, default=None
)
send_once = models.BooleanField(
default=False,
@ -379,8 +360,8 @@ class NotificationTransport(SerializerModel):
def send_local(self, notification: "Notification") -> list[str]:
"""Local notification delivery"""
if self.webhook_mapping_body:
self.webhook_mapping_body.evaluate(
if self.webhook_mapping:
self.webhook_mapping.evaluate(
user=notification.user,
request=None,
notification=notification,
@ -399,18 +380,9 @@ class NotificationTransport(SerializerModel):
if notification.event and notification.event.user:
default_body["event_user_email"] = notification.event.user.get("email", None)
default_body["event_user_username"] = notification.event.user.get("username", None)
headers = {}
if self.webhook_mapping_body:
if self.webhook_mapping:
default_body = sanitize_item(
self.webhook_mapping_body.evaluate(
user=notification.user,
request=None,
notification=notification,
)
)
if self.webhook_mapping_headers:
headers = sanitize_item(
self.webhook_mapping_headers.evaluate(
self.webhook_mapping.evaluate(
user=notification.user,
request=None,
notification=notification,
@ -420,7 +392,6 @@ class NotificationTransport(SerializerModel):
response = get_http_session().post(
self.webhook_url,
json=default_body,
headers=headers,
)
response.raise_for_status()
except RequestException as exc:

View File

@ -120,7 +120,7 @@ class TestEventsNotifications(APITestCase):
)
transport = NotificationTransport.objects.create(
name=generate_id(), webhook_mapping_body=mapping, mode=TransportMode.LOCAL
name=generate_id(), webhook_mapping=mapping, mode=TransportMode.LOCAL
)
NotificationRule.objects.filter(name__startswith="default").delete()
trigger = NotificationRule.objects.create(name=generate_id(), group=self.group)

View File

@ -60,25 +60,20 @@ class TestEventTransports(TestCase):
def test_transport_webhook_mapping(self):
"""Test webhook transport with custom mapping"""
mapping_body = NotificationWebhookMapping.objects.create(
mapping = NotificationWebhookMapping.objects.create(
name=generate_id(), expression="return request.user"
)
mapping_headers = NotificationWebhookMapping.objects.create(
name=generate_id(), expression="""return {"foo": "bar"}"""
)
transport: NotificationTransport = NotificationTransport.objects.create(
name=generate_id(),
mode=TransportMode.WEBHOOK,
webhook_url="http://localhost:1234/test",
webhook_mapping_body=mapping_body,
webhook_mapping_headers=mapping_headers,
webhook_mapping=mapping,
)
with Mocker() as mocker:
mocker.post("http://localhost:1234/test")
transport.send(self.notification)
self.assertEqual(mocker.call_count, 1)
self.assertEqual(mocker.request_history[0].method, "POST")
self.assertEqual(mocker.request_history[0].headers["foo"], "bar")
self.assertJSONEqual(
mocker.request_history[0].body.decode(),
{"email": self.user.email, "pk": self.user.pk, "username": self.user.username},

View File

@ -6,7 +6,6 @@ from typing import TYPE_CHECKING
from uuid import uuid4
from django.db import models
from django.http import HttpRequest
from django.utils.translation import gettext_lazy as _
from model_utils.managers import InheritanceManager
from rest_framework.serializers import BaseSerializer
@ -179,12 +178,11 @@ class Flow(SerializerModel, PolicyBindingModel):
help_text=_("Required level of authentication and authorization to access a flow."),
)
def background_url(self, request: HttpRequest | None = None) -> str:
@property
def background_url(self) -> str:
"""Get the URL to the background image. If the name is /static or starts with http
it is returned as-is"""
if not self.background:
if request:
return request.brand.branding_default_flow_background_url()
return (
CONFIG.get("web.path", "/")[:-1] + "/static/dist/assets/images/flow_background.jpg"
)

View File

@ -76,10 +76,10 @@ class FlowPlan:
self.bindings.append(binding)
self.markers.append(marker or StageMarker())
def insert_stage(self, stage: Stage, marker: StageMarker | None = None, index=1):
def insert_stage(self, stage: Stage, marker: StageMarker | None = None):
"""Insert stage into plan, as immediate next stage"""
self.bindings.insert(index, FlowStageBinding(stage=stage, order=0))
self.markers.insert(index, marker or StageMarker())
self.bindings.insert(1, FlowStageBinding(stage=stage, order=0))
self.markers.insert(1, marker or StageMarker())
def redirect(self, destination: str):
"""Insert a redirect stage as next stage"""

View File

@ -184,7 +184,7 @@ class ChallengeStageView(StageView):
flow_info = ContextualFlowInfo(
data={
"title": self.format_title(),
"background": self.executor.flow.background_url(self.request),
"background": self.executor.flow.background_url,
"cancel_url": reverse("authentik_flows:cancel"),
"layout": self.executor.flow.layout,
}

View File

@ -27,6 +27,7 @@ class FlowTestCase(APITestCase):
self.assertIsNotNone(raw_response["component"])
if flow:
self.assertIn("flow_info", raw_response)
self.assertEqual(raw_response["flow_info"]["background"], flow.background_url)
self.assertEqual(
raw_response["flow_info"]["cancel_url"], reverse("authentik_flows:cancel")
)

View File

@ -1,11 +1,9 @@
"""API flow tests"""
from json import loads
from django.urls import reverse
from rest_framework.test import APITestCase
from authentik.core.tests.utils import create_test_admin_user, create_test_flow
from authentik.core.tests.utils import create_test_admin_user
from authentik.flows.api.stages import StageSerializer, StageViewSet
from authentik.flows.models import Flow, FlowDesignation, FlowStageBinding, Stage
from authentik.lib.generators import generate_id
@ -79,22 +77,6 @@ class TestFlowsAPI(APITestCase):
self.assertEqual(response.status_code, 200)
self.assertJSONEqual(response.content, {"diagram": DIAGRAM_EXPECTED})
def test_api_background(self):
"""Test custom background"""
user = create_test_admin_user()
self.client.force_login(user)
flow = create_test_flow()
response = self.client.get(reverse("authentik_api:flow-detail", kwargs={"slug": flow.slug}))
body = loads(response.content.decode())
self.assertEqual(body["background"], "/static/dist/assets/images/flow_background.jpg")
flow.background = "https://goauthentik.io/img/icon.png"
flow.save()
response = self.client.get(reverse("authentik_api:flow-detail", kwargs={"slug": flow.slug}))
body = loads(response.content.decode())
self.assertEqual(body["background"], "https://goauthentik.io/img/icon.png")
def test_api_diagram_no_stages(self):
"""Test flow diagram with no stages."""
user = create_test_admin_user()

View File

@ -49,7 +49,7 @@ class TestFlowInspector(APITestCase):
"captcha_stage": None,
"component": "ak-stage-identification",
"flow_info": {
"background": "/static/dist/assets/images/flow_background.jpg",
"background": flow.background_url,
"cancel_url": reverse("authentik_flows:cancel"),
"title": flow.title,
"layout": "stacked",

View File

@ -282,14 +282,16 @@ class ConfigLoader:
def get_optional_int(self, path: str, default=None) -> int | None:
"""Wrapper for get that converts value into int or None if set"""
value = self.get(path, UNSET)
value = self.get(path, default)
if value is UNSET:
return default
try:
return int(value)
except (ValueError, TypeError) as exc:
if value is None or (isinstance(value, str) and value.lower() == "null"):
return None
return default
if value is UNSET:
return default
self.log("warning", "Failed to parse config as int", path=path, exc=str(exc))
return default
@ -370,9 +372,9 @@ def django_db_config(config: ConfigLoader | None = None) -> dict:
"sslcert": config.get("postgresql.sslcert"),
"sslkey": config.get("postgresql.sslkey"),
},
"CONN_MAX_AGE": config.get_optional_int("postgresql.conn_max_age", 0),
"CONN_HEALTH_CHECKS": config.get_bool("postgresql.conn_health_checks", False),
"DISABLE_SERVER_SIDE_CURSORS": config.get_bool(
"CONN_MAX_AGE": CONFIG.get_optional_int("postgresql.conn_max_age", 0),
"CONN_HEALTH_CHECKS": CONFIG.get_bool("postgresql.conn_health_checks", False),
"DISABLE_SERVER_SIDE_CURSORS": CONFIG.get_bool(
"postgresql.disable_server_side_cursors", False
),
"TEST": {
@ -381,8 +383,8 @@ def django_db_config(config: ConfigLoader | None = None) -> dict:
}
}
conn_max_age = config.get_optional_int("postgresql.conn_max_age", UNSET)
disable_server_side_cursors = config.get_bool("postgresql.disable_server_side_cursors", UNSET)
conn_max_age = CONFIG.get_optional_int("postgresql.conn_max_age", UNSET)
disable_server_side_cursors = CONFIG.get_bool("postgresql.disable_server_side_cursors", UNSET)
if config.get_bool("postgresql.use_pgpool", False):
db["default"]["DISABLE_SERVER_SIDE_CURSORS"] = True
if disable_server_side_cursors is not UNSET:

View File

@ -45,8 +45,6 @@ redis:
# url: ""
# transport_options: ""
http_timeout: 30
cache:
# url: ""
timeout: 300
@ -66,8 +64,6 @@ debugger: false
log_level: info
session_storage: cache
sessions:
unauthenticated_age: days=1
error_reporting:
enabled: false

View File

@ -33,7 +33,6 @@ class SyncObjectSerializer(PassiveSerializer):
)
)
sync_object_id = CharField()
override_dry_run = BooleanField(default=False)
class SyncObjectResultSerializer(PassiveSerializer):
@ -99,7 +98,6 @@ class OutgoingSyncProviderStatusMixin:
page=1,
provider_pk=provider.pk,
pk=params.validated_data["sync_object_id"],
override_dry_run=params.validated_data["override_dry_run"],
).get()
return Response(SyncObjectResultSerializer(instance={"messages": res}).data)

View File

@ -28,14 +28,6 @@ class Direction(StrEnum):
remove = "remove"
SAFE_METHODS = [
"GET",
"HEAD",
"OPTIONS",
"TRACE",
]
class BaseOutgoingSyncClient[
TModel: "Model", TConnection: "Model", TSchema: dict, TProvider: "OutgoingSyncProvider"
]:

View File

@ -21,22 +21,6 @@ class BadRequestSyncException(BaseSyncException):
"""Exception when invalid data was sent to the remote system"""
class DryRunRejected(BaseSyncException):
"""When dry_run is enabled and a provider dropped a mutating request"""
def __init__(self, url: str, method: str, body: dict):
super().__init__()
self.url = url
self.method = method
self.body = body
def __repr__(self):
return self.__str__()
def __str__(self):
return f"Dry-run rejected request: {self.method} {self.url}"
class StopSync(BaseSyncException):
"""Exception raised when a configuration error should stop the sync process"""

View File

@ -1,9 +1,8 @@
from typing import Any, Self
import pglock
from django.db import connection, models
from django.db import connection
from django.db.models import Model, QuerySet, TextChoices
from django.utils.translation import gettext_lazy as _
from authentik.core.models import Group, User
from authentik.lib.sync.outgoing.base import BaseOutgoingSyncClient
@ -19,14 +18,6 @@ class OutgoingSyncDeleteAction(TextChoices):
class OutgoingSyncProvider(Model):
"""Base abstract models for providers implementing outgoing sync"""
dry_run = models.BooleanField(
default=False,
help_text=_(
"When enabled, provider will not modify or create objects in the remote system."
),
)
class Meta:
abstract = True
@ -41,7 +32,7 @@ class OutgoingSyncProvider(Model):
@property
def sync_lock(self) -> pglock.advisory:
"""Postgres lock for syncing to prevent multiple parallel syncs happening"""
"""Postgres lock for syncing SCIM to prevent multiple parallel syncs happening"""
return pglock.advisory(
lock_id=f"goauthentik.io/{connection.schema_name}/providers/outgoing-sync/{str(self.pk)}",
timeout=0,

View File

@ -20,7 +20,6 @@ from authentik.lib.sync.outgoing import PAGE_SIZE, PAGE_TIMEOUT
from authentik.lib.sync.outgoing.base import Direction
from authentik.lib.sync.outgoing.exceptions import (
BadRequestSyncException,
DryRunRejected,
StopSync,
TransientSyncException,
)
@ -106,9 +105,7 @@ class SyncTasks:
return
task.set_status(TaskStatus.SUCCESSFUL, *messages)
def sync_objects(
self, object_type: str, page: int, provider_pk: int, override_dry_run=False, **filter
):
def sync_objects(self, object_type: str, page: int, provider_pk: int, **filter):
_object_type = path_to_class(object_type)
self.logger = get_logger().bind(
provider_type=class_to_path(self._provider_model),
@ -119,10 +116,6 @@ class SyncTasks:
provider = self._provider_model.objects.filter(pk=provider_pk).first()
if not provider:
return messages
# Override dry run mode if requested, however don't save the provider
# so that scheduled sync tasks still run in dry_run mode
if override_dry_run:
provider.dry_run = False
try:
client = provider.client_for_model(_object_type)
except TransientSyncException:
@ -139,22 +132,6 @@ class SyncTasks:
except SkipObjectException:
self.logger.debug("skipping object due to SkipObject", obj=obj)
continue
except DryRunRejected as exc:
messages.append(
asdict(
LogEvent(
_("Dropping mutating request due to dry run"),
log_level="info",
logger=f"{provider._meta.verbose_name}@{object_type}",
attributes={
"obj": sanitize_item(obj),
"method": exc.method,
"url": exc.url,
"body": exc.body,
},
)
)
)
except BadRequestSyncException as exc:
self.logger.warning("failed to sync object", exc=exc, obj=obj)
messages.append(
@ -254,10 +231,8 @@ class SyncTasks:
raise Retry() from exc
except SkipObjectException:
continue
except DryRunRejected as exc:
self.logger.info("Rejected dry-run event", exc=exc)
except StopSync as exc:
self.logger.warning("Stopping sync", exc=exc, provider_pk=provider.pk)
self.logger.warning(exc, provider_pk=provider.pk)
def sync_signal_m2m(self, group_pk: str, action: str, pk_set: list[int]):
self.logger = get_logger().bind(
@ -288,7 +263,5 @@ class SyncTasks:
raise Retry() from exc
except SkipObjectException:
continue
except DryRunRejected as exc:
self.logger.info("Rejected dry-run event", exc=exc)
except StopSync as exc:
self.logger.warning("Stopping sync", exc=exc, provider_pk=provider.pk)
self.logger.warning(exc, provider_pk=provider.pk)

View File

@ -158,18 +158,6 @@ class TestConfig(TestCase):
test_obj = Test()
dumps(test_obj, indent=4, cls=AttrEncoder)
def test_get_optional_int(self):
config = ConfigLoader()
self.assertEqual(config.get_optional_int("foo", 21), 21)
self.assertEqual(config.get_optional_int("foo"), None)
config.set("foo", "21")
self.assertEqual(config.get_optional_int("foo"), 21)
self.assertEqual(config.get_optional_int("foo", 0), 21)
self.assertEqual(config.get_optional_int("foo", "null"), 21)
config.set("foo", "null")
self.assertEqual(config.get_optional_int("foo"), None)
self.assertEqual(config.get_optional_int("foo", 21), None)
@mock.patch.dict(environ, check_deprecations_env_vars)
def test_check_deprecations(self):
"""Test config key re-write for deprecated env vars"""
@ -233,16 +221,6 @@ class TestConfig(TestCase):
},
)
def test_db_conn_max_age(self):
"""Test DB conn_max_age Config"""
config = ConfigLoader()
config.set("postgresql.conn_max_age", "null")
conf = django_db_config(config)
self.assertEqual(
conf["default"]["CONN_MAX_AGE"],
None,
)
def test_db_read_replicas(self):
"""Test read replicas"""
config = ConfigLoader()

View File

@ -16,40 +16,7 @@ def authentik_user_agent() -> str:
return f"authentik@{get_full_version()}"
class TimeoutSession(Session):
"""Always set a default HTTP request timeout"""
def __init__(self, default_timeout=None):
super().__init__()
self.timeout = default_timeout
def send(
self,
request,
*,
stream=...,
verify=...,
proxies=...,
cert=...,
timeout=...,
allow_redirects=...,
**kwargs,
):
if not timeout and self.timeout:
timeout = self.timeout
return super().send(
request,
stream=stream,
verify=verify,
proxies=proxies,
cert=cert,
timeout=timeout,
allow_redirects=allow_redirects,
**kwargs,
)
class DebugSession(TimeoutSession):
class DebugSession(Session):
"""requests session which logs http requests and responses"""
def send(self, req: PreparedRequest, *args, **kwargs):
@ -75,9 +42,8 @@ class DebugSession(TimeoutSession):
def get_http_session() -> Session:
"""Get a requests session with common headers"""
session = TimeoutSession()
session = Session()
if CONFIG.get_bool("debug") or CONFIG.get("log_level") == "trace":
session = DebugSession()
session.headers["User-Agent"] = authentik_user_agent()
session.timeout = CONFIG.get_optional_int("http_timeout")
return session

View File

@ -1,6 +1,5 @@
"""Base Kubernetes Reconciler"""
import re
from dataclasses import asdict
from json import dumps
from typing import TYPE_CHECKING, Generic, TypeVar
@ -68,8 +67,7 @@ class KubernetesObjectReconciler(Generic[T]):
@property
def name(self) -> str:
"""Get the name of the object this reconciler manages"""
base_name = (
return (
self.controller.outpost.config.object_naming_template
% {
"name": slugify(self.controller.outpost.name),
@ -77,16 +75,6 @@ class KubernetesObjectReconciler(Generic[T]):
}
).lower()
formatted = slugify(base_name)
formatted = re.sub(r"[^a-z0-9-]", "-", formatted)
formatted = re.sub(r"-+", "-", formatted)
formatted = formatted[:63]
if not formatted:
formatted = f"outpost-{self.controller.outpost.uuid.hex}"[:63]
return formatted
def get_patched_reference_object(self) -> T:
"""Get patched reference object"""
reference = self.get_reference_object()
@ -124,6 +112,7 @@ class KubernetesObjectReconciler(Generic[T]):
try:
current = self.retrieve()
except (OpenApiException, HTTPError) as exc:
if isinstance(exc, ApiException) and exc.status == HttpResponseNotFound.status_code:
self.logger.debug("Failed to get current, triggering recreate")
raise NeedsRecreate from exc
@ -167,6 +156,7 @@ class KubernetesObjectReconciler(Generic[T]):
self.delete(current)
self.logger.debug("Removing")
except (OpenApiException, HTTPError) as exc:
if isinstance(exc, ApiException) and exc.status == HttpResponseNotFound.status_code:
self.logger.debug("Failed to get current, assuming non-existent")
return

View File

@ -61,14 +61,9 @@ class KubernetesController(BaseController):
client: KubernetesClient
connection: KubernetesServiceConnection
def __init__(
self,
outpost: Outpost,
connection: KubernetesServiceConnection,
client: KubernetesClient | None = None,
) -> None:
def __init__(self, outpost: Outpost, connection: KubernetesServiceConnection) -> None:
super().__init__(outpost, connection)
self.client = client if client else KubernetesClient(connection)
self.client = KubernetesClient(connection)
self.reconcilers = {
SecretReconciler.reconciler_name(): SecretReconciler,
DeploymentReconciler.reconciler_name(): DeploymentReconciler,

View File

@ -1,44 +0,0 @@
"""Kubernetes controller tests"""
from django.test import TestCase
from authentik.blueprints.tests import reconcile_app
from authentik.lib.generators import generate_id
from authentik.outposts.apps import MANAGED_OUTPOST
from authentik.outposts.controllers.k8s.deployment import DeploymentReconciler
from authentik.outposts.controllers.kubernetes import KubernetesController
from authentik.outposts.models import KubernetesServiceConnection, Outpost, OutpostType
class KubernetesControllerTests(TestCase):
"""Kubernetes controller tests"""
@reconcile_app("authentik_outposts")
def setUp(self) -> None:
self.outpost = Outpost.objects.create(
name="test",
type=OutpostType.PROXY,
)
self.integration = KubernetesServiceConnection(name="test")
def test_gen_name(self):
"""Ensure the generated name is valid"""
controller = KubernetesController(
Outpost.objects.filter(managed=MANAGED_OUTPOST).first(),
self.integration,
# Pass something not-none as client so we don't
# attempt to connect to K8s as that's not needed
client=self,
)
rec = DeploymentReconciler(controller)
self.assertEqual(rec.name, "ak-outpost-authentik-embedded-outpost")
controller.outpost.name = generate_id()
self.assertLess(len(rec.name), 64)
# Test custom naming template
_cfg = controller.outpost.config
_cfg.object_naming_template = ""
controller.outpost.config = _cfg
self.assertEqual(rec.name, f"outpost-{controller.outpost.uuid.hex}")
self.assertLess(len(rec.name), 64)

View File

@ -9,12 +9,7 @@ from hashlib import sha256
from typing import Any
from urllib.parse import urlparse, urlunparse
from cryptography.hazmat.primitives.asymmetric.ec import (
SECP256R1,
SECP384R1,
SECP521R1,
EllipticCurvePrivateKey,
)
from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurvePrivateKey
from cryptography.hazmat.primitives.asymmetric.rsa import RSAPrivateKey
from cryptography.hazmat.primitives.asymmetric.types import PrivateKeyTypes
from dacite import Config
@ -119,22 +114,6 @@ class JWTAlgorithms(models.TextChoices):
HS256 = "HS256", _("HS256 (Symmetric Encryption)")
RS256 = "RS256", _("RS256 (Asymmetric Encryption)")
ES256 = "ES256", _("ES256 (Asymmetric Encryption)")
ES384 = "ES384", _("ES384 (Asymmetric Encryption)")
ES512 = "ES512", _("ES512 (Asymmetric Encryption)")
@classmethod
def from_private_key(cls, private_key: PrivateKeyTypes | None) -> str:
if isinstance(private_key, RSAPrivateKey):
return cls.RS256
if isinstance(private_key, EllipticCurvePrivateKey):
curve = private_key.curve
if isinstance(curve, SECP256R1):
return cls.ES256
if isinstance(curve, SECP384R1):
return cls.ES384
if isinstance(curve, SECP521R1):
return cls.ES512
raise ValueError(f"Invalid private key type: {type(private_key)}")
class ScopeMapping(PropertyMapping):
@ -284,7 +263,11 @@ class OAuth2Provider(WebfingerProvider, Provider):
return self.client_secret, JWTAlgorithms.HS256
key: CertificateKeyPair = self.signing_key
private_key = key.private_key
return private_key, JWTAlgorithms.from_private_key(private_key)
if isinstance(private_key, RSAPrivateKey):
return private_key, JWTAlgorithms.RS256
if isinstance(private_key, EllipticCurvePrivateKey):
return private_key, JWTAlgorithms.ES256
raise ValueError(f"Invalid private key type: {type(private_key)}")
def get_issuer(self, request: HttpRequest) -> str | None:
"""Get issuer, based on request"""

View File

@ -254,10 +254,10 @@ class OAuthAuthorizationParams:
raise AuthorizeError(self.redirect_uri, "invalid_scope", self.grant_type, self.state)
if SCOPE_OFFLINE_ACCESS in self.scope:
# https://openid.net/specs/openid-connect-core-1_0.html#OfflineAccess
# Don't explicitly request consent with offline_access, as the spec allows for
# "other conditions for processing the request permitting offline access to the
# requested resources are in place"
# which we interpret as "the admin picks an authorization flow with or without consent"
if PROMPT_CONSENT not in self.prompt:
# Instead of ignoring the `offline_access` scope when `prompt`
# isn't set to `consent`, we set override it ourselves
self.prompt.add(PROMPT_CONSENT)
if self.response_type not in [
ResponseTypes.CODE,
ResponseTypes.CODE_TOKEN,

View File

@ -71,7 +71,7 @@ class CodeValidatorView(PolicyAccessView):
except FlowNonApplicableException:
LOGGER.warning("Flow not applicable to user")
return None
plan.append_stage(in_memory_stage(OAuthDeviceCodeFinishStage))
plan.insert_stage(in_memory_stage(OAuthDeviceCodeFinishStage))
return plan.to_redirect(self.request, self.token.provider.authorization_flow)

View File

@ -34,5 +34,5 @@ class EndSessionView(PolicyAccessView):
PLAN_CONTEXT_APPLICATION: self.application,
},
)
plan.append_stage(in_memory_stage(SessionEndStage))
plan.insert_stage(in_memory_stage(SessionEndStage))
return plan.to_redirect(self.request, self.flow)

View File

@ -75,7 +75,10 @@ class JWKSView(View):
key_data = {}
if use == "sig":
key_data["alg"] = JWTAlgorithms.from_private_key(private_key)
if isinstance(private_key, RSAPrivateKey):
key_data["alg"] = JWTAlgorithms.RS256
elif isinstance(private_key, EllipticCurvePrivateKey):
key_data["alg"] = JWTAlgorithms.ES256
elif use == "enc":
key_data["alg"] = "RSA-OAEP-256"
key_data["enc"] = "A256CBC-HS512"

View File

@ -36,17 +36,17 @@ class IngressReconciler(KubernetesObjectReconciler[V1Ingress]):
def reconciler_name() -> str:
return "ingress"
def _check_annotations(self, current: V1Ingress, reference: V1Ingress):
def _check_annotations(self, reference: V1Ingress):
"""Check that all annotations *we* set are correct"""
for key, value in reference.metadata.annotations.items():
if key not in current.metadata.annotations:
for key, value in self.get_ingress_annotations().items():
if key not in reference.metadata.annotations:
raise NeedsUpdate()
if current.metadata.annotations[key] != value:
if reference.metadata.annotations[key] != value:
raise NeedsUpdate()
def reconcile(self, current: V1Ingress, reference: V1Ingress):
super().reconcile(current, reference)
self._check_annotations(current, reference)
self._check_annotations(reference)
# Create a list of all expected host and tls hosts
expected_hosts = []
expected_hosts_tls = []

View File

@ -1,9 +1,9 @@
"""RAC app config"""
from authentik.blueprints.apps import ManagedAppConfig
from django.apps import AppConfig
class AuthentikProviderRAC(ManagedAppConfig):
class AuthentikProviderRAC(AppConfig):
"""authentik rac app config"""
name = "authentik.providers.rac"

View File

@ -4,7 +4,8 @@ from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from django.contrib.auth.signals import user_logged_out
from django.core.cache import cache
from django.db.models.signals import post_delete, post_save, pre_delete
from django.db.models import Model
from django.db.models.signals import post_save, pre_delete
from django.dispatch import receiver
from django.http import HttpRequest
@ -45,8 +46,12 @@ def pre_delete_connection_token_disconnect(sender, instance: ConnectionToken, **
)
@receiver([post_save, post_delete], sender=Endpoint)
def post_save_post_delete_endpoint(**_):
"""Clear user's endpoint cache upon endpoint creation or deletion"""
@receiver(post_save, sender=Endpoint)
def post_save_endpoint(sender: type[Model], instance, created: bool, **_):
"""Clear user's endpoint cache upon endpoint creation"""
if not created: # pragma: no cover
return
# Delete user endpoint cache
keys = cache.keys(user_endpoint_cache_key("*"))
cache.delete_many(keys)

View File

@ -46,7 +46,7 @@ class RACStartView(PolicyAccessView):
)
except FlowNonApplicableException:
raise Http404 from None
plan.append_stage(
plan.insert_stage(
in_memory_stage(
RACFinalStage,
application=self.application,

View File

@ -180,7 +180,6 @@ class SAMLProviderSerializer(ProviderSerializer):
"session_valid_not_on_or_after",
"property_mappings",
"name_id_mapping",
"authn_context_class_ref_mapping",
"digest_algorithm",
"signature_algorithm",
"signing_kp",

View File

@ -1,28 +0,0 @@
# Generated by Django 5.0.13 on 2025-03-18 17:41
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_providers_saml", "0016_samlprovider_encryption_kp_and_more"),
]
operations = [
migrations.AddField(
model_name="samlprovider",
name="authn_context_class_ref_mapping",
field=models.ForeignKey(
blank=True,
default=None,
help_text="Configure how the AuthnContextClassRef value will be created. When left empty, the AuthnContextClassRef will be set based on which authentication methods the user used to authenticate.",
null=True,
on_delete=django.db.models.deletion.SET_DEFAULT,
related_name="+",
to="authentik_providers_saml.samlpropertymapping",
verbose_name="AuthnContextClassRef Property Mapping",
),
),
]

View File

@ -71,20 +71,6 @@ class SAMLProvider(Provider):
"the NameIDPolicy of the incoming request will be considered"
),
)
authn_context_class_ref_mapping = models.ForeignKey(
"SAMLPropertyMapping",
default=None,
blank=True,
null=True,
on_delete=models.SET_DEFAULT,
verbose_name=_("AuthnContextClassRef Property Mapping"),
related_name="+",
help_text=_(
"Configure how the AuthnContextClassRef value will be created. When left empty, "
"the AuthnContextClassRef will be set based on which authentication methods the user "
"used to authenticate."
),
)
assertion_valid_not_before = models.TextField(
default="minutes=-5",
@ -184,6 +170,7 @@ class SAMLProvider(Provider):
def launch_url(self) -> str | None:
"""Use IDP-Initiated SAML flow as launch URL"""
try:
return reverse(
"authentik_providers_saml:sso-init",
kwargs={"application_slug": self.application.slug},

View File

@ -1,6 +1,5 @@
"""SAML Assertion generator"""
from datetime import datetime
from hashlib import sha256
from types import GeneratorType
@ -53,7 +52,6 @@ class AssertionProcessor:
_assertion_id: str
_response_id: str
_auth_instant: str
_valid_not_before: str
_session_not_on_or_after: str
_valid_not_on_or_after: str
@ -67,11 +65,6 @@ class AssertionProcessor:
self._assertion_id = get_random_id()
self._response_id = get_random_id()
_login_event = get_login_event(self.http_request)
_login_time = datetime.now()
if _login_event:
_login_time = _login_event.created
self._auth_instant = get_time_string(_login_time)
self._valid_not_before = get_time_string(
timedelta_from_string(self.provider.assertion_valid_not_before)
)
@ -138,7 +131,7 @@ class AssertionProcessor:
def get_assertion_auth_n_statement(self) -> Element:
"""Generate AuthnStatement with AuthnContext and ContextClassRef Elements."""
auth_n_statement = Element(f"{{{NS_SAML_ASSERTION}}}AuthnStatement")
auth_n_statement.attrib["AuthnInstant"] = self._auth_instant
auth_n_statement.attrib["AuthnInstant"] = self._valid_not_before
auth_n_statement.attrib["SessionIndex"] = sha256(
self.http_request.session.session_key.encode("ascii")
).hexdigest()
@ -165,28 +158,6 @@ class AssertionProcessor:
auth_n_context_class_ref.text = (
"urn:oasis:names:tc:SAML:2.0:ac:classes:MobileOneFactorContract"
)
if self.provider.authn_context_class_ref_mapping:
try:
value = self.provider.authn_context_class_ref_mapping.evaluate(
user=self.http_request.user,
request=self.http_request,
provider=self.provider,
)
if value is not None:
auth_n_context_class_ref.text = str(value)
return auth_n_statement
except PropertyMappingExpressionException as exc:
Event.new(
EventAction.CONFIGURATION_ERROR,
message=(
"Failed to evaluate property-mapping: "
f"'{self.provider.authn_context_class_ref_mapping.name}'"
),
provider=self.provider,
mapping=self.provider.authn_context_class_ref_mapping,
).from_http(self.http_request)
LOGGER.warning("Failed to evaluate property mapping", exc=exc)
return auth_n_statement
return auth_n_statement
def get_assertion_conditions(self) -> Element:

View File

@ -294,61 +294,6 @@ class TestAuthNRequest(TestCase):
self.assertEqual(parsed_request.id, "aws_LDxLGeubpc5lx12gxCgS6uPbix1yd5re")
self.assertEqual(parsed_request.name_id_policy, SAML_NAME_ID_FORMAT_EMAIL)
def test_authn_context_class_ref_mapping(self):
"""Test custom authn_context_class_ref"""
authn_context_class_ref = generate_id()
mapping = SAMLPropertyMapping.objects.create(
name=generate_id(), expression=f"""return '{authn_context_class_ref}'"""
)
self.provider.authn_context_class_ref_mapping = mapping
self.provider.save()
user = create_test_admin_user()
http_request = get_request("/", user=user)
# First create an AuthNRequest
request_proc = RequestProcessor(self.source, http_request, "test_state")
request = request_proc.build_auth_n()
# To get an assertion we need a parsed request (parsed by provider)
parsed_request = AuthNRequestParser(self.provider).parse(
b64encode(request.encode()).decode(), "test_state"
)
# Now create a response and convert it to string (provider)
response_proc = AssertionProcessor(self.provider, http_request, parsed_request)
response = response_proc.build_response()
self.assertIn(user.username, response)
self.assertIn(authn_context_class_ref, response)
def test_authn_context_class_ref_mapping_invalid(self):
"""Test custom authn_context_class_ref (invalid)"""
mapping = SAMLPropertyMapping.objects.create(name=generate_id(), expression="q")
self.provider.authn_context_class_ref_mapping = mapping
self.provider.save()
user = create_test_admin_user()
http_request = get_request("/", user=user)
# First create an AuthNRequest
request_proc = RequestProcessor(self.source, http_request, "test_state")
request = request_proc.build_auth_n()
# To get an assertion we need a parsed request (parsed by provider)
parsed_request = AuthNRequestParser(self.provider).parse(
b64encode(request.encode()).decode(), "test_state"
)
# Now create a response and convert it to string (provider)
response_proc = AssertionProcessor(self.provider, http_request, parsed_request)
response = response_proc.build_response()
self.assertIn(user.username, response)
events = Event.objects.filter(
action=EventAction.CONFIGURATION_ERROR,
)
self.assertTrue(events.exists())
self.assertEqual(
events.first().context["message"],
f"Failed to evaluate property-mapping: '{mapping.name}'",
)
def test_request_attributes(self):
"""Test full SAML Request/Response flow, fully signed"""
user = create_test_admin_user()
@ -376,10 +321,8 @@ class TestAuthNRequest(TestCase):
request = request_proc.build_auth_n()
# Create invalid PropertyMapping
mapping = SAMLPropertyMapping.objects.create(
name=generate_id(), saml_name="test", expression="q"
)
self.provider.property_mappings.add(mapping)
scope = SAMLPropertyMapping.objects.create(name="test", saml_name="test", expression="q")
self.provider.property_mappings.add(scope)
# To get an assertion we need a parsed request (parsed by provider)
parsed_request = AuthNRequestParser(self.provider).parse(
@ -395,7 +338,7 @@ class TestAuthNRequest(TestCase):
self.assertTrue(events.exists())
self.assertEqual(
events.first().context["message"],
f"Failed to evaluate property-mapping: '{mapping.name}'",
"Failed to evaluate property-mapping: 'test'",
)
def test_idp_initiated(self):

View File

@ -1,16 +1,12 @@
"""Time utilities"""
from datetime import datetime, timedelta
from django.utils.timezone import now
import datetime
def get_time_string(delta: timedelta | datetime | None = None) -> str:
def get_time_string(delta: datetime.timedelta | None = None) -> str:
"""Get Data formatted in SAML format"""
if delta is None:
delta = timedelta()
if isinstance(delta, timedelta):
final = now() + delta
else:
final = delta
delta = datetime.timedelta()
now = datetime.datetime.now()
final = now + delta
return final.strftime("%Y-%m-%dT%H:%M:%SZ")

View File

@ -61,7 +61,7 @@ class SAMLSLOView(PolicyAccessView):
PLAN_CONTEXT_APPLICATION: self.application,
},
)
plan.append_stage(in_memory_stage(SessionEndStage))
plan.insert_stage(in_memory_stage(SessionEndStage))
return plan.to_redirect(self.request, self.flow)
def post(self, request: HttpRequest, application_slug: str) -> HttpResponse:

View File

@ -24,9 +24,7 @@ class SCIMProviderGroupSerializer(ModelSerializer):
"group",
"group_obj",
"provider",
"attributes",
]
extra_kwargs = {"attributes": {"read_only": True}}
class SCIMProviderGroupViewSet(

View File

@ -28,10 +28,8 @@ class SCIMProviderSerializer(ProviderSerializer):
"url",
"verify_certificates",
"token",
"compatibility_mode",
"exclude_users_service_account",
"filter_group",
"dry_run",
]
extra_kwargs = {}

View File

@ -24,9 +24,7 @@ class SCIMProviderUserSerializer(ModelSerializer):
"user",
"user_obj",
"provider",
"attributes",
]
extra_kwargs = {"attributes": {"read_only": True}}
class SCIMProviderUserViewSet(

View File

@ -12,9 +12,8 @@ from authentik.lib.sync.outgoing import (
HTTP_SERVICE_UNAVAILABLE,
HTTP_TOO_MANY_REQUESTS,
)
from authentik.lib.sync.outgoing.base import SAFE_METHODS, BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.base import BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.exceptions import (
DryRunRejected,
NotFoundSyncException,
ObjectExistsSyncException,
TransientSyncException,
@ -22,7 +21,7 @@ from authentik.lib.sync.outgoing.exceptions import (
from authentik.lib.utils.http import get_http_session
from authentik.providers.scim.clients.exceptions import SCIMRequestException
from authentik.providers.scim.clients.schema import ServiceProviderConfiguration
from authentik.providers.scim.models import SCIMCompatibilityMode, SCIMProvider
from authentik.providers.scim.models import SCIMProvider
if TYPE_CHECKING:
from django.db.models import Model
@ -55,8 +54,6 @@ class SCIMClient[TModel: "Model", TConnection: "Model", TSchema: "BaseModel"](
def _request(self, method: str, path: str, **kwargs) -> dict:
"""Wrapper to send a request to the full URL"""
if self.provider.dry_run and method.upper() not in SAFE_METHODS:
raise DryRunRejected(f"{self.base_url}{path}", method, body=kwargs.get("json"))
try:
response = self._session.request(
method,
@ -90,14 +87,9 @@ class SCIMClient[TModel: "Model", TConnection: "Model", TSchema: "BaseModel"](
"""Get Service provider config"""
default_config = ServiceProviderConfiguration.default()
try:
config = ServiceProviderConfiguration.model_validate(
return ServiceProviderConfiguration.model_validate(
self._request("GET", "/ServiceProviderConfig")
)
if self.provider.compatibility_mode == SCIMCompatibilityMode.AWS:
config.patch.supported = False
if self.provider.compatibility_mode == SCIMCompatibilityMode.SLACK:
config.filter.supported = True
return config
except (ValidationError, SCIMRequestException, NotFoundSyncException) as exc:
self.logger.warning("failed to get ServiceProviderConfig", exc=exc)
return default_config

View File

@ -102,7 +102,7 @@ class SCIMGroupClient(SCIMClient[Group, SCIMProviderGroup, SCIMGroupSchema]):
if not scim_id or scim_id == "":
raise StopSync("SCIM Response with missing or invalid `id`")
connection = SCIMProviderGroup.objects.create(
provider=self.provider, group=group, scim_id=scim_id, attributes=response
provider=self.provider, group=group, scim_id=scim_id
)
users = list(group.users.order_by("id").values_list("id", flat=True))
self._patch_add_users(connection, users)
@ -243,10 +243,9 @@ class SCIMGroupClient(SCIMClient[Group, SCIMProviderGroup, SCIMGroupSchema]):
if user.value not in users_should:
users_to_remove.append(user.value)
# Check users that should be in the group and add them
if current_group.members is not None:
for user in users_should:
if len([x for x in current_group.members if x.value == user]) < 1:
users_to_add.append(user)
for user in users_should:
if len([x for x in current_group.members if x.value == user]) < 1:
users_to_add.append(user)
# Only send request if we need to make changes
if len(users_to_add) < 1 and len(users_to_remove) < 1:
return

View File

@ -1,12 +1,10 @@
"""User client"""
from django.db import transaction
from django.utils.http import urlencode
from pydantic import ValidationError
from authentik.core.models import User
from authentik.lib.sync.mapper import PropertyMappingManager
from authentik.lib.sync.outgoing.exceptions import ObjectExistsSyncException, StopSync
from authentik.lib.sync.outgoing.exceptions import StopSync
from authentik.policies.utils import delete_none_values
from authentik.providers.scim.clients.base import SCIMClient
from authentik.providers.scim.clients.schema import SCIM_USER_SCHEMA
@ -57,44 +55,24 @@ class SCIMUserClient(SCIMClient[User, SCIMProviderUser, SCIMUserSchema]):
def create(self, user: User):
"""Create user from scratch and create a connection object"""
scim_user = self.to_schema(user, None)
with transaction.atomic():
try:
response = self._request(
"POST",
"/Users",
json=scim_user.model_dump(
mode="json",
exclude_unset=True,
),
)
except ObjectExistsSyncException as exc:
if not self._config.filter.supported:
raise exc
users = self._request(
"GET", f"/Users?{urlencode({'filter': f'userName eq {scim_user.userName}'})}"
)
users_res = users.get("Resources", [])
if len(users_res) < 1:
raise exc
return SCIMProviderUser.objects.create(
provider=self.provider,
user=user,
scim_id=users_res[0]["id"],
attributes=users_res[0],
)
else:
scim_id = response.get("id")
if not scim_id or scim_id == "":
raise StopSync("SCIM Response with missing or invalid `id`")
return SCIMProviderUser.objects.create(
provider=self.provider, user=user, scim_id=scim_id, attributes=response
)
response = self._request(
"POST",
"/Users",
json=scim_user.model_dump(
mode="json",
exclude_unset=True,
),
)
scim_id = response.get("id")
if not scim_id or scim_id == "":
raise StopSync("SCIM Response with missing or invalid `id`")
return SCIMProviderUser.objects.create(provider=self.provider, user=user, scim_id=scim_id)
def update(self, user: User, connection: SCIMProviderUser):
"""Update existing user"""
scim_user = self.to_schema(user, connection)
scim_user.id = connection.scim_id
response = self._request(
self._request(
"PUT",
f"/Users/{connection.scim_id}",
json=scim_user.model_dump(
@ -102,5 +80,3 @@ class SCIMUserClient(SCIMClient[User, SCIMProviderUser, SCIMUserSchema]):
exclude_unset=True,
),
)
connection.attributes = response
connection.save()

View File

@ -1,21 +0,0 @@
# Generated by Django 5.0.12 on 2025-02-24 19:43
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_providers_scim", "0010_scimprovider_verify_certificates"),
]
operations = [
migrations.AddField(
model_name="scimprovider",
name="dry_run",
field=models.BooleanField(
default=False,
help_text="When enabled, provider will not modify or create objects in the remote system.",
),
),
]

View File

@ -1,24 +0,0 @@
# Generated by Django 5.0.12 on 2025-03-07 23:35
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_providers_scim", "0011_scimprovider_dry_run"),
]
operations = [
migrations.AddField(
model_name="scimprovider",
name="compatibility_mode",
field=models.CharField(
choices=[("default", "Default"), ("aws", "AWS"), ("slack", "Slack")],
default="default",
help_text="Alter authentik behavior for vendor-specific SCIM implementations.",
max_length=30,
verbose_name="SCIM Compatibility Mode",
),
),
]

View File

@ -1,23 +0,0 @@
# Generated by Django 5.0.13 on 2025-03-18 13:47
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_providers_scim", "0012_scimprovider_compatibility_mode"),
]
operations = [
migrations.AddField(
model_name="scimprovidergroup",
name="attributes",
field=models.JSONField(default=dict),
),
migrations.AddField(
model_name="scimprovideruser",
name="attributes",
field=models.JSONField(default=dict),
),
]

View File

@ -22,7 +22,6 @@ class SCIMProviderUser(SerializerModel):
scim_id = models.TextField()
user = models.ForeignKey(User, on_delete=models.CASCADE)
provider = models.ForeignKey("SCIMProvider", on_delete=models.CASCADE)
attributes = models.JSONField(default=dict)
@property
def serializer(self) -> type[Serializer]:
@ -44,7 +43,6 @@ class SCIMProviderGroup(SerializerModel):
scim_id = models.TextField()
group = models.ForeignKey(Group, on_delete=models.CASCADE)
provider = models.ForeignKey("SCIMProvider", on_delete=models.CASCADE)
attributes = models.JSONField(default=dict)
@property
def serializer(self) -> type[Serializer]:
@ -59,14 +57,6 @@ class SCIMProviderGroup(SerializerModel):
return f"SCIM Provider Group {self.group_id} to {self.provider_id}"
class SCIMCompatibilityMode(models.TextChoices):
"""SCIM compatibility mode"""
DEFAULT = "default", _("Default")
AWS = "aws", _("AWS")
SLACK = "slack", _("Slack")
class SCIMProvider(OutgoingSyncProvider, BackchannelProvider):
"""SCIM 2.0 provider to create users and groups in external applications"""
@ -87,14 +77,6 @@ class SCIMProvider(OutgoingSyncProvider, BackchannelProvider):
help_text=_("Property mappings used for group creation/updating."),
)
compatibility_mode = models.CharField(
max_length=30,
choices=SCIMCompatibilityMode.choices,
default=SCIMCompatibilityMode.DEFAULT,
verbose_name=_("SCIM Compatibility Mode"),
help_text=_("Alter authentik behavior for vendor-specific SCIM implementations."),
)
@property
def icon_url(self) -> str | None:
return static("authentik/sources/scim.png")

View File

@ -3,15 +3,12 @@
from json import loads
from django.test import TestCase
from django.utils.text import slugify
from jsonschema import validate
from requests_mock import Mocker
from authentik.blueprints.tests import apply_blueprint
from authentik.core.models import Application, Group, User
from authentik.events.models import SystemTask
from authentik.lib.generators import generate_id
from authentik.lib.sync.outgoing.base import SAFE_METHODS
from authentik.providers.scim.models import SCIMMapping, SCIMProvider
from authentik.providers.scim.tasks import scim_sync, sync_tasks
from authentik.tenants.models import Tenant
@ -333,59 +330,3 @@ class SCIMUserTests(TestCase):
"userName": uid,
},
)
def test_user_create_dry_run(self):
"""Test user creation (dry_run)"""
# Update the provider before we start mocking as saving the provider triggers a full sync
self.provider.dry_run = True
self.provider.save()
with Mocker() as mock:
scim_id = generate_id()
mock.get(
"https://localhost/ServiceProviderConfig",
json={},
)
mock.post(
"https://localhost/Users",
json={
"id": scim_id,
},
)
uid = generate_id()
User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
self.assertEqual(mock.call_count, 1, mock.request_history)
self.assertEqual(mock.request_history[0].method, "GET")
def test_sync_task_dry_run(self):
"""Test sync tasks"""
# Update the provider before we start mocking as saving the provider triggers a full sync
self.provider.dry_run = True
self.provider.save()
with Mocker() as mock:
uid = generate_id()
mock.get(
"https://localhost/ServiceProviderConfig",
json={},
)
User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
sync_tasks.trigger_single_task(self.provider, scim_sync).get()
self.assertEqual(mock.call_count, 3)
for request in mock.request_history:
self.assertIn(request.method, SAFE_METHODS)
task = SystemTask.objects.filter(uid=slugify(self.provider.name)).first()
self.assertIsNotNone(task)
drop_msg = task.messages[2]
self.assertEqual(drop_msg["event"], "Dropping mutating request due to dry run")
self.assertIsNotNone(drop_msg["attributes"]["url"])
self.assertIsNotNone(drop_msg["attributes"]["body"])
self.assertIsNotNone(drop_msg["attributes"]["method"])

View File

@ -16,7 +16,6 @@ from authentik.lib.config import CONFIG, django_db_config, redis_url
from authentik.lib.logging import get_logger_config, structlog_configure
from authentik.lib.sentry import sentry_init
from authentik.lib.utils.reflection import get_env
from authentik.lib.utils.time import timedelta_from_string
from authentik.stages.password import BACKEND_APP_PASSWORD, BACKEND_INBUILT, BACKEND_LDAP
BASE_DIR = Path(__file__).absolute().parent.parent.parent
@ -243,9 +242,6 @@ SESSION_CACHE_ALIAS = "default"
# Configured via custom SessionMiddleware
# SESSION_COOKIE_SAMESITE = "None"
# SESSION_COOKIE_SECURE = True
SESSION_COOKIE_AGE = timedelta_from_string(
CONFIG.get("sessions.unauthenticated_age", "days=1")
).total_seconds()
SESSION_EXPIRE_AT_BROWSER_CLOSE = True
MESSAGE_STORAGE = "authentik.root.messages.storage.ChannelsStorage"

View File

@ -99,7 +99,6 @@ class LDAPSourceSerializer(SourceSerializer):
"sync_groups",
"sync_parent_group",
"connectivity",
"lookup_groups_from_user",
]
extra_kwargs = {"bind_password": {"write_only": True}}
@ -135,7 +134,6 @@ class LDAPSourceViewSet(UsedByMixin, ModelViewSet):
"sync_parent_group",
"user_property_mappings",
"group_property_mappings",
"lookup_groups_from_user",
]
search_fields = ["name", "slug"]
ordering = ["name"]

Some files were not shown because too many files have changed in this diff Show More