Compare commits

..

47 Commits

Author SHA1 Message Date
59401e07ad release: 2024.12.5 2025-04-08 14:54:17 -03:00
4008f0f28f Revert "core: fix non-exploitable open redirect (#13696)" (#13827) 2025-04-08 19:16:35 +02:00
210fbb0067 release: 2024.12.4 2025-03-28 14:48:01 +01:00
9c081d084d security: fix CVE-2025-29928 (cherry-pick #13695) (#13701)
security: fix CVE-2025-29928 (#13695)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-03-28 14:45:42 +01:00
c7c75dc195 core: fix non-exploitable open redirect (#13696)
discovered by @dominic-r

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	authentik/core/sources/flow_manager.py
2025-03-28 14:19:55 +01:00
5448dc93db flows: fix inspector permission check (#12907)
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-02-01 01:54:55 +01:00
6d152bcc60 core: fix generic sources not being fetchable by pk (#12896)
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-02-01 00:40:53 +01:00
f1b7a9f934 release: 2024.12.3 2025-01-29 21:47:30 +01:00
4af75d0979 ci: fix missing dockerhub login
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-01-29 21:47:23 +01:00
af0a314e0b ci: fix permissions for release-publish pipeline
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-01-29 19:24:17 +01:00
6c7f901220 ci: fix test_docker.sh (cherry-pick #12880) (#12881)
ci: fix test_docker.sh (#12880)

Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-01-29 18:52:54 +01:00
3570bfa39d ci: fix test_docker.sh (cherry-pick #12878) (#12879)
ci: fix test_docker.sh (#12878)

Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-01-29 18:42:20 +01:00
35ab51e4c5 ci: fix test_docker.sh failing due to empty .env (cherry-pick #12876) (#12877)
ci: fix test_docker.sh failing due to empty .env (#12876)

Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-01-29 18:33:04 +01:00
22eaf97d62 ci: fix test_docker.sh failing due to missing .env (cherry-pick #12873) (#12874)
ci: fix test_docker.sh failing due to missing .env (#12873)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-29 17:10:29 +01:00
764b211bd4 lifecycle: better pre release test (cherry-pick #12806) (#12808)
lifecycle: better pre release test (#12806)

* move pre-release docker test to script



* set pipefail in ak



* don't reinstall wheels since they don't exist anymore



* fix image



* fix config error on startup



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-25 02:28:23 +01:00
7afc59d691 rbac: exclude permissions for internal models (cherry-pick #12803) (#12807)
rbac: exclude permissions for internal models (#12803)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-25 02:07:16 +01:00
349572bfe4 flows: clear flow state before redirecting to final URL (cherry-pick #12788) (#12801)
flows: clear flow state before redirecting to final URL (#12788)

* providers/oauth2: clear flow state before redirecting to final URL



* make flow executor invocation correct



* actually we can do this centrally



* make sure the state is really clean



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-24 18:40:02 +01:00
bef55bc3a5 core: fix permissions for admin device listing (cherry-pick #12787) (#12791)
core: fix permissions for admin device listing (#12787)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-24 15:05:23 +01:00
d86da24c01 lifecycle: update python to 3.12.8 (cherry-pick #12783) (#12786)
lifecycle: update python to 3.12.8 (#12783)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-23 23:37:48 +01:00
25d0ee02a8 core: fix application entitlements not createable with blueprints (cherry-pick #12673) (#12784)
core: fix application entitlements not createable with blueprints (#12673)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-23 16:55:16 +01:00
bccfb0b48c sources: allow uuid or slug to be used for retrieving a source (2024.12 fix) (#12772)
sources: allow uuid or slug to be used for retrieving a source

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-01-23 12:26:48 +01:00
4b14eca2da ci: fix missing build args for dev and release (cherry-pick #12760) (#12761)
ci: fix missing build args for dev and release (#12760)

* ci: fix missing build args for dev and release



* fix?



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-22 14:32:15 +01:00
7d5cfb6356 lifecycle: fix cryptography's OpenSSL path (cherry-pick #12753) (#12759)
lifecycle: fix cryptography's OpenSSL path (#12753)

* lifecycle: make it work



* sigh



* I dont know why this works but it works



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-22 02:32:44 +01:00
7ce46ccbe0 stages/redirect: fix query parameter when redirecting to flow (cherry-pick #12750) (#12752)
stages/redirect: fix query parameter when redirecting to flow (#12750)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-21 18:07:59 +01:00
d0217c9135 lifecycle: build binary dependencies which link against SSL directly (#12724)
* lifecycle: install binary dependencies in dockerfile directly

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* install ua-parser-builtins manually as its only distributed as binary

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* build duo_client from scratch, sigh

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* deps for kadmin

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* ok fine

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* run on arm runner?

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix yaml format

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* rewrite release pipeline to use re-usable workflows

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix typo

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* re-usable multi-arch build?

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* also add suffix for amd64

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* parameterise image name

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* re-use workflow for CI images...?

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix missing checkout

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* inherit secrets

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* temp build directly

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* get cache-to from python script

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* better name?

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* matrix for merging images?

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* re-add build dep

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* use multi-image tag

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* include arch in buildcache

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	.github/workflows/ci-main.yml
#	.github/workflows/release-publish.yml
2025-01-21 15:39:32 +01:00
d82ba344d9 ci: release: fix AWS cfn template permissions (cherry-pick #12576) (#12739)
ci: release: fix AWS cfn template permissions (#12576)

Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-01-20 17:28:36 +01:00
01959132e8 enterprise/rac: Improve client connection status & bugfixes (cherry-pick #12684) (#12727)
enterprise/rac: Improve client connection status & bugfixes (#12684)

* enterprise/rac: improve status message when connecting/connection failed



* set fixed DPI



* automatically set resize method for RDP



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-17 18:27:31 +01:00
9d81f0598c release: 2024.12.2 2025-01-09 17:43:00 +01:00
cbe429f3fa providers/saml: fix invalid SAML Response when assertion and response are signed (cherry-pick #12611) (#12613)
providers/saml: fix invalid SAML Response when assertion and response are signed (#12611)

* providers/saml: fix invalid SAML Response when assertion and response are signed



* validate against schema too



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-09 16:20:49 +01:00
1cf0f57608 core: fix error when creating new user with default path (cherry-pick #12609) (#12612)
core: fix error when creating new user with default path (#12609)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-01-09 15:32:18 +01:00
052da72acf rbac: permissions endpoint: allow authenticated users (cherry-pick #12608) (#12610)
rbac: permissions endpoint: allow authenticated users (#12608)

Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-01-09 15:01:54 +01:00
9a1c76efe7 sources/kerberos: authenticate with the user's username instead of the first username in authentik (cherry-pick #12497) (#12579)
sources/kerberos: authenticate with the user's username instead of the first username in authentik (#12497)

Co-authored-by: natural-hair <github@natural-hair.net>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-01-06 15:22:15 +01:00
96b5bee912 web: fix source selection and outpost integration health (#12530)
* fix source selector

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix service connection health not updating fully

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix logo alt not translated

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	web/src/admin/AdminInterface/AboutModal.ts
2025-01-03 01:38:29 +01:00
09b3a1d0bd internal: fix missing trailing slash in outpost websocket (cherry-pick #12470) (#12471)
internal: fix missing trailing slash in outpost websocket (#12470)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2024-12-24 00:56:31 +01:00
e87a17fd81 release: 2024.12.1 2024-12-23 14:08:59 +01:00
bb1bcb29cd internal: fix URL generation for websocket connection (cherry-pick #12439) (#12440)
internal: fix URL generation for websocket connection (#12439)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2024-12-20 20:08:25 +01:00
0a5bdad972 website/docs: add content about bindings (cherry-pick #11787) (#12428)
website/docs: add content about bindings (#11787)

Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tana@goauthentik.com>
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
2024-12-19 20:36:29 +01:00
d313225956 website/docs: add new section about impersonation (cherry-pick #12328) (#12424)
website/docs: add new section about impersonation (#12328)

Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tana@goauthentik.com>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2024-12-19 19:59:44 +01:00
249dc276d4 release: 2024.12.0 2024-12-19 19:18:31 +01:00
5fb7dc4cb3 website/docs: prepare for 2024.12.0 (cherry-pick #12420) (#12422)
website/docs: prepare for 2024.12.0 (#12420)

Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2024-12-19 19:18:03 +01:00
82930ee807 root: expose CONN_MAX_AGE, CONN_HEALTH_CHECKS and DISABLE_SERVER_SIDE_CURSORS for PostgreSQL config (cherry-pick #10159) (#12419)
root: expose CONN_MAX_AGE, CONN_HEALTH_CHECKS and DISABLE_SERVER_SIDE_CURSORS for PostgreSQL config (#10159)

Co-authored-by: Tomás Farías Santana <tomas@tomasfarias.dev>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Co-authored-by: Tana M Berry <tana@goauthentik.com>
2024-12-19 19:01:06 +01:00
ac25fbab54 events: notification_cleanup: avoid unnecessary loop (cherry-pick #12417) (#12418)
events: notification_cleanup: avoid unnecessary loop (#12417)

Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2024-12-19 18:49:30 +01:00
15cb6b18f6 translate: Updates for file web/xliff/en.xlf in zh_CN (cherry-pick #12402) (#12411)
* translate: Updates for file web/xliff/en.xlf in zh_CN (#12402)

Translate web/xliff/en.xlf in zh_CN

100% translated source file: 'web/xliff/en.xlf'
on 'zh_CN'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>

* ci: dont run codeql on cherry-picked prs

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
Co-authored-by: Jens Langhammer <jens@goauthentik.io>
2024-12-19 13:17:46 +01:00
fdd39b4b4c translate: Updates for file locale/en/LC_MESSAGES/django.po in zh_CN (cherry-pick #12399) (#12408)
translate: Updates for file locale/en/LC_MESSAGES/django.po in zh_CN (#12399)

Translate locale/en/LC_MESSAGES/django.po in zh_CN

100% translated source file: 'locale/en/LC_MESSAGES/django.po'
on 'zh_CN'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-19 12:43:26 +01:00
589304df4f translate: Updates for file locale/en/LC_MESSAGES/django.po in zh-Hans (cherry-pick #12400) (#12409)
translate: Updates for file locale/en/LC_MESSAGES/django.po in zh-Hans (#12400)

Translate django.po in zh-Hans

100% translated source file: 'django.po'
on 'zh-Hans'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-19 12:43:16 +01:00
4d920ff477 translate: Updates for file web/xliff/en.xlf in zh-Hans (cherry-pick #12401) (#12410)
translate: Updates for file web/xliff/en.xlf in zh-Hans (#12401)

Translate web/xliff/en.xlf in zh-Hans

100% translated source file: 'web/xliff/en.xlf'
on 'zh-Hans'.

Co-authored-by: transifex-integration[bot] <43880903+transifex-integration[bot]@users.noreply.github.com>
2024-12-19 12:43:04 +01:00
88dc616c5e release: 2024.12.0-rc1 2024-12-18 19:35:21 +01:00
546 changed files with 17099 additions and 47688 deletions

View File

@ -1,16 +1,16 @@
[bumpversion]
current_version = 2024.12.3
current_version = 2024.12.5
tag = True
commit = True
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(?:-(?P<rc_t>[a-zA-Z-]+)(?P<rc_n>[1-9]\\d*))?
serialize =
serialize =
{major}.{minor}.{patch}-{rc_t}{rc_n}
{major}.{minor}.{patch}
message = release: {new_version}
tag_name = version/{new_version}
[bumpversion:part:rc_t]
values =
values =
rc
final
optional_value = final
@ -31,4 +31,4 @@ optional_value = final
[bumpversion:file:web/src/common/constants.ts]
[bumpversion:file:lifecycle/aws/template.yaml]
[bumpversion:file:website/docs/install-config/install/aws/template.yaml]

View File

@ -82,16 +82,6 @@ updates:
docusaurus:
patterns:
- "@docusaurus/*"
- package-ecosystem: npm
directory: "/lifecycle/aws"
schedule:
interval: daily
time: "04:00"
open-pull-requests-limit: 10
commit-message:
prefix: "lifecycle/aws:"
labels:
- dependencies
- package-ecosystem: pip
directory: "/"
schedule:

View File

@ -40,7 +40,7 @@ jobs:
attestations: write
steps:
- uses: actions/checkout@v4
- uses: docker/setup-qemu-action@v3.4.0
- uses: docker/setup-qemu-action@v3.3.0
- uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
@ -77,7 +77,7 @@ jobs:
id: push
with:
context: .
push: ${{ steps.ev.outputs.shouldPush == 'true' }}
push: true
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
@ -89,7 +89,6 @@ jobs:
cache-to: ${{ steps.ev.outputs.cacheTo }}
- uses: actions/attest-build-provenance@v2
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}

View File

@ -46,7 +46,6 @@ jobs:
- build-server-arm64
outputs:
tags: ${{ steps.ev.outputs.imageTagsJSON }}
shouldPush: ${{ steps.ev.outputs.shouldPush }}
steps:
- uses: actions/checkout@v4
- name: prepare variables
@ -58,7 +57,6 @@ jobs:
image-name: ${{ inputs.image_name }}
merge-server:
runs-on: ubuntu-latest
if: ${{ needs.get-tags.outputs.shouldPush == 'true' }}
needs:
- get-tags
- build-server-amd64

View File

@ -25,10 +25,10 @@ jobs:
uses: ./.github/actions/setup
- uses: actions/setup-node@v4
with:
node-version-file: lifecycle/aws/package.json
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: lifecycle/aws/package-lock.json
- working-directory: lifecycle/aws/
cache-dependency-path: website/package-lock.json
- working-directory: website/
run: |
npm ci
- name: Check changes have been applied

View File

@ -1,28 +0,0 @@
---
name: authentik-ci-main-daily
on:
workflow_dispatch:
schedule:
# Every night at 3am
- cron: "0 3 * * *"
jobs:
test-container:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
version:
- docs
- version-2024-12
- version-2024-10
steps:
- uses: actions/checkout@v4
- run: |
current="$(pwd)"
dir="/tmp/authentik/${{ matrix.version }}"
mkdir -p $dir
cd $dir
wget https://${{ matrix.version }}.goauthentik.io/docker-compose.yml
${current}/scripts/test_docker.sh

View File

@ -43,26 +43,15 @@ jobs:
uses: ./.github/actions/setup
- name: run migrations
run: poetry run python -m lifecycle.migrate
test-make-seed:
runs-on: ubuntu-latest
steps:
- id: seed
run: |
echo "seed=$(printf "%d\n" "0x$(openssl rand -hex 4)")" >> "$GITHUB_OUTPUT"
outputs:
seed: ${{ steps.seed.outputs.seed }}
test-migrations-from-stable:
name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }} - Run ${{ matrix.run_id }}/5
name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }}
runs-on: ubuntu-latest
timeout-minutes: 20
needs: test-make-seed
strategy:
fail-fast: false
matrix:
psql:
- 15-alpine
- 16-alpine
run_id: [1, 2, 3, 4, 5]
steps:
- uses: actions/checkout@v4
with:
@ -104,23 +93,18 @@ jobs:
env:
# Test in the main database that we just migrated from the previous stable version
AUTHENTIK_POSTGRESQL__TEST__NAME: authentik
CI_TEST_SEED: ${{ needs.test-make-seed.outputs.seed }}
CI_RUN_ID: ${{ matrix.run_id }}
CI_TOTAL_RUNS: "5"
run: |
poetry run make ci-test
poetry run make test
test-unittest:
name: test-unittest - PostgreSQL ${{ matrix.psql }} - Run ${{ matrix.run_id }}/5
name: test-unittest - PostgreSQL ${{ matrix.psql }}
runs-on: ubuntu-latest
timeout-minutes: 20
needs: test-make-seed
timeout-minutes: 30
strategy:
fail-fast: false
matrix:
psql:
- 15-alpine
- 16-alpine
run_id: [1, 2, 3, 4, 5]
steps:
- uses: actions/checkout@v4
- name: Setup authentik env
@ -128,12 +112,9 @@ jobs:
with:
postgresql_version: ${{ matrix.psql }}
- name: run unittest
env:
CI_TEST_SEED: ${{ needs.test-make-seed.outputs.seed }}
CI_RUN_ID: ${{ matrix.run_id }}
CI_TOTAL_RUNS: "5"
run: |
poetry run make ci-test
poetry run make test
poetry run coverage xml
- if: ${{ always() }}
uses: codecov/codecov-action@v5
with:
@ -153,7 +134,7 @@ jobs:
- name: Setup authentik env
uses: ./.github/actions/setup
- name: Create k8s Kind Cluster
uses: helm/kind-action@v1.12.0
uses: helm/kind-action@v1.11.0
- name: run integration
run: |
poetry run coverage run manage.py test tests/integration

View File

@ -82,7 +82,7 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.4.0
uses: docker/setup-qemu-action@v3.2.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: prepare variables

View File

@ -2,7 +2,7 @@ name: "CodeQL"
on:
push:
branches: [main, "*", next, version*]
branches: [main, next, version*]
pull_request:
branches: [main]
schedule:

View File

@ -42,7 +42,7 @@ jobs:
with:
go-version-file: "go.mod"
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.4.0
uses: docker/setup-qemu-action@v3.2.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: prepare variables
@ -147,8 +147,8 @@ jobs:
aws-region: ${{ env.AWS_REGION }}
- name: Upload template
run: |
aws s3 cp --acl=public-read lifecycle/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.${{ github.ref }}.yaml
aws s3 cp --acl=public-read lifecycle/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.latest.yaml
aws s3 cp --acl=public-read website/docs/install-config/install/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.${{ github.ref }}.yaml
aws s3 cp --acl=public-read website/docs/install-config/install/aws/template.yaml s3://authentik-cloudformation-templates/authentik.ecs.latest.yaml
test-release:
needs:
- build-server

View File

@ -1,8 +1,8 @@
name: "authentik-repo-stale"
name: 'authentik-repo-stale'
on:
schedule:
- cron: "30 1 * * *"
- cron: '30 1 * * *'
workflow_dispatch:
permissions:
@ -25,7 +25,7 @@ jobs:
days-before-stale: 60
days-before-close: 7
exempt-issue-labels: pinned,security,pr_wanted,enhancement,bug/confirmed,enhancement/confirmed,question,status/reviewing
stale-issue-label: status/stale
stale-issue-label: wontfix
stale-issue-message: >
This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. Thank you

3
.gitignore vendored
View File

@ -209,6 +209,3 @@ source_docs/
### Golang ###
/vendor/
### Docker ###
docker-compose.override.yml

View File

@ -2,7 +2,6 @@
"recommendations": [
"bashmish.es6-string-css",
"bpruitt-goddard.mermaid-markdown-syntax-highlighting",
"charliermarsh.ruff",
"dbaeumer.vscode-eslint",
"EditorConfig.EditorConfig",
"esbenp.prettier-vscode",
@ -11,12 +10,12 @@
"Gruntfuggly.todo-tree",
"mechatroner.rainbow-csv",
"ms-python.black-formatter",
"ms-python.black-formatter",
"ms-python.debugpy",
"charliermarsh.ruff",
"ms-python.python",
"ms-python.vscode-pylance",
"ms-python.black-formatter",
"redhat.vscode-yaml",
"Tobermory.es6-string-html",
"unifiedjs.vscode-mdx",
"unifiedjs.vscode-mdx"
]
}

66
.vscode/launch.json vendored
View File

@ -2,76 +2,26 @@
"version": "0.2.0",
"configurations": [
{
"name": "Debug: Attach Server Core",
"type": "debugpy",
"name": "Python: PDB attach Server",
"type": "python",
"request": "attach",
"connect": {
"host": "localhost",
"port": 9901
"port": 6800
},
"pathMappings": [
{
"localRoot": "${workspaceFolder}",
"remoteRoot": "."
}
],
"justMyCode": true,
"django": true
},
{
"name": "Debug: Attach Worker",
"type": "debugpy",
"name": "Python: PDB attach Worker",
"type": "python",
"request": "attach",
"connect": {
"host": "localhost",
"port": 9901
"port": 6900
},
"pathMappings": [
{
"localRoot": "${workspaceFolder}",
"remoteRoot": "."
}
],
"justMyCode": true,
"django": true
},
{
"name": "Debug: Start Server Router",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/server",
"cwd": "${workspaceFolder}"
},
{
"name": "Debug: Start LDAP Outpost",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/ldap",
"cwd": "${workspaceFolder}"
},
{
"name": "Debug: Start Proxy Outpost",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/proxy",
"cwd": "${workspaceFolder}"
},
{
"name": "Debug: Start RAC Outpost",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/rac",
"cwd": "${workspaceFolder}"
},
{
"name": "Debug: Start Radius Outpost",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/cmd/radius",
"cwd": "${workspaceFolder}"
}
]
}

View File

@ -15,7 +15,6 @@ go.mod @goauthentik/backend
go.sum @goauthentik/backend
# Infrastructure
.github/ @goauthentik/infrastructure
lifecycle/aws/ @goauthentik/infrastructure
Dockerfile @goauthentik/infrastructure
*Dockerfile @goauthentik/infrastructure
.dockerignore @goauthentik/infrastructure

View File

@ -155,12 +155,10 @@ WORKDIR /
# We cannot cache this layer otherwise we'll end up with a bigger image
RUN apt-get update && \
apt-get upgrade -y && \
# Required for runtime
apt-get install -y --no-install-recommends libpq5 libmaxminddb0 ca-certificates libkrb5-3 libkadm5clnt-mit12 libkdb5-10 libltdl7 libxslt1.1 && \
# Required for bootstrap & healtcheck
apt-get install -y --no-install-recommends runit && \
pip3 install --no-cache-dir --upgrade pip && \
apt-get clean && \
rm -rf /tmp/* /var/lib/apt/lists/* /var/tmp/ && \
adduser --system --no-create-home --uid 1000 --group --home /authentik authentik && \

View File

@ -5,9 +5,7 @@ PWD = $(shell pwd)
UID = $(shell id -u)
GID = $(shell id -g)
NPM_VERSION = $(shell python -m scripts.npm_version)
PY_SOURCES = authentik tests scripts lifecycle .github
GO_SOURCES = cmd internal
WEB_SOURCES = web/src web/packages
PY_SOURCES = authentik tests scripts lifecycle .github website/docs/install-config/install/aws
DOCKER_IMAGE ?= "authentik:test"
GEN_API_TS = "gen-ts-api"
@ -21,12 +19,11 @@ pg_name := $(shell python -m authentik.lib.config postgresql.name 2>/dev/null)
CODESPELL_ARGS = -D - -D .github/codespell-dictionary.txt \
-I .github/codespell-words.txt \
-S 'web/src/locales/**' \
-S 'website/developer-docs/api/reference/**' \
-S '**/node_modules/**' \
-S '**/dist/**' \
$(PY_SOURCES) \
$(GO_SOURCES) \
$(WEB_SOURCES) \
-S 'website/docs/developer-docs/api/reference/**' \
authentik \
internal \
cmd \
web/src \
website/src \
website/blog \
website/docs \
@ -72,9 +69,6 @@ migrate: ## Run the Authentik Django server's migrations
i18n-extract: core-i18n-extract web-i18n-extract ## Extract strings that require translation into files to send to a translation service
aws-cfn:
cd lifecycle/aws && npm run aws-cfn
core-i18n-extract:
ak makemessages \
--add-location file \
@ -146,7 +140,7 @@ gen-client-ts: gen-clean-ts ## Build and install the authentik API for Typescri
docker run \
--rm -v ${PWD}:/local \
--user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v7.11.0 generate \
docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \
-i /local/schema.yml \
-g typescript-fetch \
-o /local/${GEN_API_TS} \
@ -249,6 +243,9 @@ website-build:
website-watch: ## Build and watch the documentation website, updating automatically
cd website && npm run watch
aws-cfn:
cd website && npm run aws-cfn
#########################
## Docker
#########################
@ -258,7 +255,7 @@ docker: ## Build a docker image of the current source tree
DOCKER_BUILDKIT=1 docker build . --progress plain --tag ${DOCKER_IMAGE}
test-docker:
BUILD=true ./scripts/test_docker.sh
./scripts/test_docker.sh
#########################
## CI
@ -284,8 +281,3 @@ ci-bandit: ci--meta-debug
ci-pending-migrations: ci--meta-debug
ak makemigrations --check
ci-test: ci--meta-debug
coverage run manage.py test --keepdb --randomly-seed ${CI_TEST_SEED} authentik
coverage report
coverage xml

View File

@ -2,7 +2,7 @@
from os import environ
__version__ = "2024.12.3"
__version__ = "2024.12.5"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"
@ -16,5 +16,5 @@ def get_full_version() -> str:
"""Get full version, with build hash appended"""
version = __version__
if (build_hash := get_build_hash()) != "":
return f"{version}+{build_hash}"
version += "." + build_hash
return version

View File

@ -7,9 +7,7 @@ from sys import version as python_version
from typing import TypedDict
from cryptography.hazmat.backends.openssl.backend import backend
from django.conf import settings
from django.utils.timezone import now
from django.views.debug import SafeExceptionReporterFilter
from drf_spectacular.utils import extend_schema
from rest_framework.fields import SerializerMethodField
from rest_framework.request import Request
@ -54,16 +52,10 @@ class SystemInfoSerializer(PassiveSerializer):
def get_http_headers(self, request: Request) -> dict[str, str]:
"""Get HTTP Request headers"""
headers = {}
raw_session = request._request.COOKIES.get(settings.SESSION_COOKIE_NAME)
for key, value in request.META.items():
if not isinstance(value, str):
continue
actual_value = value
if raw_session in actual_value:
actual_value = actual_value.replace(
raw_session, SafeExceptionReporterFilter.cleansed_substitute
)
headers[key] = actual_value
headers[key] = value
return headers
def get_http_host(self, request: Request) -> str:

View File

@ -1,16 +1,12 @@
"""authentik administration overview"""
from socket import gethostname
from django.conf import settings
from drf_spectacular.utils import extend_schema, inline_serializer
from packaging.version import parse
from rest_framework.fields import BooleanField, CharField
from rest_framework.fields import IntegerField
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.views import APIView
from authentik import get_full_version
from authentik.rbac.permissions import HasPermission
from authentik.root.celery import CELERY_APP
@ -20,38 +16,11 @@ class WorkerView(APIView):
permission_classes = [HasPermission("authentik_rbac.view_system_info")]
@extend_schema(
responses=inline_serializer(
"Worker",
fields={
"worker_id": CharField(),
"version": CharField(),
"version_matching": BooleanField(),
},
many=True,
)
)
@extend_schema(responses=inline_serializer("Workers", fields={"count": IntegerField()}))
def get(self, request: Request) -> Response:
"""Get currently connected worker count."""
raw: list[dict[str, dict]] = CELERY_APP.control.ping(timeout=0.5)
our_version = parse(get_full_version())
response = []
for worker in raw:
key = list(worker.keys())[0]
version = worker[key].get("version")
version_matching = False
if version:
version_matching = parse(version) == our_version
response.append(
{"worker_id": key, "version": version, "version_matching": version_matching}
)
count = len(CELERY_APP.control.ping(timeout=0.5))
# In debug we run with `task_always_eager`, so tasks are ran on the main process
if settings.DEBUG: # pragma: no cover
response.append(
{
"worker_id": f"authentik-debug@{gethostname()}",
"version": get_full_version(),
"version_matching": True,
}
)
return Response(response)
count += 1
return Response({"count": count})

View File

@ -1,10 +1,11 @@
"""authentik admin app config"""
from prometheus_client import Info
from prometheus_client import Gauge, Info
from authentik.blueprints.apps import ManagedAppConfig
PROM_INFO = Info("authentik_version", "Currently running authentik version")
GAUGE_WORKERS = Gauge("authentik_admin_workers", "Currently connected workers")
class AuthentikAdminConfig(ManagedAppConfig):

View File

@ -1,35 +1,14 @@
"""admin signals"""
from django.dispatch import receiver
from packaging.version import parse
from prometheus_client import Gauge
from authentik import get_full_version
from authentik.admin.apps import GAUGE_WORKERS
from authentik.root.celery import CELERY_APP
from authentik.root.monitoring import monitoring_set
GAUGE_WORKERS = Gauge(
"authentik_admin_workers",
"Currently connected workers, their versions and if they are the same version as authentik",
["version", "version_matched"],
)
_version = parse(get_full_version())
@receiver(monitoring_set)
def monitoring_set_workers(sender, **kwargs):
"""Set worker gauge"""
raw: list[dict[str, dict]] = CELERY_APP.control.ping(timeout=0.5)
worker_version_count = {}
for worker in raw:
key = list(worker.keys())[0]
version = worker[key].get("version")
version_matching = False
if version:
version_matching = parse(version) == _version
worker_version_count.setdefault(version, {"count": 0, "matching": version_matching})
worker_version_count[version]["count"] += 1
for version, stats in worker_version_count.items():
GAUGE_WORKERS.labels(version, stats["matching"]).set(stats["count"])
count = len(CELERY_APP.control.ping(timeout=0.5))
GAUGE_WORKERS.set(count)

View File

@ -34,7 +34,7 @@ class TestAdminAPI(TestCase):
response = self.client.get(reverse("authentik_api:admin_workers"))
self.assertEqual(response.status_code, 200)
body = loads(response.content)
self.assertEqual(len(body), 0)
self.assertEqual(body["count"], 0)
def test_metrics(self):
"""Test metrics API"""

View File

@ -0,0 +1,67 @@
"""API Authorization"""
from django.conf import settings
from django.db.models import Model
from django.db.models.query import QuerySet
from django_filters.rest_framework import DjangoFilterBackend
from rest_framework.authentication import get_authorization_header
from rest_framework.filters import BaseFilterBackend
from rest_framework.permissions import BasePermission
from rest_framework.request import Request
from authentik.api.authentication import validate_auth
from authentik.rbac.filters import ObjectFilter
class OwnerFilter(BaseFilterBackend):
"""Filter objects by their owner"""
owner_key = "user"
def filter_queryset(self, request: Request, queryset: QuerySet, view) -> QuerySet:
if request.user.is_superuser:
return queryset
return queryset.filter(**{self.owner_key: request.user})
class SecretKeyFilter(DjangoFilterBackend):
"""Allow access to all objects when authenticated with secret key as token.
Replaces both DjangoFilterBackend and ObjectFilter"""
def filter_queryset(self, request: Request, queryset: QuerySet, view) -> QuerySet:
auth_header = get_authorization_header(request)
token = validate_auth(auth_header)
if token and token == settings.SECRET_KEY:
return queryset
queryset = ObjectFilter().filter_queryset(request, queryset, view)
return super().filter_queryset(request, queryset, view)
class OwnerPermissions(BasePermission):
"""Authorize requests by an object's owner matching the requesting user"""
owner_key = "user"
def has_permission(self, request: Request, view) -> bool:
"""If the user is authenticated, we allow all requests here. For listing, the
object-level permissions are done by the filter backend"""
return request.user.is_authenticated
def has_object_permission(self, request: Request, view, obj: Model) -> bool:
"""Check if the object's owner matches the currently logged in user"""
if not hasattr(obj, self.owner_key):
return False
owner = getattr(obj, self.owner_key)
if owner != request.user:
return False
return True
class OwnerSuperuserPermissions(OwnerPermissions):
"""Similar to OwnerPermissions, except always allow access for superusers"""
def has_object_permission(self, request: Request, view, obj: Model) -> bool:
if request.user.is_superuser:
return True
return super().has_object_permission(request, view, obj)

View File

@ -1,68 +0,0 @@
"""Test and debug Blueprints"""
import atexit
import readline
from pathlib import Path
from pprint import pformat
from sys import exit as sysexit
from textwrap import indent
from django.core.management.base import BaseCommand, no_translations
from structlog.stdlib import get_logger
from yaml import load
from authentik.blueprints.v1.common import BlueprintLoader, EntryInvalidError
from authentik.core.management.commands.shell import get_banner_text
from authentik.lib.utils.errors import exception_to_string
LOGGER = get_logger()
class Command(BaseCommand):
"""Test and debug Blueprints"""
lines = []
def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
histfolder = Path("~").expanduser() / Path(".local/share/authentik")
histfolder.mkdir(parents=True, exist_ok=True)
histfile = histfolder / Path("blueprint_shell_history")
readline.parse_and_bind("tab: complete")
readline.parse_and_bind("set editing-mode vi")
try:
readline.read_history_file(str(histfile))
except FileNotFoundError:
pass
atexit.register(readline.write_history_file, str(histfile))
@no_translations
def handle(self, *args, **options):
"""Interactively debug blueprint files"""
self.stdout.write(get_banner_text("Blueprint shell"))
self.stdout.write("Type '.eval' to evaluate previously entered statement(s).")
def do_eval():
yaml_input = "\n".join([line for line in self.lines if line])
data = load(yaml_input, BlueprintLoader)
self.stdout.write(pformat(data))
self.lines = []
while True:
try:
line = input("> ")
if line == ".eval":
do_eval()
else:
self.lines.append(line)
except EntryInvalidError as exc:
self.stdout.write("Failed to evaluate expression:")
self.stdout.write(indent(exception_to_string(exc), prefix=" "))
except EOFError:
break
except KeyboardInterrupt:
self.stdout.write()
sysexit(0)
self.stdout.write()

View File

@ -126,7 +126,7 @@ class Command(BaseCommand):
def_name_perm = f"model_{model_path}_permissions"
def_path_perm = f"#/$defs/{def_name_perm}"
self.schema["$defs"][def_name_perm] = self.model_permissions(model)
template = {
return {
"type": "object",
"required": ["model", "identifiers"],
"properties": {
@ -143,11 +143,6 @@ class Command(BaseCommand):
"identifiers": {"$ref": def_path},
},
}
# Meta models don't require identifiers, as there's no matching database model to find
if issubclass(model, BaseMetaModel):
del template["properties"]["identifiers"]
template["required"].remove("identifiers")
return template
def field_to_jsonschema(self, field: Field) -> dict:
"""Convert a single field to json schema"""

View File

@ -202,9 +202,6 @@ class Blueprint:
class YAMLTag:
"""Base class for all YAML Tags"""
def __repr__(self) -> str:
return str(self.resolve(BlueprintEntry(""), Blueprint()))
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
"""Implement yaml tag logic"""
raise NotImplementedError

View File

@ -51,7 +51,6 @@ from authentik.enterprise.providers.microsoft_entra.models import (
MicrosoftEntraProviderUser,
)
from authentik.enterprise.providers.rac.models import ConnectionToken
from authentik.enterprise.providers.ssf.models import StreamEvent
from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import (
EndpointDevice,
EndpointDeviceConnection,
@ -132,7 +131,6 @@ def excluded_models() -> list[type[Model]]:
EndpointDevice,
EndpointDeviceConnection,
DeviceToken,
StreamEvent,
)

View File

@ -14,10 +14,10 @@ from rest_framework.response import Response
from rest_framework.validators import UniqueValidator
from rest_framework.viewsets import ModelViewSet
from authentik.api.authorization import SecretKeyFilter
from authentik.brands.models import Brand
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import ModelSerializer, PassiveSerializer
from authentik.rbac.filters import SecretKeyFilter
from authentik.tenants.utils import get_current_tenant

View File

@ -2,12 +2,16 @@
from typing import TypedDict
from django_filters.rest_framework import DjangoFilterBackend
from guardian.utils import get_anonymous_user
from rest_framework import mixins
from rest_framework.fields import SerializerMethodField
from rest_framework.filters import OrderingFilter, SearchFilter
from rest_framework.request import Request
from rest_framework.viewsets import GenericViewSet
from ua_parser import user_agent_parser
from authentik.api.authorization import OwnerSuperuserPermissions
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import ModelSerializer
from authentik.core.models import AuthenticatedSession
@ -106,4 +110,11 @@ class AuthenticatedSessionViewSet(
search_fields = ["user__username", "last_ip", "last_user_agent"]
filterset_fields = ["user__username", "last_ip", "last_user_agent"]
ordering = ["user__username"]
owner_field = "user"
permission_classes = [OwnerSuperuserPermissions]
filter_backends = [DjangoFilterBackend, OrderingFilter, SearchFilter]
def get_queryset(self):
user = self.request.user if self.request else get_anonymous_user()
if user.is_superuser:
return super().get_queryset()
return super().get_queryset().filter(user=user.pk)

View File

@ -2,22 +2,26 @@
from collections.abc import Iterable
from django_filters.rest_framework import DjangoFilterBackend
from drf_spectacular.utils import OpenApiResponse, extend_schema
from rest_framework import mixins
from rest_framework.decorators import action
from rest_framework.fields import CharField, ReadOnlyField, SerializerMethodField
from rest_framework.filters import OrderingFilter, SearchFilter
from rest_framework.parsers import MultiPartParser
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import GenericViewSet
from structlog.stdlib import get_logger
from authentik.api.authorization import OwnerFilter, OwnerSuperuserPermissions
from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT
from authentik.core.api.object_types import TypesMixin
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import MetaNameSerializer, ModelSerializer
from authentik.core.models import GroupSourceConnection, Source, UserSourceConnection
from authentik.core.types import UserSettingSerializer
from authentik.lib.api import MultipleFieldLookupMixin
from authentik.lib.utils.file import (
FilePathSerializer,
FileUploadSerializer,
@ -72,6 +76,7 @@ class SourceSerializer(ModelSerializer, MetaNameSerializer):
class SourceViewSet(
MultipleFieldLookupMixin,
TypesMixin,
mixins.RetrieveModelMixin,
mixins.DestroyModelMixin,
@ -84,8 +89,9 @@ class SourceViewSet(
queryset = Source.objects.none()
serializer_class = SourceSerializer
lookup_field = "slug"
lookup_fields = ["slug", "pbm_uuid"]
search_fields = ["slug", "name"]
filterset_fields = ["slug", "name", "managed", "pbm_uuid"]
filterset_fields = ["slug", "name", "managed"]
def get_queryset(self): # pragma: no cover
return Source.objects.select_subclasses()
@ -186,10 +192,11 @@ class UserSourceConnectionViewSet(
queryset = UserSourceConnection.objects.all()
serializer_class = UserSourceConnectionSerializer
permission_classes = [OwnerSuperuserPermissions]
filterset_fields = ["user", "source__slug"]
search_fields = ["source__slug"]
filter_backends = [OwnerFilter, DjangoFilterBackend, OrderingFilter, SearchFilter]
ordering = ["source__slug", "pk"]
owner_field = "user"
class GroupSourceConnectionSerializer(SourceSerializer):
@ -224,7 +231,8 @@ class GroupSourceConnectionViewSet(
queryset = GroupSourceConnection.objects.all()
serializer_class = GroupSourceConnectionSerializer
permission_classes = [OwnerSuperuserPermissions]
filterset_fields = ["group", "source__slug"]
search_fields = ["source__slug"]
filter_backends = [OwnerFilter, DjangoFilterBackend, OrderingFilter, SearchFilter]
ordering = ["source__slug", "pk"]
owner_field = "user"

View File

@ -3,15 +3,18 @@
from typing import Any
from django.utils.timezone import now
from django_filters.rest_framework import DjangoFilterBackend
from drf_spectacular.utils import OpenApiResponse, extend_schema, inline_serializer
from guardian.shortcuts import assign_perm, get_anonymous_user
from rest_framework.decorators import action
from rest_framework.exceptions import ValidationError
from rest_framework.fields import CharField
from rest_framework.filters import OrderingFilter, SearchFilter
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import ModelViewSet
from authentik.api.authorization import OwnerSuperuserPermissions
from authentik.blueprints.api import ManagedSerializer
from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT
from authentik.core.api.used_by import UsedByMixin
@ -135,8 +138,8 @@ class TokenViewSet(UsedByMixin, ModelViewSet):
"managed",
]
ordering = ["identifier", "expires"]
owner_field = "user"
rbac_allow_create_without_perm = True
permission_classes = [OwnerSuperuserPermissions]
filter_backends = [DjangoFilterBackend, OrderingFilter, SearchFilter]
def get_queryset(self):
user = self.request.user if self.request else get_anonymous_user()

View File

@ -1,13 +1,14 @@
"""User API Views"""
from datetime import timedelta
from importlib import import_module
from json import loads
from typing import Any
from django.conf import settings
from django.contrib.auth import update_session_auth_hash
from django.contrib.auth.models import Permission
from django.contrib.sessions.backends.cache import KEY_PREFIX
from django.core.cache import cache
from django.contrib.sessions.backends.base import SessionBase
from django.db.models.functions import ExtractHour
from django.db.transaction import atomic
from django.db.utils import IntegrityError
@ -91,6 +92,7 @@ from authentik.stages.email.tasks import send_mails
from authentik.stages.email.utils import TemplateEmailMessage
LOGGER = get_logger()
SessionStore: SessionBase = import_module(settings.SESSION_ENGINE).SessionStore
class UserGroupSerializer(ModelSerializer):
@ -236,11 +238,9 @@ class UserSerializer(ModelSerializer):
"path",
"type",
"uuid",
"password_change_date",
]
extra_kwargs = {
"name": {"allow_blank": True},
"password_change_date": {"read_only": True},
}
@ -429,7 +429,7 @@ class UserViewSet(UsedByMixin, ModelViewSet):
queryset = User.objects.none()
ordering = ["username"]
serializer_class = UserSerializer
search_fields = ["username", "name", "is_active", "email", "uuid", "attributes"]
search_fields = ["username", "name", "is_active", "email", "uuid"]
filterset_class = UsersFilter
def get_queryset(self):
@ -587,7 +587,7 @@ class UserViewSet(UsedByMixin, ModelViewSet):
"""Set password for user"""
user: User = self.get_object()
try:
user.set_password(request.data.get("password"), request=request)
user.set_password(request.data.get("password"))
user.save()
except (ValidationError, IntegrityError) as exc:
LOGGER.debug("Failed to set password", exc=exc)
@ -769,7 +769,8 @@ class UserViewSet(UsedByMixin, ModelViewSet):
if not instance.is_active:
sessions = AuthenticatedSession.objects.filter(user=instance)
session_ids = sessions.values_list("session_key", flat=True)
cache.delete_many(f"{KEY_PREFIX}{session}" for session in session_ids)
for session in session_ids:
SessionStore(session).delete()
sessions.delete()
LOGGER.debug("Deleted user's sessions", user=instance.username)
return response

View File

@ -44,12 +44,13 @@ class TokenBackend(InbuiltBackend):
self, request: HttpRequest, username: str | None, password: str | None, **kwargs: Any
) -> User | None:
try:
user = User._default_manager.get_by_natural_key(username)
except User.DoesNotExist:
# Run the default password hasher once to reduce the timing
# difference between an existing and a nonexistent user (#20760).
User().set_password(password, request=request)
User().set_password(password)
return None
tokens = Token.filter_not_expired(

View File

@ -58,7 +58,6 @@ class PropertyMappingEvaluator(BaseEvaluator):
self._context["user"] = user
if request:
req.http_request = request
self._context["http_request"] = request
req.context.update(**kwargs)
self._context["request"] = req
self._context.update(**kwargs)

View File

@ -5,7 +5,6 @@ from typing import TextIO
from daphne.management.commands.runserver import Command as RunServer
from daphne.server import Server
from authentik.lib.debug import start_debug_server
from authentik.root.signals import post_startup, pre_startup, startup
@ -14,7 +13,6 @@ class SignalServer(Server):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
start_debug_server()
def ready_callable():
pre_startup.send(sender=self)

View File

@ -17,9 +17,7 @@ from authentik.events.middleware import should_log_model
from authentik.events.models import Event, EventAction
from authentik.events.utils import model_to_dict
def get_banner_text(shell_type="shell") -> str:
return f"""### authentik {shell_type} ({get_full_version()})
BANNER_TEXT = f"""### authentik shell ({get_full_version()})
### Node {platform.node()} | Arch {platform.machine()} | Python {platform.python_version()} """
@ -116,4 +114,4 @@ class Command(BaseCommand):
readline.parse_and_bind("tab: complete")
# Run interactive shell
code.interact(banner=get_banner_text(), local=namespace)
code.interact(banner=BANNER_TEXT, local=namespace)

View File

@ -9,7 +9,6 @@ from django.db import close_old_connections
from structlog.stdlib import get_logger
from authentik.lib.config import CONFIG
from authentik.lib.debug import start_debug_server
from authentik.root.celery import CELERY_APP
LOGGER = get_logger()
@ -29,7 +28,10 @@ class Command(BaseCommand):
def handle(self, **options):
LOGGER.debug("Celery options", **options)
close_old_connections()
start_debug_server()
if CONFIG.get_bool("remote_debug"):
import debugpy
debugpy.listen(("0.0.0.0", 6900)) # nosec
worker: Worker = CELERY_APP.Worker(
no_color=False,
quiet=True,

View File

@ -1,45 +0,0 @@
# Generated by Django 5.0.10 on 2025-01-13 18:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_core", "0041_applicationentitlement"),
]
operations = [
migrations.AddIndex(
model_name="authenticatedsession",
index=models.Index(fields=["expires"], name="authentik_c_expires_08251d_idx"),
),
migrations.AddIndex(
model_name="authenticatedsession",
index=models.Index(fields=["expiring"], name="authentik_c_expirin_9cd839_idx"),
),
migrations.AddIndex(
model_name="authenticatedsession",
index=models.Index(
fields=["expiring", "expires"], name="authentik_c_expirin_195a84_idx"
),
),
migrations.AddIndex(
model_name="authenticatedsession",
index=models.Index(fields=["session_key"], name="authentik_c_session_d0f005_idx"),
),
migrations.AddIndex(
model_name="token",
index=models.Index(fields=["expires"], name="authentik_c_expires_a62b4b_idx"),
),
migrations.AddIndex(
model_name="token",
index=models.Index(fields=["expiring"], name="authentik_c_expirin_a1b838_idx"),
),
migrations.AddIndex(
model_name="token",
index=models.Index(
fields=["expiring", "expires"], name="authentik_c_expirin_ba04d9_idx"
),
),
]

View File

@ -356,13 +356,13 @@ class User(SerializerModel, GuardianUserMixin, AttributesMixin, AbstractUser):
"""superuser == staff user"""
return self.is_superuser # type: ignore
def set_password(self, raw_password, signal=True, sender=None, request=None):
def set_password(self, raw_password, signal=True, sender=None):
if self.pk and signal:
from authentik.core.signals import password_changed
if not sender:
sender = self
password_changed.send(sender=sender, user=self, password=raw_password, request=request)
password_changed.send(sender=sender, user=self, password=raw_password)
self.password_change_date = now()
return super().set_password(raw_password)
@ -599,14 +599,6 @@ class Application(SerializerModel, PolicyBindingModel):
return None
return candidates[-1]
def backchannel_provider_for[T: Provider](self, provider_type: type[T], **kwargs) -> T | None:
"""Get Backchannel provider for a specific type"""
providers = self.backchannel_providers.filter(
**{f"{provider_type._meta.model_name}__isnull": False},
**kwargs,
)
return getattr(providers.first(), provider_type._meta.model_name)
def __str__(self):
return str(self.name)
@ -854,11 +846,6 @@ class ExpiringModel(models.Model):
class Meta:
abstract = True
indexes = [
models.Index(fields=["expires"]),
models.Index(fields=["expiring"]),
models.Index(fields=["expiring", "expires"]),
]
def expire_action(self, *args, **kwargs):
"""Handler which is called when this object is expired. By
@ -914,7 +901,7 @@ class Token(SerializerModel, ManagedModel, ExpiringModel):
class Meta:
verbose_name = _("Token")
verbose_name_plural = _("Tokens")
indexes = ExpiringModel.Meta.indexes + [
indexes = [
models.Index(fields=["identifier"]),
models.Index(fields=["key"]),
]
@ -1014,9 +1001,6 @@ class AuthenticatedSession(ExpiringModel):
class Meta:
verbose_name = _("Authenticated Session")
verbose_name_plural = _("Authenticated Sessions")
indexes = ExpiringModel.Meta.indexes + [
models.Index(fields=["session_key"]),
]
def __str__(self) -> str:
return f"Authenticated Session {self.session_key[:10]}"

View File

@ -1,7 +1,10 @@
"""authentik core signals"""
from importlib import import_module
from django.conf import settings
from django.contrib.auth.signals import user_logged_in, user_logged_out
from django.contrib.sessions.backends.cache import KEY_PREFIX
from django.contrib.sessions.backends.base import SessionBase
from django.core.cache import cache
from django.core.signals import Signal
from django.db.models import Model
@ -25,6 +28,7 @@ password_changed = Signal()
login_failed = Signal()
LOGGER = get_logger()
SessionStore: SessionBase = import_module(settings.SESSION_ENGINE).SessionStore
@receiver(post_save, sender=Application)
@ -60,8 +64,7 @@ def user_logged_out_session(sender, request: HttpRequest, user: User, **_):
@receiver(pre_delete, sender=AuthenticatedSession)
def authenticated_session_delete(sender: type[Model], instance: "AuthenticatedSession", **_):
"""Delete session when authenticated session is deleted"""
cache_key = f"{KEY_PREFIX}{instance.session_key}"
cache.delete(cache_key)
SessionStore(instance.session_key).delete()
@receiver(pre_save)

View File

@ -28,6 +28,7 @@ from rest_framework.validators import UniqueValidator
from rest_framework.viewsets import ModelViewSet
from structlog.stdlib import get_logger
from authentik.api.authorization import SecretKeyFilter
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import ModelSerializer, PassiveSerializer
from authentik.crypto.apps import MANAGED_KEY
@ -35,7 +36,7 @@ from authentik.crypto.builder import CertificateBuilder, PrivateKeyAlg
from authentik.crypto.models import CertificateKeyPair
from authentik.events.models import Event, EventAction
from authentik.rbac.decorators import permission_required
from authentik.rbac.filters import ObjectFilter, SecretKeyFilter
from authentik.rbac.filters import ObjectFilter
LOGGER = get_logger()

View File

@ -1,27 +0,0 @@
# Generated by Django 5.0.10 on 2025-01-13 18:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_enterprise", "0003_remove_licenseusage_within_limits_and_more"),
]
operations = [
migrations.AddIndex(
model_name="licenseusage",
index=models.Index(fields=["expires"], name="authentik_e_expires_3f2956_idx"),
),
migrations.AddIndex(
model_name="licenseusage",
index=models.Index(fields=["expiring"], name="authentik_e_expirin_11d3d7_idx"),
),
migrations.AddIndex(
model_name="licenseusage",
index=models.Index(
fields=["expiring", "expires"], name="authentik_e_expirin_4d558f_idx"
),
),
]

View File

@ -93,4 +93,3 @@ class LicenseUsage(ExpiringModel):
class Meta:
verbose_name = _("License Usage")
verbose_name_plural = _("License Usage Records")
indexes = ExpiringModel.Meta.indexes

View File

@ -1,8 +1,11 @@
"""RAC Provider API Views"""
from django_filters.rest_framework.backends import DjangoFilterBackend
from rest_framework import mixins
from rest_framework.filters import OrderingFilter, SearchFilter
from rest_framework.viewsets import GenericViewSet
from authentik.api.authorization import OwnerFilter, OwnerSuperuserPermissions
from authentik.core.api.groups import GroupMemberSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import ModelSerializer
@ -31,6 +34,12 @@ class ConnectionTokenSerializer(EnterpriseRequiredMixin, ModelSerializer):
]
class ConnectionTokenOwnerFilter(OwnerFilter):
"""Owner filter for connection tokens (checks session's user)"""
owner_key = "session__user"
class ConnectionTokenViewSet(
mixins.RetrieveModelMixin,
mixins.UpdateModelMixin,
@ -46,4 +55,10 @@ class ConnectionTokenViewSet(
filterset_fields = ["endpoint", "session__user", "provider"]
search_fields = ["endpoint__name", "provider__name"]
ordering = ["endpoint__name", "provider__name"]
owner_field = "session__user"
permission_classes = [OwnerSuperuserPermissions]
filter_backends = [
ConnectionTokenOwnerFilter,
DjangoFilterBackend,
OrderingFilter,
SearchFilter,
]

View File

@ -1,28 +0,0 @@
# Generated by Django 5.0.10 on 2025-01-13 18:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_core", "0042_authenticatedsession_authentik_c_expires_08251d_idx_and_more"),
("authentik_providers_rac", "0005_alter_racpropertymapping_options"),
]
operations = [
migrations.AddIndex(
model_name="connectiontoken",
index=models.Index(fields=["expires"], name="authentik_p_expires_91f148_idx"),
),
migrations.AddIndex(
model_name="connectiontoken",
index=models.Index(fields=["expiring"], name="authentik_p_expirin_59a5a7_idx"),
),
migrations.AddIndex(
model_name="connectiontoken",
index=models.Index(
fields=["expiring", "expires"], name="authentik_p_expirin_aed3ca_idx"
),
),
]

View File

@ -211,4 +211,3 @@ class ConnectionToken(ExpiringModel):
class Meta:
verbose_name = _("RAC Connection token")
verbose_name_plural = _("RAC Connection tokens")
indexes = ExpiringModel.Meta.indexes

View File

@ -1,64 +0,0 @@
"""SSF Provider API Views"""
from django.urls import reverse
from rest_framework.fields import SerializerMethodField
from rest_framework.request import Request
from rest_framework.viewsets import ModelViewSet
from authentik.core.api.providers import ProviderSerializer
from authentik.core.api.tokens import TokenSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.enterprise.api import EnterpriseRequiredMixin
from authentik.enterprise.providers.ssf.models import SSFProvider
class SSFProviderSerializer(EnterpriseRequiredMixin, ProviderSerializer):
"""SSFProvider Serializer"""
ssf_url = SerializerMethodField()
token_obj = TokenSerializer(source="token", required=False, read_only=True)
def get_ssf_url(self, instance: SSFProvider) -> str | None:
request: Request = self._context.get("request")
if not request:
return None
if not instance.backchannel_application:
return None
return request.build_absolute_uri(
reverse(
"authentik_providers_ssf:configuration",
kwargs={
"application_slug": instance.backchannel_application.slug,
},
)
)
class Meta:
model = SSFProvider
fields = [
"pk",
"name",
"component",
"verbose_name",
"verbose_name_plural",
"meta_model_name",
"signing_key",
"token_obj",
"oidc_auth_providers",
"ssf_url",
"event_retention",
]
extra_kwargs = {}
class SSFProviderViewSet(UsedByMixin, ModelViewSet):
"""SSFProvider Viewset"""
queryset = SSFProvider.objects.all()
serializer_class = SSFProviderSerializer
filterset_fields = {
"application": ["isnull"],
"name": ["iexact"],
}
search_fields = ["name"]
ordering = ["name"]

View File

@ -1,37 +0,0 @@
"""SSF Stream API Views"""
from rest_framework.viewsets import ReadOnlyModelViewSet
from authentik.core.api.utils import ModelSerializer
from authentik.enterprise.providers.ssf.api.providers import SSFProviderSerializer
from authentik.enterprise.providers.ssf.models import Stream
class SSFStreamSerializer(ModelSerializer):
"""SSFStream Serializer"""
provider_obj = SSFProviderSerializer(source="provider", read_only=True)
class Meta:
model = Stream
fields = [
"pk",
"provider",
"provider_obj",
"delivery_method",
"endpoint_url",
"events_requested",
"format",
"aud",
"iss",
]
class SSFStreamViewSet(ReadOnlyModelViewSet):
"""SSFStream Viewset"""
queryset = Stream.objects.all()
serializer_class = SSFStreamSerializer
filterset_fields = ["provider", "endpoint_url", "delivery_method"]
search_fields = ["provider__name", "endpoint_url"]
ordering = ["provider", "uuid"]

View File

@ -1,13 +0,0 @@
"""SSF app config"""
from authentik.enterprise.apps import EnterpriseConfig
class AuthentikEnterpriseProviderSSF(EnterpriseConfig):
"""authentik enterprise ssf app config"""
name = "authentik.enterprise.providers.ssf"
label = "authentik_providers_ssf"
verbose_name = "authentik Enterprise.Providers.SSF"
default = True
mountpoint = ""

View File

@ -1,201 +0,0 @@
# Generated by Django 5.0.11 on 2025-02-05 16:20
import authentik.lib.utils.time
import django.contrib.postgres.fields
import django.db.models.deletion
import uuid
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
("authentik_core", "0042_authenticatedsession_authentik_c_expires_08251d_idx_and_more"),
("authentik_crypto", "0004_alter_certificatekeypair_name"),
("authentik_providers_oauth2", "0027_accesstoken_authentik_p_expires_9f24a5_idx_and_more"),
]
operations = [
migrations.CreateModel(
name="SSFProvider",
fields=[
(
"provider_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="authentik_core.provider",
),
),
(
"event_retention",
models.TextField(
default="days=30",
validators=[authentik.lib.utils.time.timedelta_string_validator],
),
),
(
"oidc_auth_providers",
models.ManyToManyField(
blank=True, default=None, to="authentik_providers_oauth2.oauth2provider"
),
),
(
"signing_key",
models.ForeignKey(
help_text="Key used to sign the SSF Events.",
on_delete=django.db.models.deletion.CASCADE,
to="authentik_crypto.certificatekeypair",
verbose_name="Signing Key",
),
),
(
"token",
models.ForeignKey(
default=None,
null=True,
on_delete=django.db.models.deletion.CASCADE,
to="authentik_core.token",
),
),
],
options={
"verbose_name": "Shared Signals Framework Provider",
"verbose_name_plural": "Shared Signals Framework Providers",
"permissions": [("add_stream", "Add stream to SSF provider")],
},
bases=("authentik_core.provider",),
),
migrations.CreateModel(
name="Stream",
fields=[
(
"uuid",
models.UUIDField(
default=uuid.uuid4, editable=False, primary_key=True, serialize=False
),
),
(
"delivery_method",
models.TextField(
choices=[
(
"https://schemas.openid.net/secevent/risc/delivery-method/push",
"Risc Push",
),
(
"https://schemas.openid.net/secevent/risc/delivery-method/poll",
"Risc Poll",
),
]
),
),
("endpoint_url", models.TextField(null=True)),
(
"events_requested",
django.contrib.postgres.fields.ArrayField(
base_field=models.TextField(
choices=[
(
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
"Caep Session Revoked",
),
(
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"Caep Credential Change",
),
(
"https://schemas.openid.net/secevent/ssf/event-type/verification",
"Set Verification",
),
]
),
default=list,
size=None,
),
),
("format", models.TextField()),
(
"aud",
django.contrib.postgres.fields.ArrayField(
base_field=models.TextField(), default=list, size=None
),
),
("iss", models.TextField()),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="authentik_providers_ssf.ssfprovider",
),
),
],
options={
"verbose_name": "SSF Stream",
"verbose_name_plural": "SSF Streams",
"default_permissions": ["change", "delete", "view"],
},
),
migrations.CreateModel(
name="StreamEvent",
fields=[
("created", models.DateTimeField(auto_now_add=True)),
("last_updated", models.DateTimeField(auto_now=True)),
("expires", models.DateTimeField(default=None, null=True)),
("expiring", models.BooleanField(default=True)),
(
"uuid",
models.UUIDField(
default=uuid.uuid4, editable=False, primary_key=True, serialize=False
),
),
(
"status",
models.TextField(
choices=[
("pending_new", "Pending New"),
("pending_failed", "Pending Failed"),
("sent", "Sent"),
]
),
),
(
"type",
models.TextField(
choices=[
(
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
"Caep Session Revoked",
),
(
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"Caep Credential Change",
),
(
"https://schemas.openid.net/secevent/ssf/event-type/verification",
"Set Verification",
),
]
),
),
("payload", models.JSONField(default=dict)),
(
"stream",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="authentik_providers_ssf.stream",
),
),
],
options={
"verbose_name": "SSF Stream Event",
"verbose_name_plural": "SSF Stream Events",
"ordering": ("-created",),
},
),
]

View File

@ -1,178 +0,0 @@
from datetime import datetime
from functools import cached_property
from uuid import uuid4
from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurvePrivateKey
from cryptography.hazmat.primitives.asymmetric.rsa import RSAPrivateKey
from cryptography.hazmat.primitives.asymmetric.types import PrivateKeyTypes
from django.contrib.postgres.fields import ArrayField
from django.db import models
from django.templatetags.static import static
from django.utils.timezone import now
from django.utils.translation import gettext_lazy as _
from jwt import encode
from authentik.core.models import BackchannelProvider, ExpiringModel, Token
from authentik.crypto.models import CertificateKeyPair
from authentik.lib.models import CreatedUpdatedModel
from authentik.lib.utils.time import timedelta_from_string, timedelta_string_validator
from authentik.providers.oauth2.models import JWTAlgorithms, OAuth2Provider
class EventTypes(models.TextChoices):
"""SSF Event types supported by authentik"""
CAEP_SESSION_REVOKED = "https://schemas.openid.net/secevent/caep/event-type/session-revoked"
CAEP_CREDENTIAL_CHANGE = "https://schemas.openid.net/secevent/caep/event-type/credential-change"
SET_VERIFICATION = "https://schemas.openid.net/secevent/ssf/event-type/verification"
class DeliveryMethods(models.TextChoices):
"""SSF Delivery methods"""
RISC_PUSH = "https://schemas.openid.net/secevent/risc/delivery-method/push"
RISC_POLL = "https://schemas.openid.net/secevent/risc/delivery-method/poll"
class SSFEventStatus(models.TextChoices):
"""SSF Event status"""
PENDING_NEW = "pending_new"
PENDING_FAILED = "pending_failed"
SENT = "sent"
class SSFProvider(BackchannelProvider):
"""Shared Signals Framework provider to allow applications to
receive user events from authentik."""
signing_key = models.ForeignKey(
CertificateKeyPair,
verbose_name=_("Signing Key"),
on_delete=models.CASCADE,
help_text=_("Key used to sign the SSF Events."),
)
oidc_auth_providers = models.ManyToManyField(OAuth2Provider, blank=True, default=None)
token = models.ForeignKey(Token, on_delete=models.CASCADE, null=True, default=None)
event_retention = models.TextField(
default="days=30",
validators=[timedelta_string_validator],
)
@cached_property
def jwt_key(self) -> tuple[PrivateKeyTypes, str]:
"""Get either the configured certificate or the client secret"""
key: CertificateKeyPair = self.signing_key
private_key = key.private_key
if isinstance(private_key, RSAPrivateKey):
return private_key, JWTAlgorithms.RS256
if isinstance(private_key, EllipticCurvePrivateKey):
return private_key, JWTAlgorithms.ES256
raise ValueError(f"Invalid private key type: {type(private_key)}")
@property
def service_account_identifier(self) -> str:
return f"ak-providers-ssf-{self.pk}"
@property
def serializer(self):
from authentik.enterprise.providers.ssf.api.providers import SSFProviderSerializer
return SSFProviderSerializer
@property
def icon_url(self) -> str | None:
return static("authentik/sources/ssf.svg")
@property
def component(self) -> str:
return "ak-provider-ssf-form"
class Meta:
verbose_name = _("Shared Signals Framework Provider")
verbose_name_plural = _("Shared Signals Framework Providers")
permissions = [
# This overrides the default "add_stream" permission of the Stream object,
# as the user requesting to add a stream must have the permission on the provider
("add_stream", _("Add stream to SSF provider")),
]
class Stream(models.Model):
"""SSF Stream"""
uuid = models.UUIDField(default=uuid4, primary_key=True, editable=False)
provider = models.ForeignKey(SSFProvider, on_delete=models.CASCADE)
delivery_method = models.TextField(choices=DeliveryMethods.choices)
endpoint_url = models.TextField(null=True)
events_requested = ArrayField(models.TextField(choices=EventTypes.choices), default=list)
format = models.TextField()
aud = ArrayField(models.TextField(), default=list)
iss = models.TextField()
class Meta:
verbose_name = _("SSF Stream")
verbose_name_plural = _("SSF Streams")
default_permissions = ["change", "delete", "view"]
def __str__(self) -> str:
return "SSF Stream"
def prepare_event_payload(self, type: EventTypes, event_data: dict, **kwargs) -> dict:
jti = uuid4()
_now = now()
return {
"uuid": jti,
"stream_id": str(self.pk),
"type": type,
"expiring": True,
"status": SSFEventStatus.PENDING_NEW,
"expires": _now + timedelta_from_string(self.provider.event_retention),
"payload": {
"jti": jti.hex,
"aud": self.aud,
"iat": int(datetime.now().timestamp()),
"iss": self.iss,
"events": {type: event_data},
**kwargs,
},
}
def encode(self, data: dict) -> str:
headers = {}
if self.provider.signing_key:
headers["kid"] = self.provider.signing_key.kid
key, alg = self.provider.jwt_key
return encode(data, key, algorithm=alg, headers=headers)
class StreamEvent(CreatedUpdatedModel, ExpiringModel):
"""Single stream event to be sent"""
uuid = models.UUIDField(default=uuid4, primary_key=True, editable=False)
stream = models.ForeignKey(Stream, on_delete=models.CASCADE)
status = models.TextField(choices=SSFEventStatus.choices)
type = models.TextField(choices=EventTypes.choices)
payload = models.JSONField(default=dict)
def expire_action(self, *args, **kwargs):
"""Only allow automatic cleanup of successfully sent event"""
if self.status != SSFEventStatus.SENT:
return
return super().expire_action(*args, **kwargs)
def __str__(self):
return f"Stream event {self.type}"
class Meta:
verbose_name = _("SSF Stream Event")
verbose_name_plural = _("SSF Stream Events")
ordering = ("-created",)

View File

@ -1,193 +0,0 @@
from hashlib import sha256
from django.contrib.auth.signals import user_logged_out
from django.db.models import Model
from django.db.models.signals import post_delete, post_save, pre_delete
from django.dispatch import receiver
from django.http.request import HttpRequest
from guardian.shortcuts import assign_perm
from authentik.core.models import (
USER_PATH_SYSTEM_PREFIX,
AuthenticatedSession,
Token,
TokenIntents,
User,
UserTypes,
)
from authentik.core.signals import password_changed
from authentik.enterprise.providers.ssf.models import (
EventTypes,
SSFProvider,
)
from authentik.enterprise.providers.ssf.tasks import send_ssf_event
from authentik.events.middleware import audit_ignore
from authentik.stages.authenticator.models import Device
from authentik.stages.authenticator_duo.models import DuoDevice
from authentik.stages.authenticator_static.models import StaticDevice
from authentik.stages.authenticator_totp.models import TOTPDevice
from authentik.stages.authenticator_webauthn.models import (
UNKNOWN_DEVICE_TYPE_AAGUID,
WebAuthnDevice,
)
USER_PATH_PROVIDERS_SSF = USER_PATH_SYSTEM_PREFIX + "/providers/ssf"
@receiver(post_save, sender=SSFProvider)
def ssf_providers_post_save(sender: type[Model], instance: SSFProvider, created: bool, **_):
"""Create service account before provider is saved"""
identifier = instance.service_account_identifier
user, _ = User.objects.update_or_create(
username=identifier,
defaults={
"name": f"SSF Provider {instance.name} Service-Account",
"type": UserTypes.INTERNAL_SERVICE_ACCOUNT,
"path": USER_PATH_PROVIDERS_SSF,
},
)
assign_perm("add_stream", user, instance)
token, token_created = Token.objects.update_or_create(
identifier=identifier,
defaults={
"user": user,
"intent": TokenIntents.INTENT_API,
"expiring": False,
"managed": f"goauthentik.io/providers/ssf/{instance.pk}",
},
)
if created or token_created:
with audit_ignore():
instance.token = token
instance.save()
@receiver(user_logged_out)
def ssf_user_logged_out_session_revoked(sender, request: HttpRequest, user: User, **_):
"""Session revoked trigger (user logged out)"""
if not request.session or not request.session.session_key or not user:
return
send_ssf_event(
EventTypes.CAEP_SESSION_REVOKED,
{
"initiating_entity": "user",
},
sub_id={
"format": "complex",
"session": {
"format": "opaque",
"id": sha256(request.session.session_key.encode("ascii")).hexdigest(),
},
"user": {
"format": "email",
"email": user.email,
},
},
request=request,
)
@receiver(pre_delete, sender=AuthenticatedSession)
def ssf_user_session_delete_session_revoked(sender, instance: AuthenticatedSession, **_):
"""Session revoked trigger (users' session has been deleted)
As this signal is also triggered with a regular logout, we can't be sure
if the session has been deleted by an admin or by the user themselves."""
send_ssf_event(
EventTypes.CAEP_SESSION_REVOKED,
{
"initiating_entity": "user",
},
sub_id={
"format": "complex",
"session": {
"format": "opaque",
"id": sha256(instance.session_key.encode("ascii")).hexdigest(),
},
"user": {
"format": "email",
"email": instance.user.email,
},
},
)
@receiver(password_changed)
def ssf_password_changed_cred_change(sender, user: User, password: str | None, **_):
"""Credential change trigger (password changed)"""
send_ssf_event(
EventTypes.CAEP_CREDENTIAL_CHANGE,
{
"credential_type": "password",
"change_type": "revoke" if password is None else "update",
},
sub_id={
"format": "complex",
"user": {
"format": "email",
"email": user.email,
},
},
)
device_type_map = {
StaticDevice: "pin",
TOTPDevice: "pin",
WebAuthnDevice: "fido-u2f",
DuoDevice: "app",
}
@receiver(post_save)
def ssf_device_post_save(sender: type[Model], instance: Device, created: bool, **_):
if not isinstance(instance, Device):
return
if not instance.confirmed:
return
device_type = device_type_map.get(instance.__class__)
data = {
"credential_type": device_type,
"change_type": "create" if created else "update",
"friendly_name": instance.name,
}
if isinstance(instance, WebAuthnDevice) and instance.aaguid != UNKNOWN_DEVICE_TYPE_AAGUID:
data["fido2_aaguid"] = instance.aaguid
send_ssf_event(
EventTypes.CAEP_CREDENTIAL_CHANGE,
data,
sub_id={
"format": "complex",
"user": {
"format": "email",
"email": instance.user.email,
},
},
)
@receiver(post_delete)
def ssf_device_post_delete(sender: type[Model], instance: Device, **_):
if not isinstance(instance, Device):
return
if not instance.confirmed:
return
device_type = device_type_map.get(instance.__class__)
data = {
"credential_type": device_type,
"change_type": "delete",
"friendly_name": instance.name,
}
if isinstance(instance, WebAuthnDevice) and instance.aaguid != UNKNOWN_DEVICE_TYPE_AAGUID:
data["fido2_aaguid"] = instance.aaguid
send_ssf_event(
EventTypes.CAEP_CREDENTIAL_CHANGE,
data,
sub_id={
"format": "complex",
"user": {
"format": "email",
"email": instance.user.email,
},
},
)

View File

@ -1,136 +0,0 @@
from celery import group
from django.http import HttpRequest
from django.utils.timezone import now
from django.utils.translation import gettext_lazy as _
from requests.exceptions import RequestException
from structlog.stdlib import get_logger
from authentik.core.models import User
from authentik.enterprise.providers.ssf.models import (
DeliveryMethods,
EventTypes,
SSFEventStatus,
Stream,
StreamEvent,
)
from authentik.events.logs import LogEvent
from authentik.events.models import TaskStatus
from authentik.events.system_tasks import SystemTask
from authentik.lib.utils.http import get_http_session
from authentik.lib.utils.time import timedelta_from_string
from authentik.policies.engine import PolicyEngine
from authentik.root.celery import CELERY_APP
session = get_http_session()
LOGGER = get_logger()
def send_ssf_event(
event_type: EventTypes,
data: dict,
stream_filter: dict | None = None,
request: HttpRequest | None = None,
**extra_data,
):
"""Wrapper to send an SSF event to multiple streams"""
payload = []
if not stream_filter:
stream_filter = {}
stream_filter["events_requested__contains"] = [event_type]
if request and hasattr(request, "request_id"):
extra_data.setdefault("txn", request.request_id)
for stream in Stream.objects.filter(**stream_filter):
event_data = stream.prepare_event_payload(event_type, data, **extra_data)
payload.append((str(stream.uuid), event_data))
return _send_ssf_event.delay(payload)
def _check_app_access(stream_uuid: str, event_data: dict) -> bool:
"""Check if event is related to user and if so, check
if the user has access to the application"""
stream = Stream.objects.filter(pk=stream_uuid).first()
if not stream:
return False
# `event_data` is a dict version of a StreamEvent
sub_id = event_data.get("payload", {}).get("sub_id", {})
email = sub_id.get("user", {}).get("email", None)
if not email:
return True
user = User.objects.filter(email=email).first()
if not user:
return True
engine = PolicyEngine(stream.provider.backchannel_application, user)
engine.use_cache = False
engine.build()
return engine.passing
@CELERY_APP.task()
def _send_ssf_event(event_data: list[tuple[str, dict]]):
tasks = []
for stream, data in event_data:
if not _check_app_access(stream, data):
continue
event = StreamEvent.objects.create(**data)
tasks.extend(send_single_ssf_event(stream, str(event.uuid)))
main_task = group(*tasks)
main_task()
def send_single_ssf_event(stream_id: str, evt_id: str):
stream = Stream.objects.filter(pk=stream_id).first()
if not stream:
return
event = StreamEvent.objects.filter(pk=evt_id).first()
if not event:
return
if event.status == SSFEventStatus.SENT:
return
if stream.delivery_method == DeliveryMethods.RISC_PUSH:
return [ssf_push_event.si(str(event.pk))]
return []
@CELERY_APP.task(bind=True, base=SystemTask)
def ssf_push_event(self: SystemTask, event_id: str):
self.save_on_success = False
event = StreamEvent.objects.filter(pk=event_id).first()
if not event:
return
self.set_uid(event_id)
if event.status == SSFEventStatus.SENT:
self.set_status(TaskStatus.SUCCESSFUL)
return
try:
response = session.post(
event.stream.endpoint_url,
data=event.stream.encode(event.payload),
headers={"Content-Type": "application/secevent+jwt", "Accept": "application/json"},
)
response.raise_for_status()
event.status = SSFEventStatus.SENT
event.save()
self.set_status(TaskStatus.SUCCESSFUL)
return
except RequestException as exc:
LOGGER.warning("Failed to send SSF event", exc=exc)
self.set_status(TaskStatus.ERROR)
attrs = {}
if exc.response:
attrs["response"] = {
"content": exc.response.text,
"status": exc.response.status_code,
}
self.set_error(
exc,
LogEvent(
_("Failed to send request"),
log_level="warning",
logger=self.__name__,
attributes=attrs,
),
)
# Re-up the expiry of the stream event
event.expires = now() + timedelta_from_string(event.stream.provider.event_retention)
event.status = SSFEventStatus.PENDING_FAILED
event.save()

View File

@ -1,46 +0,0 @@
import json
from django.urls import reverse
from rest_framework.test import APITestCase
from authentik.core.models import Application
from authentik.core.tests.utils import create_test_cert
from authentik.enterprise.providers.ssf.models import (
SSFProvider,
)
from authentik.lib.generators import generate_id
class TestConfiguration(APITestCase):
def setUp(self):
self.application = Application.objects.create(name=generate_id(), slug=generate_id())
self.provider = SSFProvider.objects.create(
name=generate_id(),
signing_key=create_test_cert(),
backchannel_application=self.application,
)
def test_config_fetch(self):
"""test SSF configuration (unauthenticated)"""
res = self.client.get(
reverse(
"authentik_providers_ssf:configuration",
kwargs={"application_slug": self.application.slug},
),
)
self.assertEqual(res.status_code, 200)
content = json.loads(res.content)
self.assertEqual(content["spec_version"], "1_0-ID2")
def test_config_fetch_authenticated(self):
"""test SSF configuration (authenticated)"""
res = self.client.get(
reverse(
"authentik_providers_ssf:configuration",
kwargs={"application_slug": self.application.slug},
),
HTTP_AUTHORIZATION=f"Bearer {self.provider.token.key}",
)
self.assertEqual(res.status_code, 200)
content = json.loads(res.content)
self.assertEqual(content["spec_version"], "1_0-ID2")

View File

@ -1,51 +0,0 @@
"""JWKS tests"""
import base64
import json
from cryptography.hazmat.backends import default_backend
from cryptography.x509 import load_der_x509_certificate
from django.test import TestCase
from django.urls.base import reverse
from jwt import PyJWKSet
from authentik.core.models import Application
from authentik.core.tests.utils import create_test_cert
from authentik.enterprise.providers.ssf.models import SSFProvider
from authentik.lib.generators import generate_id
class TestJWKS(TestCase):
"""Test JWKS view"""
def test_rs256(self):
"""Test JWKS request with RS256"""
provider = SSFProvider.objects.create(
name=generate_id(),
signing_key=create_test_cert(),
)
app = Application.objects.create(name=generate_id(), slug=generate_id())
app.backchannel_providers.add(provider)
response = self.client.get(
reverse("authentik_providers_ssf:jwks", kwargs={"application_slug": app.slug})
)
body = json.loads(response.content.decode())
self.assertEqual(len(body["keys"]), 1)
PyJWKSet.from_dict(body)
key = body["keys"][0]
load_der_x509_certificate(base64.b64decode(key["x5c"][0]), default_backend()).public_key()
def test_es256(self):
"""Test JWKS request with ES256"""
provider = SSFProvider.objects.create(
name=generate_id(),
signing_key=create_test_cert(),
)
app = Application.objects.create(name=generate_id(), slug=generate_id())
app.backchannel_providers.add(provider)
response = self.client.get(
reverse("authentik_providers_ssf:jwks", kwargs={"application_slug": app.slug})
)
body = json.loads(response.content.decode())
self.assertEqual(len(body["keys"]), 1)
PyJWKSet.from_dict(body)

View File

@ -1,168 +0,0 @@
from uuid import uuid4
from django.urls import reverse
from rest_framework.test import APITestCase
from authentik.core.models import Application, Group
from authentik.core.tests.utils import (
create_test_cert,
create_test_user,
)
from authentik.enterprise.providers.ssf.models import (
EventTypes,
SSFEventStatus,
SSFProvider,
Stream,
StreamEvent,
)
from authentik.lib.generators import generate_id
from authentik.policies.models import PolicyBinding
from authentik.stages.authenticator_webauthn.models import WebAuthnDevice
class TestSignals(APITestCase):
"""Test individual SSF Signals"""
def setUp(self):
self.application = Application.objects.create(name=generate_id(), slug=generate_id())
self.provider = SSFProvider.objects.create(
name=generate_id(),
signing_key=create_test_cert(),
backchannel_application=self.application,
)
res = self.client.post(
reverse(
"authentik_providers_ssf:stream",
kwargs={"application_slug": self.application.slug},
),
data={
"iss": "https://authentik.company/.well-known/ssf-configuration/foo/5",
"aud": ["https://app.authentik.company"],
"delivery": {
"method": "https://schemas.openid.net/secevent/risc/delivery-method/push",
"endpoint_url": "https://app.authentik.company",
},
"events_requested": [
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
],
"format": "iss_sub",
},
HTTP_AUTHORIZATION=f"Bearer {self.provider.token.key}",
)
self.assertEqual(res.status_code, 201, res.content)
def test_signal_logout(self):
"""Test user logout"""
user = create_test_user()
self.client.force_login(user)
self.client.logout()
stream = Stream.objects.filter(provider=self.provider).first()
self.assertIsNotNone(stream)
event = StreamEvent.objects.filter(stream=stream).first()
self.assertIsNotNone(event)
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
event_payload = event.payload["events"][
"https://schemas.openid.net/secevent/caep/event-type/session-revoked"
]
self.assertEqual(event_payload["initiating_entity"], "user")
self.assertEqual(event.payload["sub_id"]["format"], "complex")
self.assertEqual(event.payload["sub_id"]["session"]["format"], "opaque")
self.assertEqual(event.payload["sub_id"]["user"]["format"], "email")
self.assertEqual(event.payload["sub_id"]["user"]["email"], user.email)
def test_signal_password_change(self):
"""Test user password change"""
user = create_test_user()
self.client.force_login(user)
user.set_password(generate_id())
user.save()
stream = Stream.objects.filter(provider=self.provider).first()
self.assertIsNotNone(stream)
event = StreamEvent.objects.filter(stream=stream).first()
self.assertIsNotNone(event)
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
event_payload = event.payload["events"][
"https://schemas.openid.net/secevent/caep/event-type/credential-change"
]
self.assertEqual(event_payload["change_type"], "update")
self.assertEqual(event_payload["credential_type"], "password")
self.assertEqual(event.payload["sub_id"]["format"], "complex")
self.assertEqual(event.payload["sub_id"]["user"]["format"], "email")
self.assertEqual(event.payload["sub_id"]["user"]["email"], user.email)
def test_signal_authenticator_added(self):
"""Test authenticator creation signal"""
user = create_test_user()
self.client.force_login(user)
dev = WebAuthnDevice.objects.create(
user=user,
name=generate_id(),
credential_id=generate_id(),
public_key=generate_id(),
aaguid=str(uuid4()),
)
stream = Stream.objects.filter(provider=self.provider).first()
self.assertIsNotNone(stream)
event = StreamEvent.objects.filter(stream=stream).exclude().first()
self.assertIsNotNone(event)
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
event_payload = event.payload["events"][
"https://schemas.openid.net/secevent/caep/event-type/credential-change"
]
self.assertEqual(event_payload["change_type"], "create")
self.assertEqual(event_payload["fido2_aaguid"], dev.aaguid)
self.assertEqual(event_payload["friendly_name"], dev.name)
self.assertEqual(event_payload["credential_type"], "fido-u2f")
self.assertEqual(event.payload["sub_id"]["format"], "complex")
self.assertEqual(event.payload["sub_id"]["user"]["format"], "email")
self.assertEqual(event.payload["sub_id"]["user"]["email"], user.email)
def test_signal_authenticator_deleted(self):
"""Test authenticator deletion signal"""
user = create_test_user()
self.client.force_login(user)
dev = WebAuthnDevice.objects.create(
user=user,
name=generate_id(),
credential_id=generate_id(),
public_key=generate_id(),
aaguid=str(uuid4()),
)
dev.delete()
stream = Stream.objects.filter(provider=self.provider).first()
self.assertIsNotNone(stream)
event = StreamEvent.objects.filter(stream=stream).exclude().first()
self.assertIsNotNone(event)
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
event_payload = event.payload["events"][
"https://schemas.openid.net/secevent/caep/event-type/credential-change"
]
self.assertEqual(event_payload["change_type"], "delete")
self.assertEqual(event_payload["fido2_aaguid"], dev.aaguid)
self.assertEqual(event_payload["friendly_name"], dev.name)
self.assertEqual(event_payload["credential_type"], "fido-u2f")
self.assertEqual(event.payload["sub_id"]["format"], "complex")
self.assertEqual(event.payload["sub_id"]["user"]["format"], "email")
self.assertEqual(event.payload["sub_id"]["user"]["email"], user.email)
def test_signal_policy_ignore(self):
"""Test event not being created for user that doesn't have access to the application"""
PolicyBinding.objects.create(
target=self.application, group=Group.objects.create(name=generate_id()), order=0
)
user = create_test_user()
self.client.force_login(user)
user.set_password(generate_id())
user.save()
stream = Stream.objects.filter(provider=self.provider).first()
self.assertIsNotNone(stream)
event = StreamEvent.objects.filter(
stream=stream, type=EventTypes.CAEP_CREDENTIAL_CHANGE
).first()
self.assertIsNone(event)

View File

@ -1,154 +0,0 @@
import json
from dataclasses import asdict
from django.urls import reverse
from django.utils import timezone
from rest_framework.test import APITestCase
from authentik.core.models import Application
from authentik.core.tests.utils import create_test_admin_user, create_test_cert, create_test_flow
from authentik.enterprise.providers.ssf.models import (
SSFEventStatus,
SSFProvider,
Stream,
StreamEvent,
)
from authentik.lib.generators import generate_id
from authentik.providers.oauth2.id_token import IDToken
from authentik.providers.oauth2.models import AccessToken, OAuth2Provider
class TestStream(APITestCase):
def setUp(self):
self.application = Application.objects.create(name=generate_id(), slug=generate_id())
self.provider = SSFProvider.objects.create(
name=generate_id(),
signing_key=create_test_cert(),
backchannel_application=self.application,
)
def test_stream_add_token(self):
"""test stream add (token auth)"""
res = self.client.post(
reverse(
"authentik_providers_ssf:stream",
kwargs={"application_slug": self.application.slug},
),
data={
"iss": "https://authentik.company/.well-known/ssf-configuration/foo/5",
"aud": ["https://app.authentik.company"],
"delivery": {
"method": "https://schemas.openid.net/secevent/risc/delivery-method/push",
"endpoint_url": "https://app.authentik.company",
},
"events_requested": [
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
],
"format": "iss_sub",
},
HTTP_AUTHORIZATION=f"Bearer {self.provider.token.key}",
)
self.assertEqual(res.status_code, 201)
stream = Stream.objects.filter(provider=self.provider).first()
self.assertIsNotNone(stream)
event = StreamEvent.objects.filter(stream=stream).first()
self.assertIsNotNone(event)
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
self.assertEqual(
event.payload["events"],
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {"state": None}},
)
def test_stream_add_poll(self):
"""test stream add - poll method"""
res = self.client.post(
reverse(
"authentik_providers_ssf:stream",
kwargs={"application_slug": self.application.slug},
),
data={
"iss": "https://authentik.company/.well-known/ssf-configuration/foo/5",
"aud": ["https://app.authentik.company"],
"delivery": {
"method": "https://schemas.openid.net/secevent/risc/delivery-method/poll",
},
"events_requested": [
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
],
"format": "iss_sub",
},
HTTP_AUTHORIZATION=f"Bearer {self.provider.token.key}",
)
self.assertEqual(res.status_code, 400)
self.assertJSONEqual(
res.content,
{"delivery": {"method": ["Polling for SSF events is not currently supported."]}},
)
def test_stream_add_oidc(self):
"""test stream add (oidc auth)"""
provider = OAuth2Provider.objects.create(
name=generate_id(),
authorization_flow=create_test_flow(),
)
self.application.provider = provider
self.application.save()
user = create_test_admin_user()
token = AccessToken.objects.create(
provider=provider,
user=user,
token=generate_id(),
auth_time=timezone.now(),
_scope="openid user profile",
_id_token=json.dumps(
asdict(
IDToken("foo", "bar"),
)
),
)
res = self.client.post(
reverse(
"authentik_providers_ssf:stream",
kwargs={"application_slug": self.application.slug},
),
data={
"iss": "https://authentik.company/.well-known/ssf-configuration/foo/5",
"aud": ["https://app.authentik.company"],
"delivery": {
"method": "https://schemas.openid.net/secevent/risc/delivery-method/push",
"endpoint_url": "https://app.authentik.company",
},
"events_requested": [
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
],
"format": "iss_sub",
},
HTTP_AUTHORIZATION=f"Bearer {token.token}",
)
self.assertEqual(res.status_code, 201)
stream = Stream.objects.filter(provider=self.provider).first()
self.assertIsNotNone(stream)
event = StreamEvent.objects.filter(stream=stream).first()
self.assertIsNotNone(event)
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
self.assertEqual(
event.payload["events"],
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {"state": None}},
)
def test_stream_delete(self):
"""delete stream"""
stream = Stream.objects.create(provider=self.provider)
res = self.client.delete(
reverse(
"authentik_providers_ssf:stream",
kwargs={"application_slug": self.application.slug},
),
HTTP_AUTHORIZATION=f"Bearer {self.provider.token.key}",
)
self.assertEqual(res.status_code, 204)
self.assertFalse(Stream.objects.filter(pk=stream.pk).exists())

View File

@ -1,32 +0,0 @@
"""SSF provider URLs"""
from django.urls import path
from authentik.enterprise.providers.ssf.api.providers import SSFProviderViewSet
from authentik.enterprise.providers.ssf.api.streams import SSFStreamViewSet
from authentik.enterprise.providers.ssf.views.configuration import ConfigurationView
from authentik.enterprise.providers.ssf.views.jwks import JWKSview
from authentik.enterprise.providers.ssf.views.stream import StreamView
urlpatterns = [
path(
"application/ssf/<slug:application_slug>/ssf-jwks/",
JWKSview.as_view(),
name="jwks",
),
path(
".well-known/ssf-configuration/<slug:application_slug>",
ConfigurationView.as_view(),
name="configuration",
),
path(
"application/ssf/<slug:application_slug>/stream/",
StreamView.as_view(),
name="stream",
),
]
api_urlpatterns = [
("providers/ssf", SSFProviderViewSet),
("ssf/streams", SSFStreamViewSet),
]

View File

@ -1,66 +0,0 @@
"""SSF Token auth"""
from typing import TYPE_CHECKING, Any
from django.db.models import Q
from rest_framework.authentication import BaseAuthentication, get_authorization_header
from rest_framework.request import Request
from authentik.core.models import Token, TokenIntents, User
from authentik.enterprise.providers.ssf.models import SSFProvider
from authentik.providers.oauth2.models import AccessToken
if TYPE_CHECKING:
from authentik.enterprise.providers.ssf.views.base import SSFView
class SSFTokenAuth(BaseAuthentication):
"""SSF Token auth"""
view: "SSFView"
def __init__(self, view: "SSFView") -> None:
super().__init__()
self.view = view
def check_token(self, key: str) -> Token | None:
"""Check that a token exists, is not expired, and is assigned to the correct provider"""
token = Token.filter_not_expired(key=key, intent=TokenIntents.INTENT_API).first()
if not token:
return None
provider: SSFProvider = token.ssfprovider_set.first()
if not provider:
return None
self.view.application = provider.backchannel_application
self.view.provider = provider
return token
def check_jwt(self, jwt: str) -> AccessToken | None:
"""Check JWT-based authentication, this supports tokens issued either by providers
configured directly in the provider, and by providers assigned to the application
that the SSF provider is a backchannel provider of."""
token = AccessToken.filter_not_expired(token=jwt, revoked=False).first()
if not token:
return None
ssf_provider = SSFProvider.objects.filter(
Q(oidc_auth_providers__in=[token.provider])
| Q(backchannel_application__provider__in=[token.provider]),
).first()
if not ssf_provider:
return None
self.view.application = ssf_provider.backchannel_application
self.view.provider = ssf_provider
return token
def authenticate(self, request: Request) -> tuple[User, Any] | None:
auth = get_authorization_header(request).decode()
auth_type, _, key = auth.partition(" ")
if auth_type != "Bearer":
return None
token = self.check_token(key)
if token:
return (token.user, token)
jwt_token = self.check_jwt(key)
if jwt_token:
return (jwt_token.user, token)
return None

View File

@ -1,23 +0,0 @@
from django.http import HttpRequest
from rest_framework.permissions import IsAuthenticated
from rest_framework.views import APIView
from structlog.stdlib import BoundLogger, get_logger
from authentik.core.models import Application
from authentik.enterprise.providers.ssf.models import SSFProvider
from authentik.enterprise.providers.ssf.views.auth import SSFTokenAuth
class SSFView(APIView):
application: Application
provider: SSFProvider
logger: BoundLogger
permission_classes = [IsAuthenticated]
def setup(self, request: HttpRequest, *args, **kwargs) -> None:
self.logger = get_logger().bind()
super().setup(request, *args, **kwargs)
def get_authenticators(self):
return [SSFTokenAuth(self)]

View File

@ -1,55 +0,0 @@
from django.http import Http404, HttpRequest, HttpResponse, JsonResponse
from django.shortcuts import get_object_or_404
from django.urls import reverse
from rest_framework.permissions import AllowAny
from authentik.core.models import Application
from authentik.enterprise.providers.ssf.models import DeliveryMethods, SSFProvider
from authentik.enterprise.providers.ssf.views.base import SSFView
class ConfigurationView(SSFView):
"""SSF configuration endpoint"""
permission_classes = [AllowAny]
def get_authenticators(self):
return []
def get(self, request: HttpRequest, application_slug: str, *args, **kwargs) -> HttpResponse:
application = get_object_or_404(Application, slug=application_slug)
provider = application.backchannel_provider_for(SSFProvider)
if not provider:
raise Http404
data = {
"spec_version": "1_0-ID2",
"issuer": self.request.build_absolute_uri(
reverse(
"authentik_providers_ssf:configuration",
kwargs={
"application_slug": application.slug,
},
)
),
"jwks_uri": self.request.build_absolute_uri(
reverse(
"authentik_providers_ssf:jwks",
kwargs={
"application_slug": application.slug,
},
)
),
"configuration_endpoint": self.request.build_absolute_uri(
reverse(
"authentik_providers_ssf:stream",
kwargs={
"application_slug": application.slug,
},
)
),
"delivery_methods_supported": [
DeliveryMethods.RISC_PUSH,
],
"authorization_schemes": [{"spec_urn": "urn:ietf:rfc:6749"}],
}
return JsonResponse(data)

View File

@ -1,31 +0,0 @@
from django.http import Http404, HttpRequest, HttpResponse, JsonResponse
from django.shortcuts import get_object_or_404
from django.views import View
from authentik.core.models import Application
from authentik.crypto.models import CertificateKeyPair
from authentik.enterprise.providers.ssf.models import SSFProvider
from authentik.providers.oauth2.views.jwks import JWKSView as OAuthJWKSView
class JWKSview(View):
"""SSF JWKS endpoint, similar to the OAuth2 provider's endpoint"""
def get(self, request: HttpRequest, application_slug: str) -> HttpResponse:
"""Show JWK Key data for Provider"""
application = get_object_or_404(Application, slug=application_slug)
provider = application.backchannel_provider_for(SSFProvider)
if not provider:
raise Http404
signing_key: CertificateKeyPair = provider.signing_key
response_data = {}
jwk = OAuthJWKSView.get_jwk_for_key(signing_key, "sig")
if jwk:
response_data["keys"] = [jwk]
response = JsonResponse(response_data)
response["Access-Control-Allow-Origin"] = "*"
return response

View File

@ -1,130 +0,0 @@
from django.http import HttpRequest
from django.urls import reverse
from rest_framework.exceptions import PermissionDenied, ValidationError
from rest_framework.fields import CharField, ChoiceField, ListField, SerializerMethodField
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.serializers import ModelSerializer
from structlog.stdlib import get_logger
from authentik.core.api.utils import PassiveSerializer
from authentik.enterprise.providers.ssf.models import (
DeliveryMethods,
EventTypes,
SSFProvider,
Stream,
)
from authentik.enterprise.providers.ssf.tasks import send_ssf_event
from authentik.enterprise.providers.ssf.views.base import SSFView
LOGGER = get_logger()
class StreamDeliverySerializer(PassiveSerializer):
method = ChoiceField(choices=[(x.value, x.value) for x in DeliveryMethods])
endpoint_url = CharField(required=False)
def validate_method(self, method: DeliveryMethods):
"""Currently only push is supported"""
if method == DeliveryMethods.RISC_POLL:
raise ValidationError("Polling for SSF events is not currently supported.")
return method
def validate(self, attrs: dict) -> dict:
if attrs["method"] == DeliveryMethods.RISC_PUSH:
if not attrs.get("endpoint_url"):
raise ValidationError("Endpoint URL is required when using push.")
return attrs
class StreamSerializer(ModelSerializer):
delivery = StreamDeliverySerializer()
events_requested = ListField(
child=ChoiceField(choices=[(x.value, x.value) for x in EventTypes])
)
format = CharField()
aud = ListField(child=CharField())
def create(self, validated_data):
provider: SSFProvider = validated_data["provider"]
request: HttpRequest = self.context["request"]
iss = request.build_absolute_uri(
reverse(
"authentik_providers_ssf:configuration",
kwargs={
"application_slug": provider.backchannel_application.slug,
},
)
)
# Ensure that streams always get SET verification events sent to them
validated_data["events_requested"].append(EventTypes.SET_VERIFICATION)
return super().create(
{
"delivery_method": validated_data["delivery"]["method"],
"endpoint_url": validated_data["delivery"].get("endpoint_url"),
"format": validated_data["format"],
"provider": validated_data["provider"],
"events_requested": validated_data["events_requested"],
"aud": validated_data["aud"],
"iss": iss,
}
)
class Meta:
model = Stream
fields = [
"delivery",
"events_requested",
"format",
"aud",
]
class StreamResponseSerializer(PassiveSerializer):
stream_id = CharField(source="pk")
iss = CharField()
aud = ListField(child=CharField())
delivery = SerializerMethodField()
format = CharField()
events_requested = ListField(child=CharField())
events_supported = SerializerMethodField()
events_delivered = ListField(child=CharField(), source="events_requested")
def get_delivery(self, instance: Stream) -> StreamDeliverySerializer:
return {
"method": instance.delivery_method,
"endpoint_url": instance.endpoint_url,
}
def get_events_supported(self, instance: Stream) -> list[str]:
return [x.value for x in EventTypes]
class StreamView(SSFView):
def post(self, request: Request, *args, **kwargs) -> Response:
stream = StreamSerializer(data=request.data, context={"request": request})
stream.is_valid(raise_exception=True)
if not request.user.has_perm("authentik_providers_ssf.add_stream", self.provider):
raise PermissionDenied(
"User does not have permission to create stream for this provider."
)
instance: Stream = stream.save(provider=self.provider)
send_ssf_event(
EventTypes.SET_VERIFICATION,
{
"state": None,
},
stream_filter={"pk": instance.uuid},
sub_id={"format": "opaque", "id": str(instance.uuid)},
)
response = StreamResponseSerializer(instance=instance, context={"request": request}).data
return Response(response, status=201)
def delete(self, request: Request, *args, **kwargs) -> Response:
streams = Stream.objects.filter(provider=self.provider)
# Technically this parameter is required by the spec...
if "stream_id" in request.query_params:
streams = streams.filter(stream_id=request.query_params["stream_id"])
streams.delete()
return Response(status=204)

View File

@ -17,7 +17,6 @@ TENANT_APPS = [
"authentik.enterprise.providers.google_workspace",
"authentik.enterprise.providers.microsoft_entra",
"authentik.enterprise.providers.rac",
"authentik.enterprise.providers.ssf",
"authentik.enterprise.stages.authenticator_endpoint_gdtc",
"authentik.enterprise.stages.source",
]

View File

@ -1,11 +1,14 @@
"""AuthenticatorEndpointGDTCStage API Views"""
from django_filters.rest_framework.backends import DjangoFilterBackend
from rest_framework import mixins
from rest_framework.filters import OrderingFilter, SearchFilter
from rest_framework.permissions import IsAdminUser
from rest_framework.serializers import ModelSerializer
from rest_framework.viewsets import GenericViewSet, ModelViewSet
from structlog.stdlib import get_logger
from authentik.api.authorization import OwnerFilter, OwnerPermissions
from authentik.core.api.used_by import UsedByMixin
from authentik.enterprise.api import EnterpriseRequiredMixin
from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import (
@ -64,7 +67,8 @@ class EndpointDeviceViewSet(
search_fields = ["name"]
filterset_fields = ["name"]
ordering = ["name"]
owner_field = "user"
permission_classes = [OwnerPermissions]
filter_backends = [OwnerFilter, DjangoFilterBackend, OrderingFilter, SearchFilter]
class EndpointAdminDeviceViewSet(ModelViewSet):

View File

@ -1,15 +1,17 @@
"""Notification API Views"""
from django_filters.rest_framework import DjangoFilterBackend
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import OpenApiResponse, extend_schema
from rest_framework import mixins
from rest_framework.decorators import action
from rest_framework.fields import ReadOnlyField
from rest_framework.permissions import IsAuthenticated
from rest_framework.filters import OrderingFilter, SearchFilter
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import GenericViewSet
from authentik.api.authorization import OwnerFilter, OwnerPermissions
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import ModelSerializer
from authentik.events.api.events import EventSerializer
@ -55,7 +57,8 @@ class NotificationViewSet(
"seen",
"user",
]
owner_field = "user"
permission_classes = [OwnerPermissions]
filter_backends = [OwnerFilter, DjangoFilterBackend, OrderingFilter, SearchFilter]
@extend_schema(
request=OpenApiTypes.NONE,
@ -63,7 +66,7 @@ class NotificationViewSet(
204: OpenApiResponse(description="Marked tasks as read successfully."),
},
)
@action(detail=False, methods=["post"], permission_classes=[IsAuthenticated])
@action(detail=False, methods=["post"])
def mark_all_seen(self, request: Request) -> Response:
"""Mark all the user's notifications as seen"""
Notification.objects.filter(user=request.user, seen=False).update(seen=True)

View File

@ -1,41 +0,0 @@
# Generated by Django 5.0.10 on 2025-01-13 18:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_events", "0007_event_authentik_e_action_9a9dd9_idx_and_more"),
]
operations = [
migrations.AddIndex(
model_name="event",
index=models.Index(fields=["expires"], name="authentik_e_expires_8c73a8_idx"),
),
migrations.AddIndex(
model_name="event",
index=models.Index(fields=["expiring"], name="authentik_e_expirin_b5cb5e_idx"),
),
migrations.AddIndex(
model_name="event",
index=models.Index(
fields=["expiring", "expires"], name="authentik_e_expirin_e37180_idx"
),
),
migrations.AddIndex(
model_name="systemtask",
index=models.Index(fields=["expires"], name="authentik_e_expires_4d3985_idx"),
),
migrations.AddIndex(
model_name="systemtask",
index=models.Index(fields=["expiring"], name="authentik_e_expirin_81d649_idx"),
),
migrations.AddIndex(
model_name="systemtask",
index=models.Index(
fields=["expiring", "expires"], name="authentik_e_expirin_eb3598_idx"
),
),
]

View File

@ -306,7 +306,7 @@ class Event(SerializerModel, ExpiringModel):
class Meta:
verbose_name = _("Event")
verbose_name_plural = _("Events")
indexes = ExpiringModel.Meta.indexes + [
indexes = [
models.Index(fields=["action"]),
models.Index(fields=["user"]),
models.Index(fields=["app"]),
@ -694,4 +694,3 @@ class SystemTask(SerializerModel, ExpiringModel):
permissions = [("run_task", _("Run task"))]
verbose_name = _("System Task")
verbose_name_plural = _("System Tasks")
indexes = ExpiringModel.Meta.indexes

View File

@ -106,9 +106,9 @@ def on_invitation_used(sender, request: HttpRequest, invitation: Invitation, **_
@receiver(password_changed)
def on_password_changed(sender, user: User, password: str, request: HttpRequest | None, **_):
def on_password_changed(sender, user: User, password: str, **_):
"""Log password change"""
Event.new(EventAction.PASSWORD_SET).from_http(request, user=user)
Event.new(EventAction.PASSWORD_SET).from_http(None, user=user)
@receiver(post_save, sender=Event)

View File

@ -53,13 +53,12 @@ class SystemTask(TenantTask):
if not isinstance(msg, LogEvent):
self._messages[idx] = LogEvent(msg, logger=self.__name__, log_level="info")
def set_error(self, exception: Exception, *messages: LogEvent):
def set_error(self, exception: Exception):
"""Set result to error and save exception"""
self._status = TaskStatus.ERROR
self._messages = list(messages)
self._messages.extend(
[LogEvent(exception_to_string(exception), logger=self.__name__, log_level="error")]
)
self._messages = [
LogEvent(exception_to_string(exception), logger=self.__name__, log_level="error")
]
def before_start(self, task_id, args, kwargs):
self._start_precise = perf_counter()

View File

@ -1,7 +1,5 @@
"""Flow Stage API Views"""
from uuid import uuid4
from django.urls.base import reverse
from drf_spectacular.utils import extend_schema
from rest_framework import mixins
@ -29,11 +27,6 @@ class StageSerializer(ModelSerializer, MetaNameSerializer):
component = SerializerMethodField()
flow_set = FlowSetSerializer(many=True, required=False)
def to_representation(self, instance: Stage):
if isinstance(instance, Stage) and instance.is_in_memory:
instance.stage_uuid = uuid4()
return super().to_representation(instance)
def get_component(self, obj: Stage) -> str:
"""Get object type so that we know how to edit the object"""
if obj.__class__ == Stage:

View File

@ -3,7 +3,6 @@
from dataclasses import dataclass
from typing import TYPE_CHECKING
from django.contrib.messages import INFO, add_message
from django.http.request import HttpRequest
from structlog.stdlib import get_logger
@ -62,8 +61,6 @@ class ReevaluateMarker(StageMarker):
engine.request.context.update(plan.context)
engine.build()
result = engine.result
for message in result.messages:
add_message(http_request, INFO, message)
if result.passing:
return binding
LOGGER.warning(

View File

@ -88,8 +88,7 @@ class Migration(migrations.Migration):
model_name="flowstagebinding",
name="re_evaluate_policies",
field=models.BooleanField(
default=False,
help_text="Evaluate policies when the Stage is presented to the user.",
default=False, help_text="Evaluate policies when the Stage is present to the user."
),
),
migrations.AddField(

View File

@ -20,7 +20,7 @@ class Migration(migrations.Migration):
model_name="flowstagebinding",
name="re_evaluate_policies",
field=models.BooleanField(
default=True, help_text="Evaluate policies when the Stage is presented to the user."
default=True, help_text="Evaluate policies when the Stage is present to the user."
),
),
]

View File

@ -102,12 +102,8 @@ class Stage(SerializerModel):
user settings are available, or a challenge."""
return None
@property
def is_in_memory(self):
return hasattr(self, "__in_memory_type")
def __str__(self):
if self.is_in_memory:
if hasattr(self, "__in_memory_type"):
return f"In-memory Stage {getattr(self, '__in_memory_type')}"
return f"Stage {self.name}"
@ -231,7 +227,7 @@ class FlowStageBinding(SerializerModel, PolicyBindingModel):
)
re_evaluate_policies = models.BooleanField(
default=True,
help_text=_("Evaluate policies when the Stage is presented to the user."),
help_text=_("Evaluate policies when the Stage is present to the user."),
)
invalid_response_action = models.TextField(

View File

@ -166,17 +166,9 @@ class FlowPlan:
temp_exec.stage_ok()
return response
get_qs = request.GET.copy()
if request.user.is_authenticated and (
# Object-scoped permission or global permission
request.user.has_perm("authentik_flows.inspect_flow", flow)
or request.user.has_perm("authentik_flows.inspect_flow")
):
get_qs["inspector"] = "available"
return redirect_with_qs(
"authentik_core:if-flow",
get_qs,
request.GET,
flow_slug=flow.slug,
)

View File

@ -7,8 +7,8 @@ from django.http import HttpRequest, HttpResponse
from django.test.client import RequestFactory
from django.urls import reverse
from authentik.core.models import Group, User
from authentik.core.tests.utils import create_test_flow, create_test_user
from authentik.core.models import User
from authentik.core.tests.utils import create_test_flow
from authentik.flows.markers import ReevaluateMarker, StageMarker
from authentik.flows.models import (
FlowDeniedAction,
@ -255,11 +255,7 @@ class TestFlowExecutor(FlowTestCase):
)
binding = FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
order=0,
evaluate_on_plan=True,
re_evaluate_policies=False,
target=flow, stage=DummyStage.objects.create(name=generate_id()), order=0
)
binding2 = FlowStageBinding.objects.create(
target=flow,
@ -282,8 +278,8 @@ class TestFlowExecutor(FlowTestCase):
self.assertEqual(plan.bindings[0], binding)
self.assertEqual(plan.bindings[1], binding2)
self.assertEqual(plan.markers[0].__class__, StageMarker)
self.assertEqual(plan.markers[1].__class__, ReevaluateMarker)
self.assertIsInstance(plan.markers[0], StageMarker)
self.assertIsInstance(plan.markers[1], ReevaluateMarker)
# Second request, this passes the first dummy stage
response = self.client.post(exec_url)
@ -305,11 +301,7 @@ class TestFlowExecutor(FlowTestCase):
)
binding = FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
order=0,
evaluate_on_plan=True,
re_evaluate_policies=False,
target=flow, stage=DummyStage.objects.create(name=generate_id()), order=0
)
binding2 = FlowStageBinding.objects.create(
target=flow,
@ -318,11 +310,7 @@ class TestFlowExecutor(FlowTestCase):
re_evaluate_policies=True,
)
binding3 = FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
order=2,
evaluate_on_plan=True,
re_evaluate_policies=False,
target=flow, stage=DummyStage.objects.create(name=generate_id()), order=2
)
PolicyBinding.objects.create(policy=false_policy, target=binding2, order=0)
@ -340,9 +328,9 @@ class TestFlowExecutor(FlowTestCase):
self.assertEqual(plan.bindings[1], binding2)
self.assertEqual(plan.bindings[2], binding3)
self.assertEqual(plan.markers[0].__class__, StageMarker)
self.assertEqual(plan.markers[1].__class__, ReevaluateMarker)
self.assertEqual(plan.markers[2].__class__, StageMarker)
self.assertIsInstance(plan.markers[0], StageMarker)
self.assertIsInstance(plan.markers[1], ReevaluateMarker)
self.assertIsInstance(plan.markers[2], StageMarker)
# Second request, this passes the first dummy stage
response = self.client.post(exec_url)
@ -353,8 +341,8 @@ class TestFlowExecutor(FlowTestCase):
self.assertEqual(plan.bindings[0], binding2)
self.assertEqual(plan.bindings[1], binding3)
self.assertEqual(plan.markers[0].__class__, ReevaluateMarker)
self.assertEqual(plan.markers[1].__class__, StageMarker)
self.assertIsInstance(plan.markers[0], StageMarker)
self.assertIsInstance(plan.markers[1], StageMarker)
# third request, this should trigger the re-evaluate
# We do this request without the patch, so the policy results in false
@ -372,11 +360,7 @@ class TestFlowExecutor(FlowTestCase):
)
binding = FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
order=0,
evaluate_on_plan=True,
re_evaluate_policies=False,
target=flow, stage=DummyStage.objects.create(name=generate_id()), order=0
)
binding2 = FlowStageBinding.objects.create(
target=flow,
@ -385,11 +369,7 @@ class TestFlowExecutor(FlowTestCase):
re_evaluate_policies=True,
)
binding3 = FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
order=2,
evaluate_on_plan=True,
re_evaluate_policies=False,
target=flow, stage=DummyStage.objects.create(name=generate_id()), order=2
)
PolicyBinding.objects.create(policy=true_policy, target=binding2, order=0)
@ -407,9 +387,9 @@ class TestFlowExecutor(FlowTestCase):
self.assertEqual(plan.bindings[1], binding2)
self.assertEqual(plan.bindings[2], binding3)
self.assertEqual(plan.markers[0].__class__, StageMarker)
self.assertEqual(plan.markers[1].__class__, ReevaluateMarker)
self.assertEqual(plan.markers[2].__class__, StageMarker)
self.assertIsInstance(plan.markers[0], StageMarker)
self.assertIsInstance(plan.markers[1], ReevaluateMarker)
self.assertIsInstance(plan.markers[2], StageMarker)
# Second request, this passes the first dummy stage
response = self.client.post(exec_url)
@ -420,8 +400,8 @@ class TestFlowExecutor(FlowTestCase):
self.assertEqual(plan.bindings[0], binding2)
self.assertEqual(plan.bindings[1], binding3)
self.assertEqual(plan.markers[0].__class__, ReevaluateMarker)
self.assertEqual(plan.markers[1].__class__, StageMarker)
self.assertIsInstance(plan.markers[0], StageMarker)
self.assertIsInstance(plan.markers[1], StageMarker)
# Third request, this passes the first dummy stage
response = self.client.post(exec_url)
@ -431,7 +411,7 @@ class TestFlowExecutor(FlowTestCase):
self.assertEqual(plan.bindings[0], binding3)
self.assertEqual(plan.markers[0].__class__, StageMarker)
self.assertIsInstance(plan.markers[0], StageMarker)
# third request, this should trigger the re-evaluate
# We do this request without the patch, so the policy results in false
@ -449,11 +429,7 @@ class TestFlowExecutor(FlowTestCase):
)
binding = FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
order=0,
evaluate_on_plan=True,
re_evaluate_policies=False,
target=flow, stage=DummyStage.objects.create(name=generate_id()), order=0
)
binding2 = FlowStageBinding.objects.create(
target=flow,
@ -468,11 +444,7 @@ class TestFlowExecutor(FlowTestCase):
re_evaluate_policies=True,
)
binding4 = FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
order=2,
evaluate_on_plan=True,
re_evaluate_policies=False,
target=flow, stage=DummyStage.objects.create(name=generate_id()), order=2
)
PolicyBinding.objects.create(policy=false_policy, target=binding2, order=0)
@ -493,10 +465,10 @@ class TestFlowExecutor(FlowTestCase):
self.assertEqual(plan.bindings[2], binding3)
self.assertEqual(plan.bindings[3], binding4)
self.assertEqual(plan.markers[0].__class__, StageMarker)
self.assertEqual(plan.markers[1].__class__, ReevaluateMarker)
self.assertEqual(plan.markers[2].__class__, ReevaluateMarker)
self.assertEqual(plan.markers[3].__class__, StageMarker)
self.assertIsInstance(plan.markers[0], StageMarker)
self.assertIsInstance(plan.markers[1], ReevaluateMarker)
self.assertIsInstance(plan.markers[2], ReevaluateMarker)
self.assertIsInstance(plan.markers[3], StageMarker)
# Second request, this passes the first dummy stage
response = self.client.post(exec_url)
@ -547,9 +519,9 @@ class TestFlowExecutor(FlowTestCase):
)
# Stage 0 is a deny stage that is added dynamically
# when the reputation policy says so
deny_stage = DenyStage.objects.create(name=generate_id())
deny_stage = DenyStage.objects.create(name="deny")
reputation_policy = ReputationPolicy.objects.create(
name=generate_id(), threshold=-1, check_ip=False
name="reputation", threshold=-1, check_ip=False
)
deny_binding = FlowStageBinding.objects.create(
target=flow,
@ -562,7 +534,7 @@ class TestFlowExecutor(FlowTestCase):
# Stage 1 is an identification stage
ident_stage = IdentificationStage.objects.create(
name=generate_id(),
name="ident",
user_fields=[UserFields.E_MAIL],
pretend_user_exists=False,
)
@ -587,64 +559,3 @@ class TestFlowExecutor(FlowTestCase):
)
response = self.client.post(exec_url, {"uid_field": "invalid-string"}, follow=True)
self.assertStageResponse(response, flow, component="ak-stage-access-denied")
def test_re_evaluate_group_binding(self):
"""Test re-evaluate stage binding that has a policy binding to a group"""
flow = create_test_flow()
user_group_membership = create_test_user()
user_direct_binding = create_test_user()
user_other = create_test_user()
group_a = Group.objects.create(name=generate_id())
user_group_membership.ak_groups.add(group_a)
# Stage 0 is an identification stage
ident_stage = IdentificationStage.objects.create(
name=generate_id(),
user_fields=[UserFields.USERNAME],
pretend_user_exists=False,
)
FlowStageBinding.objects.create(
target=flow,
stage=ident_stage,
order=0,
)
# Stage 1 is a dummy stage that is only shown for users in group_a
dummy_stage = DummyStage.objects.create(name=generate_id())
dummy_binding = FlowStageBinding.objects.create(target=flow, stage=dummy_stage, order=1)
PolicyBinding.objects.create(group=group_a, target=dummy_binding, order=0)
PolicyBinding.objects.create(user=user_direct_binding, target=dummy_binding, order=0)
# Stage 2 is a deny stage that (in this case) only user_b will see
deny_stage = DenyStage.objects.create(name=generate_id())
FlowStageBinding.objects.create(target=flow, stage=deny_stage, order=2)
exec_url = reverse("authentik_api:flow-executor", kwargs={"flow_slug": flow.slug})
with self.subTest(f"Test user access through group: {user_group_membership}"):
self.client.logout()
# First request, run the planner
response = self.client.get(exec_url)
self.assertStageResponse(response, flow, component="ak-stage-identification")
response = self.client.post(
exec_url, {"uid_field": user_group_membership.username}, follow=True
)
self.assertStageResponse(response, flow, component="ak-stage-dummy")
with self.subTest(f"Test user access through user: {user_direct_binding}"):
self.client.logout()
# First request, run the planner
response = self.client.get(exec_url)
self.assertStageResponse(response, flow, component="ak-stage-identification")
response = self.client.post(
exec_url, {"uid_field": user_direct_binding.username}, follow=True
)
self.assertStageResponse(response, flow, component="ak-stage-dummy")
with self.subTest(f"Test user has no access: {user_other}"):
self.client.logout()
# First request, run the planner
response = self.client.get(exec_url)
self.assertStageResponse(response, flow, component="ak-stage-identification")
response = self.client.post(exec_url, {"uid_field": user_other.username}, follow=True)
self.assertStageResponse(response, flow, component="ak-stage-access-denied")

View File

@ -8,7 +8,6 @@ from rest_framework.test import APITestCase
from authentik.core.tests.utils import create_test_admin_user, create_test_flow
from authentik.flows.models import FlowDesignation, FlowStageBinding, InvalidResponseAction
from authentik.lib.generators import generate_id
from authentik.stages.dummy.models import DummyStage
from authentik.stages.identification.models import IdentificationStage, UserFields
@ -27,7 +26,7 @@ class TestFlowInspector(APITestCase):
# Stage 1 is an identification stage
ident_stage = IdentificationStage.objects.create(
name=generate_id(),
name="ident",
user_fields=[UserFields.USERNAME],
)
FlowStageBinding.objects.create(
@ -36,8 +35,9 @@ class TestFlowInspector(APITestCase):
order=1,
invalid_response_action=InvalidResponseAction.RESTART_WITH_CONTEXT,
)
dummy_stage = DummyStage.objects.create(name=generate_id())
FlowStageBinding.objects.create(target=flow, stage=dummy_stage, order=1)
FlowStageBinding.objects.create(
target=flow, stage=DummyStage.objects.create(name="dummy2"), order=1
)
res = self.client.get(
reverse("authentik_api:flow-executor", kwargs={"flow_slug": flow.slug}),
@ -68,11 +68,9 @@ class TestFlowInspector(APITestCase):
)
content = loads(ins.content)
self.assertEqual(content["is_completed"], False)
self.assertEqual(content["current_plan"]["current_stage"]["stage_obj"]["name"], "ident")
self.assertEqual(
content["current_plan"]["current_stage"]["stage_obj"]["name"], ident_stage.name
)
self.assertEqual(
content["current_plan"]["next_planned_stage"]["stage_obj"]["name"], dummy_stage.name
content["current_plan"]["next_planned_stage"]["stage_obj"]["name"], "dummy2"
)
self.client.post(
@ -86,12 +84,8 @@ class TestFlowInspector(APITestCase):
)
content = loads(ins.content)
self.assertEqual(content["is_completed"], False)
self.assertEqual(
content["plans"][0]["current_stage"]["stage_obj"]["name"], ident_stage.name
)
self.assertEqual(
content["current_plan"]["current_stage"]["stage_obj"]["name"], dummy_stage.name
)
self.assertEqual(content["plans"][0]["current_stage"]["stage_obj"]["name"], "ident")
self.assertEqual(content["current_plan"]["current_stage"]["stage_obj"]["name"], "dummy2")
self.assertEqual(
content["current_plan"]["plan_context"]["pending_user"]["username"], self.admin.username
)

View File

@ -29,7 +29,6 @@ from authentik.flows.planner import (
cache_key,
)
from authentik.flows.stage import StageView
from authentik.lib.generators import generate_id
from authentik.lib.tests.utils import dummy_get_response
from authentik.outposts.apps import MANAGED_OUTPOST
from authentik.outposts.models import Outpost
@ -154,7 +153,7 @@ class TestFlowPlanner(TestCase):
"""Test planner cache"""
flow = create_test_flow(FlowDesignation.AUTHENTICATION)
FlowStageBinding.objects.create(
target=flow, stage=DummyStage.objects.create(name=generate_id()), order=0
target=flow, stage=DummyStage.objects.create(name="dummy"), order=0
)
request = self.request_factory.get(
reverse("authentik_api:flow-executor", kwargs={"flow_slug": flow.slug}),
@ -173,7 +172,7 @@ class TestFlowPlanner(TestCase):
"""Test planner with default_context"""
flow = create_test_flow()
FlowStageBinding.objects.create(
target=flow, stage=DummyStage.objects.create(name=generate_id()), order=0
target=flow, stage=DummyStage.objects.create(name="dummy"), order=0
)
user = User.objects.create(username="test-user")
@ -192,7 +191,7 @@ class TestFlowPlanner(TestCase):
FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
stage=DummyStage.objects.create(name="dummy1"),
order=0,
re_evaluate_policies=True,
)
@ -205,7 +204,7 @@ class TestFlowPlanner(TestCase):
planner = FlowPlanner(flow)
plan = planner.plan(request)
self.assertEqual(plan.markers[0].__class__, ReevaluateMarker)
self.assertIsInstance(plan.markers[0], ReevaluateMarker)
def test_planner_reevaluate_actual(self):
"""Test planner with re-evaluate"""
@ -213,14 +212,11 @@ class TestFlowPlanner(TestCase):
false_policy = DummyPolicy.objects.create(result=False, wait_min=1, wait_max=2)
binding = FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
order=0,
re_evaluate_policies=False,
target=flow, stage=DummyStage.objects.create(name="dummy1"), order=0
)
binding2 = FlowStageBinding.objects.create(
target=flow,
stage=DummyStage.objects.create(name=generate_id()),
stage=DummyStage.objects.create(name="dummy2"),
order=1,
re_evaluate_policies=True,
)
@ -244,8 +240,6 @@ class TestFlowPlanner(TestCase):
self.assertEqual(plan.bindings[0], binding)
self.assertEqual(plan.bindings[1], binding2)
self.assertEqual(plan.markers[0].__class__, StageMarker)
self.assertEqual(plan.markers[1].__class__, ReevaluateMarker)
self.assertIsInstance(plan.markers[0], StageMarker)
self.assertIsInstance(plan.markers[1], ReevaluateMarker)

View File

@ -78,9 +78,9 @@ class FlowInspectorView(APIView):
self.flow = get_object_or_404(Flow.objects.select_related(), slug=flow_slug)
if settings.DEBUG:
return
if request.user.has_perm(
if request.user.has_perm("authentik_flows.inspect_flow") or request.user.has_perm(
"authentik_flows.inspect_flow", self.flow
) or request.user.has_perm("authentik_flows.inspect_flow"):
):
return
raise Http404
@ -96,9 +96,6 @@ class FlowInspectorView(APIView):
"""Get current flow state and record it"""
plans = []
for plan in request.session.get(SESSION_KEY_HISTORY, []):
plan: FlowPlan
if plan.flow_pk != self.flow.pk.hex:
continue
plan_serializer = FlowInspectorPlanSerializer(
instance=plan, context={"request": request}
)

37
authentik/lib/api.py Normal file
View File

@ -0,0 +1,37 @@
from collections.abc import Callable, Sequence
from typing import Self
from uuid import UUID
from django.db.models import Model, Q, QuerySet, UUIDField
from django.shortcuts import get_object_or_404
class MultipleFieldLookupMixin:
"""Helper mixin class to add support for multiple lookup_fields.
`lookup_fields` needs to be set which specifies the actual fields to query, `lookup_field`
is only used to generate the URL."""
lookup_field: str
lookup_fields: str | Sequence[str]
get_queryset: Callable[[Self], QuerySet]
filter_queryset: Callable[[Self, QuerySet], QuerySet]
def get_object(self):
queryset: QuerySet = self.get_queryset()
queryset = self.filter_queryset(queryset)
if isinstance(self.lookup_fields, str):
self.lookup_fields = [self.lookup_fields]
query = Q()
model: Model = queryset.model
for field in self.lookup_fields:
field_inst = model._meta.get_field(field)
# Sanity check, if the field we're filtering again, only apply the filter if
# our value looks like a UUID
if isinstance(field_inst, UUIDField):
try:
UUID(self.kwargs[self.lookup_field])
except ValueError:
continue
query |= Q(**{field: self.kwargs[self.lookup_field]})
return get_object_or_404(queryset, query)

View File

@ -289,9 +289,7 @@ class ConfigLoader:
return int(value)
except (ValueError, TypeError) as exc:
if value is None or (isinstance(value, str) and value.lower() == "null"):
return default
if value is UNSET:
return default
return None
self.log("warning", "Failed to parse config as int", path=path, exc=str(exc))
return default
@ -424,4 +422,4 @@ if __name__ == "__main__":
if len(argv) < 2: # noqa: PLR2004
print(dumps(CONFIG.raw, indent=4, cls=AttrEncoder))
else:
print(CONFIG.get(argv[-1]))
print(CONFIG.get(argv[1]))

View File

@ -1,26 +0,0 @@
from structlog.stdlib import get_logger
from authentik.lib.config import CONFIG
LOGGER = get_logger()
def start_debug_server(**kwargs) -> bool:
"""Attempt to start a debugpy server in the current process.
Returns true if the server was started successfully, otherwise false"""
if not CONFIG.get_bool("debug") and not CONFIG.get_bool("debugger"):
return
try:
import debugpy
except ImportError:
LOGGER.warning(
"Failed to import debugpy. debugpy is not included "
"in the default release dependencies and must be installed manually"
)
return False
listen: str = CONFIG.get("listen.listen_debug_py", "127.0.0.1:9901")
host, _, port = listen.rpartition(":")
debugpy.listen((host, int(port)), **kwargs) # nosec
LOGGER.debug("Starting debug server", host=host, port=port)
return True

View File

@ -8,7 +8,6 @@ postgresql:
password: "env://POSTGRES_PASSWORD"
test:
name: test_authentik
default_schema: public
read_replicas: {}
# For example
# 0:
@ -22,7 +21,6 @@ listen:
listen_radius: 0.0.0.0:1812
listen_metrics: 0.0.0.0:9300
listen_debug: 0.0.0.0:9900
listen_debug_py: 0.0.0.0:9901
trusted_proxy_cidrs:
- 127.0.0.0/8
- 10.0.0.0/8
@ -59,7 +57,7 @@ cache:
# transport_options: ""
debug: false
debugger: false
remote_debug: false
log_level: info

View File

@ -9,25 +9,20 @@ from typing import Any
from cachetools import TLRUCache, cached
from django.core.exceptions import FieldError
from django.http import HttpRequest
from django.utils.text import slugify
from django.utils.timezone import now
from guardian.shortcuts import get_anonymous_user
from rest_framework.serializers import ValidationError
from sentry_sdk import start_span
from sentry_sdk.tracing import Span
from structlog.stdlib import get_logger
from authentik.core.models import AuthenticatedSession, User
from authentik.core.models import User
from authentik.events.models import Event
from authentik.lib.expression.exceptions import ControlFlowException
from authentik.lib.utils.http import get_http_session
from authentik.lib.utils.time import timedelta_from_string
from authentik.policies.models import Policy, PolicyBinding
from authentik.policies.process import PolicyProcess
from authentik.policies.types import PolicyRequest, PolicyResult
from authentik.providers.oauth2.id_token import IDToken
from authentik.providers.oauth2.models import AccessToken, OAuth2Provider
from authentik.stages.authenticator import devices_for_user
LOGGER = get_logger()
@ -61,7 +56,6 @@ class BaseEvaluator:
"ak_logger": get_logger(self._filename).bind(),
"ak_user_by": BaseEvaluator.expr_user_by,
"ak_user_has_authenticator": BaseEvaluator.expr_func_user_has_authenticator,
"ak_create_jwt": self.expr_create_jwt,
"ip_address": ip_address,
"ip_network": ip_network,
"list_flatten": BaseEvaluator.expr_flatten,
@ -188,36 +182,6 @@ class BaseEvaluator:
proc = PolicyProcess(PolicyBinding(policy=policy), request=req, connection=None)
return proc.profiling_wrapper()
def expr_create_jwt(
self,
user: User,
provider: OAuth2Provider | str,
scopes: list[str],
validity: str = "seconds=60",
) -> str | None:
"""Issue a JWT for a given provider"""
request: HttpRequest = self._context.get("http_request")
if not request:
return None
if not isinstance(provider, OAuth2Provider):
provider = OAuth2Provider.objects.get(name=provider)
session = None
if hasattr(request, "session") and request.session.session_key:
session = AuthenticatedSession.objects.filter(
session_key=request.session.session_key
).first()
access_token = AccessToken(
provider=provider,
user=user,
expires=now() + timedelta_from_string(validity),
scope=scopes,
auth_time=now(),
session=session,
)
access_token.id_token = IDToken.new(provider, access_token, request)
access_token.save()
return access_token.token
def wrap_expression(self, expression: str) -> str:
"""Wrap expression in a function, call it, and save the result as `result`"""
handler_signature = ",".join(sanitize_arg(x) for x in self._context.keys())

View File

@ -22,9 +22,9 @@ class OutgoingSyncProvider(Model):
class Meta:
abstract = True
def client_for_model[T: User | Group](
self, model: type[T]
) -> BaseOutgoingSyncClient[T, Any, Any, Self]:
def client_for_model[
T: User | Group
](self, model: type[T]) -> BaseOutgoingSyncClient[T, Any, Any, Self]:
raise NotImplementedError
def get_object_qs[T: User | Group](self, type: type[T]) -> QuerySet[T]:

View File

@ -1,15 +1,11 @@
"""Test Evaluator base functions"""
from django.test import RequestFactory, TestCase
from django.urls import reverse
from jwt import decode
from django.test import TestCase
from authentik.blueprints.tests import apply_blueprint
from authentik.core.tests.utils import create_test_admin_user, create_test_flow, create_test_user
from authentik.core.tests.utils import create_test_admin_user
from authentik.events.models import Event
from authentik.lib.expression.evaluator import BaseEvaluator
from authentik.lib.generators import generate_id
from authentik.providers.oauth2.models import OAuth2Provider, ScopeMapping
class TestEvaluator(TestCase):
@ -45,35 +41,3 @@ class TestEvaluator(TestCase):
event = Event.objects.filter(action="custom_foo").first()
self.assertIsNotNone(event)
self.assertEqual(event.context, {"bar": "baz", "foo": "bar"})
@apply_blueprint("system/providers-oauth2.yaml")
def test_expr_create_jwt(self):
"""Test expr_create_jwt"""
rf = RequestFactory()
user = create_test_user()
provider = OAuth2Provider.objects.create(
name=generate_id(),
authorization_flow=create_test_flow(),
)
provider.property_mappings.set(
ScopeMapping.objects.filter(
managed__in=[
"goauthentik.io/providers/oauth2/scope-openid",
"goauthentik.io/providers/oauth2/scope-email",
"goauthentik.io/providers/oauth2/scope-profile",
]
)
)
evaluator = BaseEvaluator(generate_id())
evaluator._context = {
"http_request": rf.get(reverse("authentik_core:root-redirect")),
"user": user,
"provider": provider.name,
}
jwt = evaluator.evaluate(
"return ak_create_jwt(user, provider, ['openid', 'email', 'profile'])"
)
decoded = decode(
jwt, provider.client_secret, algorithms=["HS256"], audience=provider.client_id
)
self.assertEqual(decoded["preferred_username"], user.username)

View File

@ -42,8 +42,6 @@ class DebugSession(Session):
def get_http_session() -> Session:
"""Get a requests session with common headers"""
session = Session()
if CONFIG.get_bool("debug") or CONFIG.get("log_level") == "trace":
session = DebugSession()
session = DebugSession() if CONFIG.get_bool("debug") else Session()
session.headers["User-Agent"] = authentik_user_agent()
return session

View File

@ -207,7 +207,7 @@ class KubernetesObjectReconciler(Generic[T]):
"app.kubernetes.io/instance": slugify(self.controller.outpost.name),
"app.kubernetes.io/managed-by": "goauthentik.io",
"app.kubernetes.io/name": f"authentik-{self.controller.outpost.type.lower()}",
"app.kubernetes.io/version": get_version().replace("+", "-"),
"app.kubernetes.io/version": get_version(),
"goauthentik.io/outpost-name": slugify(self.controller.outpost.name),
"goauthentik.io/outpost-type": str(self.controller.outpost.type),
"goauthentik.io/outpost-uuid": self.controller.outpost.uuid.hex,

View File

@ -94,7 +94,7 @@ class DeploymentReconciler(KubernetesObjectReconciler[V1Deployment]):
meta = self.get_object_meta(name=self.name)
image_name = self.controller.get_container_image()
image_pull_secrets = self.outpost.config.kubernetes_image_pull_secrets
version = get_full_version().replace("+", "-")
version = get_full_version()
return V1Deployment(
metadata=meta,
spec=V1DeploymentSpec(

View File

@ -13,7 +13,7 @@ if TYPE_CHECKING:
from authentik.outposts.controllers.kubernetes import KubernetesController
@dataclass(slots=True)
@dataclass
class PrometheusServiceMonitorSpecEndpoint:
"""Prometheus ServiceMonitor endpoint spec"""
@ -21,14 +21,14 @@ class PrometheusServiceMonitorSpecEndpoint:
path: str = field(default="/metrics")
@dataclass(slots=True)
@dataclass
class PrometheusServiceMonitorSpecSelector:
"""Prometheus ServiceMonitor selector spec"""
matchLabels: dict
@dataclass(slots=True)
@dataclass
class PrometheusServiceMonitorSpec:
"""Prometheus ServiceMonitor spec"""
@ -37,7 +37,7 @@ class PrometheusServiceMonitorSpec:
selector: PrometheusServiceMonitorSpecSelector
@dataclass(slots=True)
@dataclass
class PrometheusServiceMonitorMetadata:
"""Prometheus ServiceMonitor metadata"""
@ -46,7 +46,7 @@ class PrometheusServiceMonitorMetadata:
labels: dict = field(default_factory=dict)
@dataclass(slots=True)
@dataclass
class PrometheusServiceMonitor:
"""Prometheus ServiceMonitor"""

View File

@ -1,26 +1,11 @@
"""Expression Policy API"""
from drf_spectacular.utils import OpenApiResponse, extend_schema
from guardian.shortcuts import get_objects_for_user
from rest_framework.decorators import action
from rest_framework.fields import CharField
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import ModelViewSet
from structlog.stdlib import get_logger
from authentik.core.api.used_by import UsedByMixin
from authentik.events.logs import LogEventSerializer, capture_logs
from authentik.policies.api.exec import PolicyTestResultSerializer, PolicyTestSerializer
from authentik.policies.api.policies import PolicySerializer
from authentik.policies.expression.evaluator import PolicyEvaluator
from authentik.policies.expression.models import ExpressionPolicy
from authentik.policies.models import PolicyBinding
from authentik.policies.process import PolicyProcess
from authentik.policies.types import PolicyRequest
from authentik.rbac.decorators import permission_required
LOGGER = get_logger()
class ExpressionPolicySerializer(PolicySerializer):
@ -45,50 +30,3 @@ class ExpressionPolicyViewSet(UsedByMixin, ModelViewSet):
filterset_fields = "__all__"
ordering = ["name"]
search_fields = ["name"]
class ExpressionPolicyTestSerializer(PolicyTestSerializer):
"""Expression policy test serializer"""
expression = CharField()
@permission_required("authentik_policies.view_policy")
@extend_schema(
request=ExpressionPolicyTestSerializer(),
responses={
200: PolicyTestResultSerializer(),
400: OpenApiResponse(description="Invalid parameters"),
},
)
@action(detail=True, pagination_class=None, filter_backends=[], methods=["POST"])
def test(self, request: Request, pk: str) -> Response:
"""Test policy"""
policy = self.get_object()
test_params = self.ExpressionPolicyTestSerializer(data=request.data)
if not test_params.is_valid():
return Response(test_params.errors, status=400)
# User permission check, only allow policy testing for users that are readable
users = get_objects_for_user(request.user, "authentik_core.view_user").filter(
pk=test_params.validated_data["user"].pk
)
if not users.exists():
return Response(status=400)
policy.expression = test_params.validated_data["expression"]
p_request = PolicyRequest(users.first())
p_request.debug = True
p_request.set_http_request(self.request)
p_request.context = test_params.validated_data.get("context", {})
proc = PolicyProcess(PolicyBinding(policy=policy), p_request, None)
with capture_logs() as logs:
result = proc.execute()
log_messages = []
for log in logs:
if log.attributes.get("process", "") == "PolicyProcess":
continue
log_messages.append(LogEventSerializer(log).data)
result.log_messages = log_messages
response = PolicyTestResultSerializer(result)
return Response(response.data)

View File

@ -1,30 +0,0 @@
# Generated by Django 5.0.10 on 2025-01-13 18:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
(
"authentik_policies_reputation",
"0007_reputation_authentik_p_identif_9434d7_idx_and_more",
),
]
operations = [
migrations.AddIndex(
model_name="reputation",
index=models.Index(fields=["expires"], name="authentik_p_expires_da493f_idx"),
),
migrations.AddIndex(
model_name="reputation",
index=models.Index(fields=["expiring"], name="authentik_p_expirin_2ab34f_idx"),
),
migrations.AddIndex(
model_name="reputation",
index=models.Index(
fields=["expiring", "expires"], name="authentik_p_expirin_2a8ec7_idx"
),
),
]

View File

@ -96,7 +96,7 @@ class Reputation(ExpiringModel, SerializerModel):
verbose_name = _("Reputation Score")
verbose_name_plural = _("Reputation Scores")
unique_together = ("identifier", "ip")
indexes = ExpiringModel.Meta.indexes + [
indexes = [
models.Index(fields=["identifier"]),
models.Index(fields=["ip"]),
models.Index(fields=["ip", "identifier"]),

Some files were not shown because too many files have changed in this diff Show More