Compare commits

..

13 Commits

Author SHA1 Message Date
00c1e17b52 Merge branch 'main' into web/add-command-palette
Signed-off-by: Jens Langhammer <jens@goauthentik.io>

# Conflicts:
#	web/package-lock.json
#	web/package.json
#	web/src/admin/AdminInterface/AdminInterface.ts
2025-06-14 02:17:24 +02:00
cba8e84bbe fix accent color
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2024-03-17 03:18:50 +01:00
d313fd7fb4 set dark theme
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2024-03-17 03:10:02 +01:00
102811508f fix z-index
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2024-03-17 03:09:53 +01:00
16b3ca3715 web: add command palette
Adds [Ninja Keys](https://github.com/ssleptsov/ninja-keys) (MIT License) to the Admin Interface.
This is a trivial demo. Start the UI and begin by pressing CMD-K or CTRL-K in the Admin Interface.

- Because a lot of the Create and Update operations are accessible as activations-on-modals rather
  than navigable components in their own right, we don't have a way to encode commands like "Start
  the Application Wizard" or "Add a new Policy."
- Our search isn't very quick, so "Edit Application Foo" would have intolerable delays while we
  looked up a list of applications that could be edited.
- The `icon` feature is relatively difficult to exploit because it's a web component and its inner
  styles are inacessible.
- There seem to be cases where the modal-popup and charts conflict.

If we want to move forward with this, we have the challenge of finding activation means for the
various Create and Edit features, and whether or not Search can be meaningfully integrated into the
results.

It would also be nifty if the User and Admin palettes treated each other as peers; you could just go
"CMD-K My User Page" and it would just *take* you there, and "CMD-K Admin Applications" to go back,
creating an illusion of seamlessness.
2024-03-15 15:21:51 -07:00
8b4e0361c4 Merge branch 'main' into dev
* main:
  web: clean up and remove redundant alias '@goauthentik/app' (#8889)
  web/admin: fix markdown table rendering (#8908)
2024-03-14 10:35:46 -07:00
22cb5b7379 Merge branch 'main' into dev
* main:
  web: bump chromedriver from 122.0.5 to 122.0.6 in /tests/wdio (#8902)
  web: bump vite-tsconfig-paths from 4.3.1 to 4.3.2 in /web (#8903)
  core: bump google.golang.org/protobuf from 1.32.0 to 1.33.0 (#8901)
  web: provide InstallID on EnterpriseListPage (#8898)
2024-03-14 08:14:43 -07:00
2d0117d096 Merge branch 'main' into dev
* main:
  api: capabilities: properly set can_save_media when s3 is enabled (#8896)
  web: bump the rollup group in /web with 3 updates (#8891)
  core: bump pydantic from 2.6.3 to 2.6.4 (#8892)
  core: bump twilio from 9.0.0 to 9.0.1 (#8893)
2024-03-13 14:05:11 -07:00
035bda4eac Merge branch 'main' into dev
* main:
  Update _envoy_istio.md (#8888)
  website/docs: new landing page for Providers (#8879)
  web: bump the sentry group in /web with 1 update (#8881)
  web: bump chromedriver from 122.0.4 to 122.0.5 in /tests/wdio (#8884)
  web: bump the eslint group in /tests/wdio with 2 updates (#8883)
  web: bump the eslint group in /web with 2 updates (#8885)
  website: bump @types/react from 18.2.64 to 18.2.65 in /website (#8886)
2024-03-12 13:31:35 -07:00
50906214e5 Merge branch 'main' into dev
* main:
  web: upgrade to lit 3 (#8781)
2024-03-11 11:03:04 -07:00
e505f274b6 Merge branch 'main' into dev
* main:
  web: fix esbuild issue with style sheets (#8856)
2024-03-11 10:28:05 -07:00
fe52f44dca Merge branch 'main' into dev
* main:
  tenants: really ensure default tenant cannot be deleted (#8875)
  core: bump github.com/go-openapi/runtime from 0.27.2 to 0.28.0 (#8867)
  core: bump pytest from 8.0.2 to 8.1.1 (#8868)
  core: bump github.com/go-openapi/strfmt from 0.22.2 to 0.23.0 (#8869)
  core: bump bandit from 1.7.7 to 1.7.8 (#8870)
  core: bump packaging from 23.2 to 24.0 (#8871)
  core: bump ruff from 0.3.1 to 0.3.2 (#8873)
  web: bump the wdio group in /tests/wdio with 3 updates (#8865)
  core: bump requests-oauthlib from 1.3.1 to 1.4.0 (#8866)
  core: bump uvicorn from 0.27.1 to 0.28.0 (#8872)
  core: bump django-filter from 23.5 to 24.1 (#8874)
2024-03-11 10:27:43 -07:00
3146e5a50f web: fix esbuild issue with style sheets
Getting ESBuild, Lit, and Storybook to all agree on how to read and parse stylesheets is a serious
pain. This fix better identifies the value types (instances) being passed from various sources in
the repo to the three *different* kinds of style processors we're using (the native one, the
polyfill one, and whatever the heck Storybook does internally).

Falling back to using older CSS instantiating techniques one era at a time seems to do the trick.
It's ugly, but in the face of the aggressive styling we use to avoid Flashes of Unstyled Content
(FLoUC), it's the logic with which we're left.

In standard mode, the following warning appears on the console when running a Flow:

```
Autofocus processing was blocked because a document already has a focused element.
```

In compatibility mode, the following **error** appears on the console when running a Flow:

```
crawler-inject.js:1106 Uncaught TypeError: Failed to execute 'observe' on 'MutationObserver': parameter 1 is not of type 'Node'.
    at initDomMutationObservers (crawler-inject.js:1106:18)
    at crawler-inject.js:1114:24
    at Array.forEach (<anonymous>)
    at initDomMutationObservers (crawler-inject.js:1114:10)
    at crawler-inject.js:1549:1
initDomMutationObservers @ crawler-inject.js:1106
(anonymous) @ crawler-inject.js:1114
initDomMutationObservers @ crawler-inject.js:1114
(anonymous) @ crawler-inject.js:1549
```

Despite this error, nothing seems to be broken and flows work as anticipated.
2024-03-08 14:15:55 -08:00
589 changed files with 17122 additions and 18821 deletions

View File

@ -1,5 +1,5 @@
[bumpversion]
current_version = 2025.6.3
current_version = 2025.6.1
tag = True
commit = True
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(?:-(?P<rc_t>[a-zA-Z-]+)(?P<rc_n>[1-9]\\d*))?
@ -21,8 +21,6 @@ optional_value = final
[bumpversion:file:package.json]
[bumpversion:file:package-lock.json]
[bumpversion:file:docker-compose.yml]
[bumpversion:file:schema.yml]
@ -33,4 +31,6 @@ optional_value = final
[bumpversion:file:internal/constants/constants.go]
[bumpversion:file:web/src/common/constants.ts]
[bumpversion:file:lifecycle/aws/template.yaml]

View File

@ -7,9 +7,6 @@ charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
[*.toml]
indent_size = 2
[*.html]
indent_size = 2

View File

@ -38,8 +38,6 @@ jobs:
# Needed for attestation
id-token: write
attestations: write
# Needed for checkout
contents: read
steps:
- uses: actions/checkout@v4
- uses: docker/setup-qemu-action@v3.6.0

View File

@ -9,15 +9,14 @@ on:
jobs:
test-container:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
version:
- docs
- version-2025-4
- version-2025-2
- version-2024-12
steps:
- uses: actions/checkout@v4
- run: |

View File

@ -202,7 +202,7 @@ jobs:
uses: actions/cache@v4
with:
path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
- name: prepare web ui
if: steps.cache-web.outputs.cache-hit != 'true'
working-directory: web
@ -247,13 +247,11 @@ jobs:
# Needed for attestation
id-token: write
attestations: write
# Needed for checkout
contents: read
needs: ci-core-mark
uses: ./.github/workflows/_reusable-docker-build.yaml
secrets: inherit
with:
image_name: ${{ github.repository == 'goauthentik/authentik-internal' && 'ghcr.io/goauthentik/internal-server' || 'ghcr.io/goauthentik/dev-server' }}
image_name: ghcr.io/goauthentik/dev-server
release: false
pr-comment:
needs:

View File

@ -59,7 +59,6 @@ jobs:
with:
jobs: ${{ toJSON(needs) }}
build-container:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
timeout-minutes: 120
needs:
- ci-outpost-mark

View File

@ -41,29 +41,7 @@ jobs:
- name: test
working-directory: website/
run: npm test
build:
runs-on: ubuntu-latest
name: ${{ matrix.job }}
strategy:
fail-fast: false
matrix:
job:
- build
- build:integrations
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
- working-directory: website/
run: npm ci
- name: build
working-directory: website/
run: npm run ${{ matrix.job }}
build-container:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest
permissions:
# Needed to upload container images to ghcr.io
@ -116,11 +94,9 @@ jobs:
needs:
- lint
- test
- build
- build-container
runs-on: ubuntu-latest
steps:
- uses: re-actors/alls-green@release/v1
with:
jobs: ${{ toJSON(needs) }}
allowed-skips: ${{ github.repository == 'goauthentik/authentik-internal' && 'build-container' || '[]' }}

View File

@ -2,7 +2,7 @@ name: "CodeQL"
on:
push:
branches: [main, next, version*]
branches: [main, "*", next, version*]
pull_request:
branches: [main]
schedule:

View File

@ -1,21 +0,0 @@
name: "authentik-repo-mirror-cleanup"
on:
workflow_dispatch:
jobs:
to_internal:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- if: ${{ env.MIRROR_KEY != '' }}
uses: BeryJu/repository-mirroring-action@5cf300935bc2e068f73ea69bcc411a8a997208eb
with:
target_repo_url: git@github.com:goauthentik/authentik-internal.git
ssh_private_key: ${{ secrets.GH_MIRROR_KEY }}
args: --tags --force --prune
env:
MIRROR_KEY: ${{ secrets.GH_MIRROR_KEY }}

View File

@ -11,10 +11,11 @@ jobs:
with:
fetch-depth: 0
- if: ${{ env.MIRROR_KEY != '' }}
uses: BeryJu/repository-mirroring-action@5cf300935bc2e068f73ea69bcc411a8a997208eb
uses: pixta-dev/repository-mirroring-action@v1
with:
target_repo_url: git@github.com:goauthentik/authentik-internal.git
ssh_private_key: ${{ secrets.GH_MIRROR_KEY }}
args: --tags --force
target_repo_url:
git@github.com:goauthentik/authentik-internal.git
ssh_private_key:
${{ secrets.GH_MIRROR_KEY }}
env:
MIRROR_KEY: ${{ secrets.GH_MIRROR_KEY }}

View File

@ -16,7 +16,6 @@ env:
jobs:
compile:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest
steps:
- id: generate_token

5
.gitignore vendored
View File

@ -100,6 +100,9 @@ ipython_config.py
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
@ -163,6 +166,8 @@ dmypy.json
# pyenv
# celery beat schedule file
# SageMath parsed files
# Environments

View File

@ -6,15 +6,13 @@
"!Context scalar",
"!Enumerate sequence",
"!Env scalar",
"!Env sequence",
"!Find sequence",
"!Format sequence",
"!If sequence",
"!Index scalar",
"!KeyOf scalar",
"!Value scalar",
"!AtIndex scalar",
"!ParseJSON scalar"
"!AtIndex scalar"
],
"typescript.preferences.importModuleSpecifier": "non-relative",
"typescript.preferences.importModuleSpecifierEnding": "index",

View File

@ -75,9 +75,9 @@ RUN --mount=type=secret,id=GEOIPUPDATE_ACCOUNT_ID \
/bin/sh -c "GEOIPUPDATE_LICENSE_KEY_FILE=/run/secrets/GEOIPUPDATE_LICENSE_KEY /usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0"
# Stage 4: Download uv
FROM ghcr.io/astral-sh/uv:0.7.17 AS uv
FROM ghcr.io/astral-sh/uv:0.7.13 AS uv
# Stage 5: Base python image
FROM ghcr.io/goauthentik/fips-python:3.13.5-slim-bookworm-fips AS python-base
FROM ghcr.io/goauthentik/fips-python:3.13.4-slim-bookworm-fips AS python-base
ENV VENV_PATH="/ak-root/.venv" \
PATH="/lifecycle:/ak-root/.venv/bin:$PATH" \
@ -122,7 +122,6 @@ ENV UV_NO_BINARY_PACKAGE="cryptography lxml python-kadmin-rs xmlsec"
RUN --mount=type=bind,target=pyproject.toml,src=pyproject.toml \
--mount=type=bind,target=uv.lock,src=uv.lock \
--mount=type=bind,target=packages,src=packages \
--mount=type=cache,target=/root/.cache/uv \
uv sync --frozen --no-install-project --no-dev
@ -168,7 +167,6 @@ COPY ./blueprints /blueprints
COPY ./lifecycle/ /lifecycle
COPY ./authentik/sources/kerberos/krb5.conf /etc/krb5.conf
COPY --from=go-builder /go/authentik /bin/authentik
COPY ./packages/ /ak-root/packages
COPY --from=python-deps /ak-root/.venv /ak-root/.venv
COPY --from=node-builder /work/web/dist/ /web/dist/
COPY --from=node-builder /work/web/authentik/ /web/authentik/

View File

@ -6,7 +6,7 @@ PWD = $(shell pwd)
UID = $(shell id -u)
GID = $(shell id -g)
NPM_VERSION = $(shell python -m scripts.generate_semver)
PY_SOURCES = authentik packages tests scripts lifecycle .github
PY_SOURCES = authentik tests scripts lifecycle .github
DOCKER_IMAGE ?= "authentik:test"
GEN_API_TS = gen-ts-api
@ -86,10 +86,6 @@ dev-create-db:
dev-reset: dev-drop-db dev-create-db migrate ## Drop and restore the Authentik PostgreSQL instance to a "fresh install" state.
update-test-mmdb: ## Update test GeoIP and ASN Databases
curl -L https://raw.githubusercontent.com/maxmind/MaxMind-DB/refs/heads/main/test-data/GeoLite2-ASN-Test.mmdb -o ${PWD}/tests/GeoLite2-ASN-Test.mmdb
curl -L https://raw.githubusercontent.com/maxmind/MaxMind-DB/refs/heads/main/test-data/GeoLite2-City-Test.mmdb -o ${PWD}/tests/GeoLite2-City-Test.mmdb
#########################
## API Schema
#########################
@ -98,7 +94,7 @@ gen-build: ## Extract the schema from the database
AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
uv run ak make_blueprint_schema --file blueprints/schema.json
uv run ak make_blueprint_schema > blueprints/schema.json
AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
@ -150,9 +146,9 @@ gen-client-ts: gen-clean-ts ## Build and install the authentik API for Typescri
--additional-properties=npmVersion=${NPM_VERSION} \
--git-repo-id authentik \
--git-user-id goauthentik
cd ${PWD}/${GEN_API_TS} && npm link
cd ${PWD}/web && npm link @goauthentik/api
mkdir -p web/node_modules/@goauthentik/api
cd ${PWD}/${GEN_API_TS} && npm i
\cp -rf ${PWD}/${GEN_API_TS}/* web/node_modules/@goauthentik/api
gen-client-py: gen-clean-py ## Build and install the authentik API for Python
docker run \

View File

@ -2,7 +2,7 @@
from os import environ
__version__ = "2025.6.3"
__version__ = "2025.6.1"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"

View File

@ -41,7 +41,7 @@ class VersionSerializer(PassiveSerializer):
return __version__
version_in_cache = cache.get(VERSION_CACHE_KEY)
if not version_in_cache: # pragma: no cover
update_latest_version.send()
update_latest_version.delay()
return __version__
return version_in_cache

View File

@ -0,0 +1,57 @@
"""authentik administration overview"""
from socket import gethostname
from django.conf import settings
from drf_spectacular.utils import extend_schema, inline_serializer
from packaging.version import parse
from rest_framework.fields import BooleanField, CharField
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.views import APIView
from authentik import get_full_version
from authentik.rbac.permissions import HasPermission
from authentik.root.celery import CELERY_APP
class WorkerView(APIView):
"""Get currently connected worker count."""
permission_classes = [HasPermission("authentik_rbac.view_system_info")]
@extend_schema(
responses=inline_serializer(
"Worker",
fields={
"worker_id": CharField(),
"version": CharField(),
"version_matching": BooleanField(),
},
many=True,
)
)
def get(self, request: Request) -> Response:
"""Get currently connected worker count."""
raw: list[dict[str, dict]] = CELERY_APP.control.ping(timeout=0.5)
our_version = parse(get_full_version())
response = []
for worker in raw:
key = list(worker.keys())[0]
version = worker[key].get("version")
version_matching = False
if version:
version_matching = parse(version) == our_version
response.append(
{"worker_id": key, "version": version, "version_matching": version_matching}
)
# In debug we run with `task_always_eager`, so tasks are ran on the main process
if settings.DEBUG: # pragma: no cover
response.append(
{
"worker_id": f"authentik-debug@{gethostname()}",
"version": get_full_version(),
"version_matching": True,
}
)
return Response(response)

View File

@ -3,9 +3,6 @@
from prometheus_client import Info
from authentik.blueprints.apps import ManagedAppConfig
from authentik.lib.config import CONFIG
from authentik.lib.utils.time import fqdn_rand
from authentik.tasks.schedules.lib import ScheduleSpec
PROM_INFO = Info("authentik_version", "Currently running authentik version")
@ -33,15 +30,3 @@ class AuthentikAdminConfig(ManagedAppConfig):
notification_version = notification.event.context["new_version"]
if LOCAL_VERSION >= parse(notification_version):
notification.delete()
@property
def global_schedule_specs(self) -> list[ScheduleSpec]:
from authentik.admin.tasks import update_latest_version
return [
ScheduleSpec(
actor=update_latest_version,
crontab=f"{fqdn_rand('admin_latest_version')} * * * *",
paused=CONFIG.get_bool("disable_update_check"),
),
]

View File

@ -0,0 +1,15 @@
"""authentik admin settings"""
from celery.schedules import crontab
from django_tenants.utils import get_public_schema_name
from authentik.lib.utils.time import fqdn_rand
CELERY_BEAT_SCHEDULE = {
"admin_latest_version": {
"task": "authentik.admin.tasks.update_latest_version",
"schedule": crontab(minute=fqdn_rand("admin_latest_version"), hour="*"),
"tenant_schemas": [get_public_schema_name()],
"options": {"queue": "authentik_scheduled"},
}
}

View File

@ -0,0 +1,35 @@
"""admin signals"""
from django.dispatch import receiver
from packaging.version import parse
from prometheus_client import Gauge
from authentik import get_full_version
from authentik.root.celery import CELERY_APP
from authentik.root.monitoring import monitoring_set
GAUGE_WORKERS = Gauge(
"authentik_admin_workers",
"Currently connected workers, their versions and if they are the same version as authentik",
["version", "version_matched"],
)
_version = parse(get_full_version())
@receiver(monitoring_set)
def monitoring_set_workers(sender, **kwargs):
"""Set worker gauge"""
raw: list[dict[str, dict]] = CELERY_APP.control.ping(timeout=0.5)
worker_version_count = {}
for worker in raw:
key = list(worker.keys())[0]
version = worker[key].get("version")
version_matching = False
if version:
version_matching = parse(version) == _version
worker_version_count.setdefault(version, {"count": 0, "matching": version_matching})
worker_version_count[version]["count"] += 1
for version, stats in worker_version_count.items():
GAUGE_WORKERS.labels(version, stats["matching"]).set(stats["count"])

View File

@ -2,8 +2,6 @@
from django.core.cache import cache
from django.utils.translation import gettext_lazy as _
from django_dramatiq_postgres.middleware import CurrentTask
from dramatiq import actor
from packaging.version import parse
from requests import RequestException
from structlog.stdlib import get_logger
@ -11,9 +9,10 @@ from structlog.stdlib import get_logger
from authentik import __version__, get_build_hash
from authentik.admin.apps import PROM_INFO
from authentik.events.models import Event, EventAction
from authentik.events.system_tasks import SystemTask, TaskStatus, prefill_task
from authentik.lib.config import CONFIG
from authentik.lib.utils.http import get_http_session
from authentik.tasks.models import Task
from authentik.root.celery import CELERY_APP
LOGGER = get_logger()
VERSION_NULL = "0.0.0"
@ -33,12 +32,13 @@ def _set_prom_info():
)
@actor(description=_("Update latest version info."))
def update_latest_version():
self: Task = CurrentTask.get_task()
@CELERY_APP.task(bind=True, base=SystemTask)
@prefill_task
def update_latest_version(self: SystemTask):
"""Update latest version info"""
if CONFIG.get_bool("disable_update_check"):
cache.set(VERSION_CACHE_KEY, VERSION_NULL, VERSION_CACHE_TIMEOUT)
self.info("Version check disabled.")
self.set_status(TaskStatus.WARNING, "Version check disabled.")
return
try:
response = get_http_session().get(
@ -48,7 +48,7 @@ def update_latest_version():
data = response.json()
upstream_version = data.get("stable", {}).get("version")
cache.set(VERSION_CACHE_KEY, upstream_version, VERSION_CACHE_TIMEOUT)
self.info("Successfully updated latest Version")
self.set_status(TaskStatus.SUCCESSFUL, "Successfully updated latest Version")
_set_prom_info()
# Check if upstream version is newer than what we're running,
# and if no event exists yet, create one.
@ -71,7 +71,7 @@ def update_latest_version():
).save()
except (RequestException, IndexError) as exc:
cache.set(VERSION_CACHE_KEY, VERSION_NULL, VERSION_CACHE_TIMEOUT)
raise exc
self.set_error(exc)
_set_prom_info()

View File

@ -29,6 +29,13 @@ class TestAdminAPI(TestCase):
body = loads(response.content)
self.assertEqual(body["version_current"], __version__)
def test_workers(self):
"""Test Workers API"""
response = self.client.get(reverse("authentik_api:admin_workers"))
self.assertEqual(response.status_code, 200)
body = loads(response.content)
self.assertEqual(len(body), 0)
def test_apps(self):
"""Test apps API"""
response = self.client.get(reverse("authentik_api:apps-list"))

View File

@ -30,7 +30,7 @@ class TestAdminTasks(TestCase):
"""Test Update checker with valid response"""
with Mocker() as mocker, CONFIG.patch("disable_update_check", False):
mocker.get("https://version.goauthentik.io/version.json", json=RESPONSE_VALID)
update_latest_version.send()
update_latest_version.delay().get()
self.assertEqual(cache.get(VERSION_CACHE_KEY), "99999999.9999999")
self.assertTrue(
Event.objects.filter(
@ -40,7 +40,7 @@ class TestAdminTasks(TestCase):
).exists()
)
# test that a consecutive check doesn't create a duplicate event
update_latest_version.send()
update_latest_version.delay().get()
self.assertEqual(
len(
Event.objects.filter(
@ -56,7 +56,7 @@ class TestAdminTasks(TestCase):
"""Test Update checker with invalid response"""
with Mocker() as mocker:
mocker.get("https://version.goauthentik.io/version.json", status_code=400)
update_latest_version.send()
update_latest_version.delay().get()
self.assertEqual(cache.get(VERSION_CACHE_KEY), "0.0.0")
self.assertFalse(
Event.objects.filter(
@ -67,15 +67,14 @@ class TestAdminTasks(TestCase):
def test_version_disabled(self):
"""Test Update checker while its disabled"""
with CONFIG.patch("disable_update_check", True):
update_latest_version.send()
update_latest_version.delay().get()
self.assertEqual(cache.get(VERSION_CACHE_KEY), "0.0.0")
def test_clear_update_notifications(self):
"""Test clear of previous notification"""
admin_config = apps.get_app_config("authentik_admin")
Event.objects.create(
action=EventAction.UPDATE_AVAILABLE,
context={"new_version": "99999999.9999999.9999999"},
action=EventAction.UPDATE_AVAILABLE, context={"new_version": "99999999.9999999.9999999"}
)
Event.objects.create(action=EventAction.UPDATE_AVAILABLE, context={"new_version": "1.1.1"})
Event.objects.create(action=EventAction.UPDATE_AVAILABLE, context={})

View File

@ -6,11 +6,13 @@ from authentik.admin.api.meta import AppsViewSet, ModelViewSet
from authentik.admin.api.system import SystemView
from authentik.admin.api.version import VersionView
from authentik.admin.api.version_history import VersionHistoryViewSet
from authentik.admin.api.workers import WorkerView
api_urlpatterns = [
("admin/apps", AppsViewSet, "apps"),
("admin/models", ModelViewSet, "models"),
path("admin/version/", VersionView.as_view(), name="admin_version"),
("admin/version/history", VersionHistoryViewSet, "version_history"),
path("admin/workers/", WorkerView.as_view(), name="admin_workers"),
path("admin/system/", SystemView.as_view(), name="admin_system"),
]

View File

@ -39,7 +39,7 @@ class BlueprintInstanceSerializer(ModelSerializer):
"""Ensure the path (if set) specified is retrievable"""
if path == "" or path.startswith(OCI_PREFIX):
return path
files: list[dict] = blueprints_find_dict.send().get_result(block=True)
files: list[dict] = blueprints_find_dict.delay().get()
if path not in [file["path"] for file in files]:
raise ValidationError(_("Blueprint file does not exist"))
return path
@ -115,7 +115,7 @@ class BlueprintInstanceViewSet(UsedByMixin, ModelViewSet):
@action(detail=False, pagination_class=None, filter_backends=[])
def available(self, request: Request) -> Response:
"""Get blueprints"""
files: list[dict] = blueprints_find_dict.send().get_result(block=True)
files: list[dict] = blueprints_find_dict.delay().get()
return Response(files)
@permission_required("authentik_blueprints.view_blueprintinstance")
@ -129,5 +129,5 @@ class BlueprintInstanceViewSet(UsedByMixin, ModelViewSet):
def apply(self, request: Request, *args, **kwargs) -> Response:
"""Apply a blueprint"""
blueprint = self.get_object()
apply_blueprint.send_with_options(args=(blueprint.pk,), rel_obj=blueprint)
apply_blueprint.delay(str(blueprint.pk)).get()
return self.retrieve(request, *args, **kwargs)

View File

@ -6,12 +6,9 @@ from inspect import ismethod
from django.apps import AppConfig
from django.db import DatabaseError, InternalError, ProgrammingError
from dramatiq.broker import get_broker
from structlog.stdlib import BoundLogger, get_logger
from authentik.lib.utils.time import fqdn_rand
from authentik.root.signals import startup
from authentik.tasks.schedules.lib import ScheduleSpec
class ManagedAppConfig(AppConfig):
@ -37,7 +34,7 @@ class ManagedAppConfig(AppConfig):
def import_related(self):
"""Automatically import related modules which rely on just being imported
to register themselves (mainly django signals and tasks)"""
to register themselves (mainly django signals and celery tasks)"""
def import_relative(rel_module: str):
try:
@ -83,16 +80,6 @@ class ManagedAppConfig(AppConfig):
func._authentik_managed_reconcile = ManagedAppConfig.RECONCILE_GLOBAL_CATEGORY
return func
@property
def tenant_schedule_specs(self) -> list[ScheduleSpec]:
"""Get a list of schedule specs that must exist in each tenant"""
return []
@property
def global_schedule_specs(self) -> list[ScheduleSpec]:
"""Get a list of schedule specs that must exist in the default tenant"""
return []
def _reconcile_tenant(self) -> None:
"""reconcile ourselves for tenanted methods"""
from authentik.tenants.models import Tenant
@ -113,12 +100,8 @@ class ManagedAppConfig(AppConfig):
"""
from django_tenants.utils import get_public_schema_name, schema_context
try:
with schema_context(get_public_schema_name()):
self._reconcile(self.RECONCILE_GLOBAL_CATEGORY)
except (DatabaseError, ProgrammingError, InternalError) as exc:
self.logger.debug("Failed to access database to run reconcile", exc=exc)
return
with schema_context(get_public_schema_name()):
self._reconcile(self.RECONCILE_GLOBAL_CATEGORY)
class AuthentikBlueprintsConfig(ManagedAppConfig):
@ -129,29 +112,19 @@ class AuthentikBlueprintsConfig(ManagedAppConfig):
verbose_name = "authentik Blueprints"
default = True
@ManagedAppConfig.reconcile_global
def load_blueprints_v1_tasks(self):
"""Load v1 tasks"""
self.import_module("authentik.blueprints.v1.tasks")
@ManagedAppConfig.reconcile_tenant
def blueprints_discovery(self):
"""Run blueprint discovery"""
from authentik.blueprints.v1.tasks import blueprints_discovery, clear_failed_blueprints
blueprints_discovery.delay()
clear_failed_blueprints.delay()
def import_models(self):
super().import_models()
self.import_module("authentik.blueprints.v1.meta.apply_blueprint")
@ManagedAppConfig.reconcile_global
def tasks_middlewares(self):
from authentik.blueprints.v1.tasks import BlueprintWatcherMiddleware
get_broker().add_middleware(BlueprintWatcherMiddleware())
@property
def tenant_schedule_specs(self) -> list[ScheduleSpec]:
from authentik.blueprints.v1.tasks import blueprints_discovery, clear_failed_blueprints
return [
ScheduleSpec(
actor=blueprints_discovery,
crontab=f"{fqdn_rand('blueprints_v1_discover')} * * * *",
send_on_startup=True,
),
ScheduleSpec(
actor=clear_failed_blueprints,
crontab=f"{fqdn_rand('blueprints_v1_cleanup')} * * * *",
send_on_startup=True,
),
]

View File

@ -72,33 +72,20 @@ class Command(BaseCommand):
"additionalProperties": True,
},
"entries": {
"anyOf": [
{
"type": "array",
"items": {"$ref": "#/$defs/blueprint_entry"},
},
{
"type": "object",
"additionalProperties": {
"type": "array",
"items": {"$ref": "#/$defs/blueprint_entry"},
},
},
],
"type": "array",
"items": {
"oneOf": [],
},
},
},
"$defs": {"blueprint_entry": {"oneOf": []}},
"$defs": {},
}
def add_arguments(self, parser):
parser.add_argument("--file", type=str)
@no_translations
def handle(self, *args, file: str, **options):
def handle(self, *args, **options):
"""Generate JSON Schema for blueprints"""
self.build()
with open(file, "w") as _schema:
_schema.write(dumps(self.schema, indent=4, default=Command.json_default))
self.stdout.write(dumps(self.schema, indent=4, default=Command.json_default))
@staticmethod
def json_default(value: Any) -> Any:
@ -125,7 +112,7 @@ class Command(BaseCommand):
}
)
model_path = f"{model._meta.app_label}.{model._meta.model_name}"
self.schema["$defs"]["blueprint_entry"]["oneOf"].append(
self.schema["properties"]["entries"]["items"]["oneOf"].append(
self.template_entry(model_path, model, serializer)
)

View File

@ -3,7 +3,6 @@
from pathlib import Path
from uuid import uuid4
from django.contrib.contenttypes.fields import GenericRelation
from django.contrib.postgres.fields import ArrayField
from django.db import models
from django.utils.translation import gettext_lazy as _
@ -72,13 +71,6 @@ class BlueprintInstance(SerializerModel, ManagedModel, CreatedUpdatedModel):
enabled = models.BooleanField(default=True)
managed_models = ArrayField(models.TextField(), default=list)
# Manual link to tasks instead of using TasksModel because of loop imports
tasks = GenericRelation(
"authentik_tasks.Task",
content_type_field="rel_obj_content_type",
object_id_field="rel_obj_id",
)
class Meta:
verbose_name = _("Blueprint Instance")
verbose_name_plural = _("Blueprint Instances")

View File

@ -0,0 +1,18 @@
"""blueprint Settings"""
from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand
CELERY_BEAT_SCHEDULE = {
"blueprints_v1_discover": {
"task": "authentik.blueprints.v1.tasks.blueprints_discovery",
"schedule": crontab(minute=fqdn_rand("blueprints_v1_discover"), hour="*"),
"options": {"queue": "authentik_scheduled"},
},
"blueprints_v1_cleanup": {
"task": "authentik.blueprints.v1.tasks.clear_failed_blueprints",
"schedule": crontab(minute=fqdn_rand("blueprints_v1_cleanup"), hour="*"),
"options": {"queue": "authentik_scheduled"},
},
}

View File

@ -1,2 +0,0 @@
# Import all v1 tasks for auto task discovery
from authentik.blueprints.v1.tasks import * # noqa: F403

View File

@ -1,11 +1,10 @@
version: 1
entries:
foo:
- identifiers:
name: "%(id)s"
slug: "%(id)s"
model: authentik_flows.flow
state: present
attrs:
designation: stage_configuration
title: foo
- identifiers:
name: "%(id)s"
slug: "%(id)s"
model: authentik_flows.flow
state: present
attrs:
designation: stage_configuration
title: foo

View File

@ -37,7 +37,6 @@ entries:
- attrs:
attributes:
env_null: !Env [bar-baz, null]
json_parse: !ParseJSON '{"foo": "bar"}'
policy_pk1:
!Format [
"%s-%s",

View File

@ -35,6 +35,6 @@ def blueprint_tester(file_name: Path) -> Callable:
for blueprint_file in Path("blueprints/").glob("**/*.yaml"):
if "local" in str(blueprint_file) or "testing" in str(blueprint_file):
if "local" in str(blueprint_file):
continue
setattr(TestPackaged, f"test_blueprint_{blueprint_file}", blueprint_tester(blueprint_file))

View File

@ -5,6 +5,7 @@ from collections.abc import Callable
from django.apps import apps
from django.test import TestCase
from authentik.blueprints.v1.importer import is_model_allowed
from authentik.lib.models import SerializerModel
from authentik.providers.oauth2.models import RefreshToken
@ -21,13 +22,10 @@ def serializer_tester_factory(test_model: type[SerializerModel]) -> Callable:
return
model_class = test_model()
self.assertTrue(isinstance(model_class, SerializerModel))
# Models that have subclasses don't have to have a serializer
if len(test_model.__subclasses__()) > 0:
return
self.assertIsNotNone(model_class.serializer)
if model_class.serializer.Meta().model == RefreshToken:
return
self.assertTrue(issubclass(test_model, model_class.serializer.Meta().model))
self.assertEqual(model_class.serializer.Meta().model, test_model)
return tester
@ -36,6 +34,6 @@ for app in apps.get_app_configs():
if not app.label.startswith("authentik"):
continue
for model in app.get_models():
if not issubclass(model, SerializerModel):
if not is_model_allowed(model):
continue
setattr(TestModels, f"test_{app.label}_{model.__name__}", serializer_tester_factory(model))

View File

@ -215,7 +215,6 @@ class TestBlueprintsV1(TransactionTestCase):
},
"nested_context": "context-nested-value",
"env_null": None,
"json_parse": {"foo": "bar"},
"at_index_sequence": "foo",
"at_index_sequence_default": "non existent",
"at_index_mapping": 2,

View File

@ -54,7 +54,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
file.seek(0)
file_hash = sha512(file.read().encode()).hexdigest()
file.flush()
blueprints_discovery.send()
blueprints_discovery()
instance = BlueprintInstance.objects.filter(name=blueprint_id).first()
self.assertEqual(instance.last_applied_hash, file_hash)
self.assertEqual(
@ -82,7 +82,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
)
)
file.flush()
blueprints_discovery.send()
blueprints_discovery()
blueprint = BlueprintInstance.objects.filter(name="foo").first()
self.assertEqual(
blueprint.last_applied_hash,
@ -107,7 +107,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
)
)
file.flush()
blueprints_discovery.send()
blueprints_discovery()
blueprint.refresh_from_db()
self.assertEqual(
blueprint.last_applied_hash,

View File

@ -6,7 +6,6 @@ from copy import copy
from dataclasses import asdict, dataclass, field, is_dataclass
from enum import Enum
from functools import reduce
from json import JSONDecodeError, loads
from operator import ixor
from os import getenv
from typing import Any, Literal, Union
@ -192,18 +191,11 @@ class Blueprint:
"""Dataclass used for a full export"""
version: int = field(default=1)
entries: list[BlueprintEntry] | dict[str, list[BlueprintEntry]] = field(default_factory=list)
entries: list[BlueprintEntry] = field(default_factory=list)
context: dict = field(default_factory=dict)
metadata: BlueprintMetadata | None = field(default=None)
def iter_entries(self) -> Iterable[BlueprintEntry]:
if isinstance(self.entries, dict):
for _section, entries in self.entries.items():
yield from entries
else:
yield from self.entries
class YAMLTag:
"""Base class for all YAML Tags"""
@ -234,7 +226,7 @@ class KeyOf(YAMLTag):
self.id_from = node.value
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
for _entry in blueprint.iter_entries():
for _entry in blueprint.entries:
if _entry.id == self.id_from and _entry._state.instance:
# Special handling for PolicyBindingModels, as they'll have a different PK
# which is used when creating policy bindings
@ -292,22 +284,6 @@ class Context(YAMLTag):
return value
class ParseJSON(YAMLTag):
"""Parse JSON from context/env/etc value"""
raw: str
def __init__(self, loader: "BlueprintLoader", node: ScalarNode) -> None:
super().__init__()
self.raw = node.value
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
try:
return loads(self.raw)
except JSONDecodeError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc
class Format(YAMLTag):
"""Format a string"""
@ -683,7 +659,6 @@ class BlueprintLoader(SafeLoader):
self.add_constructor("!Value", Value)
self.add_constructor("!Index", Index)
self.add_constructor("!AtIndex", AtIndex)
self.add_constructor("!ParseJSON", ParseJSON)
class EntryInvalidError(SentryIgnoredException):

View File

@ -57,6 +57,7 @@ from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import (
EndpointDeviceConnection,
)
from authentik.events.logs import LogEvent, capture_logs
from authentik.events.models import SystemTask
from authentik.events.utils import cleanse_dict
from authentik.flows.models import FlowToken, Stage
from authentik.lib.models import SerializerModel
@ -76,7 +77,6 @@ from authentik.providers.scim.models import SCIMProviderGroup, SCIMProviderUser
from authentik.rbac.models import Role
from authentik.sources.scim.models import SCIMSourceGroup, SCIMSourceUser
from authentik.stages.authenticator_webauthn.models import WebAuthnDeviceType
from authentik.tasks.models import Task
from authentik.tenants.models import Tenant
# Context set when the serializer is created in a blueprint context
@ -118,7 +118,7 @@ def excluded_models() -> list[type[Model]]:
SCIMProviderGroup,
SCIMProviderUser,
Tenant,
Task,
SystemTask,
ConnectionToken,
AuthorizationCode,
AccessToken,
@ -384,7 +384,7 @@ class Importer:
def _apply_models(self, raise_errors=False) -> bool:
"""Apply (create/update) models yaml"""
self.__pk_map = {}
for entry in self._import.iter_entries():
for entry in self._import.entries:
model_app_label, model_name = entry.get_model(self._import).split(".")
try:
model: type[SerializerModel] = registry.get_model(model_app_label, model_name)

View File

@ -44,7 +44,7 @@ class ApplyBlueprintMetaSerializer(PassiveSerializer):
return MetaResult()
LOGGER.debug("Applying blueprint from meta model", blueprint=self.blueprint_instance)
apply_blueprint(self.blueprint_instance.pk)
apply_blueprint(str(self.blueprint_instance.pk))
return MetaResult()

View File

@ -4,17 +4,12 @@ from dataclasses import asdict, dataclass, field
from hashlib import sha512
from pathlib import Path
from sys import platform
from uuid import UUID
from dacite.core import from_dict
from django.conf import settings
from django.db import DatabaseError, InternalError, ProgrammingError
from django.utils.text import slugify
from django.utils.timezone import now
from django.utils.translation import gettext_lazy as _
from django_dramatiq_postgres.middleware import CurrentTask, CurrentTaskNotFound
from dramatiq.actor import actor
from dramatiq.middleware import Middleware
from structlog.stdlib import get_logger
from watchdog.events import (
FileCreatedEvent,
@ -36,13 +31,15 @@ from authentik.blueprints.v1.importer import Importer
from authentik.blueprints.v1.labels import LABEL_AUTHENTIK_INSTANTIATE
from authentik.blueprints.v1.oci import OCI_PREFIX
from authentik.events.logs import capture_logs
from authentik.events.models import TaskStatus
from authentik.events.system_tasks import SystemTask, prefill_task
from authentik.events.utils import sanitize_dict
from authentik.lib.config import CONFIG
from authentik.tasks.models import Task
from authentik.tasks.schedules.models import Schedule
from authentik.root.celery import CELERY_APP
from authentik.tenants.models import Tenant
LOGGER = get_logger()
_file_watcher_started = False
@dataclass
@ -56,21 +53,22 @@ class BlueprintFile:
meta: BlueprintMetadata | None = field(default=None)
class BlueprintWatcherMiddleware(Middleware):
def start_blueprint_watcher(self):
"""Start blueprint watcher"""
observer = Observer()
kwargs = {}
if platform.startswith("linux"):
kwargs["event_filter"] = (FileCreatedEvent, FileModifiedEvent)
observer.schedule(
BlueprintEventHandler(), CONFIG.get("blueprints_dir"), recursive=True, **kwargs
)
observer.start()
def start_blueprint_watcher():
"""Start blueprint watcher, if it's not running already."""
# This function might be called twice since it's called on celery startup
def after_worker_boot(self, broker, worker):
if not settings.TEST:
self.start_blueprint_watcher()
global _file_watcher_started # noqa: PLW0603
if _file_watcher_started:
return
observer = Observer()
kwargs = {}
if platform.startswith("linux"):
kwargs["event_filter"] = (FileCreatedEvent, FileModifiedEvent)
observer.schedule(
BlueprintEventHandler(), CONFIG.get("blueprints_dir"), recursive=True, **kwargs
)
observer.start()
_file_watcher_started = True
class BlueprintEventHandler(FileSystemEventHandler):
@ -94,7 +92,7 @@ class BlueprintEventHandler(FileSystemEventHandler):
LOGGER.debug("new blueprint file created, starting discovery")
for tenant in Tenant.objects.filter(ready=True):
with tenant:
Schedule.dispatch_by_actor(blueprints_discovery)
blueprints_discovery.delay()
def on_modified(self, event: FileSystemEvent):
"""Process file modification"""
@ -105,14 +103,14 @@ class BlueprintEventHandler(FileSystemEventHandler):
with tenant:
for instance in BlueprintInstance.objects.filter(path=rel_path, enabled=True):
LOGGER.debug("modified blueprint file, starting apply", instance=instance)
apply_blueprint.send_with_options(args=(instance.pk,), rel_obj=instance)
apply_blueprint.delay(instance.pk.hex)
@actor(
description=_("Find blueprints as `blueprints_find` does, but return a safe dict."),
@CELERY_APP.task(
throws=(DatabaseError, ProgrammingError, InternalError),
)
def blueprints_find_dict():
"""Find blueprints as `blueprints_find` does, but return a safe dict"""
blueprints = []
for blueprint in blueprints_find():
blueprints.append(sanitize_dict(asdict(blueprint)))
@ -148,19 +146,21 @@ def blueprints_find() -> list[BlueprintFile]:
return blueprints
@actor(
description=_("Find blueprints and check if they need to be created in the database."),
throws=(DatabaseError, ProgrammingError, InternalError),
@CELERY_APP.task(
throws=(DatabaseError, ProgrammingError, InternalError), base=SystemTask, bind=True
)
def blueprints_discovery(path: str | None = None):
self: Task = CurrentTask.get_task()
@prefill_task
def blueprints_discovery(self: SystemTask, path: str | None = None):
"""Find blueprints and check if they need to be created in the database"""
count = 0
for blueprint in blueprints_find():
if path and blueprint.path != path:
continue
check_blueprint_v1_file(blueprint)
count += 1
self.info(f"Successfully imported {count} files.")
self.set_status(
TaskStatus.SUCCESSFUL, _("Successfully imported {count} files.".format(count=count))
)
def check_blueprint_v1_file(blueprint: BlueprintFile):
@ -187,26 +187,22 @@ def check_blueprint_v1_file(blueprint: BlueprintFile):
)
if instance.last_applied_hash != blueprint.hash:
LOGGER.info("Applying blueprint due to changed file", instance=instance, path=instance.path)
apply_blueprint.send_with_options(args=(instance.pk,), rel_obj=instance)
apply_blueprint.delay(str(instance.pk))
@actor(description=_("Apply single blueprint."))
def apply_blueprint(instance_pk: UUID):
try:
self: Task = CurrentTask.get_task()
except CurrentTaskNotFound:
self = Task()
self.set_uid(str(instance_pk))
@CELERY_APP.task(
bind=True,
base=SystemTask,
)
def apply_blueprint(self: SystemTask, instance_pk: str):
"""Apply single blueprint"""
self.save_on_success = False
instance: BlueprintInstance | None = None
try:
instance: BlueprintInstance = BlueprintInstance.objects.filter(pk=instance_pk).first()
if not instance:
self.warning(f"Could not find blueprint {instance_pk}, skipping")
if not instance or not instance.enabled:
return
self.set_uid(slugify(instance.name))
if not instance.enabled:
self.info(f"Blueprint {instance.name} is disabled, skipping")
return
blueprint_content = instance.retrieve()
file_hash = sha512(blueprint_content.encode()).hexdigest()
importer = Importer.from_string(blueprint_content, instance.context)
@ -216,18 +212,19 @@ def apply_blueprint(instance_pk: UUID):
if not valid:
instance.status = BlueprintInstanceStatus.ERROR
instance.save()
self.logs(logs)
self.set_status(TaskStatus.ERROR, *logs)
return
with capture_logs() as logs:
applied = importer.apply()
if not applied:
instance.status = BlueprintInstanceStatus.ERROR
instance.save()
self.logs(logs)
self.set_status(TaskStatus.ERROR, *logs)
return
instance.status = BlueprintInstanceStatus.SUCCESSFUL
instance.last_applied_hash = file_hash
instance.last_applied = now()
self.set_status(TaskStatus.SUCCESSFUL)
except (
OSError,
DatabaseError,
@ -238,14 +235,15 @@ def apply_blueprint(instance_pk: UUID):
) as exc:
if instance:
instance.status = BlueprintInstanceStatus.ERROR
self.error(exc)
self.set_error(exc)
finally:
if instance:
instance.save()
@actor(description=_("Remove blueprints which couldn't be fetched."))
@CELERY_APP.task()
def clear_failed_blueprints():
"""Remove blueprints which couldn't be fetched"""
# Exclude OCI blueprints as those might be temporarily unavailable
for blueprint in BlueprintInstance.objects.exclude(path__startswith=OCI_PREFIX):
try:

View File

@ -9,7 +9,6 @@ class AuthentikBrandsConfig(ManagedAppConfig):
name = "authentik.brands"
label = "authentik_brands"
verbose_name = "authentik Brands"
default = True
mountpoints = {
"authentik.brands.urls_root": "",
}

View File

@ -1,6 +1,8 @@
"""Authenticator Devices API Views"""
from drf_spectacular.utils import extend_schema
from django.utils.translation import gettext_lazy as _
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import OpenApiParameter, extend_schema
from guardian.shortcuts import get_objects_for_user
from rest_framework.fields import (
BooleanField,
@ -13,7 +15,6 @@ from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import ViewSet
from authentik.core.api.users import ParamUserSerializer
from authentik.core.api.utils import MetaNameSerializer
from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import EndpointDevice
from authentik.stages.authenticator import device_classes, devices_for_user
@ -22,7 +23,7 @@ from authentik.stages.authenticator_webauthn.models import WebAuthnDevice
class DeviceSerializer(MetaNameSerializer):
"""Serializer for authenticator devices"""
"""Serializer for Duo authenticator devices"""
pk = CharField()
name = CharField()
@ -32,27 +33,22 @@ class DeviceSerializer(MetaNameSerializer):
last_updated = DateTimeField(read_only=True)
last_used = DateTimeField(read_only=True, allow_null=True)
extra_description = SerializerMethodField()
external_id = SerializerMethodField()
def get_type(self, instance: Device) -> str:
"""Get type of device"""
return instance._meta.label
def get_extra_description(self, instance: Device) -> str | None:
def get_extra_description(self, instance: Device) -> str:
"""Get extra description"""
if isinstance(instance, WebAuthnDevice):
return instance.device_type.description if instance.device_type else None
return (
instance.device_type.description
if instance.device_type
else _("Extra description not available")
)
if isinstance(instance, EndpointDevice):
return instance.data.get("deviceSignals", {}).get("deviceModel")
return None
def get_external_id(self, instance: Device) -> str | None:
"""Get external Device ID"""
if isinstance(instance, WebAuthnDevice):
return instance.device_type.aaguid if instance.device_type else None
if isinstance(instance, EndpointDevice):
return instance.data.get("deviceSignals", {}).get("deviceModel")
return None
return ""
class DeviceViewSet(ViewSet):
@ -61,6 +57,7 @@ class DeviceViewSet(ViewSet):
serializer_class = DeviceSerializer
permission_classes = [IsAuthenticated]
@extend_schema(responses={200: DeviceSerializer(many=True)})
def list(self, request: Request) -> Response:
"""Get all devices for current user"""
devices = devices_for_user(request.user)
@ -82,11 +79,18 @@ class AdminDeviceViewSet(ViewSet):
yield from device_set
@extend_schema(
parameters=[ParamUserSerializer],
parameters=[
OpenApiParameter(
name="user",
location=OpenApiParameter.QUERY,
type=OpenApiTypes.INT,
)
],
responses={200: DeviceSerializer(many=True)},
)
def list(self, request: Request) -> Response:
"""Get all devices for current user"""
args = ParamUserSerializer(data=request.query_params)
args.is_valid(raise_exception=True)
return Response(DeviceSerializer(self.get_devices(**args.validated_data), many=True).data)
kwargs = {}
if "user" in request.query_params:
kwargs = {"user": request.query_params["user"]}
return Response(DeviceSerializer(self.get_devices(**kwargs), many=True).data)

View File

@ -90,12 +90,6 @@ from authentik.stages.email.utils import TemplateEmailMessage
LOGGER = get_logger()
class ParamUserSerializer(PassiveSerializer):
"""Partial serializer for query parameters to select a user"""
user = PrimaryKeyRelatedField(queryset=User.objects.all().exclude_anonymous(), required=False)
class UserGroupSerializer(ModelSerializer):
"""Simplified Group Serializer for user's groups"""
@ -392,23 +386,8 @@ class UserViewSet(UsedByMixin, ModelViewSet):
queryset = User.objects.none()
ordering = ["username"]
serializer_class = UserSerializer
filterset_class = UsersFilter
search_fields = ["username", "name", "is_active", "email", "uuid", "attributes"]
def get_ql_fields(self):
from djangoql.schema import BoolField, StrField
from authentik.enterprise.search.fields import ChoiceSearchField, JSONSearchField
return [
StrField(User, "username"),
StrField(User, "name"),
StrField(User, "email"),
StrField(User, "path"),
BoolField(User, "is_active", nullable=True),
ChoiceSearchField(User, "type"),
JSONSearchField(User, "attributes", suggest_nested=False),
]
filterset_class = UsersFilter
def get_queryset(self):
base_qs = User.objects.all().exclude_anonymous()

View File

@ -2,7 +2,6 @@
from typing import Any
from django.db import models
from django.db.models import Model
from drf_spectacular.extensions import OpenApiSerializerFieldExtension
from drf_spectacular.plumbing import build_basic_type
@ -31,27 +30,7 @@ def is_dict(value: Any):
raise ValidationError("Value must be a dictionary, and not have any duplicate keys.")
class JSONDictField(JSONField):
"""JSON Field which only allows dictionaries"""
default_validators = [is_dict]
class JSONExtension(OpenApiSerializerFieldExtension):
"""Generate API Schema for JSON fields as"""
target_class = "authentik.core.api.utils.JSONDictField"
def map_serializer_field(self, auto_schema, direction):
return build_basic_type(OpenApiTypes.OBJECT)
class ModelSerializer(BaseModelSerializer):
# By default, JSON fields we have are used to store dictionaries
serializer_field_mapping = BaseModelSerializer.serializer_field_mapping.copy()
serializer_field_mapping[models.JSONField] = JSONDictField
def create(self, validated_data):
instance = super().create(validated_data)
@ -92,6 +71,21 @@ class ModelSerializer(BaseModelSerializer):
return instance
class JSONDictField(JSONField):
"""JSON Field which only allows dictionaries"""
default_validators = [is_dict]
class JSONExtension(OpenApiSerializerFieldExtension):
"""Generate API Schema for JSON fields as"""
target_class = "authentik.core.api.utils.JSONDictField"
def map_serializer_field(self, auto_schema, direction):
return build_basic_type(OpenApiTypes.OBJECT)
class PassiveSerializer(Serializer):
"""Base serializer class which doesn't implement create/update methods"""

View File

@ -1,7 +1,8 @@
"""authentik core app config"""
from django.conf import settings
from authentik.blueprints.apps import ManagedAppConfig
from authentik.tasks.schedules.lib import ScheduleSpec
class AuthentikCoreConfig(ManagedAppConfig):
@ -13,6 +14,14 @@ class AuthentikCoreConfig(ManagedAppConfig):
mountpoint = ""
default = True
@ManagedAppConfig.reconcile_global
def debug_worker_hook(self):
"""Dispatch startup tasks inline when debugging"""
if settings.DEBUG:
from authentik.root.celery import worker_ready_hook
worker_ready_hook()
@ManagedAppConfig.reconcile_tenant
def source_inbuilt(self):
"""Reconcile inbuilt source"""
@ -25,18 +34,3 @@ class AuthentikCoreConfig(ManagedAppConfig):
},
managed=Source.MANAGED_INBUILT,
)
@property
def tenant_schedule_specs(self) -> list[ScheduleSpec]:
from authentik.core.tasks import clean_expired_models, clean_temporary_users
return [
ScheduleSpec(
actor=clean_expired_models,
crontab="2-59/5 * * * *",
),
ScheduleSpec(
actor=clean_temporary_users,
crontab="9-59/5 * * * *",
),
]

View File

@ -0,0 +1,21 @@
"""Run bootstrap tasks"""
from django.core.management.base import BaseCommand
from django_tenants.utils import get_public_schema_name
from authentik.root.celery import _get_startup_tasks_all_tenants, _get_startup_tasks_default_tenant
from authentik.tenants.models import Tenant
class Command(BaseCommand):
"""Run bootstrap tasks to ensure certain objects are created"""
def handle(self, **options):
for task in _get_startup_tasks_default_tenant():
with Tenant.objects.get(schema_name=get_public_schema_name()):
task()
for task in _get_startup_tasks_all_tenants():
for tenant in Tenant.objects.filter(ready=True):
with tenant:
task()

View File

@ -13,6 +13,7 @@ class Command(TenantCommand):
parser.add_argument("usernames", nargs="*", type=str)
def handle_per_tenant(self, **options):
print(options)
new_type = UserTypes(options["type"])
qs = (
User.objects.exclude_anonymous()

View File

@ -0,0 +1,47 @@
"""Run worker"""
from sys import exit as sysexit
from tempfile import tempdir
from celery.apps.worker import Worker
from django.core.management.base import BaseCommand
from django.db import close_old_connections
from structlog.stdlib import get_logger
from authentik.lib.config import CONFIG
from authentik.lib.debug import start_debug_server
from authentik.root.celery import CELERY_APP
LOGGER = get_logger()
class Command(BaseCommand):
"""Run worker"""
def add_arguments(self, parser):
parser.add_argument(
"-b",
"--beat",
action="store_false",
help="When set, this worker will _not_ run Beat (scheduled) tasks",
)
def handle(self, **options):
LOGGER.debug("Celery options", **options)
close_old_connections()
start_debug_server()
worker: Worker = CELERY_APP.Worker(
no_color=False,
quiet=True,
optimization="fair",
autoscale=(CONFIG.get_int("worker.concurrency"), 1),
task_events=True,
beat=options.get("beat", True),
schedule_filename=f"{tempdir}/celerybeat-schedule",
queues=["authentik", "authentik_scheduled", "authentik_events"],
)
for task in CELERY_APP.tasks:
LOGGER.debug("Registered task", task=task)
worker.start()
sysexit(worker.exitcode)

View File

@ -18,7 +18,7 @@ from django.http import HttpRequest
from django.utils.functional import SimpleLazyObject, cached_property
from django.utils.timezone import now
from django.utils.translation import gettext_lazy as _
from django_cte import CTE, with_cte
from django_cte import CTEQuerySet, With
from guardian.conf import settings
from guardian.mixins import GuardianUserMixin
from model_utils.managers import InheritanceManager
@ -136,7 +136,7 @@ class AttributesMixin(models.Model):
return instance, False
class GroupQuerySet(QuerySet):
class GroupQuerySet(CTEQuerySet):
def with_children_recursive(self):
"""Recursively get all groups that have the current queryset as parents
or are indirectly related."""
@ -165,9 +165,9 @@ class GroupQuerySet(QuerySet):
)
# Build the recursive query, see above
cte = CTE.recursive(make_cte)
cte = With.recursive(make_cte)
# Return the result, as a usable queryset for Group.
return with_cte(cte, select=cte.join(Group, group_uuid=cte.col.group_uuid))
return cte.join(Group, group_uuid=cte.col.group_uuid).with_cte(cte)
class Group(SerializerModel, AttributesMixin):
@ -1082,12 +1082,6 @@ class AuthenticatedSession(SerializerModel):
user = models.ForeignKey(User, on_delete=models.CASCADE)
@property
def serializer(self) -> type[Serializer]:
from authentik.core.api.authenticated_sessions import AuthenticatedSessionSerializer
return AuthenticatedSessionSerializer
class Meta:
verbose_name = _("Authenticated Session")
verbose_name_plural = _("Authenticated Sessions")

View File

@ -3,9 +3,6 @@
from datetime import datetime, timedelta
from django.utils.timezone import now
from django.utils.translation import gettext_lazy as _
from django_dramatiq_postgres.middleware import CurrentTask
from dramatiq.actor import actor
from structlog.stdlib import get_logger
from authentik.core.models import (
@ -14,14 +11,17 @@ from authentik.core.models import (
ExpiringModel,
User,
)
from authentik.tasks.models import Task
from authentik.events.system_tasks import SystemTask, TaskStatus, prefill_task
from authentik.root.celery import CELERY_APP
LOGGER = get_logger()
@actor(description=_("Remove expired objects."))
def clean_expired_models():
self: Task = CurrentTask.get_task()
@CELERY_APP.task(bind=True, base=SystemTask)
@prefill_task
def clean_expired_models(self: SystemTask):
"""Remove expired objects"""
messages = []
for cls in ExpiringModel.__subclasses__():
cls: ExpiringModel
objects = (
@ -31,13 +31,16 @@ def clean_expired_models():
for obj in objects:
obj.expire_action()
LOGGER.debug("Expired models", model=cls, amount=amount)
self.info(f"Expired {amount} {cls._meta.verbose_name_plural}")
messages.append(f"Expired {amount} {cls._meta.verbose_name_plural}")
self.set_status(TaskStatus.SUCCESSFUL, *messages)
@actor(description=_("Remove temporary users created by SAML Sources."))
def clean_temporary_users():
self: Task = CurrentTask.get_task()
@CELERY_APP.task(bind=True, base=SystemTask)
@prefill_task
def clean_temporary_users(self: SystemTask):
"""Remove temporary users created by SAML Sources"""
_now = datetime.now()
messages = []
deleted_users = 0
for user in User.objects.filter(**{f"attributes__{USER_ATTRIBUTE_GENERATED}": True}):
if not user.attributes.get(USER_ATTRIBUTE_EXPIRES):
@ -49,4 +52,5 @@ def clean_temporary_users():
LOGGER.debug("User is expired and will be deleted.", user=user, delta=delta)
user.delete()
deleted_users += 1
self.info(f"Successfully deleted {deleted_users} users.")
messages.append(f"Successfully deleted {deleted_users} users.")
self.set_status(TaskStatus.SUCCESSFUL, *messages)

View File

@ -114,7 +114,6 @@ class TestApplicationsAPI(APITestCase):
self.assertJSONEqual(
response.content.decode(),
{
"autocomplete": {},
"pagination": {
"next": 0,
"previous": 0,
@ -168,7 +167,6 @@ class TestApplicationsAPI(APITestCase):
self.assertJSONEqual(
response.content.decode(),
{
"autocomplete": {},
"pagination": {
"next": 0,
"previous": 0,

View File

@ -36,7 +36,7 @@ class TestTasks(APITestCase):
expires=now(), user=get_anonymous_user(), intent=TokenIntents.INTENT_API
)
key = token.key
clean_expired_models.send()
clean_expired_models.delay().get()
token.refresh_from_db()
self.assertNotEqual(key, token.key)
@ -50,5 +50,5 @@ class TestTasks(APITestCase):
USER_ATTRIBUTE_EXPIRES: mktime(now().timetuple()),
},
)
clean_temporary_users.send()
clean_temporary_users.delay().get()
self.assertFalse(User.objects.filter(username=username))

View File

@ -4,8 +4,6 @@ from datetime import UTC, datetime
from authentik.blueprints.apps import ManagedAppConfig
from authentik.lib.generators import generate_id
from authentik.lib.utils.time import fqdn_rand
from authentik.tasks.schedules.lib import ScheduleSpec
MANAGED_KEY = "goauthentik.io/crypto/jwt-managed"
@ -69,14 +67,3 @@ class AuthentikCryptoConfig(ManagedAppConfig):
"key_data": builder.private_key,
},
)
@property
def tenant_schedule_specs(self) -> list[ScheduleSpec]:
from authentik.crypto.tasks import certificate_discovery
return [
ScheduleSpec(
actor=certificate_discovery,
crontab=f"{fqdn_rand('crypto_certificate_discovery')} * * * *",
),
]

View File

@ -0,0 +1,13 @@
"""Crypto task Settings"""
from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand
CELERY_BEAT_SCHEDULE = {
"crypto_certificate_discovery": {
"task": "authentik.crypto.tasks.certificate_discovery",
"schedule": crontab(minute=fqdn_rand("crypto_certificate_discovery"), hour="*"),
"options": {"queue": "authentik_scheduled"},
},
}

View File

@ -7,13 +7,13 @@ from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.serialization import load_pem_private_key
from cryptography.x509.base import load_pem_x509_certificate
from django.utils.translation import gettext_lazy as _
from django_dramatiq_postgres.middleware import CurrentTask
from dramatiq.actor import actor
from structlog.stdlib import get_logger
from authentik.crypto.models import CertificateKeyPair
from authentik.events.models import TaskStatus
from authentik.events.system_tasks import SystemTask, prefill_task
from authentik.lib.config import CONFIG
from authentik.tasks.models import Task
from authentik.root.celery import CELERY_APP
LOGGER = get_logger()
@ -36,9 +36,10 @@ def ensure_certificate_valid(body: str):
return body
@actor(description=_("Discover, import and update certificates from the filesystem."))
def certificate_discovery():
self: Task = CurrentTask.get_task()
@CELERY_APP.task(bind=True, base=SystemTask)
@prefill_task
def certificate_discovery(self: SystemTask):
"""Discover, import and update certificates from the filesystem"""
certs = {}
private_keys = {}
discovered = 0
@ -83,4 +84,6 @@ def certificate_discovery():
dirty = True
if dirty:
cert.save()
self.info(f"Successfully imported {discovered} files.")
self.set_status(
TaskStatus.SUCCESSFUL, _("Successfully imported {count} files.".format(count=discovered))
)

View File

@ -338,7 +338,7 @@ class TestCrypto(APITestCase):
with open(f"{temp_dir}/foo.bar/privkey.pem", "w+", encoding="utf-8") as _key:
_key.write(builder.private_key)
with CONFIG.patch("cert_discovery_dir", temp_dir):
certificate_discovery.send()
certificate_discovery()
keypair: CertificateKeyPair = CertificateKeyPair.objects.filter(
managed=MANAGED_DISCOVERED % "foo"
).first()

View File

@ -3,8 +3,6 @@
from django.conf import settings
from authentik.blueprints.apps import ManagedAppConfig
from authentik.lib.utils.time import fqdn_rand
from authentik.tasks.schedules.lib import ScheduleSpec
class EnterpriseConfig(ManagedAppConfig):
@ -28,14 +26,3 @@ class AuthentikEnterpriseConfig(EnterpriseConfig):
from authentik.enterprise.license import LicenseKey
return LicenseKey.cached_summary().status.is_valid
@property
def tenant_schedule_specs(self) -> list[ScheduleSpec]:
from authentik.enterprise.tasks import enterprise_update_usage
return [
ScheduleSpec(
actor=enterprise_update_usage,
crontab=f"{fqdn_rand('enterprise_update_usage')} */2 * * *",
),
]

View File

@ -1,8 +1,6 @@
"""authentik Unique Password policy app config"""
from authentik.enterprise.apps import EnterpriseConfig
from authentik.lib.utils.time import fqdn_rand
from authentik.tasks.schedules.lib import ScheduleSpec
class AuthentikEnterprisePoliciesUniquePasswordConfig(EnterpriseConfig):
@ -10,21 +8,3 @@ class AuthentikEnterprisePoliciesUniquePasswordConfig(EnterpriseConfig):
label = "authentik_policies_unique_password"
verbose_name = "authentik Enterprise.Policies.Unique Password"
default = True
@property
def tenant_schedule_specs(self) -> list[ScheduleSpec]:
from authentik.enterprise.policies.unique_password.tasks import (
check_and_purge_password_history,
trim_password_histories,
)
return [
ScheduleSpec(
actor=trim_password_histories,
crontab=f"{fqdn_rand('policies_unique_password_trim')} */12 * * *",
),
ScheduleSpec(
actor=check_and_purge_password_history,
crontab=f"{fqdn_rand('policies_unique_password_purge')} */24 * * *",
),
]

View File

@ -0,0 +1,20 @@
"""Unique Password Policy settings"""
from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand
CELERY_BEAT_SCHEDULE = {
"policies_unique_password_trim_history": {
"task": "authentik.enterprise.policies.unique_password.tasks.trim_password_histories",
"schedule": crontab(minute=fqdn_rand("policies_unique_password_trim"), hour="*/12"),
"options": {"queue": "authentik_scheduled"},
},
"policies_unique_password_check_purge": {
"task": (
"authentik.enterprise.policies.unique_password.tasks.check_and_purge_password_history"
),
"schedule": crontab(minute=fqdn_rand("policies_unique_password_purge"), hour="*/24"),
"options": {"queue": "authentik_scheduled"},
},
}

View File

@ -1,37 +1,35 @@
from django.db.models.aggregates import Count
from django.utils.translation import gettext_lazy as _
from django_dramatiq_postgres.middleware import CurrentTask
from dramatiq.actor import actor
from structlog import get_logger
from authentik.enterprise.policies.unique_password.models import (
UniquePasswordPolicy,
UserPasswordHistory,
)
from authentik.tasks.models import Task
from authentik.events.system_tasks import SystemTask, TaskStatus, prefill_task
from authentik.root.celery import CELERY_APP
LOGGER = get_logger()
@actor(
description=_(
"Check if any UniquePasswordPolicy exists, and if not, purge the password history table."
)
)
def check_and_purge_password_history():
self: Task = CurrentTask.get_task()
@CELERY_APP.task(bind=True, base=SystemTask)
@prefill_task
def check_and_purge_password_history(self: SystemTask):
"""Check if any UniquePasswordPolicy exists, and if not, purge the password history table.
This is run on a schedule instead of being triggered by policy binding deletion.
"""
if not UniquePasswordPolicy.objects.exists():
UserPasswordHistory.objects.all().delete()
LOGGER.debug("Purged UserPasswordHistory table as no policies are in use")
self.info("Successfully purged UserPasswordHistory")
self.set_status(TaskStatus.SUCCESSFUL, "Successfully purged UserPasswordHistory")
return
self.info("Not purging password histories, a unique password policy exists")
self.set_status(
TaskStatus.SUCCESSFUL, "Not purging password histories, a unique password policy exists"
)
@actor(description=_("Remove user password history that are too old."))
def trim_password_histories():
@CELERY_APP.task(bind=True, base=SystemTask)
def trim_password_histories(self: SystemTask):
"""Removes rows from UserPasswordHistory older than
the `n` most recent entries.
@ -39,8 +37,6 @@ def trim_password_histories():
UniquePasswordPolicy policies.
"""
self: Task = CurrentTask.get_task()
# No policy, we'll let the cleanup above do its thing
if not UniquePasswordPolicy.objects.exists():
return
@ -67,4 +63,4 @@ def trim_password_histories():
num_deleted, _ = UserPasswordHistory.objects.exclude(pk__in=all_pks_to_keep).delete()
LOGGER.debug("Deleted stale password history records", count=num_deleted)
self.info(f"Delete {num_deleted} stale password history records")
self.set_status(TaskStatus.SUCCESSFUL, f"Delete {num_deleted} stale password history records")

View File

@ -76,7 +76,7 @@ class TestCheckAndPurgePasswordHistory(TestCase):
self.assertTrue(UserPasswordHistory.objects.exists())
# Run the task - should purge since no policy is in use
check_and_purge_password_history.send()
check_and_purge_password_history()
# Verify the table is empty
self.assertFalse(UserPasswordHistory.objects.exists())
@ -99,7 +99,7 @@ class TestCheckAndPurgePasswordHistory(TestCase):
self.assertTrue(UserPasswordHistory.objects.exists())
# Run the task - should NOT purge since a policy is in use
check_and_purge_password_history.send()
check_and_purge_password_history()
# Verify the entries still exist
self.assertTrue(UserPasswordHistory.objects.exists())
@ -119,17 +119,17 @@ class TestTrimPasswordHistory(TestCase):
[
UserPasswordHistory(
user=self.user,
old_password="hunter1", # nosec
old_password="hunter1", # nosec B106
created_at=_now - timedelta(days=3),
),
UserPasswordHistory(
user=self.user,
old_password="hunter2", # nosec
old_password="hunter2", # nosec B106
created_at=_now - timedelta(days=2),
),
UserPasswordHistory(
user=self.user,
old_password="hunter3", # nosec
old_password="hunter3", # nosec B106
created_at=_now,
),
]
@ -142,7 +142,7 @@ class TestTrimPasswordHistory(TestCase):
enabled=True,
order=0,
)
trim_password_histories.send()
trim_password_histories.delay()
user_pwd_history_qs = UserPasswordHistory.objects.filter(user=self.user)
self.assertEqual(len(user_pwd_history_qs), 1)
@ -159,7 +159,7 @@ class TestTrimPasswordHistory(TestCase):
enabled=False,
order=0,
)
trim_password_histories.send()
trim_password_histories.delay()
self.assertTrue(UserPasswordHistory.objects.filter(user=self.user).exists())
def test_trim_password_history_fewer_records_than_maximum_is_no_op(self):
@ -174,5 +174,5 @@ class TestTrimPasswordHistory(TestCase):
enabled=True,
order=0,
)
trim_password_histories.send()
trim_password_histories.delay()
self.assertTrue(UserPasswordHistory.objects.filter(user=self.user).exists())

View File

@ -55,5 +55,5 @@ class GoogleWorkspaceProviderViewSet(OutgoingSyncProviderStatusMixin, UsedByMixi
]
search_fields = ["name"]
ordering = ["name"]
sync_task = google_workspace_sync
sync_single_task = google_workspace_sync
sync_objects_task = google_workspace_sync_objects

View File

@ -7,7 +7,6 @@ from django.db import models
from django.db.models import QuerySet
from django.templatetags.static import static
from django.utils.translation import gettext_lazy as _
from dramatiq.actor import Actor
from google.oauth2.service_account import Credentials
from rest_framework.serializers import Serializer
@ -111,12 +110,6 @@ class GoogleWorkspaceProvider(OutgoingSyncProvider, BackchannelProvider):
help_text=_("Property mappings used for group creation/updating."),
)
@property
def sync_actor(self) -> Actor:
from authentik.enterprise.providers.google_workspace.tasks import google_workspace_sync
return google_workspace_sync
def client_for_model(
self,
model: type[User | Group | GoogleWorkspaceProviderUser | GoogleWorkspaceProviderGroup],

View File

@ -0,0 +1,13 @@
"""Google workspace provider task Settings"""
from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand
CELERY_BEAT_SCHEDULE = {
"providers_google_workspace_sync": {
"task": "authentik.enterprise.providers.google_workspace.tasks.google_workspace_sync_all",
"schedule": crontab(minute=fqdn_rand("google_workspace_sync_all"), hour="*/4"),
"options": {"queue": "authentik_scheduled"},
},
}

View File

@ -2,13 +2,15 @@
from authentik.enterprise.providers.google_workspace.models import GoogleWorkspaceProvider
from authentik.enterprise.providers.google_workspace.tasks import (
google_workspace_sync_direct_dispatch,
google_workspace_sync_m2m_dispatch,
google_workspace_sync,
google_workspace_sync_direct,
google_workspace_sync_m2m,
)
from authentik.lib.sync.outgoing.signals import register_signals
register_signals(
GoogleWorkspaceProvider,
task_sync_direct_dispatch=google_workspace_sync_direct_dispatch,
task_sync_m2m_dispatch=google_workspace_sync_m2m_dispatch,
task_sync_single=google_workspace_sync,
task_sync_direct=google_workspace_sync_direct,
task_sync_m2m=google_workspace_sync_m2m,
)

View File

@ -1,48 +1,37 @@
"""Google Provider tasks"""
from django.utils.translation import gettext_lazy as _
from dramatiq.actor import actor
from authentik.enterprise.providers.google_workspace.models import GoogleWorkspaceProvider
from authentik.events.system_tasks import SystemTask
from authentik.lib.sync.outgoing.exceptions import TransientSyncException
from authentik.lib.sync.outgoing.tasks import SyncTasks
from authentik.root.celery import CELERY_APP
sync_tasks = SyncTasks(GoogleWorkspaceProvider)
@actor(description=_("Sync Google Workspace provider objects."))
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
def google_workspace_sync_objects(*args, **kwargs):
return sync_tasks.sync_objects(*args, **kwargs)
@actor(description=_("Full sync for Google Workspace provider."))
def google_workspace_sync(provider_pk: int, *args, **kwargs):
@CELERY_APP.task(
base=SystemTask, bind=True, autoretry_for=(TransientSyncException,), retry_backoff=True
)
def google_workspace_sync(self, provider_pk: int, *args, **kwargs):
"""Run full sync for Google Workspace provider"""
return sync_tasks.sync(provider_pk, google_workspace_sync_objects)
return sync_tasks.sync_single(self, provider_pk, google_workspace_sync_objects)
@actor(description=_("Sync a direct object (user, group) for Google Workspace provider."))
@CELERY_APP.task()
def google_workspace_sync_all():
return sync_tasks.sync_all(google_workspace_sync)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
def google_workspace_sync_direct(*args, **kwargs):
return sync_tasks.sync_signal_direct(*args, **kwargs)
@actor(
description=_(
"Dispatch syncs for a direct object (user, group) for Google Workspace providers."
)
)
def google_workspace_sync_direct_dispatch(*args, **kwargs):
return sync_tasks.sync_signal_direct_dispatch(google_workspace_sync_direct, *args, **kwargs)
@actor(description=_("Sync a related object (memberships) for Google Workspace provider."))
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
def google_workspace_sync_m2m(*args, **kwargs):
return sync_tasks.sync_signal_m2m(*args, **kwargs)
@actor(
description=_(
"Dispatch syncs for a related object (memberships) for Google Workspace providers."
)
)
def google_workspace_sync_m2m_dispatch(*args, **kwargs):
return sync_tasks.sync_signal_m2m_dispatch(google_workspace_sync_m2m, *args, **kwargs)

View File

@ -324,7 +324,7 @@ class GoogleWorkspaceGroupTests(TestCase):
"authentik.enterprise.providers.google_workspace.models.GoogleWorkspaceProvider.google_credentials",
MagicMock(return_value={"developerKey": self.api_key, "http": http}),
):
google_workspace_sync.send(self.provider.pk).get_result()
google_workspace_sync.delay(self.provider.pk).get()
self.assertTrue(
GoogleWorkspaceProviderGroup.objects.filter(
group=different_group, provider=self.provider

View File

@ -302,7 +302,7 @@ class GoogleWorkspaceUserTests(TestCase):
"authentik.enterprise.providers.google_workspace.models.GoogleWorkspaceProvider.google_credentials",
MagicMock(return_value={"developerKey": self.api_key, "http": http}),
):
google_workspace_sync.send(self.provider.pk).get_result()
google_workspace_sync.delay(self.provider.pk).get()
self.assertTrue(
GoogleWorkspaceProviderUser.objects.filter(
user=different_user, provider=self.provider

View File

@ -53,5 +53,5 @@ class MicrosoftEntraProviderViewSet(OutgoingSyncProviderStatusMixin, UsedByMixin
]
search_fields = ["name"]
ordering = ["name"]
sync_task = microsoft_entra_sync
sync_single_task = microsoft_entra_sync
sync_objects_task = microsoft_entra_sync_objects

View File

@ -8,7 +8,6 @@ from django.db import models
from django.db.models import QuerySet
from django.templatetags.static import static
from django.utils.translation import gettext_lazy as _
from dramatiq.actor import Actor
from rest_framework.serializers import Serializer
from authentik.core.models import (
@ -100,12 +99,6 @@ class MicrosoftEntraProvider(OutgoingSyncProvider, BackchannelProvider):
help_text=_("Property mappings used for group creation/updating."),
)
@property
def sync_actor(self) -> Actor:
from authentik.enterprise.providers.microsoft_entra.tasks import microsoft_entra_sync
return microsoft_entra_sync
def client_for_model(
self,
model: type[User | Group | MicrosoftEntraProviderUser | MicrosoftEntraProviderGroup],

View File

@ -0,0 +1,13 @@
"""Microsoft Entra provider task Settings"""
from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand
CELERY_BEAT_SCHEDULE = {
"providers_microsoft_entra_sync": {
"task": "authentik.enterprise.providers.microsoft_entra.tasks.microsoft_entra_sync_all",
"schedule": crontab(minute=fqdn_rand("microsoft_entra_sync_all"), hour="*/4"),
"options": {"queue": "authentik_scheduled"},
},
}

View File

@ -2,13 +2,15 @@
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProvider
from authentik.enterprise.providers.microsoft_entra.tasks import (
microsoft_entra_sync_direct_dispatch,
microsoft_entra_sync_m2m_dispatch,
microsoft_entra_sync,
microsoft_entra_sync_direct,
microsoft_entra_sync_m2m,
)
from authentik.lib.sync.outgoing.signals import register_signals
register_signals(
MicrosoftEntraProvider,
task_sync_direct_dispatch=microsoft_entra_sync_direct_dispatch,
task_sync_m2m_dispatch=microsoft_entra_sync_m2m_dispatch,
task_sync_single=microsoft_entra_sync,
task_sync_direct=microsoft_entra_sync_direct,
task_sync_m2m=microsoft_entra_sync_m2m,
)

View File

@ -1,46 +1,37 @@
"""Microsoft Entra Provider tasks"""
from django.utils.translation import gettext_lazy as _
from dramatiq.actor import actor
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProvider
from authentik.events.system_tasks import SystemTask
from authentik.lib.sync.outgoing.exceptions import TransientSyncException
from authentik.lib.sync.outgoing.tasks import SyncTasks
from authentik.root.celery import CELERY_APP
sync_tasks = SyncTasks(MicrosoftEntraProvider)
@actor(description=_("Sync Microsoft Entra provider objects."))
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
def microsoft_entra_sync_objects(*args, **kwargs):
return sync_tasks.sync_objects(*args, **kwargs)
@actor(description=_("Full sync for Microsoft Entra provider."))
def microsoft_entra_sync(provider_pk: int, *args, **kwargs):
@CELERY_APP.task(
base=SystemTask, bind=True, autoretry_for=(TransientSyncException,), retry_backoff=True
)
def microsoft_entra_sync(self, provider_pk: int, *args, **kwargs):
"""Run full sync for Microsoft Entra provider"""
return sync_tasks.sync(provider_pk, microsoft_entra_sync_objects)
return sync_tasks.sync_single(self, provider_pk, microsoft_entra_sync_objects)
@actor(description=_("Sync a direct object (user, group) for Microsoft Entra provider."))
@CELERY_APP.task()
def microsoft_entra_sync_all():
return sync_tasks.sync_all(microsoft_entra_sync)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
def microsoft_entra_sync_direct(*args, **kwargs):
return sync_tasks.sync_signal_direct(*args, **kwargs)
@actor(
description=_("Dispatch syncs for a direct object (user, group) for Microsoft Entra providers.")
)
def microsoft_entra_sync_direct_dispatch(*args, **kwargs):
return sync_tasks.sync_signal_direct_dispatch(microsoft_entra_sync_direct, *args, **kwargs)
@actor(description=_("Sync a related object (memberships) for Microsoft Entra provider."))
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
def microsoft_entra_sync_m2m(*args, **kwargs):
return sync_tasks.sync_signal_m2m(*args, **kwargs)
@actor(
description=_(
"Dispatch syncs for a related object (memberships) for Microsoft Entra providers."
)
)
def microsoft_entra_sync_m2m_dispatch(*args, **kwargs):
return sync_tasks.sync_signal_m2m_dispatch(microsoft_entra_sync_m2m, *args, **kwargs)

View File

@ -252,13 +252,9 @@ class MicrosoftEntraGroupTests(TestCase):
member_add.assert_called_once()
self.assertEqual(
member_add.call_args[0][0].odata_id,
f"https://graph.microsoft.com/v1.0/directoryObjects/{
MicrosoftEntraProviderUser.objects.filter(
f"https://graph.microsoft.com/v1.0/directoryObjects/{MicrosoftEntraProviderUser.objects.filter(
provider=self.provider,
)
.first()
.microsoft_id
}",
).first().microsoft_id}",
)
def test_group_create_member_remove(self):
@ -315,13 +311,9 @@ class MicrosoftEntraGroupTests(TestCase):
member_add.assert_called_once()
self.assertEqual(
member_add.call_args[0][0].odata_id,
f"https://graph.microsoft.com/v1.0/directoryObjects/{
MicrosoftEntraProviderUser.objects.filter(
f"https://graph.microsoft.com/v1.0/directoryObjects/{MicrosoftEntraProviderUser.objects.filter(
provider=self.provider,
)
.first()
.microsoft_id
}",
).first().microsoft_id}",
)
member_remove.assert_called_once()
@ -421,7 +413,7 @@ class MicrosoftEntraGroupTests(TestCase):
),
) as group_list,
):
microsoft_entra_sync.send(self.provider.pk).get_result()
microsoft_entra_sync.delay(self.provider.pk).get()
self.assertTrue(
MicrosoftEntraProviderGroup.objects.filter(
group=different_group, provider=self.provider

View File

@ -397,7 +397,7 @@ class MicrosoftEntraUserTests(APITestCase):
AsyncMock(return_value=GroupCollectionResponse(value=[])),
),
):
microsoft_entra_sync.send(self.provider.pk).get_result()
microsoft_entra_sync.delay(self.provider.pk).get()
self.assertTrue(
MicrosoftEntraProviderUser.objects.filter(
user=different_user, provider=self.provider

View File

@ -17,7 +17,6 @@ from authentik.crypto.models import CertificateKeyPair
from authentik.lib.models import CreatedUpdatedModel
from authentik.lib.utils.time import timedelta_from_string, timedelta_string_validator
from authentik.providers.oauth2.models import JWTAlgorithms, OAuth2Provider
from authentik.tasks.models import TasksModel
class EventTypes(models.TextChoices):
@ -43,7 +42,7 @@ class SSFEventStatus(models.TextChoices):
SENT = "sent"
class SSFProvider(TasksModel, BackchannelProvider):
class SSFProvider(BackchannelProvider):
"""Shared Signals Framework provider to allow applications to
receive user events from authentik."""

View File

@ -1,8 +1,10 @@
from hashlib import sha256
from django.contrib.auth.signals import user_logged_out
from django.db.models import Model
from django.db.models.signals import post_delete, post_save, pre_delete
from django.dispatch import receiver
from django.http.request import HttpRequest
from guardian.shortcuts import assign_perm
from authentik.core.models import (
@ -18,7 +20,7 @@ from authentik.enterprise.providers.ssf.models import (
EventTypes,
SSFProvider,
)
from authentik.enterprise.providers.ssf.tasks import send_ssf_events
from authentik.enterprise.providers.ssf.tasks import send_ssf_event
from authentik.events.middleware import audit_ignore
from authentik.stages.authenticator.models import Device
from authentik.stages.authenticator_duo.models import DuoDevice
@ -60,13 +62,38 @@ def ssf_providers_post_save(sender: type[Model], instance: SSFProvider, created:
instance.save()
@receiver(user_logged_out)
def ssf_user_logged_out_session_revoked(sender, request: HttpRequest, user: User, **_):
"""Session revoked trigger (user logged out)"""
if not request.session or not request.session.session_key or not user:
return
send_ssf_event(
EventTypes.CAEP_SESSION_REVOKED,
{
"initiating_entity": "user",
},
sub_id={
"format": "complex",
"session": {
"format": "opaque",
"id": sha256(request.session.session_key.encode("ascii")).hexdigest(),
},
"user": {
"format": "email",
"email": user.email,
},
},
request=request,
)
@receiver(pre_delete, sender=AuthenticatedSession)
def ssf_user_session_delete_session_revoked(sender, instance: AuthenticatedSession, **_):
"""Session revoked trigger (users' session has been deleted)
As this signal is also triggered with a regular logout, we can't be sure
if the session has been deleted by an admin or by the user themselves."""
send_ssf_events(
send_ssf_event(
EventTypes.CAEP_SESSION_REVOKED,
{
"initiating_entity": "user",
@ -88,7 +115,7 @@ def ssf_user_session_delete_session_revoked(sender, instance: AuthenticatedSessi
@receiver(password_changed)
def ssf_password_changed_cred_change(sender, user: User, password: str | None, **_):
"""Credential change trigger (password changed)"""
send_ssf_events(
send_ssf_event(
EventTypes.CAEP_CREDENTIAL_CHANGE,
{
"credential_type": "password",
@ -126,7 +153,7 @@ def ssf_device_post_save(sender: type[Model], instance: Device, created: bool, *
}
if isinstance(instance, WebAuthnDevice) and instance.aaguid != UNKNOWN_DEVICE_TYPE_AAGUID:
data["fido2_aaguid"] = instance.aaguid
send_ssf_events(
send_ssf_event(
EventTypes.CAEP_CREDENTIAL_CHANGE,
data,
sub_id={
@ -153,7 +180,7 @@ def ssf_device_post_delete(sender: type[Model], instance: Device, **_):
}
if isinstance(instance, WebAuthnDevice) and instance.aaguid != UNKNOWN_DEVICE_TYPE_AAGUID:
data["fido2_aaguid"] = instance.aaguid
send_ssf_events(
send_ssf_event(
EventTypes.CAEP_CREDENTIAL_CHANGE,
data,
sub_id={

View File

@ -1,11 +1,7 @@
from typing import Any
from uuid import UUID
from celery import group
from django.http import HttpRequest
from django.utils.timezone import now
from django.utils.translation import gettext_lazy as _
from django_dramatiq_postgres.middleware import CurrentTask
from dramatiq.actor import actor
from requests.exceptions import RequestException
from structlog.stdlib import get_logger
@ -17,16 +13,19 @@ from authentik.enterprise.providers.ssf.models import (
Stream,
StreamEvent,
)
from authentik.events.logs import LogEvent
from authentik.events.models import TaskStatus
from authentik.events.system_tasks import SystemTask
from authentik.lib.utils.http import get_http_session
from authentik.lib.utils.time import timedelta_from_string
from authentik.policies.engine import PolicyEngine
from authentik.tasks.models import Task
from authentik.root.celery import CELERY_APP
session = get_http_session()
LOGGER = get_logger()
def send_ssf_events(
def send_ssf_event(
event_type: EventTypes,
data: dict,
stream_filter: dict | None = None,
@ -34,7 +33,7 @@ def send_ssf_events(
**extra_data,
):
"""Wrapper to send an SSF event to multiple streams"""
events_data = {}
payload = []
if not stream_filter:
stream_filter = {}
stream_filter["events_requested__contains"] = [event_type]
@ -42,22 +41,16 @@ def send_ssf_events(
extra_data.setdefault("txn", request.request_id)
for stream in Stream.objects.filter(**stream_filter):
event_data = stream.prepare_event_payload(event_type, data, **extra_data)
events_data[stream.uuid] = event_data
ssf_events_dispatch.send(events_data)
payload.append((str(stream.uuid), event_data))
return _send_ssf_event.delay(payload)
@actor(description=_("Dispatch SSF events."))
def ssf_events_dispatch(events_data: dict[str, dict[str, Any]]):
for stream_uuid, event_data in events_data.items():
stream = Stream.objects.filter(pk=stream_uuid).first()
if not stream:
continue
send_ssf_event.send_with_options(args=(stream_uuid, event_data), rel_obj=stream.provider)
def _check_app_access(stream: Stream, event_data: dict) -> bool:
def _check_app_access(stream_uuid: str, event_data: dict) -> bool:
"""Check if event is related to user and if so, check
if the user has access to the application"""
stream = Stream.objects.filter(pk=stream_uuid).first()
if not stream:
return False
# `event_data` is a dict version of a StreamEvent
sub_id = event_data.get("payload", {}).get("sub_id", {})
email = sub_id.get("user", {}).get("email", None)
@ -72,22 +65,42 @@ def _check_app_access(stream: Stream, event_data: dict) -> bool:
return engine.passing
@actor(description=_("Send an SSF event."))
def send_ssf_event(stream_uuid: UUID, event_data: dict[str, Any]):
self: Task = CurrentTask.get_task()
@CELERY_APP.task()
def _send_ssf_event(event_data: list[tuple[str, dict]]):
tasks = []
for stream, data in event_data:
if not _check_app_access(stream, data):
continue
event = StreamEvent.objects.create(**data)
tasks.extend(send_single_ssf_event(stream, str(event.uuid)))
main_task = group(*tasks)
main_task()
stream = Stream.objects.filter(pk=stream_uuid).first()
def send_single_ssf_event(stream_id: str, evt_id: str):
stream = Stream.objects.filter(pk=stream_id).first()
if not stream:
return
if not _check_app_access(stream, event_data):
event = StreamEvent.objects.filter(pk=evt_id).first()
if not event:
return
event = StreamEvent.objects.create(**event_data)
self.set_uid(event.pk)
if event.status == SSFEventStatus.SENT:
return
if stream.delivery_method != DeliveryMethods.RISC_PUSH:
return
if stream.delivery_method == DeliveryMethods.RISC_PUSH:
return [ssf_push_event.si(str(event.pk))]
return []
@CELERY_APP.task(bind=True, base=SystemTask)
def ssf_push_event(self: SystemTask, event_id: str):
self.save_on_success = False
event = StreamEvent.objects.filter(pk=event_id).first()
if not event:
return
self.set_uid(event_id)
if event.status == SSFEventStatus.SENT:
self.set_status(TaskStatus.SUCCESSFUL)
return
try:
response = session.post(
event.stream.endpoint_url,
@ -97,17 +110,26 @@ def send_ssf_event(stream_uuid: UUID, event_data: dict[str, Any]):
response.raise_for_status()
event.status = SSFEventStatus.SENT
event.save()
self.set_status(TaskStatus.SUCCESSFUL)
return
except RequestException as exc:
LOGGER.warning("Failed to send SSF event", exc=exc)
self.set_status(TaskStatus.ERROR)
attrs = {}
if exc.response:
attrs["response"] = {
"content": exc.response.text,
"status": exc.response.status_code,
}
self.warning(exc)
self.warning("Failed to send request", **attrs)
self.set_error(
exc,
LogEvent(
_("Failed to send request"),
log_level="warning",
logger=self.__name__,
attributes=attrs,
),
)
# Re-up the expiry of the stream event
event.expires = now() + timedelta_from_string(event.stream.provider.event_retention)
event.status = SSFEventStatus.PENDING_FAILED

View File

@ -13,7 +13,7 @@ from authentik.enterprise.providers.ssf.models import (
SSFProvider,
Stream,
)
from authentik.enterprise.providers.ssf.tasks import send_ssf_events
from authentik.enterprise.providers.ssf.tasks import send_ssf_event
from authentik.enterprise.providers.ssf.views.base import SSFView
LOGGER = get_logger()
@ -109,7 +109,7 @@ class StreamView(SSFView):
"User does not have permission to create stream for this provider."
)
instance: Stream = stream.save(provider=self.provider)
send_ssf_events(
send_ssf_event(
EventTypes.SET_VERIFICATION,
{
"state": None,

View File

@ -1,12 +0,0 @@
"""Enterprise app config"""
from authentik.enterprise.apps import EnterpriseConfig
class AuthentikEnterpriseSearchConfig(EnterpriseConfig):
"""Enterprise app config"""
name = "authentik.enterprise.search"
label = "authentik_search"
verbose_name = "authentik Enterprise.Search"
default = True

View File

@ -1,128 +0,0 @@
"""DjangoQL search"""
from collections import OrderedDict, defaultdict
from collections.abc import Generator
from django.db import connection
from django.db.models import Model, Q
from djangoql.compat import text_type
from djangoql.schema import StrField
class JSONSearchField(StrField):
"""JSON field for DjangoQL"""
model: Model
def __init__(self, model=None, name=None, nullable=None, suggest_nested=True):
# Set this in the constructor to not clobber the type variable
self.type = "relation"
self.suggest_nested = suggest_nested
super().__init__(model, name, nullable)
def get_lookup(self, path, operator, value):
search = "__".join(path)
op, invert = self.get_operator(operator)
q = Q(**{f"{search}{op}": self.get_lookup_value(value)})
return ~q if invert else q
def json_field_keys(self) -> Generator[tuple[str]]:
with connection.cursor() as cursor:
cursor.execute(
f"""
WITH RECURSIVE "{self.name}_keys" AS (
SELECT
ARRAY[jsonb_object_keys("{self.name}")] AS key_path_array,
"{self.name}" -> jsonb_object_keys("{self.name}") AS value
FROM {self.model._meta.db_table}
WHERE "{self.name}" IS NOT NULL
AND jsonb_typeof("{self.name}") = 'object'
UNION ALL
SELECT
ck.key_path_array || jsonb_object_keys(ck.value),
ck.value -> jsonb_object_keys(ck.value) AS value
FROM "{self.name}_keys" ck
WHERE jsonb_typeof(ck.value) = 'object'
),
unique_paths AS (
SELECT DISTINCT key_path_array
FROM "{self.name}_keys"
)
SELECT key_path_array FROM unique_paths;
""" # nosec
)
return (x[0] for x in cursor.fetchall())
def get_nested_options(self) -> OrderedDict:
"""Get keys of all nested objects to show autocomplete"""
if not self.suggest_nested:
return OrderedDict()
base_model_name = f"{self.model._meta.app_label}.{self.model._meta.model_name}_{self.name}"
def recursive_function(parts: list[str], parent_parts: list[str] | None = None):
if not parent_parts:
parent_parts = []
path = parts.pop(0)
parent_parts.append(path)
relation_key = "_".join(parent_parts)
if len(parts) > 1:
out_dict = {
relation_key: {
parts[0]: {
"type": "relation",
"relation": f"{relation_key}_{parts[0]}",
}
}
}
child_paths = recursive_function(parts.copy(), parent_parts.copy())
child_paths.update(out_dict)
return child_paths
else:
return {relation_key: {parts[0]: {}}}
relation_structure = defaultdict(dict)
for relations in self.json_field_keys():
result = recursive_function([base_model_name] + relations)
for relation_key, value in result.items():
for sub_relation_key, sub_value in value.items():
if not relation_structure[relation_key].get(sub_relation_key, None):
relation_structure[relation_key][sub_relation_key] = sub_value
else:
relation_structure[relation_key][sub_relation_key].update(sub_value)
final_dict = defaultdict(dict)
for key, value in relation_structure.items():
for sub_key, sub_value in value.items():
if not sub_value:
final_dict[key][sub_key] = {
"type": "str",
"nullable": True,
}
else:
final_dict[key][sub_key] = sub_value
return OrderedDict(final_dict)
def relation(self) -> str:
return f"{self.model._meta.app_label}.{self.model._meta.model_name}_{self.name}"
class ChoiceSearchField(StrField):
def __init__(self, model=None, name=None, nullable=None):
super().__init__(model, name, nullable, suggest_options=True)
def get_options(self, search):
result = []
choices = self._field_choices()
if choices:
search = search.lower()
for c in choices:
choice = text_type(c[0])
if search in choice.lower():
result.append(choice)
return result

View File

@ -1,53 +0,0 @@
from rest_framework.response import Response
from authentik.api.pagination import Pagination
from authentik.enterprise.search.ql import AUTOCOMPLETE_COMPONENT_NAME, QLSearch
class AutocompletePagination(Pagination):
def paginate_queryset(self, queryset, request, view=None):
self.view = view
return super().paginate_queryset(queryset, request, view)
def get_autocomplete(self):
schema = QLSearch().get_schema(self.request, self.view)
introspections = {}
if hasattr(self.view, "get_ql_fields"):
from authentik.enterprise.search.schema import AKQLSchemaSerializer
introspections = AKQLSchemaSerializer().serialize(
schema(self.page.paginator.object_list.model)
)
return introspections
def get_paginated_response(self, data):
previous_page_number = 0
if self.page.has_previous():
previous_page_number = self.page.previous_page_number()
next_page_number = 0
if self.page.has_next():
next_page_number = self.page.next_page_number()
return Response(
{
"pagination": {
"next": next_page_number,
"previous": previous_page_number,
"count": self.page.paginator.count,
"current": self.page.number,
"total_pages": self.page.paginator.num_pages,
"start_index": self.page.start_index(),
"end_index": self.page.end_index(),
},
"results": data,
"autocomplete": self.get_autocomplete(),
}
)
def get_paginated_response_schema(self, schema):
final_schema = super().get_paginated_response_schema(schema)
final_schema["properties"]["autocomplete"] = {
"$ref": f"#/components/schemas/{AUTOCOMPLETE_COMPONENT_NAME}"
}
final_schema["required"].append("autocomplete")
return final_schema

View File

@ -1,80 +0,0 @@
"""DjangoQL search"""
from django.apps import apps
from django.db.models import QuerySet
from djangoql.ast import Name
from djangoql.exceptions import DjangoQLError
from djangoql.queryset import apply_search
from djangoql.schema import DjangoQLSchema
from rest_framework.filters import BaseFilterBackend, SearchFilter
from rest_framework.request import Request
from structlog.stdlib import get_logger
from authentik.enterprise.search.fields import JSONSearchField
LOGGER = get_logger()
AUTOCOMPLETE_COMPONENT_NAME = "Autocomplete"
AUTOCOMPLETE_SCHEMA = {
"type": "object",
"additionalProperties": {},
}
class BaseSchema(DjangoQLSchema):
"""Base Schema which deals with JSON Fields"""
def resolve_name(self, name: Name):
model = self.model_label(self.current_model)
root_field = name.parts[0]
field = self.models[model].get(root_field)
# If the query goes into a JSON field, return the root
# field as the JSON field will do the rest
if isinstance(field, JSONSearchField):
# This is a workaround; build_filter will remove the right-most
# entry in the path as that is intended to be the same as the field
# however for JSON that is not the case
if name.parts[-1] != root_field:
name.parts.append(root_field)
return field
return super().resolve_name(name)
class QLSearch(BaseFilterBackend):
"""rest_framework search filter which uses DjangoQL"""
def __init__(self):
super().__init__()
self._fallback = SearchFilter()
@property
def enabled(self):
return apps.get_app_config("authentik_enterprise").enabled()
def get_search_terms(self, request: Request) -> str:
"""Search terms are set by a ?search=... query parameter,
and may be comma and/or whitespace delimited."""
params = request.query_params.get("search", "")
params = params.replace("\x00", "") # strip null characters
return params
def get_schema(self, request: Request, view) -> BaseSchema:
ql_fields = []
if hasattr(view, "get_ql_fields"):
ql_fields = view.get_ql_fields()
class InlineSchema(BaseSchema):
def get_fields(self, model):
return ql_fields or []
return InlineSchema
def filter_queryset(self, request: Request, queryset: QuerySet, view) -> QuerySet:
search_query = self.get_search_terms(request)
schema = self.get_schema(request, view)
if len(search_query) == 0 or not self.enabled:
return self._fallback.filter_queryset(request, queryset, view)
try:
return apply_search(queryset, search_query, schema=schema)
except DjangoQLError as exc:
LOGGER.debug("Failed to parse search expression", exc=exc)
return self._fallback.filter_queryset(request, queryset, view)

View File

@ -1,29 +0,0 @@
from djangoql.serializers import DjangoQLSchemaSerializer
from drf_spectacular.generators import SchemaGenerator
from authentik.api.schema import create_component
from authentik.enterprise.search.fields import JSONSearchField
from authentik.enterprise.search.ql import AUTOCOMPLETE_COMPONENT_NAME, AUTOCOMPLETE_SCHEMA
class AKQLSchemaSerializer(DjangoQLSchemaSerializer):
def serialize(self, schema):
serialization = super().serialize(schema)
for _, fields in schema.models.items():
for _, field in fields.items():
if not isinstance(field, JSONSearchField):
continue
serialization["models"].update(field.get_nested_options())
return serialization
def serialize_field(self, field):
result = super().serialize_field(field)
if isinstance(field, JSONSearchField):
result["relation"] = field.relation()
return result
def postprocess_schema_search_autocomplete(result, generator: SchemaGenerator, **kwargs):
create_component(generator, AUTOCOMPLETE_COMPONENT_NAME, AUTOCOMPLETE_SCHEMA)
return result

View File

@ -1,17 +0,0 @@
SPECTACULAR_SETTINGS = {
"POSTPROCESSING_HOOKS": [
"authentik.api.schema.postprocess_schema_responses",
"authentik.enterprise.search.schema.postprocess_schema_search_autocomplete",
"drf_spectacular.hooks.postprocess_schema_enums",
],
}
REST_FRAMEWORK = {
"DEFAULT_PAGINATION_CLASS": "authentik.enterprise.search.pagination.AutocompletePagination",
"DEFAULT_FILTER_BACKENDS": [
"authentik.enterprise.search.ql.QLSearch",
"authentik.rbac.filters.ObjectFilter",
"django_filters.rest_framework.DjangoFilterBackend",
"rest_framework.filters.OrderingFilter",
],
}

View File

@ -1,78 +0,0 @@
from json import loads
from unittest.mock import PropertyMock, patch
from urllib.parse import urlencode
from django.urls import reverse
from rest_framework.test import APITestCase
from authentik.core.tests.utils import create_test_admin_user
@patch(
"authentik.enterprise.audit.middleware.EnterpriseAuditMiddleware.enabled",
PropertyMock(return_value=True),
)
class QLTest(APITestCase):
def setUp(self):
self.user = create_test_admin_user()
# ensure we have more than 1 user
create_test_admin_user()
def test_search(self):
"""Test simple search query"""
self.client.force_login(self.user)
query = f'username = "{self.user.username}"'
res = self.client.get(
reverse(
"authentik_api:user-list",
)
+ f"?{urlencode({"search": query})}"
)
self.assertEqual(res.status_code, 200)
content = loads(res.content)
self.assertEqual(content["pagination"]["count"], 1)
self.assertEqual(content["results"][0]["username"], self.user.username)
def test_no_search(self):
"""Ensure works with no search query"""
self.client.force_login(self.user)
res = self.client.get(
reverse(
"authentik_api:user-list",
)
)
self.assertEqual(res.status_code, 200)
content = loads(res.content)
self.assertNotEqual(content["pagination"]["count"], 1)
def test_search_no_ql(self):
"""Test simple search query (no QL)"""
self.client.force_login(self.user)
res = self.client.get(
reverse(
"authentik_api:user-list",
)
+ f"?{urlencode({"search": self.user.username})}"
)
self.assertEqual(res.status_code, 200)
content = loads(res.content)
self.assertEqual(content["pagination"]["count"], 1)
self.assertEqual(content["results"][0]["username"], self.user.username)
def test_search_json(self):
"""Test search query with a JSON attribute"""
self.user.attributes = {"foo": {"bar": "baz"}}
self.user.save()
self.client.force_login(self.user)
query = 'attributes.foo.bar = "baz"'
res = self.client.get(
reverse(
"authentik_api:user-list",
)
+ f"?{urlencode({"search": query})}"
)
self.assertEqual(res.status_code, 200)
content = loads(res.content)
self.assertEqual(content["pagination"]["count"], 1)
self.assertEqual(content["results"][0]["username"], self.user.username)

View File

@ -1,12 +1,23 @@
"""Enterprise additional settings"""
from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand
CELERY_BEAT_SCHEDULE = {
"enterprise_update_usage": {
"task": "authentik.enterprise.tasks.enterprise_update_usage",
"schedule": crontab(minute=fqdn_rand("enterprise_update_usage"), hour="*/2"),
"options": {"queue": "authentik_scheduled"},
}
}
TENANT_APPS = [
"authentik.enterprise.audit",
"authentik.enterprise.policies.unique_password",
"authentik.enterprise.providers.google_workspace",
"authentik.enterprise.providers.microsoft_entra",
"authentik.enterprise.providers.ssf",
"authentik.enterprise.search",
"authentik.enterprise.stages.authenticator_endpoint_gdtc",
"authentik.enterprise.stages.mtls",
"authentik.enterprise.stages.source",

View File

@ -10,7 +10,6 @@ from django.utils.timezone import get_current_timezone
from authentik.enterprise.license import CACHE_KEY_ENTERPRISE_LICENSE
from authentik.enterprise.models import License
from authentik.enterprise.tasks import enterprise_update_usage
from authentik.tasks.schedules.models import Schedule
@receiver(pre_save, sender=License)
@ -27,7 +26,7 @@ def pre_save_license(sender: type[License], instance: License, **_):
def post_save_license(sender: type[License], instance: License, **_):
"""Trigger license usage calculation when license is saved"""
cache.delete(CACHE_KEY_ENTERPRISE_LICENSE)
Schedule.dispatch_by_actor(enterprise_update_usage)
enterprise_update_usage.delay()
@receiver(post_delete, sender=License)

View File

@ -97,7 +97,6 @@ class SourceStageFinal(StageView):
token: FlowToken = self.request.session.get(SESSION_KEY_OVERRIDE_FLOW_TOKEN)
self.logger.info("Replacing source flow with overridden flow", flow=token.flow.slug)
plan = token.plan
plan.context.update(self.executor.plan.context)
plan.context[PLAN_CONTEXT_IS_RESTORED] = token
response = plan.to_redirect(self.request, token.flow)
token.delete()

View File

@ -90,17 +90,14 @@ class TestSourceStage(FlowTestCase):
plan: FlowPlan = session[SESSION_KEY_PLAN]
plan.insert_stage(in_memory_stage(SourceStageFinal), index=0)
plan.context[PLAN_CONTEXT_IS_RESTORED] = flow_token
plan.context["foo"] = "bar"
session[SESSION_KEY_PLAN] = plan
session.save()
# Pretend we've just returned from the source
with self.assertFlowFinishes() as ff:
response = self.client.get(
reverse("authentik_api:flow-executor", kwargs={"flow_slug": flow.slug}), follow=True
)
self.assertEqual(response.status_code, 200)
self.assertStageRedirects(
response, reverse("authentik_core:if-flow", kwargs={"flow_slug": flow.slug})
)
self.assertEqual(ff().context["foo"], "bar")
response = self.client.get(
reverse("authentik_api:flow-executor", kwargs={"flow_slug": flow.slug}), follow=True
)
self.assertEqual(response.status_code, 200)
self.assertStageRedirects(
response, reverse("authentik_core:if-flow", kwargs={"flow_slug": flow.slug})
)

View File

@ -1,11 +1,14 @@
"""Enterprise tasks"""
from django.utils.translation import gettext_lazy as _
from dramatiq.actor import actor
from authentik.enterprise.license import LicenseKey
from authentik.events.models import TaskStatus
from authentik.events.system_tasks import SystemTask, prefill_task
from authentik.root.celery import CELERY_APP
@actor(description=_("Update enterprise license status."))
def enterprise_update_usage():
@CELERY_APP.task(bind=True, base=SystemTask)
@prefill_task
def enterprise_update_usage(self: SystemTask):
"""Update enterprise license status"""
LicenseKey.get_total().record_usage()
self.set_status(TaskStatus.SUCCESSFUL)

View File

@ -132,22 +132,6 @@ class EventViewSet(ModelViewSet):
]
filterset_class = EventsFilter
def get_ql_fields(self):
from djangoql.schema import DateTimeField, StrField
from authentik.enterprise.search.fields import ChoiceSearchField, JSONSearchField
return [
ChoiceSearchField(Event, "action"),
StrField(Event, "event_uuid"),
StrField(Event, "app", suggest_options=True),
StrField(Event, "client_ip"),
JSONSearchField(Event, "user", suggest_nested=False),
JSONSearchField(Event, "brand", suggest_nested=False),
JSONSearchField(Event, "context", suggest_nested=False),
DateTimeField(Event, "created", suggest_options=True),
]
@extend_schema(
methods=["GET"],
responses={200: EventTopPerUserSerializer(many=True)},

View File

@ -11,7 +11,7 @@ from authentik.events.models import NotificationRule
class NotificationRuleSerializer(ModelSerializer):
"""NotificationRule Serializer"""
destination_group_obj = GroupSerializer(read_only=True, source="destination_group")
group_obj = GroupSerializer(read_only=True, source="group")
class Meta:
model = NotificationRule
@ -20,9 +20,8 @@ class NotificationRuleSerializer(ModelSerializer):
"name",
"transports",
"severity",
"destination_group",
"destination_group_obj",
"destination_event_user",
"group",
"group_obj",
]
@ -31,6 +30,6 @@ class NotificationRuleViewSet(UsedByMixin, ModelViewSet):
queryset = NotificationRule.objects.all()
serializer_class = NotificationRuleSerializer
filterset_fields = ["name", "severity", "destination_group__name"]
filterset_fields = ["name", "severity", "group__name"]
ordering = ["name"]
search_fields = ["name", "destination_group__name"]
search_fields = ["name", "group__name"]

View File

@ -0,0 +1,104 @@
"""Tasks API"""
from importlib import import_module
from django.contrib import messages
from django.utils.translation import gettext_lazy as _
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import OpenApiResponse, extend_schema
from rest_framework.decorators import action
from rest_framework.fields import (
CharField,
ChoiceField,
DateTimeField,
FloatField,
SerializerMethodField,
)
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import ReadOnlyModelViewSet
from structlog.stdlib import get_logger
from authentik.core.api.utils import ModelSerializer
from authentik.events.logs import LogEventSerializer
from authentik.events.models import SystemTask, TaskStatus
from authentik.rbac.decorators import permission_required
LOGGER = get_logger()
class SystemTaskSerializer(ModelSerializer):
"""Serialize TaskInfo and TaskResult"""
name = CharField()
full_name = SerializerMethodField()
uid = CharField(required=False)
description = CharField()
start_timestamp = DateTimeField(read_only=True)
finish_timestamp = DateTimeField(read_only=True)
duration = FloatField(read_only=True)
status = ChoiceField(choices=[(x.value, x.name) for x in TaskStatus])
messages = LogEventSerializer(many=True)
def get_full_name(self, instance: SystemTask) -> str:
"""Get full name with UID"""
if instance.uid:
return f"{instance.name}:{instance.uid}"
return instance.name
class Meta:
model = SystemTask
fields = [
"uuid",
"name",
"full_name",
"uid",
"description",
"start_timestamp",
"finish_timestamp",
"duration",
"status",
"messages",
"expires",
"expiring",
]
class SystemTaskViewSet(ReadOnlyModelViewSet):
"""Read-only view set that returns all background tasks"""
queryset = SystemTask.objects.all()
serializer_class = SystemTaskSerializer
filterset_fields = ["name", "uid", "status"]
ordering = ["name", "uid", "status"]
search_fields = ["name", "description", "uid", "status"]
@permission_required(None, ["authentik_events.run_task"])
@extend_schema(
request=OpenApiTypes.NONE,
responses={
204: OpenApiResponse(description="Task retried successfully"),
404: OpenApiResponse(description="Task not found"),
500: OpenApiResponse(description="Failed to retry task"),
},
)
@action(detail=True, methods=["POST"], permission_classes=[])
def run(self, request: Request, pk=None) -> Response:
"""Run task"""
task: SystemTask = self.get_object()
try:
task_module = import_module(task.task_call_module)
task_func = getattr(task_module, task.task_call_func)
LOGGER.info("Running task", task=task_func)
task_func.delay(*task.task_call_args, **task.task_call_kwargs)
messages.success(
self.request,
_("Successfully started task {name}.".format_map({"name": task.name})),
)
return Response(status=204)
except (ImportError, AttributeError) as exc: # pragma: no cover
LOGGER.warning("Failed to run task, remove state", task=task.name, exc=exc)
# if we get an import error, the module path has probably changed
task.delete()
return Response(status=500)

View File

@ -1,11 +1,12 @@
"""authentik events app"""
from celery.schedules import crontab
from prometheus_client import Gauge, Histogram
from authentik.blueprints.apps import ManagedAppConfig
from authentik.lib.config import CONFIG, ENV_PREFIX
from authentik.lib.utils.time import fqdn_rand
from authentik.tasks.schedules.lib import ScheduleSpec
from authentik.lib.utils.reflection import path_to_class
from authentik.root.celery import CELERY_APP
# TODO: Deprecated metric - remove in 2024.2 or later
GAUGE_TASKS = Gauge(
@ -34,17 +35,6 @@ class AuthentikEventsConfig(ManagedAppConfig):
verbose_name = "authentik Events"
default = True
@property
def tenant_schedule_specs(self) -> list[ScheduleSpec]:
from authentik.events.tasks import notification_cleanup
return [
ScheduleSpec(
actor=notification_cleanup,
crontab=f"{fqdn_rand('notification_cleanup')} */8 * * *",
),
]
@ManagedAppConfig.reconcile_global
def check_deprecations(self):
"""Check for config deprecations"""
@ -66,3 +56,41 @@ class AuthentikEventsConfig(ManagedAppConfig):
replacement_env=replace_env,
message=msg,
).save()
@ManagedAppConfig.reconcile_tenant
def prefill_tasks(self):
"""Prefill tasks"""
from authentik.events.models import SystemTask
from authentik.events.system_tasks import _prefill_tasks
for task in _prefill_tasks:
if SystemTask.objects.filter(name=task.name).exists():
continue
task.save()
self.logger.debug("prefilled task", task_name=task.name)
@ManagedAppConfig.reconcile_tenant
def run_scheduled_tasks(self):
"""Run schedule tasks which are behind schedule (only applies
to tasks of which we keep metrics)"""
from authentik.events.models import TaskStatus
from authentik.events.system_tasks import SystemTask as CelerySystemTask
for task in CELERY_APP.conf["beat_schedule"].values():
schedule = task["schedule"]
if not isinstance(schedule, crontab):
continue
task_class: CelerySystemTask = path_to_class(task["task"])
if not isinstance(task_class, CelerySystemTask):
continue
db_task = task_class.db()
if not db_task:
continue
due, _ = schedule.is_due(db_task.finish_timestamp)
if due or db_task.status == TaskStatus.UNKNOWN:
self.logger.debug("Running past-due scheduled task", task=task["task"])
task_class.apply_async(
args=task.get("args", None),
kwargs=task.get("kwargs", None),
**task.get("options", {}),
)

View File

@ -15,13 +15,13 @@ class MMDBContextProcessor(EventContextProcessor):
self.reader: Reader | None = None
self._last_mtime: float = 0.0
self.logger = get_logger()
self.load()
self.open()
def path(self) -> str | None:
"""Get the path to the MMDB file to load"""
raise NotImplementedError
def load(self):
def open(self):
"""Get GeoIP Reader, if configured, otherwise none"""
path = self.path()
if path == "" or not path:
@ -44,7 +44,7 @@ class MMDBContextProcessor(EventContextProcessor):
diff = self._last_mtime < mtime
if diff > 0:
self.logger.info("Found new MMDB Database, reopening", diff=diff, path=path)
self.load()
self.open()
except OSError as exc:
self.logger.warning("Failed to check MMDB age", exc=exc)

View File

@ -19,7 +19,7 @@ from authentik.blueprints.v1.importer import excluded_models
from authentik.core.models import Group, User
from authentik.events.models import Event, EventAction, Notification
from authentik.events.utils import model_to_dict
from authentik.lib.sentry import should_ignore_exception
from authentik.lib.sentry import before_send
from authentik.lib.utils.errors import exception_to_string
from authentik.stages.authenticator_static.models import StaticToken
@ -173,7 +173,7 @@ class AuditMiddleware:
message=exception_to_string(exception),
)
thread.run()
elif not should_ignore_exception(exception):
elif before_send({}, {"exc_info": (None, exception, None)}) is not None:
thread = EventNewThread(
EventAction.SYSTEM_EXCEPTION,
request,

View File

@ -1,26 +0,0 @@
# Generated by Django 5.1.11 on 2025-06-16 23:21
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_events", "0009_remove_notificationtransport_webhook_mapping_and_more"),
]
operations = [
migrations.RenameField(
model_name="notificationrule",
old_name="group",
new_name="destination_group",
),
migrations.AddField(
model_name="notificationrule",
name="destination_event_user",
field=models.BooleanField(
default=False,
help_text="When enabled, notification will be sent to user the user that triggered the event.When destination_group is configured, notification is sent to both.",
),
),
]

Some files were not shown because too many files have changed in this diff Show More