Compare commits

..

17 Commits

Author SHA1 Message Date
3925f5a208 release: 2023.5.6 2023-08-29 19:36:52 +02:00
6add4a62b9 include cure53 report
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-08-29 19:35:50 +02:00
54d5aa20ba security: fix CVE-2023-39522 (#6665)
* stages/email: don't disclose whether a user exists or not when recovering

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* update website

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	website/docs/releases/2023/v2023.5.md
#	website/docs/releases/2023/v2023.6.md
2023-08-29 19:08:47 +02:00
b99ac01228 release: 2023.5.5 2023-07-06 18:15:56 +02:00
15026748d1 security: fix CVE-2023-36456
Signed-off-by: Jens Langhammer <jens@goauthentik.io>

# Conflicts:
#	website/sidebars.js
2023-07-06 18:15:46 +02:00
2739376a2a release: 2023.5.4 2023-06-22 21:45:33 +02:00
152121175b bump web api client
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-06-22 21:33:02 +02:00
1d57a258f3 ATH-01-012: escape quotation marks
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-06-19 13:48:08 +02:00
f15cac39c8 ATH-01-014: save authenticator validation state in flow context
Signed-off-by: Jens Langhammer <jens@goauthentik.io>

bugfixes

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-06-19 13:48:05 +02:00
ce77d82b24 ATH-01-010: rework
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-06-19 13:48:03 +02:00
c3fe57197d ATH-01-009: migrate impersonation to use API
Signed-off-by: Jens Langhammer <jens@goauthentik.io>

# Conflicts:
#	authentik/core/urls.py
#	web/src/admin/AdminInterface.ts
#	web/src/admin/users/RelatedUserList.ts
#	web/src/admin/users/UserListPage.ts
#	web/src/admin/users/UserViewPage.ts
#	web/src/user/UserInterface.ts

# Conflicts:
#	authentik/core/urls.py
2023-06-19 13:47:53 +02:00
267938d435 ATH-01-005: use hmac.compare_digest for secret_key authentication
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-06-19 13:47:11 +02:00
6a7c2e0662 ATH-01-003 / ATH-01-012: disable htmlLabels in mermaid
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-06-19 13:47:09 +02:00
5336afb1b4 ATH-01-004: remove env from admin system endpoint
this endpoint already required admin access, but for debugging the env variables are used very little

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-06-19 13:47:06 +02:00
9bb44055a3 ATH-01-008: fix web forms not submitting correctly when pressing enter
When submitting some forms with the Enter key instead of clicking "Confirm"/etc, the form would not get submitted correctly

This would in the worst case is when setting a user's password, where the new password can end up in the URL, but the password was not actually saved to the user.

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

# Conflicts:
#	web/src/admin/applications/ApplicationCheckAccessForm.ts
#	web/src/admin/crypto/CertificateGenerateForm.ts
#	web/src/admin/flows/FlowImportForm.ts
#	web/src/admin/groups/RelatedGroupList.ts
#	web/src/admin/policies/PolicyTestForm.ts
#	web/src/admin/property-mappings/PropertyMappingTestForm.ts
#	web/src/admin/providers/saml/SAMLProviderImportForm.ts
#	web/src/admin/users/RelatedUserList.ts
#	web/src/admin/users/ServiceAccountForm.ts
#	web/src/admin/users/UserPasswordForm.ts
#	web/src/admin/users/UserResetEmailForm.ts

# Conflicts:
#	web/src/admin/property-mappings/PropertyMappingTestForm.ts
2023-06-19 13:46:52 +02:00
143663d293 ATH-01-010: fix missing user filter for webauthn device
This prevents an attack that is only possible when an attacker can intercept HTTP traffic and in the case of HTTPS decrypt it.
2023-06-19 13:46:16 +02:00
bd54d034e1 ATH-01-001: resolve path and check start before loading blueprints
This is even less of an issue since 411ef239f6, since with that commit we only allow files that the listing returns

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-06-19 13:46:13 +02:00
540 changed files with 121727 additions and 100386 deletions

View File

@ -1,5 +1,5 @@
[bumpversion]
current_version = 2023.6.2
current_version = 2023.5.6
tag = True
commit = True
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)

View File

@ -1,17 +0,0 @@
---
name: Hackathon Idea
about: Propose an idea for the hackathon
title: ""
labels: hackathon
assignees: ""
---
**Describe the idea**
A clear concise description of the idea you want to implement
You're also free to work on existing GitHub issues, whether they be feature requests or bugs, just link the existing GitHub issue here.
<!-- Don't modify below here -->
If you want to help working on this idea or want to contribute in any other way, react to this issue with a :rocket:

View File

@ -24,18 +24,6 @@ updates:
open-pull-requests-limit: 10
commit-message:
prefix: "web:"
groups:
sentry:
patterns:
- "@sentry/*"
babel:
patterns:
- "@babel/*"
- "babel-*"
storybook:
patterns:
- "@storybook/*"
- "*storybook*"
- package-ecosystem: npm
directory: "/website"
schedule:
@ -44,10 +32,6 @@ updates:
open-pull-requests-limit: 10
commit-message:
prefix: "website:"
groups:
docusaurus:
patterns:
- "@docusaurus/*"
- package-ecosystem: pip
directory: "/"
schedule:

19
.github/stale.yml vendored Normal file
View File

@ -0,0 +1,19 @@
# Number of days of inactivity before an issue becomes stale
daysUntilStale: 60
# Number of days of inactivity before a stale issue is closed
daysUntilClose: 7
# Issues with these labels will never be considered stale
exemptLabels:
- pinned
- security
- pr_wanted
- enhancement
- bug/confirmed
- enhancement/confirmed
- question
# Comment to post when marking an issue as stale. Set to `false` to disable
markComment: >
This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. Thank you
for your contributions.
only: issues

View File

@ -2,11 +2,11 @@ git:
filters:
- filter_type: file
# all supported i18n types: https://docs.transifex.com/formats
file_format: XLIFF
file_format: PO
source_language: en
source_file: web/xliff/en.xlf
source_file: web/src/locales/en.po
# path expression to translation files, must contain <lang> placeholder
translation_files_expression: "web/xliff/<lang>.xlf"
translation_files_expression: "web/src/locales/<lang>.po"
- filter_type: file
# all supported i18n types: https://docs.transifex.com/formats
file_format: PO

View File

@ -190,7 +190,7 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v2.2.0
uses: docker/setup-qemu-action@v2.1.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: prepare variables
@ -218,7 +218,6 @@ jobs:
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}-${{ steps.ev.outputs.timestamp }}-${{ steps.ev.outputs.shortHash }}
build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
VERSION=${{ steps.ev.outputs.version }}
VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
- name: Comment on PR
if: github.event_name == 'pull_request'
@ -235,7 +234,7 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v2.2.0
uses: docker/setup-qemu-action@v2.1.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: prepare variables
@ -263,6 +262,5 @@ jobs:
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}-${{ steps.ev.outputs.timestamp }}-${{ steps.ev.outputs.shortHash }}-arm64
build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
VERSION=${{ steps.ev.outputs.version }}
VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
platforms: linux/arm64

View File

@ -29,8 +29,7 @@ jobs:
- name: golangci-lint
uses: golangci/golangci-lint-action@v3
with:
version: v1.52.2
args: --timeout 5000s --verbose
args: --timeout 5000s
skip-pkg-cache: true
test-unittest:
runs-on: ubuntu-latest
@ -68,7 +67,7 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v2.2.0
uses: docker/setup-qemu-action@v2.1.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: prepare variables
@ -95,7 +94,6 @@ jobs:
file: ${{ matrix.type }}.Dockerfile
build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
VERSION=${{ steps.ev.outputs.version }}
VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
platforms: linux/amd64,linux/arm64
context: .
@ -120,7 +118,7 @@ jobs:
- uses: actions/setup-go@v4
with:
go-version-file: "go.mod"
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"

View File

@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"
@ -31,7 +31,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"
@ -47,7 +47,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"
@ -63,7 +63,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"
@ -95,7 +95,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"

View File

@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"
@ -29,7 +29,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"
@ -50,7 +50,7 @@ jobs:
- build-docs-only
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"

View File

@ -1,20 +0,0 @@
name: authentik-on-release-next-branch
on:
schedule:
- cron: "0 12 * * *" # every day at noon
workflow_dispatch:
permissions:
contents: write
jobs:
update-next:
runs-on: ubuntu-latest
environment: internal-production
steps:
- uses: actions/checkout@v3
with:
ref: main
- run: |
git push origin --force main:next

View File

@ -10,7 +10,7 @@ jobs:
steps:
- uses: actions/checkout@v3
- name: Set up QEMU
uses: docker/setup-qemu-action@v2.2.0
uses: docker/setup-qemu-action@v2.1.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: prepare variables
@ -43,7 +43,6 @@ jobs:
ghcr.io/goauthentik/server:latest
platforms: linux/amd64,linux/arm64
build-args: |
VERSION=${{ steps.ev.outputs.version }}
VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
build-outpost:
runs-on: ubuntu-latest
@ -60,7 +59,7 @@ jobs:
with:
go-version-file: "go.mod"
- name: Set up QEMU
uses: docker/setup-qemu-action@v2.2.0
uses: docker/setup-qemu-action@v2.1.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: prepare variables
@ -91,7 +90,6 @@ jobs:
file: ${{ matrix.type }}.Dockerfile
platforms: linux/amd64,linux/arm64
build-args: |
VERSION=${{ steps.ev.outputs.version }}
VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
build-outpost-binary:
timeout-minutes: 120
@ -110,7 +108,7 @@ jobs:
- uses: actions/setup-go@v4
with:
go-version-file: "go.mod"
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
cache: "npm"

View File

@ -1,33 +0,0 @@
name: 'authentik-repo-stale'
on:
schedule:
- cron: '30 1 * * *'
workflow_dispatch:
permissions:
issues: write
pull-requests: write
jobs:
stale:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: tibdex/github-app-token@v1
with:
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/stale@v8
with:
repo-token: ${{ steps.generate_token.outputs.token }}
days-before-stale: 60
days-before-close: 7
exempt-issue-labels: pinned,security,pr_wanted,enhancement,bug/confirmed,enhancement/confirmed,question
stale-issue-label: wontfix
stale-issue-message: >
This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. Thank you
for your contributions.
# Don't stale PRs, so only apply to PRs with a non-existent label
only-pr-labels: foo

View File

@ -17,7 +17,7 @@ jobs:
- uses: actions/checkout@v3
with:
token: ${{ steps.generate_token.outputs.token }}
- uses: actions/setup-node@v3.7.0
- uses: actions/setup-node@v3.6.0
with:
node-version: "20"
registry-url: "https://registry.npmjs.org"
@ -45,8 +45,8 @@ jobs:
body: "web: bump API Client version"
delete-branch: true
signoff: true
# ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
team-reviewers: "@goauthentik/core"
author: authentik bot <github-bot@goauthentik.io>
- uses: peter-evans/enable-pull-request-automerge@v3
with:
token: ${{ steps.generate_token.outputs.token }}

1
.gitignore vendored
View File

@ -166,7 +166,6 @@ dmypy.json
# SageMath parsed files
# Environments
**/.DS_Store
# Spyder project settings

27
.vscode/launch.json vendored
View File

@ -1,27 +0,0 @@
{
"version": "0.2.0",
"configurations": [
{
"name": "Python: PDB attach Server",
"type": "python",
"request": "attach",
"connect": {
"host": "localhost",
"port": 6800
},
"justMyCode": true,
"django": true
},
{
"name": "Python: PDB attach Worker",
"type": "python",
"request": "attach",
"connect": {
"host": "localhost",
"port": 6900
},
"justMyCode": true,
"django": true
},
]
}

View File

@ -20,7 +20,7 @@ WORKDIR /work/web
RUN npm ci --include=dev && npm run build
# Stage 3: Poetry to requirements.txt export
FROM docker.io/python:3.11.4-slim-bullseye AS poetry-locker
FROM docker.io/python:3.11.3-slim-bullseye AS poetry-locker
WORKDIR /work
COPY ./pyproject.toml /work
@ -31,7 +31,7 @@ RUN pip install --no-cache-dir poetry && \
poetry export -f requirements.txt --dev --output requirements-dev.txt
# Stage 4: Build go proxy
FROM docker.io/golang:1.20.5-bullseye AS go-builder
FROM docker.io/golang:1.20.4-bullseye AS go-builder
WORKDIR /work
@ -63,20 +63,17 @@ RUN --mount=type=secret,id=GEOIPUPDATE_ACCOUNT_ID \
"
# Stage 6: Run
FROM docker.io/python:3.11.4-slim-bullseye AS final-image
ARG GIT_BUILD_HASH
ARG VERSION
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
FROM docker.io/python:3.11.3-slim-bullseye AS final-image
LABEL org.opencontainers.image.url https://goauthentik.io
LABEL org.opencontainers.image.description goauthentik.io Main server image, see https://goauthentik.io for more info.
LABEL org.opencontainers.image.source https://github.com/goauthentik/authentik
LABEL org.opencontainers.image.version ${VERSION}
LABEL org.opencontainers.image.revision ${GIT_BUILD_HASH}
WORKDIR /
ARG GIT_BUILD_HASH
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
COPY --from=poetry-locker /work/requirements.txt /
COPY --from=poetry-locker /work/requirements-dev.txt /
COPY --from=geoip /usr/share/GeoIP /geoip

View File

@ -52,7 +52,7 @@ lint:
migrate:
python -m lifecycle.migrate
i18n-extract: i18n-extract-core web-i18n-extract
i18n-extract: i18n-extract-core web-extract
i18n-extract-core:
ak makemessages --ignore web --ignore internal --ignore web --ignore web-api --ignore website -l en
@ -150,8 +150,8 @@ web-lint:
web-check-compile:
cd web && npm run tsc
web-i18n-extract:
cd web && npm run extract-locales
web-extract:
cd web && npm run extract
#########################
## Website

View File

@ -15,7 +15,7 @@
## What is authentik?
authentik is an open-source Identity Provider that emphasizes flexibility and versatility. It can be seamlessly integrated into existing environments to support new protocols. authentik is also a great solution for implementing sign-up, recovery, and other similar features in your application, saving you the hassle of dealing with them.
Authentik is an open-source Identity Provider that emphasizes flexibility and versatility. It can be seamlessly integrated into existing environments to support new protocols. Authentik is also a great solution for implementing sign-up, recovery, and other similar features in your application, saving you the hassle of dealing with them.
## Installation

View File

@ -1,4 +1,4 @@
authentik takes security very seriously. We follow the rules of [responsible disclosure](https://en.wikipedia.org/wiki/Responsible_disclosure), and we urge our community to do so as well, instead of reporting vulnerabilities publicly. This allows us to patch the issue quickly, announce it's existence and release the fixed version.
Authentik takes security very seriously. We follow the rules of [responsible disclosure](https://en.wikipedia.org/wiki/Responsible_disclosure), and we urge our community to do so as well, instead of reporting vulnerabilities publicly. This allows us to patch the issue quickly, announce it's existence and release the fixed version.
## Supported Versions

View File

@ -2,7 +2,7 @@
from os import environ
from typing import Optional
__version__ = "2023.6.2"
__version__ = "2023.5.6"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"

View File

@ -8,7 +8,6 @@ from rest_framework.viewsets import ViewSet
from authentik.core.api.utils import PassiveSerializer
from authentik.lib.utils.reflection import get_apps
from authentik.policies.event_matcher.models import model_choices
class AppSerializer(PassiveSerializer):
@ -30,17 +29,3 @@ class AppsViewSet(ViewSet):
for app in sorted(get_apps(), key=lambda app: app.name):
data.append({"name": app.name, "label": app.verbose_name})
return Response(AppSerializer(data, many=True).data)
class ModelViewSet(ViewSet):
"""Read-only view list all installed models"""
permission_classes = [IsAdminUser]
@extend_schema(responses={200: AppSerializer(many=True)})
def list(self, request: Request) -> Response:
"""Read-only view list all installed models"""
data = []
for name, label in model_choices():
data.append({"name": name, "label": label})
return Response(AppSerializer(data, many=True).data)

View File

@ -19,7 +19,7 @@ class WorkerView(APIView):
def get(self, request: Request) -> Response:
"""Get currently connected worker count."""
count = len(CELERY_APP.control.ping(timeout=0.5))
# In debug we run with `task_always_eager`, so tasks are ran on the main process
# In debug we run with `CELERY_TASK_ALWAYS_EAGER`, so tasks are ran on the main process
if settings.DEBUG: # pragma: no cover
count += 1
return Response({"count": count})

View File

@ -94,11 +94,6 @@ class TestAdminAPI(TestCase):
response = self.client.get(reverse("authentik_api:apps-list"))
self.assertEqual(response.status_code, 200)
def test_models(self):
"""Test models API"""
response = self.client.get(reverse("authentik_api:models-list"))
self.assertEqual(response.status_code, 200)
@reconcile_app("authentik_outposts")
def test_system(self):
"""Test system API"""

View File

@ -1,7 +1,7 @@
"""API URLs"""
from django.urls import path
from authentik.admin.api.meta import AppsViewSet, ModelViewSet
from authentik.admin.api.meta import AppsViewSet
from authentik.admin.api.metrics import AdministrationMetricsViewSet
from authentik.admin.api.system import SystemView
from authentik.admin.api.tasks import TaskViewSet
@ -11,7 +11,6 @@ from authentik.admin.api.workers import WorkerView
api_urlpatterns = [
("admin/system_tasks", TaskViewSet, "admin_system_tasks"),
("admin/apps", AppsViewSet, "apps"),
("admin/models", ModelViewSet, "models"),
path(
"admin/metrics/",
AdministrationMetricsViewSet.as_view(),

View File

@ -10,6 +10,8 @@ API Browser - {{ tenant.branding_title }}
<script src="{% static 'dist/standalone/api-browser/index.js' %}?version={{ version }}" type="module"></script>
<meta name="theme-color" content="#151515" media="(prefers-color-scheme: light)">
<meta name="theme-color" content="#151515" media="(prefers-color-scheme: dark)">
<link rel="icon" href="{{ tenant.branding_favicon }}">
<link rel="shortcut icon" href="{{ tenant.branding_favicon }}">
{% endblock %}
{% block body %}

View File

@ -11,37 +11,31 @@ metadata:
entries:
- model: authentik_core.token
identifiers:
identifier: "%(uid)s-token"
identifier: %(uid)s-token
attrs:
key: "%(uid)s"
user: "%(user)s"
key: %(uid)s
user: %(user)s
intent: api
- model: authentik_core.application
identifiers:
slug: "%(uid)s-app"
slug: %(uid)s-app
attrs:
name: "%(uid)s-app"
name: %(uid)s-app
icon: https://goauthentik.io/img/icon.png
- model: authentik_sources_oauth.oauthsource
identifiers:
slug: "%(uid)s-source"
slug: %(uid)s-source
attrs:
name: "%(uid)s-source"
name: %(uid)s-source
provider_type: azuread
consumer_key: "%(uid)s"
consumer_secret: "%(uid)s"
consumer_key: %(uid)s
consumer_secret: %(uid)s
icon: https://goauthentik.io/img/icon.png
- model: authentik_flows.flow
identifiers:
slug: "%(uid)s-flow"
slug: %(uid)s-flow
attrs:
name: "%(uid)s-flow"
title: "%(uid)s-flow"
name: %(uid)s-flow
title: %(uid)s-flow
designation: authentication
background: https://goauthentik.io/img/icon.png
- model: authentik_core.user
identifiers:
username: "%(uid)s"
attrs:
name: "%(uid)s"
password: "%(uid)s"

View File

@ -2,7 +2,7 @@
from django.test import TransactionTestCase
from authentik.blueprints.v1.importer import Importer
from authentik.core.models import Application, Token, User
from authentik.core.models import Application, Token
from authentik.core.tests.utils import create_test_admin_user
from authentik.flows.models import Flow
from authentik.lib.generators import generate_id
@ -45,9 +45,3 @@ class TestBlueprintsV1ConditionalFields(TransactionTestCase):
flow = Flow.objects.filter(slug=f"{self.uid}-flow").first()
self.assertIsNotNone(flow)
self.assertEqual(flow.background, "https://goauthentik.io/img/icon.png")
def test_user(self):
"""Test user"""
user: User = User.objects.filter(username=self.uid).first()
self.assertIsNotNone(user)
self.assertTrue(user.check_password(self.uid))

View File

@ -185,9 +185,9 @@ def apply_blueprint(self: MonitoredTask, instance_pk: str):
instance: Optional[BlueprintInstance] = None
try:
instance: BlueprintInstance = BlueprintInstance.objects.filter(pk=instance_pk).first()
self.set_uid(slugify(instance.name))
if not instance or not instance.enabled:
return
self.set_uid(slugify(instance.name))
blueprint_content = instance.retrieve()
file_hash = sha512(blueprint_content.encode()).hexdigest()
importer = Importer(blueprint_content, instance.context)

View File

@ -1,6 +1,5 @@
"""Groups API Viewset"""
from json import loads
from typing import Optional
from django.db.models.query import QuerySet
from django.http import Http404
@ -53,14 +52,6 @@ class GroupSerializer(ModelSerializer):
num_pk = IntegerField(read_only=True)
def validate_parent(self, parent: Optional[Group]):
"""Validate group parent (if set), ensuring the parent isn't itself"""
if not self.instance or not parent:
return parent
if str(parent.group_uuid) == str(self.instance.group_uuid):
raise ValidationError("Cannot set group as parent of itself.")
return parent
class Meta:
model = Group
fields = [

View File

@ -33,7 +33,7 @@ class TokenSerializer(ManagedSerializer, ModelSerializer):
def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
if SERIALIZER_CONTEXT_BLUEPRINT in self.context:
self.fields["key"] = CharField(required=False)
self.fields["key"] = CharField()
def validate(self, attrs: dict[Any, str]) -> dict[Any, str]:
"""Ensure only API or App password tokens are created."""

View File

@ -15,7 +15,7 @@ from django.utils.http import urlencode
from django.utils.text import slugify
from django.utils.timezone import now
from django.utils.translation import gettext as _
from django_filters.filters import BooleanFilter, CharFilter, ModelMultipleChoiceFilter, UUIDFilter
from django_filters.filters import BooleanFilter, CharFilter, ModelMultipleChoiceFilter
from django_filters.filterset import FilterSet
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import (
@ -51,7 +51,6 @@ from structlog.stdlib import get_logger
from authentik.admin.api.metrics import CoordinateSerializer
from authentik.api.decorators import permission_required
from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import LinkSerializer, PassiveSerializer, is_dict
from authentik.core.middleware import (
@ -114,30 +113,6 @@ class UserSerializer(ModelSerializer):
uid = CharField(read_only=True)
username = CharField(max_length=150, validators=[UniqueValidator(queryset=User.objects.all())])
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
if SERIALIZER_CONTEXT_BLUEPRINT in self.context:
self.fields["password"] = CharField(required=False)
def create(self, validated_data: dict) -> User:
"""If this serializer is used in the blueprint context, we allow for
directly setting a password. However should be done via the `set_password`
method instead of directly setting it like rest_framework."""
instance: User = super().create(validated_data)
if SERIALIZER_CONTEXT_BLUEPRINT in self.context and "password" in validated_data:
instance.set_password(validated_data["password"])
instance.save()
return instance
def update(self, instance: User, validated_data: dict) -> User:
"""Same as `create` above, set the password directly if we're in a blueprint
context"""
instance = super().update(instance, validated_data)
if SERIALIZER_CONTEXT_BLUEPRINT in self.context and "password" in validated_data:
instance.set_password(validated_data["password"])
instance.save()
return instance
def validate_path(self, path: str) -> str:
"""Validate path"""
if path[:1] == "/" or path[-1] == "/":
@ -284,7 +259,7 @@ class UsersFilter(FilterSet):
)
is_superuser = BooleanFilter(field_name="ak_groups", lookup_expr="is_superuser")
uuid = UUIDFilter(field_name="uuid")
uuid = CharFilter(field_name="uuid")
path = CharFilter(
field_name="path",

View File

@ -1,40 +0,0 @@
"""Run worker"""
from sys import exit as sysexit
from tempfile import tempdir
from celery.apps.worker import Worker
from django.core.management.base import BaseCommand
from django.db import close_old_connections
from structlog.stdlib import get_logger
from authentik.lib.config import CONFIG
from authentik.root.celery import CELERY_APP
LOGGER = get_logger()
class Command(BaseCommand):
"""Run worker"""
def handle(self, **options):
close_old_connections()
if CONFIG.y_bool("remote_debug"):
import debugpy
debugpy.listen(("0.0.0.0", 6900)) # nosec
worker: Worker = CELERY_APP.Worker(
no_color=False,
quiet=True,
optimization="fair",
max_tasks_per_child=1,
autoscale=(3, 1),
task_events=True,
beat=True,
schedule_filename=f"{tempdir}/celerybeat-schedule",
queues=["authentik", "authentik_scheduled", "authentik_events"],
)
for task in CELERY_APP.tasks:
LOGGER.debug("Registered task", task=task)
worker.start()
sysexit(worker.exitcode)

View File

@ -11,7 +11,7 @@ def backport_is_backchannel(apps: Apps, schema_editor: BaseDatabaseSchemaEditor)
for model in BackchannelProvider.__subclasses__():
try:
for obj in model.objects.only("is_backchannel"):
for obj in model.objects.all():
obj.is_backchannel = True
obj.save()
except (DatabaseError, InternalError, ProgrammingError):

View File

@ -5,6 +5,7 @@ from typing import Any, Optional
from uuid import uuid4
from deepmerge import always_merger
from django.conf import settings
from django.contrib.auth.hashers import check_password
from django.contrib.auth.models import AbstractUser
from django.contrib.auth.models import UserManager as DjangoUserManager
@ -32,7 +33,6 @@ from authentik.lib.models import (
)
from authentik.lib.utils.http import get_client_ip
from authentik.policies.models import PolicyBindingModel
from authentik.root.install_id import get_install_id
LOGGER = get_logger()
USER_ATTRIBUTE_DEBUG = "goauthentik.io/user/debug"
@ -217,7 +217,7 @@ class User(SerializerModel, GuardianUserMixin, AbstractUser):
@property
def uid(self) -> str:
"""Generate a globally unique UID, based on the user ID and the hashed secret key"""
return sha256(f"{self.id}-{get_install_id()}".encode("ascii")).hexdigest()
return sha256(f"{self.id}-{settings.SECRET_KEY}".encode("ascii")).hexdigest()
def locale(self, request: Optional[HttpRequest] = None) -> str:
"""Get the locale the user has configured"""
@ -376,10 +376,10 @@ class Application(SerializerModel, PolicyBindingModel):
def get_launch_url(self, user: Optional["User"] = None) -> Optional[str]:
"""Get launch URL if set, otherwise attempt to get launch URL based on provider."""
url = None
if provider := self.get_provider():
url = provider.launch_url
if self.meta_launch_url:
url = self.meta_launch_url
elif provider := self.get_provider():
url = provider.launch_url
if user and url:
if isinstance(user, SimpleLazyObject):
user._setup()

View File

@ -8,8 +8,7 @@
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1, maximum-scale=1">
<title>{% block title %}{% trans title|default:tenant.branding_title %}{% endblock %}</title>
<link rel="icon" href="{{ tenant.branding_favicon }}">
<link rel="shortcut icon" href="{{ tenant.branding_favicon }}">
<link rel="shortcut icon" type="image/png" href="{% static 'dist/assets/icons/icon.png' %}">
{% block head_before %}
{% endblock %}
<link rel="stylesheet" type="text/css" href="{% static 'dist/authentik.css' %}">

View File

@ -6,6 +6,8 @@
<script src="{% static 'dist/admin/AdminInterface.js' %}?version={{ version }}" type="module"></script>
<meta name="theme-color" content="#18191a" media="(prefers-color-scheme: dark)">
<meta name="theme-color" content="#ffffff" media="(prefers-color-scheme: light)">
<link rel="icon" href="{{ tenant.branding_favicon }}">
<link rel="shortcut icon" href="{{ tenant.branding_favicon }}">
{% include "base/header_js.html" %}
{% endblock %}

View File

@ -5,6 +5,8 @@
{% block head_before %}
{{ block.super }}
<link rel="prefetch" href="{{ flow.background_url }}" />
<link rel="icon" href="{{ tenant.branding_favicon }}">
<link rel="shortcut icon" href="{{ tenant.branding_favicon }}">
{% if flow.compatibility_mode and not inspector %}
<script>ShadyDOM = { force: !navigator.webdriver };</script>
{% endif %}

View File

@ -6,6 +6,8 @@
<script src="{% static 'dist/user/UserInterface.js' %}?version={{ version }}" type="module"></script>
<meta name="theme-color" content="#1c1e21" media="(prefers-color-scheme: light)">
<meta name="theme-color" content="#1c1e21" media="(prefers-color-scheme: dark)">
<link rel="icon" href="{{ tenant.branding_favicon }}">
<link rel="shortcut icon" href="{{ tenant.branding_favicon }}">
{% include "base/header_js.html" %}
{% endblock %}

View File

@ -67,16 +67,3 @@ class TestGroupsAPI(APITestCase):
},
)
self.assertEqual(res.status_code, 404)
def test_parent_self(self):
"""Test parent"""
group = Group.objects.create(name=generate_id())
self.client.force_login(self.admin)
res = self.client.patch(
reverse("authentik_api:group-detail", kwargs={"pk": group.pk}),
data={
"pk": self.user.pk + 3,
"parent": group.pk,
},
)
self.assertEqual(res.status_code, 400)

View File

@ -8,7 +8,7 @@ from authentik.core.api.utils import PassiveSerializer
from authentik.flows.challenge import Challenge
@dataclass(slots=True)
@dataclass
class UILoginButton:
"""Dataclass for Source's ui_login_button"""

View File

@ -7,7 +7,7 @@ from cryptography import x509
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import hashes, serialization
from cryptography.hazmat.primitives.asymmetric import ec, rsa
from cryptography.hazmat.primitives.asymmetric.types import PrivateKeyTypes
from cryptography.hazmat.primitives.asymmetric.types import PRIVATE_KEY_TYPES
from cryptography.x509.oid import NameOID
from authentik import __version__
@ -40,7 +40,7 @@ class CertificateBuilder:
self.cert.save()
return self.cert
def generate_private_key(self) -> PrivateKeyTypes:
def generate_private_key(self) -> PRIVATE_KEY_TYPES:
"""Generate private key"""
if self._use_ec_private_key:
return ec.generate_private_key(curve=ec.SECP256R1)

View File

@ -6,7 +6,7 @@ from uuid import uuid4
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives.asymmetric.types import PrivateKeyTypes, PublicKeyTypes
from cryptography.hazmat.primitives.asymmetric.types import PRIVATE_KEY_TYPES, PUBLIC_KEY_TYPES
from cryptography.hazmat.primitives.serialization import load_pem_private_key
from cryptography.x509 import Certificate, load_pem_x509_certificate
from django.db import models
@ -37,8 +37,8 @@ class CertificateKeyPair(SerializerModel, ManagedModel, CreatedUpdatedModel):
)
_cert: Optional[Certificate] = None
_private_key: Optional[PrivateKeyTypes] = None
_public_key: Optional[PublicKeyTypes] = None
_private_key: Optional[PRIVATE_KEY_TYPES] = None
_public_key: Optional[PUBLIC_KEY_TYPES] = None
@property
def serializer(self) -> Serializer:
@ -56,7 +56,7 @@ class CertificateKeyPair(SerializerModel, ManagedModel, CreatedUpdatedModel):
return self._cert
@property
def public_key(self) -> Optional[PublicKeyTypes]:
def public_key(self) -> Optional[PUBLIC_KEY_TYPES]:
"""Get public key of the private key"""
if not self._public_key:
self._public_key = self.private_key.public_key()
@ -65,7 +65,7 @@ class CertificateKeyPair(SerializerModel, ManagedModel, CreatedUpdatedModel):
@property
def private_key(
self,
) -> Optional[PrivateKeyTypes]:
) -> Optional[PRIVATE_KEY_TYPES]:
"""Get python cryptography PrivateKey instance"""
if not self._private_key and self.key_data != "":
try:

View File

@ -41,7 +41,6 @@ class TaskResult:
def with_error(self, exc: Exception) -> "TaskResult":
"""Since errors might not always be pickle-able, set the traceback"""
# TODO: Mark exception somehow so that is rendered as <pre> in frontend
self.messages.append(exception_to_string(exc))
return self
@ -70,10 +69,8 @@ class TaskInfo:
return cache.get_many(cache.keys(CACHE_KEY_PREFIX + "*"))
@staticmethod
def by_name(name: str) -> Optional["TaskInfo"] | Optional[list["TaskInfo"]]:
def by_name(name: str) -> Optional["TaskInfo"]:
"""Get TaskInfo Object by name"""
if "*" in name:
return cache.get_many(cache.keys(CACHE_KEY_PREFIX + name)).values()
return cache.get(CACHE_KEY_PREFIX + name, None)
def delete(self):

View File

@ -154,7 +154,7 @@ class AutosubmitChallenge(Challenge):
"""Autosubmit challenge used to send and navigate a POST request"""
url = CharField()
attrs = DictField(child=CharField(allow_blank=True), allow_empty=True)
attrs = DictField(child=CharField())
title = CharField(required=False)
component = CharField(default="ak-stage-autosubmit")

View File

@ -30,7 +30,7 @@ class StageMarker:
return binding
@dataclass(slots=True)
@dataclass
class ReevaluateMarker(StageMarker):
"""Reevaluate Marker, forces stage's policies to be evaluated again."""

View File

@ -45,7 +45,7 @@ def cache_key(flow: Flow, user: Optional[User] = None) -> str:
return prefix
@dataclass(slots=True)
@dataclass
class FlowPlan:
"""This data-class is the output of a FlowPlanner. It holds a flat list
of all Stages that should be run."""

View File

@ -1,28 +0,0 @@
"""flow views tests"""
from django.test import TestCase
from authentik.flows.challenge import AutosubmitChallenge, ChallengeTypes
class TestChallenges(TestCase):
"""Test generic challenges"""
def test_autosubmit_blank(self):
"""Test blank autosubmit"""
challenge = AutosubmitChallenge(
data={
"type": ChallengeTypes.NATIVE.value,
"url": "http://localhost",
"attrs": {},
}
)
self.assertTrue(challenge.is_valid(raise_exception=True))
# Test with an empty value
challenge = AutosubmitChallenge(
data={
"type": ChallengeTypes.NATIVE.value,
"url": "http://localhost",
"attrs": {"foo": ""},
}
)
self.assertTrue(challenge.is_valid(raise_exception=True))

View File

@ -23,7 +23,6 @@ from authentik.flows.api.bindings import FlowStageBindingSerializer
from authentik.flows.models import Flow
from authentik.flows.planner import FlowPlan
from authentik.flows.views.executor import SESSION_KEY_HISTORY, SESSION_KEY_PLAN
from authentik.root.install_id import get_install_id
class FlowInspectorPlanSerializer(PassiveSerializer):
@ -52,7 +51,7 @@ class FlowInspectorPlanSerializer(PassiveSerializer):
"""Get a unique session ID"""
request: Request = self.context["request"]
return sha256(
f"{request._request.session.session_key}-{get_install_id()}".encode("ascii")
f"{request._request.session.session_key}-{settings.SECRET_KEY}".encode("ascii")
).hexdigest()

View File

@ -33,7 +33,6 @@ redis:
cache_timeout_reputation: 300
debug: false
remote_debug: false
log_level: info
@ -73,7 +72,6 @@ outposts:
ldap:
task_timeout_hours: 2
page_size: 50
tls:
ciphers: null

View File

@ -28,7 +28,7 @@ class WebsocketMessageInstruction(IntEnum):
TRIGGER_UPDATE = 2
@dataclass(slots=True)
@dataclass
class WebsocketMessage:
"""Complete Websocket Message that is being sent"""

View File

@ -6,7 +6,7 @@ from rest_framework.viewsets import ModelViewSet
from authentik.core.api.used_by import UsedByMixin
from authentik.policies.api.policies import PolicySerializer
from authentik.policies.event_matcher.models import EventMatcherPolicy, app_choices, model_choices
from authentik.policies.event_matcher.models import EventMatcherPolicy, app_choices
class EventMatcherPolicySerializer(PolicySerializer):
@ -15,30 +15,15 @@ class EventMatcherPolicySerializer(PolicySerializer):
app = ChoiceField(
choices=app_choices(),
required=False,
allow_null=True,
allow_blank=True,
help_text=_(
"Match events created by selected application. When left empty, "
"all applications are matched."
),
)
model = ChoiceField(
choices=model_choices(),
required=False,
allow_null=True,
help_text=_(
"Match events created by selected model. "
"When left empty, all models are matched. When an app is selected, "
"all the application's models are matched."
),
)
def validate(self, attrs: dict) -> dict:
if (
attrs["action"] == ""
and attrs["client_ip"] == ""
and attrs["app"] == ""
and attrs["model"] == ""
):
if attrs["action"] == "" and attrs["client_ip"] == "" and attrs["app"] == "":
raise ValidationError(_("At least one criteria must be set."))
return super().validate(attrs)
@ -48,7 +33,6 @@ class EventMatcherPolicySerializer(PolicySerializer):
"action",
"client_ip",
"app",
"model",
]

View File

@ -1,21 +0,0 @@
# Generated by Django 4.1.7 on 2023-05-29 15:24
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_policies_event_matcher", "0021_alter_eventmatcherpolicy_app"),
]
operations = [
migrations.AddField(
model_name="eventmatcherpolicy",
name="model",
field=models.TextField(
blank=True,
default="",
help_text="Match events created by selected model. When left empty, all models are matched. When an app is selected, all the application's models are matched.",
),
),
]

View File

@ -1,103 +0,0 @@
# Generated by Django 4.1.7 on 2023-06-21 12:45
from django.apps.registry import Apps
from django.db import migrations, models
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
def replace_defaults(apps: Apps, schema_editor: BaseDatabaseSchemaEditor):
db_alias = schema_editor.connection.alias
eventmatcherpolicy = apps.get_model("authentik_policies_event_matcher", "eventmatcherpolicy")
for policy in eventmatcherpolicy.objects.using(db_alias).all():
changed = False
if policy.action == "":
policy.action = None
changed = True
if policy.app == "":
policy.app = None
changed = True
if policy.client_ip == "":
policy.client_ip = None
changed = True
if policy.model == "":
policy.model = None
changed = True
if not changed:
continue
policy.save()
class Migration(migrations.Migration):
dependencies = [
("authentik_policies_event_matcher", "0022_eventmatcherpolicy_model"),
]
operations = [
migrations.AlterField(
model_name="eventmatcherpolicy",
name="action",
field=models.TextField(
choices=[
("login", "Login"),
("login_failed", "Login Failed"),
("logout", "Logout"),
("user_write", "User Write"),
("suspicious_request", "Suspicious Request"),
("password_set", "Password Set"),
("secret_view", "Secret View"),
("secret_rotate", "Secret Rotate"),
("invitation_used", "Invite Used"),
("authorize_application", "Authorize Application"),
("source_linked", "Source Linked"),
("impersonation_started", "Impersonation Started"),
("impersonation_ended", "Impersonation Ended"),
("flow_execution", "Flow Execution"),
("policy_execution", "Policy Execution"),
("policy_exception", "Policy Exception"),
("property_mapping_exception", "Property Mapping Exception"),
("system_task_execution", "System Task Execution"),
("system_task_exception", "System Task Exception"),
("system_exception", "System Exception"),
("configuration_error", "Configuration Error"),
("model_created", "Model Created"),
("model_updated", "Model Updated"),
("model_deleted", "Model Deleted"),
("email_sent", "Email Sent"),
("update_available", "Update Available"),
("custom_", "Custom Prefix"),
],
default=None,
help_text="Match created events with this action type. When left empty, all action types will be matched.",
null=True,
),
),
migrations.AlterField(
model_name="eventmatcherpolicy",
name="app",
field=models.TextField(
default=None,
help_text="Match events created by selected application. When left empty, all applications are matched.",
null=True,
),
),
migrations.AlterField(
model_name="eventmatcherpolicy",
name="client_ip",
field=models.TextField(
default=None,
help_text="Matches Event's Client IP (strict matching, for network matching use an Expression Policy)",
null=True,
),
),
migrations.AlterField(
model_name="eventmatcherpolicy",
name="model",
field=models.TextField(
default=None,
help_text="Match events created by selected model. When left empty, all models are matched. When an app is selected, all the application's models are matched.",
null=True,
),
),
migrations.RunPython(replace_defaults),
]

View File

@ -1,19 +1,13 @@
"""Event Matcher models"""
from itertools import chain
from django.apps import apps
from django.db import models
from django.utils.translation import gettext as _
from rest_framework.serializers import BaseSerializer
from structlog.stdlib import get_logger
from authentik.blueprints.v1.importer import is_model_allowed
from authentik.events.models import Event, EventAction
from authentik.policies.models import Policy
from authentik.policies.types import PolicyRequest, PolicyResult
LOGGER = get_logger()
def app_choices() -> list[tuple[str, str]]:
"""Get a list of all installed applications that create events.
@ -25,50 +19,27 @@ def app_choices() -> list[tuple[str, str]]:
return choices
def model_choices() -> list[tuple[str, str]]:
"""Get a list of all installed models
Returns a list of tuples containing (dotted.model.path, name)"""
choices = []
for model in apps.get_models():
if not is_model_allowed(model):
continue
name = f"{model._meta.app_label}.{model._meta.model_name}"
choices.append((name, model._meta.verbose_name))
return choices
class EventMatcherPolicy(Policy):
"""Passes when Event matches selected criteria."""
action = models.TextField(
choices=EventAction.choices,
null=True,
default=None,
blank=True,
help_text=_(
"Match created events with this action type. "
"When left empty, all action types will be matched."
),
)
app = models.TextField(
null=True,
default=None,
blank=True,
default="",
help_text=_(
"Match events created by selected application. "
"When left empty, all applications are matched."
),
)
model = models.TextField(
null=True,
default=None,
help_text=_(
"Match events created by selected model. "
"When left empty, all models are matched. When an app is selected, "
"all the application's models are matched."
),
)
client_ip = models.TextField(
null=True,
default=None,
blank=True,
help_text=_(
"Matches Event's Client IP (strict matching, "
"for network matching use an Expression Policy)"
@ -89,55 +60,13 @@ class EventMatcherPolicy(Policy):
if "event" not in request.context:
return PolicyResult(False)
event: Event = request.context["event"]
matches: list[PolicyResult] = []
messages = []
checks = [
self.passes_action,
self.passes_client_ip,
self.passes_app,
self.passes_model,
]
for checker in checks:
result = checker(request, event)
if result is None:
continue
LOGGER.info(
"Event matcher check result",
checker=checker.__name__,
result=result,
)
matches.append(result)
passing = any(x.passing for x in matches)
messages = chain(*[x.messages for x in matches])
result = PolicyResult(passing, *messages)
result.source_results = matches
return result
def passes_action(self, request: PolicyRequest, event: Event) -> PolicyResult | None:
"""Check if `self.action` matches"""
if self.action is None:
return None
return PolicyResult(self.action == event.action, "Action matched.")
def passes_client_ip(self, request: PolicyRequest, event: Event) -> PolicyResult | None:
"""Check if `self.client_ip` matches"""
if self.client_ip is None:
return None
return PolicyResult(self.client_ip == event.client_ip, "Client IP matched.")
def passes_app(self, request: PolicyRequest, event: Event) -> PolicyResult | None:
"""Check if `self.app` matches"""
if self.app is None:
return None
return PolicyResult(self.app == event.app, "App matched.")
def passes_model(self, request: PolicyRequest, event: Event) -> PolicyResult | None:
"""Check if `self.model` is set, and pass if it matches the event's model"""
if self.model is None:
return None
event_model_info = event.context.get("model", {})
event_model = f"{event_model_info.get('app')}.{event_model_info.get('model_name')}"
return PolicyResult(event_model == self.model, "Model matched.")
if event.action == self.action:
return PolicyResult(True, "Action matched.")
if event.client_ip == self.client_ip:
return PolicyResult(True, "Client IP matched.")
if event.app == self.app:
return PolicyResult(True, "App matched.")
return PolicyResult(False)
class Meta(Policy.PolicyMeta):
verbose_name = _("Event Matcher Policy")

View File

@ -42,22 +42,6 @@ class TestEventMatcherPolicy(TestCase):
self.assertTrue(response.passing)
self.assertTupleEqual(response.messages, ("App matched.",))
def test_match_model(self):
"""Test match model"""
event = Event.new(EventAction.LOGIN)
event.context = {
"model": {
"app": "foo",
"model_name": "bar",
}
}
request = PolicyRequest(get_anonymous_user())
request.context["event"] = event
policy: EventMatcherPolicy = EventMatcherPolicy.objects.create(model="foo.bar")
response = policy.passes(request)
self.assertTrue(response.passing)
self.assertTupleEqual(response.messages, ("Model matched.",))
def test_drop(self):
"""Test drop event"""
event = Event.new(EventAction.LOGIN)
@ -68,19 +52,6 @@ class TestEventMatcherPolicy(TestCase):
response = policy.passes(request)
self.assertFalse(response.passing)
def test_drop_multiple(self):
"""Test drop event"""
event = Event.new(EventAction.LOGIN)
event.app = "foo"
event.client_ip = "1.2.3.4"
request = PolicyRequest(get_anonymous_user())
request.context["event"] = event
policy: EventMatcherPolicy = EventMatcherPolicy.objects.create(
client_ip="1.2.3.5", app="bar"
)
response = policy.passes(request)
self.assertFalse(response.passing)
def test_invalid(self):
"""Test passing event"""
request = PolicyRequest(get_anonymous_user())

View File

@ -19,7 +19,7 @@ LOGGER = get_logger()
CACHE_PREFIX = "goauthentik.io/policies/"
@dataclass(slots=True)
@dataclass
class PolicyRequest:
"""Data-class to hold policy request data"""
@ -27,14 +27,14 @@ class PolicyRequest:
http_request: Optional[HttpRequest]
obj: Optional[Model]
context: dict[str, Any]
debug: bool
debug: bool = False
def __init__(self, user: User):
super().__init__()
self.user = user
self.http_request = None
self.obj = None
self.context = {}
self.debug = False
def set_http_request(self, request: HttpRequest): # pragma: no cover
"""Load data from HTTP request, including geoip when enabled"""
@ -67,7 +67,7 @@ class PolicyRequest:
return text + ">"
@dataclass(slots=True)
@dataclass
class PolicyResult:
"""Result from evaluating a policy."""
@ -81,6 +81,7 @@ class PolicyResult:
log_messages: Optional[list[dict]]
def __init__(self, passing: bool, *messages: str):
super().__init__()
self.passing = passing
self.messages = messages
self.raw_result = None

View File

@ -29,7 +29,6 @@ class LDAPProviderSerializer(ProviderSerializer):
"outpost_set",
"search_mode",
"bind_mode",
"mfa_support",
]
extra_kwargs = ProviderSerializer.Meta.extra_kwargs
@ -100,16 +99,13 @@ class LDAPOutpostConfigSerializer(ModelSerializer):
"gid_start_number",
"search_mode",
"bind_mode",
"mfa_support",
]
class LDAPOutpostConfigViewSet(ReadOnlyModelViewSet):
"""LDAPProvider Viewset"""
queryset = LDAPProvider.objects.filter(
Q(application__isnull=False) | Q(backchannel_application__isnull=False)
)
queryset = LDAPProvider.objects.filter(application__isnull=False)
serializer_class = LDAPOutpostConfigSerializer
ordering = ["name"]
search_fields = ["name"]

View File

@ -1,37 +0,0 @@
# Generated by Django 4.1.7 on 2023-06-19 17:30
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_providers_ldap", "0002_ldapprovider_bind_mode"),
]
operations = [
migrations.AddField(
model_name="ldapprovider",
name="mfa_support",
field=models.BooleanField(
default=True,
help_text="When enabled, code-based multi-factor authentication can be used by appending a semicolon and the TOTP code to the password. This should only be enabled if all users that will bind to this provider have a TOTP device configured, as otherwise a password may incorrectly be rejected if it contains a semicolon.",
verbose_name="MFA Support",
),
),
migrations.AlterField(
model_name="ldapprovider",
name="gid_start_number",
field=models.IntegerField(
default=4000,
help_text="The start for gidNumbers, this number is added to a number generated from the group.pk to make sure that the numbers aren't too low for POSIX groups. Default is 4000 to ensure that we don't collide with local groups or users primary groups gidNumber",
),
),
migrations.AlterField(
model_name="ldapprovider",
name="uid_start_number",
field=models.IntegerField(
default=2000,
help_text="The start for uidNumbers, this number is added to the user.pk to make sure that the numbers aren't too low for POSIX users. Default is 2000 to ensure that we don't collide with local users uidNumber",
),
),
]

View File

@ -50,7 +50,7 @@ class LDAPProvider(OutpostModel, BackchannelProvider):
uid_start_number = models.IntegerField(
default=2000,
help_text=_(
"The start for uidNumbers, this number is added to the user.pk to make sure that the "
"The start for uidNumbers, this number is added to the user.Pk to make sure that the "
"numbers aren't too low for POSIX users. Default is 2000 to ensure that we don't "
"collide with local users uidNumber"
),
@ -60,7 +60,7 @@ class LDAPProvider(OutpostModel, BackchannelProvider):
default=4000,
help_text=_(
"The start for gidNumbers, this number is added to a number generated from the "
"group.pk to make sure that the numbers aren't too low for POSIX groups. Default "
"group.Pk to make sure that the numbers aren't too low for POSIX groups. Default "
"is 4000 to ensure that we don't collide with local groups or users "
"primary groups gidNumber"
),
@ -69,17 +69,6 @@ class LDAPProvider(OutpostModel, BackchannelProvider):
bind_mode = models.TextField(default=APIAccessMode.DIRECT, choices=APIAccessMode.choices)
search_mode = models.TextField(default=APIAccessMode.DIRECT, choices=APIAccessMode.choices)
mfa_support = models.BooleanField(
default=True,
verbose_name="MFA Support",
help_text=_(
"When enabled, code-based multi-factor authentication can be used by appending a "
"semicolon and the TOTP code to the password. This should only be enabled if all "
"users that will bind to this provider have a TOTP device configured, as otherwise "
"a password may incorrectly be rejected if it contains a semicolon."
),
)
@property
def launch_url(self) -> Optional[str]:
"""LDAP never has a launch URL"""

View File

@ -1,52 +0,0 @@
"""LDAP Provider API tests"""
from json import loads
from django.urls import reverse
from rest_framework.test import APITestCase
from authentik.core.models import Application
from authentik.core.tests.utils import create_test_admin_user, create_test_flow
from authentik.lib.generators import generate_id
from authentik.providers.ldap.models import LDAPProvider
class TestLDAPProviderAPI(APITestCase):
"""LDAP Provider API tests"""
def test_outpost_application(self):
"""Test outpost-like provider retrieval (direct connection)"""
provider = LDAPProvider.objects.create(
name=generate_id(),
authorization_flow=create_test_flow(),
)
Application.objects.create(
name=generate_id(),
slug=generate_id(),
provider=provider,
)
user = create_test_admin_user()
self.client.force_login(user)
res = self.client.get(reverse("authentik_api:ldapprovideroutpost-list"))
self.assertEqual(res.status_code, 200)
data = loads(res.content.decode())
self.assertEqual(data["pagination"]["count"], 1)
self.assertEqual(len(data["results"]), 1)
def test_outpost_application_backchannel(self):
"""Test outpost-like provider retrieval (backchannel connection)"""
provider = LDAPProvider.objects.create(
name=generate_id(),
authorization_flow=create_test_flow(),
)
app: Application = Application.objects.create(
name=generate_id(),
slug=generate_id(),
)
app.backchannel_providers.add(provider)
user = create_test_admin_user()
self.client.force_login(user)
res = self.client.get(reverse("authentik_api:ldapprovideroutpost-list"))
self.assertEqual(res.status_code, 200)
data = loads(res.content.decode())
self.assertEqual(data["pagination"]["count"], 1)
self.assertEqual(len(data["results"]), 1)

View File

@ -2,6 +2,6 @@
from authentik.providers.ldap.api import LDAPOutpostConfigViewSet, LDAPProviderViewSet
api_urlpatterns = [
("outposts/ldap", LDAPOutpostConfigViewSet, "ldapprovideroutpost"),
("outposts/ldap", LDAPOutpostConfigViewSet),
("providers/ldap", LDAPProviderViewSet),
]

View File

@ -19,11 +19,6 @@ SCOPE_OPENID = "openid"
SCOPE_OPENID_PROFILE = "profile"
SCOPE_OPENID_EMAIL = "email"
# https://www.iana.org/assignments/oauth-parameters/\
# oauth-parameters.xhtml#pkce-code-challenge-method
PKCE_METHOD_PLAIN = "plain"
PKCE_METHOD_S256 = "S256"
TOKEN_TYPE = "Bearer" # nosec
SCOPE_AUTHENTIK_API = "goauthentik.io/api"

View File

@ -41,7 +41,7 @@ class SubModes(models.TextChoices):
)
@dataclass(slots=True)
@dataclass
# pylint: disable=too-many-instance-attributes
class IDToken:
"""The primary extension that OpenID Connect makes to OAuth 2.0 to enable End-Users to be

View File

@ -9,7 +9,7 @@ from urllib.parse import urlparse, urlunparse
from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurvePrivateKey
from cryptography.hazmat.primitives.asymmetric.rsa import RSAPrivateKey
from cryptography.hazmat.primitives.asymmetric.types import PrivateKeyTypes
from cryptography.hazmat.primitives.asymmetric.types import PRIVATE_KEY_TYPES
from dacite.core import from_dict
from django.db import models
from django.http import HttpRequest
@ -17,7 +17,6 @@ from django.urls import reverse
from django.utils.translation import gettext_lazy as _
from jwt import encode
from rest_framework.serializers import Serializer
from structlog.stdlib import get_logger
from authentik.core.models import ExpiringModel, PropertyMapping, Provider, User
from authentik.crypto.models import CertificateKeyPair
@ -27,8 +26,6 @@ from authentik.lib.utils.time import timedelta_string_validator
from authentik.providers.oauth2.id_token import IDToken, SubModes
from authentik.sources.oauth.models import OAuthSource
LOGGER = get_logger()
def generate_client_secret() -> str:
"""Generate client secret with adequate length"""
@ -218,7 +215,7 @@ class OAuth2Provider(Provider):
)
@cached_property
def jwt_key(self) -> tuple[str | PrivateKeyTypes, str]:
def jwt_key(self) -> tuple[str | PRIVATE_KEY_TYPES, str]:
"""Get either the configured certificate or the client secret"""
if not self.signing_key:
# No Certificate at all, assume HS256
@ -254,12 +251,8 @@ class OAuth2Provider(Provider):
if self.redirect_uris == "":
return None
main_url = self.redirect_uris.split("\n", maxsplit=1)[0]
try:
launch_url = urlparse(main_url)._replace(path="")
return urlunparse(launch_url)
except ValueError as exc:
LOGGER.warning("Failed to format launch url", exc=exc)
return None
launch_url = urlparse(main_url)._replace(path="")
return urlunparse(launch_url)
@property
def component(self) -> str:

View File

@ -1,7 +1,5 @@
"""Test OAuth2 API"""
from json import loads
from sys import version_info
from unittest import skipUnless
from django.urls import reverse
from rest_framework.test import APITestCase
@ -44,14 +42,3 @@ class TestAPI(APITestCase):
self.assertEqual(response.status_code, 200)
body = loads(response.content.decode())
self.assertEqual(body["issuer"], "http://testserver/application/o/test/")
# https://github.com/goauthentik/authentik/pull/5918
@skipUnless(version_info >= (3, 11, 4), "This behaviour is only Python 3.11.4 and up")
def test_launch_url(self):
"""Test launch_url"""
self.provider.redirect_uris = (
"https://[\\d\\w]+.pr.test.goauthentik.io/source/oauth/callback/authentik/\n"
)
self.provider.save()
self.provider.refresh_from_db()
self.assertIsNone(self.provider.launch_url)

View File

@ -35,8 +35,6 @@ from authentik.lib.views import bad_request_message
from authentik.policies.types import PolicyRequest
from authentik.policies.views import PolicyAccessView, RequestValidationError
from authentik.providers.oauth2.constants import (
PKCE_METHOD_PLAIN,
PKCE_METHOD_S256,
PROMPT_CONSENT,
PROMPT_LOGIN,
PROMPT_NONE,
@ -76,7 +74,7 @@ SESSION_KEY_LAST_LOGIN_UID = "authentik/providers/oauth2/last_login_uid"
ALLOWED_PROMPT_PARAMS = {PROMPT_NONE, PROMPT_CONSENT, PROMPT_LOGIN}
@dataclass(slots=True)
@dataclass
# pylint: disable=too-many-instance-attributes
class OAuthAuthorizationParams:
"""Parameters required to authorize an OAuth Client"""
@ -256,10 +254,7 @@ class OAuthAuthorizationParams:
def check_code_challenge(self):
"""PKCE validation of the transformation method."""
if self.code_challenge and self.code_challenge_method not in [
PKCE_METHOD_PLAIN,
PKCE_METHOD_S256,
]:
if self.code_challenge and self.code_challenge_method not in ["plain", "S256"]:
raise AuthorizeError(
self.redirect_uri,
"invalid_request",

View File

@ -14,7 +14,7 @@ from authentik.providers.oauth2.utils import TokenResponse, authenticate_provide
LOGGER = get_logger()
@dataclass(slots=True)
@dataclass
class TokenIntrospectionParams:
"""Parameters for Token Introspection"""

View File

@ -17,8 +17,6 @@ from authentik.providers.oauth2.constants import (
GRANT_TYPE_IMPLICIT,
GRANT_TYPE_PASSWORD,
GRANT_TYPE_REFRESH_TOKEN,
PKCE_METHOD_PLAIN,
PKCE_METHOD_S256,
SCOPE_OPENID,
)
from authentik.providers.oauth2.models import (
@ -111,7 +109,6 @@ class ProviderInfoView(View):
"request_parameter_supported": False,
"claims_supported": self.get_claims(provider),
"claims_parameter_supported": False,
"code_challenge_methods_supported": [PKCE_METHOD_PLAIN, PKCE_METHOD_S256],
}
def get_claims(self, provider: OAuth2Provider) -> list[str]:

View File

@ -39,7 +39,6 @@ from authentik.providers.oauth2.constants import (
GRANT_TYPE_DEVICE_CODE,
GRANT_TYPE_PASSWORD,
GRANT_TYPE_REFRESH_TOKEN,
PKCE_METHOD_S256,
TOKEN_TYPE,
)
from authentik.providers.oauth2.errors import DeviceCodeError, TokenError, UserAuthError
@ -59,7 +58,7 @@ from authentik.stages.password.stage import PLAN_CONTEXT_METHOD, PLAN_CONTEXT_ME
LOGGER = get_logger()
@dataclass(slots=True)
@dataclass
# pylint: disable=too-many-instance-attributes
class TokenParams:
"""Token params"""
@ -222,7 +221,7 @@ class TokenParams:
# Validate PKCE parameters.
if self.code_verifier:
if self.authorization_code.code_challenge_method == PKCE_METHOD_S256:
if self.authorization_code.code_challenge_method == "S256":
new_code_challenge = (
urlsafe_b64encode(sha256(self.code_verifier.encode("ascii")).digest())
.decode("utf-8")

View File

@ -14,7 +14,7 @@ from authentik.providers.oauth2.utils import TokenResponse, authenticate_provide
LOGGER = get_logger()
@dataclass(slots=True)
@dataclass
class TokenRevocationParams:
"""Parameters for Token Revocation"""

View File

@ -1,40 +1,185 @@
"""Kubernetes Traefik Middleware Reconciler"""
from dataclasses import asdict, dataclass, field
from typing import TYPE_CHECKING
from dacite.core import from_dict
from kubernetes.client import ApiextensionsV1Api, CustomObjectsApi
from authentik.outposts.controllers.base import FIELD_MANAGER
from authentik.outposts.controllers.k8s.base import KubernetesObjectReconciler
from authentik.outposts.controllers.kubernetes import KubernetesController
from authentik.providers.proxy.controllers.k8s.traefik_2 import Traefik2MiddlewareReconciler
from authentik.providers.proxy.controllers.k8s.traefik_3 import (
Traefik3MiddlewareReconciler,
TraefikMiddleware,
)
from authentik.outposts.controllers.k8s.triggers import NeedsUpdate
from authentik.providers.proxy.models import ProxyMode, ProxyProvider
if TYPE_CHECKING:
from authentik.outposts.controllers.kubernetes import KubernetesController
class TraefikMiddlewareReconciler(KubernetesObjectReconciler):
@dataclass
class TraefikMiddlewareSpecForwardAuth:
"""traefik middleware forwardAuth spec"""
address: str
# pylint: disable=invalid-name
authResponseHeadersRegex: str = field(default="")
# pylint: disable=invalid-name
authResponseHeaders: list[str] = field(default_factory=list)
# pylint: disable=invalid-name
trustForwardHeader: bool = field(default=True)
@dataclass
class TraefikMiddlewareSpec:
"""Traefik middleware spec"""
# pylint: disable=invalid-name
forwardAuth: TraefikMiddlewareSpecForwardAuth
@dataclass
class TraefikMiddlewareMetadata:
"""Traefik Middleware metadata"""
name: str
namespace: str
labels: dict = field(default_factory=dict)
@dataclass
class TraefikMiddleware:
"""Traefik Middleware"""
# pylint: disable=invalid-name
apiVersion: str
kind: str
metadata: TraefikMiddlewareMetadata
spec: TraefikMiddlewareSpec
CRD_NAME = "middlewares.traefik.containo.us"
CRD_GROUP = "traefik.containo.us"
CRD_VERSION = "v1alpha1"
CRD_PLURAL = "middlewares"
class TraefikMiddlewareReconciler(KubernetesObjectReconciler[TraefikMiddleware]):
"""Kubernetes Traefik Middleware Reconciler"""
def __init__(self, controller: "KubernetesController") -> None:
super().__init__(controller)
self.reconciler = Traefik3MiddlewareReconciler(controller)
if not self.reconciler.crd_exists():
self.reconciler = Traefik2MiddlewareReconciler(controller)
self.api_ex = ApiextensionsV1Api(controller.client)
self.api = CustomObjectsApi(controller.client)
@property
def noop(self) -> bool:
return self.reconciler.noop
if not ProxyProvider.objects.filter(
outpost__in=[self.controller.outpost],
mode__in=[ProxyMode.FORWARD_SINGLE, ProxyMode.FORWARD_DOMAIN],
).exists():
self.logger.debug("No providers with forward auth enabled.")
return True
if not self._crd_exists():
self.logger.debug("CRD doesn't exist")
return True
return False
def _crd_exists(self) -> bool:
"""Check if the traefik middleware exists"""
return bool(
len(
self.api_ex.list_custom_resource_definition(
field_selector=f"metadata.name={CRD_NAME}"
).items
)
)
def reconcile(self, current: TraefikMiddleware, reference: TraefikMiddleware):
return self.reconcile(current, reference)
super().reconcile(current, reference)
if current.spec.forwardAuth.address != reference.spec.forwardAuth.address:
raise NeedsUpdate()
if (
current.spec.forwardAuth.authResponseHeadersRegex
!= reference.spec.forwardAuth.authResponseHeadersRegex
):
raise NeedsUpdate()
# Ensure all of our headers are set, others can be added by the user.
if not set(current.spec.forwardAuth.authResponseHeaders).issubset(
reference.spec.forwardAuth.authResponseHeaders
):
raise NeedsUpdate()
def get_reference_object(self) -> TraefikMiddleware:
return self.get_reference_object()
"""Get deployment object for outpost"""
return TraefikMiddleware(
apiVersion=f"{CRD_GROUP}/{CRD_VERSION}",
kind="Middleware",
metadata=TraefikMiddlewareMetadata(
name=self.name,
namespace=self.namespace,
labels=self.get_object_meta().labels,
),
spec=TraefikMiddlewareSpec(
forwardAuth=TraefikMiddlewareSpecForwardAuth(
address=(
f"http://{self.name}.{self.namespace}:9000/"
"outpost.goauthentik.io/auth/traefik"
),
authResponseHeaders=[
"X-authentik-username",
"X-authentik-groups",
"X-authentik-email",
"X-authentik-name",
"X-authentik-uid",
"X-authentik-jwt",
"X-authentik-meta-jwks",
"X-authentik-meta-outpost",
"X-authentik-meta-provider",
"X-authentik-meta-app",
"X-authentik-meta-version",
],
authResponseHeadersRegex="",
trustForwardHeader=True,
)
),
)
def create(self, reference: TraefikMiddleware):
return self.create(reference)
return self.api.create_namespaced_custom_object(
group=CRD_GROUP,
version=CRD_VERSION,
plural=CRD_PLURAL,
namespace=self.namespace,
body=asdict(reference),
field_manager=FIELD_MANAGER,
)
def delete(self, reference: TraefikMiddleware):
return self.delete(reference)
return self.api.delete_namespaced_custom_object(
group=CRD_GROUP,
version=CRD_VERSION,
namespace=self.namespace,
plural=CRD_PLURAL,
name=self.name,
)
def retrieve(self) -> TraefikMiddleware:
return self.retrieve()
return from_dict(
TraefikMiddleware,
self.api.get_namespaced_custom_object(
group=CRD_GROUP,
version=CRD_VERSION,
namespace=self.namespace,
plural=CRD_PLURAL,
name=self.name,
),
)
def update(self, current: TraefikMiddleware, reference: TraefikMiddleware):
return self.update(current, reference)
return self.api.patch_namespaced_custom_object(
group=CRD_GROUP,
version=CRD_VERSION,
namespace=self.namespace,
plural=CRD_PLURAL,
name=self.name,
body=asdict(reference),
field_manager=FIELD_MANAGER,
)

View File

@ -1,18 +0,0 @@
"""Kubernetes Traefik Middleware Reconciler"""
from typing import TYPE_CHECKING
from authentik.providers.proxy.controllers.k8s.traefik_3 import Traefik3MiddlewareReconciler
if TYPE_CHECKING:
from authentik.outposts.controllers.kubernetes import KubernetesController
class Traefik2MiddlewareReconciler(Traefik3MiddlewareReconciler):
"""Kubernetes Traefik Middleware Reconciler"""
def __init__(self, controller: "KubernetesController") -> None:
super().__init__(controller)
self.crd_name = "middlewares.traefik.containo.us"
self.crd_group = "traefik.containo.us"
self.crd_version = "v1alpha1"
self.crd_plural = "middlewares"

View File

@ -1,183 +0,0 @@
"""Kubernetes Traefik Middleware Reconciler"""
from dataclasses import asdict, dataclass, field
from typing import TYPE_CHECKING
from dacite.core import from_dict
from kubernetes.client import ApiextensionsV1Api, CustomObjectsApi
from authentik.outposts.controllers.base import FIELD_MANAGER
from authentik.outposts.controllers.k8s.base import KubernetesObjectReconciler
from authentik.outposts.controllers.k8s.triggers import NeedsUpdate
from authentik.providers.proxy.models import ProxyMode, ProxyProvider
if TYPE_CHECKING:
from authentik.outposts.controllers.kubernetes import KubernetesController
@dataclass
class TraefikMiddlewareSpecForwardAuth:
"""traefik middleware forwardAuth spec"""
address: str
# pylint: disable=invalid-name
authResponseHeadersRegex: str = field(default="")
# pylint: disable=invalid-name
authResponseHeaders: list[str] = field(default_factory=list)
# pylint: disable=invalid-name
trustForwardHeader: bool = field(default=True)
@dataclass
class TraefikMiddlewareSpec:
"""Traefik middleware spec"""
# pylint: disable=invalid-name
forwardAuth: TraefikMiddlewareSpecForwardAuth
@dataclass
class TraefikMiddlewareMetadata:
"""Traefik Middleware metadata"""
name: str
namespace: str
labels: dict = field(default_factory=dict)
@dataclass
class TraefikMiddleware:
"""Traefik Middleware"""
# pylint: disable=invalid-name
apiVersion: str
kind: str
metadata: TraefikMiddlewareMetadata
spec: TraefikMiddlewareSpec
class Traefik3MiddlewareReconciler(KubernetesObjectReconciler[TraefikMiddleware]):
"""Kubernetes Traefik Middleware Reconciler"""
def __init__(self, controller: "KubernetesController") -> None:
super().__init__(controller)
self.api_ex = ApiextensionsV1Api(controller.client)
self.api = CustomObjectsApi(controller.client)
self.crd_name = "middlewares.traefik.io"
self.crd_group = "traefik.io"
self.crd_version = "v1alpha1"
self.crd_plural = "middlewares"
@property
def noop(self) -> bool:
if not ProxyProvider.objects.filter(
outpost__in=[self.controller.outpost],
mode__in=[ProxyMode.FORWARD_SINGLE, ProxyMode.FORWARD_DOMAIN],
).exists():
self.logger.debug("No providers with forward auth enabled.")
return True
if not self.crd_exists():
self.logger.debug("CRD doesn't exist")
return True
return False
def crd_exists(self) -> bool:
"""Check if the traefik middleware exists"""
return bool(
len(
self.api_ex.list_custom_resource_definition(
field_selector=f"metadata.name={self.crd_name}"
).items
)
)
def reconcile(self, current: TraefikMiddleware, reference: TraefikMiddleware):
super().reconcile(current, reference)
if current.spec.forwardAuth.address != reference.spec.forwardAuth.address:
raise NeedsUpdate()
if (
current.spec.forwardAuth.authResponseHeadersRegex
!= reference.spec.forwardAuth.authResponseHeadersRegex
):
raise NeedsUpdate()
# Ensure all of our headers are set, others can be added by the user.
if not set(current.spec.forwardAuth.authResponseHeaders).issubset(
reference.spec.forwardAuth.authResponseHeaders
):
raise NeedsUpdate()
def get_reference_object(self) -> TraefikMiddleware:
"""Get deployment object for outpost"""
return TraefikMiddleware(
apiVersion=f"{self.crd_group}/{self.crd_version}",
kind="Middleware",
metadata=TraefikMiddlewareMetadata(
name=self.name,
namespace=self.namespace,
labels=self.get_object_meta().labels,
),
spec=TraefikMiddlewareSpec(
forwardAuth=TraefikMiddlewareSpecForwardAuth(
address=(
f"http://{self.name}.{self.namespace}:9000/"
"outpost.goauthentik.io/auth/traefik"
),
authResponseHeaders=[
"X-authentik-username",
"X-authentik-groups",
"X-authentik-email",
"X-authentik-name",
"X-authentik-uid",
"X-authentik-jwt",
"X-authentik-meta-jwks",
"X-authentik-meta-outpost",
"X-authentik-meta-provider",
"X-authentik-meta-app",
"X-authentik-meta-version",
],
authResponseHeadersRegex="",
trustForwardHeader=True,
)
),
)
def create(self, reference: TraefikMiddleware):
return self.api.create_namespaced_custom_object(
group=self.crd_group,
version=self.crd_version,
plural=self.crd_plural,
namespace=self.namespace,
body=asdict(reference),
field_manager=FIELD_MANAGER,
)
def delete(self, reference: TraefikMiddleware):
return self.api.delete_namespaced_custom_object(
group=self.crd_group,
version=self.crd_version,
plural=self.crd_plural,
namespace=self.namespace,
name=self.name,
)
def retrieve(self) -> TraefikMiddleware:
return from_dict(
TraefikMiddleware,
self.api.get_namespaced_custom_object(
group=self.crd_group,
version=self.crd_version,
plural=self.crd_plural,
namespace=self.namespace,
name=self.name,
),
)
def update(self, current: TraefikMiddleware, reference: TraefikMiddleware):
return self.api.patch_namespaced_custom_object(
group=self.crd_group,
version=self.crd_version,
plural=self.crd_plural,
namespace=self.namespace,
name=self.name,
body=asdict(reference),
field_manager=FIELD_MANAGER,
)

View File

@ -2,6 +2,6 @@
from authentik.providers.proxy.api import ProxyOutpostConfigViewSet, ProxyProviderViewSet
api_urlpatterns = [
("outposts/proxy", ProxyOutpostConfigViewSet, "proxyprovideroutpost"),
("outposts/proxy", ProxyOutpostConfigViewSet),
("providers/proxy", ProxyProviderViewSet),
]

View File

@ -2,6 +2,6 @@
from authentik.providers.radius.api import RadiusOutpostConfigViewSet, RadiusProviderViewSet
api_urlpatterns = [
("outposts/radius", RadiusOutpostConfigViewSet, "radiusprovideroutpost"),
("outposts/radius", RadiusOutpostConfigViewSet),
("providers/radius", RadiusProviderViewSet),
]

View File

@ -31,7 +31,7 @@ ERROR_SIGNATURE_REQUIRED_BUT_ABSENT = (
ERROR_FAILED_TO_VERIFY = "Failed to verify signature"
@dataclass(slots=True)
@dataclass
class AuthNRequest:
"""AuthNRequest Dataclass"""

View File

@ -12,7 +12,7 @@ from authentik.providers.saml.utils.encoding import decode_base64_and_inflate
from authentik.sources.saml.processors.constants import NS_SAML_PROTOCOL
@dataclass(slots=True)
@dataclass
class LogoutRequest:
"""Logout Request"""

View File

@ -35,7 +35,7 @@ def format_pem_certificate(unformatted_cert: str) -> str:
return "\n".join(lines)
@dataclass(slots=True)
@dataclass
class ServiceProviderMetadata:
"""SP Metadata Dataclass"""

View File

@ -90,7 +90,6 @@ class TestAuthNRequest(TestCase):
issuer="authentik",
pre_authentication_flow=create_test_flow(),
signing_kp=cert,
verification_kp=cert,
)
def test_signed_valid(self):

View File

@ -130,7 +130,11 @@ class LivenessProbe(bootsteps.StartStopStep):
HEARTBEAT_FILE.touch()
CELERY_APP.config_from_object(settings.CELERY)
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
CELERY_APP.config_from_object(settings, namespace="CELERY")
# Load task modules from all registered Django app configs.
CELERY_APP.autodiscover_tasks()

View File

@ -1,41 +0,0 @@
"""install ID"""
from functools import lru_cache
from uuid import uuid4
from psycopg2 import connect
from authentik.lib.config import CONFIG
@lru_cache
def get_install_id() -> str:
"""Get install ID of this instance. The method is cached as the install ID is
not expected to change"""
from django.conf import settings
from django.db import connection
if settings.TEST:
return str(uuid4())
with connection.cursor() as cursor:
cursor.execute("SELECT id FROM authentik_install_id LIMIT 1;")
return cursor.fetchone()[0]
@lru_cache
def get_install_id_raw():
"""Get install_id without django loaded, this is required for the startup when we get
the install_id but django isn't loaded yet and we can't use the function above."""
conn = connect(
dbname=CONFIG.y("postgresql.name"),
user=CONFIG.y("postgresql.user"),
password=CONFIG.y("postgresql.password"),
host=CONFIG.y("postgresql.host"),
port=int(CONFIG.y("postgresql.port")),
sslmode=CONFIG.y("postgresql.sslmode"),
sslrootcert=CONFIG.y("postgresql.sslrootcert"),
sslcert=CONFIG.y("postgresql.sslcert"),
sslkey=CONFIG.y("postgresql.sslkey"),
)
cursor = conn.cursor()
cursor.execute("SELECT id FROM authentik_install_id LIMIT 1;")
return cursor.fetchone()[0]

View File

@ -182,13 +182,13 @@ REST_FRAMEWORK = {
},
}
_redis_protocol_prefix = "redis://"
_redis_celery_tls_requirements = ""
REDIS_PROTOCOL_PREFIX = "redis://"
REDIS_CELERY_TLS_REQUIREMENTS = ""
if CONFIG.y_bool("redis.tls", False):
_redis_protocol_prefix = "rediss://"
_redis_celery_tls_requirements = f"?ssl_cert_reqs={CONFIG.y('redis.tls_reqs')}"
REDIS_PROTOCOL_PREFIX = "rediss://"
REDIS_CELERY_TLS_REQUIREMENTS = f"?ssl_cert_reqs={CONFIG.y('redis.tls_reqs')}"
_redis_url = (
f"{_redis_protocol_prefix}:"
f"{REDIS_PROTOCOL_PREFIX}:"
f"{quote_plus(CONFIG.y('redis.password'))}@{quote_plus(CONFIG.y('redis.host'))}:"
f"{int(CONFIG.y('redis.port'))}"
)
@ -326,27 +326,27 @@ USE_TZ = True
LOCALE_PATHS = ["./locale"]
CELERY = {
"task_soft_time_limit": 600,
"worker_max_tasks_per_child": 50,
"worker_concurrency": 2,
"beat_schedule": {
"clean_expired_models": {
"task": "authentik.core.tasks.clean_expired_models",
"schedule": crontab(minute="2-59/5"),
"options": {"queue": "authentik_scheduled"},
},
"user_cleanup": {
"task": "authentik.core.tasks.clean_temporary_users",
"schedule": crontab(minute="9-59/5"),
"options": {"queue": "authentik_scheduled"},
},
# Celery settings
# Add a 10 minute timeout to all Celery tasks.
CELERY_TASK_SOFT_TIME_LIMIT = 600
CELERY_WORKER_MAX_TASKS_PER_CHILD = 50
CELERY_WORKER_CONCURRENCY = 2
CELERY_BEAT_SCHEDULE = {
"clean_expired_models": {
"task": "authentik.core.tasks.clean_expired_models",
"schedule": crontab(minute="2-59/5"),
"options": {"queue": "authentik_scheduled"},
},
"user_cleanup": {
"task": "authentik.core.tasks.clean_temporary_users",
"schedule": crontab(minute="9-59/5"),
"options": {"queue": "authentik_scheduled"},
},
"task_create_missing_queues": True,
"task_default_queue": "authentik",
"broker_url": f"{_redis_url}/{CONFIG.y('redis.db')}{_redis_celery_tls_requirements}",
"result_backend": f"{_redis_url}/{CONFIG.y('redis.db')}{_redis_celery_tls_requirements}",
}
CELERY_TASK_CREATE_MISSING_QUEUES = True
CELERY_TASK_DEFAULT_QUEUE = "authentik"
CELERY_BROKER_URL = f"{_redis_url}/{CONFIG.y('redis.db')}{REDIS_CELERY_TLS_REQUIREMENTS}"
CELERY_RESULT_BACKEND = f"{_redis_url}/{CONFIG.y('redis.db')}{REDIS_CELERY_TLS_REQUIREMENTS}"
# Sentry integration
env = get_env()
@ -455,7 +455,7 @@ _DISALLOWED_ITEMS = [
"INSTALLED_APPS",
"MIDDLEWARE",
"AUTHENTICATION_BACKENDS",
"CELERY",
"CELERY_BEAT_SCHEDULE",
]
@ -466,7 +466,7 @@ def _update_settings(app_path: str):
INSTALLED_APPS.extend(getattr(settings_module, "INSTALLED_APPS", []))
MIDDLEWARE.extend(getattr(settings_module, "MIDDLEWARE", []))
AUTHENTICATION_BACKENDS.extend(getattr(settings_module, "AUTHENTICATION_BACKENDS", []))
CELERY["beat_schedule"].update(getattr(settings_module, "CELERY_BEAT_SCHEDULE", {}))
CELERY_BEAT_SCHEDULE.update(getattr(settings_module, "CELERY_BEAT_SCHEDULE", {}))
for _attr in dir(settings_module):
if not _attr.startswith("__") and _attr not in _DISALLOWED_ITEMS:
globals()[_attr] = getattr(settings_module, _attr)
@ -482,7 +482,7 @@ for _app in INSTALLED_APPS:
_update_settings("data.user_settings")
if DEBUG:
CELERY["task_always_eager"] = True
CELERY_TASK_ALWAYS_EAGER = True
os.environ[ENV_GIT_HASH_KEY] = "dev"
INSTALLED_APPS.append("silk")
SILKY_PYTHON_PROFILER = True

View File

@ -30,7 +30,7 @@ class PytestTestRunner: # pragma: no cover
self.args.append(f"--randomly-seed={kwargs['randomly_seed']}")
settings.TEST = True
settings.CELERY["task_always_eager"] = True
settings.CELERY_TASK_ALWAYS_EAGER = True
CONFIG.y_set("avatars", "none")
CONFIG.y_set("geoip", "tests/GeoLite2-City-Test.mmdb")
CONFIG.y_set("blueprints_dir", "./blueprints")

View File

@ -8,7 +8,6 @@ from drf_spectacular.utils import extend_schema, extend_schema_field, inline_ser
from rest_framework.decorators import action
from rest_framework.exceptions import ValidationError
from rest_framework.fields import DictField, ListField
from rest_framework.relations import PrimaryKeyRelatedField
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import ModelViewSet
@ -17,7 +16,6 @@ from authentik.admin.api.tasks import TaskSerializer
from authentik.core.api.propertymappings import PropertyMappingSerializer
from authentik.core.api.sources import SourceSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.crypto.models import CertificateKeyPair
from authentik.events.monitored_tasks import TaskInfo
from authentik.sources.ldap.models import LDAPPropertyMapping, LDAPSource
from authentik.sources.ldap.tasks import SYNC_CLASSES
@ -26,15 +24,6 @@ from authentik.sources.ldap.tasks import SYNC_CLASSES
class LDAPSourceSerializer(SourceSerializer):
"""LDAP Source Serializer"""
client_certificate = PrimaryKeyRelatedField(
allow_null=True,
help_text="Client certificate to authenticate against the LDAP Server's Certificate.",
queryset=CertificateKeyPair.objects.exclude(
key_data__exact="",
),
required=False,
)
def validate(self, attrs: dict[str, Any]) -> dict[str, Any]:
"""Check that only a single source has password_sync on"""
sync_users_password = attrs.get("sync_users_password", True)
@ -53,11 +42,9 @@ class LDAPSourceSerializer(SourceSerializer):
fields = SourceSerializer.Meta.fields + [
"server_uri",
"peer_certificate",
"client_certificate",
"bind_cn",
"bind_password",
"start_tls",
"sni",
"base_dn",
"additional_user_dn",
"additional_group_dn",
@ -88,9 +75,7 @@ class LDAPSourceViewSet(UsedByMixin, ModelViewSet):
"server_uri",
"bind_cn",
"peer_certificate",
"client_certificate",
"start_tls",
"sni",
"base_dn",
"additional_user_dn",
"additional_group_dn",
@ -118,9 +103,10 @@ class LDAPSourceViewSet(UsedByMixin, ModelViewSet):
"""Get source's sync status"""
source = self.get_object()
results = []
tasks = TaskInfo.by_name(f"ldap_sync:{source.slug}:*")
if tasks:
for task in tasks:
for sync_class in SYNC_CLASSES:
sync_name = sync_class.__name__.replace("LDAPSynchronizer", "").lower()
task = TaskInfo.by_name(f"ldap_sync:{source.slug}:{sync_name}")
if task:
results.append(task)
return Response(TaskSerializer(results, many=True).data)
@ -142,7 +128,7 @@ class LDAPSourceViewSet(UsedByMixin, ModelViewSet):
source = self.get_object()
all_objects = {}
for sync_class in SYNC_CLASSES:
class_name = sync_class.name()
class_name = sync_class.__name__.replace("LDAPSynchronizer", "").lower()
all_objects.setdefault(class_name, [])
for obj in sync_class(source).get_objects(size_limit=10):
obj: dict

View File

@ -57,13 +57,13 @@ class LDAPBackend(InbuiltBackend):
# Try to bind as new user
LOGGER.debug("Attempting to bind as user", user=user)
try:
# source.connection also attempts to bind
source.connection(
temp_connection = source.connection(
connection_kwargs={
"user": user.attributes.get(LDAP_DISTINGUISHED_NAME),
"password": password,
}
)
temp_connection.bind()
return user
except LDAPInvalidCredentialsResult as exc:
LOGGER.debug("invalid LDAP credentials", user=user, exc=exc)

View File

@ -2,8 +2,9 @@
from django.core.management.base import BaseCommand
from structlog.stdlib import get_logger
from authentik.lib.utils.reflection import class_to_path
from authentik.sources.ldap.models import LDAPSource
from authentik.sources.ldap.tasks import ldap_sync_single
from authentik.sources.ldap.tasks import SYNC_CLASSES, ldap_sync
LOGGER = get_logger()
@ -20,4 +21,7 @@ class Command(BaseCommand):
if not source:
LOGGER.warning("Source does not exist", slug=source_slug)
continue
ldap_sync_single(source)
for sync_class in SYNC_CLASSES:
LOGGER.info("Starting sync", cls=sync_class)
# pylint: disable=no-value-for-parameter
ldap_sync(source.pk, class_to_path(sync_class))

View File

@ -1,45 +0,0 @@
# Generated by Django 4.1.7 on 2023-06-06 18:33
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_crypto", "0004_alter_certificatekeypair_name"),
("authentik_sources_ldap", "0002_auto_20211203_0900"),
]
operations = [
migrations.AddField(
model_name="ldapsource",
name="client_certificate",
field=models.ForeignKey(
default=None,
help_text="Client certificate to authenticate against the LDAP Server's Certificate.",
null=True,
on_delete=django.db.models.deletion.SET_DEFAULT,
related_name="ldap_client_certificates",
to="authentik_crypto.certificatekeypair",
),
),
migrations.AddField(
model_name="ldapsource",
name="sni",
field=models.BooleanField(
default=False, verbose_name="Use Server URI for SNI verification"
),
),
migrations.AlterField(
model_name="ldapsource",
name="peer_certificate",
field=models.ForeignKey(
default=None,
help_text="Optionally verify the LDAP Server's Certificate against the CA Chain in this keypair.",
null=True,
on_delete=django.db.models.deletion.SET_DEFAULT,
related_name="ldap_peer_certificates",
to="authentik_crypto.certificatekeypair",
),
),
]

View File

@ -1,13 +1,11 @@
"""authentik LDAP Models"""
from os import chmod
from ssl import CERT_REQUIRED
from tempfile import NamedTemporaryFile, mkdtemp
from typing import Optional
from django.db import models
from django.utils.translation import gettext_lazy as _
from ldap3 import ALL, NONE, RANDOM, Connection, Server, ServerPool, Tls
from ldap3.core.exceptions import LDAPInsufficientAccessRightsResult, LDAPSchemaError
from ldap3.core.exceptions import LDAPSchemaError
from rest_framework.serializers import Serializer
from authentik.core.models import Group, PropertyMapping, Source
@ -41,24 +39,14 @@ class LDAPSource(Source):
on_delete=models.SET_DEFAULT,
default=None,
null=True,
related_name="ldap_peer_certificates",
help_text=_(
"Optionally verify the LDAP Server's Certificate against the CA Chain in this keypair."
),
)
client_certificate = models.ForeignKey(
CertificateKeyPair,
on_delete=models.SET_DEFAULT,
default=None,
null=True,
related_name="ldap_client_certificates",
help_text=_("Client certificate to authenticate against the LDAP Server's Certificate."),
)
bind_cn = models.TextField(verbose_name=_("Bind CN"), blank=True)
bind_password = models.TextField(blank=True)
start_tls = models.BooleanField(default=False, verbose_name=_("Enable Start TLS"))
sni = models.BooleanField(default=False, verbose_name=_("Use Server URI for SNI verification"))
base_dn = models.TextField(verbose_name=_("Base DN"))
additional_user_dn = models.TextField(
@ -124,22 +112,8 @@ class LDAPSource(Source):
if self.peer_certificate:
tls_kwargs["ca_certs_data"] = self.peer_certificate.certificate_data
tls_kwargs["validate"] = CERT_REQUIRED
if self.client_certificate:
temp_dir = mkdtemp()
with NamedTemporaryFile(mode="w", delete=False, dir=temp_dir) as temp_cert:
temp_cert.write(self.client_certificate.certificate_data)
certificate_file = temp_cert.name
chmod(certificate_file, 0o600)
with NamedTemporaryFile(mode="w", delete=False, dir=temp_dir) as temp_key:
temp_key.write(self.client_certificate.key_data)
private_key_file = temp_key.name
chmod(private_key_file, 0o600)
tls_kwargs["local_private_key_file"] = private_key_file
tls_kwargs["local_certificate_file"] = certificate_file
if ciphers := CONFIG.y("ldap.tls.ciphers", None):
tls_kwargs["ciphers"] = ciphers.strip()
if self.sni:
tls_kwargs["sni"] = self.server_uri.split(",", maxsplit=1)[0].strip()
server_kwargs = {
"get_info": ALL,
"connect_timeout": LDAP_TIMEOUT,
@ -151,7 +125,7 @@ class LDAPSource(Source):
servers.append(Server(server, **server_kwargs))
else:
servers = [Server(self.server_uri, **server_kwargs)]
return ServerPool(servers, RANDOM, active=5, exhaust=True)
return ServerPool(servers, RANDOM, active=True, exhaust=True)
def connection(
self, server_kwargs: Optional[dict] = None, connection_kwargs: Optional[dict] = None
@ -159,10 +133,8 @@ class LDAPSource(Source):
"""Get a fully connected and bound LDAP Connection"""
server_kwargs = server_kwargs or {}
connection_kwargs = connection_kwargs or {}
if self.bind_cn is not None:
connection_kwargs.setdefault("user", self.bind_cn)
if self.bind_password is not None:
connection_kwargs.setdefault("password", self.bind_password)
connection_kwargs.setdefault("user", self.bind_cn)
connection_kwargs.setdefault("password", self.bind_password)
connection = Connection(
self.server(**server_kwargs),
raise_exceptions=True,
@ -173,18 +145,15 @@ class LDAPSource(Source):
if self.start_tls:
connection.start_tls(read_server_info=False)
try:
successful = connection.bind()
if successful:
return connection
except (LDAPSchemaError, LDAPInsufficientAccessRightsResult) as exc:
connection.bind()
except LDAPSchemaError as exc:
# Schema error, so try connecting without schema info
# See https://github.com/goauthentik/authentik/issues/4590
# See also https://github.com/goauthentik/authentik/issues/3399
if server_kwargs.get("get_info", ALL) == NONE:
raise exc
server_kwargs["get_info"] = NONE
return self.connection(server_kwargs, connection_kwargs)
return RuntimeError("Failed to bind")
return connection
class Meta:
verbose_name = _("LDAP Source")

View File

@ -4,7 +4,7 @@ from re import split
from typing import Optional
from ldap3 import BASE
from ldap3.core.exceptions import LDAPAttributeError, LDAPUnwillingToPerformResult
from ldap3.core.exceptions import LDAPAttributeError
from structlog.stdlib import get_logger
from authentik.core.models import User
@ -69,7 +69,7 @@ class LDAPPasswordChanger:
attributes=["pwdProperties"],
)
root_attrs = list(root_attrs)[0]
except (LDAPAttributeError, LDAPUnwillingToPerformResult, KeyError, IndexError):
except (LDAPAttributeError, KeyError, IndexError):
return False
raw_pwd_properties = root_attrs.get("attributes", {}).get("pwdProperties", None)
if not raw_pwd_properties:
@ -92,7 +92,7 @@ class LDAPPasswordChanger:
return
try:
self._connection.extend.microsoft.modify_password(user_dn, password)
except (LDAPAttributeError, LDAPUnwillingToPerformResult):
except LDAPAttributeError:
self._connection.extend.standard.modify_password(user_dn, new_password=password)
def _ad_check_password_existing(self, password: str, user_dn: str) -> bool:

View File

@ -12,9 +12,13 @@ from authentik.core.models import User
from authentik.core.signals import password_changed
from authentik.events.models import Event, EventAction
from authentik.flows.planner import PLAN_CONTEXT_PENDING_USER
from authentik.lib.utils.reflection import class_to_path
from authentik.sources.ldap.models import LDAPSource
from authentik.sources.ldap.password import LDAPPasswordChanger
from authentik.sources.ldap.tasks import ldap_sync_single
from authentik.sources.ldap.sync.groups import GroupLDAPSynchronizer
from authentik.sources.ldap.sync.membership import MembershipLDAPSynchronizer
from authentik.sources.ldap.sync.users import UserLDAPSynchronizer
from authentik.sources.ldap.tasks import ldap_sync
from authentik.stages.prompt.signals import password_validate
LOGGER = get_logger()
@ -31,7 +35,12 @@ def sync_ldap_source_on_save(sender, instance: LDAPSource, **_):
# and the mappings are created with an m2m event
if not instance.property_mappings.exists() or not instance.property_mappings_group.exists():
return
ldap_sync_single.delay(instance.pk)
for sync_class in [
UserLDAPSynchronizer,
GroupLDAPSynchronizer,
MembershipLDAPSynchronizer,
]:
ldap_sync.delay(instance.pk, class_to_path(sync_class))
@receiver(password_validate)

View File

@ -1,15 +1,13 @@
"""Sync LDAP Users and groups into authentik"""
from typing import Any, Generator
from django.conf import settings
from django.db.models.base import Model
from django.db.models.query import QuerySet
from ldap3 import DEREF_ALWAYS, SUBTREE, Connection
from ldap3 import Connection
from structlog.stdlib import BoundLogger, get_logger
from authentik.core.exceptions import PropertyMappingExpressionException
from authentik.events.models import Event, EventAction
from authentik.lib.config import CONFIG
from authentik.lib.merge import MERGE_LIST_UNIQUE
from authentik.sources.ldap.auth import LDAP_DISTINGUISHED_NAME
from authentik.sources.ldap.models import LDAPPropertyMapping, LDAPSource
@ -31,24 +29,6 @@ class BaseLDAPSynchronizer:
self._messages = []
self._logger = get_logger().bind(source=source, syncer=self.__class__.__name__)
@staticmethod
def name() -> str:
"""UI name for the type of object this class synchronizes"""
raise NotImplementedError
def sync_full(self):
"""Run full sync, this function should only be used in tests"""
if not settings.TEST: # noqa
raise RuntimeError(
f"{self.__class__.__name__}.sync_full() should only be used in tests"
)
for page in self.get_objects():
self.sync(page)
def sync(self, page_data: list) -> int:
"""Sync function, implemented in subclass"""
raise NotImplementedError()
@property
def messages(self) -> list[str]:
"""Get all UI messages"""
@ -80,47 +60,9 @@ class BaseLDAPSynchronizer:
"""Get objects from LDAP, implemented in subclass"""
raise NotImplementedError()
# pylint: disable=too-many-arguments
def search_paginator(
self,
search_base,
search_filter,
search_scope=SUBTREE,
dereference_aliases=DEREF_ALWAYS,
attributes=None,
size_limit=0,
time_limit=0,
types_only=False,
get_operational_attributes=False,
controls=None,
paged_size=int(CONFIG.y("ldap.page_size", 50)),
paged_criticality=False,
):
"""Search in pages, returns each page"""
cookie = True
while cookie:
self._connection.search(
search_base,
search_filter,
search_scope,
dereference_aliases,
attributes,
size_limit,
time_limit,
types_only,
get_operational_attributes,
controls,
paged_size,
paged_criticality,
None if cookie is True else cookie,
)
try:
cookie = self._connection.result["controls"]["1.2.840.113556.1.4.319"]["value"][
"cookie"
]
except KeyError:
cookie = None
yield self._connection.response
def sync(self) -> int:
"""Sync function, implemented in subclass"""
raise NotImplementedError()
def _flatten(self, value: Any) -> Any:
"""Flatten `value` if its a list"""

View File

@ -13,12 +13,8 @@ from authentik.sources.ldap.sync.base import LDAP_UNIQUENESS, BaseLDAPSynchroniz
class GroupLDAPSynchronizer(BaseLDAPSynchronizer):
"""Sync LDAP Users and groups into authentik"""
@staticmethod
def name() -> str:
return "groups"
def get_objects(self, **kwargs) -> Generator:
return self.search_paginator(
return self._connection.extend.standard.paged_search(
search_base=self.base_dn_groups,
search_filter=self._source.group_object_filter,
search_scope=SUBTREE,
@ -26,13 +22,13 @@ class GroupLDAPSynchronizer(BaseLDAPSynchronizer):
**kwargs,
)
def sync(self, page_data: list) -> int:
def sync(self) -> int:
"""Iterate over all LDAP Groups and create authentik_core.Group instances"""
if not self._source.sync_groups:
self.message("Group syncing is disabled for this Source")
return -1
group_count = 0
for group in page_data:
for group in self.get_objects():
if "attributes" not in group:
continue
attributes = group.get("attributes", {})

View File

@ -19,12 +19,8 @@ class MembershipLDAPSynchronizer(BaseLDAPSynchronizer):
super().__init__(source)
self.group_cache: dict[str, Group] = {}
@staticmethod
def name() -> str:
return "membership"
def get_objects(self, **kwargs) -> Generator:
return self.search_paginator(
return self._connection.extend.standard.paged_search(
search_base=self.base_dn_groups,
search_filter=self._source.group_object_filter,
search_scope=SUBTREE,
@ -36,13 +32,13 @@ class MembershipLDAPSynchronizer(BaseLDAPSynchronizer):
**kwargs,
)
def sync(self, page_data: list) -> int:
def sync(self) -> int:
"""Iterate over all Users and assign Groups using memberOf Field"""
if not self._source.sync_groups:
self.message("Group syncing is disabled for this Source")
return -1
membership_count = 0
for group in page_data:
for group in self.get_objects():
if "attributes" not in group:
continue
members = group.get("attributes", {}).get(self._source.group_membership_field, [])

View File

@ -15,12 +15,8 @@ from authentik.sources.ldap.sync.vendor.ms_ad import MicrosoftActiveDirectory
class UserLDAPSynchronizer(BaseLDAPSynchronizer):
"""Sync LDAP Users into authentik"""
@staticmethod
def name() -> str:
return "users"
def get_objects(self, **kwargs) -> Generator:
return self.search_paginator(
return self._connection.extend.standard.paged_search(
search_base=self.base_dn_users,
search_filter=self._source.user_object_filter,
search_scope=SUBTREE,
@ -28,13 +24,13 @@ class UserLDAPSynchronizer(BaseLDAPSynchronizer):
**kwargs,
)
def sync(self, page_data: list) -> int:
def sync(self) -> int:
"""Iterate over all LDAP Users and create authentik_core.User instances"""
if not self._source.sync_users:
self.message("User syncing is disabled for this Source")
return -1
user_count = 0
for user in page_data:
for user in self.get_objects():
if "attributes" not in user:
continue
attributes = user.get("attributes", {})

View File

@ -11,10 +11,6 @@ from authentik.sources.ldap.sync.base import BaseLDAPSynchronizer
class FreeIPA(BaseLDAPSynchronizer):
"""FreeIPA-specific LDAP"""
@staticmethod
def name() -> str:
return "freeipa"
def get_objects(self, **kwargs) -> Generator:
yield None

View File

@ -42,10 +42,6 @@ class UserAccountControl(IntFlag):
class MicrosoftActiveDirectory(BaseLDAPSynchronizer):
"""Microsoft-specific LDAP"""
@staticmethod
def name() -> str:
return "microsoft_ad"
def get_objects(self, **kwargs) -> Generator:
yield None

View File

@ -1,8 +1,4 @@
"""LDAP Sync tasks"""
from uuid import uuid4
from celery import chain, group
from django.core.cache import cache
from ldap3.core.exceptions import LDAPException
from structlog.stdlib import get_logger
@ -12,7 +8,6 @@ from authentik.lib.utils.errors import exception_to_string
from authentik.lib.utils.reflection import class_to_path, path_to_class
from authentik.root.celery import CELERY_APP
from authentik.sources.ldap.models import LDAPSource
from authentik.sources.ldap.sync.base import BaseLDAPSynchronizer
from authentik.sources.ldap.sync.groups import GroupLDAPSynchronizer
from authentik.sources.ldap.sync.membership import MembershipLDAPSynchronizer
from authentik.sources.ldap.sync.users import UserLDAPSynchronizer
@ -23,46 +18,14 @@ SYNC_CLASSES = [
GroupLDAPSynchronizer,
MembershipLDAPSynchronizer,
]
CACHE_KEY_PREFIX = "goauthentik.io/sources/ldap/page/"
@CELERY_APP.task()
def ldap_sync_all():
"""Sync all sources"""
for source in LDAPSource.objects.filter(enabled=True):
ldap_sync_single(source.pk)
@CELERY_APP.task()
def ldap_sync_single(source_pk: str):
"""Sync a single source"""
source: LDAPSource = LDAPSource.objects.filter(pk=source_pk).first()
if not source:
return
task = chain(
# User and group sync can happen at once, they have no dependencies on each other
group(
ldap_sync_paginator(source, UserLDAPSynchronizer)
+ ldap_sync_paginator(source, GroupLDAPSynchronizer),
),
# Membership sync needs to run afterwards
group(
ldap_sync_paginator(source, MembershipLDAPSynchronizer),
),
)
task()
def ldap_sync_paginator(source: LDAPSource, sync: type[BaseLDAPSynchronizer]) -> list:
"""Return a list of task signatures with LDAP pagination data"""
sync_inst: BaseLDAPSynchronizer = sync(source)
signatures = []
for page in sync_inst.get_objects():
page_cache_key = CACHE_KEY_PREFIX + str(uuid4())
cache.set(page_cache_key, page)
page_sync = ldap_sync.si(source.pk, class_to_path(sync), page_cache_key)
signatures.append(page_sync)
return signatures
for sync_class in SYNC_CLASSES:
ldap_sync.delay(source.pk, class_to_path(sync_class))
@CELERY_APP.task(
@ -71,24 +34,20 @@ def ldap_sync_paginator(source: LDAPSource, sync: type[BaseLDAPSynchronizer]) ->
soft_time_limit=60 * 60 * int(CONFIG.y("ldap.task_timeout_hours")),
task_time_limit=60 * 60 * int(CONFIG.y("ldap.task_timeout_hours")),
)
def ldap_sync(self: MonitoredTask, source_pk: str, sync_class: str, page_cache_key: str):
def ldap_sync(self: MonitoredTask, source_pk: str, sync_class: str):
"""Synchronization of an LDAP Source"""
self.result_timeout_hours = int(CONFIG.y("ldap.task_timeout_hours"))
source: LDAPSource = LDAPSource.objects.filter(pk=source_pk).first()
if not source:
try:
source: LDAPSource = LDAPSource.objects.get(pk=source_pk)
except LDAPSource.DoesNotExist:
# Because the source couldn't be found, we don't have a UID
# to set the state with
return
sync: type[BaseLDAPSynchronizer] = path_to_class(sync_class)
uid = page_cache_key.replace(CACHE_KEY_PREFIX, "")
self.set_uid(f"{source.slug}:{sync.name()}:{uid}")
sync = path_to_class(sync_class)
self.set_uid(f"{source.slug}:{sync.__name__.replace('LDAPSynchronizer', '').lower()}")
try:
sync_inst: BaseLDAPSynchronizer = sync(source)
page = cache.get(page_cache_key)
if not page:
return
cache.touch(page_cache_key)
count = sync_inst.sync(page)
sync_inst = sync(source)
count = sync_inst.sync()
messages = sync_inst.messages
messages.append(f"Synced {count} objects.")
self.set_status(
@ -97,7 +56,6 @@ def ldap_sync(self: MonitoredTask, source_pk: str, sync_class: str, page_cache_k
messages,
)
)
cache.delete(page_cache_key)
except LDAPException as exc:
# No explicit event is created here as .set_status with an error will do that
LOGGER.warning(exception_to_string(exc))

View File

@ -29,37 +29,6 @@ class LDAPSyncTests(TestCase):
additional_group_dn="ou=groups",
)
def test_auth_direct_user_ad(self):
"""Test direct auth"""
self.source.property_mappings.set(
LDAPPropertyMapping.objects.filter(
Q(managed__startswith="goauthentik.io/sources/ldap/default-")
| Q(managed__startswith="goauthentik.io/sources/ldap/ms-")
)
)
raw_conn = mock_ad_connection(LDAP_PASSWORD)
bind_mock = Mock(wraps=raw_conn.bind)
raw_conn.bind = bind_mock
connection = MagicMock(return_value=raw_conn)
with patch("authentik.sources.ldap.models.LDAPSource.connection", connection):
user_sync = UserLDAPSynchronizer(self.source)
user_sync.sync_full()
user = User.objects.get(username="user0_sn")
# auth_user_by_bind = Mock(return_value=user)
backend = LDAPBackend()
self.assertEqual(
backend.authenticate(None, username="user0_sn", password=LDAP_PASSWORD),
user,
)
connection.assert_called_with(
connection_kwargs={
"user": "cn=user0,ou=users,dc=goauthentik,dc=io",
"password": LDAP_PASSWORD,
}
)
bind_mock.assert_not_called()
def test_auth_synced_user_ad(self):
"""Test Cached auth"""
self.source.property_mappings.set(
@ -71,7 +40,7 @@ class LDAPSyncTests(TestCase):
connection = MagicMock(return_value=mock_ad_connection(LDAP_PASSWORD))
with patch("authentik.sources.ldap.models.LDAPSource.connection", connection):
user_sync = UserLDAPSynchronizer(self.source)
user_sync.sync_full()
user_sync.sync()
user = User.objects.get(username="user0_sn")
auth_user_by_bind = Mock(return_value=user)
@ -98,7 +67,7 @@ class LDAPSyncTests(TestCase):
connection = MagicMock(return_value=mock_slapd_connection(LDAP_PASSWORD))
with patch("authentik.sources.ldap.models.LDAPSource.connection", connection):
user_sync = UserLDAPSynchronizer(self.source)
user_sync.sync_full()
user_sync.sync()
user = User.objects.get(username="user0_sn")
auth_user_by_bind = Mock(return_value=user)

View File

@ -51,7 +51,7 @@ class LDAPSyncTests(TestCase):
connection = MagicMock(return_value=mock_ad_connection(LDAP_PASSWORD))
with patch("authentik.sources.ldap.models.LDAPSource.connection", connection):
user_sync = UserLDAPSynchronizer(self.source)
user_sync.sync_full()
user_sync.sync()
self.assertFalse(User.objects.filter(username="user0_sn").exists())
self.assertFalse(User.objects.filter(username="user1_sn").exists())
events = Event.objects.filter(
@ -87,7 +87,7 @@ class LDAPSyncTests(TestCase):
with patch("authentik.sources.ldap.models.LDAPSource.connection", connection):
user_sync = UserLDAPSynchronizer(self.source)
user_sync.sync_full()
user_sync.sync()
user = User.objects.filter(username="user0_sn").first()
self.assertEqual(user.attributes["foo"], "bar")
self.assertFalse(user.is_active)
@ -106,7 +106,7 @@ class LDAPSyncTests(TestCase):
connection = MagicMock(return_value=mock_slapd_connection(LDAP_PASSWORD))
with patch("authentik.sources.ldap.models.LDAPSource.connection", connection):
user_sync = UserLDAPSynchronizer(self.source)
user_sync.sync_full()
user_sync.sync()
self.assertTrue(User.objects.filter(username="user0_sn").exists())
self.assertFalse(User.objects.filter(username="user1_sn").exists())
@ -128,9 +128,9 @@ class LDAPSyncTests(TestCase):
self.source.sync_parent_group = parent_group
self.source.save()
group_sync = GroupLDAPSynchronizer(self.source)
group_sync.sync_full()
group_sync.sync()
membership_sync = MembershipLDAPSynchronizer(self.source)
membership_sync.sync_full()
membership_sync.sync()
group: Group = Group.objects.filter(name="test-group").first()
self.assertIsNotNone(group)
self.assertEqual(group.parent, parent_group)
@ -152,9 +152,9 @@ class LDAPSyncTests(TestCase):
with patch("authentik.sources.ldap.models.LDAPSource.connection", connection):
self.source.save()
group_sync = GroupLDAPSynchronizer(self.source)
group_sync.sync_full()
group_sync.sync()
membership_sync = MembershipLDAPSynchronizer(self.source)
membership_sync.sync_full()
membership_sync.sync()
group = Group.objects.filter(name="group1")
self.assertTrue(group.exists())
@ -177,11 +177,11 @@ class LDAPSyncTests(TestCase):
with patch("authentik.sources.ldap.models.LDAPSource.connection", connection):
self.source.save()
user_sync = UserLDAPSynchronizer(self.source)
user_sync.sync_full()
user_sync.sync()
group_sync = GroupLDAPSynchronizer(self.source)
group_sync.sync_full()
group_sync.sync()
membership_sync = MembershipLDAPSynchronizer(self.source)
membership_sync.sync_full()
membership_sync.sync()
# Test if membership mapping based on memberUid works.
posix_group = Group.objects.filter(name="group-posix").first()
self.assertTrue(posix_group.users.filter(name="user-posix").exists())

View File

@ -1,8 +1,6 @@
"""OpenID Type tests"""
from django.test import RequestFactory, TestCase
from requests_mock import Mocker
from django.test import TestCase
from authentik.lib.generators import generate_id
from authentik.sources.oauth.models import OAuthSource
from authentik.sources.oauth.types.oidc import OpenIDConnectOAuth2Callback
@ -26,10 +24,9 @@ class TestTypeOpenID(TestCase):
slug="test",
provider_type="openidconnect",
authorization_url="",
profile_url="http://localhost/userinfo",
profile_url="",
consumer_key="",
)
self.factory = RequestFactory()
def test_enroll_context(self):
"""Test OpenID Enrollment context"""
@ -37,19 +34,3 @@ class TestTypeOpenID(TestCase):
self.assertEqual(ak_context["username"], OPENID_USER["nickname"])
self.assertEqual(ak_context["email"], OPENID_USER["email"])
self.assertEqual(ak_context["name"], OPENID_USER["name"])
@Mocker()
def test_userinfo(self, mock: Mocker):
"""Test userinfo API call"""
mock.get("http://localhost/userinfo", json=OPENID_USER)
token = generate_id()
OpenIDConnectOAuth2Callback(request=self.factory.get("/")).get_client(
self.source
).get_profile_info(
{
"token_type": "foo",
"access_token": token,
}
)
self.assertEqual(mock.last_request.query, "")
self.assertEqual(mock.last_request.headers["Authorization"], f"foo {token}")

Some files were not shown because too many files have changed in this diff Show More