Compare commits

..

59 Commits

Author SHA1 Message Date
e095e9f694 release: 2023.10.7 2024-01-29 17:44:48 +01:00
10e311534f security: fix CVE-2024-23647 (cherry-pick #8345) (#8347)
security: fix CVE-2024-23647 (#8345)

* security: fix CVE-2024-23647



* add tests



* add website



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2024-01-29 17:42:04 +01:00
46fdb45273 root: fix redis config not being updated to match previous change
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2024-01-29 14:02:18 +01:00
6d4125cb90 root: fix listen trusted_proxy_cidrs config loading from environment (#8075)
Co-authored-by: Jens Langhammer <jens@goauthentik.io>
Signed-off-by: Jens Langhammer <jens@goauthentik.io>

# Conflicts:
#	go.mod
#	go.sum
#	internal/config/struct.go
2024-01-25 14:51:25 +01:00
bc83176962 stages/authenticator_validate: use friendly_name for stage selector when enrolling (cherry-pick #8255) (#8256)
stages/authenticator_validate: use friendly_name for stage selector when enrolling (#8255)

* stages/authenticator_validate: use friendly_name for stage selector when enrolling



* fix tests



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2024-01-22 16:27:46 +01:00
0fa8432b72 rbac: fix invitations listing with restricted permissions (cherry-pick #8227) (#8229)
rbac: fix invitations listing with restricted permissions (#8227)

* rbac: fix missing permission definition for list



* core: fix users's system_permissions not including role permissions



* core: don't require permissions for users/me/



* web/admin: catch error when listing stages on invitation page fails



* Revert "rbac: fix missing permission definition for list"

This reverts commit fd7572e699.

* Revert "core: don't require permissions for users/me/"

This reverts commit 9df0dbda8a.

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2024-01-18 23:24:58 +01:00
bb9a524b53 sources/oauth: fix URLs being overwritten by OIDC urls (cherry-pick #8147) (#8156)
sources/oauth: fix URLs being overwritten by OIDC urls (#8147)

* sources/oauth: fix URLs being overwritten by OIDC urls



* fix tests



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2024-01-13 16:37:47 +01:00
d31c05625b sources/oauth: fix azure_ad user_id and add test and fallback (cherry-pick #8146) (#8152) 2024-01-12 21:01:24 +01:00
399223b770 web/flows: fix icon for generic oauth source with dark theme (cherry-pick #8148) (#8151) 2024-01-12 21:01:11 +01:00
19197d3f9b sources/oauth: revert azure_ad profile URL change (cherry-pick #8139) (#8141)
sources/oauth: revert azure_ad profile URL change (#8139)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2024-01-12 16:21:59 +01:00
1cd000dfe2 release: 2023.10.6 2024-01-09 18:50:48 +01:00
00ae97944a providers/oauth2: fix CVE-2024-21637 (cherry-pick #8104) (#8105)
* providers/oauth2: fix CVE-2024-21637 (#8104)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* update changelog

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2024-01-09 18:32:03 +01:00
9f3ccfb7c7 web/flows: fix device picker incorrect foreground color (cherry-pick #8067) (#8069)
web/flows: fix device picker incorrect foreground color (#8067)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2024-01-05 15:31:08 +01:00
9ed9c39ac8 rbac: fix error when looking up permissions for now uninstalled apps (cherry-pick #8068) (#8070)
rbac: fix error when looking up permissions for now uninstalled apps (#8068)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2024-01-05 15:30:59 +01:00
30b6eeee9f outposts: disable deployment and secret reconciler for embedded outpost in code instead of in config (cherry-pick #8021) (#8024)
outposts: disable deployment and secret reconciler for embedded outpost in code instead of in config (#8021)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-30 21:40:54 +01:00
afe2621783 providers/proxy: use access token (cherry-pick #8022) (#8023)
providers/proxy: use access token (#8022)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-30 18:30:42 +01:00
8b12c6a01a outposts: fix Outpost reconcile not re-assigning managed attribute (cherry-pick #8014) (#8020)
outposts: fix Outpost reconcile not re-assigning managed attribute (#8014)

* outposts: fix Outpost reconcile not re-assigning managed attribute



* rework reconcile to find both name and managed outpost



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-30 15:37:36 +01:00
f63adfed96 core: fix PropertyMapping context not being available in request context
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-12-23 03:05:29 +01:00
9c8fec21cf providers/oauth2: remember session_id from initial token (cherry-pick #7976) (#7977)
providers/oauth2: remember session_id from initial token (#7976)

* providers/oauth2: remember session_id original token was created with for future access/refresh tokens



* providers/proxy: use hashed session as `sid`



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-23 01:20:48 +01:00
4776d2bcc5 sources/oauth: fix missing get_user_id for OIDC-like sources (Azure AD) (#7970)
* lib: add debug requests session that shows all sent requests

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* sources/oauth: fix missing get_user_id for OIDC-like OAuth Sources

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	authentik/lib/utils/http.py
2023-12-22 00:13:42 +01:00
a15a040362 release: 2023.10.5 2023-12-21 14:18:36 +01:00
fcd6dc1d60 events: fix lint (#7700)
* events: fix lint

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* test without explicit poetry env use?

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* delete previous poetry env

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* prevent invalid cached poetry envs

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* run test-from-stable as matrix and make required

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix missing postgres version

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* sigh

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* idk

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	.github/actions/setup/action.yml
#	.github/workflows/ci-main.yml
2023-12-19 18:40:14 +01:00
acc3b59869 events: add better fallback for sanitize_item to ensure everything can be saved as JSON (cherry-pick #7694) (#7937)
events: add better fallback for sanitize_item to ensure everything can be saved as JSON (#7694)

* events: fix events sanitizing not handling all types



* remove some leftover prints



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-19 18:31:20 +01:00
d9d5ac10e6 events: include user agent in events (cherry-pick #7693) (#7938)
events: include user agent in events (#7693)

* events: include user agent in events



* fix tests



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-19 18:31:06 +01:00
750669dcab stages/email: improve error handling for incorrect template syntax (cherry-pick #7758) (#7936)
stages/email: improve error handling for incorrect template syntax (#7758)

* stages/email: improve error handling for incorrect template syntax



* add tests



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-19 18:30:56 +01:00
88a3eed67e root: don't show warning when app has no URLs to import (cherry-pick #7765) (#7935)
root: don't show warning when app has no URLs to import (#7765)

Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-19 18:30:49 +01:00
6c214fffc4 blueprints: improve file change handler (cherry-pick #7813) (#7934)
blueprints: improve file change handler (#7813)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-19 18:30:37 +01:00
70100fc105 web/user: fix search not updating app (cherry-pick #7825) (#7933)
web/user: fix search not updating app (#7825)

web/user: fix app not updating

so when using two classes in a classMap directive, the update fails (basically saying that each class must be separated), however this error only shows when directly calling requestUpdate and is swallowed somewhere when relying on the default render cycle

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-19 18:30:23 +01:00
3c1163fabd root: Fix cache related image build issues (cherry-pick #7831) (#7932)
Fix cache related image build issues

Co-authored-by: Philipp Kolberg <philipp.kolberg@t-online.de>
2023-12-19 18:30:15 +01:00
539e8242ff web: fix overflow glitch on ak-page-header (cherry-pick #7883) (#7931)
web: fix overflow glitch on ak-page-header (#7883)

By adding 'grow' but not 'shrink' to the header section, the page was allowed to allocate
as much width as was available when the window opened, but not allowed to resize the width
if it was pushed closed by zoom, page resize, or summon sidebar.

This commit adds 'shrink' to the capabilities of the header.

Co-authored-by: Ken Sternberg <133134217+kensternberg-authentik@users.noreply.github.com>
2023-12-19 18:30:04 +01:00
2648333590 providers/scim: change familyName default (cherry-pick #7904) (#7930)
providers/scim: change familyName default (#7904)

* Update providers-scim.yaml



* fix: add formatted to match the givenName & familyName



* fix, update tests



---------

Signed-off-by: Antoine <antoine+github@jiveoff.fr>
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
Co-authored-by: Antoine <antoine+github@jiveoff.fr>
2023-12-19 18:29:55 +01:00
fe828ef993 tests: fix flaky tests (cherry-pick #7676) (#7939)
tests: fix flaky tests (#7676)

* tests: fix flaky tests



* make test-from-stable use actual latest version



* fix checkout



* remove hardcoded seed



* ignore tests for now i guess idk



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-12-19 18:29:44 +01:00
29a6530742 web: dark/light theme fixes (#7872)
* web: fix css for user tree-view

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix unrelated things

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix header button colors

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix missing fallback not showing default slant

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* move global theme-dark css to only use for SSR rendered pages

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	.github/workflows/ci-main.yml
#	web/xliff/fr.xlf
2023-12-19 18:18:19 +01:00
a6b9274c4f web/admin: always show oidc well-known URL fields when they're set (#7560)
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	web/xliff/de.xlf
#	web/xliff/en.xlf
#	web/xliff/es.xlf
#	web/xliff/fr.xlf
#	web/xliff/pl.xlf
#	web/xliff/pseudo-LOCALE.xlf
#	web/xliff/tr.xlf
#	web/xliff/zh-Hans.xlf
#	web/xliff/zh-Hant.xlf
#	web/xliff/zh_TW.xlf
2023-12-19 18:10:40 +01:00
a2a67161ac release: 2023.10.4 2023-11-21 18:38:24 +01:00
2e8263a99b web: fix locale
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2023-11-21 18:20:41 +01:00
6b9afed21f security: fix CVE-2023-48228 (cherry-pick #7666) (#7668)
security: fix CVE-2023-48228 (#7666)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-21 18:13:54 +01:00
1eb1f4e0b8 web/admin: fix admins not able to delete MFA devices (#7660)
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	web/xliff/zh-Hans.xlf
2023-11-21 15:24:37 +01:00
7c3d60ec3a events: don't update internal service accounts unless needed (cherry-pick #7611) (#7640)
events: stop spam (#7611)

* events: don't log updates to internal service accounts



* dont log reputation updates



* don't actually ignore things, stop updating outpost user when not required



* prevent updating internal service account users



* fix setattr call



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-20 19:43:30 +01:00
a494c6b6e8 root: specify node and python versions in respective config files, deduplicate in CI (#7620)
* root: specify node and python versions in respective config files, deduplicate in CI

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix engines missing for wdio

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* bump setup python version

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* actually don't bump a bunch of things

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	poetry.lock
#	website/package.json
2023-11-19 00:35:55 +01:00
6604d3577f core: bump golang from 1.21.3-bookworm to 1.21.4-bookworm (cherry-pick #7483) (#7622)
core: bump golang from 1.21.3-bookworm to 1.21.4-bookworm

Bumps golang from 1.21.3-bookworm to 1.21.4-bookworm.

---
updated-dependencies:
- dependency-name: golang
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-19 00:33:07 +01:00
f8bfa7e16a ci: fix permissions for release pipeline to publish binaries (cherry-pick #7512) (#7621)
ci: fix permissions for release pipeline to publish binaries (#7512)

ci: fix permissions

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-19 00:31:20 +01:00
ea6cf6eabf events: fix missing model_* events when not directly authenticated (cherry-pick #7588) (#7597)
events: fix missing model_* events when not directly authenticated (#7588)

* events: fix missing model_* events when not directly authenticated



* defer accessing database



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-16 12:59:41 +01:00
769ce3ce7b providers/scim: fix missing schemas attribute for User and Group (cherry-pick #7477) (#7596)
providers/scim: fix missing schemas attribute for User and Group (#7477)

* providers/scim: fix missing schemas attribute for User and Group



* make things actually work



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-16 12:06:01 +01:00
3891fb3fa8 events: sanitize functions (cherry-pick #7587) (#7589)
events: sanitize functions (#7587)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-15 23:24:13 +01:00
41eb965350 stages/email: use uuid for email confirmation token instead of username (cherry-pick #7581) (#7584)
stages/email: use uuid for email confirmation token instead of username (#7581)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-15 21:57:05 +01:00
8d95612287 providers/proxy: Fix duplicate cookies when using file system store. (cherry-pick #7541) (#7544)
providers/proxy: Fix duplicate cookies when using file system store. (#7541)

Fix duplicate cookies when using file system store.

Co-authored-by: thijs_a <thijs@thijsalders.nl>
2023-11-13 16:02:35 +01:00
82b5274b15 release: 2023.10.3 2023-11-09 18:37:22 +01:00
af56ce3d78 core: fix worker beat toggle inverted (cherry-pick #7508) (#7509)
core: fix worker beat toggle inverted (#7508)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-09 18:36:56 +01:00
f5c6e7aeb0 Web: bugfix: broken backchannel selector (cherry-pick #7480) (#7507)
Web: bugfix: broken backchannel selector (#7480)

* web: break circular dependency between AKElement & Interface.

This commit changes the way the root node of the web application shell is
discovered by child components, such that the base class shared by both
no longer results in a circular dependency between the two models.

I've run this in isolation and have seen no failures of discovery; the identity
token exists as soon as the Interface is constructed and is found by every item
on the page.

* web: fix broken typescript references

This built... and then it didn't?  Anyway, the current fix is to
provide type information the AkInterface for the data that consumers
require.

* web: rollback dependabot's upgrade of context

The most frustrating part of this is that I RAN THIS, dammit, with the updated
context and the current Wizard, and it finished the End-to-End tests without
complaint.

* web: bugfix: broken backchannel selector

There were two bugs here, both of them introduced by me because I didn't understand the
system well enough the first time through, and because I didn't test thoroughly enough.

The first is that I was calling the wrong confirmation code; the resulting syntax survived
because `confirm()` is actually a legitimate function call in the context of the DOM Window,
a legacy survivor similar to `alert()` but with a yes/no return value. Bleah.

The second is that the confirm code doesn't appear to pass back a dictionary with the
`{ items: Array<Provider> }` list, it passes back just the `items` as an Array.

Co-authored-by: Ken Sternberg <133134217+kensternberg-authentik@users.noreply.github.com>
2023-11-09 17:58:38 +01:00
3809400e93 events: fix gdpr compliance always running (cherry-pick #7491) (#7505)
events: fix gdpr compliance always running

Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2023-11-09 17:57:25 +01:00
1def9865cf web/flows: attempt to fix bitwareden android compatibility (cherry-pick #7455) (#7457)
web/flows: attempt to fix bitwareden android compatibility (#7455)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-06 23:58:44 +01:00
3716298639 sources/oauth: fix patreon (cherry-pick #7454) (#7456)
sources/oauth: fix patreon (#7454)

* web/admin: add note for potentially confusing consumer key/secret



* sources/oauth: fix patreon default scopes



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-06 16:36:22 +01:00
c16317d7cf providers/proxy: fix closed redis client (cherry-pick #7385) (#7429)
providers/proxy: fix closed redis client (#7385)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-03 15:46:17 +01:00
bbb8fa8269 ci: explicitly give write permissions to packages (cherry-pick #7428) (#7430)
ci: explicitly give write permissions to packages (#7428)

* ci: explicitly give write permissions to packages



* run full CI on cherry-picks



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-03 15:46:00 +01:00
e4c251a178 web/admin: fix html error on oauth2 provider page (cherry-pick #7384) (#7424)
web/admin: fix html error on oauth2 provider page (#7384)

* web: break circular dependency between AKElement & Interface.

This commit changes the way the root node of the web application shell is
discovered by child components, such that the base class shared by both
no longer results in a circular dependency between the two models.

I've run this in isolation and have seen no failures of discovery; the identity
token exists as soon as the Interface is constructed and is found by every item
on the page.

* web: fix broken typescript references

This built... and then it didn't?  Anyway, the current fix is to
provide type information the AkInterface for the data that consumers
require.

* \# Details

Extra `>` symbol screwed up the reading of the rest of the component.  Unfortunately,
too many fields in an input are optional, so it was easy for this bug to bypass any
checks by the validators.  I should have caught it myself, though.

Co-authored-by: Ken Sternberg <133134217+kensternberg-authentik@users.noreply.github.com>
2023-11-03 13:17:26 +01:00
0fefd5f522 stages/email: fix duplicate querystring encoding (cherry-pick #7386) (#7425)
stages/email: fix duplicate querystring encoding (#7386)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-03 13:17:18 +01:00
88057db0b0 providers/oauth2: set auth_via for token and other endpoints (cherry-pick #7417) (#7427)
providers/oauth2: set auth_via for token and other endpoints (#7417)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L <jens@goauthentik.io>
2023-11-03 13:17:10 +01:00
91cb6c9beb root: Improve multi arch Docker image build speed (cherry-pick #7355) (#7426)
root: Improve multi arch Docker image build speed (#7355)

* Improve multi arch Docker image build speed

Use only host architecture for GeoIP database update and for Go cross-compilation

* Speedup Go multi-arch compilation for other images

* Speedup multi-arch ldap image build

Co-authored-by: Philipp Kolberg <39984529+PKizzle@users.noreply.github.com>
2023-11-03 13:16:54 +01:00
2532 changed files with 90019 additions and 260648 deletions

View File

@ -1,30 +1,18 @@
[bumpversion] [bumpversion]
current_version = 2024.10.2 current_version = 2023.10.7
tag = True tag = True
commit = True commit = True
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(?:-(?P<rc_t>[a-zA-Z-]+)(?P<rc_n>[1-9]\\d*))? parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)
serialize = serialize = {major}.{minor}.{patch}
{major}.{minor}.{patch}-{rc_t}{rc_n}
{major}.{minor}.{patch}
message = release: {new_version} message = release: {new_version}
tag_name = version/{new_version} tag_name = version/{new_version}
[bumpversion:part:rc_t]
values =
rc
final
optional_value = final
[bumpversion:file:pyproject.toml] [bumpversion:file:pyproject.toml]
[bumpversion:file:package.json]
[bumpversion:file:docker-compose.yml] [bumpversion:file:docker-compose.yml]
[bumpversion:file:schema.yml] [bumpversion:file:schema.yml]
[bumpversion:file:blueprints/schema.json]
[bumpversion:file:authentik/__init__.py] [bumpversion:file:authentik/__init__.py]
[bumpversion:file:internal/constants/constants.go] [bumpversion:file:internal/constants/constants.go]

View File

@ -9,4 +9,3 @@ blueprints/local
.git .git
!gen-ts-api/node_modules !gen-ts-api/node_modules
!gen-ts-api/dist/** !gen-ts-api/dist/**
!gen-go-api/

2
.github/FUNDING.yml vendored
View File

@ -1 +1 @@
custom: https://goauthentik.io/pricing/ github: [BeryJu]

View File

@ -9,7 +9,7 @@ assignees: ""
**Describe your question/** **Describe your question/**
A clear and concise description of what you're trying to do. A clear and concise description of what you're trying to do.
**Relevant info** **Relevant infos**
i.e. Version of other software you're using, specifics of your setup i.e. Version of other software you're using, specifics of your setup
**Screenshots** **Screenshots**

View File

@ -9,6 +9,9 @@ inputs:
runs: runs:
using: "composite" using: "composite"
steps: steps:
- name: Generate config
id: ev
uses: ./.github/actions/docker-push-variables
- name: Find Comment - name: Find Comment
uses: peter-evans/find-comment@v2 uses: peter-evans/find-comment@v2
id: fc id: fc
@ -54,10 +57,9 @@ runs:
authentik: authentik:
outposts: outposts:
container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
global: image:
image: repository: ghcr.io/goauthentik/dev-server
repository: ghcr.io/goauthentik/dev-server tag: ${{ inputs.tag }}
tag: ${{ inputs.tag }}
``` ```
For arm64, use these values: For arm64, use these values:
@ -66,10 +68,9 @@ runs:
authentik: authentik:
outposts: outposts:
container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
global: image:
image: repository: ghcr.io/goauthentik/dev-server
repository: ghcr.io/goauthentik/dev-server tag: ${{ inputs.tag }}-arm64
tag: ${{ inputs.tag }}-arm64
``` ```
Afterwards, run the upgrade commands from the latest release notes. Afterwards, run the upgrade commands from the latest release notes.

View File

@ -1,53 +1,64 @@
---
name: "Prepare docker environment variables" name: "Prepare docker environment variables"
description: "Prepare docker environment variables" description: "Prepare docker environment variables"
inputs:
image-name:
required: true
description: "Docker image prefix"
image-arch:
required: false
description: "Docker image arch"
outputs: outputs:
shouldBuild: shouldBuild:
description: "Whether to build image or not" description: "Whether to build image or not"
value: ${{ steps.ev.outputs.shouldBuild }} value: ${{ steps.ev.outputs.shouldBuild }}
branchName:
description: "Branch name"
value: ${{ steps.ev.outputs.branchName }}
branchNameContainer:
description: "Branch name (for containers)"
value: ${{ steps.ev.outputs.branchNameContainer }}
timestamp:
description: "Timestamp"
value: ${{ steps.ev.outputs.timestamp }}
sha: sha:
description: "sha" description: "sha"
value: ${{ steps.ev.outputs.sha }} value: ${{ steps.ev.outputs.sha }}
shortHash:
description: "shortHash"
value: ${{ steps.ev.outputs.shortHash }}
version: version:
description: "Version" description: "version"
value: ${{ steps.ev.outputs.version }} value: ${{ steps.ev.outputs.version }}
prerelease: versionFamily:
description: "Prerelease" description: "versionFamily"
value: ${{ steps.ev.outputs.prerelease }} value: ${{ steps.ev.outputs.versionFamily }}
imageTags:
description: "Docker image tags"
value: ${{ steps.ev.outputs.imageTags }}
attestImageNames:
description: "Docker image names used for attestation"
value: ${{ steps.ev.outputs.attestImageNames }}
imageMainTag:
description: "Docker image main tag"
value: ${{ steps.ev.outputs.imageMainTag }}
imageMainName:
description: "Docker image main name"
value: ${{ steps.ev.outputs.imageMainName }}
runs: runs:
using: "composite" using: "composite"
steps: steps:
- name: Generate config - name: Generate config
id: ev id: ev
shell: bash shell: python
env:
IMAGE_NAME: ${{ inputs.image-name }}
IMAGE_ARCH: ${{ inputs.image-arch }}
PR_HEAD_SHA: ${{ github.event.pull_request.head.sha }}
run: | run: |
python3 ${{ github.action_path }}/push_vars.py """Helper script to get the actual branch name, docker safe"""
import configparser
import os
from time import time
parser = configparser.ConfigParser()
parser.read(".bumpversion.cfg")
branch_name = os.environ["GITHUB_REF"]
if os.environ.get("GITHUB_HEAD_REF", "") != "":
branch_name = os.environ["GITHUB_HEAD_REF"]
should_build = str(os.environ.get("DOCKER_USERNAME", "") != "").lower()
version = parser.get("bumpversion", "current_version")
version_family = ".".join(version.split(".")[:-1])
safe_branch_name = branch_name.replace("refs/heads/", "").replace("/", "-")
sha = os.environ["GITHUB_SHA"] if not "${{ github.event.pull_request.head.sha }}" else "${{ github.event.pull_request.head.sha }}"
with open(os.environ["GITHUB_OUTPUT"], "a+", encoding="utf-8") as _output:
print("branchName=%s" % branch_name, file=_output)
print("branchNameContainer=%s" % safe_branch_name, file=_output)
print("timestamp=%s" % int(time()), file=_output)
print("sha=%s" % sha, file=_output)
print("shortHash=%s" % sha[:7], file=_output)
print("shouldBuild=%s" % should_build, file=_output)
print("version=%s" % version, file=_output)
print("versionFamily=%s" % version_family, file=_output)

View File

@ -1,74 +0,0 @@
"""Helper script to get the actual branch name, docker safe"""
import configparser
import os
from time import time
parser = configparser.ConfigParser()
parser.read(".bumpversion.cfg")
should_build = str(len(os.environ.get("DOCKER_USERNAME", "")) > 0).lower()
branch_name = os.environ["GITHUB_REF"]
if os.environ.get("GITHUB_HEAD_REF", "") != "":
branch_name = os.environ["GITHUB_HEAD_REF"]
safe_branch_name = branch_name.replace("refs/heads/", "").replace("/", "-").replace("'", "-")
image_names = os.getenv("IMAGE_NAME").split(",")
image_arch = os.getenv("IMAGE_ARCH") or None
is_pull_request = bool(os.getenv("PR_HEAD_SHA"))
is_release = "dev" not in image_names[0]
sha = os.environ["GITHUB_SHA"] if not is_pull_request else os.getenv("PR_HEAD_SHA")
# 2042.1.0 or 2042.1.0-rc1
version = parser.get("bumpversion", "current_version")
# 2042.1
version_family = ".".join(version.split("-", 1)[0].split(".")[:-1])
prerelease = "-" in version
image_tags = []
if is_release:
for name in image_names:
image_tags += [
f"{name}:{version}",
]
if not prerelease:
image_tags += [
f"{name}:latest",
f"{name}:{version_family}",
]
else:
suffix = ""
if image_arch and image_arch != "amd64":
suffix = f"-{image_arch}"
for name in image_names:
image_tags += [
f"{name}:gh-{sha}{suffix}", # Used for ArgoCD and PR comments
f"{name}:gh-{safe_branch_name}{suffix}", # For convenience
f"{name}:gh-{safe_branch_name}-{int(time())}-{sha[:7]}{suffix}", # Use by FluxCD
]
image_main_tag = image_tags[0].split(":")[-1]
def get_attest_image_names(image_with_tags: list[str]):
"""Attestation only for GHCR"""
image_tags = []
for image_name in set(name.split(":")[0] for name in image_with_tags):
if not image_name.startswith("ghcr.io"):
continue
image_tags.append(image_name)
return ",".join(set(image_tags))
with open(os.environ["GITHUB_OUTPUT"], "a+", encoding="utf-8") as _output:
print(f"shouldBuild={should_build}", file=_output)
print(f"sha={sha}", file=_output)
print(f"version={version}", file=_output)
print(f"prerelease={prerelease}", file=_output)
print(f"imageTags={','.join(image_tags)}", file=_output)
print(f"attestImageNames={get_attest_image_names(image_tags)}", file=_output)
print(f"imageMainTag={image_main_tag}", file=_output)
print(f"imageMainName={image_tags[0]}", file=_output)

View File

@ -1,7 +0,0 @@
#!/bin/bash -x
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
GITHUB_OUTPUT=/dev/stdout \
GITHUB_REF=ref \
GITHUB_SHA=sha \
IMAGE_NAME=ghcr.io/goauthentik/server,beryju/authentik \
python $SCRIPT_DIR/push_vars.py

View File

@ -4,7 +4,7 @@ description: "Setup authentik testing environment"
inputs: inputs:
postgresql_version: postgresql_version:
description: "Optional postgresql image tag" description: "Optional postgresql image tag"
default: "16" default: "12"
runs: runs:
using: "composite" using: "composite"
@ -14,27 +14,27 @@ runs:
run: | run: |
pipx install poetry || true pipx install poetry || true
sudo apt-get update sudo apt-get update
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext libkrb5-dev krb5-kdc krb5-user krb5-admin-server sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext
- name: Setup python and restore poetry - name: Setup python and restore poetry
uses: actions/setup-python@v5 uses: actions/setup-python@v4
with: with:
python-version-file: "pyproject.toml" python-version-file: 'pyproject.toml'
cache: "poetry" cache: "poetry"
- name: Setup node - name: Setup node
uses: actions/setup-node@v4 uses: actions/setup-node@v3
with: with:
node-version-file: web/package.json node-version-file: web/package.json
cache: "npm" cache: "npm"
cache-dependency-path: web/package-lock.json cache-dependency-path: web/package-lock.json
- name: Setup go - name: Setup go
uses: actions/setup-go@v5 uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- name: Setup dependencies - name: Setup dependencies
shell: bash shell: bash
run: | run: |
export PSQL_TAG=${{ inputs.postgresql_version }} export PSQL_TAG=${{ inputs.postgresql_version }}
docker compose -f .github/actions/setup/docker-compose.yml up -d docker-compose -f .github/actions/setup/docker-compose.yml up -d
poetry install poetry install
cd web && npm ci cd web && npm ci
- name: Generate config - name: Generate config

View File

@ -1,6 +1,8 @@
version: "3.7"
services: services:
postgresql: postgresql:
image: docker.io/library/postgres:${PSQL_TAG:-16} image: docker.io/library/postgres:${PSQL_TAG:-12}
volumes: volumes:
- db-data:/var/lib/postgresql/data - db-data:/var/lib/postgresql/data
environment: environment:

View File

@ -2,6 +2,3 @@ keypair
keypairs keypairs
hass hass
warmup warmup
ontext
singed
assertIn

View File

@ -21,9 +21,7 @@ updates:
labels: labels:
- dependencies - dependencies
- package-ecosystem: npm - package-ecosystem: npm
directories: directory: "/web"
- "/web"
- "/web/sfe"
schedule: schedule:
interval: daily interval: daily
time: "04:00" time: "04:00"
@ -32,22 +30,20 @@ updates:
open-pull-requests-limit: 10 open-pull-requests-limit: 10
commit-message: commit-message:
prefix: "web:" prefix: "web:"
# TODO: deduplicate these groups
groups: groups:
sentry: sentry:
patterns: patterns:
- "@sentry/*" - "@sentry/*"
- "@spotlightjs/*"
babel: babel:
patterns: patterns:
- "@babel/*" - "@babel/*"
- "babel-*" - "babel-*"
eslint: eslint:
patterns: patterns:
- "@eslint/*"
- "@typescript-eslint/*" - "@typescript-eslint/*"
- "eslint-*"
- "eslint" - "eslint"
- "typescript-eslint" - "eslint-*"
storybook: storybook:
patterns: patterns:
- "@storybook/*" - "@storybook/*"
@ -55,16 +51,37 @@ updates:
esbuild: esbuild:
patterns: patterns:
- "@esbuild/*" - "@esbuild/*"
- "esbuild*" - package-ecosystem: npm
rollup: directory: "/tests/wdio"
schedule:
interval: daily
time: "04:00"
labels:
- dependencies
open-pull-requests-limit: 10
commit-message:
prefix: "web:"
# TODO: deduplicate these groups
groups:
sentry:
patterns: patterns:
- "@rollup/*" - "@sentry/*"
- "rollup-*" babel:
- "rollup*"
swc:
patterns: patterns:
- "@swc/*" - "@babel/*"
- "swc-*" - "babel-*"
eslint:
patterns:
- "@typescript-eslint/*"
- "eslint"
- "eslint-*"
storybook:
patterns:
- "@storybook/*"
- "*storybook*"
esbuild:
patterns:
- "@esbuild/*"
wdio: wdio:
patterns: patterns:
- "@wdio/*" - "@wdio/*"

View File

@ -1,7 +1,7 @@
<!-- <!--
👋 Hi there! Welcome. 👋 Hi there! Welcome.
Please check the Contributing guidelines: https://docs.goauthentik.io/docs/developer-docs/#how-can-i-contribute Please check the Contributing guidelines: https://goauthentik.io/developer-docs/#how-can-i-contribute
--> -->
## Details ## Details
@ -27,6 +27,7 @@ If an API change has been made
If changes to the frontend have been made If changes to the frontend have been made
- [ ] The code has been formatted (`make web`) - [ ] The code has been formatted (`make web`)
- [ ] The translation files have been updated (`make i18n-extract`)
If applicable If applicable

View File

@ -1,65 +0,0 @@
name: authentik-api-py-publish
on:
push:
branches: [main]
paths:
- "schema.yml"
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
permissions:
id-token: write
steps:
- id: generate_token
uses: tibdex/github-app-token@v2
with:
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v4
with:
token: ${{ steps.generate_token.outputs.token }}
- name: Install poetry & deps
shell: bash
run: |
pipx install poetry || true
sudo apt-get update
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext
- name: Setup python and restore poetry
uses: actions/setup-python@v5
with:
python-version-file: "pyproject.toml"
cache: "poetry"
- name: Generate API Client
run: make gen-client-py
- name: Publish package
working-directory: gen-py-api/
run: |
poetry build
- name: Publish package to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
packages-dir: gen-py-api/dist/
# We can't easily upgrade the API client being used due to poetry being poetry
# so we'll have to rely on dependabot
# - name: Upgrade /
# run: |
# export VERSION=$(cd gen-py-api && poetry version -s)
# poetry add "authentik_client=$VERSION" --allow-prereleases --lock
# - uses: peter-evans/create-pull-request@v6
# id: cpr
# with:
# token: ${{ steps.generate_token.outputs.token }}
# branch: update-root-api-client
# commit-message: "root: bump API Client version"
# title: "root: bump API Client version"
# body: "root: bump API Client version"
# delete-branch: true
# signoff: true
# # ID from https://api.github.com/users/authentik-automation[bot]
# author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
# - uses: peter-evans/enable-pull-request-automerge@v3
# with:
# token: ${{ steps.generate_token.outputs.token }}
# pull-request-number: ${{ steps.cpr.outputs.pull-request-number }}
# merge-method: squash

View File

@ -1,4 +1,3 @@
---
name: authentik-ci-main name: authentik-ci-main
on: on:
@ -7,6 +6,8 @@ on:
- main - main
- next - next
- version-* - version-*
paths-ignore:
- website
pull_request: pull_request:
branches: branches:
- main - main
@ -26,7 +27,10 @@ jobs:
- bandit - bandit
- black - black
- codespell - codespell
- isort
- pending-migrations - pending-migrations
- pylint
- pyright
- ruff - ruff
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
@ -50,12 +54,17 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
psql: psql:
- 12-alpine
- 15-alpine - 15-alpine
- 16-alpine - 16-alpine
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Setup authentik env
uses: ./.github/actions/setup
with:
postgresql_version: ${{ matrix.psql }}
- name: checkout stable - name: checkout stable
run: | run: |
# Delete all poetry envs # Delete all poetry envs
@ -64,10 +73,10 @@ jobs:
cp authentik/lib/default.yml local.env.yml cp authentik/lib/default.yml local.env.yml
cp -R .github .. cp -R .github ..
cp -R scripts .. cp -R scripts ..
git checkout $(git tag --sort=version:refname | grep '^version/' | grep -vE -- '-rc[0-9]+$' | tail -n1) git checkout version/$(python -c "from authentik import __version__; print(__version__)")
rm -rf .github/ scripts/ rm -rf .github/ scripts/
mv ../.github ../scripts . mv ../.github ../scripts .
- name: Setup authentik env (stable) - name: Setup authentik env (ensure stable deps are installed)
uses: ./.github/actions/setup uses: ./.github/actions/setup
with: with:
postgresql_version: ${{ matrix.psql }} postgresql_version: ${{ matrix.psql }}
@ -81,20 +90,14 @@ jobs:
git clean -d -fx . git clean -d -fx .
git checkout $GITHUB_SHA git checkout $GITHUB_SHA
# Delete previous poetry env # Delete previous poetry env
rm -rf /home/runner/.cache/pypoetry/virtualenvs/* rm -rf $(poetry env info --path)
poetry install
- name: Setup authentik env (ensure latest deps are installed) - name: Setup authentik env (ensure latest deps are installed)
uses: ./.github/actions/setup uses: ./.github/actions/setup
with: with:
postgresql_version: ${{ matrix.psql }} postgresql_version: ${{ matrix.psql }}
- name: migrate to latest - name: migrate to latest
run: | run: poetry run python -m lifecycle.migrate
poetry run python -m lifecycle.migrate
- name: run tests
env:
# Test in the main database that we just migrated from the previous stable version
AUTHENTIK_POSTGRESQL__TEST__NAME: authentik
run: |
poetry run make test
test-unittest: test-unittest:
name: test-unittest - PostgreSQL ${{ matrix.psql }} name: test-unittest - PostgreSQL ${{ matrix.psql }}
runs-on: ubuntu-latest runs-on: ubuntu-latest
@ -103,6 +106,7 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
psql: psql:
- 12-alpine
- 15-alpine - 15-alpine
- 16-alpine - 16-alpine
steps: steps:
@ -116,16 +120,9 @@ jobs:
poetry run make test poetry run make test
poetry run coverage xml poetry run coverage xml
- if: ${{ always() }} - if: ${{ always() }}
uses: codecov/codecov-action@v5 uses: codecov/codecov-action@v3
with: with:
flags: unit flags: unit
token: ${{ secrets.CODECOV_TOKEN }}
- if: ${{ !cancelled() }}
uses: codecov/test-results-action@v1
with:
flags: unit
file: unittest.xml
token: ${{ secrets.CODECOV_TOKEN }}
test-integration: test-integration:
runs-on: ubuntu-latest runs-on: ubuntu-latest
timeout-minutes: 30 timeout-minutes: 30
@ -134,22 +131,15 @@ jobs:
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: Create k8s Kind Cluster - name: Create k8s Kind Cluster
uses: helm/kind-action@v1.10.0 uses: helm/kind-action@v1.8.0
- name: run integration - name: run integration
run: | run: |
poetry run coverage run manage.py test tests/integration poetry run coverage run manage.py test tests/integration
poetry run coverage xml poetry run coverage xml
- if: ${{ always() }} - if: ${{ always() }}
uses: codecov/codecov-action@v5 uses: codecov/codecov-action@v3
with: with:
flags: integration flags: integration
token: ${{ secrets.CODECOV_TOKEN }}
- if: ${{ !cancelled() }}
uses: codecov/test-results-action@v1
with:
flags: integration
file: unittest.xml
token: ${{ secrets.CODECOV_TOKEN }}
test-e2e: test-e2e:
name: test-e2e (${{ matrix.job.name }}) name: test-e2e (${{ matrix.job.name }})
runs-on: ubuntu-latest runs-on: ubuntu-latest
@ -170,8 +160,6 @@ jobs:
glob: tests/e2e/test_provider_ldap* tests/e2e/test_source_ldap* glob: tests/e2e/test_provider_ldap* tests/e2e/test_source_ldap*
- name: radius - name: radius
glob: tests/e2e/test_provider_radius* glob: tests/e2e/test_provider_radius*
- name: scim
glob: tests/e2e/test_source_scim*
- name: flows - name: flows
glob: tests/e2e/test_flows* glob: tests/e2e/test_flows*
steps: steps:
@ -180,9 +168,9 @@ jobs:
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: Setup e2e env (chrome, etc) - name: Setup e2e env (chrome, etc)
run: | run: |
docker compose -f tests/e2e/docker-compose.yml up -d --quiet-pull docker-compose -f tests/e2e/docker-compose.yml up -d
- id: cache-web - id: cache-web
uses: actions/cache@v4 uses: actions/cache@v3
with: with:
path: web/dist path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**') }} key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**') }}
@ -198,16 +186,9 @@ jobs:
poetry run coverage run manage.py test ${{ matrix.job.glob }} poetry run coverage run manage.py test ${{ matrix.job.glob }}
poetry run coverage xml poetry run coverage xml
- if: ${{ always() }} - if: ${{ always() }}
uses: codecov/codecov-action@v5 uses: codecov/codecov-action@v3
with: with:
flags: e2e flags: e2e
token: ${{ secrets.CODECOV_TOKEN }}
- if: ${{ !cancelled() }}
uses: codecov/test-results-action@v1
with:
flags: e2e
file: unittest.xml
token: ${{ secrets.CODECOV_TOKEN }}
ci-core-mark: ci-core-mark:
needs: needs:
- lint - lint
@ -220,27 +201,18 @@ jobs:
steps: steps:
- run: echo mark - run: echo mark
build: build:
strategy:
fail-fast: false
matrix:
arch:
- amd64
- arm64
needs: ci-core-mark needs: ci-core-mark
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload contianer images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation
id-token: write
attestations: write
timeout-minutes: 120 timeout-minutes: 120
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v3.0.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- name: prepare variables - name: prepare variables
@ -248,12 +220,9 @@ jobs:
id: ev id: ev
env: env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }} DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/dev-server
image-arch: ${{ matrix.arch }}
- name: Login to Container Registry - name: Login to Container Registry
if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
uses: docker/login-action@v3 uses: docker/login-action@v3
if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
@ -261,49 +230,74 @@ jobs:
- name: generate ts client - name: generate ts client
run: make gen-client-ts run: make gen-client-ts
- name: Build Docker Image - name: Build Docker Image
uses: docker/build-push-action@v6 uses: docker/build-push-action@v5
id: push
with: with:
context: . context: .
secrets: | secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }} GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }} GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
tags: ${{ steps.ev.outputs.imageTags }}
push: ${{ steps.ev.outputs.shouldBuild == 'true' }} push: ${{ steps.ev.outputs.shouldBuild == 'true' }}
tags: |
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.sha }}
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}-${{ steps.ev.outputs.timestamp }}-${{ steps.ev.outputs.shortHash }}
build-args: | build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }} GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-server:buildcache VERSION=${{ steps.ev.outputs.version }}
cache-to: ${{ steps.ev.outputs.shouldBuild == 'true' && 'type=registry,ref=ghcr.io/goauthentik/dev-server:buildcache,mode=max' || '' }} VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
platforms: linux/${{ matrix.arch }} cache-from: type=gha
- uses: actions/attest-build-provenance@v1 cache-to: type=gha,mode=max
id: attest - name: Comment on PR
if: ${{ steps.ev.outputs.shouldBuild == 'true' }} if: github.event_name == 'pull_request'
continue-on-error: true
uses: ./.github/actions/comment-pr-instructions
with: with:
subject-name: ${{ steps.ev.outputs.attestImageNames }} tag: gh-${{ steps.ev.outputs.branchNameContainer }}-${{ steps.ev.outputs.timestamp }}-${{ steps.ev.outputs.shortHash }}
subject-digest: ${{ steps.push.outputs.digest }} build-arm64:
push-to-registry: true needs: ci-core-mark
pr-comment:
needs:
- build
runs-on: ubuntu-latest runs-on: ubuntu-latest
if: ${{ github.event_name == 'pull_request' }}
permissions: permissions:
# Needed to write comments on PRs # Needed to upload contianer images to ghcr.io
pull-requests: write packages: write
timeout-minutes: 120 timeout-minutes: 120
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.0.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env: env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }} DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with: - name: Login to Container Registry
image-name: ghcr.io/goauthentik/dev-server uses: docker/login-action@v3
- name: Comment on PR
if: ${{ steps.ev.outputs.shouldBuild == 'true' }} if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
uses: ./.github/actions/comment-pr-instructions
with: with:
tag: ${{ steps.ev.outputs.imageMainTag }} registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: generate ts client
run: make gen-client-ts
- name: Build Docker Image
uses: docker/build-push-action@v5
with:
context: .
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
push: ${{ steps.ev.outputs.shouldBuild == 'true' }}
tags: |
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}-arm64
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.sha }}-arm64
ghcr.io/goauthentik/dev-server:gh-${{ steps.ev.outputs.branchNameContainer }}-${{ steps.ev.outputs.timestamp }}-${{ steps.ev.outputs.shortHash }}-arm64
build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
VERSION=${{ steps.ev.outputs.version }}
VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
platforms: linux/arm64
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@ -1,4 +1,3 @@
---
name: authentik-ci-outpost name: authentik-ci-outpost
on: on:
@ -17,7 +16,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- name: Prepare and generate API - name: Prepare and generate API
@ -29,16 +28,16 @@ jobs:
- name: Generate API - name: Generate API
run: make gen-client-go run: make gen-client-go
- name: golangci-lint - name: golangci-lint
uses: golangci/golangci-lint-action@v6 uses: golangci/golangci-lint-action@v3
with: with:
version: latest version: v1.54.2
args: --timeout 5000s --verbose args: --timeout 5000s --verbose
skip-cache: true skip-cache: true
test-unittest: test-unittest:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- name: Setup authentik env - name: Setup authentik env
@ -66,20 +65,16 @@ jobs:
- proxy - proxy
- ldap - ldap
- radius - radius
- rac
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload contianer images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation
id-token: write
attestations: write
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v3.0.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- name: prepare variables - name: prepare variables
@ -87,11 +82,9 @@ jobs:
id: ev id: ev
env: env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }} DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/dev-${{ matrix.type }}
- name: Login to Container Registry - name: Login to Container Registry
if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
uses: docker/login-action@v3 uses: docker/login-action@v3
if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
@ -99,25 +92,21 @@ jobs:
- name: Generate API - name: Generate API
run: make gen-client-go run: make gen-client-go
- name: Build Docker Image - name: Build Docker Image
id: push uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with: with:
tags: ${{ steps.ev.outputs.imageTags }}
file: ${{ matrix.type }}.Dockerfile
push: ${{ steps.ev.outputs.shouldBuild == 'true' }} push: ${{ steps.ev.outputs.shouldBuild == 'true' }}
tags: |
ghcr.io/goauthentik/dev-${{ matrix.type }}:gh-${{ steps.ev.outputs.branchNameContainer }}
ghcr.io/goauthentik/dev-${{ matrix.type }}:gh-${{ steps.ev.outputs.sha }}
file: ${{ matrix.type }}.Dockerfile
build-args: | build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }} GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
VERSION=${{ steps.ev.outputs.version }}
VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
platforms: linux/amd64,linux/arm64 platforms: linux/amd64,linux/arm64
context: . context: .
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-${{ matrix.type }}:buildcache cache-from: type=gha
cache-to: ${{ steps.ev.outputs.shouldBuild == 'true' && format('type=registry,ref=ghcr.io/goauthentik/dev-{0}:buildcache,mode=max', matrix.type) || '' }} cache-to: type=gha,mode=max
- uses: actions/attest-build-provenance@v1
id: attest
if: ${{ steps.ev.outputs.shouldBuild == 'true' }}
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
build-binary: build-binary:
timeout-minutes: 120 timeout-minutes: 120
needs: needs:
@ -130,14 +119,13 @@ jobs:
- proxy - proxy
- ldap - ldap
- radius - radius
- rac
goos: [linux] goos: [linux]
goarch: [amd64, arm64] goarch: [amd64, arm64]
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- uses: actions/setup-node@v4 - uses: actions/setup-node@v4

View File

@ -12,23 +12,14 @@ on:
- version-* - version-*
jobs: jobs:
lint: lint-eslint:
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
command:
- lint
- lint:lockfile
- tsc
- prettier-check
project: project:
- web - web
include: - tests/wdio
- command: tsc
project: web
- command: lit-analyse
project: web
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-node@v4 - uses: actions/setup-node@v4
@ -37,14 +28,83 @@ jobs:
cache: "npm" cache: "npm"
cache-dependency-path: ${{ matrix.project }}/package-lock.json cache-dependency-path: ${{ matrix.project }}/package-lock.json
- working-directory: ${{ matrix.project }}/ - working-directory: ${{ matrix.project }}/
run: | run: npm ci
npm ci
- name: Generate API - name: Generate API
run: make gen-client-ts run: make gen-client-ts
- name: Lint - name: Eslint
working-directory: ${{ matrix.project }}/ working-directory: ${{ matrix.project }}/
run: npm run ${{ matrix.command }} run: npm run lint
lint-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: npm ci
- name: Generate API
run: make gen-client-ts
- name: TSC
working-directory: web/
run: npm run tsc
lint-prettier:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
project:
- web
- tests/wdio
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: ${{ matrix.project }}/package.json
cache: "npm"
cache-dependency-path: ${{ matrix.project }}/package-lock.json
- working-directory: ${{ matrix.project }}/
run: npm ci
- name: Generate API
run: make gen-client-ts
- name: prettier
working-directory: ${{ matrix.project }}/
run: npm run prettier-check
lint-lit-analyse:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: |
npm ci
# lit-analyse doesn't understand path rewrites, so make it
# belive it's an actual module
cd node_modules/@goauthentik
ln -s ../../src/ web
- name: Generate API
run: make gen-client-ts
- name: lit-analyse
working-directory: web/
run: npm run lit-analyse
ci-web-mark:
needs:
- lint-eslint
- lint-prettier
- lint-lit-analyse
- lint-build
runs-on: ubuntu-latest
steps:
- run: echo mark
build: build:
needs:
- ci-web-mark
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
@ -60,28 +120,3 @@ jobs:
- name: build - name: build
working-directory: web/ working-directory: web/
run: npm run build run: npm run build
ci-web-mark:
needs:
- build
- lint
runs-on: ubuntu-latest
steps:
- run: echo mark
test:
needs:
- ci-web-mark
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: npm ci
- name: Generate API
run: make gen-client-ts
- name: test
working-directory: web/
run: npm run test || exit 0

View File

@ -12,21 +12,20 @@ on:
- version-* - version-*
jobs: jobs:
lint: lint-prettier:
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
command:
- lint:lockfile
- prettier-check
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
- working-directory: website/ - working-directory: website/
run: npm ci run: npm ci
- name: Lint - name: prettier
working-directory: website/ working-directory: website/
run: npm run ${{ matrix.command }} run: npm run prettier-check
test: test:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
@ -49,6 +48,7 @@ jobs:
matrix: matrix:
job: job:
- build - build
- build-docs-only
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-node@v4 - uses: actions/setup-node@v4
@ -63,7 +63,7 @@ jobs:
run: npm run ${{ matrix.job }} run: npm run ${{ matrix.job }}
ci-website-mark: ci-website-mark:
needs: needs:
- lint - lint-prettier
- test - test
- build - build
runs-on: ubuntu-latest runs-on: ubuntu-latest

View File

@ -27,10 +27,10 @@ jobs:
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v3 uses: github/codeql-action/init@v2
with: with:
languages: ${{ matrix.language }} languages: ${{ matrix.language }}
- name: Autobuild - name: Autobuild
uses: github/codeql-action/autobuild@v3 uses: github/codeql-action/autobuild@v2
- name: Perform CodeQL Analysis - name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3 uses: github/codeql-action/analyze@v2

View File

@ -1,43 +0,0 @@
name: authentik-gen-update-webauthn-mds
on:
workflow_dispatch:
schedule:
- cron: '30 1 1,15 * *'
env:
POSTGRES_DB: authentik
POSTGRES_USER: authentik
POSTGRES_PASSWORD: "EK-5jnKfjrGRm<77"
jobs:
build:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: tibdex/github-app-token@v2
with:
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v4
with:
token: ${{ steps.generate_token.outputs.token }}
- name: Setup authentik env
uses: ./.github/actions/setup
- run: poetry run ak update_webauthn_mds
- uses: peter-evans/create-pull-request@v7
id: cpr
with:
token: ${{ steps.generate_token.outputs.token }}
branch: update-fido-mds-client
commit-message: "stages/authenticator_webauthn: Update FIDO MDS3 & Passkey aaguid blobs"
title: "stages/authenticator_webauthn: Update FIDO MDS3 & Passkey aaguid blobs"
body: "stages/authenticator_webauthn: Update FIDO MDS3 & Passkey aaguid blobs"
delete-branch: true
signoff: true
# ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
- uses: peter-evans/enable-pull-request-automerge@v3
with:
token: ${{ steps.generate_token.outputs.token }}
pull-request-number: ${{ steps.cpr.outputs.pull-request-number }}
merge-method: squash

View File

@ -6,10 +6,6 @@ on:
types: types:
- closed - closed
permissions:
# Permission to delete cache
actions: write
jobs: jobs:
cleanup: cleanup:
runs-on: ubuntu-latest runs-on: ubuntu-latest

View File

@ -42,7 +42,7 @@ jobs:
with: with:
githubToken: ${{ steps.generate_token.outputs.token }} githubToken: ${{ steps.generate_token.outputs.token }}
compressOnly: ${{ github.event_name != 'pull_request' }} compressOnly: ${{ github.event_name != 'pull_request' }}
- uses: peter-evans/create-pull-request@v7 - uses: peter-evans/create-pull-request@v5
if: "${{ github.event_name != 'pull_request' && steps.compress.outputs.markdown != '' }}" if: "${{ github.event_name != 'pull_request' && steps.compress.outputs.markdown != '' }}"
id: cpr id: cpr
with: with:

View File

@ -1,4 +1,3 @@
---
name: authentik-on-release name: authentik-on-release
on: on:
@ -11,22 +10,15 @@ jobs:
permissions: permissions:
# Needed to upload contianer images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation
id-token: write
attestations: write
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v3.0.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/server,beryju/authentik
- name: Docker Login Registry - name: Docker Login Registry
uses: docker/login-action@v3 uses: docker/login-action@v3
with: with:
@ -43,32 +35,29 @@ jobs:
mkdir -p ./gen-ts-api mkdir -p ./gen-ts-api
mkdir -p ./gen-go-api mkdir -p ./gen-go-api
- name: Build Docker Image - name: Build Docker Image
uses: docker/build-push-action@v6 uses: docker/build-push-action@v5
id: push
with: with:
context: . context: .
push: true push: ${{ github.event_name == 'release' }}
secrets: | secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }} GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }} GEOIPUPDATE_LICENSE_KEY=${{ secrets.GEOIPUPDATE_LICENSE_KEY }}
build-args: | tags: |
VERSION=${{ github.ref }} beryju/authentik:${{ steps.ev.outputs.version }},
tags: ${{ steps.ev.outputs.imageTags }} beryju/authentik:${{ steps.ev.outputs.versionFamily }},
beryju/authentik:latest,
ghcr.io/goauthentik/server:${{ steps.ev.outputs.version }},
ghcr.io/goauthentik/server:${{ steps.ev.outputs.versionFamily }},
ghcr.io/goauthentik/server:latest
platforms: linux/amd64,linux/arm64 platforms: linux/amd64,linux/arm64
- uses: actions/attest-build-provenance@v1 build-args: |
id: attest VERSION=${{ steps.ev.outputs.version }}
with: VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
build-outpost: build-outpost:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: permissions:
# Needed to upload contianer images to ghcr.io # Needed to upload contianer images to ghcr.io
packages: write packages: write
# Needed for attestation
id-token: write
attestations: write
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
@ -76,23 +65,18 @@ jobs:
- proxy - proxy
- ldap - ldap
- radius - radius
- rac
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v3.0.0
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3 uses: docker/setup-buildx-action@v3
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/${{ matrix.type }},beryju/authentik-${{ matrix.type }}
- name: make empty clients - name: make empty clients
run: | run: |
mkdir -p ./gen-ts-api mkdir -p ./gen-ts-api
@ -109,22 +93,22 @@ jobs:
username: ${{ github.repository_owner }} username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker Image - name: Build Docker Image
uses: docker/build-push-action@v6 uses: docker/build-push-action@v5
id: push
with: with:
push: true push: ${{ github.event_name == 'release' }}
build-args: | tags: |
VERSION=${{ github.ref }} beryju/authentik-${{ matrix.type }}:${{ steps.ev.outputs.version }},
tags: ${{ steps.ev.outputs.imageTags }} beryju/authentik-${{ matrix.type }}:${{ steps.ev.outputs.versionFamily }},
beryju/authentik-${{ matrix.type }}:latest,
ghcr.io/goauthentik/${{ matrix.type }}:${{ steps.ev.outputs.version }},
ghcr.io/goauthentik/${{ matrix.type }}:${{ steps.ev.outputs.versionFamily }},
ghcr.io/goauthentik/${{ matrix.type }}:latest
file: ${{ matrix.type }}.Dockerfile file: ${{ matrix.type }}.Dockerfile
platforms: linux/amd64,linux/arm64 platforms: linux/amd64,linux/arm64
context: . context: .
- uses: actions/attest-build-provenance@v1 build-args: |
id: attest VERSION=${{ steps.ev.outputs.version }}
with: VERSION_FAMILY=${{ steps.ev.outputs.versionFamily }}
subject-name: ${{ steps.ev.outputs.attestImageNames }}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
build-outpost-binary: build-outpost-binary:
timeout-minutes: 120 timeout-minutes: 120
runs-on: ubuntu-latest runs-on: ubuntu-latest
@ -142,7 +126,7 @@ jobs:
goarch: [amd64, arm64] goarch: [amd64, arm64]
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/setup-go@v5 - uses: actions/setup-go@v4
with: with:
go-version-file: "go.mod" go-version-file: "go.mod"
- uses: actions/setup-node@v4 - uses: actions/setup-node@v4
@ -179,12 +163,12 @@ jobs:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Run test suite in final docker images - name: Run test suite in final docker images
run: | run: |
echo "PG_PASS=$(openssl rand 32 | base64 -w 0)" >> .env echo "PG_PASS=$(openssl rand -base64 32)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(openssl rand 32 | base64 -w 0)" >> .env echo "AUTHENTIK_SECRET_KEY=$(openssl rand -base64 32)" >> .env
docker compose pull -q docker-compose pull -q
docker compose up --no-start docker-compose up --no-start
docker compose start postgresql redis docker-compose start postgresql redis
docker compose run -u root server test-all docker-compose run -u root server test-all
sentry-release: sentry-release:
needs: needs:
- build-server - build-server
@ -196,18 +180,15 @@ jobs:
- name: prepare variables - name: prepare variables
uses: ./.github/actions/docker-push-variables uses: ./.github/actions/docker-push-variables
id: ev id: ev
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with:
image-name: ghcr.io/goauthentik/server
- name: Get static files from docker image - name: Get static files from docker image
run: | run: |
docker pull ${{ steps.ev.outputs.imageMainName }} docker pull ghcr.io/goauthentik/server:latest
container=$(docker container create ${{ steps.ev.outputs.imageMainName }}) container=$(docker container create ghcr.io/goauthentik/server:latest)
docker cp ${container}:web/ . docker cp ${container}:web/ .
- name: Create a Sentry.io release - name: Create a Sentry.io release
uses: getsentry/action-release@v1 uses: getsentry/action-release@v1
continue-on-error: true continue-on-error: true
if: ${{ github.event_name == 'release' }}
env: env:
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }} SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_ORG: authentik-security-inc SENTRY_ORG: authentik-security-inc

View File

@ -1,4 +1,3 @@
---
name: authentik-on-tag name: authentik-on-tag
on: on:
@ -14,28 +13,28 @@ jobs:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Pre-release test - name: Pre-release test
run: | run: |
echo "PG_PASS=$(openssl rand 32 | base64 -w 0)" >> .env echo "PG_PASS=$(openssl rand -base64 32)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(openssl rand 32 | base64 -w 0)" >> .env echo "AUTHENTIK_SECRET_KEY=$(openssl rand -base64 32)" >> .env
docker buildx install docker buildx install
mkdir -p ./gen-ts-api mkdir -p ./gen-ts-api
docker build -t testing:latest . docker build -t testing:latest .
echo "AUTHENTIK_IMAGE=testing" >> .env echo "AUTHENTIK_IMAGE=testing" >> .env
echo "AUTHENTIK_TAG=latest" >> .env echo "AUTHENTIK_TAG=latest" >> .env
docker compose up --no-start docker-compose up --no-start
docker compose start postgresql redis docker-compose start postgresql redis
docker compose run -u root server test-all docker-compose run -u root server test-all
- id: generate_token - id: generate_token
uses: tibdex/github-app-token@v2 uses: tibdex/github-app-token@v2
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- name: prepare variables - name: Extract version number
uses: ./.github/actions/docker-push-variables id: get_version
id: ev uses: actions/github-script@v6
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
with: with:
image-name: ghcr.io/goauthentik/server github-token: ${{ steps.generate_token.outputs.token }}
script: |
return context.payload.ref.replace(/\/refs\/tags\/version\//, '');
- name: Create Release - name: Create Release
id: create_release id: create_release
uses: actions/create-release@v1.1.4 uses: actions/create-release@v1.1.4
@ -43,6 +42,6 @@ jobs:
GITHUB_TOKEN: ${{ steps.generate_token.outputs.token }} GITHUB_TOKEN: ${{ steps.generate_token.outputs.token }}
with: with:
tag_name: ${{ github.ref }} tag_name: ${{ github.ref }}
release_name: Release ${{ steps.ev.outputs.version }} release_name: Release ${{ steps.get_version.outputs.result }}
draft: true draft: true
prerelease: ${{ steps.ev.outputs.prerelease == 'true' }} prerelease: false

View File

@ -18,12 +18,12 @@ jobs:
with: with:
app_id: ${{ secrets.GH_APP_ID }} app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }} private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/stale@v9 - uses: actions/stale@v8
with: with:
repo-token: ${{ steps.generate_token.outputs.token }} repo-token: ${{ steps.generate_token.outputs.token }}
days-before-stale: 60 days-before-stale: 60
days-before-close: 7 days-before-close: 7
exempt-issue-labels: pinned,security,pr_wanted,enhancement,bug/confirmed,enhancement/confirmed,question,status/reviewing exempt-issue-labels: pinned,security,pr_wanted,enhancement,bug/confirmed,enhancement/confirmed,question
stale-issue-label: wontfix stale-issue-label: wontfix
stale-issue-message: > stale-issue-message: >
This issue has been automatically marked as stale because it has not had This issue has been automatically marked as stale because it has not had

View File

@ -7,26 +7,21 @@ on:
paths: paths:
- "!**" - "!**"
- "locale/**" - "locale/**"
- "!locale/en/**" - "web/src/locales/**"
- "web/xliff/**"
permissions:
# Permission to write comment
pull-requests: write
jobs: jobs:
post-comment: post-comment:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Find Comment - name: Find Comment
uses: peter-evans/find-comment@v3 uses: peter-evans/find-comment@v2
id: fc id: fc
with: with:
issue-number: ${{ github.event.pull_request.number }} issue-number: ${{ github.event.pull_request.number }}
comment-author: "github-actions[bot]" comment-author: "github-actions[bot]"
body-includes: authentik translations instructions body-includes: authentik translations instructions
- name: Create or update comment - name: Create or update comment
uses: peter-evans/create-or-update-comment@v4 uses: peter-evans/create-or-update-comment@v3
with: with:
comment-id: ${{ steps.fc.outputs.comment-id }} comment-id: ${{ steps.fc.outputs.comment-id }}
issue-number: ${{ github.event.pull_request.number }} issue-number: ${{ github.event.pull_request.number }}

View File

@ -1,8 +1,9 @@
--- name: authentik-backend-translate-compile
name: authentik-backend-translate-extract-compile
on: on:
schedule: push:
- cron: "0 0 * * *" # every day at midnight branches: [main]
paths:
- "locale/**"
workflow_dispatch: workflow_dispatch:
env: env:
@ -24,20 +25,16 @@ jobs:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}
- name: Setup authentik env - name: Setup authentik env
uses: ./.github/actions/setup uses: ./.github/actions/setup
- name: run extract
run: |
poetry run make i18n-extract
- name: run compile - name: run compile
run: | run: poetry run ak compilemessages
poetry run ak compilemessages
make web-check-compile
- name: Create Pull Request - name: Create Pull Request
uses: peter-evans/create-pull-request@v7 uses: peter-evans/create-pull-request@v5
id: cpr
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}
branch: extract-compile-backend-translation branch: compile-backend-translation
commit-message: "core, web: update translations" commit-message: "core: compile backend translations"
title: "core, web: update translations" title: "core: compile backend translations"
body: "core, web: update translations" body: "core: compile backend translations"
delete-branch: true delete-branch: true
signoff: true signoff: true

View File

@ -6,10 +6,6 @@ on:
pull_request: pull_request:
types: [opened, reopened] types: [opened, reopened]
permissions:
# Permission to rename PR
pull-requests: write
jobs: jobs:
rename_pr: rename_pr:
runs-on: ubuntu-latest runs-on: ubuntu-latest

View File

@ -1,4 +1,4 @@
name: authentik-api-ts-publish name: authentik-web-api-publish
on: on:
push: push:
branches: [main] branches: [main]
@ -31,16 +31,11 @@ jobs:
env: env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_PUBLISH_TOKEN }} NODE_AUTH_TOKEN: ${{ secrets.NPM_PUBLISH_TOKEN }}
- name: Upgrade /web - name: Upgrade /web
working-directory: web working-directory: web/
run: | run: |
export VERSION=`node -e 'console.log(require("../gen-ts-api/package.json").version)'` export VERSION=`node -e 'console.log(require("../gen-ts-api/package.json").version)'`
npm i @goauthentik/api@$VERSION npm i @goauthentik/api@$VERSION
- name: Upgrade /web/packages/sfe - uses: peter-evans/create-pull-request@v5
working-directory: web/packages/sfe
run: |
export VERSION=`node -e 'console.log(require("../gen-ts-api/package.json").version)'`
npm i @goauthentik/api@$VERSION
- uses: peter-evans/create-pull-request@v7
id: cpr id: cpr
with: with:
token: ${{ steps.generate_token.outputs.token }} token: ${{ steps.generate_token.outputs.token }}

View File

@ -10,12 +10,12 @@
"Gruntfuggly.todo-tree", "Gruntfuggly.todo-tree",
"mechatroner.rainbow-csv", "mechatroner.rainbow-csv",
"ms-python.black-formatter", "ms-python.black-formatter",
"charliermarsh.ruff", "ms-python.isort",
"ms-python.pylint",
"ms-python.python", "ms-python.python",
"ms-python.vscode-pylance", "ms-python.vscode-pylance",
"ms-python.black-formatter",
"redhat.vscode-yaml", "redhat.vscode-yaml",
"Tobermory.es6-string-html", "Tobermory.es6-string-html",
"unifiedjs.vscode-mdx" "unifiedjs.vscode-mdx",
] ]
} }

2
.vscode/launch.json vendored
View File

@ -22,6 +22,6 @@
}, },
"justMyCode": true, "justMyCode": true,
"django": true "django": true
} },
] ]
} }

35
.vscode/settings.json vendored
View File

@ -4,36 +4,35 @@
"asgi", "asgi",
"authentik", "authentik",
"authn", "authn",
"entra",
"goauthentik", "goauthentik",
"jwe",
"jwks", "jwks",
"kubernetes",
"oidc", "oidc",
"openid", "openid",
"passwordless",
"plex", "plex",
"saml", "saml",
"scim",
"slo",
"sso",
"totp", "totp",
"webauthn",
"traefik", "traefik",
"webauthn" "passwordless",
"kubernetes",
"sso",
"slo",
"scim",
], ],
"python.linting.pylintEnabled": true,
"todo-tree.tree.showCountsInTree": true, "todo-tree.tree.showCountsInTree": true,
"todo-tree.tree.showBadges": true, "todo-tree.tree.showBadges": true,
"python.formatting.provider": "black",
"yaml.customTags": [ "yaml.customTags": [
"!Condition sequence",
"!Context scalar",
"!Enumerate sequence",
"!Env scalar",
"!Find sequence", "!Find sequence",
"!Format sequence",
"!If sequence",
"!Index scalar",
"!KeyOf scalar", "!KeyOf scalar",
"!Value scalar" "!Context scalar",
"!Context sequence",
"!Format sequence",
"!Condition sequence",
"!Env sequence",
"!Env scalar",
"!If sequence"
], ],
"typescript.preferences.importModuleSpecifier": "non-relative", "typescript.preferences.importModuleSpecifier": "non-relative",
"typescript.preferences.importModuleSpecifierEnding": "index", "typescript.preferences.importModuleSpecifierEnding": "index",
@ -50,7 +49,9 @@
"ignoreCase": false "ignoreCase": false
} }
], ],
"go.testFlags": ["-count=1"], "go.testFlags": [
"-count=1"
],
"github-actions.workflows.pinned.workflows": [ "github-actions.workflows.pinned.workflows": [
".github/workflows/ci-main.yml" ".github/workflows/ci-main.yml"
] ]

62
.vscode/tasks.json vendored
View File

@ -2,67 +2,85 @@
"version": "2.0.0", "version": "2.0.0",
"tasks": [ "tasks": [
{ {
"label": "authentik/core: make", "label": "authentik[core]: format & test",
"command": "poetry", "command": "poetry",
"args": ["run", "make", "lint-fix", "lint"], "args": [
"presentation": { "run",
"panel": "new" "make"
}, ],
"group": "test" "group": "build",
}, },
{ {
"label": "authentik/core: run", "label": "authentik[core]: run",
"command": "poetry", "command": "poetry",
"args": ["run", "ak", "server"], "args": [
"run",
"make",
"run",
],
"group": "build", "group": "build",
"presentation": { "presentation": {
"panel": "dedicated", "panel": "dedicated",
"group": "running" "group": "running"
} },
}, },
{ {
"label": "authentik/web: make", "label": "authentik[web]: format",
"command": "make", "command": "make",
"args": ["web"], "args": ["web"],
"group": "build" "group": "build",
}, },
{ {
"label": "authentik/web: watch", "label": "authentik[web]: watch",
"command": "make", "command": "make",
"args": ["web-watch"], "args": ["web-watch"],
"group": "build", "group": "build",
"presentation": { "presentation": {
"panel": "dedicated", "panel": "dedicated",
"group": "running" "group": "running"
} },
}, },
{ {
"label": "authentik: install", "label": "authentik: install",
"command": "make", "command": "make",
"args": ["install", "-j4"], "args": ["install"],
"group": "build" "group": "build",
}, },
{ {
"label": "authentik/website: make", "label": "authentik: i18n-extract",
"command": "poetry",
"args": [
"run",
"make",
"i18n-extract"
],
"group": "build",
},
{
"label": "authentik[website]: format",
"command": "make", "command": "make",
"args": ["website"], "args": ["website"],
"group": "build" "group": "build",
}, },
{ {
"label": "authentik/website: watch", "label": "authentik[website]: watch",
"command": "make", "command": "make",
"args": ["website-watch"], "args": ["website-watch"],
"group": "build", "group": "build",
"presentation": { "presentation": {
"panel": "dedicated", "panel": "dedicated",
"group": "running" "group": "running"
} },
}, },
{ {
"label": "authentik/api: generate", "label": "authentik[api]: generate",
"command": "poetry", "command": "poetry",
"args": ["run", "make", "gen"], "args": [
"run",
"make",
"gen"
],
"group": "build" "group": "build"
} },
] ]
} }

View File

@ -11,8 +11,6 @@ scripts/ @goauthentik/backend
tests/ @goauthentik/backend tests/ @goauthentik/backend
pyproject.toml @goauthentik/backend pyproject.toml @goauthentik/backend
poetry.lock @goauthentik/backend poetry.lock @goauthentik/backend
go.mod @goauthentik/backend
go.sum @goauthentik/backend
# Infrastructure # Infrastructure
.github/ @goauthentik/infrastructure .github/ @goauthentik/infrastructure
Dockerfile @goauthentik/infrastructure Dockerfile @goauthentik/infrastructure

View File

@ -1,7 +1,7 @@
# syntax=docker/dockerfile:1 # syntax=docker/dockerfile:1
# Stage 1: Build website # Stage 1: Build website
FROM --platform=${BUILDPLATFORM} docker.io/library/node:22 AS website-builder FROM --platform=${BUILDPLATFORM} docker.io/node:21 as website-builder
ENV NODE_ENV=production ENV NODE_ENV=production
@ -14,28 +14,22 @@ RUN --mount=type=bind,target=/work/website/package.json,src=./website/package.js
COPY ./website /work/website/ COPY ./website /work/website/
COPY ./blueprints /work/blueprints/ COPY ./blueprints /work/blueprints/
COPY ./schema.yml /work/
COPY ./SECURITY.md /work/ COPY ./SECURITY.md /work/
RUN npm run build-bundled RUN npm run build-docs-only
# Stage 2: Build webui # Stage 2: Build webui
FROM --platform=${BUILDPLATFORM} docker.io/library/node:22 AS web-builder FROM --platform=${BUILDPLATFORM} docker.io/node:21 as web-builder
ARG GIT_BUILD_HASH
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
ENV NODE_ENV=production ENV NODE_ENV=production
WORKDIR /work/web WORKDIR /work/web
RUN --mount=type=bind,target=/work/web/package.json,src=./web/package.json \ RUN --mount=type=bind,target=/work/web/package.json,src=./web/package.json \
--mount=type=bind,target=/work/web/package-lock.json,src=./web/package-lock.json \ --mount=type=bind,target=/work/web/package-lock.json,src=./web/package-lock.json \
--mount=type=bind,target=/work/web/packages/sfe/package.json,src=./web/packages/sfe/package.json \
--mount=type=bind,target=/work/web/scripts,src=./web/scripts \
--mount=type=cache,id=npm-web,sharing=shared,target=/root/.npm \ --mount=type=cache,id=npm-web,sharing=shared,target=/root/.npm \
npm ci --include=dev npm ci --include=dev
COPY ./package.json /work
COPY ./web /work/web/ COPY ./web /work/web/
COPY ./website /work/website/ COPY ./website /work/website/
COPY ./gen-ts-api /work/web/node_modules/@goauthentik/api COPY ./gen-ts-api /work/web/node_modules/@goauthentik/api
@ -43,7 +37,7 @@ COPY ./gen-ts-api /work/web/node_modules/@goauthentik/api
RUN npm run build RUN npm run build
# Stage 3: Build go proxy # Stage 3: Build go proxy
FROM --platform=${BUILDPLATFORM} mcr.microsoft.com/oss/go/microsoft/golang:1.23-fips-bookworm AS go-builder FROM --platform=${BUILDPLATFORM} docker.io/golang:1.21.4-bookworm AS go-builder
ARG TARGETOS ARG TARGETOS
ARG TARGETARCH ARG TARGETARCH
@ -54,11 +48,6 @@ ARG GOARCH=$TARGETARCH
WORKDIR /go/src/goauthentik.io WORKDIR /go/src/goauthentik.io
RUN --mount=type=cache,id=apt-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/var/cache/apt \
dpkg --add-architecture arm64 && \
apt-get update && \
apt-get install -y --no-install-recommends crossbuild-essential-arm64 gcc-aarch64-linux-gnu
RUN --mount=type=bind,target=/go/src/goauthentik.io/go.mod,src=./go.mod \ RUN --mount=type=bind,target=/go/src/goauthentik.io/go.mod,src=./go.mod \
--mount=type=bind,target=/go/src/goauthentik.io/go.sum,src=./go.sum \ --mount=type=bind,target=/go/src/goauthentik.io/go.sum,src=./go.sum \
--mount=type=cache,target=/go/pkg/mod \ --mount=type=cache,target=/go/pkg/mod \
@ -73,17 +62,17 @@ COPY ./internal /go/src/goauthentik.io/internal
COPY ./go.mod /go/src/goauthentik.io/go.mod COPY ./go.mod /go/src/goauthentik.io/go.mod
COPY ./go.sum /go/src/goauthentik.io/go.sum COPY ./go.sum /go/src/goauthentik.io/go.sum
ENV CGO_ENABLED=0
RUN --mount=type=cache,sharing=locked,target=/go/pkg/mod \ RUN --mount=type=cache,sharing=locked,target=/go/pkg/mod \
--mount=type=cache,id=go-build-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/root/.cache/go-build \ --mount=type=cache,id=go-build-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/root/.cache/go-build \
if [ "$TARGETARCH" = "arm64" ]; then export CC=aarch64-linux-gnu-gcc && export CC_FOR_TARGET=gcc-aarch64-linux-gnu; fi && \ GOARM="${TARGETVARIANT#v}" go build -o /go/authentik ./cmd/server
CGO_ENABLED=1 GOEXPERIMENT="systemcrypto" GOFLAGS="-tags=requirefips" GOARM="${TARGETVARIANT#v}" \
go build -o /go/authentik ./cmd/server
# Stage 4: MaxMind GeoIP # Stage 4: MaxMind GeoIP
FROM --platform=${BUILDPLATFORM} ghcr.io/maxmind/geoipupdate:v7.1.0 AS geoip FROM --platform=${BUILDPLATFORM} ghcr.io/maxmind/geoipupdate:v6.0 as geoip
ENV GEOIPUPDATE_EDITION_IDS="GeoLite2-City GeoLite2-ASN" ENV GEOIPUPDATE_EDITION_IDS="GeoLite2-City"
ENV GEOIPUPDATE_VERBOSE="1" ENV GEOIPUPDATE_VERBOSE="true"
ENV GEOIPUPDATE_ACCOUNT_ID_FILE="/run/secrets/GEOIPUPDATE_ACCOUNT_ID" ENV GEOIPUPDATE_ACCOUNT_ID_FILE="/run/secrets/GEOIPUPDATE_ACCOUNT_ID"
ENV GEOIPUPDATE_LICENSE_KEY_FILE="/run/secrets/GEOIPUPDATE_LICENSE_KEY" ENV GEOIPUPDATE_LICENSE_KEY_FILE="/run/secrets/GEOIPUPDATE_LICENSE_KEY"
@ -94,10 +83,7 @@ RUN --mount=type=secret,id=GEOIPUPDATE_ACCOUNT_ID \
/bin/sh -c "/usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0" /bin/sh -c "/usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0"
# Stage 5: Python dependencies # Stage 5: Python dependencies
FROM ghcr.io/goauthentik/fips-python:3.12.7-slim-bookworm-fips-full AS python-deps FROM docker.io/python:3.11.5-bookworm AS python-deps
ARG TARGETARCH
ARG TARGETVARIANT
WORKDIR /ak-root/poetry WORKDIR /ak-root/poetry
@ -110,38 +96,36 @@ RUN rm -f /etc/apt/apt.conf.d/docker-clean; echo 'Binary::apt::APT::Keep-Downloa
RUN --mount=type=cache,id=apt-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/var/cache/apt \ RUN --mount=type=cache,id=apt-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/var/cache/apt \
apt-get update && \ apt-get update && \
# Required for installing pip packages # Required for installing pip packages
apt-get install -y --no-install-recommends build-essential pkg-config libpq-dev libkrb5-dev apt-get install -y --no-install-recommends build-essential pkg-config libxmlsec1-dev zlib1g-dev libpq-dev
RUN --mount=type=bind,target=./pyproject.toml,src=./pyproject.toml \ RUN --mount=type=bind,target=./pyproject.toml,src=./pyproject.toml \
--mount=type=bind,target=./poetry.lock,src=./poetry.lock \ --mount=type=bind,target=./poetry.lock,src=./poetry.lock \
--mount=type=cache,target=/root/.cache/pip \ --mount=type=cache,target=/root/.cache/pip \
--mount=type=cache,target=/root/.cache/pypoetry \ --mount=type=cache,target=/root/.cache/pypoetry \
python -m venv /ak-root/venv/ && \ python -m venv /ak-root/venv/ && \
bash -c "source ${VENV_PATH}/bin/activate && \
pip3 install --upgrade pip && \ pip3 install --upgrade pip && \
pip3 install poetry && \ pip3 install poetry && \
poetry install --only=main --no-ansi --no-interaction --no-root && \ poetry install --only=main --no-ansi --no-interaction
pip install --force-reinstall /wheels/*"
# Stage 6: Run # Stage 6: Run
FROM ghcr.io/goauthentik/fips-python:3.12.7-slim-bookworm-fips-full AS final-image FROM docker.io/python:3.11.5-slim-bookworm AS final-image
ARG VERSION
ARG GIT_BUILD_HASH ARG GIT_BUILD_HASH
ARG VERSION
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
LABEL org.opencontainers.image.url=https://goauthentik.io LABEL org.opencontainers.image.url https://goauthentik.io
LABEL org.opencontainers.image.description="goauthentik.io Main server image, see https://goauthentik.io for more info." LABEL org.opencontainers.image.description goauthentik.io Main server image, see https://goauthentik.io for more info.
LABEL org.opencontainers.image.source=https://github.com/goauthentik/authentik LABEL org.opencontainers.image.source https://github.com/goauthentik/authentik
LABEL org.opencontainers.image.version=${VERSION} LABEL org.opencontainers.image.version ${VERSION}
LABEL org.opencontainers.image.revision=${GIT_BUILD_HASH} LABEL org.opencontainers.image.revision ${GIT_BUILD_HASH}
WORKDIR / WORKDIR /
# We cannot cache this layer otherwise we'll end up with a bigger image # We cannot cache this layer otherwise we'll end up with a bigger image
RUN apt-get update && \ RUN apt-get update && \
# Required for runtime # Required for runtime
apt-get install -y --no-install-recommends libpq5 libmaxminddb0 ca-certificates libkrb5-3 libkadm5clnt-mit12 libkdb5-10 && \ apt-get install -y --no-install-recommends libpq5 openssl libxmlsec1-openssl libmaxminddb0 && \
# Required for bootstrap & healtcheck # Required for bootstrap & healtcheck
apt-get install -y --no-install-recommends runit && \ apt-get install -y --no-install-recommends runit && \
apt-get clean && \ apt-get clean && \
@ -161,12 +145,11 @@ COPY ./tests /tests
COPY ./manage.py / COPY ./manage.py /
COPY ./blueprints /blueprints COPY ./blueprints /blueprints
COPY ./lifecycle/ /lifecycle COPY ./lifecycle/ /lifecycle
COPY ./authentik/sources/kerberos/krb5.conf /etc/krb5.conf
COPY --from=go-builder /go/authentik /bin/authentik COPY --from=go-builder /go/authentik /bin/authentik
COPY --from=python-deps /ak-root/venv /ak-root/venv COPY --from=python-deps /ak-root/venv /ak-root/venv
COPY --from=web-builder /work/web/dist/ /web/dist/ COPY --from=web-builder /work/web/dist/ /web/dist/
COPY --from=web-builder /work/web/authentik/ /web/authentik/ COPY --from=web-builder /work/web/authentik/ /web/authentik/
COPY --from=website-builder /work/website/build/ /website/help/ COPY --from=website-builder /work/website/help/ /website/help/
COPY --from=geoip /usr/share/GeoIP /geoip COPY --from=geoip /usr/share/GeoIP /geoip
USER 1000 USER 1000
@ -178,8 +161,6 @@ ENV TMPDIR=/dev/shm/ \
VENV_PATH="/ak-root/venv" \ VENV_PATH="/ak-root/venv" \
POETRY_VIRTUALENVS_CREATE=false POETRY_VIRTUALENVS_CREATE=false
ENV GOFIPS=1
HEALTHCHECK --interval=30s --timeout=30s --start-period=60s --retries=3 CMD [ "ak", "healthcheck" ] HEALTHCHECK --interval=30s --timeout=30s --start-period=60s --retries=3 CMD [ "ak", "healthcheck" ]
ENTRYPOINT [ "dumb-init", "--", "ak" ] ENTRYPOINT [ "dumb-init", "--", "ak" ]

140
Makefile
View File

@ -5,13 +5,9 @@ PWD = $(shell pwd)
UID = $(shell id -u) UID = $(shell id -u)
GID = $(shell id -g) GID = $(shell id -g)
NPM_VERSION = $(shell python -m scripts.npm_version) NPM_VERSION = $(shell python -m scripts.npm_version)
PY_SOURCES = authentik tests scripts lifecycle .github PY_SOURCES = authentik tests scripts lifecycle
DOCKER_IMAGE ?= "authentik:test" DOCKER_IMAGE ?= "authentik:test"
GEN_API_TS = "gen-ts-api"
GEN_API_PY = "gen-py-api"
GEN_API_GO = "gen-go-api"
pg_user := $(shell python -m authentik.lib.config postgresql.user 2>/dev/null) pg_user := $(shell python -m authentik.lib.config postgresql.user 2>/dev/null)
pg_host := $(shell python -m authentik.lib.config postgresql.host 2>/dev/null) pg_host := $(shell python -m authentik.lib.config postgresql.host 2>/dev/null)
pg_name := $(shell python -m authentik.lib.config postgresql.name 2>/dev/null) pg_name := $(shell python -m authentik.lib.config postgresql.name 2>/dev/null)
@ -19,13 +15,13 @@ pg_name := $(shell python -m authentik.lib.config postgresql.name 2>/dev/null)
CODESPELL_ARGS = -D - -D .github/codespell-dictionary.txt \ CODESPELL_ARGS = -D - -D .github/codespell-dictionary.txt \
-I .github/codespell-words.txt \ -I .github/codespell-words.txt \
-S 'web/src/locales/**' \ -S 'web/src/locales/**' \
-S 'website/docs/developer-docs/api/reference/**' \
authentik \ authentik \
internal \ internal \
cmd \ cmd \
web/src \ web/src \
website/src \ website/src \
website/blog \ website/blog \
website/developer-docs \
website/docs \ website/docs \
website/integrations \ website/integrations \
website/src website/src
@ -42,16 +38,16 @@ help: ## Show this help
sort sort
@echo "" @echo ""
go-test: test-go:
go test -timeout 0 -v -race -cover ./... go test -timeout 0 -v -race -cover ./...
test-docker: ## Run all tests in a docker-compose test-docker: ## Run all tests in a docker-compose
echo "PG_PASS=$(shell openssl rand 32 | base64 -w 0)" >> .env echo "PG_PASS=$(openssl rand -base64 32)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(shell openssl rand 32 | base64 -w 0)" >> .env echo "AUTHENTIK_SECRET_KEY=$(openssl rand -base64 32)" >> .env
docker compose pull -q docker-compose pull -q
docker compose up --no-start docker-compose up --no-start
docker compose start postgresql redis docker-compose start postgresql redis
docker compose run -u root server test-all docker-compose run -u root server test-all
rm -f .env rm -f .env
test: ## Run the server tests and produce a coverage report (locally) test: ## Run the server tests and produce a coverage report (locally)
@ -59,37 +55,28 @@ test: ## Run the server tests and produce a coverage report (locally)
coverage html coverage html
coverage report coverage report
lint-fix: lint-codespell ## Lint and automatically fix errors in the python source code. Reports spelling errors. lint-fix: ## Lint and automatically fix errors in the python source code. Reports spelling errors.
isort $(PY_SOURCES)
black $(PY_SOURCES) black $(PY_SOURCES)
ruff check --fix $(PY_SOURCES) ruff $(PY_SOURCES)
lint-codespell: ## Reports spelling errors.
codespell -w $(CODESPELL_ARGS) codespell -w $(CODESPELL_ARGS)
lint: ## Lint the python and golang sources lint: ## Lint the python and golang sources
bandit -r $(PY_SOURCES) -x web/node_modules -x tests/wdio/node_modules -x website/node_modules bandit -r $(PY_SOURCES) -x node_modules
./web/node_modules/.bin/pyright $(PY_SOURCES)
pylint $(PY_SOURCES)
golangci-lint run -v golangci-lint run -v
core-install:
poetry install
migrate: ## Run the Authentik Django server's migrations migrate: ## Run the Authentik Django server's migrations
python -m lifecycle.migrate python -m lifecycle.migrate
i18n-extract: core-i18n-extract web-i18n-extract ## Extract strings that require translation into files to send to a translation service i18n-extract: i18n-extract-core web-i18n-extract ## Extract strings that require translation into files to send to a translation service
core-i18n-extract: i18n-extract-core:
ak makemessages \ ak makemessages --ignore web --ignore internal --ignore web --ignore web-api --ignore website -l en
--add-location file \
--no-obsolete \
--ignore web \
--ignore internal \
--ignore ${GEN_API_TS} \
--ignore ${GEN_API_GO} \
--ignore website \
-l en
install: web-install website-install core-install ## Install all requires dependencies for `web`, `website` and `core` install: web-install website-install ## Install all requires dependencies for `web`, `website` and `core`
poetry install
dev-drop-db: dev-drop-db:
dropdb -U ${pg_user} -h ${pg_host} ${pg_name} dropdb -U ${pg_user} -h ${pg_host} ${pg_name}
@ -107,14 +94,8 @@ dev-reset: dev-drop-db dev-create-db migrate ## Drop and restore the Authentik
######################### #########################
gen-build: ## Extract the schema from the database gen-build: ## Extract the schema from the database
AUTHENTIK_DEBUG=true \ AUTHENTIK_DEBUG=true ak make_blueprint_schema > blueprints/schema.json
AUTHENTIK_TENANTS__ENABLED=true \ AUTHENTIK_DEBUG=true ak spectacular --file schema.yml
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
ak make_blueprint_schema > blueprints/schema.json
AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
ak spectacular --file schema.yml
gen-changelog: ## (Release) generate the changelog based from the commits since the last tag gen-changelog: ## (Release) generate the changelog based from the commits since the last tag
git log --pretty=format:" - %s" $(shell git describe --tags $(shell git rev-list --tags --max-count=1))...$(shell git branch --show-current) | sort > changelog.md git log --pretty=format:" - %s" $(shell git describe --tags $(shell git rev-list --tags --max-count=1))...$(shell git branch --show-current) | sort > changelog.md
@ -125,77 +106,53 @@ gen-diff: ## (Release) generate the changelog diff between the current schema a
docker run \ docker run \
--rm -v ${PWD}:/local \ --rm -v ${PWD}:/local \
--user ${UID}:${GID} \ --user ${UID}:${GID} \
docker.io/openapitools/openapi-diff:2.1.0-beta.8 \ docker.io/openapitools/openapi-diff:2.1.0-beta.6 \
--markdown /local/diff.md \ --markdown /local/diff.md \
/local/old_schema.yml /local/schema.yml /local/old_schema.yml /local/schema.yml
rm old_schema.yml rm old_schema.yml
sed -i 's/{/&#123;/g' diff.md
sed -i 's/}/&#125;/g' diff.md
npx prettier --write diff.md npx prettier --write diff.md
gen-clean-ts: ## Remove generated API client for Typescript gen-clean:
rm -rf ./${GEN_API_TS}/ rm -rf web/api/src/
rm -rf ./web/node_modules/@goauthentik/api/ rm -rf api/
gen-clean-go: ## Remove generated API client for Go gen-client-ts: ## Build and install the authentik API for Typescript into the authentik UI Application
rm -rf ./${GEN_API_GO}/
gen-clean-py: ## Remove generated API client for Python
rm -rf ./${GEN_API_PY}/
gen-clean: gen-clean-ts gen-clean-go gen-clean-py ## Remove generated API clients
gen-client-ts: gen-clean-ts ## Build and install the authentik API for Typescript into the authentik UI Application
docker run \ docker run \
--rm -v ${PWD}:/local \ --rm -v ${PWD}:/local \
--user ${UID}:${GID} \ --user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \ docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \
-i /local/schema.yml \ -i /local/schema.yml \
-g typescript-fetch \ -g typescript-fetch \
-o /local/${GEN_API_TS} \ -o /local/gen-ts-api \
-c /local/scripts/api-ts-config.yaml \ -c /local/scripts/api-ts-config.yaml \
--additional-properties=npmVersion=${NPM_VERSION} \ --additional-properties=npmVersion=${NPM_VERSION} \
--git-repo-id authentik \ --git-repo-id authentik \
--git-user-id goauthentik --git-user-id goauthentik
mkdir -p web/node_modules/@goauthentik/api mkdir -p web/node_modules/@goauthentik/api
cd ./${GEN_API_TS} && npm i cd gen-ts-api && npm i
\cp -rf ./${GEN_API_TS}/* web/node_modules/@goauthentik/api \cp -rfv gen-ts-api/* web/node_modules/@goauthentik/api
gen-client-py: gen-clean-py ## Build and install the authentik API for Python gen-client-go: ## Build and install the authentik API for Golang
mkdir -p ./gen-go-api ./gen-go-api/templates
wget https://raw.githubusercontent.com/goauthentik/client-go/main/config.yaml -O ./gen-go-api/config.yaml
wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/README.mustache -O ./gen-go-api/templates/README.mustache
wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/go.mod.mustache -O ./gen-go-api/templates/go.mod.mustache
cp schema.yml ./gen-go-api/
docker run \ docker run \
--rm -v ${PWD}:/local \ --rm -v ${PWD}/gen-go-api:/local \
--user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v7.4.0 generate \
-i /local/schema.yml \
-g python \
-o /local/${GEN_API_PY} \
-c /local/scripts/api-py-config.yaml \
--additional-properties=packageVersion=${NPM_VERSION} \
--git-repo-id authentik \
--git-user-id goauthentik
pip install ./${GEN_API_PY}
gen-client-go: gen-clean-go ## Build and install the authentik API for Golang
mkdir -p ./${GEN_API_GO} ./${GEN_API_GO}/templates
wget https://raw.githubusercontent.com/goauthentik/client-go/main/config.yaml -O ./${GEN_API_GO}/config.yaml
wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/README.mustache -O ./${GEN_API_GO}/templates/README.mustache
wget https://raw.githubusercontent.com/goauthentik/client-go/main/templates/go.mod.mustache -O ./${GEN_API_GO}/templates/go.mod.mustache
cp schema.yml ./${GEN_API_GO}/
docker run \
--rm -v ${PWD}/${GEN_API_GO}:/local \
--user ${UID}:${GID} \ --user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \ docker.io/openapitools/openapi-generator-cli:v6.5.0 generate \
-i /local/schema.yml \ -i /local/schema.yml \
-g go \ -g go \
-o /local/ \ -o /local/ \
-c /local/config.yaml -c /local/config.yaml
go mod edit -replace goauthentik.io/api/v3=./${GEN_API_GO} go mod edit -replace goauthentik.io/api/v3=./gen-go-api
rm -rf ./${GEN_API_GO}/config.yaml ./${GEN_API_GO}/templates/ rm -rf ./gen-go-api/config.yaml ./gen-go-api/templates/
gen-dev-config: ## Generate a local development config file gen-dev-config: ## Generate a local development config file
python -m scripts.generate_config python -m scripts.generate_config
gen: gen-build gen-client-ts gen: gen-build gen-clean gen-client-ts
######################### #########################
## Web ## Web
@ -204,14 +161,11 @@ gen: gen-build gen-client-ts
web-build: web-install ## Build the Authentik UI web-build: web-install ## Build the Authentik UI
cd web && npm run build cd web && npm run build
web: web-lint-fix web-lint web-check-compile ## Automatically fix formatting issues in the Authentik UI source code, lint the code, and compile it web: web-lint-fix web-lint web-check-compile web-i18n-extract ## Automatically fix formatting issues in the Authentik UI source code, lint the code, and compile it
web-install: ## Install the necessary libraries to build the Authentik UI web-install: ## Install the necessary libraries to build the Authentik UI
cd web && npm ci cd web && npm ci
web-test: ## Run tests for the Authentik UI
cd web && npm run test
web-watch: ## Build and watch the Authentik UI for changes, updating automatically web-watch: ## Build and watch the Authentik UI for changes, updating automatically
rm -rf web/dist/ rm -rf web/dist/
mkdir web/dist/ mkdir web/dist/
@ -243,7 +197,7 @@ website: website-lint-fix website-build ## Automatically fix formatting issues
website-install: website-install:
cd website && npm ci cd website && npm ci
website-lint-fix: lint-codespell website-lint-fix:
cd website && npm run prettier cd website && npm run prettier
website-build: website-build:
@ -257,7 +211,6 @@ website-watch: ## Build and watch the documentation website, updating automatic
######################### #########################
docker: ## Build a docker image of the current source tree docker: ## Build a docker image of the current source tree
mkdir -p ${GEN_API_TS}
DOCKER_BUILDKIT=1 docker build . --progress plain --tag ${DOCKER_IMAGE} DOCKER_BUILDKIT=1 docker build . --progress plain --tag ${DOCKER_IMAGE}
######################### #########################
@ -270,6 +223,9 @@ ci--meta-debug:
python -V python -V
node --version node --version
ci-pylint: ci--meta-debug
pylint $(PY_SOURCES)
ci-black: ci--meta-debug ci-black: ci--meta-debug
black --check $(PY_SOURCES) black --check $(PY_SOURCES)
@ -279,8 +235,14 @@ ci-ruff: ci--meta-debug
ci-codespell: ci--meta-debug ci-codespell: ci--meta-debug
codespell $(CODESPELL_ARGS) -s codespell $(CODESPELL_ARGS) -s
ci-isort: ci--meta-debug
isort --check $(PY_SOURCES)
ci-bandit: ci--meta-debug ci-bandit: ci--meta-debug
bandit -r $(PY_SOURCES) bandit -r $(PY_SOURCES)
ci-pyright: ci--meta-debug
./web/node_modules/.bin/pyright $(PY_SOURCES)
ci-pending-migrations: ci--meta-debug ci-pending-migrations: ci--meta-debug
ak makemigrations --check ak makemigrations --check

View File

@ -15,9 +15,7 @@
## What is authentik? ## What is authentik?
authentik is an open-source Identity Provider that emphasizes flexibility and versatility, with support for a wide set of protocols. authentik is an open-source Identity Provider that emphasizes flexibility and versatility. It can be seamlessly integrated into existing environments to support new protocols. authentik is also a great solution for implementing sign-up, recovery, and other similar features in your application, saving you the hassle of dealing with them.
Our [enterprise offer](https://goauthentik.io/pricing) can also be used as a self-hosted replacement for large-scale deployments of Okta/Auth0, Entra ID, Ping Identity, or other legacy IdPs for employees and B2B2C use.
## Installation ## Installation
@ -27,14 +25,14 @@ For bigger setups, there is a Helm Chart [here](https://github.com/goauthentik/h
## Screenshots ## Screenshots
| Light | Dark | | Light | Dark |
| ----------------------------------------------------------- | ---------------------------------------------------------- | | ------------------------------------------------------ | ----------------------------------------------------- |
| ![](https://docs.goauthentik.io/img/screen_apps_light.jpg) | ![](https://docs.goauthentik.io/img/screen_apps_dark.jpg) | | ![](https://goauthentik.io/img/screen_apps_light.jpg) | ![](https://goauthentik.io/img/screen_apps_dark.jpg) |
| ![](https://docs.goauthentik.io/img/screen_admin_light.jpg) | ![](https://docs.goauthentik.io/img/screen_admin_dark.jpg) | | ![](https://goauthentik.io/img/screen_admin_light.jpg) | ![](https://goauthentik.io/img/screen_admin_dark.jpg) |
## Development ## Development
See [Developer Documentation](https://docs.goauthentik.io/docs/developer-docs/?utm_source=github) See [Developer Documentation](https://goauthentik.io/developer-docs/?utm_source=github)
## Security ## Security

View File

@ -1,9 +1,5 @@
authentik takes security very seriously. We follow the rules of [responsible disclosure](https://en.wikipedia.org/wiki/Responsible_disclosure), and we urge our community to do so as well, instead of reporting vulnerabilities publicly. This allows us to patch the issue quickly, announce it's existence and release the fixed version. authentik takes security very seriously. We follow the rules of [responsible disclosure](https://en.wikipedia.org/wiki/Responsible_disclosure), and we urge our community to do so as well, instead of reporting vulnerabilities publicly. This allows us to patch the issue quickly, announce it's existence and release the fixed version.
## Independent audits and pentests
In May/June of 2023 [Cure53](https://cure53.de) conducted an audit and pentest. The [results](https://cure53.de/pentest-report_authentik.pdf) are published on the [Cure53 website](https://cure53.de/#publications-2023). For more details about authentik's response to the findings of the audit refer to [2023-06 Cure53 Code audit](https://goauthentik.io/docs/security/2023-06-cure53).
## What authentik classifies as a CVE ## What authentik classifies as a CVE
CVE (Common Vulnerability and Exposure) is a system designed to aggregate all vulnerabilities. As such, a CVE will be issued when there is a either vulnerability or exposure. Per NIST, A vulnerability is: CVE (Common Vulnerability and Exposure) is a system designed to aggregate all vulnerabilities. As such, a CVE will be issued when there is a either vulnerability or exposure. Per NIST, A vulnerability is:
@ -18,10 +14,10 @@ Even if the issue is not a CVE, we still greatly appreciate your help in hardeni
(.x being the latest patch release for each version) (.x being the latest patch release for each version)
| Version | Supported | | Version | Supported |
| --------- | --------- | | --- | --- |
| 2024.8.x | ✅ | | 2023.6.x | ✅ |
| 2024.10.x | ✅ | | 2023.8.x | ✅ |
## Reporting a Vulnerability ## Reporting a Vulnerability
@ -31,12 +27,12 @@ To report a vulnerability, send an email to [security@goauthentik.io](mailto:se
authentik reserves the right to reclassify CVSS as necessary. To determine severity, we will use the CVSS calculator from NVD (https://nvd.nist.gov/vuln-metrics/cvss/v3-calculator). The calculated CVSS score will then be translated into one of the following categories: authentik reserves the right to reclassify CVSS as necessary. To determine severity, we will use the CVSS calculator from NVD (https://nvd.nist.gov/vuln-metrics/cvss/v3-calculator). The calculated CVSS score will then be translated into one of the following categories:
| Score | Severity | | Score | Severity |
| ---------- | -------- | | --- | --- |
| 0.0 | None | | 0.0 | None |
| 0.1 3.9 | Low | | 0.1 3.9 | Low |
| 4.0 6.9 | Medium | | 4.0 6.9 | Medium |
| 7.0 8.9 | High | | 7.0 8.9 | High |
| 9.0 10.0 | Critical | | 9.0 10.0 | Critical |
## Disclosure process ## Disclosure process

View File

@ -1,12 +1,12 @@
"""authentik root module""" """authentik root module"""
from os import environ from os import environ
from typing import Optional
__version__ = "2024.10.2" __version__ = "2023.10.7"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH" ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"
def get_build_hash(fallback: str | None = None) -> str: def get_build_hash(fallback: Optional[str] = None) -> str:
"""Get build hash""" """Get build hash"""
build_hash = environ.get(ENV_GIT_HASH_KEY, fallback if fallback else "") build_hash = environ.get(ENV_GIT_HASH_KEY, fallback if fallback else "")
return fallback if build_hash == "" and fallback else build_hash return fallback if build_hash == "" and fallback else build_hash

View File

@ -1,5 +1,4 @@
"""Meta API""" """Meta API"""
from drf_spectacular.utils import extend_schema from drf_spectacular.utils import extend_schema
from rest_framework.fields import CharField from rest_framework.fields import CharField
from rest_framework.permissions import IsAuthenticated from rest_framework.permissions import IsAuthenticated

View File

@ -1,5 +1,4 @@
"""authentik administration metrics""" """authentik administration metrics"""
from datetime import timedelta from datetime import timedelta
from django.db.models.functions import ExtractHour from django.db.models.functions import ExtractHour

View File

@ -1,23 +1,18 @@
"""authentik administration overview""" """authentik administration overview"""
import platform import platform
from datetime import datetime from datetime import datetime
from ssl import OPENSSL_VERSION
from sys import version as python_version from sys import version as python_version
from typing import TypedDict from typing import TypedDict
from cryptography.hazmat.backends.openssl.backend import backend
from django.utils.timezone import now from django.utils.timezone import now
from drf_spectacular.utils import extend_schema from drf_spectacular.utils import extend_schema
from gunicorn import version_info as gunicorn_version
from rest_framework.fields import SerializerMethodField from rest_framework.fields import SerializerMethodField
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.views import APIView from rest_framework.views import APIView
from authentik import get_full_version
from authentik.core.api.utils import PassiveSerializer from authentik.core.api.utils import PassiveSerializer
from authentik.enterprise.license import LicenseKey
from authentik.lib.config import CONFIG
from authentik.lib.utils.reflection import get_env from authentik.lib.utils.reflection import get_env
from authentik.outposts.apps import MANAGED_OUTPOST from authentik.outposts.apps import MANAGED_OUTPOST
from authentik.outposts.models import Outpost from authentik.outposts.models import Outpost
@ -28,25 +23,22 @@ class RuntimeDict(TypedDict):
"""Runtime information""" """Runtime information"""
python_version: str python_version: str
gunicorn_version: str
environment: str environment: str
architecture: str architecture: str
platform: str platform: str
uname: str uname: str
openssl_version: str
openssl_fips_enabled: bool | None
authentik_version: str
class SystemInfoSerializer(PassiveSerializer): class SystemSerializer(PassiveSerializer):
"""Get system information.""" """Get system information."""
http_headers = SerializerMethodField() http_headers = SerializerMethodField()
http_host = SerializerMethodField() http_host = SerializerMethodField()
http_is_secure = SerializerMethodField() http_is_secure = SerializerMethodField()
runtime = SerializerMethodField() runtime = SerializerMethodField()
brand = SerializerMethodField() tenant = SerializerMethodField()
server_time = SerializerMethodField() server_time = SerializerMethodField()
embedded_outpost_disabled = SerializerMethodField()
embedded_outpost_host = SerializerMethodField() embedded_outpost_host = SerializerMethodField()
def get_http_headers(self, request: Request) -> dict[str, str]: def get_http_headers(self, request: Request) -> dict[str, str]:
@ -69,30 +61,22 @@ class SystemInfoSerializer(PassiveSerializer):
def get_runtime(self, request: Request) -> RuntimeDict: def get_runtime(self, request: Request) -> RuntimeDict:
"""Get versions""" """Get versions"""
return { return {
"architecture": platform.machine(),
"authentik_version": get_full_version(),
"environment": get_env(),
"openssl_fips_enabled": (
backend._fips_enabled if LicenseKey.get_total().status().is_valid else None
),
"openssl_version": OPENSSL_VERSION,
"platform": platform.platform(),
"python_version": python_version, "python_version": python_version,
"gunicorn_version": ".".join(str(x) for x in gunicorn_version),
"environment": get_env(),
"architecture": platform.machine(),
"platform": platform.platform(),
"uname": " ".join(platform.uname()), "uname": " ".join(platform.uname()),
} }
def get_brand(self, request: Request) -> str: def get_tenant(self, request: Request) -> str:
"""Currently active brand""" """Currently active tenant"""
return str(request._request.brand) return str(request._request.tenant)
def get_server_time(self, request: Request) -> datetime: def get_server_time(self, request: Request) -> datetime:
"""Current server time""" """Current server time"""
return now() return now()
def get_embedded_outpost_disabled(self, request: Request) -> bool:
"""Whether the embedded outpost is disabled"""
return CONFIG.get_bool("outposts.disable_embedded_outpost", False)
def get_embedded_outpost_host(self, request: Request) -> str: def get_embedded_outpost_host(self, request: Request) -> str:
"""Get the FQDN configured on the embedded outpost""" """Get the FQDN configured on the embedded outpost"""
outposts = Outpost.objects.filter(managed=MANAGED_OUTPOST) outposts = Outpost.objects.filter(managed=MANAGED_OUTPOST)
@ -107,14 +91,14 @@ class SystemView(APIView):
permission_classes = [HasPermission("authentik_rbac.view_system_info")] permission_classes = [HasPermission("authentik_rbac.view_system_info")]
pagination_class = None pagination_class = None
filter_backends = [] filter_backends = []
serializer_class = SystemInfoSerializer serializer_class = SystemSerializer
@extend_schema(responses={200: SystemInfoSerializer(many=False)}) @extend_schema(responses={200: SystemSerializer(many=False)})
def get(self, request: Request) -> Response: def get(self, request: Request) -> Response:
"""Get system information.""" """Get system information."""
return Response(SystemInfoSerializer(request).data) return Response(SystemSerializer(request).data)
@extend_schema(responses={200: SystemInfoSerializer(many=False)}) @extend_schema(responses={200: SystemSerializer(many=False)})
def post(self, request: Request) -> Response: def post(self, request: Request) -> Response:
"""Get system information.""" """Get system information."""
return Response(SystemInfoSerializer(request).data) return Response(SystemSerializer(request).data)

View File

@ -0,0 +1,134 @@
"""Tasks API"""
from importlib import import_module
from django.contrib import messages
from django.http.response import Http404
from django.utils.translation import gettext_lazy as _
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_schema
from rest_framework.decorators import action
from rest_framework.fields import (
CharField,
ChoiceField,
DateTimeField,
ListField,
SerializerMethodField,
)
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.viewsets import ViewSet
from structlog.stdlib import get_logger
from authentik.api.decorators import permission_required
from authentik.core.api.utils import PassiveSerializer
from authentik.events.monitored_tasks import TaskInfo, TaskResultStatus
from authentik.rbac.permissions import HasPermission
LOGGER = get_logger()
class TaskSerializer(PassiveSerializer):
"""Serialize TaskInfo and TaskResult"""
task_name = CharField()
task_description = CharField()
task_finish_timestamp = DateTimeField(source="finish_time")
task_duration = SerializerMethodField()
status = ChoiceField(
source="result.status.name",
choices=[(x.name, x.name) for x in TaskResultStatus],
)
messages = ListField(source="result.messages")
def get_task_duration(self, instance: TaskInfo) -> int:
"""Get the duration a task took to run"""
return max(instance.finish_timestamp - instance.start_timestamp, 0)
def to_representation(self, instance: TaskInfo):
"""When a new version of authentik adds fields to TaskInfo,
the API will fail with an AttributeError, as the classes
are pickled in cache. In that case, just delete the info"""
try:
return super().to_representation(instance)
# pylint: disable=broad-except
except Exception: # pragma: no cover
if isinstance(self.instance, list):
for inst in self.instance:
inst.delete()
else:
self.instance.delete()
return {}
class TaskViewSet(ViewSet):
"""Read-only view set that returns all background tasks"""
permission_classes = [HasPermission("authentik_rbac.view_system_tasks")]
serializer_class = TaskSerializer
@extend_schema(
responses={
200: TaskSerializer(many=False),
404: OpenApiResponse(description="Task not found"),
},
parameters=[
OpenApiParameter(
"id",
type=OpenApiTypes.STR,
location=OpenApiParameter.PATH,
required=True,
),
],
)
def retrieve(self, request: Request, pk=None) -> Response:
"""Get a single system task"""
task = TaskInfo.by_name(pk)
if not task:
raise Http404
return Response(TaskSerializer(task, many=False).data)
@extend_schema(responses={200: TaskSerializer(many=True)})
def list(self, request: Request) -> Response:
"""List system tasks"""
tasks = sorted(TaskInfo.all().values(), key=lambda task: task.task_name)
return Response(TaskSerializer(tasks, many=True).data)
@permission_required(None, ["authentik_rbac.run_system_tasks"])
@extend_schema(
request=OpenApiTypes.NONE,
responses={
204: OpenApiResponse(description="Task retried successfully"),
404: OpenApiResponse(description="Task not found"),
500: OpenApiResponse(description="Failed to retry task"),
},
parameters=[
OpenApiParameter(
"id",
type=OpenApiTypes.STR,
location=OpenApiParameter.PATH,
required=True,
),
],
)
@action(detail=True, methods=["post"])
def retry(self, request: Request, pk=None) -> Response:
"""Retry task"""
task = TaskInfo.by_name(pk)
if not task:
raise Http404
try:
task_module = import_module(task.task_call_module)
task_func = getattr(task_module, task.task_call_func)
LOGGER.debug("Running task", task=task_func)
task_func.delay(*task.task_call_args, **task.task_call_kwargs)
messages.success(
self.request,
_("Successfully re-scheduled Task %(name)s!" % {"name": task.task_name}),
)
return Response(status=204)
except (ImportError, AttributeError): # pragma: no cover
LOGGER.warning("Failed to run task, remove state", task=task)
# if we get an import error, the module path has probably changed
task.delete()
return Response(status=500)

View File

@ -1,5 +1,4 @@
"""authentik administration overview""" """authentik administration overview"""
from django.core.cache import cache from django.core.cache import cache
from drf_spectacular.utils import extend_schema from drf_spectacular.utils import extend_schema
from packaging.version import parse from packaging.version import parse
@ -10,9 +9,8 @@ from rest_framework.response import Response
from rest_framework.views import APIView from rest_framework.views import APIView
from authentik import __version__, get_build_hash from authentik import __version__, get_build_hash
from authentik.admin.tasks import VERSION_CACHE_KEY, VERSION_NULL, update_latest_version from authentik.admin.tasks import VERSION_CACHE_KEY, update_latest_version
from authentik.core.api.utils import PassiveSerializer from authentik.core.api.utils import PassiveSerializer
from authentik.outposts.models import Outpost
class VersionSerializer(PassiveSerializer): class VersionSerializer(PassiveSerializer):
@ -20,10 +18,8 @@ class VersionSerializer(PassiveSerializer):
version_current = SerializerMethodField() version_current = SerializerMethodField()
version_latest = SerializerMethodField() version_latest = SerializerMethodField()
version_latest_valid = SerializerMethodField()
build_hash = SerializerMethodField() build_hash = SerializerMethodField()
outdated = SerializerMethodField() outdated = SerializerMethodField()
outpost_outdated = SerializerMethodField()
def get_build_hash(self, _) -> str: def get_build_hash(self, _) -> str:
"""Get build hash, if version is not latest or released""" """Get build hash, if version is not latest or released"""
@ -41,23 +37,10 @@ class VersionSerializer(PassiveSerializer):
return __version__ return __version__
return version_in_cache return version_in_cache
def get_version_latest_valid(self, _) -> bool:
"""Check if latest version is valid"""
return cache.get(VERSION_CACHE_KEY) != VERSION_NULL
def get_outdated(self, instance) -> bool: def get_outdated(self, instance) -> bool:
"""Check if we're running the latest version""" """Check if we're running the latest version"""
return parse(self.get_version_current(instance)) < parse(self.get_version_latest(instance)) return parse(self.get_version_current(instance)) < parse(self.get_version_latest(instance))
def get_outpost_outdated(self, _) -> bool:
"""Check if any outpost is outdated/has a version mismatch"""
any_outdated = False
for outpost in Outpost.objects.all():
for state in outpost.state:
if state.version_outdated:
any_outdated = True
return any_outdated
class VersionView(APIView): class VersionView(APIView):
"""Get running and latest version.""" """Get running and latest version."""

View File

@ -1,33 +0,0 @@
from rest_framework.permissions import IsAdminUser
from rest_framework.viewsets import ReadOnlyModelViewSet
from authentik.admin.models import VersionHistory
from authentik.core.api.utils import ModelSerializer
class VersionHistorySerializer(ModelSerializer):
"""VersionHistory Serializer"""
class Meta:
model = VersionHistory
fields = [
"id",
"timestamp",
"version",
"build",
]
class VersionHistoryViewSet(ReadOnlyModelViewSet):
"""VersionHistory Viewset"""
queryset = VersionHistory.objects.all()
serializer_class = VersionHistorySerializer
permission_classes = [IsAdminUser]
filterset_fields = [
"version",
"build",
]
search_fields = ["version", "build"]
ordering = ["-timestamp"]
pagination_class = None

View File

@ -1,5 +1,4 @@
"""authentik administration overview""" """authentik administration overview"""
from django.conf import settings from django.conf import settings
from drf_spectacular.utils import extend_schema, inline_serializer from drf_spectacular.utils import extend_schema, inline_serializer
from rest_framework.fields import IntegerField from rest_framework.fields import IntegerField

View File

@ -1,5 +1,4 @@
"""authentik admin app config""" """authentik admin app config"""
from prometheus_client import Gauge, Info from prometheus_client import Gauge, Info
from authentik.blueprints.apps import ManagedAppConfig from authentik.blueprints.apps import ManagedAppConfig
@ -15,3 +14,7 @@ class AuthentikAdminConfig(ManagedAppConfig):
label = "authentik_admin" label = "authentik_admin"
verbose_name = "authentik Admin" verbose_name = "authentik Admin"
default = True default = True
def reconcile_load_admin_signals(self):
"""Load admin signals"""
self.import_module("authentik.admin.signals")

View File

@ -1,22 +0,0 @@
"""authentik admin models"""
from django.db import models
from django.utils.translation import gettext_lazy as _
class VersionHistory(models.Model):
id = models.BigAutoField(primary_key=True)
timestamp = models.DateTimeField()
version = models.TextField()
build = models.TextField()
class Meta:
managed = False
db_table = "authentik_version_history"
ordering = ("-timestamp",)
verbose_name = _("Version history")
verbose_name_plural = _("Version history")
default_permissions = []
def __str__(self):
return f"{self.version}.{self.build} ({self.timestamp})"

View File

@ -1,5 +1,4 @@
"""authentik admin settings""" """authentik admin settings"""
from celery.schedules import crontab from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand from authentik.lib.utils.time import fqdn_rand

View File

@ -1,7 +1,7 @@
"""admin signals""" """admin signals"""
from django.dispatch import receiver from django.dispatch import receiver
from authentik.admin.api.tasks import TaskInfo
from authentik.admin.apps import GAUGE_WORKERS from authentik.admin.apps import GAUGE_WORKERS
from authentik.root.celery import CELERY_APP from authentik.root.celery import CELERY_APP
from authentik.root.monitoring import monitoring_set from authentik.root.monitoring import monitoring_set
@ -12,3 +12,10 @@ def monitoring_set_workers(sender, **kwargs):
"""Set worker gauge""" """Set worker gauge"""
count = len(CELERY_APP.control.ping(timeout=0.5)) count = len(CELERY_APP.control.ping(timeout=0.5))
GAUGE_WORKERS.set(count) GAUGE_WORKERS.set(count)
@receiver(monitoring_set)
def monitoring_set_tasks(sender, **kwargs):
"""Set task gauges"""
for task in TaskInfo.all().values():
task.update_metrics()

View File

@ -1,8 +1,9 @@
"""authentik admin tasks""" """authentik admin tasks"""
import re
from django.core.cache import cache from django.core.cache import cache
from django.core.validators import URLValidator
from django.db import DatabaseError, InternalError, ProgrammingError from django.db import DatabaseError, InternalError, ProgrammingError
from django.utils.translation import gettext_lazy as _
from packaging.version import parse from packaging.version import parse
from requests import RequestException from requests import RequestException
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
@ -10,15 +11,21 @@ from structlog.stdlib import get_logger
from authentik import __version__, get_build_hash from authentik import __version__, get_build_hash
from authentik.admin.apps import PROM_INFO from authentik.admin.apps import PROM_INFO
from authentik.events.models import Event, EventAction, Notification from authentik.events.models import Event, EventAction, Notification
from authentik.events.system_tasks import SystemTask, TaskStatus, prefill_task from authentik.events.monitored_tasks import (
MonitoredTask,
TaskResult,
TaskResultStatus,
prefill_task,
)
from authentik.lib.config import CONFIG from authentik.lib.config import CONFIG
from authentik.lib.utils.http import get_http_session from authentik.lib.utils.http import get_http_session
from authentik.root.celery import CELERY_APP from authentik.root.celery import CELERY_APP
LOGGER = get_logger() LOGGER = get_logger()
VERSION_NULL = "0.0.0"
VERSION_CACHE_KEY = "authentik_latest_version" VERSION_CACHE_KEY = "authentik_latest_version"
VERSION_CACHE_TIMEOUT = 8 * 60 * 60 # 8 hours VERSION_CACHE_TIMEOUT = 8 * 60 * 60 # 8 hours
# Chop of the first ^ because we want to search the entire string
URL_FINDER = URLValidator.regex.pattern[1:]
LOCAL_VERSION = parse(__version__) LOCAL_VERSION = parse(__version__)
@ -47,13 +54,13 @@ def clear_update_notifications():
notification.delete() notification.delete()
@CELERY_APP.task(bind=True, base=SystemTask) @CELERY_APP.task(bind=True, base=MonitoredTask)
@prefill_task @prefill_task
def update_latest_version(self: SystemTask): def update_latest_version(self: MonitoredTask):
"""Update latest version info""" """Update latest version info"""
if CONFIG.get_bool("disable_update_check"): if CONFIG.get_bool("disable_update_check"):
cache.set(VERSION_CACHE_KEY, VERSION_NULL, VERSION_CACHE_TIMEOUT) cache.set(VERSION_CACHE_KEY, "0.0.0", VERSION_CACHE_TIMEOUT)
self.set_status(TaskStatus.WARNING, "Version check disabled.") self.set_status(TaskResult(TaskResultStatus.WARNING, messages=["Version check disabled."]))
return return
try: try:
response = get_http_session().get( response = get_http_session().get(
@ -63,7 +70,9 @@ def update_latest_version(self: SystemTask):
data = response.json() data = response.json()
upstream_version = data.get("stable", {}).get("version") upstream_version = data.get("stable", {}).get("version")
cache.set(VERSION_CACHE_KEY, upstream_version, VERSION_CACHE_TIMEOUT) cache.set(VERSION_CACHE_KEY, upstream_version, VERSION_CACHE_TIMEOUT)
self.set_status(TaskStatus.SUCCESSFUL, "Successfully updated latest Version") self.set_status(
TaskResult(TaskResultStatus.SUCCESSFUL, ["Successfully updated latest Version"])
)
_set_prom_info() _set_prom_info()
# Check if upstream version is newer than what we're running, # Check if upstream version is newer than what we're running,
# and if no event exists yet, create one. # and if no event exists yet, create one.
@ -74,19 +83,13 @@ def update_latest_version(self: SystemTask):
context__new_version=upstream_version, context__new_version=upstream_version,
).exists(): ).exists():
return return
Event.new( event_dict = {"new_version": upstream_version}
EventAction.UPDATE_AVAILABLE, if match := re.search(URL_FINDER, data.get("stable", {}).get("changelog", "")):
message=_( event_dict["message"] = f"Changelog: {match.group()}"
"New version {version} available!".format( Event.new(EventAction.UPDATE_AVAILABLE, **event_dict).save()
version=upstream_version,
)
),
new_version=upstream_version,
changelog=data.get("stable", {}).get("changelog_url"),
).save()
except (RequestException, IndexError) as exc: except (RequestException, IndexError) as exc:
cache.set(VERSION_CACHE_KEY, VERSION_NULL, VERSION_CACHE_TIMEOUT) cache.set(VERSION_CACHE_KEY, "0.0.0", VERSION_CACHE_TIMEOUT)
self.set_error(exc) self.set_status(TaskResult(TaskResultStatus.ERROR).with_error(exc))
_set_prom_info() _set_prom_info()

View File

@ -1,5 +1,4 @@
"""test admin api""" """test admin api"""
from json import loads from json import loads
from django.test import TestCase from django.test import TestCase
@ -8,6 +7,8 @@ from django.urls import reverse
from authentik import __version__ from authentik import __version__
from authentik.blueprints.tests import reconcile_app from authentik.blueprints.tests import reconcile_app
from authentik.core.models import Group, User from authentik.core.models import Group, User
from authentik.core.tasks import clean_expired_models
from authentik.events.monitored_tasks import TaskResultStatus
from authentik.lib.generators import generate_id from authentik.lib.generators import generate_id
@ -22,6 +23,53 @@ class TestAdminAPI(TestCase):
self.group.save() self.group.save()
self.client.force_login(self.user) self.client.force_login(self.user)
def test_tasks(self):
"""Test Task API"""
clean_expired_models.delay()
response = self.client.get(reverse("authentik_api:admin_system_tasks-list"))
self.assertEqual(response.status_code, 200)
body = loads(response.content)
self.assertTrue(any(task["task_name"] == "clean_expired_models" for task in body))
def test_tasks_single(self):
"""Test Task API (read single)"""
clean_expired_models.delay()
response = self.client.get(
reverse(
"authentik_api:admin_system_tasks-detail",
kwargs={"pk": "clean_expired_models"},
)
)
self.assertEqual(response.status_code, 200)
body = loads(response.content)
self.assertEqual(body["status"], TaskResultStatus.SUCCESSFUL.name)
self.assertEqual(body["task_name"], "clean_expired_models")
response = self.client.get(
reverse("authentik_api:admin_system_tasks-detail", kwargs={"pk": "qwerqwer"})
)
self.assertEqual(response.status_code, 404)
def test_tasks_retry(self):
"""Test Task API (retry)"""
clean_expired_models.delay()
response = self.client.post(
reverse(
"authentik_api:admin_system_tasks-retry",
kwargs={"pk": "clean_expired_models"},
)
)
self.assertEqual(response.status_code, 204)
def test_tasks_retry_404(self):
"""Test Task API (retry, 404)"""
response = self.client.post(
reverse(
"authentik_api:admin_system_tasks-retry",
kwargs={"pk": "qwerqewrqrqewrqewr"},
)
)
self.assertEqual(response.status_code, 404)
def test_version(self): def test_version(self):
"""Test Version API""" """Test Version API"""
response = self.client.get(reverse("authentik_api:admin_version")) response = self.client.get(reverse("authentik_api:admin_version"))

View File

@ -1,5 +1,4 @@
"""test admin tasks""" """test admin tasks"""
from django.core.cache import cache from django.core.cache import cache
from django.test import TestCase from django.test import TestCase
from requests_mock import Mocker from requests_mock import Mocker
@ -17,7 +16,6 @@ RESPONSE_VALID = {
"stable": { "stable": {
"version": "99999999.9999999", "version": "99999999.9999999",
"changelog": "See https://goauthentik.io/test", "changelog": "See https://goauthentik.io/test",
"changelog_url": "https://goauthentik.io/test",
"reason": "bugfix", "reason": "bugfix",
}, },
} }
@ -36,7 +34,7 @@ class TestAdminTasks(TestCase):
Event.objects.filter( Event.objects.filter(
action=EventAction.UPDATE_AVAILABLE, action=EventAction.UPDATE_AVAILABLE,
context__new_version="99999999.9999999", context__new_version="99999999.9999999",
context__message="New version 99999999.9999999 available!", context__message="Changelog: https://goauthentik.io/test",
).exists() ).exists()
) )
# test that a consecutive check doesn't create a duplicate event # test that a consecutive check doesn't create a duplicate event
@ -46,7 +44,7 @@ class TestAdminTasks(TestCase):
Event.objects.filter( Event.objects.filter(
action=EventAction.UPDATE_AVAILABLE, action=EventAction.UPDATE_AVAILABLE,
context__new_version="99999999.9999999", context__new_version="99999999.9999999",
context__message="New version 99999999.9999999 available!", context__message="Changelog: https://goauthentik.io/test",
) )
), ),
1, 1,

View File

@ -1,15 +1,15 @@
"""API URLs""" """API URLs"""
from django.urls import path from django.urls import path
from authentik.admin.api.meta import AppsViewSet, ModelViewSet from authentik.admin.api.meta import AppsViewSet, ModelViewSet
from authentik.admin.api.metrics import AdministrationMetricsViewSet from authentik.admin.api.metrics import AdministrationMetricsViewSet
from authentik.admin.api.system import SystemView from authentik.admin.api.system import SystemView
from authentik.admin.api.tasks import TaskViewSet
from authentik.admin.api.version import VersionView from authentik.admin.api.version import VersionView
from authentik.admin.api.version_history import VersionHistoryViewSet
from authentik.admin.api.workers import WorkerView from authentik.admin.api.workers import WorkerView
api_urlpatterns = [ api_urlpatterns = [
("admin/system_tasks", TaskViewSet, "admin_system_tasks"),
("admin/apps", AppsViewSet, "apps"), ("admin/apps", AppsViewSet, "apps"),
("admin/models", ModelViewSet, "models"), ("admin/models", ModelViewSet, "models"),
path( path(
@ -18,7 +18,6 @@ api_urlpatterns = [
name="admin_metrics", name="admin_metrics",
), ),
path("admin/version/", VersionView.as_view(), name="admin_version"), path("admin/version/", VersionView.as_view(), name="admin_version"),
("admin/version/history", VersionHistoryViewSet, "version_history"),
path("admin/workers/", WorkerView.as_view(), name="admin_workers"), path("admin/workers/", WorkerView.as_view(), name="admin_workers"),
path("admin/system/", SystemView.as_view(), name="admin_system"), path("admin/system/", SystemView.as_view(), name="admin_system"),
] ]

View File

@ -10,3 +10,26 @@ class AuthentikAPIConfig(AppConfig):
label = "authentik_api" label = "authentik_api"
mountpoint = "api/" mountpoint = "api/"
verbose_name = "authentik API" verbose_name = "authentik API"
def ready(self) -> None:
from drf_spectacular.extensions import OpenApiAuthenticationExtension
from authentik.api.authentication import TokenAuthentication
# Class is defined here as it needs to be created early enough that drf-spectacular will
# find it, but also won't cause any import issues
# pylint: disable=unused-variable
class TokenSchema(OpenApiAuthenticationExtension):
"""Auth schema"""
target_class = TokenAuthentication
name = "authentik"
def get_security_definition(self, auto_schema):
"""Auth schema"""
return {
"type": "apiKey",
"in": "header",
"name": "Authorization",
"scheme": "bearer",
}

View File

@ -1,10 +1,8 @@
"""API Authentication""" """API Authentication"""
from hmac import compare_digest from hmac import compare_digest
from typing import Any from typing import Any, Optional
from django.conf import settings from django.conf import settings
from drf_spectacular.extensions import OpenApiAuthenticationExtension
from rest_framework.authentication import BaseAuthentication, get_authorization_header from rest_framework.authentication import BaseAuthentication, get_authorization_header
from rest_framework.exceptions import AuthenticationFailed from rest_framework.exceptions import AuthenticationFailed
from rest_framework.request import Request from rest_framework.request import Request
@ -18,7 +16,7 @@ from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API
LOGGER = get_logger() LOGGER = get_logger()
def validate_auth(header: bytes) -> str | None: def validate_auth(header: bytes) -> Optional[str]:
"""Validate that the header is in a correct format, """Validate that the header is in a correct format,
returns type and credentials""" returns type and credentials"""
auth_credentials = header.decode().strip() auth_credentials = header.decode().strip()
@ -33,7 +31,7 @@ def validate_auth(header: bytes) -> str | None:
return auth_credentials return auth_credentials
def bearer_auth(raw_header: bytes) -> User | None: def bearer_auth(raw_header: bytes) -> Optional[User]:
"""raw_header in the Format of `Bearer ....`""" """raw_header in the Format of `Bearer ....`"""
user = auth_user_lookup(raw_header) user = auth_user_lookup(raw_header)
if not user: if not user:
@ -43,7 +41,7 @@ def bearer_auth(raw_header: bytes) -> User | None:
return user return user
def auth_user_lookup(raw_header: bytes) -> User | None: def auth_user_lookup(raw_header: bytes) -> Optional[User]:
"""raw_header in the Format of `Bearer ....`""" """raw_header in the Format of `Bearer ....`"""
from authentik.providers.oauth2.models import AccessToken from authentik.providers.oauth2.models import AccessToken
@ -76,7 +74,7 @@ def auth_user_lookup(raw_header: bytes) -> User | None:
raise AuthenticationFailed("Token invalid/expired") raise AuthenticationFailed("Token invalid/expired")
def token_secret_key(value: str) -> User | None: def token_secret_key(value: str) -> Optional[User]:
"""Check if the token is the secret key """Check if the token is the secret key
and return the service account for the managed outpost""" and return the service account for the managed outpost"""
from authentik.outposts.apps import MANAGED_OUTPOST from authentik.outposts.apps import MANAGED_OUTPOST
@ -103,14 +101,3 @@ class TokenAuthentication(BaseAuthentication):
return None return None
return (user, None) # pragma: no cover return (user, None) # pragma: no cover
class TokenSchema(OpenApiAuthenticationExtension):
"""Auth schema"""
target_class = TokenAuthentication
name = "authentik"
def get_security_definition(self, auto_schema):
"""Auth schema"""
return {"type": "http", "scheme": "bearer"}

View File

@ -1,5 +1,4 @@
"""API Authorization""" """API Authorization"""
from django.conf import settings from django.conf import settings
from django.db.models import Model from django.db.models import Model
from django.db.models.query import QuerySet from django.db.models.query import QuerySet

View File

@ -1,7 +1,6 @@
"""API Decorators""" """API Decorators"""
from collections.abc import Callable
from functools import wraps from functools import wraps
from typing import Callable, Optional
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
@ -11,26 +10,21 @@ from structlog.stdlib import get_logger
LOGGER = get_logger() LOGGER = get_logger()
def permission_required(obj_perm: str | None = None, global_perms: list[str] | None = None): def permission_required(obj_perm: Optional[str] = None, global_perms: Optional[list[str]] = None):
"""Check permissions for a single custom action""" """Check permissions for a single custom action"""
def _check_obj_perm(self: ModelViewSet, request: Request): def wrapper_outter(func: Callable):
# Check obj_perm both globally and on the specific object
# Having the global permission has higher priority
if request.user.has_perm(obj_perm):
return
obj = self.get_object()
if not request.user.has_perm(obj_perm, obj):
LOGGER.debug("denying access for object", user=request.user, perm=obj_perm, obj=obj)
self.permission_denied(request)
def wrapper_outer(func: Callable):
"""Check permissions for a single custom action""" """Check permissions for a single custom action"""
@wraps(func) @wraps(func)
def wrapper(self: ModelViewSet, request: Request, *args, **kwargs) -> Response: def wrapper(self: ModelViewSet, request: Request, *args, **kwargs) -> Response:
if obj_perm: if obj_perm:
_check_obj_perm(self, request) obj = self.get_object()
if not request.user.has_perm(obj_perm, obj):
LOGGER.debug(
"denying access for object", user=request.user, perm=obj_perm, obj=obj
)
return self.permission_denied(request)
if global_perms: if global_perms:
for other_perm in global_perms: for other_perm in global_perms:
if not request.user.has_perm(other_perm): if not request.user.has_perm(other_perm):
@ -40,4 +34,4 @@ def permission_required(obj_perm: str | None = None, global_perms: list[str] | N
return wrapper return wrapper
return wrapper_outer return wrapper_outter

View File

@ -1,5 +1,4 @@
"""Pagination which includes total pages and current page""" """Pagination which includes total pages and current page"""
from rest_framework import pagination from rest_framework import pagination
from rest_framework.response import Response from rest_framework.response import Response

View File

@ -1,5 +1,4 @@
"""Error Response schema, from https://github.com/axnsan12/drf-yasg/issues/224""" """Error Response schema, from https://github.com/axnsan12/drf-yasg/issues/224"""
from django.utils.translation import gettext_lazy as _ from django.utils.translation import gettext_lazy as _
from drf_spectacular.generators import SchemaGenerator from drf_spectacular.generators import SchemaGenerator
from drf_spectacular.plumbing import ( from drf_spectacular.plumbing import (
@ -12,7 +11,6 @@ from drf_spectacular.settings import spectacular_settings
from drf_spectacular.types import OpenApiTypes from drf_spectacular.types import OpenApiTypes
from rest_framework.settings import api_settings from rest_framework.settings import api_settings
from authentik.api.apps import AuthentikAPIConfig
from authentik.api.pagination import PAGINATION_COMPONENT_NAME, PAGINATION_SCHEMA from authentik.api.pagination import PAGINATION_COMPONENT_NAME, PAGINATION_SCHEMA
@ -102,12 +100,3 @@ def postprocess_schema_responses(result, generator: SchemaGenerator, **kwargs):
comp = result["components"]["schemas"][component] comp = result["components"]["schemas"][component]
comp["additionalProperties"] = {} comp["additionalProperties"] = {}
return result return result
def preprocess_schema_exclude_non_api(endpoints, **kwargs):
"""Filter out all API Views which are not mounted under /api"""
return [
(path, path_regex, method, callback)
for path, path_regex, method, callback in endpoints
if path.startswith("/" + AuthentikAPIConfig.mountpoint)
]

View File

@ -1,13 +1,13 @@
{% extends "base/skeleton.html" %} {% extends "base/skeleton.html" %}
{% load authentik_core %} {% load static %}
{% block title %} {% block title %}
API Browser - {{ brand.branding_title }} API Browser - {{ tenant.branding_title }}
{% endblock %} {% endblock %}
{% block head %} {% block head %}
<script src="{% versioned_script 'dist/standalone/api-browser/index-%v.js' %}" type="module"></script> <script src="{% static 'dist/standalone/api-browser/index.js' %}?version={{ version }}" type="module"></script>
<meta name="theme-color" content="#151515" media="(prefers-color-scheme: light)"> <meta name="theme-color" content="#151515" media="(prefers-color-scheme: light)">
<meta name="theme-color" content="#151515" media="(prefers-color-scheme: dark)"> <meta name="theme-color" content="#151515" media="(prefers-color-scheme: dark)">
{% endblock %} {% endblock %}

View File

@ -1,5 +1,4 @@
"""Test API Authentication""" """Test API Authentication"""
import json import json
from base64 import b64encode from base64 import b64encode
@ -13,8 +12,6 @@ from authentik.blueprints.tests import reconcile_app
from authentik.core.models import Token, TokenIntents, User, UserTypes from authentik.core.models import Token, TokenIntents, User, UserTypes
from authentik.core.tests.utils import create_test_admin_user, create_test_flow from authentik.core.tests.utils import create_test_admin_user, create_test_flow
from authentik.lib.generators import generate_id from authentik.lib.generators import generate_id
from authentik.outposts.apps import MANAGED_OUTPOST
from authentik.outposts.models import Outpost
from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API
from authentik.providers.oauth2.models import AccessToken, OAuth2Provider from authentik.providers.oauth2.models import AccessToken, OAuth2Provider
@ -25,17 +22,17 @@ class TestAPIAuth(TestCase):
def test_invalid_type(self): def test_invalid_type(self):
"""Test invalid type""" """Test invalid type"""
with self.assertRaises(AuthenticationFailed): with self.assertRaises(AuthenticationFailed):
bearer_auth(b"foo bar") bearer_auth("foo bar".encode())
def test_invalid_empty(self): def test_invalid_empty(self):
"""Test invalid type""" """Test invalid type"""
self.assertIsNone(bearer_auth(b"Bearer ")) self.assertIsNone(bearer_auth("Bearer ".encode()))
self.assertIsNone(bearer_auth(b"")) self.assertIsNone(bearer_auth("".encode()))
def test_invalid_no_token(self): def test_invalid_no_token(self):
"""Test invalid with no token""" """Test invalid with no token"""
with self.assertRaises(AuthenticationFailed): with self.assertRaises(AuthenticationFailed):
auth = b64encode(b":abc").decode() auth = b64encode(":abc".encode()).decode()
self.assertIsNone(bearer_auth(f"Basic :{auth}".encode())) self.assertIsNone(bearer_auth(f"Basic :{auth}".encode()))
def test_bearer_valid(self): def test_bearer_valid(self):
@ -52,12 +49,8 @@ class TestAPIAuth(TestCase):
with self.assertRaises(AuthenticationFailed): with self.assertRaises(AuthenticationFailed):
bearer_auth(f"Bearer {token.key}".encode()) bearer_auth(f"Bearer {token.key}".encode())
@reconcile_app("authentik_outposts") def test_managed_outpost(self):
def test_managed_outpost_fail(self):
"""Test managed outpost""" """Test managed outpost"""
outpost = Outpost.objects.filter(managed=MANAGED_OUTPOST).first()
outpost.user.delete()
outpost.delete()
with self.assertRaises(AuthenticationFailed): with self.assertRaises(AuthenticationFailed):
bearer_auth(f"Bearer {settings.SECRET_KEY}".encode()) bearer_auth(f"Bearer {settings.SECRET_KEY}".encode())

View File

@ -1,5 +1,4 @@
"""Test config API""" """Test config API"""
from json import loads from json import loads
from django.urls import reverse from django.urls import reverse

View File

@ -0,0 +1,34 @@
"""test decorators api"""
from django.urls import reverse
from guardian.shortcuts import assign_perm
from rest_framework.test import APITestCase
from authentik.core.models import Application, User
from authentik.lib.generators import generate_id
class TestAPIDecorators(APITestCase):
"""test decorators api"""
def setUp(self) -> None:
super().setUp()
self.user = User.objects.create(username="test-user")
def test_obj_perm_denied(self):
"""Test object perm denied"""
self.client.force_login(self.user)
app = Application.objects.create(name=generate_id(), slug=generate_id())
response = self.client.get(
reverse("authentik_api:application-metrics", kwargs={"slug": app.slug})
)
self.assertEqual(response.status_code, 403)
def test_other_perm_denied(self):
"""Test other perm denied"""
self.client.force_login(self.user)
app = Application.objects.create(name=generate_id(), slug=generate_id())
assign_perm("authentik_core.view_application", self.user, app)
response = self.client.get(
reverse("authentik_api:application-metrics", kwargs={"slug": app.slug})
)
self.assertEqual(response.status_code, 403)

View File

@ -1,5 +1,4 @@
"""Schema generation tests""" """Schema generation tests"""
from django.urls import reverse from django.urls import reverse
from rest_framework.test import APITestCase from rest_framework.test import APITestCase
from yaml import safe_load from yaml import safe_load

View File

@ -1,6 +1,5 @@
"""authentik API Modelviewset tests""" """authentik API Modelviewset tests"""
from typing import Callable
from collections.abc import Callable
from django.test import TestCase from django.test import TestCase
from rest_framework.viewsets import ModelViewSet, ReadOnlyModelViewSet from rest_framework.viewsets import ModelViewSet, ReadOnlyModelViewSet
@ -26,6 +25,6 @@ def viewset_tester_factory(test_viewset: type[ModelViewSet]) -> Callable:
for _, viewset, _ in router.registry: for _, viewset, _ in router.registry:
if not issubclass(viewset, ModelViewSet | ReadOnlyModelViewSet): if not issubclass(viewset, (ModelViewSet, ReadOnlyModelViewSet)):
continue continue
setattr(TestModelViewSets, f"test_viewset_{viewset.__name__}", viewset_tester_factory(viewset)) setattr(TestModelViewSets, f"test_viewset_{viewset.__name__}", viewset_tester_factory(viewset))

View File

@ -1,5 +1,4 @@
"""authentik api urls""" """authentik api urls"""
from django.urls import include, path from django.urls import include, path
from authentik.api.v3.urls import urlpatterns as v3_urls from authentik.api.v3.urls import urlpatterns as v3_urls

View File

@ -1,5 +1,4 @@
"""core Configs API""" """core Configs API"""
from pathlib import Path from pathlib import Path
from django.conf import settings from django.conf import settings
@ -20,7 +19,7 @@ from rest_framework.response import Response
from rest_framework.views import APIView from rest_framework.views import APIView
from authentik.core.api.utils import PassiveSerializer from authentik.core.api.utils import PassiveSerializer
from authentik.events.context_processors.base import get_context_processors from authentik.events.geo import GEOIP_READER
from authentik.lib.config import CONFIG from authentik.lib.config import CONFIG
capabilities = Signal() capabilities = Signal()
@ -31,7 +30,6 @@ class Capabilities(models.TextChoices):
CAN_SAVE_MEDIA = "can_save_media" CAN_SAVE_MEDIA = "can_save_media"
CAN_GEO_IP = "can_geo_ip" CAN_GEO_IP = "can_geo_ip"
CAN_ASN = "can_asn"
CAN_IMPERSONATE = "can_impersonate" CAN_IMPERSONATE = "can_impersonate"
CAN_DEBUG = "can_debug" CAN_DEBUG = "can_debug"
IS_ENTERPRISE = "is_enterprise" IS_ENTERPRISE = "is_enterprise"
@ -68,16 +66,11 @@ class ConfigView(APIView):
"""Get all capabilities this server instance supports""" """Get all capabilities this server instance supports"""
caps = [] caps = []
deb_test = settings.DEBUG or settings.TEST deb_test = settings.DEBUG or settings.TEST
if ( if Path(settings.MEDIA_ROOT).is_mount() or deb_test:
CONFIG.get("storage.media.backend", "file") == "s3"
or Path(settings.STORAGES["default"]["OPTIONS"]["location"]).is_mount()
or deb_test
):
caps.append(Capabilities.CAN_SAVE_MEDIA) caps.append(Capabilities.CAN_SAVE_MEDIA)
for processor in get_context_processors(): if GEOIP_READER.enabled:
if cap := processor.capability(): caps.append(Capabilities.CAN_GEO_IP)
caps.append(cap) if CONFIG.get_bool("impersonation"):
if self.request.tenant.impersonation:
caps.append(Capabilities.CAN_IMPERSONATE) caps.append(Capabilities.CAN_IMPERSONATE)
if settings.DEBUG: # pragma: no cover if settings.DEBUG: # pragma: no cover
caps.append(Capabilities.CAN_DEBUG) caps.append(Capabilities.CAN_DEBUG)
@ -100,10 +93,10 @@ class ConfigView(APIView):
"traces_sample_rate": float(CONFIG.get("error_reporting.sample_rate", 0.4)), "traces_sample_rate": float(CONFIG.get("error_reporting.sample_rate", 0.4)),
}, },
"capabilities": self.get_capabilities(), "capabilities": self.get_capabilities(),
"cache_timeout": CONFIG.get_int("cache.timeout"), "cache_timeout": CONFIG.get_int("redis.cache_timeout"),
"cache_timeout_flows": CONFIG.get_int("cache.timeout_flows"), "cache_timeout_flows": CONFIG.get_int("redis.cache_timeout_flows"),
"cache_timeout_policies": CONFIG.get_int("cache.timeout_policies"), "cache_timeout_policies": CONFIG.get_int("redis.cache_timeout_policies"),
"cache_timeout_reputation": CONFIG.get_int("cache.timeout_reputation"), "cache_timeout_reputation": CONFIG.get_int("redis.cache_timeout_reputation"),
} }
) )

View File

@ -1,5 +1,4 @@
"""api v3 urls""" """api v3 urls"""
from importlib import import_module from importlib import import_module
from django.urls import path from django.urls import path
@ -33,7 +32,7 @@ for _authentik_app in get_apps():
app_name=_authentik_app.name, app_name=_authentik_app.name,
) )
continue continue
urls: list = api_urls.api_urlpatterns urls: list = getattr(api_urls, "api_urlpatterns")
for url in urls: for url in urls:
if isinstance(url, URLPattern): if isinstance(url, URLPattern):
_other_urls.append(url) _other_urls.append(url)

View File

@ -1,5 +1,4 @@
"""General API Views""" """General API Views"""
from typing import Any from typing import Any
from django.urls import reverse from django.urls import reverse

View File

@ -1,22 +1,21 @@
"""Serializer mixin for managed models""" """Serializer mixin for managed models"""
from django.utils.translation import gettext_lazy as _ from django.utils.translation import gettext_lazy as _
from drf_spectacular.utils import extend_schema, inline_serializer from drf_spectacular.utils import extend_schema, inline_serializer
from rest_framework.decorators import action from rest_framework.decorators import action
from rest_framework.exceptions import ValidationError from rest_framework.exceptions import ValidationError
from rest_framework.fields import CharField, DateTimeField from rest_framework.fields import CharField, DateTimeField, JSONField
from rest_framework.request import Request from rest_framework.request import Request
from rest_framework.response import Response from rest_framework.response import Response
from rest_framework.serializers import ListSerializer, ModelSerializer from rest_framework.serializers import ListSerializer, ModelSerializer
from rest_framework.viewsets import ModelViewSet from rest_framework.viewsets import ModelViewSet
from authentik.api.decorators import permission_required
from authentik.blueprints.models import BlueprintInstance from authentik.blueprints.models import BlueprintInstance
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer
from authentik.blueprints.v1.oci import OCI_PREFIX from authentik.blueprints.v1.oci import OCI_PREFIX
from authentik.blueprints.v1.tasks import apply_blueprint, blueprints_find_dict from authentik.blueprints.v1.tasks import apply_blueprint, blueprints_find_dict
from authentik.core.api.used_by import UsedByMixin from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.utils import JSONDictField, PassiveSerializer from authentik.core.api.utils import PassiveSerializer
from authentik.rbac.decorators import permission_required
class ManagedSerializer: class ManagedSerializer:
@ -29,7 +28,7 @@ class MetadataSerializer(PassiveSerializer):
"""Serializer for blueprint metadata""" """Serializer for blueprint metadata"""
name = CharField() name = CharField()
labels = JSONDictField() labels = JSONField()
class BlueprintInstanceSerializer(ModelSerializer): class BlueprintInstanceSerializer(ModelSerializer):
@ -51,12 +50,8 @@ class BlueprintInstanceSerializer(ModelSerializer):
context = self.instance.context if self.instance else {} context = self.instance.context if self.instance else {}
valid, logs = Importer.from_string(content, context).validate() valid, logs = Importer.from_string(content, context).validate()
if not valid: if not valid:
raise ValidationError( text_logs = "\n".join([x["event"] for x in logs])
[ raise ValidationError(_("Failed to validate blueprint: %(logs)s" % {"logs": text_logs}))
_("Failed to validate blueprint"),
*[f"- {x.event}" for x in logs],
]
)
return content return content
def validate(self, attrs: dict) -> dict: def validate(self, attrs: dict) -> dict:

View File

@ -1,6 +1,5 @@
"""authentik Blueprints app""" """authentik Blueprints app"""
from collections.abc import Callable
from importlib import import_module from importlib import import_module
from inspect import ismethod from inspect import ismethod
@ -8,100 +7,40 @@ from django.apps import AppConfig
from django.db import DatabaseError, InternalError, ProgrammingError from django.db import DatabaseError, InternalError, ProgrammingError
from structlog.stdlib import BoundLogger, get_logger from structlog.stdlib import BoundLogger, get_logger
from authentik.root.signals import startup
class ManagedAppConfig(AppConfig): class ManagedAppConfig(AppConfig):
"""Basic reconciliation logic for apps""" """Basic reconciliation logic for apps"""
logger: BoundLogger _logger: BoundLogger
RECONCILE_GLOBAL_CATEGORY: str = "global"
RECONCILE_TENANT_CATEGORY: str = "tenant"
def __init__(self, app_name: str, *args, **kwargs) -> None: def __init__(self, app_name: str, *args, **kwargs) -> None:
super().__init__(app_name, *args, **kwargs) super().__init__(app_name, *args, **kwargs)
self.logger = get_logger().bind(app_name=app_name) self._logger = get_logger().bind(app_name=app_name)
def ready(self) -> None: def ready(self) -> None:
self.import_related() self.reconcile()
startup.connect(self._on_startup_callback, dispatch_uid=self.label)
return super().ready() return super().ready()
def _on_startup_callback(self, sender, **_):
self._reconcile_global()
self._reconcile_tenant()
def import_related(self):
"""Automatically import related modules which rely on just being imported
to register themselves (mainly django signals and celery tasks)"""
def import_relative(rel_module: str):
try:
module_name = f"{self.name}.{rel_module}"
import_module(module_name)
self.logger.info("Imported related module", module=module_name)
except ModuleNotFoundError:
pass
import_relative("checks")
import_relative("tasks")
import_relative("signals")
def import_module(self, path: str): def import_module(self, path: str):
"""Load module""" """Load module"""
import_module(path) import_module(path)
def _reconcile(self, prefix: str) -> None: def reconcile(self) -> None:
"""reconcile ourselves"""
prefix = "reconcile_"
for meth_name in dir(self): for meth_name in dir(self):
meth = getattr(self, meth_name) meth = getattr(self, meth_name)
if not ismethod(meth): if not ismethod(meth):
continue continue
category = getattr(meth, "_authentik_managed_reconcile", None) if not meth_name.startswith(prefix):
if category != prefix:
continue continue
name = meth_name.replace(prefix, "") name = meth_name.replace(prefix, "")
try: try:
self.logger.debug("Starting reconciler", name=name) self._logger.debug("Starting reconciler", name=name)
meth() meth()
self.logger.debug("Successfully reconciled", name=name) self._logger.debug("Successfully reconciled", name=name)
except (DatabaseError, ProgrammingError, InternalError) as exc: except (DatabaseError, ProgrammingError, InternalError) as exc:
self.logger.warning("Failed to run reconcile", name=name, exc=exc) self._logger.warning("Failed to run reconcile", name=name, exc=exc)
@staticmethod
def reconcile_tenant(func: Callable):
"""Mark a function to be called on startup (for each tenant)"""
func._authentik_managed_reconcile = ManagedAppConfig.RECONCILE_TENANT_CATEGORY
return func
@staticmethod
def reconcile_global(func: Callable):
"""Mark a function to be called on startup (globally)"""
func._authentik_managed_reconcile = ManagedAppConfig.RECONCILE_GLOBAL_CATEGORY
return func
def _reconcile_tenant(self) -> None:
"""reconcile ourselves for tenanted methods"""
from authentik.tenants.models import Tenant
try:
tenants = list(Tenant.objects.filter(ready=True))
except (DatabaseError, ProgrammingError, InternalError) as exc:
self.logger.debug("Failed to get tenants to run reconcile", exc=exc)
return
for tenant in tenants:
with tenant:
self._reconcile(self.RECONCILE_TENANT_CATEGORY)
def _reconcile_global(self) -> None:
"""
reconcile ourselves for global methods.
Used for signals, tasks, etc. Database queries should not be made in here.
"""
from django_tenants.utils import get_public_schema_name, schema_context
with schema_context(get_public_schema_name()):
self._reconcile(self.RECONCILE_GLOBAL_CATEGORY)
class AuthentikBlueprintsConfig(ManagedAppConfig): class AuthentikBlueprintsConfig(ManagedAppConfig):
@ -112,13 +51,11 @@ class AuthentikBlueprintsConfig(ManagedAppConfig):
verbose_name = "authentik Blueprints" verbose_name = "authentik Blueprints"
default = True default = True
@ManagedAppConfig.reconcile_global def reconcile_load_blueprints_v1_tasks(self):
def load_blueprints_v1_tasks(self):
"""Load v1 tasks""" """Load v1 tasks"""
self.import_module("authentik.blueprints.v1.tasks") self.import_module("authentik.blueprints.v1.tasks")
@ManagedAppConfig.reconcile_tenant def reconcile_blueprints_discovery(self):
def blueprints_discovery(self):
"""Run blueprint discovery""" """Run blueprint discovery"""
from authentik.blueprints.v1.tasks import blueprints_discovery, clear_failed_blueprints from authentik.blueprints.v1.tasks import blueprints_discovery, clear_failed_blueprints

View File

@ -1,5 +1,4 @@
"""Apply blueprint from commandline""" """Apply blueprint from commandline"""
from sys import exit as sys_exit from sys import exit as sys_exit
from django.core.management.base import BaseCommand, no_translations from django.core.management.base import BaseCommand, no_translations
@ -7,7 +6,6 @@ from structlog.stdlib import get_logger
from authentik.blueprints.models import BlueprintInstance from authentik.blueprints.models import BlueprintInstance
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer
from authentik.tenants.models import Tenant
LOGGER = get_logger() LOGGER = get_logger()
@ -18,18 +16,14 @@ class Command(BaseCommand):
@no_translations @no_translations
def handle(self, *args, **options): def handle(self, *args, **options):
"""Apply all blueprints in order, abort when one fails to import""" """Apply all blueprints in order, abort when one fails to import"""
for tenant in Tenant.objects.filter(ready=True): for blueprint_path in options.get("blueprints", []):
with tenant: content = BlueprintInstance(path=blueprint_path).retrieve()
for blueprint_path in options.get("blueprints", []): importer = Importer.from_string(content)
content = BlueprintInstance(path=blueprint_path).retrieve() valid, _ = importer.validate()
importer = Importer.from_string(content) if not valid:
valid, logs = importer.validate() self.stderr.write("blueprint invalid")
if not valid: sys_exit(1)
self.stderr.write("Blueprint invalid") importer.apply()
for log in logs:
self.stderr.write(f"\t{log.logger}: {log.event}: {log.attributes}")
sys_exit(1)
importer.apply()
def add_arguments(self, parser): def add_arguments(self, parser):
parser.add_argument("blueprints", nargs="+", type=str) parser.add_argument("blueprints", nargs="+", type=str)

View File

@ -1,19 +1,17 @@
"""Export blueprint of current authentik install""" """Export blueprint of current authentik install"""
from django.core.management.base import BaseCommand, no_translations
from django.core.management.base import no_translations
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
from authentik.blueprints.v1.exporter import Exporter from authentik.blueprints.v1.exporter import Exporter
from authentik.tenants.management import TenantCommand
LOGGER = get_logger() LOGGER = get_logger()
class Command(TenantCommand): class Command(BaseCommand):
"""Export blueprint of current authentik install""" """Export blueprint of current authentik install"""
@no_translations @no_translations
def handle_per_tenant(self, *args, **options): def handle(self, *args, **options):
"""Export blueprint of current authentik install""" """Export blueprint of current authentik install"""
exporter = Exporter() exporter = Exporter()
self.stdout.write(exporter.export_to_string()) self.stdout.write(exporter.export_to_string())

View File

@ -1,17 +1,14 @@
"""Generate JSON Schema for blueprints""" """Generate JSON Schema for blueprints"""
from json import dumps from json import dumps
from typing import Any from typing import Any
from django.core.management.base import BaseCommand, no_translations from django.core.management.base import BaseCommand, no_translations
from django.db.models import Model, fields from django.db.models import Model
from drf_jsonschema_serializer.convert import converter, field_to_converter from drf_jsonschema_serializer.convert import field_to_converter
from rest_framework.fields import Field, JSONField, UUIDField from rest_framework.fields import Field, JSONField, UUIDField
from rest_framework.relations import PrimaryKeyRelatedField
from rest_framework.serializers import Serializer from rest_framework.serializers import Serializer
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
from authentik import __version__
from authentik.blueprints.v1.common import BlueprintEntryDesiredState from authentik.blueprints.v1.common import BlueprintEntryDesiredState
from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT, is_model_allowed from authentik.blueprints.v1.importer import SERIALIZER_CONTEXT_BLUEPRINT, is_model_allowed
from authentik.blueprints.v1.meta.registry import BaseMetaModel, registry from authentik.blueprints.v1.meta.registry import BaseMetaModel, registry
@ -20,23 +17,6 @@ from authentik.lib.models import SerializerModel
LOGGER = get_logger() LOGGER = get_logger()
@converter
class PrimaryKeyRelatedFieldConverter:
"""Custom primary key field converter which is aware of non-integer based PKs
This is not an exhaustive fix for other non-int PKs, however in authentik we either
use UUIDs or ints"""
field_class = PrimaryKeyRelatedField
def convert(self, field: PrimaryKeyRelatedField):
model: Model = field.queryset.model
pk_field = model._meta.pk
if isinstance(pk_field, fields.UUIDField):
return {"type": "string", "format": "uuid"}
return {"type": "integer"}
class Command(BaseCommand): class Command(BaseCommand):
"""Generate JSON Schema for blueprints""" """Generate JSON Schema for blueprints"""
@ -48,7 +28,7 @@ class Command(BaseCommand):
"$schema": "http://json-schema.org/draft-07/schema", "$schema": "http://json-schema.org/draft-07/schema",
"$id": "https://goauthentik.io/blueprints/schema.json", "$id": "https://goauthentik.io/blueprints/schema.json",
"type": "object", "type": "object",
"title": f"authentik {__version__} Blueprint schema", "title": "authentik Blueprint schema",
"required": ["version", "entries"], "required": ["version", "entries"],
"properties": { "properties": {
"version": { "version": {
@ -113,19 +93,16 @@ class Command(BaseCommand):
) )
model_path = f"{model._meta.app_label}.{model._meta.model_name}" model_path = f"{model._meta.app_label}.{model._meta.model_name}"
self.schema["properties"]["entries"]["items"]["oneOf"].append( self.schema["properties"]["entries"]["items"]["oneOf"].append(
self.template_entry(model_path, model, serializer) self.template_entry(model_path, serializer)
) )
def template_entry(self, model_path: str, model: type[Model], serializer: Serializer) -> dict: def template_entry(self, model_path: str, serializer: Serializer) -> dict:
"""Template entry for a single model""" """Template entry for a single model"""
model_schema = self.to_jsonschema(serializer) model_schema = self.to_jsonschema(serializer)
model_schema["required"] = [] model_schema["required"] = []
def_name = f"model_{model_path}" def_name = f"model_{model_path}"
def_path = f"#/$defs/{def_name}" def_path = f"#/$defs/{def_name}"
self.schema["$defs"][def_name] = model_schema self.schema["$defs"][def_name] = model_schema
def_name_perm = f"model_{model_path}_permissions"
def_path_perm = f"#/$defs/{def_name_perm}"
self.schema["$defs"][def_name_perm] = self.model_permissions(model)
return { return {
"type": "object", "type": "object",
"required": ["model", "identifiers"], "required": ["model", "identifiers"],
@ -138,7 +115,6 @@ class Command(BaseCommand):
"default": "present", "default": "present",
}, },
"conditions": {"type": "array", "items": {"type": "boolean"}}, "conditions": {"type": "array", "items": {"type": "boolean"}},
"permissions": {"$ref": def_path_perm},
"attrs": {"$ref": def_path}, "attrs": {"$ref": def_path},
"identifiers": {"$ref": def_path}, "identifiers": {"$ref": def_path},
}, },
@ -189,20 +165,3 @@ class Command(BaseCommand):
if required: if required:
result["required"] = required result["required"] = required
return result return result
def model_permissions(self, model: type[Model]) -> dict:
perms = [x[0] for x in model._meta.permissions]
for action in model._meta.default_permissions:
perms.append(f"{action}_{model._meta.model_name}")
return {
"type": "array",
"items": {
"type": "object",
"required": ["permission"],
"properties": {
"permission": {"type": "string", "enum": perms},
"user": {"type": "integer"},
"role": {"type": "string"},
},
},
}

View File

@ -14,7 +14,7 @@ from authentik.blueprints.v1.labels import LABEL_AUTHENTIK_SYSTEM
from authentik.lib.config import CONFIG from authentik.lib.config import CONFIG
def check_blueprint_v1_file(BlueprintInstance: type, db_alias, path: Path): def check_blueprint_v1_file(BlueprintInstance: type, path: Path):
"""Check if blueprint should be imported""" """Check if blueprint should be imported"""
from authentik.blueprints.models import BlueprintInstanceStatus from authentik.blueprints.models import BlueprintInstanceStatus
from authentik.blueprints.v1.common import BlueprintLoader, BlueprintMetadata from authentik.blueprints.v1.common import BlueprintLoader, BlueprintMetadata
@ -29,7 +29,7 @@ def check_blueprint_v1_file(BlueprintInstance: type, db_alias, path: Path):
if version != 1: if version != 1:
return return
blueprint_file.seek(0) blueprint_file.seek(0)
instance = BlueprintInstance.objects.using(db_alias).filter(path=path).first() instance: BlueprintInstance = BlueprintInstance.objects.filter(path=path).first()
rel_path = path.relative_to(Path(CONFIG.get("blueprints_dir"))) rel_path = path.relative_to(Path(CONFIG.get("blueprints_dir")))
meta = None meta = None
if metadata: if metadata:
@ -37,7 +37,7 @@ def check_blueprint_v1_file(BlueprintInstance: type, db_alias, path: Path):
if meta.labels.get(LABEL_AUTHENTIK_INSTANTIATE, "").lower() == "false": if meta.labels.get(LABEL_AUTHENTIK_INSTANTIATE, "").lower() == "false":
return return
if not instance: if not instance:
BlueprintInstance.objects.using(db_alias).create( instance = BlueprintInstance(
name=meta.name if meta else str(rel_path), name=meta.name if meta else str(rel_path),
path=str(rel_path), path=str(rel_path),
context={}, context={},
@ -47,6 +47,7 @@ def check_blueprint_v1_file(BlueprintInstance: type, db_alias, path: Path):
last_applied_hash="", last_applied_hash="",
metadata=metadata or {}, metadata=metadata or {},
) )
instance.save()
def migration_blueprint_import(apps: Apps, schema_editor: BaseDatabaseSchemaEditor): def migration_blueprint_import(apps: Apps, schema_editor: BaseDatabaseSchemaEditor):
@ -55,7 +56,7 @@ def migration_blueprint_import(apps: Apps, schema_editor: BaseDatabaseSchemaEdit
db_alias = schema_editor.connection.alias db_alias = schema_editor.connection.alias
for file in glob(f"{CONFIG.get('blueprints_dir')}/**/*.yaml", recursive=True): for file in glob(f"{CONFIG.get('blueprints_dir')}/**/*.yaml", recursive=True):
check_blueprint_v1_file(BlueprintInstance, db_alias, Path(file)) check_blueprint_v1_file(BlueprintInstance, Path(file))
for blueprint in BlueprintInstance.objects.using(db_alias).all(): for blueprint in BlueprintInstance.objects.using(db_alias).all():
# If we already have flows (and we should always run before flow migrations) # If we already have flows (and we should always run before flow migrations)

View File

@ -1,5 +1,4 @@
"""blueprint models""" """blueprint models"""
from pathlib import Path from pathlib import Path
from uuid import uuid4 from uuid import uuid4
@ -71,19 +70,6 @@ class BlueprintInstance(SerializerModel, ManagedModel, CreatedUpdatedModel):
enabled = models.BooleanField(default=True) enabled = models.BooleanField(default=True)
managed_models = ArrayField(models.TextField(), default=list) managed_models = ArrayField(models.TextField(), default=list)
class Meta:
verbose_name = _("Blueprint Instance")
verbose_name_plural = _("Blueprint Instances")
unique_together = (
(
"name",
"path",
),
)
def __str__(self) -> str:
return f"Blueprint Instance {self.name}"
def retrieve_oci(self) -> str: def retrieve_oci(self) -> str:
"""Get blueprint from an OCI registry""" """Get blueprint from an OCI registry"""
client = BlueprintOCIClient(self.path.replace(OCI_PREFIX, "https://")) client = BlueprintOCIClient(self.path.replace(OCI_PREFIX, "https://"))
@ -102,7 +88,7 @@ class BlueprintInstance(SerializerModel, ManagedModel, CreatedUpdatedModel):
raise BlueprintRetrievalFailed("Invalid blueprint path") raise BlueprintRetrievalFailed("Invalid blueprint path")
with full_path.open("r", encoding="utf-8") as _file: with full_path.open("r", encoding="utf-8") as _file:
return _file.read() return _file.read()
except OSError as exc: except (IOError, OSError) as exc:
raise BlueprintRetrievalFailed(exc) from exc raise BlueprintRetrievalFailed(exc) from exc
def retrieve(self) -> str: def retrieve(self) -> str:
@ -118,3 +104,16 @@ class BlueprintInstance(SerializerModel, ManagedModel, CreatedUpdatedModel):
from authentik.blueprints.api import BlueprintInstanceSerializer from authentik.blueprints.api import BlueprintInstanceSerializer
return BlueprintInstanceSerializer return BlueprintInstanceSerializer
def __str__(self) -> str:
return f"Blueprint Instance {self.name}"
class Meta:
verbose_name = _("Blueprint Instance")
verbose_name_plural = _("Blueprint Instances")
unique_together = (
(
"name",
"path",
),
)

View File

@ -1,5 +1,4 @@
"""blueprint Settings""" """blueprint Settings"""
from celery.schedules import crontab from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand from authentik.lib.utils.time import fqdn_rand

View File

@ -1,7 +1,6 @@
"""Blueprint helpers""" """Blueprint helpers"""
from collections.abc import Callable
from functools import wraps from functools import wraps
from typing import Callable
from django.apps import apps from django.apps import apps
@ -39,7 +38,7 @@ def reconcile_app(app_name: str):
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
config = apps.get_app_config(app_name) config = apps.get_app_config(app_name)
if isinstance(config, ManagedAppConfig): if isinstance(config, ManagedAppConfig):
config._on_startup_callback(None) config.reconcile()
return func(*args, **kwargs) return func(*args, **kwargs)
return wrapper return wrapper

View File

@ -1,24 +0,0 @@
version: 1
entries:
- model: authentik_core.user
id: user
identifiers:
username: "%(id)s"
attrs:
name: "%(id)s"
- model: authentik_rbac.role
id: role
identifiers:
name: "%(id)s"
- model: authentik_flows.flow
identifiers:
slug: "%(id)s"
attrs:
designation: authentication
name: foo
title: foo
permissions:
- permission: view_flow
user: !KeyOf user
- permission: view_flow
role: !KeyOf role

View File

@ -1,8 +0,0 @@
version: 1
entries:
- model: authentik_rbac.role
identifiers:
name: "%(id)s"
attrs:
permissions:
- authentik_blueprints.view_blueprintinstance

View File

@ -1,9 +0,0 @@
version: 1
entries:
- model: authentik_core.user
identifiers:
username: "%(id)s"
attrs:
name: "%(id)s"
permissions:
- authentik_blueprints.view_blueprintinstance

View File

@ -1,5 +1,4 @@
"""authentik managed models tests""" """authentik managed models tests"""
from django.test import TestCase from django.test import TestCase
from authentik.blueprints.models import BlueprintInstance, BlueprintRetrievalFailed from authentik.blueprints.models import BlueprintInstance, BlueprintRetrievalFailed

View File

@ -1,5 +1,4 @@
"""Test blueprints OCI""" """Test blueprints OCI"""
from django.test import TransactionTestCase from django.test import TransactionTestCase
from requests_mock import Mocker from requests_mock import Mocker

View File

@ -1,23 +1,22 @@
"""test packaged blueprints""" """test packaged blueprints"""
from collections.abc import Callable
from pathlib import Path from pathlib import Path
from typing import Callable
from django.test import TransactionTestCase from django.test import TransactionTestCase
from authentik.blueprints.models import BlueprintInstance from authentik.blueprints.models import BlueprintInstance
from authentik.blueprints.tests import apply_blueprint from authentik.blueprints.tests import apply_blueprint
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer
from authentik.brands.models import Brand from authentik.tenants.models import Tenant
class TestPackaged(TransactionTestCase): class TestPackaged(TransactionTestCase):
"""Empty class, test methods are added dynamically""" """Empty class, test methods are added dynamically"""
@apply_blueprint("default/default-brand.yaml") @apply_blueprint("default/default-tenant.yaml")
def test_decorator_static(self): def test_decorator_static(self):
"""Test @apply_blueprint decorator""" """Test @apply_blueprint decorator"""
self.assertTrue(Brand.objects.filter(domain="authentik-default").exists()) self.assertTrue(Tenant.objects.filter(domain="authentik-default").exists())
def blueprint_tester(file_name: Path) -> Callable: def blueprint_tester(file_name: Path) -> Callable:
@ -27,8 +26,7 @@ def blueprint_tester(file_name: Path) -> Callable:
base = Path("blueprints/") base = Path("blueprints/")
rel_path = Path(file_name).relative_to(base) rel_path = Path(file_name).relative_to(base)
importer = Importer.from_string(BlueprintInstance(path=str(rel_path)).retrieve()) importer = Importer.from_string(BlueprintInstance(path=str(rel_path)).retrieve())
validation, logs = importer.validate() self.assertTrue(importer.validate()[0])
self.assertTrue(validation, logs)
self.assertTrue(importer.apply()) self.assertTrue(importer.apply())
return tester return tester

View File

@ -1,6 +1,5 @@
"""authentik managed models tests""" """authentik managed models tests"""
from typing import Callable, Type
from collections.abc import Callable
from django.apps import apps from django.apps import apps
from django.test import TestCase from django.test import TestCase
@ -14,7 +13,7 @@ class TestModels(TestCase):
"""Test Models""" """Test Models"""
def serializer_tester_factory(test_model: type[SerializerModel]) -> Callable: def serializer_tester_factory(test_model: Type[SerializerModel]) -> Callable:
"""Test serializer""" """Test serializer"""
def tester(self: TestModels): def tester(self: TestModels):

View File

@ -1,5 +1,4 @@
"""Test blueprints v1""" """Test blueprints v1"""
from os import environ from os import environ
from django.test import TransactionTestCase from django.test import TransactionTestCase

View File

@ -1,5 +1,4 @@
"""Test blueprints v1 api""" """Test blueprints v1 api"""
from json import loads from json import loads
from tempfile import NamedTemporaryFile, mkdtemp from tempfile import NamedTemporaryFile, mkdtemp
@ -78,5 +77,5 @@ class TestBlueprintsV1API(APITestCase):
self.assertEqual(res.status_code, 400) self.assertEqual(res.status_code, 400)
self.assertJSONEqual( self.assertJSONEqual(
res.content.decode(), res.content.decode(),
{"content": ["Failed to validate blueprint", "- Invalid blueprint version"]}, {"content": ["Failed to validate blueprint: Invalid blueprint version"]},
) )

View File

@ -1,5 +1,4 @@
"""Test blueprints v1""" """Test blueprints v1"""
from django.test import TransactionTestCase from django.test import TransactionTestCase
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer

View File

@ -1,5 +1,4 @@
"""Test blueprints v1""" """Test blueprints v1"""
from django.test import TransactionTestCase from django.test import TransactionTestCase
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer

View File

@ -1,57 +0,0 @@
"""Test blueprints v1"""
from django.test import TransactionTestCase
from guardian.shortcuts import get_perms
from authentik.blueprints.v1.importer import Importer
from authentik.core.models import User
from authentik.flows.models import Flow
from authentik.lib.generators import generate_id
from authentik.lib.tests.utils import load_fixture
from authentik.rbac.models import Role
class TestBlueprintsV1RBAC(TransactionTestCase):
"""Test Blueprints rbac attribute"""
def test_user_permission(self):
"""Test permissions"""
uid = generate_id()
import_yaml = load_fixture("fixtures/rbac_user.yaml", id=uid)
importer = Importer.from_string(import_yaml)
self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply())
user = User.objects.filter(username=uid).first()
self.assertIsNotNone(user)
self.assertTrue(user.has_perms(["authentik_blueprints.view_blueprintinstance"]))
def test_role_permission(self):
"""Test permissions"""
uid = generate_id()
import_yaml = load_fixture("fixtures/rbac_role.yaml", id=uid)
importer = Importer.from_string(import_yaml)
self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply())
role = Role.objects.filter(name=uid).first()
self.assertIsNotNone(role)
self.assertEqual(
list(role.group.permissions.all().values_list("codename", flat=True)),
["view_blueprintinstance"],
)
def test_object_permission(self):
"""Test permissions"""
uid = generate_id()
import_yaml = load_fixture("fixtures/rbac_object.yaml", id=uid)
importer = Importer.from_string(import_yaml)
self.assertTrue(importer.validate()[0])
self.assertTrue(importer.apply())
flow = Flow.objects.filter(slug=uid).first()
user = User.objects.filter(username=uid).first()
role = Role.objects.filter(name=uid).first()
self.assertIsNotNone(flow)
self.assertEqual(get_perms(user, flow), ["view_flow"])
self.assertEqual(get_perms(role.group, flow), ["view_flow"])

View File

@ -1,5 +1,4 @@
"""Test blueprints v1""" """Test blueprints v1"""
from django.test import TransactionTestCase from django.test import TransactionTestCase
from authentik.blueprints.v1.importer import Importer from authentik.blueprints.v1.importer import Importer

View File

@ -1,5 +1,4 @@
"""Test blueprints v1 tasks""" """Test blueprints v1 tasks"""
from hashlib import sha512 from hashlib import sha512
from tempfile import NamedTemporaryFile, mkdtemp from tempfile import NamedTemporaryFile, mkdtemp
@ -54,7 +53,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
file.seek(0) file.seek(0)
file_hash = sha512(file.read().encode()).hexdigest() file_hash = sha512(file.read().encode()).hexdigest()
file.flush() file.flush()
blueprints_discovery() blueprints_discovery() # pylint: disable=no-value-for-parameter
instance = BlueprintInstance.objects.filter(name=blueprint_id).first() instance = BlueprintInstance.objects.filter(name=blueprint_id).first()
self.assertEqual(instance.last_applied_hash, file_hash) self.assertEqual(instance.last_applied_hash, file_hash)
self.assertEqual( self.assertEqual(
@ -82,7 +81,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
) )
) )
file.flush() file.flush()
blueprints_discovery() blueprints_discovery() # pylint: disable=no-value-for-parameter
blueprint = BlueprintInstance.objects.filter(name="foo").first() blueprint = BlueprintInstance.objects.filter(name="foo").first()
self.assertEqual( self.assertEqual(
blueprint.last_applied_hash, blueprint.last_applied_hash,
@ -107,7 +106,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
) )
) )
file.flush() file.flush()
blueprints_discovery() blueprints_discovery() # pylint: disable=no-value-for-parameter
blueprint.refresh_from_db() blueprint.refresh_from_db()
self.assertEqual( self.assertEqual(
blueprint.last_applied_hash, blueprint.last_applied_hash,
@ -149,7 +148,7 @@ class TestBlueprintsV1Tasks(TransactionTestCase):
instance.status, instance.status,
BlueprintInstanceStatus.UNKNOWN, BlueprintInstanceStatus.UNKNOWN,
) )
apply_blueprint(instance.pk) apply_blueprint(instance.pk) # pylint: disable=no-value-for-parameter
instance.refresh_from_db() instance.refresh_from_db()
self.assertEqual(instance.last_applied_hash, "") self.assertEqual(instance.last_applied_hash, "")
self.assertEqual( self.assertEqual(

View File

@ -1,5 +1,4 @@
"""API URLs""" """API URLs"""
from authentik.blueprints.api import BlueprintInstanceViewSet from authentik.blueprints.api import BlueprintInstanceViewSet
api_urlpatterns = [ api_urlpatterns = [

View File

@ -1,14 +1,12 @@
"""transfer common classes""" """transfer common classes"""
from collections import OrderedDict from collections import OrderedDict
from collections.abc import Generator, Iterable, Mapping
from copy import copy from copy import copy
from dataclasses import asdict, dataclass, field, is_dataclass from dataclasses import asdict, dataclass, field, is_dataclass
from enum import Enum from enum import Enum
from functools import reduce from functools import reduce
from operator import ixor from operator import ixor
from os import getenv from os import getenv
from typing import Any, Literal, Union from typing import Any, Iterable, Literal, Mapping, Optional, Union
from uuid import UUID from uuid import UUID
from deepmerge import always_merger from deepmerge import always_merger
@ -46,7 +44,7 @@ def get_attrs(obj: SerializerModel) -> dict[str, Any]:
class BlueprintEntryState: class BlueprintEntryState:
"""State of a single instance""" """State of a single instance"""
instance: Model | None = None instance: Optional[Model] = None
class BlueprintEntryDesiredState(Enum): class BlueprintEntryDesiredState(Enum):
@ -58,15 +56,6 @@ class BlueprintEntryDesiredState(Enum):
MUST_CREATED = "must_created" MUST_CREATED = "must_created"
@dataclass
class BlueprintEntryPermission:
"""Describe object-level permissions"""
permission: Union[str, "YAMLTag"]
user: Union[int, "YAMLTag", None] = field(default=None)
role: Union[str, "YAMLTag", None] = field(default=None)
@dataclass @dataclass
class BlueprintEntry: class BlueprintEntry:
"""Single entry of a blueprint""" """Single entry of a blueprint"""
@ -77,15 +66,14 @@ class BlueprintEntry:
) )
conditions: list[Any] = field(default_factory=list) conditions: list[Any] = field(default_factory=list)
identifiers: dict[str, Any] = field(default_factory=dict) identifiers: dict[str, Any] = field(default_factory=dict)
attrs: dict[str, Any] | None = field(default_factory=dict) attrs: Optional[dict[str, Any]] = field(default_factory=dict)
permissions: list[BlueprintEntryPermission] = field(default_factory=list)
id: str | None = None id: Optional[str] = None
_state: BlueprintEntryState = field(default_factory=BlueprintEntryState) _state: BlueprintEntryState = field(default_factory=BlueprintEntryState)
def __post_init__(self, *args, **kwargs) -> None: def __post_init__(self, *args, **kwargs) -> None:
self.__tag_contexts: list[YAMLTagContext] = [] self.__tag_contexts: list["YAMLTagContext"] = []
@staticmethod @staticmethod
def from_model(model: SerializerModel, *extra_identifier_names: str) -> "BlueprintEntry": def from_model(model: SerializerModel, *extra_identifier_names: str) -> "BlueprintEntry":
@ -103,10 +91,10 @@ class BlueprintEntry:
attrs=all_attrs, attrs=all_attrs,
) )
def get_tag_context( def _get_tag_context(
self, self,
depth: int = 0, depth: int = 0,
context_tag_type: type["YAMLTagContext"] | tuple["YAMLTagContext", ...] | None = None, context_tag_type: Optional[type["YAMLTagContext"] | tuple["YAMLTagContext", ...]] = None,
) -> "YAMLTagContext": ) -> "YAMLTagContext":
"""Get a YAMLTagContext object located at a certain depth in the tag tree""" """Get a YAMLTagContext object located at a certain depth in the tag tree"""
if depth < 0: if depth < 0:
@ -119,8 +107,8 @@ class BlueprintEntry:
try: try:
return contexts[-(depth + 1)] return contexts[-(depth + 1)]
except IndexError as exc: except IndexError:
raise ValueError(f"invalid depth: {depth}. Max depth: {len(contexts) - 1}") from exc raise ValueError(f"invalid depth: {depth}. Max depth: {len(contexts) - 1}")
def tag_resolver(self, value: Any, blueprint: "Blueprint") -> Any: def tag_resolver(self, value: Any, blueprint: "Blueprint") -> Any:
"""Check if we have any special tags that need handling""" """Check if we have any special tags that need handling"""
@ -160,17 +148,6 @@ class BlueprintEntry:
"""Get the blueprint model, with yaml tags resolved if present""" """Get the blueprint model, with yaml tags resolved if present"""
return str(self.tag_resolver(self.model, blueprint)) return str(self.tag_resolver(self.model, blueprint))
def get_permissions(
self, blueprint: "Blueprint"
) -> Generator[BlueprintEntryPermission, None, None]:
"""Get permissions of this entry, with all yaml tags resolved"""
for perm in self.permissions:
yield BlueprintEntryPermission(
permission=self.tag_resolver(perm.permission, blueprint),
user=self.tag_resolver(perm.user, blueprint),
role=self.tag_resolver(perm.role, blueprint),
)
def check_all_conditions_match(self, blueprint: "Blueprint") -> bool: def check_all_conditions_match(self, blueprint: "Blueprint") -> bool:
"""Check all conditions of this entry match (evaluate to True)""" """Check all conditions of this entry match (evaluate to True)"""
return all(self.tag_resolver(self.conditions, blueprint)) return all(self.tag_resolver(self.conditions, blueprint))
@ -192,7 +169,7 @@ class Blueprint:
entries: list[BlueprintEntry] = field(default_factory=list) entries: list[BlueprintEntry] = field(default_factory=list)
context: dict = field(default_factory=dict) context: dict = field(default_factory=dict)
metadata: BlueprintMetadata | None = field(default=None) metadata: Optional[BlueprintMetadata] = field(default=None)
class YAMLTag: class YAMLTag:
@ -240,7 +217,7 @@ class Env(YAMLTag):
"""Lookup environment variable with optional default""" """Lookup environment variable with optional default"""
key: str key: str
default: Any | None default: Optional[Any]
def __init__(self, loader: "BlueprintLoader", node: ScalarNode | SequenceNode) -> None: def __init__(self, loader: "BlueprintLoader", node: ScalarNode | SequenceNode) -> None:
super().__init__() super().__init__()
@ -259,7 +236,7 @@ class Context(YAMLTag):
"""Lookup key from instance context""" """Lookup key from instance context"""
key: str key: str
default: Any | None default: Optional[Any]
def __init__(self, loader: "BlueprintLoader", node: ScalarNode | SequenceNode) -> None: def __init__(self, loader: "BlueprintLoader", node: ScalarNode | SequenceNode) -> None:
super().__init__() super().__init__()
@ -303,7 +280,7 @@ class Format(YAMLTag):
try: try:
return self.format_string % tuple(args) return self.format_string % tuple(args)
except TypeError as exc: except TypeError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc raise EntryInvalidError.from_entry(exc, entry)
class Find(YAMLTag): class Find(YAMLTag):
@ -328,10 +305,7 @@ class Find(YAMLTag):
else: else:
model_name = self.model_name model_name = self.model_name
try: model_class = apps.get_model(*model_name.split("."))
model_class = apps.get_model(*model_name.split("."))
except LookupError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc
query = Q() query = Q()
for cond in self.conditions: for cond in self.conditions:
@ -391,7 +365,7 @@ class Condition(YAMLTag):
comparator = self._COMPARATORS[self.mode.upper()] comparator = self._COMPARATORS[self.mode.upper()]
return comparator(tuple(bool(x) for x in args)) return comparator(tuple(bool(x) for x in args))
except (TypeError, KeyError) as exc: except (TypeError, KeyError) as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc raise EntryInvalidError.from_entry(exc, entry)
class If(YAMLTag): class If(YAMLTag):
@ -423,7 +397,7 @@ class If(YAMLTag):
blueprint, blueprint,
) )
except TypeError as exc: except TypeError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc raise EntryInvalidError.from_entry(exc, entry)
class Enumerate(YAMLTag, YAMLTagContext): class Enumerate(YAMLTag, YAMLTagContext):
@ -437,7 +411,9 @@ class Enumerate(YAMLTag, YAMLTagContext):
"SEQ": (list, lambda a, b: [*a, b]), "SEQ": (list, lambda a, b: [*a, b]),
"MAP": ( "MAP": (
dict, dict,
lambda a, b: always_merger.merge(a, {b[0]: b[1]} if isinstance(b, tuple | list) else b), lambda a, b: always_merger.merge(
a, {b[0]: b[1]} if isinstance(b, (tuple, list)) else b
),
), ),
} }
@ -479,7 +455,7 @@ class Enumerate(YAMLTag, YAMLTagContext):
try: try:
output_class, add_fn = self._OUTPUT_BODIES[self.output_body.upper()] output_class, add_fn = self._OUTPUT_BODIES[self.output_body.upper()]
except KeyError as exc: except KeyError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc raise EntryInvalidError.from_entry(exc, entry)
result = output_class() result = output_class()
@ -507,13 +483,13 @@ class EnumeratedItem(YAMLTag):
_SUPPORTED_CONTEXT_TAGS = (Enumerate,) _SUPPORTED_CONTEXT_TAGS = (Enumerate,)
def __init__(self, _loader: "BlueprintLoader", node: ScalarNode) -> None: def __init__(self, loader: "BlueprintLoader", node: ScalarNode) -> None:
super().__init__() super().__init__()
self.depth = int(node.value) self.depth = int(node.value)
def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any: def resolve(self, entry: BlueprintEntry, blueprint: Blueprint) -> Any:
try: try:
context_tag: Enumerate = entry.get_tag_context( context_tag: Enumerate = entry._get_tag_context(
depth=self.depth, depth=self.depth,
context_tag_type=EnumeratedItem._SUPPORTED_CONTEXT_TAGS, context_tag_type=EnumeratedItem._SUPPORTED_CONTEXT_TAGS,
) )
@ -523,11 +499,9 @@ class EnumeratedItem(YAMLTag):
f"{self.__class__.__name__} tags are only usable " f"{self.__class__.__name__} tags are only usable "
f"inside an {Enumerate.__name__} tag", f"inside an {Enumerate.__name__} tag",
entry, entry,
) from exc )
raise EntryInvalidError.from_entry( raise EntryInvalidError.from_entry(f"{self.__class__.__name__} tag: {exc}", entry)
f"{self.__class__.__name__} tag: {exc}", entry
) from exc
return context_tag.get_context(entry, blueprint) return context_tag.get_context(entry, blueprint)
@ -540,8 +514,8 @@ class Index(EnumeratedItem):
try: try:
return context[0] return context[0]
except IndexError as exc: # pragma: no cover except IndexError: # pragma: no cover
raise EntryInvalidError.from_entry(f"Empty/invalid context: {context}", entry) from exc raise EntryInvalidError.from_entry(f"Empty/invalid context: {context}", entry)
class Value(EnumeratedItem): class Value(EnumeratedItem):
@ -552,8 +526,8 @@ class Value(EnumeratedItem):
try: try:
return context[1] return context[1]
except IndexError as exc: # pragma: no cover except IndexError: # pragma: no cover
raise EntryInvalidError.from_entry(f"Empty/invalid context: {context}", entry) from exc raise EntryInvalidError.from_entry(f"Empty/invalid context: {context}", entry)
class BlueprintDumper(SafeDumper): class BlueprintDumper(SafeDumper):
@ -580,11 +554,7 @@ class BlueprintDumper(SafeDumper):
def factory(items): def factory(items):
final_dict = dict(items) final_dict = dict(items)
# Remove internal state variables
final_dict.pop("_state", None) final_dict.pop("_state", None)
# Future-proof to only remove the ID if we don't set a value
if "id" in final_dict and final_dict.get("id") is None:
final_dict.pop("id")
return final_dict return final_dict
data = asdict(data, dict_factory=factory) data = asdict(data, dict_factory=factory)
@ -611,13 +581,13 @@ class BlueprintLoader(SafeLoader):
class EntryInvalidError(SentryIgnoredException): class EntryInvalidError(SentryIgnoredException):
"""Error raised when an entry is invalid""" """Error raised when an entry is invalid"""
entry_model: str | None entry_model: Optional[str]
entry_id: str | None entry_id: Optional[str]
validation_error: ValidationError | None validation_error: Optional[ValidationError]
serializer: Serializer | None = None serializer: Optional[Serializer] = None
def __init__( def __init__(
self, *args: object, validation_error: ValidationError | None = None, **kwargs self, *args: object, validation_error: Optional[ValidationError] = None, **kwargs
) -> None: ) -> None:
super().__init__(*args) super().__init__(*args)
self.entry_model = None self.entry_model = None

View File

@ -1,6 +1,5 @@
"""Blueprint exporter""" """Blueprint exporter"""
from typing import Iterable
from collections.abc import Iterable
from uuid import UUID from uuid import UUID
from django.apps import apps from django.apps import apps
@ -8,6 +7,7 @@ from django.contrib.auth import get_user_model
from django.db.models import Model, Q, QuerySet from django.db.models import Model, Q, QuerySet
from django.utils.timezone import now from django.utils.timezone import now
from django.utils.translation import gettext as _ from django.utils.translation import gettext as _
from guardian.shortcuts import get_anonymous_user
from yaml import dump from yaml import dump
from authentik.blueprints.v1.common import ( from authentik.blueprints.v1.common import (
@ -48,7 +48,7 @@ class Exporter:
"""Return a queryset for `model`. Can be used to filter some """Return a queryset for `model`. Can be used to filter some
objects on some models""" objects on some models"""
if model == get_user_model(): if model == get_user_model():
return model.objects.exclude_anonymous() return model.objects.exclude(pk=get_anonymous_user().pk)
return model.objects.all() return model.objects.all()
def _pre_export(self, blueprint: Blueprint): def _pre_export(self, blueprint: Blueprint):
@ -59,7 +59,7 @@ class Exporter:
blueprint = Blueprint() blueprint = Blueprint()
self._pre_export(blueprint) self._pre_export(blueprint)
blueprint.metadata = BlueprintMetadata( blueprint.metadata = BlueprintMetadata(
name=_("authentik Export - {date}".format_map({"date": str(now())})), name=_("authentik Export - %(date)s" % {"date": str(now())}),
labels={ labels={
LABEL_AUTHENTIK_GENERATED: "true", LABEL_AUTHENTIK_GENERATED: "true",
}, },
@ -74,7 +74,7 @@ class Exporter:
class FlowExporter(Exporter): class FlowExporter(Exporter):
"""Exporter customized to only return objects related to `flow`""" """Exporter customised to only return objects related to `flow`"""
flow: Flow flow: Flow
with_policies: bool with_policies: bool

View File

@ -1,25 +1,22 @@
"""Blueprint importer""" """Blueprint importer"""
from contextlib import contextmanager from contextlib import contextmanager
from copy import deepcopy from copy import deepcopy
from typing import Any from typing import Any, Optional
from dacite.config import Config from dacite.config import Config
from dacite.core import from_dict from dacite.core import from_dict
from dacite.exceptions import DaciteError from dacite.exceptions import DaciteError
from deepmerge import always_merger from deepmerge import always_merger
from django.contrib.auth.models import Permission
from django.contrib.contenttypes.models import ContentType
from django.core.exceptions import FieldError from django.core.exceptions import FieldError
from django.db.models import Model from django.db.models import Model
from django.db.models.query_utils import Q from django.db.models.query_utils import Q
from django.db.transaction import atomic from django.db.transaction import atomic
from django.db.utils import IntegrityError from django.db.utils import IntegrityError
from guardian.models import UserObjectPermission
from guardian.shortcuts import assign_perm
from rest_framework.exceptions import ValidationError from rest_framework.exceptions import ValidationError
from rest_framework.serializers import BaseSerializer, Serializer from rest_framework.serializers import BaseSerializer, Serializer
from structlog.stdlib import BoundLogger, get_logger from structlog.stdlib import BoundLogger, get_logger
from structlog.testing import capture_logs
from structlog.types import EventDict
from yaml import load from yaml import load
from authentik.blueprints.v1.common import ( from authentik.blueprints.v1.common import (
@ -33,70 +30,40 @@ from authentik.blueprints.v1.common import (
from authentik.blueprints.v1.meta.registry import BaseMetaModel, registry from authentik.blueprints.v1.meta.registry import BaseMetaModel, registry
from authentik.core.models import ( from authentik.core.models import (
AuthenticatedSession, AuthenticatedSession,
GroupSourceConnection,
PropertyMapping, PropertyMapping,
Provider, Provider,
Source, Source,
User,
UserSourceConnection, UserSourceConnection,
) )
from authentik.enterprise.license import LicenseKey
from authentik.enterprise.models import LicenseUsage from authentik.enterprise.models import LicenseUsage
from authentik.enterprise.providers.google_workspace.models import (
GoogleWorkspaceProviderGroup,
GoogleWorkspaceProviderUser,
)
from authentik.enterprise.providers.microsoft_entra.models import (
MicrosoftEntraProviderGroup,
MicrosoftEntraProviderUser,
)
from authentik.enterprise.providers.rac.models import ConnectionToken
from authentik.enterprise.stages.authenticator_endpoint_gdtc.models import (
EndpointDevice,
EndpointDeviceConnection,
)
from authentik.events.logs import LogEvent, capture_logs
from authentik.events.models import SystemTask
from authentik.events.utils import cleanse_dict from authentik.events.utils import cleanse_dict
from authentik.flows.models import FlowToken, Stage from authentik.flows.models import FlowToken, Stage
from authentik.lib.models import SerializerModel from authentik.lib.models import SerializerModel
from authentik.lib.sentry import SentryIgnoredException from authentik.lib.sentry import SentryIgnoredException
from authentik.lib.utils.reflection import get_apps
from authentik.outposts.models import OutpostServiceConnection from authentik.outposts.models import OutpostServiceConnection
from authentik.policies.models import Policy, PolicyBindingModel from authentik.policies.models import Policy, PolicyBindingModel
from authentik.policies.reputation.models import Reputation from authentik.providers.scim.models import SCIMGroup, SCIMUser
from authentik.providers.oauth2.models import AccessToken, AuthorizationCode, RefreshToken
from authentik.providers.scim.models import SCIMProviderGroup, SCIMProviderUser
from authentik.rbac.models import Role
from authentik.sources.scim.models import SCIMSourceGroup, SCIMSourceUser
from authentik.stages.authenticator_webauthn.models import WebAuthnDeviceType
from authentik.tenants.models import Tenant
# Context set when the serializer is created in a blueprint context # Context set when the serializer is created in a blueprint context
# Update website/docs/customize/blueprints/v1/models.md when used # Update website/developer-docs/blueprints/v1/models.md when used
SERIALIZER_CONTEXT_BLUEPRINT = "blueprint_entry" SERIALIZER_CONTEXT_BLUEPRINT = "blueprint_entry"
def excluded_models() -> list[type[Model]]: def excluded_models() -> list[type[Model]]:
"""Return a list of all excluded models that shouldn't be exposed via API """Return a list of all excluded models that shouldn't be exposed via API
or other means (internal only, base classes, non-used objects, etc)""" or other means (internal only, base classes, non-used objects, etc)"""
# pylint: disable=imported-auth-user
from django.contrib.auth.models import Group as DjangoGroup from django.contrib.auth.models import Group as DjangoGroup
from django.contrib.auth.models import User as DjangoUser from django.contrib.auth.models import User as DjangoUser
return ( return (
# Django only classes
DjangoUser, DjangoUser,
DjangoGroup, DjangoGroup,
ContentType,
Permission,
UserObjectPermission,
# Base classes # Base classes
Provider, Provider,
Source, Source,
PropertyMapping, PropertyMapping,
UserSourceConnection, UserSourceConnection,
GroupSourceConnection,
Stage, Stage,
OutpostServiceConnection, OutpostServiceConnection,
Policy, Policy,
@ -104,33 +71,16 @@ def excluded_models() -> list[type[Model]]:
# Classes that have other dependencies # Classes that have other dependencies
AuthenticatedSession, AuthenticatedSession,
# Classes which are only internally managed # Classes which are only internally managed
# FIXME: these shouldn't need to be explicitly listed, but rather based off of a mixin
FlowToken, FlowToken,
LicenseUsage, LicenseUsage,
SCIMProviderGroup, SCIMGroup,
SCIMProviderUser, SCIMUser,
Tenant,
SystemTask,
ConnectionToken,
AuthorizationCode,
AccessToken,
RefreshToken,
Reputation,
WebAuthnDeviceType,
SCIMSourceUser,
SCIMSourceGroup,
GoogleWorkspaceProviderUser,
GoogleWorkspaceProviderGroup,
MicrosoftEntraProviderUser,
MicrosoftEntraProviderGroup,
EndpointDevice,
EndpointDeviceConnection,
) )
def is_model_allowed(model: type[Model]) -> bool: def is_model_allowed(model: type[Model]) -> bool:
"""Check if model is allowed""" """Check if model is allowed"""
return model not in excluded_models() and issubclass(model, SerializerModel | BaseMetaModel) return model not in excluded_models() and issubclass(model, (SerializerModel, BaseMetaModel))
class DoRollback(SentryIgnoredException): class DoRollback(SentryIgnoredException):
@ -148,39 +98,22 @@ def transaction_rollback():
pass pass
def rbac_models() -> dict:
models = {}
for app in get_apps():
for model in app.get_models():
if not is_model_allowed(model):
continue
models[model._meta.model_name] = app.label
return models
class Importer: class Importer:
"""Import Blueprint from raw dict or YAML/JSON""" """Import Blueprint from raw dict or YAML/JSON"""
logger: BoundLogger logger: BoundLogger
_import: Blueprint _import: Blueprint
def __init__(self, blueprint: Blueprint, context: dict | None = None): def __init__(self, blueprint: Blueprint, context: Optional[dict] = None):
self.__pk_map: dict[Any, Model] = {} self.__pk_map: dict[Any, Model] = {}
self._import = blueprint self._import = blueprint
self.logger = get_logger() self.logger = get_logger()
ctx = self.default_context() ctx = {}
always_merger.merge(ctx, self._import.context) always_merger.merge(ctx, self._import.context)
if context: if context:
always_merger.merge(ctx, context) always_merger.merge(ctx, context)
self._import.context = ctx self._import.context = ctx
def default_context(self):
"""Default context"""
return {
"goauthentik.io/enterprise/licensed": LicenseKey.get_total().status().is_valid,
"goauthentik.io/rbac/models": rbac_models(),
}
@staticmethod @staticmethod
def from_string(yaml_input: str, context: dict | None = None) -> "Importer": def from_string(yaml_input: str, context: dict | None = None) -> "Importer":
"""Parse YAML string and create blueprint importer from it""" """Parse YAML string and create blueprint importer from it"""
@ -203,14 +136,14 @@ class Importer:
def updater(value) -> Any: def updater(value) -> Any:
if value in self.__pk_map: if value in self.__pk_map:
self.logger.debug("Updating reference in entry", value=value) self.logger.debug("updating reference in entry", value=value)
return self.__pk_map[value] return self.__pk_map[value]
return value return value
for key, value in attrs.items(): for key, value in attrs.items():
try: try:
if isinstance(value, dict): if isinstance(value, dict):
for _, _inner_key in enumerate(value): for idx, _inner_key in enumerate(value):
value[_inner_key] = updater(value[_inner_key]) value[_inner_key] = updater(value[_inner_key])
elif isinstance(value, list): elif isinstance(value, list):
for idx, _inner_value in enumerate(value): for idx, _inner_value in enumerate(value):
@ -239,17 +172,15 @@ class Importer:
return main_query | sub_query return main_query | sub_query
def _validate_single(self, entry: BlueprintEntry) -> BaseSerializer | None: # noqa: PLR0915 # pylint: disable-msg=too-many-locals
def _validate_single(self, entry: BlueprintEntry) -> Optional[BaseSerializer]:
"""Validate a single entry""" """Validate a single entry"""
if not entry.check_all_conditions_match(self._import): if not entry.check_all_conditions_match(self._import):
self.logger.debug("One or more conditions of this entry are not fulfilled, skipping") self.logger.debug("One or more conditions of this entry are not fulfilled, skipping")
return None return None
model_app_label, model_name = entry.get_model(self._import).split(".") model_app_label, model_name = entry.get_model(self._import).split(".")
try: model: type[SerializerModel] = registry.get_model(model_app_label, model_name)
model: type[SerializerModel] = registry.get_model(model_app_label, model_name)
except LookupError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc
# Don't use isinstance since we don't want to check for inheritance # Don't use isinstance since we don't want to check for inheritance
if not is_model_allowed(model): if not is_model_allowed(model):
raise EntryInvalidError.from_entry(f"Model {model} not allowed", entry) raise EntryInvalidError.from_entry(f"Model {model} not allowed", entry)
@ -293,13 +224,9 @@ class Importer:
serializer_kwargs = {} serializer_kwargs = {}
model_instance = existing_models.first() model_instance = existing_models.first()
if ( if not isinstance(model(), BaseMetaModel) and model_instance:
not isinstance(model(), BaseMetaModel)
and model_instance
and entry.state != BlueprintEntryDesiredState.MUST_CREATED
):
self.logger.debug( self.logger.debug(
"Initialise serializer with instance", "initialise serializer with instance",
model=model, model=model,
instance=model_instance, instance=model_instance,
pk=model_instance.pk, pk=model_instance.pk,
@ -307,17 +234,16 @@ class Importer:
serializer_kwargs["instance"] = model_instance serializer_kwargs["instance"] = model_instance
serializer_kwargs["partial"] = True serializer_kwargs["partial"] = True
elif model_instance and entry.state == BlueprintEntryDesiredState.MUST_CREATED: elif model_instance and entry.state == BlueprintEntryDesiredState.MUST_CREATED:
msg = (
f"State is set to {BlueprintEntryDesiredState.MUST_CREATED.value} "
"and object exists already",
)
raise EntryInvalidError.from_entry( raise EntryInvalidError.from_entry(
ValidationError({k: msg for k in entry.identifiers.keys()}, "unique"), (
f"state is set to {BlueprintEntryDesiredState.MUST_CREATED} "
"and object exists already",
),
entry, entry,
) )
else: else:
self.logger.debug( self.logger.debug(
"Initialised new serializer instance", "initialised new serializer instance",
model=model, model=model,
**cleanse_dict(updated_identifiers), **cleanse_dict(updated_identifiers),
) )
@ -329,7 +255,10 @@ class Importer:
try: try:
full_data = self.__update_pks_for_attrs(entry.get_attrs(self._import)) full_data = self.__update_pks_for_attrs(entry.get_attrs(self._import))
except ValueError as exc: except ValueError as exc:
raise EntryInvalidError.from_entry(exc, entry) from exc raise EntryInvalidError.from_entry(
exc,
entry,
) from exc
always_merger.merge(full_data, updated_identifiers) always_merger.merge(full_data, updated_identifiers)
serializer_kwargs["data"] = full_data serializer_kwargs["data"] = full_data
@ -350,15 +279,6 @@ class Importer:
) from exc ) from exc
return serializer return serializer
def _apply_permissions(self, instance: Model, entry: BlueprintEntry):
"""Apply object-level permissions for an entry"""
for perm in entry.get_permissions(self._import):
if perm.user is not None:
assign_perm(perm.permission, User.objects.get(pk=perm.user), instance)
if perm.role is not None:
role = Role.objects.get(pk=perm.role)
role.assign_permission(perm.permission, obj=instance)
def apply(self) -> bool: def apply(self) -> bool:
"""Apply (create/update) models yaml, in database transaction""" """Apply (create/update) models yaml, in database transaction"""
try: try:
@ -380,7 +300,7 @@ class Importer:
model: type[SerializerModel] = registry.get_model(model_app_label, model_name) model: type[SerializerModel] = registry.get_model(model_app_label, model_name)
except LookupError: except LookupError:
self.logger.warning( self.logger.warning(
"App or Model does not exist", app=model_app_label, model=model_name "app or model does not exist", app=model_app_label, model=model_name
) )
return False return False
# Validate each single entry # Validate each single entry
@ -392,7 +312,7 @@ class Importer:
if entry.get_state(self._import) == BlueprintEntryDesiredState.ABSENT: if entry.get_state(self._import) == BlueprintEntryDesiredState.ABSENT:
serializer = exc.serializer serializer = exc.serializer
else: else:
self.logger.warning(f"Entry invalid: {exc}", entry=entry, error=exc) self.logger.warning(f"entry invalid: {exc}", entry=entry, error=exc)
if raise_errors: if raise_errors:
raise exc raise exc
return False return False
@ -412,42 +332,43 @@ class Importer:
and state == BlueprintEntryDesiredState.CREATED and state == BlueprintEntryDesiredState.CREATED
): ):
self.logger.debug( self.logger.debug(
"Instance exists, skipping", "instance exists, skipping",
model=model, model=model,
instance=instance, instance=instance,
pk=instance.pk, pk=instance.pk,
) )
else: else:
instance = serializer.save() instance = serializer.save()
self.logger.debug("Updated model", model=instance) self.logger.debug("updated model", model=instance)
if "pk" in entry.identifiers: if "pk" in entry.identifiers:
self.__pk_map[entry.identifiers["pk"]] = instance.pk self.__pk_map[entry.identifiers["pk"]] = instance.pk
entry._state = BlueprintEntryState(instance) entry._state = BlueprintEntryState(instance)
self._apply_permissions(instance, entry)
elif state == BlueprintEntryDesiredState.ABSENT: elif state == BlueprintEntryDesiredState.ABSENT:
instance: Model | None = serializer.instance instance: Optional[Model] = serializer.instance
if instance.pk: if instance.pk:
instance.delete() instance.delete()
self.logger.debug("Deleted model", mode=instance) self.logger.debug("deleted model", mode=instance)
continue continue
self.logger.debug("Entry to delete with no instance, skipping") self.logger.debug("entry to delete with no instance, skipping")
return True return True
def validate(self, raise_validation_errors=False) -> tuple[bool, list[LogEvent]]: def validate(self, raise_validation_errors=False) -> tuple[bool, list[EventDict]]:
"""Validate loaded blueprint export, ensure all models are allowed """Validate loaded blueprint export, ensure all models are allowed
and serializers have no errors""" and serializers have no errors"""
self.logger.debug("Starting blueprint import validation") self.logger.debug("Starting blueprint import validation")
orig_import = deepcopy(self._import) orig_import = deepcopy(self._import)
if self._import.version != 1: if self._import.version != 1:
self.logger.warning("Invalid blueprint version") self.logger.warning("Invalid blueprint version")
return False, [LogEvent("Invalid blueprint version", log_level="warning", logger=None)] return False, [{"event": "Invalid blueprint version"}]
with ( with (
transaction_rollback(), transaction_rollback(),
capture_logs() as logs, capture_logs() as logs,
): ):
successful = self._apply_models(raise_errors=raise_validation_errors) successful = self._apply_models(raise_errors=raise_validation_errors)
if not successful: if not successful:
self.logger.warning("Blueprint validation failed") self.logger.debug("Blueprint validation failed")
for log in logs:
getattr(self.logger, log.get("log_level"))(**log)
self.logger.debug("Finished blueprint import validation") self.logger.debug("Finished blueprint import validation")
self._import = orig_import self._import = orig_import
return successful, logs return successful, logs

View File

@ -1,13 +1,12 @@
"""Apply Blueprint meta model""" """Apply Blueprint meta model"""
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from rest_framework.exceptions import ValidationError from rest_framework.exceptions import ValidationError
from rest_framework.fields import BooleanField from rest_framework.fields import BooleanField, JSONField
from structlog.stdlib import get_logger from structlog.stdlib import get_logger
from authentik.blueprints.v1.meta.registry import BaseMetaModel, MetaResult, registry from authentik.blueprints.v1.meta.registry import BaseMetaModel, MetaResult, registry
from authentik.core.api.utils import JSONDictField, PassiveSerializer from authentik.core.api.utils import PassiveSerializer, is_dict
if TYPE_CHECKING: if TYPE_CHECKING:
from authentik.blueprints.models import BlueprintInstance from authentik.blueprints.models import BlueprintInstance
@ -18,7 +17,7 @@ LOGGER = get_logger()
class ApplyBlueprintMetaSerializer(PassiveSerializer): class ApplyBlueprintMetaSerializer(PassiveSerializer):
"""Serializer for meta apply blueprint model""" """Serializer for meta apply blueprint model"""
identifiers = JSONDictField() identifiers = JSONField(validators=[is_dict])
required = BooleanField(default=True) required = BooleanField(default=True)
# We cannot override `instance` as that will confuse rest_framework # We cannot override `instance` as that will confuse rest_framework
@ -43,7 +42,7 @@ class ApplyBlueprintMetaSerializer(PassiveSerializer):
LOGGER.info("Blueprint does not exist, but not required") LOGGER.info("Blueprint does not exist, but not required")
return MetaResult() return MetaResult()
LOGGER.debug("Applying blueprint from meta model", blueprint=self.blueprint_instance) LOGGER.debug("Applying blueprint from meta model", blueprint=self.blueprint_instance)
# pylint: disable=no-value-for-parameter
apply_blueprint(str(self.blueprint_instance.pk)) apply_blueprint(str(self.blueprint_instance.pk))
return MetaResult() return MetaResult()

View File

@ -1,5 +1,4 @@
"""Base models""" """Base models"""
from django.apps import apps from django.apps import apps
from django.db.models import Model from django.db.models import Model
from rest_framework.serializers import Serializer from rest_framework.serializers import Serializer
@ -8,15 +7,15 @@ from rest_framework.serializers import Serializer
class BaseMetaModel(Model): class BaseMetaModel(Model):
"""Base models""" """Base models"""
class Meta:
abstract = True
@staticmethod @staticmethod
def serializer() -> Serializer: def serializer() -> Serializer:
"""Serializer similar to SerializerModel, but as a static method since """Serializer similar to SerializerModel, but as a static method since
this is an abstract model""" this is an abstract model"""
raise NotImplementedError raise NotImplementedError
class Meta:
abstract = True
class MetaResult: class MetaResult:
"""Result returned by Meta Models' serializers. Empty class but we can't return none as """Result returned by Meta Models' serializers. Empty class but we can't return none as

View File

@ -1,5 +1,4 @@
"""OCI Client""" """OCI Client"""
from typing import Any from typing import Any
from urllib.parse import ParseResult, urlparse from urllib.parse import ParseResult, urlparse

Some files were not shown because too many files have changed in this diff Show More