Compare commits

..

42 Commits

Author SHA1 Message Date
8f995aab62 web: break application view into constituent parts
As part of the project to make the verticals more controllable and responsive, this commit breaks
the ApplicationView into two different parts: The API layer and the rendering layer.

The rendering layer is officially dumb beyond words; it knows nothing at all about Applications,
RBAC, or Outposts; it just draws what it's told to draw. It has parts inside that have their own
reactivity, but that reactivity means nothing to the renderer.

The Renderer itself is broken into two: The LoadingRenderer works when there is no application, and
the regular Renderer is build when there is.  Typescript's check makes it impossible to attempt to
use the standard renderer when there is no application, so all of the `this.application?` checks
just... go away.

A _huge_ section of the View is the control card, which offers the user the power to visit the
provider, provide access to the backchannel providers, edit the application, run an access check
against a given user, and launch the application. All of these features were heavily obscured by a
blizzard of dg/dl/dt/dd html objects that made it hard to see what was in there. Each "description"
pair has been broken out into a tuple of Term and Description, with filters to remove the ones that
aren't applicable whenever an application doesn't have, for example, a launch url, or backchannel
providers, and a utility function I wrote _ages_ ago renders the description list syntax for me
without my having to do it all by hand.

The nice thing about this work is that it now allows me to *see* where in the ApplicationView code
to focus my efforts on providing activation hooks for the "create a new policy," "assign a new
permission to a user," or "edit an application" commands that should be accessible by the palette,
or even from the sidebar.

The other nice thing is that it reveals just *where* in our code to focus our efforts on revamping
our styling, and making it better for ourselves and our users.

There's a reason I call this my *legibility project*.

It didn't even take that long... about 2½ hours, and I'm only going to get faster at it as the needs
of the different components become clear.
2024-05-08 17:47:26 -07:00
2846e49657 Added comments to explain something. 2024-05-08 13:56:46 -07:00
0e60e755d4 Eslint had opinions. 2024-05-08 13:50:47 -07:00
6cf2433e2b web: fix value handling inside controlled components
This is one of those stupid bugs that drive web developers crazy. The basics are straightforward:
when you cause a higher-level component to have a "big enough re-render," for some unknown
definition of "big enough," it will re-render the sub-components. In traditional web interaction,
those components should never be re-rendered while the user is interacting with the form, but in
frameworks where there's dynamic re-arrangement, part or all of the form could get re-rendered at
any mmoment. Since neither the form nor any of its intermediaries is tracking the values as they're
changed, it's up to the components themselves to keep the user's input-- and to be hardened against
property changes coming from the outside world.

So static memoization of the initial value passed in, and aggressively walling off the values the
customer generates from that field, are needed to protect the user's work from any framework's
dynamic DOM management. I remember struggling with this in React; I had hoped Lit was better, but in
this case, not better enough.

The protocol for "is it an ak-data-control" is "it has a `json()` method that returns the data ready
to be sent to the authentik server."  I missed that in one place, so that's on me.
2024-05-08 13:38:49 -07:00
e1d565d40e Merge branch 'main' into dev
* main:
  core, web: update translations (#9633)
  core: bump google-api-python-client from 2.127.0 to 2.128.0 (#9641)
  core: bump goauthentik.io/api/v3 from 3.2024041.3 to 3.2024042.2 (#9635)
  core: bump golang from 1.22.2-bookworm to 1.22.3-bookworm (#9636)
  web: bump the esbuild group in /web with 2 updates (#9637)
  web: bump esbuild from 0.21.0 to 0.21.1 in /web (#9639)
  core: bump django from 5.0.5 to 5.0.6 (#9640)
  web: bump API Client version (#9630)
  enterprise/providers/google: initial account sync to google workspace (#9384)
  web/flows: fix error when using consecutive webauthn validator stages (#9629)
  web: bump API Client version (#9626)
  website/docs: refine intro page for sources (#9625)
  release: 2024.4.2
2024-05-08 08:06:05 -07:00
ee37e9235b Merge branch 'main' into dev
* main: (42 commits)
  website/docs: prepare 2024.4.2 release notes (#9555)
  web: bump the esbuild group in /web with 2 updates (#9616)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in ru (#9611)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh_CN (#9560)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh-Hans (#9563)
  ci: bump golangci/golangci-lint-action from 5 to 6 (#9615)
  web: bump esbuild from 0.20.2 to 0.21.0 in /web (#9617)
  core: bump cryptography from 42.0.6 to 42.0.7 (#9618)
  core: bump sentry-sdk from 2.0.1 to 2.1.1 (#9619)
  core: bump django from 5.0.4 to 5.0.5 (#9620)
  core, web: update translations (#9613)
  website/docs: move Sources from Integrations into Docs (#9515)
  website/docs: add procedurals to flow inspector docs (#9556)
  core: bump jinja2 from 3.1.3 to 3.1.4 (#9610)
  web: clean up the options rendering in PromptForm (#9564)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in ru (#9608)
  website/docs: add instructions for deploying radius manually with docker compose (#9605)
  sources/scim: fix duplicate groups and invalid schema (#9466)
  core: fix condition in task clean_expiring_models (#9603)
  translate: Updates for file web/xliff/en.xlf in fr (#9600)
  ...
2024-05-07 08:49:15 -07:00
8248163958 Merge branch 'main' into dev
* main:
  website/docs: fix openssl rand commands (#9554)
  web: bump @sentry/browser from 7.112.2 to 7.113.0 in /web in the sentry group (#9549)
  core, web: update translations (#9548)
  core: bump goauthentik.io/api/v3 from 3.2024041.1 to 3.2024041.2 (#9551)
  core: bump django-model-utils from 4.5.0 to 4.5.1 (#9550)
  providers/scim: fix time_limit not set correctly (#9546)
2024-05-03 13:40:13 -07:00
9acebec1f6 Merge branch 'main' into dev
* main:
  web/flows: fix error when enrolling multiple WebAuthn devices consecutively (#9545)
  web: bump ejs from 3.1.9 to 3.1.10 in /tests/wdio (#9542)
  web: bump API Client version (#9543)
  providers/saml: fix ecdsa support (#9537)
  website/integrations: nextcloud: connect to existing user (#9155)
2024-05-03 08:22:55 -07:00
2a96900dc7 Merge branch 'main' into dev
* main: (43 commits)
  stages/authenticator_webauthn: Update FIDO MDS3 & Passkey aaguid blobs (#9535)
  web: bump the rollup group across 1 directory with 3 updates (#9532)
  website/developer-docs: Add note for custom YAML tags in an IDE (#9528)
  lifecycle: close database connection after migrating (#9516)
  web: bump the babel group in /web with 3 updates (#9520)
  core: bump node from 21 to 22 (#9521)
  web: bump @codemirror/lang-python from 6.1.5 to 6.1.6 in /web (#9523)
  providers/rac: bump guacd to 1.5.5 (#9514)
  core: only prefetch related objects when required (#9476)
  website/integrations: move Fortimanager to Networking (#9505)
  website: bump react-tooltip from 5.26.3 to 5.26.4 in /website (#9494)
  web: bump the rollup group in /web with 3 updates (#9497)
  web: bump yaml from 2.4.1 to 2.4.2 in /web (#9499)
  core: bump goauthentik.io/api/v3 from 3.2024040.1 to 3.2024041.1 (#9503)
  core: bump pytest from 8.1.1 to 8.2.0 (#9501)
  website: bump react-dom from 18.3.0 to 18.3.1 in /website (#9495)
  website: bump react and @types/react in /website (#9496)
  web: bump react-dom from 18.3.0 to 18.3.1 in /web (#9498)
  core: bump sentry-sdk from 2.0.0 to 2.0.1 (#9502)
  web/flows: fix missing fallback for flow logo (#9487)
  ...
2024-05-01 17:24:34 -07:00
ca42506fa0 Merge branch 'main' into dev
* main:
  web: clean up some repetitive types (#9241)
  core: fix logic for token expiration (#9426)
  ci: fix ci pipeline (#9427)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in ru (#9424)
  web: Add resolved and integrity fields back to package-lock.json (#9419)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in ru (#9407)
  stages/identification: don't check source component (#9410)
  core: bump selenium from 4.19.0 to 4.20.0 (#9411)
  core: bump black from 24.4.0 to 24.4.1 (#9412)
  ci: bump golangci/golangci-lint-action from 4 to 5 (#9413)
  core: bump goauthentik.io/api/v3 from 3.2024023.2 to 3.2024040.1 (#9414)
  web: bump @sentry/browser from 7.112.1 to 7.112.2 in /web in the sentry group (#9416)
  sources/oauth: ensure all UI sources return a valid source (#9401)
  web: markdown: display markdown even when frontmatter is missing (#9404)
2024-04-25 08:38:08 -07:00
34de6bfd3a Merge branch 'main' into dev
* main:
  web: bump API Client version (#9400)
  release: 2024.4.0
  release: 2024.4.0-rc1
  root: bump blueprint schema version
  lifecycle: fix ak test-all command
  website/docs: finalize 2024.4 release notes (#9396)
  web: bump @sentry/browser from 7.111.0 to 7.112.1 in /web in the sentry group (#9387)
  web: bump the rollup group in /web with 3 updates (#9388)
  ci: bump helm/kind-action from 1.9.0 to 1.10.0 (#9389)
  website: bump clsx from 2.1.0 to 2.1.1 in /website (#9390)
  core: bump pydantic from 2.7.0 to 2.7.1 (#9391)
  core: bump freezegun from 1.4.0 to 1.5.0 (#9393)
  core: bump coverage from 7.4.4 to 7.5.0 (#9392)
  web: bump the storybook group in /web with 7 updates (#9380)
  web: bump the rollup group in /web with 3 updates (#9381)
2024-04-24 13:20:02 -07:00
2d94b16411 Merge branch 'main' into dev
* main: (24 commits)
  web: bump the wdio group in /tests/wdio with 4 updates (#9374)
  web: bump the rollup group in /web with 3 updates (#9371)
  core: bump ruff from 0.4.0 to 0.4.1 (#9372)
  core, web: update translations (#9366)
  web/admin: fix document title for admin interface (#9362)
  translate: Updates for file web/xliff/en.xlf in zh_CN (#9363)
  translate: Updates for file web/xliff/en.xlf in zh-Hans (#9364)
  core, web: update translations (#9360)
  website/docs: release notes 2024.4: add performance improvements values (#9356)
  translate: Updates for file web/xliff/en.xlf in zh_CN (#9317)
  translate: Updates for file web/xliff/en.xlf in zh-Hans (#9318)
  website/docs: 2024.4 release notes (#9267)
  sources/ldap: fix default blueprint for mapping user DN to path (#9355)
  web/admin: group form dual select (#9354)
  core: bump golang.org/x/net from 0.22.0 to 0.23.0 (#9351)
  core: bump goauthentik.io/api/v3 from 3.2024023.1 to 3.2024023.2 (#9345)
  web: bump chromedriver from 123.0.3 to 123.0.4 in /tests/wdio (#9348)
  core: bump twilio from 9.0.4 to 9.0.5 (#9346)
  core: bump ruff from 0.3.7 to 0.4.0 (#9347)
  web: bump @sentry/browser from 7.110.1 to 7.111.0 in /web in the sentry group (#9349)
  ...
2024-04-22 08:53:56 -07:00
98503f6009 Merge branch 'main' into dev
* main:
  stages/prompt: fix username field throwing error with existing user (#9342)
  root: expose session storage configuration (#9337)
  website/integrations: fix typo (#9340)
  root: fix go.mod for codeql checking (#9338)
  root: make redis settings more consistent (#9335)
  web/admin: fix error in admin interface due to un-hydrated context (#9336)
  web: bump API Client version (#9334)
  stages/authenticator_webauthn: fix attestation value (#9333)
  website/docs: fix SECRET_KEY length (#9328)
  website/docs: fix email template formatting (#9330)
  core, web: update translations (#9323)
  web: bump @patternfly/elements from 3.0.0 to 3.0.1 in /web (#9324)
  core: bump celery from 5.3.6 to 5.4.0 (#9325)
  core: bump goauthentik.io/api/v3 from 3.2024022.12 to 3.2024023.1 (#9327)
  sources/scim: service account should be internal (#9321)
  web: bump the storybook group in /web with 8 updates (#9266)
  sources/scim: cleanup service account when source is deleted (#9319)
2024-04-18 11:55:29 -07:00
ac4ba5d9e2 Merge branch 'main' into dev
* main: (23 commits)
  web: bump API Client version (#9316)
  release: 2024.2.3
  website/docs: 2024.2.3 release notes (#9313)
  web/admin: fix log viewer empty state (#9315)
  website/docs: fix formatting for stage changes (#9314)
  core: bump github.com/go-ldap/ldap/v3 from 3.4.7 to 3.4.8 (#9310)
  core: bump goauthentik.io/api/v3 from 3.2024022.11 to 3.2024022.12 (#9311)
  web: bump core-js from 3.36.1 to 3.37.0 in /web (#9309)
  core: bump gunicorn from 21.2.0 to 22.0.0 (#9308)
  core, web: update translations (#9307)
  website/docs: system settings: add default token duration and length (#9306)
  web/flows: update flow background (#9305)
  web: fix locale loading being skipped (#9301)
  translate: Updates for file web/xliff/en.xlf in fr (#9304)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in fr (#9303)
  core: replace authentik_signals_ignored_fields with audit_ignore (#9291)
  web/flow: fix form input rendering issue (#9297)
  events: fix incorrect user logged when using API token authentication (#9302)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh_CN (#9293)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh-Hans (#9295)
  ...
2024-04-17 10:50:26 -07:00
f19ed14bf8 Merge branch 'main' into dev
* main: (34 commits)
  web: bump API Client version (#9299)
  core: fix api schema for users and groups (#9298)
  providers/oauth2: fix refresh_token grant returning incorrect id_token (#9275)
  web: bump @sentry/browser from 7.110.0 to 7.110.1 in /web in the sentry group (#9278)
  core, web: update translations (#9277)
  web: bump the rollup group in /web with 3 updates (#9280)
  web: bump lit from 3.1.2 to 3.1.3 in /web (#9282)
  web: bump @lit/context from 1.1.0 to 1.1.1 in /web (#9281)
  website: bump @types/react from 18.2.78 to 18.2.79 in /website (#9286)
  core: bump goauthentik.io/api/v3 from 3.2024022.10 to 3.2024022.11 (#9285)
  core: bump sqlparse from 0.4.4 to 0.5.0 (#9276)
  lifecycle: gunicorn: fix app preload (#9274)
  events: add indexes (#9272)
  web/flows: fix passwordless hidden without input (#9273)
  root: fix geoipupdate arguments (#9271)
  website/docs: cleanup more (#9249)
  web: bump API Client version (#9270)
  sources: add SCIM source (#3051)
  core: delegated group member management (#9254)
  web: bump API Client version (#9269)
  ...
2024-04-16 10:49:58 -07:00
085debf170 Merge branch 'main' into dev
* main: (21 commits)
  web: manage stacked modals with a stack (#9193)
  website/docs: ensure yaml code blocks have language tags (#9240)
  blueprints: only create default brand if no other default brand exists (#9222)
  web: bump API Client version (#9239)
  website/integrations: portainer: Fix Redirect URL mismatch (#9226)
  api: fix authentication schema (#9238)
  translate: Updates for file web/xliff/en.xlf in zh_CN (#9229)
  translate: Updates for file web/xliff/en.xlf in zh-Hans (#9230)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh_CN (#9228)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh-Hans (#9231)
  core: bump pydantic from 2.6.4 to 2.7.0 (#9232)
  core: bump ruff from 0.3.5 to 0.3.7 (#9233)
  web: bump @sentry/browser from 7.109.0 to 7.110.0 in /web in the sentry group (#9234)
  website: bump @types/react from 18.2.75 to 18.2.77 in /website (#9236)
  core, web: update translations (#9225)
  website/integrations: add pfSense search scope (#9221)
  core: bump idna from 3.6 to 3.7 (#9224)
  website/docs: add websocket support to nginx snippets (#9220)
  internal: add tests to go flow executor (#9219)
  website/integrations: nextcloud: add tip to solve hashed groups configuring OAuth2 (#9153)
  ...
2024-04-12 14:27:20 -07:00
cacdf64408 Merge branch 'main' into dev
* main:
  website/docs: add more info and links about enforciing unique email addresses (#9154)
  core: bump goauthentik.io/api/v3 from 3.2024022.7 to 3.2024022.8 (#9215)
  web: bump API Client version (#9214)
  stages/authenticator_validate: add ability to limit webauthn device types (#9180)
  web: bump API Client version (#9213)
  core: add user settable token durations (#7410)
  core, web: update translations (#9205)
  web: bump typescript from 5.4.4 to 5.4.5 in /tests/wdio (#9206)
  web: bump chromedriver from 123.0.2 to 123.0.3 in /tests/wdio (#9207)
  core: bump sentry-sdk from 1.44.1 to 1.45.0 (#9208)
  web: bump typescript from 5.4.4 to 5.4.5 in /web (#9209)
  website: bump typescript from 5.4.4 to 5.4.5 in /website (#9210)
  core: bump python from 3.12.2-slim-bookworm to 3.12.3-slim-bookworm (#9211)
2024-04-11 08:10:41 -07:00
23665d173f Merge branch 'main' into dev
* main:
  website/docs: add note for flow compatibility mode (#9204)
2024-04-10 13:53:58 -07:00
272fdc516b Merge branch 'main' into dev
* main:
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh_CN (#9194)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh-Hans (#9197)
  translate: Updates for file web/xliff/en.xlf in zh_CN (#9196)
  translate: Updates for file web/xliff/en.xlf in zh-Hans (#9198)
  web: preserve selected list when provider updates (#9200)
  web: bump API Client version (#9195)
  sources/oauth: make URLs not required, only check when no OIDC URLs are defined (#9182)
2024-04-10 08:17:38 -07:00
b08dcc2289 Merge branch 'main' into dev
* main:
  web/admin: fix SAML Provider preview (#9192)
  core, web: update translations (#9183)
  web: bump chromedriver from 123.0.1 to 123.0.2 in /tests/wdio (#9188)
  website: bump @types/react from 18.2.74 to 18.2.75 in /website (#9185)
  website/docs: update Postgresql username (#9190)
  core: bump maxmind/geoipupdate from v6.1 to v7.0 (#9186)
  events: add context manager to ignore/modify audit events being written (#9181)
  web: fix application library list display length and capability (#9094)
2024-04-09 08:47:08 -07:00
c84be1d961 Merge branch 'main' into dev
* main: (25 commits)
  root: fix readme (#9178)
  enterprise: fix audit middleware import (#9177)
  web: bump @spotlightjs/spotlight from 1.2.16 to 1.2.17 in /web in the sentry group (#9162)
  web: bump API Client version (#9174)
  stages/authenticator_webauthn: add MDS support (#9114)
  website/integrations: Update Nextcloud OIDC secret size limitation (#9139)
  translate: Updates for file web/xliff/en.xlf in zh_CN (#9170)
  translate: Updates for file web/xliff/en.xlf in zh-Hans (#9171)
  web: bump the rollup group in /web with 3 updates (#9164)
  web: bump @codemirror/legacy-modes from 6.3.3 to 6.4.0 in /web (#9166)
  web: bump ts-pattern from 5.1.0 to 5.1.1 in /web (#9167)
  core: bump github.com/go-ldap/ldap/v3 from 3.4.6 to 3.4.7 (#9168)
  core, web: update translations (#9156)
  root: fix redis username in lifecycle (#9158)
  web: ak-checkbox-group for short, static, multi-select events (#9138)
  root: fix startup (#9151)
  core: Bump golang.org/x/oauth2 from 0.18.0 to 0.19.0 (#9146)
  core: Bump twilio from 9.0.3 to 9.0.4 (#9143)
  web: Bump country-flag-icons from 1.5.10 to 1.5.11 in /web (#9144)
  web: Bump typescript from 5.4.3 to 5.4.4 in /web (#9145)
  ...
2024-04-08 09:22:54 -07:00
875fc5c735 Merge branch 'main' into dev
* main: (22 commits)
  blueprints: fix default username field in user-settings flow (#9136)
  website/docs: add procedural docs for RAC (#9006)
  web: bump API Client version (#9133)
  ci: fix python client generator (#9134)
  root: generate python client (#9107)
  web: Bump vite from 5.1.4 to 5.2.8 in /web (#9120)
  core, web: update translations (#9124)
  core: Bump golang from 1.22.1-bookworm to 1.22.2-bookworm (#9125)
  web: Bump the babel group in /web with 2 updates (#9126)
  web: Bump the eslint group in /web with 1 update (#9127)
  web: Bump the eslint group in /tests/wdio with 1 update (#9129)
  core: Bump sentry-sdk from 1.44.0 to 1.44.1 (#9130)
  core: Bump channels from 4.0.0 to 4.1.0 (#9131)
  core: Bump django from 5.0.3 to 5.0.4 (#9132)
  web: Bump the rollup group in /web with 3 updates (#9128)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh-Hans (#9110)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in zh_CN (#9109)
  translate: Updates for file web/xliff/en.xlf in zh_CN (#9111)
  translate: Updates for file web/xliff/en.xlf in zh-Hans (#9112)
  web: Bump @fortawesome/fontawesome-free from 6.5.1 to 6.5.2 in /web (#9116)
  ...
2024-04-04 10:54:11 -07:00
66cefcc918 Merge branch 'main' into dev
* main:
  root: fix missing imports after #9081 (#9106)
  root: move database calls from ready() to dedicated startup signal (#9081)
  web: fix console log leftover (#9096)
  web: bump the eslint group in /web with 2 updates (#9098)
  core: bump twilio from 9.0.2 to 9.0.3 (#9103)
  web: bump the eslint group in /tests/wdio with 2 updates (#9099)
  core: bump drf-spectacular from 0.27.1 to 0.27.2 (#9100)
  core: bump django-model-utils from 4.4.0 to 4.5.0 (#9101)
  core: bump ruff from 0.3.4 to 0.3.5 (#9102)
  website/docs:  update notes on SECRET_KEY (#9091)
  web: fix broken locale compile (#9095)
  website/integrations: add outline knowledge base (#8786)
  website/docs: fix typo (#9082)
  website/docs: email stage: fix example translation error (#9048)
2024-04-02 09:01:01 -07:00
5d4c38032f Merge branch 'main' into dev
* main:
  web: bump @patternfly/elements from 2.4.0 to 3.0.0 in /web (#9089)
  web: bump ts-pattern from 5.0.8 to 5.1.0 in /web (#9090)
  website: bump the docusaurus group in /website with 9 updates (#9087)
  web/admin: allow custom sorting for bound* tables (#9080)
2024-04-01 08:31:33 -07:00
7123b2c57b Merge branch 'main' into dev
* main:
  web: move context controllers into reactive controller plugins (#8996)
  web: maintenance: split tsconfig into “base” and “build” variants. (#9036)
  web: consistent style declarations internally (#9077)
2024-03-29 13:02:47 -07:00
fc00bdee63 Merge branch 'main' into dev
* main: (23 commits)
  providers/oauth2: fix interactive device flow (#9076)
  website/docs: fix transports example (#9074)
  events: fix log_capture (#9075)
  web: bump the sentry group in /web with 2 updates (#9065)
  core: bump goauthentik.io/api/v3 from 3.2024022.6 to 3.2024022.7 (#9064)
  web: bump @codemirror/lang-python from 6.1.4 to 6.1.5 in /web (#9068)
  web: bump the eslint group in /web with 1 update (#9066)
  web: bump glob from 10.3.10 to 10.3.12 in /web (#9069)
  web: bump the rollup group in /web with 3 updates (#9067)
  web: bump the eslint group in /tests/wdio with 1 update (#9071)
  core: bump webauthn from 2.0.0 to 2.1.0 (#9070)
  core: bump sentry-sdk from 1.43.0 to 1.44.0 (#9073)
  core: bump requests-mock from 1.12.0 to 1.12.1 (#9072)
  web: bump API Client version (#9061)
  events: rework log messages returned from API and their rendering (#8770)
  website/docs: update airgapped config (#9049)
  website: bump @types/react from 18.2.72 to 18.2.73 in /website (#9052)
  web: bump the rollup group in /web with 3 updates (#9053)
  core: bump django-filter from 24.1 to 24.2 (#9055)
  core: bump requests-mock from 1.11.0 to 1.12.0 (#9056)
  ...
2024-03-29 08:35:41 -07:00
a056703da0 Merge branch 'main' into dev
* main:
  web: a few minor bugfixes and lintfixes (#9044)
  website/integrations: add documentation for OIDC setup with Xen Orchestra (#9000)
  website: bump @types/react from 18.2.70 to 18.2.72 in /website (#9041)
  core: bump goauthentik.io/api/v3 from 3.2024022.5 to 3.2024022.6 (#9042)
  web: fix markdown rendering bug for alerts (#9037)
2024-03-27 10:51:02 -07:00
3f9502072d Merge branch 'main' into dev
* main:
  web: bump API Client version (#9035)
  website/docs: maintenance, re-add system settings (#9026)
  core: bump duo-client from 5.2.0 to 5.3.0 (#9029)
  website: bump express from 4.18.2 to 4.19.2 in /website (#9027)
  web: bump express from 4.18.3 to 4.19.2 in /web (#9028)
  web: bump the eslint group in /web with 2 updates (#9030)
  core: bump goauthentik.io/api/v3 from 3.2024022.3 to 3.2024022.5 (#9031)
  website: bump @types/react from 18.2.69 to 18.2.70 in /website (#9032)
  web: bump the eslint group in /tests/wdio with 2 updates (#9033)
  web: bump katex from 0.16.9 to 0.16.10 in /web (#9025)
  translate: Updates for file locale/en/LC_MESSAGES/django.po in fr (#9023)
  website/docs: include OS-specific docker-compose install instructions + minor fixes (#8975)
2024-03-26 08:58:18 -07:00
2d254d6a7e Merge branch 'main' into dev
* main:
  web: bump API Client version (#9021)
  sources/ldap: add ability to disable password write on login (#8377)
  web: bump API Client version (#9020)
  lifecycle: migrate: ensure template schema exists before migrating (#8952)
  website/integrations: Update nextcloud Admin Group Expression (#7314)
  web/flow: general ux improvements (#8558)
  website: bump @types/react from 18.2.67 to 18.2.69 in /website (#9016)
  core: bump requests-oauthlib from 1.4.0 to 2.0.0 (#9018)
  web: bump the sentry group in /web with 2 updates (#9017)
  web/admin: small fixes (#9002)
  website: bump webpack-dev-middleware from 5.3.3 to 5.3.4 in /website (#9001)
  core: bump ruff from 0.3.3 to 0.3.4 (#8998)
  website/docs: Upgrade nginx reverse porxy config (#8947)
  website/docs: improve flow inspector docs (#8993)
  website/deverlop-docs website/integrations: add links to integrations template (#8995)
2024-03-25 07:44:17 -07:00
a7e3dca917 Merge branch 'main' into dev
* main:
  website/docs: add example policy to enforce unique email address (#8955)
  web/admin: remove enterprise preview banner (#8991)
  core: bump uvicorn from 0.28.1 to 0.29.0 (#8980)
  core: bump sentry-sdk from 1.42.0 to 1.43.0 (#8981)
  web: bump the babel group in /web with 3 updates (#8983)
  web: bump typescript from 5.4.2 to 5.4.3 in /web (#8984)
  web: bump typescript from 5.4.2 to 5.4.3 in /tests/wdio (#8986)
  web: bump chromedriver from 122.0.6 to 123.0.0 in /tests/wdio (#8987)
  website: bump typescript from 5.4.2 to 5.4.3 in /website (#8989)
  core: bump importlib-metadata from 7.0.2 to 7.1.0 (#8982)
  web: bump the wdio group in /tests/wdio with 3 updates (#8985)
  website: bump postcss from 8.4.37 to 8.4.38 in /website (#8988)
2024-03-21 09:10:21 -07:00
5d8408287f Merge branch 'main' into dev
* main:
  website/docs: config: remove options moved to tenants (#8976)
  web: bump @types/grecaptcha from 3.0.8 to 3.0.9 in /web (#8971)
  web: bump country-flag-icons from 1.5.9 to 1.5.10 in /web (#8970)
  web: bump the babel group in /web with 7 updates (#8969)
  core: bump uvicorn from 0.28.0 to 0.28.1 (#8968)
  website: bump postcss from 8.4.36 to 8.4.37 in /website (#8967)
  internal: cleanup static file serving setup code (#8965)
  website/integrations: portainer: match portainer settings order (#8974)
2024-03-20 10:12:34 -07:00
30beca9118 Merge branch 'main' into dev
* main:
  web: improve build speeds even moar!!!!!! (#8954)
2024-03-19 14:37:17 -07:00
8946b81dbd Merge branch 'main' into dev
* main:
  outposts/proxy: Fix invalid redirect on external hosts containing path components (#8915)
  core: cache user application list under policies (#8895)
  web: bump the eslint group in /web with 2 updates (#8959)
  web: bump core-js from 3.36.0 to 3.36.1 in /web (#8960)
  website: bump @types/react from 18.2.66 to 18.2.67 in /website (#8962)
  web: bump the eslint group in /tests/wdio with 2 updates (#8963)
2024-03-19 14:36:12 -07:00
db96e1a901 Merge branch 'main' into dev
* main: (31 commits)
  root: support redis username (#8935)
  core: bump black from 24.2.0 to 24.3.0 (#8945)
  web: bump the wdio group in /tests/wdio with 2 updates (#8939)
  web: bump the sentry group in /web with 1 update (#8941)
  website: bump postcss from 8.4.35 to 8.4.36 in /website (#8940)
  core: bump twilio from 9.0.1 to 9.0.2 (#8942)
  core: bump ruff from 0.3.2 to 0.3.3 (#8943)
  events: discard notification if user has empty email (#8938)
  ci: always run ci-main on branch pushes (#8950)
  core: bump goauthentik.io/api/v3 from 3.2024022.2 to 3.2024022.3 (#8946)
  website/docs: add new name "Microsft Entra ID" for Azure AD  (#8930)
  outposts: Enhance config options for k8s outposts (#7363)
  website/docs: add link to CRUD docs (#8925)
  web: bump API Client version (#8927)
  outpost: improved set secret answers for flow execution (#8013)
  stages/user_write: ensure user data is json-serializable (#8926)
  website/docs: update example ldapsearch commands (#8906)
  admin: Handle latest  version unknown in admin dashboard (#8858)
  core: bump coverage from 7.4.3 to 7.4.4 (#8917)
  core: bump urllib3 from 1.26.18 to 2.2.1 (#8918)
  ...
2024-03-18 07:58:44 -07:00
8b4e0361c4 Merge branch 'main' into dev
* main:
  web: clean up and remove redundant alias '@goauthentik/app' (#8889)
  web/admin: fix markdown table rendering (#8908)
2024-03-14 10:35:46 -07:00
22cb5b7379 Merge branch 'main' into dev
* main:
  web: bump chromedriver from 122.0.5 to 122.0.6 in /tests/wdio (#8902)
  web: bump vite-tsconfig-paths from 4.3.1 to 4.3.2 in /web (#8903)
  core: bump google.golang.org/protobuf from 1.32.0 to 1.33.0 (#8901)
  web: provide InstallID on EnterpriseListPage (#8898)
2024-03-14 08:14:43 -07:00
2d0117d096 Merge branch 'main' into dev
* main:
  api: capabilities: properly set can_save_media when s3 is enabled (#8896)
  web: bump the rollup group in /web with 3 updates (#8891)
  core: bump pydantic from 2.6.3 to 2.6.4 (#8892)
  core: bump twilio from 9.0.0 to 9.0.1 (#8893)
2024-03-13 14:05:11 -07:00
035bda4eac Merge branch 'main' into dev
* main:
  Update _envoy_istio.md (#8888)
  website/docs: new landing page for Providers (#8879)
  web: bump the sentry group in /web with 1 update (#8881)
  web: bump chromedriver from 122.0.4 to 122.0.5 in /tests/wdio (#8884)
  web: bump the eslint group in /tests/wdio with 2 updates (#8883)
  web: bump the eslint group in /web with 2 updates (#8885)
  website: bump @types/react from 18.2.64 to 18.2.65 in /website (#8886)
2024-03-12 13:31:35 -07:00
50906214e5 Merge branch 'main' into dev
* main:
  web: upgrade to lit 3 (#8781)
2024-03-11 11:03:04 -07:00
e505f274b6 Merge branch 'main' into dev
* main:
  web: fix esbuild issue with style sheets (#8856)
2024-03-11 10:28:05 -07:00
fe52f44dca Merge branch 'main' into dev
* main:
  tenants: really ensure default tenant cannot be deleted (#8875)
  core: bump github.com/go-openapi/runtime from 0.27.2 to 0.28.0 (#8867)
  core: bump pytest from 8.0.2 to 8.1.1 (#8868)
  core: bump github.com/go-openapi/strfmt from 0.22.2 to 0.23.0 (#8869)
  core: bump bandit from 1.7.7 to 1.7.8 (#8870)
  core: bump packaging from 23.2 to 24.0 (#8871)
  core: bump ruff from 0.3.1 to 0.3.2 (#8873)
  web: bump the wdio group in /tests/wdio with 3 updates (#8865)
  core: bump requests-oauthlib from 1.3.1 to 1.4.0 (#8866)
  core: bump uvicorn from 0.27.1 to 0.28.0 (#8872)
  core: bump django-filter from 23.5 to 24.1 (#8874)
2024-03-11 10:27:43 -07:00
3146e5a50f web: fix esbuild issue with style sheets
Getting ESBuild, Lit, and Storybook to all agree on how to read and parse stylesheets is a serious
pain. This fix better identifies the value types (instances) being passed from various sources in
the repo to the three *different* kinds of style processors we're using (the native one, the
polyfill one, and whatever the heck Storybook does internally).

Falling back to using older CSS instantiating techniques one era at a time seems to do the trick.
It's ugly, but in the face of the aggressive styling we use to avoid Flashes of Unstyled Content
(FLoUC), it's the logic with which we're left.

In standard mode, the following warning appears on the console when running a Flow:

```
Autofocus processing was blocked because a document already has a focused element.
```

In compatibility mode, the following **error** appears on the console when running a Flow:

```
crawler-inject.js:1106 Uncaught TypeError: Failed to execute 'observe' on 'MutationObserver': parameter 1 is not of type 'Node'.
    at initDomMutationObservers (crawler-inject.js:1106:18)
    at crawler-inject.js:1114:24
    at Array.forEach (<anonymous>)
    at initDomMutationObservers (crawler-inject.js:1114:10)
    at crawler-inject.js:1549:1
initDomMutationObservers @ crawler-inject.js:1106
(anonymous) @ crawler-inject.js:1114
initDomMutationObservers @ crawler-inject.js:1114
(anonymous) @ crawler-inject.js:1549
```

Despite this error, nothing seems to be broken and flows work as anticipated.
2024-03-08 14:15:55 -08:00
157 changed files with 2119 additions and 9395 deletions

13
.vscode/settings.json vendored
View File

@ -4,21 +4,20 @@
"asgi",
"authentik",
"authn",
"entra",
"goauthentik",
"jwks",
"kubernetes",
"oidc",
"openid",
"passwordless",
"plex",
"saml",
"scim",
"slo",
"sso",
"totp",
"traefik",
"webauthn",
"traefik",
"passwordless",
"kubernetes",
"sso",
"slo",
"scim",
],
"todo-tree.tree.showCountsInTree": true,
"todo-tree.tree.showBadges": true,

View File

@ -4,6 +4,7 @@ from collections.abc import Iterable
from uuid import UUID
from django.apps import apps
from django.contrib.auth import get_user_model
from django.db.models import Model, Q, QuerySet
from django.utils.timezone import now
from django.utils.translation import gettext as _
@ -46,6 +47,8 @@ class Exporter:
def get_model_instances(self, model: type[Model]) -> QuerySet:
"""Return a queryset for `model`. Can be used to filter some
objects on some models"""
if model == get_user_model():
return model.objects.exclude_anonymous()
return model.objects.all()
def _pre_export(self, blueprint: Blueprint):

View File

@ -39,14 +39,6 @@ from authentik.core.models import (
)
from authentik.enterprise.license import LicenseKey
from authentik.enterprise.models import LicenseUsage
from authentik.enterprise.providers.google_workspace.models import (
GoogleWorkspaceProviderGroup,
GoogleWorkspaceProviderUser,
)
from authentik.enterprise.providers.microsoft_entra.models import (
MicrosoftEntraProviderGroup,
MicrosoftEntraProviderUser,
)
from authentik.enterprise.providers.rac.models import ConnectionToken
from authentik.events.logs import LogEvent, capture_logs
from authentik.events.models import SystemTask
@ -94,7 +86,6 @@ def excluded_models() -> list[type[Model]]:
# Classes that have other dependencies
AuthenticatedSession,
# Classes which are only internally managed
# FIXME: these shouldn't need to be explicitly listed, but rather based off of a mixin
FlowToken,
LicenseUsage,
SCIMGroup,
@ -109,10 +100,6 @@ def excluded_models() -> list[type[Model]]:
WebAuthnDeviceType,
SCIMSourceUser,
SCIMSourceGroup,
GoogleWorkspaceProviderUser,
GoogleWorkspaceProviderGroup,
MicrosoftEntraProviderUser,
MicrosoftEntraProviderGroup,
)

View File

@ -63,12 +63,8 @@ class ProviderFilter(FilterSet):
"""Filter for providers"""
application__isnull = BooleanFilter(method="filter_application__isnull")
backchannel = BooleanFilter(
method="filter_backchannel",
label=_(
"When not set all providers are returned. When set to true, only backchannel "
"providers are returned. When set to false, backchannel providers are excluded"
),
backchannel_only = BooleanFilter(
method="filter_backchannel_only",
)
def filter_application__isnull(self, queryset: QuerySet, name, value):
@ -79,9 +75,8 @@ class ProviderFilter(FilterSet):
| Q(application__isnull=value)
)
def filter_backchannel(self, queryset: QuerySet, name, value):
"""By default all providers are returned. When set to true, only backchannel providers are
returned. When set to false, backchannel providers are excluded"""
def filter_backchannel_only(self, queryset: QuerySet, name, value):
"""Only return backchannel providers"""
return queryset.filter(is_backchannel=value)

View File

@ -408,7 +408,7 @@ class UserViewSet(UsedByMixin, ModelViewSet):
filterset_class = UsersFilter
def get_queryset(self):
base_qs = User.objects.all()
base_qs = User.objects.all().exclude_anonymous()
if self.serializer_class(context={"request": self.request})._should_include_groups:
base_qs = base_qs.prefetch_related("ak_groups")
return base_qs

View File

@ -10,7 +10,7 @@ from django.db.backends.base.schema import BaseDatabaseSchemaEditor
from django.db.models import Count
import authentik.core.models
import authentik.lib.validators
import authentik.lib.models
def migrate_sessions(apps: Apps, schema_editor: BaseDatabaseSchemaEditor):
@ -160,7 +160,7 @@ class Migration(migrations.Migration):
field=models.TextField(
blank=True,
default="",
validators=[authentik.lib.validators.DomainlessFormattedURLValidator()],
validators=[authentik.lib.models.DomainlessFormattedURLValidator()],
),
),
migrations.RunPython(

View File

@ -1,23 +0,0 @@
# Generated by Django 5.0.4 on 2024-04-23 16:59
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_core", "0035_alter_group_options_and_more"),
]
operations = [
migrations.AddField(
model_name="group",
name="deleted_at",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="user",
name="deleted_at",
field=models.DateTimeField(blank=True, null=True),
),
]

View File

@ -28,12 +28,10 @@ from authentik.lib.avatars import get_avatar
from authentik.lib.generators import generate_id
from authentik.lib.models import (
CreatedUpdatedModel,
DomainlessFormattedURLValidator,
SerializerModel,
SoftDeleteModel,
SoftDeleteQuerySet,
)
from authentik.lib.utils.time import timedelta_from_string
from authentik.lib.validators import DomainlessFormattedURLValidator
from authentik.policies.models import PolicyBindingModel
from authentik.tenants.models import DEFAULT_TOKEN_DURATION, DEFAULT_TOKEN_LENGTH
from authentik.tenants.utils import get_current_tenant, get_unique_identifier
@ -98,7 +96,7 @@ class UserTypes(models.TextChoices):
INTERNAL_SERVICE_ACCOUNT = "internal_service_account"
class Group(SoftDeleteModel, SerializerModel):
class Group(SerializerModel):
"""Group model which supports a basic hierarchy and has attributes"""
group_uuid = models.UUIDField(primary_key=True, editable=False, default=uuid4)
@ -188,21 +186,31 @@ class Group(SoftDeleteModel, SerializerModel):
]
class UserQuerySet(models.QuerySet):
"""User queryset"""
def exclude_anonymous(self):
"""Exclude anonymous user"""
return self.exclude(**{User.USERNAME_FIELD: settings.ANONYMOUS_USER_NAME})
class UserManager(DjangoUserManager):
"""User manager that doesn't assign is_superuser and is_staff"""
def get_queryset(self):
"""Create special user queryset"""
return SoftDeleteQuerySet(self.model, using=self._db).exclude(
**{User.USERNAME_FIELD: settings.ANONYMOUS_USER_NAME}
)
return UserQuerySet(self.model, using=self._db)
def create_user(self, username, email=None, password=None, **extra_fields):
"""User manager that doesn't assign is_superuser and is_staff"""
return self._create_user(username, email, password, **extra_fields)
def exclude_anonymous(self) -> QuerySet:
"""Exclude anonymous user"""
return self.get_queryset().exclude_anonymous()
class User(SoftDeleteModel, SerializerModel, GuardianUserMixin, AbstractUser):
class User(SerializerModel, GuardianUserMixin, AbstractUser):
"""authentik User model, based on django's contrib auth user model."""
uuid = models.UUIDField(default=uuid4, editable=False, unique=True)

View File

@ -13,7 +13,7 @@ from django.utils.translation import gettext as _
from structlog.stdlib import get_logger
from authentik.core.models import Source, SourceUserMatchingModes, User, UserSourceConnection
from authentik.core.sources.stage import PLAN_CONTEXT_SOURCES_CONNECTION, PostSourceStage
from authentik.core.sources.stage import PLAN_CONTEXT_SOURCES_CONNECTION, PostUserEnrollmentStage
from authentik.events.models import Event, EventAction
from authentik.flows.exceptions import FlowNonApplicableException
from authentik.flows.models import Flow, FlowToken, Stage, in_memory_stage
@ -206,9 +206,13 @@ class SourceFlowManager:
def get_stages_to_append(self, flow: Flow) -> list[Stage]:
"""Hook to override stages which are appended to the flow"""
return [
in_memory_stage(PostSourceStage),
]
if not self.source.enrollment_flow:
return []
if flow.slug == self.source.enrollment_flow.slug:
return [
in_memory_stage(PostUserEnrollmentStage),
]
return []
def _prepare_flow(
self,
@ -262,9 +266,6 @@ class SourceFlowManager:
)
# We run the Flow planner here so we can pass the Pending user in the context
planner = FlowPlanner(flow)
# We append some stages so the initial flow we get might be empty
planner.allow_empty_flows = True
planner.use_cache = False
plan = planner.plan(self.request, kwargs)
for stage in self.get_stages_to_append(flow):
plan.append_stage(stage)
@ -323,7 +324,7 @@ class SourceFlowManager:
reverse(
"authentik_core:if-user",
)
+ "#/settings;page-sources"
+ f"#/settings;page-{self.source.slug}"
)
def handle_enroll(

View File

@ -10,7 +10,7 @@ from authentik.flows.stage import StageView
PLAN_CONTEXT_SOURCES_CONNECTION = "goauthentik.io/sources/connection"
class PostSourceStage(StageView):
class PostUserEnrollmentStage(StageView):
"""Dynamically injected stage which saves the Connection after
the user has been enrolled."""
@ -21,12 +21,10 @@ class PostSourceStage(StageView):
]
user: User = self.executor.plan.context[PLAN_CONTEXT_PENDING_USER]
connection.user = user
linked = connection.pk is None
connection.save()
if linked:
Event.new(
EventAction.SOURCE_LINKED,
message="Linked Source",
source=connection.source,
).from_http(self.request)
Event.new(
EventAction.SOURCE_LINKED,
message="Linked Source",
source=connection.source,
).from_http(self.request)
return self.executor.stage_ok()

View File

@ -2,15 +2,11 @@
from django.contrib.auth.models import AnonymousUser
from django.test import TestCase
from django.urls import reverse
from guardian.utils import get_anonymous_user
from authentik.core.models import SourceUserMatchingModes, User
from authentik.core.sources.flow_manager import Action
from authentik.core.sources.stage import PostSourceStage
from authentik.core.tests.utils import create_test_flow
from authentik.flows.planner import FlowPlan
from authentik.flows.views.executor import SESSION_KEY_PLAN
from authentik.lib.generators import generate_id
from authentik.lib.tests.utils import get_request
from authentik.policies.denied import AccessDeniedResponse
@ -25,55 +21,41 @@ class TestSourceFlowManager(TestCase):
def setUp(self) -> None:
super().setUp()
self.authentication_flow = create_test_flow()
self.enrollment_flow = create_test_flow()
self.source: OAuthSource = OAuthSource.objects.create(
name=generate_id(),
slug=generate_id(),
authentication_flow=self.authentication_flow,
enrollment_flow=self.enrollment_flow,
)
self.source: OAuthSource = OAuthSource.objects.create(name="test")
self.identifier = generate_id()
def test_unauthenticated_enroll(self):
"""Test un-authenticated user enrolling"""
request = get_request("/", user=AnonymousUser())
flow_manager = OAuthSourceFlowManager(self.source, request, self.identifier, {})
flow_manager = OAuthSourceFlowManager(
self.source, get_request("/", user=AnonymousUser()), self.identifier, {}
)
action, _ = flow_manager.get_action()
self.assertEqual(action, Action.ENROLL)
response = flow_manager.get_flow()
self.assertEqual(response.status_code, 302)
flow_plan: FlowPlan = request.session[SESSION_KEY_PLAN]
self.assertEqual(flow_plan.bindings[0].stage.view, PostSourceStage)
flow_manager.get_flow()
def test_unauthenticated_auth(self):
"""Test un-authenticated user authenticating"""
UserOAuthSourceConnection.objects.create(
user=get_anonymous_user(), source=self.source, identifier=self.identifier
)
request = get_request("/", user=AnonymousUser())
flow_manager = OAuthSourceFlowManager(self.source, request, self.identifier, {})
flow_manager = OAuthSourceFlowManager(
self.source, get_request("/", user=AnonymousUser()), self.identifier, {}
)
action, _ = flow_manager.get_action()
self.assertEqual(action, Action.AUTH)
response = flow_manager.get_flow()
self.assertEqual(response.status_code, 302)
flow_plan: FlowPlan = request.session[SESSION_KEY_PLAN]
self.assertEqual(flow_plan.bindings[0].stage.view, PostSourceStage)
flow_manager.get_flow()
def test_authenticated_link(self):
"""Test authenticated user linking"""
user = User.objects.create(username="foo", email="foo@bar.baz")
request = get_request("/", user=user)
flow_manager = OAuthSourceFlowManager(self.source, request, self.identifier, {})
flow_manager = OAuthSourceFlowManager(
self.source, get_request("/", user=user), self.identifier, {}
)
action, connection = flow_manager.get_action()
self.assertEqual(action, Action.LINK)
self.assertIsNone(connection.pk)
response = flow_manager.get_flow()
self.assertEqual(response.status_code, 302)
self.assertEqual(
response.url,
reverse("authentik_core:if-user") + "#/settings;page-sources",
)
flow_manager.get_flow()
def test_unauthenticated_link(self):
"""Test un-authenticated user linking"""

View File

@ -132,7 +132,7 @@ class LicenseKey:
@staticmethod
def base_user_qs() -> QuerySet:
"""Base query set for all users"""
return User.objects.all().exclude(is_active=False)
return User.objects.all().exclude_anonymous().exclude(is_active=False)
@staticmethod
def get_default_user_count():

View File

@ -1,33 +0,0 @@
"""GoogleWorkspaceProviderGroup API Views"""
from rest_framework.viewsets import ModelViewSet
from authentik.core.api.sources import SourceSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.users import UserGroupSerializer
from authentik.enterprise.providers.google_workspace.models import GoogleWorkspaceProviderGroup
class GoogleWorkspaceProviderGroupSerializer(SourceSerializer):
"""GoogleWorkspaceProviderGroup Serializer"""
group_obj = UserGroupSerializer(source="group", read_only=True)
class Meta:
model = GoogleWorkspaceProviderGroup
fields = [
"id",
"group",
"group_obj",
]
class GoogleWorkspaceProviderGroupViewSet(UsedByMixin, ModelViewSet):
"""GoogleWorkspaceProviderGroup Viewset"""
queryset = GoogleWorkspaceProviderGroup.objects.all().select_related("group")
serializer_class = GoogleWorkspaceProviderGroupSerializer
filterset_fields = ["provider__id", "group__name", "group__group_uuid"]
search_fields = ["provider__name", "group__name"]
ordering = ["group__name"]

View File

@ -11,16 +11,16 @@ from authentik.core.api.used_by import UsedByMixin
from authentik.enterprise.providers.google_workspace.models import GoogleWorkspaceProviderMapping
class GoogleWorkspaceProviderMappingSerializer(PropertyMappingSerializer):
"""GoogleWorkspaceProviderMapping Serializer"""
class GoogleProviderMappingSerializer(PropertyMappingSerializer):
"""GoogleProviderMapping Serializer"""
class Meta:
model = GoogleWorkspaceProviderMapping
fields = PropertyMappingSerializer.Meta.fields
class GoogleWorkspaceProviderMappingFilter(FilterSet):
"""Filter for GoogleWorkspaceProviderMapping"""
class GoogleProviderMappingFilter(FilterSet):
"""Filter for GoogleProviderMapping"""
managed = extend_schema_field(OpenApiTypes.STR)(AllValuesMultipleFilter(field_name="managed"))
@ -29,11 +29,11 @@ class GoogleWorkspaceProviderMappingFilter(FilterSet):
fields = "__all__"
class GoogleWorkspaceProviderMappingViewSet(UsedByMixin, ModelViewSet):
"""GoogleWorkspaceProviderMapping Viewset"""
class GoogleProviderMappingViewSet(UsedByMixin, ModelViewSet):
"""GoogleProviderMapping Viewset"""
queryset = GoogleWorkspaceProviderMapping.objects.all()
serializer_class = GoogleWorkspaceProviderMappingSerializer
filterset_class = GoogleWorkspaceProviderMappingFilter
serializer_class = GoogleProviderMappingSerializer
filterset_class = GoogleProviderMappingFilter
search_fields = ["name"]
ordering = ["name"]

View File

@ -10,8 +10,8 @@ from authentik.enterprise.providers.google_workspace.tasks import google_workspa
from authentik.lib.sync.outgoing.api import OutgoingSyncProviderStatusMixin
class GoogleWorkspaceProviderSerializer(EnterpriseRequiredMixin, ProviderSerializer):
"""GoogleWorkspaceProvider Serializer"""
class GoogleProviderSerializer(EnterpriseRequiredMixin, ProviderSerializer):
"""GoogleProvider Serializer"""
class Meta:
model = GoogleWorkspaceProvider
@ -38,11 +38,11 @@ class GoogleWorkspaceProviderSerializer(EnterpriseRequiredMixin, ProviderSeriali
extra_kwargs = {}
class GoogleWorkspaceProviderViewSet(OutgoingSyncProviderStatusMixin, UsedByMixin, ModelViewSet):
"""GoogleWorkspaceProvider Viewset"""
class GoogleProviderViewSet(OutgoingSyncProviderStatusMixin, UsedByMixin, ModelViewSet):
"""GoogleProvider Viewset"""
queryset = GoogleWorkspaceProvider.objects.all()
serializer_class = GoogleWorkspaceProviderSerializer
serializer_class = GoogleProviderSerializer
filterset_fields = [
"name",
"exclude_users_service_account",

View File

@ -1,33 +0,0 @@
"""GoogleWorkspaceProviderUser API Views"""
from rest_framework.viewsets import ModelViewSet
from authentik.core.api.groups import GroupMemberSerializer
from authentik.core.api.sources import SourceSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.enterprise.providers.google_workspace.models import GoogleWorkspaceProviderUser
class GoogleWorkspaceProviderUserSerializer(SourceSerializer):
"""GoogleWorkspaceProviderUser Serializer"""
user_obj = GroupMemberSerializer(source="user", read_only=True)
class Meta:
model = GoogleWorkspaceProviderUser
fields = [
"id",
"user",
"user_obj",
]
class GoogleWorkspaceProviderUserViewSet(UsedByMixin, ModelViewSet):
"""GoogleWorkspaceProviderUser Viewset"""
queryset = GoogleWorkspaceProviderUser.objects.all().select_related("user")
serializer_class = GoogleWorkspaceProviderUserSerializer
filterset_fields = ["provider__id", "user__username", "user__id"]
search_fields = ["provider__name", "user__username"]
ordering = ["user__username"]

View File

@ -1,5 +1,5 @@
from django.db.models import Model
from django.http import HttpResponseBadRequest, HttpResponseNotFound
from django.http import HttpResponseNotFound
from google.auth.exceptions import GoogleAuthError, TransportError
from googleapiclient.discovery import build
from googleapiclient.errors import Error, HttpError
@ -10,7 +10,6 @@ from authentik.enterprise.providers.google_workspace.models import GoogleWorkspa
from authentik.lib.sync.outgoing import HTTP_CONFLICT
from authentik.lib.sync.outgoing.base import BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.exceptions import (
BadRequestSyncException,
NotFoundSyncException,
ObjectExistsSyncException,
StopSync,
@ -51,24 +50,22 @@ class GoogleWorkspaceSyncClient[TModel: Model, TConnection: Model, TSchema: dict
raise StopSync(exc) from exc
except HttpLib2Error as exc:
if isinstance(exc, HttpLib2ErrorWithResponse):
self._response_handle_status_code(request.body, exc.response.status, exc)
self._response_handle_status_code(exc.response.status, exc)
raise TransientSyncException(f"Failed to send request: {str(exc)}") from exc
except HttpError as exc:
self._response_handle_status_code(request.body, exc.status_code, exc)
self._response_handle_status_code(exc.status_code, exc)
raise TransientSyncException(f"Failed to send request: {str(exc)}") from exc
except Error as exc:
raise TransientSyncException(f"Failed to send request: {str(exc)}") from exc
return response
def _response_handle_status_code(self, request: dict, status_code: int, root_exc: Exception):
def _response_handle_status_code(self, status_code: int, root_exc: Exception):
if status_code == HttpResponseNotFound.status_code:
raise NotFoundSyncException("Object not found") from root_exc
if status_code == HTTP_CONFLICT:
raise ObjectExistsSyncException("Object exists") from root_exc
if status_code == HttpResponseBadRequest.status_code:
raise BadRequestSyncException("Bad request", request) from root_exc
def check_email_valid(self, *emails: str):
for email in emails:
if not any(email.endswith(f"@{domain_name}") for domain_name in self.domains):
raise BadRequestSyncException(f"Invalid email domain: {email}")
raise TransientSyncException(f"Invalid email domain: {email}")

View File

@ -9,6 +9,7 @@ from authentik.core.expression.exceptions import (
from authentik.core.models import Group
from authentik.enterprise.providers.google_workspace.clients.base import GoogleWorkspaceSyncClient
from authentik.enterprise.providers.google_workspace.models import (
GoogleWorkspaceDeleteAction,
GoogleWorkspaceProviderGroup,
GoogleWorkspaceProviderMapping,
GoogleWorkspaceProviderUser,
@ -21,7 +22,6 @@ from authentik.lib.sync.outgoing.exceptions import (
StopSync,
TransientSyncException,
)
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction
from authentik.lib.utils.errors import exception_to_string
@ -34,7 +34,7 @@ class GoogleWorkspaceGroupClient(
connection_type_query = "group"
can_discover = True
def to_schema(self, obj: Group, creating: bool) -> dict:
def to_schema(self, obj: Group) -> dict:
"""Convert authentik group"""
raw_google_group = {
"email": f"{slugify(obj.name)}@{self.provider.default_group_email_domain}"
@ -45,12 +45,12 @@ class GoogleWorkspaceGroupClient(
if not isinstance(mapping, GoogleWorkspaceProviderMapping):
continue
try:
mapping: GoogleWorkspaceProviderMapping
value = mapping.evaluate(
user=None,
request=None,
group=obj,
provider=self.provider,
creating=creating,
)
if value is None:
continue
@ -79,7 +79,7 @@ class GoogleWorkspaceGroupClient(
self.logger.debug("Group does not exist in Google, skipping")
return None
with transaction.atomic():
if self.provider.group_delete_action == OutgoingSyncDeleteAction.DELETE:
if self.provider.group_delete_action == GoogleWorkspaceDeleteAction.DELETE:
self._request(
self.directory_service.groups().delete(groupKey=google_group.google_id)
)
@ -87,7 +87,7 @@ class GoogleWorkspaceGroupClient(
def create(self, group: Group):
"""Create group from scratch and create a connection object"""
google_group = self.to_schema(group, True)
google_group = self.to_schema(group)
self.check_email_valid(google_group["email"])
with transaction.atomic():
try:
@ -99,17 +99,17 @@ class GoogleWorkspaceGroupClient(
group_data = self._request(
self.directory_service.groups().get(groupKey=google_group["email"])
)
return GoogleWorkspaceProviderGroup.objects.create(
GoogleWorkspaceProviderGroup.objects.create(
provider=self.provider, group=group, google_id=group_data["id"]
)
else:
return GoogleWorkspaceProviderGroup.objects.create(
GoogleWorkspaceProviderGroup.objects.create(
provider=self.provider, group=group, google_id=response["id"]
)
def update(self, group: Group, connection: GoogleWorkspaceProviderGroup):
"""Update existing group"""
google_group = self.to_schema(group, False)
google_group = self.to_schema(group)
self.check_email_valid(google_group["email"])
try:
return self._request(
@ -124,16 +124,28 @@ class GoogleWorkspaceGroupClient(
def write(self, obj: Group):
google_group, created = super().write(obj)
self.create_sync_members(obj, google_group)
return google_group, created
if created:
self.create_sync_members(obj, google_group)
return google_group
def create_sync_members(self, obj: Group, google_group: GoogleWorkspaceProviderGroup):
def create_sync_members(self, obj: Group, google_group: dict):
"""Sync all members after a group was created"""
users = list(obj.users.order_by("id").values_list("id", flat=True))
connections = GoogleWorkspaceProviderUser.objects.filter(
provider=self.provider, user__pk__in=users
).values_list("google_id", flat=True)
self._patch(google_group.google_id, Direction.add, connections)
)
for user in connections:
try:
self._request(
self.directory_service.members().insert(
groupKey=google_group["id"],
body={
"email": user.google_id,
},
)
)
except TransientSyncException:
continue
def update_group(self, group: Group, action: Direction, users_set: set[int]):
"""Update a groups members"""

View File

@ -8,6 +8,7 @@ from authentik.core.expression.exceptions import (
from authentik.core.models import User
from authentik.enterprise.providers.google_workspace.clients.base import GoogleWorkspaceSyncClient
from authentik.enterprise.providers.google_workspace.models import (
GoogleWorkspaceDeleteAction,
GoogleWorkspaceProviderMapping,
GoogleWorkspaceProviderUser,
)
@ -17,7 +18,6 @@ from authentik.lib.sync.outgoing.exceptions import (
StopSync,
TransientSyncException,
)
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction
from authentik.lib.utils.errors import exception_to_string
from authentik.policies.utils import delete_none_values
@ -29,18 +29,18 @@ class GoogleWorkspaceUserClient(GoogleWorkspaceSyncClient[User, GoogleWorkspaceP
connection_type_query = "user"
can_discover = True
def to_schema(self, obj: User, creating: bool) -> dict:
def to_schema(self, obj: User) -> dict:
"""Convert authentik user"""
raw_google_user = {}
for mapping in self.provider.property_mappings.all().order_by("name").select_subclasses():
if not isinstance(mapping, GoogleWorkspaceProviderMapping):
continue
try:
mapping: GoogleWorkspaceProviderMapping
value = mapping.evaluate(
user=obj,
request=None,
provider=self.provider,
creating=creating,
)
if value is None:
continue
@ -71,11 +71,11 @@ class GoogleWorkspaceUserClient(GoogleWorkspaceSyncClient[User, GoogleWorkspaceP
return None
with transaction.atomic():
response = None
if self.provider.user_delete_action == OutgoingSyncDeleteAction.DELETE:
if self.provider.user_delete_action == GoogleWorkspaceDeleteAction.DELETE:
response = self._request(
self.directory_service.users().delete(userKey=google_user.google_id)
)
elif self.provider.user_delete_action == OutgoingSyncDeleteAction.SUSPEND:
elif self.provider.user_delete_action == GoogleWorkspaceDeleteAction.SUSPEND:
response = self._request(
self.directory_service.users().update(
userKey=google_user.google_id, body={"suspended": True}
@ -86,7 +86,7 @@ class GoogleWorkspaceUserClient(GoogleWorkspaceSyncClient[User, GoogleWorkspaceP
def create(self, user: User):
"""Create user from scratch and create a connection object"""
google_user = self.to_schema(user, True)
google_user = self.to_schema(user)
self.check_email_valid(
google_user["primaryEmail"], *[x["address"] for x in google_user.get("emails", [])]
)
@ -95,19 +95,19 @@ class GoogleWorkspaceUserClient(GoogleWorkspaceSyncClient[User, GoogleWorkspaceP
response = self._request(self.directory_service.users().insert(body=google_user))
except ObjectExistsSyncException:
# user already exists in google workspace, so we can connect them manually
return GoogleWorkspaceProviderUser.objects.create(
GoogleWorkspaceProviderUser.objects.create(
provider=self.provider, user=user, google_id=user.email
)
except TransientSyncException as exc:
raise exc
else:
return GoogleWorkspaceProviderUser.objects.create(
GoogleWorkspaceProviderUser.objects.create(
provider=self.provider, user=user, google_id=response["primaryEmail"]
)
def update(self, user: User, connection: GoogleWorkspaceProviderUser):
"""Update existing user"""
google_user = self.to_schema(user, False)
google_user = self.to_schema(user)
self.check_email_valid(
google_user["primaryEmail"], *[x["address"] for x in google_user.get("emails", [])]
)

View File

@ -1,179 +0,0 @@
# Generated by Django 5.0.6 on 2024-05-09 12:57
import django.db.models.deletion
import uuid
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
replaces = [
("authentik_providers_google_workspace", "0001_initial"),
(
"authentik_providers_google_workspace",
"0002_alter_googleworkspaceprovidergroup_options_and_more",
),
]
initial = True
dependencies = [
("authentik_core", "0035_alter_group_options_and_more"),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name="GoogleWorkspaceProviderMapping",
fields=[
(
"propertymapping_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="authentik_core.propertymapping",
),
),
],
options={
"verbose_name": "Google Workspace Provider Mapping",
"verbose_name_plural": "Google Workspace Provider Mappings",
},
bases=("authentik_core.propertymapping",),
),
migrations.CreateModel(
name="GoogleWorkspaceProvider",
fields=[
(
"provider_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="authentik_core.provider",
),
),
("delegated_subject", models.EmailField(max_length=254)),
("credentials", models.JSONField()),
(
"scopes",
models.TextField(
default="https://www.googleapis.com/auth/admin.directory.user,https://www.googleapis.com/auth/admin.directory.group,https://www.googleapis.com/auth/admin.directory.group.member,https://www.googleapis.com/auth/admin.directory.domain.readonly"
),
),
("default_group_email_domain", models.TextField()),
("exclude_users_service_account", models.BooleanField(default=False)),
(
"user_delete_action",
models.TextField(
choices=[
("do_nothing", "Do Nothing"),
("delete", "Delete"),
("suspend", "Suspend"),
],
default="delete",
),
),
(
"group_delete_action",
models.TextField(
choices=[
("do_nothing", "Do Nothing"),
("delete", "Delete"),
("suspend", "Suspend"),
],
default="delete",
),
),
(
"filter_group",
models.ForeignKey(
default=None,
null=True,
on_delete=django.db.models.deletion.SET_DEFAULT,
to="authentik_core.group",
),
),
(
"property_mappings_group",
models.ManyToManyField(
blank=True,
default=None,
help_text="Property mappings used for group creation/updating.",
to="authentik_core.propertymapping",
),
),
],
options={
"verbose_name": "Google Workspace Provider",
"verbose_name_plural": "Google Workspace Providers",
},
bases=("authentik_core.provider", models.Model),
),
migrations.CreateModel(
name="GoogleWorkspaceProviderGroup",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4, editable=False, primary_key=True, serialize=False
),
),
("google_id", models.TextField()),
(
"group",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="authentik_core.group"
),
),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="authentik_providers_google_workspace.googleworkspaceprovider",
),
),
],
options={
"unique_together": {("google_id", "group", "provider")},
"verbose_name": "Google Workspace Provider Group",
"verbose_name_plural": "Google Workspace Provider Groups",
},
),
migrations.CreateModel(
name="GoogleWorkspaceProviderUser",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4, editable=False, primary_key=True, serialize=False
),
),
("google_id", models.TextField()),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="authentik_providers_google_workspace.googleworkspaceprovider",
),
),
(
"user",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL
),
),
],
options={
"unique_together": {("google_id", "user", "provider")},
"verbose_name": "Google Workspace Provider User",
"verbose_name_plural": "Google Workspace Provider Users",
},
),
]

View File

@ -1,27 +0,0 @@
# Generated by Django 5.0.6 on 2024-05-08 14:35
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("authentik_providers_google_workspace", "0001_initial"),
]
operations = [
migrations.AlterModelOptions(
name="googleworkspaceprovidergroup",
options={
"verbose_name": "Google Workspace Provider Group",
"verbose_name_plural": "Google Workspace Provider Groups",
},
),
migrations.AlterModelOptions(
name="googleworkspaceprovideruser",
options={
"verbose_name": "Google Workspace Provider User",
"verbose_name_plural": "Google Workspace Provider Users",
},
),
]

View File

@ -16,9 +16,8 @@ from authentik.core.models import (
User,
UserTypes,
)
from authentik.lib.models import SerializerModel
from authentik.lib.sync.outgoing.base import BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction, OutgoingSyncProvider
from authentik.lib.sync.outgoing.models import OutgoingSyncProvider
def default_scopes() -> list[str]:
@ -30,6 +29,15 @@ def default_scopes() -> list[str]:
]
class GoogleWorkspaceDeleteAction(models.TextChoices):
"""Action taken when a user/group is deleted in authentik. Suspend is not available for groups,
and will be treated as `do_nothing`"""
DO_NOTHING = "do_nothing"
DELETE = "delete"
SUSPEND = "suspend"
class GoogleWorkspaceProvider(OutgoingSyncProvider, BackchannelProvider):
"""Sync users from authentik into Google Workspace."""
@ -40,10 +48,10 @@ class GoogleWorkspaceProvider(OutgoingSyncProvider, BackchannelProvider):
default_group_email_domain = models.TextField()
exclude_users_service_account = models.BooleanField(default=False)
user_delete_action = models.TextField(
choices=OutgoingSyncDeleteAction.choices, default=OutgoingSyncDeleteAction.DELETE
choices=GoogleWorkspaceDeleteAction.choices, default=GoogleWorkspaceDeleteAction.DELETE
)
group_delete_action = models.TextField(
choices=OutgoingSyncDeleteAction.choices, default=OutgoingSyncDeleteAction.DELETE
choices=GoogleWorkspaceDeleteAction.choices, default=GoogleWorkspaceDeleteAction.DELETE
)
filter_group = models.ForeignKey(
@ -105,10 +113,10 @@ class GoogleWorkspaceProvider(OutgoingSyncProvider, BackchannelProvider):
@property
def serializer(self) -> type[Serializer]:
from authentik.enterprise.providers.google_workspace.api.providers import (
GoogleWorkspaceProviderSerializer,
GoogleProviderSerializer,
)
return GoogleWorkspaceProviderSerializer
return GoogleProviderSerializer
def __str__(self):
return f"Google Workspace Provider {self.name}"
@ -128,10 +136,10 @@ class GoogleWorkspaceProviderMapping(PropertyMapping):
@property
def serializer(self) -> type[Serializer]:
from authentik.enterprise.providers.google_workspace.api.property_mappings import (
GoogleWorkspaceProviderMappingSerializer,
GoogleProviderMappingSerializer,
)
return GoogleWorkspaceProviderMappingSerializer
return GoogleProviderMappingSerializer
def __str__(self):
return f"Google Workspace Provider Mapping {self.name}"
@ -141,7 +149,7 @@ class GoogleWorkspaceProviderMapping(PropertyMapping):
verbose_name_plural = _("Google Workspace Provider Mappings")
class GoogleWorkspaceProviderUser(SerializerModel):
class GoogleWorkspaceProviderUser(models.Model):
"""Mapping of a user and provider to a Google user ID"""
id = models.UUIDField(primary_key=True, editable=False, default=uuid4)
@ -149,24 +157,14 @@ class GoogleWorkspaceProviderUser(SerializerModel):
user = models.ForeignKey(User, on_delete=models.CASCADE)
provider = models.ForeignKey(GoogleWorkspaceProvider, on_delete=models.CASCADE)
@property
def serializer(self) -> type[Serializer]:
from authentik.enterprise.providers.google_workspace.api.users import (
GoogleWorkspaceProviderUserSerializer,
)
return GoogleWorkspaceProviderUserSerializer
class Meta:
verbose_name = _("Google Workspace Provider User")
verbose_name_plural = _("Google Workspace Provider Users")
unique_together = (("google_id", "user", "provider"),)
def __str__(self) -> str:
return f"Google Workspace Provider User {self.user_id} to {self.provider_id}"
return f"Google Workspace User {self.user_id} to {self.provider_id}"
class GoogleWorkspaceProviderGroup(SerializerModel):
class GoogleWorkspaceProviderGroup(models.Model):
"""Mapping of a group and provider to a Google group ID"""
id = models.UUIDField(primary_key=True, editable=False, default=uuid4)
@ -174,18 +172,8 @@ class GoogleWorkspaceProviderGroup(SerializerModel):
group = models.ForeignKey(Group, on_delete=models.CASCADE)
provider = models.ForeignKey(GoogleWorkspaceProvider, on_delete=models.CASCADE)
@property
def serializer(self) -> type[Serializer]:
from authentik.enterprise.providers.google_workspace.api.groups import (
GoogleWorkspaceProviderGroupSerializer,
)
return GoogleWorkspaceProviderGroupSerializer
class Meta:
verbose_name = _("Google Workspace Provider Group")
verbose_name_plural = _("Google Workspace Provider Groups")
unique_together = (("google_id", "group", "provider"),)
def __str__(self) -> str:
return f"Google Workspace Provider Group {self.group_id} to {self.provider_id}"
return f"Google Workspace Group {self.group_id} to {self.provider_id}"

View File

@ -2,21 +2,18 @@
from authentik.enterprise.providers.google_workspace.models import GoogleWorkspaceProvider
from authentik.events.system_tasks import SystemTask
from authentik.lib.sync.outgoing.exceptions import TransientSyncException
from authentik.lib.sync.outgoing.tasks import SyncTasks
from authentik.root.celery import CELERY_APP
sync_tasks = SyncTasks(GoogleWorkspaceProvider)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
@CELERY_APP.task()
def google_workspace_sync_objects(*args, **kwargs):
return sync_tasks.sync_objects(*args, **kwargs)
@CELERY_APP.task(
base=SystemTask, bind=True, autoretry_for=(TransientSyncException,), retry_backoff=True
)
@CELERY_APP.task(base=SystemTask, bind=True)
def google_workspace_sync(self, provider_pk: int, *args, **kwargs):
"""Run full sync for Google Workspace provider"""
return sync_tasks.sync_single(self, provider_pk, google_workspace_sync_objects)
@ -27,11 +24,11 @@ def google_workspace_sync_all():
return sync_tasks.sync_all(google_workspace_sync)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
@CELERY_APP.task()
def google_workspace_sync_direct(*args, **kwargs):
return sync_tasks.sync_signal_direct(*args, **kwargs)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
@CELERY_APP.task()
def google_workspace_sync_m2m(*args, **kwargs):
return sync_tasks.sync_signal_m2m(*args, **kwargs)

View File

@ -9,6 +9,7 @@ from authentik.core.models import Application, Group, User
from authentik.core.tests.utils import create_test_user
from authentik.enterprise.providers.google_workspace.clients.test_http import MockHTTP
from authentik.enterprise.providers.google_workspace.models import (
GoogleWorkspaceDeleteAction,
GoogleWorkspaceProvider,
GoogleWorkspaceProviderGroup,
GoogleWorkspaceProviderMapping,
@ -16,7 +17,6 @@ from authentik.enterprise.providers.google_workspace.models import (
from authentik.enterprise.providers.google_workspace.tasks import google_workspace_sync
from authentik.events.models import Event, EventAction
from authentik.lib.generators import generate_id
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction
from authentik.lib.tests.utils import load_fixture
from authentik.tenants.models import Tenant
@ -240,7 +240,7 @@ class GoogleWorkspaceGroupTests(TestCase):
def test_group_create_delete_do_nothing(self):
"""Test group deletion (delete action = do nothing)"""
self.provider.group_delete_action = OutgoingSyncDeleteAction.DO_NOTHING
self.provider.group_delete_action = GoogleWorkspaceDeleteAction.DO_NOTHING
self.provider.save()
uid = generate_id()
http = MockHTTP()

View File

@ -9,6 +9,7 @@ from authentik.blueprints.tests import apply_blueprint
from authentik.core.models import Application, Group, User
from authentik.enterprise.providers.google_workspace.clients.test_http import MockHTTP
from authentik.enterprise.providers.google_workspace.models import (
GoogleWorkspaceDeleteAction,
GoogleWorkspaceProvider,
GoogleWorkspaceProviderMapping,
GoogleWorkspaceProviderUser,
@ -16,7 +17,6 @@ from authentik.enterprise.providers.google_workspace.models import (
from authentik.enterprise.providers.google_workspace.tasks import google_workspace_sync
from authentik.events.models import Event, EventAction
from authentik.lib.generators import generate_id
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction
from authentik.lib.tests.utils import load_fixture
from authentik.tenants.models import Tenant
@ -160,7 +160,7 @@ class GoogleWorkspaceUserTests(TestCase):
def test_user_create_delete_suspend(self):
"""Test user deletion (delete action = Suspend)"""
self.provider.user_delete_action = OutgoingSyncDeleteAction.SUSPEND
self.provider.user_delete_action = GoogleWorkspaceDeleteAction.SUSPEND
self.provider.save()
uid = generate_id()
http = MockHTTP()
@ -209,7 +209,7 @@ class GoogleWorkspaceUserTests(TestCase):
def test_user_create_delete_do_nothing(self):
"""Test user deletion (delete action = do nothing)"""
self.provider.user_delete_action = OutgoingSyncDeleteAction.DO_NOTHING
self.provider.user_delete_action = GoogleWorkspaceDeleteAction.DO_NOTHING
self.provider.save()
uid = generate_id()
http = MockHTTP()

View File

@ -1,21 +1,11 @@
"""google provider urls"""
from authentik.enterprise.providers.google_workspace.api.groups import (
GoogleWorkspaceProviderGroupViewSet,
)
from authentik.enterprise.providers.google_workspace.api.property_mappings import (
GoogleWorkspaceProviderMappingViewSet,
)
from authentik.enterprise.providers.google_workspace.api.providers import (
GoogleWorkspaceProviderViewSet,
)
from authentik.enterprise.providers.google_workspace.api.users import (
GoogleWorkspaceProviderUserViewSet,
GoogleProviderMappingViewSet,
)
from authentik.enterprise.providers.google_workspace.api.providers import GoogleProviderViewSet
api_urlpatterns = [
("providers/google_workspace", GoogleWorkspaceProviderViewSet),
("providers/google_workspace_users", GoogleWorkspaceProviderUserViewSet),
("providers/google_workspace_groups", GoogleWorkspaceProviderGroupViewSet),
("propertymappings/provider/google_workspace", GoogleWorkspaceProviderMappingViewSet),
("providers/google_workspace", GoogleProviderViewSet),
("propertymappings/provider/google_workspace", GoogleProviderMappingViewSet),
]

View File

@ -1,33 +0,0 @@
"""MicrosoftEntraProviderGroup API Views"""
from rest_framework.viewsets import ModelViewSet
from authentik.core.api.sources import SourceSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.users import UserGroupSerializer
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProviderGroup
class MicrosoftEntraProviderGroupSerializer(SourceSerializer):
"""MicrosoftEntraProviderGroup Serializer"""
group_obj = UserGroupSerializer(source="group", read_only=True)
class Meta:
model = MicrosoftEntraProviderGroup
fields = [
"id",
"group",
"group_obj",
]
class MicrosoftEntraProviderGroupViewSet(UsedByMixin, ModelViewSet):
"""MicrosoftEntraProviderGroup Viewset"""
queryset = MicrosoftEntraProviderGroup.objects.all().select_related("group")
serializer_class = MicrosoftEntraProviderGroupSerializer
filterset_fields = ["provider__id", "group__name", "group__group_uuid"]
search_fields = ["provider__name", "group__name"]
ordering = ["group__name"]

View File

@ -1,39 +0,0 @@
"""microsoft Property mappings API Views"""
from django_filters.filters import AllValuesMultipleFilter
from django_filters.filterset import FilterSet
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import extend_schema_field
from rest_framework.viewsets import ModelViewSet
from authentik.core.api.propertymappings import PropertyMappingSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProviderMapping
class MicrosoftEntraProviderMappingSerializer(PropertyMappingSerializer):
"""MicrosoftEntraProviderMapping Serializer"""
class Meta:
model = MicrosoftEntraProviderMapping
fields = PropertyMappingSerializer.Meta.fields
class MicrosoftEntraProviderMappingFilter(FilterSet):
"""Filter for MicrosoftEntraProviderMapping"""
managed = extend_schema_field(OpenApiTypes.STR)(AllValuesMultipleFilter(field_name="managed"))
class Meta:
model = MicrosoftEntraProviderMapping
fields = "__all__"
class MicrosoftEntraProviderMappingViewSet(UsedByMixin, ModelViewSet):
"""MicrosoftEntraProviderMapping Viewset"""
queryset = MicrosoftEntraProviderMapping.objects.all()
serializer_class = MicrosoftEntraProviderMappingSerializer
filterset_class = MicrosoftEntraProviderMappingFilter
search_fields = ["name"]
ordering = ["name"]

View File

@ -1,52 +0,0 @@
"""Microsoft Provider API Views"""
from rest_framework.viewsets import ModelViewSet
from authentik.core.api.providers import ProviderSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.enterprise.api import EnterpriseRequiredMixin
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProvider
from authentik.enterprise.providers.microsoft_entra.tasks import microsoft_entra_sync
from authentik.lib.sync.outgoing.api import OutgoingSyncProviderStatusMixin
class MicrosoftEntraProviderSerializer(EnterpriseRequiredMixin, ProviderSerializer):
"""MicrosoftEntraProvider Serializer"""
class Meta:
model = MicrosoftEntraProvider
fields = [
"pk",
"name",
"property_mappings",
"property_mappings_group",
"component",
"assigned_backchannel_application_slug",
"assigned_backchannel_application_name",
"verbose_name",
"verbose_name_plural",
"meta_model_name",
"client_id",
"client_secret",
"tenant_id",
"exclude_users_service_account",
"filter_group",
"user_delete_action",
"group_delete_action",
]
extra_kwargs = {}
class MicrosoftEntraProviderViewSet(OutgoingSyncProviderStatusMixin, UsedByMixin, ModelViewSet):
"""MicrosoftEntraProvider Viewset"""
queryset = MicrosoftEntraProvider.objects.all()
serializer_class = MicrosoftEntraProviderSerializer
filterset_fields = [
"name",
"exclude_users_service_account",
"filter_group",
]
search_fields = ["name"]
ordering = ["name"]
sync_single_task = microsoft_entra_sync

View File

@ -1,33 +0,0 @@
"""MicrosoftEntraProviderUser API Views"""
from rest_framework.viewsets import ModelViewSet
from authentik.core.api.groups import GroupMemberSerializer
from authentik.core.api.sources import SourceSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProviderUser
class MicrosoftEntraProviderUserSerializer(SourceSerializer):
"""MicrosoftEntraProviderUser Serializer"""
user_obj = GroupMemberSerializer(source="user", read_only=True)
class Meta:
model = MicrosoftEntraProviderUser
fields = [
"id",
"user",
"user_obj",
]
class MicrosoftEntraProviderUserViewSet(UsedByMixin, ModelViewSet):
"""MicrosoftEntraProviderUser Viewset"""
queryset = MicrosoftEntraProviderUser.objects.all().select_related("user")
serializer_class = MicrosoftEntraProviderUserSerializer
filterset_fields = ["provider__id", "user__username", "user__id"]
search_fields = ["provider__name", "user__username"]
ordering = ["user__username"]

View File

@ -1,9 +0,0 @@
from authentik.enterprise.apps import EnterpriseConfig
class AuthentikEnterpriseProviderMicrosoftEntraConfig(EnterpriseConfig):
name = "authentik.enterprise.providers.microsoft_entra"
label = "authentik_providers_microsoft_entra"
verbose_name = "authentik Enterprise.Providers.Microsoft Entra"
default = True

View File

@ -1,100 +0,0 @@
from asyncio import run
from collections.abc import Coroutine
from typing import Any
from azure.core.exceptions import (
ClientAuthenticationError,
ServiceRequestError,
ServiceResponseError,
)
from azure.identity.aio import ClientSecretCredential
from django.db.models import Model
from django.http import HttpResponseBadRequest, HttpResponseNotFound
from kiota_abstractions.api_error import APIError
from kiota_authentication_azure.azure_identity_authentication_provider import (
AzureIdentityAuthenticationProvider,
)
from kiota_http.kiota_client_factory import KiotaClientFactory
from msgraph.generated.models.o_data_errors.o_data_error import ODataError
from msgraph.graph_request_adapter import GraphRequestAdapter, options
from msgraph.graph_service_client import GraphServiceClient
from msgraph_core import GraphClientFactory
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProvider
from authentik.lib.sync.outgoing import HTTP_CONFLICT
from authentik.lib.sync.outgoing.base import BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.exceptions import (
BadRequestSyncException,
NotFoundSyncException,
ObjectExistsSyncException,
StopSync,
TransientSyncException,
)
def get_request_adapter(
credentials: ClientSecretCredential, scopes: list[str] | None = None
) -> GraphRequestAdapter:
if scopes:
auth_provider = AzureIdentityAuthenticationProvider(credentials=credentials, scopes=scopes)
else:
auth_provider = AzureIdentityAuthenticationProvider(credentials=credentials)
return GraphRequestAdapter(
auth_provider=auth_provider,
client=GraphClientFactory.create_with_default_middleware(
options=options, client=KiotaClientFactory.get_default_client()
),
)
class MicrosoftEntraSyncClient[TModel: Model, TConnection: Model, TSchema: dict](
BaseOutgoingSyncClient[TModel, TConnection, TSchema, MicrosoftEntraProvider]
):
"""Base client for syncing to microsoft entra"""
domains: list
def __init__(self, provider: MicrosoftEntraProvider) -> None:
super().__init__(provider)
self.credentials = provider.microsoft_credentials()
self.__prefetch_domains()
@property
def client(self):
return GraphServiceClient(request_adapter=get_request_adapter(**self.credentials))
def _request[T](self, request: Coroutine[Any, Any, T]) -> T:
try:
return run(request)
except ClientAuthenticationError as exc:
raise StopSync(exc, None, None) from exc
except ODataError as exc:
raise StopSync(exc, None, None) from exc
except (ServiceRequestError, ServiceResponseError) as exc:
raise TransientSyncException("Failed to sent request") from exc
except APIError as exc:
if exc.response_status_code == HttpResponseNotFound.status_code:
raise NotFoundSyncException("Object not found") from exc
if exc.response_status_code == HttpResponseBadRequest.status_code:
raise BadRequestSyncException("Bad request", exc.response_headers) from exc
if exc.response_status_code == HTTP_CONFLICT:
raise ObjectExistsSyncException("Object exists", exc.response_headers) from exc
raise exc
def __prefetch_domains(self):
self.domains = []
organizations = self._request(self.client.organization.get())
next_link = True
while next_link:
for org in organizations.value:
self.domains.extend([x.name for x in org.verified_domains])
next_link = organizations.odata_next_link
if not next_link:
break
organizations = self._request(self.client.organization.with_url(next_link).get())
def check_email_valid(self, *emails: str):
for email in emails:
if not any(email.endswith(f"@{domain_name}") for domain_name in self.domains):
raise BadRequestSyncException(f"Invalid email domain: {email}")

View File

@ -1,241 +0,0 @@
from deepmerge import always_merger
from django.db import transaction
from msgraph.generated.groups.groups_request_builder import GroupsRequestBuilder
from msgraph.generated.models.group import Group as MSGroup
from msgraph.generated.models.reference_create import ReferenceCreate
from authentik.core.expression.exceptions import (
PropertyMappingExpressionException,
SkipObjectException,
)
from authentik.core.models import Group
from authentik.enterprise.providers.microsoft_entra.clients.base import MicrosoftEntraSyncClient
from authentik.enterprise.providers.microsoft_entra.models import (
MicrosoftEntraProviderGroup,
MicrosoftEntraProviderMapping,
MicrosoftEntraProviderUser,
)
from authentik.events.models import Event, EventAction
from authentik.lib.sync.outgoing.base import Direction
from authentik.lib.sync.outgoing.exceptions import (
NotFoundSyncException,
ObjectExistsSyncException,
StopSync,
TransientSyncException,
)
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction
from authentik.lib.utils.errors import exception_to_string
class MicrosoftEntraGroupClient(
MicrosoftEntraSyncClient[Group, MicrosoftEntraProviderGroup, MSGroup]
):
"""Microsoft client for groups"""
connection_type = MicrosoftEntraProviderGroup
connection_type_query = "group"
can_discover = True
def to_schema(self, obj: Group, creating: bool) -> MSGroup:
"""Convert authentik group"""
raw_microsoft_group = {}
for mapping in (
self.provider.property_mappings_group.all().order_by("name").select_subclasses()
):
if not isinstance(mapping, MicrosoftEntraProviderMapping):
continue
try:
value = mapping.evaluate(
user=None,
request=None,
group=obj,
provider=self.provider,
creating=creating,
)
if value is None:
continue
always_merger.merge(raw_microsoft_group, value)
except SkipObjectException as exc:
raise exc from exc
except (PropertyMappingExpressionException, ValueError) as exc:
# Value error can be raised when assigning invalid data to an attribute
Event.new(
EventAction.CONFIGURATION_ERROR,
message=f"Failed to evaluate property-mapping {exception_to_string(exc)}",
mapping=mapping,
).save()
raise StopSync(exc, obj, mapping) from exc
if not raw_microsoft_group:
raise StopSync(ValueError("No group mappings configured"), obj)
try:
return MSGroup(**raw_microsoft_group)
except TypeError as exc:
raise StopSync(exc, obj) from exc
def delete(self, obj: Group):
"""Delete group"""
microsoft_group = MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group=obj
).first()
if not microsoft_group:
self.logger.debug("Group does not exist in Microsoft, skipping")
return None
with transaction.atomic():
if self.provider.group_delete_action == OutgoingSyncDeleteAction.DELETE:
self._request(self.client.groups.by_group_id(microsoft_group.microsoft_id).delete())
microsoft_group.delete()
def create(self, group: Group):
"""Create group from scratch and create a connection object"""
microsoft_group = self.to_schema(group, True)
with transaction.atomic():
try:
response = self._request(self.client.groups.post(microsoft_group))
except ObjectExistsSyncException:
# group already exists in microsoft entra, so we can connect them manually
# for groups we need to fetch the group from microsoft as we connect on
# ID and not group email
query_params = GroupsRequestBuilder.GroupsRequestBuilderGetQueryParameters(
filter=f"displayName eq '{microsoft_group.display_name}'",
)
request_configuration = (
GroupsRequestBuilder.GroupsRequestBuilderGetRequestConfiguration(
query_parameters=query_params,
)
)
group_data = self._request(self.client.groups.get(request_configuration))
if group_data.odata_count < 1:
self.logger.warning(
"Group which could not be created also does not exist", group=group
)
return
return MicrosoftEntraProviderGroup.objects.create(
provider=self.provider, group=group, microsoft_id=group_data.value[0].id
)
else:
return MicrosoftEntraProviderGroup.objects.create(
provider=self.provider, group=group, microsoft_id=response.id
)
def update(self, group: Group, connection: MicrosoftEntraProviderGroup):
"""Update existing group"""
microsoft_group = self.to_schema(group, False)
microsoft_group.id = connection.microsoft_id
try:
return self._request(
self.client.groups.by_group_id(connection.microsoft_id).patch(microsoft_group)
)
except NotFoundSyncException:
# Resource missing is handled by self.write, which will re-create the group
raise
def write(self, obj: Group):
microsoft_group, created = super().write(obj)
self.create_sync_members(obj, microsoft_group)
return microsoft_group, created
def create_sync_members(self, obj: Group, microsoft_group: MicrosoftEntraProviderGroup):
"""Sync all members after a group was created"""
users = list(obj.users.order_by("id").values_list("id", flat=True))
connections = MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user__pk__in=users
).values_list("microsoft_id", flat=True)
self._patch(microsoft_group.microsoft_id, Direction.add, connections)
def update_group(self, group: Group, action: Direction, users_set: set[int]):
"""Update a groups members"""
if action == Direction.add:
return self._patch_add_users(group, users_set)
if action == Direction.remove:
return self._patch_remove_users(group, users_set)
def _patch(self, microsoft_group_id: str, direction: Direction, members: list[str]):
for user in members:
try:
if direction == Direction.add:
request_body = ReferenceCreate(
odata_id=f"https://graph.microsoft.com/v1.0/directoryObjects/{user}",
)
self._request(
self.client.groups.by_group_id(microsoft_group_id).members.ref.post(
request_body
)
)
if direction == Direction.remove:
self._request(
self.client.groups.by_group_id(microsoft_group_id)
.members.by_directory_object_id(user)
.ref.delete()
)
except ObjectExistsSyncException:
pass
except TransientSyncException:
raise
def _patch_add_users(self, group: Group, users_set: set[int]):
"""Add users in users_set to group"""
if len(users_set) < 1:
return
microsoft_group = MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group=group
).first()
if not microsoft_group:
self.logger.warning(
"could not sync group membership, group does not exist", group=group
)
return
user_ids = list(
MicrosoftEntraProviderUser.objects.filter(
user__pk__in=users_set, provider=self.provider
).values_list("microsoft_id", flat=True)
)
if len(user_ids) < 1:
return
self._patch(microsoft_group.microsoft_id, Direction.add, user_ids)
def _patch_remove_users(self, group: Group, users_set: set[int]):
"""Remove users in users_set from group"""
if len(users_set) < 1:
return
microsoft_group = MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group=group
).first()
if not microsoft_group:
self.logger.warning(
"could not sync group membership, group does not exist", group=group
)
return
user_ids = list(
MicrosoftEntraProviderUser.objects.filter(
user__pk__in=users_set, provider=self.provider
).values_list("microsoft_id", flat=True)
)
if len(user_ids) < 1:
return
self._patch(microsoft_group.microsoft_id, Direction.remove, user_ids)
def discover(self):
"""Iterate through all groups and connect them with authentik groups if possible"""
groups = self._request(self.client.groups.get())
next_link = True
while next_link:
for group in groups.value:
self._discover_single_group(group)
next_link = groups.odata_next_link
if not next_link:
break
groups = self._request(self.client.groups.with_url(next_link).get())
def _discover_single_group(self, group: MSGroup):
"""handle discovery of a single group"""
microsoft_name = group.unique_name
matching_authentik_group = (
self.provider.get_object_qs(Group).filter(name=microsoft_name).first()
)
if not matching_authentik_group:
return
MicrosoftEntraProviderGroup.objects.get_or_create(
provider=self.provider,
group=matching_authentik_group,
microsoft_id=group.id,
)

View File

@ -1,150 +0,0 @@
from deepmerge import always_merger
from django.db import transaction
from msgraph.generated.models.user import User as MSUser
from msgraph.generated.users.users_request_builder import UsersRequestBuilder
from authentik.core.expression.exceptions import (
PropertyMappingExpressionException,
SkipObjectException,
)
from authentik.core.models import User
from authentik.enterprise.providers.microsoft_entra.clients.base import MicrosoftEntraSyncClient
from authentik.enterprise.providers.microsoft_entra.models import (
MicrosoftEntraProviderMapping,
MicrosoftEntraProviderUser,
)
from authentik.events.models import Event, EventAction
from authentik.lib.sync.outgoing.exceptions import (
ObjectExistsSyncException,
StopSync,
TransientSyncException,
)
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction
from authentik.lib.utils.errors import exception_to_string
from authentik.policies.utils import delete_none_values
class MicrosoftEntraUserClient(MicrosoftEntraSyncClient[User, MicrosoftEntraProviderUser, MSUser]):
"""Sync authentik users into microsoft entra"""
connection_type = MicrosoftEntraProviderUser
connection_type_query = "user"
can_discover = True
def to_schema(self, obj: User, creating: bool) -> MSUser:
"""Convert authentik user"""
raw_microsoft_user = {}
for mapping in self.provider.property_mappings.all().order_by("name").select_subclasses():
if not isinstance(mapping, MicrosoftEntraProviderMapping):
continue
try:
value = mapping.evaluate(
user=obj,
request=None,
provider=self.provider,
creating=creating,
)
if value is None:
continue
always_merger.merge(raw_microsoft_user, value)
except SkipObjectException as exc:
raise exc from exc
except (PropertyMappingExpressionException, ValueError) as exc:
# Value error can be raised when assigning invalid data to an attribute
Event.new(
EventAction.CONFIGURATION_ERROR,
message=f"Failed to evaluate property-mapping {exception_to_string(exc)}",
mapping=mapping,
).save()
raise StopSync(exc, obj, mapping) from exc
if not raw_microsoft_user:
raise StopSync(ValueError("No user mappings configured"), obj)
try:
return MSUser(**delete_none_values(raw_microsoft_user))
except TypeError as exc:
raise StopSync(exc, obj) from exc
def delete(self, obj: User):
"""Delete user"""
microsoft_user = MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user=obj
).first()
if not microsoft_user:
self.logger.debug("User does not exist in Microsoft, skipping")
return None
with transaction.atomic():
response = None
if self.provider.user_delete_action == OutgoingSyncDeleteAction.DELETE:
response = self._request(
self.client.users.by_user_id(microsoft_user.microsoft_id).delete()
)
elif self.provider.user_delete_action == OutgoingSyncDeleteAction.SUSPEND:
response = self._request(
self.client.users.by_user_id(microsoft_user.microsoft_id).patch(
MSUser(account_enabled=False)
)
)
microsoft_user.delete()
return response
def create(self, user: User):
"""Create user from scratch and create a connection object"""
microsoft_user = self.to_schema(user, True)
self.check_email_valid(microsoft_user.user_principal_name)
with transaction.atomic():
try:
response = self._request(self.client.users.post(microsoft_user))
except ObjectExistsSyncException:
# user already exists in microsoft entra, so we can connect them manually
query_params = UsersRequestBuilder.UsersRequestBuilderGetQueryParameters()(
filter=f"mail eq '{microsoft_user.mail}'",
)
request_configuration = (
UsersRequestBuilder.UsersRequestBuilderGetRequestConfiguration(
query_parameters=query_params,
)
)
user_data = self._request(self.client.users.get(request_configuration))
if user_data.odata_count < 1:
self.logger.warning(
"User which could not be created also does not exist", user=user
)
return
return MicrosoftEntraProviderUser.objects.create(
provider=self.provider, user=user, microsoft_id=user_data.value[0].id
)
except TransientSyncException as exc:
raise exc
else:
return MicrosoftEntraProviderUser.objects.create(
provider=self.provider, user=user, microsoft_id=response.id
)
def update(self, user: User, connection: MicrosoftEntraProviderUser):
"""Update existing user"""
microsoft_user = self.to_schema(user, False)
self.check_email_valid(microsoft_user.user_principal_name)
self._request(self.client.users.by_user_id(connection.microsoft_id).patch(microsoft_user))
def discover(self):
"""Iterate through all users and connect them with authentik users if possible"""
users = self._request(self.client.users.get())
next_link = True
while next_link:
for user in users.value:
self._discover_single_user(user)
next_link = users.odata_next_link
if not next_link:
break
users = self._request(self.client.users.with_url(next_link).get())
def _discover_single_user(self, user: MSUser):
"""handle discovery of a single user"""
matching_authentik_user = self.provider.get_object_qs(User).filter(email=user.mail).first()
if not matching_authentik_user:
return
MicrosoftEntraProviderUser.objects.get_or_create(
provider=self.provider,
user=matching_authentik_user,
microsoft_id=user.id,
)

View File

@ -1,165 +0,0 @@
# Generated by Django 5.0.6 on 2024-05-08 14:35
import django.db.models.deletion
import uuid
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
("authentik_core", "0035_alter_group_options_and_more"),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name="MicrosoftEntraProviderMapping",
fields=[
(
"propertymapping_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="authentik_core.propertymapping",
),
),
],
options={
"verbose_name": "Microsoft Entra Provider Mapping",
"verbose_name_plural": "Microsoft Entra Provider Mappings",
},
bases=("authentik_core.propertymapping",),
),
migrations.CreateModel(
name="MicrosoftEntraProvider",
fields=[
(
"provider_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="authentik_core.provider",
),
),
("client_id", models.TextField()),
("client_secret", models.TextField()),
("tenant_id", models.TextField()),
("exclude_users_service_account", models.BooleanField(default=False)),
(
"user_delete_action",
models.TextField(
choices=[
("do_nothing", "Do Nothing"),
("delete", "Delete"),
("suspend", "Suspend"),
],
default="delete",
),
),
(
"group_delete_action",
models.TextField(
choices=[
("do_nothing", "Do Nothing"),
("delete", "Delete"),
("suspend", "Suspend"),
],
default="delete",
),
),
(
"filter_group",
models.ForeignKey(
default=None,
null=True,
on_delete=django.db.models.deletion.SET_DEFAULT,
to="authentik_core.group",
),
),
(
"property_mappings_group",
models.ManyToManyField(
blank=True,
default=None,
help_text="Property mappings used for group creation/updating.",
to="authentik_core.propertymapping",
),
),
],
options={
"verbose_name": "Microsoft Entra Provider",
"verbose_name_plural": "Microsoft Entra Providers",
},
bases=("authentik_core.provider", models.Model),
),
migrations.CreateModel(
name="MicrosoftEntraProviderGroup",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4, editable=False, primary_key=True, serialize=False
),
),
("microsoft_id", models.TextField()),
(
"group",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="authentik_core.group"
),
),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="authentik_providers_microsoft_entra.microsoftentraprovider",
),
),
],
options={
"verbose_name": "Microsoft Entra Provider Group",
"verbose_name_plural": "Microsoft Entra Provider Groups",
"unique_together": {("microsoft_id", "group", "provider")},
},
),
migrations.CreateModel(
name="MicrosoftEntraProviderUser",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4, editable=False, primary_key=True, serialize=False
),
),
("microsoft_id", models.TextField()),
(
"provider",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="authentik_providers_microsoft_entra.microsoftentraprovider",
),
),
(
"user",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL
),
),
],
options={
"verbose_name": "Microsoft Entra Provider User",
"verbose_name_plural": "Microsoft Entra Provider User",
"unique_together": {("microsoft_id", "user", "provider")},
},
),
]

View File

@ -1,180 +0,0 @@
"""Microsoft Entra sync provider"""
from typing import Any, Self
from uuid import uuid4
from azure.identity.aio import ClientSecretCredential
from django.db import models
from django.db.models import QuerySet
from django.utils.translation import gettext_lazy as _
from rest_framework.serializers import Serializer
from authentik.core.models import (
BackchannelProvider,
Group,
PropertyMapping,
User,
UserTypes,
)
from authentik.lib.models import SerializerModel
from authentik.lib.sync.outgoing.base import BaseOutgoingSyncClient
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction, OutgoingSyncProvider
class MicrosoftEntraProvider(OutgoingSyncProvider, BackchannelProvider):
"""Sync users from authentik into Microsoft Entra."""
client_id = models.TextField()
client_secret = models.TextField()
tenant_id = models.TextField()
exclude_users_service_account = models.BooleanField(default=False)
user_delete_action = models.TextField(
choices=OutgoingSyncDeleteAction.choices, default=OutgoingSyncDeleteAction.DELETE
)
group_delete_action = models.TextField(
choices=OutgoingSyncDeleteAction.choices, default=OutgoingSyncDeleteAction.DELETE
)
filter_group = models.ForeignKey(
"authentik_core.group", on_delete=models.SET_DEFAULT, default=None, null=True
)
property_mappings_group = models.ManyToManyField(
PropertyMapping,
default=None,
blank=True,
help_text=_("Property mappings used for group creation/updating."),
)
def client_for_model(
self, model: type[User | Group]
) -> BaseOutgoingSyncClient[User | Group, Any, Any, Self]:
if issubclass(model, User):
from authentik.enterprise.providers.microsoft_entra.clients.users import (
MicrosoftEntraUserClient,
)
return MicrosoftEntraUserClient(self)
if issubclass(model, Group):
from authentik.enterprise.providers.microsoft_entra.clients.groups import (
MicrosoftEntraGroupClient,
)
return MicrosoftEntraGroupClient(self)
raise ValueError(f"Invalid model {model}")
def get_object_qs(self, type: type[User | Group]) -> QuerySet[User | Group]:
if type == User:
# Get queryset of all users with consistent ordering
# according to the provider's settings
base = User.objects.all().exclude_anonymous()
if self.exclude_users_service_account:
base = base.exclude(type=UserTypes.SERVICE_ACCOUNT).exclude(
type=UserTypes.INTERNAL_SERVICE_ACCOUNT
)
if self.filter_group:
base = base.filter(ak_groups__in=[self.filter_group])
return base.order_by("pk")
if type == Group:
# Get queryset of all groups with consistent ordering
return Group.objects.all().order_by("pk")
raise ValueError(f"Invalid type {type}")
def microsoft_credentials(self):
return {
"credentials": ClientSecretCredential(
self.tenant_id, self.client_id, self.client_secret
)
}
@property
def component(self) -> str:
return "ak-provider-microsoft-entra-form"
@property
def serializer(self) -> type[Serializer]:
from authentik.enterprise.providers.microsoft_entra.api.providers import (
MicrosoftEntraProviderSerializer,
)
return MicrosoftEntraProviderSerializer
def __str__(self):
return f"Microsoft Entra Provider {self.name}"
class Meta:
verbose_name = _("Microsoft Entra Provider")
verbose_name_plural = _("Microsoft Entra Providers")
class MicrosoftEntraProviderMapping(PropertyMapping):
"""Map authentik data to outgoing Microsoft requests"""
@property
def component(self) -> str:
return "ak-property-mapping-microsoft-entra-form"
@property
def serializer(self) -> type[Serializer]:
from authentik.enterprise.providers.microsoft_entra.api.property_mappings import (
MicrosoftEntraProviderMappingSerializer,
)
return MicrosoftEntraProviderMappingSerializer
def __str__(self):
return f"Microsoft Entra Provider Mapping {self.name}"
class Meta:
verbose_name = _("Microsoft Entra Provider Mapping")
verbose_name_plural = _("Microsoft Entra Provider Mappings")
class MicrosoftEntraProviderUser(SerializerModel):
"""Mapping of a user and provider to a Microsoft user ID"""
id = models.UUIDField(primary_key=True, editable=False, default=uuid4)
microsoft_id = models.TextField()
user = models.ForeignKey(User, on_delete=models.CASCADE)
provider = models.ForeignKey(MicrosoftEntraProvider, on_delete=models.CASCADE)
@property
def serializer(self) -> type[Serializer]:
from authentik.enterprise.providers.microsoft_entra.api.users import (
MicrosoftEntraProviderUserSerializer,
)
return MicrosoftEntraProviderUserSerializer
class Meta:
verbose_name = _("Microsoft Entra Provider User")
verbose_name_plural = _("Microsoft Entra Provider User")
unique_together = (("microsoft_id", "user", "provider"),)
def __str__(self) -> str:
return f"Microsoft Entra Provider User {self.user_id} to {self.provider_id}"
class MicrosoftEntraProviderGroup(SerializerModel):
"""Mapping of a group and provider to a Microsoft group ID"""
id = models.UUIDField(primary_key=True, editable=False, default=uuid4)
microsoft_id = models.TextField()
group = models.ForeignKey(Group, on_delete=models.CASCADE)
provider = models.ForeignKey(MicrosoftEntraProvider, on_delete=models.CASCADE)
@property
def serializer(self) -> type[Serializer]:
from authentik.enterprise.providers.microsoft_entra.api.groups import (
MicrosoftEntraProviderGroupSerializer,
)
return MicrosoftEntraProviderGroupSerializer
class Meta:
verbose_name = _("Microsoft Entra Provider Group")
verbose_name_plural = _("Microsoft Entra Provider Groups")
unique_together = (("microsoft_id", "group", "provider"),)
def __str__(self) -> str:
return f"Microsoft Entra Provider Group {self.group_id} to {self.provider_id}"

View File

@ -1,13 +0,0 @@
"""Microsoft Entra provider task Settings"""
from celery.schedules import crontab
from authentik.lib.utils.time import fqdn_rand
CELERY_BEAT_SCHEDULE = {
"providers_microsoft_entra_sync": {
"task": "authentik.enterprise.providers.microsoft_entra.tasks.microsoft_entra_sync_all",
"schedule": crontab(minute=fqdn_rand("microsoft_entra_sync_all"), hour="*/4"),
"options": {"queue": "authentik_scheduled"},
},
}

View File

@ -1,16 +0,0 @@
"""Microsoft provider signals"""
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProvider
from authentik.enterprise.providers.microsoft_entra.tasks import (
microsoft_entra_sync,
microsoft_entra_sync_direct,
microsoft_entra_sync_m2m,
)
from authentik.lib.sync.outgoing.signals import register_signals
register_signals(
MicrosoftEntraProvider,
task_sync_single=microsoft_entra_sync,
task_sync_direct=microsoft_entra_sync_direct,
task_sync_m2m=microsoft_entra_sync_m2m,
)

View File

@ -1,37 +0,0 @@
"""Microsoft Entra Provider tasks"""
from authentik.enterprise.providers.microsoft_entra.models import MicrosoftEntraProvider
from authentik.events.system_tasks import SystemTask
from authentik.lib.sync.outgoing.exceptions import TransientSyncException
from authentik.lib.sync.outgoing.tasks import SyncTasks
from authentik.root.celery import CELERY_APP
sync_tasks = SyncTasks(MicrosoftEntraProvider)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
def microsoft_entra_sync_objects(*args, **kwargs):
return sync_tasks.sync_objects(*args, **kwargs)
@CELERY_APP.task(
base=SystemTask, bind=True, autoretry_for=(TransientSyncException,), retry_backoff=True
)
def microsoft_entra_sync(self, provider_pk: int, *args, **kwargs):
"""Run full sync for Microsoft Entra provider"""
return sync_tasks.sync_single(self, provider_pk, microsoft_entra_sync_objects)
@CELERY_APP.task()
def microsoft_entra_sync_all():
return sync_tasks.sync_all(microsoft_entra_sync)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
def microsoft_entra_sync_direct(*args, **kwargs):
return sync_tasks.sync_signal_direct(*args, **kwargs)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
def microsoft_entra_sync_m2m(*args, **kwargs):
return sync_tasks.sync_signal_m2m(*args, **kwargs)

View File

@ -1,392 +0,0 @@
"""Microsoft Entra Group tests"""
from unittest.mock import AsyncMock, MagicMock, patch
from azure.identity.aio import ClientSecretCredential
from django.test import TestCase
from msgraph.generated.models.group import Group as MSGroup
from msgraph.generated.models.group_collection_response import GroupCollectionResponse
from msgraph.generated.models.organization import Organization
from msgraph.generated.models.organization_collection_response import OrganizationCollectionResponse
from msgraph.generated.models.user import User as MSUser
from msgraph.generated.models.user_collection_response import UserCollectionResponse
from msgraph.generated.models.verified_domain import VerifiedDomain
from authentik.blueprints.tests import apply_blueprint
from authentik.core.models import Application, Group, User
from authentik.core.tests.utils import create_test_user
from authentik.enterprise.providers.microsoft_entra.models import (
MicrosoftEntraProvider,
MicrosoftEntraProviderGroup,
MicrosoftEntraProviderMapping,
MicrosoftEntraProviderUser,
)
from authentik.enterprise.providers.microsoft_entra.tasks import microsoft_entra_sync
from authentik.events.models import Event, EventAction
from authentik.lib.generators import generate_id
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction
from authentik.tenants.models import Tenant
class MicrosoftEntraGroupTests(TestCase):
"""Microsoft Entra Group tests"""
@apply_blueprint("system/providers-microsoft-entra.yaml")
def setUp(self) -> None:
# Delete all groups and groups as the mocked HTTP responses only return one ID
# which will cause errors with multiple groups
Tenant.objects.update(avatars="none")
User.objects.all().exclude_anonymous().delete()
Group.objects.all().delete()
self.provider: MicrosoftEntraProvider = MicrosoftEntraProvider.objects.create(
name=generate_id(),
client_id=generate_id(),
client_secret=generate_id(),
tenant_id=generate_id(),
exclude_users_service_account=True,
)
self.app: Application = Application.objects.create(
name=generate_id(),
slug=generate_id(),
)
self.app.backchannel_providers.add(self.provider)
self.provider.property_mappings.add(
MicrosoftEntraProviderMapping.objects.get(
managed="goauthentik.io/providers/microsoft_entra/user"
)
)
self.provider.property_mappings_group.add(
MicrosoftEntraProviderMapping.objects.get(
managed="goauthentik.io/providers/microsoft_entra/group"
)
)
self.creds = ClientSecretCredential(generate_id(), generate_id(), generate_id())
def test_group_create(self):
"""Test group creation"""
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.groups.groups_request_builder.GroupsRequestBuilder.post",
AsyncMock(return_value=MSGroup(id=generate_id())),
) as group_create,
):
group = Group.objects.create(name=uid)
microsoft_group = MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group=group
).first()
self.assertIsNotNone(microsoft_group)
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
group_create.assert_called_once()
def test_group_create_update(self):
"""Test group updating"""
uid = generate_id()
ext_id = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.groups.groups_request_builder.GroupsRequestBuilder.post",
AsyncMock(return_value=MSGroup(id=ext_id)),
) as group_create,
patch(
"msgraph.generated.groups.item.group_item_request_builder.GroupItemRequestBuilder.patch",
AsyncMock(return_value=MSGroup(id=ext_id)),
) as group_patch,
):
group = Group.objects.create(name=uid)
microsoft_group = MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group=group
).first()
self.assertIsNotNone(microsoft_group)
group.name = "new name"
group.save()
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
group_create.assert_called_once()
group_patch.assert_called_once()
def test_group_create_delete(self):
"""Test group deletion"""
uid = generate_id()
ext_id = generate_id()
with (
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.groups.groups_request_builder.GroupsRequestBuilder.post",
AsyncMock(return_value=MSGroup(id=ext_id)),
) as group_create,
patch(
"msgraph.generated.groups.item.group_item_request_builder.GroupItemRequestBuilder.delete",
AsyncMock(return_value=MSGroup(id=ext_id)),
) as group_delete,
):
group = Group.objects.create(name=uid)
microsoft_group = MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group=group
).first()
self.assertIsNotNone(microsoft_group)
group.delete()
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
group_create.assert_called_once()
group_delete.assert_called_once()
def test_group_create_member_add(self):
"""Test group creation"""
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.users.users_request_builder.UsersRequestBuilder.post",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_create,
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.patch",
AsyncMock(return_value=MSUser(id=generate_id())),
),
patch(
"msgraph.generated.groups.groups_request_builder.GroupsRequestBuilder.post",
AsyncMock(return_value=MSGroup(id=uid)),
) as group_create,
patch(
"msgraph.generated.groups.item.members.ref.ref_request_builder.RefRequestBuilder.post",
AsyncMock(),
) as member_add,
):
user = create_test_user(uid)
group = Group.objects.create(name=uid)
group.users.add(user)
microsoft_group = MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group=group
).first()
self.assertIsNotNone(microsoft_group)
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
user_create.assert_called_once()
group_create.assert_called_once()
member_add.assert_called_once()
self.assertEqual(
member_add.call_args[0][0].odata_id,
f"https://graph.microsoft.com/v1.0/directoryObjects/{MicrosoftEntraProviderUser.objects.filter(
provider=self.provider,
).first().microsoft_id}",
)
def test_group_create_member_remove(self):
"""Test group creation"""
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.users.users_request_builder.UsersRequestBuilder.post",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_create,
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.patch",
AsyncMock(return_value=MSUser(id=generate_id())),
),
patch(
"msgraph.generated.groups.groups_request_builder.GroupsRequestBuilder.post",
AsyncMock(return_value=MSGroup(id=uid)),
) as group_create,
patch(
"msgraph.generated.groups.item.members.ref.ref_request_builder.RefRequestBuilder.post",
AsyncMock(),
) as member_add,
patch(
"msgraph.generated.groups.item.members.item.ref.ref_request_builder.RefRequestBuilder.delete",
AsyncMock(),
) as member_remove,
):
user = create_test_user(uid)
group = Group.objects.create(name=uid)
group.users.add(user)
microsoft_group = MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group=group
).first()
self.assertIsNotNone(microsoft_group)
group.users.remove(user)
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
user_create.assert_called_once()
group_create.assert_called_once()
member_add.assert_called_once()
self.assertEqual(
member_add.call_args[0][0].odata_id,
f"https://graph.microsoft.com/v1.0/directoryObjects/{MicrosoftEntraProviderUser.objects.filter(
provider=self.provider,
).first().microsoft_id}",
)
member_remove.assert_called_once()
def test_group_create_delete_do_nothing(self):
"""Test group deletion (delete action = do nothing)"""
self.provider.group_delete_action = OutgoingSyncDeleteAction.DO_NOTHING
self.provider.save()
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.groups.groups_request_builder.GroupsRequestBuilder.post",
AsyncMock(return_value=MSGroup(id=uid)),
) as group_create,
patch(
"msgraph.generated.groups.item.group_item_request_builder.GroupItemRequestBuilder.delete",
AsyncMock(return_value=MSGroup(id=uid)),
) as group_delete,
):
group = Group.objects.create(name=uid)
microsoft_group = MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group=group
).first()
self.assertIsNotNone(microsoft_group)
group.delete()
self.assertFalse(
MicrosoftEntraProviderGroup.objects.filter(
provider=self.provider, group__name=uid
).exists()
)
group_create.assert_called_once()
group_delete.assert_not_called()
def test_sync_task(self):
"""Test group discovery"""
uid = generate_id()
self.app.backchannel_providers.remove(self.provider)
different_group = Group.objects.create(
name=uid,
)
self.app.backchannel_providers.add(self.provider)
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.patch",
AsyncMock(return_value=MSUser(id=generate_id())),
),
patch(
"msgraph.generated.groups.groups_request_builder.GroupsRequestBuilder.post",
AsyncMock(return_value=MSGroup(id=generate_id())),
),
patch(
"msgraph.generated.groups.item.group_item_request_builder.GroupItemRequestBuilder.patch",
AsyncMock(return_value=MSGroup(id=uid)),
),
patch(
"msgraph.generated.users.users_request_builder.UsersRequestBuilder.get",
AsyncMock(
return_value=UserCollectionResponse(
value=[MSUser(mail=f"{uid}@goauthentik.io", id=uid)]
)
),
) as user_list,
patch(
"msgraph.generated.groups.groups_request_builder.GroupsRequestBuilder.get",
AsyncMock(
return_value=GroupCollectionResponse(
value=[MSGroup(display_name=uid, unique_name=uid, id=uid)]
)
),
) as group_list,
):
microsoft_entra_sync.delay(self.provider.pk).get()
self.assertTrue(
MicrosoftEntraProviderGroup.objects.filter(
group=different_group, provider=self.provider
).exists()
)
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
user_list.assert_called_once()
group_list.assert_called_once()

View File

@ -1,337 +0,0 @@
"""Microsoft Entra User tests"""
from unittest.mock import AsyncMock, MagicMock, patch
from azure.identity.aio import ClientSecretCredential
from django.test import TestCase
from msgraph.generated.models.group_collection_response import GroupCollectionResponse
from msgraph.generated.models.organization import Organization
from msgraph.generated.models.organization_collection_response import OrganizationCollectionResponse
from msgraph.generated.models.user import User as MSUser
from msgraph.generated.models.user_collection_response import UserCollectionResponse
from msgraph.generated.models.verified_domain import VerifiedDomain
from authentik.blueprints.tests import apply_blueprint
from authentik.core.models import Application, Group, User
from authentik.enterprise.providers.microsoft_entra.models import (
MicrosoftEntraProvider,
MicrosoftEntraProviderMapping,
MicrosoftEntraProviderUser,
)
from authentik.enterprise.providers.microsoft_entra.tasks import microsoft_entra_sync
from authentik.events.models import Event, EventAction
from authentik.lib.generators import generate_id
from authentik.lib.sync.outgoing.models import OutgoingSyncDeleteAction
from authentik.tenants.models import Tenant
class MicrosoftEntraUserTests(TestCase):
"""Microsoft Entra User tests"""
@apply_blueprint("system/providers-microsoft-entra.yaml")
def setUp(self) -> None:
# Delete all users and groups as the mocked HTTP responses only return one ID
# which will cause errors with multiple users
Tenant.objects.update(avatars="none")
User.objects.all().exclude_anonymous().delete()
Group.objects.all().delete()
self.provider: MicrosoftEntraProvider = MicrosoftEntraProvider.objects.create(
name=generate_id(),
client_id=generate_id(),
client_secret=generate_id(),
tenant_id=generate_id(),
exclude_users_service_account=True,
)
self.app: Application = Application.objects.create(
name=generate_id(),
slug=generate_id(),
)
self.app.backchannel_providers.add(self.provider)
self.provider.property_mappings.add(
MicrosoftEntraProviderMapping.objects.get(
managed="goauthentik.io/providers/microsoft_entra/user"
)
)
self.provider.property_mappings_group.add(
MicrosoftEntraProviderMapping.objects.get(
managed="goauthentik.io/providers/microsoft_entra/group"
)
)
self.creds = ClientSecretCredential(generate_id(), generate_id(), generate_id())
def test_user_create(self):
"""Test user creation"""
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.users.users_request_builder.UsersRequestBuilder.post",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_create,
):
user = User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
microsoft_user = MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user=user
).first()
self.assertIsNotNone(microsoft_user)
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
user_create.assert_called_once()
def test_user_create_update(self):
"""Test user updating"""
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.users.users_request_builder.UsersRequestBuilder.post",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_create,
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.patch",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_patch,
):
user = User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
microsoft_user = MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user=user
).first()
self.assertIsNotNone(microsoft_user)
user.name = "new name"
user.save()
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
user_create.assert_called_once()
user_patch.assert_called_once()
def test_user_create_delete(self):
"""Test user deletion"""
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.users.users_request_builder.UsersRequestBuilder.post",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_create,
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.delete",
AsyncMock(),
) as user_delete,
):
user = User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
microsoft_user = MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user=user
).first()
self.assertIsNotNone(microsoft_user)
user.delete()
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
user_create.assert_called_once()
user_delete.assert_called_once()
def test_user_create_delete_suspend(self):
"""Test user deletion (delete action = Suspend)"""
self.provider.user_delete_action = OutgoingSyncDeleteAction.SUSPEND
self.provider.save()
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.users.users_request_builder.UsersRequestBuilder.post",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_create,
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.patch",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_patch,
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.delete",
AsyncMock(),
) as user_delete,
):
user = User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
microsoft_user = MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user=user
).first()
self.assertIsNotNone(microsoft_user)
user.delete()
self.assertFalse(
MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user__username=uid
).exists()
)
user_create.assert_called_once()
user_patch.assert_called_once()
self.assertFalse(user_patch.call_args[0][0].account_enabled)
user_delete.assert_not_called()
def test_user_create_delete_do_nothing(self):
"""Test user deletion (delete action = do nothing)"""
self.provider.user_delete_action = OutgoingSyncDeleteAction.DO_NOTHING
self.provider.save()
uid = generate_id()
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.users.users_request_builder.UsersRequestBuilder.post",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_create,
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.patch",
AsyncMock(return_value=MSUser(id=generate_id())),
) as user_patch,
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.delete",
AsyncMock(),
) as user_delete,
):
user = User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
microsoft_user = MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user=user
).first()
self.assertIsNotNone(microsoft_user)
user.delete()
self.assertFalse(
MicrosoftEntraProviderUser.objects.filter(
provider=self.provider, user__username=uid
).exists()
)
user_create.assert_called_once()
user_patch.assert_not_called()
user_delete.assert_not_called()
def test_sync_task(self):
"""Test user discovery"""
uid = generate_id()
self.app.backchannel_providers.remove(self.provider)
different_user = User.objects.create(
username=uid,
email=f"{uid}@goauthentik.io",
)
self.app.backchannel_providers.add(self.provider)
with (
patch(
"authentik.enterprise.providers.microsoft_entra.models.MicrosoftEntraProvider.microsoft_credentials",
MagicMock(return_value={"credentials": self.creds}),
),
patch(
"msgraph.generated.organization.organization_request_builder.OrganizationRequestBuilder.get",
AsyncMock(
return_value=OrganizationCollectionResponse(
value=[
Organization(verified_domains=[VerifiedDomain(name="goauthentik.io")])
]
)
),
),
patch(
"msgraph.generated.users.item.user_item_request_builder.UserItemRequestBuilder.patch",
AsyncMock(return_value=MSUser(id=generate_id())),
),
patch(
"msgraph.generated.users.users_request_builder.UsersRequestBuilder.get",
AsyncMock(
return_value=UserCollectionResponse(
value=[MSUser(mail=f"{uid}@goauthentik.io", id=uid)]
)
),
) as user_list,
patch(
"msgraph.generated.groups.groups_request_builder.GroupsRequestBuilder.get",
AsyncMock(return_value=GroupCollectionResponse(value=[])),
),
):
microsoft_entra_sync.delay(self.provider.pk).get()
self.assertTrue(
MicrosoftEntraProviderUser.objects.filter(
user=different_user, provider=self.provider
).exists()
)
self.assertFalse(Event.objects.filter(action=EventAction.SYSTEM_EXCEPTION).exists())
user_list.assert_called_once()

View File

@ -1,21 +0,0 @@
"""microsoft provider urls"""
from authentik.enterprise.providers.microsoft_entra.api.groups import (
MicrosoftEntraProviderGroupViewSet,
)
from authentik.enterprise.providers.microsoft_entra.api.property_mappings import (
MicrosoftEntraProviderMappingViewSet,
)
from authentik.enterprise.providers.microsoft_entra.api.providers import (
MicrosoftEntraProviderViewSet,
)
from authentik.enterprise.providers.microsoft_entra.api.users import (
MicrosoftEntraProviderUserViewSet,
)
api_urlpatterns = [
("providers/microsoft_entra", MicrosoftEntraProviderViewSet),
("providers/microsoft_entra_users", MicrosoftEntraProviderUserViewSet),
("providers/microsoft_entra_groups", MicrosoftEntraProviderGroupViewSet),
("propertymappings/provider/microsoft_entra", MicrosoftEntraProviderMappingViewSet),
]

View File

@ -15,7 +15,6 @@ CELERY_BEAT_SCHEDULE = {
TENANT_APPS = [
"authentik.enterprise.audit",
"authentik.enterprise.providers.google_workspace",
"authentik.enterprise.providers.microsoft_entra",
"authentik.enterprise.providers.rac",
"authentik.enterprise.stages.source",
]

View File

@ -10,7 +10,7 @@ from django.db import migrations, models
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
import authentik.events.models
import authentik.lib.validators
import authentik.lib.models
from authentik.lib.migrations import progress_bar
@ -377,7 +377,7 @@ class Migration(migrations.Migration):
model_name="notificationtransport",
name="webhook_url",
field=models.TextField(
blank=True, validators=[authentik.lib.validators.DomainlessURLValidator()]
blank=True, validators=[authentik.lib.models.DomainlessURLValidator()]
),
),
]

View File

@ -41,11 +41,10 @@ from authentik.events.utils import (
sanitize_dict,
sanitize_item,
)
from authentik.lib.models import SerializerModel
from authentik.lib.models import DomainlessURLValidator, SerializerModel
from authentik.lib.sentry import SentryIgnoredException
from authentik.lib.utils.http import get_http_session
from authentik.lib.utils.time import timedelta_from_string
from authentik.lib.validators import DomainlessURLValidator
from authentik.policies.models import PolicyBindingModel
from authentik.root.middleware import ClientIPMiddleware
from authentik.stages.email.utils import TemplateEmailMessage

View File

@ -203,8 +203,7 @@ class FlowPlanner:
"f(plan): building plan",
)
plan = self._build_plan(user, request, default_context)
if self.use_cache:
cache.set(cache_key(self.flow, user), plan, CACHE_TIMEOUT)
cache.set(cache_key(self.flow, user), plan, CACHE_TIMEOUT)
if not plan.bindings and not self.allow_empty_flows:
raise EmptyFlowException()
return plan

View File

@ -101,7 +101,6 @@ def get_logger_config():
"uvicorn": "WARNING",
"gunicorn": "INFO",
"requests_mock": "WARNING",
"hpack": "WARNING",
}
for handler_name, level in handler_level_map.items():
base_config["loggers"][handler_name] = {

View File

@ -1,16 +1,13 @@
"""Generic models"""
from typing import Any
import re
from django.core.validators import URLValidator
from django.db import models
from django.dispatch import Signal
from django.utils import timezone
from django.utils.regex_helper import _lazy_re_compile
from model_utils.managers import InheritanceManager
from rest_framework.serializers import BaseSerializer
pre_soft_delete = Signal()
post_soft_delete = Signal()
class SerializerModel(models.Model):
"""Base Abstract Model which has a serializer"""
@ -54,57 +51,46 @@ class InheritanceForeignKey(models.ForeignKey):
forward_related_accessor_class = InheritanceForwardManyToOneDescriptor
class SoftDeleteQuerySet(models.QuerySet):
class DomainlessURLValidator(URLValidator):
"""Subclass of URLValidator which doesn't check the domain
(to allow hostnames without domain)"""
def delete(self):
for obj in self.all():
obj.delete()
def hard_delete(self):
return super().delete()
class SoftDeleteManager(models.Manager):
def get_queryset(self):
return SoftDeleteQuerySet(self.model, using=self._db).filter(deleted_at__isnull=True)
class DeletedSoftDeleteManager(models.Manager):
def get_queryset(self):
return super().get_queryset().exclude(deleted_at__isnull=True)
class SoftDeleteModel(models.Model):
"""Model which doesn't fully delete itself, but rather saved the delete status
so cleanup events can run."""
deleted_at = models.DateTimeField(blank=True, null=True)
objects = SoftDeleteManager()
deleted = DeletedSoftDeleteManager()
class Meta:
abstract = True
@property
def is_deleted(self):
return self.deleted_at is not None
def delete(self, using: Any = ..., keep_parents: bool = ...) -> tuple[int, dict[str, int]]:
pre_soft_delete.send(sender=self.__class__, instance=self)
now = timezone.now()
self.deleted_at = now
self.save(
update_fields=[
"deleted_at",
]
def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
self.host_re = "(" + self.hostname_re + self.domain_re + "|localhost)"
self.regex = _lazy_re_compile(
r"^(?:[a-z0-9.+-]*)://" # scheme is validated separately
r"(?:[^\s:@/]+(?::[^\s:@/]*)?@)?" # user:pass authentication
r"(?:" + self.ipv4_re + "|" + self.ipv6_re + "|" + self.host_re + ")"
r"(?::\d{2,5})?" # port
r"(?:[/?#][^\s]*)?" # resource path
r"\Z",
re.IGNORECASE,
)
post_soft_delete.send(sender=self.__class__, instance=self)
return tuple()
self.schemes = ["http", "https", "blank"] + list(self.schemes)
def force_delete(self, using: Any = ...):
if not self.deleted_at:
raise models.ProtectedError("Refusing to force delete non-deleted model", {self})
return super().delete(using=using)
def __call__(self, value: str):
# Check if the scheme is valid.
scheme = value.split("://")[0].lower()
if scheme not in self.schemes:
value = "default" + value
super().__call__(value)
class DomainlessFormattedURLValidator(DomainlessURLValidator):
"""URL validator which allows for python format strings"""
def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
self.formatter_re = r"([%\(\)a-zA-Z])*"
self.host_re = "(" + self.formatter_re + self.hostname_re + self.domain_re + "|localhost)"
self.regex = _lazy_re_compile(
r"^(?:[a-z0-9.+-]*)://" # scheme is validated separately
r"(?:[^\s:@/]+(?::[^\s:@/]*)?@)?" # user:pass authentication
r"(?:" + self.ipv4_re + "|" + self.ipv6_re + "|" + self.host_re + ")"
r"(?::\d{2,5})?" # port
r"(?:[/?#][^\s]*)?" # resource path
r"\Z",
re.IGNORECASE,
)
self.schemes = ["http", "https", "blank"] + list(self.schemes)

View File

@ -39,25 +39,26 @@ class BaseOutgoingSyncClient[
"""Create object in remote destination"""
raise NotImplementedError()
def update(self, obj: TModel, connection: TConnection):
def update(self, obj: TModel, connection: object):
"""Update object in remote destination"""
raise NotImplementedError()
def write(self, obj: TModel) -> tuple[TConnection, bool]:
"""Write object to destination. Uses self.create and self.update, but
can be overwritten for further logic"""
connection = self.connection_type.objects.filter(
remote_obj = self.connection_type.objects.filter(
provider=self.provider, **{self.connection_type_query: obj}
).first()
connection: TConnection | None = None
try:
if not connection:
if not remote_obj:
connection = self.create(obj)
return connection, True
try:
self.update(obj, connection)
return connection, False
self.update(obj, remote_obj)
return remote_obj, False
except NotFoundSyncException:
connection.delete()
remote_obj.delete()
connection = self.create(obj)
return connection, True
except DatabaseError as exc:
@ -70,7 +71,7 @@ class BaseOutgoingSyncClient[
"""Delete object from destination"""
raise NotImplementedError()
def to_schema(self, obj: TModel, creating: bool) -> TSchema:
def to_schema(self, obj: TModel) -> TSchema:
"""Convert object to destination schema"""
raise NotImplementedError()

View File

@ -17,10 +17,6 @@ class ObjectExistsSyncException(BaseSyncException):
"""Exception when an object already exists in the remote system"""
class BadRequestSyncException(BaseSyncException):
"""Exception when invalid data was sent to the remote system"""
class StopSync(BaseSyncException):
"""Exception raised when a configuration error should stop the sync process"""

View File

@ -1,7 +1,7 @@
from typing import Any, Self
from django.core.cache import cache
from django.db.models import Model, QuerySet, TextChoices
from django.db.models import Model, QuerySet
from redis.lock import Lock
from authentik.core.models import Group, User
@ -9,15 +9,6 @@ from authentik.lib.sync.outgoing import PAGE_TIMEOUT
from authentik.lib.sync.outgoing.base import BaseOutgoingSyncClient
class OutgoingSyncDeleteAction(TextChoices):
"""Action taken when a user/group is deleted in authentik. Suspend is not available for groups,
and will be treated as `do_nothing`"""
DO_NOTHING = "do_nothing"
DELETE = "delete"
SUSPEND = "suspend"
class OutgoingSyncProvider(Model):
class Meta:

View File

@ -47,7 +47,7 @@ def register_signals(
return
task_sync_direct.delay(
class_to_path(instance.__class__), instance.pk, Direction.remove.value
).get(propagate=False)
)
pre_delete.connect(model_pre_delete, User, dispatch_uid=uid, weak=False)
pre_delete.connect(model_pre_delete, Group, dispatch_uid=uid, weak=False)

View File

@ -1,6 +1,5 @@
from collections.abc import Callable
from celery.exceptions import Retry
from celery.result import allow_join_result
from django.core.paginator import Paginator
from django.db.models import Model, QuerySet
@ -15,11 +14,7 @@ from authentik.events.models import TaskStatus
from authentik.events.system_tasks import SystemTask
from authentik.lib.sync.outgoing import PAGE_SIZE, PAGE_TIMEOUT
from authentik.lib.sync.outgoing.base import Direction
from authentik.lib.sync.outgoing.exceptions import (
BadRequestSyncException,
StopSync,
TransientSyncException,
)
from authentik.lib.sync.outgoing.exceptions import StopSync, TransientSyncException
from authentik.lib.sync.outgoing.models import OutgoingSyncProvider
from authentik.lib.utils.reflection import class_to_path, path_to_class
@ -126,27 +121,6 @@ class SyncTasks:
client.write(obj)
except SkipObjectException:
continue
except BadRequestSyncException as exc:
self.logger.warning("failed to sync object", exc=exc, obj=obj)
messages.append(
LogEvent(
_(
(
"Failed to sync {object_type} {object_name} "
"due to error: {error}"
).format_map(
{
"object_type": obj._meta.verbose_name,
"object_name": str(obj),
"error": str(exc),
}
)
),
log_level="warning",
logger="",
attributes={"arguments": exc.args[1:]},
)
)
except TransientSyncException as exc:
self.logger.warning("failed to sync object", exc=exc, user=obj)
messages.append(
@ -211,9 +185,7 @@ class SyncTasks:
client.write(instance)
if operation == Direction.remove:
client.delete(instance)
except TransientSyncException as exc:
raise Retry() from exc
except StopSync as exc:
except (StopSync, TransientSyncException) as exc:
self.logger.warning(exc, provider_pk=provider.pk)
def sync_signal_m2m(self, group_pk: str, action: str, pk_set: list[int]):
@ -239,7 +211,5 @@ class SyncTasks:
if action == "post_remove":
operation = Direction.remove
client.update_group(group, operation, pk_set)
except TransientSyncException as exc:
raise Retry() from exc
except StopSync as exc:
except (StopSync, TransientSyncException) as exc:
self.logger.warning(exc, provider_pk=provider.pk)

View File

@ -1,9 +1,5 @@
"""Serializer validators"""
import re
from django.core.validators import URLValidator
from django.utils.regex_helper import _lazy_re_compile
from django.utils.translation import gettext_lazy as _
from rest_framework.exceptions import ValidationError
from rest_framework.serializers import Serializer
@ -33,48 +29,3 @@ class RequiredTogetherValidator:
def __repr__(self):
return f"<{self.__class__.__name__}(fields={smart_repr(self.fields)})>"
class DomainlessURLValidator(URLValidator):
"""Subclass of URLValidator which doesn't check the domain
(to allow hostnames without domain)"""
def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
self.host_re = "(" + self.hostname_re + self.domain_re + "|localhost)"
self.regex = _lazy_re_compile(
r"^(?:[a-z0-9.+-]*)://" # scheme is validated separately
r"(?:[^\s:@/]+(?::[^\s:@/]*)?@)?" # user:pass authentication
r"(?:" + self.ipv4_re + "|" + self.ipv6_re + "|" + self.host_re + ")"
r"(?::\d{2,5})?" # port
r"(?:[/?#][^\s]*)?" # resource path
r"\Z",
re.IGNORECASE,
)
self.schemes = ["http", "https", "blank"] + list(self.schemes)
def __call__(self, value: str):
# Check if the scheme is valid.
scheme = value.split("://")[0].lower()
if scheme not in self.schemes:
value = "default" + value
super().__call__(value)
class DomainlessFormattedURLValidator(DomainlessURLValidator):
"""URL validator which allows for python format strings"""
def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
self.formatter_re = r"([%\(\)a-zA-Z])*"
self.host_re = "(" + self.formatter_re + self.hostname_re + self.domain_re + "|localhost)"
self.regex = _lazy_re_compile(
r"^(?:[a-z0-9.+-]*)://" # scheme is validated separately
r"(?:[^\s:@/]+(?::[^\s:@/]*)?@)?" # user:pass authentication
r"(?:" + self.ipv4_re + "|" + self.ipv6_re + "|" + self.host_re + ")"
r"(?::\d{2,5})?" # port
r"(?:[/?#][^\s]*)?" # resource path
r"\Z",
re.IGNORECASE,
)
self.schemes = ["http", "https", "blank"] + list(self.schemes)

View File

@ -1,18 +0,0 @@
# Generated by Django 5.0.4 on 2024-04-23 21:00
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_outposts", "0021_alter_outpost_type"),
]
operations = [
migrations.AddField(
model_name="outpost",
name="deleted_at",
field=models.DateTimeField(blank=True, null=True),
),
]

View File

@ -33,7 +33,7 @@ from authentik.core.models import (
from authentik.crypto.models import CertificateKeyPair
from authentik.events.models import Event, EventAction
from authentik.lib.config import CONFIG
from authentik.lib.models import InheritanceForeignKey, SerializerModel, SoftDeleteModel
from authentik.lib.models import InheritanceForeignKey, SerializerModel
from authentik.lib.sentry import SentryIgnoredException
from authentik.lib.utils.errors import exception_to_string
from authentik.outposts.controllers.k8s.utils import get_namespace
@ -131,7 +131,7 @@ class OutpostServiceConnection(models.Model):
verbose_name = _("Outpost Service-Connection")
verbose_name_plural = _("Outpost Service-Connections")
def __str__(self):
def __str__(self) -> __version__:
return f"Outpost service connection {self.name}"
@property
@ -241,7 +241,7 @@ class KubernetesServiceConnection(SerializerModel, OutpostServiceConnection):
return "ak-service-connection-kubernetes-form"
class Outpost(SoftDeleteModel, SerializerModel, ManagedModel):
class Outpost(SerializerModel, ManagedModel):
"""Outpost instance which manages a service user and token"""
uuid = models.UUIDField(default=uuid4, editable=False, primary_key=True)

View File

@ -2,14 +2,13 @@
from django.core.cache import cache
from django.db.models import Model
from django.db.models.signals import m2m_changed, post_save, pre_save
from django.db.models.signals import m2m_changed, post_save, pre_delete, pre_save
from django.dispatch import receiver
from structlog.stdlib import get_logger
from authentik.brands.models import Brand
from authentik.core.models import Provider
from authentik.crypto.models import CertificateKeyPair
from authentik.lib.models import post_soft_delete
from authentik.lib.utils.reflection import class_to_path
from authentik.outposts.models import Outpost, OutpostServiceConnection
from authentik.outposts.tasks import CACHE_KEY_OUTPOST_DOWN, outpost_controller, outpost_post_save
@ -68,7 +67,9 @@ def post_save_update(sender, instance: Model, created: bool, **_):
outpost_post_save.delay(class_to_path(instance.__class__), instance.pk)
@receiver(post_soft_delete, sender=Outpost)
def outpost_cleanup(sender, instance: Outpost, **_):
@receiver(pre_delete, sender=Outpost)
def pre_delete_cleanup(sender, instance: Outpost, **_):
"""Ensure that Outpost's user is deleted (which will delete the token through cascade)"""
outpost_controller.delay(instance.pk.hex, action="down")
instance.user.delete()
cache.set(CACHE_KEY_OUTPOST_DOWN % instance.pk.hex, instance)
outpost_controller.delay(instance.pk.hex, action="down", from_cache=True)

View File

@ -129,14 +129,17 @@ def outpost_controller_all():
@CELERY_APP.task(bind=True, base=SystemTask)
def outpost_controller(self: SystemTask, outpost_pk: str, action: str = "up"):
def outpost_controller(
self: SystemTask, outpost_pk: str, action: str = "up", from_cache: bool = False
):
"""Create/update/monitor/delete the deployment of an Outpost"""
logs = []
outpost: Outpost = None
if action == "up":
outpost = Outpost.objects.filter(pk=outpost_pk).first()
elif action == "down":
outpost = Outpost.deleted.filter(pk=outpost_pk).first()
if from_cache:
outpost: Outpost = cache.get(CACHE_KEY_OUTPOST_DOWN % outpost_pk)
LOGGER.debug("Getting outpost from cache to delete")
else:
outpost: Outpost = Outpost.objects.filter(pk=outpost_pk).first()
LOGGER.debug("Getting outpost from DB")
if not outpost:
LOGGER.warning("No outpost")
return
@ -152,10 +155,9 @@ def outpost_controller(self: SystemTask, outpost_pk: str, action: str = "up"):
except (ControllerException, ServiceConnectionInvalid) as exc:
self.set_error(exc)
else:
if from_cache:
cache.delete(CACHE_KEY_OUTPOST_DOWN % outpost_pk)
self.set_status(TaskStatus.SUCCESSFUL, *logs)
finally:
if outpost.deleted_at:
outpost.force_delete()
@CELERY_APP.task(bind=True, base=SystemTask)

View File

@ -6,7 +6,7 @@ from django.core.exceptions import FieldError
from django.db import migrations, models
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
import authentik.lib.validators
import authentik.lib.models
import authentik.providers.proxy.models
@ -80,9 +80,7 @@ class Migration(migrations.Migration):
models.TextField(
blank=True,
validators=[
authentik.lib.validators.DomainlessURLValidator(
schemes=("http", "https")
)
authentik.lib.models.DomainlessURLValidator(schemes=("http", "https"))
],
),
),
@ -90,9 +88,7 @@ class Migration(migrations.Migration):
"external_host",
models.TextField(
validators=[
authentik.lib.validators.DomainlessURLValidator(
schemes=("http", "https")
)
authentik.lib.models.DomainlessURLValidator(schemes=("http", "https"))
]
),
),

View File

@ -10,7 +10,7 @@ from django.utils.translation import gettext as _
from rest_framework.serializers import Serializer
from authentik.crypto.models import CertificateKeyPair
from authentik.lib.validators import DomainlessURLValidator
from authentik.lib.models import DomainlessURLValidator
from authentik.outposts.models import OutpostModel
from authentik.providers.oauth2.models import ClientTypes, OAuth2Provider, ScopeMapping

View File

@ -34,7 +34,7 @@ class SCIMGroupClient(SCIMClient[Group, SCIMGroup, SCIMGroupSchema]):
connection_type = SCIMGroup
connection_type_query = "group"
def to_schema(self, obj: Group, creating: bool) -> SCIMGroupSchema:
def to_schema(self, obj: Group) -> SCIMGroupSchema:
"""Convert authentik user into SCIM"""
raw_scim_group = {
"schemas": ("urn:ietf:params:scim:schemas:core:2.0:Group",),
@ -51,7 +51,6 @@ class SCIMGroupClient(SCIMClient[Group, SCIMGroup, SCIMGroupSchema]):
request=None,
group=obj,
provider=self.provider,
creating=creating,
)
if value is None:
continue
@ -100,7 +99,7 @@ class SCIMGroupClient(SCIMClient[Group, SCIMGroup, SCIMGroupSchema]):
def create(self, group: Group):
"""Create group from scratch and create a connection object"""
scim_group = self.to_schema(group, True)
scim_group = self.to_schema(group)
response = self._request(
"POST",
"/Groups",
@ -112,11 +111,11 @@ class SCIMGroupClient(SCIMClient[Group, SCIMGroup, SCIMGroupSchema]):
scim_id = response.get("id")
if not scim_id or scim_id == "":
raise StopSync("SCIM Response with missing or invalid `id`")
return SCIMGroup.objects.create(provider=self.provider, group=group, scim_id=scim_id)
SCIMGroup.objects.create(provider=self.provider, group=group, scim_id=scim_id)
def update(self, group: Group, connection: SCIMGroup):
"""Update existing group"""
scim_group = self.to_schema(group, False)
scim_group = self.to_schema(group)
scim_group.id = connection.scim_id
try:
return self._request(

View File

@ -23,7 +23,7 @@ class SCIMUserClient(SCIMClient[User, SCIMUser, SCIMUserSchema]):
connection_type = SCIMUser
connection_type_query = "user"
def to_schema(self, obj: User, creating: bool) -> SCIMUserSchema:
def to_schema(self, obj: User) -> SCIMUserSchema:
"""Convert authentik user into SCIM"""
raw_scim_user = {
"schemas": ("urn:ietf:params:scim:schemas:core:2.0:User",),
@ -37,7 +37,6 @@ class SCIMUserClient(SCIMClient[User, SCIMUser, SCIMUserSchema]):
user=obj,
request=None,
provider=self.provider,
creating=creating,
)
if value is None:
continue
@ -74,7 +73,7 @@ class SCIMUserClient(SCIMClient[User, SCIMUser, SCIMUserSchema]):
def create(self, user: User):
"""Create user from scratch and create a connection object"""
scim_user = self.to_schema(user, True)
scim_user = self.to_schema(user)
response = self._request(
"POST",
"/Users",
@ -86,11 +85,11 @@ class SCIMUserClient(SCIMClient[User, SCIMUser, SCIMUserSchema]):
scim_id = response.get("id")
if not scim_id or scim_id == "":
raise StopSync("SCIM Response with missing or invalid `id`")
return SCIMUser.objects.create(provider=self.provider, user=user, scim_id=scim_id)
SCIMUser.objects.create(provider=self.provider, user=user, scim_id=scim_id)
def update(self, user: User, connection: SCIMUser):
"""Update existing user"""
scim_user = self.to_schema(user, False)
scim_user = self.to_schema(user)
scim_user.id = connection.scim_id
self._request(
"PUT",

View File

@ -49,7 +49,7 @@ class SCIMProvider(OutgoingSyncProvider, BackchannelProvider):
if type == User:
# Get queryset of all users with consistent ordering
# according to the provider's settings
base = User.objects.all()
base = User.objects.all().exclude_anonymous()
if self.exclude_users_service_account:
base = base.exclude(type=UserTypes.SERVICE_ACCOUNT).exclude(
type=UserTypes.INTERNAL_SERVICE_ACCOUNT

View File

@ -1,7 +1,6 @@
"""SCIM Provider tasks"""
from authentik.events.system_tasks import SystemTask
from authentik.lib.sync.outgoing.exceptions import TransientSyncException
from authentik.lib.sync.outgoing.tasks import SyncTasks
from authentik.providers.scim.models import SCIMProvider
from authentik.root.celery import CELERY_APP
@ -9,14 +8,12 @@ from authentik.root.celery import CELERY_APP
sync_tasks = SyncTasks(SCIMProvider)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
@CELERY_APP.task()
def scim_sync_objects(*args, **kwargs):
return sync_tasks.sync_objects(*args, **kwargs)
@CELERY_APP.task(
base=SystemTask, bind=True, autoretry_for=(TransientSyncException,), retry_backoff=True
)
@CELERY_APP.task(base=SystemTask, bind=True)
def scim_sync(self, provider_pk: int, *args, **kwargs):
"""Run full sync for SCIM provider"""
return sync_tasks.sync_single(self, provider_pk, scim_sync_objects)
@ -27,11 +24,11 @@ def scim_sync_all():
return sync_tasks.sync_all(scim_sync)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
@CELERY_APP.task()
def scim_sync_direct(*args, **kwargs):
return sync_tasks.sync_signal_direct(*args, **kwargs)
@CELERY_APP.task(autoretry_for=(TransientSyncException,), retry_backoff=True)
@CELERY_APP.task()
def scim_sync_m2m(*args, **kwargs):
return sync_tasks.sync_signal_m2m(*args, **kwargs)

View File

@ -19,7 +19,7 @@ class SCIMGroupTests(TestCase):
def setUp(self) -> None:
# Delete all users and groups as the mocked HTTP responses only return one ID
# which will cause errors with multiple users
User.objects.all().delete()
User.objects.all().exclude_anonymous().delete()
Group.objects.all().delete()
self.provider: SCIMProvider = SCIMProvider.objects.create(
name=generate_id(),
@ -127,7 +127,7 @@ class SCIMGroupTests(TestCase):
"id": scim_id,
},
)
mock.delete(f"https://localhost/Groups/{scim_id}", status_code=204)
mock.delete("https://localhost/Groups", status_code=204)
uid = generate_id()
group = Group.objects.create(
name=uid,

View File

@ -21,7 +21,7 @@ class SCIMMembershipTests(TestCase):
def setUp(self) -> None:
# Delete all users and groups as the mocked HTTP responses only return one ID
# which will cause errors with multiple users
User.objects.all().delete()
User.objects.all().exclude_anonymous().delete()
Group.objects.all().delete()
Tenant.objects.update(avatars="none")

View File

@ -22,7 +22,7 @@ class SCIMUserTests(TestCase):
# Delete all users and groups as the mocked HTTP responses only return one ID
# which will cause errors with multiple users
Tenant.objects.update(avatars="none")
User.objects.all().delete()
User.objects.all().exclude_anonymous().delete()
Group.objects.all().delete()
self.provider: SCIMProvider = SCIMProvider.objects.create(
name=generate_id(),
@ -230,7 +230,7 @@ class SCIMUserTests(TestCase):
"id": scim_id,
},
)
mock.delete(f"https://localhost/Users/{scim_id}", status_code=204)
mock.delete("https://localhost/Users", status_code=204)
uid = generate_id()
user = User.objects.create(
username=uid,

View File

@ -63,7 +63,7 @@ def task_prerun_hook(task_id: str, task, *args, **kwargs):
@task_postrun.connect
def task_postrun_hook(task_id: str, task, *args, retval=None, state=None, **kwargs):
def task_postrun_hook(task_id, task, *args, retval=None, state=None, **kwargs):
"""Log task_id on worker"""
CTX_TASK_ID.set(...)
LOGGER.info(
@ -73,16 +73,14 @@ def task_postrun_hook(task_id: str, task, *args, retval=None, state=None, **kwar
@task_failure.connect
@task_internal_error.connect
def task_error_hook(task_id: str, exception: Exception, traceback, *args, **kwargs):
def task_error_hook(task_id, exception: Exception, traceback, *args, **kwargs):
"""Create system event for failed task"""
from authentik.events.models import Event, EventAction
LOGGER.warning("Task failure", task_id=task_id.replace("-", ""), exc=exception)
LOGGER.warning("Task failure", exc=exception)
CTX_TASK_ID.set(...)
if before_send({}, {"exc_info": (None, exception, None)}) is not None:
Event.new(
EventAction.SYSTEM_EXCEPTION, message=exception_to_string(exception), task_id=task_id
).save()
Event.new(EventAction.SYSTEM_EXCEPTION, message=exception_to_string(exception)).save()
def _get_startup_tasks_default_tenant() -> list[Callable]:

View File

@ -155,7 +155,9 @@ SPECTACULAR_SETTINGS = {
"LDAPAPIAccessMode": "authentik.providers.ldap.models.APIAccessMode",
"UserVerificationEnum": "authentik.stages.authenticator_webauthn.models.UserVerification",
"UserTypeEnum": "authentik.core.models.UserTypes",
"OutgoingSyncDeleteAction": "authentik.lib.sync.outgoing.models.OutgoingSyncDeleteAction",
"GoogleWorkspaceDeleteAction": (
"authentik.enterprise.providers.google_workspace.models.GoogleWorkspaceDeleteAction"
),
},
"ENUM_ADD_EXPLICIT_BLANK_NULL_CHOICE": False,
"ENUM_GENERATE_CHOICE_DESCRIPTION": False,

View File

@ -4,7 +4,7 @@ import django.db.models.deletion
from django.apps.registry import Apps
from django.db import migrations, models
import authentik.lib.validators
import authentik.lib.models
def set_managed_flag(apps: Apps, schema_editor):
@ -105,9 +105,7 @@ class Migration(migrations.Migration):
"server_uri",
models.TextField(
validators=[
authentik.lib.validators.DomainlessURLValidator(
schemes=["ldap", "ldaps"]
)
authentik.lib.models.DomainlessURLValidator(schemes=["ldap", "ldaps"])
],
verbose_name="Server URI",
),

View File

@ -17,7 +17,7 @@ from rest_framework.serializers import Serializer
from authentik.core.models import Group, PropertyMapping, Source
from authentik.crypto.models import CertificateKeyPair
from authentik.lib.config import CONFIG
from authentik.lib.validators import DomainlessURLValidator
from authentik.lib.models import DomainlessURLValidator
LDAP_TIMEOUT = 15

View File

@ -2594,80 +2594,6 @@
}
}
},
{
"type": "object",
"required": [
"model",
"identifiers"
],
"properties": {
"model": {
"const": "authentik_providers_microsoft_entra.microsoftentraprovider"
},
"id": {
"type": "string"
},
"state": {
"type": "string",
"enum": [
"absent",
"present",
"created",
"must_created"
],
"default": "present"
},
"conditions": {
"type": "array",
"items": {
"type": "boolean"
}
},
"attrs": {
"$ref": "#/$defs/model_authentik_providers_microsoft_entra.microsoftentraprovider"
},
"identifiers": {
"$ref": "#/$defs/model_authentik_providers_microsoft_entra.microsoftentraprovider"
}
}
},
{
"type": "object",
"required": [
"model",
"identifiers"
],
"properties": {
"model": {
"const": "authentik_providers_microsoft_entra.microsoftentraprovidermapping"
},
"id": {
"type": "string"
},
"state": {
"type": "string",
"enum": [
"absent",
"present",
"created",
"must_created"
],
"default": "present"
},
"conditions": {
"type": "array",
"items": {
"type": "boolean"
}
},
"attrs": {
"$ref": "#/$defs/model_authentik_providers_microsoft_entra.microsoftentraprovidermapping"
},
"identifiers": {
"$ref": "#/$defs/model_authentik_providers_microsoft_entra.microsoftentraprovidermapping"
}
}
},
{
"type": "object",
"required": [
@ -3486,7 +3412,6 @@
"authentik.enterprise",
"authentik.enterprise.audit",
"authentik.enterprise.providers.google_workspace",
"authentik.enterprise.providers.microsoft_entra",
"authentik.enterprise.providers.rac",
"authentik.enterprise.stages.source",
"authentik.events"
@ -3570,8 +3495,6 @@
"authentik_enterprise.license",
"authentik_providers_google_workspace.googleworkspaceprovider",
"authentik_providers_google_workspace.googleworkspaceprovidermapping",
"authentik_providers_microsoft_entra.microsoftentraprovider",
"authentik_providers_microsoft_entra.microsoftentraprovidermapping",
"authentik_providers_rac.racprovider",
"authentik_providers_rac.endpoint",
"authentik_providers_rac.racpropertymapping",
@ -8379,102 +8302,6 @@
},
"required": []
},
"model_authentik_providers_microsoft_entra.microsoftentraprovider": {
"type": "object",
"properties": {
"name": {
"type": "string",
"minLength": 1,
"title": "Name"
},
"property_mappings": {
"type": "array",
"items": {
"type": "string",
"format": "uuid"
},
"title": "Property mappings"
},
"property_mappings_group": {
"type": "array",
"items": {
"type": "string",
"format": "uuid",
"description": "Property mappings used for group creation/updating."
},
"title": "Property mappings group",
"description": "Property mappings used for group creation/updating."
},
"client_id": {
"type": "string",
"minLength": 1,
"title": "Client id"
},
"client_secret": {
"type": "string",
"minLength": 1,
"title": "Client secret"
},
"tenant_id": {
"type": "string",
"minLength": 1,
"title": "Tenant id"
},
"exclude_users_service_account": {
"type": "boolean",
"title": "Exclude users service account"
},
"filter_group": {
"type": "string",
"format": "uuid",
"title": "Filter group"
},
"user_delete_action": {
"type": "string",
"enum": [
"do_nothing",
"delete",
"suspend"
],
"title": "User delete action"
},
"group_delete_action": {
"type": "string",
"enum": [
"do_nothing",
"delete",
"suspend"
],
"title": "Group delete action"
}
},
"required": []
},
"model_authentik_providers_microsoft_entra.microsoftentraprovidermapping": {
"type": "object",
"properties": {
"managed": {
"type": [
"string",
"null"
],
"minLength": 1,
"title": "Managed by authentik",
"description": "Objects that are managed by authentik. These objects are created and updated automatically. This flag only indicates that an object can be overwritten by migrations. You can still modify the objects via the API, but expect changes to be overwritten in a later update."
},
"name": {
"type": "string",
"minLength": 1,
"title": "Name"
},
"expression": {
"type": "string",
"minLength": 1,
"title": "Expression"
}
},
"required": []
},
"model_authentik_providers_rac.racprovider": {
"type": "object",
"properties": {

View File

@ -2,7 +2,7 @@ version: 1
metadata:
labels:
blueprints.goauthentik.io/system: "true"
name: System - Google Workspace Provider - Mappings
name: System - Google Provider - Mappings
entries:
- identifiers:
managed: goauthentik.io/providers/google_workspace/user

View File

@ -1,39 +0,0 @@
version: 1
metadata:
labels:
blueprints.goauthentik.io/system: "true"
name: System - Microsoft Entra Provider - Mappings
entries:
- identifiers:
managed: goauthentik.io/providers/microsoft_entra/user
model: authentik_providers_microsoft_entra.microsoftentraprovidermapping
attrs:
name: "authentik default Microsoft Entra Mapping: User"
# https://learn.microsoft.com/en-us/graph/api/resources/user?view=graph-rest-1.0
expression: |
from msgraph.generated.models.password_profile import PasswordProfile
user = {
"display_name": request.user.name,
"account_enabled": request.user.is_active,
"mail_nickname": request.user.username,
"user_principal_name": request.user.email,
}
if creating:
user["password_profile"] = PasswordProfile(
password=request.user.password
)
return user
- identifiers:
managed: goauthentik.io/providers/microsoft_entra/group
model: authentik_providers_microsoft_entra.microsoftentraprovidermapping
attrs:
name: "authentik default Microsoft Entra Mapping: Group"
# https://learn.microsoft.com/en-us/graph/api/group-post-groups?view=graph-rest-1.0&tabs=http#request-body
expression: |
return {
"display_name": group.name,
"mail_enabled": False,
"security_enabled": True,
"mail_nickname": slugify(group.name),
}

6
go.mod
View File

@ -21,14 +21,14 @@ require (
github.com/mitchellh/mapstructure v1.5.0
github.com/nmcclain/asn1-ber v0.0.0-20170104154839-2661553a0484
github.com/pires/go-proxyproto v0.7.0
github.com/prometheus/client_golang v1.19.1
github.com/prometheus/client_golang v1.19.0
github.com/redis/go-redis/v9 v9.5.1
github.com/sethvargo/go-envconfig v1.0.2
github.com/sethvargo/go-envconfig v1.0.1
github.com/sirupsen/logrus v1.9.3
github.com/spf13/cobra v1.8.0
github.com/stretchr/testify v1.9.0
github.com/wwt/guac v1.3.2
goauthentik.io/api/v3 v3.2024042.4
goauthentik.io/api/v3 v3.2024042.2
golang.org/x/exp v0.0.0-20230210204819-062eb4c674ab
golang.org/x/oauth2 v0.20.0
golang.org/x/sync v0.7.0

12
go.sum
View File

@ -233,8 +233,8 @@ github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZb
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/pquerna/cachecontrol v0.0.0-20201205024021-ac21108117ac h1:jWKYCNlX4J5s8M0nHYkh7Y7c9gRVDEb3mq51j5J0F5M=
github.com/pquerna/cachecontrol v0.0.0-20201205024021-ac21108117ac/go.mod h1:hoLfEwdY11HjRfKFH6KqnPsfxlo3BP6bJehpDv8t6sQ=
github.com/prometheus/client_golang v1.19.1 h1:wZWJDwK+NameRJuPGDhlnFgx8e8HN3XHQeLaYJFJBOE=
github.com/prometheus/client_golang v1.19.1/go.mod h1:mP78NwGzrVks5S2H6ab8+ZZGJLZUq1hoULYBAYBw1Ho=
github.com/prometheus/client_golang v1.19.0 h1:ygXvpU1AoN1MhdzckN+PyD9QJOSD4x7kmXYlnfbA6JU=
github.com/prometheus/client_golang v1.19.0/go.mod h1:ZRM9uEAypZakd+q/x7+gmsvXdURP+DABIEIjnmDdp+k=
github.com/prometheus/client_model v0.0.0-20190812154241-14fe0d1b01d4/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=
github.com/prometheus/client_model v0.5.0 h1:VQw1hfvPvk3Uv6Qf29VrPF32JB6rtbgI6cYPYQjL0Qw=
github.com/prometheus/client_model v0.5.0/go.mod h1:dTiFglRmd66nLR9Pv9f0mZi7B7fk5Pm3gvsjB5tr+kI=
@ -248,8 +248,8 @@ github.com/rogpeppe/go-internal v1.3.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFR
github.com/rogpeppe/go-internal v1.11.0 h1:cWPaGQEPrBb5/AsnsZesgZZ9yb1OQ+GOISoDNXVBh4M=
github.com/rogpeppe/go-internal v1.11.0/go.mod h1:ddIwULY96R17DhadqLgMfk9H9tvdUzkipdSkR5nkCZA=
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
github.com/sethvargo/go-envconfig v1.0.2 h1:BAQnzBLK/mPN3R3pC0d46MLN0htc64YZBVrz/sZfAX4=
github.com/sethvargo/go-envconfig v1.0.2/go.mod h1:OKZ02xFaD3MvWBBmEW45fQr08sJEsonGrrOdicvQmQA=
github.com/sethvargo/go-envconfig v1.0.1 h1:9wglip/5fUfaH0lQecLM8AyOClMw0gT0A9K2c2wozao=
github.com/sethvargo/go-envconfig v1.0.1/go.mod h1:OKZ02xFaD3MvWBBmEW45fQr08sJEsonGrrOdicvQmQA=
github.com/sirupsen/logrus v1.4.2/go.mod h1:tLMulIdttU9McNUspp0xgXVQah82FyeX6MwdIuYE2rE=
github.com/sirupsen/logrus v1.9.3 h1:dueUQJ1C2q9oE3F7wvmSGAaVtTmUizReu6fjN8uqzbQ=
github.com/sirupsen/logrus v1.9.3/go.mod h1:naHLuLoDiP4jHNo9R0sCBMtWGeIprob74mVsIT4qYEQ=
@ -294,8 +294,8 @@ go.opentelemetry.io/otel/trace v1.24.0 h1:CsKnnL4dUAr/0llH9FKuc698G04IrpWV0MQA/Y
go.opentelemetry.io/otel/trace v1.24.0/go.mod h1:HPc3Xr/cOApsBI154IU0OI0HJexz+aw5uPdbs3UCjNU=
go.uber.org/goleak v1.2.1 h1:NBol2c7O1ZokfZ0LEU9K6Whx/KnwvepVetCUhtKja4A=
go.uber.org/goleak v1.2.1/go.mod h1:qlT2yGI9QafXHhZZLxlSuNsMw3FFLxBr+tBRlmO1xH4=
goauthentik.io/api/v3 v3.2024042.4 h1:9FnCx4gF1HpMeBzqhRWg2uWj+mKwLhwNZ+Rqzc0TCMM=
goauthentik.io/api/v3 v3.2024042.4/go.mod h1:zz+mEZg8rY/7eEjkMGWJ2DnGqk+zqxuybGCGrR2O4Kw=
goauthentik.io/api/v3 v3.2024042.2 h1:aGfIVrNXEWVuvKH3YDZpGINhnhWNwcVAGTla/Ck4hD8=
goauthentik.io/api/v3 v3.2024042.2/go.mod h1:zz+mEZg8rY/7eEjkMGWJ2DnGqk+zqxuybGCGrR2O4Kw=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20190510104115-cbcb75029529/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=
golang.org/x/crypto v0.0.0-20190605123033-f99c8df09eb5/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=

View File

@ -8,7 +8,7 @@ msgid ""
msgstr ""
"Project-Id-Version: PACKAGE VERSION\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2024-05-13 00:08+0000\n"
"POT-Creation-Date: 2024-05-08 00:07+0000\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language-Team: LANGUAGE <LL@li.org>\n"
@ -87,12 +87,6 @@ msgstr ""
msgid "Brands"
msgstr ""
#: authentik/core/api/providers.py
msgid ""
"When not set all providers are returned. When set to true, only backchannel "
"providers are returned. When set to false, backchannel providers are excluded"
msgstr ""
#: authentik/core/api/providers.py
msgid "SAML Provider from Metadata"
msgstr ""
@ -424,7 +418,6 @@ msgid "Feature only accessible for internal users."
msgstr ""
#: authentik/enterprise/providers/google_workspace/models.py
#: authentik/enterprise/providers/microsoft_entra/models.py
#: authentik/providers/scim/models.py authentik/sources/ldap/models.py
msgid "Property mappings used for group creation/updating."
msgstr ""
@ -445,50 +438,6 @@ msgstr ""
msgid "Google Workspace Provider Mappings"
msgstr ""
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider User"
msgstr ""
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Users"
msgstr ""
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Group"
msgstr ""
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Groups"
msgstr ""
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider"
msgstr ""
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Providers"
msgstr ""
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Mapping"
msgstr ""
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Mappings"
msgstr ""
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider User"
msgstr ""
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Group"
msgstr ""
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Groups"
msgstr ""
#: authentik/enterprise/providers/rac/models.py
#: authentik/stages/user_login/models.py
msgid ""

Binary file not shown.

View File

@ -14,7 +14,7 @@ msgid ""
msgstr ""
"Project-Id-Version: PACKAGE VERSION\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2024-05-13 00:08+0000\n"
"POT-Creation-Date: 2024-05-03 00:08+0000\n"
"PO-Revision-Date: 2022-09-26 16:47+0000\n"
"Last-Translator: deluxghost, 2024\n"
"Language-Team: Chinese Simplified (https://app.transifex.com/authentik/teams/119923/zh-Hans/)\n"
@ -95,13 +95,6 @@ msgstr "品牌"
msgid "Brands"
msgstr "品牌"
#: authentik/core/api/providers.py
msgid ""
"When not set all providers are returned. When set to true, only backchannel "
"providers are returned. When set to false, backchannel providers are "
"excluded"
msgstr "如果未设置,则返回所有提供程序。如果启用,仅返回反向通道提供程序。如果禁用,则返回非反向通道提供程序"
#: authentik/core/api/providers.py
msgid "SAML Provider from Metadata"
msgstr "来自元数据的 SAML 提供程序"
@ -440,72 +433,6 @@ msgstr "访问此功能需要企业版。"
msgid "Feature only accessible for internal users."
msgstr "仅内部用户能访问此功能。"
#: authentik/enterprise/providers/google_workspace/models.py
#: authentik/enterprise/providers/microsoft_entra/models.py
#: authentik/providers/scim/models.py authentik/sources/ldap/models.py
msgid "Property mappings used for group creation/updating."
msgstr "用于创建/更新组的属性映射。"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider"
msgstr "Google Workspace 提供程序"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Providers"
msgstr "Google Workspace 提供程序"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Mapping"
msgstr "Google Workspace 提供程序映射"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Mappings"
msgstr "Google Workspace 提供程序映射"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider User"
msgstr "Google Workspace 提供程序用户"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Users"
msgstr "Google Workspace 提供程序用户"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Group"
msgstr "Google Workspace 提供程序组"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Groups"
msgstr "Google Workspace 提供程序组"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider"
msgstr "Microsoft Entra 提供程序"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Providers"
msgstr "Microsoft Entra 提供程序"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Mapping"
msgstr "Microsoft Entra 提供程序映射"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Mappings"
msgstr "Microsoft Entra 提供程序映射"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider User"
msgstr "Microsoft Entra 提供程序用户"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Group"
msgstr "Microsoft Entra 提供程序组"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Groups"
msgstr "Microsoft Entra 提供程序组"
#: authentik/enterprise/providers/rac/models.py
#: authentik/stages/user_login/models.py
msgid ""
@ -851,25 +778,6 @@ msgstr "流程令牌"
msgid "Invalid next URL"
msgstr "无效的 next URL"
#: authentik/lib/sync/outgoing/tasks.py
msgid "Starting full provider sync"
msgstr "开始全量提供程序同步"
#: authentik/lib/sync/outgoing/tasks.py
#, python-format
msgid "Syncing page %(page)d of users"
msgstr "正在同步用户页面 %(page)d"
#: authentik/lib/sync/outgoing/tasks.py
#, python-format
msgid "Syncing page %(page)d of groups"
msgstr "正在同步群组页面 %(page)d"
#: authentik/lib/sync/outgoing/tasks.py
#, python-brace-format
msgid "Stopping sync due to error: {error}"
msgstr "由于以下错误,同步停止:{error}"
#: authentik/lib/utils/time.py
#, python-format
msgid "%(value)s is not in the correct format of 'hours=3;minutes=1'."
@ -1746,6 +1654,10 @@ msgstr "SCIM 请求的基础 URL通常以 /v2 结尾"
msgid "Authentication token"
msgstr "身份验证令牌"
#: authentik/providers/scim/models.py authentik/sources/ldap/models.py
msgid "Property mappings used for group creation/updating."
msgstr "用于创建/更新组的属性映射。"
#: authentik/providers/scim/models.py
msgid "SCIM Provider"
msgstr "SCIM 提供程序"
@ -1762,6 +1674,35 @@ msgstr "SCIM 映射"
msgid "SCIM Mappings"
msgstr "SCIM 映射"
#: authentik/providers/scim/tasks.py
msgid "Starting full SCIM sync"
msgstr "开始全量 SCIM 同步"
#: authentik/providers/scim/tasks.py
#, python-format
msgid "Syncing page %(page)d of users"
msgstr "正在同步用户页面 %(page)d"
#: authentik/providers/scim/tasks.py
#, python-format
msgid "Syncing page %(page)d of groups"
msgstr "正在同步群组页面 %(page)d"
#: authentik/providers/scim/tasks.py
#, python-brace-format
msgid "Failed to sync user {user_name} due to remote error: {error}"
msgstr "由于远端错误,同步用户 {user_name} 失败:{error}"
#: authentik/providers/scim/tasks.py
#, python-brace-format
msgid "Stopping sync due to error: {error}"
msgstr "由于以下错误,同步停止:{error}"
#: authentik/providers/scim/tasks.py
#, python-brace-format
msgid "Failed to sync group {group_name} due to remote error: {error}"
msgstr "由于远端错误,同步组 {group_name} 失败:{error}"
#: authentik/rbac/models.py
msgid "Role"
msgstr "角色"

View File

@ -14,7 +14,7 @@ msgid ""
msgstr ""
"Project-Id-Version: PACKAGE VERSION\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2024-05-13 00:08+0000\n"
"POT-Creation-Date: 2024-05-03 00:08+0000\n"
"PO-Revision-Date: 2022-09-26 16:47+0000\n"
"Last-Translator: deluxghost, 2024\n"
"Language-Team: Chinese (China) (https://app.transifex.com/authentik/teams/119923/zh_CN/)\n"
@ -95,13 +95,6 @@ msgstr "品牌"
msgid "Brands"
msgstr "品牌"
#: authentik/core/api/providers.py
msgid ""
"When not set all providers are returned. When set to true, only backchannel "
"providers are returned. When set to false, backchannel providers are "
"excluded"
msgstr "如果未设置,则返回所有提供程序。如果启用,仅返回反向通道提供程序。如果禁用,则返回非反向通道提供程序"
#: authentik/core/api/providers.py
msgid "SAML Provider from Metadata"
msgstr "来自元数据的 SAML 提供程序"
@ -440,72 +433,6 @@ msgstr "访问此功能需要企业版。"
msgid "Feature only accessible for internal users."
msgstr "仅内部用户能访问此功能。"
#: authentik/enterprise/providers/google_workspace/models.py
#: authentik/enterprise/providers/microsoft_entra/models.py
#: authentik/providers/scim/models.py authentik/sources/ldap/models.py
msgid "Property mappings used for group creation/updating."
msgstr "用于创建/更新组的属性映射。"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider"
msgstr "Google Workspace 提供程序"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Providers"
msgstr "Google Workspace 提供程序"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Mapping"
msgstr "Google Workspace 提供程序映射"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Mappings"
msgstr "Google Workspace 提供程序映射"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider User"
msgstr "Google Workspace 提供程序用户"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Users"
msgstr "Google Workspace 提供程序用户"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Group"
msgstr "Google Workspace 提供程序组"
#: authentik/enterprise/providers/google_workspace/models.py
msgid "Google Workspace Provider Groups"
msgstr "Google Workspace 提供程序组"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider"
msgstr "Microsoft Entra 提供程序"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Providers"
msgstr "Microsoft Entra 提供程序"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Mapping"
msgstr "Microsoft Entra 提供程序映射"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Mappings"
msgstr "Microsoft Entra 提供程序映射"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider User"
msgstr "Microsoft Entra 提供程序用户"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Group"
msgstr "Microsoft Entra 提供程序组"
#: authentik/enterprise/providers/microsoft_entra/models.py
msgid "Microsoft Entra Provider Groups"
msgstr "Microsoft Entra 提供程序组"
#: authentik/enterprise/providers/rac/models.py
#: authentik/stages/user_login/models.py
msgid ""
@ -851,25 +778,6 @@ msgstr "流程令牌"
msgid "Invalid next URL"
msgstr "无效的 next URL"
#: authentik/lib/sync/outgoing/tasks.py
msgid "Starting full provider sync"
msgstr "开始全量提供程序同步"
#: authentik/lib/sync/outgoing/tasks.py
#, python-format
msgid "Syncing page %(page)d of users"
msgstr "正在同步用户页面 %(page)d"
#: authentik/lib/sync/outgoing/tasks.py
#, python-format
msgid "Syncing page %(page)d of groups"
msgstr "正在同步群组页面 %(page)d"
#: authentik/lib/sync/outgoing/tasks.py
#, python-brace-format
msgid "Stopping sync due to error: {error}"
msgstr "由于以下错误,同步停止:{error}"
#: authentik/lib/utils/time.py
#, python-format
msgid "%(value)s is not in the correct format of 'hours=3;minutes=1'."
@ -1746,6 +1654,10 @@ msgstr "SCIM 请求的基础 URL通常以 /v2 结尾"
msgid "Authentication token"
msgstr "身份验证令牌"
#: authentik/providers/scim/models.py authentik/sources/ldap/models.py
msgid "Property mappings used for group creation/updating."
msgstr "用于创建/更新组的属性映射。"
#: authentik/providers/scim/models.py
msgid "SCIM Provider"
msgstr "SCIM 提供程序"
@ -1762,6 +1674,35 @@ msgstr "SCIM 映射"
msgid "SCIM Mappings"
msgstr "SCIM 映射"
#: authentik/providers/scim/tasks.py
msgid "Starting full SCIM sync"
msgstr "开始全量 SCIM 同步"
#: authentik/providers/scim/tasks.py
#, python-format
msgid "Syncing page %(page)d of users"
msgstr "正在同步用户页面 %(page)d"
#: authentik/providers/scim/tasks.py
#, python-format
msgid "Syncing page %(page)d of groups"
msgstr "正在同步群组页面 %(page)d"
#: authentik/providers/scim/tasks.py
#, python-brace-format
msgid "Failed to sync user {user_name} due to remote error: {error}"
msgstr "由于远端错误,同步用户 {user_name} 失败:{error}"
#: authentik/providers/scim/tasks.py
#, python-brace-format
msgid "Stopping sync due to error: {error}"
msgstr "由于以下错误,同步停止:{error}"
#: authentik/providers/scim/tasks.py
#, python-brace-format
msgid "Failed to sync group {group_name} due to remote error: {error}"
msgstr "由于远端错误,同步组 {group_name} 失败:{error}"
#: authentik/rbac/models.py
msgid "Role"
msgstr "角色"

670
poetry.lock generated
View File

@ -315,42 +315,6 @@ six = "*"
[package.extras]
visualize = ["Twisted (>=16.1.1)", "graphviz (>0.5.1)"]
[[package]]
name = "azure-core"
version = "1.30.1"
description = "Microsoft Azure Core Library for Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "azure-core-1.30.1.tar.gz", hash = "sha256:26273a254131f84269e8ea4464f3560c731f29c0c1f69ac99010845f239c1a8f"},
{file = "azure_core-1.30.1-py3-none-any.whl", hash = "sha256:7c5ee397e48f281ec4dd773d67a0a47a0962ed6fa833036057f9ea067f688e74"},
]
[package.dependencies]
requests = ">=2.21.0"
six = ">=1.11.0"
typing-extensions = ">=4.6.0"
[package.extras]
aio = ["aiohttp (>=3.0)"]
[[package]]
name = "azure-identity"
version = "1.16.0"
description = "Microsoft Azure Identity Library for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "azure-identity-1.16.0.tar.gz", hash = "sha256:6ff1d667cdcd81da1ceab42f80a0be63ca846629f518a922f7317a7e3c844e1b"},
{file = "azure_identity-1.16.0-py3-none-any.whl", hash = "sha256:722fdb60b8fdd55fa44dc378b8072f4b419b56a5e54c0de391f644949f3a826f"},
]
[package.dependencies]
azure-core = ">=1.23.0"
cryptography = ">=2.5"
msal = ">=1.24.0"
msal-extensions = ">=0.3.0"
[[package]]
name = "bandit"
version = "1.7.8"
@ -1157,23 +1121,6 @@ files = [
{file = "defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69"},
]
[[package]]
name = "deprecated"
version = "1.2.14"
description = "Python @deprecated decorator to deprecate old python classes, functions or methods."
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
{file = "Deprecated-1.2.14-py2.py3-none-any.whl", hash = "sha256:6fac8b097794a90302bdbb17b9b815e732d3c4720583ff1b198499d78470466c"},
{file = "Deprecated-1.2.14.tar.gz", hash = "sha256:e5323eb936458dccc2582dc6f9c322c852a775a27065ff2b0c4970b9d53d01b3"},
]
[package.dependencies]
wrapt = ">=1.10,<2"
[package.extras]
dev = ["PyTest", "PyTest-Cov", "bump2version (<1)", "sphinx (<2)", "tox"]
[[package]]
name = "django"
version = "5.0.6"
@ -1523,13 +1470,13 @@ tornado = ">=5.0.0,<7.0.0"
[[package]]
name = "freezegun"
version = "1.5.1"
version = "1.5.0"
description = "Let your Python tests travel through time"
optional = false
python-versions = ">=3.7"
files = [
{file = "freezegun-1.5.1-py3-none-any.whl", hash = "sha256:bf111d7138a8abe55ab48a71755673dbaa4ab87f4cff5634a4442dfec34c15f1"},
{file = "freezegun-1.5.1.tar.gz", hash = "sha256:b29dedfcda6d5e8e083ce71b2b542753ad48cfec44037b3fc79702e2980a89e9"},
{file = "freezegun-1.5.0-py3-none-any.whl", hash = "sha256:ec3f4ba030e34eb6cf7e1e257308aee2c60c3d038ff35996d7475760c9ff3719"},
{file = "freezegun-1.5.0.tar.gz", hash = "sha256:200a64359b363aa3653d8aac289584078386c7c3da77339d257e46a01fb5c77c"},
]
[package.dependencies]
@ -1666,13 +1613,13 @@ grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0.dev0)"]
[[package]]
name = "google-api-python-client"
version = "2.129.0"
version = "2.128.0"
description = "Google API Client Library for Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "google-api-python-client-2.129.0.tar.gz", hash = "sha256:984cc8cc8eb4923468b1926d2b8effc5b459a4dda3c845896eb87c153b28ef84"},
{file = "google_api_python_client-2.129.0-py2.py3-none-any.whl", hash = "sha256:d50f7e2dfdbb7fc2732f6a0cba1c54d7bb676390679526c6bb628c901e43ec86"},
{file = "google-api-python-client-2.128.0.tar.gz", hash = "sha256:908af182dfc1cd79412a489b37fe45e4f3cc99c74e80c7c477ca5babaa54eea5"},
{file = "google_api_python_client-2.128.0-py2.py3-none-any.whl", hash = "sha256:99da6acb0acc648e309102b0e0262d7fef30f07f6bf56c6eeaa0504ceca113e3"},
]
[package.dependencies]
@ -1769,53 +1716,6 @@ files = [
{file = "h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d"},
]
[[package]]
name = "h2"
version = "4.1.0"
description = "HTTP/2 State-Machine based protocol implementation"
optional = false
python-versions = ">=3.6.1"
files = [
{file = "h2-4.1.0-py3-none-any.whl", hash = "sha256:03a46bcf682256c95b5fd9e9a99c1323584c3eec6440d379b9903d709476bc6d"},
{file = "h2-4.1.0.tar.gz", hash = "sha256:a83aca08fbe7aacb79fec788c9c0bac936343560ed9ec18b82a13a12c28d2abb"},
]
[package.dependencies]
hpack = ">=4.0,<5"
hyperframe = ">=6.0,<7"
[[package]]
name = "hpack"
version = "4.0.0"
description = "Pure-Python HPACK header compression"
optional = false
python-versions = ">=3.6.1"
files = [
{file = "hpack-4.0.0-py3-none-any.whl", hash = "sha256:84a076fad3dc9a9f8063ccb8041ef100867b1878b25ef0ee63847a5d53818a6c"},
{file = "hpack-4.0.0.tar.gz", hash = "sha256:fc41de0c63e687ebffde81187a948221294896f6bdc0ae2312708df339430095"},
]
[[package]]
name = "httpcore"
version = "1.0.5"
description = "A minimal low-level HTTP client."
optional = false
python-versions = ">=3.8"
files = [
{file = "httpcore-1.0.5-py3-none-any.whl", hash = "sha256:421f18bac248b25d310f3cacd198d55b8e6125c107797b609ff9b7a6ba7991b5"},
{file = "httpcore-1.0.5.tar.gz", hash = "sha256:34a38e2f9291467ee3b44e89dd52615370e152954ba21721378a87b2960f7a61"},
]
[package.dependencies]
certifi = "*"
h11 = ">=0.13,<0.15"
[package.extras]
asyncio = ["anyio (>=4.0,<5.0)"]
http2 = ["h2 (>=3,<5)"]
socks = ["socksio (==1.*)"]
trio = ["trio (>=0.22.0,<0.26.0)"]
[[package]]
name = "httplib2"
version = "0.22.0"
@ -1878,31 +1778,6 @@ files = [
[package.extras]
test = ["Cython (>=0.29.24,<0.30.0)"]
[[package]]
name = "httpx"
version = "0.27.0"
description = "The next generation HTTP client."
optional = false
python-versions = ">=3.8"
files = [
{file = "httpx-0.27.0-py3-none-any.whl", hash = "sha256:71d5465162c13681bff01ad59b2cc68dd838ea1f10e51574bac27103f00c91a5"},
{file = "httpx-0.27.0.tar.gz", hash = "sha256:a0cb88a46f32dc874e04ee956e4c2764aba2aa228f650b06788ba6bda2962ab5"},
]
[package.dependencies]
anyio = "*"
certifi = "*"
h2 = {version = ">=3,<5", optional = true, markers = "extra == \"http2\""}
httpcore = "==1.*"
idna = "*"
sniffio = "*"
[package.extras]
brotli = ["brotli", "brotlicffi"]
cli = ["click (==8.*)", "pygments (==2.*)", "rich (>=10,<14)"]
http2 = ["h2 (>=3,<5)"]
socks = ["socksio (==1.*)"]
[[package]]
name = "humanize"
version = "4.9.0"
@ -1917,17 +1792,6 @@ files = [
[package.extras]
tests = ["freezegun", "pytest", "pytest-cov"]
[[package]]
name = "hyperframe"
version = "6.0.1"
description = "HTTP/2 framing layer for Python"
optional = false
python-versions = ">=3.6.1"
files = [
{file = "hyperframe-6.0.1-py3-none-any.whl", hash = "sha256:0ec6bafd80d8ad2195c4f03aacba3a8265e57bc4cff261e802bf39970ed02a15"},
{file = "hyperframe-6.0.1.tar.gz", hash = "sha256:ae510046231dc8e9ecb1a6586f63d2347bf4c8905914aa84ba585ae85f28a914"},
]
[[package]]
name = "hyperlink"
version = "21.0.0"
@ -1955,22 +1819,22 @@ files = [
[[package]]
name = "importlib-metadata"
version = "7.0.0"
version = "7.1.0"
description = "Read metadata from Python packages"
optional = false
python-versions = ">=3.8"
files = [
{file = "importlib_metadata-7.0.0-py3-none-any.whl", hash = "sha256:d97503976bb81f40a193d41ee6570868479c69d5068651eb039c40d850c59d67"},
{file = "importlib_metadata-7.0.0.tar.gz", hash = "sha256:7fc841f8b8332803464e5dc1c63a2e59121f46ca186c0e2e182e80bf8c1319f7"},
{file = "importlib_metadata-7.1.0-py3-none-any.whl", hash = "sha256:30962b96c0c223483ed6cc7280e7f0199feb01a0e40cfae4d4450fc6fab1f570"},
{file = "importlib_metadata-7.1.0.tar.gz", hash = "sha256:b78938b926ee8d5f020fc4772d487045805a55ddbad2ecf21c6d60938dc7fcd2"},
]
[package.dependencies]
zipp = ">=0.5"
[package.extras]
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (<7.2.5)", "sphinx (>=3.5)", "sphinx-lint"]
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
perf = ["ipython"]
testing = ["flufl.flake8", "importlib-resources (>=1.3)", "packaging", "pyfakefs", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf (>=0.9.2)", "pytest-ruff"]
testing = ["flufl.flake8", "importlib-resources (>=1.3)", "jaraco.test (>=5.4)", "packaging", "pyfakefs", "pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy", "pytest-perf (>=0.9.2)", "pytest-ruff (>=0.2.1)"]
[[package]]
name = "incremental"
@ -2521,154 +2385,6 @@ files = [
{file = "mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba"},
]
[[package]]
name = "microsoft-kiota-abstractions"
version = "1.3.2"
description = "Core abstractions for kiota generated libraries in Python"
optional = false
python-versions = "*"
files = [
{file = "microsoft_kiota_abstractions-1.3.2-py2.py3-none-any.whl", hash = "sha256:ec4335df425874b1c0171a97c4b5ccdc4a9d076e1ecd3a5c2582af1cacc25016"},
{file = "microsoft_kiota_abstractions-1.3.2.tar.gz", hash = "sha256:acac0b34b443d3fc10a3a86dd996cdf92248080553a3768a77c23350541f1aa2"},
]
[package.dependencies]
opentelemetry-api = ">=1.19.0"
opentelemetry-sdk = ">=1.19.0"
std-uritemplate = ">=0.0.38"
[[package]]
name = "microsoft-kiota-authentication-azure"
version = "1.0.0"
description = "Authentication provider for Kiota using Azure Identity"
optional = false
python-versions = "*"
files = [
{file = "microsoft_kiota_authentication_azure-1.0.0-py2.py3-none-any.whl", hash = "sha256:289fe002951ae661415a6d3fa7c422c096b739165acb32d786316988120a1b27"},
{file = "microsoft_kiota_authentication_azure-1.0.0.tar.gz", hash = "sha256:752304f8d94b884cfec12583dd763ec0478805c7f80b29344e78c6d55a97bd01"},
]
[package.dependencies]
aiohttp = ">=3.8.0"
azure-core = ">=1.21.1"
microsoft-kiota-abstractions = ">=1.0.0,<2.0.0"
opentelemetry-api = ">=1.20.0"
opentelemetry-sdk = ">=1.20.0"
[[package]]
name = "microsoft-kiota-http"
version = "1.3.1"
description = "Kiota http request adapter implementation for httpx library"
optional = false
python-versions = "*"
files = [
{file = "microsoft_kiota_http-1.3.1-py2.py3-none-any.whl", hash = "sha256:d62972c6ed4c785f9808a15479a7421abb38a9519b39e6933e5d05555b9fb427"},
{file = "microsoft_kiota_http-1.3.1.tar.gz", hash = "sha256:09d85310379f88af0a0967925d1fcbe82f2520a9fe6fa1fd50e79af813bc451d"},
]
[package.dependencies]
httpx = {version = ">=0.23.0", extras = ["http2"]}
microsoft-kiota_abstractions = ">=1.0.0,<2.0.0"
opentelemetry-api = ">=1.20.0"
opentelemetry-sdk = ">=1.20.0"
[[package]]
name = "microsoft-kiota-serialization-form"
version = "0.1.0"
description = "Implementation of Kiota Serialization Interfaces for URI-Form encoded serialization"
optional = false
python-versions = "*"
files = [
{file = "microsoft_kiota_serialization_form-0.1.0-py2.py3-none-any.whl", hash = "sha256:5bc76fb2fc67d7c1f878f876d252ea814e4fc38df505099b9b86de52d974380a"},
{file = "microsoft_kiota_serialization_form-0.1.0.tar.gz", hash = "sha256:663ece0cb1a41fe9ddfc9195aa3f15f219e14d2a1ee51e98c53ad8d795b2785d"},
]
[package.dependencies]
microsoft-kiota_abstractions = ">=1.0.0,<2.0.0"
pendulum = ">=3.0.0"
[[package]]
name = "microsoft-kiota-serialization-json"
version = "1.2.0"
description = "Implementation of Kiota Serialization interfaces for JSON"
optional = false
python-versions = "*"
files = [
{file = "microsoft_kiota_serialization_json-1.2.0-py2.py3-none-any.whl", hash = "sha256:cf68ef323157b3566b043d2282b292479bca6af0ffcf08385c806c812e507a58"},
{file = "microsoft_kiota_serialization_json-1.2.0.tar.gz", hash = "sha256:89a4ec0128958bc92287db0cf5b6616a9f66ac42f6c7bcfe8894393d2156bed9"},
]
[package.dependencies]
microsoft-kiota_abstractions = ">=1.0.0,<2.0.0"
pendulum = ">=3.0.0b1"
[[package]]
name = "microsoft-kiota-serialization-multipart"
version = "0.1.0"
description = "Implementation of Kiota Serialization Interfaces for Multipart serialization"
optional = false
python-versions = "*"
files = [
{file = "microsoft_kiota_serialization_multipart-0.1.0-py2.py3-none-any.whl", hash = "sha256:ef183902e77807806b8a181cdde53ba5bc04c6c9bdb2f7d80f8bad5d720e0015"},
{file = "microsoft_kiota_serialization_multipart-0.1.0.tar.gz", hash = "sha256:14e89e92582e6630ddbc70ac67b70bf189dacbfc41a96d3e1d10339e86c8dde5"},
]
[package.dependencies]
microsoft-kiota_abstractions = ">=1.0.0,<2.0.0"
[[package]]
name = "microsoft-kiota-serialization-text"
version = "1.0.0"
description = "Implementation of Kiota Serialization interfaces for text/plain"
optional = false
python-versions = "*"
files = [
{file = "microsoft_kiota_serialization_text-1.0.0-py2.py3-none-any.whl", hash = "sha256:1d3789e012b603e059a36cc675d1fd08cb81e0dde423d970c0af2eabce9c0d43"},
{file = "microsoft_kiota_serialization_text-1.0.0.tar.gz", hash = "sha256:c3dd3f409b1c4f4963bd1e41d51b65f7e53e852130bb441d79b77dad88ee76ed"},
]
[package.dependencies]
microsoft-kiota_abstractions = ">=1.0.0,<2.0.0"
python-dateutil = ">=2.8.2"
[[package]]
name = "msal"
version = "1.28.0"
description = "The Microsoft Authentication Library (MSAL) for Python library enables your app to access the Microsoft Cloud by supporting authentication of users with Microsoft Azure Active Directory accounts (AAD) and Microsoft Accounts (MSA) using industry standard OAuth2 and OpenID Connect."
optional = false
python-versions = ">=3.7"
files = [
{file = "msal-1.28.0-py3-none-any.whl", hash = "sha256:3064f80221a21cd535ad8c3fafbb3a3582cd9c7e9af0bb789ae14f726a0ca99b"},
{file = "msal-1.28.0.tar.gz", hash = "sha256:80bbabe34567cb734efd2ec1869b2d98195c927455369d8077b3c542088c5c9d"},
]
[package.dependencies]
cryptography = ">=0.6,<45"
PyJWT = {version = ">=1.0.0,<3", extras = ["crypto"]}
requests = ">=2.0.0,<3"
[package.extras]
broker = ["pymsalruntime (>=0.13.2,<0.15)"]
[[package]]
name = "msal-extensions"
version = "1.1.0"
description = "Microsoft Authentication Library extensions (MSAL EX) provides a persistence API that can save your data on disk, encrypted on Windows, macOS and Linux. Concurrent data access will be coordinated by a file lock mechanism."
optional = false
python-versions = ">=3.7"
files = [
{file = "msal-extensions-1.1.0.tar.gz", hash = "sha256:6ab357867062db7b253d0bd2df6d411c7891a0ee7308d54d1e4317c1d1c54252"},
{file = "msal_extensions-1.1.0-py3-none-any.whl", hash = "sha256:01be9711b4c0b1a151450068eeb2c4f0997df3bba085ac299de3a66f585e382f"},
]
[package.dependencies]
msal = ">=0.4.1,<2.0.0"
packaging = "*"
portalocker = [
{version = ">=1.0,<3", markers = "platform_system != \"Windows\""},
{version = ">=1.6,<3", markers = "platform_system == \"Windows\""},
]
[[package]]
name = "msgpack"
version = "1.0.8"
@ -2734,51 +2450,6 @@ files = [
{file = "msgpack-1.0.8.tar.gz", hash = "sha256:95c02b0e27e706e48d0e5426d1710ca78e0f0628d6e89d5b5a5b91a5f12274f3"},
]
[[package]]
name = "msgraph-core"
version = "1.0.0"
description = "Core component of the Microsoft Graph Python SDK"
optional = false
python-versions = ">=3.8"
files = [
{file = "msgraph-core-1.0.0.tar.gz", hash = "sha256:f26bcbbb3cd149dd7f1613159e0c2ed862888d61bfd20ef0b08b9408eb670c9d"},
{file = "msgraph_core-1.0.0-py3-none-any.whl", hash = "sha256:f3de5149e246833b4b03605590d0b4eacf58d9c5a10fd951c37e53f0a345afd5"},
]
[package.dependencies]
httpx = {version = ">=0.23.0", extras = ["http2"]}
microsoft-kiota-abstractions = ">=1.0.0,<2.0.0"
microsoft-kiota-authentication-azure = ">=1.0.0,<2.0.0"
microsoft-kiota-http = ">=1.0.0,<2.0.0"
[package.extras]
dev = ["bumpver", "isort", "mypy", "pylint", "pytest", "yapf"]
[[package]]
name = "msgraph-sdk"
version = "1.4.0"
description = "The Microsoft Graph Python SDK"
optional = false
python-versions = ">=3.8"
files = [
{file = "msgraph_sdk-1.4.0-py3-none-any.whl", hash = "sha256:24f99082475ea129c3d45e44269bd64a7c6bfef8dda4f8ea692bbc9e47b71b78"},
{file = "msgraph_sdk-1.4.0.tar.gz", hash = "sha256:715907272c240e579d7669a690504488e25ae15fec904e2918c49ca328dc4a14"},
]
[package.dependencies]
azure-identity = ">=1.12.0"
microsoft-kiota-abstractions = ">=1.0.0,<2.0.0"
microsoft-kiota-authentication-azure = ">=1.0.0,<2.0.0"
microsoft-kiota-http = ">=1.0.0,<2.0.0"
microsoft-kiota-serialization-form = ">=0.1.0"
microsoft-kiota-serialization-json = ">=1.0.0,<2.0.0"
microsoft-kiota-serialization-multipart = ">=0.1.0"
microsoft-kiota-serialization-text = ">=1.0.0,<2.0.0"
msgraph-core = ">=1.0.0"
[package.extras]
dev = ["bumpver", "isort", "mypy", "pylint", "pytest", "yapf"]
[[package]]
name = "multidict"
version = "6.0.5"
@ -2929,48 +2600,6 @@ files = [
{file = "opencontainers-0.0.14.tar.gz", hash = "sha256:fde3b8099b56b5c956415df8933e2227e1914e805a277b844f2f9e52341738f2"},
]
[[package]]
name = "opentelemetry-api"
version = "1.24.0"
description = "OpenTelemetry Python API"
optional = false
python-versions = ">=3.8"
files = [
{file = "opentelemetry_api-1.24.0-py3-none-any.whl", hash = "sha256:0f2c363d98d10d1ce93330015ca7fd3a65f60be64e05e30f557c61de52c80ca2"},
{file = "opentelemetry_api-1.24.0.tar.gz", hash = "sha256:42719f10ce7b5a9a73b10a4baf620574fb8ad495a9cbe5c18d76b75d8689c67e"},
]
[package.dependencies]
deprecated = ">=1.2.6"
importlib-metadata = ">=6.0,<=7.0"
[[package]]
name = "opentelemetry-sdk"
version = "1.24.0"
description = "OpenTelemetry Python SDK"
optional = false
python-versions = ">=3.8"
files = [
{file = "opentelemetry_sdk-1.24.0-py3-none-any.whl", hash = "sha256:fa731e24efe832e98bcd90902085b359dcfef7d9c9c00eb5b9a18587dae3eb59"},
{file = "opentelemetry_sdk-1.24.0.tar.gz", hash = "sha256:75bc0563affffa827700e0f4f4a68e1e257db0df13372344aebc6f8a64cde2e5"},
]
[package.dependencies]
opentelemetry-api = "1.24.0"
opentelemetry-semantic-conventions = "0.45b0"
typing-extensions = ">=3.7.4"
[[package]]
name = "opentelemetry-semantic-conventions"
version = "0.45b0"
description = "OpenTelemetry Semantic Conventions"
optional = false
python-versions = ">=3.8"
files = [
{file = "opentelemetry_semantic_conventions-0.45b0-py3-none-any.whl", hash = "sha256:a4a6fb9a7bacd9167c082aa4681009e9acdbfa28ffb2387af50c2fef3d30c864"},
{file = "opentelemetry_semantic_conventions-0.45b0.tar.gz", hash = "sha256:7c84215a44ac846bc4b8e32d5e78935c5c43482e491812a0bb8aaf87e4d92118"},
]
[[package]]
name = "outcome"
version = "1.3.0.post0"
@ -3058,105 +2687,6 @@ pygments = ">=2.12.0"
[package.extras]
dev = ["hypothesis", "mypy", "pdoc-pyo3-sample-library (==1.0.11)", "pygments (>=2.14.0)", "pytest", "pytest-cov", "pytest-timeout", "ruff", "tox", "types-pygments"]
[[package]]
name = "pendulum"
version = "3.0.0"
description = "Python datetimes made easy"
optional = false
python-versions = ">=3.8"
files = [
{file = "pendulum-3.0.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2cf9e53ef11668e07f73190c805dbdf07a1939c3298b78d5a9203a86775d1bfd"},
{file = "pendulum-3.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fb551b9b5e6059377889d2d878d940fd0bbb80ae4810543db18e6f77b02c5ef6"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c58227ac260d5b01fc1025176d7b31858c9f62595737f350d22124a9a3ad82d"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:60fb6f415fea93a11c52578eaa10594568a6716602be8430b167eb0d730f3332"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b69f6b4dbcb86f2c2fe696ba991e67347bcf87fe601362a1aba6431454b46bde"},
{file = "pendulum-3.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:138afa9c373ee450ede206db5a5e9004fd3011b3c6bbe1e57015395cd076a09f"},
{file = "pendulum-3.0.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:83d9031f39c6da9677164241fd0d37fbfc9dc8ade7043b5d6d62f56e81af8ad2"},
{file = "pendulum-3.0.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0c2308af4033fa534f089595bcd40a95a39988ce4059ccd3dc6acb9ef14ca44a"},
{file = "pendulum-3.0.0-cp310-none-win_amd64.whl", hash = "sha256:9a59637cdb8462bdf2dbcb9d389518c0263799189d773ad5c11db6b13064fa79"},
{file = "pendulum-3.0.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:3725245c0352c95d6ca297193192020d1b0c0f83d5ee6bb09964edc2b5a2d508"},
{file = "pendulum-3.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6c035f03a3e565ed132927e2c1b691de0dbf4eb53b02a5a3c5a97e1a64e17bec"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:597e66e63cbd68dd6d58ac46cb7a92363d2088d37ccde2dae4332ef23e95cd00"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99a0f8172e19f3f0c0e4ace0ad1595134d5243cf75985dc2233e8f9e8de263ca"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:77d8839e20f54706aed425bec82a83b4aec74db07f26acd039905d1237a5e1d4"},
{file = "pendulum-3.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:afde30e8146292b059020fbc8b6f8fd4a60ae7c5e6f0afef937bbb24880bdf01"},
{file = "pendulum-3.0.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:660434a6fcf6303c4efd36713ca9212c753140107ee169a3fc6c49c4711c2a05"},
{file = "pendulum-3.0.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:dee9e5a48c6999dc1106eb7eea3e3a50e98a50651b72c08a87ee2154e544b33e"},
{file = "pendulum-3.0.0-cp311-none-win_amd64.whl", hash = "sha256:d4cdecde90aec2d67cebe4042fd2a87a4441cc02152ed7ed8fb3ebb110b94ec4"},
{file = "pendulum-3.0.0-cp311-none-win_arm64.whl", hash = "sha256:773c3bc4ddda2dda9f1b9d51fe06762f9200f3293d75c4660c19b2614b991d83"},
{file = "pendulum-3.0.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:409e64e41418c49f973d43a28afe5df1df4f1dd87c41c7c90f1a63f61ae0f1f7"},
{file = "pendulum-3.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a38ad2121c5ec7c4c190c7334e789c3b4624798859156b138fcc4d92295835dc"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fde4d0b2024b9785f66b7f30ed59281bd60d63d9213cda0eb0910ead777f6d37"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4b2c5675769fb6d4c11238132962939b960fcb365436b6d623c5864287faa319"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8af95e03e066826f0f4c65811cbee1b3123d4a45a1c3a2b4fc23c4b0dff893b5"},
{file = "pendulum-3.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2165a8f33cb15e06c67070b8afc87a62b85c5a273e3aaa6bc9d15c93a4920d6f"},
{file = "pendulum-3.0.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ad5e65b874b5e56bd942546ea7ba9dd1d6a25121db1c517700f1c9de91b28518"},
{file = "pendulum-3.0.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:17fe4b2c844bbf5f0ece69cfd959fa02957c61317b2161763950d88fed8e13b9"},
{file = "pendulum-3.0.0-cp312-none-win_amd64.whl", hash = "sha256:78f8f4e7efe5066aca24a7a57511b9c2119f5c2b5eb81c46ff9222ce11e0a7a5"},
{file = "pendulum-3.0.0-cp312-none-win_arm64.whl", hash = "sha256:28f49d8d1e32aae9c284a90b6bb3873eee15ec6e1d9042edd611b22a94ac462f"},
{file = "pendulum-3.0.0-cp37-cp37m-macosx_10_12_x86_64.whl", hash = "sha256:d4e2512f4e1a4670284a153b214db9719eb5d14ac55ada5b76cbdb8c5c00399d"},
{file = "pendulum-3.0.0-cp37-cp37m-macosx_11_0_arm64.whl", hash = "sha256:3d897eb50883cc58d9b92f6405245f84b9286cd2de6e8694cb9ea5cb15195a32"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e169cc2ca419517f397811bbe4589cf3cd13fca6dc38bb352ba15ea90739ebb"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f17c3084a4524ebefd9255513692f7e7360e23c8853dc6f10c64cc184e1217ab"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:826d6e258052715f64d05ae0fc9040c0151e6a87aae7c109ba9a0ed930ce4000"},
{file = "pendulum-3.0.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2aae97087872ef152a0c40e06100b3665d8cb86b59bc8471ca7c26132fccd0f"},
{file = "pendulum-3.0.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:ac65eeec2250d03106b5e81284ad47f0d417ca299a45e89ccc69e36130ca8bc7"},
{file = "pendulum-3.0.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:a5346d08f3f4a6e9e672187faa179c7bf9227897081d7121866358af369f44f9"},
{file = "pendulum-3.0.0-cp37-none-win_amd64.whl", hash = "sha256:235d64e87946d8f95c796af34818c76e0f88c94d624c268693c85b723b698aa9"},
{file = "pendulum-3.0.0-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:6a881d9c2a7f85bc9adafcfe671df5207f51f5715ae61f5d838b77a1356e8b7b"},
{file = "pendulum-3.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d7762d2076b9b1cb718a6631ad6c16c23fc3fac76cbb8c454e81e80be98daa34"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e8e36a8130819d97a479a0e7bf379b66b3b1b520e5dc46bd7eb14634338df8c"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7dc843253ac373358ffc0711960e2dd5b94ab67530a3e204d85c6e8cb2c5fa10"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0a78ad3635d609ceb1e97d6aedef6a6a6f93433ddb2312888e668365908c7120"},
{file = "pendulum-3.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b30a137e9e0d1f751e60e67d11fc67781a572db76b2296f7b4d44554761049d6"},
{file = "pendulum-3.0.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:c95984037987f4a457bb760455d9ca80467be792236b69d0084f228a8ada0162"},
{file = "pendulum-3.0.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d29c6e578fe0f893766c0d286adbf0b3c726a4e2341eba0917ec79c50274ec16"},
{file = "pendulum-3.0.0-cp38-none-win_amd64.whl", hash = "sha256:deaba8e16dbfcb3d7a6b5fabdd5a38b7c982809567479987b9c89572df62e027"},
{file = "pendulum-3.0.0-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b11aceea5b20b4b5382962b321dbc354af0defe35daa84e9ff3aae3c230df694"},
{file = "pendulum-3.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a90d4d504e82ad236afac9adca4d6a19e4865f717034fc69bafb112c320dcc8f"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:825799c6b66e3734227756fa746cc34b3549c48693325b8b9f823cb7d21b19ac"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad769e98dc07972e24afe0cff8d365cb6f0ebc7e65620aa1976fcfbcadc4c6f3"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a6fc26907eb5fb8cc6188cc620bc2075a6c534d981a2f045daa5f79dfe50d512"},
{file = "pendulum-3.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c717eab1b6d898c00a3e0fa7781d615b5c5136bbd40abe82be100bb06df7a56"},
{file = "pendulum-3.0.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:3ddd1d66d1a714ce43acfe337190be055cdc221d911fc886d5a3aae28e14b76d"},
{file = "pendulum-3.0.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:822172853d7a9cf6da95d7b66a16c7160cb99ae6df55d44373888181d7a06edc"},
{file = "pendulum-3.0.0-cp39-none-win_amd64.whl", hash = "sha256:840de1b49cf1ec54c225a2a6f4f0784d50bd47f68e41dc005b7f67c7d5b5f3ae"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3b1f74d1e6ffe5d01d6023870e2ce5c2191486928823196f8575dcc786e107b1"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:729e9f93756a2cdfa77d0fc82068346e9731c7e884097160603872686e570f07"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e586acc0b450cd21cbf0db6bae386237011b75260a3adceddc4be15334689a9a"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22e7944ffc1f0099a79ff468ee9630c73f8c7835cd76fdb57ef7320e6a409df4"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:fa30af36bd8e50686846bdace37cf6707bdd044e5cb6e1109acbad3277232e04"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:440215347b11914ae707981b9a57ab9c7b6983ab0babde07063c6ee75c0dc6e7"},
{file = "pendulum-3.0.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:314c4038dc5e6a52991570f50edb2f08c339debdf8cea68ac355b32c4174e820"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5acb1d386337415f74f4d1955c4ce8d0201978c162927d07df8eb0692b2d8533"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a789e12fbdefaffb7b8ac67f9d8f22ba17a3050ceaaa635cd1cc4645773a4b1e"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:860aa9b8a888e5913bd70d819306749e5eb488e6b99cd6c47beb701b22bdecf5"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:5ebc65ea033ef0281368217fbf59f5cb05b338ac4dd23d60959c7afcd79a60a0"},
{file = "pendulum-3.0.0-pp37-pypy37_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:d9fef18ab0386ef6a9ac7bad7e43ded42c83ff7ad412f950633854f90d59afa8"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:1c134ba2f0571d0b68b83f6972e2307a55a5a849e7dac8505c715c531d2a8795"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:385680812e7e18af200bb9b4a49777418c32422d05ad5a8eb85144c4a285907b"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9eec91cd87c59fb32ec49eb722f375bd58f4be790cae11c1b70fac3ee4f00da0"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4386bffeca23c4b69ad50a36211f75b35a4deb6210bdca112ac3043deb7e494a"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:dfbcf1661d7146d7698da4b86e7f04814221081e9fe154183e34f4c5f5fa3bf8"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:04a1094a5aa1daa34a6b57c865b25f691848c61583fb22722a4df5699f6bf74c"},
{file = "pendulum-3.0.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:5b0ec85b9045bd49dd3a3493a5e7ddfd31c36a2a60da387c419fa04abcaecb23"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:0a15b90129765b705eb2039062a6daf4d22c4e28d1a54fa260892e8c3ae6e157"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:bb8f6d7acd67a67d6fedd361ad2958ff0539445ef51cbe8cd288db4306503cd0"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fd69b15374bef7e4b4440612915315cc42e8575fcda2a3d7586a0d88192d0c88"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc00f8110db6898360c53c812872662e077eaf9c75515d53ecc65d886eec209a"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:83a44e8b40655d0ba565a5c3d1365d27e3e6778ae2a05b69124db9e471255c4a"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:1a3604e9fbc06b788041b2a8b78f75c243021e0f512447806a6d37ee5214905d"},
{file = "pendulum-3.0.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:92c307ae7accebd06cbae4729f0ba9fa724df5f7d91a0964b1b972a22baa482b"},
{file = "pendulum-3.0.0.tar.gz", hash = "sha256:5d034998dea404ec31fae27af6b22cff1708f830a1ed7353be4d1019bb9f584e"},
]
[package.dependencies]
python-dateutil = ">=2.6"
tzdata = ">=2020.1"
[package.extras]
test = ["time-machine (>=2.6.0)"]
[[package]]
name = "platformdirs"
version = "4.2.1"
@ -3188,25 +2718,6 @@ files = [
dev = ["pre-commit", "tox"]
testing = ["pytest", "pytest-benchmark"]
[[package]]
name = "portalocker"
version = "2.8.2"
description = "Wraps the portalocker recipe for easy usage"
optional = false
python-versions = ">=3.8"
files = [
{file = "portalocker-2.8.2-py3-none-any.whl", hash = "sha256:cfb86acc09b9aa7c3b43594e19be1345b9d16af3feb08bf92f23d4dce513a28e"},
{file = "portalocker-2.8.2.tar.gz", hash = "sha256:2b035aa7828e46c58e9b31390ee1f169b98e1066ab10b9a6a861fe7e25ee4f33"},
]
[package.dependencies]
pywin32 = {version = ">=226", markers = "platform_system == \"Windows\""}
[package.extras]
docs = ["sphinx (>=1.7.1)"]
redis = ["redis"]
tests = ["pytest (>=5.4.1)", "pytest-cov (>=2.8.1)", "pytest-mypy (>=0.8.0)", "pytest-timeout (>=2.1.0)", "redis", "sphinx (>=6.0.0)", "types-redis"]
[[package]]
name = "prometheus-client"
version = "0.20.0"
@ -3274,23 +2785,23 @@ files = [
[[package]]
name = "psycopg"
version = "3.1.19"
version = "3.1.18"
description = "PostgreSQL database adapter for Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "psycopg-3.1.19-py3-none-any.whl", hash = "sha256:dca5e5521c859f6606686432ae1c94e8766d29cc91f2ee595378c510cc5b0731"},
{file = "psycopg-3.1.19.tar.gz", hash = "sha256:92d7b78ad82426cdcf1a0440678209faa890c6e1721361c2f8901f0dccd62961"},
{file = "psycopg-3.1.18-py3-none-any.whl", hash = "sha256:4d5a0a5a8590906daa58ebd5f3cfc34091377354a1acced269dd10faf55da60e"},
{file = "psycopg-3.1.18.tar.gz", hash = "sha256:31144d3fb4c17d78094d9e579826f047d4af1da6a10427d91dfcfb6ecdf6f12b"},
]
[package.dependencies]
psycopg-c = {version = "3.1.19", optional = true, markers = "implementation_name != \"pypy\" and extra == \"c\""}
psycopg-c = {version = "3.1.18", optional = true, markers = "implementation_name != \"pypy\" and extra == \"c\""}
typing-extensions = ">=4.1"
tzdata = {version = "*", markers = "sys_platform == \"win32\""}
[package.extras]
binary = ["psycopg-binary (==3.1.19)"]
c = ["psycopg-c (==3.1.19)"]
binary = ["psycopg-binary (==3.1.18)"]
c = ["psycopg-c (==3.1.18)"]
dev = ["black (>=24.1.0)", "codespell (>=2.2)", "dnspython (>=2.1)", "flake8 (>=4.0)", "mypy (>=1.4.1)", "types-setuptools (>=57.4)", "wheel (>=0.37)"]
docs = ["Sphinx (>=5.0)", "furo (==2022.6.21)", "sphinx-autobuild (>=2021.3.14)", "sphinx-autodoc-typehints (>=1.12)"]
pool = ["psycopg-pool"]
@ -3298,12 +2809,12 @@ test = ["anyio (>=3.6.2,<4.0)", "mypy (>=1.4.1)", "pproxy (>=2.7)", "pytest (>=6
[[package]]
name = "psycopg-c"
version = "3.1.19"
version = "3.1.18"
description = "PostgreSQL database adapter for Python -- C optimisation distribution"
optional = false
python-versions = ">=3.7"
files = [
{file = "psycopg_c-3.1.19.tar.gz", hash = "sha256:8e90f53c430e7d661cb3a9298e2761847212ead1b24c5fb058fc9d0fd9616017"},
{file = "psycopg-c-3.1.18.tar.gz", hash = "sha256:ffff0c4a9c0e0b7aadb1acb7b61eb8f886365dd8ef00120ce14676235846ba73"},
]
[[package]]
@ -3496,9 +3007,6 @@ files = [
{file = "PyJWT-2.8.0.tar.gz", hash = "sha256:57e28d156e3d5c10088e0c68abb90bfac3df82b40a71bd0daa20c65ccd5c23de"},
]
[package.dependencies]
cryptography = {version = ">=3.4.0", optional = true, markers = "extra == \"crypto\""}
[package.extras]
crypto = ["cryptography (>=3.4.0)"]
dev = ["coverage[toml] (==5.0.4)", "cryptography (>=3.4.0)", "pre-commit", "pytest (>=6.0.0,<7.0.0)", "sphinx (>=4.5.0,<5.0.0)", "sphinx-rtd-theme", "zope.interface"]
@ -4020,28 +3528,28 @@ pyasn1 = ">=0.1.3"
[[package]]
name = "ruff"
version = "0.4.4"
version = "0.4.3"
description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false
python-versions = ">=3.7"
files = [
{file = "ruff-0.4.4-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:29d44ef5bb6a08e235c8249294fa8d431adc1426bfda99ed493119e6f9ea1bf6"},
{file = "ruff-0.4.4-py3-none-macosx_11_0_arm64.whl", hash = "sha256:c4efe62b5bbb24178c950732ddd40712b878a9b96b1d02b0ff0b08a090cbd891"},
{file = "ruff-0.4.4-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4c8e2f1e8fc12d07ab521a9005d68a969e167b589cbcaee354cb61e9d9de9c15"},
{file = "ruff-0.4.4-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:60ed88b636a463214905c002fa3eaab19795679ed55529f91e488db3fe8976ab"},
{file = "ruff-0.4.4-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b90fc5e170fc71c712cc4d9ab0e24ea505c6a9e4ebf346787a67e691dfb72e85"},
{file = "ruff-0.4.4-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:8e7e6ebc10ef16dcdc77fd5557ee60647512b400e4a60bdc4849468f076f6eef"},
{file = "ruff-0.4.4-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b9ddb2c494fb79fc208cd15ffe08f32b7682519e067413dbaf5f4b01a6087bcd"},
{file = "ruff-0.4.4-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c51c928a14f9f0a871082603e25a1588059b7e08a920f2f9fa7157b5bf08cfe9"},
{file = "ruff-0.4.4-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b5eb0a4bfd6400b7d07c09a7725e1a98c3b838be557fee229ac0f84d9aa49c36"},
{file = "ruff-0.4.4-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:b1867ee9bf3acc21778dcb293db504692eda5f7a11a6e6cc40890182a9f9e595"},
{file = "ruff-0.4.4-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:1aecced1269481ef2894cc495647392a34b0bf3e28ff53ed95a385b13aa45768"},
{file = "ruff-0.4.4-py3-none-musllinux_1_2_i686.whl", hash = "sha256:9da73eb616b3241a307b837f32756dc20a0b07e2bcb694fec73699c93d04a69e"},
{file = "ruff-0.4.4-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:958b4ea5589706a81065e2a776237de2ecc3e763342e5cc8e02a4a4d8a5e6f95"},
{file = "ruff-0.4.4-py3-none-win32.whl", hash = "sha256:cb53473849f011bca6e754f2cdf47cafc9c4f4ff4570003a0dad0b9b6890e876"},
{file = "ruff-0.4.4-py3-none-win_amd64.whl", hash = "sha256:424e5b72597482543b684c11def82669cc6b395aa8cc69acc1858b5ef3e5daae"},
{file = "ruff-0.4.4-py3-none-win_arm64.whl", hash = "sha256:39df0537b47d3b597293edbb95baf54ff5b49589eb7ff41926d8243caa995ea6"},
{file = "ruff-0.4.4.tar.gz", hash = "sha256:f87ea42d5cdebdc6a69761a9d0bc83ae9b3b30d0ad78952005ba6568d6c022af"},
{file = "ruff-0.4.3-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b70800c290f14ae6fcbb41bbe201cf62dfca024d124a1f373e76371a007454ce"},
{file = "ruff-0.4.3-py3-none-macosx_11_0_arm64.whl", hash = "sha256:08a0d6a22918ab2552ace96adeaca308833873a4d7d1d587bb1d37bae8728eb3"},
{file = "ruff-0.4.3-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eba1f14df3c758dd7de5b55fbae7e1c8af238597961e5fb628f3de446c3c40c5"},
{file = "ruff-0.4.3-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:819fb06d535cc76dfddbfe8d3068ff602ddeb40e3eacbc90e0d1272bb8d97113"},
{file = "ruff-0.4.3-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0bfc9e955e6dc6359eb6f82ea150c4f4e82b660e5b58d9a20a0e42ec3bb6342b"},
{file = "ruff-0.4.3-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:510a67d232d2ebe983fddea324dbf9d69b71c4d2dfeb8a862f4a127536dd4cfb"},
{file = "ruff-0.4.3-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dc9ff11cd9a092ee7680a56d21f302bdda14327772cd870d806610a3503d001f"},
{file = "ruff-0.4.3-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:29efff25bf9ee685c2c8390563a5b5c006a3fee5230d28ea39f4f75f9d0b6f2f"},
{file = "ruff-0.4.3-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:18b00e0bcccf0fc8d7186ed21e311dffd19761cb632241a6e4fe4477cc80ef6e"},
{file = "ruff-0.4.3-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:262f5635e2c74d80b7507fbc2fac28fe0d4fef26373bbc62039526f7722bca1b"},
{file = "ruff-0.4.3-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:7363691198719c26459e08cc17c6a3dac6f592e9ea3d2fa772f4e561b5fe82a3"},
{file = "ruff-0.4.3-py3-none-musllinux_1_2_i686.whl", hash = "sha256:eeb039f8428fcb6725bb63cbae92ad67b0559e68b5d80f840f11914afd8ddf7f"},
{file = "ruff-0.4.3-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:927b11c1e4d0727ce1a729eace61cee88a334623ec424c0b1c8fe3e5f9d3c865"},
{file = "ruff-0.4.3-py3-none-win32.whl", hash = "sha256:25cacda2155778beb0d064e0ec5a3944dcca9c12715f7c4634fd9d93ac33fd30"},
{file = "ruff-0.4.3-py3-none-win_amd64.whl", hash = "sha256:7a1c3a450bc6539ef00da6c819fb1b76b6b065dec585f91456e7c0d6a0bbc725"},
{file = "ruff-0.4.3-py3-none-win_arm64.whl", hash = "sha256:71ca5f8ccf1121b95a59649482470c5601c60a416bf189d553955b0338e34614"},
{file = "ruff-0.4.3.tar.gz", hash = "sha256:ff0a3ef2e3c4b6d133fbedcf9586abfbe38d076041f2dc18ffb2c7e0485d5a07"},
]
[[package]]
@ -4080,13 +3588,13 @@ django-query = ["django (>=3.2)"]
[[package]]
name = "selenium"
version = "4.21.0"
version = "4.20.0"
description = ""
optional = false
python-versions = ">=3.8"
files = [
{file = "selenium-4.21.0-py3-none-any.whl", hash = "sha256:4770ffe5a5264e609de7dc914be6b89987512040d5a8efb2abb181330d097993"},
{file = "selenium-4.21.0.tar.gz", hash = "sha256:650dbfa5159895ff00ad16e5ddb6ceecb86b90c7ed2012b3f041f64e6e4904fe"},
{file = "selenium-4.20.0-py3-none-any.whl", hash = "sha256:b1d0c33b38ca27d0499183e48e1dd09ff26973481f5d3ef2983073813ae6588d"},
{file = "selenium-4.20.0.tar.gz", hash = "sha256:0bd564ee166980d419a8aaf4ac00289bc152afcf2eadca5efe8c8e36711853fd"},
]
[package.dependencies]
@ -4098,13 +3606,13 @@ urllib3 = {version = ">=1.26,<3", extras = ["socks"]}
[[package]]
name = "sentry-sdk"
version = "2.2.0"
version = "2.1.1"
description = "Python client for Sentry (https://sentry.io)"
optional = false
python-versions = ">=3.6"
files = [
{file = "sentry_sdk-2.2.0-py2.py3-none-any.whl", hash = "sha256:674f58da37835ea7447fe0e34c57b4a4277fad558b0a7cb4a6c83bcb263086be"},
{file = "sentry_sdk-2.2.0.tar.gz", hash = "sha256:70eca103cf4c6302365a9d7cf522e7ed7720828910eb23d43ada8e50d1ecda9d"},
{file = "sentry_sdk-2.1.1-py2.py3-none-any.whl", hash = "sha256:99aeb78fb76771513bd3b2829d12613130152620768d00cd3e45ac00cb17950f"},
{file = "sentry_sdk-2.1.1.tar.gz", hash = "sha256:95d8c0bb41c8b0bc37ab202c2c4a295bb84398ee05f4cdce55051cd75b926ec1"},
]
[package.dependencies]
@ -4345,17 +3853,6 @@ files = [
dev = ["build", "hatch"]
doc = ["sphinx"]
[[package]]
name = "std-uritemplate"
version = "0.0.57"
description = "std-uritemplate implementation for Python"
optional = false
python-versions = "<4.0,>=3.8"
files = [
{file = "std_uritemplate-0.0.57-py3-none-any.whl", hash = "sha256:66691cb6ff1d1b3612741053d6f5573ec7eb1c1a33ffb5ca49557e8aa2372aa8"},
{file = "std_uritemplate-0.0.57.tar.gz", hash = "sha256:f4adc717aec138562e652b95da74fc6815a942231d971314856b81f434c1b94c"},
]
[[package]]
name = "stevedore"
version = "5.2.0"
@ -4967,85 +4464,6 @@ files = [
{file = "websockets-12.0.tar.gz", hash = "sha256:81df9cbcbb6c260de1e007e58c011bfebe2dafc8435107b0537f393dd38c8b1b"},
]
[[package]]
name = "wrapt"
version = "1.16.0"
description = "Module for decorators, wrappers and monkey patching."
optional = false
python-versions = ">=3.6"
files = [
{file = "wrapt-1.16.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ffa565331890b90056c01db69c0fe634a776f8019c143a5ae265f9c6bc4bd6d4"},
{file = "wrapt-1.16.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e4fdb9275308292e880dcbeb12546df7f3e0f96c6b41197e0cf37d2826359020"},
{file = "wrapt-1.16.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bb2dee3874a500de01c93d5c71415fcaef1d858370d405824783e7a8ef5db440"},
{file = "wrapt-1.16.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2a88e6010048489cda82b1326889ec075a8c856c2e6a256072b28eaee3ccf487"},
{file = "wrapt-1.16.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac83a914ebaf589b69f7d0a1277602ff494e21f4c2f743313414378f8f50a4cf"},
{file = "wrapt-1.16.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:73aa7d98215d39b8455f103de64391cb79dfcad601701a3aa0dddacf74911d72"},
{file = "wrapt-1.16.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:807cc8543a477ab7422f1120a217054f958a66ef7314f76dd9e77d3f02cdccd0"},
{file = "wrapt-1.16.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:bf5703fdeb350e36885f2875d853ce13172ae281c56e509f4e6eca049bdfb136"},
{file = "wrapt-1.16.0-cp310-cp310-win32.whl", hash = "sha256:f6b2d0c6703c988d334f297aa5df18c45e97b0af3679bb75059e0e0bd8b1069d"},
{file = "wrapt-1.16.0-cp310-cp310-win_amd64.whl", hash = "sha256:decbfa2f618fa8ed81c95ee18a387ff973143c656ef800c9f24fb7e9c16054e2"},
{file = "wrapt-1.16.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1a5db485fe2de4403f13fafdc231b0dbae5eca4359232d2efc79025527375b09"},
{file = "wrapt-1.16.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:75ea7d0ee2a15733684badb16de6794894ed9c55aa5e9903260922f0482e687d"},
{file = "wrapt-1.16.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a452f9ca3e3267cd4d0fcf2edd0d035b1934ac2bd7e0e57ac91ad6b95c0c6389"},
{file = "wrapt-1.16.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:43aa59eadec7890d9958748db829df269f0368521ba6dc68cc172d5d03ed8060"},
{file = "wrapt-1.16.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:72554a23c78a8e7aa02abbd699d129eead8b147a23c56e08d08dfc29cfdddca1"},
{file = "wrapt-1.16.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d2efee35b4b0a347e0d99d28e884dfd82797852d62fcd7ebdeee26f3ceb72cf3"},
{file = "wrapt-1.16.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:6dcfcffe73710be01d90cae08c3e548d90932d37b39ef83969ae135d36ef3956"},
{file = "wrapt-1.16.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:eb6e651000a19c96f452c85132811d25e9264d836951022d6e81df2fff38337d"},
{file = "wrapt-1.16.0-cp311-cp311-win32.whl", hash = "sha256:66027d667efe95cc4fa945af59f92c5a02c6f5bb6012bff9e60542c74c75c362"},
{file = "wrapt-1.16.0-cp311-cp311-win_amd64.whl", hash = "sha256:aefbc4cb0a54f91af643660a0a150ce2c090d3652cf4052a5397fb2de549cd89"},
{file = "wrapt-1.16.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:5eb404d89131ec9b4f748fa5cfb5346802e5ee8836f57d516576e61f304f3b7b"},
{file = "wrapt-1.16.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9090c9e676d5236a6948330e83cb89969f433b1943a558968f659ead07cb3b36"},
{file = "wrapt-1.16.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94265b00870aa407bd0cbcfd536f17ecde43b94fb8d228560a1e9d3041462d73"},
{file = "wrapt-1.16.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f2058f813d4f2b5e3a9eb2eb3faf8f1d99b81c3e51aeda4b168406443e8ba809"},
{file = "wrapt-1.16.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98b5e1f498a8ca1858a1cdbffb023bfd954da4e3fa2c0cb5853d40014557248b"},
{file = "wrapt-1.16.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:14d7dc606219cdd7405133c713f2c218d4252f2a469003f8c46bb92d5d095d81"},
{file = "wrapt-1.16.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:49aac49dc4782cb04f58986e81ea0b4768e4ff197b57324dcbd7699c5dfb40b9"},
{file = "wrapt-1.16.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:418abb18146475c310d7a6dc71143d6f7adec5b004ac9ce08dc7a34e2babdc5c"},
{file = "wrapt-1.16.0-cp312-cp312-win32.whl", hash = "sha256:685f568fa5e627e93f3b52fda002c7ed2fa1800b50ce51f6ed1d572d8ab3e7fc"},
{file = "wrapt-1.16.0-cp312-cp312-win_amd64.whl", hash = "sha256:dcdba5c86e368442528f7060039eda390cc4091bfd1dca41e8046af7c910dda8"},
{file = "wrapt-1.16.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:d462f28826f4657968ae51d2181a074dfe03c200d6131690b7d65d55b0f360f8"},
{file = "wrapt-1.16.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a33a747400b94b6d6b8a165e4480264a64a78c8a4c734b62136062e9a248dd39"},
{file = "wrapt-1.16.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b3646eefa23daeba62643a58aac816945cadc0afaf21800a1421eeba5f6cfb9c"},
{file = "wrapt-1.16.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ebf019be5c09d400cf7b024aa52b1f3aeebeff51550d007e92c3c1c4afc2a40"},
{file = "wrapt-1.16.0-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:0d2691979e93d06a95a26257adb7bfd0c93818e89b1406f5a28f36e0d8c1e1fc"},
{file = "wrapt-1.16.0-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:1acd723ee2a8826f3d53910255643e33673e1d11db84ce5880675954183ec47e"},
{file = "wrapt-1.16.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:bc57efac2da352a51cc4658878a68d2b1b67dbe9d33c36cb826ca449d80a8465"},
{file = "wrapt-1.16.0-cp36-cp36m-win32.whl", hash = "sha256:da4813f751142436b075ed7aa012a8778aa43a99f7b36afe9b742d3ed8bdc95e"},
{file = "wrapt-1.16.0-cp36-cp36m-win_amd64.whl", hash = "sha256:6f6eac2360f2d543cc875a0e5efd413b6cbd483cb3ad7ebf888884a6e0d2e966"},
{file = "wrapt-1.16.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:a0ea261ce52b5952bf669684a251a66df239ec6d441ccb59ec7afa882265d593"},
{file = "wrapt-1.16.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7bd2d7ff69a2cac767fbf7a2b206add2e9a210e57947dd7ce03e25d03d2de292"},
{file = "wrapt-1.16.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9159485323798c8dc530a224bd3ffcf76659319ccc7bbd52e01e73bd0241a0c5"},
{file = "wrapt-1.16.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a86373cf37cd7764f2201b76496aba58a52e76dedfaa698ef9e9688bfd9e41cf"},
{file = "wrapt-1.16.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:73870c364c11f03ed072dda68ff7aea6d2a3a5c3fe250d917a429c7432e15228"},
{file = "wrapt-1.16.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:b935ae30c6e7400022b50f8d359c03ed233d45b725cfdd299462f41ee5ffba6f"},
{file = "wrapt-1.16.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:db98ad84a55eb09b3c32a96c576476777e87c520a34e2519d3e59c44710c002c"},
{file = "wrapt-1.16.0-cp37-cp37m-win32.whl", hash = "sha256:9153ed35fc5e4fa3b2fe97bddaa7cbec0ed22412b85bcdaf54aeba92ea37428c"},
{file = "wrapt-1.16.0-cp37-cp37m-win_amd64.whl", hash = "sha256:66dfbaa7cfa3eb707bbfcd46dab2bc6207b005cbc9caa2199bcbc81d95071a00"},
{file = "wrapt-1.16.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1dd50a2696ff89f57bd8847647a1c363b687d3d796dc30d4dd4a9d1689a706f0"},
{file = "wrapt-1.16.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:44a2754372e32ab315734c6c73b24351d06e77ffff6ae27d2ecf14cf3d229202"},
{file = "wrapt-1.16.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e9723528b9f787dc59168369e42ae1c3b0d3fadb2f1a71de14531d321ee05b0"},
{file = "wrapt-1.16.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dbed418ba5c3dce92619656802cc5355cb679e58d0d89b50f116e4a9d5a9603e"},
{file = "wrapt-1.16.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:941988b89b4fd6b41c3f0bfb20e92bd23746579736b7343283297c4c8cbae68f"},
{file = "wrapt-1.16.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:6a42cd0cfa8ffc1915aef79cb4284f6383d8a3e9dcca70c445dcfdd639d51267"},
{file = "wrapt-1.16.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:1ca9b6085e4f866bd584fb135a041bfc32cab916e69f714a7d1d397f8c4891ca"},
{file = "wrapt-1.16.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d5e49454f19ef621089e204f862388d29e6e8d8b162efce05208913dde5b9ad6"},
{file = "wrapt-1.16.0-cp38-cp38-win32.whl", hash = "sha256:c31f72b1b6624c9d863fc095da460802f43a7c6868c5dda140f51da24fd47d7b"},
{file = "wrapt-1.16.0-cp38-cp38-win_amd64.whl", hash = "sha256:490b0ee15c1a55be9c1bd8609b8cecd60e325f0575fc98f50058eae366e01f41"},
{file = "wrapt-1.16.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9b201ae332c3637a42f02d1045e1d0cccfdc41f1f2f801dafbaa7e9b4797bfc2"},
{file = "wrapt-1.16.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:2076fad65c6736184e77d7d4729b63a6d1ae0b70da4868adeec40989858eb3fb"},
{file = "wrapt-1.16.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c5cd603b575ebceca7da5a3a251e69561bec509e0b46e4993e1cac402b7247b8"},
{file = "wrapt-1.16.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b47cfad9e9bbbed2339081f4e346c93ecd7ab504299403320bf85f7f85c7d46c"},
{file = "wrapt-1.16.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8212564d49c50eb4565e502814f694e240c55551a5f1bc841d4fcaabb0a9b8a"},
{file = "wrapt-1.16.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:5f15814a33e42b04e3de432e573aa557f9f0f56458745c2074952f564c50e664"},
{file = "wrapt-1.16.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:db2e408d983b0e61e238cf579c09ef7020560441906ca990fe8412153e3b291f"},
{file = "wrapt-1.16.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:edfad1d29c73f9b863ebe7082ae9321374ccb10879eeabc84ba3b69f2579d537"},
{file = "wrapt-1.16.0-cp39-cp39-win32.whl", hash = "sha256:ed867c42c268f876097248e05b6117a65bcd1e63b779e916fe2e33cd6fd0d3c3"},
{file = "wrapt-1.16.0-cp39-cp39-win_amd64.whl", hash = "sha256:eb1b046be06b0fce7249f1d025cd359b4b80fc1c3e24ad9eca33e0dcdb2e4a35"},
{file = "wrapt-1.16.0-py3-none-any.whl", hash = "sha256:6906c4100a8fcbf2fa735f6059214bb13b97f75b1a61777fcf6432121ef12ef1"},
{file = "wrapt-1.16.0.tar.gz", hash = "sha256:5f370f952971e7d17c7d1ead40e49f32345a7f7a5373571ef44d800d06b1899d"},
]
[[package]]
name = "wsproto"
version = "1.2.0"
@ -5314,4 +4732,4 @@ files = [
[metadata]
lock-version = "2.0"
python-versions = "~3.12"
content-hash = "6399c90a2adca3e7119277bf4d0649fe0826d5fb4454a23b1b1fad3e64a1fe90"
content-hash = "1ef87ed82de9c403cc569f4858d874e85da73924a610190379f7db3a76458d5b"

View File

@ -76,7 +76,7 @@ show_missing = true
DJANGO_SETTINGS_MODULE = "authentik.root.settings"
python_files = ["tests.py", "test_*.py", "*_tests.py"]
junit_family = "xunit2"
addopts = "-p no:celery -p authentik.root.test_plugin --junitxml=unittest.xml -vv --full-trace --doctest-modules --import-mode=importlib"
addopts = "-p no:celery -p authentik.root.test_plugin --junitxml=unittest.xml -vv --full-trace --doctest-modules"
filterwarnings = [
"ignore:defusedxml.lxml is no longer supported and will be removed in a future release.:DeprecationWarning",
"ignore:SelectableGroups dict interface is deprecated. Use select.:DeprecationWarning",
@ -118,7 +118,6 @@ jsonpatch = "*"
kubernetes = "*"
ldap3 = "*"
lxml = "*"
msgraph-sdk = "*"
opencontainers = { extras = ["reggie"], version = "*" }
packaging = "*"
paramiko = "*"

2280
schema.yml

File diff suppressed because it is too large Load Diff

View File

@ -161,6 +161,7 @@ class TestSourceSAML(SeleniumTestCase):
self.assert_user(
User.objects.exclude(username="akadmin")
.exclude(username__startswith="ak-outpost")
.exclude_anonymous()
.exclude(pk=self.user.pk)
.first()
)
@ -243,6 +244,7 @@ class TestSourceSAML(SeleniumTestCase):
self.assert_user(
User.objects.exclude(username="akadmin")
.exclude(username__startswith="ak-outpost")
.exclude_anonymous()
.exclude(pk=self.user.pk)
.first()
)
@ -312,6 +314,7 @@ class TestSourceSAML(SeleniumTestCase):
self.assert_user(
User.objects.exclude(username="akadmin")
.exclude(username__startswith="ak-outpost")
.exclude_anonymous()
.exclude(pk=self.user.pk)
.first()
)

View File

@ -6,7 +6,7 @@
"": {
"name": "@goauthentik/web-tests",
"dependencies": {
"chromedriver": "^125.0.0"
"chromedriver": "^124.0.1"
},
"devDependencies": {
"@trivago/prettier-plugin-sort-imports": "^4.3.0",
@ -2084,9 +2084,9 @@
}
},
"node_modules/chromedriver": {
"version": "125.0.0",
"resolved": "https://registry.npmjs.org/chromedriver/-/chromedriver-125.0.0.tgz",
"integrity": "sha512-wWXrxWLWqXRTmRZDtPigs+ys44srlpHTpsL7MHnZc9iaE1oIB0hslSVeem6TcsEb1Ou8nvPx3vs5bPwCI6+VHg==",
"version": "124.0.1",
"resolved": "https://registry.npmjs.org/chromedriver/-/chromedriver-124.0.1.tgz",
"integrity": "sha512-hxd1tpAUhgMFBZd1h3W7KyMckxofOYCuKAMtcvBDAU0YKKorZcWuq6zP06+Ph0Z1ynPjtgAj0hP9VphCwesjZw==",
"hasInstallScript": true,
"dependencies": {
"@testim/chrome-version": "^1.1.4",

View File

@ -32,6 +32,6 @@
"node": ">=20"
},
"dependencies": {
"chromedriver": "^125.0.0"
"chromedriver": "^124.0.1"
}
}

1982
web/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -38,7 +38,7 @@
"@codemirror/theme-one-dark": "^6.1.2",
"@formatjs/intl-listformat": "^7.5.5",
"@fortawesome/fontawesome-free": "^6.5.2",
"@goauthentik/api": "^2024.4.2-1715271029",
"@goauthentik/api": "^2024.4.2-1715104360",
"@lit-labs/task": "^3.1.0",
"@lit/context": "^1.1.1",
"@lit/localize": "^0.12.1",
@ -46,7 +46,7 @@
"@open-wc/lit-helpers": "^0.7.0",
"@patternfly/elements": "^3.0.1",
"@patternfly/patternfly": "^4.224.2",
"@sentry/browser": "^8.2.1",
"@sentry/browser": "^7.113.0",
"@webcomponents/webcomponentsjs": "^2.8.0",
"base64-js": "^1.5.1",
"chart.js": "^4.4.2",
@ -81,13 +81,13 @@
"@lit/localize-tools": "^0.7.2",
"@rollup/plugin-replace": "^5.0.5",
"@spotlightjs/spotlight": "^1.2.17",
"@storybook/addon-essentials": "^8.1.1",
"@storybook/addon-links": "^8.1.1",
"@storybook/addon-essentials": "^8.0.10",
"@storybook/addon-links": "^8.0.10",
"@storybook/api": "^7.6.17",
"@storybook/blocks": "^8.0.8",
"@storybook/manager-api": "^8.1.1",
"@storybook/web-components": "^8.1.1",
"@storybook/web-components-vite": "^8.1.1",
"@storybook/manager-api": "^8.0.10",
"@storybook/web-components": "^8.0.10",
"@storybook/web-components-vite": "^8.0.10",
"@trivago/prettier-plugin-sort-imports": "^4.3.0",
"@types/chart.js": "^2.9.41",
"@types/codemirror": "5.60.15",
@ -100,7 +100,7 @@
"babel-plugin-tsconfig-paths": "^1.0.3",
"chokidar": "^3.6.0",
"cross-env": "^7.0.3",
"esbuild": "^0.21.3",
"esbuild": "^0.21.1",
"eslint": "^8.57.0",
"eslint-config-google": "^0.14.0",
"eslint-plugin-custom-elements": "0.0.8",
@ -108,7 +108,7 @@
"eslint-plugin-sonarjs": "^0.25.1",
"eslint-plugin-storybook": "^0.8.0",
"github-slugger": "^2.0.0",
"glob": "^10.3.15",
"glob": "^10.3.12",
"lit-analyzer": "^2.0.3",
"npm-run-all": "^4.1.5",
"prettier": "^3.2.5",
@ -117,7 +117,7 @@
"react-dom": "^18.3.1",
"rollup-plugin-modify": "^3.0.0",
"rollup-plugin-postcss-lit": "^2.1.0",
"storybook": "^8.1.1",
"storybook": "^8.0.10",
"storybook-addon-mock": "^5.0.0",
"ts-lit-plugin": "^2.0.2",
"tslib": "^2.6.2",
@ -126,9 +126,9 @@
"vite-tsconfig-paths": "^4.3.2"
},
"optionalDependencies": {
"@esbuild/darwin-arm64": "^0.21.3",
"@esbuild/darwin-arm64": "^0.21.1",
"@esbuild/linux-amd64": "^0.18.11",
"@esbuild/linux-arm64": "^0.21.3",
"@esbuild/linux-arm64": "^0.21.1",
"@rollup/rollup-darwin-arm64": "4.17.2",
"@rollup/rollup-linux-arm64-gnu": "4.17.2",
"@rollup/rollup-linux-x64-gnu": "4.17.2"

View File

@ -18,7 +18,7 @@ export interface SummarizedSyncStatus {
}
@customElement("ak-admin-status-chart-sync")
export class SyncStatusChart extends AKChart<SummarizedSyncStatus[]> {
export class LDAPSyncStatusChart extends AKChart<SummarizedSyncStatus[]> {
getChartType(): string {
return "doughnut";
}
@ -102,19 +102,6 @@ export class SyncStatusChart extends AKChart<SummarizedSyncStatus[]> {
},
msg("Google Workspace Provider"),
),
await this.fetchStatus(
() => {
return new ProvidersApi(DEFAULT_CONFIG).providersMicrosoftEntraList();
},
(element) => {
return new ProvidersApi(
DEFAULT_CONFIG,
).providersMicrosoftEntraSyncStatusRetrieve({
id: element.pk,
});
},
msg("Microsoft Entra Provider"),
),
await this.fetchStatus(
() => {
return new SourcesApi(DEFAULT_CONFIG).sourcesLdapList();

View File

@ -3,7 +3,6 @@ import "@goauthentik/admin/applications/ApplicationCheckAccessForm";
import "@goauthentik/admin/applications/ApplicationForm";
import "@goauthentik/admin/policies/BoundPoliciesList";
import { DEFAULT_CONFIG } from "@goauthentik/common/api/config";
import { PFSize } from "@goauthentik/common/enums.js";
import "@goauthentik/components/ak-app-icon";
import "@goauthentik/components/events/ObjectChangelog";
import { AKElement } from "@goauthentik/elements/Base";
@ -13,10 +12,8 @@ import "@goauthentik/elements/Tabs";
import "@goauthentik/elements/buttons/SpinnerButton";
import "@goauthentik/elements/rbac/ObjectPermissionsPage";
import { msg } from "@lit/localize";
import { CSSResult, PropertyValues, TemplateResult, html } from "lit";
import { PropertyValues } from "lit";
import { customElement, property, state } from "lit/decorators.js";
import { ifDefined } from "lit/directives/if-defined.js";
import PFBanner from "@patternfly/patternfly/components/Banner/banner.css";
import PFButton from "@patternfly/patternfly/components/Button/button.css";
@ -35,18 +32,14 @@ import {
RbacPermissionsAssignedByUsersListModelEnum,
} from "@goauthentik/api";
import {
ApplicationViewPageLoadingRenderer,
ApplicationViewPageRenderer,
} from "./ApplicationViewPageRenderers.js";
@customElement("ak-application-view")
export class ApplicationViewPage extends AKElement {
@property({ type: String })
applicationSlug?: string;
@state()
application?: Application;
@state()
missingOutpost = false;
static get styles(): CSSResult[] {
static get styles() {
return [
PFBase,
PFList,
@ -60,6 +53,15 @@ export class ApplicationViewPage extends AKElement {
];
}
@property({ type: String })
applicationSlug?: string;
@state()
application?: Application;
@state()
missingOutpost = false;
fetchIsMissingOutpost(providersByPk: Array<number>) {
new OutpostsApi(DEFAULT_CONFIG)
.outpostsInstancesList({
@ -94,231 +96,15 @@ export class ApplicationViewPage extends AKElement {
}
}
render(): TemplateResult {
return html`<ak-page-header
header=${this.application?.name || msg("Loading")}
description=${ifDefined(this.application?.metaPublisher)}
.iconImage=${true}
>
<ak-app-icon
size=${PFSize.Medium}
slot="icon"
.app=${this.application}
></ak-app-icon>
</ak-page-header>
${this.renderApp()}`;
}
render() {
const renderer = this.application
? new ApplicationViewPageRenderer(
this.application,
this.missingOutpost,
RbacPermissionsAssignedByUsersListModelEnum.CoreApplication,
)
: new ApplicationViewPageLoadingRenderer();
renderApp(): TemplateResult {
if (!this.application) {
return html`<ak-empty-state ?loading="${true}" header=${msg("Loading")}>
</ak-empty-state>`;
}
return html`<ak-tabs>
${this.missingOutpost
? html`<div slot="header" class="pf-c-banner pf-m-warning">
${msg("Warning: Application is not used by any Outpost.")}
</div>`
: html``}
<section
slot="page-overview"
data-tab-title="${msg("Overview")}"
class="pf-c-page__main-section pf-m-no-padding-mobile"
>
<div class="pf-l-grid pf-m-gutter">
<div
class="pf-c-card pf-l-grid__item pf-m-12-col pf-m-2-col-on-xl pf-m-2-col-on-2xl"
>
<div class="pf-c-card__title">${msg("Related")}</div>
<div class="pf-c-card__body">
<dl class="pf-c-description-list">
${this.application.providerObj
? html`<div class="pf-c-description-list__group">
<dt class="pf-c-description-list__term">
<span class="pf-c-description-list__text"
>${msg("Provider")}</span
>
</dt>
<dd class="pf-c-description-list__description">
<div class="pf-c-description-list__text">
<a
href="#/core/providers/${this.application
.providerObj?.pk}"
>
${this.application.providerObj?.name}
(${this.application.providerObj?.verboseName})
</a>
</div>
</dd>
</div>`
: html``}
${(this.application.backchannelProvidersObj || []).length > 0
? html`<div class="pf-c-description-list__group">
<dt class="pf-c-description-list__term">
<span class="pf-c-description-list__text"
>${msg("Backchannel Providers")}</span
>
</dt>
<dd class="pf-c-description-list__description">
<div class="pf-c-description-list__text">
<ul class="pf-c-list">
${this.application.backchannelProvidersObj.map(
(provider) => {
return html`
<li>
<a
href="#/core/providers/${provider.pk}"
>
${provider.name}
(${provider.verboseName})
</a>
</li>
`;
},
)}
</ul>
</div>
</dd>
</div>`
: html``}
<div class="pf-c-description-list__group">
<dt class="pf-c-description-list__term">
<span class="pf-c-description-list__text"
>${msg("Policy engine mode")}</span
>
</dt>
<dd class="pf-c-description-list__description">
<div class="pf-c-description-list__text">
${this.application.policyEngineMode?.toUpperCase()}
</div>
</dd>
</div>
<div class="pf-c-description-list__group">
<dt class="pf-c-description-list__term">
<span class="pf-c-description-list__text"
>${msg("Edit")}</span
>
</dt>
<dd class="pf-c-description-list__description">
<div class="pf-c-description-list__text">
<ak-forms-modal>
<span slot="submit"> ${msg("Update")} </span>
<span slot="header">
${msg("Update Application")}
</span>
<ak-application-form
slot="form"
.instancePk=${this.application.slug}
>
</ak-application-form>
<button
slot="trigger"
class="pf-c-button pf-m-secondary"
>
${msg("Edit")}
</button>
</ak-forms-modal>
</div>
</dd>
</div>
<div class="pf-c-description-list__group">
<dt class="pf-c-description-list__term">
<span class="pf-c-description-list__text"
>${msg("Check access")}</span
>
</dt>
<dd class="pf-c-description-list__description">
<div class="pf-c-description-list__text">
<ak-forms-modal .closeAfterSuccessfulSubmit=${false}>
<span slot="submit"> ${msg("Check")} </span>
<span slot="header">
${msg("Check Application access")}
</span>
<ak-application-check-access-form
slot="form"
.application=${this.application}
>
</ak-application-check-access-form>
<button
slot="trigger"
class="pf-c-button pf-m-secondary"
>
${msg("Test")}
</button>
</ak-forms-modal>
</div>
</dd>
</div>
${this.application.launchUrl
? html`<div class="pf-c-description-list__group">
<dt class="pf-c-description-list__term">
<span class="pf-c-description-list__text"
>${msg("Launch")}</span
>
</dt>
<dd class="pf-c-description-list__description">
<div class="pf-c-description-list__text">
<a
target="_blank"
href=${this.application.launchUrl}
slot="trigger"
class="pf-c-button pf-m-secondary"
>
${msg("Launch")}
</a>
</div>
</dd>
</div>`
: html``}
</dl>
</div>
</div>
<div
class="pf-c-card pf-l-grid__item pf-m-12-col pf-m-10-col-on-xl pf-m-10-col-on-2xl"
>
<div class="pf-c-card__title">
${msg("Logins over the last week (per 8 hours)")}
</div>
<div class="pf-c-card__body">
${this.application &&
html` <ak-charts-application-authorize
applicationSlug=${this.application.slug}
>
</ak-charts-application-authorize>`}
</div>
</div>
<div class="pf-c-card pf-l-grid__item pf-m-12-col">
<div class="pf-c-card__title">${msg("Changelog")}</div>
<div class="pf-c-card__body">
<ak-object-changelog
targetModelPk=${this.application.pk || ""}
targetModelApp="authentik_core"
targetModelName="application"
>
</ak-object-changelog>
</div>
</div>
</div>
</section>
<section
slot="page-policy-bindings"
data-tab-title="${msg("Policy / Group / User Bindings")}"
class="pf-c-page__main-section pf-m-no-padding-mobile"
>
<div class="pf-c-card">
<div class="pf-c-card__title">
${msg("These policies control which users can access this application.")}
</div>
<ak-bound-policies-list .target=${this.application.pk}>
</ak-bound-policies-list>
</div>
</section>
<ak-rbac-object-permission-page
slot="page-permissions"
data-tab-title="${msg("Permissions")}"
model=${RbacPermissionsAssignedByUsersListModelEnum.CoreApplication}
objectPk=${this.application.pk}
></ak-rbac-object-permission-page>
</ak-tabs>`;
return renderer.render();
}
}

View File

@ -0,0 +1,214 @@
import { PFSize } from "@goauthentik/common/enums.js";
import { DescriptionPair, renderDescriptionList } from "@goauthentik/components/DescriptionList.js";
import { msg } from "@lit/localize";
import { html, nothing } from "lit";
import { ifDefined } from "lit/directives/if-defined.js";
import type { Application, RbacPermissionsAssignedByUsersListModelEnum } from "@goauthentik/api";
export class ApplicationViewPageLoadingRenderer {
constructor() {}
render() {
return html`<ak-page-header header=${msg("Loading")}
><ak-empty-state ?loading="${true}" header=${msg("Loading")}> </ak-empty-state
></ak-page-header>`;
}
}
export class ApplicationViewPageRenderer {
constructor(
private app: Application,
private noOutpost: boolean,
private rbacModel: RbacPermissionsAssignedByUsersListModelEnum,
) {}
missingOutpostMessage() {
return this.noOutpost
? html`<div slot="header" class="pf-c-banner pf-m-warning">
${msg("Warning: Application is not used by any Outpost.")}
</div>`
: nothing;
}
controlCardContents(app: Application): DescriptionPair[] {
// prettier-ignore
const rows: (DescriptionPair | null)[] = [
app.providerObj
? [
msg("Provider"),
html`
<a href="#/core/providers/${app.providerObj?.pk}">
${app.providerObj?.name} (${app.providerObj?.verboseName})
</a>
`,
]
: null,
(app.backchannelProvidersObj || []).length > 0
? [
msg("Backchannel Providers"),
html`
<ul class="pf-c-list">
${app.backchannelProvidersObj.map((provider) => {
return html`
<li>
<a href="#/core/providers/${provider.pk}">
${provider.name} (${provider.verboseName})
</a>
</li>
`;
})}
</ul>
`,
]
: null,
[
msg("Policy engine mode"),
app.policyEngineMode?.toUpperCase()
],
[
msg("Edit"),
html`
<ak-forms-modal>
<span slot="submit"> ${msg("Update")} </span>
<span slot="header"> ${msg("Update Application")} </span>
<ak-application-form slot="form" .instancePk=${app.slug}>
</ak-application-form>
<button slot="trigger" class="pf-c-button pf-m-secondary">
${msg("Edit")}
</button>
</ak-forms-modal>
`,
],
[
msg("Check access"),
html`
<ak-forms-modal .closeAfterSuccessfulSubmit=${false}>
<span slot="submit"> ${msg("Check")} </span>
<span slot="header"> ${msg("Check Application access")} </span>
<ak-application-check-access-form slot="form" .application=${app}>
</ak-application-check-access-form>
<button slot="trigger" class="pf-c-button pf-m-secondary">
${msg("Test")}
</button>
</ak-forms-modal>
`,
],
app.launchUrl
? [
msg("Launch"),
html`
<a
target="_blank"
href=${app.launchUrl}
slot="trigger"
class="pf-c-button pf-m-secondary"
>
${msg("Launch")}
</a>
`,
]
: null,
];
return rows.filter((row) => row !== null) as DescriptionPair[];
}
controlCard(app: Application) {
return html`
<div class="pf-c-card pf-l-grid__item pf-m-12-col pf-m-2-col-on-xl pf-m-2-col-on-2xl">
<div class="pf-c-card__title">${msg("Related")}</div>
<div class="pf-c-card__body">
${renderDescriptionList(this.controlCardContents(app))}
</div>
</div>
`;
}
loginsChart(app: Application) {
return html`<div
class="pf-c-card pf-l-grid__item pf-m-12-col pf-m-10-col-on-xl pf-m-10-col-on-2xl"
>
<div class="pf-c-card__title">${msg("Logins over the last week (per 8 hours)")}</div>
<div class="pf-c-card__body">
${app &&
html` <ak-charts-application-authorize applicationSlug=${app.slug}>
</ak-charts-application-authorize>`}
</div>
</div>`;
}
changelog(app: Application) {
return html`
<div class="pf-c-card pf-l-grid__item pf-m-12-col">
<div class="pf-c-card__title">${msg("Changelog")}</div>
<div class="pf-c-card__body">
<ak-object-changelog
targetModelPk=${app.pk || ""}
targetModelApp="authentik_core"
targetModelName="application"
>
</ak-object-changelog>
</div>
</div>
`;
}
overview(app: Application) {
return html`
<div class="pf-l-grid pf-m-gutter">
${this.controlCard(app)} ${this.loginsChart(app)} ${this.changelog(app)}
</div>
</section>`;
}
policiesList(app: Application) {
return html`
<div class="pf-c-card">
<div class="pf-c-card__title">
${msg("These policies control which users can access this application.")}
</div>
<ak-bound-policies-list .target=${app.pk}> </ak-bound-policies-list>
</div>
`;
}
render() {
return html` <ak-page-header
header=${this.app.name}
description=${ifDefined(this.app.metaPublisher)}
.iconImage=${true}
>
<ak-app-icon size=${PFSize.Medium} slot="icon" .app=${this.app}></ak-app-icon>
</ak-page-header>
<ak-tabs>
${this.missingOutpostMessage()}
<section
slot="page-overview"
data-tab-title="${msg("Overview")}"
class="pf-c-page__main-section pf-m-no-padding-mobile"
>
${this.overview(this.app)}
</section>
<section
slot="page-policy-bindings"
data-tab-title="${msg("Policy / Group / User Bindings")}"
class="pf-c-page__main-section pf-m-no-padding-mobile"
>
${this.policiesList(this.app)}
</section>
<ak-rbac-object-permission-page
slot="page-permissions"
data-tab-title="${msg("Permissions")}"
model=${this.rbacModel}
objectPk=${this.app.pk}
></ak-rbac-object-permission-page>
</ak-tabs>`;
}
}

View File

@ -21,7 +21,7 @@ export class ProviderSelectModal extends TableModal<Provider> {
}
@property({ type: Boolean })
backchannel?: boolean;
backchannelOnly = false;
@property()
confirm!: (selectedItems: Provider[]) => Promise<unknown>;
@ -34,7 +34,7 @@ export class ProviderSelectModal extends TableModal<Provider> {
page: page,
pageSize: (await uiConfig()).pagination.perPage,
search: this.search || "",
backchannel: this.backchannel,
backchannelOnly: this.backchannelOnly,
});
}

View File

@ -64,7 +64,7 @@ export class AkBackchannelProvidersInput extends AKElement {
return html`
<ak-form-element-horizontal label=${this.label} name=${this.name}>
<div class="pf-c-input-group">
<ak-provider-select-table ?backchannel=${true} .confirm=${this.confirm}>
<ak-provider-select-table ?backchannelOnly=${true} .confirm=${this.confirm}>
<button slot="trigger" class="pf-c-button pf-m-control" type="button">
${this.tooltip ? this.tooltip : nothing}
<i class="fas fa-plus" aria-hidden="true"></i>

View File

@ -15,7 +15,6 @@ const doGroupBy = (items: Provider[]) => groupBy(items, (item) => item.verboseNa
async function fetch(query?: string) {
const args: ProvidersAllListRequest = {
ordering: "name",
backchannel: false,
};
if (query !== undefined) {
args.search = query;

View File

@ -10,11 +10,11 @@ import { TemplateResult, html } from "lit";
import { customElement } from "lit/decorators.js";
import { ifDefined } from "lit/directives/if-defined.js";
import { GoogleWorkspaceProviderMapping, PropertymappingsApi } from "@goauthentik/api";
import { GoogleProviderMapping, PropertymappingsApi } from "@goauthentik/api";
@customElement("ak-property-mapping-google-workspace-form")
export class PropertyMappingGoogleWorkspaceForm extends BasePropertyMappingForm<GoogleWorkspaceProviderMapping> {
loadInstance(pk: string): Promise<GoogleWorkspaceProviderMapping> {
export class PropertyMappingGoogleWorkspaceForm extends BasePropertyMappingForm<GoogleProviderMapping> {
loadInstance(pk: string): Promise<GoogleProviderMapping> {
return new PropertymappingsApi(
DEFAULT_CONFIG,
).propertymappingsProviderGoogleWorkspaceRetrieve({
@ -22,19 +22,19 @@ export class PropertyMappingGoogleWorkspaceForm extends BasePropertyMappingForm<
});
}
async send(data: GoogleWorkspaceProviderMapping): Promise<GoogleWorkspaceProviderMapping> {
async send(data: GoogleProviderMapping): Promise<GoogleProviderMapping> {
if (this.instance) {
return new PropertymappingsApi(
DEFAULT_CONFIG,
).propertymappingsProviderGoogleWorkspaceUpdate({
pmUuid: this.instance.pk,
googleWorkspaceProviderMappingRequest: data,
pmUuid: this.instance.pk || "",
googleProviderMappingRequest: data,
});
} else {
return new PropertymappingsApi(
DEFAULT_CONFIG,
).propertymappingsProviderGoogleWorkspaceCreate({
googleWorkspaceProviderMappingRequest: data,
googleProviderMappingRequest: data,
});
}
}

Some files were not shown because too many files have changed in this diff Show More