Compare commits

...

385 commits
v0.7.2 ... main

Author SHA1 Message Date
Bart Schuurmans d90e8e56d5
Merge pull request #3341 from Minnozz/django-4.2
Upgrade to Django 4.2
2024-05-31 17:04:17 +02:00
Bart Schuurmans eca246fc61 Fix lint 2024-05-31 16:59:24 +02:00
Bart Schuurmans aa2312e8af Merge branch 'main' into django-4.2 2024-05-31 16:52:48 +02:00
Bart Schuurmans 61d9e0c260
Move comment to separate line
Co-authored-by: Adeodato Simó <73768+dato@users.noreply.github.com>
2024-05-31 16:49:34 +02:00
Bart Schuurmans 44eedd09d9
Merge pull request #1 from dato/django-4.2
minor contributions to django 4.2 upgrade
2024-05-31 16:49:06 +02:00
Mouse Reeve 4e987a0e66
Merge pull request #3369 from bookwyrm-social/dependabot/pip/requests-2.32.0
Bump requests from 2.31.0 to 2.32.0
2024-05-21 14:47:34 -07:00
dependabot[bot] 332286cdff
---
updated-dependencies:
- dependency-name: requests
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-05-21 05:41:48 +00:00
Adeodato Simó e6ee169c3e
Narrow down bare type: ignore pragmas 2024-04-26 21:20:43 -03:00
Adeodato Simó 29f852b57e
consolidate multiple cache.delete() calls into cache.delete_many() 2024-04-26 15:36:38 -03:00
Bart Schuurmans acae063652 Fix new warnings from pylint upgrade 2024-04-26 13:59:16 +02:00
Bart Schuurmans c32f9faaa0 Upgrade pylint to 2.17.7 2024-04-26 13:41:01 +02:00
Bart Schuurmans e7f95ef4c2 Modify update_fields in save() when modifying objects
https://docs.djangoproject.com/en/5.0/releases/4.2/#setting-update-fields-in-model-save-may-now-be-required
2024-04-25 15:53:53 +02:00
Bart Schuurmans a6c2ce15dd Early return 2024-04-25 15:51:32 +02:00
Bart Schuurmans 7604d0acdb Simplify ObjectMixin broadcast kwarg 2024-04-25 10:31:24 +02:00
Bart Schuurmans 77832cbec7 Add merge migration 2024-04-25 10:14:07 +02:00
Bart Schuurmans de67c73237 Add merge migration 2024-04-25 10:14:07 +02:00
Bart Schuurmans f38622fdc9 Define CSRF_TRUSTED_ORIGINS 2024-04-25 10:14:07 +02:00
Bart Schuurmans 051dab77bb Replace deprecated CICharField with custom collation for case-insensitivity 2024-04-25 10:14:07 +02:00
Bart Schuurmans 2896219e88 Switch from django-redis to the built-in Redis cache backend 2024-04-25 10:14:07 +02:00
Bart Schuurmans 03ac846b5d Migrate from pytz to zoneinfo 2024-04-25 10:14:07 +02:00
Bart Schuurmans 39c2a0feae Update qrcode to 7.4.2 2024-04-25 10:14:07 +02:00
Bart Schuurmans 22986a08f0 Update pytest to 8.1.1 2024-04-25 10:14:07 +02:00
Bart Schuurmans f6bbe673ca Update responses to 0.25.0 2024-04-25 10:14:07 +02:00
Bart Schuurmans f324a3cd1d Update pytest-xdist to 3.5.0 2024-04-25 10:14:07 +02:00
Bart Schuurmans 039160e004 Update pytest-env to 1.1.3 2024-04-25 10:14:07 +02:00
Bart Schuurmans a1ff5a478e Update types-Pillow to 10.2.0.20240331 2024-04-25 10:14:07 +02:00
Bart Schuurmans 1cb86197d5 Update types-requests to 2.31.0.20240311 2024-04-25 10:14:07 +02:00
Bart Schuurmans 2537886b4d Group version constraints for indirect dependencies and change to >= 2024-04-25 10:14:07 +02:00
Bart Schuurmans 1474c0d3aa Remove protobuf as a direct dependency 2024-04-25 10:14:07 +02:00
Bart Schuurmans e46bc2e9a1 Update redis-py to 5.0.3 2024-04-25 10:14:07 +02:00
Bart Schuurmans 01b37026eb Update Markdown to 3.6 2024-04-25 10:14:07 +02:00
Bart Schuurmans 9ebda3fbe8 Update celery to 5.3.6 2024-04-25 10:14:07 +02:00
Bart Schuurmans b6174d9101 Update bleach to 6.1.0 2024-04-25 10:14:06 +02:00
Bart Schuurmans 1303f539c3 Update psycopg to 2.9.9 2024-04-25 10:13:21 +02:00
Bart Schuurmans 624115bf11 Use headers dict instead of HTTP_* kwargs or request.META 2024-04-25 10:13:21 +02:00
Bart Schuurmans 224fae7a87 Fix mypy errors 2024-04-25 10:13:21 +02:00
Bart Schuurmans 869bc5a376 Update mypy to 1.7.1 2024-04-25 10:13:21 +02:00
Bart Schuurmans d80a0146bd Update django-stubs to 4.2.7 2024-04-25 10:13:21 +02:00
Bart Schuurmans e1fd57a1d6 Fix constructor arguments to SessionMiddleware in tests 2024-04-25 10:13:21 +02:00
Bart Schuurmans 1f8ba4df3e Update python-dateutil to 2.9.0.post0 2024-04-25 10:13:21 +02:00
Bart Schuurmans c11725a5c8 Update pyotp to 2.9.0 2024-04-25 10:13:21 +02:00
Bart Schuurmans 309147bd98 Update pycryptodome to 3.20.0 2024-04-25 10:13:21 +02:00
Bart Schuurmans 1276112214 Update opentelemetry dependencies 2024-04-25 10:13:21 +02:00
Bart Schuurmans e9325b8798 Update libsass to 0.23.0 2024-04-25 10:13:21 +02:00
Bart Schuurmans e0a14ea2ba Update django-sass-processor to 1.4 2024-04-25 10:13:21 +02:00
Bart Schuurmans 69c273486c Update django-model-utils to 4.4.0 2024-04-25 10:13:19 +02:00
Bart Schuurmans ffb3549e06 Update django-imagekit to 5.0.0 2024-04-25 10:12:30 +02:00
Bart Schuurmans 16e1b17a33 Update django-csp to 3.8 2024-04-25 10:12:30 +02:00
Bart Schuurmans 3dfbc44c9a Update django-celery-beat to 2.6.0 2024-04-25 10:12:30 +02:00
Bart Schuurmans 23bf089004 Update boto3 to 1.34.74 2024-04-25 10:12:30 +02:00
Bart Schuurmans b5ef9f6241 Configure STORAGES using OPTIONS instead of subclassing 2024-04-25 10:12:30 +02:00
Bart Schuurmans 4fa823e8df Update django-storages to 1.14.2
The problem that boto3 closes files has been worked around in django-storages.
2024-04-25 10:12:30 +02:00
Bart Schuurmans cfcb873235 Update pytest-cov to 5.0.0 2024-04-25 10:12:30 +02:00
Bart Schuurmans 0007c86a2c Update environs to 11.0.0 2024-04-25 10:12:30 +02:00
Bart Schuurmans 984d7fb7d8 Update pytest-django to 4.8.0 2024-04-25 10:12:30 +02:00
Bart Schuurmans 92a94d2fdc django.utils.timezone.utc alias is deprecated 2024-04-25 10:12:30 +02:00
Bart Schuurmans 0d621b68e0 Reorder operations in save() overrides
Accessing many-to-many relations before saving is no longer allowed.

Reorder all operations consistently:
1. Validations
2. Modify own fields
3. Perform save by calling super().save()
4. Modify related objects and clear caches

Especially clearing caches should be done after actually saving, otherwise the old data can be
re-added immediately by another request before the new data is written.
2024-04-25 10:12:30 +02:00
Bart Schuurmans 47fdad9c87 Use new STORAGES setting 2024-04-25 10:12:30 +02:00
Bart Schuurmans 3349817a0b settings.USE_L10N is deprecated 2024-04-25 10:12:30 +02:00
Bart Schuurmans 45bd67cb04 Add migration resulting from Django 4.2 upgrade 2024-04-25 10:12:29 +02:00
Bart Schuurmans 2f4010b93b Upgrade Django to 4.2
- https://docs.djangoproject.com/en/5.0/releases/4.0/
- https://docs.djangoproject.com/en/5.0/releases/4.1/
- https://docs.djangoproject.com/en/5.0/releases/4.2/
2024-04-25 10:12:29 +02:00
Mouse Reeve c4b21ee258
Merge pull request #3114 from SMillerDev/feat/api/oauth
feat: add OAuth authentication
2024-04-24 15:45:54 -07:00
Mouse Reeve ad830dd885
Merge pull request #3350 from Minnozz/custom-port
Correctly handle serving BookWyrm on custom port
2024-04-24 15:27:01 -07:00
Mouse Reeve 366c647585
Merge pull request #3359 from bookwyrm-social/dependabot/pip/aiohttp-3.9.4
Bump aiohttp from 3.9.2 to 3.9.4
2024-04-24 15:13:30 -07:00
Bart Schuurmans 4f58b11330 Include the correct protocol and port in remote IDs 2024-04-24 15:35:19 +02:00
Bart Schuurmans 609bc15406 Support http:// protocol in BookWyrm connector 2024-04-24 15:30:47 +02:00
Bart Schuurmans c42db40a63 Construct absolute URLs with the correct protocol and port 2024-04-24 15:30:47 +02:00
Bart Schuurmans 3aefbb548e Allow serving BookWyrm on a non-standard port 2024-04-24 15:30:47 +02:00
Bart Schuurmans baea105c18 pytest.ini env values should be unquoted
Otherwise the quotes end up in the strings.
2024-04-24 15:30:47 +02:00
Bart Schuurmans c73d1fff6a Remove unnecessary exceptions from validate_url_domain 2024-04-24 15:30:47 +02:00
Bart Schuurmans 3d183a393f
Merge pull request #3360 from hughrun/move-fix
refactor Move for more redundancy
2024-04-24 15:30:19 +02:00
Bart Schuurmans f24fdf73b5 Update to match newer code style 2024-04-24 15:08:48 +02:00
Bart Schuurmans 839ab2fafd
Merge branch 'main' into move-fix 2024-04-24 14:56:32 +02:00
Bart Schuurmans 637f19b208
Merge pull request #3336 from Minnozz/s3-url-protocol
Support AWS_S3_URL_PROTOCOL
2024-04-24 14:53:55 +02:00
Bart Schuurmans 031223104f Clarify AWS_S3_URL_PROTOCOL in .env.example 2024-04-24 14:46:57 +02:00
Hugh Rundle 6684d60526
refactor Move for more redundancy
As outlined in #3354, a use `Move` fails if the user is moving from a BookWyrm server to another BookWrym server.
This is because:

1. the original code did not announce changes to alsoKnownAs;
2. the original code always checked the locally saved profile rather than refetching the remote data;

This commit fixes both these problems by forcing `MoveUser` to always perform a "refresh" of the local data from the remote, and by saving the user with broadcast=True when updating alsoKnownAs ids.
2024-04-22 13:35:08 +10:00
dependabot[bot] cca58023ed
Bump aiohttp from 3.9.2 to 3.9.4
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.9.2 to 3.9.4.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.9.2...v3.9.4)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-18 15:51:34 +00:00
Bart Schuurmans bf5c08dbf3 Add docker-compose.override.yml to .gitignore 2024-04-15 13:17:00 +02:00
Bart Schuurmans be872ed672 Support AWS_S3_URL_PROTOCOL
- Allow setting in .env
- Default to PROTOCOL (same as before)
- Propagate to django-storages so it generates the correct URLs in sass_src
2024-04-15 13:16:51 +02:00
Bart Schuurmans 70f803a1f6
Merge pull request #3353 from dato/fix_quotation_str_pagenum
Fix creation of quotations with no end position
2024-04-15 13:11:55 +02:00
Adeodato Simó 4304cd4a79
use re.escape 2024-04-13 21:26:41 -03:00
Adeodato Simó 8733369605
test_quotation_page_serialization: add test with no position 2024-04-13 21:26:41 -03:00
Adeodato Simó df78cc64a6
Quotation._format_position: do not treat page numbers as integers
Fixes: #3352
2024-04-13 21:26:41 -03:00
Adeodato Simó f844abcad9
test_quotation_page_serialization: use strings for page numbers
This follows from #3273, "Allow page numbers to be text, instead of
integers".
2024-04-13 21:26:39 -03:00
Bart Schuurmans 21a39f8170
Merge pull request #3228 from hughrun/user-export
Fix user exports to deal with s3 storage
2024-04-13 22:53:58 +02:00
Hugh Rundle c3c46144fe
add merge migration 2024-04-13 12:39:40 +10:00
Hugh Rundle d48d312c0a
Merge branch 'main' into user-export 2024-04-13 12:26:13 +10:00
Hugh Rundle 501fb45528
export avatars to own directory
Saving avatars to /images is problematic because it changes the original filepath from avatars/filename to images/avatars/filename.
In this PR prior to this commit, imports failed as they are looking for a file path beginning with "avatar"
2024-04-13 12:03:35 +10:00
Bart Schuurmans 7d581759da
Merge pull request #3342 from hbrunn/main-pilkit
[FIX] make sure to get Pillow>=10 compatible pilkit
2024-04-11 14:52:22 +02:00
Bart Schuurmans d5a536ae36 Change pilkit constraint to the version that does work 2024-04-11 14:45:13 +02:00
Bart Schuurmans 26f92db5d8 Merge branch 'main' into main-pilkit 2024-04-11 14:43:10 +02:00
Bart Schuurmans 5686c5ae5d
Merge pull request #3356 from Minnozz/quick-fix-frontend-ci
Install same version of eslint in CI as in dev-tools
2024-04-10 22:10:07 +02:00
Bart Schuurmans 9d9e64399c Install same version of eslint in CI as in dev-tools 2024-04-10 21:26:34 +02:00
Mouse Reeve b6aba44e42
Merge pull request #3355 from bookwyrm-social/merge-migration
Adds merge migration
2024-04-09 06:04:15 -05:00
Mouse Reeve 3ffbb242a4 Black 2024-04-09 05:59:01 -05:00
Mouse Reeve af0bd90c15 Adds merge migration 2024-04-09 05:57:27 -05:00
Mouse Reeve 73630331d1
Merge pull request #3299 from Minnozz/absorb
Track which Author/Work/Edition a duplicate has been merged into
2024-04-09 05:55:44 -05:00
Mouse Reeve ca6dbcb483
Merge pull request #3348 from Minnozz/more-indexes
Define more indexes for slow queries
2024-04-04 15:18:07 -07:00
Bart Schuurmans e1c54b2933 Remove optimizations with adverse effects
`if not audience` actually causes the entire query to be evaluated, before .values_list() is called.
2024-04-04 13:47:56 +02:00
Bart Schuurmans 439cb3ccaa Remove unnecessary conversions between list and set 2024-04-04 13:15:31 +02:00
Bart Schuurmans 321397a349 Specify which column DISTINCT should apply to 2024-04-03 21:28:22 +02:00
Bart Schuurmans 464a0298c6 Add index for finding active (and local) users 2024-04-03 21:27:52 +02:00
Bart Schuurmans 0501ce39cd Add index for looking up User by username 2024-04-03 21:15:24 +02:00
Bart Schuurmans 4d5a30d953 Add index for looking up KeyPair by remote id 2024-04-03 21:11:27 +02:00
Bart Schuurmans 5cfe7eca6f Add index for finding all statuses in a thread 2024-04-03 21:11:09 +02:00
Bart Schuurmans 5082806b82
Merge pull request #3338 from Minnozz/fix-nginx-location
Make nginx config safer
2024-04-03 19:22:16 +02:00
Mouse Reeve d1d91f0c2b
Merge pull request #3347 from bookwyrm-social/dependabot/pip/pillow-10.3.0
Bump pillow from 10.2.0 to 10.3.0
2024-04-03 10:01:59 -07:00
dependabot[bot] ea0ade955b
Bump pillow from 10.2.0 to 10.3.0
Bumps [pillow](https://github.com/python-pillow/Pillow) from 10.2.0 to 10.3.0.
- [Release notes](https://github.com/python-pillow/Pillow/releases)
- [Changelog](https://github.com/python-pillow/Pillow/blob/main/CHANGES.rst)
- [Commits](https://github.com/python-pillow/Pillow/compare/10.2.0...10.3.0)

---
updated-dependencies:
- dependency-name: pillow
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-03 16:45:11 +00:00
Mouse Reeve f085d3d0fe
Merge pull request #3346 from Minnozz/status-remote-id-index
Add index on Status.remote_id
2024-04-02 13:02:35 -07:00
Bart Schuurmans 4bbdd0b2d0 Add index on Status.remote_id
This field is often used in WHERE-clauses in queries that are very slow on bookwyrm.social.
2024-04-02 21:54:30 +02:00
Sean Molenaar d5fb21f330
Merge branch 'main' into feat/api/oauth 2024-04-01 22:35:19 +02:00
Mouse Reeve f28800af7f
Merge pull request #3339 from Minnozz/fix-file-leaks
Fix resource leaks
2024-03-31 12:43:19 -07:00
Mouse Reeve cb3fd0cfc1
Merge branch 'main' into feat/api/oauth 2024-03-31 12:41:12 -07:00
Bart Schuurmans 72ed878eeb
Merge pull request #3343 from Minnozz/update-codeql
Update CodeQL workflows to v3
2024-03-30 22:01:49 +01:00
Bart Schuurmans f666951934 Update CodeQL workflows to v3
https://github.blog/changelog/2024-01-12-code-scanning-deprecation-of-codeql-action-v2/
2024-03-30 21:56:44 +01:00
Holger Brunn fcd0087589 [FIX] make sure to get Pillow>=10 compatible pilkit 2024-03-30 01:58:41 +01:00
Bart Schuurmans ffee29d8e2 Fix resource leaks
Rewrite places where files (or other resources) are opened but not closed to "with" blocks, which
automatically call close() at the end of the scope.

Also simplify some tests where images need to be saved to a model field: an opened file can be
passed directly to FileField.save().
2024-03-29 20:14:10 +01:00
Bart Schuurmans 75bc4f8cb0 Make nginx config safer
Instead of allowing all image files anywhere, and disallowing non-image file under /images/, only
allow image files under /images/ and don't match non-image files elsewhere. They get proxied to web
instead and result in a 404 there.

For example, the old config allowed /exports/foo.jpg to be served, while the new config does not.
2024-03-29 15:04:38 +01:00
Bart Schuurmans e7ae0fdf93
Merge pull request #3337 from prolibre/apport-perso
flower 2.0.1 fixes a few link bugs (particularly for favicon)
2024-03-29 14:45:59 +01:00
Bart Schuurmans 5d597f1ca9 Use new "with ()" style 2024-03-29 14:25:08 +01:00
Bart Schuurmans 0ac9d12d1c Merge branch 'main' into user-export 2024-03-29 14:23:10 +01:00
Bart Schuurmans e74de94640
Merge pull request #3334 from ccamara/patch-1
Remove twitter from README.md
2024-03-29 14:21:49 +01:00
Bart Schuurmans 1464d09a43
Merge pull request #3320 from dato/better-fmt-patch-calls
bulk-fmt: bracket-wrap calls to patch() for better readability
2024-03-29 14:19:16 +01:00
Anthony 2272e7a326 flower 2.0.1 fixes a few link bugs (particularly for favicon) 2024-03-29 12:07:52 +01:00
Bart Schuurmans 2bbe3d4c32 Test user export archive contents 2024-03-28 13:50:55 +01:00
Bart Schuurmans bb5d8152f1 Fix mypy error 2024-03-28 13:21:30 +01:00
Bart Schuurmans dabf7c6e10 User export testing fixes 2024-03-28 13:09:21 +01:00
Bart Schuurmans cdbc1d172c Fix double exports subdir in S3 user export 2024-03-27 23:28:24 +01:00
Adeodato Simó 3133a47b7c
Merge from main into 'better-fmt-patch-calls'
Conflicts:
	bookwyrm/tests/test_book_search.py
2024-03-27 17:13:08 -03:00
Bart Schuurmans c6ca547d58 Fix migration formatting 2024-03-27 20:41:59 +01:00
Bart Schuurmans 797d5cb508 Update BookwyrmExportJob tests 2024-03-27 20:39:57 +01:00
Adeodato Simó 699d637bae
Fix detection of unlisted posts (#3258)
Merged from dato/fix_unlisted_set_from_activity.
2024-03-27 16:29:09 -03:00
Bart Schuurmans 9afd0ebb54 Update migrations 2024-03-27 20:15:06 +01:00
Bart Schuurmans 9685ae5a0a Consolidate BookwyrmExportJob into two tasks
Creating the export JSON and export TAR are now the only two tasks.
2024-03-27 20:13:49 +01:00
Carlos Cámara 98600440d8
Remove twitter from README.md
The Twitter/X account doesn't seem to exist, so removing the badge
2024-03-26 17:14:09 +00:00
Bart Schuurmans ed2e9e5ea8 Merge migration 2024-03-26 13:41:39 +01:00
Bart Schuurmans ef57c0bc8b Check last user export too in post handler 2024-03-26 13:41:39 +01:00
Bart Schuurmans 145c67dd21 Merge BookwyrmExportJob export_data field back into one with dynamic storage backend 2024-03-26 13:41:39 +01:00
Bart Schuurmans 6a67943408
Merge branch 'main' into user-export 2024-03-26 13:15:40 +01:00
Mouse Reeve 9dfa218ba5
Merge pull request #3333 from bookwyrm-social/locales
Updates locales and version number
2024-03-25 16:36:51 -07:00
Mouse Reeve bf52eeaa9e Bump version to 0.7.3. 2024-03-25 16:15:02 -07:00
Mouse Reeve 011e4a27a6 Updates locales and adds missing trimmed on blocktrans 2024-03-25 16:13:00 -07:00
Mouse Reeve 7192449b21
Merge pull request #3325 from Minnozz/author-search-vector
Rework author search
2024-03-25 14:41:25 -07:00
Bart Schuurmans d9bf848cfa Fix pylint warnings 2024-03-25 18:25:43 +01:00
Bart Schuurmans bd95bcd50b Add test for special character in cover filename 2024-03-25 18:14:45 +01:00
Bart Schuurmans f721289b1d Simplify logic for rendering user exports 2024-03-25 18:14:45 +01:00
Bart Schuurmans a51402241b Refactor creation of user export archive 2024-03-25 18:14:45 +01:00
Bart Schuurmans e0decbfd1d Fix urlescaped relative path to cover image in export
Fixes #3292
2024-03-25 17:59:39 +01:00
Bart Schuurmans aee8dc16af Fix pylint warning 2024-03-24 13:27:01 +01:00
Bart Schuurmans 5bd66cb3f7 Only generate signed S3 link to user export when user clicks download 2024-03-24 13:08:33 +01:00
Bart Schuurmans ab7b0893e0 User exports: handle files that no longer exist on file storage 2024-03-24 12:47:26 +01:00
Bart Schuurmans 471233c1dc Use different export job fields for the different storage backends
This way, the database definition is not depdendent on the runtime configuration.
2024-03-24 12:46:42 +01:00
Bart Schuurmans 073f62d5bb Add exports_volume to docker-compose.yml
Exports should be written to a Docker volume instead of to the bind mount (= source directory). This
way they are shared between different containers even when they run on different machines.
2024-03-24 12:08:29 +01:00
Bart Schuurmans a770689245 Merge branch 'main' into user-export 2024-03-24 12:07:14 +01:00
Bart Schuurmans 69f464418d Remove problematic migration
This migration is dependent on the runtime configuration (.env); a structural fix will follow.
2024-03-24 12:06:44 +01:00
Bart Schuurmans f11c80162a
Merge pull request #3331 from Minnozz/revert-docker-mount-ro
Revert "docker-compose.yml: make all bind mounts read only"
2024-03-24 11:30:56 +01:00
Bart Schuurmans 7c2fa746ae Revert "docker-compose.yml: make all bind mounts read only"
This reverts commit 864304f128.
2024-03-24 11:23:23 +01:00
Hugh Rundle 03587dfdc7
migrations 2024-03-24 20:56:20 +11:00
Hugh Rundle dd27684d4b
set signed s3 url expiry with env value
Adds S3_SIGNED_URL_EXPIRY val to .env and settings (defaults to 15 mins)
Note that this is reset every time the user loads the exports page
and is independent of the _creation_ of export files.
2024-03-24 20:53:49 +11:00
Bart Schuurmans caebebeb37
Merge pull request #3261 from bSolt/book-series-3256
Add book series by title in feed posts
2024-03-23 20:01:03 +01:00
Bart Schuurmans 592914dc91 Render series number with comma and outside of link on book page 2024-03-23 19:51:20 +01:00
Bart Schuurmans 2915133223
Merge branch 'main' into book-series-3256 2024-03-23 19:37:07 +01:00
Bart Schuurmans 2d2ccd51df Factor out book series info into separate template 2024-03-23 19:35:24 +01:00
Bart Schuurmans 4a690e675a BookDataModel: add dry_run argument to merge_into 2024-03-23 19:28:57 +01:00
Bart Schuurmans fb82c7a579 Add test for merging authors 2024-03-23 19:28:57 +01:00
Bart Schuurmans 6f191acb27 BookDataModel: fix absorbing data from array and partial date fields 2024-03-23 19:28:57 +01:00
Bart Schuurmans 7fb079cb43 PartialDate: fix __eq__ method 2024-03-23 19:28:57 +01:00
Bart Schuurmans 7066e2815b BookDataModel.merge_into: return and log absorbed fields 2024-03-23 19:28:57 +01:00
Bart Schuurmans e04cd79ff8 Redirect to new URL when a merged object is requested 2024-03-23 19:28:57 +01:00
Bart Schuurmans 5e123972e8 BookDataModel: implement merge_into method 2024-03-23 19:28:57 +01:00
Bart Schuurmans b3753ab6da Add MergedBookDataModel 2024-03-23 19:28:57 +01:00
Bart Schuurmans b8995bd4b1 Add tests for author search 2024-03-23 19:26:51 +01:00
Bart Schuurmans 769d9726e5 Add book search test cases for author aliases 2024-03-23 19:26:51 +01:00
Bart Schuurmans 36222afa79 Switch author search from TrigramSimilarity to SearchQuery 2024-03-23 19:26:51 +01:00
Bart Schuurmans 0795b4d171 Include Author aliases in Book search vector 2024-03-23 19:26:51 +01:00
Bart Schuurmans 2de35f3fc7 Calculate Author search vector with name and aliases 2024-03-23 19:26:51 +01:00
Mouse Reeve bac52eef3e
Merge pull request #3275 from ccamara/wikidata
Add wikidata field for authors
2024-03-23 08:12:09 -07:00
Mouse Reeve 8bbac458a6
Merge pull request #3217 from dato/switch_edition_invalidate_active_shelves
Invalidate `active_shelf` when switching editions
2024-03-23 07:59:40 -07:00
Mouse Reeve 5b71e94888
Merge branch 'main' into user-export 2024-03-23 07:55:46 -07:00
Mouse Reeve a914a44fba
Removes unnecessary redeclaration of wikidata model field in Author 2024-03-23 07:54:54 -07:00
Mouse Reeve 8e088a6d53
Merge branch 'main' into switch_edition_invalidate_active_shelves 2024-03-23 07:53:24 -07:00
Mouse Reeve b508b4cd33
Merge pull request #3323 from Minnozz/docker-bind-ro
Docker: make bind mounts of source code read only
2024-03-23 07:51:00 -07:00
Mouse Reeve 886d6ec9f7
Merge branch 'main' into docker-bind-ro 2024-03-23 07:48:27 -07:00
Mouse Reeve 21f75da75e
Merge pull request #3328 from Minnozz/escape-query-in-link
Escape search query in generated URLs
2024-03-23 07:46:04 -07:00
Mouse Reeve 20db968315
Merge pull request #3322 from Minnozz/fix-font-download
Fix font download
2024-03-23 07:36:43 -07:00
Bart Schuurmans c3d25c59c5 Escape search query in generated URLs
Otherwise, a query containing '&' or other special characters results in a broken URL.
2024-03-21 16:48:34 +01:00
Bart Schuurmans 3cde6dbe5a
Merge pull request #3326 from Minnozz/black-required-version
black: specify major version 22 only
2024-03-21 16:30:56 +01:00
Bart Schuurmans 682bb3b62f dev-tools: relax black version constraint 2024-03-21 16:25:29 +01:00
Bart Schuurmans b5b9eddaf0 CI: relax black version constraints 2024-03-20 12:46:37 +01:00
Bart Schuurmans ab430e0208 requirements.txt: add black
This way, IDEs can be set up to use the black version from the environment instead of a globally
available/bundled black version.
2024-03-20 12:43:17 +01:00
Bart Schuurmans e13e4237f4 black: specify required-version
This ensures consistent formatting among different contributors / development setups.

https://black.readthedocs.io/en/stable/usage_and_configuration/the_basics.html#required-version
2024-03-20 12:26:21 +01:00
Bart Schuurmans 762786839c
Merge pull request #3134 from dato/trigger_migrations
Support trigger migrations
2024-03-20 12:11:34 +01:00
Bart Schuurmans 4ca52c0b38
Merge branch 'main' into trigger_migrations 2024-03-20 11:47:54 +01:00
Bart Schuurmans 6a87713f9f Recalculate all book search vectors after fixing the author trigger 2024-03-20 11:45:12 +01:00
Mouse Reeve d08147c6d9
Merge pull request #3244 from bookwyrm-social/dependabot/pip/pillow-10.2.0
Bump pillow from 10.0.1 to 10.2.0
2024-03-19 15:10:30 -07:00
Bart Schuurmans f423834bd0 Catch the correct exception type from Pillow 2024-03-19 12:42:52 +01:00
Mouse Reeve d304ceb437
Merge pull request #3324 from bookwyrm-social/dependabot/pip/django-3.2.25
Bump django from 3.2.24 to 3.2.25
2024-03-18 15:05:30 -07:00
dependabot[bot] 47afe34d97
Bump django from 3.2.24 to 3.2.25
Bumps [django](https://github.com/django/django) from 3.2.24 to 3.2.25.
- [Commits](https://github.com/django/django/compare/3.2.24...3.2.25)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-18 21:48:21 +00:00
Bart Schuurmans 4d23edddca Make sure /images/ and /static/ exist now that the bind mount is read only
Otherwise the static_volume and media_volume can't be mounted there.
2024-03-18 21:35:12 +01:00
Bart Schuurmans 68cb94daf2 docker-compose.yml: don't automatically start dev-tools by assigning profile 2024-03-18 21:34:51 +01:00
Bart Schuurmans 864304f128 docker-compose.yml: make all bind mounts read only
Except dev-tools, since it needs to be able to change the source.
2024-03-18 21:34:09 +01:00
Bart Schuurmans 7690247ab4 Font download: log the exact error 2024-03-18 20:34:47 +01:00
Bart Schuurmans 3367b20965 Font download: destination dir is allowed to exist
Without this argument, an existing directory (but not the file) causes an error.
2024-03-18 20:23:31 +01:00
Bart Schuurmans 748418590f docker-compose.yml: mount static_volume for flower
Because flower also uses BookwyrmConfig, it wants to download fonts, and will download them to an
incorrect location if the static_volume is not mounted.
2024-03-18 20:22:19 +01:00
Bart Schuurmans ccf2b16d73 requirements.txt: make typing-Pillow match Pillow 2024-03-18 19:52:40 +01:00
dependabot[bot] 3be227fc86 Bump pillow from 10.0.1 to 10.2.0
Bumps [pillow](https://github.com/python-pillow/Pillow) from 10.0.1 to 10.2.0.
- [Release notes](https://github.com/python-pillow/Pillow/releases)
- [Changelog](https://github.com/python-pillow/Pillow/blob/main/CHANGES.rst)
- [Commits](https://github.com/python-pillow/Pillow/compare/10.0.1...10.2.0)

---
updated-dependencies:
- dependency-name: pillow
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-18 19:51:24 +01:00
Adeodato Simó a6dc5bd13f
Make get_file_size robust against typing errors 2024-03-18 15:03:07 -03:00
Adeodato Simó 518da3b9cf Merge from main into 'user-export'
Conflicts:
	bookwyrm/models/bookwyrm_export_job.py
	requirements.txt
2024-03-18 14:47:34 -03:00
Adeodato Simó 2cf7ed477d Consolidate test_posgres.py into test_book_search.py
These are tests I missed when first writing trigger tests in
test_book_search.py.
2024-03-17 22:38:44 -03:00
Adeodato Simó cceccd1ecf
Merge from main into 'trigger_migrations'
Conflicts:
	requirements.txt
2024-03-17 21:54:15 -03:00
Adeodato Simó beb49af514
Upgade django-pgtrigger to 4.11 2024-03-17 21:46:34 -03:00
Adeodato Simó 90bd893568 Fix remaining instances of bad-classmethod-argument 2024-03-17 21:28:55 -03:00
Adeodato Simó e2c9ea3cd2 Fix instances of bad-classmethod-argument in recently edited files 2024-03-17 21:28:55 -03:00
Adeodato Simó 4b9fe0af0c Remove nesting in several with..patch calls 2024-03-17 20:57:39 -03:00
Adeodato Simó 1b9e0546e6 Bracket-wrap calls to patch() for better readability 2024-03-17 20:34:12 -03:00
Bart Schuurmans 8cf52e0a77
Merge pull request #3318 from Minnozz/ci-annotations
CI: update pytest setup and show annotations on PRs
2024-03-17 11:24:01 +01:00
Bart Schuurmans 0282e20b89
Merge branch 'main' into book-series-3256 2024-03-16 11:23:40 +01:00
Bart Schuurmans 4e20e43037 CI: merge all Python actions into one file 2024-03-13 23:36:26 +01:00
Bart Schuurmans 383e6533e1 CI: use pytest-github-actions-annotate-failures 2024-03-13 23:35:05 +01:00
Bart Schuurmans 74fdd9a85a CI: simplify pytest setup 2024-03-13 23:35:05 +01:00
Bart Schuurmans 6af0a08838 CI: use actions/setup-python@v5 and cache pip 2024-03-13 23:35:03 +01:00
Bart Schuurmans 12b469a0d6 CI: use actions/checkout@v4 2024-03-13 23:33:40 +01:00
Mouse Reeve 288743b686
Merge pull request #3315 from Minnozz/fix-pytest-env
pytest.ini: define ALLOWED_HOSTS
2024-03-13 15:29:15 -07:00
Mouse Reeve a3465e6154
Merge pull request #3303 from MaggieFero/main
Upgrade Python Version and Several Other Packages for Security
2024-03-13 15:28:54 -07:00
Bart Schuurmans 3ba528ecdd pytest.ini: define ALLOWED_HOSTS
This fixes running `./bw-dev pytest` locally when having a different value defined for
`ALLOWED_HOSTS` in `.env`.
2024-03-11 20:12:46 +01:00
Adeodato Simó 304c47863b
FileLinkForm: fix duplicate check (#3311)
Merged from: Minnozz/filelink-duplicate-check.
2024-03-11 15:10:28 -03:00
Mouse Reeve b68a4cc392
Merge branch 'main' into filelink-duplicate-check 2024-03-09 07:37:26 -08:00
Mouse Reeve 6dfb5000cc
Merge pull request #3305 from dato/export_catch_missing_key_icon
json_export: also detect absent "icon" key
2024-03-09 07:37:14 -08:00
Bart Schuurmans 8d018b872f FileLinkForm: fix duplicate check 2024-03-09 15:49:42 +01:00
Adeodato Simó 9e7b040b73
Fix shelving date changing when changing editions (#3193)
Merged from  from jakejack13/switch-edition
Fixes: #3139.
2024-03-03 18:48:04 -03:00
Adeodato Simó 09c3d9c0dc
json_export: also detect absent "icon" key 2024-03-03 18:42:27 -03:00
Mouse Reeve dd9d68c97d
Merge pull request #3096 from bookwyrm-social/image-ap-serialization
Changes to how images are serialized
2024-03-02 18:58:08 -08:00
Margaret Fero d138395c75 Add linter exclusion for TBookWyrmModel 2024-03-02 17:43:49 -08:00
Margaret Fero 91fe4ad535 Fix spacing for linter 2024-03-02 17:31:16 -08:00
Margaret Fero 9fa09d5ebe Add extra space required by linter 2024-03-02 17:30:37 -08:00
Margaret Fero eadb0e640f Fix typo in operator 2024-03-02 17:29:42 -08:00
Margaret Fero be140d5e5a Pin setuptools at 65.5.1 2024-03-02 17:20:48 -08:00
Margaret Fero 22c4155c7c Upgrade pytest to 6.2.5 2024-03-02 16:09:34 -08:00
Margaret Fero 498dc35d99 Upgrade Pylint to 2.15.0 2024-03-02 16:09:06 -08:00
Margaret Fero 0f5a3e9163 Pin Tornado at 6.3.3 2024-03-02 16:08:41 -08:00
Margaret Fero da2636fa29 Add grpcio pin @ 1.57.0 2024-03-02 16:07:50 -08:00
Margaret Fero c1520da56d Upgrade flower to 2.0.0 2024-03-02 16:05:11 -08:00
Margaret Fero fee3fdd5a8 Upgrade django-compressor to 4.4 2024-03-02 16:04:37 -08:00
Margaret Fero c944824ac7 Upgrade django-celery-beat to 2.5.0 2024-03-02 16:04:06 -08:00
Margaret Fero 4312e9bba0 Upgrade Celery to 5.3.1 2024-03-02 16:03:19 -08:00
Margaret Fero 39da471f79 Disable Pylint Failure for imghdr deprecation for now 2024-03-02 15:59:17 -08:00
Margaret Fero 570017d3b0 Upgrade Python Version from 3.9 to 3.11 2024-03-02 15:57:06 -08:00
Margaret Fero 3652ac8100
Alphabetize requirements.txt
Alphabetize requirements.txt for developer convenience; this helps to find duplicates and unnecessarily-pinned subdependencies, as well as making the file easier to read and use.
2024-03-02 15:41:06 -08:00
Margaret Fero f8fd76cff0
Remove duplicate types-requests==2.31.0.2
The types-requests==2.31.0.2 dependency was double-listed right next to each other; this commit removes one.
2024-03-02 13:57:09 -08:00
Margaret Fero 206ed9f7fb
Merge pull request #2 from bookwyrm-social/main
No Actual Changes
2024-03-02 13:55:24 -08:00
Mouse Reeve 218171e9bc
Merge pull request #3300 from MaggieFero/MaggieFero-add-timeouts-to-requests.get
Add timeouts to requests.get
2024-03-01 22:49:44 -08:00
Margaret Fero 50b811d9aa
Typo fix
Add a comma
2024-03-01 20:11:14 -08:00
Margaret Fero 1ae9870862
Add timeout to base_activity.py
An instance of requests.get was missing a timeout; this commit adds a timeout of 15 as used in other places in this codebase which already have timeouts.
2024-03-01 20:02:40 -08:00
Margaret Fero db97d76a24
Add timeout to isbn.py
An instance of requests.get in isbn.py lacks a timeout, and this commit adds one with a default of 15 as used other places in the code, where requests.get does already have a timeout.
2024-03-01 19:58:11 -08:00
Mouse Reeve 354388cc8f
Merge pull request #3238 from hughrun/export-fixes
fix multiple issues from user exports config changes
2024-02-29 16:16:25 -08:00
Mouse Reeve 2c59908ddd
Merge branch 'main' into export-fixes 2024-02-29 16:10:20 -08:00
Mouse Reeve 6a70eadba8
Merge pull request #3284 from NetspherePub/072NginxSecurityFixed
Adds production.conf security configuration missing in version 0.7.2
2024-02-29 15:55:56 -08:00
Mouse Reeve ec52460f02
Merge pull request #3274 from Minnozz/author-search
Add search for author
2024-02-29 15:55:12 -08:00
Adeodato Simó 1fabe51261
Move ratings and reviews when switching editions (#3117)
Merged from mattlehrer/move-ratings-and-reviews-when-switching-editions.
Fixes: #2926.
2024-02-21 18:48:32 -03:00
Adeodato Simó e6b6bd648d
Merge branch 'main' into move-ratings-and-reviews-when-switching-editions 2024-02-21 18:42:18 -03:00
Mouse Reeve 9d7965780d
Merge pull request #3285 from polarbirke/fix-label-input-association-for-shelves-filter
Fix label and input association for shelves filter
2024-02-20 16:56:57 -08:00
Mouse Reeve 333fb03c2c
Merge pull request #3290 from bookwyrm-social/korean-locale
Korean locale
2024-02-20 16:56:26 -08:00
Mouse Reeve 8f537ef56a Adds missing migration for Korean locale 2024-02-20 16:45:16 -08:00
Mouse Reeve 6163e1a6be
Merge pull request #3283 from NetspherePub/ko_KR
Add Korean (ko-kr) to LANGUAGES and locale.
2024-02-20 16:44:31 -08:00
Ross Chapman dd1999eb8e
Adds view tests for shelf filters (#3162)
* Adds test file

* Adds success assertion

* Updates tests

* Updates shelf books creation

* Updates assertion to use isbn for Edition model

* Updates query

* trigger workflow test

* Updates validate_html

* Updates comment and test

* Fixes none test

* Adds management command to clear all deleted user data

* Adds success message

---------

Co-authored-by: Mouse Reeve <mousereeve@riseup.net>
Co-authored-by: Mouse Reeve <mouse.reeve@gmail.com>
2024-02-20 16:25:01 -08:00
Søren Birkemeyer 4c0d5ede86 Fix label and input association for shelves filter
This PR correctly associates label and text input of the shelves
filter via for- and id-attributes. With the association in place,
the aria-label can be removed (the label will be announced by
assistive software when the input is focused). This also fixes the
issue that the aria-label was not translated, whereas the label is.
2024-02-10 16:24:52 +00:00
FoW 1c587c5e53 Adds production.conf security configuration missing in version 0.7.2 2024-02-10 17:54:25 +09:00
FoW ddd13a3e2e Add Korean (ko-kr) to LANGUAGES and locale 2024-02-10 16:17:25 +09:00
Mouse Reeve 7469f1f4ca
Merge pull request #3281 from bookwyrm-social/dependabot/pip/django-3.2.24
Bump django from 3.2.23 to 3.2.24
2024-02-07 14:54:50 -08:00
dependabot[bot] 363cb79951
Bump django from 3.2.23 to 3.2.24
Bumps [django](https://github.com/django/django) from 3.2.23 to 3.2.24.
- [Commits](https://github.com/django/django/compare/3.2.23...3.2.24)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-07 22:50:00 +00:00
Hugh Rundle 46a158d701
Merge branch 'main' into export-fixes 2024-02-06 18:31:19 +11:00
Hugh Rundle 8773caa26b
Merge pull request #4 from dato/data_upload_max_size_mb
Support DATA_UPLOAD_MAX_MEMORY_MiB, only, in .env
2024-02-06 18:25:44 +11:00
Carlos Camara 89d8537e1b Add wikidata field to author's template 2024-02-05 22:08:34 +00:00
Carlos Cámara 71f527eb1b
Merge branch 'main' into wikidata 2024-02-04 20:34:51 +01:00
Adeodato Simó 4a9d69e169
Support DATA_UPLOAD_MAX_MEMORY_MiB, only, in .env
Since arithmetic is not allowed in .env files, a change in unit for
the variable seems most usable.
2024-02-04 15:34:04 -03:00
Mouse Reeve d97747078e
Merge pull request #3276 from bookwyrm-social/fixes-version-number
Fixes version number mistakenly reverted
2024-02-03 19:49:48 -08:00
Mouse Reeve db629255db Fixes version number mistakenly reverted 2024-02-03 18:27:59 -08:00
Carlos Cámara 6ac38564e2 Add wikidata field for authors 2024-02-03 22:55:33 +00:00
Bart Schuurmans 6c9ca0bf19 Add search for author 2024-02-03 21:55:46 +01:00
Mouse Reeve 6b1ffbc634
Merge pull request #3185 from bookwyrm-social/check-version-number
Check version number asynchronously
2024-02-03 08:25:52 -08:00
Mouse Reeve 748c934986 Merge migrations upon merge migrations 2024-02-03 08:20:12 -08:00
Mouse Reeve f7580c59a5 Merge branch 'main' into check-version-number 2024-02-03 08:19:46 -08:00
Mouse Reeve 4e2b8af147 Adds merge migration 2024-02-03 08:02:51 -08:00
Mouse Reeve 48f8ee57a6 Merge branch 'main' into check-version-number 2024-02-03 08:02:15 -08:00
Mouse Reeve faf45cf956
Merge pull request #3273 from bookwyrm-social/WesleyAC-freeform-page-number
Allow page numbers to be text, instead of integers
2024-02-03 08:00:34 -08:00
Mouse Reeve a1ac9494b2 Allow admins to un-schedule tasks 2024-02-03 08:00:07 -08:00
Mouse Reeve 6d5752fb4e Adds merge migration for page numbering fix 2024-02-03 07:40:23 -08:00
Mouse Reeve 37aa7ad2f6 Merge branch 'freeform-page-number' of github.com:WesleyAC/bookwyrm into WesleyAC-freeform-page-number 2024-02-03 07:38:02 -08:00
Mouse Reeve e0667c6a03
Merge pull request #3237 from Minnozz/status-title-description
Improve OpenGraph tags for status and book pages
2024-02-03 07:37:00 -08:00
Mouse Reeve 103da863c4
Merge pull request #3239 from Minnozz/user-agent
Replace python-requests with BookWyrm in user agent
2024-02-03 07:27:58 -08:00
Mouse Reeve fa66284000
Merge pull request #3253 from skobkin/patch-autocomplete-fictionbook-format
Adding FictionBook format ("FB2", "FB3") to autocomplete options in "get a copy" block.
2024-02-03 07:26:58 -08:00
Mouse Reeve 0f0420ce04
Merge pull request #3257 from dato/prefer_shared_inbox
Use shared inboxes for mentions too
2024-02-03 07:25:51 -08:00
Mouse Reeve 438d88d8d4
Merge pull request #3260 from bSolt/fix-widths-2023
Fix awkward layout for tablets on /confirm-email, /login, /invite, and /preferences/reactivate
2024-02-03 07:18:35 -08:00
Mouse Reeve 5f2f321ed5
Merge branch 'main' into export-fixes 2024-02-03 07:04:05 -08:00
Mouse Reeve 45cc3dc979
Merge pull request #3249 from dato/cookie_age_setting
Set SESSION_COOKIE_AGE from environment
2024-02-03 07:03:12 -08:00
Mouse Reeve 9c5f6c527b
Fixes translation tags 2024-02-03 06:51:23 -08:00
Mouse Reeve efa29b269c
Merge pull request #3269 from bookwyrm-social/dependabot/pip/aiohttp-3.9.2
Bump aiohttp from 3.9.0 to 3.9.2
2024-01-30 18:03:00 -08:00
Jacob Kerr 2ba7dff845 Fixed shelving date changing when changing editions 2024-01-30 16:53:59 -05:00
Hugh Rundle 21a8570035
Merge pull request #3207 from rsk2/issue-3187
Hide "year in the books" for newly registered users
2024-01-31 07:03:43 +11:00
Hugh Rundle ef6fd608fa
Merge branch 'main' into issue-3187 2024-01-30 18:47:07 +11:00
dependabot[bot] b05621005e
Bump aiohttp from 3.9.0 to 3.9.2
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.9.0 to 3.9.2.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.9.0...v3.9.2)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-30 00:10:13 +00:00
Hugh Rundle 3675a4cf3f
disable user exports if using azure 2024-01-29 14:28:30 +11:00
Hugh Rundle 5f7be848fc
subclass boto3 session instead of adding new env value
Thanks Dato!
2024-01-29 14:10:36 +11:00
Hugh Rundle f96ddaa3e1
Merge pull request #3 from dato/export_job_inject_aws_endpoint_setting
Subclass boto3.Session to use AWS_S3_ENDPOINT_URL
2024-01-29 13:49:45 +11:00
Hugh Rundle adff3c4251
allow user exports with s3
also undoes a line space change in settings.py to make the PR cleaner
2024-01-29 13:45:35 +11:00
Hugh Rundle 765fc1e43d
fix tests 2024-01-29 12:28:37 +11:00
Adeodato Simó c106b2a988
Subclass boto3.Session to use AWS_S3_ENDPOINT_URL
As of 0.1.13, the s3-tar library uses an environment variable
(`S3_ENDPOINT_URL`) to determine the AWS endpoint. See:
https://github.com/xtream1101/s3-tar/blob/0.1.13/s3_tar/utils.py#L25-L29.

To save BookWyrm admins from having to set it (e.g., through `.env`)
when they are already setting `AWS_S3_ENDPOINT_URL`, we create a Session
class that unconditionally uses that URL, and feed it to S3Tar.
2024-01-28 22:21:44 -03:00
Hugh Rundle 2c231acebe
linting and tests 2024-01-28 20:35:47 +11:00
Hugh Rundle a3e05254b5
fix avatar import path 2024-01-28 15:56:44 +11:00
Hugh Rundle 582e97e4a5
Merge branch 'image-serialize' into user-export
pulls Mouse's fix for imagefile serialization
2024-01-28 15:12:15 +11:00
Hugh Rundle 0d619f7eb4
Merge branch 'main' into user-export 2024-01-28 15:11:02 +11:00
Hugh Rundle 2bb9a85591
various fixes
- use signed url for s3 downloads
- re-arrange tar.gz file to match original
- delete all working files after tarring
- import from s3 export

TODO

- check local export and import
- fix error when avatar missing
- deal with multiple s3 storage options (e.g. Azure)
2024-01-28 15:07:55 +11:00
Braden Solt 6add81cf15 move outside of authors "if" 2024-01-27 11:02:42 -07:00
Braden Solt 629acbaa19 add series number on posts in the feed 2024-01-27 10:58:57 -07:00
Braden Solt 940274b1c2 classes that fix widths 2024-01-26 15:47:55 -07:00
Adeodato Simó accb3273f1
When determining privacy, check for unlisted early
If `followers_url` is found in `to`, the post may still be _unlisted_
if `"https://www.w3.org/ns/activitystreams#Public"` appears in `cc`.
Hence this should be checked earlier.
2024-01-26 06:45:54 -03:00
Adeodato Simó 8ac873419f
refactor: eagerly use a set in recipients, get_recipients 2024-01-26 06:29:59 -03:00
Adeodato Simó 31babdfa51
Always prefer shared inboxes when computing receipent lists
This avoids duplicate submissions to remote instances when mentioning
followers (i.e., `POST /user/foo/inbox` followed by `POST /inbox`, which
results in two separate `add_status` tasks, and might generate duplicates
in the target instance).
2024-01-26 06:18:02 -03:00
Adeodato Simó 80ad36e75b
Include SESSION_COOKIE_AGE in .env.example
Suggested-by: Alexey Skobkin <skobkin-ru@ya.ru>
2024-01-25 20:28:15 +01:00
Adeodato Simó 500e4eb4f5
Merge from main to avoid conflicts 2024-01-25 20:27:54 +01:00
Adeodato Simó 82f9aa9da4
Set SESSION_COOKIE_AGE from environment, default to one month
While we do wish for a longer maximum age (up to one year, see #3082),
we only want to do that after termination of active sessions is
implemented (see #2278).

In the meantime, by reading and setting the variable from settings,
we allow site admins to alter the default.
2024-01-25 20:27:24 +01:00
Alexey Skobkin 2d4b11aaee
Adding FictionBook format ("FB2", "FB3") to autocomplete options in "Get a copy" block. 2024-01-25 01:50:10 +03:00
Mouse Reeve 193aeff4d2
Merge pull request #3245 from WesleyAC/redis-aof-auto-compact
Add redis automatic rewrite configuration.
2024-01-24 08:26:45 -08:00
Rohan Sureshkumar c4596544a3 Issue-3187: fix failing tests 2024-01-24 19:18:46 +05:30
Wesley Aptekar-Cassels 30ba8d37dc Add redis automatic rewrite configuration.
This should hopefully prevent the AOF file from growing too large.
2024-01-23 18:19:31 -05:00
Bart Schuurmans eb6bea013f Fix pylint warning 2024-01-21 11:04:08 +01:00
Bart Schuurmans 646b27b7a7 OpenGraph: fall back on book cover when preview images are disabled 2024-01-20 17:34:52 +01:00
Bart Schuurmans ea9d3f8ba1 Use Status.page_image for OpenGraph tags 2024-01-20 17:34:52 +01:00
Bart Schuurmans 290ee997b3 Refactor OpenGraph tags logic 2024-01-20 17:34:52 +01:00
Bart Schuurmans ad56024ffe Add Status.page_image property 2024-01-20 17:34:52 +01:00
Bart Schuurmans f7b4d9ea50 Give individual status page a title and OpenGraph description 2024-01-20 17:34:52 +01:00
Bart Schuurmans 6cb3b97144 Replace python-requests with BookWyrm in user agent
Fixes #3108
2024-01-20 16:15:17 +01:00
Hugh Rundle a563275308
fix comment in env example 2024-01-20 13:27:30 +11:00
Hugh Rundle ddc35a7a52
fix multiple issues from user exports config changes
- improve nginx config
- fix DATA_UPLOAD_MAX_MEMORY_SIZE default not being an int
- translate fallback value in id_to_username template tag
- make location of setting to turn on user exports easier to locate for admins

fixes #3227
fixes #3231
fixes #3232
fixes #3236
2024-01-20 13:19:13 +11:00
Hugh Rundle 26c37de2d4
linting 2024-01-20 07:16:42 +11:00
Mouse Reeve fd0b1d90b0
Merge pull request #3229 from verymilan/nginx-ttf
nginx: fix missing ttf static files
2024-01-18 14:43:05 -08:00
Milan dd5c314bd5 nginx: also serve svg static files 2024-01-18 22:29:43 +01:00
Milan a59dcfc890 nginx: fix missing ttf static files 2024-01-18 17:03:02 +01:00
Rohan Sureshkumar 8e2649ba3b Issue-3187: change variable name and code formatting 2024-01-18 21:23:25 +05:30
Rohan d73141792d
Merge branch 'main' into issue-3187 2024-01-18 21:19:20 +05:30
Hugh Rundle 469172947b
cleanup and linting 2024-01-18 18:43:45 +11:00
Hugh Rundle 833f26fd0e
Merge branch 'main' into user-export 2024-01-18 18:24:56 +11:00
Mouse Reeve fb5fae4251
Merge pull request #3219 from bSolt/issue-3178
Fix awkward clipping on about page
2024-01-17 15:31:52 -08:00
Mouse Reeve c22f189c86
Merge pull request #3216 from dato/dev-tools_require_bookworm
Ensure dev-tools uses bookworm
2024-01-17 15:31:43 -08:00
bSolt 76a3874662 add bulma classes to fix awkward spacing 2024-01-15 23:25:52 -07:00
Rohan 8144507893
Merge branch 'main' into issue-3187 2024-01-15 17:25:36 +05:30
Rohan Sureshkumar 70adf878e8 Merge branch 'issue-3187' of https://github.com/rsk2/bookwyrm into issue-3187 2024-01-15 17:23:17 +05:30
Rohan Sureshkumar 5ef104b802 Issue-3187: addressing review comments 2024-01-15 17:22:33 +05:30
Hugh Rundle d4d2734dab
ignore exports dir 2024-01-14 14:14:20 +11:00
Hugh Rundle 62cc6c298f
oops
- remove test export files
- check in emblackened files
2024-01-14 12:19:59 +11:00
Hugh Rundle cbd08127ef
initial work on fixing user exports with s3
- custom storages
- tar.gz within bucket using s3_tar
- slightly changes export directory structure
- major problems still outstanding re delivering s3 files to end users
2024-01-14 12:14:44 +11:00
Adeodato Simó eb13eb9882
Invalidate active_shelf when switching editions 2024-01-13 19:00:57 +01:00
Adeodato Simó 9a487b0442
Ensure dev-tools uses bookworm
In 1937177e1 ("dev-tools: use apt source for Node instead of setup script"),
I introduced the use of `Signed-By` with a public key block, which is only
supported in bookworm (bullseye only supports fingerprints, TTBOMK).

Python's Docker images already use bookworm by default, but we explicitly
require it now to avoid build errors if someone has a very old image laying
around (see, e.g., #3190).

(This can be dropped after Debian 13 ‘trixie’ is released.)
2024-01-13 17:55:21 +01:00
Rohan 6dc95a82d6
Merge branch 'bookwyrm-social:main' into issue-3187 2024-01-09 17:06:22 +05:30
Rohan Sureshkumar 1a682753c0 Issue-3187: changes 2024-01-09 15:31:05 +05:30
Wesley Aptekar-Cassels 6cd2c91135 Allow page numbers to be text, instead of integers.
Fixes: #2640
2024-01-04 19:09:39 -05:00
Mouse Reeve d287581620 Fixes html validation error 2024-01-02 13:31:18 -08:00
Mouse Reeve 193a1c7d54 updates wording and fixes get or create logic 2024-01-02 13:28:25 -08:00
Mouse Reeve 8be9e91d21 Re-use schedules rather than creating new ones 2024-01-02 13:18:26 -08:00
Mouse Reeve f36af42f41 Adds view to see scheduled tasks 2024-01-02 13:05:44 -08:00
Mouse Reeve 5509941aa4 Adds schedule-able task to check for version updates 2024-01-02 13:05:26 -08:00
Mouse Reeve d6f7f76c4d Removes outdated/unused version and updating code
I had the bright idea of creating this update script but it doesn't work
and hasn't been maintained, so it's just sitting there causing confusing
and requiring weird things to exist in other places.

Now, the unused `version` field can be removed and I can scrap the
management command for getting versions.
2024-01-02 11:37:01 -08:00
Sean Molenaar 5d09c54e57
Merge branch 'main' into feat/api/oauth 2023-12-07 15:38:19 +01:00
Sean Molenaar b7ba6f1a36
urls.py: fix style 2023-11-30 11:25:51 +01:00
Matt Lehrer 7f55495287 Merge branch 'move-ratings-and-reviews-when-switching-editions' of github.com:mattlehrer/bookwyrm into move-ratings-and-reviews-when-switching-editions 2023-11-30 11:15:33 +01:00
Matt Lehrer 31a78a5c9e linted 2023-11-30 11:13:11 +01:00
Adeodato Simó d6eb390cee
Add test that forces book_authors_search_vector_trigger to execute 2023-11-26 15:59:17 -03:00
Adeodato Simó b5805accac
Minor improvements to bookwyrm_book trigger code
- do not COALESCE columns that cannot be NULL
- do not bring bookwyrm_book to author names JOIN
- add comments documenting the four steps
2023-11-25 21:49:15 -03:00
Adeodato Simó bbfbd1e97a
Add tests for trigger code (i.e. how search_vector is computed) 2023-11-25 20:54:49 -03:00
Adeodato Simó 9bcb5b80ea
Further simplify bookwyrm_author trigger 2023-11-25 18:13:40 -03:00
Adeodato Simó 8df408e07e
Define search_vector_trigger via Book.Meta.triggers 2023-11-25 17:02:54 -03:00
Adeodato Simó bcb3a343d4
Fix JOIN in author_search_vector_trigger, add missing WHERE clause 2023-11-25 16:23:21 -03:00
Adeodato Simó 416a6caf2d
Define author_search_vector_trigger via Author.Meta.triggers
Previously, triggers lived only in a particular migration file. With
this change, code for the triggers resides in the model, and their
lifecycle is managed through normal Django migrations.
2023-11-25 16:17:51 -03:00
Adeodato Simó 44ef928c3c
Alter object row IDs to force test failure in original code 2023-11-25 16:11:01 -03:00
Adeodato Simó e4d688665c
Remove index for author.search_vector, which is never used 2023-11-24 22:43:12 -03:00
Adeodato Simó 0299f2e235
Add functional tests for search_vector triggers
As metadata changes, search continues to work.
2023-11-24 22:28:41 -03:00
Matt Lehrer 6933f70af3
Merge branch 'bookwyrm-social:main' into move-ratings-and-reviews-when-switching-editions 2023-11-20 09:31:45 +01:00
Matt Lehrer f4da9fbf34 remove unnecessary loop.
ReviewRatings are a subclass and are included in the models.Review block
2023-11-16 20:37:46 +01:00
Matt Lehrer bf81192d73
Merge branch 'main' into move-ratings-and-reviews-when-switching-editions 2023-11-16 10:49:05 +01:00
Sean Molenaar e144ce19fa
fix: add include import from django.urls 2023-11-16 10:48:06 +01:00
Matt Lehrer bd920a4630 move reviews to new edition 2023-11-16 10:38:45 +01:00
Matt Lehrer 7684101f15 move ratings to new edition 2023-11-16 10:38:41 +01:00
Sean Molenaar da4214ad61 feat: add OAuth authentication
Issue GH-2292
2023-11-14 14:18:35 +01:00
Mouse Reeve 44b14f4933 Fixes workflow errors 2023-11-08 16:00:10 -08:00
Mouse Reeve 0bb4b0d71d Changes to how images are serialized
I'm just going to see if any tests fail?
2023-11-08 15:24:47 -08:00
356 changed files with 20875 additions and 5034 deletions

View file

@ -16,6 +16,11 @@ DEFAULT_LANGUAGE="English"
## Leave unset to allow all hosts
# ALLOWED_HOSTS="localhost,127.0.0.1,[::1]"
# Specify when the site is served from a port that is not the default
# for the protocol (80 for HTTP or 443 for HTTPS).
# Probably only necessary in development.
# PORT=1333
MEDIA_ROOT=images/
# Database configuration
@ -71,14 +76,20 @@ ENABLE_THUMBNAIL_GENERATION=true
USE_S3=false
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
# seconds for signed S3 urls to expire
# this is currently only used for user export files
S3_SIGNED_URL_EXPIRY=900
# Commented are example values if you use a non-AWS, S3-compatible service
# AWS S3 should work with only AWS_STORAGE_BUCKET_NAME and AWS_S3_REGION_NAME
# non-AWS S3-compatible services will need AWS_STORAGE_BUCKET_NAME,
# along with both AWS_S3_CUSTOM_DOMAIN and AWS_S3_ENDPOINT_URL
# along with both AWS_S3_CUSTOM_DOMAIN and AWS_S3_ENDPOINT_URL.
# AWS_S3_URL_PROTOCOL must end in ":" and defaults to the same protocol as
# the BookWyrm instance ("http:" or "https:", based on USE_SSL).
# AWS_STORAGE_BUCKET_NAME= # "example-bucket-name"
# AWS_S3_CUSTOM_DOMAIN=None # "example-bucket-name.s3.fr-par.scw.cloud"
# AWS_S3_URL_PROTOCOL=None # "http:"
# AWS_S3_REGION_NAME=None # "fr-par"
# AWS_S3_ENDPOINT_URL=None # "https://s3.fr-par.scw.cloud"
@ -133,10 +144,14 @@ HTTP_X_FORWARDED_PROTO=false
TWO_FACTOR_LOGIN_VALIDITY_WINDOW=2
TWO_FACTOR_LOGIN_MAX_SECONDS=60
# Additional hosts to allow in the Content-Security-Policy, "self" (should be DOMAIN)
# and AWS_S3_CUSTOM_DOMAIN (if used) are added by default.
# Value should be a comma-separated list of host names.
# Additional hosts to allow in the Content-Security-Policy, "self" (should be
# DOMAIN with optionally ":" + PORT) and AWS_S3_CUSTOM_DOMAIN (if used) are
# added by default. Value should be a comma-separated list of host names.
CSP_ADDITIONAL_HOSTS=
# The last number here means "megabytes"
# Increase if users are having trouble uploading BookWyrm export files.
DATA_UPLOAD_MAX_MEMORY_SIZE = (1024**2 * 100)
# Time before being logged out (in seconds)
# SESSION_COOKIE_AGE=2592000 # current default: 30 days
# Maximum allowed memory for file uploads (increase if users are having trouble
# uploading BookWyrm export files).
# DATA_UPLOAD_MAX_MEMORY_MiB=100

View file

@ -1,17 +0,0 @@
name: Python Formatting (run ./bw-dev black to fix)
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: psf/black@22.12.0
with:
version: 22.12.0

View file

@ -36,11 +36,11 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
@ -51,7 +51,7 @@ jobs:
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
uses: github/codeql-action/autobuild@v3
# Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl
@ -65,4 +65,4 @@ jobs:
# make release
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
uses: github/codeql-action/analyze@v3

View file

@ -10,7 +10,7 @@ jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install curlylint
run: pip install curlylint

View file

@ -1,70 +0,0 @@
name: Run Python Tests
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-20.04
services:
postgres:
image: postgres:13
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: hunter2
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Check migrations up-to-date
run: |
python ./manage.py makemigrations --check
env:
SECRET_KEY: beepbeep
DOMAIN: your.domain.here
EMAIL_HOST: ""
EMAIL_HOST_USER: ""
EMAIL_HOST_PASSWORD: ""
- name: Run Tests
env:
SECRET_KEY: beepbeep
DEBUG: false
USE_HTTPS: true
DOMAIN: your.domain.here
BOOKWYRM_DATABASE_BACKEND: postgres
MEDIA_ROOT: images/
POSTGRES_PASSWORD: hunter2
POSTGRES_USER: postgres
POSTGRES_DB: github_actions
POSTGRES_HOST: 127.0.0.1
CELERY_BROKER: ""
REDIS_BROKER_PORT: 6379
REDIS_BROKER_PASSWORD: beep
USE_DUMMY_CACHE: true
FLOWER_PORT: 8888
EMAIL_HOST: "smtp.mailgun.org"
EMAIL_PORT: 587
EMAIL_HOST_USER: ""
EMAIL_HOST_PASSWORD: ""
EMAIL_USE_TLS: true
ENABLE_PREVIEW_IMAGES: false
ENABLE_THUMBNAIL_GENERATION: true
HTTP_X_FORWARDED_PROTO: false
run: |
pytest -n 3

View file

@ -19,10 +19,11 @@ jobs:
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it.
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install modules
run: npm install stylelint stylelint-config-recommended stylelint-config-standard stylelint-order eslint
# run: npm install stylelint stylelint-config-recommended stylelint-config-standard stylelint-order eslint
run: npm install eslint@^8.9.0
# See .stylelintignore for files that are not linted.
# - name: Run stylelint

View file

@ -1,50 +0,0 @@
name: Mypy
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.9
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Analysing the code with mypy
env:
SECRET_KEY: beepbeep
DEBUG: false
USE_HTTPS: true
DOMAIN: your.domain.here
BOOKWYRM_DATABASE_BACKEND: postgres
MEDIA_ROOT: images/
POSTGRES_PASSWORD: hunter2
POSTGRES_USER: postgres
POSTGRES_DB: github_actions
POSTGRES_HOST: 127.0.0.1
CELERY_BROKER: ""
REDIS_BROKER_PORT: 6379
REDIS_BROKER_PASSWORD: beep
USE_DUMMY_CACHE: true
FLOWER_PORT: 8888
EMAIL_HOST: "smtp.mailgun.org"
EMAIL_PORT: 587
EMAIL_HOST_USER: ""
EMAIL_HOST_PASSWORD: ""
EMAIL_USE_TLS: true
ENABLE_PREVIEW_IMAGES: false
ENABLE_THUMBNAIL_GENERATION: true
HTTP_X_FORWARDED_PROTO: false
run: |
mypy bookwyrm celerywyrm

View file

@ -14,7 +14,7 @@ jobs:
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it.
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install modules
run: npm install prettier@2.5.1

View file

@ -1,27 +0,0 @@
name: Pylint
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.9
uses: actions/setup-python@v4
with:
python-version: 3.9
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Analysing the code with pylint
run: |
pylint bookwyrm/

99
.github/workflows/python.yml vendored Normal file
View file

@ -0,0 +1,99 @@
name: Python
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
# overrides for .env.example
env:
POSTGRES_HOST: 127.0.0.1
PGPORT: 5432
POSTGRES_USER: postgres
POSTGRES_PASSWORD: hunter2
POSTGRES_DB: github_actions
SECRET_KEY: beepbeep
EMAIL_HOST_USER: ""
EMAIL_HOST_PASSWORD: ""
jobs:
pytest:
name: Tests (pytest)
runs-on: ubuntu-latest
services:
postgres:
image: postgres:13
env: # does not inherit from jobs.build.env
POSTGRES_USER: postgres
POSTGRES_PASSWORD: hunter2
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.11
uses: actions/setup-python@v5
with:
python-version: 3.11
cache: pip
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pytest-github-actions-annotate-failures
- name: Set up .env
run: cp .env.example .env
- name: Check migrations up-to-date
run: python ./manage.py makemigrations --check
- name: Run Tests
run: pytest -n 3
pylint:
name: Linting (pylint)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.11
uses: actions/setup-python@v5
with:
python-version: 3.11
cache: pip
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Analyse code with pylint
run: pylint bookwyrm/
mypy:
name: Typing (mypy)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.11
uses: actions/setup-python@v5
with:
python-version: 3.11
cache: pip
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Set up .env
run: cp .env.example .env
- name: Analyse code with mypy
run: mypy bookwyrm celerywyrm
black:
name: Formatting (black; run ./bw-dev black to fix)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
- uses: psf/black@stable
with:
version: "22.*"

5
.gitignore vendored
View file

@ -16,6 +16,8 @@
# BookWyrm
.env
/images/
/exports/
/static/
bookwyrm/static/css/bookwyrm.css
bookwyrm/static/css/themes/
!bookwyrm/static/css/themes/bookwyrm-*.scss
@ -36,3 +38,6 @@ nginx/default.conf
#macOS
**/.DS_Store
# Docker
docker-compose.override.yml

View file

@ -1,4 +1,4 @@
FROM python:3.9
FROM python:3.11
ENV PYTHONUNBUFFERED 1

View file

@ -10,7 +10,6 @@ BookWyrm is a social network for tracking your reading, talking about books, wri
## Links
[![Mastodon Follow](https://img.shields.io/mastodon/follow/000146121?domain=https%3A%2F%2Ftech.lgbt&style=social)](https://tech.lgbt/@bookwyrm)
[![Twitter Follow](https://img.shields.io/twitter/follow/BookWyrmSocial?style=social)](https://twitter.com/BookWyrmSocial)
- [Project homepage](https://joinbookwyrm.com/)
- [Support](https://patreon.com/bookwyrm)

View file

@ -1 +1 @@
0.7.1
0.7.3

View file

@ -20,6 +20,7 @@ from bookwyrm.tasks import app, MISC
logger = logging.getLogger(__name__)
# pylint: disable=invalid-name
TBookWyrmModel = TypeVar("TBookWyrmModel", bound=base_model.BookWyrmModel)
@ -399,11 +400,11 @@ def get_representative():
to sign outgoing HTTP GET requests"""
return models.User.objects.get_or_create(
username=f"{INSTANCE_ACTOR_USERNAME}@{DOMAIN}",
defaults=dict(
email="bookwyrm@localhost",
local=True,
localname=INSTANCE_ACTOR_USERNAME,
),
defaults={
"email": "bookwyrm@localhost",
"local": True,
"localname": INSTANCE_ACTOR_USERNAME,
},
)[0]
@ -423,6 +424,7 @@ def get_activitypub_data(url):
"Date": now,
"Signature": make_signature("get", sender, url, now),
},
timeout=15,
)
except requests.RequestException:
raise ConnectorException()

View file

@ -1,5 +1,5 @@
""" actor serializer """
from dataclasses import dataclass, field
from dataclasses import dataclass
from typing import Dict
from .base_activity import ActivityObject
@ -35,7 +35,7 @@ class Person(ActivityObject):
endpoints: Dict = None
name: str = None
summary: str = None
icon: Image = field(default_factory=lambda: {})
icon: Image = None
bookwyrmUser: bool = False
manuallyApprovesFollowers: str = False
discoverable: str = False

View file

@ -139,14 +139,14 @@ class ActivityStream(RedisStore):
| (
Q(following=status.user) & Q(following=status.reply_parent.user)
) # if the user is following both authors
).distinct()
)
# only visible to the poster's followers and tagged users
elif status.privacy == "followers":
audience = audience.filter(
Q(following=status.user) # if the user is following the author
)
return audience.distinct()
return audience.distinct("id")
@tracer.start_as_current_span("ActivityStream.get_audience")
def get_audience(self, status):
@ -156,7 +156,7 @@ class ActivityStream(RedisStore):
status_author = models.User.objects.filter(
is_active=True, local=True, id=status.user.id
).values_list("id", flat=True)
return list(set(list(audience) + list(status_author)))
return list(set(audience) | set(status_author))
def get_stores_for_users(self, user_ids):
"""convert a list of user ids into redis store ids"""
@ -183,15 +183,13 @@ class HomeStream(ActivityStream):
def get_audience(self, status):
trace.get_current_span().set_attribute("stream_id", self.key)
audience = super()._get_audience(status)
if not audience:
return []
# if the user is following the author
audience = audience.filter(following=status.user).values_list("id", flat=True)
# if the user is the post's author
status_author = models.User.objects.filter(
is_active=True, local=True, id=status.user.id
).values_list("id", flat=True)
return list(set(list(audience) + list(status_author)))
return list(set(audience) | set(status_author))
def get_statuses_for_user(self, user):
return models.Status.privacy_filter(
@ -239,9 +237,7 @@ class BooksStream(ActivityStream):
)
audience = super()._get_audience(status)
if not audience:
return models.User.objects.none()
return audience.filter(shelfbook__book__parent_work=work).distinct()
return audience.filter(shelfbook__book__parent_work=work)
def get_audience(self, status):
# only show public statuses on the books feed,

View file

@ -1,4 +1,5 @@
"""Do further startup configuration and initialization"""
import os
import urllib
import logging
@ -14,16 +15,16 @@ def download_file(url, destination):
"""Downloads a file to the given path"""
try:
# Ensure our destination directory exists
os.makedirs(os.path.dirname(destination))
os.makedirs(os.path.dirname(destination), exist_ok=True)
with urllib.request.urlopen(url) as stream:
with open(destination, "b+w") as outfile:
outfile.write(stream.read())
except (urllib.error.HTTPError, urllib.error.URLError):
logger.info("Failed to download file %s", url)
except OSError:
logger.info("Couldn't open font file %s for writing", destination)
except: # pylint: disable=bare-except
logger.info("Unknown error in file download")
except (urllib.error.HTTPError, urllib.error.URLError) as err:
logger.error("Failed to download file %s: %s", url, err)
except OSError as err:
logger.error("Couldn't open font file %s for writing: %s", destination, err)
except Exception as err: # pylint:disable=broad-except
logger.error("Unknown error in file download: %s", err)
class BookwyrmConfig(AppConfig):

View file

@ -3,7 +3,9 @@ from __future__ import annotations
from abc import ABC, abstractmethod
from typing import Optional, TypedDict, Any, Callable, Union, Iterator
from urllib.parse import quote_plus
import imghdr
# pylint: disable-next=deprecated-module
import imghdr # Deprecated in 3.11 for removal in 3.13; no good alternative yet
import logging
import re
import asyncio

View file

@ -118,9 +118,11 @@ def get_connectors() -> Iterator[abstract_connector.AbstractConnector]:
def get_or_create_connector(remote_id: str) -> abstract_connector.AbstractConnector:
"""get the connector related to the object's server"""
url = urlparse(remote_id)
identifier = url.netloc
identifier = url.hostname
if not identifier:
raise ValueError("Invalid remote id")
raise ValueError(f"Invalid remote id: {remote_id}")
base_url = f"{url.scheme}://{url.netloc}"
try:
connector_info = models.Connector.objects.get(identifier=identifier)
@ -128,10 +130,10 @@ def get_or_create_connector(remote_id: str) -> abstract_connector.AbstractConnec
connector_info = models.Connector.objects.create(
identifier=identifier,
connector_file="bookwyrm_connector",
base_url=f"https://{identifier}",
books_url=f"https://{identifier}/book",
covers_url=f"https://{identifier}/images/covers",
search_url=f"https://{identifier}/search?q=",
base_url=base_url,
books_url=f"{base_url}/book",
covers_url=f"{base_url}/images/covers",
search_url=f"{base_url}/search?q=",
priority=2,
)
@ -143,7 +145,9 @@ def load_more_data(connector_id: str, book_id: str) -> None:
"""background the work of getting all 10,000 editions of LoTR"""
connector_info = models.Connector.objects.get(id=connector_id)
connector = load_connector(connector_info)
book = models.Book.objects.select_subclasses().get(id=book_id)
book = models.Book.objects.select_subclasses().get( # type: ignore[no-untyped-call]
id=book_id
)
connector.expand_book_data(book)
@ -154,7 +158,9 @@ def create_edition_task(
"""separate task for each of the 10,000 editions of LoTR"""
connector_info = models.Connector.objects.get(id=connector_id)
connector = load_connector(connector_info)
work = models.Work.objects.select_subclasses().get(id=work_id)
work = models.Work.objects.select_subclasses().get( # type: ignore[no-untyped-call]
id=work_id
)
connector.create_edition_from_data(work, data)
@ -188,8 +194,11 @@ def raise_not_valid_url(url: str) -> None:
if not parsed.scheme in ["http", "https"]:
raise ConnectorException("Invalid scheme: ", url)
if not parsed.hostname:
raise ConnectorException("Hostname missing: ", url)
try:
ipaddress.ip_address(parsed.netloc)
ipaddress.ip_address(parsed.hostname)
raise ConnectorException("Provided url is an IP address: ", url)
except ValueError:
# it's not an IP address, which is good

View file

@ -229,7 +229,7 @@ class Connector(AbstractConnector):
data = get_data(url)
except ConnectorException:
return ""
return data.get("extract", "")
return str(data.get("extract", ""))
def get_remote_id_from_model(self, obj: models.BookDataModel) -> str:
"""use get_remote_id to figure out the link from a model obj"""

View file

@ -4,7 +4,7 @@ from django.template.loader import get_template
from bookwyrm import models, settings
from bookwyrm.tasks import app, EMAIL
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import DOMAIN, BASE_URL
def email_data():
@ -14,6 +14,7 @@ def email_data():
"site_name": site.name,
"logo": site.logo_small_url,
"domain": DOMAIN,
"base_url": BASE_URL,
"user": None,
}

View file

@ -15,6 +15,7 @@ class AuthorForm(CustomForm):
"aliases",
"bio",
"wikipedia_link",
"wikidata",
"website",
"born",
"died",
@ -32,6 +33,7 @@ class AuthorForm(CustomForm):
"wikipedia_link": forms.TextInput(
attrs={"aria-describedby": "desc_wikipedia_link"}
),
"wikidata": forms.TextInput(attrs={"aria-describedby": "desc_wikidata"}),
"website": forms.TextInput(attrs={"aria-describedby": "desc_website"}),
"born": forms.SelectDateWidget(attrs={"aria-describedby": "desc_born"}),
"died": forms.SelectDateWidget(attrs={"aria-describedby": "desc_died"}),

View file

@ -1,4 +1,5 @@
""" using django model forms """
from urllib.parse import urlparse
from django.utils.translation import gettext_lazy as _
@ -25,7 +26,7 @@ class FileLinkForm(CustomForm):
url = cleaned_data.get("url")
filetype = cleaned_data.get("filetype")
book = cleaned_data.get("book")
domain = urlparse(url).netloc
domain = urlparse(url).hostname
if models.LinkDomain.objects.filter(domain=domain).exists():
status = models.LinkDomain.objects.get(domain=domain).status
if status == "blocked":
@ -37,10 +38,9 @@ class FileLinkForm(CustomForm):
),
)
if (
not self.instance
and models.FileLink.objects.filter(
url=url, book=book, filetype=filetype
).exists()
models.FileLink.objects.filter(url=url, book=book, filetype=filetype)
.exclude(pk=self.instance)
.exists()
):
# pylint: disable=line-too-long
self.add_error(

View file

@ -14,15 +14,10 @@ class CalibreImporter(Importer):
def __init__(self, *args: Any, **kwargs: Any):
# Add timestamp to row_mappings_guesses for date_added to avoid
# integrity error
row_mappings_guesses = []
for field, mapping in self.row_mappings_guesses:
if field in ("date_added",):
row_mappings_guesses.append((field, mapping + ["timestamp"]))
else:
row_mappings_guesses.append((field, mapping))
self.row_mappings_guesses = row_mappings_guesses
self.row_mappings_guesses = [
(field, mapping + (["timestamp"] if field == "date_added" else []))
for field, mapping in self.row_mappings_guesses
]
super().__init__(*args, **kwargs)
def get_shelf(self, normalized_row: dict[str, Optional[str]]) -> Optional[str]:

View file

@ -26,7 +26,7 @@ class IsbnHyphenator:
def update_range_message(self) -> None:
"""Download the range message xml file and save it locally"""
response = requests.get(self.__range_message_url)
response = requests.get(self.__range_message_url, timeout=15)
with open(self.__range_file_path, "w", encoding="utf-8") as file:
file.write(response.text)
self.__element_tree = None

View file

@ -1,13 +1,14 @@
""" PROCEED WITH CAUTION: uses deduplication fields to permanently
merge book data objects """
from django.core.management.base import BaseCommand
from django.db.models import Count
from bookwyrm import models
from bookwyrm.management.merge import merge_objects
def dedupe_model(model):
def dedupe_model(model, dry_run=False):
"""combine duplicate editions and update related models"""
print(f"deduplicating {model.__name__}:")
fields = model._meta.get_fields()
dedupe_fields = [
f for f in fields if hasattr(f, "deduplication_field") and f.deduplication_field
@ -16,30 +17,42 @@ def dedupe_model(model):
dupes = (
model.objects.values(field.name)
.annotate(Count(field.name))
.filter(**{"%s__count__gt" % field.name: 1})
.filter(**{f"{field.name}__count__gt": 1})
.exclude(**{field.name: ""})
.exclude(**{f"{field.name}__isnull": True})
)
for dupe in dupes:
value = dupe[field.name]
if not value or value == "":
continue
print("----------")
print(dupe)
objs = model.objects.filter(**{field.name: value}).order_by("id")
canonical = objs.first()
print("keeping", canonical.remote_id)
action = "would merge" if dry_run else "merging"
print(
f"{action} into {model.__name__} {canonical.remote_id} based on {field.name} {value}:"
)
for obj in objs[1:]:
print(obj.remote_id)
merge_objects(canonical, obj)
print(f"- {obj.remote_id}")
absorbed_fields = obj.merge_into(canonical, dry_run=dry_run)
print(f" absorbed fields: {absorbed_fields}")
class Command(BaseCommand):
"""deduplicate allllll the book data models"""
help = "merges duplicate book data"
def add_arguments(self, parser):
"""add the arguments for this command"""
parser.add_argument(
"--dry_run",
action="store_true",
help="don't actually merge, only print what would happen",
)
# pylint: disable=no-self-use,unused-argument
def handle(self, *args, **options):
"""run deduplications"""
dedupe_model(models.Edition)
dedupe_model(models.Work)
dedupe_model(models.Author)
dedupe_model(models.Edition, dry_run=options["dry_run"])
dedupe_model(models.Work, dry_run=options["dry_run"])
dedupe_model(models.Author, dry_run=options["dry_run"])

View file

@ -1,54 +0,0 @@
""" Get your admin code to allow install """
from django.core.management.base import BaseCommand
from bookwyrm import models
from bookwyrm.settings import VERSION
# pylint: disable=no-self-use
class Command(BaseCommand):
"""command-line options"""
help = "What version is this?"
def add_arguments(self, parser):
"""specify which function to run"""
parser.add_argument(
"--current",
action="store_true",
help="Version stored in database",
)
parser.add_argument(
"--target",
action="store_true",
help="Version stored in settings",
)
parser.add_argument(
"--update",
action="store_true",
help="Update database version",
)
# pylint: disable=unused-argument
def handle(self, *args, **options):
"""execute init"""
site = models.SiteSettings.objects.get()
current = site.version or "0.0.1"
target = VERSION
if options.get("current"):
print(current)
return
if options.get("target"):
print(target)
return
if options.get("update"):
site.version = target
site.save()
return
if current != target:
print(f"{current}/{target}")
else:
print(current)

View file

@ -1,50 +0,0 @@
from django.db.models import ManyToManyField
def update_related(canonical, obj):
"""update all the models with fk to the object being removed"""
# move related models to canonical
related_models = [
(r.remote_field.name, r.related_model) for r in canonical._meta.related_objects
]
for (related_field, related_model) in related_models:
# Skip the ManyToMany fields that arent auto-created. These
# should have a corresponding OneToMany field in the model for
# the linking table anyway. If we update it through that model
# instead then we wont lose the extra fields in the linking
# table.
related_field_obj = related_model._meta.get_field(related_field)
if isinstance(related_field_obj, ManyToManyField):
through = related_field_obj.remote_field.through
if not through._meta.auto_created:
continue
related_objs = related_model.objects.filter(**{related_field: obj})
for related_obj in related_objs:
print("replacing in", related_model.__name__, related_field, related_obj.id)
try:
setattr(related_obj, related_field, canonical)
related_obj.save()
except TypeError:
getattr(related_obj, related_field).add(canonical)
getattr(related_obj, related_field).remove(obj)
def copy_data(canonical, obj):
"""try to get the most data possible"""
for data_field in obj._meta.get_fields():
if not hasattr(data_field, "activitypub_field"):
continue
data_value = getattr(obj, data_field.name)
if not data_value:
continue
if not getattr(canonical, data_field.name):
print("setting data field", data_field.name, data_value)
setattr(canonical, data_field.name, data_value)
canonical.save()
def merge_objects(canonical, obj):
copy_data(canonical, obj)
update_related(canonical, obj)
# remove the outdated entry
obj.delete()

View file

@ -1,4 +1,3 @@
from bookwyrm.management.merge import merge_objects
from django.core.management.base import BaseCommand
@ -9,6 +8,11 @@ class MergeCommand(BaseCommand):
"""add the arguments for this command"""
parser.add_argument("--canonical", type=int, required=True)
parser.add_argument("--other", type=int, required=True)
parser.add_argument(
"--dry_run",
action="store_true",
help="don't actually merge, only print what would happen",
)
# pylint: disable=no-self-use,unused-argument
def handle(self, *args, **options):
@ -26,4 +30,8 @@ class MergeCommand(BaseCommand):
print("other book doesnt exist!")
return
merge_objects(canonical, other)
absorbed_fields = other.merge_into(canonical, dry_run=options["dry_run"])
action = "would be" if options["dry_run"] else "has been"
print(f"{other.remote_id} {action} merged into {canonical.remote_id}")
print(f"absorbed fields: {absorbed_fields}")

View file

@ -1,5 +1,5 @@
""" Makes the app aware of the users timezone """
import pytz
import zoneinfo
from django.utils import timezone
@ -12,9 +12,7 @@ class TimezoneMiddleware:
def __call__(self, request):
if request.user.is_authenticated:
timezone.activate(pytz.timezone(request.user.preferred_timezone))
timezone.activate(zoneinfo.ZoneInfo(request.user.preferred_timezone))
else:
timezone.activate(pytz.utc)
response = self.get_response(request)
timezone.deactivate()
return response
timezone.deactivate()
return self.get_response(request)

View file

@ -10,6 +10,7 @@ class Migration(migrations.Migration):
]
operations = [
# The new timezones are "Factory" and "localtime"
migrations.AlterField(
model_name="user",
name="preferred_timezone",

View file

@ -0,0 +1,16 @@
# Generated by Django 3.2.20 on 2023-11-24 17:11
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0188_theme_loads"),
]
operations = [
migrations.RemoveIndex(
model_name="author",
name="bookwyrm_au_search__b050a8_gin",
),
]

View file

@ -0,0 +1,76 @@
# Generated by Django 3.2.20 on 2023-11-25 00:47
from importlib import import_module
import re
from django.db import migrations
import pgtrigger.compiler
import pgtrigger.migrations
trigger_migration = import_module("bookwyrm.migrations.0077_auto_20210623_2155")
# it's _very_ convenient for development that this migration be reversible
search_vector_trigger = trigger_migration.Migration.operations[4]
author_search_vector_trigger = trigger_migration.Migration.operations[5]
assert re.search(r"\bCREATE TRIGGER search_vector_trigger\b", search_vector_trigger.sql)
assert re.search(
r"\bCREATE TRIGGER author_search_vector_trigger\b",
author_search_vector_trigger.sql,
)
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0190_book_search_updates"),
]
operations = [
pgtrigger.migrations.AddTrigger(
model_name="book",
trigger=pgtrigger.compiler.Trigger(
name="update_search_vector_on_book_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="new.search_vector := setweight(coalesce(nullif(to_tsvector('english', new.title), ''), to_tsvector('simple', new.title)), 'A') || setweight(to_tsvector('english', coalesce(new.subtitle, '')), 'B') || (SELECT setweight(to_tsvector('simple', coalesce(array_to_string(array_agg(bookwyrm_author.name), ' '), '')), 'C') FROM bookwyrm_author LEFT JOIN bookwyrm_book_authors ON bookwyrm_author.id = bookwyrm_book_authors.author_id WHERE bookwyrm_book_authors.book_id = new.id ) || setweight(to_tsvector('english', coalesce(new.series, '')), 'D');RETURN NEW;",
hash="77d6399497c0a89b0bf09d296e33c396da63705c",
operation='INSERT OR UPDATE OF "title", "subtitle", "series", "search_vector"',
pgid="pgtrigger_update_search_vector_on_book_edit_bec58",
table="bookwyrm_book",
when="BEFORE",
),
),
),
pgtrigger.migrations.AddTrigger(
model_name="author",
trigger=pgtrigger.compiler.Trigger(
name="reset_search_vector_on_author_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="WITH updated_books AS (SELECT book_id FROM bookwyrm_book_authors WHERE author_id = new.id ) UPDATE bookwyrm_book SET search_vector = '' FROM updated_books WHERE id = updated_books.book_id;RETURN NEW;",
hash="e7bbf08711ff3724c58f4d92fb7a082ffb3d7826",
operation='UPDATE OF "name"',
pgid="pgtrigger_reset_search_vector_on_author_edit_a447c",
table="bookwyrm_author",
when="AFTER",
),
),
),
migrations.RunSQL(
sql="""DROP TRIGGER IF EXISTS search_vector_trigger ON bookwyrm_book;
DROP FUNCTION IF EXISTS book_trigger;
""",
reverse_sql=search_vector_trigger.sql,
),
migrations.RunSQL(
sql="""DROP TRIGGER IF EXISTS author_search_vector_trigger ON bookwyrm_author;
DROP FUNCTION IF EXISTS author_trigger;
""",
reverse_sql=author_search_vector_trigger.sql,
),
migrations.RunSQL(
# Recalculate book search vector for any missed author name changes
# due to bug in JOIN in the old trigger.
sql="UPDATE bookwyrm_book SET search_vector = NULL;",
reverse_sql=migrations.RunSQL.noop,
),
]

View file

@ -0,0 +1,23 @@
# Generated by Django 3.2.23 on 2024-01-04 23:56
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0191_merge_20240102_0326"),
]
operations = [
migrations.AlterField(
model_name="quotation",
name="endposition",
field=models.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name="quotation",
name="position",
field=models.TextField(blank=True, null=True),
),
]

View file

@ -0,0 +1,18 @@
# Generated by Django 3.2.23 on 2024-01-02 19:36
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0191_merge_20240102_0326"),
]
operations = [
migrations.RenameField(
model_name="sitesettings",
old_name="version",
new_name="available_version",
),
]

View file

@ -0,0 +1,92 @@
# Generated by Django 3.2.23 on 2024-01-28 02:49
import django.core.serializers.json
from django.db import migrations, models
import django.db.models.deletion
from django.core.files.storage import storages
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0192_sitesettings_user_exports_enabled"),
]
operations = [
migrations.AddField(
model_name="bookwyrmexportjob",
name="export_json",
field=models.JSONField(
encoder=django.core.serializers.json.DjangoJSONEncoder, null=True
),
),
migrations.AddField(
model_name="bookwyrmexportjob",
name="json_completed",
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name="bookwyrmexportjob",
name="export_data",
field=models.FileField(
null=True,
storage=storages["exports"],
upload_to="",
),
),
migrations.CreateModel(
name="AddFileToTar",
fields=[
(
"childjob_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="bookwyrm.childjob",
),
),
(
"parent_export_job",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="child_edition_export_jobs",
to="bookwyrm.bookwyrmexportjob",
),
),
],
options={
"abstract": False,
},
bases=("bookwyrm.childjob",),
),
migrations.CreateModel(
name="AddBookToUserExportJob",
fields=[
(
"childjob_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="bookwyrm.childjob",
),
),
(
"edition",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="bookwyrm.edition",
),
),
],
options={
"abstract": False,
},
bases=("bookwyrm.childjob",),
),
]

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.23 on 2024-02-03 15:39
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0192_make_page_positions_text"),
("bookwyrm", "0192_sitesettings_user_exports_enabled"),
]
operations = []

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.23 on 2024-02-03 16:19
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0192_rename_version_sitesettings_available_version"),
("bookwyrm", "0193_merge_20240203_1539"),
]
operations = []

View file

@ -0,0 +1,46 @@
# Generated by Django 3.2.23 on 2024-02-21 00:45
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0194_merge_20240203_1619"),
]
operations = [
migrations.AlterField(
model_name="user",
name="preferred_language",
field=models.CharField(
blank=True,
choices=[
("en-us", "English"),
("ca-es", "Català (Catalan)"),
("de-de", "Deutsch (German)"),
("eo-uy", "Esperanto (Esperanto)"),
("es-es", "Español (Spanish)"),
("eu-es", "Euskara (Basque)"),
("gl-es", "Galego (Galician)"),
("it-it", "Italiano (Italian)"),
("ko-kr", "한국어 (Korean)"),
("fi-fi", "Suomi (Finnish)"),
("fr-fr", "Français (French)"),
("lt-lt", "Lietuvių (Lithuanian)"),
("nl-nl", "Nederlands (Dutch)"),
("no-no", "Norsk (Norwegian)"),
("pl-pl", "Polski (Polish)"),
("pt-br", "Português do Brasil (Brazilian Portuguese)"),
("pt-pt", "Português Europeu (European Portuguese)"),
("ro-ro", "Română (Romanian)"),
("sv-se", "Svenska (Swedish)"),
("uk-ua", "Українська (Ukrainian)"),
("zh-hans", "简体中文 (Simplified Chinese)"),
("zh-hant", "繁體中文 (Traditional Chinese)"),
],
max_length=255,
null=True,
),
),
]

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.23 on 2024-03-18 17:37
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0193_auto_20240128_0249"),
("bookwyrm", "0195_alter_user_preferred_language"),
]
operations = []

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.23 on 2024-03-18 00:48
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0191_migrate_search_vec_triggers_to_pgtriggers"),
("bookwyrm", "0195_alter_user_preferred_language"),
]
operations = []

View file

@ -0,0 +1,41 @@
# Generated by Django 3.2.25 on 2024-03-20 15:15
import django.contrib.postgres.indexes
from django.db import migrations
import pgtrigger.compiler
import pgtrigger.migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0196_merge_pr3134_into_main"),
]
operations = [
migrations.AddIndex(
model_name="author",
index=django.contrib.postgres.indexes.GinIndex(
fields=["search_vector"], name="bookwyrm_au_search__b050a8_gin"
),
),
pgtrigger.migrations.AddTrigger(
model_name="author",
trigger=pgtrigger.compiler.Trigger(
name="update_search_vector_on_author_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="new.search_vector := setweight(to_tsvector('simple', new.name), 'A') || setweight(to_tsvector('simple', coalesce(array_to_string(new.aliases, ' '), '')), 'B');RETURN NEW;",
hash="b97919016236d74d0ade51a0769a173ea269da64",
operation='INSERT OR UPDATE OF "name", "aliases", "search_vector"',
pgid="pgtrigger_update_search_vector_on_author_edit_c61cb",
table="bookwyrm_author",
when="BEFORE",
),
),
),
migrations.RunSQL(
# Calculate search vector for all Authors.
sql="UPDATE bookwyrm_author SET search_vector = NULL;",
reverse_sql="UPDATE bookwyrm_author SET search_vector = NULL;",
),
]

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.25 on 2024-03-24 02:35
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0196_merge_20240318_1737"),
("bookwyrm", "0196_merge_pr3134_into_main"),
]
operations = []

View file

@ -0,0 +1,48 @@
# Generated by Django 3.2.24 on 2024-02-28 21:30
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0196_merge_pr3134_into_main"),
]
operations = [
migrations.CreateModel(
name="MergedBook",
fields=[
("deleted_id", models.IntegerField(primary_key=True, serialize=False)),
(
"merged_into",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="absorbed",
to="bookwyrm.book",
),
),
],
options={
"abstract": False,
},
),
migrations.CreateModel(
name="MergedAuthor",
fields=[
("deleted_id", models.IntegerField(primary_key=True, serialize=False)),
(
"merged_into",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="absorbed",
to="bookwyrm.author",
),
),
],
options={
"abstract": False,
},
),
]

View file

@ -0,0 +1,23 @@
# Generated by Django 3.2.25 on 2024-03-26 11:37
import bookwyrm.models.bookwyrm_export_job
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0197_merge_20240324_0235"),
]
operations = [
migrations.AlterField(
model_name="bookwyrmexportjob",
name="export_data",
field=models.FileField(
null=True,
storage=bookwyrm.models.bookwyrm_export_job.select_exports_storage,
upload_to="",
),
),
]

View file

@ -0,0 +1,57 @@
# Generated by Django 3.2.25 on 2024-03-20 15:52
from django.db import migrations
import pgtrigger.compiler
import pgtrigger.migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0197_author_search_vector"),
]
operations = [
pgtrigger.migrations.RemoveTrigger(
model_name="author",
name="reset_search_vector_on_author_edit",
),
pgtrigger.migrations.RemoveTrigger(
model_name="book",
name="update_search_vector_on_book_edit",
),
pgtrigger.migrations.AddTrigger(
model_name="author",
trigger=pgtrigger.compiler.Trigger(
name="reset_book_search_vector_on_author_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="WITH updated_books AS (SELECT book_id FROM bookwyrm_book_authors WHERE author_id = new.id ) UPDATE bookwyrm_book SET search_vector = '' FROM updated_books WHERE id = updated_books.book_id;RETURN NEW;",
hash="68422c0f29879c5802b82159dde45297eff53e73",
operation='UPDATE OF "name", "aliases"',
pgid="pgtrigger_reset_book_search_vector_on_author_edit_a50c7",
table="bookwyrm_author",
when="AFTER",
),
),
),
pgtrigger.migrations.AddTrigger(
model_name="book",
trigger=pgtrigger.compiler.Trigger(
name="update_search_vector_on_book_edit",
sql=pgtrigger.compiler.UpsertTriggerSql(
func="WITH author_names AS (SELECT array_to_string(bookwyrm_author.name || bookwyrm_author.aliases, ' ') AS name_and_aliases FROM bookwyrm_author LEFT JOIN bookwyrm_book_authors ON bookwyrm_author.id = bookwyrm_book_authors.author_id WHERE bookwyrm_book_authors.book_id = new.id ) SELECT setweight(coalesce(nullif(to_tsvector('english', new.title), ''), to_tsvector('simple', new.title)), 'A') || setweight(to_tsvector('english', coalesce(new.subtitle, '')), 'B') || (SELECT setweight(to_tsvector('simple', coalesce(array_to_string(array_agg(name_and_aliases), ' '), '')), 'C') FROM author_names) || setweight(to_tsvector('english', coalesce(new.series, '')), 'D') INTO new.search_vector;RETURN NEW;",
hash="9324f5ca76a6f5e63931881d62d11da11f595b2c",
operation='INSERT OR UPDATE OF "title", "subtitle", "series", "search_vector"',
pgid="pgtrigger_update_search_vector_on_book_edit_bec58",
table="bookwyrm_book",
when="BEFORE",
),
),
),
migrations.RunSQL(
# Recalculate search vector for all Books because it now includes
# Author aliases.
sql="UPDATE bookwyrm_book SET search_vector = NULL;",
reverse_sql="UPDATE bookwyrm_book SET search_vector = NULL;",
),
]

View file

@ -0,0 +1,70 @@
# Generated by Django 4.2.11 on 2024-03-29 19:25
import bookwyrm.models.fields
from django.conf import settings
from django.db import migrations
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0198_book_search_vector_author_aliases"),
]
operations = [
migrations.AlterField(
model_name="userblocks",
name="user_object",
field=bookwyrm.models.fields.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="%(class)s_user_object",
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name="userblocks",
name="user_subject",
field=bookwyrm.models.fields.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="%(class)s_user_subject",
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name="userfollowrequest",
name="user_object",
field=bookwyrm.models.fields.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="%(class)s_user_object",
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name="userfollowrequest",
name="user_subject",
field=bookwyrm.models.fields.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="%(class)s_user_subject",
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name="userfollows",
name="user_object",
field=bookwyrm.models.fields.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="%(class)s_user_object",
to=settings.AUTH_USER_MODEL,
),
),
migrations.AlterField(
model_name="userfollows",
name="user_subject",
field=bookwyrm.models.fields.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="%(class)s_user_subject",
to=settings.AUTH_USER_MODEL,
),
),
]

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.25 on 2024-03-26 12:17
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0198_alter_bookwyrmexportjob_export_data"),
("bookwyrm", "0198_book_search_vector_author_aliases"),
]
operations = []

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-02 19:53
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0198_book_search_vector_author_aliases"),
]
operations = [
migrations.AddIndex(
model_name="status",
index=models.Index(
fields=["remote_id"], name="bookwyrm_st_remote__06aeba_idx"
),
),
]

View file

@ -0,0 +1,633 @@
# Generated by Django 4.2.11 on 2024-04-01 20:19
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0199_alter_userblocks_user_object_and_more"),
]
operations = [
migrations.AlterField(
model_name="user",
name="preferred_timezone",
field=models.CharField(
choices=[
("Africa/Abidjan", "Africa/Abidjan"),
("Africa/Accra", "Africa/Accra"),
("Africa/Addis_Ababa", "Africa/Addis_Ababa"),
("Africa/Algiers", "Africa/Algiers"),
("Africa/Asmara", "Africa/Asmara"),
("Africa/Asmera", "Africa/Asmera"),
("Africa/Bamako", "Africa/Bamako"),
("Africa/Bangui", "Africa/Bangui"),
("Africa/Banjul", "Africa/Banjul"),
("Africa/Bissau", "Africa/Bissau"),
("Africa/Blantyre", "Africa/Blantyre"),
("Africa/Brazzaville", "Africa/Brazzaville"),
("Africa/Bujumbura", "Africa/Bujumbura"),
("Africa/Cairo", "Africa/Cairo"),
("Africa/Casablanca", "Africa/Casablanca"),
("Africa/Ceuta", "Africa/Ceuta"),
("Africa/Conakry", "Africa/Conakry"),
("Africa/Dakar", "Africa/Dakar"),
("Africa/Dar_es_Salaam", "Africa/Dar_es_Salaam"),
("Africa/Djibouti", "Africa/Djibouti"),
("Africa/Douala", "Africa/Douala"),
("Africa/El_Aaiun", "Africa/El_Aaiun"),
("Africa/Freetown", "Africa/Freetown"),
("Africa/Gaborone", "Africa/Gaborone"),
("Africa/Harare", "Africa/Harare"),
("Africa/Johannesburg", "Africa/Johannesburg"),
("Africa/Juba", "Africa/Juba"),
("Africa/Kampala", "Africa/Kampala"),
("Africa/Khartoum", "Africa/Khartoum"),
("Africa/Kigali", "Africa/Kigali"),
("Africa/Kinshasa", "Africa/Kinshasa"),
("Africa/Lagos", "Africa/Lagos"),
("Africa/Libreville", "Africa/Libreville"),
("Africa/Lome", "Africa/Lome"),
("Africa/Luanda", "Africa/Luanda"),
("Africa/Lubumbashi", "Africa/Lubumbashi"),
("Africa/Lusaka", "Africa/Lusaka"),
("Africa/Malabo", "Africa/Malabo"),
("Africa/Maputo", "Africa/Maputo"),
("Africa/Maseru", "Africa/Maseru"),
("Africa/Mbabane", "Africa/Mbabane"),
("Africa/Mogadishu", "Africa/Mogadishu"),
("Africa/Monrovia", "Africa/Monrovia"),
("Africa/Nairobi", "Africa/Nairobi"),
("Africa/Ndjamena", "Africa/Ndjamena"),
("Africa/Niamey", "Africa/Niamey"),
("Africa/Nouakchott", "Africa/Nouakchott"),
("Africa/Ouagadougou", "Africa/Ouagadougou"),
("Africa/Porto-Novo", "Africa/Porto-Novo"),
("Africa/Sao_Tome", "Africa/Sao_Tome"),
("Africa/Timbuktu", "Africa/Timbuktu"),
("Africa/Tripoli", "Africa/Tripoli"),
("Africa/Tunis", "Africa/Tunis"),
("Africa/Windhoek", "Africa/Windhoek"),
("America/Adak", "America/Adak"),
("America/Anchorage", "America/Anchorage"),
("America/Anguilla", "America/Anguilla"),
("America/Antigua", "America/Antigua"),
("America/Araguaina", "America/Araguaina"),
(
"America/Argentina/Buenos_Aires",
"America/Argentina/Buenos_Aires",
),
("America/Argentina/Catamarca", "America/Argentina/Catamarca"),
(
"America/Argentina/ComodRivadavia",
"America/Argentina/ComodRivadavia",
),
("America/Argentina/Cordoba", "America/Argentina/Cordoba"),
("America/Argentina/Jujuy", "America/Argentina/Jujuy"),
("America/Argentina/La_Rioja", "America/Argentina/La_Rioja"),
("America/Argentina/Mendoza", "America/Argentina/Mendoza"),
(
"America/Argentina/Rio_Gallegos",
"America/Argentina/Rio_Gallegos",
),
("America/Argentina/Salta", "America/Argentina/Salta"),
("America/Argentina/San_Juan", "America/Argentina/San_Juan"),
("America/Argentina/San_Luis", "America/Argentina/San_Luis"),
("America/Argentina/Tucuman", "America/Argentina/Tucuman"),
("America/Argentina/Ushuaia", "America/Argentina/Ushuaia"),
("America/Aruba", "America/Aruba"),
("America/Asuncion", "America/Asuncion"),
("America/Atikokan", "America/Atikokan"),
("America/Atka", "America/Atka"),
("America/Bahia", "America/Bahia"),
("America/Bahia_Banderas", "America/Bahia_Banderas"),
("America/Barbados", "America/Barbados"),
("America/Belem", "America/Belem"),
("America/Belize", "America/Belize"),
("America/Blanc-Sablon", "America/Blanc-Sablon"),
("America/Boa_Vista", "America/Boa_Vista"),
("America/Bogota", "America/Bogota"),
("America/Boise", "America/Boise"),
("America/Buenos_Aires", "America/Buenos_Aires"),
("America/Cambridge_Bay", "America/Cambridge_Bay"),
("America/Campo_Grande", "America/Campo_Grande"),
("America/Cancun", "America/Cancun"),
("America/Caracas", "America/Caracas"),
("America/Catamarca", "America/Catamarca"),
("America/Cayenne", "America/Cayenne"),
("America/Cayman", "America/Cayman"),
("America/Chicago", "America/Chicago"),
("America/Chihuahua", "America/Chihuahua"),
("America/Ciudad_Juarez", "America/Ciudad_Juarez"),
("America/Coral_Harbour", "America/Coral_Harbour"),
("America/Cordoba", "America/Cordoba"),
("America/Costa_Rica", "America/Costa_Rica"),
("America/Creston", "America/Creston"),
("America/Cuiaba", "America/Cuiaba"),
("America/Curacao", "America/Curacao"),
("America/Danmarkshavn", "America/Danmarkshavn"),
("America/Dawson", "America/Dawson"),
("America/Dawson_Creek", "America/Dawson_Creek"),
("America/Denver", "America/Denver"),
("America/Detroit", "America/Detroit"),
("America/Dominica", "America/Dominica"),
("America/Edmonton", "America/Edmonton"),
("America/Eirunepe", "America/Eirunepe"),
("America/El_Salvador", "America/El_Salvador"),
("America/Ensenada", "America/Ensenada"),
("America/Fort_Nelson", "America/Fort_Nelson"),
("America/Fort_Wayne", "America/Fort_Wayne"),
("America/Fortaleza", "America/Fortaleza"),
("America/Glace_Bay", "America/Glace_Bay"),
("America/Godthab", "America/Godthab"),
("America/Goose_Bay", "America/Goose_Bay"),
("America/Grand_Turk", "America/Grand_Turk"),
("America/Grenada", "America/Grenada"),
("America/Guadeloupe", "America/Guadeloupe"),
("America/Guatemala", "America/Guatemala"),
("America/Guayaquil", "America/Guayaquil"),
("America/Guyana", "America/Guyana"),
("America/Halifax", "America/Halifax"),
("America/Havana", "America/Havana"),
("America/Hermosillo", "America/Hermosillo"),
("America/Indiana/Indianapolis", "America/Indiana/Indianapolis"),
("America/Indiana/Knox", "America/Indiana/Knox"),
("America/Indiana/Marengo", "America/Indiana/Marengo"),
("America/Indiana/Petersburg", "America/Indiana/Petersburg"),
("America/Indiana/Tell_City", "America/Indiana/Tell_City"),
("America/Indiana/Vevay", "America/Indiana/Vevay"),
("America/Indiana/Vincennes", "America/Indiana/Vincennes"),
("America/Indiana/Winamac", "America/Indiana/Winamac"),
("America/Indianapolis", "America/Indianapolis"),
("America/Inuvik", "America/Inuvik"),
("America/Iqaluit", "America/Iqaluit"),
("America/Jamaica", "America/Jamaica"),
("America/Jujuy", "America/Jujuy"),
("America/Juneau", "America/Juneau"),
("America/Kentucky/Louisville", "America/Kentucky/Louisville"),
("America/Kentucky/Monticello", "America/Kentucky/Monticello"),
("America/Knox_IN", "America/Knox_IN"),
("America/Kralendijk", "America/Kralendijk"),
("America/La_Paz", "America/La_Paz"),
("America/Lima", "America/Lima"),
("America/Los_Angeles", "America/Los_Angeles"),
("America/Louisville", "America/Louisville"),
("America/Lower_Princes", "America/Lower_Princes"),
("America/Maceio", "America/Maceio"),
("America/Managua", "America/Managua"),
("America/Manaus", "America/Manaus"),
("America/Marigot", "America/Marigot"),
("America/Martinique", "America/Martinique"),
("America/Matamoros", "America/Matamoros"),
("America/Mazatlan", "America/Mazatlan"),
("America/Mendoza", "America/Mendoza"),
("America/Menominee", "America/Menominee"),
("America/Merida", "America/Merida"),
("America/Metlakatla", "America/Metlakatla"),
("America/Mexico_City", "America/Mexico_City"),
("America/Miquelon", "America/Miquelon"),
("America/Moncton", "America/Moncton"),
("America/Monterrey", "America/Monterrey"),
("America/Montevideo", "America/Montevideo"),
("America/Montreal", "America/Montreal"),
("America/Montserrat", "America/Montserrat"),
("America/Nassau", "America/Nassau"),
("America/New_York", "America/New_York"),
("America/Nipigon", "America/Nipigon"),
("America/Nome", "America/Nome"),
("America/Noronha", "America/Noronha"),
("America/North_Dakota/Beulah", "America/North_Dakota/Beulah"),
("America/North_Dakota/Center", "America/North_Dakota/Center"),
(
"America/North_Dakota/New_Salem",
"America/North_Dakota/New_Salem",
),
("America/Nuuk", "America/Nuuk"),
("America/Ojinaga", "America/Ojinaga"),
("America/Panama", "America/Panama"),
("America/Pangnirtung", "America/Pangnirtung"),
("America/Paramaribo", "America/Paramaribo"),
("America/Phoenix", "America/Phoenix"),
("America/Port-au-Prince", "America/Port-au-Prince"),
("America/Port_of_Spain", "America/Port_of_Spain"),
("America/Porto_Acre", "America/Porto_Acre"),
("America/Porto_Velho", "America/Porto_Velho"),
("America/Puerto_Rico", "America/Puerto_Rico"),
("America/Punta_Arenas", "America/Punta_Arenas"),
("America/Rainy_River", "America/Rainy_River"),
("America/Rankin_Inlet", "America/Rankin_Inlet"),
("America/Recife", "America/Recife"),
("America/Regina", "America/Regina"),
("America/Resolute", "America/Resolute"),
("America/Rio_Branco", "America/Rio_Branco"),
("America/Rosario", "America/Rosario"),
("America/Santa_Isabel", "America/Santa_Isabel"),
("America/Santarem", "America/Santarem"),
("America/Santiago", "America/Santiago"),
("America/Santo_Domingo", "America/Santo_Domingo"),
("America/Sao_Paulo", "America/Sao_Paulo"),
("America/Scoresbysund", "America/Scoresbysund"),
("America/Shiprock", "America/Shiprock"),
("America/Sitka", "America/Sitka"),
("America/St_Barthelemy", "America/St_Barthelemy"),
("America/St_Johns", "America/St_Johns"),
("America/St_Kitts", "America/St_Kitts"),
("America/St_Lucia", "America/St_Lucia"),
("America/St_Thomas", "America/St_Thomas"),
("America/St_Vincent", "America/St_Vincent"),
("America/Swift_Current", "America/Swift_Current"),
("America/Tegucigalpa", "America/Tegucigalpa"),
("America/Thule", "America/Thule"),
("America/Thunder_Bay", "America/Thunder_Bay"),
("America/Tijuana", "America/Tijuana"),
("America/Toronto", "America/Toronto"),
("America/Tortola", "America/Tortola"),
("America/Vancouver", "America/Vancouver"),
("America/Virgin", "America/Virgin"),
("America/Whitehorse", "America/Whitehorse"),
("America/Winnipeg", "America/Winnipeg"),
("America/Yakutat", "America/Yakutat"),
("America/Yellowknife", "America/Yellowknife"),
("Antarctica/Casey", "Antarctica/Casey"),
("Antarctica/Davis", "Antarctica/Davis"),
("Antarctica/DumontDUrville", "Antarctica/DumontDUrville"),
("Antarctica/Macquarie", "Antarctica/Macquarie"),
("Antarctica/Mawson", "Antarctica/Mawson"),
("Antarctica/McMurdo", "Antarctica/McMurdo"),
("Antarctica/Palmer", "Antarctica/Palmer"),
("Antarctica/Rothera", "Antarctica/Rothera"),
("Antarctica/South_Pole", "Antarctica/South_Pole"),
("Antarctica/Syowa", "Antarctica/Syowa"),
("Antarctica/Troll", "Antarctica/Troll"),
("Antarctica/Vostok", "Antarctica/Vostok"),
("Arctic/Longyearbyen", "Arctic/Longyearbyen"),
("Asia/Aden", "Asia/Aden"),
("Asia/Almaty", "Asia/Almaty"),
("Asia/Amman", "Asia/Amman"),
("Asia/Anadyr", "Asia/Anadyr"),
("Asia/Aqtau", "Asia/Aqtau"),
("Asia/Aqtobe", "Asia/Aqtobe"),
("Asia/Ashgabat", "Asia/Ashgabat"),
("Asia/Ashkhabad", "Asia/Ashkhabad"),
("Asia/Atyrau", "Asia/Atyrau"),
("Asia/Baghdad", "Asia/Baghdad"),
("Asia/Bahrain", "Asia/Bahrain"),
("Asia/Baku", "Asia/Baku"),
("Asia/Bangkok", "Asia/Bangkok"),
("Asia/Barnaul", "Asia/Barnaul"),
("Asia/Beirut", "Asia/Beirut"),
("Asia/Bishkek", "Asia/Bishkek"),
("Asia/Brunei", "Asia/Brunei"),
("Asia/Calcutta", "Asia/Calcutta"),
("Asia/Chita", "Asia/Chita"),
("Asia/Choibalsan", "Asia/Choibalsan"),
("Asia/Chongqing", "Asia/Chongqing"),
("Asia/Chungking", "Asia/Chungking"),
("Asia/Colombo", "Asia/Colombo"),
("Asia/Dacca", "Asia/Dacca"),
("Asia/Damascus", "Asia/Damascus"),
("Asia/Dhaka", "Asia/Dhaka"),
("Asia/Dili", "Asia/Dili"),
("Asia/Dubai", "Asia/Dubai"),
("Asia/Dushanbe", "Asia/Dushanbe"),
("Asia/Famagusta", "Asia/Famagusta"),
("Asia/Gaza", "Asia/Gaza"),
("Asia/Harbin", "Asia/Harbin"),
("Asia/Hebron", "Asia/Hebron"),
("Asia/Ho_Chi_Minh", "Asia/Ho_Chi_Minh"),
("Asia/Hong_Kong", "Asia/Hong_Kong"),
("Asia/Hovd", "Asia/Hovd"),
("Asia/Irkutsk", "Asia/Irkutsk"),
("Asia/Istanbul", "Asia/Istanbul"),
("Asia/Jakarta", "Asia/Jakarta"),
("Asia/Jayapura", "Asia/Jayapura"),
("Asia/Jerusalem", "Asia/Jerusalem"),
("Asia/Kabul", "Asia/Kabul"),
("Asia/Kamchatka", "Asia/Kamchatka"),
("Asia/Karachi", "Asia/Karachi"),
("Asia/Kashgar", "Asia/Kashgar"),
("Asia/Kathmandu", "Asia/Kathmandu"),
("Asia/Katmandu", "Asia/Katmandu"),
("Asia/Khandyga", "Asia/Khandyga"),
("Asia/Kolkata", "Asia/Kolkata"),
("Asia/Krasnoyarsk", "Asia/Krasnoyarsk"),
("Asia/Kuala_Lumpur", "Asia/Kuala_Lumpur"),
("Asia/Kuching", "Asia/Kuching"),
("Asia/Kuwait", "Asia/Kuwait"),
("Asia/Macao", "Asia/Macao"),
("Asia/Macau", "Asia/Macau"),
("Asia/Magadan", "Asia/Magadan"),
("Asia/Makassar", "Asia/Makassar"),
("Asia/Manila", "Asia/Manila"),
("Asia/Muscat", "Asia/Muscat"),
("Asia/Nicosia", "Asia/Nicosia"),
("Asia/Novokuznetsk", "Asia/Novokuznetsk"),
("Asia/Novosibirsk", "Asia/Novosibirsk"),
("Asia/Omsk", "Asia/Omsk"),
("Asia/Oral", "Asia/Oral"),
("Asia/Phnom_Penh", "Asia/Phnom_Penh"),
("Asia/Pontianak", "Asia/Pontianak"),
("Asia/Pyongyang", "Asia/Pyongyang"),
("Asia/Qatar", "Asia/Qatar"),
("Asia/Qostanay", "Asia/Qostanay"),
("Asia/Qyzylorda", "Asia/Qyzylorda"),
("Asia/Rangoon", "Asia/Rangoon"),
("Asia/Riyadh", "Asia/Riyadh"),
("Asia/Saigon", "Asia/Saigon"),
("Asia/Sakhalin", "Asia/Sakhalin"),
("Asia/Samarkand", "Asia/Samarkand"),
("Asia/Seoul", "Asia/Seoul"),
("Asia/Shanghai", "Asia/Shanghai"),
("Asia/Singapore", "Asia/Singapore"),
("Asia/Srednekolymsk", "Asia/Srednekolymsk"),
("Asia/Taipei", "Asia/Taipei"),
("Asia/Tashkent", "Asia/Tashkent"),
("Asia/Tbilisi", "Asia/Tbilisi"),
("Asia/Tehran", "Asia/Tehran"),
("Asia/Tel_Aviv", "Asia/Tel_Aviv"),
("Asia/Thimbu", "Asia/Thimbu"),
("Asia/Thimphu", "Asia/Thimphu"),
("Asia/Tokyo", "Asia/Tokyo"),
("Asia/Tomsk", "Asia/Tomsk"),
("Asia/Ujung_Pandang", "Asia/Ujung_Pandang"),
("Asia/Ulaanbaatar", "Asia/Ulaanbaatar"),
("Asia/Ulan_Bator", "Asia/Ulan_Bator"),
("Asia/Urumqi", "Asia/Urumqi"),
("Asia/Ust-Nera", "Asia/Ust-Nera"),
("Asia/Vientiane", "Asia/Vientiane"),
("Asia/Vladivostok", "Asia/Vladivostok"),
("Asia/Yakutsk", "Asia/Yakutsk"),
("Asia/Yangon", "Asia/Yangon"),
("Asia/Yekaterinburg", "Asia/Yekaterinburg"),
("Asia/Yerevan", "Asia/Yerevan"),
("Atlantic/Azores", "Atlantic/Azores"),
("Atlantic/Bermuda", "Atlantic/Bermuda"),
("Atlantic/Canary", "Atlantic/Canary"),
("Atlantic/Cape_Verde", "Atlantic/Cape_Verde"),
("Atlantic/Faeroe", "Atlantic/Faeroe"),
("Atlantic/Faroe", "Atlantic/Faroe"),
("Atlantic/Jan_Mayen", "Atlantic/Jan_Mayen"),
("Atlantic/Madeira", "Atlantic/Madeira"),
("Atlantic/Reykjavik", "Atlantic/Reykjavik"),
("Atlantic/South_Georgia", "Atlantic/South_Georgia"),
("Atlantic/St_Helena", "Atlantic/St_Helena"),
("Atlantic/Stanley", "Atlantic/Stanley"),
("Australia/ACT", "Australia/ACT"),
("Australia/Adelaide", "Australia/Adelaide"),
("Australia/Brisbane", "Australia/Brisbane"),
("Australia/Broken_Hill", "Australia/Broken_Hill"),
("Australia/Canberra", "Australia/Canberra"),
("Australia/Currie", "Australia/Currie"),
("Australia/Darwin", "Australia/Darwin"),
("Australia/Eucla", "Australia/Eucla"),
("Australia/Hobart", "Australia/Hobart"),
("Australia/LHI", "Australia/LHI"),
("Australia/Lindeman", "Australia/Lindeman"),
("Australia/Lord_Howe", "Australia/Lord_Howe"),
("Australia/Melbourne", "Australia/Melbourne"),
("Australia/NSW", "Australia/NSW"),
("Australia/North", "Australia/North"),
("Australia/Perth", "Australia/Perth"),
("Australia/Queensland", "Australia/Queensland"),
("Australia/South", "Australia/South"),
("Australia/Sydney", "Australia/Sydney"),
("Australia/Tasmania", "Australia/Tasmania"),
("Australia/Victoria", "Australia/Victoria"),
("Australia/West", "Australia/West"),
("Australia/Yancowinna", "Australia/Yancowinna"),
("Brazil/Acre", "Brazil/Acre"),
("Brazil/DeNoronha", "Brazil/DeNoronha"),
("Brazil/East", "Brazil/East"),
("Brazil/West", "Brazil/West"),
("CET", "CET"),
("CST6CDT", "CST6CDT"),
("Canada/Atlantic", "Canada/Atlantic"),
("Canada/Central", "Canada/Central"),
("Canada/Eastern", "Canada/Eastern"),
("Canada/Mountain", "Canada/Mountain"),
("Canada/Newfoundland", "Canada/Newfoundland"),
("Canada/Pacific", "Canada/Pacific"),
("Canada/Saskatchewan", "Canada/Saskatchewan"),
("Canada/Yukon", "Canada/Yukon"),
("Chile/Continental", "Chile/Continental"),
("Chile/EasterIsland", "Chile/EasterIsland"),
("Cuba", "Cuba"),
("EET", "EET"),
("EST", "EST"),
("EST5EDT", "EST5EDT"),
("Egypt", "Egypt"),
("Eire", "Eire"),
("Etc/GMT", "Etc/GMT"),
("Etc/GMT+0", "Etc/GMT+0"),
("Etc/GMT+1", "Etc/GMT+1"),
("Etc/GMT+10", "Etc/GMT+10"),
("Etc/GMT+11", "Etc/GMT+11"),
("Etc/GMT+12", "Etc/GMT+12"),
("Etc/GMT+2", "Etc/GMT+2"),
("Etc/GMT+3", "Etc/GMT+3"),
("Etc/GMT+4", "Etc/GMT+4"),
("Etc/GMT+5", "Etc/GMT+5"),
("Etc/GMT+6", "Etc/GMT+6"),
("Etc/GMT+7", "Etc/GMT+7"),
("Etc/GMT+8", "Etc/GMT+8"),
("Etc/GMT+9", "Etc/GMT+9"),
("Etc/GMT-0", "Etc/GMT-0"),
("Etc/GMT-1", "Etc/GMT-1"),
("Etc/GMT-10", "Etc/GMT-10"),
("Etc/GMT-11", "Etc/GMT-11"),
("Etc/GMT-12", "Etc/GMT-12"),
("Etc/GMT-13", "Etc/GMT-13"),
("Etc/GMT-14", "Etc/GMT-14"),
("Etc/GMT-2", "Etc/GMT-2"),
("Etc/GMT-3", "Etc/GMT-3"),
("Etc/GMT-4", "Etc/GMT-4"),
("Etc/GMT-5", "Etc/GMT-5"),
("Etc/GMT-6", "Etc/GMT-6"),
("Etc/GMT-7", "Etc/GMT-7"),
("Etc/GMT-8", "Etc/GMT-8"),
("Etc/GMT-9", "Etc/GMT-9"),
("Etc/GMT0", "Etc/GMT0"),
("Etc/Greenwich", "Etc/Greenwich"),
("Etc/UCT", "Etc/UCT"),
("Etc/UTC", "Etc/UTC"),
("Etc/Universal", "Etc/Universal"),
("Etc/Zulu", "Etc/Zulu"),
("Europe/Amsterdam", "Europe/Amsterdam"),
("Europe/Andorra", "Europe/Andorra"),
("Europe/Astrakhan", "Europe/Astrakhan"),
("Europe/Athens", "Europe/Athens"),
("Europe/Belfast", "Europe/Belfast"),
("Europe/Belgrade", "Europe/Belgrade"),
("Europe/Berlin", "Europe/Berlin"),
("Europe/Bratislava", "Europe/Bratislava"),
("Europe/Brussels", "Europe/Brussels"),
("Europe/Bucharest", "Europe/Bucharest"),
("Europe/Budapest", "Europe/Budapest"),
("Europe/Busingen", "Europe/Busingen"),
("Europe/Chisinau", "Europe/Chisinau"),
("Europe/Copenhagen", "Europe/Copenhagen"),
("Europe/Dublin", "Europe/Dublin"),
("Europe/Gibraltar", "Europe/Gibraltar"),
("Europe/Guernsey", "Europe/Guernsey"),
("Europe/Helsinki", "Europe/Helsinki"),
("Europe/Isle_of_Man", "Europe/Isle_of_Man"),
("Europe/Istanbul", "Europe/Istanbul"),
("Europe/Jersey", "Europe/Jersey"),
("Europe/Kaliningrad", "Europe/Kaliningrad"),
("Europe/Kiev", "Europe/Kiev"),
("Europe/Kirov", "Europe/Kirov"),
("Europe/Kyiv", "Europe/Kyiv"),
("Europe/Lisbon", "Europe/Lisbon"),
("Europe/Ljubljana", "Europe/Ljubljana"),
("Europe/London", "Europe/London"),
("Europe/Luxembourg", "Europe/Luxembourg"),
("Europe/Madrid", "Europe/Madrid"),
("Europe/Malta", "Europe/Malta"),
("Europe/Mariehamn", "Europe/Mariehamn"),
("Europe/Minsk", "Europe/Minsk"),
("Europe/Monaco", "Europe/Monaco"),
("Europe/Moscow", "Europe/Moscow"),
("Europe/Nicosia", "Europe/Nicosia"),
("Europe/Oslo", "Europe/Oslo"),
("Europe/Paris", "Europe/Paris"),
("Europe/Podgorica", "Europe/Podgorica"),
("Europe/Prague", "Europe/Prague"),
("Europe/Riga", "Europe/Riga"),
("Europe/Rome", "Europe/Rome"),
("Europe/Samara", "Europe/Samara"),
("Europe/San_Marino", "Europe/San_Marino"),
("Europe/Sarajevo", "Europe/Sarajevo"),
("Europe/Saratov", "Europe/Saratov"),
("Europe/Simferopol", "Europe/Simferopol"),
("Europe/Skopje", "Europe/Skopje"),
("Europe/Sofia", "Europe/Sofia"),
("Europe/Stockholm", "Europe/Stockholm"),
("Europe/Tallinn", "Europe/Tallinn"),
("Europe/Tirane", "Europe/Tirane"),
("Europe/Tiraspol", "Europe/Tiraspol"),
("Europe/Ulyanovsk", "Europe/Ulyanovsk"),
("Europe/Uzhgorod", "Europe/Uzhgorod"),
("Europe/Vaduz", "Europe/Vaduz"),
("Europe/Vatican", "Europe/Vatican"),
("Europe/Vienna", "Europe/Vienna"),
("Europe/Vilnius", "Europe/Vilnius"),
("Europe/Volgograd", "Europe/Volgograd"),
("Europe/Warsaw", "Europe/Warsaw"),
("Europe/Zagreb", "Europe/Zagreb"),
("Europe/Zaporozhye", "Europe/Zaporozhye"),
("Europe/Zurich", "Europe/Zurich"),
("Factory", "Factory"),
("GB", "GB"),
("GB-Eire", "GB-Eire"),
("GMT", "GMT"),
("GMT+0", "GMT+0"),
("GMT-0", "GMT-0"),
("GMT0", "GMT0"),
("Greenwich", "Greenwich"),
("HST", "HST"),
("Hongkong", "Hongkong"),
("Iceland", "Iceland"),
("Indian/Antananarivo", "Indian/Antananarivo"),
("Indian/Chagos", "Indian/Chagos"),
("Indian/Christmas", "Indian/Christmas"),
("Indian/Cocos", "Indian/Cocos"),
("Indian/Comoro", "Indian/Comoro"),
("Indian/Kerguelen", "Indian/Kerguelen"),
("Indian/Mahe", "Indian/Mahe"),
("Indian/Maldives", "Indian/Maldives"),
("Indian/Mauritius", "Indian/Mauritius"),
("Indian/Mayotte", "Indian/Mayotte"),
("Indian/Reunion", "Indian/Reunion"),
("Iran", "Iran"),
("Israel", "Israel"),
("Jamaica", "Jamaica"),
("Japan", "Japan"),
("Kwajalein", "Kwajalein"),
("Libya", "Libya"),
("MET", "MET"),
("MST", "MST"),
("MST7MDT", "MST7MDT"),
("Mexico/BajaNorte", "Mexico/BajaNorte"),
("Mexico/BajaSur", "Mexico/BajaSur"),
("Mexico/General", "Mexico/General"),
("NZ", "NZ"),
("NZ-CHAT", "NZ-CHAT"),
("Navajo", "Navajo"),
("PRC", "PRC"),
("PST8PDT", "PST8PDT"),
("Pacific/Apia", "Pacific/Apia"),
("Pacific/Auckland", "Pacific/Auckland"),
("Pacific/Bougainville", "Pacific/Bougainville"),
("Pacific/Chatham", "Pacific/Chatham"),
("Pacific/Chuuk", "Pacific/Chuuk"),
("Pacific/Easter", "Pacific/Easter"),
("Pacific/Efate", "Pacific/Efate"),
("Pacific/Enderbury", "Pacific/Enderbury"),
("Pacific/Fakaofo", "Pacific/Fakaofo"),
("Pacific/Fiji", "Pacific/Fiji"),
("Pacific/Funafuti", "Pacific/Funafuti"),
("Pacific/Galapagos", "Pacific/Galapagos"),
("Pacific/Gambier", "Pacific/Gambier"),
("Pacific/Guadalcanal", "Pacific/Guadalcanal"),
("Pacific/Guam", "Pacific/Guam"),
("Pacific/Honolulu", "Pacific/Honolulu"),
("Pacific/Johnston", "Pacific/Johnston"),
("Pacific/Kanton", "Pacific/Kanton"),
("Pacific/Kiritimati", "Pacific/Kiritimati"),
("Pacific/Kosrae", "Pacific/Kosrae"),
("Pacific/Kwajalein", "Pacific/Kwajalein"),
("Pacific/Majuro", "Pacific/Majuro"),
("Pacific/Marquesas", "Pacific/Marquesas"),
("Pacific/Midway", "Pacific/Midway"),
("Pacific/Nauru", "Pacific/Nauru"),
("Pacific/Niue", "Pacific/Niue"),
("Pacific/Norfolk", "Pacific/Norfolk"),
("Pacific/Noumea", "Pacific/Noumea"),
("Pacific/Pago_Pago", "Pacific/Pago_Pago"),
("Pacific/Palau", "Pacific/Palau"),
("Pacific/Pitcairn", "Pacific/Pitcairn"),
("Pacific/Pohnpei", "Pacific/Pohnpei"),
("Pacific/Ponape", "Pacific/Ponape"),
("Pacific/Port_Moresby", "Pacific/Port_Moresby"),
("Pacific/Rarotonga", "Pacific/Rarotonga"),
("Pacific/Saipan", "Pacific/Saipan"),
("Pacific/Samoa", "Pacific/Samoa"),
("Pacific/Tahiti", "Pacific/Tahiti"),
("Pacific/Tarawa", "Pacific/Tarawa"),
("Pacific/Tongatapu", "Pacific/Tongatapu"),
("Pacific/Truk", "Pacific/Truk"),
("Pacific/Wake", "Pacific/Wake"),
("Pacific/Wallis", "Pacific/Wallis"),
("Pacific/Yap", "Pacific/Yap"),
("Poland", "Poland"),
("Portugal", "Portugal"),
("ROC", "ROC"),
("ROK", "ROK"),
("Singapore", "Singapore"),
("Turkey", "Turkey"),
("UCT", "UCT"),
("US/Alaska", "US/Alaska"),
("US/Aleutian", "US/Aleutian"),
("US/Arizona", "US/Arizona"),
("US/Central", "US/Central"),
("US/East-Indiana", "US/East-Indiana"),
("US/Eastern", "US/Eastern"),
("US/Hawaii", "US/Hawaii"),
("US/Indiana-Starke", "US/Indiana-Starke"),
("US/Michigan", "US/Michigan"),
("US/Mountain", "US/Mountain"),
("US/Pacific", "US/Pacific"),
("US/Samoa", "US/Samoa"),
("UTC", "UTC"),
("Universal", "Universal"),
("W-SU", "W-SU"),
("WET", "WET"),
("Zulu", "Zulu"),
("localtime", "localtime"),
],
default="UTC",
max_length=255,
),
),
]

View file

@ -0,0 +1,27 @@
# Generated by Django 3.2.25 on 2024-03-27 19:14
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0199_merge_20240326_1217"),
]
operations = [
migrations.RemoveField(
model_name="addfiletotar",
name="childjob_ptr",
),
migrations.RemoveField(
model_name="addfiletotar",
name="parent_export_job",
),
migrations.DeleteModel(
name="AddBookToUserExportJob",
),
migrations.DeleteModel(
name="AddFileToTar",
),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-03 19:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0199_status_bookwyrm_st_remote__06aeba_idx"),
]
operations = [
migrations.AddIndex(
model_name="status",
index=models.Index(
fields=["thread_id"], name="bookwyrm_st_thread__cf064f_idx"
),
),
]

View file

@ -0,0 +1,39 @@
# Generated by Django 4.2.11 on 2024-04-01 21:09
import bookwyrm.models.fields
from django.db import migrations, models
from django.contrib.postgres.operations import CreateCollation
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0200_alter_user_preferred_timezone"),
]
operations = [
CreateCollation(
"case_insensitive",
provider="icu",
locale="und-u-ks-level2",
deterministic=False,
),
migrations.AlterField(
model_name="hashtag",
name="name",
field=bookwyrm.models.fields.CharField(
db_collation="case_insensitive", max_length=256
),
),
migrations.AlterField(
model_name="user",
name="localname",
field=models.CharField(
db_collation="case_insensitive",
max_length=255,
null=True,
unique=True,
validators=[bookwyrm.models.fields.validate_localname],
),
),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-03 19:10
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0200_status_bookwyrm_st_thread__cf064f_idx"),
]
operations = [
migrations.AddIndex(
model_name="keypair",
index=models.Index(
fields=["remote_id"], name="bookwyrm_ke_remote__472927_idx"
),
),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-03 19:14
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0201_keypair_bookwyrm_ke_remote__472927_idx"),
]
operations = [
migrations.AddIndex(
model_name="user",
index=models.Index(
fields=["username"], name="bookwyrm_us_usernam_b2546d_idx"
),
),
]

View file

@ -0,0 +1,19 @@
# Generated by Django 3.2.25 on 2024-04-03 19:22
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0202_user_bookwyrm_us_usernam_b2546d_idx"),
]
operations = [
migrations.AddIndex(
model_name="user",
index=models.Index(
fields=["is_active", "local"], name="bookwyrm_us_is_acti_972dc4_idx"
),
),
]

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.25 on 2024-04-09 10:42
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0197_mergedauthor_mergedbook"),
("bookwyrm", "0203_user_bookwyrm_us_is_acti_972dc4_idx"),
]
operations = []

View file

@ -0,0 +1,13 @@
# Generated by Django 4.2.11 on 2024-04-10 20:22
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0201_alter_hashtag_name_alter_user_localname"),
("bookwyrm", "0204_merge_20240409_1042"),
]
operations = []

View file

@ -0,0 +1,13 @@
# Generated by Django 3.2.25 on 2024-04-13 02:32
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0200_auto_20240327_1914"),
("bookwyrm", "0204_merge_20240409_1042"),
]
operations = []

View file

@ -0,0 +1,13 @@
# Generated by Django 4.2.11 on 2024-04-15 15:37
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("bookwyrm", "0205_merge_20240410_2022"),
("bookwyrm", "0205_merge_20240413_0232"),
]
operations = []

View file

@ -152,8 +152,9 @@ class ActivitypubMixin:
# find anyone who's tagged in a status, for example
mentions = self.recipients if hasattr(self, "recipients") else []
# we always send activities to explicitly mentioned users' inboxes
recipients = [u.inbox for u in mentions or [] if not u.local]
# we always send activities to explicitly mentioned users (using shared inboxes
# where available to avoid duplicate submissions to a given instance)
recipients = {u.shared_inbox or u.inbox for u in mentions if not u.local}
# unless it's a dm, all the followers should receive the activity
if privacy != "direct":
@ -168,23 +169,23 @@ class ActivitypubMixin:
# filter users first by whether they're using the desired software
# this lets us send book updates only to other bw servers
if software:
queryset = queryset.filter(bookwyrm_user=(software == "bookwyrm"))
queryset = queryset.filter(bookwyrm_user=software == "bookwyrm")
# if there's a user, we only want to send to the user's followers
if user:
queryset = queryset.filter(following=user)
# ideally, we will send to shared inboxes for efficiency
shared_inboxes = (
queryset.filter(shared_inbox__isnull=False)
.values_list("shared_inbox", flat=True)
.distinct()
# as above, we prefer shared inboxes if available
recipients.update(
queryset.filter(shared_inbox__isnull=False).values_list(
"shared_inbox", flat=True
)
)
# but not everyone has a shared inbox
inboxes = queryset.filter(shared_inbox__isnull=True).values_list(
"inbox", flat=True
recipients.update(
queryset.filter(shared_inbox__isnull=True).values_list(
"inbox", flat=True
)
)
recipients += list(shared_inboxes) + list(inboxes)
return list(set(recipients))
return list(recipients)
def to_activity_dataclass(self):
"""convert from a model to an activity"""
@ -205,14 +206,10 @@ class ObjectMixin(ActivitypubMixin):
created: Optional[bool] = None,
software: Any = None,
priority: str = BROADCAST,
broadcast: bool = True,
**kwargs: Any,
) -> None:
"""broadcast created/updated/deleted objects as appropriate"""
broadcast = kwargs.get("broadcast", True)
# this bonus kwarg would cause an error in the base save method
if "broadcast" in kwargs:
del kwargs["broadcast"]
created = created or not bool(self.id)
# first off, we want to save normally no matter what
super().save(*args, **kwargs)

View file

@ -1,20 +1,25 @@
""" database schema for info about authors """
import re
from typing import Tuple, Any
from django.contrib.postgres.indexes import GinIndex
import re
from typing import Any
from django.db import models
from django.contrib.postgres.indexes import GinIndex
import pgtrigger
from bookwyrm import activitypub
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from bookwyrm.utils.db import format_trigger
from .book import BookDataModel
from .book import BookDataModel, MergedAuthor
from . import fields
class Author(BookDataModel):
"""basic biographic info"""
merged_model = MergedAuthor
wikipedia_link = fields.CharField(
max_length=255, blank=True, null=True, deduplication_field=True
)
@ -40,12 +45,12 @@ class Author(BookDataModel):
)
bio = fields.HtmlField(null=True, blank=True)
def save(self, *args: Tuple[Any, ...], **kwargs: dict[str, Any]) -> None:
def save(self, *args: Any, **kwargs: Any) -> None:
"""normalize isni format"""
if self.isni:
if self.isni is not None:
self.isni = re.sub(r"\s", "", self.isni)
return super().save(*args, **kwargs)
super().save(*args, **kwargs)
@property
def isni_link(self):
@ -65,11 +70,48 @@ class Author(BookDataModel):
def get_remote_id(self):
"""editions and works both use "book" instead of model_name"""
return f"https://{DOMAIN}/author/{self.id}"
activity_serializer = activitypub.Author
return f"{BASE_URL}/author/{self.id}"
class Meta:
"""sets up postgres GIN index field"""
"""sets up indexes and triggers"""
# pylint: disable=line-too-long
indexes = (GinIndex(fields=["search_vector"]),)
triggers = [
pgtrigger.Trigger(
name="update_search_vector_on_author_edit",
when=pgtrigger.Before,
operation=pgtrigger.Insert
| pgtrigger.UpdateOf("name", "aliases", "search_vector"),
func=format_trigger(
"""new.search_vector :=
-- author name, with priority A
setweight(to_tsvector('simple', new.name), 'A') ||
-- author aliases, with priority B
setweight(to_tsvector('simple', coalesce(array_to_string(new.aliases, ' '), '')), 'B');
RETURN new;
"""
),
),
pgtrigger.Trigger(
name="reset_book_search_vector_on_author_edit",
when=pgtrigger.After,
operation=pgtrigger.UpdateOf("name", "aliases"),
func=format_trigger(
"""WITH updated_books AS (
SELECT book_id
FROM bookwyrm_book_authors
WHERE author_id = new.id
)
UPDATE bookwyrm_book
SET search_vector = ''
FROM updated_books
WHERE id = updated_books.book_id;
RETURN new;
"""
),
),
]
activity_serializer = activitypub.Author

View file

@ -10,7 +10,7 @@ from django.http import Http404
from django.utils.translation import gettext_lazy as _
from django.utils.text import slugify
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from .fields import RemoteIdField
@ -38,7 +38,7 @@ class BookWyrmModel(models.Model):
def get_remote_id(self):
"""generate the url that resolves to the local object, without a slug"""
base_path = f"https://{DOMAIN}"
base_path = BASE_URL
if hasattr(self, "user"):
base_path = f"{base_path}{self.user.local_path}"
@ -53,7 +53,7 @@ class BookWyrmModel(models.Model):
@property
def local_path(self):
"""how to link to this object in the local app, with a slug"""
local = self.get_remote_id().replace(f"https://{DOMAIN}", "")
local = self.get_remote_id().replace(BASE_URL, "")
name = None
if hasattr(self, "name_field"):

View file

@ -1,29 +1,33 @@
""" database schema for books and shelves """
from itertools import chain
import re
from typing import Any
from typing import Any, Dict, Optional, Iterable
from typing_extensions import Self
from django.contrib.postgres.search import SearchVectorField
from django.contrib.postgres.indexes import GinIndex
from django.core.cache import cache
from django.db import models, transaction
from django.db.models import Prefetch
from django.db.models import Prefetch, ManyToManyField
from django.dispatch import receiver
from django.utils.translation import gettext_lazy as _
from model_utils import FieldTracker
from model_utils.managers import InheritanceManager
from imagekit.models import ImageSpecField
import pgtrigger
from bookwyrm import activitypub
from bookwyrm.isbn.isbn import hyphenator_singleton as hyphenator
from bookwyrm.preview_images import generate_edition_preview_image_task
from bookwyrm.settings import (
DOMAIN,
BASE_URL,
DEFAULT_LANGUAGE,
LANGUAGE_ARTICLES,
ENABLE_PREVIEW_IMAGES,
ENABLE_THUMBNAIL_GENERATION,
)
from bookwyrm.utils.db import format_trigger, add_update_fields
from .activitypub_mixin import OrderedCollectionPageMixin, ObjectMixin
from .base_model import BookWyrmModel
@ -92,24 +96,134 @@ class BookDataModel(ObjectMixin, BookWyrmModel):
abstract = True
def save(self, *args: Any, **kwargs: Any) -> None:
def save(
self, *args: Any, update_fields: Optional[Iterable[str]] = None, **kwargs: Any
) -> None:
"""ensure that the remote_id is within this instance"""
if self.id:
self.remote_id = self.get_remote_id()
update_fields = add_update_fields(update_fields, "remote_id")
else:
self.origin_id = self.remote_id
self.remote_id = None
return super().save(*args, **kwargs)
update_fields = add_update_fields(update_fields, "origin_id", "remote_id")
super().save(*args, update_fields=update_fields, **kwargs)
# pylint: disable=arguments-differ
def broadcast(self, activity, sender, software="bookwyrm", **kwargs):
"""only send book data updates to other bookwyrm instances"""
super().broadcast(activity, sender, software=software, **kwargs)
def merge_into(self, canonical: Self, dry_run=False) -> Dict[str, Any]:
"""merge this entity into another entity"""
if canonical.id == self.id:
raise ValueError(f"Cannot merge {self} into itself")
absorbed_fields = canonical.absorb_data_from(self, dry_run=dry_run)
if dry_run:
return absorbed_fields
canonical.save()
self.merged_model.objects.create(deleted_id=self.id, merged_into=canonical)
# move related models to canonical
related_models = [
(r.remote_field.name, r.related_model) for r in self._meta.related_objects
]
# pylint: disable=protected-access
for related_field, related_model in related_models:
# Skip the ManyToMany fields that arent auto-created. These
# should have a corresponding OneToMany field in the model for
# the linking table anyway. If we update it through that model
# instead then we wont lose the extra fields in the linking
# table.
# pylint: disable=protected-access
related_field_obj = related_model._meta.get_field(related_field)
if isinstance(related_field_obj, ManyToManyField):
through = related_field_obj.remote_field.through
if not through._meta.auto_created:
continue
related_objs = related_model.objects.filter(**{related_field: self})
for related_obj in related_objs:
try:
setattr(related_obj, related_field, canonical)
related_obj.save()
except TypeError:
getattr(related_obj, related_field).add(canonical)
getattr(related_obj, related_field).remove(self)
self.delete()
return absorbed_fields
def absorb_data_from(self, other: Self, dry_run=False) -> Dict[str, Any]:
"""fill empty fields with values from another entity"""
absorbed_fields = {}
for data_field in self._meta.get_fields():
if not hasattr(data_field, "activitypub_field"):
continue
canonical_value = getattr(self, data_field.name)
other_value = getattr(other, data_field.name)
if not other_value:
continue
if isinstance(data_field, fields.ArrayField):
if new_values := list(set(other_value) - set(canonical_value)):
# append at the end (in no particular order)
if not dry_run:
setattr(self, data_field.name, canonical_value + new_values)
absorbed_fields[data_field.name] = new_values
elif isinstance(data_field, fields.PartialDateField):
if (
(not canonical_value)
or (other_value.has_day and not canonical_value.has_day)
or (other_value.has_month and not canonical_value.has_month)
):
if not dry_run:
setattr(self, data_field.name, other_value)
absorbed_fields[data_field.name] = other_value
else:
if not canonical_value:
if not dry_run:
setattr(self, data_field.name, other_value)
absorbed_fields[data_field.name] = other_value
return absorbed_fields
class MergedBookDataModel(models.Model):
"""a BookDataModel instance that has been merged into another instance. kept
to be able to redirect old URLs"""
deleted_id = models.IntegerField(primary_key=True)
class Meta:
"""abstract just like BookDataModel"""
abstract = True
class MergedBook(MergedBookDataModel):
"""an Book that has been merged into another one"""
merged_into = models.ForeignKey(
"Book", on_delete=models.PROTECT, related_name="absorbed"
)
class MergedAuthor(MergedBookDataModel):
"""an Author that has been merged into another one"""
merged_into = models.ForeignKey(
"Author", on_delete=models.PROTECT, related_name="absorbed"
)
class Book(BookDataModel):
"""a generic book, which can mean either an edition or a work"""
merged_model = MergedBook
connector = models.ForeignKey("Connector", on_delete=models.PROTECT, null=True)
# book/work metadata
@ -190,9 +304,13 @@ class Book(BookDataModel):
"""properties of this edition, as a string"""
items = [
self.physical_format if hasattr(self, "physical_format") else None,
f"{self.languages[0]} language"
if self.languages and self.languages[0] and self.languages[0] != "English"
else None,
(
f"{self.languages[0]} language"
if self.languages
and self.languages[0]
and self.languages[0] != "English"
else None
),
str(self.published_date.year) if self.published_date else None,
", ".join(self.publishers) if hasattr(self, "publishers") else None,
]
@ -210,11 +328,11 @@ class Book(BookDataModel):
if not isinstance(self, (Edition, Work)):
raise ValueError("Books should be added as Editions or Works")
return super().save(*args, **kwargs)
super().save(*args, **kwargs)
def get_remote_id(self):
"""editions and works both use "book" instead of model_name"""
return f"https://{DOMAIN}/book/{self.id}"
return f"{BASE_URL}/book/{self.id}"
def guess_sort_title(self):
"""Get a best-guess sort title for the current book"""
@ -232,9 +350,49 @@ class Book(BookDataModel):
)
class Meta:
"""sets up postgres GIN index field"""
"""set up indexes and triggers"""
# pylint: disable=line-too-long
indexes = (GinIndex(fields=["search_vector"]),)
triggers = [
pgtrigger.Trigger(
name="update_search_vector_on_book_edit",
when=pgtrigger.Before,
operation=pgtrigger.Insert
| pgtrigger.UpdateOf("title", "subtitle", "series", "search_vector"),
func=format_trigger(
"""
WITH author_names AS (
SELECT array_to_string(bookwyrm_author.name || bookwyrm_author.aliases, ' ') AS name_and_aliases
FROM bookwyrm_author
LEFT JOIN bookwyrm_book_authors
ON bookwyrm_author.id = bookwyrm_book_authors.author_id
WHERE bookwyrm_book_authors.book_id = new.id
)
SELECT
-- title, with priority A (parse in English, default to simple if empty)
setweight(COALESCE(nullif(
to_tsvector('english', new.title), ''),
to_tsvector('simple', new.title)), 'A') ||
-- subtitle, with priority B (always in English?)
setweight(to_tsvector('english', COALESCE(new.subtitle, '')), 'B') ||
-- list of authors names and aliases (with priority C)
(SELECT setweight(to_tsvector('simple', COALESCE(array_to_string(ARRAY_AGG(name_and_aliases), ' '), '')), 'C')
FROM author_names
) ||
--- last: series name, with lowest priority
setweight(to_tsvector('english', COALESCE(new.series, '')), 'D')
INTO new.search_vector;
RETURN new;
"""
),
)
]
class Work(OrderedCollectionPageMixin, Book):
@ -247,10 +405,11 @@ class Work(OrderedCollectionPageMixin, Book):
def save(self, *args, **kwargs):
"""set some fields on the edition object"""
super().save(*args, **kwargs)
# set rank
for edition in self.editions.all():
edition.save()
return super().save(*args, **kwargs)
@property
def default_edition(self):
@ -356,33 +515,48 @@ class Edition(Book):
# max rank is 9
return rank
def save(self, *args: Any, **kwargs: Any) -> None:
def save(
self, *args: Any, update_fields: Optional[Iterable[str]] = None, **kwargs: Any
) -> None:
"""set some fields on the edition object"""
# calculate isbn 10/13
if self.isbn_13 and self.isbn_13[:3] == "978" and not self.isbn_10:
if (
self.isbn_10 is None
and self.isbn_13 is not None
and self.isbn_13[:3] == "978"
):
self.isbn_10 = isbn_13_to_10(self.isbn_13)
if self.isbn_10 and not self.isbn_13:
update_fields = add_update_fields(update_fields, "isbn_10")
if self.isbn_13 is None and self.isbn_10 is not None:
self.isbn_13 = isbn_10_to_13(self.isbn_10)
update_fields = add_update_fields(update_fields, "isbn_13")
# normalize isbn format
if self.isbn_10:
if self.isbn_10 is not None:
self.isbn_10 = normalize_isbn(self.isbn_10)
if self.isbn_13:
if self.isbn_13 is not None:
self.isbn_13 = normalize_isbn(self.isbn_13)
# set rank
self.edition_rank = self.get_rank()
# clear author cache
if self.id:
for author_id in self.authors.values_list("id", flat=True):
cache.delete(f"author-books-{author_id}")
if (new := self.get_rank()) != self.edition_rank:
self.edition_rank = new
update_fields = add_update_fields(update_fields, "edition_rank")
# Create sort title by removing articles from title
if self.sort_title in [None, ""]:
self.sort_title = self.guess_sort_title()
update_fields = add_update_fields(update_fields, "sort_title")
return super().save(*args, **kwargs)
super().save(*args, update_fields=update_fields, **kwargs)
# clear author cache
if self.id:
cache.delete_many(
[
f"author-books-{author_id}"
for author_id in self.authors.values_list("id", flat=True)
]
)
@transaction.atomic
def repair(self):

View file

@ -1,216 +1,317 @@
"""Export user account to tar.gz file for import into another Bookwyrm instance"""
import dataclasses
import logging
from uuid import uuid4
import os
from django.db.models import FileField
from boto3.session import Session as BotoSession
from s3_tar import S3Tar
from django.db.models import BooleanField, FileField, JSONField
from django.db.models import Q
from django.core.serializers.json import DjangoJSONEncoder
from django.core.files.base import ContentFile
from django.core.files.storage import storages
from bookwyrm.models import AnnualGoal, ReadThrough, ShelfBook, List, ListItem
from bookwyrm import settings
from bookwyrm.models import AnnualGoal, ReadThrough, ShelfBook, ListItem
from bookwyrm.models import Review, Comment, Quotation
from bookwyrm.models import Edition
from bookwyrm.models import UserFollows, User, UserBlocks
from bookwyrm.models.job import ParentJob, ParentTask
from bookwyrm.models.job import ParentJob
from bookwyrm.tasks import app, IMPORTS
from bookwyrm.utils.tar import BookwyrmTarFile
logger = logging.getLogger(__name__)
class BookwyrmAwsSession(BotoSession):
"""a boto session that always uses settings.AWS_S3_ENDPOINT_URL"""
def client(self, *args, **kwargs): # pylint: disable=arguments-differ
kwargs["endpoint_url"] = settings.AWS_S3_ENDPOINT_URL
return super().client("s3", *args, **kwargs)
def select_exports_storage():
"""callable to allow for dependency on runtime configuration"""
return storages["exports"]
class BookwyrmExportJob(ParentJob):
"""entry for a specific request to export a bookwyrm user"""
export_data = FileField(null=True)
export_data = FileField(null=True, storage=select_exports_storage)
export_json = JSONField(null=True, encoder=DjangoJSONEncoder)
json_completed = BooleanField(default=False)
def start_job(self):
"""Start the job"""
start_export_task.delay(job_id=self.id, no_children=True)
"""schedule the first task"""
return self
task = create_export_json_task.delay(job_id=self.id)
self.task_id = task.id
self.save(update_fields=["task_id"])
@app.task(queue=IMPORTS, base=ParentTask)
def start_export_task(**kwargs):
"""trigger the child tasks for each row"""
job = BookwyrmExportJob.objects.get(id=kwargs["job_id"])
@app.task(queue=IMPORTS)
def create_export_json_task(job_id):
"""create the JSON data for the export"""
job = BookwyrmExportJob.objects.get(id=job_id)
# don't start the job if it was stopped from the UI
if job.complete:
return
try:
# This is where ChildJobs get made
job.export_data = ContentFile(b"", str(uuid4()))
json_data = json_export(job.user)
tar_export(json_data, job.user, job.export_data)
job.save(update_fields=["export_data"])
job.set_status("active")
# generate JSON structure
job.export_json = export_json(job.user)
job.save(update_fields=["export_json"])
# create archive in separate task
create_archive_task.delay(job_id=job.id)
except Exception as err: # pylint: disable=broad-except
logger.exception("User Export Job %s Failed with error: %s", job.id, err)
logger.exception(
"create_export_json_task for %s failed with error: %s", job, err
)
job.set_status("failed")
job.set_status("complete")
def archive_file_location(file, directory="") -> str:
"""get the relative location of a file inside the archive"""
return os.path.join(directory, file.name)
def tar_export(json_data: str, user, file):
"""wrap the export information in a tar file"""
file.open("wb")
with BookwyrmTarFile.open(mode="w:gz", fileobj=file) as tar:
tar.write_bytes(json_data.encode("utf-8"))
def add_file_to_s3_tar(s3_tar: S3Tar, storage, file, directory=""):
"""
add file to S3Tar inside directory, keeping any directories under its
storage location
"""
s3_tar.add_file(
os.path.join(storage.location, file.name),
folder=os.path.dirname(archive_file_location(file, directory=directory)),
)
# Add avatar image if present
if getattr(user, "avatar", False):
tar.add_image(user.avatar, filename="avatar")
@app.task(queue=IMPORTS)
def create_archive_task(job_id):
"""create the archive containing the JSON file and additional files"""
job = BookwyrmExportJob.objects.get(id=job_id)
# don't start the job if it was stopped from the UI
if job.complete:
return
try:
export_task_id = str(job.task_id)
archive_filename = f"{export_task_id}.tar.gz"
export_json_bytes = DjangoJSONEncoder().encode(job.export_json).encode("utf-8")
user = job.user
editions = get_books_for_user(user)
for book in editions:
if getattr(book, "cover", False):
tar.add_image(book.cover)
file.close()
if settings.USE_S3:
# Storage for writing temporary files
exports_storage = storages["exports"]
# Handle for creating the final archive
s3_tar = S3Tar(
exports_storage.bucket_name,
os.path.join(exports_storage.location, archive_filename),
session=BookwyrmAwsSession(),
)
# Save JSON file to a temporary location
export_json_tmp_file = os.path.join(export_task_id, "archive.json")
exports_storage.save(
export_json_tmp_file,
ContentFile(export_json_bytes),
)
s3_tar.add_file(
os.path.join(exports_storage.location, export_json_tmp_file)
)
# Add images to TAR
images_storage = storages["default"]
if user.avatar:
add_file_to_s3_tar(s3_tar, images_storage, user.avatar)
for edition in editions:
if edition.cover:
add_file_to_s3_tar(
s3_tar, images_storage, edition.cover, directory="images"
)
# Create archive and store file name
s3_tar.tar()
job.export_data = archive_filename
job.save(update_fields=["export_data"])
# Delete temporary files
exports_storage.delete(export_json_tmp_file)
else:
job.export_data = archive_filename
with job.export_data.open("wb") as tar_file:
with BookwyrmTarFile.open(mode="w:gz", fileobj=tar_file) as tar:
# save json file
tar.write_bytes(export_json_bytes)
# Add avatar image if present
if user.avatar:
tar.add_image(user.avatar)
for edition in editions:
if edition.cover:
tar.add_image(edition.cover, directory="images")
job.save(update_fields=["export_data"])
job.set_status("completed")
except Exception as err: # pylint: disable=broad-except
logger.exception("create_archive_task for %s failed with error: %s", job, err)
job.set_status("failed")
def json_export(
user,
): # pylint: disable=too-many-locals, too-many-statements, too-many-branches
"""Generate an export for a user"""
def export_json(user: User):
"""create export JSON"""
data = export_user(user) # in the root of the JSON structure
data["settings"] = export_settings(user)
data["goals"] = export_goals(user)
data["books"] = export_books(user)
data["saved_lists"] = export_saved_lists(user)
data["follows"] = export_follows(user)
data["blocks"] = export_blocks(user)
return data
# User as AP object
exported_user = user.to_activity()
# I don't love this but it prevents a JSON encoding error
# when there is no user image
if isinstance(
exported_user["icon"],
dataclasses._MISSING_TYPE, # pylint: disable=protected-access
):
exported_user["icon"] = {}
def export_user(user: User):
"""export user data"""
data = user.to_activity()
if user.avatar:
data["icon"]["url"] = archive_file_location(user.avatar)
else:
# change the URL to be relative to the JSON file
file_type = exported_user["icon"]["url"].rsplit(".", maxsplit=1)[-1]
filename = f"avatar.{file_type}"
exported_user["icon"]["url"] = filename
data["icon"] = {}
return data
# Additional settings - can't be serialized as AP
def export_settings(user: User):
"""Additional settings - can't be serialized as AP"""
vals = [
"show_goal",
"preferred_timezone",
"default_post_privacy",
"show_suggested_users",
]
exported_user["settings"] = {}
for k in vals:
exported_user["settings"][k] = getattr(user, k)
return {k: getattr(user, k) for k in vals}
# Reading goals - can't be serialized as AP
reading_goals = AnnualGoal.objects.filter(user=user).distinct()
exported_user["goals"] = []
for goal in reading_goals:
exported_user["goals"].append(
{"goal": goal.goal, "year": goal.year, "privacy": goal.privacy}
)
# Reading history - can't be serialized as AP
readthroughs = ReadThrough.objects.filter(user=user).distinct().values()
readthroughs = list(readthroughs)
def export_saved_lists(user: User):
"""add user saved lists to export JSON"""
return [l.remote_id for l in user.saved_lists.all()]
# Books
editions = get_books_for_user(user)
exported_user["books"] = []
for edition in editions:
book = {}
book["work"] = edition.parent_work.to_activity()
book["edition"] = edition.to_activity()
if book["edition"].get("cover"):
# change the URL to be relative to the JSON file
filename = book["edition"]["cover"]["url"].rsplit("/", maxsplit=1)[-1]
book["edition"]["cover"]["url"] = f"covers/{filename}"
# authors
book["authors"] = []
for author in edition.authors.all():
book["authors"].append(author.to_activity())
# Shelves this book is on
# Every ShelfItem is this book so we don't other serializing
book["shelves"] = []
shelf_books = (
ShelfBook.objects.select_related("shelf")
.filter(user=user, book=edition)
.distinct()
)
for shelfbook in shelf_books:
book["shelves"].append(shelfbook.shelf.to_activity())
# Lists and ListItems
# ListItems include "notes" and "approved" so we need them
# even though we know it's this book
book["lists"] = []
list_items = ListItem.objects.filter(book=edition, user=user).distinct()
for item in list_items:
list_info = item.book_list.to_activity()
list_info[
"privacy"
] = item.book_list.privacy # this isn't serialized so we add it
list_info["list_item"] = item.to_activity()
book["lists"].append(list_info)
# Statuses
# Can't use select_subclasses here because
# we need to filter on the "book" value,
# which is not available on an ordinary Status
for status in ["comments", "quotations", "reviews"]:
book[status] = []
comments = Comment.objects.filter(user=user, book=edition).all()
for status in comments:
obj = status.to_activity()
obj["progress"] = status.progress
obj["progress_mode"] = status.progress_mode
book["comments"].append(obj)
quotes = Quotation.objects.filter(user=user, book=edition).all()
for status in quotes:
obj = status.to_activity()
obj["position"] = status.position
obj["endposition"] = status.endposition
obj["position_mode"] = status.position_mode
book["quotations"].append(obj)
reviews = Review.objects.filter(user=user, book=edition).all()
for status in reviews:
obj = status.to_activity()
book["reviews"].append(obj)
# readthroughs can't be serialized to activity
book_readthroughs = (
ReadThrough.objects.filter(user=user, book=edition).distinct().values()
)
book["readthroughs"] = list(book_readthroughs)
# append everything
exported_user["books"].append(book)
# saved book lists - just the remote id
saved_lists = List.objects.filter(id__in=user.saved_lists.all()).distinct()
exported_user["saved_lists"] = [l.remote_id for l in saved_lists]
# follows - just the remote id
def export_follows(user: User):
"""add user follows to export JSON"""
follows = UserFollows.objects.filter(user_subject=user).distinct()
following = User.objects.filter(userfollows_user_object__in=follows).distinct()
exported_user["follows"] = [f.remote_id for f in following]
return [f.remote_id for f in following]
# blocks - just the remote id
def export_blocks(user: User):
"""add user blocks to export JSON"""
blocks = UserBlocks.objects.filter(user_subject=user).distinct()
blocking = User.objects.filter(userblocks_user_object__in=blocks).distinct()
return [b.remote_id for b in blocking]
exported_user["blocks"] = [b.remote_id for b in blocking]
return DjangoJSONEncoder().encode(exported_user)
def export_goals(user: User):
"""add user reading goals to export JSON"""
reading_goals = AnnualGoal.objects.filter(user=user).distinct()
return [
{"goal": goal.goal, "year": goal.year, "privacy": goal.privacy}
for goal in reading_goals
]
def export_books(user: User):
"""add books to export JSON"""
editions = get_books_for_user(user)
return [export_book(user, edition) for edition in editions]
def export_book(user: User, edition: Edition):
"""add book to export JSON"""
data = {}
data["work"] = edition.parent_work.to_activity()
data["edition"] = edition.to_activity()
if edition.cover:
data["edition"]["cover"]["url"] = archive_file_location(
edition.cover, directory="images"
)
# authors
data["authors"] = [author.to_activity() for author in edition.authors.all()]
# Shelves this book is on
# Every ShelfItem is this book so we don't other serializing
shelf_books = (
ShelfBook.objects.select_related("shelf")
.filter(user=user, book=edition)
.distinct()
)
data["shelves"] = [shelfbook.shelf.to_activity() for shelfbook in shelf_books]
# Lists and ListItems
# ListItems include "notes" and "approved" so we need them
# even though we know it's this book
list_items = ListItem.objects.filter(book=edition, user=user).distinct()
data["lists"] = []
for item in list_items:
list_info = item.book_list.to_activity()
list_info[
"privacy"
] = item.book_list.privacy # this isn't serialized so we add it
list_info["list_item"] = item.to_activity()
data["lists"].append(list_info)
# Statuses
# Can't use select_subclasses here because
# we need to filter on the "book" value,
# which is not available on an ordinary Status
for status in ["comments", "quotations", "reviews"]:
data[status] = []
comments = Comment.objects.filter(user=user, book=edition).all()
for status in comments:
obj = status.to_activity()
obj["progress"] = status.progress
obj["progress_mode"] = status.progress_mode
data["comments"].append(obj)
quotes = Quotation.objects.filter(user=user, book=edition).all()
for status in quotes:
obj = status.to_activity()
obj["position"] = status.position
obj["endposition"] = status.endposition
obj["position_mode"] = status.position_mode
data["quotations"].append(obj)
reviews = Review.objects.filter(user=user, book=edition).all()
data["reviews"] = [status.to_activity() for status in reviews]
# readthroughs can't be serialized to activity
book_readthroughs = (
ReadThrough.objects.filter(user=user, book=edition).distinct().values()
)
data["readthroughs"] = list(book_readthroughs)
return data
def get_books_for_user(user):

View file

@ -42,20 +42,23 @@ def start_import_task(**kwargs):
try:
archive_file.open("rb")
with BookwyrmTarFile.open(mode="r:gz", fileobj=archive_file) as tar:
job.import_data = json.loads(tar.read("archive.json").decode("utf-8"))
json_filename = next(
filter(lambda n: n.startswith("archive"), tar.getnames())
)
job.import_data = json.loads(tar.read(json_filename).decode("utf-8"))
if "include_user_profile" in job.required:
update_user_profile(job.user, tar, job.import_data)
if "include_user_settings" in job.required:
update_user_settings(job.user, job.import_data)
if "include_goals" in job.required:
update_goals(job.user, job.import_data.get("goals"))
update_goals(job.user, job.import_data.get("goals", []))
if "include_saved_lists" in job.required:
upsert_saved_lists(job.user, job.import_data.get("saved_lists"))
upsert_saved_lists(job.user, job.import_data.get("saved_lists", []))
if "include_follows" in job.required:
upsert_follows(job.user, job.import_data.get("follows"))
upsert_follows(job.user, job.import_data.get("follows", []))
if "include_blocks" in job.required:
upsert_user_blocks(job.user, job.import_data.get("blocks"))
upsert_user_blocks(job.user, job.import_data.get("blocks", []))
process_books(job, tar)
@ -212,7 +215,7 @@ def upsert_statuses(user, cls, data, book_remote_id):
instance.save() # save and broadcast
else:
logger.info("User does not have permission to import statuses")
logger.warning("User does not have permission to import statuses")
def upsert_lists(user, lists, book_id):

View file

@ -11,7 +11,7 @@ ConnectorFiles = models.TextChoices("ConnectorFiles", CONNECTORS)
class Connector(BookWyrmModel):
"""book data source connectors"""
identifier = models.CharField(max_length=255, unique=True)
identifier = models.CharField(max_length=255, unique=True) # domain
priority = models.IntegerField(default=2)
name = models.CharField(max_length=255, null=True, blank=True)
connector_file = models.CharField(max_length=255, choices=ConnectorFiles.choices)

View file

@ -16,7 +16,7 @@ FederationStatus = [
class FederatedServer(BookWyrmModel):
"""store which servers we federate with"""
server_name = models.CharField(max_length=255, unique=True)
server_name = models.CharField(max_length=255, unique=True) # domain
status = models.CharField(
max_length=255, default="federated", choices=FederationStatus
)
@ -64,5 +64,4 @@ class FederatedServer(BookWyrmModel):
def is_blocked(cls, url: str) -> bool:
"""look up if a domain is blocked"""
url = urlparse(url)
domain = url.netloc
return cls.objects.filter(server_name=domain, status="blocked").exists()
return cls.objects.filter(server_name=url.hostname, status="blocked").exists()

View file

@ -260,12 +260,12 @@ class PrivacyField(ActivitypubFieldMixin, models.CharField):
if to == [self.public]:
setattr(instance, self.name, "public")
elif self.public in cc:
setattr(instance, self.name, "unlisted")
elif to == [user.followers_url]:
setattr(instance, self.name, "followers")
elif cc == []:
setattr(instance, self.name, "direct")
elif self.public in cc:
setattr(instance, self.name, "unlisted")
else:
setattr(instance, self.name, "followers")
return original == getattr(instance, self.name)
@ -482,7 +482,7 @@ class ImageField(ActivitypubFieldMixin, models.ImageField):
if not url:
return None
return activitypub.Document(url=url, name=alt)
return activitypub.Image(url=url, name=alt)
def field_from_activity(self, value, allow_external_connections=True):
image_slug = value

View file

@ -1,7 +1,7 @@
""" do book related things with other users """
from django.db import models, IntegrityError, transaction
from django.db.models import Q
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from .base_model import BookWyrmModel
from . import fields
from .relationship import UserBlocks
@ -17,7 +17,7 @@ class Group(BookWyrmModel):
def get_remote_id(self):
"""don't want the user to be in there in this case"""
return f"https://{DOMAIN}/group/{self.id}"
return f"{BASE_URL}/group/{self.id}"
@classmethod
def followers_filter(cls, queryset, viewer):

View file

@ -2,18 +2,19 @@
from bookwyrm import activitypub
from .activitypub_mixin import ActivitypubMixin
from .base_model import BookWyrmModel
from .fields import CICharField
from .fields import CharField
class Hashtag(ActivitypubMixin, BookWyrmModel):
"a hashtag which can be used in statuses"
name = CICharField(
name = CharField(
max_length=256,
blank=False,
null=False,
activitypub_field="name",
deduplication_field=True,
db_collation="case_insensitive",
)
name_field = "name"

View file

@ -135,8 +135,7 @@ class ParentJob(Job):
)
app.control.revoke(list(tasks))
for task in self.pending_child_jobs:
task.update(status=self.Status.STOPPED)
self.pending_child_jobs.update(status=self.Status.STOPPED)
@property
def has_completed(self):
@ -248,7 +247,7 @@ class SubTask(app.Task):
"""
def before_start(
self, task_id, args, kwargs
self, task_id, *args, **kwargs
): # pylint: disable=no-self-use, unused-argument
"""Handler called before the task starts. Override.
@ -272,7 +271,7 @@ class SubTask(app.Task):
child_job.set_status(ChildJob.Status.ACTIVE)
def on_success(
self, retval, task_id, args, kwargs
self, retval, task_id, *args, **kwargs
): # pylint: disable=no-self-use, unused-argument
"""Run by the worker if the task executes successfully. Override.

View file

@ -1,4 +1,5 @@
""" outlink data """
from typing import Optional, Iterable
from urllib.parse import urlparse
from django.core.exceptions import PermissionDenied
@ -6,6 +7,7 @@ from django.db import models
from django.utils.translation import gettext_lazy as _
from bookwyrm import activitypub
from bookwyrm.utils.db import add_update_fields
from .activitypub_mixin import ActivitypubMixin
from .base_model import BookWyrmModel
from . import fields
@ -34,17 +36,19 @@ class Link(ActivitypubMixin, BookWyrmModel):
"""link name via the associated domain"""
return self.domain.name
def save(self, *args, **kwargs):
def save(self, *args, update_fields: Optional[Iterable[str]] = None, **kwargs):
"""create a link"""
# get or create the associated domain
if not self.domain:
domain = urlparse(self.url).netloc
domain = urlparse(self.url).hostname
self.domain, _ = LinkDomain.objects.get_or_create(domain=domain)
update_fields = add_update_fields(update_fields, "domain")
# this is never broadcast, the owning model broadcasts an update
if "broadcast" in kwargs:
del kwargs["broadcast"]
return super().save(*args, **kwargs)
super().save(*args, update_fields=update_fields, **kwargs)
AvailabilityChoices = [
@ -88,8 +92,10 @@ class LinkDomain(BookWyrmModel):
return
raise PermissionDenied()
def save(self, *args, **kwargs):
def save(self, *args, update_fields: Optional[Iterable[str]] = None, **kwargs):
"""set a default name"""
if not self.name:
self.name = self.domain
super().save(*args, **kwargs)
update_fields = add_update_fields(update_fields, "name")
super().save(*args, update_fields=update_fields, **kwargs)

View file

@ -1,4 +1,5 @@
""" make a list of books!! """
from typing import Optional, Iterable
import uuid
from django.core.exceptions import PermissionDenied
@ -7,7 +8,8 @@ from django.db.models import Q
from django.utils import timezone
from bookwyrm import activitypub
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from bookwyrm.utils.db import add_update_fields
from .activitypub_mixin import CollectionItemMixin, OrderedCollectionMixin
from .base_model import BookWyrmModel
@ -50,7 +52,7 @@ class List(OrderedCollectionMixin, BookWyrmModel):
def get_remote_id(self):
"""don't want the user to be in there in this case"""
return f"https://{DOMAIN}/list/{self.id}"
return f"{BASE_URL}/list/{self.id}"
@property
def collection_queryset(self):
@ -124,11 +126,13 @@ class List(OrderedCollectionMixin, BookWyrmModel):
group=None, curation="closed"
)
def save(self, *args, **kwargs):
def save(self, *args, update_fields: Optional[Iterable[str]] = None, **kwargs):
"""on save, update embed_key and avoid clash with existing code"""
if not self.embed_key:
self.embed_key = uuid.uuid4()
super().save(*args, **kwargs)
update_fields = add_update_fields(update_fields, "embed_key")
super().save(*args, update_fields=update_fields, **kwargs)
class ListItem(CollectionItemMixin, BookWyrmModel):

View file

@ -10,7 +10,7 @@ from .notification import Notification, NotificationType
class Move(ActivityMixin, BookWyrmModel):
"""migrating an activitypub user account"""
"""migrating an activitypub object"""
user = fields.ForeignKey(
"User", on_delete=models.PROTECT, activitypub_field="actor"
@ -48,24 +48,21 @@ class MoveUser(Move):
"""update user info and broadcast it"""
# only allow if the source is listed in the target's alsoKnownAs
if self.user in self.target.also_known_as.all():
self.user.also_known_as.add(self.target.id)
self.user.update_active_date()
self.user.moved_to = self.target.remote_id
self.user.save(update_fields=["moved_to"])
if self.user.local:
kwargs[
"broadcast"
] = True # Only broadcast if we are initiating the Move
super().save(*args, **kwargs)
for follower in self.user.followers.all():
if follower.local:
Notification.notify(
follower, self.user, notification_type=NotificationType.MOVE
)
else:
if self.user not in self.target.also_known_as.all():
raise PermissionDenied()
self.user.also_known_as.add(self.target.id)
self.user.update_active_date()
self.user.moved_to = self.target.remote_id
self.user.save(update_fields=["moved_to"])
if self.user.local:
kwargs["broadcast"] = True # Only broadcast if we are initiating the Move
super().save(*args, **kwargs)
for follower in self.user.followers.all():
if follower.local:
Notification.notify(
follower, self.user, notification_type=NotificationType.MOVE
)

View file

@ -1,9 +1,13 @@
""" progress in a book """
from typing import Optional, Iterable
from django.core import validators
from django.core.cache import cache
from django.db import models
from django.db.models import F, Q
from bookwyrm.utils.db import add_update_fields
from .base_model import BookWyrmModel
@ -30,14 +34,17 @@ class ReadThrough(BookWyrmModel):
stopped_date = models.DateTimeField(blank=True, null=True)
is_active = models.BooleanField(default=True)
def save(self, *args, **kwargs):
def save(self, *args, update_fields: Optional[Iterable[str]] = None, **kwargs):
"""update user active time"""
cache.delete(f"latest_read_through-{self.user_id}-{self.book_id}")
self.user.update_active_date()
# an active readthrough must have an unset finish date
if self.finish_date or self.stopped_date:
self.is_active = False
super().save(*args, **kwargs)
update_fields = add_update_fields(update_fields, "is_active")
super().save(*args, update_fields=update_fields, **kwargs)
cache.delete(f"latest_read_through-{self.user_id}-{self.book_id}")
self.user.update_active_date()
def create_update(self):
"""add update to the readthrough"""

View file

@ -38,14 +38,16 @@ class UserRelationship(BookWyrmModel):
def save(self, *args, **kwargs):
"""clear the template cache"""
clear_cache(self.user_subject, self.user_object)
super().save(*args, **kwargs)
clear_cache(self.user_subject, self.user_object)
def delete(self, *args, **kwargs):
"""clear the template cache"""
clear_cache(self.user_subject, self.user_object)
super().delete(*args, **kwargs)
clear_cache(self.user_subject, self.user_object)
class Meta:
"""relationships should be unique"""

View file

@ -3,7 +3,7 @@ from django.core.exceptions import PermissionDenied
from django.db import models
from django.utils.translation import gettext_lazy as _
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from .base_model import BookWyrmModel
@ -46,7 +46,7 @@ class Report(BookWyrmModel):
raise PermissionDenied()
def get_remote_id(self):
return f"https://{DOMAIN}/settings/reports/{self.id}"
return f"{BASE_URL}/settings/reports/{self.id}"
def comment(self, user, note):
"""comment on a report"""

View file

@ -1,13 +1,15 @@
""" puttin' books on shelves """
import re
from typing import Optional, Iterable
from django.core.cache import cache
from django.core.exceptions import PermissionDenied
from django.db import models
from django.utils import timezone
from bookwyrm import activitypub
from bookwyrm.settings import DOMAIN
from bookwyrm.settings import BASE_URL
from bookwyrm.tasks import BROADCAST
from bookwyrm.utils.db import add_update_fields
from .activitypub_mixin import CollectionItemMixin, OrderedCollectionMixin
from .base_model import BookWyrmModel
from . import fields
@ -44,8 +46,9 @@ class Shelf(OrderedCollectionMixin, BookWyrmModel):
"""set the identifier"""
super().save(*args, priority=priority, **kwargs)
if not self.identifier:
# this needs the auto increment ID from the save() above
self.identifier = self.get_identifier()
super().save(*args, **kwargs, broadcast=False)
super().save(*args, **kwargs, broadcast=False, update_fields={"identifier"})
def get_identifier(self):
"""custom-shelf-123 for the url"""
@ -71,7 +74,7 @@ class Shelf(OrderedCollectionMixin, BookWyrmModel):
@property
def local_path(self):
"""No slugs"""
return self.get_remote_id().replace(f"https://{DOMAIN}", "")
return self.get_remote_id().replace(BASE_URL, "")
def raise_not_deletable(self, viewer):
"""don't let anyone delete a default shelf"""
@ -100,10 +103,21 @@ class ShelfBook(CollectionItemMixin, BookWyrmModel):
activity_serializer = activitypub.ShelfItem
collection_field = "shelf"
def save(self, *args, priority=BROADCAST, **kwargs):
def save(
self,
*args,
priority=BROADCAST,
update_fields: Optional[Iterable[str]] = None,
**kwargs,
):
if not self.user:
self.user = self.shelf.user
if self.id and self.user.local:
update_fields = add_update_fields(update_fields, "user")
is_update = self.id is not None
super().save(*args, priority=priority, update_fields=update_fields, **kwargs)
if is_update and self.user.local:
# remove all caches related to all editions of this book
cache.delete_many(
[
@ -111,7 +125,6 @@ class ShelfBook(CollectionItemMixin, BookWyrmModel):
for book in self.book.parent_work.editions.all()
]
)
super().save(*args, priority=priority, **kwargs)
def delete(self, *args, **kwargs):
if self.id and self.user.local:

View file

@ -1,5 +1,6 @@
""" the particulars for this instance of BookWyrm """
import datetime
from typing import Optional, Iterable
from urllib.parse import urljoin
import uuid
@ -10,8 +11,12 @@ from django.dispatch import receiver
from django.utils import timezone
from model_utils import FieldTracker
from bookwyrm.connectors.abstract_connector import get_data
from bookwyrm.preview_images import generate_site_preview_image_task
from bookwyrm.settings import DOMAIN, ENABLE_PREVIEW_IMAGES, STATIC_FULL_URL
from bookwyrm.settings import BASE_URL, ENABLE_PREVIEW_IMAGES, STATIC_FULL_URL
from bookwyrm.settings import RELEASE_API
from bookwyrm.tasks import app, MISC
from bookwyrm.utils.db import add_update_fields
from .base_model import BookWyrmModel, new_access_code
from .user import User
from .fields import get_absolute_url
@ -45,7 +50,7 @@ class SiteSettings(SiteModel):
default_theme = models.ForeignKey(
"Theme", null=True, blank=True, on_delete=models.SET_NULL
)
version = models.CharField(null=True, blank=True, max_length=10)
available_version = models.CharField(null=True, blank=True, max_length=10)
# admin setup options
install_mode = models.BooleanField(default=False)
@ -133,16 +138,19 @@ class SiteSettings(SiteModel):
return get_absolute_url(uploaded)
return urljoin(STATIC_FULL_URL, default_path)
def save(self, *args, **kwargs):
def save(self, *args, update_fields: Optional[Iterable[str]] = None, **kwargs):
"""if require_confirm_email is disabled, make sure no users are pending,
if enabled, make sure invite_question_text is not empty"""
if not self.invite_question_text:
self.invite_question_text = "What is your favourite book?"
update_fields = add_update_fields(update_fields, "invite_question_text")
super().save(*args, update_fields=update_fields, **kwargs)
if not self.require_confirm_email:
User.objects.filter(is_active=False, deactivation_reason="pending").update(
is_active=True, deactivation_reason=None
)
if not self.invite_question_text:
self.invite_question_text = "What is your favourite book?"
super().save(*args, **kwargs)
class Theme(SiteModel):
@ -185,7 +193,7 @@ class SiteInvite(models.Model):
@property
def link(self):
"""formats the invite link"""
return f"https://{DOMAIN}/invite/{self.code}"
return f"{BASE_URL}/invite/{self.code}"
class InviteRequest(BookWyrmModel):
@ -232,7 +240,7 @@ class PasswordReset(models.Model):
@property
def link(self):
"""formats the invite link"""
return f"https://{DOMAIN}/password-reset/{self.code}"
return f"{BASE_URL}/password-reset/{self.code}"
# pylint: disable=unused-argument
@ -245,3 +253,14 @@ def preview_image(instance, *args, **kwargs):
if len(changed_fields) > 0:
generate_site_preview_image_task.delay()
@app.task(queue=MISC)
def check_for_updates_task():
"""See if git remote knows about a new version"""
site = SiteSettings.objects.get()
release = get_data(RELEASE_API, timeout=3)
available_version = release.get("tag_name", None)
if available_version:
site.available_version = available_version
site.save(update_fields=["available_version"])

View file

@ -1,6 +1,6 @@
""" models for storing different kinds of Activities """
from dataclasses import MISSING
from typing import Optional
from typing import Optional, Iterable
import re
from django.apps import apps
@ -12,12 +12,15 @@ from django.db.models import Q
from django.dispatch import receiver
from django.template.loader import get_template
from django.utils import timezone
from django.utils.translation import gettext_lazy as _
from django.utils.translation import ngettext_lazy
from model_utils import FieldTracker
from model_utils.managers import InheritanceManager
from bookwyrm import activitypub
from bookwyrm.preview_images import generate_edition_preview_image_task
from bookwyrm.settings import ENABLE_PREVIEW_IMAGES
from bookwyrm.utils.db import add_update_fields
from .activitypub_mixin import ActivitypubMixin, ActivityMixin
from .activitypub_mixin import OrderedCollectionPageMixin
from .base_model import BookWyrmModel
@ -78,13 +81,18 @@ class Status(OrderedCollectionPageMixin, BookWyrmModel):
"""default sorting"""
ordering = ("-published_date",)
indexes = [
models.Index(fields=["remote_id"]),
models.Index(fields=["thread_id"]),
]
def save(self, *args, **kwargs):
def save(self, *args, update_fields: Optional[Iterable[str]] = None, **kwargs):
"""save and notify"""
if self.reply_parent:
if self.thread_id is None and self.reply_parent:
self.thread_id = self.reply_parent.thread_id or self.reply_parent_id
update_fields = add_update_fields(update_fields, "thread_id")
super().save(*args, **kwargs)
super().save(*args, update_fields=update_fields, **kwargs)
if not self.reply_parent:
self.thread_id = self.id
@ -107,14 +115,14 @@ class Status(OrderedCollectionPageMixin, BookWyrmModel):
@property
def recipients(self):
"""tagged users who definitely need to get this status in broadcast"""
mentions = [u for u in self.mention_users.all() if not u.local]
mentions = {u for u in self.mention_users.all() if not u.local}
if (
hasattr(self, "reply_parent")
and self.reply_parent
and not self.reply_parent.user.local
):
mentions.append(self.reply_parent.user)
return list(set(mentions))
mentions.add(self.reply_parent.user)
return list(mentions)
@classmethod
def ignore_activity(
@ -178,6 +186,24 @@ class Status(OrderedCollectionPageMixin, BookWyrmModel):
"""you can't boost dms"""
return self.privacy in ["unlisted", "public"]
@property
def page_title(self):
"""title of the page when only this status is shown"""
return _("%(display_name)s's status") % {"display_name": self.user.display_name}
@property
def page_description(self):
"""description of the page in meta tags when only this status is shown"""
return None
@property
def page_image(self):
"""image to use as preview in meta tags when only this status is shown"""
if self.mention_books.exists():
book = self.mention_books.first()
return book.preview_image or book.cover
return self.user.preview_image
def to_replies(self, **kwargs):
"""helper function for loading AP serialized replies to a status"""
return self.to_ordered_collection(
@ -301,6 +327,10 @@ class BookStatus(Status):
abstract = True
@property
def page_image(self):
return self.book.preview_image or self.book.cover or super().page_image
class Comment(BookStatus):
"""like a review but without a rating and transient"""
@ -332,17 +362,26 @@ class Comment(BookStatus):
activity_serializer = activitypub.Comment
@property
def page_title(self):
return _("%(display_name)s's comment on %(book_title)s") % {
"display_name": self.user.display_name,
"book_title": self.book.title,
}
class Quotation(BookStatus):
"""like a review but without a rating and transient"""
quote = fields.HtmlField()
raw_quote = models.TextField(blank=True, null=True)
position = models.IntegerField(
validators=[MinValueValidator(0)], null=True, blank=True
position = models.TextField(
null=True,
blank=True,
)
endposition = models.IntegerField(
validators=[MinValueValidator(0)], null=True, blank=True
endposition = models.TextField(
null=True,
blank=True,
)
position_mode = models.CharField(
max_length=3,
@ -355,10 +394,10 @@ class Quotation(BookStatus):
def _format_position(self) -> Optional[str]:
"""serialize page position"""
beg = self.position
end = self.endposition or 0
end = self.endposition
if self.position_mode != "PG" or not beg:
return None
return f"pp. {beg}-{end}" if end > beg else f"p. {beg}"
return f"pp. {beg}-{end}" if end else f"p. {beg}"
@property
def pure_content(self):
@ -374,6 +413,13 @@ class Quotation(BookStatus):
activity_serializer = activitypub.Quotation
@property
def page_title(self):
return _("%(display_name)s's quote from %(book_title)s") % {
"display_name": self.user.display_name,
"book_title": self.book.title,
}
class Review(BookStatus):
"""a book review"""
@ -403,14 +449,22 @@ class Review(BookStatus):
"""indicate the book in question for mastodon (or w/e) users"""
return self.content
@property
def page_title(self):
return _("%(display_name)s's review of %(book_title)s") % {
"display_name": self.user.display_name,
"book_title": self.book.title,
}
activity_serializer = activitypub.Review
pure_type = "Article"
def save(self, *args, **kwargs):
"""clear rating caches"""
super().save(*args, **kwargs)
if self.book.parent_work:
cache.delete(f"book-rating-{self.book.parent_work.id}")
super().save(*args, **kwargs)
class ReviewRating(Review):
@ -426,6 +480,18 @@ class ReviewRating(Review):
template = get_template("snippets/generated_status/rating.html")
return template.render({"book": self.book, "rating": self.rating}).strip()
@property
def page_description(self):
return ngettext_lazy(
"%(display_name)s rated %(book_title)s: %(display_rating).1f star",
"%(display_name)s rated %(book_title)s: %(display_rating).1f stars",
"display_rating",
) % {
"display_name": self.user.display_name,
"book_title": self.book.title,
"display_rating": self.rating,
}
activity_serializer = activitypub.Rating
pure_type = "Note"

View file

@ -1,28 +1,31 @@
""" database schema for user data """
import datetime
import re
import zoneinfo
from typing import Optional, Iterable
from urllib.parse import urlparse
from uuid import uuid4
from django.apps import apps
from django.contrib.auth.models import AbstractUser
from django.contrib.postgres.fields import ArrayField, CICharField
from django.contrib.postgres.fields import ArrayField as DjangoArrayField
from django.core.exceptions import PermissionDenied, ObjectDoesNotExist
from django.dispatch import receiver
from django.db import models, transaction, IntegrityError
from django.utils import timezone
from django.utils.translation import gettext_lazy as _
from model_utils import FieldTracker
import pytz
from bookwyrm import activitypub
from bookwyrm.connectors import get_data, ConnectorException
from bookwyrm.models.shelf import Shelf
from bookwyrm.models.status import Status
from bookwyrm.preview_images import generate_user_preview_image_task
from bookwyrm.settings import DOMAIN, ENABLE_PREVIEW_IMAGES, USE_HTTPS, LANGUAGES
from bookwyrm.settings import BASE_URL, ENABLE_PREVIEW_IMAGES, LANGUAGES
from bookwyrm.signatures import create_key_pair
from bookwyrm.tasks import app, MISC
from bookwyrm.utils import regex
from bookwyrm.utils.db import add_update_fields
from .activitypub_mixin import OrderedCollectionPageMixin, ActivitypubMixin
from .base_model import BookWyrmModel, DeactivationReason, new_access_code
from .federated_server import FederatedServer
@ -42,12 +45,6 @@ def get_feed_filter_choices():
return [f[0] for f in FeedFilterChoices]
def site_link():
"""helper for generating links to the site"""
protocol = "https" if USE_HTTPS else "http"
return f"{protocol}://{DOMAIN}"
# pylint: disable=too-many-public-methods
class User(OrderedCollectionPageMixin, AbstractUser):
"""a user who wants to read books"""
@ -81,11 +78,12 @@ class User(OrderedCollectionPageMixin, AbstractUser):
summary = fields.HtmlField(null=True, blank=True)
local = models.BooleanField(default=False)
bookwyrm_user = fields.BooleanField(default=True)
localname = CICharField(
localname = models.CharField(
max_length=255,
null=True,
unique=True,
validators=[fields.validate_localname],
db_collation="case_insensitive",
)
# name is your display name, which you can change at will
name = fields.CharField(max_length=100, null=True, blank=True)
@ -162,7 +160,7 @@ class User(OrderedCollectionPageMixin, AbstractUser):
show_guided_tour = models.BooleanField(default=True)
# feed options
feed_status_types = ArrayField(
feed_status_types = DjangoArrayField(
models.CharField(max_length=10, blank=False, choices=FeedFilterChoices),
size=8,
default=get_feed_filter_choices,
@ -171,8 +169,8 @@ class User(OrderedCollectionPageMixin, AbstractUser):
summary_keys = models.JSONField(null=True)
preferred_timezone = models.CharField(
choices=[(str(tz), str(tz)) for tz in pytz.all_timezones],
default=str(pytz.utc),
choices=[(str(tz), str(tz)) for tz in sorted(zoneinfo.available_timezones())],
default=str(datetime.timezone.utc),
max_length=255,
)
preferred_language = models.CharField(
@ -198,6 +196,14 @@ class User(OrderedCollectionPageMixin, AbstractUser):
hotp_secret = models.CharField(max_length=32, default=None, blank=True, null=True)
hotp_count = models.IntegerField(default=0, blank=True, null=True)
class Meta(AbstractUser.Meta):
"""indexes"""
indexes = [
models.Index(fields=["username"]),
models.Index(fields=["is_active", "local"]),
]
@property
def active_follower_requests(self):
"""Follow requests from active users"""
@ -206,8 +212,7 @@ class User(OrderedCollectionPageMixin, AbstractUser):
@property
def confirmation_link(self):
"""helper for generating confirmation links"""
link = site_link()
return f"{link}/confirm-email/{self.confirmation_code}"
return f"{BASE_URL}/confirm-email/{self.confirmation_code}"
@property
def following_link(self):
@ -335,13 +340,14 @@ class User(OrderedCollectionPageMixin, AbstractUser):
]
return activity_object
def save(self, *args, **kwargs):
def save(self, *args, update_fields: Optional[Iterable[str]] = None, **kwargs):
"""populate fields for new local users"""
created = not bool(self.id)
if not self.local and not re.match(regex.FULL_USERNAME, self.username):
# generate a username that uses the domain (webfinger format)
actor_parts = urlparse(self.remote_id)
self.username = f"{self.username}@{actor_parts.netloc}"
self.username = f"{self.username}@{actor_parts.hostname}"
update_fields = add_update_fields(update_fields, "username")
# this user already exists, no need to populate fields
if not created:
@ -350,26 +356,34 @@ class User(OrderedCollectionPageMixin, AbstractUser):
elif not self.deactivation_date:
self.deactivation_date = timezone.now()
super().save(*args, **kwargs)
super().save(*args, update_fields=update_fields, **kwargs)
return
# this is a new remote user, we need to set their remote server field
if not self.local:
super().save(*args, **kwargs)
super().save(*args, update_fields=update_fields, **kwargs)
transaction.on_commit(lambda: set_remote_server(self.id))
return
with transaction.atomic():
# populate fields for local users
link = site_link()
self.remote_id = f"{link}/user/{self.localname}"
self.remote_id = f"{BASE_URL}/user/{self.localname}"
self.followers_url = f"{self.remote_id}/followers"
self.inbox = f"{self.remote_id}/inbox"
self.shared_inbox = f"{link}/inbox"
self.shared_inbox = f"{BASE_URL}/inbox"
self.outbox = f"{self.remote_id}/outbox"
update_fields = add_update_fields(
update_fields,
"remote_id",
"followers_url",
"inbox",
"shared_inbox",
"outbox",
)
# an id needs to be set before we can proceed with related models
super().save(*args, **kwargs)
super().save(*args, update_fields=update_fields, **kwargs)
# make users editors by default
try:
@ -509,18 +523,30 @@ class KeyPair(ActivitypubMixin, BookWyrmModel):
activity_serializer = activitypub.PublicKey
serialize_reverse_fields = [("owner", "owner", "id")]
class Meta:
"""indexes"""
indexes = [
models.Index(fields=["remote_id"]),
]
def get_remote_id(self):
# self.owner is set by the OneToOneField on User
return f"{self.owner.remote_id}/#main-key"
def save(self, *args, **kwargs):
def save(self, *args, update_fields: Optional[Iterable[str]] = None, **kwargs):
"""create a key pair"""
# no broadcasting happening here
if "broadcast" in kwargs:
del kwargs["broadcast"]
if not self.public_key:
self.private_key, self.public_key = create_key_pair()
return super().save(*args, **kwargs)
update_fields = add_update_fields(
update_fields, "private_key", "public_key"
)
super().save(*args, update_fields=update_fields, **kwargs)
@app.task(queue=MISC)
@ -543,7 +569,7 @@ def set_remote_server(user_id, allow_external_connections=False):
user = User.objects.get(id=user_id)
actor_parts = urlparse(user.remote_id)
federated_server = get_or_create_remote_server(
actor_parts.netloc, allow_external_connections=allow_external_connections
actor_parts.hostname, allow_external_connections=allow_external_connections
)
# if we were unable to find the server, we need to create a new entry for it
if not federated_server:

View file

@ -1,4 +1,5 @@
""" Generate social media preview images for twitter/mastodon/etc """
import math
import os
import textwrap
@ -42,8 +43,8 @@ def get_imagefont(name, size):
return ImageFont.truetype(path, size)
except KeyError:
logger.error("Font %s not found in config", name)
except OSError:
logger.error("Could not load font %s from file", name)
except OSError as err:
logger.error("Could not load font %s from file: %s", name, err)
return ImageFont.load_default()
@ -59,7 +60,7 @@ def get_font(weight, size=28):
font.set_variation_by_name("Bold")
if weight == "regular":
font.set_variation_by_name("Regular")
except AttributeError:
except OSError:
pass
return font
@ -174,11 +175,13 @@ def generate_instance_layer(content_width):
site = models.SiteSettings.objects.get()
if site.logo_small:
logo_img = Image.open(site.logo_small)
with Image.open(site.logo_small) as logo_img:
logo_img.load()
else:
try:
static_path = os.path.join(settings.STATIC_ROOT, "images/logo-small.png")
logo_img = Image.open(static_path)
with Image.open(static_path) as logo_img:
logo_img.load()
except FileNotFoundError:
logo_img = None
@ -210,18 +213,9 @@ def generate_instance_layer(content_width):
def generate_rating_layer(rating, content_width):
"""Places components for rating preview"""
try:
icon_star_full = Image.open(
os.path.join(settings.STATIC_ROOT, "images/icons/star-full.png")
)
icon_star_empty = Image.open(
os.path.join(settings.STATIC_ROOT, "images/icons/star-empty.png")
)
icon_star_half = Image.open(
os.path.join(settings.STATIC_ROOT, "images/icons/star-half.png")
)
except FileNotFoundError:
return None
path_star_full = os.path.join(settings.STATIC_ROOT, "images/icons/star-full.png")
path_star_empty = os.path.join(settings.STATIC_ROOT, "images/icons/star-empty.png")
path_star_half = os.path.join(settings.STATIC_ROOT, "images/icons/star-half.png")
icon_size = 64
icon_margin = 10
@ -236,17 +230,23 @@ def generate_rating_layer(rating, content_width):
position_x = 0
for _ in range(math.floor(rating)):
rating_layer_mask.alpha_composite(icon_star_full, (position_x, 0))
position_x = position_x + icon_size + icon_margin
try:
with Image.open(path_star_full) as icon_star_full:
for _ in range(math.floor(rating)):
rating_layer_mask.alpha_composite(icon_star_full, (position_x, 0))
position_x = position_x + icon_size + icon_margin
if math.floor(rating) != math.ceil(rating):
rating_layer_mask.alpha_composite(icon_star_half, (position_x, 0))
position_x = position_x + icon_size + icon_margin
if math.floor(rating) != math.ceil(rating):
with Image.open(path_star_half) as icon_star_half:
rating_layer_mask.alpha_composite(icon_star_half, (position_x, 0))
position_x = position_x + icon_size + icon_margin
for _ in range(5 - math.ceil(rating)):
rating_layer_mask.alpha_composite(icon_star_empty, (position_x, 0))
position_x = position_x + icon_size + icon_margin
with Image.open(path_star_empty) as icon_star_empty:
for _ in range(5 - math.ceil(rating)):
rating_layer_mask.alpha_composite(icon_star_empty, (position_x, 0))
position_x = position_x + icon_size + icon_margin
except FileNotFoundError:
return None
rating_layer_mask = rating_layer_mask.getchannel("A")
rating_layer_mask = ImageOps.invert(rating_layer_mask)
@ -289,7 +289,8 @@ def generate_preview_image(
texts = texts or {}
# Cover
try:
inner_img_layer = Image.open(picture)
with Image.open(picture) as inner_img_layer:
inner_img_layer.load()
inner_img_layer.thumbnail(
(inner_img_width, inner_img_height), Image.Resampling.LANCZOS
)

View file

@ -19,7 +19,6 @@ DOMAIN = env("DOMAIN")
with open("VERSION", encoding="utf-8") as f:
version = f.read()
version = version.replace("\n", "")
f.close()
VERSION = version
@ -30,6 +29,9 @@ RELEASE_API = env(
PAGE_LENGTH = env.int("PAGE_LENGTH", 15)
DEFAULT_LANGUAGE = env("DEFAULT_LANGUAGE", "English")
# TODO: extend maximum age to 1 year once termination of active sessions
# is implemented (see bookwyrm-social#2278, bookwyrm-social#3082).
SESSION_COOKIE_AGE = env.int("SESSION_COOKIE_AGE", 3600 * 24 * 30) # 1 month
JS_CACHE = "8a89cad7"
@ -99,12 +101,14 @@ INSTALLED_APPS = [
"django.contrib.messages",
"django.contrib.staticfiles",
"django.contrib.humanize",
"oauth2_provider",
"file_resubmit",
"sass_processor",
"bookwyrm",
"celery",
"django_celery_beat",
"imagekit",
"pgtrigger",
"storages",
]
@ -253,11 +257,8 @@ if env.bool("USE_DUMMY_CACHE", False):
else:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"BACKEND": "django.core.cache.backends.redis.RedisCache",
"LOCATION": REDIS_ACTIVITY_URL,
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
},
},
"file_resubmit": {
"BACKEND": "django.core.cache.backends.filebased.FileBasedCache",
@ -318,6 +319,7 @@ LANGUAGES = [
("eu-es", _("Euskara (Basque)")),
("gl-es", _("Galego (Galician)")),
("it-it", _("Italiano (Italian)")),
("ko-kr", _("한국어 (Korean)")),
("fi-fi", _("Suomi (Finnish)")),
("fr-fr", _("Français (French)")),
("lt-lt", _("Lietuvių (Lithuanian)")),
@ -342,35 +344,37 @@ TIME_ZONE = "UTC"
USE_I18N = True
USE_L10N = True
USE_TZ = True
agent = requests.utils.default_user_agent()
USER_AGENT = f"{agent} (BookWyrm/{VERSION}; +https://{DOMAIN}/)"
# Imagekit generated thumbnails
ENABLE_THUMBNAIL_GENERATION = env.bool("ENABLE_THUMBNAIL_GENERATION", False)
IMAGEKIT_CACHEFILE_DIR = "thumbnails"
IMAGEKIT_DEFAULT_CACHEFILE_STRATEGY = "bookwyrm.thumbnail_generation.Strategy"
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/3.2/howto/static-files/
PROJECT_DIR = os.path.dirname(os.path.abspath(__file__))
CSP_ADDITIONAL_HOSTS = env.list("CSP_ADDITIONAL_HOSTS", [])
# Storage
PROTOCOL = "http"
if USE_HTTPS:
PROTOCOL = "https"
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
PORT = env.int("PORT", 443 if USE_HTTPS else 80)
if (USE_HTTPS and PORT == 443) or (not USE_HTTPS and PORT == 80):
NETLOC = DOMAIN
else:
NETLOC = f"{DOMAIN}:{PORT}"
BASE_URL = f"{PROTOCOL}://{NETLOC}"
CSRF_TRUSTED_ORIGINS = [BASE_URL]
USER_AGENT = f"BookWyrm (BookWyrm/{VERSION}; +{BASE_URL})"
# Storage
USE_S3 = env.bool("USE_S3", False)
USE_AZURE = env.bool("USE_AZURE", False)
S3_SIGNED_URL_EXPIRY = env.int("S3_SIGNED_URL_EXPIRY", 900)
if USE_S3:
# AWS settings
@ -382,44 +386,117 @@ if USE_S3:
AWS_S3_ENDPOINT_URL = env("AWS_S3_ENDPOINT_URL", None)
AWS_DEFAULT_ACL = "public-read"
AWS_S3_OBJECT_PARAMETERS = {"CacheControl": "max-age=86400"}
AWS_S3_URL_PROTOCOL = env("AWS_S3_URL_PROTOCOL", f"{PROTOCOL}:")
# Storages
STORAGES = {
"default": {
"BACKEND": "storages.backends.s3.S3Storage",
"OPTIONS": {
"location": "images",
"default_acl": "public-read",
"file_overwrite": False,
},
},
"staticfiles": {
"BACKEND": "storages.backends.s3.S3Storage",
"OPTIONS": {
"location": "static",
"default_acl": "public-read",
},
},
"exports": {
"BACKEND": "storages.backends.s3.S3Storage",
"OPTIONS": {
"location": "images",
"default_acl": None,
"file_overwrite": False,
},
},
}
# S3 Static settings
STATIC_LOCATION = "static"
STATIC_URL = f"{PROTOCOL}://{AWS_S3_CUSTOM_DOMAIN}/{STATIC_LOCATION}/"
STATICFILES_STORAGE = "bookwyrm.storage_backends.StaticStorage"
STATIC_URL = f"{AWS_S3_URL_PROTOCOL}//{AWS_S3_CUSTOM_DOMAIN}/{STATIC_LOCATION}/"
STATIC_FULL_URL = STATIC_URL
# S3 Media settings
MEDIA_LOCATION = "images"
MEDIA_URL = f"{PROTOCOL}://{AWS_S3_CUSTOM_DOMAIN}/{MEDIA_LOCATION}/"
MEDIA_URL = f"{AWS_S3_URL_PROTOCOL}//{AWS_S3_CUSTOM_DOMAIN}/{MEDIA_LOCATION}/"
MEDIA_FULL_URL = MEDIA_URL
STATIC_FULL_URL = STATIC_URL
DEFAULT_FILE_STORAGE = "bookwyrm.storage_backends.ImagesStorage"
CSP_DEFAULT_SRC = ["'self'", AWS_S3_CUSTOM_DOMAIN] + CSP_ADDITIONAL_HOSTS
CSP_SCRIPT_SRC = ["'self'", AWS_S3_CUSTOM_DOMAIN] + CSP_ADDITIONAL_HOSTS
# Content Security Policy
CSP_DEFAULT_SRC = [
"'self'",
f"{AWS_S3_URL_PROTOCOL}//{AWS_S3_CUSTOM_DOMAIN}"
if AWS_S3_CUSTOM_DOMAIN
else None,
] + CSP_ADDITIONAL_HOSTS
CSP_SCRIPT_SRC = [
"'self'",
f"{AWS_S3_URL_PROTOCOL}//{AWS_S3_CUSTOM_DOMAIN}"
if AWS_S3_CUSTOM_DOMAIN
else None,
] + CSP_ADDITIONAL_HOSTS
elif USE_AZURE:
# Azure settings
AZURE_ACCOUNT_NAME = env("AZURE_ACCOUNT_NAME")
AZURE_ACCOUNT_KEY = env("AZURE_ACCOUNT_KEY")
AZURE_CONTAINER = env("AZURE_CONTAINER")
AZURE_CUSTOM_DOMAIN = env("AZURE_CUSTOM_DOMAIN")
# Storages
STORAGES = {
"default": {
"BACKEND": "storages.backends.azure_storage.AzureStorage",
"OPTIONS": {
"location": "images",
"overwrite_files": False,
},
},
"staticfiles": {
"BACKEND": "storages.backends.azure_storage.AzureStorage",
"OPTIONS": {
"location": "static",
},
},
"exports": {
"BACKEND": None, # not implemented yet
},
}
# Azure Static settings
STATIC_LOCATION = "static"
STATIC_URL = (
f"{PROTOCOL}://{AZURE_CUSTOM_DOMAIN}/{AZURE_CONTAINER}/{STATIC_LOCATION}/"
)
STATICFILES_STORAGE = "bookwyrm.storage_backends.AzureStaticStorage"
STATIC_FULL_URL = STATIC_URL
# Azure Media settings
MEDIA_LOCATION = "images"
MEDIA_URL = (
f"{PROTOCOL}://{AZURE_CUSTOM_DOMAIN}/{AZURE_CONTAINER}/{MEDIA_LOCATION}/"
)
MEDIA_FULL_URL = MEDIA_URL
STATIC_FULL_URL = STATIC_URL
DEFAULT_FILE_STORAGE = "bookwyrm.storage_backends.AzureImagesStorage"
# Content Security Policy
CSP_DEFAULT_SRC = ["'self'", AZURE_CUSTOM_DOMAIN] + CSP_ADDITIONAL_HOSTS
CSP_SCRIPT_SRC = ["'self'", AZURE_CUSTOM_DOMAIN] + CSP_ADDITIONAL_HOSTS
else:
# Storages
STORAGES = {
"default": {
"BACKEND": "django.core.files.storage.FileSystemStorage",
},
"staticfiles": {
"BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
},
"exports": {
"BACKEND": "django.core.files.storage.FileSystemStorage",
"OPTIONS": {
"location": "exports",
},
},
}
# Static settings
STATIC_URL = "/static/"
STATIC_FULL_URL = BASE_URL + STATIC_URL
# Media settings
MEDIA_URL = "/images/"
MEDIA_FULL_URL = f"{PROTOCOL}://{DOMAIN}{MEDIA_URL}"
STATIC_FULL_URL = f"{PROTOCOL}://{DOMAIN}{STATIC_URL}"
MEDIA_FULL_URL = BASE_URL + MEDIA_URL
# Content Security Policy
CSP_DEFAULT_SRC = ["'self'"] + CSP_ADDITIONAL_HOSTS
CSP_SCRIPT_SRC = ["'self'"] + CSP_ADDITIONAL_HOSTS
@ -443,4 +520,6 @@ if HTTP_X_FORWARDED_PROTO:
# user with the same username - in which case you should change it!
INSTANCE_ACTOR_USERNAME = "bookwyrm.instance.actor"
DATA_UPLOAD_MAX_MEMORY_SIZE = env.int("DATA_UPLOAD_MAX_MEMORY_SIZE", (1024**2 * 100))
# We only allow specifying DATA_UPLOAD_MAX_MEMORY_SIZE in MiB from .env
# (note the difference in variable names).
DATA_UPLOAD_MAX_MEMORY_SIZE = env.int("DATA_UPLOAD_MAX_MEMORY_MiB", 100) << 20

View file

@ -111,6 +111,10 @@ const tries = {
},
},
f: {
b: {
2: "FB2",
3: "FB3",
},
l: {
a: {
c: "FLAC",

View file

@ -1,63 +0,0 @@
"""Handles backends for storages"""
import os
from tempfile import SpooledTemporaryFile
from storages.backends.s3boto3 import S3Boto3Storage
from storages.backends.azure_storage import AzureStorage
class StaticStorage(S3Boto3Storage): # pylint: disable=abstract-method
"""Storage class for Static contents"""
location = "static"
default_acl = "public-read"
class ImagesStorage(S3Boto3Storage): # pylint: disable=abstract-method
"""Storage class for Image files"""
location = "images"
default_acl = "public-read"
file_overwrite = False
"""
This is our custom version of S3Boto3Storage that fixes a bug in
boto3 where the passed in file is closed upon upload.
From:
https://github.com/matthewwithanm/django-imagekit/issues/391#issuecomment-275367006
https://github.com/boto/boto3/issues/929
https://github.com/matthewwithanm/django-imagekit/issues/391
"""
def _save(self, name, content):
"""
We create a clone of the content file as when this is passed to
boto3 it wrongly closes the file upon upload where as the storage
backend expects it to still be open
"""
# Seek our content back to the start
content.seek(0, os.SEEK_SET)
# Create a temporary file that will write to disk after a specified
# size. This file will be automatically deleted when closed by
# boto3 or after exiting the `with` statement if the boto3 is fixed
with SpooledTemporaryFile() as content_autoclose:
# Write our original content into our copy that will be closed by boto3
content_autoclose.write(content.read())
# Upload the object which will auto close the
# content_autoclose instance
return super()._save(name, content_autoclose)
class AzureStaticStorage(AzureStorage): # pylint: disable=abstract-method
"""Storage class for Static contents"""
location = "static"
class AzureImagesStorage(AzureStorage): # pylint: disable=abstract-method
"""Storage class for Image files"""
location = "images"
overwrite_files = False

View file

@ -8,7 +8,7 @@
<h1 class="title">{% trans "File too large" %}</h1>
<p class="content">{% trans "The file you are uploading is too large." %}</p>
<p class="content">
{% blocktrans %}
{% blocktrans trimmed %}
You you can try using a smaller file, or ask your BookWyrm server administrator to increase the <code>DATA_UPLOAD_MAX_MEMORY_SIZE</code> setting.
{% endblocktrans %}
</p>

View file

@ -31,10 +31,10 @@
</p>
</div>
<div class="columns">
<div class="columns is-multiline">
{% if superlatives.top_rated %}
{% with book=superlatives.top_rated.default_edition rating=superlatives.top_rated.rating %}
<div class="column is-one-third is-flex">
<div class="column is-half-tablet is-one-third-desktop is-flex-grow-1 is-flex">
<div class="media notification is-clipped">
<div class="media-left">
<a href="{{ book.local_path }}">
@ -53,7 +53,7 @@
{% if superlatives.wanted %}
{% with book=superlatives.wanted.default_edition %}
<div class="column is-one-third is-flex">
<div class="column is-half-tablet is-one-third-desktop is-flex-grow-1 is-flex">
<div class="media notification is-clipped">
<div class="media-left">
<a href="{{ book.local_path }}">
@ -72,7 +72,7 @@
{% if superlatives.controversial %}
{% with book=superlatives.controversial.default_edition %}
<div class="column is-one-third is-flex">
<div class="column is-half-tablet is-one-third-desktop is-flex-grow-1 is-flex">
<div class="media notification is-clipped">
<div class="media-left">
<a href="{{ book.local_path }}">

View file

@ -55,6 +55,8 @@
<p class="field"><label class="label" for="id_wikipedia_link">{% trans "Wikipedia link:" %}</label> {{ form.wikipedia_link }}</p>
<p class="field"><label class="label" for="id_wikidata">{% trans "Wikidata:" %}</label> {{ form.wikidata }}</p>
{% include 'snippets/form_errors.html' with errors_list=form.wikipedia_link.errors id="desc_wikipedia_link" %}
<p class="field"><label class="label" for="id_website">{% trans "Website:" %}</label> {{ form.website }}</p>

View file

@ -9,7 +9,8 @@
{% block title %}{{ book|book_title }}{% endblock %}
{% block opengraph %}
{% include 'snippets/opengraph.html' with title=book.title description=book|book_description image=book.preview_image %}
{% firstof book.preview_image book.cover as book_image %}
{% include 'snippets/opengraph.html' with title=book.title description=book|book_description image=book_image %}
{% endblock %}
{% block content %}
@ -44,18 +45,22 @@
{% endif %}
{% if book.series %}
<meta itemprop="position" content="{{ book.series_number }}">
{% spaceless %}
<span itemprop="isPartOf" itemscope itemtype="https://schema.org/BookSeries">
{% if book.authors.exists %}
<a href="{% url 'book-series-by' book.authors.first.id %}?series_name={{ book.series | urlencode }}"
itemprop="url">
{% endif %}
<span itemprop="name">{{ book.series }}</span>
{% if book.series_number %} #{{ book.series_number }}{% endif %}
{% if book.authors.exists %}
</a>
{% endif %}
</span>
{% if book.series_number %}
<span>, #</span>
<span itemprop="position">{{ book.series_number }}</span>
{% endif %}
{% endspaceless %}
{% endif %}
</p>
{% endif %}

View file

@ -6,8 +6,8 @@
{% block content %}
<h1 class="title">{% trans "Confirm your email address" %}</h1>
<div class="columns">
<div class="column">
<div class="columns is-multiline">
<div class="column is-full is-half-desktop">
<div class="block content">
<section class="block">
<p>{% trans "A confirmation code has been sent to the email address you used to register your account." %}</p>

View file

@ -2,10 +2,10 @@
<div style="font-family: BlinkMacSystemFont,-apple-system,'Segoe UI',Roboto,Oxygen,Ubuntu,Cantarell,'Fira Sans','Droid Sans','Helvetica Neue',Helvetica,Arial,sans-serif; border-radius: 6px; background-color: #efefef; max-width: 632px;">
<div style="padding: 1rem; overflow: auto;">
<div style="float: left; margin-right: 1rem;">
<a style="color: #3273dc;" href="https://{{ domain }}" style="text-decoration: none;"><img src="{{ logo }}" alt="logo" loading="lazy" decoding="async"></a>
<a style="color: #3273dc;" href="{{ base_url }}" style="text-decoration: none;"><img src="{{ logo }}" alt="logo" loading="lazy" decoding="async"></a>
</div>
<div>
<a style="color: black; text-decoration: none" href="https://{{ domain }}" style="text-decoration: none;"><strong>{{ site_name }}</strong><br>
<a style="color: black; text-decoration: none" href="{{ base_url }}" style="text-decoration: none;"><strong>{{ site_name }}</strong><br>
{{ domain }}</a>
</div>
</div>
@ -18,9 +18,9 @@
</div>
<div style="padding: 1rem; font-size: 0.8rem;">
<p style="margin: 0; color: #333;">{% blocktrans %}BookWyrm hosted on <a style="color: #3273dc;" href="https://{{ domain }}">{{ site_name }}</a>{% endblocktrans %}</p>
<p style="margin: 0; color: #333;">{% blocktrans %}BookWyrm hosted on <a style="color: #3273dc;" href="{{ base_url }}">{{ site_name }}</a>{% endblocktrans %}</p>
{% if user %}
<p style="margin: 0; color: #333;"><a style="color: #3273dc;" href="https://{{ domain }}{% url 'prefs-profile' %}">{% trans "Email preference" %}</a></p>
<p style="margin: 0; color: #333;"><a style="color: #3273dc;" href="{{ base_url }}{% url 'prefs-profile' %}">{% trans "Email preference" %}</a></p>
{% endif %}
</div>
</div>

View file

@ -12,6 +12,6 @@
<p>
{% url 'code-of-conduct' as coc_path %}
{% url 'about' as about_path %}
{% blocktrans %}Learn more <a href="https://{{ domain }}{{ about_path }}">about {{ site_name }}</a>.{% endblocktrans %}
{% blocktrans %}Learn more <a href="{{ base_url }}{{ about_path }}">about {{ site_name }}</a>.{% endblocktrans %}
</p>
{% endblock %}

View file

@ -5,6 +5,6 @@
{{ invite_link }}
{% blocktrans %}Learn more about {{ site_name }}:{% endblocktrans %} https://{{ domain }}{% url 'about' %}
{% blocktrans %}Learn more about {{ site_name }}:{% endblocktrans %} {{ base_url }}{% url 'about' %}
{% endblock %}

View file

@ -41,7 +41,7 @@
</section>
{% endif %}
{% if annual_summary_year and tab.key == 'home' %}
{% if annual_summary_year and tab.key == 'home' and has_summary_read_throughs %}
<section class="block is-hidden" data-hide="hide_annual_summary_{{ annual_summary_year }}">
{% include 'feed/summary_card.html' with year=annual_summary_year %}
<hr>

View file

@ -2,13 +2,11 @@
{% load feed_page_tags %}
{% load i18n %}
{% block title %}{{ title }}{% endblock %}
{% block opengraph %}
{% firstof status.book status.mention_books.first as book %}
{% if book %}
{% include 'snippets/opengraph.html' with image=preview %}
{% else %}
{% include 'snippets/opengraph.html' %}
{% endif %}
{% include 'snippets/opengraph.html' with image=page_image %}
{% endblock %}

View file

@ -6,8 +6,8 @@
{% block content %}
<h1 class="title">{% trans "Create an Account" %}</h1>
<div class="columns">
<div class="column">
<div class="columns is-multiline">
<div class="column is-full is-half-desktop">
<div class="block">
{% if valid %}
<div>

View file

@ -6,7 +6,7 @@
{% block content %}
<h1 class="title">{% trans "Log in" %}</h1>
<div class="columns is-multiline">
<div class="column is-half">
<div class="column {% if site.allow_registration %}is-half{% else %}is-full is-half-desktop{% endif %}">
{% if login_form.non_field_errors %}
<p class="notification is-danger">{{ login_form.non_field_errors }}</p>
{% endif %}
@ -20,13 +20,15 @@
<div class="field">
<label class="label" for="id_localname_confirm">{% trans "Username:" %}</label>
<div class="control">
<input type="text" name="localname" maxlength="255" class="input" required="" id="id_localname_confirm" value="{{ login_form.localname.value|default:'' }}">
<input type="text" name="localname" maxlength="255" class="input" required=""
id="id_localname_confirm" value="{{ login_form.localname.value|default:'' }}">
</div>
</div>
<div class="field">
<label class="label" for="id_password_confirm">{% trans "Password:" %}</label>
<div class="control">
<input type="password" name="password" maxlength="128" class="input" required="" id="id_password_confirm" aria-describedby="desc_password">
<input type="password" name="password" maxlength="128" class="input" required=""
id="id_password_confirm" aria-describedby="desc_password">
</div>
{% include 'snippets/form_errors.html' with errors_list=login_form.password.errors id="desc_password" %}
@ -58,10 +60,10 @@
{% include 'snippets/about.html' %}
<p class="block">
<a href="{% url 'about' %}">{% trans "More about this site" %}</a>
<a href="{% url 'about' %}">{% trans "More about this site" %}</a>
</p>
</div>
</div>
</div>
{% endblock %}
{% endblock %}

View file

@ -4,8 +4,8 @@
{% block title %}{% trans "Reset Password" %}{% endblock %}
{% block content %}
<div class="columns">
<div class="column">
<div class="columns is-multiline">
<div class="column is-full is-half-desktop">
<div class="block">
<h1 class="title">{% trans "Reset Password" %}</h1>

Some files were not shown because too many files have changed in this diff Show more