mirror of
https://github.com/bookwyrm-social/bookwyrm.git
synced 2025-04-24 03:04:10 +00:00
Merge branch 'main' into openreads
This commit is contained in:
commit
ba2ba9f265
76 changed files with 5424 additions and 409 deletions
.env.exampleCONTRIBUTING.mdFEDERATION.mdREADME.mdurls.pyrequirements.txt
bookwyrm
activitypub
connectors
forms
importers
management/commands
migrations
0210_alter_connector_connector_file.py0211_author_finna_key_book_finna_key.py0212_userrelationshipimport_and_more.py
models
templates
book
import
preferences
rss
search
shelf
templatetags
tests
connectors
data
default_avi_exif.jpgfinna_author_search.jsonfinna_isbn_search.jsonfinna_record.jsonfinna_search.jsonfinna_versions.jsonuser_import.json
importers
management
models
test_utils.pyviews
utils
views
|
@ -21,6 +21,7 @@ DEFAULT_LANGUAGE="English"
|
|||
# Probably only necessary in development.
|
||||
# PORT=1333
|
||||
|
||||
STATIC_ROOT=static/
|
||||
MEDIA_ROOT=images/
|
||||
|
||||
# Database configuration
|
||||
|
|
53
CONTRIBUTING.md
Normal file
53
CONTRIBUTING.md
Normal file
|
@ -0,0 +1,53 @@
|
|||
# Contributing to BookWyrm
|
||||
|
||||
Our goal is to make BookWyrm a kind and welcoming place where everyone can contribute to the success of the project. Here are some ways you can join the project:
|
||||
|
||||
## Report things that are confusing
|
||||
|
||||
We want BookWyrm to be a fun experience that is intuitive to understand. If you're confused by something, it's probably because it is confusing! We are always keen to improve our [documentation](https://docs.joinbookwyrm.com) and Guided Tour as well as the platform itself.
|
||||
|
||||
You can [create an issue to improve our documentation](https://github.com/bookwyrm-social/documentation/issues) or if you prefer, [ask for help in our Matrix chat room](https://app.element.io/#/room/#bookwyrm:matrix.org).
|
||||
|
||||
## Report bugs
|
||||
|
||||
Sometimes things don't work the way we intended. We would love to have fewer bugs, but we can only fix them if we know about them.
|
||||
|
||||
You can [report bugs](https://github.com/bookwyrm-social/bookwyrm/issues) by clicking "New Issue". The more information you can provide, the easier it will be to understand the problem and squash that bug!
|
||||
|
||||
It's a good idea to search the Issues for key words associated with your bug first because someone else may have already reported it.
|
||||
|
||||
## Request and discuss new features
|
||||
|
||||
Got a great idea for an improvement to BookWyrm? You can [request new features](https://github.com/bookwyrm-social/bookwyrm/issues) by clicking "New Issue".
|
||||
|
||||
It's a good idea to search the Issues for key words associated with your feature suggestion first because someone else may have already requested it.
|
||||
|
||||
## Translate BookWyrm into international languages
|
||||
|
||||
Books are written in many languages, and BookWyrm should be too. If you know more than one language, you might be able to help us to [translate BookWyrm](https://translate.joinbookwyrm.com/). You can find out more about translation [in the documentation](https://docs.joinbookwyrm.com/translation.html).
|
||||
|
||||
## Keep the documentation up to date
|
||||
|
||||
Good documentation is crucial so that people know how to use, contribute to, and administer BookWyrm. No matter how you are involved with BookWyrm, your perspective is valuable and you can contribute to our documentation.
|
||||
|
||||
We managed documentation in [a separate GitHub repository](https://github.com/bookwyrm-social/documentation) where you can [log a documentation issue](https://github.com/bookwyrm-social/documentation/issues) or contribute to the documentation yourself.
|
||||
|
||||
## Test draft versions
|
||||
|
||||
Are you a BookWyrm instance administrator? You can help to test new features when we release them in a draft version of BookWyrm, and report back on your experiences. This is crucial to helping us to release stable versions with fewer bugs.
|
||||
|
||||
## Contribute code
|
||||
|
||||
If you're able to write code, you can contribute that way! Check out the [Guide to the developer environment](https://docs.joinbookwyrm.com/install-dev.html) and our code [style guide](https://docs.joinbookwyrm.com/style_guide.html).
|
||||
|
||||
## Provide expert advice
|
||||
|
||||
Bibliographic metadata wizard? Celery nerd? ActivityPub expert? SQL query obsessive? We need all kinds of expertise! You can contribute to discussions in [the Issues](https://github.com/bookwyrm-social/bookwyrm/issues) or reach out to make suggestions [in our Matrix chat room](https://app.element.io/#/room/#bookwyrm:matrix.org) or via an Issue of your own.
|
||||
|
||||
## More information
|
||||
|
||||
You can find out more about BookWyrm and contributing at [JoinBookWyrm.com](https://joinbookwyrm.com/get-involved/).
|
||||
|
||||
Ensure you are aware of and agree to our [Code of Conduct](https://github.com/bookwyrm-social/bookwyrm/blob/main/CODE_OF_CONDUCT.md).
|
||||
|
||||
Please note that the BookWyrm project is licensed under the [Anti-capitalist Software License](https://github.com/bookwyrm-social/bookwyrm/blob/main/LICENSE.md).
|
|
@ -321,6 +321,8 @@ Bookwyrm uses the [Webfinger](https://datatracker.ietf.org/doc/html/rfc7033) sta
|
|||
|
||||
Bookwyrm uses and requires HTTP signatures for all `POST` requests. `GET` requests are not signed by default, but if Bookwyrm receives a `403` response to a `GET` it will re-send the request, signed by the default server user. This usually will have a user id of `https://example.net/user/bookwyrm.instance.actor`
|
||||
|
||||
As of the first version to be released in 2025, all `GET` requests will be signed by the instance user instead of re-sending requests that are rejected.
|
||||
|
||||
#### publicKey id
|
||||
|
||||
In older versions of Bookwyrm the `publicKey.id` was incorrectly listed in request headers as `https://example.net/user/username#main-key`. As of v0.6.3 the id is now listed correctly, as `https://example.net/user/username/#main-key`. In most ActivityPub implementations this will make no difference as the URL will usually resolve to the same place.
|
||||
|
|
|
@ -24,6 +24,8 @@ BookWyrm is built on [ActivityPub](http://activitypub.rocks/). With ActivityPub,
|
|||
|
||||
Federation makes it possible to have small, self-determining communities, in contrast to the monolithic service you find on GoodReads or Twitter. An instance can be focused on a particular interest, be just for a group of friends, or anything else that brings people together. Each community can choose which other instances they want to federate with, and moderate and run their community autonomously. Check out https://runyourown.social/ to get a sense of the philosophy and logistics behind small, high-trust social networks.
|
||||
|
||||
Developers of other ActivityPub software can find out more about BookWyrm's implementation at [`FEDERATION.md`](https://github.com/bookwyrm-social/bookwyrm/blob/main/FEDERATION.md).
|
||||
|
||||
## Features
|
||||
|
||||
### Post about books
|
||||
|
@ -61,3 +63,7 @@ Deployment
|
|||
|
||||
## Set up BookWyrm
|
||||
The [documentation website](https://docs.joinbookwyrm.com/) has instruction on how to set up BookWyrm in a [developer environment](https://docs.joinbookwyrm.com/install-dev.html) or [production](https://docs.joinbookwyrm.com/install-prod.html).
|
||||
|
||||
## Contributing
|
||||
|
||||
There are many ways you can contribute to the success and health of the BookWyrm project! You do not have to know how to write code and we are always keen to see more people get involved. Find out how you can join the project at [CONTRIBUTING.md](https://github.com/bookwyrm-social/bookwyrm/blob/main/CONTRIBUTING.md)
|
|
@ -120,6 +120,7 @@ class ActivityObject:
|
|||
save: bool = True,
|
||||
overwrite: bool = True,
|
||||
allow_external_connections: bool = True,
|
||||
trigger=None,
|
||||
) -> Optional[TBookWyrmModel]:
|
||||
"""convert from an activity to a model instance. Args:
|
||||
model: the django model that this object is being converted to
|
||||
|
@ -133,6 +134,9 @@ class ActivityObject:
|
|||
only update blank fields if false
|
||||
allow_external_connections: look up missing data if true,
|
||||
throw an exception if false and an external connection is needed
|
||||
trigger: the object that originally triggered this
|
||||
self.to_model. e.g. if this is a Work being dereferenced from
|
||||
an incoming Edition
|
||||
"""
|
||||
model = model or get_model_from_type(self.type)
|
||||
|
||||
|
@ -223,6 +227,8 @@ class ActivityObject:
|
|||
related_field_name = model_field.field.name
|
||||
|
||||
for item in values:
|
||||
if trigger and item == trigger.remote_id:
|
||||
continue
|
||||
set_related_field.delay(
|
||||
related_model.__name__,
|
||||
instance.__class__.__name__,
|
||||
|
|
|
@ -13,6 +13,7 @@ class BookData(ActivityObject):
|
|||
|
||||
openlibraryKey: Optional[str] = None
|
||||
inventaireId: Optional[str] = None
|
||||
finnaKey: Optional[str] = None
|
||||
librarythingKey: Optional[str] = None
|
||||
goodreadsKey: Optional[str] = None
|
||||
bnfId: Optional[str] = None
|
||||
|
|
|
@ -50,6 +50,7 @@ class Note(ActivityObject):
|
|||
save=True,
|
||||
overwrite=True,
|
||||
allow_external_connections=True,
|
||||
trigger=None,
|
||||
):
|
||||
instance = super().to_model(
|
||||
model, instance, allow_create, save, overwrite, allow_external_connections
|
||||
|
|
|
@ -4,11 +4,10 @@ from abc import ABC, abstractmethod
|
|||
from typing import Optional, TypedDict, Any, Callable, Union, Iterator
|
||||
from urllib.parse import quote_plus
|
||||
|
||||
# pylint: disable-next=deprecated-module
|
||||
import imghdr # Deprecated in 3.11 for removal in 3.13; no good alternative yet
|
||||
import logging
|
||||
import re
|
||||
import asyncio
|
||||
from PIL import Image, UnidentifiedImageError
|
||||
import requests
|
||||
from requests.exceptions import RequestException
|
||||
import aiohttp
|
||||
|
@ -370,13 +369,14 @@ def get_image(
|
|||
return None, None
|
||||
|
||||
image_content = ContentFile(resp.content)
|
||||
extension = imghdr.what(None, image_content.read())
|
||||
if not extension:
|
||||
try:
|
||||
with Image.open(image_content) as im:
|
||||
extension = str(im.format).lower()
|
||||
return image_content, extension
|
||||
except UnidentifiedImageError:
|
||||
logger.info("File requested was not an image: %s", url)
|
||||
return None, None
|
||||
|
||||
return image_content, extension
|
||||
|
||||
|
||||
class Mapping:
|
||||
"""associate a local database field with a field in an external dataset"""
|
||||
|
|
398
bookwyrm/connectors/finna.py
Normal file
398
bookwyrm/connectors/finna.py
Normal file
|
@ -0,0 +1,398 @@
|
|||
"""finna data connector"""
|
||||
|
||||
import re
|
||||
from typing import Iterator
|
||||
|
||||
from bookwyrm import models
|
||||
from bookwyrm.book_search import SearchResult
|
||||
from bookwyrm.models.book import FormatChoices
|
||||
from .abstract_connector import AbstractConnector, Mapping, JsonDict
|
||||
from .abstract_connector import get_data
|
||||
from .connector_manager import ConnectorException, create_edition_task
|
||||
from .openlibrary_languages import languages
|
||||
|
||||
|
||||
class Connector(AbstractConnector):
|
||||
"""instantiate a connector for finna"""
|
||||
|
||||
generated_remote_link_field = "id"
|
||||
|
||||
def __init__(self, identifier: str):
|
||||
super().__init__(identifier)
|
||||
|
||||
get_first = lambda x, *args: x[0] if x else None
|
||||
format_remote_id = lambda x: f"{self.books_url}{x}"
|
||||
format_cover_url = lambda x: f"{self.covers_url}{x[0]}" if x else None
|
||||
self.book_mappings = [
|
||||
Mapping("id", remote_field="id", formatter=format_remote_id),
|
||||
Mapping("finnaKey", remote_field="id"),
|
||||
Mapping("title", remote_field="shortTitle"),
|
||||
Mapping("title", remote_field="title"),
|
||||
Mapping("subtitle", remote_field="subTitle"),
|
||||
Mapping("isbn10", remote_field="cleanIsbn"),
|
||||
Mapping("languages", remote_field="languages", formatter=resolve_languages),
|
||||
Mapping("authors", remote_field="authors", formatter=parse_authors),
|
||||
Mapping("subjects", formatter=join_subject_list),
|
||||
Mapping("publishedDate", remote_field="year"),
|
||||
Mapping("cover", remote_field="images", formatter=format_cover_url),
|
||||
Mapping("description", remote_field="summary", formatter=get_first),
|
||||
Mapping("series", remote_field="series", formatter=parse_series_name),
|
||||
Mapping(
|
||||
"seriesNumber",
|
||||
remote_field="series",
|
||||
formatter=parse_series_number,
|
||||
),
|
||||
Mapping("publishers", remote_field="publishers"),
|
||||
Mapping(
|
||||
"physicalFormat",
|
||||
remote_field="formats",
|
||||
formatter=describe_physical_format,
|
||||
),
|
||||
Mapping(
|
||||
"physicalFormatDetail",
|
||||
remote_field="physicalDescriptions",
|
||||
formatter=get_first,
|
||||
),
|
||||
Mapping(
|
||||
"pages",
|
||||
remote_field="physicalDescriptions",
|
||||
formatter=guess_page_numbers,
|
||||
),
|
||||
]
|
||||
|
||||
self.author_mappings = [
|
||||
Mapping("id", remote_field="authors", formatter=self.get_remote_author_id),
|
||||
Mapping("name", remote_field="authors", formatter=get_first_author),
|
||||
]
|
||||
|
||||
def get_book_data(self, remote_id: str) -> JsonDict:
|
||||
request_parameters = {
|
||||
"field[]": [
|
||||
"authors",
|
||||
"cleanIsbn",
|
||||
"formats",
|
||||
"id",
|
||||
"images",
|
||||
"isbns",
|
||||
"languages",
|
||||
"physicalDescriptions",
|
||||
"publishers",
|
||||
"recordPage",
|
||||
"series",
|
||||
"shortTitle",
|
||||
"subjects",
|
||||
"subTitle",
|
||||
"summary",
|
||||
"title",
|
||||
"year",
|
||||
]
|
||||
}
|
||||
data = get_data(
|
||||
url=remote_id, params=request_parameters # type:ignore[arg-type]
|
||||
)
|
||||
extracted = data.get("records", [])
|
||||
try:
|
||||
data = extracted[0]
|
||||
except (KeyError, IndexError):
|
||||
raise ConnectorException("Invalid book data")
|
||||
return data
|
||||
|
||||
def get_remote_author_id(self, data: JsonDict) -> str | None:
|
||||
"""return search url for author info, as we don't
|
||||
have way to retrieve author-id with the query"""
|
||||
author = get_first_author(data)
|
||||
if author:
|
||||
return f"{self.search_url}{author}&type=Author"
|
||||
return None
|
||||
|
||||
def get_remote_id(self, data: JsonDict) -> str:
|
||||
"""return record-id page as book-id"""
|
||||
return f"{self.books_url}{data.get('id')}"
|
||||
|
||||
def parse_search_data(
|
||||
self, data: JsonDict, min_confidence: float
|
||||
) -> Iterator[SearchResult]:
|
||||
for idx, search_result in enumerate(data.get("records", [])):
|
||||
authors = search_result.get("authors")
|
||||
author = None
|
||||
if authors:
|
||||
author_list = parse_authors(authors)
|
||||
if author_list:
|
||||
author = "; ".join(author_list)
|
||||
|
||||
confidence = 1 / (idx + 1)
|
||||
if confidence < min_confidence:
|
||||
break
|
||||
|
||||
# Create some extra info on edition if it is audio-book or e-book
|
||||
edition_info_title = describe_physical_format(search_result.get("formats"))
|
||||
edition_info = ""
|
||||
if edition_info_title and edition_info_title != "Hardcover":
|
||||
for book_format, info_title in FormatChoices:
|
||||
if book_format == edition_info_title:
|
||||
edition_info = f" {info_title}"
|
||||
break
|
||||
|
||||
search_result = SearchResult(
|
||||
title=f"{search_result.get('title')}{edition_info}",
|
||||
key=f"{self.books_url}{search_result.get('id')}",
|
||||
author=author,
|
||||
cover=f"{self.covers_url}{search_result.get('images')[0]}"
|
||||
if search_result.get("images")
|
||||
else None,
|
||||
year=search_result.get("year"),
|
||||
view_link=f"{self.base_url}{search_result.get('recordPage')}",
|
||||
confidence=confidence,
|
||||
connector=self,
|
||||
)
|
||||
yield search_result
|
||||
|
||||
def parse_isbn_search_data(self, data: JsonDict) -> Iterator[SearchResult]:
|
||||
"""got some data"""
|
||||
for idx, search_result in enumerate(data.get("records", [])):
|
||||
authors = search_result.get("authors")
|
||||
author = None
|
||||
if authors:
|
||||
author_list = parse_authors(authors)
|
||||
if author_list:
|
||||
author = "; ".join(author_list)
|
||||
|
||||
confidence = 1 / (idx + 1)
|
||||
yield SearchResult(
|
||||
title=search_result.get("title"),
|
||||
key=f"{self.books_url}{search_result.get('id')}",
|
||||
author=author,
|
||||
cover=f"{self.covers_url}{search_result.get('images')[0]}"
|
||||
if search_result.get("images")
|
||||
else None,
|
||||
year=search_result.get("year"),
|
||||
view_link=f"{self.base_url}{search_result.get('recordPage')}",
|
||||
confidence=confidence,
|
||||
connector=self,
|
||||
)
|
||||
|
||||
def get_authors_from_data(self, data: JsonDict) -> Iterator[models.Author]:
|
||||
authors = data.get("authors")
|
||||
if authors:
|
||||
for author in parse_authors(authors):
|
||||
model = self.get_or_create_author(
|
||||
f"{self.search_url}{author}&type=Author"
|
||||
)
|
||||
if model:
|
||||
yield model
|
||||
|
||||
def expand_book_data(self, book: models.Book) -> None:
|
||||
work = book
|
||||
# go from the edition to the work, if necessary
|
||||
if isinstance(book, models.Edition):
|
||||
work = book.parent_work
|
||||
|
||||
try:
|
||||
edition_options = retrieve_versions(work.finna_key)
|
||||
except ConnectorException:
|
||||
return
|
||||
|
||||
for edition in edition_options:
|
||||
remote_id = self.get_remote_id(edition)
|
||||
if remote_id:
|
||||
create_edition_task.delay(self.connector.id, work.id, edition)
|
||||
|
||||
def get_remote_id_from_model(self, obj: models.BookDataModel) -> str:
|
||||
"""use get_remote_id to figure out the link from a model obj"""
|
||||
return f"{self.books_url}{obj.finna_key}"
|
||||
|
||||
def is_work_data(self, data: JsonDict) -> bool:
|
||||
"""
|
||||
https://api.finna.fi/v1/search?id=anders.1946700&search=versions&view=&lng=fi&field[]=formats&field[]=series&field[]=title&field[]=authors&field[]=summary&field[]=cleanIsbn&field[]=id
|
||||
|
||||
No real ordering what is work and what is edition, so pick first version as work
|
||||
"""
|
||||
edition_list = retrieve_versions(data.get("id"))
|
||||
if edition_list:
|
||||
return data.get("id") == edition_list[0].get("id")
|
||||
return True
|
||||
|
||||
def get_edition_from_work_data(self, data: JsonDict) -> JsonDict:
|
||||
"""No real distinctions what is work/edition,
|
||||
so check all versions and pick preferred edition"""
|
||||
edition_list = retrieve_versions(data.get("id"))
|
||||
if not edition_list:
|
||||
raise ConnectorException("No editions found for work")
|
||||
edition = pick_preferred_edition(edition_list)
|
||||
if not edition:
|
||||
raise ConnectorException("No editions found for work")
|
||||
return edition
|
||||
|
||||
def get_work_from_edition_data(self, data: JsonDict) -> JsonDict:
|
||||
return retrieve_versions(data.get("id"))[0]
|
||||
|
||||
|
||||
def guess_page_numbers(data: JsonDict) -> str | None:
|
||||
"""Try to retrieve page count of edition"""
|
||||
for row in data:
|
||||
# Try to match page count text in style of '134 pages' or '134 sivua'
|
||||
page_search = re.search(r"(\d+) (sivua|s\.|sidor|pages)", row)
|
||||
page_count = page_search.group(1) if page_search else None
|
||||
if page_count:
|
||||
return page_count
|
||||
# If we didn't match, try starting number
|
||||
page_search = re.search(r"^(\d+)", row)
|
||||
page_count = page_search.group(1) if page_search else None
|
||||
if page_count:
|
||||
return page_count
|
||||
return None
|
||||
|
||||
|
||||
def resolve_languages(data: JsonDict) -> list[str]:
|
||||
"""Use openlibrary language code list to resolve iso-lang codes"""
|
||||
result_languages = []
|
||||
for language_code in data:
|
||||
result_languages.append(
|
||||
languages.get(f"/languages/{language_code}", language_code)
|
||||
)
|
||||
return result_languages
|
||||
|
||||
|
||||
def join_subject_list(data: list[JsonDict]) -> list[str]:
|
||||
"""Join list of string list about subject topics as one list"""
|
||||
return [" ".join(info) for info in data]
|
||||
|
||||
|
||||
def describe_physical_format(formats: list[JsonDict]) -> str:
|
||||
"""Map if book is physical book, eBook or audiobook"""
|
||||
found_format = "Hardcover"
|
||||
# Map finnish finna formats to bookwyrm codes
|
||||
format_mapping = {
|
||||
"1/Book/Book/": "Hardcover",
|
||||
"1/Book/AudioBook/": "AudiobookFormat",
|
||||
"1/Book/eBook/": "EBook",
|
||||
}
|
||||
for format_to_check in formats:
|
||||
format_value = format_to_check.get("value")
|
||||
if not isinstance(format_value, str):
|
||||
continue
|
||||
if (mapping_match := format_mapping.get(format_value, None)) is not None:
|
||||
found_format = mapping_match
|
||||
return found_format
|
||||
|
||||
|
||||
def parse_series_name(series: list[JsonDict]) -> str | None:
|
||||
"""Parse series name if given"""
|
||||
for info in series:
|
||||
if "name" in info:
|
||||
return info.get("name")
|
||||
return None
|
||||
|
||||
|
||||
def parse_series_number(series: list[JsonDict]) -> str | None:
|
||||
"""Parse series number from additional info if given"""
|
||||
for info in series:
|
||||
if "additional" in info:
|
||||
return info.get("additional")
|
||||
return None
|
||||
|
||||
|
||||
def retrieve_versions(book_id: str | None) -> list[JsonDict]:
|
||||
"""
|
||||
https://api.finna.fi/v1/search?id=anders.1946700&search=versions&view=&
|
||||
|
||||
Search all editions/versions of the book that finna is aware of
|
||||
"""
|
||||
|
||||
if not book_id:
|
||||
return []
|
||||
|
||||
request_parameters = {
|
||||
"id": book_id,
|
||||
"search": "versions",
|
||||
"view": "",
|
||||
"field[]": [
|
||||
"authors",
|
||||
"cleanIsbn",
|
||||
"edition",
|
||||
"formats",
|
||||
"id",
|
||||
"images",
|
||||
"isbns",
|
||||
"languages",
|
||||
"physicalDescriptions",
|
||||
"publishers",
|
||||
"recordPage",
|
||||
"series",
|
||||
"shortTitle",
|
||||
"subjects",
|
||||
"subTitle",
|
||||
"summary",
|
||||
"title",
|
||||
"year",
|
||||
],
|
||||
}
|
||||
data = get_data(
|
||||
url="https://api.finna.fi/api/v1/search",
|
||||
params=request_parameters, # type: ignore[arg-type]
|
||||
)
|
||||
result = data.get("records", [])
|
||||
if isinstance(result, list):
|
||||
return result
|
||||
return []
|
||||
|
||||
|
||||
def get_first_author(data: JsonDict) -> str | None:
|
||||
"""Parse authors and return first one, usually the main author"""
|
||||
authors = parse_authors(data)
|
||||
if authors:
|
||||
return authors[0]
|
||||
return None
|
||||
|
||||
|
||||
def parse_authors(data: JsonDict) -> list[str]:
|
||||
"""Search author info, they are given in SurName, FirstName style
|
||||
return them also as FirstName SurName order"""
|
||||
if author_keys := data.get("primary", None):
|
||||
if author_keys:
|
||||
# we search for 'kirjoittaja' role, if any found
|
||||
tulos = list(
|
||||
# Convert from 'Lewis, Michael' to 'Michael Lewis'
|
||||
" ".join(reversed(author_key.split(", ")))
|
||||
for author_key, author_info in author_keys.items()
|
||||
if "kirjoittaja" in author_info.get("role", [])
|
||||
)
|
||||
if tulos:
|
||||
return tulos
|
||||
# if not found, we search any role that is not specificly something
|
||||
tulos = list(
|
||||
" ".join(reversed(author_key.split(", ")))
|
||||
for author_key, author_info in author_keys.items()
|
||||
if "-" in author_info.get("role", [])
|
||||
)
|
||||
return tulos
|
||||
return []
|
||||
|
||||
|
||||
def pick_preferred_edition(options: list[JsonDict]) -> JsonDict | None:
|
||||
"""favor physical copies with covers in english"""
|
||||
if not options:
|
||||
return None
|
||||
if len(options) == 1:
|
||||
return options[0]
|
||||
|
||||
# pick hardcodver book if present over eBook/audiobook
|
||||
formats = ["1/Book/Book/"]
|
||||
format_selection = []
|
||||
for edition in options:
|
||||
for edition_format in edition.get("formats", []):
|
||||
if edition_format.get("value") in formats:
|
||||
format_selection.append(edition)
|
||||
options = format_selection or options
|
||||
|
||||
# Prefer Finnish/Swedish language editions if any found
|
||||
language_list = ["fin", "swe"]
|
||||
languages_selection = []
|
||||
for edition in options:
|
||||
for edition_language in edition.get("languages", []):
|
||||
if edition_language in language_list:
|
||||
languages_selection.append(edition)
|
||||
options = languages_selection or options
|
||||
|
||||
options = [e for e in options if e.get("cleanIsbn")] or options
|
||||
return options[0]
|
|
@ -1,3 +1,3 @@
|
|||
""" settings book data connectors """
|
||||
|
||||
CONNECTORS = ["openlibrary", "inventaire", "bookwyrm_connector"]
|
||||
CONNECTORS = ["openlibrary", "inventaire", "bookwyrm_connector", "finna"]
|
||||
|
|
|
@ -40,6 +40,7 @@ class EditionForm(CustomForm):
|
|||
"openlibrary_key",
|
||||
"inventaire_id",
|
||||
"goodreads_key",
|
||||
"finna_key",
|
||||
"oclc_number",
|
||||
"asin",
|
||||
"aasin",
|
||||
|
@ -93,6 +94,7 @@ class EditionForm(CustomForm):
|
|||
"oclc_number": forms.TextInput(
|
||||
attrs={"aria-describedby": "desc_oclc_number"}
|
||||
),
|
||||
"finna_key": forms.TextInput(attrs={"aria-describedby": "desc_finna_key"}),
|
||||
"ASIN": forms.TextInput(attrs={"aria-describedby": "desc_ASIN"}),
|
||||
"AASIN": forms.TextInput(attrs={"aria-describedby": "desc_AASIN"}),
|
||||
"isfdb": forms.TextInput(attrs={"aria-describedby": "desc_isfdb"}),
|
||||
|
|
|
@ -22,6 +22,21 @@ class BookwyrmImporter:
|
|||
job = BookwyrmImportJob.objects.create(
|
||||
user=user, archive_file=archive_file, required=required
|
||||
)
|
||||
|
||||
return job
|
||||
|
||||
def create_retry_job(
|
||||
self, user: User, original_job: BookwyrmImportJob
|
||||
) -> BookwyrmImportJob:
|
||||
"""retry items that didn't import"""
|
||||
|
||||
job = BookwyrmImportJob.objects.create(
|
||||
user=user,
|
||||
archive_file=original_job.archive_file,
|
||||
required=original_job.required,
|
||||
retry=True,
|
||||
)
|
||||
|
||||
return job
|
||||
|
||||
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
""" handle reading a csv from goodreads """
|
||||
from typing import Optional
|
||||
from . import Importer
|
||||
|
||||
|
||||
|
@ -7,3 +8,10 @@ class GoodreadsImporter(Importer):
|
|||
For a more complete example of overriding see librarything_import.py"""
|
||||
|
||||
service = "Goodreads"
|
||||
|
||||
def normalize_row(
|
||||
self, entry: dict[str, str], mappings: dict[str, Optional[str]]
|
||||
) -> dict[str, Optional[str]]:
|
||||
normalized = super().normalize_row(entry, mappings)
|
||||
normalized["goodreads_key"] = normalized["id"]
|
||||
return normalized
|
||||
|
|
58
bookwyrm/management/commands/add_finna_connector.py
Normal file
58
bookwyrm/management/commands/add_finna_connector.py
Normal file
|
@ -0,0 +1,58 @@
|
|||
""" Add finna connector to connectors """
|
||||
from django.core.management.base import BaseCommand
|
||||
|
||||
from bookwyrm import models
|
||||
|
||||
|
||||
def enable_finna_connector():
|
||||
|
||||
models.Connector.objects.create(
|
||||
identifier="api.finna.fi",
|
||||
name="Finna API",
|
||||
connector_file="finna",
|
||||
base_url="https://www.finna.fi",
|
||||
books_url="https://api.finna.fi/api/v1/record" "?id=",
|
||||
covers_url="https://api.finna.fi",
|
||||
search_url="https://api.finna.fi/api/v1/search?limit=20"
|
||||
"&filter[]=format%3a%220%2fBook%2f%22"
|
||||
"&field[]=title&field[]=recordPage&field[]=authors"
|
||||
"&field[]=year&field[]=id&field[]=formats&field[]=images"
|
||||
"&lookfor=",
|
||||
isbn_search_url="https://api.finna.fi/api/v1/search?limit=1"
|
||||
"&filter[]=format%3a%220%2fBook%2f%22"
|
||||
"&field[]=title&field[]=recordPage&field[]=authors&field[]=year"
|
||||
"&field[]=id&field[]=formats&field[]=images"
|
||||
"&lookfor=isbn:",
|
||||
)
|
||||
|
||||
|
||||
def remove_finna_connector():
|
||||
models.Connector.objects.filter(identifier="api.finna.fi").update(
|
||||
active=False, deactivation_reason="Disabled by management command"
|
||||
)
|
||||
print("Finna connector deactivated")
|
||||
|
||||
|
||||
# pylint: disable=no-self-use
|
||||
# pylint: disable=unused-argument
|
||||
class Command(BaseCommand):
|
||||
"""command-line options"""
|
||||
|
||||
help = "Setup Finna API connector"
|
||||
|
||||
def add_arguments(self, parser):
|
||||
"""specify argument to remove connector"""
|
||||
parser.add_argument(
|
||||
"--deactivate",
|
||||
action="store_true",
|
||||
help="Deactivate the finna connector from config",
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
"""enable or remove connector"""
|
||||
if options.get("deactivate"):
|
||||
print("Deactivate finna connector config if one present")
|
||||
remove_finna_connector()
|
||||
else:
|
||||
print("Adding Finna API connector to configuration")
|
||||
enable_finna_connector()
|
|
@ -99,7 +99,7 @@ def init_connectors():
|
|||
covers_url="https://inventaire.io",
|
||||
search_url="https://inventaire.io/api/search?types=works&types=works&search=",
|
||||
isbn_search_url="https://inventaire.io/api/entities?action=by-uris&uris=isbn%3A",
|
||||
priority=1,
|
||||
priority=3,
|
||||
)
|
||||
|
||||
models.Connector.objects.create(
|
||||
|
@ -111,7 +111,7 @@ def init_connectors():
|
|||
covers_url="https://covers.openlibrary.org",
|
||||
search_url="https://openlibrary.org/search?q=",
|
||||
isbn_search_url="https://openlibrary.org/api/books?jscmd=data&format=json&bibkeys=ISBN:",
|
||||
priority=1,
|
||||
priority=3,
|
||||
)
|
||||
|
||||
|
||||
|
|
26
bookwyrm/migrations/0210_alter_connector_connector_file.py
Normal file
26
bookwyrm/migrations/0210_alter_connector_connector_file.py
Normal file
|
@ -0,0 +1,26 @@
|
|||
# Generated by Django 4.2.17 on 2025-02-02 20:22
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("bookwyrm", "0209_user_show_ratings"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="connector",
|
||||
name="connector_file",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("openlibrary", "Openlibrary"),
|
||||
("inventaire", "Inventaire"),
|
||||
("bookwyrm_connector", "Bookwyrm Connector"),
|
||||
("finna", "Finna"),
|
||||
],
|
||||
max_length=255,
|
||||
),
|
||||
),
|
||||
]
|
28
bookwyrm/migrations/0211_author_finna_key_book_finna_key.py
Normal file
28
bookwyrm/migrations/0211_author_finna_key_book_finna_key.py
Normal file
|
@ -0,0 +1,28 @@
|
|||
# Generated by Django 4.2.17 on 2025-02-08 16:14
|
||||
|
||||
import bookwyrm.models.fields
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("bookwyrm", "0210_alter_connector_connector_file"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="author",
|
||||
name="finna_key",
|
||||
field=bookwyrm.models.fields.CharField(
|
||||
blank=True, max_length=255, null=True
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="book",
|
||||
name="finna_key",
|
||||
field=bookwyrm.models.fields.CharField(
|
||||
blank=True, max_length=255, null=True
|
||||
),
|
||||
),
|
||||
]
|
151
bookwyrm/migrations/0212_userrelationshipimport_and_more.py
Normal file
151
bookwyrm/migrations/0212_userrelationshipimport_and_more.py
Normal file
|
@ -0,0 +1,151 @@
|
|||
# Generated by Django 4.2.20 on 2025-03-28 07:37
|
||||
|
||||
import bookwyrm.models.fields
|
||||
import django.contrib.postgres.fields
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("bookwyrm", "0211_author_finna_key_book_finna_key"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name="UserRelationshipImport",
|
||||
fields=[
|
||||
(
|
||||
"childjob_ptr",
|
||||
models.OneToOneField(
|
||||
auto_created=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
parent_link=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
to="bookwyrm.childjob",
|
||||
),
|
||||
),
|
||||
(
|
||||
"relationship",
|
||||
bookwyrm.models.fields.CharField(
|
||||
choices=[("follow", "Follow"), ("block", "Block")],
|
||||
max_length=10,
|
||||
null=True,
|
||||
),
|
||||
),
|
||||
(
|
||||
"remote_id",
|
||||
bookwyrm.models.fields.RemoteIdField(
|
||||
max_length=255,
|
||||
null=True,
|
||||
validators=[bookwyrm.models.fields.validate_remote_id],
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"abstract": False,
|
||||
},
|
||||
bases=("bookwyrm.childjob",),
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name="bookwyrmexportjob",
|
||||
name="json_completed",
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="bookwyrmimportjob",
|
||||
name="retry",
|
||||
field=models.BooleanField(default=False),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="childjob",
|
||||
name="fail_reason",
|
||||
field=models.TextField(null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="parentjob",
|
||||
name="fail_reason",
|
||||
field=models.TextField(null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="bookwyrmimportjob",
|
||||
name="required",
|
||||
field=django.contrib.postgres.fields.ArrayField(
|
||||
base_field=bookwyrm.models.fields.CharField(blank=True, max_length=50),
|
||||
blank=True,
|
||||
size=None,
|
||||
),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name="UserImportPost",
|
||||
fields=[
|
||||
(
|
||||
"childjob_ptr",
|
||||
models.OneToOneField(
|
||||
auto_created=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
parent_link=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
to="bookwyrm.childjob",
|
||||
),
|
||||
),
|
||||
("json", models.JSONField()),
|
||||
(
|
||||
"status_type",
|
||||
bookwyrm.models.fields.CharField(
|
||||
choices=[
|
||||
("comment", "Comment"),
|
||||
("review", "Review"),
|
||||
("quote", "Quotation"),
|
||||
],
|
||||
default="comment",
|
||||
max_length=10,
|
||||
null=True,
|
||||
),
|
||||
),
|
||||
(
|
||||
"book",
|
||||
bookwyrm.models.fields.ForeignKey(
|
||||
on_delete=django.db.models.deletion.PROTECT,
|
||||
to="bookwyrm.edition",
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"abstract": False,
|
||||
},
|
||||
bases=("bookwyrm.childjob",),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name="UserImportBook",
|
||||
fields=[
|
||||
(
|
||||
"childjob_ptr",
|
||||
models.OneToOneField(
|
||||
auto_created=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
parent_link=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
to="bookwyrm.childjob",
|
||||
),
|
||||
),
|
||||
("book_data", models.JSONField()),
|
||||
(
|
||||
"book",
|
||||
models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
to="bookwyrm.book",
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"abstract": False,
|
||||
},
|
||||
bases=("bookwyrm.childjob",),
|
||||
),
|
||||
]
|
|
@ -26,7 +26,12 @@ from .federated_server import FederatedServer
|
|||
from .group import Group, GroupMember, GroupMemberInvitation
|
||||
|
||||
from .import_job import ImportJob, ImportItem
|
||||
from .bookwyrm_import_job import BookwyrmImportJob
|
||||
from .bookwyrm_import_job import (
|
||||
BookwyrmImportJob,
|
||||
UserImportBook,
|
||||
UserImportPost,
|
||||
import_book_task,
|
||||
)
|
||||
from .bookwyrm_export_job import BookwyrmExportJob
|
||||
|
||||
from .move import MoveUser
|
||||
|
|
|
@ -129,7 +129,20 @@ class ActivitypubMixin:
|
|||
|
||||
def broadcast(self, activity, sender, software=None, queue=BROADCAST):
|
||||
"""send out an activity"""
|
||||
|
||||
# if we're posting about ShelfBooks, set a delay to give the base activity
|
||||
# time to add the book on remote servers first to avoid race conditions
|
||||
countdown = (
|
||||
10
|
||||
if (
|
||||
isinstance(activity, object)
|
||||
and not isinstance(activity["object"], str)
|
||||
and activity["object"].get("type", None) in ["GeneratedNote", "Comment"]
|
||||
)
|
||||
else 0
|
||||
)
|
||||
broadcast_task.apply_async(
|
||||
countdown=countdown,
|
||||
args=(
|
||||
sender.id,
|
||||
json.dumps(activity, cls=activitypub.ActivityEncoder),
|
||||
|
@ -227,6 +240,7 @@ class ObjectMixin(ActivitypubMixin):
|
|||
return
|
||||
|
||||
try:
|
||||
# TODO: here is where we might use an ActivityPub extension instead
|
||||
# do we have a "pure" activitypub version of this for mastodon?
|
||||
if software != "bookwyrm" and hasattr(self, "pure_content"):
|
||||
pure_activity = self.to_create_activity(user, pure=True)
|
||||
|
|
|
@ -41,6 +41,9 @@ class BookDataModel(ObjectMixin, BookWyrmModel):
|
|||
openlibrary_key = fields.CharField(
|
||||
max_length=255, blank=True, null=True, deduplication_field=True
|
||||
)
|
||||
finna_key = fields.CharField(
|
||||
max_length=255, blank=True, null=True, deduplication_field=True
|
||||
)
|
||||
inventaire_id = fields.CharField(
|
||||
max_length=255, blank=True, null=True, deduplication_field=True
|
||||
)
|
||||
|
|
|
@ -6,7 +6,7 @@ import os
|
|||
from boto3.session import Session as BotoSession
|
||||
from s3_tar import S3Tar
|
||||
|
||||
from django.db.models import BooleanField, FileField, JSONField
|
||||
from django.db.models import FileField, JSONField
|
||||
from django.core.serializers.json import DjangoJSONEncoder
|
||||
from django.core.files.base import ContentFile
|
||||
from django.core.files.storage import storages
|
||||
|
@ -17,7 +17,7 @@ from bookwyrm.models import AnnualGoal, ReadThrough, ShelfBook, ListItem
|
|||
from bookwyrm.models import Review, Comment, Quotation
|
||||
from bookwyrm.models import Edition
|
||||
from bookwyrm.models import UserFollows, User, UserBlocks
|
||||
from bookwyrm.models.job import ParentJob
|
||||
from bookwyrm.models.job import ParentJob, ParentTask
|
||||
from bookwyrm.tasks import app, IMPORTS
|
||||
from bookwyrm.utils.tar import BookwyrmTarFile
|
||||
|
||||
|
@ -42,38 +42,41 @@ class BookwyrmExportJob(ParentJob):
|
|||
|
||||
export_data = FileField(null=True, storage=select_exports_storage)
|
||||
export_json = JSONField(null=True, encoder=DjangoJSONEncoder)
|
||||
json_completed = BooleanField(default=False)
|
||||
|
||||
def start_job(self):
|
||||
"""schedule the first task"""
|
||||
|
||||
task = create_export_json_task.delay(job_id=self.id)
|
||||
self.task_id = task.id
|
||||
self.save(update_fields=["task_id"])
|
||||
self.set_status("active")
|
||||
create_export_json_task.delay(job_id=self.id)
|
||||
|
||||
|
||||
@app.task(queue=IMPORTS)
|
||||
def create_export_json_task(job_id):
|
||||
@app.task(queue=IMPORTS, base=ParentTask)
|
||||
def create_export_json_task(**kwargs):
|
||||
"""create the JSON data for the export"""
|
||||
|
||||
job = BookwyrmExportJob.objects.get(id=job_id)
|
||||
|
||||
job = BookwyrmExportJob.objects.get(id=kwargs["job_id"])
|
||||
# don't start the job if it was stopped from the UI
|
||||
if job.complete:
|
||||
if job.status == "stopped":
|
||||
return
|
||||
|
||||
try:
|
||||
job.set_status("active")
|
||||
|
||||
# generate JSON structure
|
||||
job.export_json = export_json(job.user)
|
||||
# generate JSON
|
||||
data = export_user(job.user)
|
||||
data["settings"] = export_settings(job.user)
|
||||
data["goals"] = export_goals(job.user)
|
||||
data["books"] = export_books(job.user)
|
||||
data["saved_lists"] = export_saved_lists(job.user)
|
||||
data["follows"] = export_follows(job.user)
|
||||
data["blocks"] = export_blocks(job.user)
|
||||
job.export_json = data
|
||||
job.save(update_fields=["export_json"])
|
||||
|
||||
# create archive in separate task
|
||||
# trigger task to create tar file
|
||||
create_archive_task.delay(job_id=job.id)
|
||||
|
||||
except Exception as err: # pylint: disable=broad-except
|
||||
logger.exception(
|
||||
"create_export_json_task for %s failed with error: %s", job, err
|
||||
"create_export_json_task for job %s failed with error: %s", job.id, err
|
||||
)
|
||||
job.set_status("failed")
|
||||
|
||||
|
@ -94,21 +97,20 @@ def add_file_to_s3_tar(s3_tar: S3Tar, storage, file, directory=""):
|
|||
)
|
||||
|
||||
|
||||
@app.task(queue=IMPORTS)
|
||||
def create_archive_task(job_id):
|
||||
@app.task(queue=IMPORTS, base=ParentTask)
|
||||
def create_archive_task(**kwargs):
|
||||
"""create the archive containing the JSON file and additional files"""
|
||||
|
||||
job = BookwyrmExportJob.objects.get(id=job_id)
|
||||
job = BookwyrmExportJob.objects.get(id=kwargs["job_id"])
|
||||
|
||||
# don't start the job if it was stopped from the UI
|
||||
if job.complete:
|
||||
if job.status == "stopped":
|
||||
return
|
||||
|
||||
try:
|
||||
export_task_id = str(job.task_id)
|
||||
archive_filename = f"{export_task_id}.tar.gz"
|
||||
export_json_bytes = DjangoJSONEncoder().encode(job.export_json).encode("utf-8")
|
||||
|
||||
user = job.user
|
||||
editions = get_books_for_user(user)
|
||||
|
||||
|
@ -169,25 +171,15 @@ def create_archive_task(job_id):
|
|||
tar.add_image(edition.cover, directory="images")
|
||||
job.save(update_fields=["export_data"])
|
||||
|
||||
job.set_status("completed")
|
||||
job.complete_job()
|
||||
|
||||
except Exception as err: # pylint: disable=broad-except
|
||||
logger.exception("create_archive_task for %s failed with error: %s", job, err)
|
||||
logger.exception(
|
||||
"create_archive_task for job %s failed with error: %s", job.id, err
|
||||
)
|
||||
job.set_status("failed")
|
||||
|
||||
|
||||
def export_json(user: User):
|
||||
"""create export JSON"""
|
||||
data = export_user(user) # in the root of the JSON structure
|
||||
data["settings"] = export_settings(user)
|
||||
data["goals"] = export_goals(user)
|
||||
data["books"] = export_books(user)
|
||||
data["saved_lists"] = export_saved_lists(user)
|
||||
data["follows"] = export_follows(user)
|
||||
data["blocks"] = export_blocks(user)
|
||||
return data
|
||||
|
||||
|
||||
def export_user(user: User):
|
||||
"""export user data"""
|
||||
data = user.to_activity()
|
||||
|
@ -316,11 +308,9 @@ def export_book(user: User, edition: Edition):
|
|||
def get_books_for_user(user):
|
||||
"""
|
||||
Get all the books and editions related to a user.
|
||||
|
||||
We use union() instead of Q objects because it creates
|
||||
multiple simple queries in stead of a much more complex DB query
|
||||
multiple simple queries instead of a complex DB query
|
||||
that can time out.
|
||||
|
||||
"""
|
||||
|
||||
shelf_eds = Edition.objects.select_related("parent_work").filter(shelves__user=user)
|
||||
|
|
|
@ -2,16 +2,26 @@
|
|||
|
||||
import json
|
||||
import logging
|
||||
import math
|
||||
|
||||
from django.db.models import FileField, JSONField, CharField
|
||||
from django.db.models import (
|
||||
BooleanField,
|
||||
ForeignKey,
|
||||
FileField,
|
||||
JSONField,
|
||||
TextChoices,
|
||||
PROTECT,
|
||||
SET_NULL,
|
||||
)
|
||||
from django.utils import timezone
|
||||
from django.utils.html import strip_tags
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.contrib.postgres.fields import ArrayField as DjangoArrayField
|
||||
|
||||
from bookwyrm import activitypub
|
||||
from bookwyrm import models
|
||||
from bookwyrm.tasks import app, IMPORTS
|
||||
from bookwyrm.models.job import ParentJob, ParentTask, SubTask
|
||||
from bookwyrm.models.job import ParentJob, ChildJob, ParentTask, SubTask
|
||||
from bookwyrm.utils.tar import BookwyrmTarFile
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
@ -22,23 +32,130 @@ class BookwyrmImportJob(ParentJob):
|
|||
|
||||
archive_file = FileField(null=True, blank=True)
|
||||
import_data = JSONField(null=True)
|
||||
required = DjangoArrayField(CharField(max_length=50, blank=True), blank=True)
|
||||
required = DjangoArrayField(
|
||||
models.fields.CharField(max_length=50, blank=True), blank=True
|
||||
)
|
||||
retry = BooleanField(default=False)
|
||||
|
||||
def start_job(self):
|
||||
"""Start the job"""
|
||||
start_import_task.delay(job_id=self.id, no_children=True)
|
||||
start_import_task.delay(job_id=self.id)
|
||||
|
||||
@property
|
||||
def book_tasks(self):
|
||||
"""How many import book tasks are there?"""
|
||||
return UserImportBook.objects.filter(parent_job=self).all()
|
||||
|
||||
@property
|
||||
def status_tasks(self):
|
||||
"""How many import status tasks are there?"""
|
||||
return UserImportPost.objects.filter(parent_job=self).all()
|
||||
|
||||
@property
|
||||
def relationship_tasks(self):
|
||||
"""How many import relationship tasks are there?"""
|
||||
return UserRelationshipImport.objects.filter(parent_job=self).all()
|
||||
|
||||
@property
|
||||
def item_count(self):
|
||||
"""How many total tasks are there?"""
|
||||
return self.book_tasks.count() + self.status_tasks.count()
|
||||
|
||||
@property
|
||||
def pending_item_count(self):
|
||||
"""How many tasks are incomplete?"""
|
||||
status = BookwyrmImportJob.Status
|
||||
book_tasks = self.book_tasks.filter(
|
||||
status__in=[status.PENDING, status.ACTIVE]
|
||||
).count()
|
||||
|
||||
status_tasks = self.status_tasks.filter(
|
||||
status__in=[status.PENDING, status.ACTIVE]
|
||||
).count()
|
||||
|
||||
relationship_tasks = self.relationship_tasks.filter(
|
||||
status__in=[status.PENDING, status.ACTIVE]
|
||||
).count()
|
||||
|
||||
return book_tasks + status_tasks + relationship_tasks
|
||||
|
||||
@property
|
||||
def percent_complete(self):
|
||||
"""How far along?"""
|
||||
item_count = self.item_count
|
||||
if not item_count:
|
||||
return 0
|
||||
return math.floor((item_count - self.pending_item_count) / item_count * 100)
|
||||
|
||||
|
||||
class UserImportBook(ChildJob):
|
||||
"""ChildJob to import each book.
|
||||
Equivalent to ImportItem when importing a csv file of books"""
|
||||
|
||||
book = ForeignKey(models.Book, on_delete=SET_NULL, null=True, blank=True)
|
||||
book_data = JSONField(null=False)
|
||||
|
||||
def start_job(self):
|
||||
"""Start the job"""
|
||||
import_book_task.delay(child_id=self.id)
|
||||
|
||||
|
||||
class UserImportPost(ChildJob):
|
||||
"""ChildJob for comments, quotes, and reviews"""
|
||||
|
||||
class StatusType(TextChoices):
|
||||
"""Possible status types."""
|
||||
|
||||
COMMENT = "comment", _("Comment")
|
||||
REVIEW = "review", _("Review")
|
||||
QUOTE = "quote", _("Quotation")
|
||||
|
||||
json = JSONField(null=False)
|
||||
book = models.fields.ForeignKey(
|
||||
"Edition", on_delete=PROTECT, activitypub_field="inReplyToBook"
|
||||
)
|
||||
status_type = models.fields.CharField(
|
||||
max_length=10, choices=StatusType.choices, default=StatusType.COMMENT, null=True
|
||||
)
|
||||
|
||||
def start_job(self):
|
||||
"""Start the job"""
|
||||
upsert_status_task.delay(child_id=self.id)
|
||||
|
||||
|
||||
class UserRelationshipImport(ChildJob):
|
||||
"""ChildJob for follows and blocks"""
|
||||
|
||||
class RelationshipType(TextChoices):
|
||||
"""Possible relationship types."""
|
||||
|
||||
FOLLOW = "follow", _("Follow")
|
||||
BLOCK = "block", _("Block")
|
||||
|
||||
relationship = models.fields.CharField(
|
||||
max_length=10, choices=RelationshipType.choices, null=True
|
||||
)
|
||||
remote_id = models.fields.RemoteIdField(null=True, unique=False)
|
||||
|
||||
def start_job(self):
|
||||
"""Start the job"""
|
||||
import_user_relationship_task.delay(child_id=self.id)
|
||||
|
||||
|
||||
@app.task(queue=IMPORTS, base=ParentTask)
|
||||
def start_import_task(**kwargs):
|
||||
"""trigger the child import tasks for each user data"""
|
||||
"""trigger the child import tasks for each user data
|
||||
We always import the books even if not assigning
|
||||
them to shelves, lists etc"""
|
||||
job = BookwyrmImportJob.objects.get(id=kwargs["job_id"])
|
||||
archive_file = job.archive_file
|
||||
archive_file = job.bookwyrmimportjob.archive_file
|
||||
|
||||
# don't start the job if it was stopped from the UI
|
||||
if job.complete:
|
||||
if job.status == "stopped":
|
||||
return
|
||||
|
||||
job.status = "active"
|
||||
job.save(update_fields=["status"])
|
||||
|
||||
try:
|
||||
archive_file.open("rb")
|
||||
with BookwyrmTarFile.open(mode="r:gz", fileobj=archive_file) as tar:
|
||||
|
@ -56,13 +173,23 @@ def start_import_task(**kwargs):
|
|||
if "include_saved_lists" in job.required:
|
||||
upsert_saved_lists(job.user, job.import_data.get("saved_lists", []))
|
||||
if "include_follows" in job.required:
|
||||
upsert_follows(job.user, job.import_data.get("follows", []))
|
||||
for remote_id in job.import_data.get("follows", []):
|
||||
UserRelationshipImport.objects.create(
|
||||
parent_job=job, remote_id=remote_id, relationship="follow"
|
||||
)
|
||||
if "include_blocks" in job.required:
|
||||
upsert_user_blocks(job.user, job.import_data.get("blocks", []))
|
||||
for remote_id in job.import_data.get("blocks", []):
|
||||
UserRelationshipImport.objects.create(
|
||||
parent_job=job, remote_id=remote_id, relationship="block"
|
||||
)
|
||||
|
||||
process_books(job, tar)
|
||||
for item in UserRelationshipImport.objects.filter(parent_job=job).all():
|
||||
item.start_job()
|
||||
|
||||
for data in job.import_data.get("books"):
|
||||
book_job = UserImportBook.objects.create(parent_job=job, book_data=data)
|
||||
book_job.start_job()
|
||||
|
||||
job.set_status("complete")
|
||||
archive_file.close()
|
||||
|
||||
except Exception as err: # pylint: disable=broad-except
|
||||
|
@ -70,89 +197,191 @@ def start_import_task(**kwargs):
|
|||
job.set_status("failed")
|
||||
|
||||
|
||||
def process_books(job, tar):
|
||||
"""
|
||||
Process user import data related to books
|
||||
We always import the books even if not assigning
|
||||
them to shelves, lists etc
|
||||
"""
|
||||
@app.task(queue=IMPORTS, base=SubTask)
|
||||
def import_book_task(**kwargs): # pylint: disable=too-many-locals,too-many-branches
|
||||
"""Take work and edition data,
|
||||
find or create the edition and work in the database"""
|
||||
|
||||
books = job.import_data.get("books")
|
||||
task = UserImportBook.objects.get(id=kwargs["child_id"])
|
||||
job = task.parent_job
|
||||
archive_file = job.bookwyrmimportjob.archive_file
|
||||
book_data = task.book_data
|
||||
|
||||
for data in books:
|
||||
book = get_or_create_edition(data, tar)
|
||||
if task.complete or job.status == "stopped":
|
||||
return
|
||||
|
||||
if "include_shelves" in job.required:
|
||||
upsert_shelves(book, job.user, data)
|
||||
try:
|
||||
edition = book_data.get("edition")
|
||||
work = book_data.get("work")
|
||||
book = models.Edition.find_existing(edition)
|
||||
if not book:
|
||||
# make sure we have the authors in the local DB
|
||||
# replace the old author ids in the edition JSON
|
||||
edition["authors"] = []
|
||||
work["authors"] = []
|
||||
for author in book_data.get("authors"):
|
||||
instance = activitypub.parse(author).to_model(
|
||||
model=models.Author, save=True, overwrite=False
|
||||
)
|
||||
|
||||
if "include_readthroughs" in job.required:
|
||||
upsert_readthroughs(data.get("readthroughs"), job.user, book.id)
|
||||
edition["authors"].append(instance.remote_id)
|
||||
work["authors"].append(instance.remote_id)
|
||||
|
||||
if "include_comments" in job.required:
|
||||
upsert_statuses(
|
||||
job.user, models.Comment, data.get("comments"), book.remote_id
|
||||
)
|
||||
if "include_quotations" in job.required:
|
||||
upsert_statuses(
|
||||
job.user, models.Quotation, data.get("quotations"), book.remote_id
|
||||
# we will add the cover later from the tar
|
||||
# don't try to load it from the old server
|
||||
cover = edition.get("cover", {})
|
||||
cover_path = cover.get("url", None)
|
||||
edition["cover"] = {}
|
||||
|
||||
# first we need the parent work to exist
|
||||
work["editions"] = []
|
||||
work_instance = activitypub.parse(work).to_model(
|
||||
model=models.Work, save=True, overwrite=False
|
||||
)
|
||||
|
||||
if "include_reviews" in job.required:
|
||||
upsert_statuses(
|
||||
job.user, models.Review, data.get("reviews"), book.remote_id
|
||||
# now we have a work we can add it to the edition
|
||||
# and create the edition model instance
|
||||
edition["work"] = work_instance.remote_id
|
||||
book = activitypub.parse(edition).to_model(
|
||||
model=models.Edition, save=True, overwrite=False
|
||||
)
|
||||
|
||||
if "include_lists" in job.required:
|
||||
upsert_lists(job.user, data.get("lists"), book.id)
|
||||
# set the cover image from the tar
|
||||
if cover_path:
|
||||
archive_file.open("rb")
|
||||
with BookwyrmTarFile.open(mode="r:gz", fileobj=archive_file) as tar:
|
||||
tar.write_image_to_file(cover_path, book.cover)
|
||||
archive_file.close()
|
||||
|
||||
task.book = book
|
||||
task.save(update_fields=["book"])
|
||||
required = task.parent_job.bookwyrmimportjob.required
|
||||
|
||||
def get_or_create_edition(book_data, tar):
|
||||
"""Take a JSON string of work and edition data,
|
||||
find or create the edition and work in the database and
|
||||
return an edition instance"""
|
||||
if "include_shelves" in required:
|
||||
upsert_shelves(task.parent_job.user, book, book_data.get("shelves"))
|
||||
|
||||
edition = book_data.get("edition")
|
||||
existing = models.Edition.find_existing(edition)
|
||||
if existing:
|
||||
return existing
|
||||
if "include_readthroughs" in required:
|
||||
upsert_readthroughs(
|
||||
task.parent_job.user, book.id, book_data.get("readthroughs")
|
||||
)
|
||||
|
||||
# make sure we have the authors in the local DB
|
||||
# replace the old author ids in the edition JSON
|
||||
edition["authors"] = []
|
||||
for author in book_data.get("authors"):
|
||||
parsed_author = activitypub.parse(author)
|
||||
instance = parsed_author.to_model(
|
||||
model=models.Author, save=True, overwrite=True
|
||||
if "include_lists" in required:
|
||||
upsert_lists(task.parent_job.user, book.id, book_data.get("lists"))
|
||||
|
||||
except Exception as err: # pylint: disable=broad-except
|
||||
logger.exception(
|
||||
"Book Import Task %s for Job %s Failed with error: %s", task.id, job.id, err
|
||||
)
|
||||
task.fail_reason = _("unknown")
|
||||
task.save(update_fields=["fail_reason"])
|
||||
task.set_status("failed")
|
||||
|
||||
edition["authors"].append(instance.remote_id)
|
||||
# Now import statuses
|
||||
# These are also subtasks so that we can isolate anything that fails
|
||||
if "include_comments" in job.bookwyrmimportjob.required:
|
||||
for status in book_data.get("comments"):
|
||||
UserImportPost.objects.create(
|
||||
parent_job=task.parent_job,
|
||||
json=status,
|
||||
book=book,
|
||||
status_type=UserImportPost.StatusType.COMMENT,
|
||||
)
|
||||
|
||||
# we will add the cover later from the tar
|
||||
# don't try to load it from the old server
|
||||
cover = edition.get("cover", {})
|
||||
cover_path = cover.get("url", None)
|
||||
edition["cover"] = {}
|
||||
if "include_quotations" in job.bookwyrmimportjob.required:
|
||||
for status in book_data.get("quotations"):
|
||||
UserImportPost.objects.create(
|
||||
parent_job=task.parent_job,
|
||||
json=status,
|
||||
book=book,
|
||||
status_type=UserImportPost.StatusType.QUOTE,
|
||||
)
|
||||
|
||||
# first we need the parent work to exist
|
||||
work = book_data.get("work")
|
||||
work["editions"] = []
|
||||
parsed_work = activitypub.parse(work)
|
||||
work_instance = parsed_work.to_model(model=models.Work, save=True, overwrite=True)
|
||||
if "include_reviews" in job.bookwyrmimportjob.required:
|
||||
for status in book_data.get("reviews"):
|
||||
UserImportPost.objects.create(
|
||||
parent_job=task.parent_job,
|
||||
json=status,
|
||||
book=book,
|
||||
status_type=UserImportPost.StatusType.REVIEW,
|
||||
)
|
||||
|
||||
# now we have a work we can add it to the edition
|
||||
# and create the edition model instance
|
||||
edition["work"] = work_instance.remote_id
|
||||
parsed_edition = activitypub.parse(edition)
|
||||
book = parsed_edition.to_model(model=models.Edition, save=True, overwrite=True)
|
||||
for item in UserImportPost.objects.filter(parent_job=job).all():
|
||||
item.start_job()
|
||||
|
||||
# set the cover image from the tar
|
||||
if cover_path:
|
||||
tar.write_image_to_file(cover_path, book.cover)
|
||||
|
||||
return book
|
||||
task.complete_job()
|
||||
|
||||
|
||||
def upsert_readthroughs(data, user, book_id):
|
||||
@app.task(queue=IMPORTS, base=SubTask)
|
||||
def upsert_status_task(**kwargs):
|
||||
"""Find or create book statuses"""
|
||||
|
||||
task = UserImportPost.objects.get(id=kwargs["child_id"])
|
||||
job = task.parent_job
|
||||
user = job.user
|
||||
status = task.json
|
||||
status_class = (
|
||||
models.Review
|
||||
if task.status_type == "review"
|
||||
else models.Quotation
|
||||
if task.status_type == "quote"
|
||||
else models.Comment
|
||||
)
|
||||
|
||||
if task.complete or job.status == "stopped":
|
||||
return
|
||||
|
||||
try:
|
||||
# only add statuses if this is the same user
|
||||
if is_alias(user, status.get("attributedTo", False)):
|
||||
status["attributedTo"] = user.remote_id
|
||||
status["to"] = update_followers_address(user, status["to"])
|
||||
status["cc"] = update_followers_address(user, status["cc"])
|
||||
status[
|
||||
"replies"
|
||||
] = (
|
||||
{}
|
||||
) # this parses incorrectly but we can't set it without knowing the new id
|
||||
status["inReplyToBook"] = task.book.remote_id
|
||||
parsed = activitypub.parse(status)
|
||||
if not status_already_exists(
|
||||
user, parsed
|
||||
): # don't duplicate posts on multiple import
|
||||
|
||||
instance = parsed.to_model(
|
||||
model=status_class, save=True, overwrite=True
|
||||
)
|
||||
|
||||
for val in [
|
||||
"progress",
|
||||
"progress_mode",
|
||||
"position",
|
||||
"endposition",
|
||||
"position_mode",
|
||||
]:
|
||||
if status.get(val):
|
||||
instance.val = status[val]
|
||||
|
||||
instance.remote_id = instance.get_remote_id() # update the remote_id
|
||||
instance.save() # save and broadcast
|
||||
|
||||
task.complete_job()
|
||||
|
||||
else:
|
||||
logger.warning(
|
||||
"User not authorized to import statuses, or status is tombstone"
|
||||
)
|
||||
task.fail_reason = _("unauthorized")
|
||||
task.save(update_fields=["fail_reason"])
|
||||
task.set_status("failed")
|
||||
|
||||
except Exception as err: # pylint: disable=broad-except
|
||||
logger.exception("User Import Task %s Failed with error: %s", task.id, err)
|
||||
task.fail_reason = _("unknown")
|
||||
task.save(update_fields=["fail_reason"])
|
||||
task.set_status("failed")
|
||||
|
||||
|
||||
def upsert_readthroughs(user, book_id, data):
|
||||
"""Take a JSON string of readthroughs and
|
||||
find or create the instances in the database"""
|
||||
|
||||
|
@ -176,49 +405,11 @@ def upsert_readthroughs(data, user, book_id):
|
|||
models.ReadThrough.objects.create(**obj)
|
||||
|
||||
|
||||
def upsert_statuses(user, cls, data, book_remote_id):
|
||||
"""Take a JSON string of a status and
|
||||
find or create the instances in the database"""
|
||||
|
||||
for status in data:
|
||||
if is_alias(
|
||||
user, status["attributedTo"]
|
||||
): # don't let l33t hax0rs steal other people's posts
|
||||
# update ids and remove replies
|
||||
status["attributedTo"] = user.remote_id
|
||||
status["to"] = update_followers_address(user, status["to"])
|
||||
status["cc"] = update_followers_address(user, status["cc"])
|
||||
status[
|
||||
"replies"
|
||||
] = (
|
||||
{}
|
||||
) # this parses incorrectly but we can't set it without knowing the new id
|
||||
status["inReplyToBook"] = book_remote_id
|
||||
parsed = activitypub.parse(status)
|
||||
if not status_already_exists(
|
||||
user, parsed
|
||||
): # don't duplicate posts on multiple import
|
||||
|
||||
instance = parsed.to_model(model=cls, save=True, overwrite=True)
|
||||
|
||||
for val in [
|
||||
"progress",
|
||||
"progress_mode",
|
||||
"position",
|
||||
"endposition",
|
||||
"position_mode",
|
||||
]:
|
||||
if status.get(val):
|
||||
instance.val = status[val]
|
||||
|
||||
instance.remote_id = instance.get_remote_id() # update the remote_id
|
||||
instance.save() # save and broadcast
|
||||
|
||||
else:
|
||||
logger.warning("User does not have permission to import statuses")
|
||||
|
||||
|
||||
def upsert_lists(user, lists, book_id):
|
||||
def upsert_lists(
|
||||
user,
|
||||
book_id,
|
||||
lists,
|
||||
):
|
||||
"""Take a list of objects each containing
|
||||
a list and list item as AP objects
|
||||
|
||||
|
@ -254,11 +445,10 @@ def upsert_lists(user, lists, book_id):
|
|||
)
|
||||
|
||||
|
||||
def upsert_shelves(book, user, book_data):
|
||||
def upsert_shelves(user, book, shelves):
|
||||
"""Take shelf JSON objects and create
|
||||
DB entries if they don't already exist"""
|
||||
|
||||
shelves = book_data["shelves"]
|
||||
for shelf in shelves:
|
||||
|
||||
book_shelf = models.Shelf.objects.filter(name=shelf["name"], user=user).first()
|
||||
|
@ -275,6 +465,10 @@ def upsert_shelves(book, user, book_data):
|
|||
)
|
||||
|
||||
|
||||
# user updates
|
||||
##############
|
||||
|
||||
|
||||
def update_user_profile(user, tar, data):
|
||||
"""update the user's profile from import data"""
|
||||
name = data.get("name", None)
|
||||
|
@ -315,14 +509,6 @@ def update_user_settings(user, data):
|
|||
user.save(update_fields=update_fields)
|
||||
|
||||
|
||||
@app.task(queue=IMPORTS, base=SubTask)
|
||||
def update_user_settings_task(job_id):
|
||||
"""wrapper task for user's settings import"""
|
||||
parent_job = BookwyrmImportJob.objects.get(id=job_id)
|
||||
|
||||
return update_user_settings(parent_job.user, parent_job.import_data.get("user"))
|
||||
|
||||
|
||||
def update_goals(user, data):
|
||||
"""update the user's goals from import data"""
|
||||
|
||||
|
@ -340,14 +526,6 @@ def update_goals(user, data):
|
|||
models.AnnualGoal.objects.create(**goal)
|
||||
|
||||
|
||||
@app.task(queue=IMPORTS, base=SubTask)
|
||||
def update_goals_task(job_id):
|
||||
"""wrapper task for user's goals import"""
|
||||
parent_job = BookwyrmImportJob.objects.get(id=job_id)
|
||||
|
||||
return update_goals(parent_job.user, parent_job.import_data.get("goals"))
|
||||
|
||||
|
||||
def upsert_saved_lists(user, values):
|
||||
"""Take a list of remote ids and add as saved lists"""
|
||||
|
||||
|
@ -358,67 +536,85 @@ def upsert_saved_lists(user, values):
|
|||
|
||||
|
||||
@app.task(queue=IMPORTS, base=SubTask)
|
||||
def upsert_saved_lists_task(job_id):
|
||||
"""wrapper task for user's saved lists import"""
|
||||
parent_job = BookwyrmImportJob.objects.get(id=job_id)
|
||||
def import_user_relationship_task(**kwargs):
|
||||
"""import a user follow or block from an import file"""
|
||||
|
||||
return upsert_saved_lists(
|
||||
parent_job.user, parent_job.import_data.get("saved_lists")
|
||||
)
|
||||
task = UserRelationshipImport.objects.get(id=kwargs["child_id"])
|
||||
job = task.parent_job
|
||||
|
||||
try:
|
||||
if task.relationship == "follow":
|
||||
|
||||
def upsert_follows(user, values):
|
||||
"""Take a list of remote ids and add as follows"""
|
||||
|
||||
for remote_id in values:
|
||||
followee = activitypub.resolve_remote_id(remote_id, models.User)
|
||||
if followee:
|
||||
(follow_request, created,) = models.UserFollowRequest.objects.get_or_create(
|
||||
user_subject=user,
|
||||
user_object=followee,
|
||||
)
|
||||
|
||||
if not created:
|
||||
# this request probably failed to connect with the remote
|
||||
# and should save to trigger a re-broadcast
|
||||
follow_request.save()
|
||||
|
||||
|
||||
@app.task(queue=IMPORTS, base=SubTask)
|
||||
def upsert_follows_task(job_id):
|
||||
"""wrapper task for user's follows import"""
|
||||
parent_job = BookwyrmImportJob.objects.get(id=job_id)
|
||||
|
||||
return upsert_follows(parent_job.user, parent_job.import_data.get("follows"))
|
||||
|
||||
|
||||
def upsert_user_blocks(user, user_ids):
|
||||
"""block users"""
|
||||
|
||||
for user_id in user_ids:
|
||||
user_object = activitypub.resolve_remote_id(user_id, models.User)
|
||||
if user_object:
|
||||
exists = models.UserBlocks.objects.filter(
|
||||
user_subject=user, user_object=user_object
|
||||
).exists()
|
||||
if not exists:
|
||||
models.UserBlocks.objects.create(
|
||||
user_subject=user, user_object=user_object
|
||||
followee = activitypub.resolve_remote_id(task.remote_id, models.User)
|
||||
if followee:
|
||||
(
|
||||
follow_request,
|
||||
created,
|
||||
) = models.UserFollowRequest.objects.get_or_create(
|
||||
user_subject=job.user,
|
||||
user_object=followee,
|
||||
)
|
||||
# remove the blocked users's lists from the groups
|
||||
models.List.remove_from_group(user, user_object)
|
||||
# remove the blocked user from all blocker's owned groups
|
||||
models.GroupMember.remove(user, user_object)
|
||||
|
||||
if not created:
|
||||
# this request probably failed to connect with the remote
|
||||
# and should save to trigger a re-broadcast
|
||||
follow_request.save()
|
||||
|
||||
task.complete_job()
|
||||
|
||||
else:
|
||||
logger.exception(
|
||||
"Could not resolve user %s task %s", task.remote_id, task.id
|
||||
)
|
||||
task.fail_reason = _("connection_error")
|
||||
task.save(update_fields=["fail_reason"])
|
||||
task.set_status("failed")
|
||||
|
||||
elif task.relationship == "block":
|
||||
|
||||
user_object = activitypub.resolve_remote_id(task.remote_id, models.User)
|
||||
if user_object:
|
||||
exists = models.UserBlocks.objects.filter(
|
||||
user_subject=job.user, user_object=user_object
|
||||
).exists()
|
||||
if not exists:
|
||||
models.UserBlocks.objects.create(
|
||||
user_subject=job.user, user_object=user_object
|
||||
)
|
||||
# remove the blocked users's lists from the groups
|
||||
models.List.remove_from_group(job.user, user_object)
|
||||
# remove the blocked user from all blocker's owned groups
|
||||
models.GroupMember.remove(job.user, user_object)
|
||||
|
||||
task.complete_job()
|
||||
|
||||
else:
|
||||
logger.exception(
|
||||
"Could not resolve user %s task %s", task.remote_id, task.id
|
||||
)
|
||||
task.fail_reason = _("connection_error")
|
||||
task.save(update_fields=["fail_reason"])
|
||||
task.set_status("failed")
|
||||
|
||||
else:
|
||||
logger.exception(
|
||||
"Invalid relationship %s type specified in task %s",
|
||||
task.relationship,
|
||||
task.id,
|
||||
)
|
||||
task.fail_reason = _("invalid_relationship")
|
||||
task.save(update_fields=["fail_reason"])
|
||||
task.set_status("failed")
|
||||
|
||||
except Exception as err: # pylint: disable=broad-except
|
||||
logger.exception("User Import Task %s Failed with error: %s", task.id, err)
|
||||
task.fail_reason = _("unknown")
|
||||
task.save(update_fields=["fail_reason"])
|
||||
task.set_status("failed")
|
||||
|
||||
|
||||
@app.task(queue=IMPORTS, base=SubTask)
|
||||
def upsert_user_blocks_task(job_id):
|
||||
"""wrapper task for user's blocks import"""
|
||||
parent_job = BookwyrmImportJob.objects.get(id=job_id)
|
||||
|
||||
return upsert_user_blocks(
|
||||
parent_job.user, parent_job.import_data.get("blocked_users")
|
||||
)
|
||||
# utilities
|
||||
###########
|
||||
|
||||
|
||||
def update_followers_address(user, field):
|
||||
|
@ -433,19 +629,21 @@ def update_followers_address(user, field):
|
|||
|
||||
|
||||
def is_alias(user, remote_id):
|
||||
"""check that the user is listed as movedTo or also_known_as
|
||||
in the remote user's profile"""
|
||||
"""check that the user is listed as moved_to
|
||||
or also_known_as in the remote user's profile"""
|
||||
|
||||
if not remote_id:
|
||||
return False
|
||||
|
||||
remote_user = activitypub.resolve_remote_id(
|
||||
remote_id=remote_id, model=models.User, save=False
|
||||
)
|
||||
|
||||
if remote_user:
|
||||
|
||||
if remote_user.moved_to:
|
||||
if getattr(remote_user, "moved_to", None) is not None:
|
||||
return user.remote_id == remote_user.moved_to
|
||||
|
||||
if remote_user.also_known_as:
|
||||
if hasattr(remote_user, "also_known_as"):
|
||||
return user in remote_user.also_known_as.all()
|
||||
|
||||
return False
|
||||
|
|
|
@ -86,7 +86,9 @@ class ActivitypubFieldMixin:
|
|||
raise
|
||||
value = getattr(data, "actor")
|
||||
formatted = self.field_from_activity(
|
||||
value, allow_external_connections=allow_external_connections
|
||||
value,
|
||||
allow_external_connections=allow_external_connections,
|
||||
trigger=instance,
|
||||
)
|
||||
if formatted is None or formatted is MISSING or formatted == {}:
|
||||
return False
|
||||
|
@ -128,7 +130,7 @@ class ActivitypubFieldMixin:
|
|||
return value
|
||||
|
||||
# pylint: disable=unused-argument
|
||||
def field_from_activity(self, value, allow_external_connections=True):
|
||||
def field_from_activity(self, value, allow_external_connections=True, trigger=None):
|
||||
"""formatter to convert activitypub into a model value"""
|
||||
if value and hasattr(self, "activitypub_wrapper"):
|
||||
value = value.get(self.activitypub_wrapper)
|
||||
|
@ -150,7 +152,9 @@ class ActivitypubRelatedFieldMixin(ActivitypubFieldMixin):
|
|||
self.load_remote = load_remote
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def field_from_activity(self, value, allow_external_connections=True):
|
||||
def field_from_activity(self, value, allow_external_connections=True, trigger=None):
|
||||
"""trigger: the object that triggered this deserialization.
|
||||
For example the Edition for which self is the parent Work"""
|
||||
if not value:
|
||||
return None
|
||||
|
||||
|
@ -160,7 +164,7 @@ class ActivitypubRelatedFieldMixin(ActivitypubFieldMixin):
|
|||
# only look in the local database
|
||||
return related_model.find_existing(value.serialize())
|
||||
# this is an activitypub object, which we can deserialize
|
||||
return value.to_model(model=related_model)
|
||||
return value.to_model(model=related_model, trigger=trigger)
|
||||
try:
|
||||
# make sure the value looks like a remote id
|
||||
validate_remote_id(value)
|
||||
|
@ -336,7 +340,7 @@ class ManyToManyField(ActivitypubFieldMixin, models.ManyToManyField):
|
|||
return f"{value.instance.remote_id}/{self.name}"
|
||||
return [i.remote_id for i in value.all()]
|
||||
|
||||
def field_from_activity(self, value, allow_external_connections=True):
|
||||
def field_from_activity(self, value, allow_external_connections=True, trigger=None):
|
||||
if value is None or value is MISSING:
|
||||
return None
|
||||
if not isinstance(value, list):
|
||||
|
@ -386,7 +390,7 @@ class TagField(ManyToManyField):
|
|||
)
|
||||
return tags
|
||||
|
||||
def field_from_activity(self, value, allow_external_connections=True):
|
||||
def field_from_activity(self, value, allow_external_connections=True, trigger=None):
|
||||
if not isinstance(value, list):
|
||||
# GoToSocial DMs and single-user mentions are
|
||||
# sent as objects, not as an array of objects
|
||||
|
@ -481,7 +485,7 @@ class ImageField(ActivitypubFieldMixin, models.ImageField):
|
|||
|
||||
return activitypub.Image(url=url, name=alt)
|
||||
|
||||
def field_from_activity(self, value, allow_external_connections=True):
|
||||
def field_from_activity(self, value, allow_external_connections=True, trigger=None):
|
||||
image_slug = value
|
||||
# when it's an inline image (User avatar/icon, Book cover), it's a json
|
||||
# blob, but when it's an attached image, it's just a url
|
||||
|
@ -538,7 +542,7 @@ class DateTimeField(ActivitypubFieldMixin, models.DateTimeField):
|
|||
return None
|
||||
return value.isoformat()
|
||||
|
||||
def field_from_activity(self, value, allow_external_connections=True):
|
||||
def field_from_activity(self, value, allow_external_connections=True, trigger=None):
|
||||
missing_fields = datetime(1970, 1, 1) # "2022-10" => "2022-10-01"
|
||||
try:
|
||||
date_value = dateutil.parser.parse(value, default=missing_fields)
|
||||
|
@ -556,7 +560,7 @@ class PartialDateField(ActivitypubFieldMixin, PartialDateModel):
|
|||
def field_to_activity(self, value) -> str:
|
||||
return value.partial_isoformat() if value else None
|
||||
|
||||
def field_from_activity(self, value, allow_external_connections=True):
|
||||
def field_from_activity(self, value, allow_external_connections=True, trigger=None):
|
||||
# pylint: disable=no-else-return
|
||||
try:
|
||||
return from_partial_isoformat(value)
|
||||
|
@ -584,7 +588,7 @@ class PartialDateField(ActivitypubFieldMixin, PartialDateModel):
|
|||
class HtmlField(ActivitypubFieldMixin, models.TextField):
|
||||
"""a text field for storing html"""
|
||||
|
||||
def field_from_activity(self, value, allow_external_connections=True):
|
||||
def field_from_activity(self, value, allow_external_connections=True, trigger=None):
|
||||
if not value or value == MISSING:
|
||||
return None
|
||||
return clean(value)
|
||||
|
|
|
@ -29,6 +29,7 @@ class Job(models.Model):
|
|||
status = models.CharField(
|
||||
max_length=50, choices=Status.choices, default=Status.PENDING, null=True
|
||||
)
|
||||
fail_reason = models.TextField(null=True)
|
||||
|
||||
class Meta:
|
||||
"""Make it abstract"""
|
||||
|
@ -133,7 +134,8 @@ class ParentJob(Job):
|
|||
tasks = self.pending_child_jobs.filter(task_id__isnull=False).values_list(
|
||||
"task_id", flat=True
|
||||
)
|
||||
app.control.revoke(list(tasks))
|
||||
tasklist = [str(task) for task in list(tasks)]
|
||||
app.control.revoke(tasklist)
|
||||
|
||||
self.pending_child_jobs.update(status=self.Status.STOPPED)
|
||||
|
||||
|
@ -208,7 +210,7 @@ class ParentTask(app.Task):
|
|||
job.task_id = task_id
|
||||
job.save(update_fields=["task_id"])
|
||||
|
||||
if kwargs["no_children"]:
|
||||
if kwargs.get("no_children"):
|
||||
job.set_status(ChildJob.Status.ACTIVE)
|
||||
|
||||
def on_success(
|
||||
|
@ -233,7 +235,7 @@ class ParentTask(app.Task):
|
|||
None: The return value of this handler is ignored.
|
||||
"""
|
||||
|
||||
if kwargs["no_children"]:
|
||||
if kwargs.get("no_children"):
|
||||
job = ParentJob.objects.get(id=kwargs["job_id"])
|
||||
job.complete_job()
|
||||
|
||||
|
@ -247,7 +249,7 @@ class SubTask(app.Task):
|
|||
"""
|
||||
|
||||
def before_start(
|
||||
self, task_id, *args, **kwargs
|
||||
self, task_id, args, kwargs
|
||||
): # pylint: disable=no-self-use, unused-argument
|
||||
"""Handler called before the task starts. Override.
|
||||
|
||||
|
@ -271,7 +273,7 @@ class SubTask(app.Task):
|
|||
child_job.set_status(ChildJob.Status.ACTIVE)
|
||||
|
||||
def on_success(
|
||||
self, retval, task_id, *args, **kwargs
|
||||
self, retval, task_id, args, kwargs
|
||||
): # pylint: disable=no-self-use, unused-argument
|
||||
"""Run by the worker if the task executes successfully. Override.
|
||||
|
||||
|
|
|
@ -52,6 +52,13 @@
|
|||
<dd>{{ book.goodreads_key }}</dd>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% if book.finna_key %}
|
||||
<div class="is-flex is-flex-wrap-wrap">
|
||||
<dt class="mr-1">{% trans "Finna ID:" %}</dt>
|
||||
<dd>{{ book.finna_key}}</dd>
|
||||
</div>
|
||||
{% endif %}
|
||||
</dl>
|
||||
{% endif %}
|
||||
{% endspaceless %}
|
||||
|
|
|
@ -382,6 +382,15 @@
|
|||
|
||||
{% include 'snippets/form_errors.html' with errors_list=form.isfdb.errors id="desc_isfdb" %}
|
||||
</div>
|
||||
|
||||
<div class="field">
|
||||
<label class="label" for="id_finna_key">
|
||||
{% trans "Finna ID:" %}
|
||||
</label>
|
||||
{{ form.finna_key }}
|
||||
|
||||
{% include 'snippets/form_errors.html' with errors_list=form.finna_key.errors id="desc_finna_key" %}
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
</div>
|
||||
|
|
|
@ -29,10 +29,25 @@
|
|||
</div>
|
||||
{% elif next_available %}
|
||||
<div class="notification is-warning">
|
||||
<p>{% blocktrans %}Currently you are allowed to import one user every {{ user_import_hours }} hours.{% endblocktrans %}</p>
|
||||
<p>{% blocktrans %}You will next be able to import a user file at {{ next_available }}{% endblocktrans %}</p>
|
||||
<p>{% blocktrans with hours=next_available.1 %}Currently you are allowed to import one user every {{ hours }} hours.{% endblocktrans %}</p>
|
||||
<p>{% blocktrans with next_time=next_available.0 %}You will next be able to import a user file at {{ next_time }}{% endblocktrans %}</p>
|
||||
</div>
|
||||
{% else %}
|
||||
{% if recent_avg_hours or recent_avg_minutes %}
|
||||
<div class="notification">
|
||||
<p>
|
||||
{% if recent_avg_hours %}
|
||||
{% blocktrans trimmed with hours=recent_avg_hours|floatformat:0|intcomma %}
|
||||
On average, recent imports have taken {{ hours }} hours.
|
||||
{% endblocktrans %}
|
||||
{% else %}
|
||||
{% blocktrans trimmed with minutes=recent_avg_minutes|floatformat:0|intcomma %}
|
||||
On average, recent imports have taken {{ minutes }} minutes.
|
||||
{% endblocktrans %}
|
||||
{% endif %}
|
||||
</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
<form class="box content" name="import-user" action="/user-import" method="post" enctype="multipart/form-data">
|
||||
{% csrf_token %}
|
||||
|
||||
|
@ -186,10 +201,10 @@
|
|||
{% endif %}
|
||||
{% for job in jobs %}
|
||||
<tr>
|
||||
<td><a href="{% url 'user-import-status' job.id %}">{{ job.created_date }}</a></td>
|
||||
<td>
|
||||
<p>{{ job.created_date }}</p>
|
||||
<p>{{ job.updated_date }}</p>
|
||||
</td>
|
||||
<td>{{ job.updated_date }}</td>
|
||||
<td>
|
||||
<span
|
||||
{% if job.status == "stopped" or job.status == "failed" %}
|
||||
|
@ -197,14 +212,13 @@
|
|||
{% elif job.status == "pending" %}
|
||||
class="tag is-warning"
|
||||
{% elif job.complete %}
|
||||
class="tag"
|
||||
{% else %}
|
||||
class="tag is-success"
|
||||
{% else %}
|
||||
class="tag"
|
||||
{% endif %}
|
||||
>
|
||||
{% if job.status %}
|
||||
{{ job.status }}
|
||||
{{ job.status_display }}
|
||||
{{ job.get_status_display }}
|
||||
{% elif job.complete %}
|
||||
{% trans "Complete" %}
|
||||
{% else %}
|
||||
|
|
210
bookwyrm/templates/import/user_import_status.html
Normal file
210
bookwyrm/templates/import/user_import_status.html
Normal file
|
@ -0,0 +1,210 @@
|
|||
{% extends 'layout.html' %}
|
||||
{% load i18n %}
|
||||
{% load humanize %}
|
||||
{% load static %}
|
||||
|
||||
{% block title %}{% trans "User Import Status" %}{% endblock %}
|
||||
|
||||
{% block content %}{% spaceless %}
|
||||
<header class="block">
|
||||
<h1 class="title">
|
||||
{% block page_title %}
|
||||
{% if job.retry %}
|
||||
{% trans "User Import Retry Status" %}
|
||||
{% else %}
|
||||
{% trans "User Import Status" %}
|
||||
{% endif %}
|
||||
{% endblock %}
|
||||
</h1>
|
||||
|
||||
<nav class="breadcrumb subtitle" aria-label="breadcrumbs">
|
||||
<ul>
|
||||
<li><a href="{% url 'user-import' %}">{% trans "User Imports" %}</a></li>
|
||||
{% url 'user-import-status' job.id as path %}
|
||||
<li{% if request.path in path %} class="is-active"{% endif %}>
|
||||
<a href="{{ path }}" {% if request.path in path %}aria-current="page"{% endif %}>
|
||||
{% trans "User Import Status" %}
|
||||
</a>
|
||||
</li>
|
||||
{% block breadcrumbs %}{% endblock %}
|
||||
</ul>
|
||||
</nav>
|
||||
|
||||
<div class="block">
|
||||
<dl>
|
||||
<dt class="is-pulled-left mr-5 has-text-weight-bold">{% trans "Import started:" %}</dt>
|
||||
<dd>{{ job.created_date | naturaltime }}</dd>
|
||||
<dt class="is-pulled-left mr-5 has-text-weight-bold">Import Job Status: </dt>
|
||||
<dd>
|
||||
<span
|
||||
{% if job.status == "stopped" or job.status == "failed" %}
|
||||
class="tag is-danger"
|
||||
{% elif job.status == "pending" %}
|
||||
class="tag is-warning"
|
||||
{% elif job.complete %}
|
||||
class="tag"
|
||||
{% else %}
|
||||
class="tag is-success"
|
||||
{% endif %}
|
||||
>
|
||||
{% if job.status %}
|
||||
{{ job.status }}
|
||||
{{ job.status_display }}
|
||||
{% elif job.complete %}
|
||||
{% trans "Complete" %}
|
||||
{% else %}
|
||||
{% trans "Active" %}
|
||||
{% endif %}
|
||||
</span>
|
||||
</dd>
|
||||
</dl>
|
||||
</div>
|
||||
{% block import_counts %}
|
||||
<div class="block">
|
||||
<div class="table-container">
|
||||
<table class="table is-striped is-fullwidth">
|
||||
<tr>
|
||||
<th></th>
|
||||
<th class="has-text-centered">{% trans "Imported" %}</th>
|
||||
<th class="has-text-centered">{% trans "Failed" %}</th>
|
||||
<th class="has-text-centered">{% trans "Total" %}</th>
|
||||
</tr>
|
||||
<tr>
|
||||
<th>{% trans "Books" %}</th>
|
||||
<td class="has-text-centered">{{ completed_books_count }}</td>
|
||||
<td class="has-text-centered">{{ failed_books_count }}</td>
|
||||
<td class="has-text-centered">{{ book_jobs_count }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th>{% trans "Statuses" %}</th>
|
||||
<td class="has-text-centered">{{ completed_statuses_count }}</td>
|
||||
<td class="has-text-centered">{{ failed_statuses_count }}</td>
|
||||
<td class="has-text-centered">{{ status_jobs_count }}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<th>{% trans "Follows & Blocks" %}</th>
|
||||
<td class="has-text-centered">{{ completed_relationships_count }}</td>
|
||||
<td class="has-text-centered">{{ failed_relationships_count }}</td>
|
||||
<td class="has-text-centered">{{ relationship_jobs_count }}</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% if job.status == "active" and show_progress %}
|
||||
<div class="box is-processing">
|
||||
<div class="block">
|
||||
<span class="icon icon-spinner is-pulled-left" aria-hidden="true"></span>
|
||||
<span>{% trans "In progress" %}</span>
|
||||
<span class="is-pulled-right">
|
||||
<a href="{% url 'user-import-status' job.id %}" class="button is-small">{% trans "Refresh" %}</a>
|
||||
</span>
|
||||
</div>
|
||||
<div class="is-flex">
|
||||
<progress
|
||||
class="progress is-success is-medium mr-2"
|
||||
role="progressbar"
|
||||
aria-min="0"
|
||||
value="{{ complete_count }}"
|
||||
aria-valuenow="{{ complete_count }}"
|
||||
max="{{ item_count }}"
|
||||
aria-valuemax="{{ item_count }}">
|
||||
{{ percent }} %
|
||||
</progress>
|
||||
<span>{{ percent }}%</span>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% if not job.complete %}
|
||||
<form name="stop-import" action="{% url 'user-import-stop' job.id %}" method="POST">
|
||||
{% csrf_token %}
|
||||
<button class="button is-danger" type="submit">{% trans "Stop import" %}</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
|
||||
{% if job.complete and fail_count and not job.retry %}
|
||||
<div class="notification is-warning">
|
||||
{% blocktrans trimmed count counter=fail_count with display_counter=fail_count|intcomma %}
|
||||
{{ display_counter }} item failed to import.
|
||||
{% plural %}
|
||||
{{ display_counter }} items failed to import.
|
||||
{% endblocktrans %}
|
||||
<a href="{% url 'user-import-troubleshoot' job.id %}">
|
||||
{% trans "View and troubleshoot failed items" %}
|
||||
</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
</header>
|
||||
|
||||
<div class="block">
|
||||
{% block actions %}{% endblock %}
|
||||
{% block item_list %}
|
||||
<h2 class="title">{% trans "Imported books" %}</h2>
|
||||
<div class="table-container">
|
||||
<table class="table is-striped is-fullwidth">
|
||||
<tr>
|
||||
<th>
|
||||
{% trans "Title" %}
|
||||
</th>
|
||||
<th>
|
||||
{% trans "ISBN" %}
|
||||
</th>
|
||||
<th>
|
||||
{% trans "Authors" %}
|
||||
</th>
|
||||
{% block import_cols_headers %}
|
||||
<th>
|
||||
{% trans "Book" %}
|
||||
</th>
|
||||
<th>
|
||||
{% trans "Status" %}
|
||||
</th>
|
||||
{% endblock %}
|
||||
</tr>
|
||||
{% for item in items %}
|
||||
<tr>
|
||||
<td>
|
||||
{{ item.book_data.edition.title }}
|
||||
</td>
|
||||
<td>
|
||||
{{ item.book_data.edition.isbn13|default:'' }}
|
||||
|
||||
</td>
|
||||
<td>
|
||||
{% for author in item.book_data.authors %}
|
||||
<p>{{ author.name }}</p>
|
||||
{% endfor %}
|
||||
</td>
|
||||
{% block import_cols %}
|
||||
<td>
|
||||
{% if item.book %}
|
||||
<a href="{{ item.book.local_path }}">
|
||||
{% include 'snippets/book_cover.html' with book=item.book cover_class='is-h-s' size='small' %}
|
||||
</a>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
{% if item.book %}
|
||||
<span class="icon icon-check" aria-hidden="true"></span>
|
||||
<span class="is-sr-only-mobile">{% trans "Imported" %}</span>
|
||||
{% else %}
|
||||
<div class="is-flex">
|
||||
<span class="is-sr-only-mobile">{{ item.status }}</span>
|
||||
</div>
|
||||
{% endif %}
|
||||
</td>
|
||||
{% endblock %}
|
||||
</tr>
|
||||
{% block action_row %}{% endblock %}
|
||||
{% endfor %}
|
||||
</table>
|
||||
</div>
|
||||
{% endblock %}
|
||||
</div>
|
||||
|
||||
<div>
|
||||
{% include 'snippets/pagination.html' with page=items path=page_path %}
|
||||
</div>
|
||||
{% endspaceless %}{% endblock %}
|
91
bookwyrm/templates/import/user_troubleshoot.html
Normal file
91
bookwyrm/templates/import/user_troubleshoot.html
Normal file
|
@ -0,0 +1,91 @@
|
|||
{% extends 'import/user_import_status.html' %}
|
||||
{% load i18n %}
|
||||
{% load utilities %}
|
||||
|
||||
{% block title %}{% trans "User Import Troubleshooting" %}{% endblock %}
|
||||
|
||||
{% block page_title %}
|
||||
{% trans "Failed items" %}
|
||||
{% endblock %}
|
||||
|
||||
{% block breadcrumbs %}
|
||||
<li class="is-active">
|
||||
<a href="#" aria-current="page">{% trans "Troubleshooting" %}</a>
|
||||
</li>
|
||||
{% endblock %}
|
||||
|
||||
{% block import_counts %}{% endblock %}
|
||||
|
||||
{% block actions %}
|
||||
<div class="block">
|
||||
<div class="notification content">
|
||||
<p>
|
||||
{% trans "Re-trying an import can fix missing items in cases such as:" %}
|
||||
</p>
|
||||
<ul>
|
||||
<li>{% trans "Your account was not set as an alias of the original user account" %}</li>
|
||||
<li>{% trans "A transient error or timeout caused the external data source to be unavailable." %}</li>
|
||||
<li>{% trans "BookWyrm has been updated since this import with a bug fix" %}</li>
|
||||
</ul>
|
||||
<p>
|
||||
{% trans "Re-trying an import will not work in cases such as:" %}
|
||||
</p>
|
||||
<ul>
|
||||
<li>{% trans "A user, status, or BookWyrm server was deleted after your import file was created" %}</li>
|
||||
<li>{% trans "Importing statuses when your old account has been deleted" %}</li>
|
||||
</ul>
|
||||
<p>
|
||||
{% trans "Contact your admin or <a href='https://github.com/bookwyrm-social/bookwyrm/issues'>open an issue</a> if you are seeing unexpected failed items." %}
|
||||
</p>
|
||||
</div>
|
||||
{% if next_available %}
|
||||
<div class="notification is-warning">
|
||||
<p>{% blocktrans with hours=next_available.1 %}Currently you are allowed to import or retry one user every {{ hours }} hours.{% endblocktrans %}</p>
|
||||
<p>{% blocktrans with next_time=next_available.0 %}You will be able to retry this import at {{ next_time }}{% endblocktrans %}</p>
|
||||
</div>
|
||||
{% else %}
|
||||
<form name="retry" method="post" action="{% url 'user-import-troubleshoot' job.id %}">
|
||||
{% csrf_token %}
|
||||
<button type="submit" class="button">Retry all</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endblock %}
|
||||
{% block item_list %}
|
||||
<div class="table-container">
|
||||
<table class="table is-striped is-fullwidth">
|
||||
<tr>
|
||||
<th>
|
||||
{% trans "Book" %}
|
||||
</th>
|
||||
<th>
|
||||
{% trans "Status" %}
|
||||
</th>
|
||||
<th>
|
||||
{% trans "Relationship" %}
|
||||
</th>
|
||||
<th>
|
||||
{% trans "Reason" %}
|
||||
</th>
|
||||
</tr>
|
||||
{% for item in items %}
|
||||
<tr>
|
||||
<td class="is-italic">{{ item.userimportpost.book.title }}</td>
|
||||
<td>{{ item.userimportpost.json.type }}</td>
|
||||
<td>{% id_to_username item.userrelationshipimport.remote_id True %}</td>
|
||||
<td>
|
||||
{% if item.fail_reason == "unauthorized" %}
|
||||
Not authorized to import statuses
|
||||
{% elif item.fail_reason == "connection_error" %}
|
||||
Could not connect to remote identity
|
||||
{% elif item.fail_reason == "invalid_relationship" %}
|
||||
Invalid relationship type - please log an issue
|
||||
{% else %}
|
||||
Unknown error
|
||||
{% endif %}
|
||||
</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</table>
|
||||
</div>
|
||||
{% endblock %}
|
|
@ -1,5 +1,6 @@
|
|||
{% extends 'preferences/layout.html' %}
|
||||
{% load i18n %}
|
||||
{% load humanize %}
|
||||
{% load utilities %}
|
||||
|
||||
{% block title %}{% trans "Export BookWyrm Account" %}{% endblock %}
|
||||
|
@ -48,12 +49,12 @@
|
|||
<p class="notification is-danger">
|
||||
{% trans "New user exports are currently disabled." %}
|
||||
{% if perms.bookwyrm.edit_instance_settings %}
|
||||
<br/>
|
||||
{% url 'settings-imports' as url %}
|
||||
{% blocktrans trimmed %}
|
||||
User exports settings can be changed from <a href="{{ url }}">the Imports page</a> in the Admin dashboard.
|
||||
{% endblocktrans %}
|
||||
{% endif%}
|
||||
<br/>
|
||||
{% url 'settings-imports' as url %}
|
||||
{% blocktrans trimmed %}
|
||||
User exports settings can be changed from <a href="{{ url }}">the Imports page</a> in the Admin dashboard.
|
||||
{% endblocktrans %}
|
||||
{% endif%}
|
||||
</p>
|
||||
{% elif next_available %}
|
||||
<p class="notification is-warning">
|
||||
|
@ -61,7 +62,25 @@
|
|||
You will be able to create a new export file at {{ next_available }}
|
||||
{% endblocktrans %}
|
||||
</p>
|
||||
|
||||
{% else %}
|
||||
|
||||
{% if recent_avg_hours or recent_avg_minutes %}
|
||||
<div class="notification">
|
||||
<p>
|
||||
{% if recent_avg_hours %}
|
||||
{% blocktrans trimmed with hours=recent_avg_hours|floatformat:0|intcomma %}
|
||||
On average, recent exports have taken {{ hours }} hours.
|
||||
{% endblocktrans %}
|
||||
{% else %}
|
||||
{% blocktrans trimmed with minutes=recent_avg_minutes|floatformat:0|intcomma %}
|
||||
On average, recent exports have taken {{ minutes }} minutes.
|
||||
{% endblocktrans %}
|
||||
{% endif %}
|
||||
</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<form name="export" method="POST" href="{% url 'prefs-user-export' %}">
|
||||
{% csrf_token %}
|
||||
<button type="submit" class="button">
|
||||
|
@ -107,14 +126,13 @@
|
|||
{% elif export.job.status == "pending" %}
|
||||
class="tag is-warning"
|
||||
{% elif export.job.complete %}
|
||||
class="tag"
|
||||
{% else %}
|
||||
class="tag is-success"
|
||||
{% else %}
|
||||
class="tag"
|
||||
{% endif %}
|
||||
>
|
||||
{% if export.job.status %}
|
||||
{{ export.job.status }}
|
||||
{{ export.job.status_display }}
|
||||
{{ export.job.get_status_display }}
|
||||
{% elif export.job.complete %}
|
||||
{% trans "Complete" %}
|
||||
{% else %}
|
||||
|
|
10
bookwyrm/templates/rss/edition.html
Normal file
10
bookwyrm/templates/rss/edition.html
Normal file
|
@ -0,0 +1,10 @@
|
|||
{% load i18n %}
|
||||
{% load shelf_tags %}
|
||||
‘{{ obj.title }}’ {% if obj.author_text %} by {{obj.author_text}} {% endif %}
|
||||
<p>{{ obj.description|default_if_none:obj.parent_work.description}}</p>
|
||||
{% if obj.isbn_13 %}{% trans "ISBN 13:" %} {{ obj.isbn_13 }}<br>{% endif %}
|
||||
{% if obj.oclc_number %}{% trans "OCLC Number:" %} {{ obj.oclc_number }}<br>{% endif %}
|
||||
{% if obj.asin %}{% trans "ASIN:" %} {{ obj.asin }}<br>{% endif %}
|
||||
{% if obj.aasin %}{% trans "Audible ASIN:" %} {{ obj.aasin }}<br>{% endif %}
|
||||
{% if obj.isfdb %}{% trans "ISFDB ID:" %} {{ obj.isfdb }}<br>{% endif %}
|
||||
{% if obj.goodreads_key %}{% trans "Goodreads:" %} {{ obj.goodreads_key }}{% endif %}
|
|
@ -8,6 +8,9 @@
|
|||
<li class="">
|
||||
<a href="{{ author.local_path }}" class="author" itemprop="author" itemscope itemtype="https://schema.org/Thing">
|
||||
<span itemprop="name">{{ author.name }}</span>
|
||||
{% if author.born or author.died %}
|
||||
<span>({{ author.born|date:"Y" }}-{{ author.died|date:"Y" }})</span>
|
||||
{% endif %}
|
||||
</a>
|
||||
</li>
|
||||
{% endfor %}
|
||||
|
|
|
@ -12,6 +12,11 @@
|
|||
{% include 'snippets/opengraph.html' with image=user.preview_image %}
|
||||
{% endblock %}
|
||||
|
||||
|
||||
{% block head_links %}
|
||||
<link rel="alternate" type="application/rss+xml" href="{{ request.get_full_path }}/rss" title="{{ user.display_name }} - {{ shelf.name }}" />
|
||||
{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<header class="block">
|
||||
<h1 class="title">
|
||||
|
|
|
@ -116,7 +116,7 @@ def get_isni(existing, author, autoescape=True):
|
|||
|
||||
|
||||
@register.simple_tag(takes_context=False)
|
||||
def id_to_username(user_id):
|
||||
def id_to_username(user_id, return_empty=False):
|
||||
"""given an arbitrary remote id, return the username"""
|
||||
if user_id:
|
||||
url = urlparse(user_id)
|
||||
|
@ -126,6 +126,10 @@ def id_to_username(user_id):
|
|||
value = f"{name}@{domain}"
|
||||
|
||||
return value
|
||||
|
||||
if return_empty:
|
||||
return ""
|
||||
|
||||
return _("a new user account")
|
||||
|
||||
|
||||
|
|
142
bookwyrm/tests/connectors/test_finna_connector.py
Normal file
142
bookwyrm/tests/connectors/test_finna_connector.py
Normal file
|
@ -0,0 +1,142 @@
|
|||
""" testing book data connectors """
|
||||
import json
|
||||
import pathlib
|
||||
|
||||
from django.test import TestCase
|
||||
import responses
|
||||
|
||||
from bookwyrm import models
|
||||
from bookwyrm.connectors.finna import Connector, guess_page_numbers
|
||||
|
||||
|
||||
class Finna(TestCase):
|
||||
"""test loading data from finna.fi"""
|
||||
|
||||
@classmethod
|
||||
def setUpTestData(cls):
|
||||
"""creates the connector in the database"""
|
||||
models.Connector.objects.create(
|
||||
identifier="api.finna.fi",
|
||||
name="Finna API",
|
||||
connector_file="finna",
|
||||
base_url="https://www.finna.fi",
|
||||
books_url="https://api.finna.fi/api/v1/record" "?id=",
|
||||
covers_url="https://api.finna.fi",
|
||||
search_url="https://api.finna.fi/api/v1/search?limit=20"
|
||||
"&filter[]=format%3a%220%2fBook%2f%22"
|
||||
"&field[]=title&field[]=recordPage&field[]=authors"
|
||||
"&field[]=year&field[]=id&field[]=formats&field[]=images"
|
||||
"&lookfor=",
|
||||
isbn_search_url="https://api.finna.fi/api/v1/search?limit=1"
|
||||
"&filter[]=format%3a%220%2fBook%2f%22"
|
||||
"&field[]=title&field[]=recordPage&field[]=authors&field[]=year"
|
||||
"&field[]=id&field[]=formats&field[]=images"
|
||||
"&lookfor=isbn:",
|
||||
)
|
||||
|
||||
def setUp(self):
|
||||
"""connector instance"""
|
||||
self.connector = Connector("api.finna.fi")
|
||||
|
||||
def test_parse_search_data(self):
|
||||
"""json to search result objs"""
|
||||
search_file = pathlib.Path(__file__).parent.joinpath(
|
||||
"../data/finna_search.json"
|
||||
)
|
||||
search_results = json.loads(search_file.read_bytes())
|
||||
print(search_results)
|
||||
|
||||
print(self.connector)
|
||||
formatted = list(self.connector.parse_search_data(search_results, 0))
|
||||
print(formatted)
|
||||
|
||||
self.assertEqual(formatted[0].title, "Sarvijumala")
|
||||
self.assertEqual(formatted[0].author, "Magdalena Hai")
|
||||
self.assertEqual(
|
||||
formatted[0].key, "https://api.finna.fi/api/v1/record?id=anders.1920022"
|
||||
)
|
||||
self.assertEqual(
|
||||
formatted[0].cover,
|
||||
None,
|
||||
)
|
||||
# Test that edition info is parsed correctly to title
|
||||
self.assertEqual(formatted[1].title, "Sarvijumala Audiobook")
|
||||
self.assertEqual(formatted[2].title, "Sarvijumala")
|
||||
self.assertEqual(formatted[3].title, "Sarvijumala eBook")
|
||||
|
||||
def test_parse_isbn_search_data(self):
|
||||
"""another search type"""
|
||||
search_file = pathlib.Path(__file__).parent.joinpath(
|
||||
"../data/finna_isbn_search.json"
|
||||
)
|
||||
search_results = json.loads(search_file.read_bytes())
|
||||
|
||||
formatted = list(self.connector.parse_isbn_search_data(search_results))[0]
|
||||
|
||||
self.assertEqual(formatted.title, "Ilmakirja : painovoimainen ilmanvaihto")
|
||||
self.assertEqual(
|
||||
formatted.key, "https://api.finna.fi/api/v1/record?id=3amk.308439"
|
||||
)
|
||||
|
||||
def test_parse_isbn_search_data_empty(self):
|
||||
"""another search type"""
|
||||
search_results = {"resultCount": 0, "records": []}
|
||||
results = list(self.connector.parse_isbn_search_data(search_results))
|
||||
self.assertEqual(results, [])
|
||||
|
||||
def test_page_count_parsing(self):
|
||||
"""Test page count parsing flow"""
|
||||
for data in [
|
||||
"123 sivua",
|
||||
"123, [4] sivua",
|
||||
"sidottu 123 sivua",
|
||||
"123s; [4]; 9cm",
|
||||
]:
|
||||
page_count = guess_page_numbers([data])
|
||||
self.assertEqual(page_count, "123")
|
||||
for data in [" sivua", "xx, [4] sivua", "sidottu", "[4]; 9cm"]:
|
||||
page_count = guess_page_numbers([data])
|
||||
self.assertEqual(page_count, None)
|
||||
|
||||
@responses.activate
|
||||
def test_get_book_data(self):
|
||||
"""Test book data parsing from example json files"""
|
||||
record_file = pathlib.Path(__file__).parent.joinpath(
|
||||
"../data/finna_record.json"
|
||||
)
|
||||
version_file = pathlib.Path(__file__).parent.joinpath(
|
||||
"../data/finna_versions.json"
|
||||
)
|
||||
author_file = pathlib.Path(__file__).parent.joinpath(
|
||||
"../data/finna_author_search.json"
|
||||
)
|
||||
record_result = json.loads(record_file.read_bytes())
|
||||
versions_result = json.loads(version_file.read_bytes())
|
||||
author_search_result = json.loads(author_file.read_bytes())
|
||||
responses.add(
|
||||
responses.GET,
|
||||
"https://api.finna.fi/api/v1/search?id=anders.1819084&search=versions"
|
||||
"&view=&field%5B%5D=authors&field%5B%5D=cleanIsbn&field%5B%5D=edition"
|
||||
"&field%5B%5D=formats&field%5B%5D=id&field%5B%5D=images&field%5B%5D=isbns"
|
||||
"&field%5B%5D=languages&field%5B%5D=physicalDescriptions"
|
||||
"&field%5B%5D=publishers&field%5B%5D=recordPage&field%5B%5D=series"
|
||||
"&field%5B%5D=shortTitle&field%5B%5D=subjects&field%5B%5D=subTitle"
|
||||
"&field%5B%5D=summary&field%5B%5D=title&field%5B%5D=year",
|
||||
json=versions_result,
|
||||
)
|
||||
responses.add(
|
||||
responses.GET,
|
||||
"https://api.finna.fi/api/v1/search?limit=20&filter%5B%5D=format%3A%220%2F"
|
||||
"Book%2F%22&field%5B%5D=title&field%5B%5D=recordPage&field%5B%5D=authors&"
|
||||
"field%5B%5D=year&field%5B%5D=id&field%5B%5D=formats&field%5B%5D=images&"
|
||||
"lookfor=Emmi%20It%C3%A4ranta&type=Author&field%5B%5D=authors&field%5B%5D"
|
||||
"=cleanIsbn&field%5B%5D=formats&field%5B%5D=id&field%5B%5D=images&field"
|
||||
"%5B%5D=isbns&field%5B%5D=languages&field%5B%5D=physicalDescriptions&"
|
||||
"field%5B%5D=publishers&field%5B%5D=recordPage&field%5B%5D=series&field"
|
||||
"%5B%5D=shortTitle&field%5B%5D=subjects&field%5B%5D=subTitle"
|
||||
"&field%5B%5D=summary&field%5B%5D=title&field%5B%5D=year",
|
||||
json=author_search_result,
|
||||
)
|
||||
responses.add(responses.GET, "https://test.url/id", json=record_result)
|
||||
book = self.connector.get_or_create_book("https://test.url/id")
|
||||
self.assertEqual(book.languages[0], "Finnish")
|
BIN
bookwyrm/tests/data/default_avi_exif.jpg
Normal file
BIN
bookwyrm/tests/data/default_avi_exif.jpg
Normal file
Binary file not shown.
After ![]() (image error) Size: 6.6 KiB |
1512
bookwyrm/tests/data/finna_author_search.json
Normal file
1512
bookwyrm/tests/data/finna_author_search.json
Normal file
File diff suppressed because it is too large
Load diff
54
bookwyrm/tests/data/finna_isbn_search.json
Normal file
54
bookwyrm/tests/data/finna_isbn_search.json
Normal file
|
@ -0,0 +1,54 @@
|
|||
{
|
||||
"resultCount": 1,
|
||||
"records": [
|
||||
{
|
||||
"title": "Ilmakirja : painovoimainen ilmanvaihto",
|
||||
"recordPage": "/Record/3amk.308439",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Mikkola, Juulia": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Kuuluvainen, Leino": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
},
|
||||
"Böök, Netta": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
},
|
||||
"Moreeni": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": {
|
||||
"Moreeni, kustantaja": {
|
||||
"role": [
|
||||
"kustantaja"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"year": "2024",
|
||||
"id": "3amk.308439",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"status": "OK"
|
||||
}
|
86
bookwyrm/tests/data/finna_record.json
Normal file
86
bookwyrm/tests/data/finna_record.json
Normal file
|
@ -0,0 +1,86 @@
|
|||
{
|
||||
"resultCount": 1,
|
||||
"records": [
|
||||
{
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Itäranta, Emmi": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"cleanIsbn": "9523630873",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
],
|
||||
"id": "anders.1819084",
|
||||
"languages": [
|
||||
"fin"
|
||||
],
|
||||
"physicalDescriptions": [
|
||||
"381 sivua ; 22 cm"
|
||||
],
|
||||
"publishers": [
|
||||
"Kustannusosakeyhtiö Teos"
|
||||
],
|
||||
"recordPage": "/Record/anders.1819084",
|
||||
"series": [],
|
||||
"shortTitle": "Kuunpäivän kirjeet",
|
||||
"subjects": [
|
||||
[
|
||||
"Salo, Lumi",
|
||||
"(fiktiivinen hahmo)"
|
||||
],
|
||||
[
|
||||
"Soli",
|
||||
"(fiktiivinen hahmo)"
|
||||
],
|
||||
[
|
||||
"katoaminen"
|
||||
],
|
||||
[
|
||||
"etsintä"
|
||||
],
|
||||
[
|
||||
"identiteetti"
|
||||
],
|
||||
[
|
||||
"luokkaerot"
|
||||
],
|
||||
[
|
||||
"riisto"
|
||||
],
|
||||
[
|
||||
"puolisot"
|
||||
],
|
||||
[
|
||||
"sisäkertomukset"
|
||||
],
|
||||
[
|
||||
"muistikirjat"
|
||||
],
|
||||
[
|
||||
"siirtokunnat"
|
||||
],
|
||||
[
|
||||
"vaihtoehtoiset todellisuudet"
|
||||
]
|
||||
],
|
||||
"summary": [],
|
||||
"title": "Kuunpäivän kirjeet",
|
||||
"year": "2020"
|
||||
}
|
||||
],
|
||||
"status": "OK"
|
||||
}
|
677
bookwyrm/tests/data/finna_search.json
Normal file
677
bookwyrm/tests/data/finna_search.json
Normal file
|
@ -0,0 +1,677 @@
|
|||
{
|
||||
"resultCount": 87,
|
||||
"records": [
|
||||
{
|
||||
"title": "Sarvijumala",
|
||||
"recordPage": "/Record/anders.1920022",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Hai, Magdalena": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2023",
|
||||
"id": "anders.1920022",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Sarvijumala",
|
||||
"recordPage": "/Record/fikka.5591048",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Hai, Magdalena": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Toiviainen, Miiko": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
},
|
||||
"Otava": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Toiviainen, Miiko, lukija": {
|
||||
"role": [
|
||||
"lukija"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": {
|
||||
"Otava, kustantaja": {
|
||||
"role": [
|
||||
"kustantaja"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"year": "2023",
|
||||
"id": "fikka.5591048",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/AudioBook/",
|
||||
"translated": "Äänikirja"
|
||||
},
|
||||
{
|
||||
"value": "2/Book/AudioBook/Online/",
|
||||
"translated": "E-äänikirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Sarvijumala",
|
||||
"recordPage": "/Record/anders.1961374",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Takala, Tuija": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Otava, kustannusosakeyhtiö": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Hai, Magdalena": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": {
|
||||
"Otava, kustannusosakeyhtiö, kustantaja": {
|
||||
"role": [
|
||||
"kustantaja"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"year": "2024",
|
||||
"id": "anders.1961374",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Sarvijumala",
|
||||
"recordPage": "/Record/fikka.5591040",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Hai, Magdalena": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Otava, kustannusosakeyhtiö": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": {
|
||||
"Otava, kustannusosakeyhtiö, kustantaja": {
|
||||
"role": [
|
||||
"kustantaja"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"year": "2023",
|
||||
"id": "fikka.5591040",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/eBook/",
|
||||
"translated": "E-kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Sarvijumala",
|
||||
"recordPage": "/Record/keski.3324001",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Hai, Magdalena": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Toiviainen, Miiko": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Toiviainen, Miiko, lukija": {
|
||||
"role": [
|
||||
"lukija"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2023",
|
||||
"id": "keski.3324001",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/AudioBook/",
|
||||
"translated": "Äänikirja"
|
||||
},
|
||||
{
|
||||
"value": "2/Book/AudioBook/Daisy/",
|
||||
"translated": "Celia-äänikirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Kirjapaketti kouluille : Sarvijumala",
|
||||
"recordPage": "/Record/vaski.4353064",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Hai, Magdalena": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2023",
|
||||
"id": "vaski.4353064",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Sarvijumala : Daisy-äänikirja vain lukemisesteisille",
|
||||
"recordPage": "/Record/kyyti.1536498",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Hai, Magdalena": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Toiviainen, Miiko, lukija": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2023",
|
||||
"id": "kyyti.1536498",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/AudioBook/",
|
||||
"translated": "Äänikirja"
|
||||
},
|
||||
{
|
||||
"value": "2/Book/AudioBook/Daisy/",
|
||||
"translated": "Celia-äänikirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Tulta, verta, savupatsaita!. Osa 4, New Yorkin tulipätsissä Jumala antoi valokuvata Luciferin sarvet",
|
||||
"recordPage": "/Record/fikka.3958203",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Meller, Leo": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
},
|
||||
"Kuva ja sana (yhtiö)": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": {
|
||||
"Kuva ja sana (yhtiö), kustantaja": {
|
||||
"role": [
|
||||
"kustantaja"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"year": "2002",
|
||||
"id": "fikka.3958203",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/AudioBook/",
|
||||
"translated": "Äänikirja"
|
||||
},
|
||||
{
|
||||
"value": "2/Book/AudioBook/Cassette/",
|
||||
"translated": "Kasettiäänikirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Nico Bravo and the trial of Vulcan",
|
||||
"recordPage": "/Record/helmet.2531986",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Cavallaro, Michael": {
|
||||
"role": [
|
||||
"sarjakuvantekijä"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2022",
|
||||
"id": "helmet.2531986",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Nico Bravo and the cellar dwellers",
|
||||
"recordPage": "/Record/helmet.2536894",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Cavallaro, Michael": {
|
||||
"role": [
|
||||
"sarjakuvantekijä"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2020",
|
||||
"id": "helmet.2536894",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Nico Bravo and the cellar dwellers",
|
||||
"recordPage": "/Record/helmet.2536962",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Cavallaro, Michael": {
|
||||
"role": [
|
||||
"sarjakuvantekijä"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2020",
|
||||
"id": "helmet.2536962",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Nico Bravo and the hound of Hades",
|
||||
"recordPage": "/Record/helmet.2536893",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Cavallaro, Mike": {
|
||||
"role": [
|
||||
"sarjakuvantekijä"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2019",
|
||||
"id": "helmet.2536893",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Monsters : a bestiary of the bizarre",
|
||||
"recordPage": "/Record/karelia.99700956005967",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Dell, Christopher": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2016",
|
||||
"id": "karelia.99700956005967",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Wendigo ja muita yliluonnollisia kauhukertomuksia",
|
||||
"recordPage": "/Record/anders.1471241",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Sadelehto, Markku": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
},
|
||||
"Rosvall, Matti": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Sadelehto, Markku, toimittaja": {
|
||||
"role": [
|
||||
"toimittaja"
|
||||
]
|
||||
},
|
||||
"Rosvall, Matti, kääntäjä": {
|
||||
"role": [
|
||||
"kääntäjä"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2015",
|
||||
"id": "anders.1471241",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Sairas kertomus",
|
||||
"recordPage": "/Record/jykdok.1506488",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Sivonen, Hannu": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Sivonen, Hannu, kirjoittaja": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2015",
|
||||
"id": "jykdok.1506488",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Kuvitteellisten olentojen kirja",
|
||||
"recordPage": "/Record/anders.891877",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Borges, Jorge Luis": {
|
||||
"role": [
|
||||
"tekijä"
|
||||
]
|
||||
},
|
||||
"Guerrero, Margarita": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
},
|
||||
"Selander, Sari": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Selander, Sari, kääntäjä": {
|
||||
"role": [
|
||||
"kääntäjä"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2009",
|
||||
"id": "anders.891877",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Suutele minulle siivet",
|
||||
"recordPage": "/Record/anders.201550",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Tabermann, Tommy": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "2004",
|
||||
"id": "anders.201550",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Elämänuskon kirja",
|
||||
"recordPage": "/Record/anders.910579",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Jaatinen, Sanna": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "1996",
|
||||
"id": "anders.910579",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Suomen kansan eläinkirja : kertomus Metsolan ja Ilmolan väestä ja elämästä",
|
||||
"recordPage": "/Record/anders.88891",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Railo, Eino": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "1934",
|
||||
"id": "anders.88891",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Iloinen laulu Eevasta ja Aatamista : runoja",
|
||||
"recordPage": "/Record/anders.909304",
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Ahti, Risto": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"year": "1995",
|
||||
"id": "anders.909304",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"status": "OK"
|
||||
}
|
507
bookwyrm/tests/data/finna_versions.json
Normal file
507
bookwyrm/tests/data/finna_versions.json
Normal file
|
@ -0,0 +1,507 @@
|
|||
{
|
||||
"resultCount": 7,
|
||||
"records": [
|
||||
{
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Itäranta, Emmi": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"cleanIsbn": "9523630873",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
],
|
||||
"id": "anders.1819084",
|
||||
"isbns": [
|
||||
"978-952-363-087-1 sidottu"
|
||||
],
|
||||
"languages": [
|
||||
"fin"
|
||||
],
|
||||
"physicalDescriptions": [
|
||||
"381 sivua ; 22 cm"
|
||||
],
|
||||
"publishers": [
|
||||
"Kustannusosakeyhtiö Teos"
|
||||
],
|
||||
"recordPage": "/Record/anders.1819084",
|
||||
"series": [],
|
||||
"shortTitle": "Kuunpäivän kirjeet",
|
||||
"subjects": [
|
||||
[
|
||||
"Salo, Lumi",
|
||||
"(fiktiivinen hahmo)"
|
||||
],
|
||||
[
|
||||
"Soli",
|
||||
"(fiktiivinen hahmo)"
|
||||
],
|
||||
[
|
||||
"katoaminen"
|
||||
],
|
||||
[
|
||||
"etsintä"
|
||||
],
|
||||
[
|
||||
"identiteetti"
|
||||
],
|
||||
[
|
||||
"luokkaerot"
|
||||
],
|
||||
[
|
||||
"riisto"
|
||||
],
|
||||
[
|
||||
"puolisot"
|
||||
],
|
||||
[
|
||||
"sisäkertomukset"
|
||||
],
|
||||
[
|
||||
"muistikirjat"
|
||||
],
|
||||
[
|
||||
"siirtokunnat"
|
||||
],
|
||||
[
|
||||
"vaihtoehtoiset todellisuudet"
|
||||
]
|
||||
],
|
||||
"summary": [],
|
||||
"title": "Kuunpäivän kirjeet",
|
||||
"year": "2020"
|
||||
},
|
||||
{
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Itäranta, Emmi": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": []
|
||||
},
|
||||
"cleanIsbn": "1803360445",
|
||||
"edition": "First Titan edition",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
],
|
||||
"id": "anders.1906854",
|
||||
"isbns": [
|
||||
"978-1-80336-044-7 pehmeäkantinen"
|
||||
],
|
||||
"languages": [
|
||||
"eng"
|
||||
],
|
||||
"physicalDescriptions": [
|
||||
"365 sivua ; 20 cm"
|
||||
],
|
||||
"publishers": [
|
||||
"Titan Books"
|
||||
],
|
||||
"recordPage": "/Record/anders.1906854",
|
||||
"series": [],
|
||||
"shortTitle": "The moonday letters",
|
||||
"subjects": [
|
||||
[
|
||||
"Salo, Lumi",
|
||||
"(fiktiivinen hahmo)"
|
||||
],
|
||||
[
|
||||
"Sol",
|
||||
"(fiktiivinen hahmo)"
|
||||
],
|
||||
[
|
||||
"katoaminen"
|
||||
],
|
||||
[
|
||||
"etsintä"
|
||||
],
|
||||
[
|
||||
"identiteetti"
|
||||
],
|
||||
[
|
||||
"luokkaerot"
|
||||
],
|
||||
[
|
||||
"riisto"
|
||||
],
|
||||
[
|
||||
"puolisot"
|
||||
],
|
||||
[
|
||||
"sisäkertomukset"
|
||||
],
|
||||
[
|
||||
"muistikirjat"
|
||||
],
|
||||
[
|
||||
"siirtokunnat"
|
||||
],
|
||||
[
|
||||
"vaihtoehtoiset todellisuudet"
|
||||
]
|
||||
],
|
||||
"summary": [
|
||||
"Sol has disappeared. Their Earth-born wife Lumi sets out to find them but it is no simple feat: each clue uncovers another enigma. Their disappearance leads back to underground environmental groups and a web of mystery that spans the space between the planets themselves. Told through letters and extracts, the course of Lumi's journey takes her not only from the affluent colonies of Mars to the devastated remnants of Earth, but into the hidden depths of Sol's past and the long-forgotten secrets of her own."
|
||||
],
|
||||
"title": "The moonday letters",
|
||||
"year": "2022"
|
||||
},
|
||||
{
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Itäranta, Emmi": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Varjomäki, Elina": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Varjomäki, Elina, lukija": {
|
||||
"role": [
|
||||
"lukija"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": []
|
||||
},
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/AudioBook/",
|
||||
"translated": "Äänikirja"
|
||||
},
|
||||
{
|
||||
"value": "2/Book/AudioBook/Daisy/",
|
||||
"translated": "Celia-äänikirja"
|
||||
}
|
||||
],
|
||||
"id": "keski.3127858",
|
||||
"isbns": [],
|
||||
"languages": [
|
||||
"fin"
|
||||
],
|
||||
"physicalDescriptions": [
|
||||
"1 CD-levy (Daisy) (11 h 41 min)"
|
||||
],
|
||||
"publishers": [
|
||||
"Celia"
|
||||
],
|
||||
"recordPage": "/Record/keski.3127858",
|
||||
"series": [],
|
||||
"shortTitle": "Kuunpäivän kirjeet",
|
||||
"subjects": [
|
||||
[
|
||||
"Salo, Lumi",
|
||||
"(fiktiivinen hahmo)"
|
||||
],
|
||||
[
|
||||
"Sol",
|
||||
"(fiktiivinen hahmo)"
|
||||
],
|
||||
[
|
||||
"katoaminen"
|
||||
],
|
||||
[
|
||||
"etsintä"
|
||||
],
|
||||
[
|
||||
"identiteetti"
|
||||
],
|
||||
[
|
||||
"luokkaerot"
|
||||
],
|
||||
[
|
||||
"riisto"
|
||||
],
|
||||
[
|
||||
"puolisot"
|
||||
],
|
||||
[
|
||||
"sisäkertomukset"
|
||||
],
|
||||
[
|
||||
"muistikirjat"
|
||||
],
|
||||
[
|
||||
"siirtokunnat"
|
||||
],
|
||||
[
|
||||
"vaihtoehtoiset todellisuudet"
|
||||
]
|
||||
],
|
||||
"summary": [],
|
||||
"title": "Kuunpäivän kirjeet",
|
||||
"year": "2020"
|
||||
},
|
||||
{
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Itäranta, Emmi": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Varjomäki, Elina": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
},
|
||||
"Teos (kustantamo)": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Varjomäki, Elina, lukija": {
|
||||
"role": [
|
||||
"lukija"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": {
|
||||
"Teos (kustantamo), kustantaja": {
|
||||
"role": [
|
||||
"kustantaja"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"cleanIsbn": "9523631020",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/AudioBook/",
|
||||
"translated": "Äänikirja"
|
||||
},
|
||||
{
|
||||
"value": "2/Book/AudioBook/Online/",
|
||||
"translated": "E-äänikirja"
|
||||
}
|
||||
],
|
||||
"id": "fikka.5456913",
|
||||
"isbns": [
|
||||
"978-952-363-102-1 MP3"
|
||||
],
|
||||
"languages": [
|
||||
"fin"
|
||||
],
|
||||
"physicalDescriptions": [
|
||||
"1 verkkoaineisto (1 äänitiedosto (11 h 39 min))"
|
||||
],
|
||||
"publishers": [
|
||||
"Kustannusosakeyhtiö Teos"
|
||||
],
|
||||
"recordPage": "/Record/fikka.5456913",
|
||||
"series": [],
|
||||
"shortTitle": "Kuunpäivän kirjeet",
|
||||
"subjects": [],
|
||||
"summary": [],
|
||||
"title": "Kuunpäivän kirjeet",
|
||||
"year": "2020"
|
||||
},
|
||||
{
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Itäranta, Emmi": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Kustannusosakeyhtiö Teos": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": [],
|
||||
"corporate": {
|
||||
"Kustannusosakeyhtiö Teos, kustantaja": {
|
||||
"role": [
|
||||
"kustantaja"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"cleanIsbn": "9523631012",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/eBook/",
|
||||
"translated": "E-kirja"
|
||||
}
|
||||
],
|
||||
"id": "fikka.5458151",
|
||||
"isbns": [
|
||||
"978-952-363-101-4 EPUB"
|
||||
],
|
||||
"languages": [
|
||||
"fin"
|
||||
],
|
||||
"physicalDescriptions": [
|
||||
"1 verkkoaineisto"
|
||||
],
|
||||
"publishers": [
|
||||
"Teos"
|
||||
],
|
||||
"recordPage": "/Record/fikka.5458151",
|
||||
"series": [],
|
||||
"shortTitle": "Kuunpäivän kirjeet",
|
||||
"subjects": [],
|
||||
"summary": [],
|
||||
"title": "Kuunpäivän kirjeet",
|
||||
"year": "2020"
|
||||
},
|
||||
{
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Itäranta, Emmi": {
|
||||
"role": [
|
||||
"kirjoittaja"
|
||||
]
|
||||
},
|
||||
"Švec, Michal": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Švec, Michal, kääntäjä": {
|
||||
"role": [
|
||||
"kääntäjä"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": []
|
||||
},
|
||||
"cleanIsbn": "807662634X",
|
||||
"edition": "Vydání první",
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/Book/",
|
||||
"translated": "Kirja"
|
||||
}
|
||||
],
|
||||
"id": "fikka.5773795",
|
||||
"isbns": [
|
||||
"978-80-7662-634-8 kovakantinen"
|
||||
],
|
||||
"languages": [
|
||||
"ces"
|
||||
],
|
||||
"physicalDescriptions": [
|
||||
"332 sivua ; 21 cm"
|
||||
],
|
||||
"publishers": [
|
||||
"Kniha Zlin"
|
||||
],
|
||||
"recordPage": "/Record/fikka.5773795",
|
||||
"series": [],
|
||||
"shortTitle": "Dopisy měsíčního dne",
|
||||
"subjects": [],
|
||||
"summary": [],
|
||||
"title": "Dopisy měsíčního dne",
|
||||
"year": "2024"
|
||||
},
|
||||
{
|
||||
"authors": {
|
||||
"primary": {
|
||||
"Itäranta, Emmi": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
},
|
||||
"Varjomäki, Elina": {
|
||||
"role": [
|
||||
"-"
|
||||
]
|
||||
}
|
||||
},
|
||||
"secondary": {
|
||||
"Varjomäki, Elina , lukija": {
|
||||
"role": [
|
||||
"lukija"
|
||||
]
|
||||
}
|
||||
},
|
||||
"corporate": []
|
||||
},
|
||||
"formats": [
|
||||
{
|
||||
"value": "0/Book/",
|
||||
"translated": "Kirja"
|
||||
},
|
||||
{
|
||||
"value": "1/Book/AudioBook/",
|
||||
"translated": "Äänikirja"
|
||||
},
|
||||
{
|
||||
"value": "2/Book/AudioBook/Daisy/",
|
||||
"translated": "Celia-äänikirja"
|
||||
}
|
||||
],
|
||||
"id": "ratamo.2045073",
|
||||
"isbns": [],
|
||||
"languages": [
|
||||
"fin"
|
||||
],
|
||||
"physicalDescriptions": [
|
||||
"1 CD-äänilevy (MP3) ( 11 h 41 min)"
|
||||
],
|
||||
"publishers": [
|
||||
"Celia"
|
||||
],
|
||||
"recordPage": "/Record/ratamo.2045073",
|
||||
"series": [],
|
||||
"shortTitle": "Kuunpäivän kirjeet",
|
||||
"subjects": [],
|
||||
"summary": [],
|
||||
"title": "Kuunpäivän kirjeet",
|
||||
"year": "2020"
|
||||
}
|
||||
],
|
||||
"status": "OK"
|
||||
}
|
|
@ -86,7 +86,7 @@
|
|||
"id": "https://www.example.com/book/2",
|
||||
"type": "Edition",
|
||||
"openlibraryKey": "OL680025M",
|
||||
"title": "Seeking Like A State",
|
||||
"title": "Seeing Like A State",
|
||||
"sortTitle": "seeing like a state",
|
||||
"subtitle": "",
|
||||
"description": "<p>Examines how (sometimes quasi-) authoritarian high-modernist planning fails to deliver the goods, be they increased resources for the state or a better life for the people.</p>",
|
||||
|
|
46
bookwyrm/tests/importers/test_bookwyrm_user_import.py
Normal file
46
bookwyrm/tests/importers/test_bookwyrm_user_import.py
Normal file
|
@ -0,0 +1,46 @@
|
|||
""" testing bookwyrm user import """
|
||||
from unittest.mock import patch
|
||||
from django.test import TestCase
|
||||
from bookwyrm import models
|
||||
from bookwyrm.importers import BookwyrmImporter
|
||||
|
||||
|
||||
class BookwyrmUserImport(TestCase):
|
||||
"""importing from BookWyrm user import"""
|
||||
|
||||
def setUp(self):
|
||||
"""setting stuff up"""
|
||||
with (
|
||||
patch("bookwyrm.suggested_users.rerank_suggestions_task.delay"),
|
||||
patch("bookwyrm.activitystreams.populate_stream_task.delay"),
|
||||
patch("bookwyrm.lists_stream.populate_lists_task.delay"),
|
||||
patch("bookwyrm.suggested_users.rerank_user_task.delay"),
|
||||
):
|
||||
self.user = models.User.objects.create_user(
|
||||
"mouse", "mouse@mouse.mouse", "password", local=True, localname="mouse"
|
||||
)
|
||||
|
||||
def test_create_retry_job(self):
|
||||
"""test retrying a user import"""
|
||||
|
||||
job = models.bookwyrm_import_job.BookwyrmImportJob.objects.create(
|
||||
user=self.user, required=[]
|
||||
)
|
||||
|
||||
job.complete_job()
|
||||
self.assertEqual(job.retry, False)
|
||||
self.assertEqual(
|
||||
models.bookwyrm_import_job.BookwyrmImportJob.objects.count(), 1
|
||||
)
|
||||
|
||||
# retry the job
|
||||
importer = BookwyrmImporter()
|
||||
importer.create_retry_job(user=self.user, original_job=job)
|
||||
|
||||
retry_job = models.bookwyrm_import_job.BookwyrmImportJob.objects.last()
|
||||
|
||||
self.assertEqual(
|
||||
models.bookwyrm_import_job.BookwyrmImportJob.objects.count(), 2
|
||||
)
|
||||
self.assertEqual(retry_job.retry, True)
|
||||
self.assertNotEqual(job.id, retry_job.id)
|
|
@ -63,6 +63,7 @@ class GoodreadsImport(TestCase):
|
|||
self.assertEqual(import_items[0].data["Book Id"], "42036538")
|
||||
self.assertEqual(import_items[0].normalized_data["isbn_13"], '="9781250313195"')
|
||||
self.assertEqual(import_items[0].normalized_data["isbn_10"], '="1250313198"')
|
||||
self.assertEqual(import_items[0].normalized_data["goodreads_key"], "42036538")
|
||||
|
||||
self.assertEqual(import_items[1].index, 1)
|
||||
self.assertEqual(import_items[1].data["Book Id"], "52691223")
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
""" testing import """
|
||||
from collections import namedtuple
|
||||
import pathlib
|
||||
import io
|
||||
from unittest.mock import patch
|
||||
import datetime
|
||||
|
||||
|
@ -159,22 +160,11 @@ class GenericImporter(TestCase):
|
|||
|
||||
def test_complete_job(self, *_):
|
||||
"""test notification"""
|
||||
import_job = self.importer.create_job(
|
||||
self.local_user, self.csv, False, "unlisted"
|
||||
)
|
||||
items = import_job.items.all()
|
||||
for item in items[:3]:
|
||||
item.fail_reason = "hello"
|
||||
item.save()
|
||||
item.update_job()
|
||||
self.assertFalse(
|
||||
models.Notification.objects.filter(
|
||||
user=self.local_user,
|
||||
related_import=import_job,
|
||||
notification_type="IMPORT",
|
||||
).exists()
|
||||
)
|
||||
|
||||
# csv content not important
|
||||
csv = io.StringIO("title,author_text,remote_id\nbeep,boop,blurp")
|
||||
import_job = self.importer.create_job(self.local_user, csv, False, "unlisted")
|
||||
items = import_job.items.all()
|
||||
item = items.last()
|
||||
item.fail_reason = "hello"
|
||||
item.save()
|
||||
|
|
34
bookwyrm/tests/management/test_add_finna_connector.py
Normal file
34
bookwyrm/tests/management/test_add_finna_connector.py
Normal file
|
@ -0,0 +1,34 @@
|
|||
""" test populating user streams """
|
||||
from django.test import TestCase
|
||||
|
||||
from bookwyrm.models import Connector
|
||||
from bookwyrm.management.commands import add_finna_connector
|
||||
|
||||
|
||||
class InitDB(TestCase):
|
||||
"""Add/remove finna connector"""
|
||||
|
||||
def test_adding_connector(self):
|
||||
"""Create groups"""
|
||||
add_finna_connector.enable_finna_connector()
|
||||
self.assertTrue(
|
||||
Connector.objects.filter(identifier="api.finna.fi", active=True).exists()
|
||||
)
|
||||
|
||||
def test_command_no_args(self):
|
||||
"""command line calls"""
|
||||
command = add_finna_connector.Command()
|
||||
command.handle()
|
||||
self.assertTrue(
|
||||
Connector.objects.filter(identifier="api.finna.fi", active=True).exists()
|
||||
)
|
||||
|
||||
def test_command_with_args(self):
|
||||
"""command line calls"""
|
||||
command = add_finna_connector.Command()
|
||||
command.handle(deactivate=True)
|
||||
|
||||
# everything should have been cleaned
|
||||
self.assertFalse(
|
||||
Connector.objects.filter(identifier="api.finna.fi", active=True).exists()
|
||||
)
|
|
@ -1,19 +1,21 @@
|
|||
""" testing models """
|
||||
|
||||
import json
|
||||
import os
|
||||
import pathlib
|
||||
from unittest.mock import patch
|
||||
|
||||
from django.core.files import File
|
||||
from django.db.models import Q
|
||||
from django.utils.dateparse import parse_datetime
|
||||
from django.test import TestCase
|
||||
|
||||
from bookwyrm import models
|
||||
from bookwyrm import activitypub, models
|
||||
from bookwyrm.utils.tar import BookwyrmTarFile
|
||||
from bookwyrm.models import bookwyrm_import_job
|
||||
|
||||
|
||||
class BookwyrmImport(TestCase):
|
||||
class BookwyrmImport(TestCase): # pylint: disable=too-many-public-methods
|
||||
"""testing user import functions"""
|
||||
|
||||
def setUp(self):
|
||||
|
@ -49,8 +51,9 @@ class BookwyrmImport(TestCase):
|
|||
"badger",
|
||||
"badger@badger.badger",
|
||||
"password",
|
||||
local=True,
|
||||
local=False,
|
||||
localname="badger",
|
||||
remote_id="badger@remote.remote",
|
||||
)
|
||||
|
||||
self.work = models.Work.objects.create(title="Sand Talk")
|
||||
|
@ -71,8 +74,14 @@ class BookwyrmImport(TestCase):
|
|||
with open(self.json_file, "r", encoding="utf-8") as jsonfile:
|
||||
self.json_data = json.loads(jsonfile.read())
|
||||
|
||||
self.archive_file = pathlib.Path(__file__).parent.joinpath(
|
||||
"../data/bookwyrm_account_export.tar.gz"
|
||||
self.archive_file_path = os.path.relpath(
|
||||
pathlib.Path(__file__).parent.joinpath(
|
||||
"../data/bookwyrm_account_export.tar.gz"
|
||||
)
|
||||
)
|
||||
|
||||
self.job = bookwyrm_import_job.BookwyrmImportJob.objects.create(
|
||||
user=self.local_user, required=[]
|
||||
)
|
||||
|
||||
def test_update_user_profile(self):
|
||||
|
@ -84,7 +93,7 @@ class BookwyrmImport(TestCase):
|
|||
patch("bookwyrm.suggested_users.rerank_user_task.delay"),
|
||||
):
|
||||
with (
|
||||
open(self.archive_file, "rb") as fileobj,
|
||||
open(self.archive_file_path, "rb") as fileobj,
|
||||
BookwyrmTarFile.open(mode="r:gz", fileobj=fileobj) as tarfile,
|
||||
):
|
||||
models.bookwyrm_import_job.update_user_profile(
|
||||
|
@ -195,8 +204,14 @@ class BookwyrmImport(TestCase):
|
|||
|
||||
self.assertTrue(self.local_user.saved_lists.filter(id=book_list.id).exists())
|
||||
|
||||
def test_upsert_follows(self):
|
||||
"""Test take a list of remote ids and add as follows"""
|
||||
def test_follow_relationship(self):
|
||||
"""Test take a remote ID and create a follow"""
|
||||
|
||||
task = bookwyrm_import_job.UserRelationshipImport.objects.create(
|
||||
parent_job=self.job,
|
||||
relationship="follow",
|
||||
remote_id="https://blah.blah/user/rat",
|
||||
)
|
||||
|
||||
before_follow = models.UserFollows.objects.filter(
|
||||
user_subject=self.local_user, user_object=self.rat_user
|
||||
|
@ -208,18 +223,168 @@ class BookwyrmImport(TestCase):
|
|||
patch("bookwyrm.activitystreams.add_user_statuses_task.delay"),
|
||||
patch("bookwyrm.lists_stream.add_user_lists_task.delay"),
|
||||
patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"),
|
||||
patch("bookwyrm.activitypub.resolve_remote_id", return_value=self.rat_user),
|
||||
):
|
||||
models.bookwyrm_import_job.upsert_follows(
|
||||
self.local_user, self.json_data.get("follows")
|
||||
)
|
||||
|
||||
bookwyrm_import_job.import_user_relationship_task(child_id=task.id)
|
||||
|
||||
after_follow = models.UserFollows.objects.filter(
|
||||
user_subject=self.local_user, user_object=self.rat_user
|
||||
).exists()
|
||||
self.assertTrue(after_follow)
|
||||
|
||||
def test_upsert_user_blocks(self):
|
||||
"""test adding blocked users"""
|
||||
def test_import_book_task_existing_author(self):
|
||||
"""Test importing a book with an author
|
||||
already known to the server does not overwrite"""
|
||||
|
||||
self.assertEqual(models.Author.objects.count(), 0)
|
||||
models.Author.objects.create(
|
||||
id=1,
|
||||
name="James C. Scott",
|
||||
wikipedia_link="https://en.wikipedia.org/wiki/James_C._Scott",
|
||||
wikidata="Q3025403",
|
||||
aliases=["Test Alias"],
|
||||
)
|
||||
|
||||
with open(self.archive_file_path, "rb") as fileobj:
|
||||
self.job.archive_file = File(fileobj)
|
||||
self.job.save()
|
||||
task = bookwyrm_import_job.UserImportBook.objects.create(
|
||||
parent_job=self.job, book_data=self.json_data.get("books")[0]
|
||||
)
|
||||
|
||||
self.assertEqual(models.Edition.objects.count(), 1)
|
||||
|
||||
# run the task
|
||||
bookwyrm_import_job.import_book_task(child_id=task.id)
|
||||
|
||||
self.assertTrue(models.Edition.objects.filter(isbn_13="9780300070163").exists())
|
||||
self.assertEqual(models.Edition.objects.count(), 2)
|
||||
|
||||
# Check the existing author did not get overwritten
|
||||
author = models.Author.objects.first()
|
||||
self.assertEqual(author.name, "James C. Scott")
|
||||
self.assertIn(author.aliases[0], "Test Alias")
|
||||
|
||||
def test_import_book_task_existing_edition(self):
|
||||
"""Test importing a book with an edition
|
||||
already known to the server does not overwrite"""
|
||||
|
||||
with open(self.archive_file_path, "rb") as fileobj:
|
||||
self.job.archive_file = File(fileobj)
|
||||
self.job.save()
|
||||
task = bookwyrm_import_job.UserImportBook.objects.create(
|
||||
parent_job=self.job, book_data=self.json_data.get("books")[1]
|
||||
)
|
||||
|
||||
self.assertEqual(models.Edition.objects.count(), 1)
|
||||
self.assertTrue(models.Edition.objects.filter(isbn_13="9780062975645").exists())
|
||||
|
||||
# run the task
|
||||
bookwyrm_import_job.import_book_task(child_id=task.id)
|
||||
|
||||
# Check the existing Edition did not get overwritten
|
||||
self.assertEqual(models.Edition.objects.count(), 1)
|
||||
self.assertEqual(models.Edition.objects.first().title, "Sand Talk")
|
||||
|
||||
def test_import_book_task_existing_work(self):
|
||||
"""Test importing a book with a work unknown to the server"""
|
||||
|
||||
with open(self.archive_file_path, "rb") as fileobj:
|
||||
self.job.archive_file = File(fileobj)
|
||||
self.job.save()
|
||||
task = bookwyrm_import_job.UserImportBook.objects.create(
|
||||
parent_job=self.job, book_data=self.json_data.get("books")[1]
|
||||
)
|
||||
|
||||
self.assertEqual(models.Work.objects.count(), 1)
|
||||
|
||||
# run the task
|
||||
bookwyrm_import_job.import_book_task(child_id=task.id)
|
||||
|
||||
# Check the existing Work did not get overwritten
|
||||
self.assertEqual(models.Work.objects.count(), 1)
|
||||
self.assertNotEqual(
|
||||
self.json_data.get("books")[1]["work"]["title"], models.Work.objects.first()
|
||||
)
|
||||
|
||||
def test_import_book_task_new_author(self):
|
||||
"""Test importing a book with author not known
|
||||
to the server imports the new author"""
|
||||
|
||||
with open(self.archive_file_path, "rb") as fileobj:
|
||||
self.job.archive_file = File(fileobj)
|
||||
self.job.save()
|
||||
task = bookwyrm_import_job.UserImportBook.objects.create(
|
||||
parent_job=self.job, book_data=self.json_data.get("books")[0]
|
||||
)
|
||||
|
||||
self.assertEqual(models.Edition.objects.count(), 1)
|
||||
|
||||
# run the task
|
||||
bookwyrm_import_job.import_book_task(child_id=task.id)
|
||||
|
||||
self.assertTrue(models.Edition.objects.filter(isbn_13="9780300070163").exists())
|
||||
self.assertEqual(models.Edition.objects.count(), 2)
|
||||
|
||||
# Check the author was created
|
||||
author = models.Author.objects.get()
|
||||
self.assertEqual(author.name, "James C. Scott")
|
||||
self.assertIn(author.aliases[0], "James Campbell Scott")
|
||||
|
||||
def test_import_book_task_new_edition(self):
|
||||
"""Test importing a book with an edition
|
||||
unknown to the server"""
|
||||
|
||||
with open(self.archive_file_path, "rb") as fileobj:
|
||||
self.job.archive_file = File(fileobj)
|
||||
self.job.save()
|
||||
task = bookwyrm_import_job.UserImportBook.objects.create(
|
||||
parent_job=self.job, book_data=self.json_data.get("books")[0]
|
||||
)
|
||||
|
||||
self.assertEqual(models.Edition.objects.count(), 1)
|
||||
self.assertFalse(
|
||||
models.Edition.objects.filter(isbn_13="9780300070163").exists()
|
||||
)
|
||||
|
||||
# run the task
|
||||
bookwyrm_import_job.import_book_task(child_id=task.id)
|
||||
|
||||
# Check the Edition was added
|
||||
self.assertEqual(models.Edition.objects.count(), 2)
|
||||
self.assertEqual(models.Edition.objects.first().title, "Sand Talk")
|
||||
self.assertEqual(models.Edition.objects.last().title, "Seeing Like A State")
|
||||
self.assertTrue(models.Edition.objects.filter(isbn_13="9780300070163").exists())
|
||||
|
||||
def test_import_book_task_new_work(self):
|
||||
"""Test importing a book with a work unknown to the server"""
|
||||
|
||||
with open(self.archive_file_path, "rb") as fileobj:
|
||||
self.job.archive_file = File(fileobj)
|
||||
self.job.save()
|
||||
task = bookwyrm_import_job.UserImportBook.objects.create(
|
||||
parent_job=self.job, book_data=self.json_data.get("books")[0]
|
||||
)
|
||||
|
||||
self.assertEqual(models.Work.objects.count(), 1)
|
||||
|
||||
# run the task
|
||||
bookwyrm_import_job.import_book_task(child_id=task.id)
|
||||
|
||||
# Check the Work was added
|
||||
self.assertEqual(models.Work.objects.count(), 2)
|
||||
self.assertEqual(models.Work.objects.first().title, "Sand Talk")
|
||||
self.assertEqual(models.Work.objects.last().title, "Seeing Like a State")
|
||||
|
||||
def test_block_relationship(self):
|
||||
"""test adding blocks for users"""
|
||||
|
||||
task = bookwyrm_import_job.UserRelationshipImport.objects.create(
|
||||
parent_job=self.job,
|
||||
relationship="block",
|
||||
remote_id="https://blah.blah/user/badger",
|
||||
)
|
||||
|
||||
blocked_before = models.UserBlocks.objects.filter(
|
||||
Q(
|
||||
|
@ -234,10 +399,11 @@ class BookwyrmImport(TestCase):
|
|||
patch("bookwyrm.activitystreams.remove_user_statuses_task.delay"),
|
||||
patch("bookwyrm.lists_stream.remove_user_lists_task.delay"),
|
||||
patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"),
|
||||
patch(
|
||||
"bookwyrm.activitypub.resolve_remote_id", return_value=self.badger_user
|
||||
),
|
||||
):
|
||||
models.bookwyrm_import_job.upsert_user_blocks(
|
||||
self.local_user, self.json_data.get("blocks")
|
||||
)
|
||||
bookwyrm_import_job.import_user_relationship_task(child_id=task.id)
|
||||
|
||||
blocked_after = models.UserBlocks.objects.filter(
|
||||
Q(
|
||||
|
@ -248,37 +414,29 @@ class BookwyrmImport(TestCase):
|
|||
self.assertTrue(blocked_after)
|
||||
|
||||
def test_get_or_create_edition_existing(self):
|
||||
"""Test take a JSON string of books and editions,
|
||||
find or create the editions in the database and
|
||||
return a list of edition instances"""
|
||||
"""Test import existing book"""
|
||||
|
||||
task = bookwyrm_import_job.UserImportBook.objects.create(
|
||||
parent_job=self.job,
|
||||
book_data=self.json_data["books"][1],
|
||||
)
|
||||
|
||||
self.assertEqual(models.Edition.objects.count(), 1)
|
||||
|
||||
with (
|
||||
open(self.archive_file, "rb") as fileobj,
|
||||
BookwyrmTarFile.open(mode="r:gz", fileobj=fileobj) as tarfile,
|
||||
):
|
||||
bookwyrm_import_job.get_or_create_edition(
|
||||
self.json_data["books"][1], tarfile
|
||||
) # Sand Talk
|
||||
bookwyrm_import_job.import_book_task(child_id=task.id)
|
||||
|
||||
self.assertEqual(models.Edition.objects.count(), 1)
|
||||
self.assertEqual(models.Edition.objects.count(), 1)
|
||||
|
||||
def test_get_or_create_edition_not_existing(self):
|
||||
"""Test take a JSON string of books and editions,
|
||||
find or create the editions in the database and
|
||||
return a list of edition instances"""
|
||||
"""Test import new book"""
|
||||
|
||||
task = bookwyrm_import_job.UserImportBook.objects.create(
|
||||
parent_job=self.job,
|
||||
book_data=self.json_data["books"][0],
|
||||
)
|
||||
|
||||
self.assertEqual(models.Edition.objects.count(), 1)
|
||||
|
||||
with (
|
||||
open(self.archive_file, "rb") as fileobj,
|
||||
BookwyrmTarFile.open(mode="r:gz", fileobj=fileobj) as tarfile,
|
||||
):
|
||||
bookwyrm_import_job.get_or_create_edition(
|
||||
self.json_data["books"][0], tarfile
|
||||
) # Seeing like a state
|
||||
|
||||
bookwyrm_import_job.import_book_task(child_id=task.id)
|
||||
self.assertTrue(models.Edition.objects.filter(isbn_13="9780300070163").exists())
|
||||
self.assertEqual(models.Edition.objects.count(), 2)
|
||||
|
||||
|
@ -305,7 +463,7 @@ class BookwyrmImport(TestCase):
|
|||
|
||||
self.assertEqual(models.ReadThrough.objects.count(), 0)
|
||||
bookwyrm_import_job.upsert_readthroughs(
|
||||
readthroughs, self.local_user, self.book.id
|
||||
self.local_user, self.book.id, readthroughs
|
||||
)
|
||||
|
||||
self.assertEqual(models.ReadThrough.objects.count(), 1)
|
||||
|
@ -318,17 +476,19 @@ class BookwyrmImport(TestCase):
|
|||
self.assertEqual(models.ReadThrough.objects.first().user, self.local_user)
|
||||
|
||||
def test_get_or_create_review(self):
|
||||
"""Test get_or_create_review_status with a review"""
|
||||
"""Test upsert_status_task with a review"""
|
||||
|
||||
task = bookwyrm_import_job.UserImportPost.objects.create(
|
||||
parent_job=self.job,
|
||||
book=self.book,
|
||||
json=self.json_data["books"][0]["reviews"][0],
|
||||
status_type="review",
|
||||
)
|
||||
|
||||
self.assertEqual(models.Review.objects.filter(user=self.local_user).count(), 0)
|
||||
reviews = self.json_data["books"][0]["reviews"]
|
||||
with (
|
||||
patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"),
|
||||
patch("bookwyrm.models.bookwyrm_import_job.is_alias", return_value=True),
|
||||
):
|
||||
bookwyrm_import_job.upsert_statuses(
|
||||
self.local_user, models.Review, reviews, self.book.remote_id
|
||||
)
|
||||
|
||||
with patch("bookwyrm.models.bookwyrm_import_job.is_alias", return_value=True):
|
||||
bookwyrm_import_job.upsert_status_task(child_id=task.id)
|
||||
|
||||
self.assertEqual(models.Review.objects.filter(user=self.local_user).count(), 1)
|
||||
self.assertEqual(
|
||||
|
@ -354,18 +514,20 @@ class BookwyrmImport(TestCase):
|
|||
)
|
||||
|
||||
def test_get_or_create_comment(self):
|
||||
"""Test get_or_create_review_status with a comment"""
|
||||
"""Test upsert_status_task with a comment"""
|
||||
|
||||
task = bookwyrm_import_job.UserImportPost.objects.create(
|
||||
parent_job=self.job,
|
||||
book=self.book,
|
||||
json=self.json_data["books"][1]["comments"][0],
|
||||
status_type="comment",
|
||||
)
|
||||
|
||||
self.assertEqual(models.Comment.objects.filter(user=self.local_user).count(), 0)
|
||||
comments = self.json_data["books"][1]["comments"]
|
||||
|
||||
with (
|
||||
patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"),
|
||||
patch("bookwyrm.models.bookwyrm_import_job.is_alias", return_value=True),
|
||||
):
|
||||
bookwyrm_import_job.upsert_statuses(
|
||||
self.local_user, models.Comment, comments, self.book.remote_id
|
||||
)
|
||||
with patch("bookwyrm.models.bookwyrm_import_job.is_alias", return_value=True):
|
||||
bookwyrm_import_job.upsert_status_task(child_id=task.id)
|
||||
|
||||
self.assertEqual(models.Comment.objects.filter(user=self.local_user).count(), 1)
|
||||
self.assertEqual(
|
||||
models.Comment.objects.filter(book=self.book).first().content,
|
||||
|
@ -382,20 +544,22 @@ class BookwyrmImport(TestCase):
|
|||
)
|
||||
|
||||
def test_get_or_create_quote(self):
|
||||
"""Test get_or_create_review_status with a quote"""
|
||||
"""Test upsert_status_task with a quote"""
|
||||
|
||||
task = bookwyrm_import_job.UserImportPost.objects.create(
|
||||
parent_job=self.job,
|
||||
book=self.book,
|
||||
json=self.json_data["books"][1]["quotations"][0],
|
||||
status_type="quote",
|
||||
)
|
||||
|
||||
self.assertEqual(
|
||||
models.Quotation.objects.filter(user=self.local_user).count(), 0
|
||||
)
|
||||
quotes = self.json_data["books"][1]["quotations"]
|
||||
with (
|
||||
patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"),
|
||||
patch("bookwyrm.models.bookwyrm_import_job.is_alias", return_value=True),
|
||||
):
|
||||
|
||||
bookwyrm_import_job.upsert_statuses(
|
||||
self.local_user, models.Quotation, quotes, self.book.remote_id
|
||||
)
|
||||
with patch("bookwyrm.models.bookwyrm_import_job.is_alias", return_value=True):
|
||||
bookwyrm_import_job.upsert_status_task(child_id=task.id)
|
||||
|
||||
self.assertEqual(
|
||||
models.Quotation.objects.filter(user=self.local_user).count(), 1
|
||||
)
|
||||
|
@ -416,20 +580,20 @@ class BookwyrmImport(TestCase):
|
|||
)
|
||||
|
||||
def test_get_or_create_quote_unauthorized(self):
|
||||
"""Test get_or_create_review_status with a quote but not authorized"""
|
||||
"""Test upsert_status_task with a quote but not authorized"""
|
||||
|
||||
task = bookwyrm_import_job.UserImportPost.objects.create(
|
||||
parent_job=self.job,
|
||||
book=self.book,
|
||||
json=self.json_data["books"][1]["quotations"][0],
|
||||
status="quote",
|
||||
)
|
||||
|
||||
self.assertEqual(
|
||||
models.Quotation.objects.filter(user=self.local_user).count(), 0
|
||||
)
|
||||
quotes = self.json_data["books"][1]["quotations"]
|
||||
with (
|
||||
patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"),
|
||||
patch("bookwyrm.models.bookwyrm_import_job.is_alias", return_value=False),
|
||||
):
|
||||
|
||||
bookwyrm_import_job.upsert_statuses(
|
||||
self.local_user, models.Quotation, quotes, self.book.remote_id
|
||||
)
|
||||
with patch("bookwyrm.models.bookwyrm_import_job.is_alias", return_value=False):
|
||||
bookwyrm_import_job.upsert_status_task(child_id=task.id)
|
||||
self.assertEqual(
|
||||
models.Quotation.objects.filter(user=self.local_user).count(), 0
|
||||
)
|
||||
|
@ -438,8 +602,6 @@ class BookwyrmImport(TestCase):
|
|||
"""Take a list and ListItems as JSON and create DB entries
|
||||
if they don't already exist"""
|
||||
|
||||
book_data = self.json_data["books"][0]
|
||||
|
||||
other_book = models.Edition.objects.create(
|
||||
title="Another Book", remote_id="https://example.com/book/9876"
|
||||
)
|
||||
|
@ -471,8 +633,8 @@ class BookwyrmImport(TestCase):
|
|||
):
|
||||
bookwyrm_import_job.upsert_lists(
|
||||
self.local_user,
|
||||
book_data["lists"],
|
||||
other_book.id,
|
||||
self.json_data["books"][0]["lists"],
|
||||
)
|
||||
|
||||
self.assertEqual(models.List.objects.filter(user=self.local_user).count(), 1)
|
||||
|
@ -488,8 +650,6 @@ class BookwyrmImport(TestCase):
|
|||
"""Take a list and ListItems as JSON and create DB entries
|
||||
if they don't already exist"""
|
||||
|
||||
book_data = self.json_data["books"][0]
|
||||
|
||||
self.assertEqual(models.List.objects.filter(user=self.local_user).count(), 0)
|
||||
self.assertFalse(models.ListItem.objects.filter(book=self.book.id).exists())
|
||||
|
||||
|
@ -499,8 +659,8 @@ class BookwyrmImport(TestCase):
|
|||
):
|
||||
bookwyrm_import_job.upsert_lists(
|
||||
self.local_user,
|
||||
book_data["lists"],
|
||||
self.book.id,
|
||||
self.json_data["books"][0]["lists"],
|
||||
)
|
||||
|
||||
self.assertEqual(models.List.objects.filter(user=self.local_user).count(), 1)
|
||||
|
@ -526,12 +686,13 @@ class BookwyrmImport(TestCase):
|
|||
book=self.book, shelf=shelf, user=self.local_user
|
||||
)
|
||||
|
||||
book_data = self.json_data["books"][0]
|
||||
with (
|
||||
patch("bookwyrm.activitystreams.add_book_statuses_task.delay"),
|
||||
patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"),
|
||||
):
|
||||
bookwyrm_import_job.upsert_shelves(self.book, self.local_user, book_data)
|
||||
bookwyrm_import_job.upsert_shelves(
|
||||
self.local_user, self.book, self.json_data["books"][0].get("shelves")
|
||||
)
|
||||
|
||||
self.assertEqual(
|
||||
models.ShelfBook.objects.filter(user=self.local_user.id).count(), 2
|
||||
|
@ -545,13 +706,13 @@ class BookwyrmImport(TestCase):
|
|||
models.ShelfBook.objects.filter(user=self.local_user.id).count(), 0
|
||||
)
|
||||
|
||||
book_data = self.json_data["books"][0]
|
||||
|
||||
with (
|
||||
patch("bookwyrm.activitystreams.add_book_statuses_task.delay"),
|
||||
patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"),
|
||||
):
|
||||
bookwyrm_import_job.upsert_shelves(self.book, self.local_user, book_data)
|
||||
bookwyrm_import_job.upsert_shelves(
|
||||
self.local_user, self.book, self.json_data["books"][0].get("shelves")
|
||||
)
|
||||
|
||||
self.assertEqual(
|
||||
models.ShelfBook.objects.filter(user=self.local_user.id).count(), 2
|
||||
|
@ -561,3 +722,49 @@ class BookwyrmImport(TestCase):
|
|||
self.assertEqual(
|
||||
models.Shelf.objects.filter(user=self.local_user.id).count(), 4
|
||||
)
|
||||
|
||||
def test_update_followers_address(self):
|
||||
"""test updating followers address to local"""
|
||||
|
||||
user = self.local_user
|
||||
followers = ["https://old.address/user/oldusername/followers"]
|
||||
new_followers = bookwyrm_import_job.update_followers_address(user, followers)
|
||||
|
||||
self.assertEqual(new_followers, [f"{self.local_user.remote_id}/followers"])
|
||||
|
||||
def test_is_alias(self):
|
||||
"""test checking for valid alias"""
|
||||
|
||||
self.rat_user.also_known_as.add(self.local_user)
|
||||
|
||||
with patch(
|
||||
"bookwyrm.activitypub.resolve_remote_id", return_value=self.rat_user
|
||||
):
|
||||
|
||||
alias = bookwyrm_import_job.is_alias(
|
||||
self.local_user, self.rat_user.remote_id
|
||||
)
|
||||
|
||||
self.assertTrue(alias)
|
||||
|
||||
def test_status_already_exists(self):
|
||||
"""test status checking"""
|
||||
|
||||
string = '{"id":"https://www.example.com/user/rat/comment/4","type":"Comment","published":"2023-08-14T04:48:18.746+00:00","attributedTo":"https://www.example.com/user/rat","content":"<p>this is a comment about an amazing book</p>","to":["https://www.w3.org/ns/activitystreams#Public"],"cc":["https://www.example.com/user/rat/followers"],"replies":{"id":"https://www.example.com/user/rat/comment/4/replies","type":"OrderedCollection","totalItems":0,"first":"https://www.example.com/user/rat/comment/4/replies?page=1","last":"https://www.example.com/user/rat/comment/4/replies?page=1","@context":"https://www.w3.org/ns/activitystreams"},"tag":[],"attachment":[],"sensitive":false,"inReplyToBook":"https://www.example.com/book/4","readingStatus":null,"@context":"https://www.w3.org/ns/activitystreams"}' # pylint: disable=line-too-long
|
||||
|
||||
status = json.loads(string)
|
||||
parsed = activitypub.parse(status)
|
||||
exists = bookwyrm_import_job.status_already_exists(self.local_user, parsed)
|
||||
|
||||
self.assertFalse(exists)
|
||||
|
||||
comment = models.Comment.objects.create(
|
||||
user=self.local_user, book=self.book, content="<p>hi</p>"
|
||||
)
|
||||
status_two = comment.to_activity()
|
||||
parsed_two = activitypub.parse(status_two)
|
||||
exists_two = bookwyrm_import_job.status_already_exists(
|
||||
self.local_user, parsed_two
|
||||
)
|
||||
|
||||
self.assertTrue(exists_two)
|
||||
|
|
|
@ -1,9 +1,14 @@
|
|||
""" test searching for books """
|
||||
import os
|
||||
import re
|
||||
from PIL import Image
|
||||
|
||||
from django.core.files.uploadedfile import InMemoryUploadedFile
|
||||
from django.test import TestCase
|
||||
|
||||
from bookwyrm.settings import BASE_URL
|
||||
from bookwyrm.utils import regex
|
||||
from bookwyrm.utils.images import remove_uploaded_image_exif
|
||||
from bookwyrm.utils.validate import validate_url_domain
|
||||
|
||||
|
||||
|
@ -24,3 +29,18 @@ class TestUtils(TestCase):
|
|||
self.assertIsNone(
|
||||
validate_url_domain("https://up-to-no-good.tld/bad-actor.exe")
|
||||
)
|
||||
|
||||
def test_remove_uploaded_image_exif(self):
|
||||
"""Check that EXIF data is removed from image"""
|
||||
image_path = "bookwyrm/tests/data/default_avi_exif.jpg"
|
||||
with open(image_path, "rb") as image_file:
|
||||
source = InMemoryUploadedFile(
|
||||
image_file,
|
||||
"cover",
|
||||
"default_avi_exif.jpg",
|
||||
"image/jpeg",
|
||||
os.fstat(image_file.fileno()).st_size,
|
||||
None,
|
||||
)
|
||||
sanitized_image = Image.open(remove_uploaded_image_exif(source).open())
|
||||
self.assertNotIn("exif", sanitized_image.info)
|
||||
|
|
|
@ -145,11 +145,21 @@ class ListViews(TestCase):
|
|||
def test_user_lists_page_logged_out(self):
|
||||
"""there are so many views, this just makes sure it LOADS"""
|
||||
view = views.UserLists.as_view()
|
||||
with (
|
||||
patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"),
|
||||
patch("bookwyrm.lists_stream.remove_list_task.delay"),
|
||||
):
|
||||
models.List.objects.create(name="Public list", user=self.local_user)
|
||||
models.List.objects.create(
|
||||
name="Private list", privacy="direct", user=self.local_user
|
||||
)
|
||||
request = self.factory.get("")
|
||||
request.user = self.anonymous_user
|
||||
|
||||
result = view(request, self.local_user.username)
|
||||
self.assertEqual(result.status_code, 302)
|
||||
self.assertIsInstance(result, TemplateResponse)
|
||||
validate_html(result.render())
|
||||
self.assertEqual(result.status_code, 200)
|
||||
|
||||
def test_lists_create(self):
|
||||
"""create list view"""
|
||||
|
|
|
@ -68,7 +68,7 @@ class ExportViews(TestCase):
|
|||
# pylint: disable=line-too-long
|
||||
self.assertEqual(
|
||||
export.content,
|
||||
b"title,author_text,remote_id,openlibrary_key,inventaire_id,librarything_key,goodreads_key,bnf_id,viaf,wikidata,asin,aasin,isfdb,isbn_10,isbn_13,oclc_number,start_date,finish_date,stopped_date,rating,review_name,review_cw,review_content,review_published,shelf,shelf_name,shelf_date\r\n"
|
||||
+ b"Test Book,,%b,,,,,beep,,,,,,123456789X,9781234567890,,,,,,,,,,to-read,To Read,%b\r\n"
|
||||
b"title,author_text,remote_id,openlibrary_key,finna_key,inventaire_id,librarything_key,goodreads_key,bnf_id,viaf,wikidata,asin,aasin,isfdb,isbn_10,isbn_13,oclc_number,start_date,finish_date,stopped_date,rating,review_name,review_cw,review_content,review_published,shelf,shelf_name,shelf_date\r\n"
|
||||
+ b"Test Book,,%b,,,,,,beep,,,,,,123456789X,9781234567890,,,,,,,,,,to-read,To Read,%b\r\n"
|
||||
% (self.book.remote_id.encode("utf-8"), book_date),
|
||||
)
|
||||
|
|
|
@ -70,7 +70,9 @@ class ReadingViews(TestCase):
|
|||
},
|
||||
)
|
||||
request.user = self.local_user
|
||||
with patch("bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"):
|
||||
with patch(
|
||||
"bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"
|
||||
) as mock:
|
||||
views.ReadingStatus.as_view()(request, "start", self.book.id)
|
||||
|
||||
self.assertEqual(shelf.books.get(), self.book)
|
||||
|
@ -86,6 +88,12 @@ class ReadingViews(TestCase):
|
|||
self.assertEqual(readthrough.user, self.local_user)
|
||||
self.assertEqual(readthrough.book, self.book)
|
||||
|
||||
# Three broadcast tasks:
|
||||
# 1. Create Readthrough
|
||||
# 2. Create post as pure_content (for non-BookWyrm)
|
||||
# 3. Create post with book attached - this should only happen once!
|
||||
self.assertEqual(len(mock.mock_calls), 3)
|
||||
|
||||
def test_start_reading_with_comment(self, *_):
|
||||
"""begin a book"""
|
||||
shelf = self.local_user.shelf_set.get(identifier=models.Shelf.READING)
|
||||
|
|
|
@ -132,3 +132,27 @@ class RssFeedView(TestCase):
|
|||
self.assertEqual(result.status_code, 200)
|
||||
|
||||
self.assertIn(b"a sickening sense", result.content)
|
||||
|
||||
def test_rss_shelf(self, *_):
|
||||
"""load the rss feed of a shelf"""
|
||||
with patch(
|
||||
"bookwyrm.models.activitypub_mixin.broadcast_task.apply_async"
|
||||
), patch("bookwyrm.activitystreams.add_book_statuses_task.delay"):
|
||||
# make the shelf
|
||||
shelf = models.Shelf.objects.create(
|
||||
name="Test Shelf", identifier="test-shelf", user=self.local_user
|
||||
)
|
||||
# put the shelf on the book
|
||||
models.ShelfBook.objects.create(
|
||||
book=self.book,
|
||||
shelf=shelf,
|
||||
user=self.local_user,
|
||||
)
|
||||
view = rss_feed.RssShelfFeed()
|
||||
request = self.factory.get("/user/books/test-shelf/rss")
|
||||
request.user = self.local_user
|
||||
result = view(
|
||||
request, username=self.local_user.username, shelf_identifier="test-shelf"
|
||||
)
|
||||
self.assertEqual(result.status_code, 200)
|
||||
self.assertIn(b"Example Edition", result.content)
|
||||
|
|
|
@ -434,6 +434,11 @@ urlpatterns = [
|
|||
# imports
|
||||
re_path(r"^import/?$", views.Import.as_view(), name="import"),
|
||||
re_path(r"^user-import/?$", views.UserImport.as_view(), name="user-import"),
|
||||
re_path(
|
||||
r"^user-import/(?P<job_id>\d+)/?$",
|
||||
views.UserImportStatus.as_view(),
|
||||
name="user-import-status",
|
||||
),
|
||||
re_path(
|
||||
r"^import/(?P<job_id>\d+)/?$",
|
||||
views.ImportStatus.as_view(),
|
||||
|
@ -444,6 +449,11 @@ urlpatterns = [
|
|||
views.stop_import,
|
||||
name="import-stop",
|
||||
),
|
||||
re_path(
|
||||
r"^user-import/(?P<job_id>\d+)/stop/?$",
|
||||
views.stop_user_import,
|
||||
name="user-import-stop",
|
||||
),
|
||||
re_path(
|
||||
r"^import/(?P<job_id>\d+)/retry/(?P<item_id>\d+)/?$",
|
||||
views.retry_item,
|
||||
|
@ -454,6 +464,11 @@ urlpatterns = [
|
|||
views.ImportTroubleshoot.as_view(),
|
||||
name="import-troubleshoot",
|
||||
),
|
||||
re_path(
|
||||
r"^user-import/(?P<job_id>\d+)/failed/?$",
|
||||
views.UserImportTroubleshoot.as_view(),
|
||||
name="user-import-troubleshoot",
|
||||
),
|
||||
re_path(
|
||||
r"^import/(?P<job_id>\d+)/review/?$",
|
||||
views.ImportManualReview.as_view(),
|
||||
|
@ -577,11 +592,21 @@ urlpatterns = [
|
|||
views.Shelf.as_view(),
|
||||
name="shelf",
|
||||
),
|
||||
re_path(
|
||||
rf"^{USER_PATH}/(shelf|books)/(?P<shelf_identifier>[\w-]+)/rss/?$",
|
||||
views.rss_feed.RssShelfFeed(),
|
||||
name="shelf-rss",
|
||||
),
|
||||
re_path(
|
||||
rf"^{LOCAL_USER_PATH}/(books|shelf)/(?P<shelf_identifier>[\w-]+)(.json)?/?$",
|
||||
views.Shelf.as_view(),
|
||||
name="shelf",
|
||||
),
|
||||
re_path(
|
||||
rf"^{LOCAL_USER_PATH}/(books|shelf)/(?P<shelf_identifier>[\w-]+)/rss/?$",
|
||||
views.rss_feed.RssShelfFeed(),
|
||||
name="shelf-rss",
|
||||
),
|
||||
re_path(r"^create-shelf/?$", views.create_shelf, name="shelf-create"),
|
||||
re_path(r"^delete-shelf/(?P<shelf_id>\d+)/?$", views.delete_shelf),
|
||||
re_path(r"^shelve/?$", views.shelve),
|
||||
|
|
27
bookwyrm/utils/images.py
Normal file
27
bookwyrm/utils/images.py
Normal file
|
@ -0,0 +1,27 @@
|
|||
""" Image utilities """
|
||||
|
||||
from io import BytesIO
|
||||
from PIL import Image
|
||||
from django.core.files.uploadedfile import InMemoryUploadedFile
|
||||
|
||||
|
||||
def remove_uploaded_image_exif(source: InMemoryUploadedFile) -> InMemoryUploadedFile:
|
||||
"""Removes EXIF data from provided image and returns a sanitized copy"""
|
||||
io = BytesIO()
|
||||
with Image.open(source) as image:
|
||||
if "exif" in image.info:
|
||||
del image.info["exif"]
|
||||
|
||||
if image.format == "JPEG":
|
||||
image.save(io, format=image.format, quality="keep")
|
||||
else:
|
||||
image.save(io, format=image.format)
|
||||
|
||||
return InMemoryUploadedFile(
|
||||
io,
|
||||
source.field_name,
|
||||
source.name,
|
||||
source.content_type,
|
||||
len(io.getvalue()),
|
||||
source.charset,
|
||||
)
|
|
@ -85,10 +85,17 @@ from .shelf.shelf import Shelf
|
|||
from .shelf.shelf_actions import create_shelf, delete_shelf
|
||||
from .shelf.shelf_actions import shelve, unshelve
|
||||
|
||||
# csv import
|
||||
from .imports.import_data import Import, UserImport
|
||||
from .imports.import_status import ImportStatus, retry_item, stop_import
|
||||
# csv and user import
|
||||
from .imports.import_data import Import, UserImport, user_import_available
|
||||
from .imports.import_status import (
|
||||
ImportStatus,
|
||||
UserImportStatus,
|
||||
retry_item,
|
||||
stop_import,
|
||||
stop_user_import,
|
||||
)
|
||||
from .imports.troubleshoot import ImportTroubleshoot
|
||||
from .imports.user_troubleshoot import UserImportTroubleshoot
|
||||
from .imports.manually_review import (
|
||||
ImportManualReview,
|
||||
approve_import_item,
|
||||
|
|
|
@ -7,6 +7,7 @@ from django.template.response import TemplateResponse
|
|||
from django.utils.decorators import method_decorator
|
||||
from django.views import View
|
||||
from django.views.decorators.http import require_POST
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm import forms, models
|
||||
from bookwyrm.activitypub import ActivitypubResponse
|
||||
|
@ -24,6 +25,7 @@ class Author(View):
|
|||
"""this person wrote a book"""
|
||||
|
||||
# pylint: disable=unused-argument
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, author_id, slug=None):
|
||||
"""landing page for an author"""
|
||||
author = get_mergeable_object_or_404(models.Author, id=author_id)
|
||||
|
|
|
@ -10,12 +10,14 @@ from django.shortcuts import get_object_or_404, redirect
|
|||
from django.template.response import TemplateResponse
|
||||
from django.views import View
|
||||
from django.views.decorators.http import require_POST
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm import forms, models
|
||||
from bookwyrm.activitypub import ActivitypubResponse
|
||||
from bookwyrm.connectors import connector_manager, ConnectorException
|
||||
from bookwyrm.connectors.abstract_connector import get_image
|
||||
from bookwyrm.settings import PAGE_LENGTH
|
||||
from bookwyrm.utils.images import remove_uploaded_image_exif
|
||||
from bookwyrm.views.helpers import (
|
||||
is_api_request,
|
||||
maybe_redirect_local_path,
|
||||
|
@ -27,6 +29,7 @@ from bookwyrm.views.helpers import (
|
|||
class Book(View):
|
||||
"""a book! this is the stuff"""
|
||||
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, book_id, **kwargs):
|
||||
"""info about a book"""
|
||||
if is_api_request(request):
|
||||
|
@ -156,7 +159,7 @@ def upload_cover(request, book_id):
|
|||
if not form.is_valid() or not form.files.get("cover"):
|
||||
return redirect(book.local_path)
|
||||
|
||||
book.cover = form.files["cover"]
|
||||
book.cover = remove_uploaded_image_exif(form.files["cover"])
|
||||
book.save()
|
||||
|
||||
return redirect(book.local_path)
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
""" the good stuff! the books! """
|
||||
|
||||
from re import sub, findall
|
||||
|
||||
from django.contrib.auth.decorators import login_required, permission_required
|
||||
from django.contrib.postgres.search import SearchRank, SearchVector
|
||||
from django.db import transaction
|
||||
|
@ -12,6 +13,7 @@ from django.views.decorators.http import require_POST
|
|||
from django.views import View
|
||||
|
||||
from bookwyrm import book_search, forms, models
|
||||
from bookwyrm.utils.images import remove_uploaded_image_exif
|
||||
|
||||
# from bookwyrm.activitypub.base_activity import ActivityObject
|
||||
from bookwyrm.utils.isni import (
|
||||
|
@ -71,6 +73,8 @@ class EditBook(View):
|
|||
image = set_cover_from_url(url)
|
||||
if image:
|
||||
book.cover.save(*image, save=False)
|
||||
elif "cover" in form.files:
|
||||
book.cover = remove_uploaded_image_exif(form.files["cover"])
|
||||
|
||||
book.save()
|
||||
return redirect(f"/book/{book.id}")
|
||||
|
@ -142,6 +146,8 @@ class CreateBook(View):
|
|||
image = set_cover_from_url(url)
|
||||
if image:
|
||||
book.cover.save(*image, save=False)
|
||||
elif "cover" in form.files:
|
||||
book.cover = remove_uploaded_image_exif(form.files["cover"])
|
||||
|
||||
book.save()
|
||||
return redirect(f"/book/{book.id}")
|
||||
|
@ -311,6 +317,8 @@ class ConfirmEditBook(View):
|
|||
image = set_cover_from_url(url)
|
||||
if image:
|
||||
book.cover.save(*image, save=False)
|
||||
elif "cover" in form.files:
|
||||
book.cover = remove_uploaded_image_exif(form.files["cover"])
|
||||
|
||||
# we don't tell the world when creating a book
|
||||
book.save(broadcast=False)
|
||||
|
|
|
@ -12,6 +12,7 @@ from django.shortcuts import redirect
|
|||
from django.template.response import TemplateResponse
|
||||
from django.views import View
|
||||
from django.views.decorators.http import require_POST
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm import forms, models
|
||||
from bookwyrm.activitypub import ActivitypubResponse
|
||||
|
@ -23,6 +24,7 @@ from bookwyrm.views.helpers import is_api_request, get_mergeable_object_or_404
|
|||
class Editions(View):
|
||||
"""list of editions"""
|
||||
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, book_id):
|
||||
"""list of editions of a book"""
|
||||
work = get_mergeable_object_or_404(models.Work, id=book_id)
|
||||
|
|
|
@ -3,6 +3,7 @@
|
|||
from sys import float_info
|
||||
from django.views import View
|
||||
from django.template.response import TemplateResponse
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm.views.helpers import is_api_request, get_mergeable_object_or_404
|
||||
from bookwyrm import models
|
||||
|
@ -20,6 +21,7 @@ def sort_by_series(book):
|
|||
class BookSeriesBy(View):
|
||||
"""book series by author"""
|
||||
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, author_id):
|
||||
"""lists all books in a series"""
|
||||
series_name = request.GET.get("series_name")
|
||||
|
|
|
@ -9,6 +9,7 @@ from django.template.response import TemplateResponse
|
|||
from django.utils import timezone
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.views import View
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm import activitystreams, forms, models
|
||||
from bookwyrm.models.user import FeedFilterChoices
|
||||
|
@ -130,6 +131,7 @@ class Status(View):
|
|||
"""get posting"""
|
||||
|
||||
# pylint: disable=unused-argument
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, username, status_id, slug=None):
|
||||
"""display a particular status (and replies, etc)"""
|
||||
user = get_user_from_username(request.user, username)
|
||||
|
@ -217,6 +219,7 @@ class Status(View):
|
|||
class Replies(View):
|
||||
"""replies page (a json view of status)"""
|
||||
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, username, status_id):
|
||||
"""ordered collection of replies to a status"""
|
||||
# the html view is the same as Status
|
||||
|
|
|
@ -156,8 +156,7 @@ def handle_reading_status(user, shelf, book, privacy):
|
|||
# it's a non-standard shelf, don't worry about it
|
||||
return
|
||||
|
||||
status = create_generated_note(user, message, mention_books=[book], privacy=privacy)
|
||||
status.save()
|
||||
create_generated_note(user, message, mention_books=[book], privacy=privacy)
|
||||
|
||||
|
||||
def load_date_in_user_tz_as_utc(date_str: str, user: models.User) -> datetime:
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
""" import books from another app """
|
||||
from io import TextIOWrapper
|
||||
import datetime
|
||||
from typing import Optional
|
||||
|
||||
from django.contrib.auth.decorators import login_required
|
||||
from django.db.models import Avg, ExpressionWrapper, F, fields
|
||||
|
@ -152,36 +153,35 @@ class UserImport(View):
|
|||
jobs = BookwyrmImportJob.objects.filter(user=request.user).order_by(
|
||||
"-created_date"
|
||||
)
|
||||
site = models.SiteSettings.objects.get()
|
||||
hours = site.user_import_time_limit
|
||||
allowed = (
|
||||
jobs.first().created_date < timezone.now() - datetime.timedelta(hours=hours)
|
||||
if jobs.first()
|
||||
else True
|
||||
)
|
||||
next_available = (
|
||||
jobs.first().created_date + datetime.timedelta(hours=hours)
|
||||
if not allowed
|
||||
else False
|
||||
)
|
||||
paginated = Paginator(jobs, PAGE_LENGTH)
|
||||
page = paginated.get_page(request.GET.get("page"))
|
||||
data = {
|
||||
"import_form": forms.ImportUserForm(),
|
||||
"jobs": page,
|
||||
"user_import_hours": hours,
|
||||
"next_available": next_available,
|
||||
"next_available": user_import_available(user=request.user),
|
||||
"page_range": paginated.get_elided_page_range(
|
||||
page.number, on_each_side=2, on_ends=1
|
||||
),
|
||||
"invalid": invalid,
|
||||
}
|
||||
|
||||
seconds = get_or_set(
|
||||
"avg-user-import-time", get_average_user_import_time, timeout=86400
|
||||
)
|
||||
if seconds and seconds > 60**2:
|
||||
data["recent_avg_hours"] = seconds / (60**2)
|
||||
elif seconds:
|
||||
data["recent_avg_minutes"] = seconds / 60
|
||||
|
||||
return TemplateResponse(request, "import/import_user.html", data)
|
||||
|
||||
def post(self, request):
|
||||
"""ingest a Bookwyrm json file"""
|
||||
|
||||
site = models.SiteSettings.objects.get()
|
||||
if not site.imports_enabled:
|
||||
raise PermissionDenied()
|
||||
|
||||
importer = BookwyrmImporter()
|
||||
|
||||
form = forms.ImportUserForm(request.POST, request.FILES)
|
||||
|
@ -197,3 +197,45 @@ class UserImport(View):
|
|||
job.start_job()
|
||||
|
||||
return redirect("user-import")
|
||||
|
||||
|
||||
def user_import_available(user: models.User) -> Optional[tuple[datetime, int]]:
|
||||
"""for a given user, determine whether they are allowed to run
|
||||
a user import and if not, return a tuple with the next available
|
||||
time they can import, and how many hours between imports allowed"""
|
||||
|
||||
jobs = BookwyrmImportJob.objects.filter(user=user).order_by("-created_date")
|
||||
site = models.SiteSettings.objects.get()
|
||||
hours = site.user_import_time_limit
|
||||
allowed = (
|
||||
jobs.first().created_date < timezone.now() - datetime.timedelta(hours=hours)
|
||||
if jobs.first()
|
||||
else True
|
||||
)
|
||||
if allowed and site.imports_enabled:
|
||||
return False
|
||||
|
||||
return (jobs.first().created_date + datetime.timedelta(hours=hours), hours)
|
||||
|
||||
|
||||
def get_average_user_import_time() -> float:
|
||||
"""Helper to figure out how long imports are taking (returns seconds)"""
|
||||
last_week = timezone.now() - datetime.timedelta(days=7)
|
||||
recent_avg = (
|
||||
models.BookwyrmImportJob.objects.filter(
|
||||
created_date__gte=last_week, complete=True
|
||||
)
|
||||
.exclude(status="stopped")
|
||||
.annotate(
|
||||
runtime=ExpressionWrapper(
|
||||
F("updated_date") - F("created_date"),
|
||||
output_field=fields.DurationField(),
|
||||
)
|
||||
)
|
||||
.aggregate(Avg("runtime"))
|
||||
.get("runtime__avg")
|
||||
)
|
||||
|
||||
if recent_avg:
|
||||
return recent_avg.total_seconds()
|
||||
return None
|
||||
|
|
|
@ -83,3 +83,79 @@ def stop_import(request, job_id):
|
|||
job = get_object_or_404(models.ImportJob, id=job_id, user=request.user)
|
||||
job.stop_job()
|
||||
return redirect("import-status", job_id)
|
||||
|
||||
|
||||
# pylint: disable= no-self-use
|
||||
@method_decorator(login_required, name="dispatch")
|
||||
class UserImportStatus(View):
|
||||
"""status of an existing import"""
|
||||
|
||||
def get(self, request, job_id):
|
||||
"""status of an import job"""
|
||||
job = get_object_or_404(models.BookwyrmImportJob, id=job_id)
|
||||
if job.user != request.user:
|
||||
raise PermissionDenied()
|
||||
|
||||
jobs = job.book_tasks.all().order_by("created_date")
|
||||
item_count = job.item_count or 1
|
||||
|
||||
paginated = Paginator(jobs, PAGE_LENGTH)
|
||||
page = paginated.get_page(request.GET.get("page"))
|
||||
|
||||
book_jobs_count = job.book_tasks.count() or "(pending...)"
|
||||
if job.complete and not job.book_tasks.count():
|
||||
book_jobs_count = 0
|
||||
|
||||
status_jobs_count = job.status_tasks.count() or "(pending...)"
|
||||
if job.complete and not job.status_tasks.count():
|
||||
status_jobs_count = 0
|
||||
|
||||
relationship_jobs_count = job.relationship_tasks.count() or "(pending...)"
|
||||
if job.complete and not job.relationship_tasks.count():
|
||||
relationship_jobs_count = 0
|
||||
|
||||
data = {
|
||||
"job": job,
|
||||
"items": page,
|
||||
"completed_books_count": job.book_tasks.filter(status="complete").count()
|
||||
or 0,
|
||||
"completed_statuses_count": job.status_tasks.filter(
|
||||
status="complete"
|
||||
).count()
|
||||
or 0,
|
||||
"completed_relationships_count": job.relationship_tasks.filter(
|
||||
status="complete"
|
||||
).count()
|
||||
or 0,
|
||||
"failed_books_count": job.book_tasks.filter(status="failed").count() or 0,
|
||||
"failed_statuses_count": job.status_tasks.filter(status="failed").count()
|
||||
or 0,
|
||||
"failed_relationships_count": job.relationship_tasks.filter(
|
||||
status="failed"
|
||||
).count()
|
||||
or 0,
|
||||
"fail_count": job.child_jobs.filter(status="failed").count(),
|
||||
"book_jobs_count": book_jobs_count,
|
||||
"status_jobs_count": status_jobs_count,
|
||||
"relationship_jobs_count": relationship_jobs_count,
|
||||
"page_range": paginated.get_elided_page_range(
|
||||
page.number, on_each_side=2, on_ends=1
|
||||
),
|
||||
"show_progress": True,
|
||||
"item_count": item_count,
|
||||
"complete_count": item_count - job.pending_item_count,
|
||||
"percent": job.percent_complete,
|
||||
# hours since last import item update
|
||||
"inactive_time": (job.updated_date - timezone.now()).seconds / 60 / 60,
|
||||
}
|
||||
|
||||
return TemplateResponse(request, "import/user_import_status.html", data)
|
||||
|
||||
|
||||
@login_required
|
||||
@require_POST
|
||||
def stop_user_import(request, job_id):
|
||||
"""scrap that"""
|
||||
job = get_object_or_404(models.BookwyrmImportJob, id=job_id, user=request.user)
|
||||
job.stop_job()
|
||||
return redirect("user-import-status", job_id)
|
||||
|
|
50
bookwyrm/views/imports/user_troubleshoot.py
Normal file
50
bookwyrm/views/imports/user_troubleshoot.py
Normal file
|
@ -0,0 +1,50 @@
|
|||
""" import books from another app """
|
||||
from django.contrib.auth.decorators import login_required
|
||||
from django.core.exceptions import PermissionDenied
|
||||
from django.core.paginator import Paginator
|
||||
from django.shortcuts import get_object_or_404, redirect
|
||||
from django.template.response import TemplateResponse
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.urls import reverse
|
||||
from django.views import View
|
||||
|
||||
from bookwyrm import models
|
||||
from bookwyrm.importers import BookwyrmImporter
|
||||
from bookwyrm.views import user_import_available
|
||||
from bookwyrm.settings import PAGE_LENGTH
|
||||
|
||||
# pylint: disable= no-self-use
|
||||
@method_decorator(login_required, name="dispatch")
|
||||
class UserImportTroubleshoot(View):
|
||||
"""failed items in an existing user import"""
|
||||
|
||||
def get(self, request, job_id):
|
||||
"""status of an import job"""
|
||||
job = get_object_or_404(models.BookwyrmImportJob, id=job_id)
|
||||
if job.user != request.user:
|
||||
raise PermissionDenied()
|
||||
|
||||
items = job.child_jobs.order_by("task_id").filter(status="failed")
|
||||
paginated = Paginator(items, PAGE_LENGTH)
|
||||
page = paginated.get_page(request.GET.get("page"))
|
||||
data = {
|
||||
"next_available": user_import_available(user=request.user),
|
||||
"job": job,
|
||||
"items": page,
|
||||
"page_range": paginated.get_elided_page_range(
|
||||
page.number, on_each_side=2, on_ends=1
|
||||
),
|
||||
"complete": True,
|
||||
"page_path": reverse("user-import-troubleshoot", args=[job.id]),
|
||||
}
|
||||
|
||||
return TemplateResponse(request, "import/user_troubleshoot.html", data)
|
||||
|
||||
def post(self, request, job_id):
|
||||
"""retry lines from a user import"""
|
||||
job = get_object_or_404(models.BookwyrmImportJob, id=job_id)
|
||||
|
||||
importer = BookwyrmImporter()
|
||||
job = importer.create_retry_job(request.user, job)
|
||||
job.start_job()
|
||||
return redirect(f"/user-import/{job.id}")
|
|
@ -3,6 +3,7 @@ from django.core.paginator import Paginator
|
|||
from django.http import JsonResponse
|
||||
from django.template.response import TemplateResponse
|
||||
from django.views import View
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm import book_search
|
||||
from bookwyrm.settings import PAGE_LENGTH
|
||||
|
@ -12,6 +13,7 @@ from .helpers import is_api_request
|
|||
class Isbn(View):
|
||||
"""search a book by isbn"""
|
||||
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, isbn):
|
||||
"""info about a book"""
|
||||
book_results = book_search.isbn_search(isbn)
|
||||
|
|
|
@ -14,6 +14,7 @@ from django.urls import reverse
|
|||
from django.utils.decorators import method_decorator
|
||||
from django.views import View
|
||||
from django.views.decorators.http import require_POST
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm import book_search, forms, models
|
||||
from bookwyrm.activitypub import ActivitypubResponse
|
||||
|
@ -29,6 +30,7 @@ from bookwyrm.views.helpers import (
|
|||
class List(View):
|
||||
"""book list page"""
|
||||
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, list_id, **kwargs):
|
||||
"""display a book list"""
|
||||
add_failed = kwargs.get("add_failed", False)
|
||||
|
|
|
@ -10,6 +10,9 @@ from bookwyrm import forms, models
|
|||
from bookwyrm.lists_stream import ListsStream
|
||||
from bookwyrm.views.helpers import get_user_from_username
|
||||
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# pylint: disable=no-self-use
|
||||
class Lists(View):
|
||||
|
@ -64,12 +67,12 @@ class SavedLists(View):
|
|||
return TemplateResponse(request, "lists/lists.html", data)
|
||||
|
||||
|
||||
@method_decorator(login_required, name="dispatch")
|
||||
class UserLists(View):
|
||||
"""a user's book list page"""
|
||||
|
||||
def get(self, request, username):
|
||||
"""display a book list"""
|
||||
|
||||
user = get_user_from_username(request.user, username)
|
||||
lists = models.List.privacy_filter(request.user).filter(user=user)
|
||||
paginated = Paginator(lists, 12)
|
||||
|
|
|
@ -1,9 +1,12 @@
|
|||
""" Let users export their book data """
|
||||
from datetime import timedelta
|
||||
import csv
|
||||
import datetime
|
||||
import io
|
||||
|
||||
from django.contrib.auth.decorators import login_required
|
||||
from django.db.models import Avg, ExpressionWrapper, F
|
||||
from django.db.models.fields import DurationField
|
||||
from django.core.paginator import Paginator
|
||||
from django.db.models import Q
|
||||
from django.http import HttpResponse, HttpResponseServerError, Http404
|
||||
|
@ -19,7 +22,7 @@ from storages.backends.s3 import S3Storage
|
|||
from bookwyrm import models
|
||||
from bookwyrm.models.bookwyrm_export_job import BookwyrmExportJob
|
||||
from bookwyrm import settings
|
||||
|
||||
from bookwyrm.utils.cache import get_or_set
|
||||
|
||||
# pylint: disable=no-self-use,too-many-locals
|
||||
@method_decorator(login_required, name="dispatch")
|
||||
|
@ -203,6 +206,14 @@ class ExportUser(View):
|
|||
),
|
||||
}
|
||||
|
||||
seconds = get_or_set(
|
||||
"avg-user-export-time", get_average_export_time, timeout=86400
|
||||
)
|
||||
if seconds and seconds > 60**2:
|
||||
data["recent_avg_hours"] = seconds / (60**2)
|
||||
elif seconds:
|
||||
data["recent_avg_minutes"] = seconds / 60
|
||||
|
||||
return TemplateResponse(request, "preferences/export-user.html", data)
|
||||
|
||||
def post(self, request):
|
||||
|
@ -253,3 +264,26 @@ class ExportArchive(View):
|
|||
)
|
||||
except FileNotFoundError:
|
||||
raise Http404()
|
||||
|
||||
|
||||
def get_average_export_time() -> float:
|
||||
"""Helper to figure out how long exports are taking (returns seconds)"""
|
||||
last_week = timezone.now() - datetime.timedelta(days=7)
|
||||
recent_avg = (
|
||||
models.BookwyrmExportJob.objects.filter(
|
||||
created_date__gte=last_week, complete=True
|
||||
)
|
||||
.exclude(status="stopped")
|
||||
.annotate(
|
||||
runtime=ExpressionWrapper(
|
||||
F("updated_date") - F("created_date"),
|
||||
output_field=DurationField(),
|
||||
)
|
||||
)
|
||||
.aggregate(Avg("runtime"))
|
||||
.get("runtime__avg")
|
||||
)
|
||||
|
||||
if recent_avg:
|
||||
return recent_avg.total_seconds()
|
||||
return None
|
||||
|
|
|
@ -4,6 +4,7 @@ from django.core.paginator import Paginator
|
|||
from django.db.models import Q, Count
|
||||
from django.template.response import TemplateResponse
|
||||
from django.views import View
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm.activitypub import ActivitypubResponse
|
||||
from bookwyrm.settings import PAGE_LENGTH
|
||||
|
@ -14,6 +15,7 @@ from .helpers import get_user_from_username, is_api_request
|
|||
class Relationships(View):
|
||||
"""list of followers/following view"""
|
||||
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, username, direction):
|
||||
"""list of followers"""
|
||||
user = get_user_from_username(request.user, username)
|
||||
|
|
|
@ -3,6 +3,7 @@
|
|||
from django.contrib.syndication.views import Feed
|
||||
from django.template.loader import get_template
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.shortcuts import get_object_or_404
|
||||
from ..models import Review, Quotation, Comment
|
||||
|
||||
from .helpers import get_user_from_username
|
||||
|
@ -177,3 +178,61 @@ class RssCommentsOnlyFeed(Feed):
|
|||
def item_pubdate(self, item):
|
||||
"""publication date of the item"""
|
||||
return item.published_date
|
||||
|
||||
|
||||
class RssShelfFeed(Feed):
|
||||
"""serialize a shelf activity in rss"""
|
||||
|
||||
description_template = "rss/edition.html"
|
||||
|
||||
def item_title(self, item):
|
||||
"""render the item title"""
|
||||
authors = item.authors
|
||||
if item.author_text:
|
||||
authors.display_name = f"{item.author_text}:"
|
||||
else:
|
||||
authors.description = ""
|
||||
template = get_template("rss/title.html")
|
||||
return template.render({"user": authors, "item_title": item.title}).strip()
|
||||
|
||||
def get_object(
|
||||
self, request, shelf_identifier, username
|
||||
): # pylint: disable=arguments-differ
|
||||
"""the shelf that gets serialized"""
|
||||
user = get_user_from_username(request.user, username)
|
||||
# always get privacy, don't support rss over anything private
|
||||
# get the SHELF of the object
|
||||
shelf = get_object_or_404(
|
||||
user.shelf_set,
|
||||
identifier=shelf_identifier,
|
||||
privacy__in=["public", "unlisted"],
|
||||
)
|
||||
shelf.raise_visible_to_user(request.user)
|
||||
return shelf
|
||||
|
||||
def link(self, obj):
|
||||
"""link to the shelf"""
|
||||
return obj.local_path
|
||||
|
||||
def title(self, obj):
|
||||
"""title of the rss feed entry"""
|
||||
return _(f"{obj.user.display_name}’s {obj.name} shelf")
|
||||
|
||||
def items(self, obj):
|
||||
"""the user's activity feed"""
|
||||
return obj.books.order_by("-shelfbook__shelved_date")[:10]
|
||||
|
||||
def item_link(self, item):
|
||||
"""link to the status"""
|
||||
return item.local_path
|
||||
|
||||
def item_pubdate(self, item):
|
||||
"""publication date of the item"""
|
||||
return item.published_date
|
||||
|
||||
def description(self, obj):
|
||||
"""description of the shelf including the shelf name and user."""
|
||||
# if there's a description, lets add it. Not everyone puts a description in.
|
||||
if desc := obj.description:
|
||||
return _(f"{obj.user.display_name}’s {obj.name} shelf: {desc}")
|
||||
return _(f"Books added to {obj.user.name}’s {obj.name} shelf")
|
||||
|
|
|
@ -9,6 +9,7 @@ from django.db.models.functions import Greatest
|
|||
from django.http import JsonResponse
|
||||
from django.template.response import TemplateResponse
|
||||
from django.views import View
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from csp.decorators import csp_update
|
||||
|
||||
|
@ -26,6 +27,7 @@ class Search(View):
|
|||
"""search users or books"""
|
||||
|
||||
@csp_update(IMG_SRC="*")
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request):
|
||||
"""that search bar up top"""
|
||||
if is_api_request(request):
|
||||
|
|
|
@ -10,6 +10,7 @@ from django.template.response import TemplateResponse
|
|||
from django.utils.decorators import method_decorator
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.views import View
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm import forms, models
|
||||
from bookwyrm.activitypub import ActivitypubResponse
|
||||
|
@ -23,6 +24,7 @@ class Shelf(View):
|
|||
"""shelf page"""
|
||||
|
||||
# pylint: disable=R0914
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, username, shelf_identifier=None):
|
||||
"""display a shelf"""
|
||||
user = get_user_from_username(request.user, username)
|
||||
|
|
|
@ -8,6 +8,7 @@ from django.template.response import TemplateResponse
|
|||
from django.utils import timezone
|
||||
from django.views import View
|
||||
from django.views.decorators.http import require_POST
|
||||
from django.views.decorators.vary import vary_on_headers
|
||||
|
||||
from bookwyrm import models
|
||||
from bookwyrm.activitypub import ActivitypubResponse
|
||||
|
@ -19,6 +20,7 @@ from .helpers import get_user_from_username, is_api_request
|
|||
class User(View):
|
||||
"""user profile page"""
|
||||
|
||||
@vary_on_headers("Accept")
|
||||
def get(self, request, username):
|
||||
"""profile page for a user"""
|
||||
user = get_user_from_username(request.user, username)
|
||||
|
|
|
@ -4,7 +4,7 @@ boto3==1.34.74
|
|||
bw-file-resubmit==0.6.0rc2
|
||||
celery==5.3.6
|
||||
colorthief==0.2.1
|
||||
Django==4.2.18
|
||||
Django==4.2.20
|
||||
django-celery-beat==2.6.0
|
||||
django-compressor==4.4
|
||||
django-csp==3.8
|
||||
|
|
Loading…
Reference in a new issue