1 Commits

Author SHA1 Message Date
8762112e37 feat: Add Markdown export 2023-01-29 15:24:16 +01:00
17 changed files with 268 additions and 852 deletions

23
.drone.yml Normal file
View File

@@ -0,0 +1,23 @@
---
kind: pipeline
type: exec
name: build
platform:
os: linux
arch: arm64
steps:
- name: build
commands:
- docker build -t gcrkrause/mastodon-blocklist-deploy .
- name: push
environment:
USERNAME:
from_secret: docker-hub-user
PASSWORD:
from_secret: docker-hub-pw
commands:
- docker login -u $USERNAME -p $PASSWORD
- docker push gcrkrause/mastodon-blocklist-deploy
- docker image prune -a -f

3
.gitignore vendored
View File

@@ -1,6 +1,3 @@
*.toml
*.csv
*.json
# ---> Python
# Byte-compiled / optimized / DLL files

View File

@@ -1,14 +0,0 @@
steps:
test:
image: python:3.11
commands:
- pip install poetry
- poetry run pytest --cov-report term-missing --cov=fediverse_blocklist_deploy tests
create_docker_image:
image: woodpeckerci/plugin-docker-buildx
secrets: [ docker_username, docker_password ]
settings:
repo: moanos/fedivers-blocklist-tool
dockerfile: Dockerfile
tag: latest

View File

@@ -4,14 +4,14 @@
In order to have a common development environment, its nice to use docker. Its quite easy. To build a new image, simply run
`docker build . -t fediverse_blocklist_tool`
`docker build . -t mastodon_blocklist_deploy`
Now you can execute any commands using
`docker run --rm fediverse_blocklist_tool --help`
`docker run --rm mastodon_blocklist_deploy --help`
If you want to avoid building new containers for each change, simply mount your code into the container using
`docker run --rm -v $(pwd):/app fediverse_blocklist_tool`
`docker run --rm -v $(pwd):/app mastodon_blocklist_deploy`
Please be aware that changes to the package itself require a rebuild anyways.

View File

@@ -4,9 +4,9 @@ ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
COPY pyproject.toml poetry.lock README.md /app/
COPY fediverse_blocklist_tool /app/fediverse_blocklist_tool
COPY mastodon_blocklist_deploy /app/mastodon_blocklist_deploy
WORKDIR /app
ENTRYPOINT ["fediverse_blocklist_deploy"]
ENTRYPOINT ["mastodon_blocklist_deploy"]
RUN pip install -e .

View File

@@ -1,44 +1,37 @@
# fediverse-blocklist-tool
# mastodon-blocklist-deploy
A small tool to deploy blocklist updates to a fediverse server using its API.
A small tool to deploy blocklist updates to a mastodon server using its API.
## Concept
The idea is to maintain a blocklist in a simple structured file in this repository. All changes need to be deployed to
the fediverse server, this is supposed to be automated with Drone CI.
the mastodon server, this is supposed to be automated with Drone CI.
In order to compare the list entries, we can read the whole blocklist
using [the get endpoint](https://docs.joinmastodon.org/methods/admin/domain_blocks/#get). At the same time we read the
whole file in the repository, make a comparison
whole file in the repository, make a comparision
and [remove](https://docs.joinmastodon.org/methods/admin/domain_blocks/#delete) unblocked domains from the blocklist
and [add](https://docs.joinmastodon.org/methods/admin/domain_blocks/#create) newly added.
Since we have several attributes for a domain block, a simple `.txt` file might not be sufficient. We probably want to
Since we have several attributes for a domain blog, a simple `.txt` file might not be sufficient. We probably want to
set the severity, reject_media, reject_reports and comments. This means we need a human-readable, easily python-readable
and structured file format. Since Python 3.11 got native support for [toml](https://toml.io/) and it
supports [Array of Tables](https://toml.io/en/v1.0.0#array-of-tables), I'd prefer to use this.
# Supported server types
- [x] Mastodon
- [X] GoToSocial
# Basic usage
##
```
usage: fediverse_blocklist_deploy [-h] [-s SERVER] [-t TOKEN] [-i INPUT_FILE] [-r REMOTE_BLOCKLIST] [-o OUTPUT] [-v] [-n]
[--format FORMAT] [--private]
{diff,deploy,export}
$ mastodon_blocklist_deploy -h
usage: mastodon_blocklist_deploy [-h] [-s SERVER] [-t TOKEN] [-i INPUT_FILE] [-r REMOTE_BLOCKLIST] [-o OUTPUT] [-v] [-n] {diff,deploy,export}
Deploy blocklist updates to a fediverse server
Deploy blocklist updates to a mastodon server
positional arguments:
{diff,deploy,export} Either use 'diff' to check the difference between local blockĺist and the blocklist on the server, 'deploy'
to apply the current local blocklist or 'export' to export the remote blocklist into a local file.
{diff,deploy,export} Either use 'diff' to check the difference between local blockĺist and the blocklist on the server, 'deploy' to apply the current local blocklist or 'export' to export the remote blocklist into a local file.
options:
-h, --help show this help message and exit
@@ -54,8 +47,6 @@ options:
Filename where to export the blocklist
-v, --verbose
-n, --no-delete Do not delete existing blocks
--format FORMAT Export format: toml|markdown|csv|json
--private When the flag is set, private comment will also be exported.
```
## Obtain a server token
@@ -69,7 +60,7 @@ options:
1. **Export the current blocklist from the server**
```
fediverse_blocklist_deploy export -s yourserver -t yourtoken -o blocklist.toml
mastodon_blocklist_deploy export -s yourserver -t yourtoken -o blocklist.toml
```
2. **Manually add something to the blocklist**
@@ -88,12 +79,12 @@ private_comment = "We discussed this after X and Y and now that Z happend we dec
3. **Check the difference between the local and remote blocklist**
```
fediverse_blocklist_deploy diff -s yourserver -t yourtoken -i blocklist.toml
mastodon_blocklist_deploy diff -s yourserver -t yourtoken -i blocklist.toml
```
4. **Apply the local blocklist to the server**
```
fediverse_blocklist_deploy apply -s yourserver -t yourtoken -i blocklist.toml
mastodon_blocklist_deploy apply -s yourserver -t yourtoken -i blocklist.toml
```

View File

@@ -1 +0,0 @@
version = "0.1.0"

View File

@@ -1,234 +0,0 @@
# import tomllib
import argparse
import json
import logging
from typing import List
import requests
import os
import toml
import validators
from fediverse_blocklist_tool.models import Instance
from fediverse_blocklist_tool.helpers import blocklist_to_markdown, blocklist_to_toml, blocklist_to_csv, \
blocklist_to_json
SUPPORTED_LOADING_FORMATS = ["toml", "csv", "json"]
def determine_location(location: str) -> (bool, str, str):
"""
Determines whether a location is remote and splits off an access token if necessary
:return: A set of (is_remote, location, access_token) where access token can be None
"""
# If the file extension is in supported formats the location is assumed to be local
if location.split(".")[-1] in SUPPORTED_LOADING_FORMATS:
return False, location, None
elif "@" in location:
access_token, remote_server = location.split("@")
else:
remote_server = location
access_token = None
# Validate that the domain is valid
if not validators.domain(remote_server):
error_message = f"Could not parse blocklist location: {location}. Local blocklists must have one of the following formats: {SUPPORTED_LOADING_FORMATS} or a valid domain"
logging.error(error_message)
raise ValueError(error_message)
return True, remote_server, access_token
def load_blocklist(location: str) -> List[Instance]:
"""Load blocklist from the given location
:param location: location of the blocklist. Remote locations are allowed to contain an access token. Examples: local_blocklist.toml, ACCESS_TOKEN@fediserver1.org
"""
is_remote, location, access_token = determine_location(location)
if not is_remote:
return load_blocklist_file(location)
else:
if access_token is None:
access_token = os.getenv('FBT_TOKEN')
return load_blocklist_from_instance(location, access_token)
def load_blocklist_file(filename: str) -> [Instance]:
if filename.endswith("json"):
with open(filename) as f:
instances = blocklist_json_to_instances(json.load(f))
else:
with open(filename, "r") as f:
data = toml.load(f)
instances = []
for instance_dict in data["instances"]:
instance = Instance(instance_dict)
instances.append(instance)
return instances
def blocklist_json_to_instances(blocklist_json: str) -> [Instance]:
instances = []
for i in blocklist_json:
instances.append(Instance(i))
return instances
def load_blocklist_from_instance(server: str, token: str) -> [Instance]:
headers = {
f'Authorization': f'Bearer {token}',
}
response = requests.get(f'https://{server}/api/v1/admin/domain_blocks', headers=headers)
if response.status_code == 200:
blocklist_json = json.loads(response.content)
return blocklist_json_to_instances(blocklist_json)
else:
raise ConnectionError(f"Could not connect to the server ({response.status_code}: {response.reason})")
def remove_key_from_dict(dict, key):
del dict[key]
return dict
def exporter(blocklist, output=None, format: str = "toml", private: bool = False):
if format == "toml":
exported_text = blocklist_to_toml(blocklist, private)
if format == "csv":
exported_text = blocklist_to_csv(blocklist, private)
if format == "md":
exported_text = blocklist_to_markdown(blocklist, private)
if format == "json":
exported_text = blocklist_to_json(blocklist, private)
# Output the text
if output is not None:
with open(output, "w") as f:
f.write(exported_text)
else:
print(exported_text)
def merge_lists(input_blocklist, merge_target_blocklist, overwrite=False):
"""Shows a table in the CLI comparing the local and remote blocklist"""
from rich.table import Table
from rich.console import Console
for input_instance in input_blocklist:
# If the block is already there with the same parameters we do nothing
if input_instance in merge_target_blocklist:
continue
# Check if there is a domain in the merge target where the input domain is similar
try:
merge_target_instance = [merge_target_instance for merge_target_instance in merge_target_blocklist if
input_instance.domain == merge_target_instance.domain][0]
key_input = ""
if not overwrite:
while key_input not in ("i", "O"):
print(f"Different settings for {input_instance.domain} detected.")
Instance.show_diff(input_instance, merge_target_instance)
key_input = input("Keep input (i) or original (O) [i/O]")
elif key_input == "i":
merge_target_blocklist.append(merge_target_instance)
else:
merge_target_blocklist.append(input_instance)
except KeyError:
pass
return merge_target_blocklist
def get_format_from_filename(filename: str) -> str:
detected_format = filename.split(".")[-1]
if detected_format not in ["toml", "csv", "md", "json"]:
detected_format = "toml"
return detected_format
def diff(args):
if len(args.blocklist) > 2:
raise NotImplemented("Comparing more than two blocklists is not yet supported")
blocklists_to_compare = args.blocklist
blocklist_a = load_blocklist_file(blocklists_to_compare[0])
blocklist_b = load_blocklist_file(blocklists_to_compare[1])
Instance.show_diffs(blocklist_a, blocklist_b)
def export(args):
blocklist_to_export = load_blocklist_file(args.blocklist_to_export)
export_format = get_format_from_filename(args.target_blocklist)
exporter(blocklist_to_export, args.target, export_format, args.private)
def deploy(args):
blocklist_to_deploy = load_blocklist_file(args.blocklist_to_deploy)
target_blocklist = load_blocklist_file(args.target_blocklist)
diffs = Instance.list_diffs(blocklist_to_deploy, target_blocklist)
Instance.apply_blocks_from_diff(diffs, args.server, token, args.no_delete)
def merge(args):
blocklist_to_merge = load_blocklist_file(args.blocklist_to_merge)
target_blocklist = load_blocklist_file(args.target)
export_format = get_format_from_filename(args.target)
merge_lists(blocklist_to_merge, target_blocklist, export_format)
def cli():
parser = argparse.ArgumentParser(description='Compare, merge, export and deploy blocklist of a fediverse server')
parser.add_argument('-v', '--verbose', action='store_true')
subparsers = parser.add_subparsers(required=True)
diff_parser = subparsers.add_parser('diff')
diff_parser.add_argument('blocklist',
nargs="+",
type=str,
help="The blocklists you want to compare. Provide at least two. Example: "
"lockalblocklist.toml ACCESS_TOKEN@fediserver.org")
diff_parser.set_defaults(func=diff)
export_parser = subparsers.add_parser('export')
export_parser.add_argument('blocklist_to_export',
type=str,
help="The blocklist you want to export. Example: ACCESS_TOKEN@fediserver.org")
export_parser.add_argument('target',
type=str,
help="Filename where to save the blocklist. The extension will determine the format, "
"if none ist given it'll default to toml")
export_parser.add_argument('--private',
action='store_true',
default=False,
help="When the flag is set, private comment will also be exported.")
export_parser.set_defaults(func=export)
merge_parser = subparsers.add_parser('merge')
merge_parser.add_argument('blocklist_to_merge',
type=str,
help="The blocklist you want to merge to the target. Example: lockalblocklist.toml "
"ACCESS_TOKEN@fediserver.org")
merge_parser.add_argument('target',
type=str,
help="Merge target that will hold the merged blocklist. Example: lockalblocklist.toml "
"ACCESS_TOKEN@fediserver.org")
merge_parser.set_defaults(func=merge)
deploy_parser = subparsers.add_parser('deploy')
deploy_parser.add_argument('blocklist_to_deploy',
type=str,
help="The blocklist you want to deploy to the target. Example: lockalblocklist.toml "
"ACCESS_TOKEN@fediserver.org")
deploy_parser.add_argument('target',
type=str,
help="Deploy target that will hold the merged blocklist. Example: lockalblocklist.toml "
"ACCESS_TOKEN@fediserver.org")
deploy_parser.add_argument('-n', '--no-delete', action='store_true', help="Do not delete existing blocks")
deploy_parser.set_defaults(func=deploy)
args = parser.parse_args()
if args.verbose:
logging.basicConfig(level=logging.DEBUG)
else:
logging.basicConfig(level=logging.WARN)
args.func(args)
if __name__ == "__main__":
cli()

View File

@@ -1,36 +0,0 @@
from fediverse_blocklist_tool.models import Instance
import toml
import io
import csv
import json
def blocklist_to_markdown(blocklist: [Instance], private: bool = False):
if private:
markdown_string = "| Instance | Status | Reason | Private Comment |\n | --- | --- | --- |\n"
else:
markdown_string = "| Instance | Status | Reason |\n | --- | --- | --- |\n"
for instance in blocklist:
if private:
markdown_string += f"| {instance.domain} | {instance.severity} | {instance.public_comment} | {instance.private_comment} |\n"
else:
markdown_string += f"| {instance.domain} | {instance.severity} | {instance.public_comment} |\n"
return markdown_string
def blocklist_to_toml(blocklist: [Instance], private: bool = False):
toml_string = toml.dumps({"instances": [b.as_dict(private) for b in blocklist]})
return toml_string
def blocklist_to_csv(blocklist: [Instance], private: bool = False):
csv_string = io.StringIO()
blocklist_as_dict = [b.as_dict(private) for b in blocklist]
keys = blocklist_as_dict[0].keys()
w = csv.DictWriter(csv_string, keys)
w.writeheader()
w.writerows(blocklist_as_dict)
return csv_string.getvalue()
def blocklist_to_json(blocklist: [Instance], private: bool = False):
json_string = json.dumps([b.as_dict(private) for b in blocklist])
return json_string

View File

@@ -0,0 +1,113 @@
# import tomllib
import argparse
import json
import logging
import requests
import os
import toml
from mastodon_blocklist_deploy.models import Instance
from mastodon_blocklist_deploy.helpers import blocklist_to_markdown
def load_blocklist_file(filename: str) -> [Instance]:
with open(filename, "r") as f:
data = toml.load(f)
instances = []
for instance_dict in data["instances"]:
instance = Instance(instance_dict)
instances.append(instance)
return instances
def blocklist_json_to_instances(blocklist_json: str) -> [Instance]:
instances = []
for i in blocklist_json:
instances.append(Instance(i))
return instances
def load_blocklist_from_instance(server: str, token: str) -> [Instance]:
headers = {
f'Authorization': f'Bearer {token}',
}
response = requests.get(f'https://{server}/api/v1/admin/domain_blocks', headers=headers)
if response.status_code == 200:
blocklist_json = json.loads(response.content)
return blocklist_json_to_instances(blocklist_json)
else:
raise ConnectionError(f"Could not connect to the server ({response.status_code}: {response.reason})")
def remove_key_from_dict(dict, key):
del dict[key]
return dict
def cli():
parser = argparse.ArgumentParser(description='Deploy blocklist updates to a mastodon server')
parser.add_argument('action', choices=['diff', 'deploy', 'export'],
help="Either use 'diff' to check the difference between local blockĺist and the blocklist on "
"the server, 'deploy' to apply the current local blocklist or 'export' to export the remote "
"blocklist into a local file.")
parser.add_argument('-s', '--server', help="The address of the server where you want to deploy (e.g. "
"mastodon.social)")
parser.add_argument('-t', '--token', help="Authorization token")
parser.add_argument('-i', '--input-file', help="The blocklist to use")
parser.add_argument('-r', '--remote-blocklist', help="The remote blocklist as json for debugging reasons")
parser.add_argument('-o', '--output', help="Filename where to export the blocklist")
parser.add_argument('-v', '--verbose', action='store_true')
parser.add_argument('-n', '--no-delete', action='store_true', help="Do not delete existing blocks")
parser.add_argument('--markdown', action='store_true', help="Export as markdown table")
args = parser.parse_args()
if args.verbose:
logging.basicConfig(level=logging.DEBUG)
else:
logging.basicConfig(level=logging.WARN)
if args.token:
token = args.token
else:
token = os.getenv('MBD_TOKEN')
"""if there is a remote blocklist provided load this instead of fetching it from a server (for debugging reasons)"""
if args.remote_blocklist:
with open(args.remote_blocklist) as f:
remote_blocklist = blocklist_json_to_instances(json.load(f))
else:
remote_blocklist = load_blocklist_from_instance(server=args.server, token=token)
"""Load local blocklist only when needed"""
if args.action in ["diff", "deploy"]:
if args.input_file:
blocklist_filename = args.input_file
else:
blocklist_filename = "../blocklist.toml"
try:
local_blocklist = load_blocklist_file(blocklist_filename)
except FileNotFoundError:
print("Local blocklist file was not found. Make sure to specify it's location via -i")
exit()
if args.action == "diff":
Instance.show_diffs(local_blocklist, remote_blocklist)
elif args.action == "deploy":
diffs = Instance.list_diffs(local_blocklist, remote_blocklist)
Instance.apply_blocks_from_diff(diffs, args.server, token, args.no_delete)
elif args.action == "export":
if not args.output:
if args.markdown:
print(blocklist_to_markdown(remote_blocklist))
else:
print(toml.dumps({"instances": [remove_key_from_dict(b.__dict__, 'id') for b in remote_blocklist]}))
else:
with open(args.output, "w") as f:
if args.markdown:
f.write(blocklist_to_markdown(remote_blocklist))
else:
toml.dump({"instances": [remove_key_from_dict(b.__dict__, 'id') for b in remote_blocklist]}, f)
if __name__ == "__main__":
cli()

View File

@@ -0,0 +1,6 @@
from mastodon_blocklist_deploy.models import Instance
def blocklist_to_markdown(blocklist:[Instance]):
markdown_string = "| Instance | Status | Reason |\n | --- | --- | --- |\n"
for instance in blocklist:
markdown_string += f"| {instance.domain} | {instance.severity} | {instance.public_comment} |\n"
return markdown_string

View File

@@ -2,22 +2,21 @@ import logging
import requests
from typing import Dict, Union
class Instance:
def __init__(self, instance_dict : Dict):
def __init__(self, instance_dict):
"""If obfuscate, reject_media or reject_reports are not specified default to False"""
self.severity: str = "suspend"
self.obfuscate: bool = False
self.reject_media: bool = False
self.reject_reports: bool = False
self.id: Union[int, None] = None
self.domain: str = ""
self.private_comment: str = ""
self.public_comment: str = ""
self.obfuscate = False
self.reject_media = False
self.reject_reports = False
self.id = None
self.parse_block(instance_dict)
"""Remote blocks and local blocks are parsed differently"""
try:
instance_dict["id"]
self.parse_remote_block(instance_dict)
except KeyError:
self.parse_local_block(instance_dict)
def __str__(self):
return f"{self.domain}: {self.severity}"
@@ -28,36 +27,43 @@ class Instance:
def status_str(self):
return f"{self.severity}\nReject reports: {self.reject_reports}\nReject media: {self.reject_media}\nObfuscate: {self.obfuscate}"
def as_dict(self, private=False):
keys = ["domain", "severity", "public_comment", "obfuscate", "reject_media", "reject_reports"]
if private:
keys.append("private_comment")
exportable = {}
for key in keys:
exportable[key] = getattr(self, key)
return exportable
def parse_remote_block(self, instance_dict):
self.domain = instance_dict["domain"]
self.id = instance_dict["id"]
self.severity = instance_dict["severity"]
self.public_comment = instance_dict["public_comment"]
self.private_comment = instance_dict["private_comment"]
self.obfuscate = instance_dict["obfuscate"]
self.reject_media = instance_dict["reject_media"]
self.reject_reports = instance_dict["reject_reports"]
def parse_block(self, instance_dict):
# this specifies possible properties and default values if not found on the remote source. If a default is None
# the value is required and the parse will fail
properties_and_defaults = [("domain", None), ("severity", "suspend"), ("public_comment", ""),
("private_comment", ""), ("obfuscate", False), ("reject_media", False),
("reject_reports", False)]
for key, default in properties_and_defaults:
def parse_local_block(self, instance_dict):
try:
setattr(self, key, instance_dict[key])
self.name = instance_dict["name"]
except KeyError:
if default is not None:
setattr(self, key, default)
else:
raise KeyError(f"The key {key} was not in the instance_dict response.")
pass
self.domain = instance_dict["domain"]
self.severity = instance_dict["severity"]
self.public_comment = instance_dict["public_comment"]
self.private_comment = instance_dict["private_comment"]
try:
self.obfuscate = instance_dict["obfuscate"]
except KeyError:
pass
try:
self.reject_media = instance_dict["reject_media"]
except KeyError:
pass
try:
self.reject_reports = instance_dict["reject_reports"]
except KeyError:
pass
def apply(self, server, token, block_id=None):
"""Applies instance block on the remote server"""
headers = {
f'Authorization': f'Bearer {token}',
}
# As long as we generate this enside of apply we cannot properly test for the correct format
data = {"domain": self.domain,
"severity": self.severity,
"reject_media": str(self.reject_media).lower(),
@@ -72,7 +78,7 @@ class Instance:
response = requests.put(f'https://{server}/api/v1/admin/domain_blocks/{block_id}', data=data,
headers=headers)
if response.status_code != 200:
raise ConnectionError(f"Could not apply block for {self.domain} ({response.status_code}: {response.reason})")
raise ConnectionError(f"Could not apply block ({response.status_code}: {response.reason})")
def delete(self, server: str, token: str):
"""Deletes the instance from the blocklist on the remote server"""
@@ -145,18 +151,3 @@ class Instance:
table.add_row(diff["local"].domain, diff["local"].status_str(), diff["remote"].status_str())
console = Console()
console.print(table)
@staticmethod
def show_diff(instanceA, instanceB, column_names=('Input', 'Original')):
from rich.table import Table
from rich.console import Console
table = Table(title="Differences", expand=True, show_lines=True)
table.add_column("Attribute", style="cyan")
table.add_column(column_names[0], style="green")
table.add_column(column_names[1], style="magenta")
compare_attributes = ["domain", "severity", "obfuscate", "private_comment", "public_comment", "reject_media", "reject_reports"]
for attr in compare_attributes:
table.add_row(attr, str(getattr(instanceA, attr)), str(getattr(instanceB, attr)))
console = Console()
console.print(table)

420
poetry.lock generated
View File

@@ -1,399 +1,120 @@
# This file is automatically @generated by Poetry 1.7.1 and should not be changed by hand.
# This file is automatically @generated by Poetry and should not be changed by hand.
[[package]]
name = "certifi"
version = "2023.11.17"
version = "2022.12.7"
description = "Python package for providing Mozilla's CA Bundle."
category = "main"
optional = false
python-versions = ">=3.6"
files = [
{file = "certifi-2023.11.17-py3-none-any.whl", hash = "sha256:e036ab49d5b79556f99cfc2d9320b34cfbe5be05c5871b51de9329f0603b0474"},
{file = "certifi-2023.11.17.tar.gz", hash = "sha256:9b469f3a900bf28dc19b8cfbf8019bf47f7fdd1a65a1d4ffb98fc14166beb4d1"},
{file = "certifi-2022.12.7-py3-none-any.whl", hash = "sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18"},
{file = "certifi-2022.12.7.tar.gz", hash = "sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3"},
]
[[package]]
name = "charset-normalizer"
version = "3.3.2"
version = "2.1.1"
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
category = "main"
optional = false
python-versions = ">=3.7.0"
python-versions = ">=3.6.0"
files = [
{file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
{file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
{file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
{file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
{file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
{file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
{file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
{file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
{file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
{file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
{file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
{file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
{file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
{file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
{file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
{file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
{file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
{file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
{file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
{file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
{file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
{file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
{file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
{file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
]
[[package]]
name = "colorama"
version = "0.4.6"
description = "Cross-platform colored terminal text."
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
files = [
{file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
{file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
]
[[package]]
name = "coverage"
version = "7.3.3"
description = "Code coverage measurement for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "coverage-7.3.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d874434e0cb7b90f7af2b6e3309b0733cde8ec1476eb47db148ed7deeb2a9494"},
{file = "coverage-7.3.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ee6621dccce8af666b8c4651f9f43467bfbf409607c604b840b78f4ff3619aeb"},
{file = "coverage-7.3.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1367aa411afb4431ab58fd7ee102adb2665894d047c490649e86219327183134"},
{file = "coverage-7.3.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f0f8f0c497eb9c9f18f21de0750c8d8b4b9c7000b43996a094290b59d0e7523"},
{file = "coverage-7.3.3-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db0338c4b0951d93d547e0ff8d8ea340fecf5885f5b00b23be5aa99549e14cfd"},
{file = "coverage-7.3.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d31650d313bd90d027f4be7663dfa2241079edd780b56ac416b56eebe0a21aab"},
{file = "coverage-7.3.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:9437a4074b43c177c92c96d051957592afd85ba00d3e92002c8ef45ee75df438"},
{file = "coverage-7.3.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:9e17d9cb06c13b4f2ef570355fa45797d10f19ca71395910b249e3f77942a837"},
{file = "coverage-7.3.3-cp310-cp310-win32.whl", hash = "sha256:eee5e741b43ea1b49d98ab6e40f7e299e97715af2488d1c77a90de4a663a86e2"},
{file = "coverage-7.3.3-cp310-cp310-win_amd64.whl", hash = "sha256:593efa42160c15c59ee9b66c5f27a453ed3968718e6e58431cdfb2d50d5ad284"},
{file = "coverage-7.3.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:8c944cf1775235c0857829c275c777a2c3e33032e544bcef614036f337ac37bb"},
{file = "coverage-7.3.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:eda7f6e92358ac9e1717ce1f0377ed2b9320cea070906ece4e5c11d172a45a39"},
{file = "coverage-7.3.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3c854c1d2c7d3e47f7120b560d1a30c1ca221e207439608d27bc4d08fd4aeae8"},
{file = "coverage-7.3.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:222b038f08a7ebed1e4e78ccf3c09a1ca4ac3da16de983e66520973443b546bc"},
{file = "coverage-7.3.3-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff4800783d85bff132f2cc7d007426ec698cdce08c3062c8d501ad3f4ea3d16c"},
{file = "coverage-7.3.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fc200cec654311ca2c3f5ab3ce2220521b3d4732f68e1b1e79bef8fcfc1f2b97"},
{file = "coverage-7.3.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:307aecb65bb77cbfebf2eb6e12009e9034d050c6c69d8a5f3f737b329f4f15fb"},
{file = "coverage-7.3.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:ffb0eacbadb705c0a6969b0adf468f126b064f3362411df95f6d4f31c40d31c1"},
{file = "coverage-7.3.3-cp311-cp311-win32.whl", hash = "sha256:79c32f875fd7c0ed8d642b221cf81feba98183d2ff14d1f37a1bbce6b0347d9f"},
{file = "coverage-7.3.3-cp311-cp311-win_amd64.whl", hash = "sha256:243576944f7c1a1205e5cd658533a50eba662c74f9be4c050d51c69bd4532936"},
{file = "coverage-7.3.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:a2ac4245f18057dfec3b0074c4eb366953bca6787f1ec397c004c78176a23d56"},
{file = "coverage-7.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f9191be7af41f0b54324ded600e8ddbcabea23e1e8ba419d9a53b241dece821d"},
{file = "coverage-7.3.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:31c0b1b8b5a4aebf8fcd227237fc4263aa7fa0ddcd4d288d42f50eff18b0bac4"},
{file = "coverage-7.3.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ee453085279df1bac0996bc97004771a4a052b1f1e23f6101213e3796ff3cb85"},
{file = "coverage-7.3.3-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1191270b06ecd68b1d00897b2daddb98e1719f63750969614ceb3438228c088e"},
{file = "coverage-7.3.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:007a7e49831cfe387473e92e9ff07377f6121120669ddc39674e7244350a6a29"},
{file = "coverage-7.3.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:af75cf83c2d57717a8493ed2246d34b1f3398cb8a92b10fd7a1858cad8e78f59"},
{file = "coverage-7.3.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:811ca7373da32f1ccee2927dc27dc523462fd30674a80102f86c6753d6681bc6"},
{file = "coverage-7.3.3-cp312-cp312-win32.whl", hash = "sha256:733537a182b5d62184f2a72796eb6901299898231a8e4f84c858c68684b25a70"},
{file = "coverage-7.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:e995efb191f04b01ced307dbd7407ebf6e6dc209b528d75583277b10fd1800ee"},
{file = "coverage-7.3.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:fbd8a5fe6c893de21a3c6835071ec116d79334fbdf641743332e442a3466f7ea"},
{file = "coverage-7.3.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:50c472c1916540f8b2deef10cdc736cd2b3d1464d3945e4da0333862270dcb15"},
{file = "coverage-7.3.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e9223a18f51d00d3ce239c39fc41410489ec7a248a84fab443fbb39c943616c"},
{file = "coverage-7.3.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f501e36ac428c1b334c41e196ff6bd550c0353c7314716e80055b1f0a32ba394"},
{file = "coverage-7.3.3-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:475de8213ed95a6b6283056d180b2442eee38d5948d735cd3d3b52b86dd65b92"},
{file = "coverage-7.3.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:afdcc10c01d0db217fc0a64f58c7edd635b8f27787fea0a3054b856a6dff8717"},
{file = "coverage-7.3.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:fff0b2f249ac642fd735f009b8363c2b46cf406d3caec00e4deeb79b5ff39b40"},
{file = "coverage-7.3.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:a1f76cfc122c9e0f62dbe0460ec9cc7696fc9a0293931a33b8870f78cf83a327"},
{file = "coverage-7.3.3-cp38-cp38-win32.whl", hash = "sha256:757453848c18d7ab5d5b5f1827293d580f156f1c2c8cef45bfc21f37d8681069"},
{file = "coverage-7.3.3-cp38-cp38-win_amd64.whl", hash = "sha256:ad2453b852a1316c8a103c9c970db8fbc262f4f6b930aa6c606df9b2766eee06"},
{file = "coverage-7.3.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3b15e03b8ee6a908db48eccf4e4e42397f146ab1e91c6324da44197a45cb9132"},
{file = "coverage-7.3.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:89400aa1752e09f666cc48708eaa171eef0ebe3d5f74044b614729231763ae69"},
{file = "coverage-7.3.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c59a3e59fb95e6d72e71dc915e6d7fa568863fad0a80b33bc7b82d6e9f844973"},
{file = "coverage-7.3.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9ede881c7618f9cf93e2df0421ee127afdfd267d1b5d0c59bcea771cf160ea4a"},
{file = "coverage-7.3.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f3bfd2c2f0e5384276e12b14882bf2c7621f97c35320c3e7132c156ce18436a1"},
{file = "coverage-7.3.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:7f3bad1a9313401ff2964e411ab7d57fb700a2d5478b727e13f156c8f89774a0"},
{file = "coverage-7.3.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:65d716b736f16e250435473c5ca01285d73c29f20097decdbb12571d5dfb2c94"},
{file = "coverage-7.3.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a702e66483b1fe602717020a0e90506e759c84a71dbc1616dd55d29d86a9b91f"},
{file = "coverage-7.3.3-cp39-cp39-win32.whl", hash = "sha256:7fbf3f5756e7955174a31fb579307d69ffca91ad163467ed123858ce0f3fd4aa"},
{file = "coverage-7.3.3-cp39-cp39-win_amd64.whl", hash = "sha256:cad9afc1644b979211989ec3ff7d82110b2ed52995c2f7263e7841c846a75348"},
{file = "coverage-7.3.3-pp38.pp39.pp310-none-any.whl", hash = "sha256:d299d379b676812e142fb57662a8d0d810b859421412b4d7af996154c00c31bb"},
{file = "coverage-7.3.3.tar.gz", hash = "sha256:df04c64e58df96b4427db8d0559e95e2df3138c9916c96f9f6a4dd220db2fdb7"},
{file = "charset-normalizer-2.1.1.tar.gz", hash = "sha256:5a3d016c7c547f69d6f81fb0db9449ce888b418b5b9952cc5e6e66843e9dd845"},
{file = "charset_normalizer-2.1.1-py3-none-any.whl", hash = "sha256:83e9a75d1911279afd89352c68b45348559d1fc0506b054b346651b5e7fee29f"},
]
[package.extras]
toml = ["tomli"]
unicode-backport = ["unicodedata2"]
[[package]]
name = "commonmark"
version = "0.9.1"
description = "Python parser for the CommonMark Markdown spec"
category = "main"
optional = false
python-versions = "*"
files = [
{file = "commonmark-0.9.1-py2.py3-none-any.whl", hash = "sha256:da2f38c92590f83de410ba1a3cbceafbc74fee9def35f9251ba9a971d6d66fd9"},
{file = "commonmark-0.9.1.tar.gz", hash = "sha256:452f9dc859be7f06631ddcb328b6919c67984aca654e5fefb3914d54691aed60"},
]
[package.extras]
test = ["flake8 (==3.7.8)", "hypothesis (==3.55.3)"]
[[package]]
name = "idna"
version = "3.6"
version = "3.4"
description = "Internationalized Domain Names in Applications (IDNA)"
category = "main"
optional = false
python-versions = ">=3.5"
files = [
{file = "idna-3.6-py3-none-any.whl", hash = "sha256:c05567e9c24a6b9faaa835c4821bad0590fbb9d5779e7caa6e1cc4978e7eb24f"},
{file = "idna-3.6.tar.gz", hash = "sha256:9ecdbbd083b06798ae1e86adcbfe8ab1479cf864e4ee30fe4e46a003d12491ca"},
{file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
{file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
]
[[package]]
name = "iniconfig"
version = "2.0.0"
description = "brain-dead simple config-ini parsing"
optional = false
python-versions = ">=3.7"
files = [
{file = "iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374"},
{file = "iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3"},
]
[[package]]
name = "markdown-it-py"
version = "3.0.0"
description = "Python port of markdown-it. Markdown parsing, done right!"
optional = false
python-versions = ">=3.8"
files = [
{file = "markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb"},
{file = "markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1"},
]
[package.dependencies]
mdurl = ">=0.1,<1.0"
[package.extras]
benchmarking = ["psutil", "pytest", "pytest-benchmark"]
code-style = ["pre-commit (>=3.0,<4.0)"]
compare = ["commonmark (>=0.9,<1.0)", "markdown (>=3.4,<4.0)", "mistletoe (>=1.0,<2.0)", "mistune (>=2.0,<3.0)", "panflute (>=2.3,<3.0)"]
linkify = ["linkify-it-py (>=1,<3)"]
plugins = ["mdit-py-plugins"]
profiling = ["gprof2dot"]
rtd = ["jupyter_sphinx", "mdit-py-plugins", "myst-parser", "pyyaml", "sphinx", "sphinx-copybutton", "sphinx-design", "sphinx_book_theme"]
testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"]
[[package]]
name = "mdurl"
version = "0.1.2"
description = "Markdown URL utilities"
optional = false
python-versions = ">=3.7"
files = [
{file = "mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8"},
{file = "mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba"},
]
[[package]]
name = "packaging"
version = "23.2"
description = "Core utilities for Python packages"
optional = false
python-versions = ">=3.7"
files = [
{file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
{file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
]
[[package]]
name = "pluggy"
version = "1.3.0"
description = "plugin and hook calling mechanisms for python"
optional = false
python-versions = ">=3.8"
files = [
{file = "pluggy-1.3.0-py3-none-any.whl", hash = "sha256:d89c696a773f8bd377d18e5ecda92b7a3793cbe66c87060a6fb58c7b6e1061f7"},
{file = "pluggy-1.3.0.tar.gz", hash = "sha256:cf61ae8f126ac6f7c451172cf30e3e43d3ca77615509771b3a984a0730651e12"},
]
[package.extras]
dev = ["pre-commit", "tox"]
testing = ["pytest", "pytest-benchmark"]
[[package]]
name = "pygments"
version = "2.17.2"
version = "2.14.0"
description = "Pygments is a syntax highlighting package written in Python."
category = "main"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.6"
files = [
{file = "pygments-2.17.2-py3-none-any.whl", hash = "sha256:b27c2826c47d0f3219f29554824c30c5e8945175d888647acd804ddd04af846c"},
{file = "pygments-2.17.2.tar.gz", hash = "sha256:da46cec9fd2de5be3a8a784f434e4c4ab670b4ff54d605c4c2717e9d49c4c367"},
{file = "Pygments-2.14.0-py3-none-any.whl", hash = "sha256:fa7bd7bd2771287c0de303af8bfdfc731f51bd2c6a47ab69d117138893b82717"},
{file = "Pygments-2.14.0.tar.gz", hash = "sha256:b3ed06a9e8ac9a9aae5a6f5dbe78a8a58655d17b43b93c078f094ddc476ae297"},
]
[package.extras]
plugins = ["importlib-metadata"]
windows-terminal = ["colorama (>=0.4.6)"]
[[package]]
name = "pytest"
version = "7.4.3"
description = "pytest: simple powerful testing with Python"
optional = false
python-versions = ">=3.7"
files = [
{file = "pytest-7.4.3-py3-none-any.whl", hash = "sha256:0d009c083ea859a71b76adf7c1d502e4bc170b80a8ef002da5806527b9591fac"},
{file = "pytest-7.4.3.tar.gz", hash = "sha256:d989d136982de4e3b29dabcc838ad581c64e8ed52c11fbe86ddebd9da0818cd5"},
]
[package.dependencies]
colorama = {version = "*", markers = "sys_platform == \"win32\""}
iniconfig = "*"
packaging = "*"
pluggy = ">=0.12,<2.0"
[package.extras]
testing = ["argcomplete", "attrs (>=19.2.0)", "hypothesis (>=3.56)", "mock", "nose", "pygments (>=2.7.2)", "requests", "setuptools", "xmlschema"]
[[package]]
name = "pytest-cov"
version = "4.1.0"
description = "Pytest plugin for measuring coverage."
optional = false
python-versions = ">=3.7"
files = [
{file = "pytest-cov-4.1.0.tar.gz", hash = "sha256:3904b13dfbfec47f003b8e77fd5b589cd11904a21ddf1ab38a64f204d6a10ef6"},
{file = "pytest_cov-4.1.0-py3-none-any.whl", hash = "sha256:6ba70b9e97e69fcc3fb45bfeab2d0a138fb65c4d0d6a41ef33983ad114be8c3a"},
]
[package.dependencies]
coverage = {version = ">=5.2.1", extras = ["toml"]}
pytest = ">=4.6"
[package.extras]
testing = ["fields", "hunter", "process-tests", "pytest-xdist", "six", "virtualenv"]
[[package]]
name = "requests"
version = "2.31.0"
version = "2.28.1"
description = "Python HTTP for Humans."
category = "main"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.7, <4"
files = [
{file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
{file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
{file = "requests-2.28.1-py3-none-any.whl", hash = "sha256:8fefa2a1a1365bf5520aac41836fbee479da67864514bdb821f31ce07ce65349"},
{file = "requests-2.28.1.tar.gz", hash = "sha256:7c5599b102feddaa661c826c56ab4fee28bfd17f5abca1ebbe3e7f19d7c97983"},
]
[package.dependencies]
certifi = ">=2017.4.17"
charset-normalizer = ">=2,<4"
charset-normalizer = ">=2,<3"
idna = ">=2.5,<4"
urllib3 = ">=1.21.1,<3"
urllib3 = ">=1.21.1,<1.27"
[package.extras]
socks = ["PySocks (>=1.5.6,!=1.5.7)"]
use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
[[package]]
name = "requests-mock"
version = "1.11.0"
description = "Mock out responses from the requests package"
optional = false
python-versions = "*"
files = [
{file = "requests-mock-1.11.0.tar.gz", hash = "sha256:ef10b572b489a5f28e09b708697208c4a3b2b89ef80a9f01584340ea357ec3c4"},
{file = "requests_mock-1.11.0-py2.py3-none-any.whl", hash = "sha256:f7fae383f228633f6bececebdab236c478ace2284d6292c6e7e2867b9ab74d15"},
]
[package.dependencies]
requests = ">=2.3,<3"
six = "*"
[package.extras]
fixture = ["fixtures"]
test = ["fixtures", "mock", "purl", "pytest", "requests-futures", "sphinx", "testtools"]
[[package]]
name = "rich"
version = "13.7.0"
version = "13.0.1"
description = "Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal"
category = "main"
optional = false
python-versions = ">=3.7.0"
files = [
{file = "rich-13.7.0-py3-none-any.whl", hash = "sha256:6da14c108c4866ee9520bbffa71f6fe3962e193b7da68720583850cd4548e235"},
{file = "rich-13.7.0.tar.gz", hash = "sha256:5cb5123b5cf9ee70584244246816e9114227e0b98ad9176eede6ad54bf5403fa"},
{file = "rich-13.0.1-py3-none-any.whl", hash = "sha256:41fe1d05f433b0f4724cda8345219213d2bfa472ef56b2f64f415b5b94d51b04"},
{file = "rich-13.0.1.tar.gz", hash = "sha256:25f83363f636995627a99f6e4abc52ed0970ebbd544960cc63cbb43aaac3d6f0"},
]
[package.dependencies]
markdown-it-py = ">=2.2.0"
pygments = ">=2.13.0,<3.0.0"
commonmark = ">=0.9.0,<0.10.0"
pygments = ">=2.6.0,<3.0.0"
[package.extras]
jupyter = ["ipywidgets (>=7.5.1,<9)"]
[[package]]
name = "six"
version = "1.16.0"
description = "Python 2 and 3 compatibility utilities"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
files = [
{file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
{file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
]
jupyter = ["ipywidgets (>=7.5.1,<8.0.0)"]
[[package]]
name = "toml"
version = "0.10.2"
description = "Python Library for Tom's Obvious, Minimal Language"
category = "main"
optional = false
python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
files = [
@@ -403,43 +124,22 @@ files = [
[[package]]
name = "urllib3"
version = "2.1.0"
version = "1.26.14"
description = "HTTP library with thread-safe connection pooling, file post, and more."
category = "main"
optional = false
python-versions = ">=3.8"
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
files = [
{file = "urllib3-2.1.0-py3-none-any.whl", hash = "sha256:55901e917a5896a349ff771be919f8bd99aff50b79fe58fec595eb37bbc56bb3"},
{file = "urllib3-2.1.0.tar.gz", hash = "sha256:df7aa8afb0148fa78488e7899b2c59b5f4ffcfa82e6c54ccb9dd37c1d7b52d54"},
{file = "urllib3-1.26.14-py2.py3-none-any.whl", hash = "sha256:75edcdc2f7d85b137124a6c3c9fc3933cdeaa12ecb9a6a959f22797a0feca7e1"},
{file = "urllib3-1.26.14.tar.gz", hash = "sha256:076907bf8fd355cde77728471316625a4d2f7e713c125f51953bb5b3eecf4f72"},
]
[package.extras]
brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
zstd = ["zstandard (>=0.18.0)"]
[[package]]
name = "validators"
version = "0.22.0"
description = "Python Data Validation for Humans™"
optional = false
python-versions = ">=3.8"
files = [
{file = "validators-0.22.0-py3-none-any.whl", hash = "sha256:61cf7d4a62bbae559f2e54aed3b000cea9ff3e2fdbe463f51179b92c58c9585a"},
{file = "validators-0.22.0.tar.gz", hash = "sha256:77b2689b172eeeb600d9605ab86194641670cdb73b60afd577142a9397873370"},
]
[package.extras]
docs-offline = ["myst-parser (>=2.0.0)", "pypandoc-binary (>=1.11)", "sphinx (>=7.1.1)"]
docs-online = ["mkdocs (>=1.5.2)", "mkdocs-git-revision-date-localized-plugin (>=1.2.0)", "mkdocs-material (>=9.2.6)", "mkdocstrings[python] (>=0.22.0)", "pyaml (>=23.7.0)"]
hooks = ["pre-commit (>=3.3.3)"]
package = ["build (>=1.0.0)", "twine (>=4.0.2)"]
runner = ["tox (>=4.11.1)"]
sast = ["bandit[toml] (>=1.7.5)"]
testing = ["pytest (>=7.4.0)"]
tooling = ["black (>=23.7.0)", "pyright (>=1.1.325)", "ruff (>=0.0.287)"]
tooling-extras = ["pyaml (>=23.7.0)", "pypandoc-binary (>=1.11)", "pytest (>=7.4.0)"]
brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)", "brotlipy (>=0.6.0)"]
secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "ipaddress", "pyOpenSSL (>=0.14)", "urllib3-secure-extra"]
socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
[metadata]
lock-version = "2.0"
python-versions = "^3.11"
content-hash = "c0c1087bf3d383422333cbca7dc301927f981d97fa7d6ce1c91d01fd8e0e2125"
python-versions = "^3.10"
content-hash = "f6e631371be67516f200e86805f1fab33dcd481a779c93f97a510f7348d0e2aa"

View File

@@ -1,31 +1,24 @@
[tool.poetry]
name = "fediverse-blocklist-tool"
name = "mastodon-blocklist-deploy"
version = "0.1.0"
description = "A small tool to export, compareof merge and deploy blocklists of a fediverse server"
description = "A small tool to deploy blocklist updates to a mastodon server using its API."
authors = ["Georg Krause <mail@georg-krause.net>", "Julian-Samuel Gebühr <julian-samuel@gebuehr.net>"]
maintainers = ["Julian-Samuel Gebühr <julian-samuel@gebuehr.net>"]
readme = "README.md"
packages = [{include = "fediverse_blocklist_tool"}]
packages = [{include = "mastodon_blocklist_deploy"}]
license = "MIT"
keywords = ["fediverse", "blocklist", "mastodon", "gotosocial", "safety"]
keywords = ["mastodon", "blocklist", "fediverse"]
[tool.poetry.dependencies]
python = "^3.11"
python = "^3.10"
requests = "^2.28.1"
rich = "^13.0.1"
toml = "^0.10.2"
validators = "*"
[tool.poetry.scripts]
fbt = 'fediverse_blocklist_tool.cli:cli'
mastodon_blocklist_deploy = 'mastodon_blocklist_deploy.cli:cli'
[tool.poetry.group.test.dependencies]
pytest = "^7.4.0"
pytest-cov = "^4.1.0"
requests-mock = "^1.11.0"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"

View File

View File

@@ -1,21 +0,0 @@
from fediverse_blocklist_tool.cli import get_format_from_filename
def test_get_format_from_filename():
assert "toml" == get_format_from_filename("temp.toml")
assert "csv" == get_format_from_filename("temp.csv")
assert "json" == get_format_from_filename("temp.json")
assert "md" == get_format_from_filename("temp.md")
assert "md" == get_format_from_filename("csv.md")
assert "md" == get_format_from_filename("toml.md")
assert "md" == get_format_from_filename("json.md")
assert "toml" == get_format_from_filename("csv.toml")
assert "toml" == get_format_from_filename("md.toml")
assert "toml" == get_format_from_filename("json.toml")
assert "toml" == get_format_from_filename("tmp")
assert "json" == get_format_from_filename("csv.json")
assert "json" == get_format_from_filename("md.json")
assert "json" == get_format_from_filename("toml.json")

View File

@@ -1,92 +0,0 @@
import pytest
from fediverse_blocklist_deploy import models
def test_empty_instance():
with pytest.raises(KeyError):
models.Instance = models.Instance({})
def test_minimal_init():
i: models.Instance = models.Instance({"domain": "abc.xyz"})
assert i.id == None
assert i.domain == "abc.xyz"
assert i.obfuscate == False
assert i.reject_media == False
assert i.reject_reports == False
def test_string_representation():
i: models.Instance = models.Instance({"domain": "abc.xyz"})
assert str(i) == "abc.xyz: suspend"
def test_status():
i: models.Instance = models.Instance({"domain": "abc.xyz"})
assert i.status_str() == "suspend\nReject reports: False\nReject media: False\nObfuscate: False"
def test_equality():
a1: models.Instance = models.Instance({"domain": "a"})
a2: models.Instance = models.Instance({"domain": "a"})
b: models.Instance = models.Instance({"domain": "b"})
assert a1 == a2
assert a2 != b
def test_as_dict():
test_data = {"domain": "abc.xyz", "severity": "suspend", "private_comment": "hidden", "public_comment": "", "obfuscate": True, "reject_media": False, "reject_reports": False}
i: models.Instance = models.Instance(test_data)
test_data.pop("private_comment")
assert i.as_dict() == test_data
def test_as_dict_private():
test_data = {"domain": "abc.xyz", "severity": "suspend", "private_comment": "hidden", "public_comment": "", "obfuscate": True, "reject_media": False, "reject_reports": False}
i: models.Instance = models.Instance(test_data)
assert i.as_dict(private=True) == test_data
def test_apply(requests_mock):
requests_mock.post("https://server.org/api/v1/admin/domain_blocks", text="success")
i: models.Instance = models.Instance({"domain": "abc.xyz"})
i.apply("server.org", token="abcdef")
assert requests_mock.called
def test_apply_with_id(requests_mock):
requests_mock.put("https://server.org/api/v1/admin/domain_blocks/123", text="success")
i: models.Instance = models.Instance({"domain": "abc.xyz"})
i.apply("server.org", token="abcdef", block_id=123)
assert requests_mock.called
def test_apply_error(requests_mock):
requests_mock.post("https://server.org/api/v1/admin/domain_blocks", status_code=400)
with pytest.raises(ConnectionError):
i: models.Instance = models.Instance({"domain": "abc.xyz"})
i.apply("server.org", token="abcdef")
assert requests_mock.called
def test_delete(requests_mock):
requests_mock.delete("https://server.org/api/v1/admin/domain_blocks/123", text="success")
i: models.Instance = models.Instance({"domain": "abc.xyz"})
i.id = 123
i.delete("server.org", token="abcdef")
assert requests_mock.called
def test_delete_error(requests_mock):
requests_mock.delete("https://server.org/api/v1/admin/domain_blocks/123", status_code=400)
i: models.Instance = models.Instance({"domain": "abc.xyz"})
i.id = 123
with pytest.raises(ConnectionError):
i.delete("server.org", token="abcdef")
assert requests_mock.called
def test_diff_equal():
a1: models.Instance = models.Instance({"domain": "a"})
a2: models.Instance = models.Instance({"domain": "a"})
assert models.Instance.list_diffs([a1], [a2]) == []
def test_diff_not_equal():
a1: models.Instance = models.Instance({"domain": "a2"})
a2: models.Instance = models.Instance({"domain": "a1"})
assert models.Instance.list_diffs([a1], [a2]) == [{"local": a1, "remote": None}, {"local": None, "remote": a2}]