Compare commits

...

44 Commits

Author SHA1 Message Date
Fabio Manganiello b8215d2736 A more robust cron start logic
If may happen (usually because of a race condition) that a cronjob has
already been started, but it hasn't yet changed its status from IDLE to
RUNNING when the scheduler checks it.

This fix guards the application against such events. If they occur, we
should just report them and move on, not terminate the whole scheduler.
2022-10-27 10:45:59 +02:00
Fabio Manganiello a5db599268
FIX: Skip empty lines on `config.include` 2022-10-14 20:56:18 +02:00
Fabio Manganiello b88983f055
Added `qos` argument to `mqtt.publish`. 2022-10-05 01:13:47 +02:00
Fabio Manganiello 85f583a0ad
Reduced :maxdepth: of toctree in documentation.
Recent versions of Sphinx get a bit too zealous about generating deeply
nested toctrees.
2022-09-30 11:47:19 +02:00
Fabio Manganiello fed7c2c6ff
Fixed typo in schema path 2022-09-30 11:30:57 +02:00
Fabio Manganiello 1d78c3e753
FIX: Broken docstring 2022-09-30 10:56:08 +02:00
Fabio Manganiello 00d47731c5 Merge pull request 'Mimic3 integration' (#227) from 226-mimic3-integration into master
Reviewed-on: platypush/platypush#227
2022-09-30 10:52:53 +02:00
Fabio Manganiello ae226a5b01
Added `tts.mimic3` integration.
Closes: #226
2022-09-30 10:51:17 +02:00
Fabio Manganiello fef7aff245
LINT fixes for mpv plugin 2022-09-30 10:41:56 +02:00
Fabio Manganiello 82ab7face2
A more robust logic to detect the webserver local bind address 2022-09-30 03:10:37 +02:00
Fabio Manganiello 3ed10092ae Merge pull request 'Wallabag integration' (#225) from 222-wallabag-integration into master
Reviewed-on: platypush/platypush#225
2022-09-29 10:52:16 +02:00
Fabio Manganiello 4bab9d2607
[#224] Implemented Wallabag integration 2022-09-29 10:51:16 +02:00
Fabio Manganiello a0575ed6de
Bump version: 0.23.5 → 0.23.6 2022-09-19 20:41:02 +02:00
Fabio Manganiello 3d74f0a11f
Updated CHANGELOG 2022-09-19 20:40:54 +02:00
Fabio Manganiello 09baceab4b
Include album_id and the list of tracks in music.tidal.get_album 2022-09-19 20:39:21 +02:00
Fabio Manganiello c2a3f2f4f3
Bump version: 0.23.4 → 0.23.5 2022-09-18 19:55:05 +02:00
Fabio Manganiello 36dd645209
Use session.playlist instead of session.user.playlist to query playlists 2022-09-18 06:04:53 +02:00
Fabio Manganiello 61cda60751
Proper implementation for Tidal's add_to_playlist and remove_from_playlist methods
- Using tidalapi's `UserPlaylist.add` and `UserPlaylist.delete` methods
  instead of defining my own through `_api_request`, so we won't have to
  deal with the logic to set the ETag header.

- Added `remove_from_playlist` method.
2022-09-18 05:22:12 +02:00
Fabio Manganiello 7c610adc84
FIX: Apply expanduser to the credentials_file setting in music.tidal 2022-09-17 06:30:20 +02:00
Fabio Manganiello a9ebb4805a
Fixed doc warnings 2022-09-17 06:25:28 +02:00
Fabio Manganiello 1b405de0d5
Added missing docs 2022-09-17 06:09:39 +02:00
Fabio Manganiello e1aa214bad tidal-integration (#223)
Reviewed-on: platypush/platypush#223
2022-09-16 21:48:09 +02:00
Fabio Manganiello 41acf4b253
Generate event ID as true random strings, not MD5 hashes of UUIDs 2022-09-05 03:08:39 +02:00
Fabio Manganiello c77746e278 If the output of a hook is null, make sure to normalize it an empty string before pushing it to Redis 2022-09-04 16:16:02 +02:00
Fabio Manganiello 4682fb4210
Throw an assertion error when on_duplicate_update is specified on db.insert with no key_columns 2022-09-04 16:02:37 +02:00
Fabio Manganiello 0143dac216
Improved support for bulk database statements
- Wrapped insert/update/delete operations in transactions
- Proper (and much more efficient) bulk logic
- Better upsert logic
- Return inserted/updated records if the engine supports it
2022-09-04 13:30:35 +02:00
Fabio Manganiello a90aa2cb2e Make sure that a webhook function never returns a null response 2022-09-04 00:52:41 +02:00
Fabio Manganiello 1ea53a6f50
Support for query placeholders in `db.select` 2022-09-04 00:28:08 +02:00
Fabio Manganiello e77d6a4ad4 Merge pull request 'Add support for OPML import and export in the RSS plugin' (#220) from 219-opml-import-export into master
Reviewed-on: platypush/platypush#220
2022-09-02 00:24:37 +02:00
Fabio Manganiello 61c96612bc Merge branch 'master' into 219-opml-import-export 2022-09-02 00:23:57 +02:00
Fabio Manganiello 6c6e68b512
Added support for OPML import and export in the RSS plugin.
[closes #219]
2022-09-02 00:21:40 +02:00
Fabio Manganiello a286cf5000 Updated PopcornTime base URL 2022-09-01 11:13:16 +02:00
Fabio Manganiello c5b12403d0
Implemented support for returning richer HTTP responses on webhooks.
A `WebhookEvent` hook can now return a tuple in the format `(data,
http_code, headers)` in order to customize the HTTP status code and the
headers of a response.
2022-09-01 01:37:18 +02:00
Fabio Manganiello 96b2ad148c
A smarter way of building and matching the event condition 2022-08-31 02:19:21 +02:00
Fabio Manganiello 67413c02cd
Handle the case where the condition is a serialized dictionary 2022-08-31 01:55:21 +02:00
Fabio Manganiello db45d7ecbf
FIX: More robust logic against section configurations that may not be maps 2022-08-31 01:27:53 +02:00
Fabio Manganiello a675fe6a92
Updated CHANGELOG 2022-08-31 00:49:08 +02:00
Fabio Manganiello c3fa3315f5
Implemented synchronization with webhook responses.
When a client triggers a `WebhookEvent` by calling a configured webhook
over `/hook/<hook_name>`, the server will now wait for the configured
`@hook` function to complete and it will return the returned response
back to the client.

This makes webhooks much more powerful, as they can be used to proxy
HTTP calls or other services, and in general return something to the
client instead of just executing actions.
2022-08-30 23:35:19 +02:00
Fabio Manganiello e08947a3b7
Merge pull request #311 from BlackLight/dependabot/npm_and_yarn/platypush/backend/http/webapp/terser-5.14.2
Bump terser from 5.12.1 to 5.14.2 in /platypush/backend/http/webapp
2022-08-29 00:59:55 +02:00
Fabio Manganiello 6d63d2fc74
Merge pull request #305 from BlackLight/dependabot/npm_and_yarn/platypush/backend/http/webapp/shell-quote-1.7.3
Bump shell-quote from 1.7.2 to 1.7.3 in /platypush/backend/http/webapp
2022-08-29 00:59:19 +02:00
Fabio Manganiello 540a7d469e
- Fixed documentation errors and warnings
- Split Matrix integration into `plugin` and `client` files.
2022-08-29 00:55:46 +02:00
Fabio Manganiello b11a0e8bbb
Bump version: 0.23.3 → 0.23.4 2022-08-28 15:27:54 +02:00
dependabot[bot] c7927a3d2f
Bump terser from 5.12.1 to 5.14.2 in /platypush/backend/http/webapp
Bumps [terser](https://github.com/terser/terser) from 5.12.1 to 5.14.2.
- [Release notes](https://github.com/terser/terser/releases)
- [Changelog](https://github.com/terser/terser/blob/master/CHANGELOG.md)
- [Commits](https://github.com/terser/terser/commits)

---
updated-dependencies:
- dependency-name: terser
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-21 00:35:29 +00:00
dependabot[bot] 06168d4ebd
Bump shell-quote from 1.7.2 to 1.7.3 in /platypush/backend/http/webapp
Bumps [shell-quote](https://github.com/substack/node-shell-quote) from 1.7.2 to 1.7.3.
- [Release notes](https://github.com/substack/node-shell-quote/releases)
- [Changelog](https://github.com/substack/node-shell-quote/blob/master/CHANGELOG.md)
- [Commits](https://github.com/substack/node-shell-quote/compare/v1.7.2...1.7.3)

---
updated-dependencies:
- dependency-name: shell-quote
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-23 11:29:13 +00:00
48 changed files with 3659 additions and 1282 deletions

View File

@ -1,10 +1,51 @@
# Changelog
All notable changes to this project will be documented in this file.
Given the high speed of development in the first phase, changes are being reported only starting from v0.20.2.
Given the high speed of development in the first phase, changes are being
reported only starting from v0.20.2.
## [Unreleased]
### Added
- Added [Wallabag integration](https://git.platypush.tech/platypush/platypush/issues/224).
- Added [Mimic3 TTS integration](https://git.platypush.tech/platypush/platypush/issues/226).
## [0.23.6] - 2022-09-19
### Fixed
- Fixed album_id and list of tracks on `music.tidal.get_album`.
## [0.23.5] - 2022-09-18
### Added
- Added support for web hooks returning their hook method responses back to the
HTTP client.
- Added [Tidal integration](https://git.platypush.tech/platypush/platypush/pulls/223)
- Added support for [OPML
subscriptions](https://git.platypush.tech/platypush/platypush/pulls/220) to
the `rss` plugin.
- Better support for bulk database operations on the `db` plugin.
### Fixed
- Now supporting YAML sections with empty configurations.
## [0.23.4] - 2022-08-28
### Added
- Added `matrix` integration
([issue](https://git.platypush.tech/platypush/platypush/issues/2),
[PR](https://git.platypush.tech/platypush/platypush/pulls/217)).
### Changed
- Removed `clipboard` backend. Enabling the `clipboard` plugin will also enable
clipboard monitoring, with no need for an additional backend.

View File

@ -3,7 +3,7 @@ Backends
========
.. toctree::
:maxdepth: 2
:maxdepth: 1
:caption: Backends:
platypush/backend/adafruit.io.rst

View File

@ -71,7 +71,7 @@ master_doc = 'index'
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
language = 'en'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
@ -195,7 +195,8 @@ intersphinx_mapping = {'https://docs.python.org/': None}
todo_include_todos = True
autodoc_default_options = {
'inherited-members': True,
'members': True,
'show-inheritance': True,
}
autodoc_mock_imports = [
@ -294,6 +295,7 @@ autodoc_mock_imports = [
'nio',
'aiofiles',
'aiofiles.os',
'async_lru',
]
sys.path.insert(0, os.path.abspath('../..'))

View File

@ -3,7 +3,7 @@ Events
======
.. toctree::
:maxdepth: 2
:maxdepth: 1
:caption: Events:
platypush/events/adafruit.rst
@ -47,6 +47,7 @@ Events
platypush/events/mqtt.rst
platypush/events/music.rst
platypush/events/music.snapcast.rst
platypush/events/music.tidal.rst
platypush/events/nextcloud.rst
platypush/events/nfc.rst
platypush/events/ngrok.rst

View File

@ -16,7 +16,7 @@ For more information on Platypush check out:
.. _Blog articles: https://blog.platypush.tech
.. toctree::
:maxdepth: 3
:maxdepth: 2
:caption: Contents:
backends

View File

@ -0,0 +1,5 @@
``music.tidal``
===============
.. automodule:: platypush.message.event.music.tidal
:members:

View File

@ -2,4 +2,4 @@
==========================
.. automodule:: platypush.plugins.dbus
:members:
:exclude-members: DBusService, BusType

View File

@ -2,4 +2,4 @@
==========
.. automodule:: platypush.plugins.matrix
:members:
:members: MatrixPlugin

View File

@ -0,0 +1,5 @@
``music.tidal``
===============
.. automodule:: platypush.plugins.music.tidal
:members:

View File

@ -0,0 +1,5 @@
``tts.mimic3``
==============
.. automodule:: platypush.plugins.tts.mimic3
:members:

View File

@ -0,0 +1,5 @@
``wallabag``
============
.. automodule:: platypush.plugins.wallabag
:members:

View File

@ -3,7 +3,7 @@ Plugins
=======
.. toctree::
:maxdepth: 2
:maxdepth: 1
:caption: Plugins:
platypush/plugins/adafruit.io.rst
@ -94,6 +94,7 @@ Plugins
platypush/plugins/music.mpd.rst
platypush/plugins/music.snapcast.rst
platypush/plugins/music.spotify.rst
platypush/plugins/music.tidal.rst
platypush/plugins/nextcloud.rst
platypush/plugins/ngrok.rst
platypush/plugins/nmap.rst
@ -131,12 +132,14 @@ Plugins
platypush/plugins/trello.rst
platypush/plugins/tts.rst
platypush/plugins/tts.google.rst
platypush/plugins/tts.mimic3.rst
platypush/plugins/tv.samsung.ws.rst
platypush/plugins/twilio.rst
platypush/plugins/udp.rst
platypush/plugins/user.rst
platypush/plugins/utils.rst
platypush/plugins/variable.rst
platypush/plugins/wallabag.rst
platypush/plugins/weather.buienradar.rst
platypush/plugins/weather.darksky.rst
platypush/plugins/weather.openweathermap.rst

View File

@ -3,7 +3,7 @@ Responses
=========
.. toctree::
:maxdepth: 2
:maxdepth: 1
:caption: Responses:
platypush/responses/bluetooth.rst

View File

@ -23,7 +23,7 @@ from .message.response import Response
from .utils import set_thread_name, get_enabled_plugins
__author__ = 'Fabio Manganiello <info@fabiomanganiello.com>'
__version__ = '0.23.3'
__version__ = '0.23.6'
logger = logging.getLogger('platypush')

View File

@ -1,9 +1,11 @@
import json
from flask import Blueprint, abort, request, Response
from flask import Blueprint, abort, request, make_response
from platypush.backend.http.app import template_folder
from platypush.backend.http.app.utils import logger, send_message
from platypush.config import Config
from platypush.event.hook import EventCondition
from platypush.message.event.http.hook import WebhookEvent
@ -15,9 +17,23 @@ __routes__ = [
]
@hook.route('/hook/<hook_name>', methods=['GET', 'POST', 'PUT', 'DELETE', 'PATCH', 'OPTIONS'])
def _hook(hook_name):
""" Endpoint for custom webhooks """
def matches_condition(event: WebhookEvent, hook):
if isinstance(hook, dict):
if_ = hook['if'].copy()
if_['type'] = '.'.join([event.__module__, event.__class__.__qualname__])
condition = EventCondition.build(if_)
else:
condition = hook.condition
return event.matches_condition(condition)
@hook.route(
'/hook/<hook_name>', methods=['GET', 'POST', 'PUT', 'DELETE', 'PATCH', 'OPTIONS']
)
def hook_route(hook_name):
"""Endpoint for custom webhooks"""
event_args = {
'hook': hook_name,
@ -28,20 +44,54 @@ def _hook(hook_name):
}
if event_args['data']:
# noinspection PyBroadException
try:
event_args['data'] = json.loads(event_args['data'])
except Exception as e:
logger().warning('Not a valid JSON string: {}: {}'.format(event_args['data'], str(e)))
logger().warning(
'Not a valid JSON string: %s: %s', event_args['data'], str(e)
)
event = WebhookEvent(**event_args)
matching_hooks = [
hook
for hook in Config.get_event_hooks().values()
if matches_condition(event, hook)
]
try:
send_message(event)
return Response(json.dumps({'status': 'ok', **event_args}), mimetype='application/json')
rs = default_rs = make_response(json.dumps({'status': 'ok', **event_args}))
headers = {}
status_code = 200
# If there are matching hooks, wait for their completion before returning
if matching_hooks:
rs = event.wait_response(timeout=60)
try:
rs = json.loads(rs.decode()) # type: ignore
except Exception:
pass
if isinstance(rs, dict) and '___data___' in rs:
# data + http_code + custom_headers return format
headers = rs.get('___headers___', {})
status_code = rs.get('___code___', status_code)
rs = rs['___data___']
if rs is None:
rs = default_rs
headers = {'Content-Type': 'application/json'}
rs = make_response(rs)
else:
headers = {'Content-Type': 'application/json'}
rs.status_code = status_code
rs.headers.update(headers)
return rs
except Exception as e:
logger().exception(e)
logger().error('Error while dispatching webhook event {}: {}'.format(event, str(e)))
logger().error('Error while dispatching webhook event %s: %s', event, str(e))
abort(500, str(e))

View File

@ -0,0 +1,46 @@
import requests
from urllib.parse import urljoin
from flask import abort, request, Blueprint
from platypush.backend.http.app import template_folder
mimic3 = Blueprint('mimic3', __name__, template_folder=template_folder)
# Declare routes list
__routes__ = [
mimic3,
]
@mimic3.route('/tts/mimic3/say', methods=['GET'])
def proxy_tts_request():
"""
This route is used to proxy the POST request to the Mimic3 TTS server
through a GET, so it can be easily processed as a URL through a media
plugin.
"""
required_args = {
'text',
'server_url',
'voice',
}
missing_args = required_args.difference(set(request.args.keys()))
if missing_args:
abort(400, f'Missing parameters: {missing_args}')
args = {arg: request.args[arg] for arg in required_args}
rs = requests.post(
urljoin(args['server_url'], '/api/tts'),
data=args['text'],
params={
'voice': args['voice'],
},
)
return rs.content
# vim:sw=4:ts=4:et:

View File

@ -35,13 +35,15 @@ def logger():
'format': '%(asctime)-15s|%(levelname)5s|%(name)s|%(message)s',
}
level = (Config.get('backend.http') or {}).get('logging') or \
(Config.get('logging') or {}).get('level')
level = (Config.get('backend.http') or {}).get('logging') or (
Config.get('logging') or {}
).get('level')
filename = (Config.get('backend.http') or {}).get('filename')
if level:
log_args['level'] = getattr(logging, level.upper()) \
if isinstance(level, str) else level
log_args['level'] = (
getattr(logging, level.upper()) if isinstance(level, str) else level
)
if filename:
log_args['filename'] = filename
@ -65,6 +67,7 @@ def get_message_response(msg):
# noinspection PyProtectedMember
def get_http_port():
from platypush.backend.http import HttpBackend
http_conf = Config.get('backend.http')
return http_conf.get('port', HttpBackend._DEFAULT_HTTP_PORT)
@ -72,6 +75,7 @@ def get_http_port():
# noinspection PyProtectedMember
def get_websocket_port():
from platypush.backend.http import HttpBackend
http_conf = Config.get('backend.http')
return http_conf.get('websocket_port', HttpBackend._DEFAULT_WEBSOCKET_PORT)
@ -89,17 +93,13 @@ def send_message(msg, wait_for_response=True):
if isinstance(msg, Request) and wait_for_response:
response = get_message_response(msg)
logger().debug('Processing response on the HTTP backend: {}'.
format(response))
logger().debug('Processing response on the HTTP backend: {}'.format(response))
return response
def send_request(action, wait_for_response=True, **kwargs):
msg = {
'type': 'request',
'action': action
}
msg = {'type': 'request', 'action': action}
if kwargs:
msg['args'] = kwargs
@ -113,8 +113,10 @@ def _authenticate_token():
if 'X-Token' in request.headers:
user_token = request.headers['X-Token']
elif 'Authorization' in request.headers and request.headers['Authorization'].startswith('Bearer '):
user_token = request.headers['Authorization'][len('Bearer '):]
elif 'Authorization' in request.headers and request.headers[
'Authorization'
].startswith('Bearer '):
user_token = request.headers['Authorization'][7:]
elif 'token' in request.args:
user_token = request.args.get('token')
else:
@ -176,7 +178,10 @@ def _authenticate_csrf_token():
if user is None:
return False
return session.csrf_token is None or request.form.get('csrf_token') == session.csrf_token
return (
session.csrf_token is None
or request.form.get('csrf_token') == session.csrf_token
)
def authenticate(redirect_page='', skip_auth_methods=None, check_csrf_token=False):
@ -208,7 +213,9 @@ def authenticate(redirect_page='', skip_auth_methods=None, check_csrf_token=Fals
if session_auth_ok:
return f(*args, **kwargs)
return redirect('/login?redirect=' + (redirect_page or request.url), 307)
return redirect(
'/login?redirect=' + (redirect_page or request.url), 307
)
# CSRF token check
if check_csrf_token:
@ -217,15 +224,22 @@ def authenticate(redirect_page='', skip_auth_methods=None, check_csrf_token=Fals
return abort(403, 'Invalid or missing csrf_token')
if n_users == 0 and 'session' not in skip_methods:
return redirect('/register?redirect=' + (redirect_page or request.url), 307)
return redirect(
'/register?redirect=' + (redirect_page or request.url), 307
)
if ('http' not in skip_methods and http_auth_ok) or \
('token' not in skip_methods and token_auth_ok) or \
('session' not in skip_methods and session_auth_ok):
if (
('http' not in skip_methods and http_auth_ok)
or ('token' not in skip_methods and token_auth_ok)
or ('session' not in skip_methods and session_auth_ok)
):
return f(*args, **kwargs)
return Response('Authentication required', 401,
{'WWW-Authenticate': 'Basic realm="Login required"'})
return Response(
'Authentication required',
401,
{'WWW-Authenticate': 'Basic realm="Login required"'},
)
return wrapper
@ -233,42 +247,57 @@ def authenticate(redirect_page='', skip_auth_methods=None, check_csrf_token=Fals
def get_routes():
routes_dir = os.path.join(
os.path.dirname(os.path.abspath(__file__)), 'routes')
routes_dir = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'routes')
routes = []
base_module = '.'.join(__name__.split('.')[:-1])
for path, dirs, files in os.walk(routes_dir):
for path, _, files in os.walk(routes_dir):
for f in files:
if f.endswith('.py'):
mod_name = '.'.join(
(base_module + '.' + os.path.join(path, f).replace(
os.path.dirname(__file__), '')[1:].replace(os.sep, '.')).split('.')
[:(-2 if f == '__init__.py' else -1)])
(
base_module
+ '.'
+ os.path.join(path, f)
.replace(os.path.dirname(__file__), '')[1:]
.replace(os.sep, '.')
).split('.')[: (-2 if f == '__init__.py' else -1)]
)
try:
mod = importlib.import_module(mod_name)
if hasattr(mod, '__routes__'):
routes.extend(mod.__routes__)
except Exception as e:
logger().warning('Could not import routes from {}/{}: {}: {}'.
format(path, f, type(e), str(e)))
logger().warning(
'Could not import routes from {}/{}: {}: {}'.format(
path, f, type(e), str(e)
)
)
return routes
def get_local_base_url():
http_conf = Config.get('backend.http') or {}
return '{proto}://localhost:{port}'.format(
bind_address = http_conf.get('bind_address')
if not bind_address or bind_address == '0.0.0.0':
bind_address = 'localhost'
return '{proto}://{host}:{port}'.format(
proto=('https' if http_conf.get('ssl_cert') else 'http'),
port=get_http_port())
host=bind_address,
port=get_http_port(),
)
def get_remote_base_url():
http_conf = Config.get('backend.http') or {}
return '{proto}://{host}:{port}'.format(
proto=('https' if http_conf.get('ssl_cert') else 'http'),
host=get_ip_or_hostname(), port=get_http_port())
host=get_ip_or_hostname(),
port=get_http_port(),
)
# vim:sw=4:ts=4:et:

View File

@ -1759,6 +1759,58 @@
"integrity": "sha512-ZnQMnLV4e7hDlUvw8H+U8ASL02SS2Gn6+9Ac3wGGLIe7+je2AeAOxPY+izIPJDfFDb7eDjev0Us8MO1iFRN8hA==",
"dev": true
},
"node_modules/@jridgewell/gen-mapping": {
"version": "0.3.2",
"resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.2.tgz",
"integrity": "sha512-mh65xKQAzI6iBcFzwv28KVWSmCkdRBWoOh+bYQGW3+6OZvbbN3TqMGo5hqYxQniRcH9F2VZIoJCm4pa3BPDK/A==",
"dependencies": {
"@jridgewell/set-array": "^1.0.1",
"@jridgewell/sourcemap-codec": "^1.4.10",
"@jridgewell/trace-mapping": "^0.3.9"
},
"engines": {
"node": ">=6.0.0"
}
},
"node_modules/@jridgewell/resolve-uri": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.0.tgz",
"integrity": "sha512-F2msla3tad+Mfht5cJq7LSXcdudKTWCVYUgw6pLFOOHSTtZlj6SWNYAp+AhuqLmWdBO2X5hPrLcu8cVP8fy28w==",
"engines": {
"node": ">=6.0.0"
}
},
"node_modules/@jridgewell/set-array": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/@jridgewell/set-array/-/set-array-1.1.2.tgz",
"integrity": "sha512-xnkseuNADM0gt2bs+BvhO0p78Mk762YnZdsuzFV018NoG1Sj1SCQvpSqa7XUaTam5vAGasABV9qXASMKnFMwMw==",
"engines": {
"node": ">=6.0.0"
}
},
"node_modules/@jridgewell/source-map": {
"version": "0.3.2",
"resolved": "https://registry.npmjs.org/@jridgewell/source-map/-/source-map-0.3.2.tgz",
"integrity": "sha512-m7O9o2uR8k2ObDysZYzdfhb08VuEml5oWGiosa1VdaPZ/A6QyPkAJuwN0Q1lhULOf6B7MtQmHENS743hWtCrgw==",
"dependencies": {
"@jridgewell/gen-mapping": "^0.3.0",
"@jridgewell/trace-mapping": "^0.3.9"
}
},
"node_modules/@jridgewell/sourcemap-codec": {
"version": "1.4.14",
"resolved": "https://registry.npmjs.org/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.4.14.tgz",
"integrity": "sha512-XPSJHWmi394fuUuzDnGz1wiKqWfo1yXecHQMRf2l6hztTO+nPru658AyDngaBe7isIxEkRsPR3FZh+s7iVa4Uw=="
},
"node_modules/@jridgewell/trace-mapping": {
"version": "0.3.14",
"resolved": "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.14.tgz",
"integrity": "sha512-bJWEfQ9lPTvm3SneWwRFVLzrh6nhjwqw7TUFFBEMzwvg7t7PCDenf2lDwqo4NQXzdpgBXyFgDWnQA+2vkruksQ==",
"dependencies": {
"@jridgewell/resolve-uri": "^3.0.3",
"@jridgewell/sourcemap-codec": "^1.4.10"
}
},
"node_modules/@node-ipc/js-queue": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/@node-ipc/js-queue/-/js-queue-2.0.3.tgz",
@ -9710,9 +9762,9 @@
}
},
"node_modules/shell-quote": {
"version": "1.7.2",
"resolved": "https://registry.npmjs.org/shell-quote/-/shell-quote-1.7.2.tgz",
"integrity": "sha512-mRz/m/JVscCrkMyPqHc/bczi3OQHkLTqXHEFu0zDhK/qfv3UcOA4SVmRCLmos4bhjr9ekVQubj/R7waKapmiQg==",
"version": "1.7.3",
"resolved": "https://registry.npmjs.org/shell-quote/-/shell-quote-1.7.3.tgz",
"integrity": "sha512-Vpfqwm4EnqGdlsBFNmHhxhElJYrdfcxPThu+ryKS5J8L/fhAwLazFZtq+S+TWZ9ANj2piSQLGj6NQg+lKPmxrw==",
"dev": true
},
"node_modules/signal-exit": {
@ -10223,13 +10275,13 @@
}
},
"node_modules/terser": {
"version": "5.12.1",
"resolved": "https://registry.npmjs.org/terser/-/terser-5.12.1.tgz",
"integrity": "sha512-NXbs+7nisos5E+yXwAD+y7zrcTkMqb0dEJxIGtSKPdCBzopf7ni4odPul2aechpV7EXNvOudYOX2bb5tln1jbQ==",
"version": "5.14.2",
"resolved": "https://registry.npmjs.org/terser/-/terser-5.14.2.tgz",
"integrity": "sha512-oL0rGeM/WFQCUd0y2QrWxYnq7tfSuKBiqTjRPWrRgB46WD/kiwHwF8T23z78H6Q6kGCuuHcPB+KULHRdxvVGQA==",
"dependencies": {
"@jridgewell/source-map": "^0.3.2",
"acorn": "^8.5.0",
"commander": "^2.20.0",
"source-map": "~0.7.2",
"source-map-support": "~0.5.20"
},
"bin": {
@ -10308,14 +10360,6 @@
"node": ">=0.4.0"
}
},
"node_modules/terser/node_modules/source-map": {
"version": "0.7.3",
"resolved": "https://registry.npmjs.org/source-map/-/source-map-0.7.3.tgz",
"integrity": "sha512-CkCj6giN3S+n9qrYiBTX5gystlENnRW5jZeNLHpe6aue+SrHcG5VYwujhW9s4dY31mEGsxBDrHR6oI69fTXsaQ==",
"engines": {
"node": ">= 8"
}
},
"node_modules/text-table": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/text-table/-/text-table-0.2.0.tgz",
@ -12938,6 +12982,49 @@
"integrity": "sha512-ZnQMnLV4e7hDlUvw8H+U8ASL02SS2Gn6+9Ac3wGGLIe7+je2AeAOxPY+izIPJDfFDb7eDjev0Us8MO1iFRN8hA==",
"dev": true
},
"@jridgewell/gen-mapping": {
"version": "0.3.2",
"resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.2.tgz",
"integrity": "sha512-mh65xKQAzI6iBcFzwv28KVWSmCkdRBWoOh+bYQGW3+6OZvbbN3TqMGo5hqYxQniRcH9F2VZIoJCm4pa3BPDK/A==",
"requires": {
"@jridgewell/set-array": "^1.0.1",
"@jridgewell/sourcemap-codec": "^1.4.10",
"@jridgewell/trace-mapping": "^0.3.9"
}
},
"@jridgewell/resolve-uri": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.0.tgz",
"integrity": "sha512-F2msla3tad+Mfht5cJq7LSXcdudKTWCVYUgw6pLFOOHSTtZlj6SWNYAp+AhuqLmWdBO2X5hPrLcu8cVP8fy28w=="
},
"@jridgewell/set-array": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/@jridgewell/set-array/-/set-array-1.1.2.tgz",
"integrity": "sha512-xnkseuNADM0gt2bs+BvhO0p78Mk762YnZdsuzFV018NoG1Sj1SCQvpSqa7XUaTam5vAGasABV9qXASMKnFMwMw=="
},
"@jridgewell/source-map": {
"version": "0.3.2",
"resolved": "https://registry.npmjs.org/@jridgewell/source-map/-/source-map-0.3.2.tgz",
"integrity": "sha512-m7O9o2uR8k2ObDysZYzdfhb08VuEml5oWGiosa1VdaPZ/A6QyPkAJuwN0Q1lhULOf6B7MtQmHENS743hWtCrgw==",
"requires": {
"@jridgewell/gen-mapping": "^0.3.0",
"@jridgewell/trace-mapping": "^0.3.9"
}
},
"@jridgewell/sourcemap-codec": {
"version": "1.4.14",
"resolved": "https://registry.npmjs.org/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.4.14.tgz",
"integrity": "sha512-XPSJHWmi394fuUuzDnGz1wiKqWfo1yXecHQMRf2l6hztTO+nPru658AyDngaBe7isIxEkRsPR3FZh+s7iVa4Uw=="
},
"@jridgewell/trace-mapping": {
"version": "0.3.14",
"resolved": "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.14.tgz",
"integrity": "sha512-bJWEfQ9lPTvm3SneWwRFVLzrh6nhjwqw7TUFFBEMzwvg7t7PCDenf2lDwqo4NQXzdpgBXyFgDWnQA+2vkruksQ==",
"requires": {
"@jridgewell/resolve-uri": "^3.0.3",
"@jridgewell/sourcemap-codec": "^1.4.10"
}
},
"@node-ipc/js-queue": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/@node-ipc/js-queue/-/js-queue-2.0.3.tgz",
@ -18912,9 +18999,9 @@
"dev": true
},
"shell-quote": {
"version": "1.7.2",
"resolved": "https://registry.npmjs.org/shell-quote/-/shell-quote-1.7.2.tgz",
"integrity": "sha512-mRz/m/JVscCrkMyPqHc/bczi3OQHkLTqXHEFu0zDhK/qfv3UcOA4SVmRCLmos4bhjr9ekVQubj/R7waKapmiQg==",
"version": "1.7.3",
"resolved": "https://registry.npmjs.org/shell-quote/-/shell-quote-1.7.3.tgz",
"integrity": "sha512-Vpfqwm4EnqGdlsBFNmHhxhElJYrdfcxPThu+ryKS5J8L/fhAwLazFZtq+S+TWZ9ANj2piSQLGj6NQg+lKPmxrw==",
"dev": true
},
"signal-exit": {
@ -19314,13 +19401,13 @@
"integrity": "sha512-GNzQvQTOIP6RyTfE2Qxb8ZVlNmw0n88vp1szwWRimP02mnTsx3Wtn5qRdqY9w2XduFNUgvOwhNnQsjwCp+kqaQ=="
},
"terser": {
"version": "5.12.1",
"resolved": "https://registry.npmjs.org/terser/-/terser-5.12.1.tgz",
"integrity": "sha512-NXbs+7nisos5E+yXwAD+y7zrcTkMqb0dEJxIGtSKPdCBzopf7ni4odPul2aechpV7EXNvOudYOX2bb5tln1jbQ==",
"version": "5.14.2",
"resolved": "https://registry.npmjs.org/terser/-/terser-5.14.2.tgz",
"integrity": "sha512-oL0rGeM/WFQCUd0y2QrWxYnq7tfSuKBiqTjRPWrRgB46WD/kiwHwF8T23z78H6Q6kGCuuHcPB+KULHRdxvVGQA==",
"requires": {
"@jridgewell/source-map": "^0.3.2",
"acorn": "^8.5.0",
"commander": "^2.20.0",
"source-map": "~0.7.2",
"source-map-support": "~0.5.20"
},
"dependencies": {
@ -19328,11 +19415,6 @@
"version": "8.7.0",
"resolved": "https://registry.npmjs.org/acorn/-/acorn-8.7.0.tgz",
"integrity": "sha512-V/LGr1APy+PXIwKebEWrkZPwoeoF+w1jiOBUmuxuiUIaOHtob8Qc9BTrYo7VuI5fR8tqsy+buA2WFooR5olqvQ=="
},
"source-map": {
"version": "0.7.3",
"resolved": "https://registry.npmjs.org/source-map/-/source-map-0.7.3.tgz",
"integrity": "sha512-CkCj6giN3S+n9qrYiBTX5gystlENnRW5jZeNLHpe6aue+SrHcG5VYwujhW9s4dY31mEGsxBDrHR6oI69fTXsaQ=="
}
}
},

View File

@ -201,6 +201,8 @@ class Config:
)
for include_file in include_files:
if not include_file:
continue
if not os.path.isabs(include_file):
include_file = os.path.join(cfgfile_dir, include_file)
self._included_files.add(include_file)
@ -215,7 +217,9 @@ class Config:
)
else:
section_config = file_config.get(section, {}) or {}
if not section_config.get('disabled'):
if not (
hasattr(section_config, 'get') and section_config.get('disabled')
):
config[section] = section_config
return config

View File

@ -153,7 +153,10 @@ class CronScheduler(threading.Thread):
for (job_name, job_config) in self.jobs_config.items():
job = self._get_job(name=job_name, config=job_config)
if job.state == CronjobState.IDLE:
job.start()
try:
job.start()
except Exception as e:
logger.warning(f'Could not start cronjob {job_name}: {e}')
t_before_wait = get_now().timestamp()
self._should_stop.wait(timeout=self._poll_seconds)

View File

@ -15,10 +15,10 @@ logger = logging.getLogger('platypush')
def parse(msg):
""" Builds a dict given another dictionary or
a JSON UTF-8 encoded string/bytearray """
"""Builds a dict given another dictionary or
a JSON UTF-8 encoded string/bytearray"""
if isinstance(msg, bytes) or isinstance(msg, bytearray):
if isinstance(msg, (bytes, bytearray)):
msg = msg.decode('utf-8')
if isinstance(msg, str):
try:
@ -30,8 +30,8 @@ def parse(msg):
return msg
class EventCondition(object):
""" Event hook condition class """
class EventCondition:
"""Event hook condition class"""
def __init__(self, type=Event.__class__, priority=None, **kwargs):
"""
@ -55,8 +55,8 @@ class EventCondition(object):
@classmethod
def build(cls, rule):
""" Builds a rule given either another EventRule, a dictionary or
a JSON UTF-8 encoded string/bytearray """
"""Builds a rule given either another EventRule, a dictionary or
a JSON UTF-8 encoded string/bytearray"""
if isinstance(rule, cls):
return rule
@ -64,8 +64,7 @@ class EventCondition(object):
rule = parse(rule)
assert isinstance(rule, dict), f'Not a valid rule: {rule}'
type = get_event_class_by_type(
rule.pop('type') if 'type' in rule else 'Event')
type = get_event_class_by_type(rule.pop('type') if 'type' in rule else 'Event')
args = {}
for (key, value) in rule.items():
@ -75,8 +74,8 @@ class EventCondition(object):
class EventAction(Request):
""" Event hook action class. It is a special type of runnable request
whose fields can be configured later depending on the event context """
"""Event hook action class. It is a special type of runnable request
whose fields can be configured later depending on the event context"""
def __init__(self, target=None, action=None, **args):
if target is None:
@ -99,16 +98,16 @@ class EventAction(Request):
return super().build(action)
class EventHook(object):
""" Event hook class. It consists of one conditions and
one or multiple actions to be executed """
class EventHook:
"""Event hook class. It consists of one conditions and
one or multiple actions to be executed"""
def __init__(self, name, priority=None, condition=None, actions=None):
""" Constructor. Takes a name, a EventCondition object and an event action
procedure as input. It may also have a priority attached
as a positive number. If multiple hooks match against an event,
only the ones that have either the maximum match score or the
maximum pre-configured priority will be run. """
"""Constructor. Takes a name, a EventCondition object and an event action
procedure as input. It may also have a priority attached
as a positive number. If multiple hooks match against an event,
only the ones that have either the maximum match score or the
maximum pre-configured priority will be run."""
self.name = name
self.condition = EventCondition.build(condition or {})
@ -118,8 +117,8 @@ class EventHook(object):
@classmethod
def build(cls, name, hook):
""" Builds a rule given either another EventRule, a dictionary or
a JSON UTF-8 encoded string/bytearray """
"""Builds a rule given either another EventRule, a dictionary or
a JSON UTF-8 encoded string/bytearray"""
if isinstance(hook, cls):
return hook
@ -146,14 +145,14 @@ class EventHook(object):
return cls(name=name, condition=condition, actions=actions, priority=priority)
def matches_event(self, event):
""" Returns an EventMatchResult object containing the information
about the match between the event and this hook """
"""Returns an EventMatchResult object containing the information
about the match between the event and this hook"""
return event.matches_condition(self.condition)
def run(self, event):
""" Checks the condition of the hook against a particular event and
runs the hook actions if the condition is met """
"""Checks the condition of the hook against a particular event and
runs the hook actions if the condition is met"""
def _thread_func(result):
set_thread_name('Event-' + self.name)
@ -163,7 +162,9 @@ class EventHook(object):
if result.is_match:
logger.info('Running hook {} triggered by an event'.format(self.name))
threading.Thread(target=_thread_func, name='Event-' + self.name, args=(result,)).start()
threading.Thread(
target=_thread_func, name='Event-' + self.name, args=(result,)
).start()
def hook(event_type=Event, **condition):
@ -172,8 +173,14 @@ def hook(event_type=Event, **condition):
f.condition = EventCondition(type=event_type, **condition)
@wraps(f)
def wrapped(*args, **kwargs):
return exec_wrapper(f, *args, **kwargs)
def wrapped(event, *args, **kwargs):
from platypush.message.event.http.hook import WebhookEvent
response = exec_wrapper(f, event, *args, **kwargs)
if isinstance(event, WebhookEvent):
event.send_response(response)
return response
return wrapped

View File

@ -1,10 +1,9 @@
import copy
import hashlib
import json
import random
import re
import sys
import time
import uuid
from datetime import date
@ -79,9 +78,7 @@ class Event(Message):
@staticmethod
def _generate_id():
"""Generate a unique event ID"""
return hashlib.md5(
str(uuid.uuid1()).encode()
).hexdigest() # lgtm [py/weak-sensitive-data-hashing]
return ''.join(['{:02x}'.format(random.randint(0, 255)) for _ in range(16)])
def matches_condition(self, condition):
"""

View File

@ -1,11 +1,26 @@
import json
import uuid
from platypush.message.event import Event
from platypush.utils import get_redis
class WebhookEvent(Event):
"""
Event triggered when a custom webhook is called.
"""
def __init__(self, *argv, hook, method, data=None, args=None, headers=None, **kwargs):
def __init__(
self,
*argv,
hook,
method,
data=None,
args=None,
headers=None,
response=None,
**kwargs,
):
"""
:param hook: Name of the invoked web hook, from http://host:port/hook/<hook>
:type hook: str
@ -21,10 +36,56 @@ class WebhookEvent(Event):
:param headers: Request headers
:type args: dict
"""
super().__init__(hook=hook, method=method, data=data, args=args or {},
headers=headers or {}, *argv, **kwargs)
:param response: Response returned by the hook.
:type args: dict | list | str
"""
# This queue is used to synchronize with the hook and wait for its completion
kwargs['response_queue'] = kwargs.get(
'response_queue', f'platypush/webhook/{str(uuid.uuid1())}'
)
super().__init__(
*argv,
hook=hook,
method=method,
data=data,
args=args or {},
headers=headers or {},
response=response,
**kwargs,
)
def send_response(self, response):
output = response.output
if isinstance(output, tuple):
# A 3-sized tuple where the second element is an int and the third
# is a dict represents an HTTP response in the format `(data,
# http_code headers)`.
if (
len(output) == 3
and isinstance(output[1], int)
and isinstance(output[2], dict)
):
output = {
'___data___': output[0],
'___code___': output[1],
'___headers___': output[2],
}
else:
# Normalize tuples to lists before serialization
output = list(output)
if isinstance(output, (dict, list)):
output = json.dumps(output)
if output is None:
output = ''
get_redis().rpush(self.args['response_queue'], output)
def wait_response(self, timeout=None):
rs = get_redis().blpop(self.args['response_queue'], timeout=timeout)
if rs and len(rs) > 1:
return rs[1]
# vim:sw=4:ts=4:et:

View File

@ -0,0 +1,14 @@
from platypush.message.event import Event
class TidalEvent(Event):
"""Base class for Tidal events"""
class TidalPlaylistUpdatedEvent(TidalEvent):
"""
Event fired when a Tidal playlist is updated.
"""
def __init__(self, playlist_id: str, *args, **kwargs):
super().__init__(*args, playlist_id=playlist_id, **kwargs)

View File

@ -3,9 +3,12 @@
"""
import time
from typing import Optional
from sqlalchemy import create_engine, Table, MetaData
from sqlalchemy.engine import Engine
from sqlalchemy.exc import CompileError
from sqlalchemy.sql import and_, or_, text
from platypush.plugins import Plugin, action
@ -23,10 +26,17 @@ class DbPlugin(Plugin):
def __init__(self, engine=None, *args, **kwargs):
"""
:param engine: Default SQLAlchemy connection engine string (e.g. ``sqlite:///:memory:`` or ``mysql://user:pass@localhost/test``) that will be used. You can override the default engine in the db actions.
:param engine: Default SQLAlchemy connection engine string (e.g.
``sqlite:///:memory:`` or ``mysql://user:pass@localhost/test``)
that will be used. You can override the default engine in the db
actions.
:type engine: str
:param args: Extra arguments that will be passed to ``sqlalchemy.create_engine`` (see https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to ``sqlalchemy.create_engine`` (seehttps:///docs.sqlalchemy.org/en/latest/core/engines.html)
:param args: Extra arguments that will be passed to
``sqlalchemy.create_engine`` (see
https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to
``sqlalchemy.create_engine``
(see https:///docs.sqlalchemy.org/en/latest/core/engines.html)
"""
super().__init__()
@ -41,11 +51,11 @@ class DbPlugin(Plugin):
return create_engine(engine, *args, **kwargs)
assert self.engine
return self.engine
# noinspection PyUnusedLocal
@staticmethod
def _build_condition(table, column, value):
def _build_condition(table, column, value): # type: ignore
if isinstance(value, str):
value = "'{}'".format(value)
elif not isinstance(value, int) and not isinstance(value, float):
@ -69,8 +79,12 @@ class DbPlugin(Plugin):
:type statement: str
:param engine: Engine to be used (default: default class engine)
:type engine: str
:param args: Extra arguments that will be passed to ``sqlalchemy.create_engine`` (see https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to ``sqlalchemy.create_engine`` (seehttps:///docs.sqlalchemy.org/en/latest/core/engines.html)
:param args: Extra arguments that will be passed to
``sqlalchemy.create_engine`` (see
https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to
``sqlalchemy.create_engine``
(see https:///docs.sqlalchemy.org/en/latest/core/engines.html)
"""
engine = self._get_engine(engine, *args, **kwargs)
@ -106,24 +120,42 @@ class DbPlugin(Plugin):
return table, engine
@action
def select(self, query=None, table=None, filter=None, engine=None, *args, **kwargs):
def select(
self,
query=None,
table=None,
filter=None,
engine=None,
data: Optional[dict] = None,
*args,
**kwargs
):
"""
Returns rows (as a list of hashes) given a query.
:param query: SQL to be executed
:type query: str
:param filter: Query WHERE filter expressed as a dictionary. This approach is preferred over specifying raw SQL
in ``query`` as the latter approach may be prone to SQL injection, unless you need to build some complex
SQL logic.
:param filter: Query WHERE filter expressed as a dictionary. This
approach is preferred over specifying raw SQL
in ``query`` as the latter approach may be prone to SQL injection,
unless you need to build some complex SQL logic.
:type filter: dict
:param table: If you specified a filter instead of a raw query, you'll have to specify the target table
:param table: If you specified a filter instead of a raw query, you'll
have to specify the target table
:type table: str
:param engine: Engine to be used (default: default class engine)
:type engine: str
:param args: Extra arguments that will be passed to ``sqlalchemy.create_engine``
(see https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to ``sqlalchemy.create_engine``
(seehttps:///docs.sqlalchemy.org/en/latest/core/engines.html)
:param data: If ``query`` is an SQL string, then you can use
SQLAlchemy's *placeholders* mechanism. You can specify placeholders
in the query for values that you want to be safely serialized, and
their values can be specified on the ``data`` attribute in a
``name`` ``value`` mapping format.
:param args: Extra arguments that will be passed to
``sqlalchemy.create_engine`` (see
https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to
``sqlalchemy.create_engine`` (see
https:///docs.sqlalchemy.org/en/latest/core/engines.html)
:returns: List of hashes representing the result rows.
Examples:
@ -136,7 +168,10 @@ class DbPlugin(Plugin):
"action": "db.select",
"args": {
"engine": "sqlite:///:memory:",
"query": "SELECT id, name FROM table"
"query": "SELECT id, name FROM table WHERE name = :name",
"data": {
"name": "foobar"
}
}
}
@ -165,19 +200,24 @@ class DbPlugin(Plugin):
engine = self._get_engine(engine, *args, **kwargs)
if isinstance(query, str):
query = text(query)
if table:
table, engine = self._get_table(table, engine=engine, *args, **kwargs)
query = table.select()
if filter:
for (k,v) in filter.items():
for (k, v) in filter.items():
query = query.where(self._build_condition(table, k, v))
if query is None:
raise RuntimeError('You need to specify either "query", or "table" and "filter"')
raise RuntimeError(
'You need to specify either "query", or "table" and "filter"'
)
with engine.connect() as connection:
result = connection.execute(query)
result = connection.execute(query, **(data or {}))
columns = result.keys()
rows = [
{col: row[i] for i, col in enumerate(list(columns))}
@ -187,8 +227,16 @@ class DbPlugin(Plugin):
return rows
@action
def insert(self, table, records, engine=None, key_columns=None,
on_duplicate_update=False, *args, **kwargs):
def insert(
self,
table,
records,
engine=None,
key_columns=None,
on_duplicate_update=False,
*args,
**kwargs
):
"""
Inserts records (as a list of hashes) into a table.
@ -198,12 +246,25 @@ class DbPlugin(Plugin):
:type records: list
:param engine: Engine to be used (default: default class engine)
:type engine: str
:param key_columns: Set it to specify the names of the key columns for ``table``. Set it if you want your statement to be executed with the ``on_duplicate_update`` flag.
:param key_columns: Set it to specify the names of the key columns for
``table``. Set it if you want your statement to be executed with
the ``on_duplicate_update`` flag.
:type key_columns: list
:param on_duplicate_update: If set, update the records in case of duplicate rows (default: False). If set, you'll need to specify ``key_columns`` as well.
:param on_duplicate_update: If set, update the records in case of
duplicate rows (default: False). If set, you'll need to specify
``key_columns`` as well. If ``key_columns`` is set, existing
records are found but ``on_duplicate_update`` is false, then
existing records will be ignored.
:type on_duplicate_update: bool
:param args: Extra arguments that will be passed to ``sqlalchemy.create_engine`` (see https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to ``sqlalchemy.create_engine`` (seehttps:///docs.sqlalchemy.org/en/latest/core/engines.html)
:param args: Extra arguments that will be passed to
``sqlalchemy.create_engine`` (see
https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to
``sqlalchemy.create_engine``
(see https:///docs.sqlalchemy.org/en/latest/core/engines.html)
:return: The inserted records, if the underlying engine supports the
``RETURNING`` statement, otherwise nothing.
Example:
@ -231,24 +292,107 @@ class DbPlugin(Plugin):
}
"""
if on_duplicate_update:
assert (
key_columns
), 'on_duplicate_update requires key_columns to be specified'
if key_columns is None:
key_columns = []
engine = self._get_engine(engine, *args, **kwargs)
table, engine = self._get_table(table, engine=engine, *args, **kwargs)
insert_records = records
update_records = []
returned_records = []
with engine.connect() as connection:
# Upsert case
if key_columns:
insert_records, update_records = self._get_new_and_existing_records(
connection, table, records, key_columns
)
with connection.begin():
if insert_records:
insert = table.insert().values(insert_records)
ret = self._execute_try_returning(connection, insert)
if ret:
returned_records += ret
if update_records and on_duplicate_update:
ret = self._update(connection, table, update_records, key_columns)
if ret:
returned_records = ret + returned_records
if returned_records:
return returned_records
@staticmethod
def _execute_try_returning(connection, stmt):
ret = None
stmt_with_ret = stmt.returning('*')
try:
ret = connection.execute(stmt_with_ret)
except CompileError as e:
if str(e).startswith('RETURNING is not supported'):
connection.execute(stmt)
else:
raise e
if ret:
return [
{col.name: getattr(row, col.name, None) for col in stmt.table.c}
for row in ret
]
def _get_new_and_existing_records(self, connection, table, records, key_columns):
records_by_key = {
tuple(record.get(k) for k in key_columns): record for record in records
}
query = table.select().where(
or_(
and_(
self._build_condition(table, k, record.get(k)) for k in key_columns
)
for record in records
)
)
existing_records = {
tuple(getattr(record, k, None) for k in key_columns): record
for record in connection.execute(query).all()
}
update_records = [
record for k, record in records_by_key.items() if k in existing_records
]
insert_records = [
record for k, record in records_by_key.items() if k not in existing_records
]
return insert_records, update_records
def _update(self, connection, table, records, key_columns):
updated_records = []
for record in records:
table, engine = self._get_table(table, engine=engine, *args, **kwargs)
insert = table.insert().values(**record)
key = {k: v for (k, v) in record.items() if k in key_columns}
values = {k: v for (k, v) in record.items() if k not in key_columns}
update = table.update()
try:
engine.execute(insert)
except Exception as e:
if on_duplicate_update and key_columns:
self.update(table=table, records=records,
key_columns=key_columns, engine=engine,
*args, **kwargs)
else:
raise e
for (k, v) in key.items():
update = update.where(self._build_condition(table, k, v))
update = update.values(**values)
ret = self._execute_try_returning(connection, update)
if ret:
updated_records += ret
if updated_records:
return updated_records
@action
def update(self, table, records, key_columns, engine=None, *args, **kwargs):
@ -263,8 +407,15 @@ class DbPlugin(Plugin):
:type key_columns: list
:param engine: Engine to be used (default: default class engine)
:type engine: str
:param args: Extra arguments that will be passed to ``sqlalchemy.create_engine`` (see https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to ``sqlalchemy.create_engine`` (seehttps:///docs.sqlalchemy.org/en/latest/core/engines.html)
:param args: Extra arguments that will be passed to
``sqlalchemy.create_engine`` (see
https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to
``sqlalchemy.create_engine``
(see https:///docs.sqlalchemy.org/en/latest/core/engines.html)
:return: The inserted records, if the underlying engine supports the
``RETURNING`` statement, otherwise nothing.
Example:
@ -292,21 +443,10 @@ class DbPlugin(Plugin):
}
}
"""
engine = self._get_engine(engine, *args, **kwargs)
for record in records:
with engine.connect() as connection:
table, engine = self._get_table(table, engine=engine, *args, **kwargs)
key = { k:v for (k,v) in record.items() if k in key_columns }
values = { k:v for (k,v) in record.items() if k not in key_columns }
update = table.update()
for (k,v) in key.items():
update = update.where(self._build_condition(table, k, v))
update = update.values(**values)
engine.execute(update)
return self._update(connection, table, records, key_columns)
@action
def delete(self, table, records, engine=None, *args, **kwargs):
@ -319,8 +459,12 @@ class DbPlugin(Plugin):
:type records: list
:param engine: Engine to be used (default: default class engine)
:type engine: str
:param args: Extra arguments that will be passed to ``sqlalchemy.create_engine`` (see https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to ``sqlalchemy.create_engine`` (seehttps:///docs.sqlalchemy.org/en/latest/core/engines.html)
:param args: Extra arguments that will be passed to
``sqlalchemy.create_engine`` (see
https://docs.sqlalchemy.org/en/latest/core/engines.html)
:param kwargs: Extra kwargs that will be passed to
``sqlalchemy.create_engine``
(see https:///docs.sqlalchemy.org/en/latest/core/engines.html)
Example:
@ -343,14 +487,15 @@ class DbPlugin(Plugin):
engine = self._get_engine(engine, *args, **kwargs)
for record in records:
table, engine = self._get_table(table, engine=engine, *args, **kwargs)
delete = table.delete()
with engine.connect() as connection, connection.begin():
for record in records:
table, engine = self._get_table(table, engine=engine, *args, **kwargs)
delete = table.delete()
for (k,v) in record.items():
delete = delete.where(self._build_condition(table, k, v))
for (k, v) in record.items():
delete = delete.where(self._build_condition(table, k, v))
engine.execute(delete)
connection.execute(delete)
# vim:sw=4:ts=4:et:

View File

@ -1,100 +1,26 @@
import asyncio
import datetime
import json
import logging
import os
import pathlib
import re
import threading
from dataclasses import dataclass
from typing import Collection, Coroutine, Dict, Sequence
from typing import Collection, Coroutine, Sequence
from urllib.parse import urlparse
from async_lru import alru_cache
from nio import (
Api,
AsyncClient,
AsyncClientConfig,
CallAnswerEvent,
CallHangupEvent,
CallInviteEvent,
ErrorResponse,
Event,
InviteEvent,
KeyVerificationStart,
KeyVerificationAccept,
KeyVerificationMac,
KeyVerificationKey,
KeyVerificationCancel,
LocalProtocolError,
LoginResponse,
MatrixRoom,
MegolmEvent,
ProfileGetResponse,
RoomCreateEvent,
RoomEncryptedAudio,
RoomEncryptedFile,
RoomEncryptedImage,
RoomEncryptedMedia,
RoomEncryptedVideo,
RoomGetEventError,
RoomGetStateResponse,
RoomMemberEvent,
RoomMessage,
RoomMessageAudio,
RoomMessageFile,
RoomMessageFormatted,
RoomMessageText,
RoomMessageImage,
RoomMessageMedia,
RoomMessageVideo,
RoomTopicEvent,
RoomUpgradeEvent,
StickerEvent,
SyncResponse,
ToDeviceError,
UnknownEncryptedEvent,
UnknownEvent,
)
import aiofiles
import aiofiles.os
from nio.api import MessageDirection, RoomVisibility
from nio.client.async_client import client_session
from nio.client.base_client import logged_in
from nio.crypto import decrypt_attachment
from nio.crypto.device import OlmDevice
from nio.events.ephemeral import ReceiptEvent, TypingNoticeEvent
from nio.events.presence import PresenceEvent
from nio.exceptions import OlmUnverifiedDeviceError
from nio.responses import DownloadResponse, RoomMessagesResponse
from platypush.config import Config
from platypush.context import get_bus
from platypush.message.event.matrix import (
MatrixCallAnswerEvent,
MatrixCallHangupEvent,
MatrixCallInviteEvent,
MatrixEncryptedMessageEvent,
MatrixMessageAudioEvent,
MatrixMessageEvent,
MatrixMessageFileEvent,
MatrixMessageImageEvent,
MatrixMessageVideoEvent,
MatrixReactionEvent,
MatrixRoomCreatedEvent,
MatrixRoomInviteEvent,
MatrixRoomJoinEvent,
MatrixRoomLeaveEvent,
MatrixRoomSeenReceiptEvent,
MatrixRoomTopicChangedEvent,
MatrixRoomTypingStartEvent,
MatrixRoomTypingStopEvent,
MatrixSyncEvent,
MatrixUserPresenceEvent,
)
from platypush.plugins import AsyncRunnablePlugin, action
from platypush.schemas.matrix import (
@ -111,6 +37,8 @@ from platypush.schemas.matrix import (
from platypush.utils import get_mime_type
from .client import MatrixClient
logger = logging.getLogger(__name__)
@ -130,746 +58,6 @@ class Credentials:
}
class MatrixClient(AsyncClient):
def __init__(
self,
*args,
credentials_file: str,
store_path: str | None = None,
config: AsyncClientConfig | None = None,
autojoin_on_invite=True,
autotrust_devices=False,
autotrust_devices_whitelist: Collection[str] | None = None,
autotrust_rooms_whitelist: Collection[str] | None = None,
autotrust_users_whitelist: Collection[str] | None = None,
**kwargs,
):
credentials_file = os.path.abspath(os.path.expanduser(credentials_file))
if not store_path:
store_path = os.path.join(Config.get('workdir'), 'matrix', 'store') # type: ignore
assert store_path
store_path = os.path.abspath(os.path.expanduser(store_path))
pathlib.Path(store_path).mkdir(exist_ok=True, parents=True)
if not config:
config = AsyncClientConfig(
max_limit_exceeded=0,
max_timeouts=0,
store_sync_tokens=True,
encryption_enabled=True,
)
super().__init__(*args, config=config, store_path=store_path, **kwargs)
self.logger = logging.getLogger(self.__class__.__name__)
self._credentials_file = credentials_file
self._autojoin_on_invite = autojoin_on_invite
self._autotrust_devices = autotrust_devices
self._autotrust_devices_whitelist = autotrust_devices_whitelist
self._autotrust_rooms_whitelist = autotrust_rooms_whitelist or set()
self._autotrust_users_whitelist = autotrust_users_whitelist or set()
self._first_sync_performed = asyncio.Event()
self._last_batches_by_room = {}
self._typing_users_by_room = {}
self._encrypted_attachments_keystore_path = os.path.join(
store_path, 'attachment_keys.json'
)
self._encrypted_attachments_keystore = {}
self._sync_store_timer: threading.Timer | None = None
keystore = {}
try:
with open(self._encrypted_attachments_keystore_path, 'r') as f:
keystore = json.load(f)
except (ValueError, OSError):
with open(self._encrypted_attachments_keystore_path, 'w') as f:
f.write(json.dumps({}))
pathlib.Path(self._encrypted_attachments_keystore_path).touch(
mode=0o600, exist_ok=True
)
self._encrypted_attachments_keystore = {
tuple(key.split('|')): data for key, data in keystore.items()
}
async def _autojoin_room_callback(self, room: MatrixRoom, *_):
await self.join(room.room_id) # type: ignore
def _load_from_file(self):
if not os.path.isfile(self._credentials_file):
return
try:
with open(self._credentials_file, 'r') as f:
credentials = json.load(f)
except json.JSONDecodeError:
self.logger.warning(
'Could not read credentials_file %s - overwriting it',
self._credentials_file,
)
return
assert credentials.get('user_id'), 'Missing user_id'
assert credentials.get('access_token'), 'Missing access_token'
self.access_token = credentials['access_token']
self.user_id = credentials['user_id']
self.homeserver = credentials.get('server_url', self.homeserver)
if credentials.get('device_id'):
self.device_id = credentials['device_id']
self.load_store()
async def login(
self,
password: str | None = None,
device_name: str | None = None,
token: str | None = None,
) -> LoginResponse:
self._load_from_file()
login_res = None
if self.access_token:
self.load_store()
self.logger.info(
'Logged in to %s as %s using the stored access token',
self.homeserver,
self.user_id,
)
login_res = LoginResponse(
user_id=self.user_id,
device_id=self.device_id,
access_token=self.access_token,
)
else:
assert self.user, 'No credentials file found and no user provided'
login_args = {'device_name': device_name}
if token:
login_args['token'] = token
else:
assert (
password
), 'No credentials file found and no password nor access token provided'
login_args['password'] = password
login_res = await super().login(**login_args)
assert isinstance(login_res, LoginResponse), f'Failed to login: {login_res}'
self.logger.info(login_res)
credentials = Credentials(
server_url=self.homeserver,
user_id=login_res.user_id,
access_token=login_res.access_token,
device_id=login_res.device_id,
)
with open(self._credentials_file, 'w') as f:
json.dump(credentials.to_dict(), f)
os.chmod(self._credentials_file, 0o600)
if self.should_upload_keys:
self.logger.info('Uploading encryption keys')
await self.keys_upload()
self.logger.info('Synchronizing state')
self._first_sync_performed.clear()
self._add_callbacks()
sync_token = self.loaded_sync_token
self.loaded_sync_token = ''
await self.sync(sync_filter={'room': {'timeline': {'limit': 1}}})
self.loaded_sync_token = sync_token
self._sync_devices_trust()
self._first_sync_performed.set()
get_bus().post(MatrixSyncEvent(server_url=self.homeserver))
self.logger.info('State synchronized')
return login_res
@logged_in
async def sync(self, *args, **kwargs) -> SyncResponse:
response = await super().sync(*args, **kwargs)
assert isinstance(response, SyncResponse), str(response)
self._last_batches_by_room.update(
{
room_id: {
'prev_batch': room.timeline.prev_batch,
'next_batch': response.next_batch,
}
for room_id, room in response.rooms.join.items()
}
)
return response
@logged_in
async def room_messages(
self, room_id: str, start: str | None = None, *args, **kwargs
) -> RoomMessagesResponse:
if not start:
start = self._last_batches_by_room.get(room_id, {}).get('prev_batch')
assert start, (
f'No sync batches were found for room {room_id} and no start'
'batch has been provided'
)
response = await super().room_messages(room_id, start, *args, **kwargs)
assert isinstance(response, RoomMessagesResponse), str(response)
return response
def _sync_devices_trust(self):
all_devices = self.get_devices()
devices_to_trust: Dict[str, OlmDevice] = {}
untrusted_devices = {
device_id: device
for device_id, device in all_devices.items()
if not device.verified
}
if self._autotrust_devices:
devices_to_trust.update(untrusted_devices)
else:
if self._autotrust_devices_whitelist:
devices_to_trust.update(
{
device_id: device
for device_id, device in all_devices.items()
if device_id in self._autotrust_devices_whitelist
and device_id in untrusted_devices
}
)
if self._autotrust_rooms_whitelist:
devices_to_trust.update(
{
device_id: device
for room_id, devices in self.get_devices_by_room().items()
for device_id, device in devices.items() # type: ignore
if room_id in self._autotrust_rooms_whitelist
and device_id in untrusted_devices
}
)
if self._autotrust_users_whitelist:
devices_to_trust.update(
{
device_id: device
for user_id, devices in self.get_devices_by_user().items()
for device_id, device in devices.items() # type: ignore
if user_id in self._autotrust_users_whitelist
and device_id in untrusted_devices
}
)
for device in devices_to_trust.values():
self.verify_device(device)
self.logger.info(
'Device %s by user %s added to the whitelist', device.id, device.user_id
)
def get_devices_by_user(
self, user_id: str | None = None
) -> Dict[str, Dict[str, OlmDevice]] | Dict[str, OlmDevice]:
devices = {user: devices for user, devices in self.device_store.items()}
if user_id:
devices = devices.get(user_id, {})
return devices
def get_devices(self) -> Dict[str, OlmDevice]:
return {
device_id: device
for _, devices in self.device_store.items()
for device_id, device in devices.items()
}
def get_device(self, device_id: str) -> OlmDevice | None:
return self.get_devices().get(device_id)
def get_devices_by_room(
self, room_id: str | None = None
) -> Dict[str, Dict[str, OlmDevice]] | Dict[str, OlmDevice]:
devices = {
room_id: {
device_id: device
for _, devices in self.room_devices(room_id).items()
for device_id, device in devices.items()
}
for room_id in self.rooms.keys()
}
if room_id:
devices = devices.get(room_id, {})
return devices
def _add_callbacks(self):
self.add_event_callback(self._event_catch_all, Event)
self.add_event_callback(self._on_invite, InviteEvent) # type: ignore
self.add_event_callback(self._on_message, RoomMessageText) # type: ignore
self.add_event_callback(self._on_message, RoomMessageMedia) # type: ignore
self.add_event_callback(self._on_message, RoomEncryptedMedia) # type: ignore
self.add_event_callback(self._on_message, StickerEvent) # type: ignore
self.add_event_callback(self._on_room_member, RoomMemberEvent) # type: ignore
self.add_event_callback(self._on_room_topic_changed, RoomTopicEvent) # type: ignore
self.add_event_callback(self._on_call_invite, CallInviteEvent) # type: ignore
self.add_event_callback(self._on_call_answer, CallAnswerEvent) # type: ignore
self.add_event_callback(self._on_call_hangup, CallHangupEvent) # type: ignore
self.add_event_callback(self._on_unknown_event, UnknownEvent) # type: ignore
self.add_event_callback(self._on_unknown_encrypted_event, UnknownEncryptedEvent) # type: ignore
self.add_event_callback(self._on_unknown_encrypted_event, MegolmEvent) # type: ignore
self.add_to_device_callback(self._on_key_verification_start, KeyVerificationStart) # type: ignore
self.add_to_device_callback(self._on_key_verification_cancel, KeyVerificationCancel) # type: ignore
self.add_to_device_callback(self._on_key_verification_key, KeyVerificationKey) # type: ignore
self.add_to_device_callback(self._on_key_verification_mac, KeyVerificationMac) # type: ignore
self.add_to_device_callback(self._on_key_verification_accept, KeyVerificationAccept) # type: ignore
self.add_ephemeral_callback(self._on_typing, TypingNoticeEvent) # type: ignore
self.add_ephemeral_callback(self._on_receipt, ReceiptEvent) # type: ignore
self.add_presence_callback(self._on_presence, PresenceEvent) # type: ignore
if self._autojoin_on_invite:
self.add_event_callback(self._autojoin_room_callback, InviteEvent) # type: ignore
def _sync_store(self):
self.logger.info('Synchronizing keystore')
serialized_keystore = json.dumps(
{
f'{server}|{media_id}': data
for (
server,
media_id,
), data in self._encrypted_attachments_keystore.items()
}
)
try:
with open(self._encrypted_attachments_keystore_path, 'w') as f:
f.write(serialized_keystore)
finally:
self._sync_store_timer = None
@alru_cache(maxsize=500)
@client_session
async def get_profile(self, user_id: str | None = None) -> ProfileGetResponse:
"""
Cached version of get_profile.
"""
ret = await super().get_profile(user_id)
assert isinstance(
ret, ProfileGetResponse
), f'Could not retrieve profile for user {user_id}: {ret.message}'
return ret
@alru_cache(maxsize=500)
@client_session
async def room_get_state(self, room_id: str) -> RoomGetStateResponse:
"""
Cached version of room_get_state.
"""
ret = await super().room_get_state(room_id)
assert isinstance(
ret, RoomGetStateResponse
), f'Could not retrieve profile for room {room_id}: {ret.message}'
return ret
@client_session
async def download(
self,
server_name: str,
media_id: str,
filename: str | None = None,
allow_remote: bool = True,
):
response = await super().download(
server_name, media_id, filename, allow_remote=allow_remote
)
assert isinstance(
response, DownloadResponse
), f'Could not download media {media_id}: {response}'
encryption_data = self._encrypted_attachments_keystore.get(
(server_name, media_id)
)
if encryption_data:
self.logger.info('Decrypting media %s using the available keys', media_id)
response.filename = encryption_data.get('body', response.filename)
response.content_type = encryption_data.get(
'mimetype', response.content_type
)
response.body = decrypt_attachment(
response.body,
key=encryption_data.get('key'),
hash=encryption_data.get('hash'),
iv=encryption_data.get('iv'),
)
return response
async def _event_base_args(
self, room: MatrixRoom | None, event: Event | None = None
) -> dict:
sender_id = getattr(event, 'sender', None)
sender = (
await self.get_profile(sender_id) if sender_id else None # type: ignore
)
return {
'server_url': self.homeserver,
'sender_id': sender_id,
'sender_display_name': sender.displayname if sender else None,
'sender_avatar_url': sender.avatar_url if sender else None,
**(
{
'room_id': room.room_id,
'room_name': room.name,
'room_topic': room.topic,
}
if room
else {}
),
'server_timestamp': (
datetime.datetime.fromtimestamp(event.server_timestamp / 1000)
if event and getattr(event, 'server_timestamp', None)
else None
),
}
async def _event_catch_all(self, room: MatrixRoom, event: Event):
self.logger.debug('Received event on room %s: %r', room.room_id, event)
async def _on_invite(self, room: MatrixRoom, event: RoomMessageText):
get_bus().post(
MatrixRoomInviteEvent(
**(await self._event_base_args(room, event)),
)
)
async def _on_message(
self,
room: MatrixRoom,
event: RoomMessageText | RoomMessageMedia | RoomEncryptedMedia | StickerEvent,
):
if self._first_sync_performed.is_set():
evt_type = MatrixMessageEvent
evt_args = {
'body': event.body,
'url': getattr(event, 'url', None),
**(await self._event_base_args(room, event)),
}
if isinstance(event, (RoomMessageMedia, RoomEncryptedMedia, StickerEvent)):
evt_args['url'] = event.url
if isinstance(event, RoomEncryptedMedia):
evt_args['thumbnail_url'] = event.thumbnail_url
evt_args['mimetype'] = event.mimetype
self._store_encrypted_media_keys(event)
if isinstance(event, RoomMessageFormatted):
evt_args['format'] = event.format
evt_args['formatted_body'] = event.formatted_body
if isinstance(event, (RoomMessageImage, RoomEncryptedImage)):
evt_type = MatrixMessageImageEvent
elif isinstance(event, (RoomMessageAudio, RoomEncryptedAudio)):
evt_type = MatrixMessageAudioEvent
elif isinstance(event, (RoomMessageVideo, RoomEncryptedVideo)):
evt_type = MatrixMessageVideoEvent
elif isinstance(event, (RoomMessageFile, RoomEncryptedFile)):
evt_type = MatrixMessageFileEvent
get_bus().post(evt_type(**evt_args))
def _store_encrypted_media_keys(self, event: RoomEncryptedMedia):
url = event.url.strip('/')
parsed_url = urlparse(url)
homeserver = parsed_url.netloc.strip('/')
media_key = (homeserver, parsed_url.path.strip('/'))
self._encrypted_attachments_keystore[media_key] = {
'url': url,
'body': event.body,
'key': event.key['k'],
'hash': event.hashes['sha256'],
'iv': event.iv,
'homeserver': homeserver,
'mimetype': event.mimetype,
}
if not self._sync_store_timer:
self._sync_store_timer = threading.Timer(5, self._sync_store)
self._sync_store_timer.start()
async def _on_room_member(self, room: MatrixRoom, event: RoomMemberEvent):
evt_type = None
if event.membership == 'join':
evt_type = MatrixRoomJoinEvent
elif event.membership == 'leave':
evt_type = MatrixRoomLeaveEvent
if evt_type and self._first_sync_performed.is_set():
get_bus().post(
evt_type(
**(await self._event_base_args(room, event)),
)
)
async def _on_room_topic_changed(self, room: MatrixRoom, event: RoomTopicEvent):
if self._first_sync_performed.is_set():
get_bus().post(
MatrixRoomTopicChangedEvent(
**(await self._event_base_args(room, event)),
topic=event.topic,
)
)
async def _on_call_invite(self, room: MatrixRoom, event: CallInviteEvent):
if self._first_sync_performed.is_set():
get_bus().post(
MatrixCallInviteEvent(
call_id=event.call_id,
version=event.version,
invite_validity=event.lifetime / 1000.0,
sdp=event.offer.get('sdp'),
**(await self._event_base_args(room, event)),
)
)
async def _on_call_answer(self, room: MatrixRoom, event: CallAnswerEvent):
if self._first_sync_performed.is_set():
get_bus().post(
MatrixCallAnswerEvent(
call_id=event.call_id,
version=event.version,
sdp=event.answer.get('sdp'),
**(await self._event_base_args(room, event)),
)
)
async def _on_call_hangup(self, room: MatrixRoom, event: CallHangupEvent):
if self._first_sync_performed.is_set():
get_bus().post(
MatrixCallHangupEvent(
call_id=event.call_id,
version=event.version,
**(await self._event_base_args(room, event)),
)
)
async def _on_room_created(self, room: MatrixRoom, event: RoomCreateEvent):
get_bus().post(
MatrixRoomCreatedEvent(
**(await self._event_base_args(room, event)),
)
)
def _get_sas(self, event):
sas = self.key_verifications.get(event.transaction_id)
if not sas:
self.logger.debug(
'Received a key verification event with no associated transaction ID'
)
return sas
async def _on_key_verification_start(self, event: KeyVerificationStart):
self.logger.info(f'Received a key verification request from {event.sender}')
if 'emoji' not in event.short_authentication_string:
self.logger.warning(
'Only emoji verification is supported, but the verifying device '
'provided the following authentication methods: %r',
event.short_authentication_string,
)
return
sas = self._get_sas(event)
if not sas:
return
rs = await self.accept_key_verification(sas.transaction_id)
assert not isinstance(
rs, ToDeviceError
), f'accept_key_verification failed: {rs}'
rs = await self.to_device(sas.share_key())
assert not isinstance(rs, ToDeviceError), f'Shared key exchange failed: {rs}'
async def _on_key_verification_accept(self, event: KeyVerificationAccept):
self.logger.info('Key verification from device %s accepted', event.sender)
async def _on_key_verification_cancel(self, event: KeyVerificationCancel):
self.logger.info(
'The device %s cancelled a key verification request. ' 'Reason: %s',
event.sender,
event.reason,
)
async def _on_key_verification_key(self, event: KeyVerificationKey):
sas = self._get_sas(event)
if not sas:
return
self.logger.info(
'Received emoji verification from device %s: %s',
event.sender,
sas.get_emoji(),
)
rs = await self.confirm_short_auth_string(sas.transaction_id)
assert not isinstance(
rs, ToDeviceError
), f'confirm_short_auth_string failed: {rs}'
async def _on_key_verification_mac(self, event: KeyVerificationMac):
self.logger.info('Received MAC verification request from %s', event.sender)
sas = self._get_sas(event)
if not sas:
return
try:
mac = sas.get_mac()
except LocalProtocolError as e:
self.logger.warning(
'Verification from %s cancelled or unexpected protocol error. '
'Reason: %s',
e,
event.sender,
)
return
rs = await self.to_device(mac)
assert not isinstance(
rs, ToDeviceError
), f'Sending of the verification MAC to {event.sender} failed: {rs}'
self.logger.info('This device has been successfully verified!')
async def _on_room_upgrade(self, room: MatrixRoom, event: RoomUpgradeEvent):
self.logger.info(
'The room %s has been moved to %s', room.room_id, event.replacement_room
)
await self.room_leave(room.room_id)
await self.join(event.replacement_room)
async def _on_typing(self, room: MatrixRoom, event: TypingNoticeEvent):
users = set(event.users)
typing_users = self._typing_users_by_room.get(room.room_id, set())
start_typing_users = users.difference(typing_users)
stop_typing_users = typing_users.difference(users)
for user in start_typing_users:
event.sender = user # type: ignore
get_bus().post(
MatrixRoomTypingStartEvent(
**(await self._event_base_args(room, event)), # type: ignore
sender=user,
)
)
for user in stop_typing_users:
event.sender = user # type: ignore
get_bus().post(
MatrixRoomTypingStopEvent(
**(await self._event_base_args(room, event)), # type: ignore
)
)
self._typing_users_by_room[room.room_id] = users
async def _on_receipt(self, room: MatrixRoom, event: ReceiptEvent):
if self._first_sync_performed.is_set():
for receipt in event.receipts:
event.sender = receipt.user_id # type: ignore
get_bus().post(
MatrixRoomSeenReceiptEvent(
**(await self._event_base_args(room, event)), # type: ignore
)
)
async def _on_presence(self, event: PresenceEvent):
if self._first_sync_performed.is_set():
last_active = (
(
datetime.datetime.now()
- datetime.timedelta(seconds=event.last_active_ago / 1000)
)
if event.last_active_ago
else None
)
event.sender = event.user_id # type: ignore
get_bus().post(
MatrixUserPresenceEvent(
**(await self._event_base_args(None, event)), # type: ignore
is_active=event.currently_active or False,
last_active=last_active,
)
)
async def _on_unknown_encrypted_event(
self, room: MatrixRoom, event: UnknownEncryptedEvent | MegolmEvent
):
if self._first_sync_performed.is_set():
body = getattr(event, 'ciphertext', '')
get_bus().post(
MatrixEncryptedMessageEvent(
body=body,
**(await self._event_base_args(room, event)),
)
)
async def _on_unknown_event(self, room: MatrixRoom, event: UnknownEvent):
evt = None
if event.type == 'm.reaction' and self._first_sync_performed.is_set():
# Get the ID of the event this was a reaction to
relation_dict = event.source.get('content', {}).get('m.relates_to', {})
reacted_to = relation_dict.get('event_id')
if reacted_to and relation_dict.get('rel_type') == 'm.annotation':
event_response = await self.room_get_event(room.room_id, reacted_to)
if isinstance(event_response, RoomGetEventError):
self.logger.warning(
'Error getting event that was reacted to (%s)', reacted_to
)
else:
evt = MatrixReactionEvent(
in_response_to_event_id=event_response.event.event_id,
**(await self._event_base_args(room, event)),
)
if evt:
get_bus().post(evt)
else:
self.logger.info(
'Received an unknown event on room %s: %r', room.room_id, event
)
async def upload_file(
self,
file: str,
name: str | None = None,
content_type: str | None = None,
encrypt: bool = False,
):
file = os.path.expanduser(file)
file_stat = await aiofiles.os.stat(file)
async with aiofiles.open(file, 'rb') as f:
return await super().upload(
f, # type: ignore
content_type=(
content_type or get_mime_type(file) or 'application/octet-stream'
),
filename=name or os.path.basename(file),
encrypt=encrypt,
filesize=file_stat.st_size,
)
class MatrixPlugin(AsyncRunnablePlugin):
"""
Matrix chat integration.
@ -903,7 +91,7 @@ class MatrixPlugin(AsyncRunnablePlugin):
- In the _Security_ section, you should see that at least one device is
marked as unverified, and you can start the verification process by
clicking on it.
- Select "_Verify through emoji_". A list of emojis should be prompted.
- Select "*Verify through emoji*". A list of emojis should be prompted.
Optionally, verify the logs of the application to check that you see
the same list. Then confirm that you see the same emojis, and your
device will be automatically marked as trusted.
@ -960,7 +148,7 @@ class MatrixPlugin(AsyncRunnablePlugin):
def __init__(
self,
server_url: str = 'https://matrix.to',
server_url: str = 'https://matrix-client.matrix.org',
user_id: str | None = None,
password: str | None = None,
access_token: str | None = None,
@ -985,7 +173,8 @@ class MatrixPlugin(AsyncRunnablePlugin):
associated field instead of using ``password``. This may be required if
the user has 2FA enabled.
:param server_url: Default Matrix instance base URL (default: ``https://matrix.to``).
:param server_url: Default Matrix instance base URL (default:
``https://matrix-client.matrix.org``).
:param user_id: user_id, in the format ``@user:example.org``, or just
the username if the account is hosted on the same server configured in
the ``server_url``.
@ -1366,7 +555,7 @@ class MatrixPlugin(AsyncRunnablePlugin):
def room_alias_to_id(self, alias: str) -> str:
"""
Convert a room alias (in the format ``#alias:matrix.example.org``) to a
room ID (in the format ``!aBcDeFgHiJkMnO:matrix.example.org').
room ID (in the format ``!aBcDeFgHiJkMnO:matrix.example.org``).
:param alias: The room alias.
:return: The room ID, as a string.

View File

@ -0,0 +1,856 @@
import asyncio
import datetime
import json
import logging
import os
import pathlib
import threading
from dataclasses import dataclass
from typing import Collection, Dict, Optional, Union
from urllib.parse import urlparse
from async_lru import alru_cache
from nio import (
AsyncClient,
AsyncClientConfig,
CallAnswerEvent,
CallHangupEvent,
CallInviteEvent,
Event,
InviteEvent,
KeyVerificationStart,
KeyVerificationAccept,
KeyVerificationMac,
KeyVerificationKey,
KeyVerificationCancel,
LocalProtocolError,
LoginResponse,
MatrixRoom,
MegolmEvent,
ProfileGetResponse,
RoomCreateEvent,
RoomEncryptedAudio,
RoomEncryptedFile,
RoomEncryptedImage,
RoomEncryptedMedia,
RoomEncryptedVideo,
RoomGetEventError,
RoomGetStateResponse,
RoomMemberEvent,
RoomMessageAudio,
RoomMessageFile,
RoomMessageFormatted,
RoomMessageText,
RoomMessageImage,
RoomMessageMedia,
RoomMessageVideo,
RoomTopicEvent,
RoomUpgradeEvent,
StickerEvent,
SyncResponse,
ToDeviceError,
UnknownEncryptedEvent,
UnknownEvent,
)
import aiofiles
import aiofiles.os
from nio.client.async_client import client_session
from nio.client.base_client import logged_in
from nio.crypto import decrypt_attachment
from nio.crypto.device import OlmDevice
from nio.events.ephemeral import ReceiptEvent, TypingNoticeEvent
from nio.events.presence import PresenceEvent
from nio.responses import DownloadResponse, RoomMessagesResponse
from platypush.config import Config
from platypush.context import get_bus
from platypush.message.event.matrix import (
MatrixCallAnswerEvent,
MatrixCallHangupEvent,
MatrixCallInviteEvent,
MatrixEncryptedMessageEvent,
MatrixMessageAudioEvent,
MatrixMessageEvent,
MatrixMessageFileEvent,
MatrixMessageImageEvent,
MatrixMessageVideoEvent,
MatrixReactionEvent,
MatrixRoomCreatedEvent,
MatrixRoomInviteEvent,
MatrixRoomJoinEvent,
MatrixRoomLeaveEvent,
MatrixRoomSeenReceiptEvent,
MatrixRoomTopicChangedEvent,
MatrixRoomTypingStartEvent,
MatrixRoomTypingStopEvent,
MatrixSyncEvent,
MatrixUserPresenceEvent,
)
from platypush.utils import get_mime_type
logger = logging.getLogger(__name__)
@dataclass
class Credentials:
server_url: str
user_id: str
access_token: str
device_id: str | None
def to_dict(self) -> dict:
return {
'server_url': self.server_url,
'user_id': self.user_id,
'access_token': self.access_token,
'device_id': self.device_id,
}
class MatrixClient(AsyncClient):
def __init__(
self,
*args,
credentials_file: str,
store_path: str | None = None,
config: Optional[AsyncClientConfig] = None,
autojoin_on_invite=True,
autotrust_devices=False,
autotrust_devices_whitelist: Collection[str] | None = None,
autotrust_rooms_whitelist: Collection[str] | None = None,
autotrust_users_whitelist: Collection[str] | None = None,
**kwargs,
):
credentials_file = os.path.abspath(os.path.expanduser(credentials_file))
if not store_path:
store_path = os.path.join(Config.get('workdir'), 'matrix', 'store') # type: ignore
assert store_path
store_path = os.path.abspath(os.path.expanduser(store_path))
pathlib.Path(store_path).mkdir(exist_ok=True, parents=True)
if not config:
config = AsyncClientConfig(
max_limit_exceeded=0,
max_timeouts=0,
store_sync_tokens=True,
encryption_enabled=True,
)
super().__init__(*args, config=config, store_path=store_path, **kwargs)
self.logger = logging.getLogger(self.__class__.__name__)
self._credentials_file = credentials_file
self._autojoin_on_invite = autojoin_on_invite
self._autotrust_devices = autotrust_devices
self._autotrust_devices_whitelist = autotrust_devices_whitelist
self._autotrust_rooms_whitelist = autotrust_rooms_whitelist or set()
self._autotrust_users_whitelist = autotrust_users_whitelist or set()
self._first_sync_performed = asyncio.Event()
self._last_batches_by_room = {}
self._typing_users_by_room = {}
self._encrypted_attachments_keystore_path = os.path.join(
store_path, 'attachment_keys.json'
)
self._encrypted_attachments_keystore = {}
self._sync_store_timer: threading.Timer | None = None
keystore = {}
try:
with open(self._encrypted_attachments_keystore_path, 'r') as f:
keystore = json.load(f)
except (ValueError, OSError):
with open(self._encrypted_attachments_keystore_path, 'w') as f:
f.write(json.dumps({}))
pathlib.Path(self._encrypted_attachments_keystore_path).touch(
mode=0o600, exist_ok=True
)
self._encrypted_attachments_keystore = {
tuple(key.split('|')): data for key, data in keystore.items()
}
async def _autojoin_room_callback(self, room: MatrixRoom, *_):
await self.join(room.room_id) # type: ignore
def _load_from_file(self):
if not os.path.isfile(self._credentials_file):
return
try:
with open(self._credentials_file, 'r') as f:
credentials = json.load(f)
except json.JSONDecodeError:
self.logger.warning(
'Could not read credentials_file %s - overwriting it',
self._credentials_file,
)
return
assert credentials.get('user_id'), 'Missing user_id'
assert credentials.get('access_token'), 'Missing access_token'
self.access_token = credentials['access_token']
self.user_id = credentials['user_id']
self.homeserver = credentials.get('server_url', self.homeserver)
if credentials.get('device_id'):
self.device_id = credentials['device_id']
self.load_store()
async def login(
self,
password: str | None = None,
device_name: str | None = None,
token: str | None = None,
) -> LoginResponse:
self._load_from_file()
login_res = None
if self.access_token:
self.load_store()
self.logger.info(
'Logged in to %s as %s using the stored access token',
self.homeserver,
self.user_id,
)
login_res = LoginResponse(
user_id=self.user_id,
device_id=self.device_id,
access_token=self.access_token,
)
else:
assert self.user, 'No credentials file found and no user provided'
login_args = {'device_name': device_name}
if token:
login_args['token'] = token
else:
assert (
password
), 'No credentials file found and no password nor access token provided'
login_args['password'] = password
login_res = await super().login(**login_args)
assert isinstance(login_res, LoginResponse), f'Failed to login: {login_res}'
self.logger.info(login_res)
credentials = Credentials(
server_url=self.homeserver,
user_id=login_res.user_id,
access_token=login_res.access_token,
device_id=login_res.device_id,
)
with open(self._credentials_file, 'w') as f:
json.dump(credentials.to_dict(), f)
os.chmod(self._credentials_file, 0o600)
if self.should_upload_keys:
self.logger.info('Uploading encryption keys')
await self.keys_upload()
self.logger.info('Synchronizing state')
self._first_sync_performed.clear()
self._add_callbacks()
sync_token = self.loaded_sync_token
self.loaded_sync_token = ''
await self.sync(sync_filter={'room': {'timeline': {'limit': 1}}})
self.loaded_sync_token = sync_token
self._sync_devices_trust()
self._first_sync_performed.set()
get_bus().post(MatrixSyncEvent(server_url=self.homeserver))
self.logger.info('State synchronized')
return login_res
@logged_in
async def sync(self, *args, **kwargs) -> SyncResponse:
response = await super().sync(*args, **kwargs)
assert isinstance(response, SyncResponse), str(response)
self._last_batches_by_room.update(
{
room_id: {
'prev_batch': room.timeline.prev_batch,
'next_batch': response.next_batch,
}
for room_id, room in response.rooms.join.items()
}
)
return response
@logged_in
async def room_messages(
self, room_id: str, start: str | None = None, *args, **kwargs
) -> RoomMessagesResponse:
if not start:
start = self._last_batches_by_room.get(room_id, {}).get('prev_batch')
assert start, (
f'No sync batches were found for room {room_id} and no start'
'batch has been provided'
)
response = await super().room_messages(room_id, start, *args, **kwargs)
assert isinstance(response, RoomMessagesResponse), str(response)
return response
def _sync_devices_trust(self):
all_devices = self.get_devices()
devices_to_trust: Dict[str, OlmDevice] = {}
untrusted_devices = {
device_id: device
for device_id, device in all_devices.items()
if not device.verified
}
if self._autotrust_devices:
devices_to_trust.update(untrusted_devices)
else:
if self._autotrust_devices_whitelist:
devices_to_trust.update(
{
device_id: device
for device_id, device in all_devices.items()
if device_id in self._autotrust_devices_whitelist
and device_id in untrusted_devices
}
)
if self._autotrust_rooms_whitelist:
devices_to_trust.update(
{
device_id: device
for room_id, devices in self.get_devices_by_room().items()
for device_id, device in devices.items() # type: ignore
if room_id in self._autotrust_rooms_whitelist
and device_id in untrusted_devices
}
)
if self._autotrust_users_whitelist:
devices_to_trust.update(
{
device_id: device
for user_id, devices in self.get_devices_by_user().items()
for device_id, device in devices.items() # type: ignore
if user_id in self._autotrust_users_whitelist
and device_id in untrusted_devices
}
)
for device in devices_to_trust.values():
self.verify_device(device)
self.logger.info(
'Device %s by user %s added to the whitelist', device.id, device.user_id
)
def get_devices_by_user(
self, user_id: str | None = None
) -> Dict[str, Dict[str, OlmDevice]] | Dict[str, OlmDevice]:
devices = {user: devices for user, devices in self.device_store.items()}
if user_id:
devices = devices.get(user_id, {})
return devices
def get_devices(self) -> Dict[str, OlmDevice]:
return {
device_id: device
for _, devices in self.device_store.items()
for device_id, device in devices.items()
}
def get_device(self, device_id: str) -> Optional[OlmDevice]:
return self.get_devices().get(device_id)
def get_devices_by_room(
self, room_id: str | None = None
) -> Dict[str, Dict[str, OlmDevice]] | Dict[str, OlmDevice]:
devices = {
room_id: {
device_id: device
for _, devices in self.room_devices(room_id).items()
for device_id, device in devices.items()
}
for room_id in self.rooms.keys()
}
if room_id:
devices = devices.get(room_id, {})
return devices
def _add_callbacks(self):
self.add_event_callback(self._event_catch_all, Event)
self.add_event_callback(self._on_invite, InviteEvent) # type: ignore
self.add_event_callback(self._on_message, RoomMessageText) # type: ignore
self.add_event_callback(self._on_message, RoomMessageMedia) # type: ignore
self.add_event_callback(self._on_message, RoomEncryptedMedia) # type: ignore
self.add_event_callback(self._on_message, StickerEvent) # type: ignore
self.add_event_callback(self._on_room_member, RoomMemberEvent) # type: ignore
self.add_event_callback(self._on_room_topic_changed, RoomTopicEvent) # type: ignore
self.add_event_callback(self._on_call_invite, CallInviteEvent) # type: ignore
self.add_event_callback(self._on_call_answer, CallAnswerEvent) # type: ignore
self.add_event_callback(self._on_call_hangup, CallHangupEvent) # type: ignore
self.add_event_callback(self._on_unknown_event, UnknownEvent) # type: ignore
self.add_event_callback(self._on_unknown_encrypted_event, UnknownEncryptedEvent) # type: ignore
self.add_event_callback(self._on_unknown_encrypted_event, MegolmEvent) # type: ignore
self.add_to_device_callback(self._on_key_verification_start, KeyVerificationStart) # type: ignore
self.add_to_device_callback(self._on_key_verification_cancel, KeyVerificationCancel) # type: ignore
self.add_to_device_callback(self._on_key_verification_key, KeyVerificationKey) # type: ignore
self.add_to_device_callback(self._on_key_verification_mac, KeyVerificationMac) # type: ignore
self.add_to_device_callback(self._on_key_verification_accept, KeyVerificationAccept) # type: ignore
self.add_ephemeral_callback(self._on_typing, TypingNoticeEvent) # type: ignore
self.add_ephemeral_callback(self._on_receipt, ReceiptEvent) # type: ignore
self.add_presence_callback(self._on_presence, PresenceEvent) # type: ignore
if self._autojoin_on_invite:
self.add_event_callback(self._autojoin_room_callback, InviteEvent) # type: ignore
def _sync_store(self):
self.logger.info('Synchronizing keystore')
serialized_keystore = json.dumps(
{
f'{server}|{media_id}': data
for (
server,
media_id,
), data in self._encrypted_attachments_keystore.items()
}
)
try:
with open(self._encrypted_attachments_keystore_path, 'w') as f:
f.write(serialized_keystore)
finally:
self._sync_store_timer = None
@alru_cache(maxsize=500)
@client_session
async def get_profile(self, user_id: str | None = None) -> ProfileGetResponse:
"""
Cached version of get_profile.
"""
ret = await super().get_profile(user_id)
assert isinstance(
ret, ProfileGetResponse
), f'Could not retrieve profile for user {user_id}: {ret.message}'
return ret
@alru_cache(maxsize=500)
@client_session
async def room_get_state(self, room_id: str) -> RoomGetStateResponse:
"""
Cached version of room_get_state.
"""
ret = await super().room_get_state(room_id)
assert isinstance(
ret, RoomGetStateResponse
), f'Could not retrieve profile for room {room_id}: {ret.message}'
return ret
@client_session
async def download(
self,
server_name: str,
media_id: str,
filename: str | None = None,
allow_remote: bool = True,
):
response = await super().download(
server_name, media_id, filename, allow_remote=allow_remote
)
assert isinstance(
response, DownloadResponse
), f'Could not download media {media_id}: {response}'
encryption_data = self._encrypted_attachments_keystore.get(
(server_name, media_id)
)
if encryption_data:
self.logger.info('Decrypting media %s using the available keys', media_id)
response.filename = encryption_data.get('body', response.filename)
response.content_type = encryption_data.get(
'mimetype', response.content_type
)
response.body = decrypt_attachment(
response.body,
key=encryption_data.get('key'),
hash=encryption_data.get('hash'),
iv=encryption_data.get('iv'),
)
return response
async def _event_base_args(
self, room: Optional[MatrixRoom], event: Optional[Event] = None
) -> dict:
sender_id = getattr(event, 'sender', None)
sender = (
await self.get_profile(sender_id) if sender_id else None # type: ignore
)
return {
'server_url': self.homeserver,
'sender_id': sender_id,
'sender_display_name': sender.displayname if sender else None,
'sender_avatar_url': sender.avatar_url if sender else None,
**(
{
'room_id': room.room_id,
'room_name': room.name,
'room_topic': room.topic,
}
if room
else {}
),
'server_timestamp': (
datetime.datetime.fromtimestamp(event.server_timestamp / 1000)
if event and getattr(event, 'server_timestamp', None)
else None
),
}
async def _event_catch_all(self, room: MatrixRoom, event: Event):
self.logger.debug('Received event on room %s: %r', room.room_id, event)
async def _on_invite(self, room: MatrixRoom, event: RoomMessageText):
get_bus().post(
MatrixRoomInviteEvent(
**(await self._event_base_args(room, event)),
)
)
async def _on_message(
self,
room: MatrixRoom,
event: Union[
RoomMessageText, RoomMessageMedia, RoomEncryptedMedia, StickerEvent
],
):
if self._first_sync_performed.is_set():
evt_type = MatrixMessageEvent
evt_args = {
'body': event.body,
'url': getattr(event, 'url', None),
**(await self._event_base_args(room, event)),
}
if isinstance(event, (RoomMessageMedia, RoomEncryptedMedia, StickerEvent)):
evt_args['url'] = event.url
if isinstance(event, RoomEncryptedMedia):
evt_args['thumbnail_url'] = event.thumbnail_url
evt_args['mimetype'] = event.mimetype
self._store_encrypted_media_keys(event)
if isinstance(event, RoomMessageFormatted):
evt_args['format'] = event.format
evt_args['formatted_body'] = event.formatted_body
if isinstance(event, (RoomMessageImage, RoomEncryptedImage)):
evt_type = MatrixMessageImageEvent
elif isinstance(event, (RoomMessageAudio, RoomEncryptedAudio)):
evt_type = MatrixMessageAudioEvent
elif isinstance(event, (RoomMessageVideo, RoomEncryptedVideo)):
evt_type = MatrixMessageVideoEvent
elif isinstance(event, (RoomMessageFile, RoomEncryptedFile)):
evt_type = MatrixMessageFileEvent
get_bus().post(evt_type(**evt_args))
def _store_encrypted_media_keys(self, event: RoomEncryptedMedia):
url = event.url.strip('/')
parsed_url = urlparse(url)
homeserver = parsed_url.netloc.strip('/')
media_key = (homeserver, parsed_url.path.strip('/'))
self._encrypted_attachments_keystore[media_key] = {
'url': url,
'body': event.body,
'key': event.key['k'],
'hash': event.hashes['sha256'],
'iv': event.iv,
'homeserver': homeserver,
'mimetype': event.mimetype,
}
if not self._sync_store_timer:
self._sync_store_timer = threading.Timer(5, self._sync_store)
self._sync_store_timer.start()
async def _on_room_member(self, room: MatrixRoom, event: RoomMemberEvent):
evt_type = None
if event.membership == 'join':
evt_type = MatrixRoomJoinEvent
elif event.membership == 'leave':
evt_type = MatrixRoomLeaveEvent
if evt_type and self._first_sync_performed.is_set():
get_bus().post(
evt_type(
**(await self._event_base_args(room, event)),
)
)
async def _on_room_topic_changed(self, room: MatrixRoom, event: RoomTopicEvent):
if self._first_sync_performed.is_set():
get_bus().post(
MatrixRoomTopicChangedEvent(
**(await self._event_base_args(room, event)),
topic=event.topic,
)
)
async def _on_call_invite(self, room: MatrixRoom, event: CallInviteEvent):
if self._first_sync_performed.is_set():
get_bus().post(
MatrixCallInviteEvent(
call_id=event.call_id,
version=event.version,
invite_validity=event.lifetime / 1000.0,
sdp=event.offer.get('sdp'),
**(await self._event_base_args(room, event)),
)
)
async def _on_call_answer(self, room: MatrixRoom, event: CallAnswerEvent):
if self._first_sync_performed.is_set():
get_bus().post(
MatrixCallAnswerEvent(
call_id=event.call_id,
version=event.version,
sdp=event.answer.get('sdp'),
**(await self._event_base_args(room, event)),
)
)
async def _on_call_hangup(self, room: MatrixRoom, event: CallHangupEvent):
if self._first_sync_performed.is_set():
get_bus().post(
MatrixCallHangupEvent(
call_id=event.call_id,
version=event.version,
**(await self._event_base_args(room, event)),
)
)
async def _on_room_created(self, room: MatrixRoom, event: RoomCreateEvent):
get_bus().post(
MatrixRoomCreatedEvent(
**(await self._event_base_args(room, event)),
)
)
def _get_sas(self, event):
sas = self.key_verifications.get(event.transaction_id)
if not sas:
self.logger.debug(
'Received a key verification event with no associated transaction ID'
)
return sas
async def _on_key_verification_start(self, event: KeyVerificationStart):
self.logger.info(f'Received a key verification request from {event.sender}')
if 'emoji' not in event.short_authentication_string:
self.logger.warning(
'Only emoji verification is supported, but the verifying device '
'provided the following authentication methods: %r',
event.short_authentication_string,
)
return
sas = self._get_sas(event)
if not sas:
return
rs = await self.accept_key_verification(sas.transaction_id)
assert not isinstance(
rs, ToDeviceError
), f'accept_key_verification failed: {rs}'
rs = await self.to_device(sas.share_key())
assert not isinstance(rs, ToDeviceError), f'Shared key exchange failed: {rs}'
async def _on_key_verification_accept(self, event: KeyVerificationAccept):
self.logger.info('Key verification from device %s accepted', event.sender)
async def _on_key_verification_cancel(self, event: KeyVerificationCancel):
self.logger.info(
'The device %s cancelled a key verification request. ' 'Reason: %s',
event.sender,
event.reason,
)
async def _on_key_verification_key(self, event: KeyVerificationKey):
sas = self._get_sas(event)
if not sas:
return
self.logger.info(
'Received emoji verification from device %s: %s',
event.sender,
sas.get_emoji(),
)
rs = await self.confirm_short_auth_string(sas.transaction_id)
assert not isinstance(
rs, ToDeviceError
), f'confirm_short_auth_string failed: {rs}'
async def _on_key_verification_mac(self, event: KeyVerificationMac):
self.logger.info('Received MAC verification request from %s', event.sender)
sas = self._get_sas(event)
if not sas:
return
try:
mac = sas.get_mac()
except LocalProtocolError as e:
self.logger.warning(
'Verification from %s cancelled or unexpected protocol error. '
'Reason: %s',
e,
event.sender,
)
return
rs = await self.to_device(mac)
assert not isinstance(
rs, ToDeviceError
), f'Sending of the verification MAC to {event.sender} failed: {rs}'
self.logger.info('This device has been successfully verified!')
async def _on_room_upgrade(self, room: MatrixRoom, event: RoomUpgradeEvent):
self.logger.info(
'The room %s has been moved to %s', room.room_id, event.replacement_room
)
await self.room_leave(room.room_id)
await self.join(event.replacement_room)
async def _on_typing(self, room: MatrixRoom, event: TypingNoticeEvent):
users = set(event.users)
typing_users = self._typing_users_by_room.get(room.room_id, set())
start_typing_users = users.difference(typing_users)
stop_typing_users = typing_users.difference(users)
for user in start_typing_users:
event.sender = user # type: ignore
get_bus().post(
MatrixRoomTypingStartEvent(
**(await self._event_base_args(room, event)), # type: ignore
sender=user,
)
)
for user in stop_typing_users:
event.sender = user # type: ignore
get_bus().post(
MatrixRoomTypingStopEvent(
**(await self._event_base_args(room, event)), # type: ignore
)
)
self._typing_users_by_room[room.room_id] = users
async def _on_receipt(self, room: MatrixRoom, event: ReceiptEvent):
if self._first_sync_performed.is_set():
for receipt in event.receipts:
event.sender = receipt.user_id # type: ignore
get_bus().post(
MatrixRoomSeenReceiptEvent(
**(await self._event_base_args(room, event)), # type: ignore
)
)
async def _on_presence(self, event: PresenceEvent):
if self._first_sync_performed.is_set():
last_active = (
(
datetime.datetime.now()
- datetime.timedelta(seconds=event.last_active_ago / 1000)
)
if event.last_active_ago
else None
)
event.sender = event.user_id # type: ignore
get_bus().post(
MatrixUserPresenceEvent(
**(await self._event_base_args(None, event)), # type: ignore
is_active=event.currently_active or False,
last_active=last_active,
)
)
async def _on_unknown_encrypted_event(
self, room: MatrixRoom, event: Union[UnknownEncryptedEvent, MegolmEvent]
):
if self._first_sync_performed.is_set():
body = getattr(event, 'ciphertext', '')
get_bus().post(
MatrixEncryptedMessageEvent(
body=body,
**(await self._event_base_args(room, event)),
)
)
async def _on_unknown_event(self, room: MatrixRoom, event: UnknownEvent):
evt = None
if event.type == 'm.reaction' and self._first_sync_performed.is_set():
# Get the ID of the event this was a reaction to
relation_dict = event.source.get('content', {}).get('m.relates_to', {})
reacted_to = relation_dict.get('event_id')
if reacted_to and relation_dict.get('rel_type') == 'm.annotation':
event_response = await self.room_get_event(room.room_id, reacted_to)
if isinstance(event_response, RoomGetEventError):
self.logger.warning(
'Error getting event that was reacted to (%s)', reacted_to
)
else:
evt = MatrixReactionEvent(
in_response_to_event_id=event_response.event.event_id,
**(await self._event_base_args(room, event)),
)
if evt:
get_bus().post(evt)
else:
self.logger.info(
'Received an unknown event on room %s: %r', room.room_id, event
)
async def upload_file(
self,
file: str,
name: Optional[str] = None,
content_type: Optional[str] = None,
encrypt: bool = False,
):
file = os.path.expanduser(file)
file_stat = await aiofiles.os.stat(file)
async with aiofiles.open(file, 'rb') as f:
return await super().upload(
f, # type: ignore
content_type=(
content_type or get_mime_type(file) or 'application/octet-stream'
),
filename=name or os.path.basename(file),
encrypt=encrypt,
filesize=file_stat.st_size,
)
# vim:sw=4:ts=4:et:

View File

@ -3,8 +3,15 @@ import threading
from platypush.context import get_bus
from platypush.plugins.media import PlayerState, MediaPlugin
from platypush.message.event.media import MediaPlayEvent, MediaPlayRequestEvent, \
MediaPauseEvent, MediaStopEvent, NewPlayingMediaEvent, MediaSeekEvent, MediaResumeEvent
from platypush.message.event.media import (
MediaPlayEvent,
MediaPlayRequestEvent,
MediaPauseEvent,
MediaStopEvent,
NewPlayingMediaEvent,
MediaSeekEvent,
MediaResumeEvent,
)
from platypush.plugins import action
@ -66,29 +73,58 @@ class MediaMpvPlugin(MediaPlugin):
def _event_callback(self):
def callback(event):
from mpv import MpvEventID as Event
from mpv import MpvEventEndFile as EndFile
from mpv import (
MpvEvent,
MpvEventID as Event,
MpvEventEndFile as EndFile,
)
self.logger.info('Received mpv event: {}'.format(event))
if isinstance(event, MpvEvent):
event = event.as_dict()
evt = event.get('event_id')
if not evt:
return
if (evt == Event.FILE_LOADED or evt == Event.START_FILE) and self._get_current_resource():
if (
evt == Event.FILE_LOADED or evt == Event.START_FILE
) and self._get_current_resource():
self._playback_rebounce_event.set()
self._post_event(NewPlayingMediaEvent, resource=self._get_current_resource(),
title=self._player.filename)
self._post_event(
NewPlayingMediaEvent,
resource=self._get_current_resource(),
title=self._player.filename,
)
elif evt == Event.PLAYBACK_RESTART:
self._playback_rebounce_event.set()
self._post_event(MediaPlayEvent, resource=self._get_current_resource(), title=self._player.filename)
self._post_event(
MediaPlayEvent,
resource=self._get_current_resource(),
title=self._player.filename,
)
elif evt == Event.PAUSE:
self._post_event(MediaPauseEvent, resource=self._get_current_resource(), title=self._player.filename)
self._post_event(
MediaPauseEvent,
resource=self._get_current_resource(),
title=self._player.filename,
)
elif evt == Event.UNPAUSE:
self._post_event(MediaResumeEvent, resource=self._get_current_resource(), title=self._player.filename)
elif evt == Event.SHUTDOWN or evt == Event.IDLE or (
evt == Event.END_FILE and event.get('event', {}).get('reason') in
[EndFile.EOF, EndFile.ABORTED, EndFile.QUIT]):
self._post_event(
MediaResumeEvent,
resource=self._get_current_resource(),
title=self._player.filename,
)
elif (
evt == Event.SHUTDOWN
or evt == Event.IDLE
or (
evt == Event.END_FILE
and event.get('event', {}).get('reason')
in [EndFile.EOF, EndFile.ABORTED, EndFile.QUIT]
)
):
playback_rebounced = self._playback_rebounce_event.wait(timeout=0.5)
if playback_rebounced:
self._playback_rebounce_event.clear()
@ -147,7 +183,7 @@ class MediaMpvPlugin(MediaPlugin):
@action
def pause(self):
""" Toggle the paused state """
"""Toggle the paused state"""
if not self._player:
return None, 'No mpv instance is running'
@ -156,7 +192,7 @@ class MediaMpvPlugin(MediaPlugin):
@action
def quit(self):
""" Stop and quit the player """
"""Stop and quit the player"""
if not self._player:
return None, 'No mpv instance is running'
@ -167,19 +203,19 @@ class MediaMpvPlugin(MediaPlugin):
@action
def stop(self):
""" Stop and quit the player """
"""Stop and quit the player"""
return self.quit()
@action
def voldown(self, step=10.0):
""" Volume down by (default: 10)% """
"""Volume down by (default: 10)%"""
if not self._player:
return None, 'No mpv instance is running'
return self.set_volume(self._player.volume - step)
@action
def volup(self, step=10.0):
""" Volume up by (default: 10)% """
"""Volume up by (default: 10)%"""
if not self._player:
return None, 'No mpv instance is running'
return self.set_volume(self._player.volume + step)
@ -211,14 +247,13 @@ class MediaMpvPlugin(MediaPlugin):
return None, 'No mpv instance is running'
if not self._player.seekable:
return None, 'The resource is not seekable'
pos = min(self._player.time_pos + self._player.time_remaining,
max(0, position))
pos = min(self._player.time_pos + self._player.time_remaining, max(0, position))
self._player.time_pos = pos
return self.status()
@action
def back(self, offset=30.0):
""" Back by (default: 30) seconds """
"""Back by (default: 30) seconds"""
if not self._player:
return None, 'No mpv instance is running'
if not self._player.seekable:
@ -228,47 +263,44 @@ class MediaMpvPlugin(MediaPlugin):
@action
def forward(self, offset=30.0):
""" Forward by (default: 30) seconds """
"""Forward by (default: 30) seconds"""
if not self._player:
return None, 'No mpv instance is running'
if not self._player.seekable:
return None, 'The resource is not seekable'
pos = min(self._player.time_pos + self._player.time_remaining,
self._player.time_pos + offset)
pos = min(
self._player.time_pos + self._player.time_remaining,
self._player.time_pos + offset,
)
return self.seek(pos)
@action
def next(self):
""" Play the next item in the queue """
"""Play the next item in the queue"""
if not self._player:
return None, 'No mpv instance is running'
self._player.playlist_next()
@action
def prev(self):
""" Play the previous item in the queue """
"""Play the previous item in the queue"""
if not self._player:
return None, 'No mpv instance is running'
self._player.playlist_prev()
@action
def toggle_subtitles(self, visible=None):
""" Toggle the subtitles visibility """
"""Toggle the subtitles visibility"""
return self.toggle_property('sub_visibility')
@action
def add_subtitles(self, filename):
""" Add a subtitles file """
"""Add a subtitles file"""
return self._player.sub_add(filename)
@action
def remove_subtitles(self, sub_id):
""" Remove a subtitles track by id """
return self._player.sub_remove(sub_id)
@action
def toggle_fullscreen(self):
""" Toggle the fullscreen mode """
"""Toggle the fullscreen mode"""
return self.toggle_property('fullscreen')
# noinspection PyShadowingBuiltins
@ -319,15 +351,17 @@ class MediaMpvPlugin(MediaPlugin):
@action
def set_subtitles(self, filename, *args, **kwargs):
""" Sets media subtitles from filename """
"""Sets media subtitles from filename"""
# noinspection PyTypeChecker
return self.set_property(subfile=filename, sub_visibility=True)
@action
def remove_subtitles(self):
""" Removes (hides) the subtitles """
def remove_subtitles(self, sub_id=None):
"""Removes (hides) the subtitles"""
if not self._player:
return None, 'No mpv instance is running'
if sub_id:
return self._player.sub_remove(sub_id)
self._player.sub_visibility = False
@action
@ -350,7 +384,7 @@ class MediaMpvPlugin(MediaPlugin):
@action
def mute(self):
""" Toggle mute state """
"""Toggle mute state"""
if not self._player:
return None, 'No mpv instance is running'
mute = not self._player.mute
@ -382,28 +416,35 @@ class MediaMpvPlugin(MediaPlugin):
return {'state': PlayerState.STOP.value}
return {
'audio_channels': getattr(self._player, 'audio_channels'),
'audio_codec': getattr(self._player, 'audio_codec_name'),
'delay': getattr(self._player, 'delay'),
'duration': getattr(self._player, 'playback_time', 0) + getattr(self._player, 'playtime_remaining', 0)
if getattr(self._player, 'playtime_remaining') else None,
'filename': getattr(self._player, 'filename'),
'file_size': getattr(self._player, 'file_size'),
'fullscreen': getattr(self._player, 'fs'),
'mute': getattr(self._player, 'mute'),
'name': getattr(self._player, 'name'),
'pause': getattr(self._player, 'pause'),
'percent_pos': getattr(self._player, 'percent_pos'),
'position': getattr(self._player, 'playback_time'),
'seekable': getattr(self._player, 'seekable'),
'state': (PlayerState.PAUSE.value if self._player.pause else PlayerState.PLAY.value),
'title': getattr(self._player, 'media_title') or getattr(self._player, 'filename'),
'audio_channels': getattr(self._player, 'audio_channels', None),
'audio_codec': getattr(self._player, 'audio_codec_name', None),
'delay': getattr(self._player, 'delay', None),
'duration': getattr(self._player, 'playback_time', 0)
+ getattr(self._player, 'playtime_remaining', 0)
if getattr(self._player, 'playtime_remaining', None)
else None,
'filename': getattr(self._player, 'filename', None),
'file_size': getattr(self._player, 'file_size', None),
'fullscreen': getattr(self._player, 'fs', None),
'mute': getattr(self._player, 'mute', None),
'name': getattr(self._player, 'name', None),
'pause': getattr(self._player, 'pause', None),
'percent_pos': getattr(self._player, 'percent_pos', None),
'position': getattr(self._player, 'playback_time', None),
'seekable': getattr(self._player, 'seekable', None),
'state': (
PlayerState.PAUSE.value
if self._player.pause
else PlayerState.PLAY.value
),
'title': getattr(self._player, 'media_title', None)
or getattr(self._player, 'filename', None),
'url': self._get_current_resource(),
'video_codec': getattr(self._player, 'video_codec'),
'video_format': getattr(self._player, 'video_format'),
'volume': getattr(self._player, 'volume'),
'volume_max': getattr(self._player, 'volume_max'),
'width': getattr(self._player, 'width'),
'video_codec': getattr(self._player, 'video_codec', None),
'video_format': getattr(self._player, 'video_format', None),
'volume': getattr(self._player, 'volume', None),
'volume_max': getattr(self._player, 'volume_max', None),
'width': getattr(self._player, 'width', None),
}
def on_stop(self, callback):
@ -413,12 +454,13 @@ class MediaMpvPlugin(MediaPlugin):
if not self._player or not self._player.stream_path:
return
return ('file://' if os.path.isfile(self._player.stream_path)
else '') + self._player.stream_path
return (
'file://' if os.path.isfile(self._player.stream_path) else ''
) + self._player.stream_path
def _get_resource(self, resource):
if self._is_youtube_resource(resource):
return resource # mpv can handle YouTube streaming natively
return resource # mpv can handle YouTube streaming natively
return super()._get_resource(resource)

View File

@ -21,44 +21,68 @@ class MqttPlugin(Plugin):
"""
def __init__(self, host=None, port=1883, tls_cafile=None,
tls_certfile=None, tls_keyfile=None,
tls_version=None, tls_ciphers=None, tls_insecure=False,
username=None, password=None, client_id=None, timeout=None, **kwargs):
def __init__(
self,
host=None,
port=1883,
tls_cafile=None,
tls_certfile=None,
tls_keyfile=None,
tls_version=None,
tls_ciphers=None,
tls_insecure=False,
username=None,
password=None,
client_id=None,
timeout=None,
**kwargs,
):
"""
:param host: If set, MQTT messages will by default routed to this host unless overridden in `send_message` (default: None)
:param host: If set, MQTT messages will by default routed to this host
unless overridden in `send_message` (default: None)
:type host: str
:param port: If a default host is set, specify the listen port (default: 1883)
:param port: If a default host is set, specify the listen port
(default: 1883)
:type port: int
:param tls_cafile: If a default host is set and requires TLS/SSL, specify the certificate authority file (default: None)
:param tls_cafile: If a default host is set and requires TLS/SSL,
specify the certificate authority file (default: None)
:type tls_cafile: str
:param tls_certfile: If a default host is set and requires TLS/SSL, specify the certificate file (default: None)
:param tls_certfile: If a default host is set and requires TLS/SSL,
specify the certificate file (default: None)
:type tls_certfile: str
:param tls_keyfile: If a default host is set and requires TLS/SSL, specify the key file (default: None)
:param tls_keyfile: If a default host is set and requires TLS/SSL,
specify the key file (default: None)
:type tls_keyfile: str
:param tls_version: If TLS/SSL is enabled on the MQTT server and it requires a certain TLS version, specify it
here (default: None). Supported versions: ``tls`` (automatic), ``tlsv1``, ``tlsv1.1``, ``tlsv1.2``.
:param tls_version: If TLS/SSL is enabled on the MQTT server and it
requires a certain TLS version, specify it here (default: None).
Supported versions: ``tls`` (automatic), ``tlsv1``, ``tlsv1.1``,
``tlsv1.2``.
:type tls_version: str
:param tls_ciphers: If a default host is set and requires TLS/SSL, specify the supported ciphers (default: None)
:param tls_ciphers: If a default host is set and requires TLS/SSL,
specify the supported ciphers (default: None)
:type tls_ciphers: str
:param tls_insecure: Set to True to ignore TLS insecure warnings (default: False).
:param tls_insecure: Set to True to ignore TLS insecure warnings
(default: False).
:type tls_insecure: bool
:param username: If a default host is set and requires user authentication, specify the username ciphers (default: None)
:param username: If a default host is set and requires user
authentication, specify the username ciphers (default: None)
:type username: str
:param password: If a default host is set and requires user authentication, specify the password ciphers (default: None)
:param password: If a default host is set and requires user
authentication, specify the password ciphers (default: None)
:type password: str
:param client_id: ID used to identify the client on the MQTT server (default: None).
If None is specified then ``Config.get('device_id')`` will be used.
:param client_id: ID used to identify the client on the MQTT server
(default: None). If None is specified then
``Config.get('device_id')`` will be used.
:type client_id: str
:param timeout: Client timeout in seconds (default: None).
@ -83,10 +107,11 @@ class MqttPlugin(Plugin):
@staticmethod
def get_tls_version(version: Optional[str] = None):
import ssl
if not version:
return None
if type(version) == type(ssl.PROTOCOL_TLS):
if isinstance(version, type(ssl.PROTOCOL_TLS)):
return version
if isinstance(version, str):
@ -120,10 +145,17 @@ class MqttPlugin(Plugin):
def _expandpath(path: Optional[str] = None) -> Optional[str]:
return os.path.abspath(os.path.expanduser(path)) if path else None
def _get_client(self, tls_cafile: Optional[str] = None, tls_certfile: Optional[str] = None,
tls_keyfile: Optional[str] = None, tls_version: Optional[str] = None,
tls_ciphers: Optional[str] = None, tls_insecure: Optional[bool] = None,
username: Optional[str] = None, password: Optional[str] = None):
def _get_client(
self,
tls_cafile: Optional[str] = None,
tls_certfile: Optional[str] = None,
tls_keyfile: Optional[str] = None,
tls_version: Optional[str] = None,
tls_ciphers: Optional[str] = None,
tls_insecure: Optional[bool] = None,
username: Optional[str] = None,
password: Optional[str] = None,
):
from paho.mqtt.client import Client
tls_cafile = self._expandpath(tls_cafile or self.tls_cafile)
@ -144,43 +176,77 @@ class MqttPlugin(Plugin):
if username and password:
client.username_pw_set(username, password)
if tls_cafile:
client.tls_set(ca_certs=tls_cafile, certfile=tls_certfile, keyfile=tls_keyfile,
tls_version=tls_version, ciphers=tls_ciphers)
client.tls_set(
ca_certs=tls_cafile,
certfile=tls_certfile,
keyfile=tls_keyfile,
tls_version=tls_version,
ciphers=tls_ciphers,
)
client.tls_insecure_set(tls_insecure)
return client
@action
def publish(self, topic: str, msg: Any, host: Optional[str] = None, port: Optional[int] = None,
reply_topic: Optional[str] = None, timeout: int = 60,
tls_cafile: Optional[str] = None, tls_certfile: Optional[str] = None,
tls_keyfile: Optional[str] = None, tls_version: Optional[str] = None,
tls_ciphers: Optional[str] = None, tls_insecure: Optional[bool] = None,
username: Optional[str] = None, password: Optional[str] = None):
def publish(
self,
topic: str,
msg: Any,
host: Optional[str] = None,
port: Optional[int] = None,
reply_topic: Optional[str] = None,
timeout: int = 60,
tls_cafile: Optional[str] = None,
tls_certfile: Optional[str] = None,
tls_keyfile: Optional[str] = None,
tls_version: Optional[str] = None,
tls_ciphers: Optional[str] = None,
tls_insecure: Optional[bool] = None,
username: Optional[str] = None,
password: Optional[str] = None,
qos: int = 0,
):
"""
Sends a message to a topic.
:param topic: Topic/channel where the message will be delivered
:param msg: Message to be sent. It can be a list, a dict, or a Message object.
:param host: MQTT broker hostname/IP (default: default host configured on the plugin).
:param port: MQTT broker port (default: default port configured on the plugin).
:param reply_topic: If a ``reply_topic`` is specified, then the action will wait for a response on this topic.
:param timeout: If ``reply_topic`` is set, use this parameter to specify the maximum amount of time to
wait for a response (default: 60 seconds).
:param tls_cafile: If TLS/SSL is enabled on the MQTT server and the certificate requires a certificate authority
to authenticate it, `ssl_cafile` will point to the provided ca.crt file (default: None).
:param tls_certfile: If TLS/SSL is enabled on the MQTT server and a client certificate it required, specify it
here (default: None).
:param tls_keyfile: If TLS/SSL is enabled on the MQTT server and a client certificate key it required, specify
it here (default: None).
:param tls_version: If TLS/SSL is enabled on the MQTT server and it requires a certain TLS version, specify it
here (default: None). Supported versions: ``tls`` (automatic), ``tlsv1``, ``tlsv1.1``, ``tlsv1.2``.
:param tls_insecure: Set to True to ignore TLS insecure warnings (default: False).
:param tls_ciphers: If TLS/SSL is enabled on the MQTT server and an explicit list of supported ciphers is
required, specify it here (default: None).
:param username: Specify it if the MQTT server requires authentication (default: None).
:param password: Specify it if the MQTT server requires authentication (default: None).
:param msg: Message to be sent. It can be a list, a dict, or a Message
object.
:param host: MQTT broker hostname/IP (default: default host configured
on the plugin).
:param port: MQTT broker port (default: default port configured on the
plugin).
:param reply_topic: If a ``reply_topic`` is specified, then the action
will wait for a response on this topic.
:param timeout: If ``reply_topic`` is set, use this parameter to
specify the maximum amount of time to wait for a response (default:
60 seconds).
:param tls_cafile: If TLS/SSL is enabled on the MQTT server and the
certificate requires a certificate authority to authenticate it,
`ssl_cafile` will point to the provided ca.crt file (default:
None).
:param tls_certfile: If TLS/SSL is enabled on the MQTT server and a
client certificate it required, specify it here (default: None).
:param tls_keyfile: If TLS/SSL is enabled on the MQTT server and a
client certificate key it required, specify it here (default:
None).
:param tls_version: If TLS/SSL is enabled on the MQTT server and it
requires a certain TLS version, specify it here (default: None).
Supported versions: ``tls`` (automatic), ``tlsv1``, ``tlsv1.1``,
``tlsv1.2``.
:param tls_insecure: Set to True to ignore TLS insecure warnings
(default: False).
:param tls_ciphers: If TLS/SSL is enabled on the MQTT server and an
explicit list of supported ciphers is required, specify it here
(default: None).
:param username: Specify it if the MQTT server requires authentication
(default: None).
:param password: Specify it if the MQTT server requires authentication
(default: None).
:param qos: Quality of Service (_QoS_) for the message - see `MQTT QoS
<https://assetwolf.com/learn/mqtt-qos-understanding-quality-of-service>`_
(default: 0).
"""
response_buffer = io.BytesIO()
client = None
@ -199,20 +265,29 @@ class MqttPlugin(Plugin):
port = port or self.port or 1883
assert host, 'No host specified'
client = self._get_client(tls_cafile=tls_cafile, tls_certfile=tls_certfile, tls_keyfile=tls_keyfile,
tls_version=tls_version, tls_ciphers=tls_ciphers, tls_insecure=tls_insecure,
username=username, password=password)
client = self._get_client(
tls_cafile=tls_cafile,
tls_certfile=tls_certfile,
tls_keyfile=tls_keyfile,
tls_version=tls_version,
tls_ciphers=tls_ciphers,
tls_insecure=tls_insecure,
username=username,
password=password,
)
client.connect(host, port, keepalive=timeout)
response_received = threading.Event()
if reply_topic:
client.on_message = self._response_callback(reply_topic=reply_topic,
event=response_received,
buffer=response_buffer)
client.on_message = self._response_callback(
reply_topic=reply_topic,
event=response_received,
buffer=response_buffer,
)
client.subscribe(reply_topic)
client.publish(topic, str(msg))
client.publish(topic, str(msg), qos=qos)
if not reply_topic:
return
@ -241,6 +316,7 @@ class MqttPlugin(Plugin):
buffer.write(msg.payload)
client.loop_stop()
event.set()
return on_message
@action

View File

@ -6,9 +6,17 @@ from platypush.message.response import Response
from platypush.plugins import action
from platypush.plugins.media import PlayerState
from platypush.plugins.music import MusicPlugin
from platypush.schemas.spotify import SpotifyDeviceSchema, SpotifyStatusSchema, SpotifyTrackSchema, \
SpotifyHistoryItemSchema, SpotifyPlaylistSchema, SpotifyAlbumSchema, SpotifyEpisodeSchema, SpotifyShowSchema, \
SpotifyArtistSchema
from platypush.schemas.spotify import (
SpotifyDeviceSchema,
SpotifyStatusSchema,
SpotifyTrackSchema,
SpotifyHistoryItemSchema,
SpotifyPlaylistSchema,
SpotifyAlbumSchema,
SpotifyEpisodeSchema,
SpotifyShowSchema,
SpotifyArtistSchema,
)
class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
@ -45,9 +53,16 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
be printed on the application logs/stdout.
"""
def __init__(self, client_id: Optional[str] = None, client_secret: Optional[str] = None, **kwargs):
def __init__(
self,
client_id: Optional[str] = None,
client_secret: Optional[str] = None,
**kwargs,
):
MusicPlugin.__init__(self, **kwargs)
SpotifyMixin.__init__(self, client_id=client_id, client_secret=client_secret, **kwargs)
SpotifyMixin.__init__(
self, client_id=client_id, client_secret=client_secret, **kwargs
)
self._players_by_id = {}
self._players_by_name = {}
# Playlist ID -> snapshot ID and tracks cache
@ -63,14 +78,16 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
return dev
@staticmethod
def _parse_datetime(dt: Optional[Union[str, datetime, int, float]]) -> Optional[datetime]:
def _parse_datetime(
dt: Optional[Union[str, datetime, int, float]]
) -> Optional[datetime]:
if isinstance(dt, str):
try:
dt = float(dt)
except (ValueError, TypeError):
return datetime.fromisoformat(dt)
if isinstance(dt, int) or isinstance(dt, float):
if isinstance(dt, (int, float)):
return datetime.fromtimestamp(dt)
return dt
@ -85,18 +102,12 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
devices = self.spotify_user_call('/v1/me/player/devices').get('devices', [])
self._players_by_id = {
**self._players_by_id,
**{
dev['id']: dev
for dev in devices
}
**{dev['id']: dev for dev in devices},
}
self._players_by_name = {
**self._players_by_name,
**{
dev['name']: dev
for dev in devices
}
**{dev['name']: dev for dev in devices},
}
return SpotifyDeviceSchema().dump(devices, many=True)
@ -118,7 +129,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
params={
'volume_percent': volume,
**({'device_id': device} if device else {}),
}
},
)
def _get_volume(self, device: Optional[str] = None) -> Optional[int]:
@ -138,10 +149,13 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
if device:
device = self._get_device(device)['id']
self.spotify_user_call('/v1/me/player/volume', params={
'volume_percent': min(100, (self._get_volume() or 0) + delta),
**({'device_id': device} if device else {}),
})
self.spotify_user_call(
'/v1/me/player/volume',
params={
'volume_percent': min(100, (self._get_volume() or 0) + delta),
**({'device_id': device} if device else {}),
},
)
@action
def voldown(self, delta: int = 5, device: Optional[str] = None):
@ -154,10 +168,13 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
if device:
device = self._get_device(device)['id']
self.spotify_user_call('/v1/me/player/volume', params={
'volume_percent': max(0, (self._get_volume() or 0) - delta),
**({'device_id': device} if device else {}),
})
self.spotify_user_call(
'/v1/me/player/volume',
params={
'volume_percent': max(0, (self._get_volume() or 0) - delta),
**({'device_id': device} if device else {}),
},
)
@action
def play(self, resource: Optional[str] = None, device: Optional[str] = None):
@ -192,8 +209,12 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
# noinspection PyUnresolvedReferences
status = self.status().output
state = 'play' \
if status.get('device_id') != device or status.get('state') != PlayerState.PLAY.value else 'pause'
state = (
'play'
if status.get('device_id') != device
or status.get('state') != PlayerState.PLAY.value
else 'pause'
)
self.spotify_user_call(
f'/v1/me/player/{state}',
@ -212,7 +233,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
status = self.status().output
if status.get('state') == PlayerState.PLAY.value:
self.spotify_user_call(
f'/v1/me/player/pause',
'/v1/me/player/pause',
method='put',
)
@ -230,7 +251,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
status = self.status().output
if status.get('state') != PlayerState.PLAY.value:
self.spotify_user_call(
f'/v1/me/player/play',
'/v1/me/player/play',
method='put',
params={
**({'device_id': device} if device else {}),
@ -261,7 +282,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
"""
device = self._get_device(device)['id']
self.spotify_user_call(
f'/v1/me/player',
'/v1/me/player',
method='put',
json={
'device_ids': [device],
@ -279,7 +300,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
device = self._get_device(device)['id']
self.spotify_user_call(
f'/v1/me/player/next',
'/v1/me/player/next',
method='post',
params={
**({'device_id': device} if device else {}),
@ -297,7 +318,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
device = self._get_device(device)['id']
self.spotify_user_call(
f'/v1/me/player/previous',
'/v1/me/player/previous',
method='post',
params={
**({'device_id': device} if device else {}),
@ -316,7 +337,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
device = self._get_device(device)['id']
self.spotify_user_call(
f'/v1/me/player/seek',
'/v1/me/player/seek',
method='put',
params={
'position_ms': int(position * 1000),
@ -338,13 +359,16 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
if value is None:
# noinspection PyUnresolvedReferences
status = self.status().output
state = 'context' \
if status.get('device_id') != device or not status.get('repeat') else 'off'
state = (
'context'
if status.get('device_id') != device or not status.get('repeat')
else 'off'
)
else:
state = value is True
self.spotify_user_call(
f'/v1/me/player/repeat',
'/v1/me/player/repeat',
method='put',
params={
'state': 'context' if state else 'off',
@ -366,12 +390,12 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
if value is None:
# noinspection PyUnresolvedReferences
status = self.status().output
state = True if status.get('device_id') != device or not status.get('random') else False
state = bool(status.get('device_id') != device or not status.get('random'))
else:
state = value is True
self.spotify_user_call(
f'/v1/me/player/shuffle',
'/v1/me/player/shuffle',
method='put',
params={
'state': state,
@ -380,8 +404,12 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
)
@action
def history(self, limit: int = 20, before: Optional[Union[datetime, str, int]] = None,
after: Optional[Union[datetime, str, int]] = None):
def history(
self,
limit: int = 20,
before: Optional[Union[datetime, str, int]] = None,
after: Optional[Union[datetime, str, int]] = None,
):
"""
Get a list of recently played track on the account.
@ -396,21 +424,26 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
after = self._parse_datetime(after)
assert not (before and after), 'before and after cannot both be set'
results = self._spotify_paginate_results('/v1/me/player/recently-played',
limit=limit,
params={
'limit': min(limit, 50),
**({'before': before} if before else {}),
**({'after': after} if after else {}),
})
results = self._spotify_paginate_results(
'/v1/me/player/recently-played',
limit=limit,
params={
'limit': min(limit, 50),
**({'before': before} if before else {}),
**({'after': after} if after else {}),
},
)
return SpotifyHistoryItemSchema().dump([
{
**item.pop('track'),
**item,
}
for item in results
], many=True)
return SpotifyHistoryItemSchema().dump(
[
{
**item.pop('track'),
**item,
}
for item in results
],
many=True,
)
@action
def add(self, resource: str, device: Optional[str] = None, **kwargs):
@ -424,7 +457,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
device = self._get_device(device)['id']
self.spotify_user_call(
f'/v1/me/player/queue',
'/v1/me/player/queue',
method='post',
params={
'uri': resource,
@ -472,7 +505,9 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
return SpotifyTrackSchema().dump(track)
@action
def get_playlists(self, limit: int = 1000, offset: int = 0, user: Optional[str] = None):
def get_playlists(
self, limit: int = 1000, offset: int = 0, user: Optional[str] = None
):
"""
Get the user's playlists.
@ -483,7 +518,8 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
"""
playlists = self._spotify_paginate_results(
f'/v1/{"users/" + user if user else "me"}/playlists',
limit=limit, offset=offset
limit=limit,
offset=offset,
)
return SpotifyPlaylistSchema().dump(playlists, many=True)
@ -491,36 +527,45 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
def _get_playlist(self, playlist: str) -> dict:
playlists = self.get_playlists().output
playlists = [
pl for pl in playlists if (
pl['id'] == playlist or
pl['uri'] == playlist or
pl['name'] == playlist
)
pl
for pl in playlists
if (pl['id'] == playlist or pl['uri'] == playlist or pl['name'] == playlist)
]
assert playlists, f'No such playlist ID, URI or name: {playlist}'
return playlists[0]
def _get_playlist_tracks_from_cache(self, id: str, snapshot_id: str, limit: Optional[int] = None,
offset: int = 0) -> Optional[Iterable]:
def _get_playlist_tracks_from_cache(
self, id: str, snapshot_id: str, limit: Optional[int] = None, offset: int = 0
) -> Optional[Iterable]:
snapshot = self._playlist_snapshots.get(id)
if (
not snapshot or
snapshot['snapshot_id'] != snapshot_id or
(limit is None and snapshot['limit'] is not None)
not snapshot
or snapshot['snapshot_id'] != snapshot_id
or (limit is None and snapshot['limit'] is not None)
):
return
if limit is not None and snapshot['limit'] is not None:
stored_range = (snapshot['limit'], snapshot['limit'] + snapshot['offset'])
requested_range = (limit, limit + offset)
if requested_range[0] < stored_range[0] or requested_range[1] > stored_range[1]:
if (
requested_range[0] < stored_range[0]
or requested_range[1] > stored_range[1]
):
return
return snapshot['tracks']
def _cache_playlist_data(self, id: str, snapshot_id: str, tracks: Iterable[dict], limit: Optional[int] = None,
offset: int = 0, **_):
def _cache_playlist_data(
self,
id: str,
snapshot_id: str,
tracks: Iterable[dict],
limit: Optional[int] = None,
offset: int = 0,
**_,
):
self._playlist_snapshots[id] = {
'id': id,
'tracks': tracks,
@ -530,7 +575,13 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
}
@action
def get_playlist(self, playlist: str, with_tracks: bool = True, limit: Optional[int] = None, offset: int = 0):
def get_playlist(
self,
playlist: str,
with_tracks: bool = True,
limit: Optional[int] = None,
offset: int = 0,
):
"""
Get a playlist content.
@ -544,8 +595,10 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
playlist = self._get_playlist(playlist)
if with_tracks:
playlist['tracks'] = self._get_playlist_tracks_from_cache(
playlist['id'], snapshot_id=playlist['snapshot_id'],
limit=limit, offset=offset
playlist['id'],
snapshot_id=playlist['snapshot_id'],
limit=limit,
offset=offset,
)
if playlist['tracks'] is None:
@ -554,13 +607,16 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
**track,
'track': {
**track['track'],
'position': offset+i+1,
}
'position': offset + i + 1,
},
}
for i, track in enumerate(self._spotify_paginate_results(
f'/v1/playlists/{playlist["id"]}/tracks',
limit=limit, offset=offset
))
for i, track in enumerate(
self._spotify_paginate_results(
f'/v1/playlists/{playlist["id"]}/tracks',
limit=limit,
offset=offset,
)
)
]
self._cache_playlist_data(**playlist, limit=limit, offset=offset)
@ -568,7 +624,12 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
return SpotifyPlaylistSchema().dump(playlist)
@action
def add_to_playlist(self, playlist: str, resources: Union[str, Iterable[str]], position: Optional[int] = None):
def add_to_playlist(
self,
playlist: str,
resources: Union[str, Iterable[str]],
position: Optional[int] = None,
):
"""
Add one or more items to a playlist.
@ -585,11 +646,14 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
},
json={
'uris': [
uri.strip() for uri in (
resources.split(',') if isinstance(resources, str) else resources
uri.strip()
for uri in (
resources.split(',')
if isinstance(resources, str)
else resources
)
]
}
},
)
snapshot_id = response.get('snapshot_id')
@ -611,18 +675,27 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
'tracks': [
{'uri': uri.strip()}
for uri in (
resources.split(',') if isinstance(resources, str) else resources
resources.split(',')
if isinstance(resources, str)
else resources
)
]
}
},
)
snapshot_id = response.get('snapshot_id')
assert snapshot_id is not None, 'Could not save playlist'
@action
def playlist_move(self, playlist: str, from_pos: int, to_pos: int, range_length: int = 1,
resources: Optional[Union[str, Iterable[str]]] = None, **_):
def playlist_move(
self,
playlist: str,
from_pos: int,
to_pos: int,
range_length: int = 1,
resources: Optional[Union[str, Iterable[str]]] = None,
**_,
):
"""
Move or replace elements in a playlist.
@ -641,12 +714,21 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
'range_start': int(from_pos) + 1,
'range_length': int(range_length),
'insert_before': int(to_pos) + 1,
**({'uris': [
uri.strip() for uri in (
resources.split(',') if isinstance(resources, str) else resources
)
]} if resources else {})
}
**(
{
'uris': [
uri.strip()
for uri in (
resources.split(',')
if isinstance(resources, str)
else resources
)
]
}
if resources
else {}
),
},
)
snapshot_id = response.get('snapshot_id')
@ -673,8 +755,14 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
# noinspection PyShadowingBuiltins
@action
def search(self, query: Optional[Union[str, dict]] = None, limit: int = 50, offset: int = 0, type: str = 'track',
**filter) -> Iterable[dict]:
def search(
self,
query: Optional[Union[str, dict]] = None,
limit: int = 50,
offset: int = 0,
type: str = 'track',
**filter,
) -> Iterable[dict]:
"""
Search for tracks matching a certain criteria.
@ -714,12 +802,16 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
}.get('uri', [])
uris = uri.split(',') if isinstance(uri, str) else uri
params = {
'ids': ','.join([uri.split(':')[-1].strip() for uri in uris]),
} if uris else {
'q': self._make_filter(query, **filter),
'type': type,
}
params = (
{
'ids': ','.join([uri.split(':')[-1].strip() for uri in uris]),
}
if uris
else {
'q': self._make_filter(query, **filter),
'type': type,
}
)
response = self._spotify_paginate_results(
f'/v1/{type + "s" if uris else "search"}',
@ -739,7 +831,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
track.get('track'),
track.get('title'),
track.get('popularity'),
)
),
)
schema_class = None
@ -759,6 +851,31 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
return response
@action
def create_playlist(
self, name: str, description: Optional[str] = None, public: bool = False
):
"""
Create a playlist.
:param name: Playlist name.
:param description: Optional playlist description.
:param public: Whether the new playlist should be public
(default: False).
:return: .. schema:: spotify.SpotifyPlaylistSchema
"""
ret = self.spotify_user_call(
'/v1/users/me/playlists',
method='post',
json={
'name': name,
'description': description,
'public': public,
},
)
return SpotifyPlaylistSchema().dump(ret)
@action
def follow_playlist(self, playlist: str, public: bool = True):
"""
@ -774,7 +891,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
method='put',
json={
'public': public,
}
},
)
@action
@ -792,10 +909,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
@staticmethod
def _uris_to_id(*uris: str) -> Iterable[str]:
return [
uri.split(':')[-1]
for uri in uris
]
return [uri.split(':')[-1] for uri in uris]
@action
def get_albums(self, limit: int = 50, offset: int = 0) -> List[dict]:
@ -811,7 +925,8 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
'/v1/me/albums',
limit=limit,
offset=offset,
), many=True
),
many=True,
)
@action
@ -852,9 +967,7 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
return [
SpotifyTrackSchema().dump(item['track'])
for item in self._spotify_paginate_results(
'/v1/me/tracks',
limit=limit,
offset=offset
'/v1/me/tracks', limit=limit, offset=offset
)
]
@ -898,7 +1011,8 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
'/v1/me/episodes',
limit=limit,
offset=offset,
), many=True
),
many=True,
)
@action
@ -941,7 +1055,8 @@ class MusicSpotifyPlugin(MusicPlugin, SpotifyMixin):
'/v1/me/shows',
limit=limit,
offset=offset,
), many=True
),
many=True,
)
@action

View File

@ -0,0 +1,397 @@
import json
import os
import pathlib
from datetime import datetime
from typing import Iterable, Optional, Union
from platypush.config import Config
from platypush.context import Variable, get_bus
from platypush.message.event.music.tidal import TidalPlaylistUpdatedEvent
from platypush.plugins import RunnablePlugin, action
from platypush.plugins.music.tidal.workers import get_items
from platypush.schemas.tidal import (
TidalAlbumSchema,
TidalPlaylistSchema,
TidalArtistSchema,
TidalSearchResultsSchema,
TidalTrackSchema,
)
class MusicTidalPlugin(RunnablePlugin):
"""
Plugin to interact with the user's Tidal account and library.
Upon the first login, the application will prompt you with a link to
connect to your Tidal account. Once authorized, you should no longer be
required to explicitly login.
Triggers:
* :class:`platypush.message.event.music.TidalPlaylistUpdatedEvent`: when a user playlist
is updated.
Requires:
* **tidalapi** (``pip install 'tidalapi >= 0.7.0'``)
"""
_base_url = 'https://api.tidalhifi.com/v1/'
_default_credentials_file = os.path.join(
str(Config.get('workdir')), 'tidal', 'credentials.json'
)
def __init__(
self,
quality: str = 'high',
credentials_file: str = _default_credentials_file,
**kwargs,
):
"""
:param quality: Default audio quality. Default: ``high``.
Supported: [``loseless``, ``master``, ``high``, ``low``].
:param credentials_file: Path to the file where the OAuth session
parameters will be stored (default:
``<WORKDIR>/tidal/credentials.json``).
"""
from tidalapi import Quality
super().__init__(**kwargs)
self._credentials_file = os.path.expanduser(credentials_file)
self._user_playlists = {}
try:
self._quality = getattr(Quality, quality.lower())
except AttributeError:
raise AssertionError(
f'Invalid quality: {quality}. Supported values: '
f'{[q.name for q in Quality]}'
)
self._session = None
def _oauth_open_saved_session(self):
if not self._session:
return
try:
with open(self._credentials_file, 'r') as f:
data = json.load(f)
self._session.load_oauth_session(
data['token_type'], data['access_token'], data['refresh_token']
)
except Exception as e:
self.logger.warning('Could not load %s: %s', self._credentials_file, e)
def _oauth_create_new_session(self):
if not self._session:
return
self._session.login_oauth_simple(function=self.logger.warning) # type: ignore
if self._session.check_login():
data = {
'token_type': self._session.token_type,
'session_id': self._session.session_id,
'access_token': self._session.access_token,
'refresh_token': self._session.refresh_token,
}
pathlib.Path(os.path.dirname(self._credentials_file)).mkdir(
parents=True, exist_ok=True
)
with open(self._credentials_file, 'w') as outfile:
json.dump(data, outfile)
@property
def session(self):
from tidalapi import Config, Session
if self._session and self._session.check_login():
return self._session
# Attempt to reload the existing session from file
self._session = Session(config=Config(quality=self._quality))
self._oauth_open_saved_session()
if not self._session.check_login():
# Create a new session if we couldn't load an existing one
self._oauth_create_new_session()
assert (
self._session.user and self._session.check_login()
), 'Could not connect to TIDAL'
return self._session
@property
def user(self):
user = self.session.user
assert user, 'Not logged in'
return user
@action
def create_playlist(self, name: str, description: Optional[str] = None):
"""
Create a new playlist.
:param name: Playlist name.
:param description: Optional playlist description.
:return: .. schema:: tidal.TidalPlaylistSchema
"""
ret = self.user.create_playlist(name, description)
return TidalPlaylistSchema().dump(ret)
@action
def delete_playlist(self, playlist_id: str):
"""
Delete a playlist by ID.
:param playlist_id: ID of the playlist to delete.
"""
pl = self.session.playlist(playlist_id)
pl.delete()
@action
def edit_playlist(self, playlist_id: str, title=None, description=None):
"""
Edit a playlist's metadata.
:param name: New name.
:param description: New description.
"""
pl = self.session.playlist(playlist_id)
pl.edit(title=title, description=description)
@action
def get_playlists(self):
"""
Get the user's playlists (track lists are excluded).
:return: .. schema:: tidal.TidalPlaylistSchema(many=True)
"""
ret = self.user.playlists() + self.user.favorites.playlists()
return TidalPlaylistSchema().dump(ret, many=True)
@action
def get_playlist(self, playlist_id: str):
"""
Get the details of a playlist (including tracks).
:param playlist_id: Playlist ID.
:return: .. schema:: tidal.TidalPlaylistSchema
"""
pl = self.session.playlist(playlist_id)
pl._tracks = get_items(pl.tracks)
return TidalPlaylistSchema().dump(pl)
@action
def get_artist(self, artist_id: Union[str, int]):
"""
Get the details of an artist.
:param artist_id: Artist ID.
:return: .. schema:: tidal.TidalArtistSchema
"""
ret = self.session.artist(artist_id)
ret.albums = get_items(ret.get_albums)
return TidalArtistSchema().dump(ret)
@action
def get_album(self, album_id: Union[str, int]):
"""
Get the details of an album.
:param artist_id: Album ID.
:return: .. schema:: tidal.TidalAlbumSchema
"""
ret = self.session.album(album_id)
return TidalAlbumSchema(with_tracks=True).dump(ret)
@action
def get_track(self, track_id: Union[str, int]):
"""
Get the details of an track.
:param artist_id: Track ID.
:return: .. schema:: tidal.TidalTrackSchema
"""
ret = self.session.album(track_id)
return TidalTrackSchema().dump(ret)
@action
def search(
self,
query: str,
limit: int = 50,
offset: int = 0,
type: Optional[str] = None,
):
"""
Perform a search.
:param query: Query string.
:param limit: Maximum results that should be returned (default: 50).
:param offset: Search offset (default: 0).
:param type: Type of results that should be returned. Default: None
(return all the results that match the query). Supported:
``artist``, ``album``, ``track`` and ``playlist``.
:return: .. schema:: tidal.TidalSearchResultsSchema
"""
from tidalapi.artist import Artist
from tidalapi.album import Album
from tidalapi.media import Track
from tidalapi.playlist import Playlist
models = None
if type is not None:
if type == 'artist':
models = [Artist]
elif type == 'album':
models = [Album]
elif type == 'track':
models = [Track]
elif type == 'playlist':
models = [Playlist]
else:
raise AssertionError(f'Unsupported search type: {type}')
ret = self.session.search(query, models=models, limit=limit, offset=offset)
return TidalSearchResultsSchema().dump(ret)
@action
def get_download_url(self, track_id: str) -> str:
"""
Get the direct download URL of a track.
:param artist_id: Track ID.
"""
return self.session.track(track_id).get_url()
@action
def add_to_playlist(self, playlist_id: str, track_ids: Iterable[Union[str, int]]):
"""
Append one or more tracks to a playlist.
:param playlist_id: Target playlist ID.
:param track_ids: List of track IDs to append.
"""
pl = self.session.playlist(playlist_id)
pl.add(track_ids)
@action
def remove_from_playlist(
self,
playlist_id: str,
track_id: Optional[Union[str, int]] = None,
index: Optional[int] = None,
):
"""
Remove a track from a playlist.
Specify either the ``track_id`` or the ``index``.
:param playlist_id: Target playlist ID.
:param track_id: ID of the track to remove.
:param index: Index of the track to remove.
"""
assert not (
track_id is None and index is None
), 'Please specify either track_id or index'
pl = self.session.playlist(playlist_id)
if index:
pl.remove_by_index(index)
if track_id:
pl.remove_by_id(track_id)
@action
def add_track(self, track_id: Union[str, int]):
"""
Add a track to the user's collection.
:param track_id: Track ID.
"""
self.user.favorites.add_track(track_id)
@action
def add_album(self, album_id: Union[str, int]):
"""
Add an album to the user's collection.
:param album_id: Album ID.
"""
self.user.favorites.add_album(album_id)
@action
def add_artist(self, artist_id: Union[str, int]):
"""
Add an artist to the user's collection.
:param artist_id: Artist ID.
"""
self.user.favorites.add_artist(artist_id)
@action
def add_playlist(self, playlist_id: str):
"""
Add a playlist to the user's collection.
:param playlist_id: Playlist ID.
"""
self.user.favorites.add_playlist(playlist_id)
@action
def remove_track(self, track_id: Union[str, int]):
"""
Remove a track from the user's collection.
:param track_id: Track ID.
"""
self.user.favorites.remove_track(track_id)
@action
def remove_album(self, album_id: Union[str, int]):
"""
Remove an album from the user's collection.
:param album_id: Album ID.
"""
self.user.favorites.remove_album(album_id)
@action
def remove_artist(self, artist_id: Union[str, int]):
"""
Remove an artist from the user's collection.
:param artist_id: Artist ID.
"""
self.user.favorites.remove_artist(artist_id)
@action
def remove_playlist(self, playlist_id: str):
"""
Remove a playlist from the user's collection.
:param playlist_id: Playlist ID.
"""
self.user.favorites.remove_playlist(playlist_id)
def main(self):
while not self.should_stop():
playlists = self.session.user.playlists() # type: ignore
for pl in playlists:
last_updated_var = Variable(f'TIDAL_PLAYLIST_LAST_UPDATE[{pl.id}]')
prev_last_updated = last_updated_var.get()
if prev_last_updated:
prev_last_updated = datetime.fromisoformat(prev_last_updated)
if pl.last_updated > prev_last_updated:
get_bus().post(TidalPlaylistUpdatedEvent(playlist_id=pl.id))
if not prev_last_updated or pl.last_updated > prev_last_updated:
last_updated_var.set(pl.last_updated.isoformat())
self.wait_stop(self.poll_interval)

View File

@ -0,0 +1,9 @@
manifest:
events:
- platypush.message.event.music.TidalPlaylistUpdatedEvent: when a user playlist
is updated.
install:
pip:
- tidalapi >= 0.7.0
package: platypush.plugins.music.tidal
type: plugin

View File

@ -0,0 +1,56 @@
from concurrent.futures import ThreadPoolExecutor
from typing import Callable
def func_wrapper(args):
(f, offset, *args) = args
items = f(*args)
return [(i + offset, item) for i, item in enumerate(items)]
def get_items(
func: Callable,
*args,
parse: Callable = lambda _: _,
chunk_size: int = 100,
processes: int = 5,
):
"""
This function performs pagination on a function that supports
`limit`/`offset` parameters and it runs API requests in parallel to speed
things up.
"""
items = []
offsets = [-chunk_size]
remaining = chunk_size * processes
with ThreadPoolExecutor(
processes, thread_name_prefix=f'mopidy-tidal-{func.__name__}-'
) as pool:
while remaining == chunk_size * processes:
offsets = [offsets[-1] + chunk_size * (i + 1) for i in range(processes)]
pool_results = pool.map(
func_wrapper,
[
(
func,
offset,
*args,
chunk_size, # limit
offset, # offset
)
for offset in offsets
],
)
new_items = []
for results in pool_results:
new_items.extend(results)
remaining = len(new_items)
items.extend(new_items)
items = sorted([_ for _ in items if _], key=lambda item: item[0])
sorted_items = [item[1] for item in items]
return list(map(parse, sorted_items))

View File

@ -1,8 +1,13 @@
import datetime
import os
import queue
import re
import threading
import time
from typing import Optional, Collection
from dateutil.tz import tzutc
from typing import Iterable, Optional, Collection, Set
from xml.etree import ElementTree
import dateutil.parser
import requests
@ -24,56 +29,67 @@ class RssPlugin(RunnablePlugin):
Requires:
* **feedparser** (``pip install feedparser``)
* **defusedxml** (``pip install defusedxml``)
"""
user_agent = 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) ' + \
'Chrome/62.0.3202.94 Safari/537.36'
user_agent = (
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) '
+ 'Chrome/62.0.3202.94 Safari/537.36'
)
def __init__(
self, subscriptions: Optional[Collection[str]] = None, poll_seconds: int = 300,
user_agent: str = user_agent, **kwargs
self,
subscriptions: Optional[Collection[str]] = None,
poll_seconds: int = 300,
user_agent: str = user_agent,
**kwargs,
):
"""
:param subscriptions: List of feeds to monitor for updates, as URLs.
OPML URLs/local files are also supported.
:param poll_seconds: How often we should check for updates (default: 300 seconds).
:param user_agent: Custom user agent to use for the requests.
"""
super().__init__(**kwargs)
self.subscriptions = subscriptions or []
self.poll_seconds = poll_seconds
self.user_agent = user_agent
self._latest_timestamps = self._get_latest_timestamps()
self._feeds_metadata = {}
self._feed_worker_queues = [queue.Queue()] * 5
self._feed_response_queue = queue.Queue()
self._feed_workers = []
self._latest_entries = []
self.subscriptions = list(self._parse_subscriptions(subscriptions or []))
self._latest_timestamps = self._get_latest_timestamps()
@staticmethod
def _get_feed_latest_timestamp_varname(url: str) -> str:
return f'LATEST_FEED_TIMESTAMP[{url}]'
@classmethod
def _get_feed_latest_timestamp(cls, url: str) -> Optional[datetime.datetime]:
t = get_plugin('variable').get(
cls._get_feed_latest_timestamp_varname(url)
).output.get(cls._get_feed_latest_timestamp_varname(url))
t = (
get_plugin('variable')
.get(cls._get_feed_latest_timestamp_varname(url))
.output.get(cls._get_feed_latest_timestamp_varname(url))
)
if t:
return dateutil.parser.isoparse(t)
def _get_latest_timestamps(self) -> dict:
return {
url: self._get_feed_latest_timestamp(url)
for url in self.subscriptions
}
return {url: self._get_feed_latest_timestamp(url) for url in self.subscriptions}
def _update_latest_timestamps(self) -> None:
variable = get_plugin('variable')
variable.set(**{
self._get_feed_latest_timestamp_varname(url): latest_timestamp
for url, latest_timestamp in self._latest_timestamps.items()
})
variable.set(
**{
self._get_feed_latest_timestamp_varname(url): latest_timestamp
for url, latest_timestamp in self._latest_timestamps.items()
}
)
@staticmethod
def _parse_content(entry) -> Optional[str]:
@ -96,23 +112,30 @@ class RssPlugin(RunnablePlugin):
"""
import feedparser
feed = feedparser.parse(requests.get(url, headers={'User-Agent': self.user_agent}).text)
feed = feedparser.parse(
requests.get(url, headers={'User-Agent': self.user_agent}).text
)
return RssFeedEntrySchema().dump(
sorted([
{
'feed_url': url,
'feed_title': getattr(feed.feed, 'title', None),
'id': getattr(entry, 'id', None),
'url': entry.link,
'published': datetime.datetime.fromtimestamp(time.mktime(entry.published_parsed)),
'title': entry.title,
'summary': getattr(entry, 'summary', None),
'content': self._parse_content(entry),
}
for entry in feed.entries
if getattr(entry, 'published_parsed', None)
], key=lambda e: e['published']),
many=True
sorted(
[
{
'feed_url': url,
'feed_title': getattr(feed.feed, 'title', None),
'id': getattr(entry, 'id', None),
'url': entry.link,
'published': datetime.datetime.fromtimestamp(
time.mktime(entry.published_parsed)
),
'title': entry.title,
'summary': getattr(entry, 'summary', None),
'content': self._parse_content(entry),
}
for entry in feed.entries
if getattr(entry, 'published_parsed', None)
],
key=lambda e: e['published'],
),
many=True,
)
@action
@ -123,7 +146,9 @@ class RssPlugin(RunnablePlugin):
:param limit: Maximum number of entries to return (default: 20).
:return: .. schema:: rss.RssFeedEntrySchema(many=True)
"""
return sorted(self._latest_entries, key=lambda e: e['published'], reverse=True)[:limit]
return sorted(self._latest_entries, key=lambda e: e['published'], reverse=True)[
:limit
]
def _feed_worker(self, q: queue.Queue):
while not self.should_stop():
@ -133,18 +158,157 @@ class RssPlugin(RunnablePlugin):
continue
try:
self._feed_response_queue.put({
'url': url,
'content': self.parse_feed(url).output,
})
self._feed_response_queue.put(
{
'url': url,
'content': self.parse_feed(url).output,
}
)
except Exception as e:
self._feed_response_queue.put({
'url': url,
'error': e,
})
self._feed_response_queue.put(
{
'url': url,
'error': e,
}
)
self._feed_response_queue.put(None)
def _parse_opml_lists(self, subs: Iterable[str]) -> Set[str]:
from defusedxml import ElementTree
feeds = set()
subs = set(subs)
content_by_sub = {}
urls = {sub for sub in subs if re.search(r'^https?://', sub)}
files = {os.path.expanduser(sub) for sub in subs if sub not in urls}
for url in urls:
try:
content_by_sub[url] = requests.get(
url,
headers={
'User-Agent': self.user_agent,
},
).text
except Exception as e:
self.logger.warning('Could not retrieve subscription %s: %s', url, e)
for file in files:
try:
with open(file, 'r') as f:
content_by_sub[file] = f.read()
except Exception as e:
self.logger.warning('Could not open file %s: %s', file, e)
for sub, content in content_by_sub.items():
root = ElementTree.fromstring(content.strip())
if root.tag != 'opml':
self.logger.warning('%s is not a valid OPML resource', sub)
continue
feeds.update(self._parse_feeds_from_outlines(root.findall('body/outline')))
return feeds
def _parse_feeds_from_outlines(
self,
outlines: Iterable[ElementTree.Element],
) -> Set[str]:
feeds = set()
outlines = list(outlines)
while outlines:
outline = outlines.pop(0)
if 'xmlUrl' in outline.attrib:
url = outline.attrib['xmlUrl']
feeds.add(url)
self._feeds_metadata[url] = {
**self._feeds_metadata.get(url, {}),
'title': outline.attrib.get('title'),
'description': outline.attrib.get('text'),
'url': outline.attrib.get('htmlUrl'),
}
for i, child in enumerate(outline.iter()):
if i > 0:
outlines.append(child)
return feeds
def _parse_subscriptions(self, subs: Iterable[str]) -> Iterable[str]:
import feedparser
self.logger.info('Parsing feed subscriptions')
feeds = set()
lists = set()
for sub in subs:
try:
# Check if it's an OPML list of feeds or an individual feed
feed = feedparser.parse(sub)
if feed.feed.get('opml'):
lists.add(sub)
else:
channel = feed.get('channel', {})
self._feeds_metadata[sub] = {
**self._feeds_metadata.get(sub, {}),
'title': channel.get('title'),
'description': channel.get('description'),
'url': channel.get('link'),
}
feeds.add(sub)
except Exception as e:
self.logger.warning('Could not parse %s: %s', sub, e)
feeds.update(self._parse_opml_lists(lists))
return feeds
@staticmethod
def _datetime_to_string(dt: datetime.datetime) -> str:
return dt.replace(tzinfo=tzutc()).strftime('%a, %d %b %Y %H:%M:%S %Z')
@action
def export_to_opml(self) -> str:
"""
Export the list of subscriptions into OPML format.
:return: The list of subscriptions as a string in OPML format.
"""
root = ElementTree.Element('opml', {'version': '2.0'})
head = ElementTree.Element('head')
title = ElementTree.Element('title')
title.text = 'Platypush feed subscriptions'
created = ElementTree.Element('dateCreated')
created.text = self._datetime_to_string(datetime.datetime.utcnow())
head.append(title)
head.append(created)
body = ElementTree.Element('body')
feeds = ElementTree.Element('outline', {'text': 'Feeds'})
for sub in self.subscriptions:
metadata = self._feeds_metadata.get(sub, {})
feed = ElementTree.Element(
'outline',
{
'xmlUrl': sub,
'text': metadata.get('description', metadata.get('title', sub)),
**({'htmlUrl': metadata['url']} if metadata.get('url') else {}),
**({'title': metadata['title']} if metadata.get('title') else {}),
},
)
feeds.append(feed)
body.append(feeds)
root.append(head)
root.append(body)
return ElementTree.tostring(root, encoding='utf-8', method='xml').decode()
def main(self):
self._feed_workers = [
threading.Thread(target=self._feed_worker, args=(q,))
@ -154,12 +318,16 @@ class RssPlugin(RunnablePlugin):
for worker in self._feed_workers:
worker.start()
self.logger.info(f'Initialized RSS plugin with {len(self.subscriptions)} subscriptions')
self.logger.info(
f'Initialized RSS plugin with {len(self.subscriptions)} subscriptions'
)
while not self.should_stop():
responses = {}
for i, url in enumerate(self.subscriptions):
worker_queue = self._feed_worker_queues[i % len(self._feed_worker_queues)]
worker_queue = self._feed_worker_queues[
i % len(self._feed_worker_queues)
]
worker_queue.put(url)
time_start = time.time()
@ -168,12 +336,14 @@ class RssPlugin(RunnablePlugin):
new_entries = []
while (
not self.should_stop() and
len(responses) < len(self.subscriptions) and
time.time() - time_start <= timeout
not self.should_stop()
and len(responses) < len(self.subscriptions)
and time.time() - time_start <= timeout
):
try:
response = self._feed_response_queue.get(block=True, timeout=max_time-time_start)
response = self._feed_response_queue.get(
block=True, timeout=max_time - time_start
)
except queue.Empty:
self.logger.warning('RSS parse timeout')
break
@ -189,7 +359,9 @@ class RssPlugin(RunnablePlugin):
else:
responses[url] = response['content']
responses = {k: v for k, v in responses.items() if not isinstance(v, Exception)}
responses = {
k: v for k, v in responses.items() if not isinstance(v, Exception)
}
for url, response in responses.items():
latest_timestamp = self._latest_timestamps.get(url)
@ -205,7 +377,7 @@ class RssPlugin(RunnablePlugin):
self._update_latest_timestamps()
self._latest_entries = new_entries
time.sleep(self.poll_seconds)
self.wait_stop(self.poll_seconds)
def stop(self):
super().stop()

View File

@ -4,5 +4,6 @@ manifest:
install:
pip:
- feedparser
- defusedxml
package: platypush.plugins.rss
type: plugin

View File

@ -37,7 +37,7 @@ class TorrentPlugin(Plugin):
torrent_state = {}
transfers = {}
# noinspection HttpUrlsUsage
default_popcorn_base_url = 'http://popcorn-ru.tk'
default_popcorn_base_url = 'http://popcorn-time.ga'
def __init__(self, download_dir=None, torrent_ports=None, imdb_key=None, popcorn_base_url=default_popcorn_base_url,
**kwargs):

View File

@ -0,0 +1,119 @@
import requests
from typing import Optional
from urllib.parse import urljoin, urlencode
from platypush.backend.http.app.utils import get_local_base_url
from platypush.context import get_backend
from platypush.plugins import action
from platypush.plugins.tts import TtsPlugin
from platypush.schemas.tts.mimic3 import Mimic3VoiceSchema
class TtsMimic3Plugin(TtsPlugin):
"""
TTS plugin that uses the `Mimic3 webserver
<https://github.com/MycroftAI/mimic3>`_ provided by `Mycroft
<https://mycroft.ai/>`_ as a text-to-speech engine.
The easiest way to deploy a Mimic3 instance is probably via Docker:
.. code-block:: bash
$ mkdir -p "$HOME/.local/share/mycroft/mimic3"
$ chmod a+rwx "$HOME/.local/share/mycroft/mimic3"
$ docker run --rm \
-p 59125:59125 \
-v "%h/.local/share/mycroft/mimic3:/home/mimic3/.local/share/mycroft/mimic3" \
'mycroftai/mimic3'
Requires:
* At least a *media plugin* (see
:class:`platypush.plugins.media.MediaPlugin`) enabled/configured -
used for speech playback.
* The ``http`` backend (:class:`platypush.backend.http.HttpBackend`)
enabled - used for proxying the API calls.
"""
def __init__(
self,
server_url: str,
voice: str = 'en_UK/apope_low',
media_plugin: Optional[str] = None,
player_args: Optional[dict] = None,
**kwargs
):
"""
:param server_url: Base URL of the web server that runs the Mimic3 engine.
:param voice: Default voice to be used (default: ``en_UK/apope_low``).
You can get a full list of the voices available on the server
through :meth:`.voices`.
:param media_plugin: Media plugin to be used for audio playback. Supported:
- ``media.gstreamer``
- ``media.omxplayer``
- ``media.mplayer``
- ``media.mpv``
- ``media.vlc``
:param player_args: Optional arguments that should be passed to the player plugin's
:meth:`platypush.plugins.media.MediaPlugin.play` method.
"""
super().__init__(media_plugin=media_plugin, player_args=player_args, **kwargs)
self.server_url = server_url
self.voice = voice
@action
def say(
self,
text: str,
server_url: Optional[str] = None,
voice: Optional[str] = None,
player_args: Optional[dict] = None,
):
"""
Say some text.
:param text: Text to say.
:param server_url: Default ``server_url`` override.
:param voice: Default ``voice`` override.
:param player_args: Default ``player_args`` override.
"""
server_url = server_url or self.server_url
voice = voice or self.voice
player_args = player_args or self.player_args
http = get_backend('http')
assert http, 'http backend not configured'
assert self.media_plugin, 'No media plugin configured'
url = (
urljoin(get_local_base_url(), '/tts/mimic3/say')
+ '?'
+ urlencode(
{
'text': text,
'server_url': server_url,
'voice': voice,
}
)
)
self.media_plugin.play(url, **player_args)
@action
def voices(self, server_url: Optional[str] = None):
"""
List the voices available on the server.
:param server_url: Default ``server_url`` override.
:return: .. schema:: tts.mimic3.Mimic3VoiceSchema(many=True)
"""
server_url = server_url or self.server_url
rs = requests.get(urljoin(server_url, '/api/voices'))
rs.raise_for_status()
return Mimic3VoiceSchema().dump(rs.json(), many=True)
# vim:sw=4:ts=4:et:

View File

@ -0,0 +1,6 @@
manifest:
events: {}
install:
pip: []
package: platypush.plugins.tts.mimic3
type: plugin

View File

@ -0,0 +1,405 @@
import json
import os
import pathlib
import requests
import time
from datetime import datetime, timedelta
from typing import Iterable, List, Optional
from urllib.parse import urljoin
from platypush.config import Config
from platypush.plugins import Plugin, action
from platypush.schemas.wallabag import WallabagEntrySchema
class WallabagPlugin(Plugin):
"""
Plugin to interact with Wallabag (https://wallabag.it),
an open-source alternative to Instapaper and Pocket.
"""
_default_credentials_file = os.path.join(
str(Config.get('workdir')), 'wallabag', 'credentials.json'
)
def __init__(
self,
client_id: str,
client_secret: str,
server_url: str = 'https://wallabag.it',
username: Optional[str] = None,
password: Optional[str] = None,
credentials_file: str = _default_credentials_file,
**kwargs,
):
"""
:param client_id: Client ID for your application - you can create one
at ``<server_url>/developer``.
:param client_secret: Client secret for your application - you can
create one at ``<server_url>/developer``.
:param server_url: Base URL of the Wallabag server (default: ``https://wallabag.it``).
:param username: Wallabag username. Only needed for the first login,
you can remove it afterwards. Alternatively, you can provide it
on the :meth:`.login` method.
:param password: Wallabag password. Only needed for the first login,
you can remove it afterwards. Alternatively, you can provide it
on the :meth:`.login` method.
:param credentials_file: Path to the file where the OAuth session
parameters will be stored (default:
``<WORKDIR>/wallabag/credentials.json``).
"""
super().__init__(**kwargs)
self._client_id = client_id
self._client_secret = client_secret
self._server_url = server_url
self._username = username
self._password = password
self._credentials_file = os.path.expanduser(credentials_file)
self._session = {}
def _oauth_open_saved_session(self):
try:
with open(self._credentials_file, 'r') as f:
data = json.load(f)
except Exception as e:
self.logger.warning('Could not load %s: %s', self._credentials_file, e)
return
self._session = {
'username': data['username'],
'client_id': data.get('client_id', self._client_id),
'client_secret': data.get('client_secret', self._client_secret),
'access_token': data['access_token'],
'refresh_token': data['refresh_token'],
}
if data.get('expires_at') and time.time() > data['expires_at']:
self.logger.info('OAuth token expired, refreshing it')
self._oauth_refresh_token()
def _oauth_refresh_token(self):
url = urljoin(self._server_url, '/oauth/v2/token')
rs = requests.post(
url,
json={
'grant_type': 'refresh_token',
'client_id': self._client_id,
'client_secret': self._client_secret,
'access_token': self._session['access_token'],
'refresh_token': self._session['refresh_token'],
},
)
rs.raise_for_status()
rs = rs.json()
self._session.update(
{
'access_token': rs['access_token'],
'refresh_token': rs['refresh_token'],
'expires_at': (
int(
(
datetime.now() + timedelta(seconds=rs['expires_in'])
).timestamp()
)
if rs.get('expires_in')
else None
),
}
)
self._oauth_flush_session()
def _oauth_create_new_session(self, username: str, password: str):
url = urljoin(self._server_url, '/oauth/v2/token')
rs = requests.post(
url,
json={
'grant_type': 'password',
'client_id': self._client_id,
'client_secret': self._client_secret,
'username': username,
'password': password,
},
)
rs.raise_for_status()
rs = rs.json()
self._session = {
'client_id': self._client_id,
'client_secret': self._client_secret,
'username': username,
'access_token': rs['access_token'],
'refresh_token': rs['refresh_token'],
'expires_at': (
int((datetime.now() + timedelta(seconds=rs['expires_in'])).timestamp())
if rs.get('expires_in')
else None
),
}
self._oauth_flush_session()
def _oauth_flush_session(self):
pathlib.Path(self._credentials_file).parent.mkdir(parents=True, exist_ok=True)
pathlib.Path(self._credentials_file).touch(mode=0o600, exist_ok=True)
with open(self._credentials_file, 'w') as f:
f.write(json.dumps(self._session))
@action
def login(self, username: Optional[str] = None, password: Optional[str] = None):
"""
Create a new user session if not logged in.
:param username: Default ``username`` override.
:param password: Default ``password`` override.
"""
self._oauth_open_saved_session()
if self._session:
return
username = username or self._username
password = password or self._password
assert (
username and password
), 'No stored user session and no username/password provided'
self._oauth_create_new_session(username, password)
def _request(self, url: str, method: str, *args, as_json=True, **kwargs):
url = urljoin(self._server_url, f'api/{url}')
func = getattr(requests, method.lower())
self.login()
kwargs['headers'] = {
**kwargs.get('headers', {}),
'Authorization': f'Bearer {self._session["access_token"]}',
}
rs = func(url, *args, **kwargs)
rs.raise_for_status()
return rs.json() if as_json else rs.text
@action
def list(
self,
archived: bool = True,
starred: bool = False,
sort: str = 'created',
descending: bool = False,
page: int = 1,
limit: int = 30,
tags: Optional[Iterable[str]] = None,
since: Optional[int] = None,
full: bool = True,
) -> List[dict]:
"""
List saved links.
:param archived: Include archived items (default: ``True``).
:param starred: Include only starred items (default: ``False``).
:param sort: Timestamp sort criteria. Supported: ``created``,
``updated``, ``archived`` (default: ``created``).
:param descending: Sort in descending order (default: ``False``).
:param page: Results page to be retrieved (default: ``1``).
:param limit: Maximum number of entries per page (default: ``30``).
:param tags: Filter by a list of tags.
:param since: Return entries created after this timestamp (as a UNIX
timestamp).
:param full: Include the full parsed body of the saved entry.
:return: .. schema:: wallabag.WallabagEntrySchema(many=True)
"""
rs = self._request(
'/entries.json',
method='get',
params={
'archived': int(archived),
'starred': int(starred),
'sort': sort,
'order': 'desc' if descending else 'asc',
'page': page,
'perPage': limit,
'tags': ','.join(tags or []),
'since': since or 0,
'detail': 'full' if full else 'metadata',
},
)
return WallabagEntrySchema().dump(
rs.get('_embedded', {}).get('items', []), many=True
)
@action
def search(
self,
term: str,
page: int = 1,
limit: int = 30,
) -> List[dict]:
"""
Search links by some text.
:param term: Term to be searched.
:param page: Results page to be retrieved (default: ``1``).
:param limit: Maximum number of entries per page (default: ``30``).
:return: .. schema:: wallabag.WallabagEntrySchema(many=True)
"""
rs = self._request(
'/search.json',
method='get',
params={
'term': term,
'page': page,
'perPage': limit,
},
)
return WallabagEntrySchema().dump(
rs.get('_embedded', {}).get('items', []), many=True
)
@action
def get(self, id: int) -> Optional[dict]:
"""
Get the content and metadata of a link by ID.
:param id: Entry ID.
:return: .. schema:: wallabag.WallabagEntrySchema
"""
rs = self._request(f'/entries/{id}.json', method='get')
return WallabagEntrySchema().dump(rs) # type: ignore
@action
def export(self, id: int, file: str, format: str = 'txt'):
"""
Export a saved entry to a file in the specified format.
:param id: Entry ID.
:param file: Output filename.
:param format: Output format. Supported: ``txt``, ``xml``, ``csv``,
``pdf``, ``epub`` and ``mobi`` (default: ``txt``).
"""
rs = self._request(
f'/entries/{id}/export.{format}', method='get', as_json=False
)
if isinstance(rs, str):
rs = rs.encode()
with open(os.path.expanduser(file), 'wb') as f:
f.write(rs)
@action
def save(
self,
url: str,
title: Optional[str] = None,
content: Optional[str] = None,
tags: Optional[Iterable[str]] = None,
authors: Optional[Iterable[str]] = None,
archived: bool = False,
starred: bool = False,
public: bool = False,
language: Optional[str] = None,
preview_picture: Optional[str] = None,
) -> Optional[dict]:
"""
Save a link to Wallabag.
:param url: URL to be saved.
:param title: Entry title (default: parsed from the page content).
:param content: Entry content (default: parsed from the entry itself).
:param tags: List of tags to attach to the entry.
:param authors: List of authors of the entry (default: parsed from the
page content).
:param archived: Whether the entry should be created in the archive
(default: ``False``).
:param starred: Whether the entry should be starred (default:
``False``).
:param public: Whether the entry should be publicly available. If so, a
public URL will be generated (default: ``False``).
:param language: Language of the entry.
:param preview_picture: URL of a picture to be used for the preview
(default: parsed from the page itself).
:return: .. schema:: wallabag.WallabagEntrySchema
"""
rs = self._request(
'/entries.json',
method='post',
json={
'url': url,
'title': title,
'content': content,
'tags': ','.join(tags or []),
'authors': ','.join(authors or []),
'archive': int(archived),
'starred': int(starred),
'public': int(public),
'language': language,
'preview_picture': preview_picture,
},
)
return WallabagEntrySchema().dump(rs) # type: ignore
@action
def update(
self,
id: int,
title: Optional[str] = None,
content: Optional[str] = None,
tags: Optional[Iterable[str]] = None,
authors: Optional[Iterable[str]] = None,
archived: bool = False,
starred: bool = False,
public: bool = False,
language: Optional[str] = None,
preview_picture: Optional[str] = None,
) -> Optional[dict]:
"""
Update a link entry saved to Wallabag.
:param id: Entry ID.
:param title: New entry title.
:param content: New entry content.
:param tags: List of tags to attach to the entry.
:param authors: List of authors of the entry.
:param archived: Archive/unarchive the entry.
:param starred: Star/unstar the entry.
:param public: Mark the entry as public/private.
:param language: Change the language of the entry.
:param preview_picture: Change the preview picture URL.
:return: .. schema:: wallabag.WallabagEntrySchema
"""
rs = self._request(
f'/entries/{id}.json',
method='patch',
json={
'title': title,
'content': content,
'tags': ','.join(tags or []),
'authors': ','.join(authors or []),
'archive': int(archived),
'starred': int(starred),
'public': int(public),
'language': language,
'preview_picture': preview_picture,
},
)
return WallabagEntrySchema().dump(rs) # type: ignore
@action
def delete(self, id: int) -> Optional[dict]:
"""
Delete an entry by ID.
:param id: Entry ID.
:return: .. schema:: wallabag.WallabagEntrySchema
"""
rs = self._request(
f'/entries/{id}.json',
method='delete',
)
return WallabagEntrySchema().dump(rs) # type: ignore

View File

@ -0,0 +1,3 @@
manifest:
package: platypush.plugins.wallabag
type: plugin

View File

@ -340,7 +340,7 @@ class MatrixMessageSchema(Schema):
class MatrixMessagesResponseSchema(Schema):
messages = fields.Nested(
MatrixMessageSchema(),
MatrixMessageSchema,
many=True,
required=True,
attribute='chunk',

228
platypush/schemas/tidal.py Normal file
View File

@ -0,0 +1,228 @@
from marshmallow import Schema, fields, pre_dump, post_dump
from platypush.schemas import DateTime
class TidalSchema(Schema):
pass
class TidalArtistSchema(TidalSchema):
id = fields.String(
required=True,
dump_only=True,
metadata={
'example': '3288612',
'description': 'Artist ID',
},
)
url = fields.String(
required=True,
dump_only=True,
metadata={
'description': 'Artist Tidal URL',
'example': 'https://tidal.com/artist/3288612',
},
)
name = fields.String(required=True)
@pre_dump
def _prefill_url(self, data, *_, **__):
data.url = f'https://tidal.com/artist/{data.id}'
return data
class TidalAlbumSchema(TidalSchema):
def __init__(self, *args, with_tracks=False, **kwargs):
super().__init__(*args, **kwargs)
self._with_tracks = with_tracks
id = fields.String(
required=True,
dump_only=True,
metadata={
'example': '45288612',
'description': 'Album ID',
},
)
url = fields.String(
required=True,
dump_only=True,
metadata={
'description': 'Album Tidal URL',
'example': 'https://tidal.com/album/45288612',
},
)
name = fields.String(required=True)
artist = fields.Nested(TidalArtistSchema)
duration = fields.Int(metadata={'description': 'Album duration, in seconds'})
year = fields.Integer(metadata={'example': 2003})
num_tracks = fields.Int(metadata={'example': 10})
tracks = fields.List(fields.Dict(), attribute='_tracks')
@pre_dump
def _prefill_url(self, data, *_, **__):
data.url = f'https://tidal.com/album/{data.id}'
return data
@pre_dump
def _cache_tracks(self, data, *_, **__):
if self._with_tracks:
album_id = str(data.id)
self.context[album_id] = {
'tracks': data.tracks(),
}
return data
@post_dump
def _dump_tracks(self, data, *_, **__):
if self._with_tracks:
album_id = str(data['id'])
ctx = self.context.pop(album_id, {})
data['tracks'] = TidalTrackSchema().dump(ctx.pop('tracks', []), many=True)
return data
class TidalTrackSchema(TidalSchema):
id = fields.String(
required=True,
dump_only=True,
metadata={
'example': '25288614',
'description': 'Track ID',
},
)
url = fields.String(
required=True,
dump_only=True,
metadata={
'description': 'Track Tidal URL',
'example': 'https://tidal.com/track/25288614',
},
)
artist = fields.Nested(TidalArtistSchema)
album = fields.Nested(TidalAlbumSchema)
name = fields.String(metadata={'description': 'Track title'})
duration = fields.Int(metadata={'description': 'Track duration, in seconds'})
track_num = fields.Int(
metadata={'description': 'Index of the track within the album'}
)
@pre_dump
def _prefill_url(self, data, *_, **__):
data.url = f'https://tidal.com/track/{data.id}'
return data
class TidalPlaylistSchema(TidalSchema):
id = fields.String(
required=True,
dump_only=True,
attribute='uuid',
metadata={
'example': '2b288612-34f5-11ed-b42d-001500e8f607',
'description': 'Playlist ID',
},
)
url = fields.String(
required=True,
dump_only=True,
metadata={
'description': 'Playlist Tidal URL',
'example': 'https://tidal.com/playlist/2b288612-34f5-11ed-b42d-001500e8f607',
},
)
name = fields.String(required=True)
description = fields.String()
duration = fields.Int(metadata={'description': 'Playlist duration, in seconds'})
public = fields.Boolean(attribute='publicPlaylist')
owner = fields.String(
attribute='creator',
metadata={
'description': 'Playlist creator/owner ID',
},
)
num_tracks = fields.Int(
attribute='numberOfTracks',
default=0,
metadata={
'example': 42,
'description': 'Number of tracks in the playlist',
},
)
created_at = DateTime(
attribute='created',
metadata={
'description': 'When the playlist was created',
},
)
last_updated_at = DateTime(
attribute='lastUpdated',
metadata={
'description': 'When the playlist was last updated',
},
)
tracks = fields.Nested(TidalTrackSchema, many=True)
def _flatten_object(self, data, *_, **__):
if not isinstance(data, dict):
data = {
'created': data.created,
'creator': data.creator.id,
'description': data.description,
'duration': data.duration,
'lastUpdated': data.last_updated,
'uuid': data.id,
'name': data.name,
'numberOfTracks': data.num_tracks,
'publicPlaylist': data.public,
'tracks': getattr(data, '_tracks', []),
}
return data
def _normalize_owner(self, data, *_, **__):
owner = data.pop('owner', data.pop('creator', None))
if owner:
if isinstance(owner, dict):
owner = owner['id']
data['creator'] = owner
return data
def _normalize_name(self, data, *_, **__):
if data.get('title'):
data['name'] = data.pop('title')
return data
@pre_dump
def normalize(self, data, *_, **__):
if not isinstance(data, dict):
data = self._flatten_object(data)
self._normalize_name(data)
self._normalize_owner(data)
if 'tracks' not in data:
data['tracks'] = []
return data
class TidalSearchResultsSchema(TidalSchema):
artists = fields.Nested(TidalArtistSchema, many=True)
albums = fields.Nested(TidalAlbumSchema, many=True)
tracks = fields.Nested(TidalTrackSchema, many=True)
playlists = fields.Nested(TidalPlaylistSchema, many=True)

View File

View File

@ -0,0 +1,51 @@
from marshmallow import Schema, fields
class Mimic3Schema(Schema):
pass
class Mimic3VoiceSchema(Mimic3Schema):
key = fields.String(
required=True,
dump_only=True,
metadata={
'description': 'Unique voice ID',
'example': 'en_UK/apope_low',
},
)
language = fields.String(
required=True,
dump_only=True,
metadata={
'example': 'en_UK',
},
)
language_english = fields.String(
metadata={
'description': 'Name of the language (in English)',
}
)
language_native = fields.String(
metadata={
'description': 'Name of the language (in the native language)',
}
)
name = fields.String(
metadata={
'example': 'apope_low',
}
)
sample_text = fields.String(
metadata={
'example': 'Some text',
}
)
description = fields.String()
aliases = fields.List(fields.String)

View File

@ -0,0 +1,147 @@
from marshmallow import Schema, fields
from platypush.schemas import DateTime
class WallabagSchema(Schema):
pass
class WallabagAnnotationSchema(WallabagSchema):
id = fields.Integer(
required=True,
dump_only=True,
metadata={'example': 2345},
)
text = fields.String(
attribute='quote',
metadata={
'example': 'Some memorable quote',
},
)
comment = fields.String(
attribute='text',
metadata={
'example': 'My comment on this memorable quote',
},
)
ranges = fields.Function(
lambda data: [
[int(r['startOffset']), int(r['endOffset'])] for r in data.get('ranges', [])
],
metadata={
'example': [[100, 180]],
},
)
created_at = DateTime(
metadata={
'description': 'When the annotation was created',
},
)
updated_at = DateTime(
metadata={
'description': 'When the annotation was last updated',
},
)
class WallabagEntrySchema(WallabagSchema):
id = fields.Integer(
required=True,
dump_only=True,
metadata={'example': 1234},
)
url = fields.URL(
required=True,
metadata={
'description': 'Original URL',
'example': 'https://example.com/article/some-title',
},
)
preview_picture = fields.URL(
metadata={
'description': 'Preview picture URL',
'example': 'https://example.com/article/some-title.jpg',
},
)
is_archived = fields.Boolean()
is_starred = fields.Boolean()
is_public = fields.Boolean()
mimetype = fields.String(
metadata={
'example': 'text/html',
},
)
title = fields.String(
metadata={
'description': 'Title of the saved page',
},
)
content = fields.String(
metadata={
'description': 'Parsed content',
}
)
language = fields.String(
metadata={
'example': 'en',
}
)
annotations = fields.List(fields.Nested(WallabagAnnotationSchema))
published_by = fields.List(
fields.String,
metadata={
'example': ['Author 1', 'Author 2'],
},
)
tags = fields.Function(
lambda data: [tag['label'] for tag in data.get('tags', [])],
metadata={
'example': ['tech', 'programming'],
},
)
reading_time = fields.Integer(
metadata={
'description': 'Estimated reading time, in minutes',
'example': 10,
}
)
created_at = DateTime(
metadata={
'description': 'When the entry was created',
},
)
updated_at = DateTime(
metadata={
'description': 'When the entry was last updated',
},
)
starred_at = DateTime(
metadata={
'description': 'If the entry is starred, when was it last marked',
},
)
published_at = DateTime(
metadata={
'description': 'When the entry was initially published',
},
)

View File

@ -1,5 +1,5 @@
[bumpversion]
current_version = 0.23.3
current_version = 0.23.6
commit = True
tag = True

View File

@ -28,7 +28,7 @@ backend = pkg_files('platypush/backend')
setup(
name="platypush",
version="0.23.3",
version="0.23.6",
author="Fabio Manganiello",
author_email="info@fabiomanganiello.com",
description="Platypush service",
@ -64,7 +64,7 @@ setup(
'zeroconf>=0.27.0',
'tz',
'python-dateutil',
'cryptography',
# 'cryptography',
'pyjwt',
'marshmallow',
'frozendict',
@ -86,7 +86,7 @@ setup(
# Support for MQTT backends
'mqtt': ['paho-mqtt'],
# Support for RSS feeds parser
'rss': ['feedparser'],
'rss': ['feedparser', 'defusedxml'],
# Support for PDF generation
'pdf': ['weasyprint'],
# Support for Philips Hue plugin