Compare commits

...
Sign in to create a new pull request.

64 commits

Author SHA1 Message Date
Rajesh Rajendran
14932b1742 Revert "Moving cli to scripts folder (#1196)"
This reverts commit c947e48d99.
2023-04-22 11:41:03 +02:00
Rajesh Rajendran
c947e48d99
Moving cli to scripts folder (#1196) 2023-04-22 11:37:31 +02:00
Rajesh Rajendran
4f18c7a6c3
Bump image tags (#1194)
* chore(helm): Updating frontend image release

* chore(helm): Updating chalice image release
2023-04-21 11:53:42 +02:00
Kraiem Taha Yassine
2b52768196
Merge pull request #1193 from openreplay/v1.11.0-patch
feat(chalice): refactored records list
2023-04-21 00:49:52 +01:00
Taha Yassine Kraiem
de338247b6 feat(chalice): refactored records list 2023-04-21 00:47:45 +01:00
Kraiem Taha Yassine
837067157c
Merge pull request #1192 from openreplay/v1.11.0-patch
feat(chalice): return all records if date is not specified
2023-04-21 00:33:08 +01:00
Taha Yassine Kraiem
2063d55522 feat(chalice): return all records if date is not specified 2023-04-21 00:31:48 +01:00
Rajesh Rajendran
a28faab96b
chore(helm): Updating frontend image release (#1191) 2023-04-20 17:33:07 +02:00
Shekar Siri
ada7e74aed
fix(player): fix vroot context getter (#1190)
Co-authored-by: nick-delirium <nikita@openreplay.com>
2023-04-20 15:17:43 +02:00
Rajesh Rajendran
4abd69e0e2
chore(cli): Updating comment (#1188)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-19 18:09:51 +02:00
Rajesh Rajendran
1a6277c29f
chore(cli): removing log message (#1186)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-19 15:01:48 +02:00
Rajesh Rajendran
9a520b6352
chore(cli): Adding option to keep backup directories (#1185)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-19 11:42:51 +02:00
Rajesh Rajendran
606ac0d906
chore(cli): Updating comment (#1184)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-19 11:32:16 +02:00
Rajesh Rajendran
34eb81b590
Update cli for fetch latest patches and kubeconfig file hierarchy (#1183)
* chore(helm): Kubeconfig file hierarchy

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* chore(cli): openreplay -u fetches update from current version, unless
flag set

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-19 11:23:34 +02:00
Mehdi Osman
22c4f491b9
Updated tracker version 2023-04-18 18:11:43 +02:00
Rajesh Rajendran
974dbbd06b
chore(helm): Updating frontend image release (#1180) 2023-04-18 17:07:48 +02:00
Shekar Siri
d406c720b8
fix(ui) - check for error status and force logout (#1179)
* fix(ui) - token expire

* fix(ui) - token expire
2023-04-18 13:08:18 +02:00
Rajesh Rajendran
c0e9205780
chore(helm): Updating frontend image release (#1178) 2023-04-18 12:45:08 +02:00
Shekar Siri
0ac3067f4f
fix(ui) - sessions reload (#1177) 2023-04-18 09:26:47 +02:00
Rajesh Rajendran
45a87bfc81
chore(helm): Updating frontend image release (#1176) 2023-04-17 15:26:16 +02:00
Shekar Siri
c3ce2dfeb8
fix(ui) - fixes from dev (#1175)
* fix(player): consider stringDict before any CreateDocument (fastfix)

* style(player/DOMManager/safeCSSRules): depend on interfaces

* fixup! fix(player): consider stringDict before any CreateDocument (fastfix)

* fix(ui) - user sessions modal - navigation

* fix(player): proper unmount

---------

Co-authored-by: Alex Kaminskii <alex@openreplay.com>
Co-authored-by: nick-delirium <nikita@openreplay.com>
2023-04-17 14:40:26 +02:00
Shekar Siri
8d8f320ddb
Player virtual dom lazy creation (#1174)
* feat(player): lazy JS DOM node creation; (need fixes for reaching  full potential)

* fix(player): drasticly reduce amount of node getter call during virtual node insertion

* feat(player/VirtualDOM): OnloadVRoot & OnloadStyleSheet for lazy iframe innerContent initialisation & elimination of forceInsertion requirement in this case;; few renamings

* style(player): few renamings; comments improved

* feat(player/DOMManager): VirtualNodes insertion prioretization (for styles)

---------

Co-authored-by: Alex Kaminskii <alex@openreplay.com>
2023-04-17 14:37:33 +02:00
Rajesh Rajendran
687ebfbaae
fix(helm): Variable override, prioriry to the user created one. (#1173) 2023-04-17 14:09:26 +02:00
Shekar Siri
ad2ecc167c
fix(ui) - search url to wait for metadata to load (#1172) 2023-04-17 11:59:29 +02:00
Rajesh Rajendran
50ca8538af
chore(helm): Updating frontend image release (#1166)
* chore(helm): Updating frontend image release

* fix(helm): PG custom port

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-14 17:07:03 +02:00
Mehdi Osman
506d8c289d
Updated hero 2023-04-13 19:25:39 +02:00
Shekar Siri
7021eec51c
change(ui) - player fixes from dev (#1165)
* conflicts from 72be865c5f

* fix(player): priority and await for message processing

---------

Co-authored-by: nick-delirium <nikita@openreplay.com>
2023-04-13 18:29:55 +02:00
Kraiem Taha Yassine
be4e16901a
Merge pull request #1163 from openreplay/v1.11.0-patch
feat(crons): added missing dependency
2023-04-13 13:20:45 +01:00
Taha Yassine Kraiem
f099b642ce feat(crons): added missing dependency 2023-04-13 13:20:17 +01:00
Rajesh Rajendran
f241bfbef7
Bump image versions (#1162)
* chore(helm): Updating ender image release

* chore(helm): Updating storage image release
2023-04-13 13:19:06 +02:00
Rajesh Rajendran
8e3b89eea1
Changing default encryption to false (#1161) 2023-04-13 13:07:43 +02:00
Alexander
6f195a0ff0
feat(backend): enabled ecnryption and added metrics (#1160) 2023-04-13 12:47:44 +02:00
Rajesh Rajendran
0610965130
chore(helm): Enabling redis string for helm template variable (#1159)
fix #1158

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-13 10:10:20 +02:00
Shekar Siri
45c5dfc1bf
Add files via upload (#1157) 2023-04-12 19:10:50 +02:00
Shekar Siri
d020a8f8d7
Add files via upload (#1156) 2023-04-12 18:47:06 +02:00
Rajesh Rajendran
360d51d637
chore(helm): Updating chalice image release (#1155) 2023-04-12 18:11:19 +02:00
Kraiem Taha Yassine
83ea01762d
Merge pull request #1154 from openreplay/v1.11.0-patch
V1.11.0 patch
2023-04-12 16:50:05 +01:00
Taha Yassine Kraiem
a229f91501 chore(build): testing EE cron-Jobs 2023-04-12 16:36:51 +01:00
Taha Yassine Kraiem
b67300b462 feat(chalice): changed corn-Job execution time 2023-04-12 16:19:33 +01:00
Taha Yassine Kraiem
86017ec2cf feat(chalice): fixing Jobs 2023-04-12 15:59:17 +01:00
Rajesh Rajendran
62f238cdd0
chore(helm): disabling redis string if not enabled (#1153)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-12 16:25:45 +02:00
Taha Yassine Kraiem
92f4ffa1fb chore(build): test patch branch 2023-04-12 15:24:42 +01:00
Taha Yassine Kraiem
585d893063 feat(chalice): refactored Jobs
feat(chalice): added limits on Jobs
2023-04-12 15:23:50 +01:00
Kraiem Taha Yassine
76bb483505
Merge pull request #1152 from openreplay/v1.11.0-patch
V1.11.0 patch
2023-04-12 15:08:47 +01:00
Taha Yassine Kraiem
0d01afbcb5 feat(chalice): changes 2023-04-12 15:07:23 +01:00
Taha Yassine Kraiem
5c0faea838 feat(chalice): configurable mobs expiration 2023-04-12 14:51:15 +01:00
Taha Yassine Kraiem
e25bfabba0 feat(chalice): fixing jobs execution 2023-04-12 14:36:51 +01:00
Taha Yassine Kraiem
4113ffaa3b feat(chalice): debugging jobs execution 2023-04-12 14:07:47 +01:00
Taha Yassine Kraiem
82e2856d99 feat(chalice): debugging jobs execution 2023-04-12 13:48:52 +01:00
Taha Yassine Kraiem
b82be4c540 feat(chalice): debugging jobs execution 2023-04-12 13:37:16 +01:00
Taha Yassine Kraiem
048a9767ac feat(chalice): fixed jobs execution 2023-04-12 13:06:38 +01:00
Rajesh Rajendran
b14bcbb342
chore(build): Bump image version of frontend assets while building (#1149)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-11 16:07:36 +02:00
Rajesh Rajendran
dc032bf370
chore(helm): Updating frontend image release (#1147) 2023-04-11 15:06:23 +02:00
Rajesh Rajendran
d855bffb12
chore(helm): Adding option for records bucket (#1146)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-11 14:24:29 +02:00
Shekar Siri
6993104a02
fix(ui): fix player destruction on id change (#1145)
Co-authored-by: nick-delirium <nikita@openreplay.com>
2023-04-11 12:13:52 +02:00
Rajesh Rajendran
28a1ccf63e
chore(cli): Adding verbose logging (#1144)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-10 11:38:42 +02:00
Rajesh Rajendran
7784fdcdae
chore(helm): Updating chalice image release (#1143) 2023-04-09 15:47:12 +02:00
Rajesh Rajendran
ece075e2f3
fix(ee): chalice health check (#1142)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-09 15:43:38 +02:00
Rajesh Rajendran
99fa7e18c1
fix(helm): clickhouse username (#1141) 2023-04-09 15:09:02 +02:00
Rajesh Rajendran
191ae35311
chore(helm): Updating chalice image release (#1139) 2023-04-08 10:17:38 +02:00
Rajesh Rajendran
8c1e6ec02e
fix redis endpoint and chalice health endpoints (#1138)
* chore(helm): Adding redis string from global config

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* fix(chalice): health check url for alerts and assist

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2023-04-08 10:14:10 +02:00
Rajesh Rajendran
0eb4b66b9d
chore(helm): Updating chalice image release (#1136) 2023-04-07 18:08:47 +02:00
Kraiem Taha Yassine
1d0f330118
Merge pull request #1135 from openreplay/v1.11.0-patch
feat(chalice): skip mob existence verification
2023-04-07 16:15:14 +01:00
Taha Yassine Kraiem
16e7be5e99 feat(chalice): skip mob existence verification 2023-04-07 16:14:33 +01:00
69 changed files with 1390 additions and 1115 deletions

View file

@ -10,6 +10,7 @@ on:
branches: branches:
- dev - dev
- api-* - api-*
- v1.11.0-patch
paths: paths:
- "ee/api/**" - "ee/api/**"
- "api/**" - "api/**"

View file

@ -10,6 +10,7 @@ on:
branches: branches:
- dev - dev
- api-* - api-*
- v1.11.0-patch
paths: paths:
- "api/**" - "api/**"
- "!api/.gitignore" - "!api/.gitignore"

View file

@ -10,6 +10,7 @@ on:
branches: branches:
- dev - dev
- api-* - api-*
- v1.11.0-patch
paths: paths:
- "ee/api/**" - "ee/api/**"
- "api/**" - "api/**"

View file

@ -14,9 +14,9 @@ def app_connection_string(name, port, path):
HEALTH_ENDPOINTS = { HEALTH_ENDPOINTS = {
"alerts": app_connection_string("alerts-openreplay", 8888, "metrics"), "alerts": app_connection_string("alerts-openreplay", 8888, "health"),
"assets": app_connection_string("assets-openreplay", 8888, "metrics"), "assets": app_connection_string("assets-openreplay", 8888, "metrics"),
"assist": app_connection_string("assist-openreplay", 8888, "metrics"), "assist": app_connection_string("assist-openreplay", 8888, "health"),
"chalice": app_connection_string("chalice-openreplay", 8888, "metrics"), "chalice": app_connection_string("chalice-openreplay", 8888, "metrics"),
"db": app_connection_string("db-openreplay", 8888, "metrics"), "db": app_connection_string("db-openreplay", 8888, "metrics"),
"ender": app_connection_string("ender-openreplay", 8888, "metrics"), "ender": app_connection_string("ender-openreplay", 8888, "metrics"),

View file

@ -1,6 +1,6 @@
from chalicelib.utils import pg_client, helper from chalicelib.utils import pg_client, helper
from chalicelib.utils.TimeUTC import TimeUTC from chalicelib.utils.TimeUTC import TimeUTC
from chalicelib.core import sessions, sessions_mobs from chalicelib.core import sessions_mobs, sessions_devtool
class Actions: class Actions:
@ -17,11 +17,9 @@ class JobStatus:
def get(job_id): def get(job_id):
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
query = cur.mogrify( query = cur.mogrify(
"""\ """SELECT *
SELECT FROM public.jobs
* WHERE job_id = %(job_id)s;""",
FROM public.jobs
WHERE job_id = %(job_id)s;""",
{"job_id": job_id} {"job_id": job_id}
) )
cur.execute(query=query) cur.execute(query=query)
@ -37,11 +35,9 @@ def get(job_id):
def get_all(project_id): def get_all(project_id):
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
query = cur.mogrify( query = cur.mogrify(
"""\ """SELECT *
SELECT FROM public.jobs
* WHERE project_id = %(project_id)s;""",
FROM public.jobs
WHERE project_id = %(project_id)s;""",
{"project_id": project_id} {"project_id": project_id}
) )
cur.execute(query=query) cur.execute(query=query)
@ -51,23 +47,19 @@ def get_all(project_id):
return helper.list_to_camel_case(data) return helper.list_to_camel_case(data)
def create(project_id, data): def create(project_id, user_id):
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
job = { job = {"status": "scheduled",
"status": "scheduled", "project_id": project_id,
"project_id": project_id, "action": Actions.DELETE_USER_DATA,
**data "reference_id": user_id,
} "description": f"Delete user sessions of userId = {user_id}",
"start_at": TimeUTC.to_human_readable(TimeUTC.midnight(1))}
query = cur.mogrify("""\ query = cur.mogrify(
INSERT INTO public.jobs( """INSERT INTO public.jobs(project_id, description, status, action,reference_id, start_at)
project_id, description, status, action, VALUES (%(project_id)s, %(description)s, %(status)s, %(action)s,%(reference_id)s, %(start_at)s)
reference_id, start_at RETURNING *;""", job)
)
VALUES (
%(project_id)s, %(description)s, %(status)s, %(action)s,
%(reference_id)s, %(start_at)s
) RETURNING *;""", job)
cur.execute(query=query) cur.execute(query=query)
@ -90,14 +82,13 @@ def update(job_id, job):
**job **job
} }
query = cur.mogrify("""\ query = cur.mogrify(
UPDATE public.jobs """UPDATE public.jobs
SET SET updated_at = timezone('utc'::text, now()),
updated_at = timezone('utc'::text, now()), status = %(status)s,
status = %(status)s, errors = %(errors)s
errors = %(errors)s WHERE job_id = %(job_id)s
WHERE RETURNING *;""", job_data)
job_id = %(job_id)s RETURNING *;""", job_data)
cur.execute(query=query) cur.execute(query=query)
@ -113,61 +104,64 @@ def format_datetime(r):
r["start_at"] = TimeUTC.datetime_to_timestamp(r["start_at"]) r["start_at"] = TimeUTC.datetime_to_timestamp(r["start_at"])
def __get_session_ids_by_user_ids(project_id, user_ids):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(
"""SELECT session_id
FROM public.sessions
WHERE project_id = %(project_id)s
AND user_id IN %(userId)s
LIMIT 1000;""",
{"project_id": project_id, "userId": tuple(user_ids)})
cur.execute(query=query)
ids = cur.fetchall()
return [s["session_id"] for s in ids]
def __delete_sessions_by_session_ids(session_ids):
with pg_client.PostgresClient(unlimited_query=True) as cur:
query = cur.mogrify(
"""DELETE FROM public.sessions
WHERE session_id IN %(session_ids)s""",
{"session_ids": tuple(session_ids)}
)
cur.execute(query=query)
def get_scheduled_jobs(): def get_scheduled_jobs():
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
query = cur.mogrify( query = cur.mogrify(
"""\ """SELECT *
SELECT * FROM public.jobs FROM public.jobs
WHERE status = %(status)s AND start_at <= (now() at time zone 'utc');""", WHERE status = %(status)s
{"status": JobStatus.SCHEDULED} AND start_at <= (now() at time zone 'utc');""",
) {"status": JobStatus.SCHEDULED})
cur.execute(query=query) cur.execute(query=query)
data = cur.fetchall() data = cur.fetchall()
for record in data:
format_datetime(record)
return helper.list_to_camel_case(data) return helper.list_to_camel_case(data)
def execute_jobs(): def execute_jobs():
jobs = get_scheduled_jobs() jobs = get_scheduled_jobs()
if len(jobs) == 0:
# No jobs to execute
return
for job in jobs: for job in jobs:
print(f"job can be executed {job['id']}") print(f"Executing jobId:{job['jobId']}")
try: try:
if job["action"] == Actions.DELETE_USER_DATA: if job["action"] == Actions.DELETE_USER_DATA:
session_ids = sessions.get_session_ids_by_user_ids( session_ids = __get_session_ids_by_user_ids(project_id=job["projectId"],
project_id=job["projectId"], user_ids=job["referenceId"] user_ids=[job["referenceId"]])
) if len(session_ids) > 0:
print(f"Deleting {len(session_ids)} sessions")
sessions.delete_sessions_by_session_ids(session_ids) __delete_sessions_by_session_ids(session_ids)
sessions_mobs.delete_mobs(session_ids=session_ids, project_id=job["projectId"]) sessions_mobs.delete_mobs(session_ids=session_ids, project_id=job["projectId"])
sessions_devtool.delete_mobs(session_ids=session_ids, project_id=job["projectId"])
else: else:
raise Exception(f"The action {job['action']} not supported.") raise Exception(f"The action '{job['action']}' not supported.")
job["status"] = JobStatus.COMPLETED job["status"] = JobStatus.COMPLETED
print(f"job completed {job['id']}") print(f"Job completed {job['jobId']}")
except Exception as e: except Exception as e:
job["status"] = JobStatus.FAILED job["status"] = JobStatus.FAILED
job["error"] = str(e) job["errors"] = str(e)
print(f"job failed {job['id']}") print(f"Job failed {job['jobId']}")
update(job["job_id"], job) update(job["jobId"], job)
def group_user_ids_by_project_id(jobs, now):
project_id_user_ids = {}
for job in jobs:
if job["startAt"] > now:
continue
project_id = job["projectId"]
if project_id not in project_id_user_ids:
project_id_user_ids[project_id] = []
project_id_user_ids[project_id].append(job)
return project_id_user_ids

View file

@ -1065,47 +1065,6 @@ def get_session_user(project_id, user_id):
return helper.dict_to_camel_case(data) return helper.dict_to_camel_case(data)
def get_session_ids_by_user_ids(project_id, user_ids):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(
"""\
SELECT session_id FROM public.sessions
WHERE
project_id = %(project_id)s AND user_id IN %(userId)s;""",
{"project_id": project_id, "userId": tuple(user_ids)}
)
ids = cur.execute(query=query)
return ids
def delete_sessions_by_session_ids(session_ids):
with pg_client.PostgresClient(unlimited_query=True) as cur:
query = cur.mogrify(
"""\
DELETE FROM public.sessions
WHERE
session_id IN %(session_ids)s;""",
{"session_ids": tuple(session_ids)}
)
cur.execute(query=query)
return True
def delete_sessions_by_user_ids(project_id, user_ids):
with pg_client.PostgresClient(unlimited_query=True) as cur:
query = cur.mogrify(
"""\
DELETE FROM public.sessions
WHERE
project_id = %(project_id)s AND user_id IN %(userId)s;""",
{"project_id": project_id, "userId": tuple(user_ids)}
)
cur.execute(query=query)
return True
def count_all(): def count_all():
with pg_client.PostgresClient(unlimited_query=True) as cur: with pg_client.PostgresClient(unlimited_query=True) as cur:
cur.execute(query="SELECT COUNT(session_id) AS count FROM public.sessions") cur.execute(query="SELECT COUNT(session_id) AS count FROM public.sessions")

View file

@ -24,3 +24,9 @@ def get_urls(session_id, project_id, check_existence: bool = True):
ExpiresIn=config("PRESIGNED_URL_EXPIRATION", cast=int, default=900) ExpiresIn=config("PRESIGNED_URL_EXPIRATION", cast=int, default=900)
)) ))
return results return results
def delete_mobs(project_id, session_ids):
for session_id in session_ids:
for k in __get_devtools_keys(project_id=project_id, session_id=session_id):
s3.schedule_for_deletion(config("sessions_bucket"), k)

View file

@ -57,5 +57,6 @@ def get_ios(session_id):
def delete_mobs(project_id, session_ids): def delete_mobs(project_id, session_ids):
for session_id in session_ids: for session_id in session_ids:
for k in __get_mob_keys(project_id=project_id, session_id=session_id): for k in __get_mob_keys(project_id=project_id, session_id=session_id) \
+ __get_mob_keys_deprecated(session_id=session_id):
s3.schedule_for_deletion(config("sessions_bucket"), k) s3.schedule_for_deletion(config("sessions_bucket"), k)

View file

@ -69,9 +69,11 @@ def get_by_id2_pg(project_id, session_id, context: schemas.CurrentContext, full_
if e['source'] == "js_exception"][:500] if e['source'] == "js_exception"][:500]
data['userEvents'] = events.get_customs_by_session_id(project_id=project_id, data['userEvents'] = events.get_customs_by_session_id(project_id=project_id,
session_id=session_id) session_id=session_id)
data['domURL'] = sessions_mobs.get_urls(session_id=session_id, project_id=project_id) data['domURL'] = sessions_mobs.get_urls(session_id=session_id, project_id=project_id,
data['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session_id) check_existence=False)
data['devtoolsURL'] = sessions_devtool.get_urls(session_id=session_id, project_id=project_id) data['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session_id, check_existence=False)
data['devtoolsURL'] = sessions_devtool.get_urls(session_id=session_id, project_id=project_id,
check_existence=False)
data['resources'] = resources.get_by_session_id(session_id=session_id, project_id=project_id, data['resources'] = resources.get_by_session_id(session_id=session_id, project_id=project_id,
start_ts=data["startTs"], duration=data["duration"]) start_ts=data["startTs"], duration=data["duration"])
@ -126,9 +128,11 @@ def get_replay(project_id, session_id, context: schemas.CurrentContext, full_dat
if data["platform"] == 'ios': if data["platform"] == 'ios':
data['mobsUrl'] = sessions_mobs.get_ios(session_id=session_id) data['mobsUrl'] = sessions_mobs.get_ios(session_id=session_id)
else: else:
data['domURL'] = sessions_mobs.get_urls(session_id=session_id, project_id=project_id) data['domURL'] = sessions_mobs.get_urls(session_id=session_id, project_id=project_id,
data['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session_id) check_existence=False)
data['devtoolsURL'] = sessions_devtool.get_urls(session_id=session_id, project_id=project_id) data['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session_id, check_existence=False)
data['devtoolsURL'] = sessions_devtool.get_urls(session_id=session_id, project_id=project_id,
check_existence=False)
data['metadata'] = __group_metadata(project_metadata=data.pop("projectMetadata"), session=data) data['metadata'] = __group_metadata(project_metadata=data.pop("projectMetadata"), session=data)
data['live'] = live and assist.is_live(project_id=project_id, session_id=session_id, data['live'] = live and assist.is_live(project_id=project_id, session_id=session_id,

View file

@ -110,11 +110,14 @@ def rename(source_bucket, source_key, target_bucket, target_key):
def schedule_for_deletion(bucket, key): def schedule_for_deletion(bucket, key):
if not exists(bucket, key):
return False
s3 = __get_s3_resource() s3 = __get_s3_resource()
s3_object = s3.Object(bucket, key) s3_object = s3.Object(bucket, key)
s3_object.copy_from(CopySource={'Bucket': bucket, 'Key': key}, s3_object.copy_from(CopySource={'Bucket': bucket, 'Key': key},
Expires=datetime.now() + timedelta(days=7), Expires=datetime.utcnow() + timedelta(days=config("SCH_DELETE_DAYS", cast=int, default=30)),
MetadataDirective='REPLACE') MetadataDirective='REPLACE')
return True
def generate_file_key(project_id, key): def generate_file_key(project_id, key):

View file

@ -53,4 +53,5 @@ PRESIGNED_URL_EXPIRATION=3600
ASSIST_JWT_EXPIRATION=144000 ASSIST_JWT_EXPIRATION=144000
ASSIST_JWT_SECRET= ASSIST_JWT_SECRET=
PYTHONUNBUFFERED=1 PYTHONUNBUFFERED=1
REDIS_STRING=redis://redis-master.db.svc.cluster.local:6379 REDIS_STRING=redis://redis-master.db.svc.cluster.local:6379
SCH_DELETE_DAYS=30

View file

@ -1,5 +1,4 @@
from apscheduler.triggers.cron import CronTrigger from apscheduler.triggers.cron import CronTrigger
from apscheduler.triggers.interval import IntervalTrigger
from chalicelib.core import telemetry from chalicelib.core import telemetry
from chalicelib.core import weekly_report, jobs from chalicelib.core import weekly_report, jobs
@ -20,7 +19,7 @@ async def telemetry_cron() -> None:
cron_jobs = [ cron_jobs = [
{"func": telemetry_cron, "trigger": CronTrigger(day_of_week="*"), {"func": telemetry_cron, "trigger": CronTrigger(day_of_week="*"),
"misfire_grace_time": 60 * 60, "max_instances": 1}, "misfire_grace_time": 60 * 60, "max_instances": 1},
{"func": run_scheduled_jobs, "trigger": IntervalTrigger(minutes=1), {"func": run_scheduled_jobs, "trigger": CronTrigger(day_of_week="*", hour=0, minute=15),
"misfire_grace_time": 20, "max_instances": 1}, "misfire_grace_time": 20, "max_instances": 1},
{"func": weekly_report2, "trigger": CronTrigger(day_of_week="mon", hour=5), {"func": weekly_report2, "trigger": CronTrigger(day_of_week="mon", hour=5),
"misfire_grace_time": 60 * 60, "max_instances": 1} "misfire_grace_time": 60 * 60, "max_instances": 1}

View file

@ -2,7 +2,6 @@ from fastapi import Depends, Body
import schemas import schemas
from chalicelib.core import sessions, events, jobs, projects from chalicelib.core import sessions, events, jobs, projects
from chalicelib.utils.TimeUTC import TimeUTC
from or_dependencies import OR_context from or_dependencies import OR_context
from routers.base import get_routers from routers.base import get_routers
@ -15,7 +14,7 @@ async def get_user_sessions(projectKey: str, userId: str, start_date: int = None
if projectId is None: if projectId is None:
return {"errors": ["invalid projectKey"]} return {"errors": ["invalid projectKey"]}
return { return {
'data': sessions.get_user_sessions( "data": sessions.get_user_sessions(
project_id=projectId, project_id=projectId,
user_id=userId, user_id=userId,
start_date=start_date, start_date=start_date,
@ -30,7 +29,7 @@ async def get_session_events(projectKey: str, sessionId: int):
if projectId is None: if projectId is None:
return {"errors": ["invalid projectKey"]} return {"errors": ["invalid projectKey"]}
return { return {
'data': events.get_by_session_id( "data": events.get_by_session_id(
project_id=projectId, project_id=projectId,
session_id=sessionId session_id=sessionId
) )
@ -43,7 +42,7 @@ async def get_user_details(projectKey: str, userId: str):
if projectId is None: if projectId is None:
return {"errors": ["invalid projectKey"]} return {"errors": ["invalid projectKey"]}
return { return {
'data': sessions.get_session_user( "data": sessions.get_session_user(
project_id=projectId, project_id=projectId,
user_id=userId user_id=userId
) )
@ -55,14 +54,8 @@ async def schedule_to_delete_user_data(projectKey: str, userId: str):
projectId = projects.get_internal_project_id(projectKey) projectId = projects.get_internal_project_id(projectKey)
if projectId is None: if projectId is None:
return {"errors": ["invalid projectKey"]} return {"errors": ["invalid projectKey"]}
data = {"action": "delete_user_data", record = jobs.create(project_id=projectId, user_id=userId)
"reference_id": userId, return {"data": record}
"description": f"Delete user sessions of userId = {userId}",
"start_at": TimeUTC.to_human_readable(TimeUTC.midnight(1))}
record = jobs.create(project_id=projectId, data=data)
return {
'data': record
}
@app_apikey.get('/v1/{projectKey}/jobs', tags=["api"]) @app_apikey.get('/v1/{projectKey}/jobs', tags=["api"])
@ -70,16 +63,12 @@ async def get_jobs(projectKey: str):
projectId = projects.get_internal_project_id(projectKey) projectId = projects.get_internal_project_id(projectKey)
if projectId is None: if projectId is None:
return {"errors": ["invalid projectKey"]} return {"errors": ["invalid projectKey"]}
return { return {"data": jobs.get_all(project_id=projectId)}
'data': jobs.get_all(project_id=projectId)
}
@app_apikey.get('/v1/{projectKey}/jobs/{jobId}', tags=["api"]) @app_apikey.get('/v1/{projectKey}/jobs/{jobId}', tags=["api"])
async def get_job(projectKey: str, jobId: int): async def get_job(projectKey: str, jobId: int):
return { return {"data": jobs.get(job_id=jobId)}
'data': jobs.get(job_id=jobId)
}
@app_apikey.delete('/v1/{projectKey}/jobs/{jobId}', tags=["api"]) @app_apikey.delete('/v1/{projectKey}/jobs/{jobId}', tags=["api"])
@ -93,9 +82,7 @@ async def cancel_job(projectKey: str, jobId: int):
return {"errors": ["The request job has already been canceled/completed."]} return {"errors": ["The request job has already been canceled/completed."]}
job["status"] = "cancelled" job["status"] = "cancelled"
return { return {"data": jobs.update(job_id=jobId, job=job)}
'data': jobs.update(job_id=jobId, job=job)
}
@app_apikey.get('/v1/projects', tags=["api"]) @app_apikey.get('/v1/projects', tags=["api"])
@ -104,15 +91,13 @@ async def get_projects(context: schemas.CurrentContext = Depends(OR_context)):
for record in records: for record in records:
del record['projectId'] del record['projectId']
return { return {"data": records}
'data': records
}
@app_apikey.get('/v1/projects/{projectKey}', tags=["api"]) @app_apikey.get('/v1/projects/{projectKey}', tags=["api"])
async def get_project(projectKey: str, context: schemas.CurrentContext = Depends(OR_context)): async def get_project(projectKey: str, context: schemas.CurrentContext = Depends(OR_context)):
return { return {
'data': projects.get_project_by_key(tenant_id=context.tenant_id, project_key=projectKey) "data": projects.get_project_by_key(tenant_id=context.tenant_id, project_key=projectKey)
} }
@ -125,5 +110,5 @@ async def create_project(data: schemas.CreateProjectSchema = Body(...),
data=data, data=data,
skip_authorization=True skip_authorization=True
) )
del record['data']['projectId'] del record["data"]['projectId']
return record return record

View file

@ -84,7 +84,8 @@ ENV TZ=UTC \
CH_PASSWORD="" \ CH_PASSWORD="" \
CH_DATABASE="default" \ CH_DATABASE="default" \
# Max file size to process, default to 100MB # Max file size to process, default to 100MB
MAX_FILE_SIZE=100000000 MAX_FILE_SIZE=100000000 \
USE_ENCRYPTION=false
RUN if [ "$SERVICE_NAME" = "http" ]; then \ RUN if [ "$SERVICE_NAME" = "http" ]; then \

View file

@ -33,10 +33,28 @@ func (t FileType) String() string {
} }
type Task struct { type Task struct {
id string id string
doms *bytes.Buffer key string
dome *bytes.Buffer domRaw []byte
dev *bytes.Buffer devRaw []byte
doms *bytes.Buffer
dome *bytes.Buffer
dev *bytes.Buffer
}
func (t *Task) SetMob(mob []byte, tp FileType) {
if tp == DOM {
t.domRaw = mob
} else {
t.devRaw = mob
}
}
func (t *Task) Mob(tp FileType) []byte {
if tp == DOM {
return t.domRaw
}
return t.devRaw
} }
type Storage struct { type Storage struct {
@ -75,7 +93,8 @@ func (s *Storage) Upload(msg *messages.SessionEnd) (err error) {
filePath := s.cfg.FSDir + "/" + sessionID filePath := s.cfg.FSDir + "/" + sessionID
// Prepare sessions // Prepare sessions
newTask := &Task{ newTask := &Task{
id: sessionID, id: sessionID,
key: msg.EncryptionKey,
} }
wg := &sync.WaitGroup{} wg := &sync.WaitGroup{}
wg.Add(2) wg.Add(2)
@ -108,6 +127,9 @@ func (s *Storage) Upload(msg *messages.SessionEnd) (err error) {
} }
func (s *Storage) openSession(filePath string, tp FileType) ([]byte, error) { func (s *Storage) openSession(filePath string, tp FileType) ([]byte, error) {
if tp == DEV {
filePath += "devtools"
}
// Check file size before download into memory // Check file size before download into memory
info, err := os.Stat(filePath) info, err := os.Stat(filePath)
if err == nil && info.Size() > s.cfg.MaxFileSize { if err == nil && info.Size() > s.cfg.MaxFileSize {
@ -142,51 +164,87 @@ func (s *Storage) sortSessionMessages(raw []byte) ([]byte, error) {
} }
func (s *Storage) prepareSession(path string, tp FileType, task *Task) error { func (s *Storage) prepareSession(path string, tp FileType, task *Task) error {
// Open mob file // Open session file
if tp == DEV {
path += "devtools"
}
startRead := time.Now() startRead := time.Now()
mob, err := s.openSession(path, tp) mob, err := s.openSession(path, tp)
if err != nil { if err != nil {
return err return err
} }
metrics.RecordSessionSize(float64(len(mob)), tp.String())
metrics.RecordSessionReadDuration(float64(time.Now().Sub(startRead).Milliseconds()), tp.String()) metrics.RecordSessionReadDuration(float64(time.Now().Sub(startRead).Milliseconds()), tp.String())
metrics.RecordSessionSize(float64(len(mob)), tp.String())
// Encode and compress session // Put opened session file into task struct
if tp == DEV { task.SetMob(mob, tp)
start := time.Now()
task.dev = s.compressSession(mob) // Encrypt and compress session
metrics.RecordSessionCompressDuration(float64(time.Now().Sub(start).Milliseconds()), tp.String()) s.packSession(task, tp)
} else {
if len(mob) <= s.cfg.FileSplitSize {
start := time.Now()
task.doms = s.compressSession(mob)
metrics.RecordSessionCompressDuration(float64(time.Now().Sub(start).Milliseconds()), tp.String())
return nil
}
wg := &sync.WaitGroup{}
wg.Add(2)
var firstPart, secondPart int64
go func() {
start := time.Now()
task.doms = s.compressSession(mob[:s.cfg.FileSplitSize])
firstPart = time.Now().Sub(start).Milliseconds()
wg.Done()
}()
go func() {
start := time.Now()
task.dome = s.compressSession(mob[s.cfg.FileSplitSize:])
secondPart = time.Now().Sub(start).Milliseconds()
wg.Done()
}()
wg.Wait()
metrics.RecordSessionCompressDuration(float64(firstPart+secondPart), tp.String())
}
return nil return nil
} }
func (s *Storage) packSession(task *Task, tp FileType) {
// Prepare mob file
mob := task.Mob(tp)
if tp == DEV || len(mob) <= s.cfg.FileSplitSize {
// Encryption
start := time.Now()
data := s.encryptSession(mob, task.key)
metrics.RecordSessionEncryptionDuration(float64(time.Now().Sub(start).Milliseconds()), tp.String())
// Compression
start = time.Now()
result := s.compressSession(data)
metrics.RecordSessionCompressDuration(float64(time.Now().Sub(start).Milliseconds()), tp.String())
if tp == DOM {
task.doms = result
} else {
task.dev = result
}
return
}
// Prepare two workers
wg := &sync.WaitGroup{}
wg.Add(2)
var firstPart, secondPart, firstEncrypt, secondEncrypt int64
// DomStart part
go func() {
// Encryption
start := time.Now()
data := s.encryptSession(mob[:s.cfg.FileSplitSize], task.key)
firstEncrypt = time.Since(start).Milliseconds()
// Compression
start = time.Now()
task.doms = s.compressSession(data)
firstPart = time.Since(start).Milliseconds()
// Finish task
wg.Done()
}()
// DomEnd part
go func() {
// Encryption
start := time.Now()
data := s.encryptSession(mob[s.cfg.FileSplitSize:], task.key)
secondEncrypt = time.Since(start).Milliseconds()
// Compression
start = time.Now()
task.dome = s.compressSession(data)
secondPart = time.Since(start).Milliseconds()
// Finish task
wg.Done()
}()
wg.Wait()
// Record metrics
metrics.RecordSessionEncryptionDuration(float64(firstEncrypt+secondEncrypt), tp.String())
metrics.RecordSessionCompressDuration(float64(firstPart+secondPart), tp.String())
}
func (s *Storage) encryptSession(data []byte, encryptionKey string) []byte { func (s *Storage) encryptSession(data []byte, encryptionKey string) []byte {
var encryptedData []byte var encryptedData []byte
var err error var err error

View file

@ -85,18 +85,18 @@ func RecordSessionSortDuration(durMillis float64, fileType string) {
storageSessionSortDuration.WithLabelValues(fileType).Observe(durMillis / 1000.0) storageSessionSortDuration.WithLabelValues(fileType).Observe(durMillis / 1000.0)
} }
var storageSessionEncodeDuration = prometheus.NewHistogramVec( var storageSessionEncryptionDuration = prometheus.NewHistogramVec(
prometheus.HistogramOpts{ prometheus.HistogramOpts{
Namespace: "storage", Namespace: "storage",
Name: "encode_duration_seconds", Name: "encryption_duration_seconds",
Help: "A histogram displaying the duration of encoding for each session in seconds.", Help: "A histogram displaying the duration of encoding for each session in seconds.",
Buckets: common.DefaultDurationBuckets, Buckets: common.DefaultDurationBuckets,
}, },
[]string{"file_type"}, []string{"file_type"},
) )
func RecordSessionEncodeDuration(durMillis float64, fileType string) { func RecordSessionEncryptionDuration(durMillis float64, fileType string) {
storageSessionEncodeDuration.WithLabelValues(fileType).Observe(durMillis / 1000.0) storageSessionEncryptionDuration.WithLabelValues(fileType).Observe(durMillis / 1000.0)
} }
var storageSessionCompressDuration = prometheus.NewHistogramVec( var storageSessionCompressDuration = prometheus.NewHistogramVec(
@ -133,7 +133,7 @@ func List() []prometheus.Collector {
storageTotalSessions, storageTotalSessions,
storageSessionReadDuration, storageSessionReadDuration,
storageSessionSortDuration, storageSessionSortDuration,
storageSessionEncodeDuration, storageSessionEncryptionDuration,
storageSessionCompressDuration, storageSessionCompressDuration,
storageSessionUploadDuration, storageSessionUploadDuration,
} }

View file

@ -41,9 +41,13 @@ def save_record(project_id, data: schemas_ee.AssistRecordSavePayloadSchema, cont
def search_records(project_id, data: schemas_ee.AssistRecordSearchPayloadSchema, context: schemas_ee.CurrentContext): def search_records(project_id, data: schemas_ee.AssistRecordSearchPayloadSchema, context: schemas_ee.CurrentContext):
conditions = ["projects.tenant_id=%(tenant_id)s", conditions = ["projects.tenant_id=%(tenant_id)s",
"projects.deleted_at ISNULL", "projects.deleted_at ISNULL",
"assist_records.created_at>=%(startDate)s", "projects.project_id=%(project_id)s",
"assist_records.created_at<=%(endDate)s",
"assist_records.deleted_at ISNULL"] "assist_records.deleted_at ISNULL"]
if data.startDate:
conditions.append("assist_records.created_at>=%(startDate)s")
if data.endDate:
conditions.append("assist_records.created_at<=%(endDate)s")
params = {"tenant_id": context.tenant_id, "project_id": project_id, params = {"tenant_id": context.tenant_id, "project_id": project_id,
"startDate": data.startDate, "endDate": data.endDate, "startDate": data.startDate, "endDate": data.endDate,
"p_start": (data.page - 1) * data.limit, "p_limit": data.limit, "p_start": (data.page - 1) * data.limit, "p_limit": data.limit,

View file

@ -15,9 +15,9 @@ def app_connection_string(name, port, path):
HEALTH_ENDPOINTS = { HEALTH_ENDPOINTS = {
"alerts": app_connection_string("alerts-openreplay", 8888, "metrics"), "alerts": app_connection_string("alerts-openreplay", 8888, "health"),
"assets": app_connection_string("assets-openreplay", 8888, "metrics"), "assets": app_connection_string("assets-openreplay", 8888, "metrics"),
"assist": app_connection_string("assist-openreplay", 8888, "metrics"), "assist": app_connection_string("assist-openreplay", 8888, "health"),
"chalice": app_connection_string("chalice-openreplay", 8888, "metrics"), "chalice": app_connection_string("chalice-openreplay", 8888, "metrics"),
"db": app_connection_string("db-openreplay", 8888, "metrics"), "db": app_connection_string("db-openreplay", 8888, "metrics"),
"ender": app_connection_string("ender-openreplay", 8888, "metrics"), "ender": app_connection_string("ender-openreplay", 8888, "metrics"),

View file

@ -31,3 +31,9 @@ def get_urls(session_id, project_id, context: schemas_ee.CurrentContext, check_e
ExpiresIn=config("PRESIGNED_URL_EXPIRATION", cast=int, default=900) ExpiresIn=config("PRESIGNED_URL_EXPIRATION", cast=int, default=900)
)) ))
return results return results
def delete_mobs(project_id, session_ids):
for session_id in session_ids:
for k in __get_devtools_keys(project_id=project_id, session_id=session_id):
s3.schedule_for_deletion(config("sessions_bucket"), k)

View file

@ -1396,47 +1396,6 @@ def get_session_user(project_id, user_id):
return helper.dict_to_camel_case(data) return helper.dict_to_camel_case(data)
def get_session_ids_by_user_ids(project_id, user_ids):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(
"""\
SELECT session_id FROM public.sessions
WHERE
project_id = %(project_id)s AND user_id IN %(userId)s;""",
{"project_id": project_id, "userId": tuple(user_ids)}
)
ids = cur.execute(query=query)
return ids
def delete_sessions_by_session_ids(session_ids):
with pg_client.PostgresClient(unlimited_query=True) as cur:
query = cur.mogrify(
"""\
DELETE FROM public.sessions
WHERE
session_id IN %(session_ids)s;""",
{"session_ids": tuple(session_ids)}
)
cur.execute(query=query)
return True
def delete_sessions_by_user_ids(project_id, user_ids):
with pg_client.PostgresClient(unlimited_query=True) as cur:
query = cur.mogrify(
"""\
DELETE FROM public.sessions
WHERE
project_id = %(project_id)s AND user_id IN %(userId)s;""",
{"project_id": project_id, "userId": tuple(user_ids)}
)
cur.execute(query=query)
return True
def count_all(): def count_all():
with ch_client.ClickHouseClient() as cur: with ch_client.ClickHouseClient() as cur:
row = cur.execute(query=f"SELECT COUNT(session_id) AS count FROM {exp_ch_helper.get_main_sessions_table()}") row = cur.execute(query=f"SELECT COUNT(session_id) AS count FROM {exp_ch_helper.get_main_sessions_table()}")

View file

@ -72,10 +72,11 @@ def get_by_id2_pg(project_id, session_id, context: schemas_ee.CurrentContext, fu
if e['source'] == "js_exception"][:500] if e['source'] == "js_exception"][:500]
data['userEvents'] = events.get_customs_by_session_id(project_id=project_id, data['userEvents'] = events.get_customs_by_session_id(project_id=project_id,
session_id=session_id) session_id=session_id)
data['domURL'] = sessions_mobs.get_urls(session_id=session_id, project_id=project_id) data['domURL'] = sessions_mobs.get_urls(session_id=session_id, project_id=project_id,
data['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session_id) check_existence=False)
data['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session_id, check_existence=False)
data['devtoolsURL'] = sessions_devtool.get_urls(session_id=session_id, project_id=project_id, data['devtoolsURL'] = sessions_devtool.get_urls(session_id=session_id, project_id=project_id,
context=context) context=context, check_existence=False)
data['resources'] = resources.get_by_session_id(session_id=session_id, project_id=project_id, data['resources'] = resources.get_by_session_id(session_id=session_id, project_id=project_id,
start_ts=data["startTs"], duration=data["duration"]) start_ts=data["startTs"], duration=data["duration"])
@ -132,10 +133,11 @@ def get_replay(project_id, session_id, context: schemas.CurrentContext, full_dat
if data["platform"] == 'ios': if data["platform"] == 'ios':
data['mobsUrl'] = sessions_mobs.get_ios(session_id=session_id) data['mobsUrl'] = sessions_mobs.get_ios(session_id=session_id)
else: else:
data['domURL'] = sessions_mobs.get_urls(session_id=session_id, project_id=project_id) data['domURL'] = sessions_mobs.get_urls(session_id=session_id, project_id=project_id,
data['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session_id) check_existence=False)
data['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session_id, check_existence=False)
data['devtoolsURL'] = sessions_devtool.get_urls(session_id=session_id, project_id=project_id, data['devtoolsURL'] = sessions_devtool.get_urls(session_id=session_id, project_id=project_id,
context=context) context=context, check_existence=False)
data['metadata'] = __group_metadata(project_metadata=data.pop("projectMetadata"), session=data) data['metadata'] = __group_metadata(project_metadata=data.pop("projectMetadata"), session=data)
data['live'] = live and assist.is_live(project_id=project_id, session_id=session_id, data['live'] = live and assist.is_live(project_id=project_id, session_id=session_id,

View file

@ -73,4 +73,5 @@ PRESIGNED_URL_EXPIRATION=3600
ASSIST_JWT_EXPIRATION=144000 ASSIST_JWT_EXPIRATION=144000
ASSIST_JWT_SECRET= ASSIST_JWT_SECRET=
KAFKA_SERVERS=kafka.db.svc.cluster.local:9092 KAFKA_SERVERS=kafka.db.svc.cluster.local:9092
KAFKA_USE_SSL=false KAFKA_USE_SSL=false
SCH_DELETE_DAYS=30

View file

@ -7,7 +7,7 @@ elasticsearch==8.6.2
jira==3.5.0 jira==3.5.0
python-decouple==3.8
apscheduler==3.10.1 apscheduler==3.10.1
clickhouse-driver==0.2.5 clickhouse-driver==0.2.5

View file

@ -1,5 +1,4 @@
from apscheduler.triggers.cron import CronTrigger from apscheduler.triggers.cron import CronTrigger
from apscheduler.triggers.interval import IntervalTrigger
from decouple import config from decouple import config
from chalicelib.core import jobs from chalicelib.core import jobs
@ -29,13 +28,14 @@ cron_jobs = [
{"func": unlock_cron, "trigger": CronTrigger(day="*")}, {"func": unlock_cron, "trigger": CronTrigger(day="*")},
] ]
SINGLE_CRONS = [{"func": telemetry_cron, "trigger": CronTrigger(day_of_week="*"), SINGLE_CRONS = [
"misfire_grace_time": 60 * 60, "max_instances": 1}, {"func": telemetry_cron, "trigger": CronTrigger(day_of_week="*"),
{"func": run_scheduled_jobs, "trigger": IntervalTrigger(minutes=60), "misfire_grace_time": 60 * 60, "max_instances": 1},
"misfire_grace_time": 20, "max_instances": 1}, {"func": run_scheduled_jobs, "trigger": CronTrigger(day_of_week="*", hour=0, minute=15),
{"func": weekly_report, "trigger": CronTrigger(day_of_week="mon", hour=5), "misfire_grace_time": 20, "max_instances": 1},
"misfire_grace_time": 60 * 60, "max_instances": 1} {"func": weekly_report, "trigger": CronTrigger(day_of_week="mon", hour=5),
] "misfire_grace_time": 60 * 60, "max_instances": 1}
]
if config("LOCAL_CRONS", default=False, cast=bool): if config("LOCAL_CRONS", default=False, cast=bool):
cron_jobs += SINGLE_CRONS cron_jobs += SINGLE_CRONS

View file

@ -137,8 +137,8 @@ class AssistRecordSavePayloadSchema(AssistRecordPayloadSchema):
class AssistRecordSearchPayloadSchema(schemas._PaginatedSchema): class AssistRecordSearchPayloadSchema(schemas._PaginatedSchema):
limit: int = Field(default=200, gt=0) limit: int = Field(default=200, gt=0)
startDate: int = Field(default=TimeUTC.now(-7)) startDate: Optional[int] = Field(default=None)
endDate: int = Field(default=TimeUTC.now(1)) endDate: Optional[int] = Field(default=None)
user_id: Optional[int] = Field(default=None) user_id: Optional[int] = Field(default=None)
query: Optional[str] = Field(default=None) query: Optional[str] = Field(default=None)
order: Literal["asc", "desc"] = Field(default="desc") order: Literal["asc", "desc"] = Field(default="desc")

View file

@ -22,5 +22,5 @@ MINIO_ACCESS_KEY = ''
MINIO_SECRET_KEY = '' MINIO_SECRET_KEY = ''
# APP and TRACKER VERSIONS # APP and TRACKER VERSIONS
VERSION = '1.11.0' VERSION = 1.11.13
TRACKER_VERSION = '6.0.0' TRACKER_VERSION = '6.0.1'

View file

@ -13,9 +13,9 @@ export default () => (next) => (action) => {
return call(client) return call(client)
.then(async (response) => { .then(async (response) => {
if (response.status === 403) { // if (response.status === 403) {
next({ type: FETCH_ACCOUNT.FAILURE }); // next({ type: FETCH_ACCOUNT.FAILURE });
} // }
if (!response.ok) { if (!response.ok) {
const text = await response.text(); const text = await response.text();
return Promise.reject(text); return Promise.reject(text);
@ -34,6 +34,10 @@ export default () => (next) => (action) => {
} }
}) })
.catch(async (e) => { .catch(async (e) => {
if (e.response?.status === 403) {
next({ type: FETCH_ACCOUNT.FAILURE });
}
const data = await e.response?.json(); const data = await e.response?.json();
logger.error('Error during API request. ', e); logger.error('Error during API request. ', e);
return next({ type: FAILURE, errors: data ? parseError(data.errors) : [] }); return next({ type: FAILURE, errors: data ? parseError(data.errors) : [] });

View file

@ -1,37 +1,29 @@
import { useModal } from 'App/components/Modal'; import { useModal } from 'App/components/Modal';
import React, { useState } from 'react'; import React from 'react';
import SessionList from '../SessionList'; import SessionList from '../SessionList';
import stl from './assistTabs.module.css' import stl from './assistTabs.module.css';
interface Props { interface Props {
userId: any, userId: any;
userNumericHash: any,
} }
const AssistTabs = (props: Props) => { const AssistTabs = (props: Props) => {
const [showMenu, setShowMenu] = useState(false)
const { showModal } = useModal(); const { showModal } = useModal();
return ( return (
<div className="relative mr-4"> <div className="relative mr-4">
<div className="flex items-center"> <div className="flex items-center">
{props.userId && ( {props.userId && (
<> <div
<div className={stl.btnLink}
className={stl.btnLink} onClick={() =>
onClick={() => showModal(<SessionList userId={props.userId} />, { right: true, width: 700 })} showModal(<SessionList userId={props.userId} />, { right: true, width: 700 })
> }
Active Sessions >
</div> Active Sessions
</> </div>
)} )}
</div> </div>
{/* <SlideModal
title={ <div>{props.userId}'s <span className="color-gray-medium">Live Sessions</span> </div> }
isDisplayed={ showMenu }
content={ showMenu && <SessionList /> }
onClose={ () => setShowMenu(false) }
/> */}
</div> </div>
); );
}; };

View file

@ -46,7 +46,7 @@ function SessionList(props: Props) {
> >
<div className="p-4"> <div className="p-4">
{props.list.map((session: any) => ( {props.list.map((session: any) => (
<div className="mb-6"> <div className="mb-6" key={session.sessionId}>
{session.pageTitle && session.pageTitle !== '' && ( {session.pageTitle && session.pageTitle !== '' && (
<div className="flex items-center mb-2"> <div className="flex items-center mb-2">
<Label size="small" className="p-1"> <Label size="small" className="p-1">
@ -55,7 +55,7 @@ function SessionList(props: Props) {
<span className="ml-2 font-medium">{session.pageTitle}</span> <span className="ml-2 font-medium">{session.pageTitle}</span>
</div> </div>
)} )}
<SessionItem compact={true} onClick={() => hideModal()} key={session.sessionId} session={session} /> <SessionItem compact={true} onClick={hideModal} session={session} />
</div> </div>
))} ))}
</div> </div>

View file

@ -44,7 +44,7 @@ export default observer(({ player }) => {
<Performance <Performance
performanceChartTime={ current ? current.tmie : 0 } performanceChartTime={ current ? current.tmie : 0 }
performanceChartData={ player.lists[PERFORMANCE].list } performanceChartData={ player.lists[PERFORMANCE].list }
avaliability={ player.lists[PERFORMANCE].availability } availability={ player.lists[PERFORMANCE].availability }
hiddenScreenMarker={ false } hiddenScreenMarker={ false }
player={ player } player={ player }
/> />

View file

@ -220,12 +220,12 @@ export default class Performance extends React.PureComponent {
render() { render() {
const { const {
performanceChartTime, performanceChartTime,
avaliability = {}, availability = {},
hiddenScreenMarker = true, hiddenScreenMarker = true,
} = this.props; } = this.props;
const { fps, cpu, heap, nodes, memory, battery } = avaliability; const { fps, cpu, heap, nodes, memory, battery } = availability;
const avaliableCount = [ fps, cpu, heap, nodes, memory, battery ].reduce((c, av) => av ? c + 1 : c, 0); const availableCount = [ fps, cpu, heap, nodes, memory, battery ].reduce((c, av) => av ? c + 1 : c, 0);
const height = avaliableCount === 0 ? "0" : `${100 / avaliableCount}%`; const height = availableCount === 0 ? "0" : `${100 / availableCount}%`;
return ( return (
<> <>

View file

@ -71,12 +71,13 @@ function LivePlayer({
return () => { return () => {
if (!location.pathname.includes('multiview') || !location.pathname.includes(usedSession.sessionId)) { if (!location.pathname.includes('multiview') || !location.pathname.includes(usedSession.sessionId)) {
console.debug('unmount', usedSession.sessionId)
playerInst?.clean?.(); playerInst?.clean?.();
// @ts-ignore default empty // @ts-ignore default empty
setContextValue(defaultContextValue) setContextValue(defaultContextValue)
} }
} }
}, [location.pathname]); }, [location.pathname, usedSession.sessionId]);
// LAYOUT (TODO: local layout state - useContext or something..) // LAYOUT (TODO: local layout state - useContext or something..)
useEffect(() => { useEffect(() => {

View file

@ -46,7 +46,7 @@ function LivePlayerBlockHeader(props: any) {
history.push(withSiteId(ASSIST_ROUTE, siteId)); history.push(withSiteId(ASSIST_ROUTE, siteId));
}; };
const { userId, userNumericHash, metadata, isCallActive, agentIds } = session; const { userId, metadata, isCallActive, agentIds } = session;
let _metaList = Object.keys(metadata) let _metaList = Object.keys(metadata)
.filter((i) => metaList.includes(i)) .filter((i) => metaList.includes(i))
.map((key) => { .map((key) => {
@ -87,7 +87,7 @@ function LivePlayerBlockHeader(props: any) {
</div> </div>
)} )}
<UserCard className="" width={width} height={height} /> <UserCard className="" width={width} height={height} />
<AssistTabs userId={userId} userNumericHash={userNumericHash} /> <AssistTabs userId={userId} />
<div className={cn('ml-auto flex items-center h-full', { hidden: closedLive })}> <div className={cn('ml-auto flex items-center h-full', { hidden: closedLive })}>
{_metaList.length > 0 && ( {_metaList.length > 0 && (

View file

@ -13,6 +13,7 @@ import PlayerContent from './Player/ReplayPlayer/PlayerContent';
import { IPlayerContext, PlayerContext, defaultContextValue } from './playerContext'; import { IPlayerContext, PlayerContext, defaultContextValue } from './playerContext';
import { observer } from 'mobx-react-lite'; import { observer } from 'mobx-react-lite';
import { Note } from "App/services/NotesService"; import { Note } from "App/services/NotesService";
import { useParams } from 'react-router-dom'
const TABS = { const TABS = {
EVENTS: 'User Steps', EVENTS: 'User Steps',
@ -35,6 +36,7 @@ function WebPlayer(props: any) {
// @ts-ignore // @ts-ignore
const [contextValue, setContextValue] = useState<IPlayerContext>(defaultContextValue); const [contextValue, setContextValue] = useState<IPlayerContext>(defaultContextValue);
let playerInst: IPlayerContext['player']; let playerInst: IPlayerContext['player'];
const params: { sessionId: string } = useParams()
useEffect(() => { useEffect(() => {
if (!session.sessionId || contextValue.player !== undefined) return; if (!session.sessionId || contextValue.player !== undefined) return;
@ -91,13 +93,14 @@ function WebPlayer(props: any) {
// LAYOUT (TODO: local layout state - useContext or something..) // LAYOUT (TODO: local layout state - useContext or something..)
useEffect( useEffect(
() => () => { () => () => {
console.debug('cleaning up player after', params.sessionId)
toggleFullscreen(false); toggleFullscreen(false);
closeBottomBlock(); closeBottomBlock();
playerInst?.clean(); playerInst?.clean();
// @ts-ignore // @ts-ignore
setContextValue(defaultContextValue); setContextValue(defaultContextValue);
}, },
[] [params.sessionId]
); );
const onNoteClose = () => { const onNoteClose = () => {

View file

@ -187,7 +187,7 @@ function Performance({
performanceChartData, performanceChartData,
connType, connType,
connBandwidth, connBandwidth,
performanceAvaliability: avaliability, performanceAvailability: availability,
} = store.get(); } = store.get();
React.useState(() => { React.useState(() => {
@ -212,9 +212,9 @@ function Performance({
} }
}; };
const { fps, cpu, heap, nodes } = avaliability; const { fps, cpu, heap, nodes } = availability;
const avaliableCount = [fps, cpu, heap, nodes].reduce((c, av) => (av ? c + 1 : c), 0); const availableCount = [fps, cpu, heap, nodes].reduce((c, av) => (av ? c + 1 : c), 0);
const height = avaliableCount === 0 ? '0' : `${100 / avaliableCount}%`; const height = availableCount === 0 ? '0' : `${100 / availableCount}%`;
return ( return (
<BottomBlock> <BottomBlock>

View file

@ -55,6 +55,7 @@ interface Props {
compact?: boolean; compact?: boolean;
isDisabled?: boolean; isDisabled?: boolean;
isAdd?: boolean; isAdd?: boolean;
ignoreAssist?: boolean;
} }
function SessionItem(props: RouteComponentProps & Props) { function SessionItem(props: RouteComponentProps & Props) {
@ -70,6 +71,7 @@ function SessionItem(props: RouteComponentProps & Props) {
lastPlayedSessionId, lastPlayedSessionId,
onClick = null, onClick = null,
compact = false, compact = false,
ignoreAssist = false,
} = props; } = props;
const { const {
@ -99,9 +101,10 @@ function SessionItem(props: RouteComponentProps & Props) {
const hasUserId = userId || userAnonymousId; const hasUserId = userId || userAnonymousId;
const isSessions = isRoute(SESSIONS_ROUTE, location.pathname); const isSessions = isRoute(SESSIONS_ROUTE, location.pathname);
const isAssist = const isAssist =
isRoute(ASSIST_ROUTE, location.pathname) || !ignoreAssist &&
isRoute(ASSIST_LIVE_SESSION, location.pathname) || (isRoute(ASSIST_ROUTE, location.pathname) ||
location.pathname.includes('multiview'); isRoute(ASSIST_LIVE_SESSION, location.pathname) ||
location.pathname.includes('multiview'));
const isLastPlayed = lastPlayedSessionId === sessionId; const isLastPlayed = lastPlayedSessionId === sessionId;
const _metaList = Object.keys(metadata) const _metaList = Object.keys(metadata)

View file

@ -21,11 +21,11 @@ interface Props {
updateFilter: typeof updateFilter; updateFilter: typeof updateFilter;
} }
function SessionSearch(props: Props) { function SessionSearch(props: Props) {
const { appliedFilter, saveRequestPayloads = false, metaLoading } = props; const { appliedFilter, saveRequestPayloads = false, metaLoading = false } = props;
const hasEvents = appliedFilter.filters.filter((i: any) => i.isEvent).size > 0; const hasEvents = appliedFilter.filters.filter((i: any) => i.isEvent).size > 0;
const hasFilters = appliedFilter.filters.filter((i: any) => !i.isEvent).size > 0; const hasFilters = appliedFilter.filters.filter((i: any) => !i.isEvent).size > 0;
useSessionSearchQueryHandler({ appliedFilter, applyFilter: props.updateFilter }); useSessionSearchQueryHandler({ appliedFilter, applyFilter: props.updateFilter, loading: metaLoading });
useEffect(() => { useEffect(() => {
debounceFetch = debounce(() => props.fetchSessions(), 500); debounceFetch = debounce(() => props.fetchSessions(), 500);
@ -89,11 +89,9 @@ function SessionSearch(props: Props) {
<div className="border-t px-5 py-1 flex items-center -mx-2"> <div className="border-t px-5 py-1 flex items-center -mx-2">
<div> <div>
<FilterSelection filter={undefined} onFilterClick={onAddFilter}> <FilterSelection filter={undefined} onFilterClick={onAddFilter}>
{/* <IconButton primaryText label="ADD STEP" icon="plus" /> */}
<Button <Button
variant="text-primary" variant="text-primary"
className="mr-2" className="mr-2"
// onClick={() => setshowModal(true)}
icon="plus" icon="plus"
> >
ADD STEP ADD STEP

View file

@ -70,7 +70,7 @@ function UserSessionsModal(props: Props) {
<Loader loading={loading}> <Loader loading={loading}>
{data.sessions.map((session: any) => ( {data.sessions.map((session: any) => (
<div className="border-b last:border-none" key={session.sessionId}> <div className="border-b last:border-none" key={session.sessionId}>
<SessionItem key={session.sessionId} session={session} compact={true} onClick={hideModal} /> <SessionItem key={session.sessionId} session={session} compact={true} onClick={hideModal} ignoreAssist={true} />
</div> </div>
))} ))}
</Loader> </Loader>

View file

@ -5,29 +5,34 @@ import { createUrlQuery, getFiltersFromQuery } from 'App/utils/search';
interface Props { interface Props {
appliedFilter: any; appliedFilter: any;
applyFilter: any; applyFilter: any;
loading: boolean;
} }
const useSessionSearchQueryHandler = (props: Props) => { const useSessionSearchQueryHandler = (props: Props) => {
const { appliedFilter, applyFilter } = props; const { appliedFilter, applyFilter, loading } = props;
const history = useHistory(); const history = useHistory();
useEffect(() => { useEffect(() => {
const applyFilterFromQuery = () => { const applyFilterFromQuery = () => {
const filter = getFiltersFromQuery(history.location.search, appliedFilter); if (!loading) {
applyFilter(filter, true, false); const filter = getFiltersFromQuery(history.location.search, appliedFilter);
applyFilter(filter, true, false);
}
}; };
applyFilterFromQuery(); applyFilterFromQuery();
}, []); }, [loading]);
useEffect(() => { useEffect(() => {
const generateUrlQuery = () => { const generateUrlQuery = () => {
const search: any = createUrlQuery(appliedFilter); if (!loading) {
history.replace({ search }); const search: any = createUrlQuery(appliedFilter);
history.replace({ search });
}
}; };
generateUrlQuery(); generateUrlQuery();
}, [appliedFilter]); }, [appliedFilter, loading]);
return null; return null;
}; };

View file

@ -105,11 +105,11 @@ export default class ListWalker<T extends Timed> {
: null : null
} }
/* /**
Returns last message with the time <= t. * @returns last message with the time <= t.
Assumed that the current message is already handled so * Assumed that the current message is already handled so
if pointer doesn't cahnge <null> is returned. * if pointer doesn't cahnge <null> is returned.
*/ */
moveGetLast(t: number, index?: number): T | null { moveGetLast(t: number, index?: number): T | null {
let key: string = "time"; //TODO let key: string = "time"; //TODO
let val = t; let val = t;
@ -130,7 +130,13 @@ export default class ListWalker<T extends Timed> {
return changed ? this.list[ this.p - 1 ] : null; return changed ? this.list[ this.p - 1 ] : null;
} }
async moveWait(t: number, callback: (msg: T) => Promise<any> | undefined): Promise<void> { /**
* Moves over the messages starting from the current+1 to the last one with the time <= t
* applying callback on each of them
* @param t - max message time to move to; will move & apply callback while msg.time <= t
* @param callback - a callback to apply on each message passing by while moving
*/
moveApply(t: number, callback: (msg: T) => void): void {
// Applying only in increment order for now // Applying only in increment order for now
if (t < this.timeNow) { if (t < this.timeNow) {
this.reset(); this.reset();
@ -138,8 +144,7 @@ export default class ListWalker<T extends Timed> {
const list = this.list const list = this.list
while (list[this.p] && list[this.p].time <= t) { while (list[this.p] && list[this.p].time <= t) {
const maybePromise = callback(this.list[ this.p++ ]); callback(this.list[ this.p++ ])
if (maybePromise) { await maybePromise }
} }
} }

View file

@ -1,6 +1,5 @@
import { Store } from './types' import { Store } from './types'
// (not a type)
export default class SimpleSore<G, S=G> implements Store<G, S> { export default class SimpleSore<G, S=G> implements Store<G, S> {
constructor(private state: G){} constructor(private state: G){}
get(): G { get(): G {

View file

@ -52,7 +52,7 @@ export interface State extends ScreenState, ListsState {
connBandwidth?: number, connBandwidth?: number,
location?: string, location?: string,
performanceChartTime?: number, performanceChartTime?: number,
performanceAvaliability?: PerformanceTrackManager['avaliability'] performanceAvailability?: PerformanceTrackManager['availability']
domContentLoadedTime?: { time: number, value: number }, domContentLoadedTime?: { time: number, value: number },
domBuildingTime?: number, domBuildingTime?: number,
@ -191,7 +191,7 @@ export default class MessageManager {
private onFileReadSuccess = () => { private onFileReadSuccess = () => {
const stateToUpdate : Partial<State>= { const stateToUpdate : Partial<State>= {
performanceChartData: this.performanceTrackManager.chartData, performanceChartData: this.performanceTrackManager.chartData,
performanceAvaliability: this.performanceTrackManager.avaliability, performanceAvailability: this.performanceTrackManager.availability,
...this.lists.getFullListsState(), ...this.lists.getFullListsState(),
} }
if (this.activityManager) { if (this.activityManager) {
@ -214,7 +214,7 @@ export default class MessageManager {
async loadMessages(isClickmap: boolean = false) { async loadMessages(isClickmap: boolean = false) {
this.setMessagesLoading(true) this.setMessagesLoading(true)
// TODO: reusable decryptor instance // TODO: reusable decryptor instance
const createNewParser = (shouldDecrypt = true, file) => { const createNewParser = (shouldDecrypt = true, file?: string) => {
const decrypt = shouldDecrypt && this.session.fileKey const decrypt = shouldDecrypt && this.session.fileKey
? (b: Uint8Array) => decryptSessionBytes(b, this.session.fileKey) ? (b: Uint8Array) => decryptSessionBytes(b, this.session.fileKey)
: (b: Uint8Array) => Promise.resolve(b) : (b: Uint8Array) => Promise.resolve(b)
@ -228,7 +228,7 @@ export default class MessageManager {
} }
const sorted = msgs.sort((m1, m2) => { const sorted = msgs.sort((m1, m2) => {
// @ts-ignore // @ts-ignore
if (m1.time === m2.time) return m1._index - m2._index if (!m1.time || !m2.time || m1.time === m2.time) return m1._index - m2._index
return m1.time - m2.time return m1.time - m2.time
}) })
@ -250,34 +250,60 @@ export default class MessageManager {
this.waitingForFiles = true this.waitingForFiles = true
// TODO: refactor this stuff; split everything to async/await
const loadMethod = this.session.domURL && this.session.domURL.length > 0 const loadMethod = this.session.domURL && this.session.domURL.length > 0
? { url: this.session.domURL, parser: () => createNewParser(true, 'dom') } ? { url: this.session.domURL, parser: () => createNewParser(true, 'dom') }
: { url: this.session.mobsUrl, parser: () => createNewParser(false, 'dom')} : { url: this.session.mobsUrl, parser: () => createNewParser(false, 'dom')}
loadFiles(loadMethod.url, loadMethod.parser()) const parser = loadMethod.parser()
// EFS fallback
.catch((e) => /**
requestEFSDom(this.session.sessionId) * We load first dom mobfile before the rest
.then(createNewParser(false, 'domEFS')) * to speed up time to replay
* but as a tradeoff we have to have some copy-paste
* for the devtools file
* */
loadFiles([loadMethod.url[0]], parser)
.then(() => {
const domPromise = loadMethod.url.length > 1
? loadFiles([loadMethod.url[1]], parser, true)
: Promise.resolve()
const devtoolsPromise = !isClickmap
? this.loadDevtools(createNewParser)
: Promise.resolve()
return Promise.all([domPromise, devtoolsPromise])
})
/**
* EFS fallback for unprocessed sessions (which are live)
* */
.catch(() => {
requestEFSDom(this.session.sessionId)
.then(createNewParser(false, 'domEFS'))
.catch(this.onFileReadFailed)
if (!isClickmap) {
this.loadDevtools(createNewParser)
}
}
) )
.then(this.onFileReadSuccess) .then(this.onFileReadSuccess)
.catch(this.onFileReadFailed)
.finally(this.onFileReadFinally); .finally(this.onFileReadFinally);
}
// load devtools (TODO: start after the first DOM file download) loadDevtools(createNewParser: (shouldDecrypt: boolean, file: string) => (b: Uint8Array) => Promise<void>) {
if (isClickmap) return;
this.state.update({ devtoolsLoading: true }) this.state.update({ devtoolsLoading: true })
loadFiles(this.session.devtoolsURL, createNewParser(true, 'devtools')) return loadFiles(this.session.devtoolsURL, createNewParser(true, 'devtools'))
// EFS fallback // EFS fallback
.catch(() => .catch(() =>
requestEFSDevtools(this.session.sessionId) requestEFSDevtools(this.session.sessionId)
.then(createNewParser(false, 'devtoolsEFS')) .then(createNewParser(false, 'devtoolsEFS'))
) )
.then(() => { // TODO: also in case of dynamic update through assist
this.state.update(this.lists.getFullListsState()) // TODO: also in case of dynamic update through assist .then(() => {
}) this.state.update({ ...this.lists.getFullListsState() })
.catch(e => logger.error("Can not download the devtools file", e)) })
.finally(() => this.state.update({ devtoolsLoading: false })) .catch(e => logger.error("Can not download the devtools file", e))
.finally(() => this.state.update({ devtoolsLoading: false }))
} }
resetMessageManagers() { resetMessageManagers() {

View file

@ -90,7 +90,7 @@ export default class WebLivePlayer extends WebPlayer {
clean = () => { clean = () => {
this.incomingMessages.length = 0 this.incomingMessages.length = 0
this.assistManager.clean() this.assistManager.clean()
this.screen.clean() this.screen?.clean?.()
// @ts-ignore // @ts-ignore
this.screen = undefined; this.screen = undefined;
super.clean() super.clean()

View file

@ -9,40 +9,35 @@ import FocusManager from './FocusManager';
import SelectionManager from './SelectionManager'; import SelectionManager from './SelectionManager';
import type { StyleElement } from './VirtualDOM'; import type { StyleElement } from './VirtualDOM';
import { import {
PostponedStyleSheet, OnloadStyleSheet,
VDocument, VDocument,
VElement, VElement,
VHTMLElement,
VNode, VNode,
VShadowRoot, VShadowRoot,
VStyleElement,
VText, VText,
OnloadVRoot,
} from './VirtualDOM'; } from './VirtualDOM';
import { deleteRule, insertRule } from './safeCSSRules'; import { deleteRule, insertRule } from './safeCSSRules';
type HTMLElementWithValue = HTMLInputElement | HTMLTextAreaElement | HTMLSelectElement;
const IGNORED_ATTRS = [ "autocomplete" ]; function isStyleVElement(vElem: VElement): vElem is VElement & { node: StyleElement } {
const ATTR_NAME_REGEXP = /([^\t\n\f \/>"'=]+)/; // regexp costs ~ return vElem.tagName.toLowerCase() === "style"
}
// TODO: filter out non-relevant prefixes
// function replaceCSSPrefixes(css: string) {
// return css
// .replace(/\-ms\-/g, "")
// .replace(/\-webkit\-/g, "")
// .replace(/\-moz\-/g, "")
// .replace(/\-webkit\-/g, "")
// }
const IGNORED_ATTRS = [ "autocomplete" ]
const ATTR_NAME_REGEXP = /([^\t\n\f \/>"'=]+)/
export default class DOMManager extends ListWalker<Message> { export default class DOMManager extends ListWalker<Message> {
private readonly vTexts: Map<number, VText> = new Map() // map vs object here? private readonly vTexts: Map<number, VText> = new Map() // map vs object here?
private readonly vElements: Map<number, VElement> = new Map() private readonly vElements: Map<number, VElement> = new Map()
private readonly vRoots: Map<number, VShadowRoot | VDocument> = new Map() private readonly olVRoots: Map<number, OnloadVRoot> = new Map()
private styleSheets: Map<number, CSSStyleSheet> = new Map() /** Constructed StyleSheets https://developer.mozilla.org/en-US/docs/Web/API/Document/adoptedStyleSheets
private ppStyleSheets: Map<number, PostponedStyleSheet> = new Map() * as well as <style> tag owned StyleSheets
private stringDict: Record<number,string> = {} */
private attrsBacktrack: Message[] = [] private olStyleSheets: Map<number, OnloadStyleSheet> = new Map()
/** @depreacted since tracker 4.0.2 Mapping by nodeID */
private olStyleSheetsDeprecated: Map<number, OnloadStyleSheet> = new Map()
private upperBodyId: number = -1; private upperBodyId: number = -1;
private nodeScrollManagers: Map<number, ListWalker<SetNodeScroll>> = new Map() private nodeScrollManagers: Map<number, ListWalker<SetNodeScroll>> = new Map()
@ -53,6 +48,7 @@ export default class DOMManager extends ListWalker<Message> {
constructor( constructor(
private readonly screen: Screen, private readonly screen: Screen,
private readonly isMobile: boolean, private readonly isMobile: boolean,
private stringDict: Record<number,string>,
public readonly time: number, public readonly time: number,
setCssLoading: ConstructorParameters<typeof StylesManager>[1], setCssLoading: ConstructorParameters<typeof StylesManager>[1],
) { ) {
@ -61,6 +57,10 @@ export default class DOMManager extends ListWalker<Message> {
this.stylesManager = new StylesManager(screen, setCssLoading) this.stylesManager = new StylesManager(screen, setCssLoading)
} }
setStringDict(stringDict: Record<number,string>) {
this.stringDict = stringDict
}
append(m: Message): void { append(m: Message): void {
if (m.tp === MType.SetNodeScroll) { if (m.tp === MType.SetNodeScroll) {
let scrollManager = this.nodeScrollManagers.get(m.id) let scrollManager = this.nodeScrollManagers.get(m.id)
@ -91,21 +91,20 @@ export default class DOMManager extends ListWalker<Message> {
super.append(m) super.append(m)
} }
private removeBodyScroll(id: number, vn: VElement): void { private removeBodyScroll(id: number, vElem: VElement): void {
if (this.isMobile && this.upperBodyId === id) { // Need more type safety! if (this.isMobile && this.upperBodyId === id) { // Need more type safety!
(vn.node as HTMLBodyElement).style.overflow = "hidden" (vElem.node as HTMLBodyElement).style.overflow = "hidden"
} }
} }
// May be make it as a message on message add? private removeAutocomplete(vElem: VElement): boolean {
private removeAutocomplete(node: Element): boolean { const tag = vElem.tagName
const tag = node.tagName
if ([ "FORM", "TEXTAREA", "SELECT" ].includes(tag)) { if ([ "FORM", "TEXTAREA", "SELECT" ].includes(tag)) {
node.setAttribute("autocomplete", "off"); vElem.setAttribute("autocomplete", "off");
return true; return true;
} }
if (tag === "INPUT") { if (tag === "INPUT") {
node.setAttribute("autocomplete", "new-password"); vElem.setAttribute("autocomplete", "new-password");
return true; return true;
} }
return false; return false;
@ -117,22 +116,24 @@ export default class DOMManager extends ListWalker<Message> {
logger.error("Insert error. Node not found", id); logger.error("Insert error. Node not found", id);
return; return;
} }
const parent = this.vElements.get(parentID) || this.vRoots.get(parentID) const parent = this.vElements.get(parentID) || this.olVRoots.get(parentID)
if (!parent) { if (!parent) {
logger.error("Insert error. Parent node not found", parentID, this.vElements, this.vRoots); logger.error("Insert error. Parent vNode not found", parentID, this.vElements, this.olVRoots);
return; return;
} }
const pNode = parent.node if (parent instanceof VElement && isStyleVElement(parent)) {
if ((pNode instanceof HTMLStyleElement) && // TODO: correct ordering OR filter in tracker // TODO: if this ever happens? ; Maybe do not send empty TextNodes in tracker
pNode.sheet && const styleNode = parent.node
pNode.sheet.cssRules && if (styleNode.sheet &&
pNode.sheet.cssRules.length > 0 && styleNode.sheet.cssRules &&
pNode.innerText && styleNode.sheet.cssRules.length > 0 &&
pNode.innerText.trim().length === 0 styleNode.textContent &&
) { styleNode.textContent.trim().length === 0
logger.log("Trying to insert child to a style tag with virtual rules: ", parent, child); ) {
return; logger.log("Trying to insert child to a style tag with virtual rules: ", parent, child);
return;
}
} }
parent.insertChildAt(child, index) parent.insertChildAt(child, index)
@ -143,25 +144,27 @@ export default class DOMManager extends ListWalker<Message> {
const vn = this.vElements.get(msg.id) const vn = this.vElements.get(msg.id)
if (!vn) { logger.error("SetNodeAttribute: Node not found", msg); return } if (!vn) { logger.error("SetNodeAttribute: Node not found", msg); return }
if (vn.node.tagName === "INPUT" && name === "name") { if (vn.tagName === "INPUT" && name === "name") {
// Otherwise binds local autocomplete values (maybe should ignore on the tracker level) // Otherwise binds local autocomplete values (maybe should ignore on the tracker level?)
return return
} }
if (name === "href" && vn.node.tagName === "LINK") { if (name === "href" && vn.tagName === "LINK") {
// @ts-ignore ?global ENV type // It've been done on backend (remove after testing in saas) // @ts-ignore ?global ENV type // It've been done on backend (remove after testing in saas)
// if (value.startsWith(window.env.ASSETS_HOST || window.location.origin + '/assets')) { // if (value.startsWith(window.env.ASSETS_HOST || window.location.origin + '/assets')) {
// value = value.replace("?", "%3F"); // value = value.replace("?", "%3F");
// } // }
if (!value.startsWith("http")) { if (!value.startsWith("http")) {
/* blob:... value can happen here for some reason.
* which will result in that link being unable to load and having 4sec timeout in the below function.
*/
return return
} }
// blob:... value can happen here for some reason.
// which will result in that link being unable to load and having 4sec timeout in the below function.
// TODO: check if node actually exists on the page, not just in memory // TODOTODO: check if node actually exists on the page, not just in memory
this.stylesManager.setStyleHandlers(vn.node as HTMLLinkElement, value); this.stylesManager.setStyleHandlers(vn.node as HTMLLinkElement, value);
} }
if (vn.node.namespaceURI === 'http://www.w3.org/2000/svg' && value.startsWith("url(")) { if (vn.isSVG && value.startsWith("url(")) {
/* SVG shape ID-s for masks etc. Sometimes referred with the full-page url, which we don't have in replay */
value = "url(#" + (value.split("#")[1] ||")") value = "url(#" + (value.split("#")[1] ||")")
} }
vn.setAttribute(name, value) vn.setAttribute(name, value)
@ -169,12 +172,9 @@ export default class DOMManager extends ListWalker<Message> {
} }
private applyMessage = (msg: Message): Promise<any> | undefined => { private applyMessage = (msg: Message): Promise<any> | undefined => {
let vn: VNode | undefined
let doc: Document | null
let styleSheet: CSSStyleSheet | PostponedStyleSheet | undefined
switch (msg.tp) { switch (msg.tp) {
case MType.CreateDocument: case MType.CreateDocument: {
doc = this.screen.document; const doc = this.screen.document;
if (!doc) { if (!doc) {
logger.error("No root iframe document found", msg, this.screen) logger.error("No root iframe document found", msg, this.screen)
return; return;
@ -185,84 +185,71 @@ export default class DOMManager extends ListWalker<Message> {
const fRoot = doc.documentElement; const fRoot = doc.documentElement;
fRoot.innerText = ''; fRoot.innerText = '';
vn = new VElement(fRoot) const vHTMLElement = new VHTMLElement(fRoot)
this.vElements.clear() this.vElements.clear()
this.vElements.set(0, vn) this.vElements.set(0, vHTMLElement)
const vDoc = new VDocument(doc) const vDoc = OnloadVRoot.fromDocumentNode(doc)
vDoc.insertChildAt(vn, 0) vDoc.insertChildAt(vHTMLElement, 0)
this.vRoots.clear() this.olVRoots.clear()
this.vRoots.set(0, vDoc) // watchout: id==0 for both Document and documentElement this.olVRoots.set(0, vDoc) // watchout: id==0 for both Document and documentElement
// this is done for the AdoptedCSS logic // this is done for the AdoptedCSS logic
// todo: start from 0-node (sync logic with tracker) // Maybetodo: start Document as 0-node in tracker
this.vTexts.clear() this.vTexts.clear()
this.stylesManager.reset() this.stylesManager.reset()
this.stringDict = {}
return return
case MType.CreateTextNode: }
vn = new VText() case MType.CreateTextNode: {
this.vTexts.set(msg.id, vn) const vText = new VText()
this.vTexts.set(msg.id, vText)
this.insertNode(msg) this.insertNode(msg)
return return
case MType.CreateElementNode: }
let element: Element case MType.CreateElementNode: {
if (msg.svg) { const vElem = new VElement(msg.tag, msg.svg)
element = document.createElementNS('http://www.w3.org/2000/svg', msg.tag) if (['STYLE', 'style', 'LINK'].includes(msg.tag)) {
} else { vElem.prioritized = true
element = document.createElement(msg.tag)
} }
if (msg.tag === "STYLE" || msg.tag === "style") { this.vElements.set(msg.id, vElem)
vn = new VStyleElement(element as StyleElement)
} else {
vn = new VElement(element)
}
this.vElements.set(msg.id, vn)
this.insertNode(msg) this.insertNode(msg)
this.removeBodyScroll(msg.id, vn) this.removeBodyScroll(msg.id, vElem)
this.removeAutocomplete(element) this.removeAutocomplete(vElem)
if (['STYLE', 'style', 'LINK'].includes(msg.tag)) { // Styles in priority
vn.enforceInsertion()
}
return return
}
case MType.MoveNode: case MType.MoveNode:
this.insertNode(msg); this.insertNode(msg)
return return
case MType.RemoveNode: case MType.RemoveNode: {
vn = this.vElements.get(msg.id) || this.vTexts.get(msg.id) const vChild = this.vElements.get(msg.id) || this.vTexts.get(msg.id)
if (!vn) { logger.error("RemoveNode: Node not found", msg); return } if (!vChild) { logger.error("RemoveNode: Node not found", msg); return }
if (!vn.parentNode) { logger.error("RemoveNode: Parent node not found", msg); return } if (!vChild.parentNode) { logger.error("RemoveNode: Parent node not found", msg); return }
vn.parentNode.removeChild(vn) vChild.parentNode.removeChild(vChild)
this.vElements.delete(msg.id) this.vElements.delete(msg.id)
this.vTexts.delete(msg.id) this.vTexts.delete(msg.id)
return return
}
case MType.SetNodeAttribute: case MType.SetNodeAttribute:
if (msg.name === 'href') this.attrsBacktrack.push(msg) this.setNodeAttribute(msg)
else this.setNodeAttribute(msg)
return
case MType.StringDict:
this.stringDict[msg.key] = msg.value
return return
case MType.SetNodeAttributeDict: case MType.SetNodeAttributeDict:
this.stringDict[msg.nameKey] === undefined && logger.error("No dictionary key for msg 'name': ", msg) this.stringDict[msg.nameKey] === undefined && logger.error("No dictionary key for msg 'name': ", msg, this.stringDict)
this.stringDict[msg.valueKey] === undefined && logger.error("No dictionary key for msg 'value': ", msg) this.stringDict[msg.valueKey] === undefined && logger.error("No dictionary key for msg 'value': ", msg, this.stringDict)
if (this.stringDict[msg.nameKey] === undefined || this.stringDict[msg.valueKey] === undefined ) { return } if (this.stringDict[msg.nameKey] === undefined || this.stringDict[msg.valueKey] === undefined ) { return }
if (this.stringDict[msg.nameKey] === 'href') this.attrsBacktrack.push(msg) this.setNodeAttribute({
else { id: msg.id,
this.setNodeAttribute({ name: this.stringDict[msg.nameKey],
id: msg.id, value: this.stringDict[msg.valueKey],
name: this.stringDict[msg.nameKey], })
value: this.stringDict[msg.valueKey],
})
}
return return
case MType.RemoveNodeAttribute: case MType.RemoveNodeAttribute: {
vn = this.vElements.get(msg.id) const vElem = this.vElements.get(msg.id)
if (!vn) { logger.error("RemoveNodeAttribute: Node not found", msg); return } if (!vElem) { logger.error("RemoveNodeAttribute: Node not found", msg); return }
vn.removeAttribute(msg.name) vElem.removeAttribute(msg.name)
return return
case MType.SetInputValue: }
vn = this.vElements.get(msg.id) case MType.SetInputValue: {
if (!vn) { logger.error("SetInoputValue: Node not found", msg); return } const vElem = this.vElements.get(msg.id)
const nodeWithValue = vn.node if (!vElem) { logger.error("SetInoputValue: Node not found", msg); return }
const nodeWithValue = vElem.node
if (!(nodeWithValue instanceof HTMLInputElement if (!(nodeWithValue instanceof HTMLInputElement
|| nodeWithValue instanceof HTMLTextAreaElement || nodeWithValue instanceof HTMLTextAreaElement
|| nodeWithValue instanceof HTMLSelectElement) || nodeWithValue instanceof HTMLSelectElement)
@ -271,222 +258,183 @@ export default class DOMManager extends ListWalker<Message> {
return return
} }
const val = msg.mask > 0 ? '*'.repeat(msg.mask) : msg.value const val = msg.mask > 0 ? '*'.repeat(msg.mask) : msg.value
doc = this.screen.document const doc = this.screen.document
if (doc && nodeWithValue === doc.activeElement) { if (doc && nodeWithValue === doc.activeElement) {
// For the case of Remote Control // For the case of Remote Control
nodeWithValue.onblur = () => { nodeWithValue.value = val } nodeWithValue.onblur = () => { nodeWithValue.value = val }
return return
} }
nodeWithValue.value = val nodeWithValue.value = val // Maybe make special VInputValueElement type for lazy value update
return return
case MType.SetInputChecked: }
vn = this.vElements.get(msg.id) case MType.SetInputChecked: {
if (!vn) { logger.error("SetInputChecked: Node not found", msg); return } const vElem = this.vElements.get(msg.id)
(vn.node as HTMLInputElement).checked = msg.checked if (!vElem) { logger.error("SetInputChecked: Node not found", msg); return }
(vElem.node as HTMLInputElement).checked = msg.checked // Maybe make special VCheckableElement type for lazy checking
return return
}
case MType.SetNodeData: case MType.SetNodeData:
case MType.SetCssData: // mbtodo: remove css transitions when timeflow is not natural (on jumps) case MType.SetCssData: {
vn = this.vTexts.get(msg.id) const vText = this.vTexts.get(msg.id)
if (!vn) { logger.error("SetCssData: Node not found", msg); return } if (!vText) { logger.error("SetNodeData/SetCssData: Node not found", msg); return }
vn.setData(msg.data) vText.setData(msg.data)
if (msg.tp === MType.SetCssData) { // Styles in priority (do we need inlines as well?)
vn.applyChanges()
}
return return
}
// @deprecated since 4.0.2 in favor of adopted_ss_insert/delete_rule + add_owner as being common case for StyleSheets /** @deprecated
case MType.CssInsertRule: * since 4.0.2 in favor of AdoptedSsInsertRule/DeleteRule + AdoptedSsAddOwner as a common case for StyleSheets
vn = this.vElements.get(msg.id) */
if (!vn) { logger.error("CssInsertRule: Node not found", msg); return } case MType.CssInsertRule: {
if (!(vn instanceof VStyleElement)) { let styleSheet = this.olStyleSheetsDeprecated.get(msg.id)
logger.warn("Non-style node in CSS rules message (or sheet is null)", msg, vn); if (!styleSheet) {
return const vElem = this.vElements.get(msg.id)
if (!vElem) { logger.error("CssInsertRule: Node not found", msg); return }
if (!isStyleVElement(vElem)) { logger.error("CssInsertRule: Non-style element", msg); return }
styleSheet = OnloadStyleSheet.fromStyleElement(vElem.node)
this.olStyleSheetsDeprecated.set(msg.id, styleSheet)
} }
vn.onStyleSheet(sheet => insertRule(sheet, msg)) styleSheet.insertRule(msg.rule, msg.index)
return return
case MType.CssDeleteRule: }
vn = this.vElements.get(msg.id) case MType.CssDeleteRule: {
if (!vn) { logger.error("CssDeleteRule: Node not found", msg); return } const styleSheet = this.olStyleSheetsDeprecated.get(msg.id)
if (!(vn instanceof VStyleElement)) { if (!styleSheet) { logger.error("CssDeleteRule: StyleSheet was not created", msg); return }
logger.warn("Non-style node in CSS rules message (or sheet is null)", msg, vn); styleSheet.deleteRule(msg.index)
return
}
vn.onStyleSheet(sheet => deleteRule(sheet, msg))
return return
// end @deprecated }
/* end @deprecated */
case MType.CreateIFrameDocument: case MType.CreateIFrameDocument: {
vn = this.vElements.get(msg.frameID) const vElem = this.vElements.get(msg.frameID)
if (!vn) { logger.error("CreateIFrameDocument: Node not found", msg); return } if (!vElem) { logger.error("CreateIFrameDocument: Node not found", msg); return }
vn.enforceInsertion() const vRoot = OnloadVRoot.fromVElement(vElem)
const host = vn.node vRoot.catch(e => logger.warn(e, msg))
if (host instanceof HTMLIFrameElement) { this.olVRoots.set(msg.id, vRoot)
const doc = host.contentDocument
if (!doc) {
logger.warn("No default iframe doc", msg, host)
return
}
const vDoc = new VDocument(doc)
this.vRoots.set(msg.id, vDoc)
return;
} else if (host instanceof Element) { // shadow DOM
try {
const shadowRoot = host.attachShadow({ mode: 'open' })
vn = new VShadowRoot(shadowRoot)
this.vRoots.set(msg.id, vn)
} catch(e) {
logger.warn("Can not attach shadow dom", e, msg)
}
} else {
logger.warn("Context message host is not Element", msg)
}
return return
case MType.AdoptedSsInsertRule: }
styleSheet = this.styleSheets.get(msg.sheetID) || this.ppStyleSheets.get(msg.sheetID) case MType.AdoptedSsInsertRule: {
const styleSheet = this.olStyleSheets.get(msg.sheetID)
if (!styleSheet) { if (!styleSheet) {
logger.warn("No stylesheet was created for ", msg) logger.warn("No stylesheet was created for ", msg)
return return
} }
insertRule(styleSheet, msg) insertRule(styleSheet, msg)
return return
case MType.AdoptedSsDeleteRule: }
styleSheet = this.styleSheets.get(msg.sheetID) || this.ppStyleSheets.get(msg.sheetID) case MType.AdoptedSsDeleteRule: {
const styleSheet = this.olStyleSheets.get(msg.sheetID)
if (!styleSheet) { if (!styleSheet) {
logger.warn("No stylesheet was created for ", msg) logger.warn("No stylesheet was created for ", msg)
return return
} }
deleteRule(styleSheet, msg) deleteRule(styleSheet, msg)
return return
}
case MType.AdoptedSsReplace: case MType.AdoptedSsReplace: {
styleSheet = this.styleSheets.get(msg.sheetID) const styleSheet = this.olStyleSheets.get(msg.sheetID)
if (!styleSheet) { if (!styleSheet) {
logger.warn("No stylesheet was created for ", msg) logger.warn("No stylesheet was created for ", msg)
return return
} }
// @ts-ignore // @ts-ignore (configure ts with recent WebaAPI)
styleSheet.replaceSync(msg.text) styleSheet.replaceSync(msg.text)
return return
case MType.AdoptedSsAddOwner:
vn = this.vRoots.get(msg.id)
if (!vn) {
// non-constructed case
vn = this.vElements.get(msg.id)
if (!vn) { logger.error("AdoptedSsAddOwner: Node not found", msg); return }
if (!(vn instanceof VStyleElement)) { logger.error("Non-style owner", msg); return }
this.ppStyleSheets.set(msg.sheetID, new PostponedStyleSheet(vn.node))
return
}
styleSheet = this.styleSheets.get(msg.sheetID)
if (!styleSheet) {
let context: typeof globalThis
const rootNode = vn.node
if (rootNode.nodeType === Node.DOCUMENT_NODE) {
context = (rootNode as Document).defaultView
} else {
context = (rootNode as ShadowRoot).ownerDocument.defaultView
}
styleSheet = new context.CSSStyleSheet()
this.styleSheets.set(msg.sheetID, styleSheet)
}
//@ts-ignore
vn.node.adoptedStyleSheets = [...vn.node.adoptedStyleSheets, styleSheet]
return
case MType.AdoptedSsRemoveOwner:
styleSheet = this.styleSheets.get(msg.sheetID)
if (!styleSheet) {
logger.warn("No stylesheet was created for ", msg)
return
}
vn = this.vRoots.get(msg.id)
if (!vn) { logger.error("AdoptedSsRemoveOwner: Node not found", msg); return }
//@ts-ignore
vn.node.adoptedStyleSheets = [...vn.node.adoptedStyleSheets].filter(s => s !== styleSheet)
return
case MType.LoadFontFace:
vn = this.vRoots.get(msg.parentID)
if (!vn) { logger.error("LoadFontFace: Node not found", msg); return }
if (vn instanceof VShadowRoot) { logger.error(`Node ${vn} expected to be a Document`, msg); return }
let descr: Object
try {
descr = JSON.parse(msg.descriptors)
descr = typeof descr === 'object' ? descr : undefined
} catch {
logger.warn("Can't parse font-face descriptors: ", msg)
}
const ff = new FontFace(msg.family, msg.source, descr)
vn.node.fonts.add(ff)
return ff.load()
}
}
applyBacktrack(msg: Message) {
// @ts-ignore
const target = this.vElements.get(msg.id)
if (!target) {
return
}
switch (msg.tp) {
case MType.SetNodeAttribute: {
this.setNodeAttribute(msg)
return
} }
case MType.SetNodeAttributeDict: { case MType.AdoptedSsAddOwner: {
this.stringDict[msg.nameKey] === undefined && logger.error("No dictionary key for msg 'name': ", msg) const vRoot = this.olVRoots.get(msg.id)
this.stringDict[msg.valueKey] === undefined && logger.error("No dictionary key for msg 'value': ", msg) if (!vRoot) {
if (this.stringDict[msg.nameKey] === undefined || this.stringDict[msg.valueKey] === undefined) { /* <style> tag case */
const vElem = this.vElements.get(msg.id)
if (!vElem) { logger.error("AdoptedSsAddOwner: Node not found", msg); return }
if (!isStyleVElement(vElem)) { logger.error("Non-style owner", msg); return }
this.olStyleSheets.set(msg.sheetID, OnloadStyleSheet.fromStyleElement(vElem.node))
return return
} }
this.setNodeAttribute({ /* Constructed StyleSheet case */
id: msg.id, let olStyleSheet = this.olStyleSheets.get(msg.sheetID)
name: this.stringDict[msg.nameKey], if (!olStyleSheet) {
value: this.stringDict[msg.valueKey], olStyleSheet = OnloadStyleSheet.fromVRootContext(vRoot)
this.olStyleSheets.set(msg.sheetID, olStyleSheet)
}
olStyleSheet.whenReady(styleSheet => {
vRoot.onNode(node => {
// @ts-ignore
node.adoptedStyleSheets = [...node.adoptedStyleSheets, styleSheet]
})
}) })
return; return
}
case MType.AdoptedSsRemoveOwner: {
const olStyleSheet = this.olStyleSheets.get(msg.sheetID)
if (!olStyleSheet) {
logger.warn("AdoptedSsRemoveOwner: No stylesheet was created for ", msg)
return
}
const vRoot = this.olVRoots.get(msg.id)
if (!vRoot) { logger.error("AdoptedSsRemoveOwner: Owner node not found", msg); return }
olStyleSheet.whenReady(styleSheet => {
vRoot.onNode(node => {
// @ts-ignore
node.adoptedStyleSheets = [...vRoot.node.adoptedStyleSheets].filter(s => s !== styleSheet)
})
})
return
}
case MType.LoadFontFace: {
const vRoot = this.olVRoots.get(msg.parentID)
if (!vRoot) { logger.error("LoadFontFace: Node not found", msg); return }
vRoot.whenReady(vNode => {
if (vNode instanceof VShadowRoot) { logger.error(`Node ${vNode} expected to be a Document`, msg); return }
let descr: Object | undefined
try {
descr = JSON.parse(msg.descriptors)
descr = typeof descr === 'object' ? descr : undefined
} catch {
logger.warn("Can't parse font-face descriptors: ", msg)
}
const ff = new FontFace(msg.family, msg.source, descr)
vNode.node.fonts.add(ff)
ff.load() // TODOTODO: wait for this one in StylesManager in a common way with styles
})
return
} }
} }
} }
/**
* Moves and applies all the messages from the current (or from the beginning, if t < current.time)
* to the one with msg.time >= `t`
*
* This function autoresets pointer if necessary (better name?)
*
* @returns Promise that fulfulls when necessary changes get applied
* (the async part exists mostly due to styles loading)
*/
async moveReady(t: number): Promise<void> { async moveReady(t: number): Promise<void> {
// MBTODO (back jump optimisation): this.moveApply(t, this.applyMessage)
// - store intemediate virtual dom state
// - cancel previous moveReady tasks (is it possible?) if new timestamp is less
// This function autoresets pointer if necessary (better name?)
/** this.olVRoots.forEach(rt => rt.applyChanges())
* Basically just skipping all set attribute with attrs being "href" if user is 'jumping'
* to the other point of replay to save time on NOT downloading any resources before the dom tree changes
* are applied, so it won't try to download and then cancel when node is created in msg N and removed in msg N+2
* which produces weird bug when asset is cached (10-25ms delay)
* */
// http://0.0.0.0:3333/5/session/8452905874437457
// 70 iframe, 8 create element - STYLE tag
await this.moveWait(t, this.applyMessage)
this.attrsBacktrack.forEach(msg => {
this.applyBacktrack(msg)
})
this.attrsBacktrack = []
this.vRoots.forEach(rt => rt.applyChanges()) // MBTODO (optimisation): affected set
// Thinkabout (read): css preload // Thinkabout (read): css preload
// What if we go back before it is ready? We'll have two handlres? // What if we go back before it is ready? We'll have two handlres?
return this.stylesManager.moveReady(t).then(() => { return this.stylesManager.moveReady(t).then(() => {
// Apply focus /* Waiting for styles to be applied first */
/* Applying focus */
this.focusManager.move(t) this.focusManager.move(t)
/* Applying text selections */
this.selectionManager.move(t) this.selectionManager.move(t)
// Apply all scrolls after the styles got applied /* Applying all scrolls */
this.nodeScrollManagers.forEach(manager => { this.nodeScrollManagers.forEach(manager => {
const msg = manager.moveGetLast(t) const msg = manager.moveGetLast(t)
if (msg) { if (msg) {
let vNode: VNode let scrollVHost: VElement | OnloadVRoot | undefined
if (vNode = this.vElements.get(msg.id)) { if (scrollVHost = this.vElements.get(msg.id)) {
vNode.node.scrollLeft = msg.x scrollVHost.node.scrollLeft = msg.x
vNode.node.scrollTop = msg.y scrollVHost.node.scrollTop = msg.y
} else if ((vNode = this.vRoots.get(msg.id)) && vNode instanceof VDocument){ } else if ((scrollVHost = this.olVRoots.get(msg.id))) {
vNode.node.defaultView?.scrollTo(msg.x, msg.y) scrollVHost.whenReady(vNode => {
if (vNode instanceof VDocument) {
vNode.node.defaultView?.scrollTo(msg.x, msg.y)
}
})
} }
} }
}) })

View file

@ -1,69 +1,119 @@
type VChild = VElement | VText
export type VNode = VDocument | VShadowRoot | VElement | VText
import { insertRule, deleteRule } from './safeCSSRules'; import { insertRule, deleteRule } from './safeCSSRules';
abstract class VParent {
abstract node: Node | null type Callback<T> = (o: T) => void
/**
* Virtual Node base class.
* Implements common abstract methods and lazy node creation logic.
*
* @privateRemarks
* Would be better to export type-only, but didn't find a nice way to do that.
*/
export abstract class VNode<T extends Node = Node> {
protected abstract createNode(): T
private _node: T | null
/**
* JS DOM Node getter with lazy node creation
*
* @returns underneath JS DOM Node
* @remarks should not be called unless the real node is required since creation might be expensive
* It is better to use `onNode` callback applicator unless in the `applyChanges` implementation
*/
get node(): T {
if (!this._node) {
const node = this._node = this.createNode()
this.nodeCallbacks.forEach(cb => cb(node))
this.nodeCallbacks = []
}
return this._node
}
private nodeCallbacks: Callback<T>[] = []
/**
* Lazy Node callback applicator
*
* @param callback - Callback that fires on existing JS DOM Node instantly if it exists
* or whenever it gets created. Call sequence is concerned.
*/
onNode(callback: Callback<T>) {
if (this._node) {
callback(this._node)
return
}
this.nodeCallbacks.push(callback)
}
/**
* Abstract method, should be implemented by the actual classes
* It is supposed to apply virtual changes into the actual DOM
*/
public abstract applyChanges(): void
}
type VChild = VElement | VText
abstract class VParent<T extends Node = Node> extends VNode<T>{
protected children: VChild[] = [] protected children: VChild[] = []
private insertedChildren: Set<VChild> = new Set() private childrenToMount: Set<VChild> = new Set()
insertChildAt(child: VChild, index: number) { insertChildAt(child: VChild, index: number) {
if (child.parentNode) { if (child.parentNode) {
child.parentNode.removeChild(child) child.parentNode.removeChild(child)
} }
this.children.splice(index, 0, child) this.children.splice(index, 0, child)
this.insertedChildren.add(child) this.childrenToMount.add(child)
child.parentNode = this child.parentNode = this
} }
removeChild(child: VChild) { removeChild(child: VChild) {
this.children = this.children.filter(ch => ch !== child) this.children = this.children.filter(ch => ch !== child)
this.insertedChildren.delete(child) this.childrenToMount.delete(child)
child.parentNode = null child.parentNode = null
} }
applyChanges() { protected mountChildren(shouldInsert?: (child: VChild) => boolean) {
const node = this.node let nextMounted: VChild | null = null
if (!node) {
// log err
console.error("No node found", this)
return
}
// inserting
for (let i = this.children.length-1; i >= 0; i--) { for (let i = this.children.length-1; i >= 0; i--) {
const child = this.children[i] const child = this.children[i]
child.applyChanges() if (this.childrenToMount.has(child) &&
if (this.insertedChildren.has(child)) { (!shouldInsert || shouldInsert(child)) // is there a better way of not-knowing about subclass logic on prioritized insertion?
const nextVSibling = this.children[i+1] ) {
node.insertBefore(child.node, nextVSibling ? nextVSibling.node : null) this.node.insertBefore(child.node, nextMounted ? nextMounted.node : null)
this.childrenToMount.delete(child)
}
if (!this.childrenToMount.has(child)) {
nextMounted = child
} }
} }
this.insertedChildren.clear() }
// removing
applyChanges() {
/* Building a sub-trees first (in-memory for non-mounted children) */
this.children.forEach(child => child.applyChanges())
/* Inserting */
this.mountChildren()
if (this.childrenToMount.size !== 0) {
console.error("VParent: Something went wrong with children insertion")
}
/* Removing in-between */
const node = this.node
const realChildren = node.childNodes const realChildren = node.childNodes
for(let j = 0; j < this.children.length; j++) { for(let j = 0; j < this.children.length; j++) {
while (realChildren[j] !== this.children[j].node) { while (realChildren[j] !== this.children[j].node) {
node.removeChild(realChildren[j]) node.removeChild(realChildren[j])
} }
} }
// removing rest /* Removing tail */
while(realChildren.length > this.children.length) { while(realChildren.length > this.children.length) {
node.removeChild(node.lastChild) node.removeChild(node.lastChild as Node) /* realChildren.length > this.children.length >= 0 so it is not null */
} }
} }
} }
export class VDocument extends VParent { export class VDocument extends VParent<Document> {
constructor(public readonly node: Document) { super() } constructor(protected readonly createNode: () => Document) { super() }
applyChanges() { applyChanges() {
if (this.children.length > 1) { if (this.children.length > 1) {
// log err console.error("VDocument expected to have a single child.", this)
}
if (!this.node) {
// iframe not mounted yet
return
} }
const child = this.children[0] const child = this.children[0]
if (!child) { return } if (!child) { return }
@ -75,14 +125,22 @@ export class VDocument extends VParent {
} }
} }
export class VShadowRoot extends VParent { export class VShadowRoot extends VParent<ShadowRoot> {
constructor(public readonly node: ShadowRoot) { super() } constructor(protected readonly createNode: () => ShadowRoot) { super() }
} }
export class VElement extends VParent { export type VRoot = VDocument | VShadowRoot
parentNode: VParent | null = null
export class VElement extends VParent<Element> {
parentNode: VParent | null = null /** Should be modified only by he parent itself */
private newAttributes: Map<string, string | false> = new Map() private newAttributes: Map<string, string | false> = new Map()
constructor(public readonly node: Element) { super() }
constructor(readonly tagName: string, readonly isSVG = false) { super() }
protected createNode() {
return this.isSVG
? document.createElementNS('http://www.w3.org/2000/svg', this.tagName)
: document.createElement(this.tagName)
}
setAttribute(name: string, value: string) { setAttribute(name: string, value: string) {
this.newAttributes.set(name, value) this.newAttributes.set(name, value)
} }
@ -90,18 +148,7 @@ export class VElement extends VParent {
this.newAttributes.set(name, false) this.newAttributes.set(name, false)
} }
// mbtodo: priority insertion instead. private applyAttributeChanges() { // "changes" -> "updates" ?
// rn this is for styles that should be inserted as prior,
// otherwise it will show visual styling lag if there is a transition CSS property)
enforceInsertion() {
let vNode: VElement = this
while (vNode.parentNode instanceof VElement) {
vNode = vNode.parentNode
}
(vNode.parentNode || vNode).applyChanges()
}
applyChanges() {
this.newAttributes.forEach((value, key) => { this.newAttributes.forEach((value, key) => {
if (value === false) { if (value === false) {
this.node.removeAttribute(key) this.node.removeAttribute(key)
@ -114,88 +161,57 @@ export class VElement extends VParent {
} }
}) })
this.newAttributes.clear() this.newAttributes.clear()
}
applyChanges() {
this.prioritized && this.applyPrioritizedChanges()
this.applyAttributeChanges()
super.applyChanges() super.applyChanges()
} }
}
/** Insertion Prioritization
type StyleSheetCallback = (s: CSSStyleSheet) => void * Made for styles that should be inserted as prior,
export type StyleElement = HTMLStyleElement | SVGStyleElement * otherwise it will show visual styling lag if there is a transition CSS property)
*/
// @deprecated TODO: remove in favor of PostponedStyleSheet prioritized = false
export class VStyleElement extends VElement { insertChildAt(child: VChild, index: number) {
private loaded = false super.insertChildAt(child, index)
private stylesheetCallbacks: StyleSheetCallback[] = [] /* Bubble prioritization */
constructor(public readonly node: StyleElement) { if (child instanceof VElement && child.prioritized) {
super(node) // Is it compiled correctly or with 2 node assignments? let parent: VParent | null = this
node.onload = () => { while (parent instanceof VElement && !parent.prioritized) {
const sheet = node.sheet parent.prioritized = true
if (sheet) { parent = parent.parentNode
this.stylesheetCallbacks.forEach(cb => cb(sheet))
this.stylesheetCallbacks = []
} else {
// console.warn("Style onload: sheet is null") ?
// sometimes logs sheet ton of errors for some reason
}
this.loaded = true
}
}
onStyleSheet(cb: StyleSheetCallback) {
if (this.loaded) {
if (!this.node.sheet) {
console.warn("Style tag is loaded, but sheet is null")
return
} }
cb(this.node.sheet)
} else {
this.stylesheetCallbacks.push(cb)
} }
} }
} private applyPrioritizedChanges() {
this.children.forEach(child => {
if (child instanceof VText) {
export class PostponedStyleSheet { child.applyChanges()
private loaded = false } else if (child.prioritized) {
private stylesheetCallbacks: StyleSheetCallback[] = [] /* Update prioritized VElement-s */
child.applyPrioritizedChanges()
constructor(private readonly node: StyleElement) { child.applyAttributeChanges()
node.onload = () => {
const sheet = node.sheet
if (sheet) {
this.stylesheetCallbacks.forEach(cb => cb(sheet))
this.stylesheetCallbacks = []
} else {
console.warn("Style node onload: sheet is null")
}
this.loaded = true
}
}
private applyCallback(cb: StyleSheetCallback) {
if (this.loaded) {
if (!this.node.sheet) {
console.warn("Style tag is loaded, but sheet is null")
return
} }
cb(this.node.sheet) })
} else { this.mountChildren(child => child instanceof VText || child.prioritized)
this.stylesheetCallbacks.push(cb)
}
}
insertRule(rule: string, index: number) {
this.applyCallback(s => insertRule(s, { rule, index }))
}
deleteRule(index: number) {
this.applyCallback(s => deleteRule(s, { index }))
} }
} }
export class VText { export class VHTMLElement extends VElement {
constructor(node: HTMLElement) {
super("HTML", false)
this.createNode = () => node
}
}
export class VText extends VNode<Text> {
parentNode: VParent | null = null parentNode: VParent | null = null
constructor(public readonly node: Text = new Text()) {} protected createNode() {
return new Text()
}
private data: string = "" private data: string = ""
private changed: boolean = false private changed: boolean = false
setData(data: string) { setData(data: string) {
@ -210,3 +226,112 @@ export class VText {
} }
} }
class PromiseQueue<T> {
constructor(private promise: Promise<T>) {}
/**
* Call sequence is concerned.
*/
// Doing this with callbacks list instead might be more efficient (but more wordy). TODO: research
whenReady(cb: Callback<T>) {
this.promise = this.promise.then(vRoot => {
cb(vRoot)
return vRoot
})
}
catch(cb: Parameters<Promise<T>['catch']>[0]) {
this.promise.catch(cb)
}
}
/**
* VRoot wrapper that allows to defer all the API calls till the moment
* when VNode CAN be created (for example, on <iframe> mount&load)
*/
export class OnloadVRoot extends PromiseQueue<VRoot> {
static fromDocumentNode(doc: Document): OnloadVRoot {
return new OnloadVRoot(Promise.resolve(new VDocument(() => doc)))
}
static fromVElement(vElem: VElement): OnloadVRoot {
return new OnloadVRoot(new Promise((resolve, reject) => {
vElem.onNode(host => {
if (host instanceof HTMLIFrameElement) {
/* IFrame case: creating Document */
const doc = host.contentDocument
if (doc) {
resolve(new VDocument(() => doc))
} else {
host.addEventListener('load', () => {
const doc = host.contentDocument
if (doc) {
resolve(new VDocument(() => doc))
} else {
reject("No default Document found on iframe load") // Send `host` for logging as well
}
})
}
} else {
/* ShadowDom case */
try {
const shadowRoot = host.attachShadow({ mode: 'open' })
resolve(new VShadowRoot(() => shadowRoot))
} catch(e) {
reject(e) // "Can not attach shadow dom"
}
}
})
}))
}
onNode(cb: Callback<Document | ShadowRoot>) {
this.whenReady(vRoot => vRoot.onNode(cb))
}
applyChanges() {
this.whenReady(vRoot => vRoot.applyChanges())
}
insertChildAt(...args: Parameters<VParent['insertChildAt']>) {
this.whenReady(vRoot => vRoot.insertChildAt(...args))
}
}
export type StyleElement = HTMLStyleElement | SVGStyleElement
/**
* CSSStyleSheet wrapper that collects all the insertRule/deleteRule calls
* and then applies them when the sheet is ready
*/
export class OnloadStyleSheet extends PromiseQueue<CSSStyleSheet> {
static fromStyleElement(node: StyleElement) {
return new OnloadStyleSheet(new Promise((resolve, reject) => {
node.addEventListener("load", () => {
const sheet = node.sheet
if (sheet) {
resolve(sheet)
} else {
reject("Style node onload: sheet is null")
}
})
}))
}
static fromVRootContext(vRoot: OnloadVRoot) {
return new OnloadStyleSheet(new Promise((resolve, reject) =>
vRoot.onNode(node => {
let context: typeof globalThis | null
if (node instanceof Document || node.nodeName === '#document') {
context = node.defaultView
} else {
context = node.ownerDocument.defaultView
}
if (!context) { reject("Root node default view not found"); return }
/* a StyleSheet from another Window context won't work */
resolve(new context.CSSStyleSheet())
})
))
}
insertRule(rule: string, index: number) {
this.whenReady(s => insertRule(s, { rule, index }))
}
deleteRule(index: number) {
this.whenReady(s => deleteRule(s, { index }))
}
}

View file

@ -1,14 +1,15 @@
import logger from 'App/logger'; import logger from 'App/logger';
export type { PostponedStyleSheet } from './VirtualDOM' export function insertRule(
sheet: { insertRule: (rule: string, index: number) => void },
export function insertRule(sheet: CSSStyleSheet | PostponedStyleSheet, msg: { rule: string, index: number }) { msg: { rule: string, index: number }
) {
try { try {
sheet.insertRule(msg.rule, msg.index) sheet.insertRule(msg.rule, msg.index)
} catch (e) { } catch (e) {
logger.warn(e, msg) logger.warn(e, msg)
try { try {
sheet.insertRule(msg.rule, 0) sheet.insertRule(msg.rule, 0) // TODO: index renumeration in case of subsequent rule deletion
logger.warn("Inserting rule into 0-index", e, msg) logger.warn("Inserting rule into 0-index", e, msg)
} catch (e) { } catch (e) {
logger.warn("Cannot insert rule.", e, msg) logger.warn("Cannot insert rule.", e, msg)
@ -16,7 +17,10 @@ export function insertRule(sheet: CSSStyleSheet | PostponedStyleSheet, msg: { ru
} }
} }
export function deleteRule(sheet: CSSStyleSheet | PostponedStyleSheet, msg: { index: number }) { export function deleteRule(
sheet: { deleteRule: (index: number) => void },
msg: { index: number }
) {
try { try {
sheet.deleteRule(msg.index) sheet.deleteRule(msg.index)
} catch (e) { } catch (e) {

View file

@ -1,13 +1,21 @@
import type Screen from '../Screen/Screen'; import logger from 'App/logger';
import type { Message } from '../messages';
import { MType } from '../messages'; import type Screen from '../Screen/Screen';
import type { Message, StringDict } from '../messages';
import { MType} from '../messages';
import ListWalker from '../../common/ListWalker'; import ListWalker from '../../common/ListWalker';
import DOMManager from './DOM/DOMManager'; import DOMManager from './DOM/DOMManager';
export default class PagesManager extends ListWalker<DOMManager> { export default class PagesManager extends ListWalker<DOMManager> {
private currentPage: DOMManager | null = null private currentPage: DOMManager | null = null
/**
* String Dictionary in tracker may be desync with CreateDocument (why???)
* e.g. some StringDictionary and other messages before any 'CreateDocument' one
* TODO: understand why and fix
*/
private currentStringDict: Record<number, string> = {}
constructor( constructor(
private screen: Screen, private screen: Screen,
@ -19,11 +27,19 @@ export default class PagesManager extends ListWalker<DOMManager> {
Assumed that messages added in a correct time sequence. Assumed that messages added in a correct time sequence.
*/ */
appendMessage(m: Message): void { appendMessage(m: Message): void {
if (m.tp === MType.StringDict) {
if (this.currentStringDict[m.key] !== undefined) {
this.currentStringDict = {} /* refresh stringDict */
this.last?.setStringDict(this.currentStringDict)
}
this.currentStringDict[m.key] = m.value
return
}
if (m.tp === MType.CreateDocument) { if (m.tp === MType.CreateDocument) {
super.append(new DOMManager(this.screen, this.isMobile, m.time, this.setCssLoading)) super.append(new DOMManager(this.screen, this.isMobile, this.currentStringDict, m.time, this.setCssLoading))
} }
if (this.last === null) { if (this.last === null) {
// Log wrong logger.warn("DOMMessage before any document created, skipping:", m)
return; return;
} }
this.last.append(m) this.last.append(m)

View file

@ -15,9 +15,9 @@ export default class PerformanceTrackManager extends ListWalker<PerformanceTrack
private chart: Array<PerformanceChartPoint> = []; private chart: Array<PerformanceChartPoint> = [];
private isHidden: boolean = false; private isHidden: boolean = false;
private timeCorrection: number = 0; private timeCorrection: number = 0;
private heapAvaliable: boolean = false; private heapAvailable: boolean = false;
private fpsAvaliable: boolean = false; private fpsAvailable: boolean = false;
private cpuAvaliable: boolean = false; private cpuAvailable: boolean = false;
private prevTime: number | null = null; private prevTime: number | null = null;
private prevNodesCount: number = 0; private prevNodesCount: number = 0;
@ -29,7 +29,7 @@ export default class PerformanceTrackManager extends ListWalker<PerformanceTrack
let timePassed = msg.time - this.prevTime + this.timeCorrection; let timePassed = msg.time - this.prevTime + this.timeCorrection;
if (timePassed > 0 && msg.frames >= 0) { if (timePassed > 0 && msg.frames >= 0) {
if (msg.frames > 0) { this.fpsAvaliable = true; } if (msg.frames > 0) { this.fpsAvailable = true; }
fps = msg.frames*1e3/timePassed; // Multiply by 1e3 as time in ms; fps = msg.frames*1e3/timePassed; // Multiply by 1e3 as time in ms;
fps = Math.min(fps,60); // What if 120? TODO: alert if more than 60 fps = Math.min(fps,60); // What if 120? TODO: alert if more than 60
if (this.chart.length === 1) { if (this.chart.length === 1) {
@ -38,7 +38,7 @@ export default class PerformanceTrackManager extends ListWalker<PerformanceTrack
} }
if (timePassed > 0 && msg.ticks >= 0) { if (timePassed > 0 && msg.ticks >= 0) {
this.cpuAvaliable = true; this.cpuAvailable = true;
let tickRate = msg.ticks * 30 / timePassed; let tickRate = msg.ticks * 30 / timePassed;
if (tickRate > 1) { if (tickRate > 1) {
tickRate = 1; tickRate = 1;
@ -53,7 +53,7 @@ export default class PerformanceTrackManager extends ListWalker<PerformanceTrack
this.prevTime = msg.time; this.prevTime = msg.time;
this.timeCorrection = 0 this.timeCorrection = 0
this.heapAvaliable = this.heapAvaliable || msg.usedJSHeapSize > 0; this.heapAvailable = this.heapAvailable || msg.usedJSHeapSize > 0;
this.chart.push({ this.chart.push({
usedHeap: msg.usedJSHeapSize, usedHeap: msg.usedJSHeapSize,
totalHeap: msg.totalJSHeapSize, totalHeap: msg.totalJSHeapSize,
@ -109,11 +109,11 @@ export default class PerformanceTrackManager extends ListWalker<PerformanceTrack
return this.chart; return this.chart;
} }
get avaliability(): { cpu: boolean, fps: boolean, heap: boolean, nodes: boolean } { get availability(): { cpu: boolean, fps: boolean, heap: boolean, nodes: boolean } {
return { return {
cpu: this.cpuAvaliable, cpu: this.cpuAvailable,
fps: this.fpsAvaliable, fps: this.fpsAvailable,
heap: this.heapAvaliable, heap: this.heapAvailable,
nodes: true, nodes: true,
} }
} }

View file

@ -15,6 +15,18 @@ import { MType } from '../raw.gen'
import { resolveURL, resolveCSS } from './urlResolve' import { resolveURL, resolveCSS } from './urlResolve'
import { HOVER_CLASSNAME, FOCUS_CLASSNAME } from './constants' import { HOVER_CLASSNAME, FOCUS_CLASSNAME } from './constants'
/* maybetodo: filter out non-relevant prefixes in CSS-rules.
They might cause an error in console, but not sure if it breaks the replay.
(research required)
*/
// function replaceCSSPrefixes(css: string) {
// return css
// .replace(/\-ms\-/g, "")
// .replace(/\-webkit\-/g, "")
// .replace(/\-moz\-/g, "")
// .replace(/\-webkit\-/g, "")
// }
const HOVER_SELECTOR = `.${HOVER_CLASSNAME}` const HOVER_SELECTOR = `.${HOVER_CLASSNAME}`
const FOCUS_SELECTOR = `.${FOCUS_CLASSNAME}` const FOCUS_SELECTOR = `.${FOCUS_CLASSNAME}`
export function replaceCSSPseudoclasses(cssText: string): string { export function replaceCSSPseudoclasses(cssText: string): string {

View file

@ -8,15 +8,16 @@ export const NO_URLS = 'No-urls-provided'
export async function loadFiles( export async function loadFiles(
urls: string[], urls: string[],
onData: (data: Uint8Array) => void, onData: (data: Uint8Array) => void,
): Promise<any> { canSkip: boolean = false,
): Promise<void> {
if (!urls.length) { if (!urls.length) {
throw NO_URLS throw NO_URLS
} }
try { try {
for (let url of urls) { for (let url of urls) {
const response = await window.fetch(url) const response = await window.fetch(url)
const data = await processAPIStreamResponse(response, url !== url[0]) const data = await processAPIStreamResponse(response, urls.length > 1 ? url !== urls[0] : canSkip)
onData(data) await onData(data)
} }
} catch(e) { } catch(e) {
if (e === ALLOWED_404) { if (e === ALLOWED_404) {

View file

@ -1,7 +1,7 @@
import { getFilterKeyTypeByKey, setQueryParamKeyFromFilterkey } from 'Types/filter/filterType'; import { getFilterKeyTypeByKey, setQueryParamKeyFromFilterkey } from 'Types/filter/filterType';
import Period, { LAST_24_HOURS, LAST_7_DAYS, LAST_30_DAYS, CUSTOM_RANGE } from 'Types/app/period'; import Period, { CUSTOM_RANGE } from 'Types/app/period';
import Filter from 'Types/filter/filter'; import Filter from 'Types/filter/filter';
import { filtersMap } from 'App/types/filter/newFilter'; import { filtersMap } from 'Types/filter/newFilter';
export const createUrlQuery = (filter: any) => { export const createUrlQuery = (filter: any) => {
const query = []; const query = [];
@ -83,13 +83,13 @@ const getFiltersFromEntries = (entires: any) => {
filter.operator = operator; filter.operator = operator;
if (filter.icon === "filters/metadata") { if (filter.icon === "filters/metadata") {
filter.source = filter.type; filter.source = filter.type;
filter.type = 'METADATA'; filter.type = 'MULTIPLE';
} else { } else {
filter.source = sourceArr && sourceArr.length > 0 ? sourceArr : null; filter.source = sourceArr && sourceArr.length > 0 ? sourceArr : null;
filter.sourceOperator = !!sourceOperator ? decodeURI(sourceOperator) : null; filter.sourceOperator = !!sourceOperator ? decodeURI(sourceOperator) : null;
} }
if (!filter.filters || filter.filters.size === 0) { if (!filter.filters || filter.filters.size === 0) { // TODO support subfilters in url
filters.push(filter); filters.push(filter);
} }
}); });

View file

@ -20,7 +20,9 @@ check_prereq() {
chart=frontend chart=frontend
[[ $1 == ee ]] && ee=true [[ $1 == ee ]] && ee=true
[[ $PATCH -eq 1 ]] && { [[ $PATCH -eq 1 ]] && {
image_tag="$(grep -ER ^.ppVersion ../scripts/helmcharts/openreplay/charts/$chart | xargs | awk '{print $2}' | awk -F. -v OFS=. '{$NF += 1 ; print}')" __app_version="$(grep -ER ^.ppVersion ../scripts/helmcharts/openreplay/charts/${chart} | xargs | awk '{print $2}' | awk -F. -v OFS=. '{$NF += 1 ; print}' | cut -d 'v' -f2)"
sed -i "s/^VERSION = .*/VERSION = $__app_version/g" .env.sample
image_tag="v${__app_version}"
[[ $ee == "true" ]] && { [[ $ee == "true" ]] && {
image_tag="${image_tag}-ee" image_tag="${image_tag}-ee"
} }
@ -30,8 +32,9 @@ update_helm_release() {
# Update the chart version # Update the chart version
sed -i "s#^version.*#version: $HELM_TAG# g" ../scripts/helmcharts/openreplay/charts/$chart/Chart.yaml sed -i "s#^version.*#version: $HELM_TAG# g" ../scripts/helmcharts/openreplay/charts/$chart/Chart.yaml
# Update image tags # Update image tags
sed -i "s#ppVersion.*#ppVersion: \"$image_tag\"#g" ../scripts/helmcharts/openreplay/charts/$chart/Chart.yaml sed -i "s#ppVersion.*#ppVersion: \"v${__app_version}\"#g" ../scripts/helmcharts/openreplay/charts/$chart/Chart.yaml
# Commit the changes # Commit the changes
git add .env.sample
git add ../scripts/helmcharts/openreplay/charts/$chart/Chart.yaml git add ../scripts/helmcharts/openreplay/charts/$chart/Chart.yaml
git commit -m "chore(helm): Updating $chart image release" git commit -m "chore(helm): Updating $chart image release"
} }

View file

@ -84,4 +84,4 @@ nodeSelector: {}
tolerations: [] tolerations: []
affinity: {} affinity: {}
storageSize: 100G storageSize: 100Gi

View file

@ -5,10 +5,6 @@ OR_DIR="/var/lib/openreplay"
APP_NS="${APP_NS:-app}" APP_NS="${APP_NS:-app}"
DB_NS="${DB_NS:-db}" DB_NS="${DB_NS:-db}"
OR_REPO="https://github.com/openreplay/openreplay" OR_REPO="https://github.com/openreplay/openreplay"
# To run kubeconfig run
# `KUBECONFIG=/path/to/file openreplay -s`
export KUBECONFIG=${KUBECONFIG:-"/etc/rancher/k3s/k3s.yaml"}
tmp_dir=$(mktemp -d)
# For example HELM_OPTIONS="--set dbMigrationUpstreamBranch=dev" # For example HELM_OPTIONS="--set dbMigrationUpstreamBranch=dev"
#HELM_OPTIONS="" #HELM_OPTIONS=""
# If you want to install the dev version. It can be any branch or tag. # If you want to install the dev version. It can be any branch or tag.
@ -79,6 +75,14 @@ function log () {
exit 100 exit 100
} }
# To run kubeconfig run
# `KUBECONFIG=/path/to/file openreplay -s`
[[ -f /etc/rancher/k3s/k3s.yaml ]] && k3s_path="/etc/rancher/k3s/k3s.yaml"
[[ -f "${HOME}/.kube/config" ]] && local_kube_config_path="${HOME}/.kube/config"
export KUBECONFIG=${KUBECONFIG:-$k3s_path:$local_kube_config_path}
[[ -z $KUBECONFIG ]] && log err "No kubeconfig file found. Exiting"
tmp_dir=$(mktemp -d)
function install_packages() { function install_packages() {
[[ -e "$OR_DIR/eget" ]] || { [[ -e "$OR_DIR/eget" ]] || {
@ -117,11 +121,11 @@ cat <<"EOF"
EOF EOF
echo -e ${NC} echo -e ${NC}
log info ' log info "
Usage: openreplay [ -h | --help ] Usage: openreplay [ -h | --help ]
[ -s | --status ] [ -s | --status ]
[ -i | --install DOMAIN_NAME ] [ -i | --install DOMAIN_NAME ]
[ -u | --upgrade ] [ -u | --upgrade (fetch lastest patches for installed release. ${BWHITE}RELEASE_UPGRADE=1 openreplay -u${GREEN} to upgrade release.)]
[ -U | --deprecated-upgrade /path/to/old_vars.yaml] [ -U | --deprecated-upgrade /path/to/old_vars.yaml]
[ -r | --restart ] [ -r | --restart ]
[ -R | --Reload ] [ -R | --Reload ]
@ -133,7 +137,7 @@ log info '
db ender frontend heuristics db ender frontend heuristics
http integrations nginx-controller http integrations nginx-controller
peers sink sourcemapreader storage peers sink sourcemapreader storage
' "
return return
} }
@ -186,6 +190,12 @@ function or_helm_upgrade() {
function upgrade_old() { function upgrade_old() {
old_vars_path="$1" old_vars_path="$1"
[[ -f $old_vars_path ]] || log err "No configuration file ${BWHITE}$old_vars_path${RED}.
If you're updating from version older than ${BWHITE}v1.10.0${RED}, for example ${BWHITE}v1.9.0${RED}:
${BWHITE}openreplay --deprecated-upgrade ~/openreplay_v1.9.0/scripts/helmcharts/vars.yaml${RED}.
If you're having a custom installation,
${BWHITE}openreplay --deprecated-upgrade /path/to/vars.yaml${RED}.
"
or_version=$(busybox awk '/fromVersion/{print $2}' < "${old_vars_path}") or_version=$(busybox awk '/fromVersion/{print $2}' < "${old_vars_path}")
sudo cp "${old_vars_path}" ${OR_DIR}/vars.yaml.backup."${or_version//\"}"_"$(date +%Y%m%d-%H%M%S)" || log err "Not able to copy old vars.yaml" sudo cp "${old_vars_path}" ${OR_DIR}/vars.yaml.backup."${or_version//\"}"_"$(date +%Y%m%d-%H%M%S)" || log err "Not able to copy old vars.yaml"
sudo cp "${old_vars_path}" ${OR_DIR}/vars.yaml || log err "Not able to copy old vars.yaml" sudo cp "${old_vars_path}" ${OR_DIR}/vars.yaml || log err "Not able to copy old vars.yaml"
@ -266,14 +276,24 @@ function upgrade() {
# 3. How to update package. Because openreplay -u will be done from old update script # 3. How to update package. Because openreplay -u will be done from old update script
# 4. Update from Version # 4. Update from Version
exists git || log err "Git not found. Please install" exists git || log err "Git not found. Please install"
[[ -f ${OR_DIR}/vars.yaml ]] || log err "No configuration file ${BWHITE}${OR_DIR}/vars.yaml${RED}.
If you're updating from version older than ${BWHITE}v1.10.0${RED}, for example ${BWHITE}v1.9.0${RED}:
${BWHITE}openreplay --deprecated-upgrade ~/openreplay_v1.9.0/scripts/helmcharts/vars.yaml${RED}.
If you're having a custom installation,
${BWHITE}openreplay --deprecated-upgrade /path/to/vars.yaml${RED}.
"
or_version=$(busybox awk '/fromVersion/{print $2}' < "${OR_DIR}/vars.yaml") || { or_version=$(busybox awk '/fromVersion/{print $2}' < "${OR_DIR}/vars.yaml") || {
log err "${BWHITE}${OR_DIR}/vars.yaml${RED} not found. log err "${BWHITE}${OR_DIR}/vars.yaml${RED} not found.
Please do ${BWHITE}openreplay --deprecated-upgrade /path/to/vars.yaml${RED} Please do ${BWHITE}openreplay --deprecated-upgrade /path/to/vars.yaml${RED}
" "
} }
# Unless its upgrade release, always checkout same tag.
[[ $RELEASE_UPGRADE -eq 1 ]] || OR_VERSION=$or_version
time_now=$(date +%m-%d-%Y-%I%M%S)
# Creating backup dir of current installation # Creating backup dir of current installation
[[ -d "$OR_DIR/openreplay" ]] && sudo cp -rfb "$OR_DIR/openreplay" "$OR_DIR/openreplay_${or_version//\"}" && sudo rm -rf ${OR_DIR}/openreplay [[ -d "$OR_DIR/openreplay" ]] && sudo mv "$OR_DIR/openreplay" "$OR_DIR/openreplay_${or_version//\"}_${time_now}"
clone_repo clone_repo
err_cd openreplay/scripts/helmcharts err_cd openreplay/scripts/helmcharts
@ -345,7 +365,11 @@ do
exit 0 exit 0
;; ;;
-u | --upgrade) -u | --upgrade)
log title "Upgrading OpenReplay" if [[ $RELEASE_UPGRADE -eq 1 ]]; then
log title "Upgrading OpenReplay to Latest Release"
else
log title "Applying Latest OpenReplay Patches"
fi
upgrade upgrade
clean_tmp_dir clean_tmp_dir
exit 0 exit 0
@ -391,7 +415,7 @@ do
clean_tmp_dir clean_tmp_dir
exit 100 exit 100
} }
if /var/lib/openreplay/busybox md5sum -c "${tmp_dir}/var.yaml.md5"; then if /var/lib/openreplay/busybox md5sum -c "${tmp_dir}/var.yaml.md5" &> /dev/null; then
log info "No change detected in ${BWHITE}${OR_DIR}/vars.yaml${GREEN}. Not reloading" log info "No change detected in ${BWHITE}${OR_DIR}/vars.yaml${GREEN}. Not reloading"
else else
reload reload

View file

@ -48,7 +48,7 @@ spec:
- name: pg_host - name: pg_host
value: '{{ .Values.global.postgresql.postgresqlHost }}' value: '{{ .Values.global.postgresql.postgresqlHost }}'
- name: pg_port - name: pg_port
value: "5432" value: '{{ .Values.global.postgresql.postgresqlPort }}'
- name: pg_dbname - name: pg_dbname
value: "{{ .Values.global.postgresql.postgresqlDatabase }}" value: "{{ .Values.global.postgresql.postgresqlDatabase }}"
- name: ch_host - name: ch_host

View file

@ -43,6 +43,14 @@ spec:
{{- .Values.healthCheck | toYaml | nindent 10}} {{- .Values.healthCheck | toYaml | nindent 10}}
{{- end}} {{- end}}
env: env:
{{- range $key, $val := .Values.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end}}
{{- range $key, $val := .Values.global.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end }}
- name: AWS_ACCESS_KEY_ID - name: AWS_ACCESS_KEY_ID
{{- if .Values.global.s3.existingSecret }} {{- if .Values.global.s3.existingSecret }}
valueFrom: valueFrom:
@ -94,14 +102,6 @@ spec:
value: '{{ .Values.global.s3.endpoint }}/{{.Values.global.s3.assetsBucket}}' value: '{{ .Values.global.s3.endpoint }}/{{.Values.global.s3.assetsBucket}}'
{{- end }} {{- end }}
{{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }} {{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }}
{{- range $key, $val := .Values.global.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end }}
{{- range $key, $val := .Values.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end}}
ports: ports:
{{- range $key, $val := .Values.service.ports }} {{- range $key, $val := .Values.service.ports }}
- name: {{ $key }} - name: {{ $key }}

View file

@ -15,10 +15,10 @@ type: application
# This is the chart version. This version number should be incremented each time you make changes # This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version. # to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/) # Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.7 version: 0.1.12
# This is the version number of the application being deployed. This version number should be # This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.11.7" AppVersion: "v1.11.12"

View file

@ -43,10 +43,9 @@ spec:
{{- .Values.healthCheck | toYaml | nindent 10}} {{- .Values.healthCheck | toYaml | nindent 10}}
{{- end}} {{- end}}
env: env:
{{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }}
- name: KAFKA_SERVERS - name: KAFKA_SERVERS
value: "{{ .Values.global.kafka.kafkaHost }}" value: "{{ .Values.global.kafka.kafkaHost }}"
- name: REDIS_STRING
value: "{{ .Values.global.redis.redisHost }}"
- name: ch_username - name: ch_username
value: "{{ .Values.global.clickhouse.username }}" value: "{{ .Values.global.clickhouse.username }}"
- name: ch_password - name: ch_password
@ -114,6 +113,8 @@ spec:
value: '{{ .Values.global.s3.region }}' value: '{{ .Values.global.s3.region }}'
- name: sessions_region - name: sessions_region
value: '{{ .Values.global.s3.region }}' value: '{{ .Values.global.s3.region }}'
- name: ASSIST_RECORDS_BUCKET
value: {{ .Values.global.s3.assistRecordsBucket }}
- name: sessions_bucket - name: sessions_bucket
value: {{ .Values.global.s3.recordingsBucket }} value: {{ .Values.global.s3.recordingsBucket }}
- name: sourcemaps_bucket - name: sourcemaps_bucket

View file

@ -44,7 +44,7 @@ spec:
{{- end}} {{- end}}
env: env:
- name: CH_USERNAME - name: CH_USERNAME
value: '{{ .Values.global.clickhouse.userame }}' value: '{{ .Values.global.clickhouse.username }}'
- name: CH_PASSWORD - name: CH_PASSWORD
value: '{{ .Values.global.clickhouse.password }}' value: '{{ .Values.global.clickhouse.password }}'
- name: CLICKHOUSE_STRING - name: CLICKHOUSE_STRING

View file

@ -15,10 +15,10 @@ type: application
# This is the chart version. This version number should be incremented each time you make changes # This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version. # to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/) # Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.1 version: 0.1.2
# This is the version number of the application being deployed. This version number should be # This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.11.1" AppVersion: "v1.11.2"

View file

@ -15,10 +15,10 @@ type: application
# This is the chart version. This version number should be incremented each time you make changes # This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version. # to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (frontends://semver.org/) # Versions are expected to follow Semantic Versioning (frontends://semver.org/)
version: 0.1.7 version: 0.1.14
# This is the version number of the application being deployed. This version number should be # This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.11.6" AppVersion: "v1.11.13"

View file

@ -43,6 +43,14 @@ spec:
{{- .Values.healthCheck | toYaml | nindent 10}} {{- .Values.healthCheck | toYaml | nindent 10}}
{{- end}} {{- end}}
env: env:
{{- range $key, $val := .Values.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end}}
{{- range $key, $val := .Values.global.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end }}
- name: AWS_ACCESS_KEY_ID - name: AWS_ACCESS_KEY_ID
{{- if .Values.global.s3.existingSecret }} {{- if .Values.global.s3.existingSecret }}
valueFrom: valueFrom:
@ -101,14 +109,6 @@ spec:
value: '{{ .Values.global.s3.endpoint }}/{{.Values.global.s3.assetsBucket}}' value: '{{ .Values.global.s3.endpoint }}/{{.Values.global.s3.assetsBucket}}'
{{- end }} {{- end }}
{{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }} {{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }}
{{- range $key, $val := .Values.global.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end }}
{{- range $key, $val := .Values.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end}}
ports: ports:
{{- range $key, $val := .Values.service.ports }} {{- range $key, $val := .Values.service.ports }}
- name: {{ $key }} - name: {{ $key }}

View file

@ -43,6 +43,14 @@ spec:
{{- .Values.healthCheck | toYaml | nindent 10}} {{- .Values.healthCheck | toYaml | nindent 10}}
{{- end}} {{- end}}
env: env:
{{- range $key, $val := .Values.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end}}
{{- range $key, $val := .Values.global.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end }}
- name: LICENSE_KEY - name: LICENSE_KEY
value: '{{ .Values.global.enterpriseEditionLicense }}' value: '{{ .Values.global.enterpriseEditionLicense }}'
- name: KAFKA_SERVERS - name: KAFKA_SERVERS
@ -70,14 +78,6 @@ spec:
value: '{{ .Values.global.s3.endpoint }}/{{.Values.global.s3.assetsBucket}}' value: '{{ .Values.global.s3.endpoint }}/{{.Values.global.s3.assetsBucket}}'
{{- end }} {{- end }}
{{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }} {{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }}
{{- range $key, $val := .Values.global.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end }}
{{- range $key, $val := .Values.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end}}
ports: ports:
{{- range $key, $val := .Values.service.ports }} {{- range $key, $val := .Values.service.ports }}
- name: {{ $key }} - name: {{ $key }}

View file

@ -43,6 +43,14 @@ spec:
{{- .Values.healthCheck | toYaml | nindent 10}} {{- .Values.healthCheck | toYaml | nindent 10}}
{{- end}} {{- end}}
env: env:
{{- range $key, $val := .Values.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end}}
{{- range $key, $val := .Values.global.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end }}
- name: AWS_ACCESS_KEY_ID - name: AWS_ACCESS_KEY_ID
value: {{ .Values.global.s3.accessKey }} value: {{ .Values.global.s3.accessKey }}
- name: AWS_SECRET_ACCESS_KEY - name: AWS_SECRET_ACCESS_KEY
@ -51,8 +59,7 @@ spec:
value: '{{ .Values.global.s3.region }}' value: '{{ .Values.global.s3.region }}'
- name: LICENSE_KEY - name: LICENSE_KEY
value: '{{ .Values.global.enterpriseEditionLicense }}' value: '{{ .Values.global.enterpriseEditionLicense }}'
- name: REDIS_STRING {{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }}
value: '{{ .Values.global.redis.redisHost }}:{{ .Values.global.redis.redisPort }}'
- name: KAFKA_SERVERS - name: KAFKA_SERVERS
value: '{{ .Values.global.kafka.kafkaHost }}:{{ .Values.global.kafka.kafkaPort }}' value: '{{ .Values.global.kafka.kafkaHost }}:{{ .Values.global.kafka.kafkaPort }}'
- name: KAFKA_USE_SSL - name: KAFKA_USE_SSL
@ -79,14 +86,6 @@ spec:
# S3 compatible storage # S3 compatible storage
value: '{{ .Values.global.s3.endpoint }}/{{.Values.global.s3.assetsBucket}}' value: '{{ .Values.global.s3.endpoint }}/{{.Values.global.s3.assetsBucket}}'
{{- end }} {{- end }}
{{- range $key, $val := .Values.global.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end }}
{{- range $key, $val := .Values.env }}
- name: {{ $key }}
value: '{{ $val }}'
{{- end}}
ports: ports:
{{- range $key, $val := .Values.service.ports }} {{- range $key, $val := .Values.service.ports }}
- name: {{ $key }} - name: {{ $key }}

View file

@ -15,10 +15,10 @@ type: application
# This is the chart version. This version number should be incremented each time you make changes # This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version. # to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/) # Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.1 version: 0.1.2
# This is the version number of the application being deployed. This version number should be # This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.11.1" AppVersion: "v1.11.2"

View file

@ -65,6 +65,7 @@ Create the name of the service account to use
Create the environment configuration for REDIS_STRING Create the environment configuration for REDIS_STRING
*/}} */}}
{{- define "openreplay.env.redis_string" -}} {{- define "openreplay.env.redis_string" -}}
{{- if .enabled }}
{{- $scheme := (eq (.tls | default dict).enabled true) | ternary "rediss" "redis" -}} {{- $scheme := (eq (.tls | default dict).enabled true) | ternary "rediss" "redis" -}}
{{- $auth := "" -}} {{- $auth := "" -}}
{{- if or .existingSecret .redisPassword -}} {{- if or .existingSecret .redisPassword -}}
@ -83,6 +84,7 @@ Create the environment configuration for REDIS_STRING
- name: REDIS_STRING - name: REDIS_STRING
value: '{{ $scheme }}://{{ $auth }}{{ .redisHost }}:{{ .redisPort }}' value: '{{ $scheme }}://{{ $auth }}{{ .redisHost }}:{{ .redisPort }}'
{{- end }} {{- end }}
{{- end }}
{{/* {{/*
Create the volume mount config for redis TLS certificates Create the volume mount config for redis TLS certificates

View file

@ -50,7 +50,7 @@ kafka: &kafka
# value: "3000000" # value: "3000000"
redis: &redis redis: &redis
# enabled: false enabled: true
redisHost: "redis-master.db.svc.cluster.local" redisHost: "redis-master.db.svc.cluster.local"
redisPort: "6379" redisPort: "6379"
@ -117,6 +117,7 @@ global:
assetsBucket: "sessions-assets" assetsBucket: "sessions-assets"
recordingsBucket: "mobs" recordingsBucket: "mobs"
sourcemapsBucket: "sourcemaps" sourcemapsBucket: "sourcemaps"
assistRecordsBucket: "records"
vaultBucket: "vault-data" vaultBucket: "vault-data"
# This is only for enterpriseEdition # This is only for enterpriseEdition
quickwitBucket: "quickwit" quickwitBucket: "quickwit"

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 570 KiB

After

Width:  |  Height:  |  Size: 567 KiB

148
static/replay-thumbnail.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 99 KiB