Compare commits

...
Sign in to create a new pull request.

42 commits

Author SHA1 Message Date
Shekar Siri
6edf729bd9 change(ui): sentry dep update 2024-10-15 16:25:15 +02:00
Mehdi Osman
b43a35e458
Increment frontend chart version (#2646)
Co-authored-by: GitHub Action <action@github.com>
2024-10-10 14:28:25 +02:00
Delirium
28a9b53d05
port tracker-14 fixes to latest (#2645) 2024-10-10 14:21:56 +02:00
Mehdi Osman
111e9c6474
Increment chalice chart version (#2642)
Co-authored-by: GitHub Action <action@github.com>
2024-10-08 15:54:17 +02:00
Kraiem Taha Yassine
f8d8cc5150
fix(chalice): use existing user attributes for SSO if they are missing in the list of claims (#2641) 2024-10-08 15:31:14 +02:00
Mehdi Osman
aa25b0e882
Increment frontend chart version (#2639)
Co-authored-by: GitHub Action <action@github.com>
2024-10-07 16:58:34 +02:00
Delirium
b53b14ae5f
rm console line (#2637) 2024-10-07 16:45:17 +02:00
Delirium
e3f6a8fadc
ui: fix audioplayer time comp (#2636) 2024-10-07 16:43:00 +02:00
Chris Weaver
e95611c1a6
fix #2360 Check ping or Wget to confirm Github is up in job.yaml (#2631) 2024-10-03 16:39:57 +02:00
Mehdi Osman
46aebe9a8c
Updated patch build from main e9a9d2ff2a (#2619)
* Increment chalice chart version

* Increment alerts chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-09-27 15:10:07 +02:00
Kraiem Taha Yassine
e9a9d2ff2a
Patch/api v1.20.0 (#2618)
* chore(actions): show patch diff

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* fix(chalice): fixed session's search ignore injected durations

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
Co-authored-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-27 14:58:36 +02:00
rjshrjndrn
1f7d587796 chore(actions): show patch diff
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-27 10:49:31 +02:00
Mehdi Osman
7c20b608c5
Increment frontend chart version (#2615) 2024-09-26 14:39:11 -04:00
Mehdi Osman
88a82acb8b
Update .env.sample 2024-09-26 12:37:28 -04:00
rjshrjndrn
36c9b5e234 chore(actions): git clone should be from the specific tag for submodule
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-26 10:20:59 +02:00
Mehdi Osman
4cfdee28c3
Updated patch build from main 62ef3ca2dd (#2611)
* Increment chalice chart version

* Increment alerts chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-09-25 17:30:24 +02:00
Kraiem Taha Yassine
62ef3ca2dd
Patch/api v1.20.0 (#2610)
* fix(chalice): remove null referrer from table of referrers

* fix(chalice): fixed add MSTeams integration with wrong URL

* fix(chalice): session's search ignore injected durations
2024-09-25 17:25:18 +02:00
Mehdi Osman
9d0f3b34ae
Increment frontend chart version (#2609)
Co-authored-by: GitHub Action <action@github.com>
2024-09-25 16:16:20 +02:00
Delirium
93c605a28e
UI path evs cons (#2608)
* ui: support payload for events search

* ui: assist console size and init fixes
2024-09-25 16:11:03 +02:00
rjshrjndrn
872263624d chore(build): Support for multi arch
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-24 16:47:54 +02:00
Mehdi Osman
1dee5853a5
Increment frontend chart version (#2607)
Co-authored-by: GitHub Action <action@github.com>
2024-09-24 12:16:21 +02:00
Delirium
5cf584e8e1
UI patch 1337 (#2606)
* ui: debugging audio

* ui: debugging audio pt2

* ui: remove select-none from console rows

* ui: fix audioplayer file length calculation and checks
2024-09-24 12:12:50 +02:00
rjshrjndrn
cfc1f807ec chore(cli): proper cleanup
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-20 19:03:52 +02:00
Mehdi Osman
de19f0397d
Increment frontend chart version (#2599)
Co-authored-by: GitHub Action <action@github.com>
2024-09-20 17:12:18 +02:00
Delirium
a11c683baf
fix ui: prevent audioplayer from looping after playing once unless scrolled backwards (#2598) 2024-09-20 16:47:47 +02:00
rjshrjndrn
f5949cc08e chore(helm): check github availability before clone
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-20 15:30:59 +02:00
Sudheer Salavadi
d7cb49d490
New Git hero 2024-09-19 19:20:05 +05:30
Mehdi Osman
6e5d92ed79
Increment chalice chart version (#2596)
Co-authored-by: GitHub Action <action@github.com>
2024-09-17 20:06:18 +02:00
Kraiem Taha Yassine
018bf9c0be
fix(chalice): fixed spot refresh logic for EE (#2595) 2024-09-17 20:03:44 +02:00
Mehdi Osman
c56a2c2d25
Increment chalice chart version (#2594)
Co-authored-by: GitHub Action <action@github.com>
2024-09-17 12:46:22 +02:00
Kraiem Taha Yassine
5d786bde56
fix(chalice): fixed issues-tracking error handler (#2593) 2024-09-17 12:42:34 +02:00
Mehdi Osman
c7e6f31941
Updated patch build from main ad0ef00842 (#2591)
* Increment chalice chart version

* Increment alerts chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-09-16 16:36:39 +02:00
Kraiem Taha Yassine
ad0ef00842
fix(alerts): fixed missing dependency for EE (#2590)
fix(crons): fixed missing dependency for EE
2024-09-16 16:24:23 +02:00
Kraiem Taha Yassine
2ffec26d02
fix(chalice): fixed wrong default logging level (#2589) 2024-09-16 16:11:12 +02:00
Mehdi Osman
b63962b51a
Increment frontend chart version (#2588)
Co-authored-by: GitHub Action <action@github.com>
2024-09-16 16:05:36 +02:00
Delirium
abe440f729
fix ui: revert spots check (#2587) 2024-09-16 15:59:25 +02:00
Mehdi Osman
71e7552899
Updated patch build from main 7906384fe7 (#2586)
* Increment chalice chart version

* Increment alerts chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-09-16 14:10:07 +02:00
Kraiem Taha Yassine
7906384fe7
Patch/api v1.20.0 (#2585)
* fix(chalice): fixed top fetchUrl values for EE-exp
* fix(alerts): fixed missing logger
* fix(chalice): JIRA integration support expired credentials
2024-09-16 13:45:51 +02:00
Mehdi Osman
bdd564f49c
Increment spot chart version (#2579)
Co-authored-by: GitHub Action <action@github.com>
2024-09-14 12:24:00 +05:30
Mehdi Osman
b89248067a
Increment frontend chart version (#2578)
Co-authored-by: GitHub Action <action@github.com>
2024-09-13 12:16:54 -04:00
Delirium
9ed207abb1
Dev (#2577)
* ui: use enum state for spot ready checker

* ui: force worker for hls

* ui: fix spot list header behavior, change spot login flow?

* ui: bump spot v

* ui: spot signup fixes
2024-09-13 18:13:15 +02:00
Mehdi Osman
cbe2d62def
Increment frontend chart version (#2576)
Co-authored-by: GitHub Action <action@github.com>
2024-09-13 12:03:40 -04:00
53 changed files with 1279 additions and 821 deletions

View file

@ -83,8 +83,12 @@ jobs:
[ -d $MSAAS_REPO_FOLDER ] || { [ -d $MSAAS_REPO_FOLDER ] || {
git clone -b dev --recursive https://x-access-token:$MSAAS_REPO_CLONE_TOKEN@$MSAAS_REPO_URL $MSAAS_REPO_FOLDER git clone -b dev --recursive https://x-access-token:$MSAAS_REPO_CLONE_TOKEN@$MSAAS_REPO_URL $MSAAS_REPO_FOLDER
cd $MSAAS_REPO_FOLDER cd $MSAAS_REPO_FOLDER
cd openreplay && git fetch origin && git checkout main # This have to be changed to specific tag
git log -1
cd $MSAAS_REPO_FOLDER
bash git-init.sh bash git-init.sh
git checkout git checkout
git --git-dir=./openreplay/.git status
} }
} }
function build_managed() { function build_managed() {
@ -97,7 +101,7 @@ jobs:
else else
cd $MSAAS_REPO_FOLDER/openreplay/$service cd $MSAAS_REPO_FOLDER/openreplay/$service
fi fi
IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash build.sh >> /tmp/arm.txt IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash -x build.sh >> /tmp/arm.txt
} }
# Checking for backend images # Checking for backend images
ls backend/cmd >> /tmp/backend.txt ls backend/cmd >> /tmp/backend.txt

View file

@ -12,6 +12,8 @@ from chalicelib.core.collaboration_slack import Slack
from chalicelib.utils import pg_client, helper, email_helper, smtp from chalicelib.utils import pg_client, helper, email_helper, smtp
from chalicelib.utils.TimeUTC import TimeUTC from chalicelib.utils.TimeUTC import TimeUTC
logger = logging.getLogger(__name__)
def get(id): def get(id):
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:

View file

@ -26,17 +26,23 @@ class MSTeams(BaseCollaboration):
@classmethod @classmethod
def say_hello(cls, url): def say_hello(cls, url):
r = requests.post( try:
url=url, r = requests.post(
json={ url=url,
"@type": "MessageCard", json={
"@context": "https://schema.org/extensions", "@type": "MessageCard",
"summary": "Welcome to OpenReplay", "@context": "https://schema.org/extensions",
"title": "Welcome to OpenReplay" "summary": "Welcome to OpenReplay",
}) "title": "Welcome to OpenReplay"
if r.status_code != 200: },
logger.warning("MSTeams integration failed") timeout=3)
logger.warning(r.text) if r.status_code != 200:
logger.warning("MSTeams integration failed")
logger.warning(r.text)
return False
except Exception as e:
logger.warning("!!! MSTeams integration failed")
logger.exception(e)
return False return False
return True return True

View file

@ -41,6 +41,7 @@ class JIRAIntegration(integration_base.BaseIntegration):
except Exception as e: except Exception as e:
self._issue_handler = None self._issue_handler = None
self.integration["valid"] = False self.integration["valid"] = False
return {"errors": ["Something went wrong, please check your JIRA credentials."]}
return self._issue_handler return self._issue_handler
# TODO: remove this once jira-oauth is done # TODO: remove this once jira-oauth is done

View file

@ -336,10 +336,13 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
if v not in extra_conditions[e.operator].value: if v not in extra_conditions[e.operator].value:
extra_conditions[e.operator].value.append(v) extra_conditions[e.operator].value.append(v)
extra_conditions = list(extra_conditions.values()) extra_conditions = list(extra_conditions.values())
elif metric_of == schemas.MetricOfTable.ISSUES and len(metric_value) > 0: elif metric_of == schemas.MetricOfTable.ISSUES and len(metric_value) > 0:
data.filters.append(schemas.SessionSearchFilterSchema(value=metric_value, type=schemas.FilterType.ISSUE, data.filters.append(schemas.SessionSearchFilterSchema(value=metric_value, type=schemas.FilterType.ISSUE,
operator=schemas.SearchEventOperator.IS)) operator=schemas.SearchEventOperator.IS))
elif metric_of == schemas.MetricOfTable.REFERRER:
data.filters.append(schemas.SessionSearchFilterSchema(value=metric_value, type=schemas.FilterType.REFERRER,
operator=schemas.SearchEventOperator.IS_ANY))
full_args, query_part = search_query_parts(data=data, error_status=None, errors_only=False, full_args, query_part = search_query_parts(data=data, error_status=None, errors_only=False,
favorite_only=False, issue=None, project_id=project_id, favorite_only=False, issue=None, project_id=project_id,
user_id=None, extra_event=extra_event, extra_conditions=extra_conditions) user_id=None, extra_event=extra_event, extra_conditions=extra_conditions)

View file

@ -5,7 +5,7 @@ from decouple import config
from . import smtp from . import smtp
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
logging.basicConfig(level=config("LOGLEVEL", default=logging.info)) logging.basicConfig(level=config("LOGLEVEL", default=logging.INFO))
if smtp.has_smtp(): if smtp.has_smtp():
logger.info("valid SMTP configuration found") logger.info("valid SMTP configuration found")

View file

@ -394,8 +394,11 @@ def get_all_issue_tracking_projects(context: schemas.CurrentContext = Depends(OR
user_id=context.user_id) user_id=context.user_id)
if error is not None: if error is not None:
return error return error
data = integration.issue_handler.get_projects() data = integration.issue_handler
if "errors" in data: if isinstance(data, dict) and "errors" in data:
return data
data = data.get_projects()
if isinstance(data, dict) and "errors" in data:
return data return data
return {"data": data} return {"data": data}
@ -406,8 +409,11 @@ def get_integration_metadata(integrationProjectId: int, context: schemas.Current
user_id=context.user_id) user_id=context.user_id)
if error is not None: if error is not None:
return error return error
data = integration.issue_handler.get_metas(integrationProjectId) data = integration
if "errors" in data.keys(): if isinstance(data, dict) and "errors" in data:
return data
data = data.issue_handler.get_metas(integrationProjectId)
if isinstance(data, dict) and "errors" in data:
return data return data
return {"data": data} return {"data": data}

View file

@ -777,7 +777,8 @@ class SessionsSearchPayloadSchema(_TimedSchema, _PaginatedSchema):
for f in values.get("filters", []): for f in values.get("filters", []):
vals = [] vals = []
for v in f.get("value", []): for v in f.get("value", []):
if v is not None: if v is not None and (f.get("type", "") != FilterType.DURATION.value
or str(v).isnumeric()):
vals.append(v) vals.append(v)
f["value"] = vals f["value"] = vals
return values return values

1
ee/api/.gitignore vendored
View file

@ -273,7 +273,6 @@ Pipfile.lock
/chalicelib/core/usability_testing/ /chalicelib/core/usability_testing/
/NOTES.md /NOTES.md
/chalicelib/core/db_request_handler.py /chalicelib/core/db_request_handler.py
/routers/subs/spot.py
/chalicelib/utils/or_cache/ /chalicelib/utils/or_cache/
/routers/subs/health.py /routers/subs/health.py
/chalicelib/core/spot.py /chalicelib/core/spot.py

View file

@ -1,3 +1,4 @@
import chalicelib.utils.exp_ch_helper
import schemas import schemas
from chalicelib.core import countries, events, metadata from chalicelib.core import countries, events, metadata
from chalicelib.utils import ch_client from chalicelib.utils import ch_client
@ -325,12 +326,13 @@ def get_top_values(project_id, event_type, event_key=None):
FROM raw;""" FROM raw;"""
else: else:
colname = TYPE_TO_COLUMN.get(event_type) colname = TYPE_TO_COLUMN.get(event_type)
event_type = exp_ch_helper.get_event_type(event_type)
query = f"""WITH raw AS (SELECT DISTINCT {colname} AS c_value, query = f"""WITH raw AS (SELECT DISTINCT {colname} AS c_value,
COUNT(1) OVER (PARTITION BY c_value) AS row_count, COUNT(1) OVER (PARTITION BY c_value) AS row_count,
COUNT(1) OVER () AS total_count COUNT(1) OVER () AS total_count
FROM experimental.events FROM experimental.events
WHERE project_id = %(project_id)s WHERE project_id = %(project_id)s
AND event_type = '{event_type.upper()}' AND event_type = '{event_type}'
AND isNotNull(c_value) AND isNotNull(c_value)
AND notEmpty(c_value) AND notEmpty(c_value)
ORDER BY row_count DESC ORDER BY row_count DESC

View file

@ -450,6 +450,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
elif metric_of == schemas.MetricOfTable.REFERRER: elif metric_of == schemas.MetricOfTable.REFERRER:
main_col = "referrer" main_col = "referrer"
extra_col = ", referrer" extra_col = ", referrer"
extra_where = "WHERE isNotNull(referrer)"
elif metric_of == schemas.MetricOfTable.FETCH: elif metric_of == schemas.MetricOfTable.FETCH:
main_col = "url_path" main_col = "url_path"
extra_col = ", s.url_path" extra_col = ", s.url_path"
@ -554,39 +555,6 @@ def __is_valid_event(is_any: bool, event: schemas.SessionSearchEventSchema2):
event.filters is None or len(event.filters) == 0)) event.filters is None or len(event.filters) == 0))
def __get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEventType], platform="web"):
defs = {
schemas.EventType.CLICK: "CLICK",
schemas.EventType.INPUT: "INPUT",
schemas.EventType.LOCATION: "LOCATION",
schemas.PerformanceEventType.LOCATION_DOM_COMPLETE: "LOCATION",
schemas.PerformanceEventType.LOCATION_LARGEST_CONTENTFUL_PAINT_TIME: "LOCATION",
schemas.PerformanceEventType.LOCATION_TTFB: "LOCATION",
schemas.EventType.CUSTOM: "CUSTOM",
schemas.EventType.REQUEST: "REQUEST",
schemas.EventType.REQUEST_DETAILS: "REQUEST",
schemas.PerformanceEventType.FETCH_FAILED: "REQUEST",
schemas.EventType.STATE_ACTION: "STATEACTION",
schemas.EventType.ERROR: "ERROR",
schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD: 'PERFORMANCE',
schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE: 'PERFORMANCE'
}
defs_mobile = {
schemas.EventType.CLICK_MOBILE: "TAP",
schemas.EventType.INPUT_MOBILE: "INPUT",
schemas.EventType.CUSTOM_MOBILE: "CUSTOM",
schemas.EventType.REQUEST_MOBILE: "REQUEST",
schemas.EventType.ERROR_MOBILE: "CRASH",
schemas.EventType.VIEW_MOBILE: "VIEW",
schemas.EventType.SWIPE_MOBILE: "SWIPE"
}
if platform != "web" and event_type in defs_mobile:
return defs_mobile.get(event_type)
if event_type not in defs:
raise Exception(f"unsupported EventType:{event_type}")
return defs.get(event_type)
# this function generates the query and return the generated-query with the dict of query arguments # this function generates the query and return the generated-query with the dict of query arguments
def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_status, errors_only, favorite_only, issue, def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_status, errors_only, favorite_only, issue,
project_id, user_id, platform="web", extra_event=None, extra_deduplication=[], project_id, user_id, platform="web", extra_event=None, extra_deduplication=[],
@ -925,7 +893,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
if platform == "web": if platform == "web":
_column = events.EventType.CLICK.column _column = events.EventType.CLICK.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if schemas.ClickEventExtraOperator.has_value(event.operator): if schemas.ClickEventExtraOperator.has_value(event.operator):
@ -937,7 +906,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -945,14 +915,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
else: else:
_column = events.EventType.CLICK_MOBILE.column _column = events.EventType.CLICK_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -963,14 +935,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
if platform == "web": if platform == "web":
_column = events.EventType.INPUT.column _column = events.EventType.INPUT.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -982,14 +956,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
full_args = {**full_args, **_multiple_values(event.source, value_key=f"custom{i}")} full_args = {**full_args, **_multiple_values(event.source, value_key=f"custom{i}")}
else: else:
_column = events.EventType.INPUT_MOBILE.column _column = events.EventType.INPUT_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1000,14 +976,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
if platform == "web": if platform == "web":
_column = 'url_path' _column = 'url_path'
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1015,14 +993,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
else: else:
_column = events.EventType.VIEW_MOBILE.column _column = events.EventType.VIEW_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1031,14 +1011,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.CUSTOM.ui_type: elif event_type == events.EventType.CUSTOM.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = events.EventType.CUSTOM.column _column = events.EventType.CUSTOM.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1047,14 +1027,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.REQUEST.ui_type: elif event_type == events.EventType.REQUEST.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path' _column = 'url_path'
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1072,14 +1052,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.STATEACTION.ui_type: elif event_type == events.EventType.STATEACTION.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = events.EventType.STATEACTION.column _column = events.EventType.STATEACTION.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1089,7 +1069,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.ERROR.ui_type: elif event_type == events.EventType.ERROR.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main" event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main"
events_extra_join = f"SELECT * FROM {MAIN_EVENTS_TABLE} AS main1 WHERE main1.project_id=%(project_id)s" events_extra_join = f"SELECT * FROM {MAIN_EVENTS_TABLE} AS main1 WHERE main1.project_id=%(project_id)s"
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
event.source = tuple(event.source) event.source = tuple(event.source)
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []
@ -1109,14 +1089,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
# ----- Mobile # ----- Mobile
elif event_type == events.EventType.CLICK_MOBILE.ui_type: elif event_type == events.EventType.CLICK_MOBILE.ui_type:
_column = events.EventType.CLICK_MOBILE.column _column = events.EventType.CLICK_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1124,14 +1104,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.INPUT_MOBILE.ui_type: elif event_type == events.EventType.INPUT_MOBILE.ui_type:
_column = events.EventType.INPUT_MOBILE.column _column = events.EventType.INPUT_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1139,14 +1119,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.VIEW_MOBILE.ui_type: elif event_type == events.EventType.VIEW_MOBILE.ui_type:
_column = events.EventType.VIEW_MOBILE.column _column = events.EventType.VIEW_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1154,14 +1134,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.CUSTOM_MOBILE.ui_type: elif event_type == events.EventType.CUSTOM_MOBILE.ui_type:
_column = events.EventType.CUSTOM_MOBILE.column _column = events.EventType.CUSTOM_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1170,14 +1150,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.REQUEST_MOBILE.ui_type: elif event_type == events.EventType.REQUEST_MOBILE.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path' _column = 'url_path'
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1185,14 +1165,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.CRASH_MOBILE.ui_type: elif event_type == events.EventType.CRASH_MOBILE.ui_type:
_column = events.EventType.CRASH_MOBILE.column _column = events.EventType.CRASH_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1200,14 +1180,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.SWIPE_MOBILE.ui_type and platform != "web": elif event_type == events.EventType.SWIPE_MOBILE.ui_type and platform != "web":
_column = events.EventType.SWIPE_MOBILE.column _column = events.EventType.SWIPE_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1217,7 +1197,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == schemas.PerformanceEventType.FETCH_FAILED: elif event_type == schemas.PerformanceEventType.FETCH_FAILED:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path' _column = 'url_path'
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []
if not is_any: if not is_any:
@ -1225,7 +1205,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1256,7 +1236,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
schemas.PerformanceEventType.LOCATION_LARGEST_CONTENTFUL_PAINT_TIME, schemas.PerformanceEventType.LOCATION_LARGEST_CONTENTFUL_PAINT_TIME,
schemas.PerformanceEventType.LOCATION_TTFB]: schemas.PerformanceEventType.LOCATION_TTFB]:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []
col = performance_event.get_col(event_type) col = performance_event.get_col(event_type)
@ -1279,7 +1259,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type in [schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD, elif event_type in [schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD,
schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE]: schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE]:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []
col = performance_event.get_col(event_type) col = performance_event.get_col(event_type)
@ -1302,9 +1282,9 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
# elif event_type == schemas.PerformanceEventType.time_between_events: # elif event_type == schemas.PerformanceEventType.time_between_events:
# event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " # event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
# # event_from = event_from % f"{getattr(events.event_type, event.value[0].type).table} AS main INNER JOIN {getattr(events.event_type, event.value[1].type).table} AS main2 USING(session_id) " # # event_from = event_from % f"{getattr(events.event_type, event.value[0].type).table} AS main INNER JOIN {getattr(events.event_type, event.value[1].type).table} AS main2 USING(session_id) "
# event_where.append(f"main.event_type='{__get_event_type(event.value[0].type, platform=platform)}'") # event_where.append(f"main.event_type='{__exp_ch_helper.get_event_type(event.value[0].type, platform=platform)}'")
# events_conditions.append({"type": event_where[-1]}) # events_conditions.append({"type": event_where[-1]})
# event_where.append(f"main.event_type='{__get_event_type(event.value[0].type, platform=platform)}'") # event_where.append(f"main.event_type='{__exp_ch_helper.get_event_type(event.value[0].type, platform=platform)}'")
# events_conditions.append({"type": event_where[-1]}) # events_conditions.append({"type": event_where[-1]})
# #
# if not isinstance(event.value[0].value, list): # if not isinstance(event.value[0].value, list):
@ -1352,7 +1332,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
# TODO: no isNot for RequestDetails # TODO: no isNot for RequestDetails
elif event_type == schemas.EventType.REQUEST_DETAILS: elif event_type == schemas.EventType.REQUEST_DETAILS:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
apply = False apply = False
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []

View file

@ -1,3 +1,6 @@
from typing import Union
import schemas
from chalicelib.utils.TimeUTC import TimeUTC from chalicelib.utils.TimeUTC import TimeUTC
from decouple import config from decouple import config
import logging import logging
@ -51,3 +54,37 @@ def get_main_js_errors_sessions_table(timestamp=0):
# return "experimental.js_errors_sessions_mv" # \ # return "experimental.js_errors_sessions_mv" # \
# if config("EXP_7D_MV", cast=bool, default=True) \ # if config("EXP_7D_MV", cast=bool, default=True) \
# and timestamp >= TimeUTC.now(delta_days=-7) else "experimental.events" # and timestamp >= TimeUTC.now(delta_days=-7) else "experimental.events"
def get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEventType], platform="web"):
defs = {
schemas.EventType.CLICK: "CLICK",
schemas.EventType.INPUT: "INPUT",
schemas.EventType.LOCATION: "LOCATION",
schemas.PerformanceEventType.LOCATION_DOM_COMPLETE: "LOCATION",
schemas.PerformanceEventType.LOCATION_LARGEST_CONTENTFUL_PAINT_TIME: "LOCATION",
schemas.PerformanceEventType.LOCATION_TTFB: "LOCATION",
schemas.EventType.CUSTOM: "CUSTOM",
schemas.EventType.REQUEST: "REQUEST",
schemas.EventType.REQUEST_DETAILS: "REQUEST",
schemas.PerformanceEventType.FETCH_FAILED: "REQUEST",
schemas.EventType.STATE_ACTION: "STATEACTION",
schemas.EventType.ERROR: "ERROR",
schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD: 'PERFORMANCE',
schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE: 'PERFORMANCE',
schemas.FetchFilterType.FETCH_URL: 'REQUEST'
}
defs_mobile = {
schemas.EventType.CLICK_MOBILE: "TAP",
schemas.EventType.INPUT_MOBILE: "INPUT",
schemas.EventType.CUSTOM_MOBILE: "CUSTOM",
schemas.EventType.REQUEST_MOBILE: "REQUEST",
schemas.EventType.ERROR_MOBILE: "CRASH",
schemas.EventType.VIEW_MOBILE: "VIEW",
schemas.EventType.SWIPE_MOBILE: "SWIPE"
}
if platform != "web" and event_type in defs_mobile:
return defs_mobile.get(event_type)
if event_type not in defs:
raise Exception(f"unsupported EventType:{event_type}")
return defs.get(event_type)

View file

@ -95,7 +95,6 @@ rm -rf ./orpy.py
rm -rf ./chalicelib/core/usability_testing/ rm -rf ./chalicelib/core/usability_testing/
rm -rf ./chalicelib/core/db_request_handler.py rm -rf ./chalicelib/core/db_request_handler.py
rm -rf ./chalicelib/core/db_request_handler.py rm -rf ./chalicelib/core/db_request_handler.py
rm -rf ./routers/subs/spot.py
rm -rf ./chalicelib/utils/or_cache rm -rf ./chalicelib/utils/or_cache
rm -rf ./routers/subs/health.py rm -rf ./routers/subs/health.py
rm -rf ./chalicelib/core/spot.py rm -rf ./chalicelib/core/spot.py

View file

@ -7,6 +7,7 @@ psycopg2-binary==2.9.9
psycopg[pool,binary]==3.2.1 psycopg[pool,binary]==3.2.1
elasticsearch==8.15.0 elasticsearch==8.15.0
jira==3.8.0 jira==3.8.0
cachetools==5.5.0

View file

@ -7,6 +7,7 @@ psycopg2-binary==2.9.9
psycopg[pool,binary]==3.2.1 psycopg[pool,binary]==3.2.1
elasticsearch==8.15.0 elasticsearch==8.15.0
jira==3.8.0 jira==3.8.0
cachetools==5.5.0

View file

@ -90,26 +90,35 @@ async def __process_assertion(request: Request, tenant_key=None) -> Response | d
return {"errors": ["invalid tenantKey, please copy the correct value from Preferences > Account"]} return {"errors": ["invalid tenantKey, please copy the correct value from Preferences > Account"]}
existing = users.get_by_email_only(email) existing = users.get_by_email_only(email)
role_names = user_data.get("role", []) role_names = user_data.get("role", [])
if len(role_names) == 0:
logger.info("No role specified, setting role to member")
role_names = ["member"]
role = None role = None
for r in role_names: if len(role_names) == 0:
if r.lower() == existing["roleName"].lower(): if existing is None:
role = {"roleId": existing["roleId"], "name": r} logger.info("No role specified, setting role to member")
role_names = ["member"]
else: else:
role = roles.get_role_by_name(tenant_id=t['tenantId'], name=r) role_names = [existing["roleName"]]
role = {"name": existing["roleName"], "roleId": existing["roleId"]}
if role is None:
for r in role_names:
if r.lower() == existing["roleName"].lower():
role = {"roleId": existing["roleId"], "name": r}
else:
role = roles.get_role_by_name(tenant_id=t['tenantId'], name=r)
if role is not None: if role is not None:
break break
if role is None: if role is None:
return {"errors": [f"role '{role_names}' not found, please create it in OpenReplay first"]} return {"errors": [f"role '{role_names}' not found, please create it in OpenReplay first"]}
logger.info(f"received roles:{role_names}; using:{role['name']}") logger.info(f"received roles:{role_names}; using:{role['name']}")
admin_privileges = user_data.get("adminPrivileges", []) admin_privileges = user_data.get("adminPrivileges", [])
admin_privileges = not (len(admin_privileges) == 0 if len(admin_privileges) == 0:
or admin_privileges[0] is None if existing is None:
or admin_privileges[0].lower() == "false") admin_privileges = not (len(admin_privileges) == 0
or admin_privileges[0] is None
or admin_privileges[0].lower() == "false")
else:
admin_privileges = existing["admin"]
internal_id = next(iter(user_data.get("internalId", [])), None) internal_id = next(iter(user_data.get("internalId", [])), None)
full_name = " ".join(user_data.get("firstName", []) + user_data.get("lastName", [])) full_name = " ".join(user_data.get("firstName", []) + user_data.get("lastName", []))

View file

@ -0,0 +1,32 @@
from fastapi import Depends
from starlette.responses import JSONResponse, Response
import schemas
from chalicelib.core import spot, webhook
from or_dependencies import OR_context
from routers.base import get_routers
public_app, app, app_apikey = get_routers(prefix="/spot", tags=["spot"])
COOKIE_PATH = "/api/spot/refresh"
@app.get('/logout')
def logout_spot(response: Response, context: schemas.CurrentContext = Depends(OR_context)):
spot.logout(user_id=context.user_id)
response.delete_cookie(key="spotRefreshToken", path=COOKIE_PATH)
return {"data": "success"}
@app.get('/refresh')
def refresh_spot_login(response: JSONResponse, context: schemas.CurrentContext = Depends(OR_context)):
r = spot.refresh(user_id=context.user_id, tenant_id=context.tenant_id)
content = {"jwt": r.get("jwt")}
response.set_cookie(key="spotRefreshToken", value=r.get("refreshToken"), path=COOKIE_PATH,
max_age=r.pop("refreshTokenMaxAge"), secure=True, httponly=True)
return content
@app.get('/integrations/slack/channels', tags=["integrations"])
def get_slack_channels(context: schemas.CurrentContext = Depends(OR_context)):
return {"data": webhook.get_by_type(tenant_id=context.tenant_id, webhook_type=schemas.WebhookType.SLACK)}

View file

@ -23,4 +23,4 @@ MINIO_SECRET_KEY = ''
# APP and TRACKER VERSIONS # APP and TRACKER VERSIONS
VERSION = 1.20.0 VERSION = 1.20.0
TRACKER_VERSION = '14.0.6' TRACKER_VERSION = '14.0.7'

View file

@ -119,6 +119,7 @@ interface Props {
function PrivateRoutes(props: Props) { function PrivateRoutes(props: Props) {
const { onboarding, sites, siteId } = props; const { onboarding, sites, siteId } = props;
const hasRecordings = sites.some(s => s.recorded); const hasRecordings = sites.some(s => s.recorded);
const redirectToSetup = props.scope === 0;
const redirectToOnboarding = const redirectToOnboarding =
!onboarding && (localStorage.getItem(GLOBAL_HAS_NO_RECORDINGS) === 'true' || !hasRecordings) && props.scope > 0; !onboarding && (localStorage.getItem(GLOBAL_HAS_NO_RECORDINGS) === 'true' || !hasRecordings) && props.scope > 0;
const siteIdList: any = sites.map(({ id }) => id).toJS(); const siteIdList: any = sites.map(({ id }) => id).toJS();
@ -126,6 +127,13 @@ function PrivateRoutes(props: Props) {
return ( return (
<Suspense fallback={<Loader loading={true} className="flex-1" />}> <Suspense fallback={<Loader loading={true} className="flex-1" />}>
<Switch key="content"> <Switch key="content">
<Route
exact
strict
path={SCOPE_SETUP}
component={enhancedComponents.ScopeSetup}
/>
{redirectToSetup ? <Redirect to={SCOPE_SETUP} /> : null}
<Route path={CLIENT_PATH} component={enhancedComponents.Client} /> <Route path={CLIENT_PATH} component={enhancedComponents.Client} />
<Route <Route
path={withSiteId(ONBOARDING_PATH, siteIdList)} path={withSiteId(ONBOARDING_PATH, siteIdList)}
@ -143,12 +151,6 @@ function PrivateRoutes(props: Props) {
path={SPOT_PATH} path={SPOT_PATH}
component={enhancedComponents.Spot} component={enhancedComponents.Spot}
/> />
<Route
exact
strict
path={SCOPE_SETUP}
component={enhancedComponents.ScopeSetup}
/>
{props.scope === 1 ? <Redirect to={SPOTS_LIST_PATH} /> : null} {props.scope === 1 ? <Redirect to={SPOTS_LIST_PATH} /> : null}
<Route <Route
path="/integrations/" path="/integrations/"

View file

@ -10,21 +10,19 @@ import {
GLOBAL_DESTINATION_PATH, GLOBAL_DESTINATION_PATH,
IFRAME, IFRAME,
JWT_PARAM, JWT_PARAM,
SPOT_ONBOARDING SPOT_ONBOARDING,
} from "App/constants/storageKeys"; } from 'App/constants/storageKeys';
import Layout from 'App/layout/Layout'; import Layout from 'App/layout/Layout';
import { withStore } from "App/mstore"; import { withStore } from 'App/mstore';
import { checkParam, handleSpotJWT, isTokenExpired } from "App/utils"; import { checkParam, handleSpotJWT, isTokenExpired } from 'App/utils';
import { ModalProvider } from 'Components/Modal'; import { ModalProvider } from 'Components/Modal';
import { ModalProvider as NewModalProvider } from 'Components/ModalContext'; import { ModalProvider as NewModalProvider } from 'Components/ModalContext';
import { fetchListActive as fetchMetadata } from 'Duck/customField'; import { fetchListActive as fetchMetadata } from 'Duck/customField';
import { setSessionPath } from 'Duck/sessions'; import { setSessionPath } from 'Duck/sessions';
import { fetchList as fetchSiteList } from 'Duck/site'; import { fetchList as fetchSiteList } from 'Duck/site';
import { init as initSite } from 'Duck/site'; import { init as initSite } from 'Duck/site';
import { fetchUserInfo, getScope, setJwt, logout } from "Duck/user"; import { fetchUserInfo, getScope, logout, setJwt } from 'Duck/user';
import { fetchTenants } from 'Duck/user';
import { Loader } from 'UI'; import { Loader } from 'UI';
import { spotsList } from "./routes";
import * as routes from './routes'; import * as routes from './routes';
interface RouterProps interface RouterProps
@ -36,7 +34,6 @@ interface RouterProps
changePassword: boolean; changePassword: boolean;
isEnterprise: boolean; isEnterprise: boolean;
fetchUserInfo: () => any; fetchUserInfo: () => any;
fetchTenants: () => any;
setSessionPath: (path: any) => any; setSessionPath: (path: any) => any;
fetchSiteList: (siteId?: number) => any; fetchSiteList: (siteId?: number) => any;
match: { match: {
@ -45,7 +42,7 @@ interface RouterProps
}; };
}; };
mstore: any; mstore: any;
setJwt: (params: { jwt: string, spotJwt: string | null }) => any; setJwt: (params: { jwt: string; spotJwt: string | null }) => any;
fetchMetadata: (siteId: string) => void; fetchMetadata: (siteId: string) => void;
initSite: (site: any) => void; initSite: (site: any) => void;
scopeSetup: boolean; scopeSetup: boolean;
@ -68,15 +65,16 @@ const Router: React.FC<RouterProps> = (props) => {
logout, logout,
} = props; } = props;
const params = new URLSearchParams(location.search) const params = new URLSearchParams(location.search);
const spotCb = params.get('spotCallback'); const spotCb = params.get('spotCallback');
const spotReqSent = React.useRef(false) const spotReqSent = React.useRef(false);
const [isSpotCb, setIsSpotCb] = React.useState(false); const [isSpotCb, setIsSpotCb] = React.useState(false);
const [isSignup, setIsSignup] = React.useState(false);
const [isIframe, setIsIframe] = React.useState(false); const [isIframe, setIsIframe] = React.useState(false);
const [isJwt, setIsJwt] = React.useState(false); const [isJwt, setIsJwt] = React.useState(false);
const handleJwtFromUrl = () => { const handleJwtFromUrl = () => {
const params = new URLSearchParams(location.search) const params = new URLSearchParams(location.search);
const urlJWT = params.get('jwt'); const urlJWT = params.get('jwt');
const spotJwt = params.get('spotJwt'); const spotJwt = params.get('spotJwt');
if (spotJwt) { if (spotJwt) {
@ -92,6 +90,7 @@ const Router: React.FC<RouterProps> = (props) => {
return; return;
} else { } else {
spotReqSent.current = true; spotReqSent.current = true;
setIsSpotCb(false);
} }
handleSpotJWT(jwt); handleSpotJWT(jwt);
}; };
@ -107,13 +106,17 @@ const Router: React.FC<RouterProps> = (props) => {
const handleUserLogin = async () => { const handleUserLogin = async () => {
if (isSpotCb) { if (isSpotCb) {
localStorage.setItem(SPOT_ONBOARDING, 'true') localStorage.setItem(SPOT_ONBOARDING, 'true');
} }
await fetchUserInfo(); await fetchUserInfo();
const siteIdFromPath = parseInt(location.pathname.split('/')[1]); const siteIdFromPath = parseInt(location.pathname.split('/')[1]);
await fetchSiteList(siteIdFromPath); await fetchSiteList(siteIdFromPath);
props.mstore.initClient(); props.mstore.initClient();
if (localSpotJwt && !isTokenExpired(localSpotJwt)) {
handleSpotLogin(localSpotJwt);
}
const destinationPath = localStorage.getItem(GLOBAL_DESTINATION_PATH); const destinationPath = localStorage.getItem(GLOBAL_DESTINATION_PATH);
if ( if (
destinationPath && destinationPath &&
@ -144,7 +147,10 @@ const Router: React.FC<RouterProps> = (props) => {
if (spotCb) { if (spotCb) {
setIsSpotCb(true); setIsSpotCb(true);
} }
}, [spotCb]) if (location.pathname.includes('signup')) {
setIsSignup(true);
}
}, [spotCb]);
useEffect(() => { useEffect(() => {
handleDestinationPath(); handleDestinationPath();
@ -159,22 +165,14 @@ const Router: React.FC<RouterProps> = (props) => {
}, [isLoggedIn]); }, [isLoggedIn]);
useEffect(() => { useEffect(() => {
if (scopeSetup) { if (isLoggedIn && isSpotCb && !isSignup) {
history.push(routes.scopeSetup()) if (localSpotJwt && !isTokenExpired(localSpotJwt)) {
} handleSpotLogin(localSpotJwt);
}, [scopeSetup]) } else {
logout();
useEffect(() => {
if (isLoggedIn && (location.pathname.includes('login') || isSpotCb)) {
if (localSpotJwt) {
if (!isTokenExpired(localSpotJwt)) {
handleSpotLogin(localSpotJwt);
} else {
logout();
}
} }
} }
}, [isSpotCb, location, isLoggedIn, localSpotJwt]) }, [isSpotCb, isLoggedIn, localSpotJwt, isSignup]);
useEffect(() => { useEffect(() => {
if (siteId && siteId !== lastFetchedSiteIdRef.current) { if (siteId && siteId !== lastFetchedSiteIdRef.current) {
@ -204,8 +202,7 @@ const Router: React.FC<RouterProps> = (props) => {
location.pathname.includes('multiview') || location.pathname.includes('multiview') ||
location.pathname.includes('/view-spot/') || location.pathname.includes('/view-spot/') ||
location.pathname.includes('/spots/') || location.pathname.includes('/spots/') ||
location.pathname.includes('/scope-setup') location.pathname.includes('/scope-setup');
if (isIframe) { if (isIframe) {
return ( return (
@ -238,8 +235,11 @@ const mapStateToProps = (state: Map<string, any>) => {
'loading', 'loading',
]); ]);
const sitesLoading = state.getIn(['site', 'fetchListRequest', 'loading']); const sitesLoading = state.getIn(['site', 'fetchListRequest', 'loading']);
const scopeSetup = getScope(state) === 0 const scopeSetup = getScope(state) === 0;
const loading = Boolean(userInfoLoading) || Boolean(sitesLoading) || (!scopeSetup && !siteId); const loading =
Boolean(userInfoLoading) ||
Boolean(sitesLoading) ||
(!scopeSetup && !siteId);
return { return {
siteId, siteId,
changePassword, changePassword,
@ -262,7 +262,6 @@ const mapStateToProps = (state: Map<string, any>) => {
const mapDispatchToProps = { const mapDispatchToProps = {
fetchUserInfo, fetchUserInfo,
fetchTenants,
setSessionPath, setSessionPath,
fetchSiteList, fetchSiteList,
setJwt, setJwt,

View file

@ -2,7 +2,7 @@ import { ArrowRightOutlined } from '@ant-design/icons';
import { Button, Card, Radio } from 'antd'; import { Button, Card, Radio } from 'antd';
import React from 'react'; import React from 'react';
import { connect } from 'react-redux'; import { connect } from 'react-redux';
import { upgradeScope, downgradeScope } from "App/duck/user"; import { upgradeScope, downgradeScope, getScope } from 'App/duck/user';
import { useHistory } from 'react-router-dom'; import { useHistory } from 'react-router-dom';
import * as routes from 'App/routes' import * as routes from 'App/routes'
import { SPOT_ONBOARDING } from "../../constants/storageKeys"; import { SPOT_ONBOARDING } from "../../constants/storageKeys";
@ -15,8 +15,18 @@ const Scope = {
function ScopeForm({ function ScopeForm({
upgradeScope, upgradeScope,
downgradeScope, downgradeScope,
scopeState,
}: any) { }: any) {
const [scope, setScope] = React.useState(Scope.FULL); const [scope, setScope] = React.useState(Scope.FULL);
React.useEffect(() => {
if (scopeState !== 0) {
if (scopeState === 2) {
history.replace(routes.onboarding())
} else {
history.replace(routes.spotsList())
}
}
}, [scopeState])
React.useEffect(() => { React.useEffect(() => {
const isSpotSetup = localStorage.getItem(SPOT_ONBOARDING) const isSpotSetup = localStorage.getItem(SPOT_ONBOARDING)
if (isSpotSetup) { if (isSpotSetup) {
@ -36,50 +46,52 @@ function ScopeForm({
}; };
return ( return (
<div className={'flex items-center justify-center w-screen h-screen'}> <div className={'flex items-center justify-center w-screen h-screen'}>
<Card <Card
style={{ width: 540 }} style={{ width: 540 }}
title={'👋 Welcome to OpenReplay'} title={'👋 Welcome to OpenReplay'}
classNames={{ classNames={{
header: 'text-2xl font-semibold text-center', header: 'text-2xl font-semibold text-center',
body: 'flex flex-col gap-2', body: 'flex flex-col gap-2',
}} }}
>
<div className={'font-semibold'}>
How will you primarily use OpenReplay?{' '}
</div>
<div className={'text-disabled-text'}>
<div>
You will have access to all OpenReplay features regardless of your
choice.
</div>
<div>
Your preference will simply help us tailor your onboarding experience.
</div>
</div>
<Radio.Group
value={scope}
onChange={(e) => setScope(e.target.value)}
className={'flex flex-col gap-2 mt-4 '}
> >
<Radio value={'full'}> <div className={'font-semibold'}>
Session Replay & Debugging, Customer Support and more How will you primarily use OpenReplay?{' '}
</Radio> </div>
<Radio value={'spot'}>Report bugs via Spot</Radio> <div className={'text-disabled-text'}>
</Radio.Group> <div>
You will have access to all OpenReplay features regardless of your
<div className={'self-end'}> choice.
<Button </div>
type={'primary'} <div>
onClick={() => onContinue()} Your preference will simply help us tailor your onboarding experience.
icon={<ArrowRightOutlined />} </div>
iconPosition={'end'} </div>
<Radio.Group
value={scope}
onChange={(e) => setScope(e.target.value)}
className={'flex flex-col gap-2 mt-4 '}
> >
Continue <Radio value={'full'}>
</Button> Session Replay & Debugging, Customer Support and more
</div> </Radio>
</Card> <Radio value={'spot'}>Report bugs via Spot</Radio>
</Radio.Group>
<div className={'self-end'}>
<Button
type={'primary'}
onClick={() => onContinue()}
icon={<ArrowRightOutlined />}
iconPosition={'end'}
>
Continue
</Button>
</div>
</Card>
</div> </div>
); );
} }
export default connect(null, { upgradeScope, downgradeScope })(ScopeForm); export default connect((state) => ({
scopeState: getScope(state),
}), { upgradeScope, downgradeScope })(ScopeForm);

View file

@ -83,9 +83,8 @@ function Player(props: IProps) {
<div <div
onMouseDown={handleResize} onMouseDown={handleResize}
className={'w-full h-2 cursor-ns-resize absolute top-0 left-0 z-20'} className={'w-full h-2 cursor-ns-resize absolute top-0 left-0 z-20'}
> />
<ConsolePanel isLive /> <ConsolePanel isLive />
</div>
</div> </div>
) : null} ) : null}
{!fullView && !isMultiview ? <LiveControls jump={playerContext.player.jump} /> : null} {!fullView && !isMultiview ? <LiveControls jump={playerContext.player.jump} /> : null}

View file

@ -6,6 +6,7 @@ import {
} from '@ant-design/icons'; } from '@ant-design/icons';
import { Button, InputNumber, Popover } from 'antd'; import { Button, InputNumber, Popover } from 'antd';
import { Slider } from 'antd'; import { Slider } from 'antd';
import cn from 'classnames';
import { observer } from 'mobx-react-lite'; import { observer } from 'mobx-react-lite';
import React, { useContext, useEffect, useRef, useState } from 'react'; import React, { useContext, useEffect, useRef, useState } from 'react';
@ -24,17 +25,38 @@ function DropdownAudioPlayer({
const [isMuted, setIsMuted] = useState(false); const [isMuted, setIsMuted] = useState(false);
const lastPlayerTime = useRef(0); const lastPlayerTime = useRef(0);
const audioRefs = useRef<Record<string, HTMLAudioElement | null>>({}); const audioRefs = useRef<Record<string, HTMLAudioElement | null>>({});
const fileLengths = useRef<Record<string, number>>({});
const { time = 0, speed = 1, playing, sessionStart } = store?.get() ?? {}; const { time = 0, speed = 1, playing, sessionStart } = store?.get() ?? {};
const files = audioEvents.map((pa) => { const files = React.useMemo(
const data = pa.payload; () =>
return { audioEvents.map((pa) => {
url: data.url, const data = pa.payload;
timestamp: data.timestamp, const nativeTs = data.timestamp;
start: pa.timestamp - sessionStart, const startTs = nativeTs
}; ? nativeTs > sessionStart
}); ? nativeTs - sessionStart
: nativeTs
: pa.timestamp - sessionStart;
return {
url: data.url,
timestamp: data.timestamp,
start: startTs,
};
}),
[audioEvents.length, sessionStart]
);
React.useEffect(() => {
Object.entries(audioRefs.current).forEach(([url, audio]) => {
if (audio) {
audio.loop = false;
audio.addEventListener('loadedmetadata', () => {
fileLengths.current[url] = audio.duration;
});
}
});
}, [audioRefs.current]);
const toggleMute = () => { const toggleMute = () => {
Object.values(audioRefs.current).forEach((audio) => { Object.values(audioRefs.current).forEach((audio) => {
@ -89,10 +111,15 @@ function DropdownAudioPlayer({
if (audio) { if (audio) {
const file = files.find((f) => f.url === key); const file = files.find((f) => f.url === key);
if (file) { if (file) {
audio.currentTime = Math.max( const targetTime = (timeMs + delta * 1000 - file.start) / 1000;
(timeMs + delta * 1000 - file.start) / 1000, const fileLength = fileLengths.current[key];
0 if (targetTime < 0 || (fileLength && targetTime > fileLength)) {
); audio.pause();
audio.currentTime = 0;
return;
} else {
audio.currentTime = targetTime;
}
} }
} }
}); });
@ -108,27 +135,39 @@ function DropdownAudioPlayer({
useEffect(() => { useEffect(() => {
const deltaMs = delta * 1000; const deltaMs = delta * 1000;
if (Math.abs(lastPlayerTime.current - time - deltaMs) >= 250) { const deltaTime = Math.abs(lastPlayerTime.current - time - deltaMs);
if (deltaTime >= 250) {
handleSeek(time); handleSeek(time);
} }
Object.entries(audioRefs.current).forEach(([url, audio]) => { Object.entries(audioRefs.current).forEach(([url, audio]) => {
if (audio) { if (audio) {
const file = files.find((f) => f.url === url); const file = files.find((f) => f.url === url);
if (file && time >= file.start) { const fileLength = fileLengths.current[url];
if (audio.paused && playing) { if (file) {
audio.play(); if (fileLength && fileLength * 1000 + file.start < time) {
return;
}
if (time >= file.start) {
if (audio.paused && playing) {
audio.play();
}
} else {
audio.pause();
} }
} else {
audio.pause();
}
if (audio.muted !== isMuted) {
audio.muted = isMuted;
} }
} }
}); });
lastPlayerTime.current = time + deltaMs; lastPlayerTime.current = time + deltaMs;
}, [time, delta]); }, [time, delta]);
useEffect(() => {
Object.values(audioRefs.current).forEach((audio) => {
if (audio) {
audio.muted = isMuted;
}
});
}, [isMuted]);
useEffect(() => { useEffect(() => {
changePlaybackSpeed(speed); changePlaybackSpeed(speed);
}, [speed]); }, [speed]);
@ -137,22 +176,30 @@ function DropdownAudioPlayer({
Object.entries(audioRefs.current).forEach(([url, audio]) => { Object.entries(audioRefs.current).forEach(([url, audio]) => {
if (audio) { if (audio) {
const file = files.find((f) => f.url === url); const file = files.find((f) => f.url === url);
if (file && playing && time >= file.start) { const fileLength = fileLengths.current[url];
audio.play(); if (file) {
} else { if (fileLength && fileLength * 1000 + file.start < time) {
audio.pause(); audio.pause();
return;
}
if (playing && time >= file.start) {
audio.play();
} else {
audio.pause();
}
} }
} }
}); });
setVolume(isMuted ? 0 : volume); setVolume(isMuted ? 0 : volume);
}, [playing]); }, [playing]);
const buttonIcon =
'px-2 cursor-pointer border border-gray-light hover:border-main hover:text-main hover:z-10 h-fit';
return ( return (
<div className={'relative'}> <div className={'relative'}>
<div className={'flex items-center'} style={{ height: 24 }}> <div className={'flex items-center'} style={{ height: 24 }}>
<Popover <Popover
trigger={'click'} trigger={'click'}
className={'h-full'}
content={ content={
<div <div
className={'flex flex-col gap-2 rounded'} className={'flex flex-col gap-2 rounded'}
@ -169,20 +216,14 @@ function DropdownAudioPlayer({
</div> </div>
} }
> >
<div <div className={cn(buttonIcon, 'rounded-l')}>
className={
'px-2 h-full cursor-pointer border rounded-l border-gray-light hover:border-main hover:text-main hover:z-10'
}
>
{isMuted ? <MutedOutlined /> : <SoundOutlined />} {isMuted ? <MutedOutlined /> : <SoundOutlined />}
</div> </div>
</Popover> </Popover>
<div <div
onClick={toggleVisible} onClick={toggleVisible}
style={{ marginLeft: -1 }} style={{ marginLeft: -1 }}
className={ className={cn(buttonIcon, 'rounded-r')}
'px-2 h-full border rounded-r border-gray-light cursor-pointer hover:border-main hover:text-main hover:z-10'
}
> >
<CaretDownOutlined /> <CaretDownOutlined />
</div> </div>
@ -236,6 +277,7 @@ function DropdownAudioPlayer({
<div style={{ display: 'none' }}> <div style={{ display: 'none' }}>
{files.map((file) => ( {files.map((file) => (
<audio <audio
loop={false}
key={file.url} key={file.url}
ref={(el) => (audioRefs.current[file.url] = el)} ref={(el) => (audioRefs.current[file.url] = el)}
controls controls

View file

@ -19,7 +19,7 @@ function ConsoleRow(props: Props) {
return ( return (
<div <div
className={cn(stl.line, 'flex py-2 px-4 overflow-hidden group relative select-none', { className={cn(stl.line, 'flex py-2 px-4 overflow-hidden group relative', {
info: !log.isYellow && !log.isRed, info: !log.isYellow && !log.isRed,
warn: log.isYellow, warn: log.isYellow,
error: log.isRed, error: log.isRed,

View file

@ -10,7 +10,7 @@ const SpotsListHeader = observer(
onDelete, onDelete,
selectedCount, selectedCount,
onClearSelection, onClearSelection,
isEmpty, tenantHasSpots,
onRefresh, onRefresh,
}: { }: {
onDelete: () => void; onDelete: () => void;
@ -18,6 +18,7 @@ const SpotsListHeader = observer(
onClearSelection: () => void; onClearSelection: () => void;
onRefresh: () => void; onRefresh: () => void;
isEmpty?: boolean; isEmpty?: boolean;
tenantHasSpots: boolean;
}) => { }) => {
const { spotStore } = useStore(); const { spotStore } = useStore();
@ -52,7 +53,7 @@ const SpotsListHeader = observer(
<ReloadButton buttonSize={'small'} onClick={onRefresh} iconSize={16} /> <ReloadButton buttonSize={'small'} onClick={onRefresh} iconSize={16} />
</div> </div>
{isEmpty ? null : ( {tenantHasSpots ? (
<div className="flex gap-2 items-center"> <div className="flex gap-2 items-center">
<div className={'ml-auto'}> <div className={'ml-auto'}>
{selectedCount > 0 && ( {selectedCount > 0 && (
@ -90,7 +91,7 @@ const SpotsListHeader = observer(
/> />
</div> </div>
</div> </div>
)} ) : null}
</div> </div>
); );
} }

View file

@ -89,6 +89,7 @@ function SpotsList() {
selectedCount={selectedSpots.length} selectedCount={selectedSpots.length}
onClearSelection={clearSelection} onClearSelection={clearSelection}
isEmpty={isEmpty} isEmpty={isEmpty}
tenantHasSpots={spotStore.tenantHasSpots}
/> />
</div> </div>

View file

@ -117,7 +117,7 @@ function ConsolePanel({
exceptionsList = [], exceptionsList = [],
logListNow = [], logListNow = [],
exceptionsListNow = [], exceptionsListNow = [],
} = tabStates[currentTab]; } = tabStates[currentTab] ?? {};
const list = isLive const list = isLive
? (useMemo( ? (useMemo(

View file

@ -45,7 +45,7 @@ function ConsoleRow(props: Props) {
<div <div
style={style} style={style}
className={cn( className={cn(
'border-b flex items-start py-1 px-4 pe-8 overflow-hidden group relative select-none', 'border-b flex items-start py-1 px-4 pe-8 overflow-hidden group relative',
{ {
info: !log.isYellow && !log.isRed, info: !log.isYellow && !log.isRed,
warn: log.isYellow, warn: log.isYellow,

View file

@ -25,7 +25,7 @@ const ALL = 'ALL';
const TAB_KEYS = [ALL, ...typeList] as const; const TAB_KEYS = [ALL, ...typeList] as const;
const TABS = TAB_KEYS.map((tab) => ({ text: tab, key: tab })); const TABS = TAB_KEYS.map((tab) => ({ text: tab, key: tab }));
type EventsList = Array<Timed & { name: string; source: string; key: string }>; type EventsList = Array<Timed & { name: string; source: string; key: string; payload?: string[] }>;
const WebStackEventPanelComp = observer( const WebStackEventPanelComp = observer(
({ ({
@ -95,7 +95,7 @@ export const MobileStackEventPanel = connect((state: Record<string, any>) => ({
zoomEndTs: state.getIn(['components', 'player']).timelineZoom.endTs, zoomEndTs: state.getIn(['components', 'player']).timelineZoom.endTs,
}))(MobileStackEventPanelComp); }))(MobileStackEventPanelComp);
function EventsPanel({ const EventsPanel = observer(({
list, list,
listNow, listNow,
jump, jump,
@ -109,7 +109,7 @@ function EventsPanel({
zoomEnabled: boolean; zoomEnabled: boolean;
zoomStartTs: number; zoomStartTs: number;
zoomEndTs: number; zoomEndTs: number;
}) { }) => {
const { const {
sessionStore: { devTools }, sessionStore: { devTools },
} = useStore(); } = useStore();
@ -126,13 +126,19 @@ function EventsPanel({
zoomEnabled ? zoomStartTs <= time && time <= zoomEndTs : true zoomEnabled ? zoomStartTs <= time && time <= zoomEndTs : true
); );
let filteredList = useRegExListFilterMemo(inZoomRangeList, (it) => it.name, filter); let filteredList = useRegExListFilterMemo(inZoomRangeList, (it) => {
const searchBy = [it.name]
if (it.payload) {
const payload = Array.isArray(it.payload) ? it.payload.join(',') : JSON.stringify(it.payload);
searchBy.push(payload);
}
return searchBy
}, filter);
filteredList = useTabListFilterMemo(filteredList, (it) => it.source, ALL, activeTab); filteredList = useTabListFilterMemo(filteredList, (it) => it.source, ALL, activeTab);
const onTabClick = (activeTab: (typeof TAB_KEYS)[number]) => const onTabClick = (activeTab: (typeof TAB_KEYS)[number]) =>
devTools.update(INDEX_KEY, { activeTab }); devTools.update(INDEX_KEY, { activeTab });
const onFilterChange = ({ target: { value } }: React.ChangeEvent<HTMLInputElement>) => const onFilterChange = ({ target: { value } }: React.ChangeEvent<HTMLInputElement>) => devTools.update(INDEX_KEY, { filter: value });
devTools.update(INDEX_KEY, { filter: value });
const tabs = useMemo( const tabs = useMemo(
() => TABS.filter(({ key }) => key === ALL || inZoomRangeList.some(({ source }) => key === source)), () => TABS.filter(({ key }) => key === ALL || inZoomRangeList.some(({ source }) => key === source)),
[inZoomRangeList.length] [inZoomRangeList.length]
@ -229,4 +235,4 @@ function EventsPanel({
</BottomBlock.Content> </BottomBlock.Content>
</BottomBlock> </BottomBlock>
); );
} });

View file

@ -1,10 +1,15 @@
import { makeAutoObservable } from 'mobx'; import { makeAutoObservable } from 'mobx';
import { spotService } from 'App/services'; import { spotService } from 'App/services';
import { UpdateSpotRequest } from 'App/services/spotService'; import { UpdateSpotRequest } from 'App/services/spotService';
import { Spot } from './types/spot'; import { Spot } from './types/spot';
export default class SpotStore { export default class SpotStore {
isLoading: boolean = false; isLoading: boolean = false;
spots: Spot[] = []; spots: Spot[] = [];
@ -18,6 +23,7 @@ export default class SpotStore {
pubKey: { value: string; expiration: number } | null = null; pubKey: { value: string; expiration: number } | null = null;
readonly order = 'desc'; readonly order = 'desc';
accessError = false; accessError = false;
tenantHasSpots = false;
constructor() { constructor() {
makeAutoObservable(this); makeAutoObservable(this);
@ -81,13 +87,18 @@ export default class SpotStore {
limit: this.limit, limit: this.limit,
} as const; } as const;
const response = await this.withLoader(() => const { spots, tenantHasSpots, total } = await this.withLoader(() =>
spotService.fetchSpots(filters) spotService.fetchSpots(filters)
); );
this.setSpots(response.spots.map((spot: any) => new Spot(spot))); this.setSpots(spots.map((spot: any) => new Spot(spot)));
this.setTotal(response.total); this.setTotal(total);
this.setTenantHasSpots(tenantHasSpots);
}; };
setTenantHasSpots(hasSpots: boolean) {
this.tenantHasSpots = hasSpots;
}
async fetchSpotById(id: string) { async fetchSpotById(id: string) {
try { try {
const response = await this.withLoader(() => const response = await this.withLoader(() =>

View file

@ -1,5 +1,4 @@
import logger from 'App/logger'; import logger from 'App/logger';
import { resolveURL } from "../../messages/rewriter/urlResolve";
import type Screen from '../../Screen/Screen'; import type Screen from '../../Screen/Screen';
import type { Message, SetNodeScroll } from '../../messages'; import type { Message, SetNodeScroll } from '../../messages';
@ -32,6 +31,8 @@ export default class DOMManager extends ListWalker<Message> {
private readonly vTexts: Map<number, VText> = new Map() // map vs object here? private readonly vTexts: Map<number, VText> = new Map() // map vs object here?
private readonly vElements: Map<number, VElement> = new Map() private readonly vElements: Map<number, VElement> = new Map()
private readonly olVRoots: Map<number, OnloadVRoot> = new Map() private readonly olVRoots: Map<number, OnloadVRoot> = new Map()
/** required to keep track of iframes, frameId : vnodeId */
private readonly iframeRoots: Record<number, number> = {}
/** Constructed StyleSheets https://developer.mozilla.org/en-US/docs/Web/API/Document/adoptedStyleSheets /** Constructed StyleSheets https://developer.mozilla.org/en-US/docs/Web/API/Document/adoptedStyleSheets
* as well as <style> tag owned StyleSheets * as well as <style> tag owned StyleSheets
*/ */
@ -219,6 +220,10 @@ export default class DOMManager extends ListWalker<Message> {
if (['STYLE', 'style', 'LINK'].includes(msg.tag)) { if (['STYLE', 'style', 'LINK'].includes(msg.tag)) {
vElem.prioritized = true vElem.prioritized = true
} }
if (this.vElements.has(msg.id)) {
logger.error("CreateElementNode: Node already exists", msg)
return
}
this.vElements.set(msg.id, vElem) this.vElements.set(msg.id, vElem)
this.insertNode(msg) this.insertNode(msg)
this.removeBodyScroll(msg.id, vElem) this.removeBodyScroll(msg.id, vElem)
@ -316,6 +321,10 @@ export default class DOMManager extends ListWalker<Message> {
case MType.CreateIFrameDocument: { case MType.CreateIFrameDocument: {
const vElem = this.vElements.get(msg.frameID) const vElem = this.vElements.get(msg.frameID)
if (!vElem) { logger.error("CreateIFrameDocument: Node not found", msg); return } if (!vElem) { logger.error("CreateIFrameDocument: Node not found", msg); return }
if (this.iframeRoots[msg.frameID] && !this.olVRoots.has(msg.id)) {
this.olVRoots.delete(this.iframeRoots[msg.frameID])
}
this.iframeRoots[msg.frameID] = msg.id
const vRoot = OnloadVRoot.fromVElement(vElem) const vRoot = OnloadVRoot.fromVElement(vElem)
vRoot.catch(e => logger.warn(e, msg)) vRoot.catch(e => logger.warn(e, msg))
this.olVRoots.set(msg.id, vRoot) this.olVRoots.set(msg.id, vRoot)

View file

@ -33,6 +33,7 @@ interface AddCommentRequest {
interface GetSpotsResponse { interface GetSpotsResponse {
spots: SpotInfo[]; spots: SpotInfo[];
total: number; total: number;
tenantHasSpots: boolean;
} }
interface GetSpotsRequest { interface GetSpotsRequest {

View file

@ -504,7 +504,6 @@ export function truncateStringToFit(string: string, screenWidth: number, charWid
let sendingRequest = false; let sendingRequest = false;
export const handleSpotJWT = (jwt: string) => { export const handleSpotJWT = (jwt: string) => {
console.log(jwt, sendingRequest)
let tries = 0; let tries = 0;
if (!jwt || sendingRequest) { if (!jwt || sendingRequest) {
return; return;

View file

@ -30,7 +30,7 @@
"@floating-ui/react-dom-interactions": "^0.10.3", "@floating-ui/react-dom-interactions": "^0.10.3",
"@medv/finder": "^3.1.0", "@medv/finder": "^3.1.0",
"@reduxjs/toolkit": "^2.2.2", "@reduxjs/toolkit": "^2.2.2",
"@sentry/browser": "^5.21.1", "@sentry/browser": "^8.34.0",
"@svg-maps/world": "^1.0.1", "@svg-maps/world": "^1.0.1",
"@svgr/webpack": "^6.2.1", "@svgr/webpack": "^6.2.1",
"@wojtekmaj/react-daterange-picker": "^6.0.0", "@wojtekmaj/react-daterange-picker": "^6.0.0",

View file

@ -3188,67 +3188,90 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/browser@npm:^5.21.1": "@sentry-internal/browser-utils@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/browser@npm:5.30.0" resolution: "@sentry-internal/browser-utils@npm:8.34.0"
dependencies: dependencies:
"@sentry/core": 5.30.0 "@sentry/core": 8.34.0
"@sentry/types": 5.30.0 "@sentry/types": 8.34.0
"@sentry/utils": 5.30.0 "@sentry/utils": 8.34.0
tslib: ^1.9.3 checksum: fb764f52a989307bb6369a2ae24bb83ef9880c108d2cc2aba94106c846dae743ba379291c84539306d26c3f35f16b6cd2341fa32e85cb942057f2905a33e82bf
checksum: 6793e1b49a8cdb1f025115bcc591bf67c97b6515f62a33ffcbb7b1ab66e459ebc471797d02e471be1ebf14092b56eb25ed914f043962388cc224bc961e334a17
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/core@npm:5.30.0": "@sentry-internal/feedback@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/core@npm:5.30.0" resolution: "@sentry-internal/feedback@npm:8.34.0"
dependencies: dependencies:
"@sentry/hub": 5.30.0 "@sentry/core": 8.34.0
"@sentry/minimal": 5.30.0 "@sentry/types": 8.34.0
"@sentry/types": 5.30.0 "@sentry/utils": 8.34.0
"@sentry/utils": 5.30.0 checksum: 7137a6b589cb56b541df52abd75a73280d3f8fd09f1983f298e29647c5dd941d5fc404d32599d4c1fe2bdcb7693d0e7886f2c08c10ad1eb7c8e17cad650e4cb3
tslib: ^1.9.3
checksum: 6407b9c2a6a56f90c198f5714b3257df24d89d1b4ca6726bd44760d0adabc25798b69fef2c88ccea461c7e79e3c78861aaebfd51fd3cb892aee656c3f7e11801
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/hub@npm:5.30.0": "@sentry-internal/replay-canvas@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/hub@npm:5.30.0" resolution: "@sentry-internal/replay-canvas@npm:8.34.0"
dependencies: dependencies:
"@sentry/types": 5.30.0 "@sentry-internal/replay": 8.34.0
"@sentry/utils": 5.30.0 "@sentry/core": 8.34.0
tslib: ^1.9.3 "@sentry/types": 8.34.0
checksum: 386c91d06aa44be0465fc11330d748a113e464d41cd562a9e1d222a682cbcb14e697a3e640953e7a0239997ad8a02b223a0df3d9e1d8816cb823fd3613be3e2f "@sentry/utils": 8.34.0
checksum: 55c53be37e0c06706e099a96d1485636b8d3f11b72078c279fda6e7992205d217d27dae9e609db2c0466db0755bd038087e76cfe746eaff9ce39bbfd1f1571a5
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/minimal@npm:5.30.0": "@sentry-internal/replay@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/minimal@npm:5.30.0" resolution: "@sentry-internal/replay@npm:8.34.0"
dependencies: dependencies:
"@sentry/hub": 5.30.0 "@sentry-internal/browser-utils": 8.34.0
"@sentry/types": 5.30.0 "@sentry/core": 8.34.0
tslib: ^1.9.3 "@sentry/types": 8.34.0
checksum: 34ec05503de46d01f98c94701475d5d89cc044892c86ccce30e01f62f28344eb23b718e7cf573815e46f30a4ac9da3129bed9b3d20c822938acfb40cbe72437b "@sentry/utils": 8.34.0
checksum: 8a4b6f1f169584ddd62c372760168ea2d63ca0d6ebd6433e45d760fcbb2610418a2bf6546bbda49ecd619deddf39b4ac268b87a15adbb56efc0b86edf4c40dd9
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/types@npm:5.30.0": "@sentry/browser@npm:^8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/types@npm:5.30.0" resolution: "@sentry/browser@npm:8.34.0"
checksum: 99c6e55c0a82c8ca95be2e9dbb35f581b29e4ff7af74b23bc62b690de4e35febfa15868184a2303480ef86babd4fea5273cf3b5ddf4a27685b841a72f13a0c88 dependencies:
"@sentry-internal/browser-utils": 8.34.0
"@sentry-internal/feedback": 8.34.0
"@sentry-internal/replay": 8.34.0
"@sentry-internal/replay-canvas": 8.34.0
"@sentry/core": 8.34.0
"@sentry/types": 8.34.0
"@sentry/utils": 8.34.0
checksum: 8a08033fce2908018cc3fc81cf1110a93a338c0d370628a2e9aaa9f43703041824462474037e59b2b166141835b7e94b437325bd6a46bb8371e37b659b216d10
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/utils@npm:5.30.0": "@sentry/core@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/utils@npm:5.30.0" resolution: "@sentry/core@npm:8.34.0"
dependencies: dependencies:
"@sentry/types": 5.30.0 "@sentry/types": 8.34.0
tslib: ^1.9.3 "@sentry/utils": 8.34.0
checksum: ca8eebfea7ac7db6d16f6c0b8a66ac62587df12a79ce9d0d8393f4d69880bb8d40d438f9810f7fb107a9880fe0d68bbf797b89cbafd113e89a0829eb06b205f8 checksum: 0ab7e11bd382cb47ade38f3c9615e6fb876bad43eba4b376a51e44b1c57e00efe2e74f3cc0790a8da6c0be16093086bc65d89cf5387453f93ae96e10a41a0d60
languageName: node
linkType: hard
"@sentry/types@npm:8.34.0":
version: 8.34.0
resolution: "@sentry/types@npm:8.34.0"
checksum: d35bf72129f621af2f7916b0805c6948d210791757bee690fc6b68f2412bbe80c8ec704a0f8eb8ee45eb78deeadbd3c69830469b62fba4827506ea30c235f4e8
languageName: node
linkType: hard
"@sentry/utils@npm:8.34.0":
version: 8.34.0
resolution: "@sentry/utils@npm:8.34.0"
dependencies:
"@sentry/types": 8.34.0
checksum: 60612dba8320c736f9559ba2fb4efe2927fd9d4a1f29bff36f116ad30c9ce210f6677013052a69cb7e16c5d28f1d8d7465d9278e72f7384b84e924cf3ed2790c
languageName: node languageName: node
linkType: hard linkType: hard
@ -18149,7 +18172,7 @@ __metadata:
"@medv/finder": ^3.1.0 "@medv/finder": ^3.1.0
"@openreplay/sourcemap-uploader": ^3.0.8 "@openreplay/sourcemap-uploader": ^3.0.8
"@reduxjs/toolkit": ^2.2.2 "@reduxjs/toolkit": ^2.2.2
"@sentry/browser": ^5.21.1 "@sentry/browser": ^8.34.0
"@storybook/addon-actions": ^6.5.12 "@storybook/addon-actions": ^6.5.12
"@storybook/addon-docs": ^6.5.12 "@storybook/addon-docs": ^6.5.12
"@storybook/addon-essentials": ^6.5.12 "@storybook/addon-essentials": ^6.5.12

View file

@ -11,6 +11,7 @@ docker rmi alpine || true
# Signing image # Signing image
# cosign sign --key awskms:///alias/openreplay-container-sign image_url:tag # cosign sign --key awskms:///alias/openreplay-container-sign image_url:tag
export SIGN_IMAGE=1 export SIGN_IMAGE=1
export ARCH=${ARCH:-"amd64"}
export PUSH_IMAGE=0 export PUSH_IMAGE=0
export AWS_DEFAULT_REGION="eu-central-1" export AWS_DEFAULT_REGION="eu-central-1"
export SIGN_KEY="awskms:///alias/openreplay-container-sign" export SIGN_KEY="awskms:///alias/openreplay-container-sign"
@ -21,17 +22,17 @@ echo $DOCKER_REPO
} || { } || {
# docker login $DOCKER_REPO # docker login $DOCKER_REPO
# tmux set-option remain-on-exit on # tmux set-option remain-on-exit on
tmux split-window "cd ../../backend && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../backend && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux split-window "cd ../../assist && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../assist && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux select-layout tiled tmux select-layout tiled
tmux split-window "cd ../../peers && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../peers && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux split-window "cd ../../frontend && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../frontend && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux select-layout tiled tmux select-layout tiled
tmux split-window "cd ../../sourcemapreader && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../sourcemapreader && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux split-window "cd ../../api && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@ \ tmux split-window "cd ../../api && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@ \
&& DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build_alerts.sh $@ \ && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build_alerts.sh $@ \
&& DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build_crons.sh $@ \ && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build_crons.sh $@ \
&& cd ../assist-stats && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" && cd ../assist-stats && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux select-layout tiled tmux select-layout tiled
} }

View file

@ -119,7 +119,7 @@ function install_openreplay_actions() {
sudo rm -rf $openreplay_code_dir sudo rm -rf $openreplay_code_dir
fi fi
sudo cp -rfb ./vars.yaml $openreplay_home_dir sudo cp -rfb ./vars.yaml $openreplay_home_dir
sudo cp -rf "$(cd ../.. && pwd)" $openreplay_code_dir sudo cp -rf "$(cd ../.. && pwd)" $openreplay_home_dir
} }
function main() { function main() {

View file

@ -203,6 +203,7 @@ function status() {
return return
} }
# Create OR version patch with gith sha
function patch_version() { function patch_version() {
# Patching config version for console # Patching config version for console
version=$(/var/lib/openreplay/yq '.fromVersion' vars.yaml)-$(sudo git rev-parse --short HEAD) version=$(/var/lib/openreplay/yq '.fromVersion' vars.yaml)-$(sudo git rev-parse --short HEAD)
@ -385,7 +386,7 @@ function upgrade() {
time_now=$(date +%m-%d-%Y-%I%M%S) time_now=$(date +%m-%d-%Y-%I%M%S)
# Creating backup dir of current installation # Creating backup dir of current installation
[[ -d "$OR_DIR/openreplay" ]] && sudo mv "$OR_DIR/openreplay" "$OR_DIR/openreplay_${or_version//\"/}_${time_now}" [[ -d "$OR_DIR/openreplay" ]] && sudo cp -rf "$OR_DIR/openreplay" "$OR_DIR/openreplay_${or_version//\"/}_${time_now}"
clone_repo clone_repo
err_cd openreplay/scripts/helmcharts err_cd openreplay/scripts/helmcharts
@ -406,7 +407,8 @@ function upgrade() {
sudo mv ./openreplay-cli /bin/openreplay sudo mv ./openreplay-cli /bin/openreplay
sudo chmod +x /bin/openreplay sudo chmod +x /bin/openreplay
sudo mv ./vars.yaml "$OR_DIR" sudo mv ./vars.yaml "$OR_DIR"
sudo cp -rf ../../../openreplay "$OR_DIR/" sudo rm -rf "$OR_DIR/openreplay" || true
sudo cp -rf "${tmp_dir}/openreplay" "$OR_DIR/"
log info "Configuration file is saved in /var/lib/openreplay/vars.yaml" log info "Configuration file is saved in /var/lib/openreplay/vars.yaml"
log info "Run ${BWHITE}openreplay -h${GREEN} to see the cli information to manage OpenReplay." log info "Run ${BWHITE}openreplay -h${GREEN} to see the cli information to manage OpenReplay."

View file

@ -18,4 +18,4 @@ version: 0.1.1
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.1" AppVersion: "v1.20.5"

View file

@ -18,4 +18,4 @@ version: 0.1.7
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.0" AppVersion: "v1.20.7"

View file

@ -18,4 +18,4 @@ version: 0.1.10
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.4" AppVersion: "v1.20.13"

View file

@ -1,7 +1,6 @@
apiVersion: v2 apiVersion: v2
name: spot name: spot
description: A Helm chart for Kubernetes description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart. # A chart can be either an 'application' or a 'library' chart.
# #
# Application charts are a collection of templates that can be packaged into versioned archives # Application charts are a collection of templates that can be packaged into versioned archives
@ -11,14 +10,12 @@ description: A Helm chart for Kubernetes
# a dependency of application charts to inject those utilities and functions into the rendering # a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed. # pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application type: application
# This is the chart version. This version number should be incremented each time you make changes # This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version. # to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/) # Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.1 version: 0.1.1
# This is the version number of the application being deployed. This version number should be # This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.0" AppVersion: "v1.20.1"

View file

@ -74,7 +74,23 @@ spec:
- | - |
set -x set -x
mkdir -p /opt/openreplay/openreplay && cd /opt/openreplay/openreplay mkdir -p /opt/openreplay/openreplay && cd /opt/openreplay/openreplay
git clone {{ .Values.global.dbMigrationUpstreamRepoURL | default "https://github.com/openreplay/openreplay" }} .
# Function to check if GitHub is available
check_github() {
for i in {1..10}; do
if ping -c 1 github.com &> /dev/null || wget -q --spider https://github.com; then
echo "GitHub is available."
git clone {{ .Values.global.dbMigrationUpstreamRepoURL | default "https://github.com/openreplay/openreplay" }} .
break
else
echo "GitHub is not available. Retrying in 3 seconds..."
sleep 3
fi
done
}
check_github
ls /opt/openreplay/openreplay ls /opt/openreplay/openreplay
git checkout {{ default .Chart.AppVersion .Values.dbMigrationUpstreamBranch }} || exit 10 git checkout {{ default .Chart.AppVersion .Values.dbMigrationUpstreamBranch }} || exit 10
git log -1 git log -1

View file

@ -2,7 +2,7 @@
"name": "wxt-starter", "name": "wxt-starter",
"description": "manifest.json description", "description": "manifest.json description",
"private": true, "private": true,
"version": "1.0.5", "version": "1.0.6",
"type": "module", "type": "module",
"scripts": { "scripts": {
"dev": "wxt", "dev": "wxt",

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 567 KiB

After

Width:  |  Height:  |  Size: 519 KiB

View file

@ -52,6 +52,12 @@ export interface StartOptions {
forceNew?: boolean forceNew?: boolean
sessionHash?: string sessionHash?: string
assistOnly?: boolean assistOnly?: boolean
/**
* @deprecated We strongly advise to use .start().then instead.
*
* This method is kept for snippet compatibility only
* */
startCallback?: (result: StartPromiseReturn) => void
} }
interface OnStartInfo { interface OnStartInfo {
@ -161,6 +167,12 @@ type AppOptions = {
} }
network?: NetworkOptions network?: NetworkOptions
/**
* use this flag if you're using Angular
* basically goes around window.Zone api changes to mutation observer
* and event listeners
* */
angularMode?: boolean
} & WebworkerOptions & } & WebworkerOptions &
SessOptions SessOptions
@ -185,12 +197,14 @@ const proto = {
resp: 'never-gonna-let-you-down', resp: 'never-gonna-let-you-down',
// regenerating id (copied other tab) // regenerating id (copied other tab)
reg: 'never-gonna-run-around-and-desert-you', reg: 'never-gonna-run-around-and-desert-you',
// tracker inside a child iframe iframeSignal: 'tracker inside a child iframe',
iframeSignal: 'never-gonna-make-you-cry', iframeId: 'getting node id for child iframe',
// getting node id for child iframe iframeBatch: 'batch of messages from an iframe window',
iframeId: 'never-gonna-say-goodbye', parentAlive: 'signal that parent is live',
// batch of messages from an iframe window killIframe: 'stop tracker inside frame',
iframeBatch: 'never-gonna-tell-a-lie-and-hurt-you', startIframe: 'start tracker inside frame',
// checking updates
polling: 'hello-how-are-you-im-under-the-water-please-help-me',
} as const } as const
export default class App { export default class App {
@ -237,7 +251,6 @@ export default class App {
private rootId: number | null = null private rootId: number | null = null
private pageFrames: HTMLIFrameElement[] = [] private pageFrames: HTMLIFrameElement[] = []
private frameOderNumber = 0 private frameOderNumber = 0
private readonly initialHostName = location.hostname
private features = { private features = {
'feature-flags': true, 'feature-flags': true,
'usability-test': true, 'usability-test': true,
@ -248,7 +261,7 @@ export default class App {
sessionToken: string | undefined, sessionToken: string | undefined,
options: Partial<Options>, options: Partial<Options>,
private readonly signalError: (error: string, apis: string[]) => void, private readonly signalError: (error: string, apis: string[]) => void,
private readonly insideIframe: boolean, public readonly insideIframe: boolean,
) { ) {
this.contextId = Math.random().toString(36).slice(2) this.contextId = Math.random().toString(36).slice(2)
this.projectKey = projectKey this.projectKey = projectKey
@ -305,6 +318,7 @@ export default class App {
__save_canvas_locally: false, __save_canvas_locally: false,
useAnimationFrame: false, useAnimationFrame: false,
}, },
angularMode: false,
} }
this.options = simpleMerge(defaultOptions, options) this.options = simpleMerge(defaultOptions, options)
@ -322,7 +336,7 @@ export default class App {
this.localStorage = this.options.localStorage ?? window.localStorage this.localStorage = this.options.localStorage ?? window.localStorage
this.sessionStorage = this.options.sessionStorage ?? window.sessionStorage this.sessionStorage = this.options.sessionStorage ?? window.sessionStorage
this.sanitizer = new Sanitizer(this, options) this.sanitizer = new Sanitizer(this, options)
this.nodes = new Nodes(this.options.node_id) this.nodes = new Nodes(this.options.node_id, Boolean(options.angularMode))
this.observer = new Observer(this, options) this.observer = new Observer(this, options)
this.ticker = new Ticker(this) this.ticker = new Ticker(this)
this.ticker.attach(() => this.commit()) this.ticker.attach(() => this.commit())
@ -348,136 +362,31 @@ export default class App {
this.session.applySessionHash(sessionToken) this.session.applySessionHash(sessionToken)
} }
this.initWorker()
const thisTab = this.session.getTabId() const thisTab = this.session.getTabId()
if (this.insideIframe) {
/**
* listen for messages from parent window, so we can signal that we're alive
* */
window.addEventListener('message', this.parentCrossDomainFrameListener)
setInterval(() => {
window.parent.postMessage(
{
line: proto.polling,
context: this.contextId,
},
'*',
)
}, 250)
} else {
this.initWorker()
}
if (!this.insideIframe) { if (!this.insideIframe) {
/** /**
* if we get a signal from child iframes, we check for their node_id and send it back, * if we get a signal from child iframes, we check for their node_id and send it back,
* so they can act as if it was just a same-domain iframe * so they can act as if it was just a same-domain iframe
* */ * */
let crossdomainFrameCount = 0 window.addEventListener('message', this.crossDomainIframeListener)
const catchIframeMessage = (event: MessageEvent) => {
const { data } = event
if (data.line === proto.iframeSignal) {
const childIframeDomain = data.domain
const pageIframes = Array.from(document.querySelectorAll('iframe'))
this.pageFrames = pageIframes
const signalId = async () => {
let tries = 0
while (tries < 10) {
const id = this.checkNodeId(pageIframes, childIframeDomain)
if (id) {
this.waitStarted()
.then(() => {
crossdomainFrameCount++
const token = this.session.getSessionToken()
const iframeData = {
line: proto.iframeId,
context: this.contextId,
domain: childIframeDomain,
id,
token,
frameOrderNumber: crossdomainFrameCount,
}
this.debug.log('iframe_data', iframeData)
// @ts-ignore
event.source?.postMessage(iframeData, '*')
})
.catch(console.error)
tries = 10
break
}
tries++
await delay(100)
}
}
void signalId()
}
/**
* proxying messages from iframe to main body, so they can be in one batch (same indexes, etc)
* plus we rewrite some of the messages to be relative to the main context/window
* */
if (data.line === proto.iframeBatch) {
const msgBatch = data.messages
const mappedMessages: Message[] = msgBatch.map((msg: Message) => {
if (msg[0] === MType.MouseMove) {
let fixedMessage = msg
this.pageFrames.forEach((frame) => {
if (frame.dataset.domain === event.data.domain) {
const [type, x, y] = msg
const { left, top } = frame.getBoundingClientRect()
fixedMessage = [type, x + left, y + top]
}
})
return fixedMessage
}
if (msg[0] === MType.MouseClick) {
let fixedMessage = msg
this.pageFrames.forEach((frame) => {
if (frame.dataset.domain === event.data.domain) {
const [type, id, hesitationTime, label, selector, normX, normY] = msg
const { left, top, width, height } = frame.getBoundingClientRect()
const contentWidth = document.documentElement.scrollWidth
const contentHeight = document.documentElement.scrollHeight
// (normalizedX * frameWidth + frameLeftOffset)/docSize
const fullX = (normX / 100) * width + left
const fullY = (normY / 100) * height + top
const fixedX = fullX / contentWidth
const fixedY = fullY / contentHeight
fixedMessage = [
type,
id,
hesitationTime,
label,
selector,
Math.round(fixedX * 1e3) / 1e1,
Math.round(fixedY * 1e3) / 1e1,
]
}
})
return fixedMessage
}
return msg
})
this.messages.push(...mappedMessages)
}
}
window.addEventListener('message', catchIframeMessage)
this.attachStopCallback(() => {
window.removeEventListener('message', catchIframeMessage)
})
} else {
const catchParentMessage = (event: MessageEvent) => {
const { data } = event
if (data.line !== proto.iframeId) {
return
}
this.rootId = data.id
this.session.setSessionToken(data.token as string)
this.frameOderNumber = data.frameOrderNumber
this.debug.log('starting iframe tracking', data)
this.allowAppStart()
}
window.addEventListener('message', catchParentMessage)
this.attachStopCallback(() => {
window.removeEventListener('message', catchParentMessage)
})
// communicating with parent window,
// even if its crossdomain is possible via postMessage api
const domain = this.initialHostName
window.parent.postMessage(
{
line: proto.iframeSignal,
source: thisTab,
context: this.contextId,
domain,
},
'*',
)
} }
if (this.bc !== null) { if (this.bc !== null) {
@ -488,7 +397,7 @@ export default class App {
}) })
this.startTimeout = setTimeout(() => { this.startTimeout = setTimeout(() => {
this.allowAppStart() this.allowAppStart()
}, 500) }, 250)
this.bc.onmessage = (ev: MessageEvent<RickRoll>) => { this.bc.onmessage = (ev: MessageEvent<RickRoll>) => {
if (ev.data.context === this.contextId) { if (ev.data.context === this.contextId) {
return return
@ -519,8 +428,204 @@ export default class App {
} }
} }
/** used by child iframes for crossdomain only */
/** used by child iframes for crossdomain only */
parentActive = false
checkStatus = () => {
return this.parentActive
}
parentCrossDomainFrameListener = (event: MessageEvent) => {
const { data } = event
if (!data || event.source === window) return
if (data.line === proto.startIframe) {
if (this.active()) return
try {
this.allowAppStart()
void this.start()
} catch (e) {
console.error('children frame restart failed:', e)
}
}
if (data.line === proto.parentAlive) {
this.parentActive = true
}
if (data.line === proto.iframeId) {
this.parentActive = true
this.rootId = data.id
this.session.setSessionToken(data.token as string)
this.frameOderNumber = data.frameOrderNumber
this.debug.log('starting iframe tracking', data)
this.allowAppStart()
}
if (data.line === proto.killIframe) {
if (this.active()) {
this.stop()
}
}
}
/**
* context ids for iframes,
* order is not so important as long as its consistent
* */
trackedFrames: string[] = []
crossDomainIframeListener = (event: MessageEvent) => {
if (!this.active() || event.source === window) return
const { data } = event
if (!data) return
if (data.line === proto.iframeSignal) {
// @ts-ignore
event.source?.postMessage({ ping: true, line: proto.parentAlive }, '*')
const pageIframes = Array.from(document.querySelectorAll('iframe'))
this.pageFrames = pageIframes
const signalId = async () => {
if (event.source === null) {
return console.error('Couldnt connect to event.source for child iframe tracking')
}
const id = await this.checkNodeId(pageIframes, event.source)
if (id && !this.trackedFrames.includes(data.context)) {
try {
this.trackedFrames.push(data.context)
await this.waitStarted()
const token = this.session.getSessionToken()
const order = this.trackedFrames.findIndex((f) => f === data.context) + 1
if (order === 0) {
this.debug.error(
'Couldnt get order number for iframe',
data.context,
this.trackedFrames,
)
}
const iframeData = {
line: proto.iframeId,
id,
token,
// since indexes go from 0 we +1
frameOrderNumber: order,
}
this.debug.log('Got child frame signal; nodeId', id, event.source, iframeData)
// @ts-ignore
event.source?.postMessage(iframeData, '*')
} catch (e) {
console.error(e)
}
} else {
this.debug.log('Couldnt get node id for iframe', event.source, pageIframes)
}
}
void signalId()
}
/**
* proxying messages from iframe to main body, so they can be in one batch (same indexes, etc)
* plus we rewrite some of the messages to be relative to the main context/window
* */
if (data.line === proto.iframeBatch) {
const msgBatch = data.messages
const mappedMessages: Message[] = msgBatch.map((msg: Message) => {
if (msg[0] === MType.MouseMove) {
let fixedMessage = msg
this.pageFrames.forEach((frame) => {
if (frame.contentWindow === event.source) {
const [type, x, y] = msg
const { left, top } = frame.getBoundingClientRect()
fixedMessage = [type, x + left, y + top]
}
})
return fixedMessage
}
if (msg[0] === MType.MouseClick) {
let fixedMessage = msg
this.pageFrames.forEach((frame) => {
if (frame.contentWindow === event.source) {
const [type, id, hesitationTime, label, selector, normX, normY] = msg
const { left, top, width, height } = frame.getBoundingClientRect()
const contentWidth = document.documentElement.scrollWidth
const contentHeight = document.documentElement.scrollHeight
// (normalizedX * frameWidth + frameLeftOffset)/docSize
const fullX = (normX / 100) * width + left
const fullY = (normY / 100) * height + top
const fixedX = fullX / contentWidth
const fixedY = fullY / contentHeight
fixedMessage = [
type,
id,
hesitationTime,
label,
selector,
Math.round(fixedX * 1e3) / 1e1,
Math.round(fixedY * 1e3) / 1e1,
]
}
})
return fixedMessage
}
return msg
})
this.messages.push(...mappedMessages)
}
if (data.line === proto.polling) {
if (!this.pollingQueue.order.length) {
return
}
const nextCommand = this.pollingQueue.order[0]
if (this.pollingQueue[nextCommand].includes(data.context)) {
this.pollingQueue[nextCommand] = this.pollingQueue[nextCommand].filter(
(c: string) => c !== data.context,
)
// @ts-ignore
event.source?.postMessage({ line: nextCommand }, '*')
if (this.pollingQueue[nextCommand].length === 0) {
this.pollingQueue.order.shift()
}
}
}
}
/**
* { command : [remaining iframes] }
* + order of commands
**/
pollingQueue: Record<string, any> = {
order: [],
}
private readonly addCommand = (cmd: string) => {
this.pollingQueue.order.push(cmd)
this.pollingQueue[cmd] = [...this.trackedFrames]
}
public bootChildrenFrames = async () => {
await this.waitStarted()
this.addCommand(proto.startIframe)
}
public killChildrenFrames = () => {
this.addCommand(proto.killIframe)
}
signalIframeTracker = () => {
const thisTab = this.session.getTabId()
const signalToParent = (n: number) => {
window.parent.postMessage(
{
line: proto.iframeSignal,
source: thisTab,
context: this.contextId,
},
this.options.crossdomain?.parentDomain ?? '*',
)
setTimeout(() => {
if (!this.checkStatus() && n < 100) {
void signalToParent(n + 1)
}
}, 250)
}
void signalToParent(1)
}
startTimeout: ReturnType<typeof setTimeout> | null = null startTimeout: ReturnType<typeof setTimeout> | null = null
private allowAppStart() { public allowAppStart() {
this.canStart = true this.canStart = true
if (this.startTimeout) { if (this.startTimeout) {
clearTimeout(this.startTimeout) clearTimeout(this.startTimeout)
@ -528,15 +633,38 @@ export default class App {
} }
} }
private checkNodeId(iframes: HTMLIFrameElement[], domain: string) { private async checkNodeId(
iframes: HTMLIFrameElement[],
source: MessageEventSource,
): Promise<number | null> {
for (const iframe of iframes) { for (const iframe of iframes) {
if (iframe.dataset.domain === domain) { if (iframe.contentWindow && iframe.contentWindow === source) {
// @ts-ignore /**
return iframe[this.options.node_id] as number | undefined * Here we're trying to get node id from the iframe (which is kept in observer)
* because of async nature of dom initialization, we give 100 retries with 100ms delay each
* which equals to 10 seconds. This way we have a period where we give app some time to load
* and tracker some time to parse the initial DOM tree even on slower devices
* */
let tries = 0
while (tries < 100) {
// @ts-ignore
const potentialId = iframe[this.options.node_id]
if (potentialId !== undefined) {
tries = 100
return potentialId
} else {
tries++
await delay(100)
}
}
return null
} }
} }
return null return null
} }
private initWorker() { private initWorker() {
try { try {
this.worker = new Worker( this.worker = new Worker(
@ -647,28 +775,28 @@ export default class App {
this.messages.length = 0 this.messages.length = 0
return return
} }
if (this.worker === undefined || !this.messages.length) {
return
}
if (this.insideIframe) { if (this.insideIframe) {
window.parent.postMessage( window.parent.postMessage(
{ {
line: proto.iframeBatch, line: proto.iframeBatch,
messages: this.messages, messages: this.messages,
domain: this.initialHostName,
}, },
'*', this.options.crossdomain?.parentDomain ?? '*',
) )
this.commitCallbacks.forEach((cb) => cb(this.messages)) this.commitCallbacks.forEach((cb) => cb(this.messages))
this.messages.length = 0 this.messages.length = 0
return return
} }
if (this.worker === undefined || !this.messages.length) {
return
}
try { try {
requestIdleCb(() => { requestIdleCb(() => {
this.messages.unshift(TabData(this.session.getTabId())) this.messages.unshift(TabData(this.session.getTabId()))
this.messages.unshift(Timestamp(this.timestamp())) this.messages.unshift(Timestamp(this.timestamp()))
// why I need to add opt chaining?
this.worker?.postMessage(this.messages) this.worker?.postMessage(this.messages)
this.commitCallbacks.forEach((cb) => cb(this.messages)) this.commitCallbacks.forEach((cb) => cb(this.messages))
this.messages.length = 0 this.messages.length = 0
@ -740,36 +868,39 @@ export default class App {
this.commitCallbacks.push(cb) this.commitCallbacks.push(cb)
} }
attachStartCallback(cb: StartCallback, useSafe = false): void { attachStartCallback = (cb: StartCallback, useSafe = false): void => {
if (useSafe) { if (useSafe) {
cb = this.safe(cb) cb = this.safe(cb)
} }
this.startCallbacks.push(cb) this.startCallbacks.push(cb)
} }
attachStopCallback(cb: () => any, useSafe = false): void { attachStopCallback = (cb: () => any, useSafe = false): void => {
if (useSafe) { if (useSafe) {
cb = this.safe(cb) cb = this.safe(cb)
} }
this.stopCallbacks.push(cb) this.stopCallbacks.push(cb)
} }
// Use app.nodes.attachNodeListener for registered nodes instead attachEventListener = (
attachEventListener(
target: EventTarget, target: EventTarget,
type: string, type: string,
listener: EventListener, listener: EventListener,
useSafe = true, useSafe = true,
useCapture = true, useCapture = true,
): void { ): void => {
if (useSafe) { if (useSafe) {
listener = this.safe(listener) listener = this.safe(listener)
} }
const createListener = () => const createListener = () =>
target ? createEventListener(target, type, listener, useCapture) : null target
? createEventListener(target, type, listener, useCapture, this.options.angularMode)
: null
const deleteListener = () => const deleteListener = () =>
target ? deleteEventListener(target, type, listener, useCapture) : null target
? deleteEventListener(target, type, listener, useCapture, this.options.angularMode)
: null
this.attachStartCallback(createListener, useSafe) this.attachStartCallback(createListener, useSafe)
this.attachStopCallback(deleteListener, useSafe) this.attachStopCallback(deleteListener, useSafe)
@ -1157,7 +1288,7 @@ export default class App {
if (isColdStart && this.coldInterval) { if (isColdStart && this.coldInterval) {
clearInterval(this.coldInterval) clearInterval(this.coldInterval)
} }
if (!this.worker) { if (!this.worker && !this.insideIframe) {
const reason = 'No worker found: perhaps, CSP is not set.' const reason = 'No worker found: perhaps, CSP is not set.'
this.signalError(reason, []) this.signalError(reason, [])
return Promise.resolve(UnsuccessfulStart(reason)) return Promise.resolve(UnsuccessfulStart(reason))
@ -1189,7 +1320,7 @@ export default class App {
}) })
const timestamp = now() const timestamp = now()
this.worker.postMessage({ this.worker?.postMessage({
type: 'start', type: 'start',
pageNo: this.session.incPageNo(), pageNo: this.session.incPageNo(),
ingestPoint: this.options.ingestPoint, ingestPoint: this.options.ingestPoint,
@ -1237,7 +1368,7 @@ export default class App {
const reason = error === CANCELED ? CANCELED : `Server error: ${r.status}. ${error}` const reason = error === CANCELED ? CANCELED : `Server error: ${r.status}. ${error}`
return UnsuccessfulStart(reason) return UnsuccessfulStart(reason)
} }
if (!this.worker) { if (!this.worker && !this.insideIframe) {
const reason = 'no worker found after start request (this should not happen in real world)' const reason = 'no worker found after start request (this should not happen in real world)'
this.signalError(reason, []) this.signalError(reason, [])
return UnsuccessfulStart(reason) return UnsuccessfulStart(reason)
@ -1295,9 +1426,9 @@ export default class App {
if (socketOnly) { if (socketOnly) {
this.socketMode = true this.socketMode = true
this.worker.postMessage('stop') this.worker?.postMessage('stop')
} else { } else {
this.worker.postMessage({ this.worker?.postMessage({
type: 'auth', type: 'auth',
token, token,
beaconSizeLimit, beaconSizeLimit,
@ -1320,11 +1451,17 @@ export default class App {
// TODO: start as early as possible (before receiving the token) // TODO: start as early as possible (before receiving the token)
/** after start */ /** after start */
this.startCallbacks.forEach((cb) => cb(onStartInfo)) // MBTODO: callbacks after DOM "mounted" (observed) this.startCallbacks.forEach((cb) => cb(onStartInfo)) // MBTODO: callbacks after DOM "mounted" (observed)
if (startOpts.startCallback) {
startOpts.startCallback(SuccessfulStart(onStartInfo))
}
if (this.features['feature-flags']) { if (this.features['feature-flags']) {
void this.featureFlags.reloadFlags() void this.featureFlags.reloadFlags()
} }
await this.tagWatcher.fetchTags(this.options.ingestPoint, token) await this.tagWatcher.fetchTags(this.options.ingestPoint, token)
this.activityState = ActivityState.Active this.activityState = ActivityState.Active
if (this.options.crossdomain?.enabled && !this.insideIframe) {
void this.bootChildrenFrames()
}
if (canvasEnabled && !this.options.canvas.disableCanvas) { if (canvasEnabled && !this.options.canvas.disableCanvas) {
this.canvasRecorder = this.canvasRecorder =
@ -1336,7 +1473,6 @@ export default class App {
fixedScaling: this.options.canvas.fixedCanvasScaling, fixedScaling: this.options.canvas.fixedCanvasScaling,
useAnimationFrame: this.options.canvas.useAnimationFrame, useAnimationFrame: this.options.canvas.useAnimationFrame,
}) })
this.canvasRecorder.startTracking()
} }
/** --------------- COLD START BUFFER ------------------*/ /** --------------- COLD START BUFFER ------------------*/
@ -1359,9 +1495,12 @@ export default class App {
} }
this.ticker.start() this.ticker.start()
} }
this.canvasRecorder?.startTracking()
if (this.features['usability-test']) { if (this.features['usability-test']) {
this.uxtManager = this.uxtManager ? this.uxtManager : new UserTestManager(this, uxtStorageKey) this.uxtManager = this.uxtManager
? this.uxtManager
: new UserTestManager(this, uxtStorageKey)
let uxtId: number | undefined let uxtId: number | undefined
const savedUxtTag = this.localStorage.getItem(uxtStorageKey) const savedUxtTag = this.localStorage.getItem(uxtStorageKey)
if (savedUxtTag) { if (savedUxtTag) {
@ -1394,6 +1533,11 @@ export default class App {
} catch (reason) { } catch (reason) {
this.stop() this.stop()
this.session.reset() this.session.reset()
if (!reason) {
console.error('Unknown error during start')
this.signalError('Unknown error', [])
return UnsuccessfulStart('Unknown error')
}
if (reason === CANCELED) { if (reason === CANCELED) {
this.signalError(CANCELED, []) this.signalError(CANCELED, [])
return UnsuccessfulStart(CANCELED) return UnsuccessfulStart(CANCELED)
@ -1452,9 +1596,13 @@ export default class App {
} }
async waitStarted() { async waitStarted() {
return this.waitStatus(ActivityState.Active)
}
async waitStatus(status: ActivityState) {
return new Promise((resolve) => { return new Promise((resolve) => {
const check = () => { const check = () => {
if (this.activityState === ActivityState.Active) { if (this.activityState === status) {
resolve(true) resolve(true)
} else { } else {
setTimeout(check, 25) setTimeout(check, 25)
@ -1478,6 +1626,10 @@ export default class App {
return Promise.resolve(UnsuccessfulStart(reason)) return Promise.resolve(UnsuccessfulStart(reason))
} }
if (this.insideIframe) {
this.signalIframeTracker()
}
if (!document.hidden) { if (!document.hidden) {
await this.waitStart() await this.waitStart()
return this._start(...args) return this._start(...args)
@ -1533,20 +1685,28 @@ export default class App {
stop(stopWorker = true): void { stop(stopWorker = true): void {
if (this.activityState !== ActivityState.NotActive) { if (this.activityState !== ActivityState.NotActive) {
try { try {
if (!this.insideIframe && this.options.crossdomain?.enabled) {
this.killChildrenFrames()
}
this.attributeSender.clear() this.attributeSender.clear()
this.sanitizer.clear() this.sanitizer.clear()
this.observer.disconnect() this.observer.disconnect()
this.nodes.clear() this.nodes.clear()
this.ticker.stop() this.ticker.stop()
this.stopCallbacks.forEach((cb) => cb()) this.stopCallbacks.forEach((cb) => cb())
this.debug.log('OpenReplay tracking stopped.')
this.tagWatcher.clear() this.tagWatcher.clear()
if (this.worker && stopWorker) { if (this.worker && stopWorker) {
this.worker.postMessage('stop') this.worker.postMessage('stop')
} }
this.canvasRecorder?.clear() this.canvasRecorder?.clear()
this.messages.length = 0
this.trackedFrames = []
this.parentActive = false
this.canStart = false
this.pollingQueue = { order: [] }
} finally { } finally {
this.activityState = ActivityState.NotActive this.activityState = ActivityState.NotActive
this.debug.log('OpenReplay tracking stopped.')
} }
} }
} }

View file

@ -10,10 +10,13 @@ export default class Nodes {
private readonly elementListeners: Map<number, Array<ElementListener>> = new Map() private readonly elementListeners: Map<number, Array<ElementListener>> = new Map()
private nextNodeId = 0 private nextNodeId = 0
constructor(private readonly node_id: string) {} constructor(
private readonly node_id: string,
private readonly angularMode: boolean,
) {}
syntheticMode(frameOrder: number) { syntheticMode(frameOrder: number) {
const maxSafeNumber = 9007199254740900 const maxSafeNumber = Number.MAX_SAFE_INTEGER
const placeholderSize = 99999999 const placeholderSize = 99999999
const nextFrameId = placeholderSize * frameOrder const nextFrameId = placeholderSize * frameOrder
// I highly doubt that this will ever happen, // I highly doubt that this will ever happen,
@ -25,7 +28,7 @@ export default class Nodes {
} }
// Attached once per Tracker instance // Attached once per Tracker instance
attachNodeCallback(nodeCallback: NodeCallback): void { attachNodeCallback = (nodeCallback: NodeCallback): void => {
this.nodeCallbacks.push(nodeCallback) this.nodeCallbacks.push(nodeCallback)
} }
@ -33,12 +36,12 @@ export default class Nodes {
this.nodes.forEach((node) => cb(node)) this.nodes.forEach((node) => cb(node))
} }
attachNodeListener(node: Node, type: string, listener: EventListener, useCapture = true): void { attachNodeListener = (node: Node, type: string, listener: EventListener, useCapture = true): void => {
const id = this.getID(node) const id = this.getID(node)
if (id === undefined) { if (id === undefined) {
return return
} }
createEventListener(node, type, listener, useCapture) createEventListener(node, type, listener, useCapture, this.angularMode)
let listeners = this.elementListeners.get(id) let listeners = this.elementListeners.get(id)
if (listeners === undefined) { if (listeners === undefined) {
listeners = [] listeners = []
@ -70,7 +73,7 @@ export default class Nodes {
if (listeners !== undefined) { if (listeners !== undefined) {
this.elementListeners.delete(id) this.elementListeners.delete(id)
listeners.forEach((listener) => listeners.forEach((listener) =>
deleteEventListener(node, listener[0], listener[1], listener[2]), deleteEventListener(node, listener[0], listener[1], listener[2], this.angularMode),
) )
} }
this.totalNodeAmount-- this.totalNodeAmount--

View file

@ -19,13 +19,13 @@ export default class IFrameObserver extends Observer {
}) })
} }
syntheticObserve(selfId: number, doc: Document) { syntheticObserve(rootNodeId: number, doc: Document) {
this.observeRoot(doc, (docID) => { this.observeRoot(doc, (docID) => {
if (docID === undefined) { if (docID === undefined) {
this.app.debug.log('OpenReplay: Iframe document not bound') this.app.debug.log('OpenReplay: Iframe document not bound')
return return
} }
this.app.send(CreateIFrameDocument(selfId, docID)) this.app.send(CreateIFrameDocument(rootNodeId, docID))
}) })
} }
} }

View file

@ -1,4 +1,4 @@
import { createMutationObserver, ngSafeBrowserMethod } from '../../utils.js' import { createMutationObserver } from '../../utils.js'
import { import {
RemoveNodeAttribute, RemoveNodeAttribute,
SetNodeAttributeURLBased, SetNodeAttributeURLBased,
@ -105,6 +105,9 @@ export default abstract class Observer {
if (name === null) { if (name === null) {
continue continue
} }
if (target instanceof HTMLIFrameElement && name === 'src') {
this.handleIframeSrcChange(target)
}
let attr = this.attributesMap.get(id) let attr = this.attributesMap.get(id)
if (attr === undefined) { if (attr === undefined) {
this.attributesMap.set(id, (attr = new Set())) this.attributesMap.set(id, (attr = new Set()))
@ -119,6 +122,7 @@ export default abstract class Observer {
} }
this.commitNodes() this.commitNodes()
}) as MutationCallback, }) as MutationCallback,
this.app.options.angularMode,
) )
} }
private clear(): void { private clear(): void {
@ -129,10 +133,49 @@ export default abstract class Observer {
this.textSet.clear() this.textSet.clear()
} }
/**
* Unbinds the removed nodes in case of iframe src change.
*/
private handleIframeSrcChange(iframe: HTMLIFrameElement): void {
const oldContentDocument = iframe.contentDocument
if (oldContentDocument) {
const id = this.app.nodes.getID(oldContentDocument)
if (id !== undefined) {
const walker = document.createTreeWalker(
oldContentDocument,
NodeFilter.SHOW_ELEMENT + NodeFilter.SHOW_TEXT,
{
acceptNode: (node) =>
isIgnored(node) || this.app.nodes.getID(node) === undefined
? NodeFilter.FILTER_REJECT
: NodeFilter.FILTER_ACCEPT,
},
// @ts-ignore
false,
)
let removed = 0
const totalBeforeRemove = this.app.nodes.getNodeCount()
while (walker.nextNode()) {
if (!iframe.contentDocument.contains(walker.currentNode)) {
removed += 1
this.app.nodes.unregisterNode(walker.currentNode)
}
}
const removedPercent = Math.floor((removed / totalBeforeRemove) * 100)
if (removedPercent > 30) {
this.app.send(UnbindNodes(removedPercent))
}
}
}
}
private sendNodeAttribute(id: number, node: Element, name: string, value: string | null): void { private sendNodeAttribute(id: number, node: Element, name: string, value: string | null): void {
if (isSVGElement(node)) { if (isSVGElement(node)) {
if (name.substr(0, 6) === 'xlink:') { if (name.substring(0, 6) === 'xlink:') {
name = name.substr(6) name = name.substring(6)
} }
if (value === null) { if (value === null) {
this.app.send(RemoveNodeAttribute(id, name)) this.app.send(RemoveNodeAttribute(id, name))
@ -152,7 +195,7 @@ export default abstract class Observer {
name === 'integrity' || name === 'integrity' ||
name === 'crossorigin' || name === 'crossorigin' ||
name === 'autocomplete' || name === 'autocomplete' ||
name.substr(0, 2) === 'on' name.substring(0, 2) === 'on'
) { ) {
return return
} }

View file

@ -140,7 +140,7 @@ export default class TopObserver extends Observer {
) )
} }
crossdomainObserve(selfId: number, frameOder: number) { crossdomainObserve(rootNodeId: number, frameOder: number) {
const observer = this const observer = this
Element.prototype.attachShadow = function () { Element.prototype.attachShadow = function () {
// eslint-disable-next-line // eslint-disable-next-line
@ -152,7 +152,7 @@ export default class TopObserver extends Observer {
this.app.nodes.syntheticMode(frameOder) this.app.nodes.syntheticMode(frameOder)
const iframeObserver = new IFrameObserver(this.app) const iframeObserver = new IFrameObserver(this.app)
this.iframeObservers.push(iframeObserver) this.iframeObservers.push(iframeObserver)
iframeObserver.syntheticObserve(selfId, window.document) iframeObserver.syntheticObserve(rootNodeId, window.document)
} }
disconnect() { disconnect() {

View file

@ -99,6 +99,7 @@ export default function (app: App): void {
} }
} }
}) as MutationCallback, }) as MutationCallback,
app.options.angularMode,
) )
app.attachStopCallback(() => { app.attachStopCallback(() => {

View file

@ -132,9 +132,13 @@ export function ngSafeBrowserMethod(method: string): string {
: method : method
} }
export function createMutationObserver(cb: MutationCallback) { export function createMutationObserver(cb: MutationCallback, angularMode?: boolean) {
const mObserver = ngSafeBrowserMethod('MutationObserver') as 'MutationObserver' if (angularMode) {
return new window[mObserver](cb) const mObserver = ngSafeBrowserMethod('MutationObserver') as 'MutationObserver'
return new window[mObserver](cb)
} else {
return new MutationObserver(cb)
}
} }
export function createEventListener( export function createEventListener(
@ -142,15 +146,23 @@ export function createEventListener(
event: string, event: string,
cb: EventListenerOrEventListenerObject, cb: EventListenerOrEventListenerObject,
capture?: boolean, capture?: boolean,
angularMode?: boolean,
) { ) {
const safeAddEventListener = ngSafeBrowserMethod('addEventListener') as 'addEventListener' let safeAddEventListener: 'addEventListener'
if (angularMode) {
safeAddEventListener = ngSafeBrowserMethod('addEventListener') as 'addEventListener'
} else {
safeAddEventListener = 'addEventListener'
}
try { try {
target[safeAddEventListener](event, cb, capture) target[safeAddEventListener](event, cb, capture)
} catch (e) { } catch (e) {
const msg = e.message const msg = e.message
console.debug( console.error(
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions // eslint-disable-next-line @typescript-eslint/restrict-template-expressions
`Openreplay: ${msg}; if this error is caused by an IframeObserver, ignore it`, `Openreplay: ${msg}; if this error is caused by an IframeObserver, ignore it`,
event,
target,
) )
} }
} }
@ -160,17 +172,23 @@ export function deleteEventListener(
event: string, event: string,
cb: EventListenerOrEventListenerObject, cb: EventListenerOrEventListenerObject,
capture?: boolean, capture?: boolean,
angularMode?: boolean,
) { ) {
const safeRemoveEventListener = ngSafeBrowserMethod( let safeRemoveEventListener: 'removeEventListener'
'removeEventListener', if (angularMode) {
) as 'removeEventListener' safeRemoveEventListener = ngSafeBrowserMethod('removeEventListener') as 'removeEventListener'
} else {
safeRemoveEventListener = 'removeEventListener'
}
try { try {
target[safeRemoveEventListener](event, cb, capture) target[safeRemoveEventListener](event, cb, capture)
} catch (e) { } catch (e) {
const msg = e.message const msg = e.message
console.debug( console.error(
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions // eslint-disable-next-line @typescript-eslint/restrict-template-expressions
`Openreplay: ${msg}; if this error is caused by an IframeObserver, ignore it`, `Openreplay: ${msg}; if this error is caused by an IframeObserver, ignore it`,
event,
target,
) )
} }
} }

View file

@ -7,7 +7,7 @@ describe('Nodes', () => {
const mockCallback = jest.fn() const mockCallback = jest.fn()
beforeEach(() => { beforeEach(() => {
nodes = new Nodes(nodeId) nodes = new Nodes(nodeId, false)
mockCallback.mockClear() mockCallback.mockClear()
}) })