Compare commits

...
Sign in to create a new pull request.

21 commits

Author SHA1 Message Date
Mehdi Osman
dda7affd5d
Added mobile and canvas related topics 2024-07-24 14:19:55 -04:00
Atef Ben Ali
6e2a772e7f
docs: update README_AR.md file (#2421) 2024-07-24 09:44:21 -04:00
Mehdi Osman
dcce3569fb
Increment chalice chart version (#2417)
Co-authored-by: GitHub Action <action@github.com>
2024-07-23 13:17:48 +02:00
Kraiem Taha Yassine
867247dbc0
fix(chalice): fixed insights with filter steps (#2416) 2024-07-23 13:01:48 +02:00
PiR
754293e29d
Tracker GrahpQL: update doc and tracker initialization + add option to pass sanitizer function (#2402)
* fix(graphQL): update doc and tracker initialization + add option to pass sanitizer function

* improvement(graphQL): improve sanitizer type & apollo operation name
2024-07-22 16:07:12 +02:00
Mehdi Osman
ddd037ce79
Increment chalice chart version (#2411)
Co-authored-by: GitHub Action <action@github.com>
2024-07-19 15:09:47 +02:00
Kraiem Taha Yassine
66f4c5c93b
fix(chalice): stop SA from logout (#2410) 2024-07-19 15:03:59 +02:00
Mehdi Osman
66e4d133ad
Increment chalice chart version (#2407)
Co-authored-by: GitHub Action <action@github.com>
2024-07-18 13:17:27 +02:00
Kraiem Taha Yassine
f9f8853ab0
fix(chalice): fixed search mobile sessions (#2406)
fix(chalice): fixed autocomplete mobile sessions
2024-07-18 13:08:05 +02:00
Mehdi Osman
e0bb6fea9d
Updated patch build from main 4e7efaecde (#2405)
* Increment frontend chart version

* Increment db chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-07-18 12:46:42 +02:00
Alexander
4e7efaecde
Heatmaps fix (float coordinates) (#2403) (#2404)
* feat(spot): use float click coordinates instead of ints in PG

* feat(db): added support for float clicks in CH

* feat(db): fix float instead of uint8

* feat(mobile): new naming for mobile autocomplete types
2024-07-18 12:38:47 +02:00
Delirium
54a9624332
Heatmaps patch 2 (#2400)
* fix ui: move clickmap overlay inside replay vdom, refactor renderer scaling

* fix ui: fix first event calculation
2024-07-17 18:57:21 +02:00
Mehdi Osman
1ddffca572
Increment frontend chart version (#2395)
Co-authored-by: GitHub Action <action@github.com>
2024-07-16 17:38:25 +02:00
Delirium
c91881413a
fix manager event reads for mobile (#2394) 2024-07-16 17:34:12 +02:00
Mehdi Osman
ba2d9eb81c
Increment chalice chart version (#2393)
Co-authored-by: GitHub Action <action@github.com>
2024-07-16 17:20:59 +02:00
Kraiem Taha Yassine
c845415e1e
Patch/api v1.19.0 (#2392)
* fix(chalice): reversed count&total for card-tables to confuse devs

* fix(DB): changed normalized_x&y col-type
2024-07-16 17:15:38 +02:00
Mehdi Osman
ee0ede8478
Increment chalice chart version (#2391)
Co-authored-by: GitHub Action <action@github.com>
2024-07-16 14:26:47 +02:00
Kraiem Taha Yassine
72afae226b
fix(chalice): fixed missing totalSessions in card-tables in EE (#2390)
* fix(chalice): fixed missing totalSessions in card-tables in EE

* fix(chalice): fixed missing totalSessions in card-tables in EE
2024-07-16 14:16:26 +02:00
Shekar Siri
b3f545849a
fix(ui): use count instead of totalSessions (#2387) 2024-07-12 17:38:44 +02:00
Mehdi Osman
cd2966fb9f
Increment chalice chart version (#2384)
Co-authored-by: GitHub Action <action@github.com>
2024-07-11 11:50:42 +02:00
Kraiem Taha Yassine
4b91dcded0
Patch/api v1.19.0 (#2383)
* fix(chalice): fixed create heatmap card EE

* fix(chalice): fixed click_rage-heatmap card EE

* fix(chalice): fixed click_rage-heatmap ambiguous alias EE
2024-07-11 11:36:07 +02:00
41 changed files with 457 additions and 245 deletions

View file

@ -55,17 +55,17 @@ OpenReplay هو مجموعة إعادة تشغيل الجلسة التي يمك
## الميزات
- **إعادة تشغيل الجلسة:** تتيح لك إعادة تشغيل الجلسة إعادة عيش تجربة مستخدميك، ورؤية أين يواجهون صعوبة وكيف يؤثر ذلك على سلوكهم. يتم تحليل كل إعادة تشغيل للجلسة تلقائيًا بناءً على الأساليب الاستدلالية، لسهولة التقييم.
- **أدوات التطوير (DevTools):** إنها مثل التصحيح في متصفحك الخاص. يوفر لك OpenReplay السياق الكامل (نشاط الشبكة، أخطاء JavaScript، إجراءات/حالة التخزين وأكثر من 40 مقياسًا) حتى تتمكن من إعادة إنتاج الأخطاء فورًا وفهم مشكلات الأداء.
- **أدوات التطوير (DevTools):** إنها مثل المصحح (debugger) في متصفحك الخاص. يوفر لك OpenReplay السياق الكامل (نشاط الشبكة، أخطاء JavaScript، إجراءات/حالة التخزين وأكثر من 40 مقياسًا) حتى تتمكن من إعادة إنتاج الأخطاء فورًا وفهم مشكلات الأداء.
- **المساعدة (Assist):** تساعدك في دعم مستخدميك من خلال رؤية شاشتهم مباشرة والانضمام فورًا إلى مكالمة (WebRTC) معهم دون الحاجة إلى برامج مشاركة الشاشة من جهات خارجية.
- **البحث الشامل (Omni-search):** ابحث وفرز حسب أي عملية/معيار للمستخدم تقريبًا، أو سمة الجلسة أو الحدث التقني، حتى تتمكن من الرد على أي سؤال. لا يلزم تجهيز.
- **البحث الشامل (Omni-search):** ابحث وافرز حسب أي عملية/معيار للمستخدم تقريبًا، أو سمة الجلسة أو الحدث التقني، حتى تتمكن من الرد على أي سؤال. لا يلزم تجهيز.
- **الأنفاق (Funnels):** للكشف عن المشكلات الأكثر تأثيرًا التي تسبب في فقدان التحويل والإيرادات.
- **ضوابط الخصوصية الدقيقة:** اختر ماذا تريد التقاطه، ماذا تريد أن تخفي أو تجاهل حتى لا تصل بيانات المستخدم حتى إلى خوادمك.
- **موجهة للمكونات الإضافية (Plugins oriented):** تصل إلى السبب الجذري بشكل أسرع عن طريق تتبع حالة التطبيق (Redux، VueX، MobX، NgRx، Pinia، وZustand) وتسجيل استعلامات GraphQL (Apollo، Relay) وطلبات Fetch/Axios.
- **ضوابط الخصوصية الدقيقة:** اختر ماذا تريد التقاطه، ماذا تريد أن تخفي أو تتجاهل حتى لا تصل بيانات المستخدم حتى إلى خوادمك.
- **موجهة للمكونات الإضافية (Plugins oriented):** يمكنك الوصول إلى السبب الجذري بشكل أسرع عن طريق تتبع حالة التطبيق (Redux، VueX، MobX، NgRx، Pinia، وZustand) وتسجيل استعلامات GraphQL (Apollo، Relay) وطلبات Fetch/Axios.
- **التكاملات (Integrations):** مزامنة سجلات الخادم الخلفي مع إعادات التشغيل للجلسات ورؤية ما حدث من الأمام إلى الخلف. يدعم OpenReplay Sentry وDatadog وCloudWatch وStackdriver وElastic والمزيد.
## خيارات النشر
يمكن نشر OpenReplay في أي مكان. اتبع دليلنا الخطوة بالخطوة لنشره على خدمات السحابة العامة الرئيسية:
يمكن نشر OpenReplay في أي مكان. اتبع دليلنا خطوة بخطوة لنشره على خدمات السحابة العامة الرئيسة:
- [AWS](https://docs.openreplay.com/deployment/deploy-aws)
- [Google Cloud](https://docs.openreplay.com/deployment/deploy-gcp)

View file

@ -319,13 +319,14 @@ def create_card(project_id, user_id, data: schemas.CardSchema, dashboard=False):
session_data = None
if data.metric_type == schemas.MetricType.heat_map:
if data.session_id is not None:
session_data = json.dumps({"sessionId": data.session_id})
session_data = {"sessionId": data.session_id}
else:
session_data = __get_heat_map_chart(project_id=project_id, user_id=user_id,
data=data, include_mobs=False)
if session_data is not None:
session_data = json.dumps({"sessionId": session_data["sessionId"]})
_data = {"session_data": session_data}
session_data = {"sessionId": session_data["sessionId"]}
_data = {"session_data": json.dumps(session_data) if session_data is not None else None}
for i, s in enumerate(data.series):
for k in s.model_dump().keys():
_data[f"{k}_{i}"] = s.__getattribute__(k)

View file

@ -359,12 +359,12 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
distinct_on += ",path"
if metric_format == schemas.MetricExtendedFormatType.session_count:
main_query = f"""SELECT COUNT(*) AS count,
COALESCE(SUM(users_sessions.session_count),0) AS total_sessions,
COALESCE(SUM(users_sessions.session_count),0) AS count,
COALESCE(JSONB_AGG(users_sessions)
FILTER ( WHERE rn > %(limit_s)s
AND rn <= %(limit_e)s ), '[]'::JSONB) AS values
FROM (SELECT {main_col} AS name,
count(DISTINCT session_id) AS session_count,
count(DISTINCT session_id) AS total,
ROW_NUMBER() OVER (ORDER BY count(full_sessions) DESC) AS rn
FROM (SELECT *
FROM (SELECT DISTINCT ON({distinct_on}) s.session_id, s.user_uuid,
@ -379,7 +379,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
ORDER BY session_count DESC) AS users_sessions;"""
else:
main_query = f"""SELECT COUNT(*) AS count,
COALESCE(SUM(users_sessions.user_count),0) AS total_users,
COALESCE(SUM(users_sessions.user_count),0) AS count,
COALESCE(JSONB_AGG(users_sessions) FILTER ( WHERE rn <= 200 ), '[]'::JSONB) AS values
FROM (SELECT {main_col} AS name,
count(DISTINCT user_id) AS user_count,
@ -420,12 +420,12 @@ def search_table_of_individual_issues(data: schemas.SessionsSearchPayloadSchema,
full_args["issues_limit_s"] = (data.page - 1) * data.limit
full_args["issues_limit_e"] = data.page * data.limit
main_query = cur.mogrify(f"""SELECT COUNT(1) AS count,
COALESCE(SUM(session_count), 0) AS total_sessions,
COALESCE(SUM(session_count), 0) AS count,
COALESCE(JSONB_AGG(ranked_issues)
FILTER ( WHERE rn > %(issues_limit_s)s
AND rn <= %(issues_limit_e)s ), '[]'::JSONB) AS values
FROM (SELECT *, ROW_NUMBER() OVER (ORDER BY session_count DESC) AS rn
FROM (SELECT type AS name, context_string AS value, COUNT(DISTINCT session_id) AS session_count
FROM (SELECT type AS name, context_string AS value, COUNT(DISTINCT session_id) AS total
FROM (SELECT session_id
{query_part}) AS filtered_sessions
INNER JOIN events_common.issues USING (session_id)
@ -814,12 +814,6 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
event_where.append(
sh.multi_conditions(f"main.{events.EventType.VIEW_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == events.EventType.SWIPE_MOBILE.ui_type and platform == "ios":
event_from = event_from % f"{events.EventType.SWIPE_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.{events.EventType.SWIPE_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == events.EventType.CUSTOM.ui_type:
event_from = event_from % f"{events.EventType.CUSTOM.table} AS main "
if not is_any:
@ -855,7 +849,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
event_where.append(sh.multi_conditions(f"main1.source = %({s_k})s", event.source, value_key=s_k))
# ----- IOS
# ----- Mobile
elif event_type == events.EventType.CLICK_MOBILE.ui_type:
event_from = event_from % f"{events.EventType.CLICK_MOBILE.table} AS main "
if not is_any:
@ -897,6 +891,13 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
event_where.append(
sh.multi_conditions(f"(main1.reason {op} %({e_k})s OR main1.name {op} %({e_k})s)",
event.value, value_key=e_k))
elif event_type == events.EventType.SWIPE_MOBILE.ui_type and platform != "web":
event_from = event_from % f"{events.EventType.SWIPE_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.{events.EventType.SWIPE_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == schemas.PerformanceEventType.fetch_failed:
event_from = event_from % f"{events.EventType.REQUEST.table} AS main "
if not is_any:

View file

@ -19,6 +19,7 @@ from routers.base import get_routers
public_app, app, app_apikey = get_routers()
@app.get('/{projectId}/autocomplete', tags=["events"])
@app.get('/{projectId}/events/search', tags=["events"])
def events_search(projectId: int, q: str,
type: Union[schemas.FilterType, schemas.EventType,

View file

@ -471,13 +471,13 @@ class EventType(str, Enum):
state_action = "stateAction"
error = "error"
tag = "tag"
click_mobile = "tapIos"
input_mobile = "inputIos"
view_mobile = "viewIos"
custom_mobile = "customIos"
request_mobile = "requestIos"
error_mobile = "errorIos"
swipe_mobile = "swipeIos"
click_mobile = "click_mobile"
input_mobile = "input_mobile"
view_mobile = "view_mobile"
custom_mobile = "custom_mobile"
request_mobile = "request_mobile"
error_mobile = "error_mobile"
swipe_mobile = "swipe_mobile"
class PerformanceEventType(str, Enum):

View file

@ -81,13 +81,13 @@ func (s *saverImpl) handleMobileMessage(msg Message) error {
if err = s.sessions.UpdateUserID(session.SessionID, m.ID); err != nil {
return err
}
s.pg.InsertAutocompleteValue(session.SessionID, session.ProjectID, "USERID_Mobile", m.ID)
s.pg.InsertAutocompleteValue(session.SessionID, session.ProjectID, "USERID_MOBILE", m.ID)
return nil
case *MobileUserAnonymousID:
if err = s.sessions.UpdateAnonymousID(session.SessionID, m.ID); err != nil {
return err
}
s.pg.InsertAutocompleteValue(session.SessionID, session.ProjectID, "USERANONYMOUSID_Mobile", m.ID)
s.pg.InsertAutocompleteValue(session.SessionID, session.ProjectID, "USERANONYMOUSID_MOBILE", m.ID)
return nil
case *MobileMetadata:
return s.sessions.UpdateMetadata(m.SessionID(), m.Key, m.Value)

View file

@ -132,8 +132,15 @@ func (conn *Conn) InsertWebClickEvent(sess *sessions.Session, e *messages.MouseC
}
var host, path string
host, path, _, _ = url.GetURLParts(e.Url)
if e.NormalizedX <= 100 && e.NormalizedY <= 100 {
if err := conn.bulks.Get("webClickXYEvents").Append(sess.SessionID, truncSqIdx(e.MsgID()), e.Timestamp, e.Label, e.Selector, host+path, path, e.HesitationTime, e.NormalizedX, e.NormalizedY); err != nil {
if e.NormalizedX != 101 && e.NormalizedY != 101 {
// To support previous versions of tracker
if e.NormalizedX <= 100 && e.NormalizedY <= 100 {
e.NormalizedX *= 100
e.NormalizedY *= 100
}
normalizedX := float32(e.NormalizedX) / 100.0
normalizedY := float32(e.NormalizedY) / 100.0
if err := conn.bulks.Get("webClickXYEvents").Append(sess.SessionID, truncSqIdx(e.MsgID()), e.Timestamp, e.Label, e.Selector, host+path, path, e.HesitationTime, normalizedX, normalizedY); err != nil {
sessCtx := context.WithValue(context.Background(), "sessionID", sess.SessionID)
conn.log.Error(sessCtx, "insert web click event in bulk err: %s", err)
}

View file

@ -13,14 +13,14 @@ func (conn *Conn) InsertMobileEvent(session *sessions.Session, e *messages.Mobil
if err := conn.InsertCustomEvent(session.SessionID, e.Timestamp, truncSqIdx(e.Index), e.Name, e.Payload); err != nil {
return err
}
conn.InsertAutocompleteValue(session.SessionID, session.ProjectID, "CUSTOM_Mobile", e.Name)
conn.InsertAutocompleteValue(session.SessionID, session.ProjectID, "CUSTOM_MOBILE", e.Name)
return nil
}
func (conn *Conn) InsertMobileNetworkCall(sess *sessions.Session, e *messages.MobileNetworkCall) error {
err := conn.InsertRequest(sess.SessionID, e.Timestamp, truncSqIdx(e.Index), e.URL, e.Duration, e.Status < 400)
if err == nil {
conn.InsertAutocompleteValue(sess.SessionID, sess.ProjectID, "REQUEST_Mobile", url.DiscardURLQuery(e.URL))
conn.InsertAutocompleteValue(sess.SessionID, sess.ProjectID, "REQUEST_MOBILE", url.DiscardURLQuery(e.URL))
}
return err
}
@ -36,7 +36,7 @@ func (conn *Conn) InsertMobileClickEvent(sess *sessions.Session, clickEvent *mes
); err != nil {
return err
}
conn.InsertAutocompleteValue(sess.SessionID, sess.ProjectID, "CLICK_Mobile", clickEvent.Label)
conn.InsertAutocompleteValue(sess.SessionID, sess.ProjectID, "CLICK_MOBILE", clickEvent.Label)
return nil
}
@ -51,7 +51,7 @@ func (conn *Conn) InsertMobileSwipeEvent(sess *sessions.Session, swipeEvent *mes
); err != nil {
return err
}
conn.InsertAutocompleteValue(sess.SessionID, sess.ProjectID, "SWIPE_Mobile", swipeEvent.Label)
conn.InsertAutocompleteValue(sess.SessionID, sess.ProjectID, "SWIPE_MOBILE", swipeEvent.Label)
return nil
}
@ -66,7 +66,7 @@ func (conn *Conn) InsertMobileInputEvent(sess *sessions.Session, inputEvent *mes
); err != nil {
return err
}
conn.InsertAutocompleteValue(sess.SessionID, sess.ProjectID, "INPUT_Mobile", inputEvent.Label)
conn.InsertAutocompleteValue(sess.SessionID, sess.ProjectID, "INPUT_MOBILE", inputEvent.Label)
return nil
}

View file

@ -29,6 +29,11 @@ def _get_current_auth_context(request: Request, jwt_payload: dict) -> schemas.Cu
return request.state.currentContext
def _allow_access_to_endpoint(request: Request, current_context: schemas.CurrentContext) -> bool:
return not current_context.service_account \
or request.url.path not in ["/logout", "/api/logout", "/refresh", "/api/refresh"]
class JWTAuth(HTTPBearer):
def __init__(self, auto_error: bool = True):
super(JWTAuth, self).__init__(auto_error=auto_error)
@ -68,7 +73,10 @@ class JWTAuth(HTTPBearer):
or old_jwt_payload.get("userId") != jwt_payload.get("userId"):
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Invalid token or expired token.")
return _get_current_auth_context(request=request, jwt_payload=jwt_payload)
ctx = _get_current_auth_context(request=request, jwt_payload=jwt_payload)
if not _allow_access_to_endpoint(request=request, current_context=ctx):
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Unauthorized endpoint.")
return ctx
else:
credentials: HTTPAuthorizationCredentials = await super(JWTAuth, self).__call__(request)
@ -95,7 +103,10 @@ class JWTAuth(HTTPBearer):
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Invalid token or expired token.")
return _get_current_auth_context(request=request, jwt_payload=jwt_payload)
ctx = _get_current_auth_context(request=request, jwt_payload=jwt_payload)
if not _allow_access_to_endpoint(request=request, current_context=ctx):
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Unauthorized endpoint.")
return ctx
logger.warning("Invalid authorization code.")
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Invalid authorization code.")

View file

@ -339,10 +339,13 @@ def create_card(project_id, user_id, data: schemas.CardSchema, dashboard=False):
session_data = None
if data.metric_type == schemas.MetricType.heat_map:
if data.session_id is not None:
session_data = json.dumps({"sessionId": data.session_id})
session_data = {"sessionId": data.session_id}
else:
session_data = __get_heat_map_chart(project_id=project_id, user_id=user_id,
data=data, include_mobs=False)
if session_data is not None:
session_data = {"sessionId": session_data["sessionId"]}
if session_data is not None:
# for EE only
keys = sessions_mobs. \
@ -356,8 +359,8 @@ def create_card(project_id, user_id, data: schemas.CardSchema, dashboard=False):
except Exception as e:
logger.warning(f"!!!Error while tagging: {k} to {tag} for heatMap")
logger.error(str(e))
session_data = json.dumps(session_data)
_data = {"session_data": session_data}
_data = {"session_data": json.dumps(session_data) if session_data is not None else None}
for i, s in enumerate(data.series):
for k in s.model_dump().keys():
_data[f"{k}_{i}"] = s.__getattribute__(k)

View file

@ -57,16 +57,16 @@ def get_by_url(project_id, data: schemas.GetHeatMapPayloadSchema):
# f.value, value_key=f_k))
if data.click_rage and not has_click_rage_filter:
constraints.append("""(issues.session_id IS NULL
OR (issues.datetime >= toDateTime(%(startDate)s/1000)
AND issues.datetime <= toDateTime(%(endDate)s/1000)
AND issues.project_id = toUInt16(%(project_id)s)
AND issues.event_type = 'ISSUE'
AND issues.project_id = toUInt16(%(project_id)s
AND mis.project_id = toUInt16(%(project_id)s
AND mis.type='click_rage'))))""")
query_from += """ LEFT JOIN experimental.events AS issues ON (main_events.session_id=issues.session_id)
LEFT JOIN experimental.issues AS mis ON (issues.issue_id=mis.issue_id)"""
constraints.append("""(issues_t.session_id IS NULL
OR (issues_t.datetime >= toDateTime(%(startDate)s/1000)
AND issues_t.datetime <= toDateTime(%(endDate)s/1000)
AND issues_t.project_id = toUInt16(%(project_id)s)
AND issues_t.event_type = 'ISSUE'
AND issues_t.project_id = toUInt16(%(project_id)s)
AND mis.project_id = toUInt16(%(project_id)s)
AND mis.type='click_rage'))""")
query_from += """ LEFT JOIN experimental.events AS issues_t ON (main_events.session_id=issues_t.session_id)
LEFT JOIN experimental.issues AS mis ON (issues_t.issue_id=mis.issue_id)"""
with ch_client.ClickHouseClient() as cur:
query = cur.format(f"""SELECT main_events.normalized_x AS normalized_x,
main_events.normalized_y AS normalized_y

View file

@ -442,7 +442,8 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
if metric_format == schemas.MetricExtendedFormatType.session_count:
main_query = f"""SELECT COUNT(DISTINCT {main_col}) OVER () AS main_count,
{main_col} AS name,
count(DISTINCT session_id) AS session_count
count(DISTINCT session_id) AS session_count,
COALESCE(SUM(count(DISTINCT session_id)) OVER (), 0) AS total_sessions
FROM (SELECT s.session_id AS session_id,
{extra_col}
{query_part}) AS filtred_sessions
@ -470,11 +471,14 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
logging.debug("--------------------")
sessions = cur.execute(main_query)
count = 0
total_sessions = 0
if len(sessions) > 0:
count = sessions[0]["main_count"]
total_sessions = sessions[0]["total_sessions"]
for s in sessions:
s.pop("main_count")
sessions = {"count": count, "values": helper.list_to_camel_case(sessions)}
s.pop("total_sessions")
sessions = {"total": count, "count": total_sessions, "values": helper.list_to_camel_case(sessions)}
return sessions
@ -520,7 +524,7 @@ def search_table_of_individual_issues(data: schemas.SessionsSearchPayloadSchema,
total_sessions = 0
issues_count = 0
return {"count": issues_count, "totalSessions": total_sessions, "values": issues}
return {"total": issues_count, "count": total_sessions, "values": issues}
def __is_valid_event(is_any: bool, event: schemas.SessionSearchEventSchema2):
@ -563,7 +567,7 @@ def __get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEve
schemas.PerformanceEventType.fetch_failed: "REQUEST",
schemas.EventType.error: "CRASH",
}
if platform == "ios" and event_type in defs_mobile:
if platform != "web" and event_type in defs_mobile:
return defs_mobile.get(event_type)
if event_type not in defs:
raise Exception(f"unsupported EventType:{event_type}")
@ -964,7 +968,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
value_key=f"custom{i}"))
full_args = {**full_args, **_multiple_values(event.source, value_key=f"custom{i}")}
else:
_column = events.EventType.INPUT_IOS.column
_column = events.EventType.INPUT_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
@ -997,7 +1001,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event.value, value_key=e_k))
events_conditions[-1]["condition"] = event_where[-1]
else:
_column = events.EventType.VIEW_IOS.column
_column = events.EventType.VIEW_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
@ -1089,6 +1093,114 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = " AND ".join(events_conditions[-1]["condition"])
# ----- Mobile
elif event_type == events.EventType.CLICK_MOBILE.ui_type:
_column = events.EventType.CLICK_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1]
else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.INPUT_MOBILE.ui_type:
_column = events.EventType.INPUT_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1]
else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.VIEW_MOBILE.ui_type:
_column = events.EventType.VIEW_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1]
else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
event.value, value_key=e_k))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.CUSTOM_MOBILE.ui_type:
_column = events.EventType.CUSTOM_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1]
else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
event.value, value_key=e_k))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.REQUEST_MOBILE.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path'
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1]
else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.CRASH_MOBILE.ui_type:
_column = events.EventType.CRASH_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1]
else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
event.value, value_key=e_k))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.SWIPE_MOBILE.ui_type and platform != "web":
_column = events.EventType.SWIPE_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k))
events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1]
else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
event.value, value_key=e_k))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.PerformanceEventType.fetch_failed:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path'

View file

@ -185,8 +185,9 @@ def __filter_subquery(project_id: int, filters: Optional[schemas.SessionsSearchP
errors_only=True, favorite_only=None,
issue=None, user_id=None)
params = {**params, **qp_params}
# TODO: test if this line impacts other cards beside insights
# sub_query = f"INNER JOIN {sub_query} USING(session_id)"
# This line was added because insights is failing when you add filter steps,
# for example when you add a LOCATION filter
sub_query = f"INNER JOIN {sub_query} USING(session_id)"
return params, sub_query

View file

@ -397,12 +397,19 @@ func (c *connectorImpl) InsertWebClickEvent(session *sessions.Session, msg *mess
if msg.Label == "" {
return nil
}
var nX *uint8 = nil
var nY *uint8 = nil
if msg.NormalizedX <= 100 && msg.NormalizedY <= 100 {
nXVal := uint8(msg.NormalizedX)
var nX *float32 = nil
var nY *float32 = nil
if msg.NormalizedX != 101 && msg.NormalizedY != 101 {
// To support previous versions of tracker
if msg.NormalizedX <= 100 && msg.NormalizedY <= 100 {
msg.NormalizedX *= 100
msg.NormalizedY *= 100
}
normalizedX := float32(msg.NormalizedX) / 100.0
normalizedY := float32(msg.NormalizedY) / 100.0
nXVal := normalizedX
nX = &nXVal
nYVal := uint8(msg.NormalizedY)
nYVal := normalizedY
nY = &nYVal
}
if err := c.batches["clicks"].Append(

View file

@ -3,8 +3,8 @@ CREATE OR REPLACE FUNCTION openreplay_version AS() -> 'v1.19.0-ee';
DROP TABLE IF EXISTS experimental.events_l7d_mv;
ALTER TABLE experimental.events
ADD COLUMN IF NOT EXISTS normalized_x Nullable(UInt8),
ADD COLUMN IF NOT EXISTS normalized_y Nullable(UInt8),
ADD COLUMN IF NOT EXISTS normalized_x Nullable(Float32),
ADD COLUMN IF NOT EXISTS normalized_y Nullable(Float32),
DROP COLUMN IF EXISTS coordinate;
CREATE MATERIALIZED VIEW IF NOT EXISTS experimental.events_l7d_mv

View file

@ -81,8 +81,8 @@ CREATE TABLE IF NOT EXISTS experimental.events
error_tags_values Array(Nullable(String)),
transfer_size Nullable(UInt32),
selector Nullable(String),
normalized_x Nullable(UInt8),
normalized_y Nullable(UInt8),
normalized_x Nullable(Float32),
normalized_y Nullable(Float32),
message_id UInt64 DEFAULT 0,
_timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp)

View file

@ -19,8 +19,8 @@ $fn_def$, :'next_version')
--
ALTER TABLE IF EXISTS events.clicks
ADD COLUMN IF NOT EXISTS normalized_x smallint NULL,
ADD COLUMN IF NOT EXISTS normalized_y smallint NULL,
ADD COLUMN IF NOT EXISTS normalized_x decimal NULL,
ADD COLUMN IF NOT EXISTS normalized_y decimal NULL,
DROP COLUMN IF EXISTS x,
DROP COLUMN IF EXISTS y;

View file

@ -659,16 +659,16 @@ CREATE INDEX pages_query_nn_gin_idx ON events.pages USING GIN (query gin_trgm_op
CREATE TABLE events.clicks
(
session_id bigint NOT NULL REFERENCES public.sessions (session_id) ON DELETE CASCADE,
message_id bigint NOT NULL,
timestamp bigint NOT NULL,
label text DEFAULT NULL,
url text DEFAULT '' NOT NULL,
session_id bigint NOT NULL REFERENCES public.sessions (session_id) ON DELETE CASCADE,
message_id bigint NOT NULL,
timestamp bigint NOT NULL,
label text DEFAULT NULL,
url text DEFAULT '' NOT NULL,
path text,
selector text DEFAULT '' NOT NULL,
hesitation integer DEFAULT NULL,
normalized_x smallint DEFAULT NULL,
normalized_y smallint DEFAULT NULL,
selector text DEFAULT '' NOT NULL,
hesitation integer DEFAULT NULL,
normalized_x decimal DEFAULT NULL,
normalized_y decimal DEFAULT NULL,
PRIMARY KEY (session_id, message_id)
);
CREATE INDEX clicks_session_id_idx ON events.clicks (session_id);

View file

@ -62,9 +62,9 @@ function ClickMapCard({
if (mapUrl) return evt.path.includes(mapUrl)
return evt
}) || { timestamp: metricStore.instance.data.startTs }
const jumpTimestamp = (jumpToEvent.timestamp - metricStore.instance.data.startTs) + jumpToEvent.domBuildingTime + 99 // 99ms safety margin to give some time for the DOM to load
const ts = jumpToEvent.timestamp ?? metricStore.instance.data.startTs
const domTime = jumpToEvent.domBuildingTime ?? 0
const jumpTimestamp = (ts - metricStore.instance.data.startTs) + domTime + 99 // 99ms safety margin to give some time for the DOM to load
return (
<div id="clickmap-render">
<ClickMapRenderer

View file

@ -51,8 +51,8 @@ function WebPlayer(props: any) {
const isPlayerReady = contextValue.store?.get().ready
React.useEffect(() => {
contextValue.player && contextValue.player.play()
if (isPlayerReady && insights.size > 0) {
contextValue.player && contextValue.player.play()
if (isPlayerReady && insights.size > 0 && jumpTimestamp) {
setTimeout(() => {
contextValue.player.pause()
contextValue.player.jump(jumpTimestamp)

View file

@ -9,7 +9,7 @@
height: 100%;
/* border: solid thin $gray-light; */
/* border-radius: 3px; */
overflow: hidden;
overflow-y: scroll;
}
.checkers {

View file

@ -302,8 +302,8 @@ export default class Widget {
} else if (this.metricType === FUNNEL) {
_data.funnel = new Funnel().fromJSON(_data);
} else if (this.metricType === TABLE) {
const totalSessions = data[0]['totalSessions'];
_data[0]['values'] = data[0]['values'].map((s: any) => new SessionsByRow().fromJson(s, totalSessions, this.metricOf));
const count = data[0]['count'];
_data[0]['values'] = data[0]['values'].map((s: any) => new SessionsByRow().fromJson(s, count, this.metricOf));
} else {
if (data.hasOwnProperty('chart')) {
_data['value'] = data.value;

View file

@ -78,10 +78,10 @@ export interface State extends ScreenState, ListsState {
}
const userEvents = [
MType.IosSwipeEvent,
MType.IosClickEvent,
MType.IosInputEvent,
MType.IosScreenChanges,
MType.MobileSwipeEvent,
MType.MobileClickEvent,
MType.MobileInputEvent,
MType.MobileScreenChanges,
];
export default class IOSMessageManager implements IMessageManager {
@ -233,7 +233,7 @@ export default class IOSMessageManager implements IMessageManager {
}
switch (msg.tp) {
case MType.IosPerformanceEvent:
case MType.MobilePerformanceEvent:
const performanceStats = ['background', 'memoryUsage', 'mainThreadCPU'];
if (performanceStats.includes(msg.name)) {
this.performanceManager.append(msg);
@ -253,21 +253,21 @@ export default class IOSMessageManager implements IMessageManager {
// case MType.IosInputEvent:
// console.log('input', msg)
// break;
case MType.IosNetworkCall:
case MType.MobileNetworkCall:
this.lists.lists.fetch.insert(getResourceFromNetworkRequest(msg, this.sessionStart));
break;
case MType.WsChannel:
this.lists.lists.websocket.insert(msg);
break;
case MType.IosEvent:
case MType.MobileEvent:
// @ts-ignore
this.lists.lists.event.insert({ ...msg, source: 'openreplay' });
break;
case MType.IosSwipeEvent:
case MType.IosClickEvent:
case MType.MobileSwipeEvent:
case MType.MobileClickEvent:
this.touchManager.append(msg);
break;
case MType.IosLog:
case MType.MobileLog:
const log = { ...msg, level: msg.severity };
// @ts-ignore
this.lists.lists.log.append(Log(log));

View file

@ -31,7 +31,7 @@ export default class TouchManager extends ListWalker<IosClickEvent | IosSwipeEve
public move(t: number) {
const lastTouch = this.moveGetLast(t)
if (!!lastTouch) {
if (lastTouch.tp === MType.IosSwipeEvent) {
if (lastTouch.tp === MType.MobileSwipeEvent) {
return
// not using swipe rn
// this.touchTrail?.createSwipeTrail({

View file

@ -233,10 +233,10 @@ export default class Screen {
break;
case ScaleMode.AdjustParentHeight:
// we want to scale the document with true height so the clickmap will be scrollable
const usedHeight =
this.document?.body.scrollHeight && this.document?.body.scrollHeight > height
? this.document.body.scrollHeight + 'px'
: height + 'px';
const usedHeight = height + 'px';
// this.document?.body.scrollHeight && this.document?.body.scrollHeight > height
// ? this.document.body.scrollHeight + 'px'
// : height + 'px';
this.scaleRatio = offsetWidth / width;
translate = 'translate(-50%, 0)';
posStyles = { top: 0, height: usedHeight };

View file

@ -146,39 +146,37 @@ export default class TargetMarker {
if (clicks && this.screen.document) {
this.clickMapOverlay?.remove();
const overlay = document.createElement('canvas');
const iframeSize = this.screen.iframeStylesRef;
const scrollHeight = this.screen.document?.documentElement.scrollHeight || 0;
const scrollWidth = this.screen.document?.documentElement.scrollWidth || 0;
const scaleRatio = this.screen.getScale();
Object.assign(
overlay.style,
clickmapStyles.overlayStyle({
height: iframeSize.height,
width: iframeSize.width,
scale: scaleRatio,
height: scrollHeight + 'px',
width: scrollWidth + 'px',
})
);
this.clickMapOverlay = overlay;
this.screen.getParentElement()?.appendChild(overlay);
this.screen.document.body.appendChild(overlay);
const pointMap: Record<string, { times: number; data: number[], original: any }> = {};
const ovWidth = parseInt(iframeSize.width);
const ovHeight = parseInt(iframeSize.height);
overlay.width = ovWidth;
overlay.height = ovHeight;
overlay.width = scrollWidth;
overlay.height = scrollHeight;
let maxIntensity = 0;
clicks.forEach((point) => {
const key = `${point.normalizedY}-${point.normalizedX}`;
const y = roundToSecond(point.normalizedY);
const x = roundToSecond(point.normalizedX);
const key = `${y}-${x}`;
if (pointMap[key]) {
const times = pointMap[key].times + 1;
maxIntensity = Math.max(maxIntensity, times);
pointMap[key].times = times;
} else {
const clickData = [
(point.normalizedX / 100) * scrollWidth,
(point.normalizedY / 100) * scrollHeight,
(x / 100) * scrollWidth,
(y / 100) * scrollHeight,
];
pointMap[key] = { times: 1, data: clickData, original: point };
}
@ -204,3 +202,7 @@ export default class TargetMarker {
}
}
}
function roundToSecond(num: number) {
return Math.round(num * 100) / 100;
}

View file

@ -1,14 +1,12 @@
export const clickmapStyles = {
overlayStyle: ({ height, width, scale }: { height: string, width: string, scale: number }) => ({
transform: `scale(${scale}) translate(-50%, 0)`,
overlayStyle: ({ height, width }: { height: string, width: string }) => ({
position: 'absolute',
top: '0px',
left: '50%',
left: 0,
width,
height,
background: 'rgba(0,0,0, 0.15)',
zIndex: 9 * 10e3,
transformOrigin: 'left top',
}),
totalClicks: {
fontSize: '16px',

View file

@ -1,7 +1,6 @@
apiVersion: v2
name: chalice
description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
@ -11,14 +10,12 @@ description: A Helm chart for Kubernetes
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.7
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
AppVersion: "v1.19.0"
AppVersion: "v1.19.6"

View file

@ -1,7 +1,6 @@
apiVersion: v2
name: db
description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
@ -11,14 +10,12 @@ description: A Helm chart for Kubernetes
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.1
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
AppVersion: "v1.19.0"
AppVersion: "v1.19.1"

View file

@ -1,7 +1,6 @@
apiVersion: v2
name: frontend
description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
@ -11,14 +10,12 @@ description: A Helm chart for Kubernetes
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (frontends://semver.org/)
version: 0.1.10
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
AppVersion: "v1.19.0"
AppVersion: "v1.19.2"

View file

@ -8,7 +8,11 @@ RETENTION_TIME=${RETENTION_TIME:-345600000}
topics=(
"raw"
"raw-ios"
"raw-images"
"canvas-images"
"trigger"
"canvas-trigger"
"mobile-trigger"
"cache"
"analytics"
"storage-failover"

View file

@ -19,8 +19,8 @@ $fn_def$, :'next_version')
--
ALTER TABLE IF EXISTS events.clicks
ADD COLUMN IF NOT EXISTS normalized_x smallint NULL,
ADD COLUMN IF NOT EXISTS normalized_y smallint NULL,
ADD COLUMN IF NOT EXISTS normalized_x decimal NULL,
ADD COLUMN IF NOT EXISTS normalized_y decimal NULL,
DROP COLUMN IF EXISTS x,
DROP COLUMN IF EXISTS y;

View file

@ -620,16 +620,16 @@ CREATE INDEX pages_query_nn_gin_idx ON events.pages USING GIN (query gin_trgm_op
CREATE TABLE events.clicks
(
session_id bigint NOT NULL REFERENCES public.sessions (session_id) ON DELETE CASCADE,
message_id bigint NOT NULL,
timestamp bigint NOT NULL,
label text DEFAULT NULL,
url text DEFAULT '' NOT NULL,
session_id bigint NOT NULL REFERENCES public.sessions (session_id) ON DELETE CASCADE,
message_id bigint NOT NULL,
timestamp bigint NOT NULL,
label text DEFAULT NULL,
url text DEFAULT '' NOT NULL,
path text,
selector text DEFAULT '' NOT NULL,
hesitation integer DEFAULT NULL,
normalized_x smallint DEFAULT NULL,
normalized_y smallint DEFAULT NULL,
selector text DEFAULT '' NOT NULL,
hesitation integer DEFAULT NULL,
normalized_x decimal DEFAULT NULL,
normalized_y decimal DEFAULT NULL,
PRIMARY KEY (session_id, message_id)
);
CREATE INDEX clicks_session_id_idx ON events.clicks (session_id);

View file

@ -18,13 +18,13 @@ returns `result` without changes.
```js
import Tracker from '@openreplay/tracker';
import trackerGraphQL from '@openreplay/tracker-graphql';
import { createGraphqlMiddleware } from '@openreplay/tracker-graphql';
const tracker = new Tracker({
projectKey: YOUR_PROJECT_KEY,
});
export const recordGraphQL = tracker.plugin(trackerGraphQL());
export const recordGraphQL = tracker.use(createGraphqlMiddleware());
```
### Relay
@ -33,15 +33,28 @@ If you're using [Relay network tools](https://github.com/relay-tools/react-relay
you can simply [create a middleware](https://github.com/relay-tools/react-relay-network-modern/tree/master?tab=readme-ov-file#example-of-injecting-networklayer-with-middlewares-on-the-client-side)
```js
import { createRelayMiddleware } from '@openreplay/tracker-graphql'
import { createRelayMiddleware } from '@openreplay/tracker-graphql';
const trackerMiddleware = createRelayMiddleware(tracker)
const trackerMiddleware = tracker.use(createRelayMiddleware());
const network = new RelayNetworkLayer([
// your middleware
// ,
trackerMiddleware
])
trackerMiddleware,
]);
```
You can pass a Sanitizer function to `createRelayMiddleware` to sanitize the variables and data before sending them to OpenReplay.
```js
const trackerLink = tracker.use(
createRelayMiddleware((variables) => {
return {
...variables,
password: '***',
};
}),
);
```
Or you can manually put `recordGraphQL` call
@ -52,22 +65,22 @@ then you should do something like below
import { createGraphqlMiddleware } from '@openreplay/tracker-graphql'; // see above for recordGraphQL definition
import { Environment } from 'relay-runtime';
const handler = createGraphqlMiddleware(tracker)
const handler = tracker.use(createGraphqlMiddleware());
function fetchQuery(operation, variables, cacheConfig, uploadables) {
return fetch('www.myapi.com/resource', {
// ...
})
.then(response => response.json())
.then(result =>
handler(
// op kind, name, variables, response, duration (default 0)
operation.operationKind,
operation.name,
variables,
result,
duration,
),
.then((response) => response.json())
.then((result) =>
handler(
// op kind, name, variables, response, duration (default 0)
operation.operationKind,
operation.name,
variables,
result,
duration,
),
);
}
@ -81,10 +94,23 @@ See [Relay Network Layer](https://relay.dev/docs/en/network-layer) for details.
For [Apollo](https://www.apollographql.com/) you should create a new `ApolloLink`
```js
import { createTrackerLink } from '@openreplay/tracker-graphql'
import { createTrackerLink } from '@openreplay/tracker-graphql';
const trackerLink = createTrackerLink(tracker);
const yourLink = new ApolloLink(trackerLink)
const trackerLink = tracker.use(createTrackerLink());
const yourLink = new ApolloLink(trackerLink);
```
You can pass a Sanitizer function to `createRelayMiddleware` to sanitize the variables and data before sending them to OpenReplay.
```js
const trackerLink = tracker.use(
createTrackerLink((variables) => {
return {
...variables,
password: '***',
};
}),
);
```
Alternatively you can use generic graphql handler:
@ -93,18 +119,21 @@ Alternatively you can use generic graphql handler:
import { createGraphqlMiddleware } from '@openreplay/tracker-graphql'; // see above for recordGraphQL definition
import { ApolloLink } from 'apollo-link';
const handler = createGraphqlMiddleware(tracker)
const handler = tracker.use(createGraphqlMiddleware());
const trackerApolloLink = new ApolloLink((operation, forward) => {
return forward(operation).map(result =>
handler(
operation.setContext({ start: performance.now() });
return forward(operation).map((result) => {
const time = performance.now() - operation.getContext().start;
return handler(
// op kind, name, variables, response, duration (default 0)
operation.query.definitions[0].operation,
operation.operationName,
operation.variables,
result,
),
);
time,
);
});
});
const link = ApolloLink.from([

View file

@ -1,5 +1,6 @@
import { App, Messages } from '@openreplay/tracker';
import Observable from 'zen-observable';
import { Sanitizer } from './types';
type Operation = {
query: Record<string, any>;
@ -9,48 +10,63 @@ type Operation = {
};
type NextLink = (operation: Operation) => Observable<Record<string, any>>;
export const createTrackerLink = (app: App | null) => {
if (!app) {
return (operation: Operation, forward: NextLink) => forward(operation);
}
return (operation: Operation, forward: NextLink) => {
return new Observable((observer) => {
const start = app.timestamp();
const observable = forward(operation);
const subscription = observable.subscribe({
next(value) {
const end = app.timestamp();
app.send(
Messages.GraphQL(
operation.query.definitions[0].kind,
operation.operationName,
JSON.stringify(operation.variables),
JSON.stringify(value.data),
end - start,
),
);
observer.next(value);
},
error(error) {
const end = app.timestamp();
app.send(
Messages.GraphQL(
operation.query.definitions[0].kind,
operation.operationName,
JSON.stringify(operation.variables),
JSON.stringify(error),
end - start,
),
);
observer.error(error);
},
complete() {
observer.complete();
},
});
export const createTrackerLink = (
sanitizer?: Sanitizer<Record<string, any> | undefined | null>,
) => {
return (app: App | null) => {
if (!app) {
return (operation: Operation, forward: NextLink) => forward(operation);
}
return (operation: Operation, forward: NextLink) => {
return new Observable((observer) => {
const start = app.timestamp();
const observable = forward(operation);
const subscription = observable.subscribe({
next(value) {
const end = app.timestamp();
const operationDefinition = operation.query.definitions[0];
app.send(
Messages.GraphQL(
operationDefinition.kind === 'OperationDefinition'
? operationDefinition.operation
: 'unknown?',
operation.operationName,
JSON.stringify(
sanitizer
? sanitizer(operation.variables)
: operation.variables,
),
JSON.stringify(sanitizer ? sanitizer(value.data) : value.data),
end - start,
),
);
observer.next(value);
},
error(error) {
const end = app.timestamp();
app.send(
Messages.GraphQL(
operation.query.definitions[0].kind,
operation.operationName,
JSON.stringify(
sanitizer
? sanitizer(operation.variables)
: operation.variables,
),
JSON.stringify(error),
end - start,
),
);
observer.error(error);
},
complete() {
observer.complete();
},
});
return () => subscription.unsubscribe();
});
return () => subscription.unsubscribe();
});
};
};
};

View file

@ -1,4 +1,4 @@
import { App, Messages } from "@openreplay/tracker";
import { App, Messages } from '@openreplay/tracker';
function createGraphqlMiddleware() {
return (app: App | null) => {
@ -10,7 +10,7 @@ function createGraphqlMiddleware() {
operationName: string,
variables: any,
result: any,
duration = 0
duration = 0,
) => {
try {
app.send(
@ -30,4 +30,4 @@ function createGraphqlMiddleware() {
};
}
export default createGraphqlMiddleware
export default createGraphqlMiddleware;

View file

@ -1,9 +1,11 @@
import createTrackerLink from './apolloMiddleware.js';
import createRelayMiddleware from './relayMiddleware.js';
import createGraphqlMiddleware from './graphqlMiddleware.js';
import { Sanitizer } from './types.js';
export {
createTrackerLink,
createRelayMiddleware,
createGraphqlMiddleware,
}
Sanitizer,
};

View file

@ -1,37 +1,55 @@
import { App, Messages } from '@openreplay/tracker';
import type { Middleware, RelayRequest } from './relaytypes';
import { Sanitizer } from './types';
const createRelayMiddleware = (app: App | null): Middleware => {
if (!app) {
return (next) => async (req) => await next(req);
}
return (next) => async (req) => {
const start = app.timestamp();
const resp = await next(req)
const end = app.timestamp();
if ('requests' in req) {
req.requests.forEach((request) => {
app.send(getMessage(request, resp.json as Record<string, any>, end - start))
})
} else {
app.send(getMessage(req, resp.json as Record<string, any>, end - start))
const createRelayMiddleware = (sanitizer?: Sanitizer<Record<string, any>>) => {
return (app: App | null): Middleware => {
if (!app) {
return (next) => async (req) => await next(req);
}
return resp;
}
return (next) => async (req) => {
const start = app.timestamp();
const resp = await next(req);
const end = app.timestamp();
if ('requests' in req) {
req.requests.forEach((request) => {
app.send(
getMessage(
request,
resp.json as Record<string, any>,
end - start,
sanitizer,
),
);
});
} else {
app.send(
getMessage(
req,
resp.json as Record<string, any>,
end - start,
sanitizer,
),
);
}
return resp;
};
};
};
function getMessage(request: RelayRequest, json: Record<string, any>, duration: number) {
function getMessage(
request: RelayRequest,
json: Record<string, any>,
duration: number,
sanitizer?: Sanitizer<Record<string, any>>,
) {
const opKind = request.operation.kind;
const opName = request.operation.name;
const vars = JSON.stringify(request.variables)
const opResp = JSON.stringify(json)
return Messages.GraphQL(
opKind,
opName,
vars,
opResp,
duration
)
const vars = JSON.stringify(
sanitizer ? sanitizer(request.variables) : request.variables,
);
const opResp = JSON.stringify(sanitizer ? sanitizer(json) : json);
return Messages.GraphQL(opKind, opName, vars, opResp, duration);
}
export default createRelayMiddleware
export default createRelayMiddleware;

View file

@ -1,4 +1,3 @@
type ConcreteBatch = {
kind: 'Batch';
fragment: any;
@ -9,7 +8,7 @@ type ConcreteBatch = {
text: string | null;
operationKind: string;
};
type Variables = { [name: string]: any };
export type Variables = { [name: string]: any };
interface FetchOpts {
url?: string;
method: 'POST' | 'GET';
@ -17,7 +16,13 @@ interface FetchOpts {
body: string | FormData;
credentials?: 'same-origin' | 'include' | 'omit';
mode?: 'cors' | 'websocket' | 'navigate' | 'no-cors' | 'same-origin';
cache?: 'default' | 'no-store' | 'reload' | 'no-cache' | 'force-cache' | 'only-if-cached';
cache?:
| 'default'
| 'no-store'
| 'reload'
| 'no-cache'
| 'force-cache'
| 'only-if-cached';
redirect?: 'follow' | 'error' | 'manual';
signal?: AbortSignal;
[name: string]: any;

View file

@ -0,0 +1 @@
export type Sanitizer<T> = (values: T) => Partial<T>;

View file

@ -963,8 +963,8 @@ export default class App {
deviceMemory,
jsHeapSizeLimit,
timezone: getTimezone(),
width: window.innerWidth,
height: window.innerHeight,
width: window.screen.width,
height: window.screen.height,
}),
})
const {
@ -1220,7 +1220,9 @@ export default class App {
timezone: getTimezone(),
condition: conditionName,
assistOnly: startOpts.assistOnly ?? this.socketMode,
}),
width: window.screen.width,
height: window.screen.height
}),
})
if (r.status !== 200) {
const error = await r.text()