* change(ui) - redirect to the landing url on SSO login
* fix(ui): fix share popup styles
* change(ui) - non admin user preference restrictions
* fix(ui) - redirect fix
* change(ui) - show installation btn without mouse hover
* feat(api): api-v1 handle wrong projectKey
feat(api): api-v1 get live sessions
* change(ui) - show role edit on hover
* change(ui) - audit trail count with comma
* fix(ui) - audit trail date range custom picker alignment
* change(ui) - show a message when mob file not found
* feat(api): api-v1 fixed search live sessions
* feat(api): api-v1 handle wrong projectKey
* feat(api): fixed assist error response
* fix(tracker): check node scrolls only on start
* fixup! fix(tracker): check node scrolls only on start
* feat(ui/player): scroll view in click map
* feat(ui/player): rm unused check
* New configuration module (#558)
* ci(dbmigrate): Create db migrate when there is change
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): fix login error/button margins
* fix(ui) - checkbox click
* fix(ui) - search rename and save fixes
* change(ui) - text changes
* fix(ui) - button text nowrap
* fix(ui): fix slowestdomains widget height
* change(ui) - ignore clicks while annotating
* change(ui) - if block with braces
* change(ui) - capitalize first letter in breadcrumb
* feat(db): remove errors from permissions
feat(api): remove errors from permissions
* feat(api): changed reset password response
* fix(ui) - assist active tab list, broken after with new api changes (pagination)
* fix(ui) - assist active tab list, broken after with new api changes (pagination)
* change(ui) - search compare
* fix(ui): last fixes for 1.7
* fix(ui): fix timeline
* fix(ui): small code fixes
* fix(ui): remove unused
* feat(frontend/assist): show when client tab is inactive + fix reconnection status update
* fix(ui) - visibility settings
* feat(assist): refactored extractSessionInfo
feat(assist): hardcoded session's attributes
* Added snabbdom (JS)
* fix(tracker): version check works with x.x.x-beta versions
* fix(backend): keep the highest user's timestamp instead of the latest message timestamp for correct session duration value
* feat(backend/s3): added file tag RETENTION (#561)
* change(ui) - search optimization and autocomplete improvements
* feat(backend/assets): added new metrics assets_downloaded
* change(ui) - show back the date range in bookmarks since the api is filtering by daterange
* feat(backend-assets): custom headers for cacher requests
* chore(backend): no tidy in dockerfile (local build speed up)
* feat(backend/assets): added proxy support for cacher module
* feat(backend/storage): set retention env variable as not required
* fix(ui): fix jira issues
* ci(helm): use kubectl for deployment
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(tracker):3.5.13: performance improvements for a case of extensive dom
* fix(backend): added missed err var and continue statement
* ci(helm): forcing namespace
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): fixed slowest_domains query
* ci(helm): update helm deployment method
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* change(ui) - filter dropdown colros
* fix(ui) - speed index location avg attribute changed to value
* ci(api): enable kubectl apply
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - widget y axis label
* feat(api): fixed slowest_domains query
* chore(helm): Adding namespaces to all templates (#565)
* feat(api): assist type-autocomplete
* feat(api): assist global-autocomplete
* feat(sourcemaps): include wasm file in build
* feat(sourcemaps-reader): refactored
* fix(ui): fix data for funnels
* fix(ui): fix all sessions section margin
* fix(ui) - assist loader flag
* fix(ui) - assist loader flag
* fix(ui): fix weird check
* feat(api): autocomplete accept unsupported types
* feat(ui): migrate to yarn v3
* feat(ui): minor fixes for installment
* feat(ui): add typescript plugin to yarn
* chore(helm): Ability to override image registry
* chore(helm): Overriding openreplay docker registry
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): fix control arrows on firefox
* feat(crons): EE crons
* feat(api): fixed build script
* feat(alerts): fixed build script
* feat(crons): fixed build script
* chore(helm): Updating cron version
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(crons): changes
* chore(helm): optional minio ingress
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(crons): fix build script
feat(alerts): fix build script
* Revert "chore(helm): Updating cron version"
This reverts commit 3ca190ea2f.
* feat(crons): fix build script
* feat(crons): fix Dockerfile
* feat(api): fixed metadata change-case
* change(ui) - remove capitalize for the meta value
* change(ui) - autocomplete improvements with custom textfield
* fix(tracker):3.5.13+:reuse metadata on internal-caused restarts
* fix(tracker-assist):3.5.13:send active:true on start; scroll behavior fix
* change(ui) - filters autocomplete blur on pressing Enter key
* fix(tracker): fix node v to lower
* fix(tracker): fix deps
* fix(tracker): fix deps
* fix(ui) - dashboard modal width
* change(ui) - filter dropdown overflow
* chore(helm): clickhouse reclaim polity to retain
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(tracker): fix engine max v
* fix(ui): load metadata in assist tab for sorting
* fix(ui): rm unneeded api call
* fix(tracker): build script to cjs
* change(ui) - removed sample data
* chore(tracker): remove upper node version limit
* Updating Beacon size
Beacon size should be <= QUEUE_MESSAGE_SIZE_LIMIT
* feat(crons): run 24/7
feat(alerts): support env-file override
* feat(api): changed EE env handler
* fix(ui): fix sessions search modal
* change(ui) - margin for error message
* change(ui) - disable assist sort when there are no meta options to choose
* chore(helm): Adding utilities service namespace
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - dashboard date range selection reload, metric not found message
* change(ui) - disable clearsearch in assist when there are no filters\
* feat(api): fixed EE env handler
* chore(helm): Adding migration namespaces
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - report logo path
* chore(helm): Removing unnecessary SA
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): changed EE env handler
* feat(api): changed EE env handler
* feat(api): changed EE env handler
* feat(api): changed EE env handler
* feat(crons): changed crons
* feat(api): accept wrong metric_id
* feat(crons): changed env handler
feat(api): changed env handler
feat(alerts): changed env handler
* feat(utilities): support old version of nodejs
* feat(crons): changed env handler
feat(api): changed env handler
feat(alerts): changed env handler
* fix(tracker): fix srcset tracking
* chore(build): Adding frontent
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(assist): changed general helper
* feat(assist): changed general helper
* fix(ui): fix widget pagination (#570)
* feat(crons): changed entrypoint
* feat(player): dev-log on skipping message
* fix(tracker): removeNode mutation priority over attributes
* fix(tracker): capture relative img timing;use startsWith instead of substr; codestyle fix
* chore(build): fixing api build script
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* chore(ci): faster deployment
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* change(ui) - assist list show active status
* chore(actions): option to build all/specific services in GH
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - slowest domain metric data as per the api changes
* ci(helm): updated variable name
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* ci(backend): cherrypick changes to ee
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(backend): disabled pprof in http service
* fix(ui) - TimeToRender avg value as per the API change
* fix(ui) - ResponseTimeDistribution avg value as per the API change
* fix(ui) - MemoryConsumption avg value as per the API change
* fix(ui) - ResponseTime avg value as per the API change
* fix(ui) - DomBuildTime avg value as per the API change
* fix(ui) - FrameRate avg value as per the API change
* chore(helm): proper default tag
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(backend): removed sensitive information from http logs
* ci(backend): adding default parameter value for workflow dispatch
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(backend): deleted empty file
* fix(actions): creating image source file prior
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(helm): variable substitution
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* change(ui) - project list item installation button text change
* fix(ui) - project create validation
* fix(backend): removed unsafe string logs in http service
* chore(kafka): Adding new topic
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(efs-cron): variable name
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - developer tools - hint links
* fix(ui) - session filters - country and platform dropdown values
* chore(helm): updating version
* chore(kafka): Update kafka default message size while provisioning
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(tracker): fix dependency security
* change(ui) - webhook delete confirmation
* change(ui) - assist url to handle when empty
* feat(api): autocomplete replace console with errors
feat(DB): clean extra files
* chore(helm): Adding cron jobs
* change(ui) - set changed flag to false after the metric delete to avoid prompt
* chore(helm): enbaling cron only for ee
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): autocomplete remove console
* change(ui) - removed Console filter type
* fix(ui) - timeline position
* fix(helm): RFC naming
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): let user change project in dashboards and select default dashboard
* chore(helm): update registry url
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(DB): return pages_count to DB
* fix(ui) - account settings opt out checkbox
* fix(ui): fix modal width
* fix(ui) - explore circle bg
* fix(ui) - user name overlap
* fix(ui) - empty dashboards create button
* fix(ui): fix timeline position cursor for safari
* fix(ui) - custom metrics errors modal url reset on close
* fix(ui) - onboarding check for siteId
* change(ui) - tracker version
* Update local_deploy.sh
* fix(ui) - drilldown timestamp
* fix(tracker): fix deps for assist
* fix(tracker): update peerjs library
* fix(tracker): update assist v
* fix(tracker): fix type error
* fix(backend): no missing resource relying on resource zero-timing
* Update tracker to v3.5.15
* chore(helm): Adding CSP override variable.
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(backend): added pem file support for kafka ssl setup
* feat(backend): added useBatch setup for kafka producer
* ci(backend): set verbose logging
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(backend): using setKey instead of direct writes
* ci(backend): fix error code
* ci(deploy): Updating the image registry
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): changed get user id alias
* ci(frontent): removing depricated steps
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* ci(fix): variable replace
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* ci(helm): creating image image_override
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): fix timezone settings
* Added failover mechanism for storage service (#576)
* fix(ui): fix typescript config to remove array iterator error
* fix(ui): refactor timezone settings store/comp
* feat(snippet): opensource snippet
* feat(assist): support multiple IPs
* fix(ui): fix type errors in select /timezones fix
* feat(backend): set size of first part of sessions at 500kb
* change(ui) - removed logs
* fix(ui) - custom metric errors reset url on modal close
* feat(DB): no funnel migration
* fix(ui): fix screensize bug
* feat(DB): migrate super old funnels support
* changed db-migration workflow
Co-authored-by: Shekar Siri <sshekarsiri@gmail.com>
Co-authored-by: sylenien <nikita@openreplay.com>
Co-authored-by: Alex Kaminskii <alex@openreplay.com>
Co-authored-by: Alexander <zavorotynskiy@pm.me>
Co-authored-by: rjshrjndrn <rjshrjndrn@gmail.com>
Co-authored-by: Mehdi Osman <estradino@users.noreply.github.com>
Co-authored-by: Alexander <alexander@openreplay.com>
Co-authored-by: Rajesh Rajendran <rjshrjndrn@users.noreply.github.com>
Co-authored-by: Delirium <sylenien@gmail.com>
372 lines
17 KiB
Python
372 lines
17 KiB
Python
import json
|
|
from typing import List
|
|
|
|
import chalicelib.utils.helper
|
|
import schemas
|
|
from chalicelib.core import significance, sessions
|
|
from chalicelib.utils import dev
|
|
from chalicelib.utils import helper, pg_client
|
|
from chalicelib.utils.TimeUTC import TimeUTC
|
|
|
|
REMOVE_KEYS = ["key", "_key", "startDate", "endDate"]
|
|
|
|
ALLOW_UPDATE_FOR = ["name", "filter"]
|
|
|
|
|
|
def filter_stages(stages: List[schemas._SessionSearchEventSchema]):
|
|
ALLOW_TYPES = [schemas.EventType.click, schemas.EventType.input,
|
|
schemas.EventType.location, schemas.EventType.custom,
|
|
schemas.EventType.click_ios, schemas.EventType.input_ios,
|
|
schemas.EventType.view_ios, schemas.EventType.custom_ios, ]
|
|
return [s for s in stages if s.type in ALLOW_TYPES and s.value is not None]
|
|
|
|
|
|
def __parse_events(f_events: List[dict]):
|
|
return [schemas._SessionSearchEventSchema.parse_obj(e) for e in f_events]
|
|
|
|
|
|
def __unparse_events(f_events: List[schemas._SessionSearchEventSchema]):
|
|
return [e.dict() for e in f_events]
|
|
|
|
|
|
def __fix_stages(f_events: List[schemas._SessionSearchEventSchema]):
|
|
if f_events is None:
|
|
return
|
|
events = []
|
|
for e in f_events:
|
|
if e.operator is None:
|
|
e.operator = schemas.SearchEventOperator._is
|
|
|
|
if not isinstance(e.value, list):
|
|
e.value = [e.value]
|
|
is_any = sessions._isAny_opreator(e.operator)
|
|
if not is_any and isinstance(e.value, list) and len(e.value) == 0:
|
|
continue
|
|
events.append(e)
|
|
return events
|
|
|
|
|
|
def __transform_old_funnels(events):
|
|
for e in events:
|
|
if not isinstance(e.get("value"), list):
|
|
e["value"] = [e["value"]]
|
|
return events
|
|
|
|
|
|
def create(project_id, user_id, name, filter: schemas.FunnelSearchPayloadSchema, is_public):
|
|
helper.delete_keys_from_dict(filter, REMOVE_KEYS)
|
|
filter.events = filter_stages(stages=filter.events)
|
|
with pg_client.PostgresClient() as cur:
|
|
query = cur.mogrify("""\
|
|
INSERT INTO public.funnels (project_id, user_id, name, filter,is_public)
|
|
VALUES (%(project_id)s, %(user_id)s, %(name)s, %(filter)s::jsonb,%(is_public)s)
|
|
RETURNING *;""",
|
|
{"user_id": user_id, "project_id": project_id, "name": name,
|
|
"filter": json.dumps(filter.dict()),
|
|
"is_public": is_public})
|
|
|
|
cur.execute(
|
|
query
|
|
)
|
|
r = cur.fetchone()
|
|
r["created_at"] = TimeUTC.datetime_to_timestamp(r["created_at"])
|
|
r = helper.dict_to_camel_case(r)
|
|
r["filter"]["startDate"], r["filter"]["endDate"] = TimeUTC.get_start_end_from_range(r["filter"]["rangeValue"])
|
|
return {"data": r}
|
|
|
|
|
|
def update(funnel_id, user_id, project_id, name=None, filter=None, is_public=None):
|
|
s_query = []
|
|
if filter is not None:
|
|
helper.delete_keys_from_dict(filter, REMOVE_KEYS)
|
|
s_query.append("filter = %(filter)s::jsonb")
|
|
if name is not None and len(name) > 0:
|
|
s_query.append("name = %(name)s")
|
|
if is_public is not None:
|
|
s_query.append("is_public = %(is_public)s")
|
|
if len(s_query) == 0:
|
|
return {"errors": ["Nothing to update"]}
|
|
with pg_client.PostgresClient() as cur:
|
|
query = cur.mogrify(f"""\
|
|
UPDATE public.funnels
|
|
SET {" , ".join(s_query)}
|
|
WHERE funnel_id=%(funnel_id)s
|
|
AND project_id = %(project_id)s
|
|
AND (user_id = %(user_id)s OR is_public)
|
|
RETURNING *;""", {"user_id": user_id, "funnel_id": funnel_id, "name": name,
|
|
"filter": json.dumps(filter) if filter is not None else None, "is_public": is_public,
|
|
"project_id": project_id})
|
|
# print("--------------------")
|
|
# print(query)
|
|
# print("--------------------")
|
|
cur.execute(
|
|
query
|
|
)
|
|
r = cur.fetchone()
|
|
if r is None:
|
|
return {"errors": ["funnel not found"]}
|
|
r["created_at"] = TimeUTC.datetime_to_timestamp(r["created_at"])
|
|
r = helper.dict_to_camel_case(r)
|
|
r["filter"]["startDate"], r["filter"]["endDate"] = TimeUTC.get_start_end_from_range(r["filter"]["rangeValue"])
|
|
r["filter"] = helper.old_search_payload_to_flat(r["filter"])
|
|
return {"data": r}
|
|
|
|
|
|
def get_by_user(project_id, user_id, range_value=None, start_date=None, end_date=None, details=False):
|
|
with pg_client.PostgresClient() as cur:
|
|
cur.execute(
|
|
cur.mogrify(
|
|
f"""\
|
|
SELECT funnel_id, project_id, user_id, name, created_at, deleted_at, is_public
|
|
{",filter" if details else ""}
|
|
FROM public.funnels
|
|
WHERE project_id = %(project_id)s
|
|
AND funnels.deleted_at IS NULL
|
|
AND (funnels.user_id = %(user_id)s OR funnels.is_public);""",
|
|
{"project_id": project_id, "user_id": user_id}
|
|
)
|
|
)
|
|
|
|
rows = cur.fetchall()
|
|
rows = helper.list_to_camel_case(rows)
|
|
for row in rows:
|
|
row["createdAt"] = TimeUTC.datetime_to_timestamp(row["createdAt"])
|
|
if details:
|
|
row["filter"]["events"] = filter_stages(__parse_events(row["filter"]["events"]))
|
|
if row.get("filter") is not None and row["filter"].get("events") is not None:
|
|
row["filter"]["events"] = __transform_old_funnels(__unparse_events(row["filter"]["events"]))
|
|
|
|
get_start_end_time(filter_d=row["filter"], range_value=range_value, start_date=start_date,
|
|
end_date=end_date)
|
|
counts = sessions.search2_pg(data=schemas.SessionsSearchPayloadSchema.parse_obj(row["filter"]),
|
|
project_id=project_id, user_id=None, count_only=True)
|
|
row["sessionsCount"] = counts["countSessions"]
|
|
row["usersCount"] = counts["countUsers"]
|
|
filter_clone = dict(row["filter"])
|
|
overview = significance.get_overview(filter_d=row["filter"], project_id=project_id)
|
|
row["stages"] = overview["stages"]
|
|
row.pop("filter")
|
|
row["stagesCount"] = len(row["stages"])
|
|
# TODO: ask david to count it alone
|
|
row["criticalIssuesCount"] = overview["criticalIssuesCount"]
|
|
row["missedConversions"] = 0 if len(row["stages"]) < 2 \
|
|
else row["stages"][0]["sessionsCount"] - row["stages"][-1]["sessionsCount"]
|
|
row["filter"] = helper.old_search_payload_to_flat(filter_clone)
|
|
return rows
|
|
|
|
|
|
def get_possible_issue_types(project_id):
|
|
return [{"type": t, "title": chalicelib.utils.helper.get_issue_title(t)} for t in
|
|
['click_rage', 'dead_click', 'excessive_scrolling',
|
|
'bad_request', 'missing_resource', 'memory', 'cpu',
|
|
'slow_resource', 'slow_page_load', 'crash', 'custom_event_error',
|
|
'js_error']]
|
|
|
|
|
|
def get_start_end_time(filter_d, range_value, start_date, end_date):
|
|
if start_date is not None and end_date is not None:
|
|
filter_d["startDate"], filter_d["endDate"] = start_date, end_date
|
|
elif range_value is not None and len(range_value) > 0:
|
|
filter_d["rangeValue"] = range_value
|
|
filter_d["startDate"], filter_d["endDate"] = TimeUTC.get_start_end_from_range(range_value)
|
|
else:
|
|
filter_d["startDate"], filter_d["endDate"] = TimeUTC.get_start_end_from_range(filter_d["rangeValue"])
|
|
|
|
|
|
def delete(project_id, funnel_id, user_id):
|
|
with pg_client.PostgresClient() as cur:
|
|
cur.execute(
|
|
cur.mogrify("""\
|
|
UPDATE public.funnels
|
|
SET deleted_at = timezone('utc'::text, now())
|
|
WHERE project_id = %(project_id)s
|
|
AND funnel_id = %(funnel_id)s
|
|
AND (user_id = %(user_id)s OR is_public);""",
|
|
{"funnel_id": funnel_id, "project_id": project_id, "user_id": user_id})
|
|
)
|
|
|
|
return {"data": {"state": "success"}}
|
|
|
|
|
|
def get_sessions(project_id, funnel_id, user_id, range_value=None, start_date=None, end_date=None):
|
|
f = get(funnel_id=funnel_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if f is None:
|
|
return {"errors": ["funnel not found"]}
|
|
get_start_end_time(filter_d=f["filter"], range_value=range_value, start_date=start_date, end_date=end_date)
|
|
return sessions.search2_pg(data=schemas.SessionsSearchPayloadSchema.parse_obj(f["filter"]), project_id=project_id,
|
|
user_id=user_id)
|
|
|
|
|
|
def get_sessions_on_the_fly(funnel_id, project_id, user_id, data: schemas.FunnelSearchPayloadSchema):
|
|
data.events = filter_stages(data.events)
|
|
data.events = __fix_stages(data.events)
|
|
if len(data.events) == 0:
|
|
f = get(funnel_id=funnel_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if f is None:
|
|
return {"errors": ["funnel not found"]}
|
|
get_start_end_time(filter_d=f["filter"], range_value=data.range_value,
|
|
start_date=data.startDate, end_date=data.endDate)
|
|
data = schemas.FunnelSearchPayloadSchema.parse_obj(f["filter"])
|
|
return sessions.search2_pg(data=data, project_id=project_id,
|
|
user_id=user_id)
|
|
|
|
|
|
def get_top_insights(project_id, user_id, funnel_id, range_value=None, start_date=None, end_date=None):
|
|
f = get(funnel_id=funnel_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if f is None:
|
|
return {"errors": ["funnel not found"]}
|
|
get_start_end_time(filter_d=f["filter"], range_value=range_value, start_date=start_date, end_date=end_date)
|
|
insights, total_drop_due_to_issues = significance.get_top_insights(filter_d=f["filter"], project_id=project_id)
|
|
insights = helper.list_to_camel_case(insights)
|
|
if len(insights) > 0:
|
|
# fix: this fix for huge drop count
|
|
if total_drop_due_to_issues > insights[0]["sessionsCount"]:
|
|
total_drop_due_to_issues = insights[0]["sessionsCount"]
|
|
# end fix
|
|
insights[-1]["dropDueToIssues"] = total_drop_due_to_issues
|
|
return {"data": {"stages": insights,
|
|
"totalDropDueToIssues": total_drop_due_to_issues}}
|
|
|
|
|
|
def get_top_insights_on_the_fly(funnel_id, user_id, project_id, data: schemas.FunnelInsightsPayloadSchema):
|
|
data.events = filter_stages(__parse_events(data.events))
|
|
if len(data.events) == 0:
|
|
f = get(funnel_id=funnel_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if f is None:
|
|
return {"errors": ["funnel not found"]}
|
|
get_start_end_time(filter_d=f["filter"], range_value=data.rangeValue,
|
|
start_date=data.startDate,
|
|
end_date=data.endDate)
|
|
data = schemas.FunnelInsightsPayloadSchema.parse_obj(f["filter"])
|
|
data.events = __fix_stages(data.events)
|
|
insights, total_drop_due_to_issues = significance.get_top_insights(filter_d=data.dict(), project_id=project_id)
|
|
insights = helper.list_to_camel_case(insights)
|
|
if len(insights) > 0:
|
|
# fix: this fix for huge drop count
|
|
if total_drop_due_to_issues > insights[0]["sessionsCount"]:
|
|
total_drop_due_to_issues = insights[0]["sessionsCount"]
|
|
# end fix
|
|
insights[-1]["dropDueToIssues"] = total_drop_due_to_issues
|
|
return {"data": {"stages": insights,
|
|
"totalDropDueToIssues": total_drop_due_to_issues}}
|
|
|
|
|
|
# def get_top_insights_on_the_fly_widget(project_id, data: schemas.FunnelInsightsPayloadSchema):
|
|
def get_top_insights_on_the_fly_widget(project_id, data: schemas.CustomMetricSeriesFilterSchema):
|
|
data.events = filter_stages(__parse_events(data.events))
|
|
data.events = __fix_stages(data.events)
|
|
if len(data.events) == 0:
|
|
return {"stages": [], "totalDropDueToIssues": 0}
|
|
insights, total_drop_due_to_issues = significance.get_top_insights(filter_d=data.dict(), project_id=project_id)
|
|
insights = helper.list_to_camel_case(insights)
|
|
if len(insights) > 0:
|
|
# TODO: check if this correct
|
|
if total_drop_due_to_issues > insights[0]["sessionsCount"]:
|
|
if len(insights) == 0:
|
|
total_drop_due_to_issues = 0
|
|
else:
|
|
total_drop_due_to_issues = insights[0]["sessionsCount"] - insights[-1]["sessionsCount"]
|
|
insights[-1]["dropDueToIssues"] = total_drop_due_to_issues
|
|
return {"stages": insights,
|
|
"totalDropDueToIssues": total_drop_due_to_issues}
|
|
|
|
|
|
def get_issues(project_id, user_id, funnel_id, range_value=None, start_date=None, end_date=None):
|
|
f = get(funnel_id=funnel_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if f is None:
|
|
return {"errors": ["funnel not found"]}
|
|
get_start_end_time(filter_d=f["filter"], range_value=range_value, start_date=start_date, end_date=end_date)
|
|
return {"data": {
|
|
"issues": helper.dict_to_camel_case(significance.get_issues_list(filter_d=f["filter"], project_id=project_id))
|
|
}}
|
|
|
|
|
|
def get_issues_on_the_fly(funnel_id, user_id, project_id, data: schemas.FunnelSearchPayloadSchema):
|
|
data.events = filter_stages(data.events)
|
|
data.events = __fix_stages(data.events)
|
|
if len(data.events) == 0:
|
|
f = get(funnel_id=funnel_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if f is None:
|
|
return {"errors": ["funnel not found"]}
|
|
get_start_end_time(filter_d=f["filter"], range_value=data.rangeValue,
|
|
start_date=data.startDate,
|
|
end_date=data.endDate)
|
|
data = schemas.FunnelSearchPayloadSchema.parse_obj(f["filter"])
|
|
if len(data.events) < 2:
|
|
return {"issues": []}
|
|
return {
|
|
"issues": helper.dict_to_camel_case(
|
|
significance.get_issues_list(filter_d=data.dict(), project_id=project_id, first_stage=1,
|
|
last_stage=len(data.events)))}
|
|
|
|
|
|
# def get_issues_on_the_fly_widget(project_id, data: schemas.FunnelSearchPayloadSchema):
|
|
def get_issues_on_the_fly_widget(project_id, data: schemas.CustomMetricSeriesFilterSchema):
|
|
data.events = filter_stages(data.events)
|
|
data.events = __fix_stages(data.events)
|
|
if len(data.events) < 0:
|
|
return {"issues": []}
|
|
|
|
return {
|
|
"issues": helper.dict_to_camel_case(
|
|
significance.get_issues_list(filter_d=data.dict(), project_id=project_id, first_stage=1,
|
|
last_stage=len(data.events)))}
|
|
|
|
|
|
def get(funnel_id, project_id, user_id, flatten=True, fix_stages=True):
|
|
with pg_client.PostgresClient() as cur:
|
|
cur.execute(
|
|
cur.mogrify(
|
|
"""\
|
|
SELECT
|
|
*
|
|
FROM public.funnels
|
|
WHERE project_id = %(project_id)s
|
|
AND deleted_at IS NULL
|
|
AND funnel_id = %(funnel_id)s
|
|
AND (user_id = %(user_id)s OR is_public);""",
|
|
{"funnel_id": funnel_id, "project_id": project_id, "user_id": user_id}
|
|
)
|
|
)
|
|
|
|
f = helper.dict_to_camel_case(cur.fetchone())
|
|
if f is None:
|
|
return None
|
|
if f.get("filter") is not None and f["filter"].get("events") is not None:
|
|
f["filter"]["events"] = __transform_old_funnels(f["filter"]["events"])
|
|
f["createdAt"] = TimeUTC.datetime_to_timestamp(f["createdAt"])
|
|
f["filter"]["events"] = __parse_events(f["filter"]["events"])
|
|
f["filter"]["events"] = filter_stages(stages=f["filter"]["events"])
|
|
if fix_stages:
|
|
f["filter"]["events"] = __fix_stages(f["filter"]["events"])
|
|
f["filter"]["events"] = [e.dict() for e in f["filter"]["events"]]
|
|
if flatten:
|
|
f["filter"] = helper.old_search_payload_to_flat(f["filter"])
|
|
return f
|
|
|
|
|
|
def search_by_issue(user_id, project_id, funnel_id, issue_id, data: schemas.FunnelSearchPayloadSchema, range_value=None,
|
|
start_date=None, end_date=None):
|
|
if len(data.events) == 0:
|
|
f = get(funnel_id=funnel_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if f is None:
|
|
return {"errors": ["funnel not found"]}
|
|
data.startDate = data.startDate if data.startDate is not None else start_date
|
|
data.endDate = data.endDate if data.endDate is not None else end_date
|
|
get_start_end_time(filter_d=f["filter"], range_value=range_value, start_date=data.startDate,
|
|
end_date=data.endDate)
|
|
data = schemas.FunnelSearchPayloadSchema.parse_obj(f["filter"])
|
|
|
|
issues = get_issues_on_the_fly(funnel_id=funnel_id, user_id=user_id, project_id=project_id, data=data) \
|
|
.get("issues", {})
|
|
issues = issues.get("significant", []) + issues.get("insignificant", [])
|
|
issue = None
|
|
for i in issues:
|
|
if i.get("issueId", "") == issue_id:
|
|
issue = i
|
|
break
|
|
return {"sessions": sessions.search2_pg(user_id=user_id, project_id=project_id, issue=issue,
|
|
data=data) if issue is not None else {"total": 0, "sessions": []},
|
|
# "stages": helper.list_to_camel_case(insights),
|
|
# "totalDropDueToIssues": total_drop_due_to_issues,
|
|
"issue": issue}
|