* change(ui) - redirect to the landing url on SSO login
* fix(ui): fix share popup styles
* change(ui) - non admin user preference restrictions
* fix(ui) - redirect fix
* change(ui) - show installation btn without mouse hover
* feat(api): api-v1 handle wrong projectKey
feat(api): api-v1 get live sessions
* change(ui) - show role edit on hover
* change(ui) - audit trail count with comma
* fix(ui) - audit trail date range custom picker alignment
* change(ui) - show a message when mob file not found
* feat(api): api-v1 fixed search live sessions
* feat(api): api-v1 handle wrong projectKey
* feat(api): fixed assist error response
* fix(tracker): check node scrolls only on start
* fixup! fix(tracker): check node scrolls only on start
* feat(ui/player): scroll view in click map
* feat(ui/player): rm unused check
* New configuration module (#558)
* ci(dbmigrate): Create db migrate when there is change
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): fix login error/button margins
* fix(ui) - checkbox click
* fix(ui) - search rename and save fixes
* change(ui) - text changes
* fix(ui) - button text nowrap
* fix(ui): fix slowestdomains widget height
* change(ui) - ignore clicks while annotating
* change(ui) - if block with braces
* change(ui) - capitalize first letter in breadcrumb
* feat(db): remove errors from permissions
feat(api): remove errors from permissions
* feat(api): changed reset password response
* fix(ui) - assist active tab list, broken after with new api changes (pagination)
* fix(ui) - assist active tab list, broken after with new api changes (pagination)
* change(ui) - search compare
* fix(ui): last fixes for 1.7
* fix(ui): fix timeline
* fix(ui): small code fixes
* fix(ui): remove unused
* feat(frontend/assist): show when client tab is inactive + fix reconnection status update
* fix(ui) - visibility settings
* feat(assist): refactored extractSessionInfo
feat(assist): hardcoded session's attributes
* Added snabbdom (JS)
* fix(tracker): version check works with x.x.x-beta versions
* fix(backend): keep the highest user's timestamp instead of the latest message timestamp for correct session duration value
* feat(backend/s3): added file tag RETENTION (#561)
* change(ui) - search optimization and autocomplete improvements
* feat(backend/assets): added new metrics assets_downloaded
* change(ui) - show back the date range in bookmarks since the api is filtering by daterange
* feat(backend-assets): custom headers for cacher requests
* chore(backend): no tidy in dockerfile (local build speed up)
* feat(backend/assets): added proxy support for cacher module
* feat(backend/storage): set retention env variable as not required
* fix(ui): fix jira issues
* ci(helm): use kubectl for deployment
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(tracker):3.5.13: performance improvements for a case of extensive dom
* fix(backend): added missed err var and continue statement
* ci(helm): forcing namespace
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): fixed slowest_domains query
* ci(helm): update helm deployment method
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* change(ui) - filter dropdown colros
* fix(ui) - speed index location avg attribute changed to value
* ci(api): enable kubectl apply
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - widget y axis label
* feat(api): fixed slowest_domains query
* chore(helm): Adding namespaces to all templates (#565)
* feat(api): assist type-autocomplete
* feat(api): assist global-autocomplete
* feat(sourcemaps): include wasm file in build
* feat(sourcemaps-reader): refactored
* fix(ui): fix data for funnels
* fix(ui): fix all sessions section margin
* fix(ui) - assist loader flag
* fix(ui) - assist loader flag
* fix(ui): fix weird check
* feat(api): autocomplete accept unsupported types
* feat(ui): migrate to yarn v3
* feat(ui): minor fixes for installment
* feat(ui): add typescript plugin to yarn
* chore(helm): Ability to override image registry
* chore(helm): Overriding openreplay docker registry
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): fix control arrows on firefox
* feat(crons): EE crons
* feat(api): fixed build script
* feat(alerts): fixed build script
* feat(crons): fixed build script
* chore(helm): Updating cron version
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(crons): changes
* chore(helm): optional minio ingress
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(crons): fix build script
feat(alerts): fix build script
* Revert "chore(helm): Updating cron version"
This reverts commit 3ca190ea2f.
* feat(crons): fix build script
* feat(crons): fix Dockerfile
* feat(api): fixed metadata change-case
* change(ui) - remove capitalize for the meta value
* change(ui) - autocomplete improvements with custom textfield
* fix(tracker):3.5.13+:reuse metadata on internal-caused restarts
* fix(tracker-assist):3.5.13:send active:true on start; scroll behavior fix
* change(ui) - filters autocomplete blur on pressing Enter key
* fix(tracker): fix node v to lower
* fix(tracker): fix deps
* fix(tracker): fix deps
* fix(ui) - dashboard modal width
* change(ui) - filter dropdown overflow
* chore(helm): clickhouse reclaim polity to retain
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(tracker): fix engine max v
* fix(ui): load metadata in assist tab for sorting
* fix(ui): rm unneeded api call
* fix(tracker): build script to cjs
* change(ui) - removed sample data
* chore(tracker): remove upper node version limit
* Updating Beacon size
Beacon size should be <= QUEUE_MESSAGE_SIZE_LIMIT
* feat(crons): run 24/7
feat(alerts): support env-file override
* feat(api): changed EE env handler
* fix(ui): fix sessions search modal
* change(ui) - margin for error message
* change(ui) - disable assist sort when there are no meta options to choose
* chore(helm): Adding utilities service namespace
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - dashboard date range selection reload, metric not found message
* change(ui) - disable clearsearch in assist when there are no filters\
* feat(api): fixed EE env handler
* chore(helm): Adding migration namespaces
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - report logo path
* chore(helm): Removing unnecessary SA
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): changed EE env handler
* feat(api): changed EE env handler
* feat(api): changed EE env handler
* feat(api): changed EE env handler
* feat(crons): changed crons
* feat(api): accept wrong metric_id
* feat(crons): changed env handler
feat(api): changed env handler
feat(alerts): changed env handler
* feat(utilities): support old version of nodejs
* feat(crons): changed env handler
feat(api): changed env handler
feat(alerts): changed env handler
* fix(tracker): fix srcset tracking
* chore(build): Adding frontent
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(assist): changed general helper
* feat(assist): changed general helper
* fix(ui): fix widget pagination (#570)
* feat(crons): changed entrypoint
* feat(player): dev-log on skipping message
* fix(tracker): removeNode mutation priority over attributes
* fix(tracker): capture relative img timing;use startsWith instead of substr; codestyle fix
* chore(build): fixing api build script
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* chore(ci): faster deployment
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* change(ui) - assist list show active status
* chore(actions): option to build all/specific services in GH
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - slowest domain metric data as per the api changes
* ci(helm): updated variable name
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* ci(backend): cherrypick changes to ee
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(backend): disabled pprof in http service
* fix(ui) - TimeToRender avg value as per the API change
* fix(ui) - ResponseTimeDistribution avg value as per the API change
* fix(ui) - MemoryConsumption avg value as per the API change
* fix(ui) - ResponseTime avg value as per the API change
* fix(ui) - DomBuildTime avg value as per the API change
* fix(ui) - FrameRate avg value as per the API change
* chore(helm): proper default tag
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(backend): removed sensitive information from http logs
* ci(backend): adding default parameter value for workflow dispatch
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(backend): deleted empty file
* fix(actions): creating image source file prior
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(helm): variable substitution
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* change(ui) - project list item installation button text change
* fix(ui) - project create validation
* fix(backend): removed unsafe string logs in http service
* chore(kafka): Adding new topic
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(efs-cron): variable name
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - developer tools - hint links
* fix(ui) - session filters - country and platform dropdown values
* chore(helm): updating version
* chore(kafka): Update kafka default message size while provisioning
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(tracker): fix dependency security
* change(ui) - webhook delete confirmation
* change(ui) - assist url to handle when empty
* feat(api): autocomplete replace console with errors
feat(DB): clean extra files
* chore(helm): Adding cron jobs
* change(ui) - set changed flag to false after the metric delete to avoid prompt
* chore(helm): enbaling cron only for ee
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): autocomplete remove console
* change(ui) - removed Console filter type
* fix(ui) - timeline position
* fix(helm): RFC naming
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): let user change project in dashboards and select default dashboard
* chore(helm): update registry url
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(DB): return pages_count to DB
* fix(ui) - account settings opt out checkbox
* fix(ui): fix modal width
* fix(ui) - explore circle bg
* fix(ui) - user name overlap
* fix(ui) - empty dashboards create button
* fix(ui): fix timeline position cursor for safari
* fix(ui) - custom metrics errors modal url reset on close
* fix(ui) - onboarding check for siteId
* change(ui) - tracker version
* Update local_deploy.sh
* fix(ui) - drilldown timestamp
* fix(tracker): fix deps for assist
* fix(tracker): update peerjs library
* fix(tracker): update assist v
* fix(tracker): fix type error
* fix(backend): no missing resource relying on resource zero-timing
* Update tracker to v3.5.15
* chore(helm): Adding CSP override variable.
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(backend): added pem file support for kafka ssl setup
* feat(backend): added useBatch setup for kafka producer
* ci(backend): set verbose logging
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(backend): using setKey instead of direct writes
* ci(backend): fix error code
* ci(deploy): Updating the image registry
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): changed get user id alias
* ci(frontent): removing depricated steps
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* ci(fix): variable replace
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* ci(helm): creating image image_override
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): fix timezone settings
* Added failover mechanism for storage service (#576)
* fix(ui): fix typescript config to remove array iterator error
* fix(ui): refactor timezone settings store/comp
* feat(snippet): opensource snippet
* feat(assist): support multiple IPs
* fix(ui): fix type errors in select /timezones fix
* feat(backend): set size of first part of sessions at 500kb
* change(ui) - removed logs
* fix(ui) - custom metric errors reset url on modal close
* feat(DB): no funnel migration
* fix(ui): fix screensize bug
* feat(DB): migrate super old funnels support
* changed db-migration workflow
Co-authored-by: Shekar Siri <sshekarsiri@gmail.com>
Co-authored-by: sylenien <nikita@openreplay.com>
Co-authored-by: Alex Kaminskii <alex@openreplay.com>
Co-authored-by: Alexander <zavorotynskiy@pm.me>
Co-authored-by: rjshrjndrn <rjshrjndrn@gmail.com>
Co-authored-by: Mehdi Osman <estradino@users.noreply.github.com>
Co-authored-by: Alexander <alexander@openreplay.com>
Co-authored-by: Rajesh Rajendran <rjshrjndrn@users.noreply.github.com>
Co-authored-by: Delirium <sylenien@gmail.com>
538 lines
26 KiB
Python
538 lines
26 KiB
Python
import json
|
|
from typing import Union
|
|
|
|
import schemas
|
|
from chalicelib.core import sessions, funnels, errors, issues
|
|
from chalicelib.utils import helper, pg_client
|
|
from chalicelib.utils.TimeUTC import TimeUTC
|
|
|
|
PIE_CHART_GROUP = 5
|
|
|
|
|
|
def __try_live(project_id, data: schemas.TryCustomMetricsPayloadSchema):
|
|
results = []
|
|
for i, s in enumerate(data.series):
|
|
s.filter.startDate = data.startTimestamp
|
|
s.filter.endDate = data.endTimestamp
|
|
results.append(sessions.search2_series(data=s.filter, project_id=project_id, density=data.density,
|
|
view_type=data.view_type, metric_type=data.metric_type,
|
|
metric_of=data.metric_of, metric_value=data.metric_value))
|
|
if data.view_type == schemas.MetricTimeseriesViewType.progress:
|
|
r = {"count": results[-1]}
|
|
diff = s.filter.endDate - s.filter.startDate
|
|
s.filter.endDate = s.filter.startDate
|
|
s.filter.startDate = s.filter.endDate - diff
|
|
r["previousCount"] = sessions.search2_series(data=s.filter, project_id=project_id, density=data.density,
|
|
view_type=data.view_type, metric_type=data.metric_type,
|
|
metric_of=data.metric_of, metric_value=data.metric_value)
|
|
r["countProgress"] = helper.__progress(old_val=r["previousCount"], new_val=r["count"])
|
|
# r["countProgress"] = ((r["count"] - r["previousCount"]) / r["previousCount"]) * 100 \
|
|
# if r["previousCount"] > 0 else 0
|
|
r["seriesName"] = s.name if s.name else i + 1
|
|
r["seriesId"] = s.series_id if s.series_id else None
|
|
results[-1] = r
|
|
elif data.view_type == schemas.MetricTableViewType.pie_chart:
|
|
if len(results[i].get("values", [])) > PIE_CHART_GROUP:
|
|
results[i]["values"] = results[i]["values"][:PIE_CHART_GROUP] \
|
|
+ [{
|
|
"name": "Others", "group": True,
|
|
"sessionCount": sum(r["sessionCount"] for r in results[i]["values"][PIE_CHART_GROUP:])
|
|
}]
|
|
|
|
return results
|
|
|
|
|
|
def __is_funnel_chart(data: schemas.TryCustomMetricsPayloadSchema):
|
|
return data.metric_type == schemas.MetricType.funnel
|
|
|
|
|
|
def __get_funnel_chart(project_id, data: schemas.TryCustomMetricsPayloadSchema):
|
|
if len(data.series) == 0:
|
|
return {
|
|
"stages": [],
|
|
"totalDropDueToIssues": 0
|
|
}
|
|
data.series[0].filter.startDate = data.startTimestamp
|
|
data.series[0].filter.endDate = data.endTimestamp
|
|
return funnels.get_top_insights_on_the_fly_widget(project_id=project_id, data=data.series[0].filter)
|
|
|
|
|
|
def __is_errors_list(data):
|
|
return data.metric_type == schemas.MetricType.table \
|
|
and data.metric_of == schemas.TableMetricOfType.errors
|
|
|
|
|
|
def __get_errors_list(project_id, user_id, data):
|
|
if len(data.series) == 0:
|
|
return {
|
|
"total": 0,
|
|
"errors": []
|
|
}
|
|
data.series[0].filter.startDate = data.startTimestamp
|
|
data.series[0].filter.endDate = data.endTimestamp
|
|
data.series[0].filter.page = data.page
|
|
data.series[0].filter.limit = data.limit
|
|
return errors.search(data.series[0].filter, project_id=project_id, user_id=user_id)
|
|
|
|
|
|
def __is_sessions_list(data):
|
|
return data.metric_type == schemas.MetricType.table \
|
|
and data.metric_of == schemas.TableMetricOfType.sessions
|
|
|
|
|
|
def __get_sessions_list(project_id, user_id, data):
|
|
if len(data.series) == 0:
|
|
print("empty series")
|
|
return {
|
|
"total": 0,
|
|
"sessions": []
|
|
}
|
|
data.series[0].filter.startDate = data.startTimestamp
|
|
data.series[0].filter.endDate = data.endTimestamp
|
|
data.series[0].filter.page = data.page
|
|
data.series[0].filter.limit = data.limit
|
|
return sessions.search2_pg(data=data.series[0].filter, project_id=project_id, user_id=user_id)
|
|
|
|
|
|
def merged_live(project_id, data: schemas.TryCustomMetricsPayloadSchema, user_id=None):
|
|
if __is_funnel_chart(data):
|
|
return __get_funnel_chart(project_id=project_id, data=data)
|
|
elif __is_errors_list(data):
|
|
return __get_errors_list(project_id=project_id, user_id=user_id, data=data)
|
|
elif __is_sessions_list(data):
|
|
return __get_sessions_list(project_id=project_id, user_id=user_id, data=data)
|
|
|
|
series_charts = __try_live(project_id=project_id, data=data)
|
|
if data.view_type == schemas.MetricTimeseriesViewType.progress or data.metric_type == schemas.MetricType.table:
|
|
return series_charts
|
|
results = [{}] * len(series_charts[0])
|
|
for i in range(len(results)):
|
|
for j, series_chart in enumerate(series_charts):
|
|
results[i] = {**results[i], "timestamp": series_chart[i]["timestamp"],
|
|
data.series[j].name if data.series[j].name else j + 1: series_chart[i]["count"]}
|
|
return results
|
|
|
|
|
|
def __merge_metric_with_data(metric, data: Union[schemas.CustomMetricChartPayloadSchema,
|
|
schemas.CustomMetricSessionsPayloadSchema]) \
|
|
-> Union[schemas.CreateCustomMetricsSchema, None]:
|
|
if data.series is not None and len(data.series) > 0:
|
|
metric["series"] = data.series
|
|
metric: schemas.CreateCustomMetricsSchema = schemas.CreateCustomMetricsSchema.parse_obj({**data.dict(), **metric})
|
|
if len(data.filters) > 0 or len(data.events) > 0:
|
|
for s in metric.series:
|
|
if len(data.filters) > 0:
|
|
s.filter.filters += data.filters
|
|
if len(data.events) > 0:
|
|
s.filter.events += data.events
|
|
return metric
|
|
|
|
|
|
def make_chart(project_id, user_id, metric_id, data: schemas.CustomMetricChartPayloadSchema, metric=None):
|
|
if metric is None:
|
|
metric = get(metric_id=metric_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if metric is None:
|
|
return None
|
|
metric: schemas.CreateCustomMetricsSchema = __merge_metric_with_data(metric=metric, data=data)
|
|
|
|
return merged_live(project_id=project_id, data=metric, user_id=user_id)
|
|
# if __is_funnel_chart(metric):
|
|
# return __get_funnel_chart(project_id=project_id, data=metric)
|
|
# elif __is_errors_list(metric):
|
|
# return __get_errors_list(project_id=project_id, user_id=user_id, data=metric)
|
|
#
|
|
# series_charts = __try_live(project_id=project_id, data=metric)
|
|
# if metric.view_type == schemas.MetricTimeseriesViewType.progress or metric.metric_type == schemas.MetricType.table:
|
|
# return series_charts
|
|
# results = [{}] * len(series_charts[0])
|
|
# for i in range(len(results)):
|
|
# for j, series_chart in enumerate(series_charts):
|
|
# results[i] = {**results[i], "timestamp": series_chart[i]["timestamp"],
|
|
# metric.series[j].name: series_chart[i]["count"]}
|
|
# return results
|
|
|
|
|
|
def get_sessions(project_id, user_id, metric_id, data: schemas.CustomMetricSessionsPayloadSchema):
|
|
metric = get(metric_id=metric_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if metric is None:
|
|
return None
|
|
metric: schemas.CreateCustomMetricsSchema = __merge_metric_with_data(metric=metric, data=data)
|
|
if metric is None:
|
|
return None
|
|
results = []
|
|
for s in metric.series:
|
|
s.filter.startDate = data.startTimestamp
|
|
s.filter.endDate = data.endTimestamp
|
|
s.filter.limit = data.limit
|
|
s.filter.page = data.page
|
|
results.append({"seriesId": s.series_id, "seriesName": s.name,
|
|
**sessions.search2_pg(data=s.filter, project_id=project_id, user_id=user_id)})
|
|
|
|
return results
|
|
|
|
|
|
def get_funnel_issues(project_id, user_id, metric_id, data: schemas.CustomMetricSessionsPayloadSchema):
|
|
metric = get(metric_id=metric_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if metric is None:
|
|
return None
|
|
metric: schemas.CreateCustomMetricsSchema = __merge_metric_with_data(metric=metric, data=data)
|
|
if metric is None:
|
|
return None
|
|
for s in metric.series:
|
|
s.filter.startDate = data.startTimestamp
|
|
s.filter.endDate = data.endTimestamp
|
|
s.filter.limit = data.limit
|
|
s.filter.page = data.page
|
|
return {"seriesId": s.series_id, "seriesName": s.name,
|
|
**funnels.get_issues_on_the_fly_widget(project_id=project_id, data=s.filter)}
|
|
|
|
|
|
def get_errors_list(project_id, user_id, metric_id, data: schemas.CustomMetricSessionsPayloadSchema):
|
|
metric = get(metric_id=metric_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if metric is None:
|
|
return None
|
|
metric: schemas.CreateCustomMetricsSchema = __merge_metric_with_data(metric=metric, data=data)
|
|
if metric is None:
|
|
return None
|
|
for s in metric.series:
|
|
s.filter.startDate = data.startTimestamp
|
|
s.filter.endDate = data.endTimestamp
|
|
s.filter.limit = data.limit
|
|
s.filter.page = data.page
|
|
return {"seriesId": s.series_id, "seriesName": s.name,
|
|
**errors.search(data=s.filter, project_id=project_id, user_id=user_id)}
|
|
|
|
|
|
def try_sessions(project_id, user_id, data: schemas.CustomMetricSessionsPayloadSchema):
|
|
results = []
|
|
if data.series is None:
|
|
return results
|
|
for s in data.series:
|
|
s.filter.startDate = data.startTimestamp
|
|
s.filter.endDate = data.endTimestamp
|
|
s.filter.limit = data.limit
|
|
s.filter.page = data.page
|
|
results.append({"seriesId": None, "seriesName": s.name,
|
|
**sessions.search2_pg(data=s.filter, project_id=project_id, user_id=user_id)})
|
|
|
|
return results
|
|
|
|
|
|
def create(project_id, user_id, data: schemas.CreateCustomMetricsSchema, dashboard=False):
|
|
with pg_client.PostgresClient() as cur:
|
|
_data = {}
|
|
for i, s in enumerate(data.series):
|
|
for k in s.dict().keys():
|
|
_data[f"{k}_{i}"] = s.__getattribute__(k)
|
|
_data[f"index_{i}"] = i
|
|
_data[f"filter_{i}"] = s.filter.json()
|
|
series_len = len(data.series)
|
|
data.series = None
|
|
params = {"user_id": user_id, "project_id": project_id,
|
|
"default_config": json.dumps(data.config.dict()),
|
|
**data.dict(), **_data}
|
|
query = cur.mogrify(f"""\
|
|
WITH m AS (INSERT INTO metrics (project_id, user_id, name, is_public,
|
|
view_type, metric_type, metric_of, metric_value,
|
|
metric_format, default_config)
|
|
VALUES (%(project_id)s, %(user_id)s, %(name)s, %(is_public)s,
|
|
%(view_type)s, %(metric_type)s, %(metric_of)s, %(metric_value)s,
|
|
%(metric_format)s, %(default_config)s)
|
|
RETURNING *)
|
|
INSERT
|
|
INTO metric_series(metric_id, index, name, filter)
|
|
VALUES {",".join([f"((SELECT metric_id FROM m), %(index_{i})s, %(name_{i})s, %(filter_{i})s::jsonb)"
|
|
for i in range(series_len)])}
|
|
RETURNING metric_id;""", params)
|
|
|
|
cur.execute(
|
|
query
|
|
)
|
|
r = cur.fetchone()
|
|
if dashboard:
|
|
return r["metric_id"]
|
|
return {"data": get(metric_id=r["metric_id"], project_id=project_id, user_id=user_id)}
|
|
|
|
|
|
def update(metric_id, user_id, project_id, data: schemas.UpdateCustomMetricsSchema):
|
|
metric = get(metric_id=metric_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if metric is None:
|
|
return None
|
|
series_ids = [r["seriesId"] for r in metric["series"]]
|
|
n_series = []
|
|
d_series_ids = []
|
|
u_series = []
|
|
u_series_ids = []
|
|
params = {"metric_id": metric_id, "is_public": data.is_public, "name": data.name,
|
|
"user_id": user_id, "project_id": project_id, "view_type": data.view_type,
|
|
"metric_type": data.metric_type, "metric_of": data.metric_of,
|
|
"metric_value": data.metric_value, "metric_format": data.metric_format}
|
|
for i, s in enumerate(data.series):
|
|
prefix = "u_"
|
|
if s.index is None:
|
|
s.index = i
|
|
if s.series_id is None or s.series_id not in series_ids:
|
|
n_series.append({"i": i, "s": s})
|
|
prefix = "n_"
|
|
else:
|
|
u_series.append({"i": i, "s": s})
|
|
u_series_ids.append(s.series_id)
|
|
ns = s.dict()
|
|
for k in ns.keys():
|
|
if k == "filter":
|
|
ns[k] = json.dumps(ns[k])
|
|
params[f"{prefix}{k}_{i}"] = ns[k]
|
|
for i in series_ids:
|
|
if i not in u_series_ids:
|
|
d_series_ids.append(i)
|
|
params["d_series_ids"] = tuple(d_series_ids)
|
|
|
|
with pg_client.PostgresClient() as cur:
|
|
sub_queries = []
|
|
if len(n_series) > 0:
|
|
sub_queries.append(f"""\
|
|
n AS (INSERT INTO metric_series (metric_id, index, name, filter)
|
|
VALUES {",".join([f"(%(metric_id)s, %(n_index_{s['i']})s, %(n_name_{s['i']})s, %(n_filter_{s['i']})s::jsonb)"
|
|
for s in n_series])}
|
|
RETURNING 1)""")
|
|
if len(u_series) > 0:
|
|
sub_queries.append(f"""\
|
|
u AS (UPDATE metric_series
|
|
SET name=series.name,
|
|
filter=series.filter,
|
|
index=series.index
|
|
FROM (VALUES {",".join([f"(%(u_series_id_{s['i']})s,%(u_index_{s['i']})s,%(u_name_{s['i']})s,%(u_filter_{s['i']})s::jsonb)"
|
|
for s in u_series])}) AS series(series_id, index, name, filter)
|
|
WHERE metric_series.metric_id =%(metric_id)s AND metric_series.series_id=series.series_id
|
|
RETURNING 1)""")
|
|
if len(d_series_ids) > 0:
|
|
sub_queries.append("""\
|
|
d AS (DELETE FROM metric_series WHERE metric_id =%(metric_id)s AND series_id IN %(d_series_ids)s
|
|
RETURNING 1)""")
|
|
query = cur.mogrify(f"""\
|
|
{"WITH " if len(sub_queries) > 0 else ""}{",".join(sub_queries)}
|
|
UPDATE metrics
|
|
SET name = %(name)s, is_public= %(is_public)s,
|
|
view_type= %(view_type)s, metric_type= %(metric_type)s,
|
|
metric_of= %(metric_of)s, metric_value= %(metric_value)s,
|
|
metric_format= %(metric_format)s,
|
|
edited_at = timezone('utc'::text, now())
|
|
WHERE metric_id = %(metric_id)s
|
|
AND project_id = %(project_id)s
|
|
AND (user_id = %(user_id)s OR is_public)
|
|
RETURNING metric_id;""", params)
|
|
cur.execute(query)
|
|
return get(metric_id=metric_id, project_id=project_id, user_id=user_id)
|
|
|
|
|
|
def get_all(project_id, user_id, include_series=False):
|
|
with pg_client.PostgresClient() as cur:
|
|
sub_join = ""
|
|
if include_series:
|
|
sub_join = """LEFT JOIN LATERAL (SELECT COALESCE(jsonb_agg(metric_series.* ORDER BY index),'[]'::jsonb) AS series
|
|
FROM metric_series
|
|
WHERE metric_series.metric_id = metrics.metric_id
|
|
AND metric_series.deleted_at ISNULL
|
|
) AS metric_series ON (TRUE)"""
|
|
cur.execute(
|
|
cur.mogrify(
|
|
f"""SELECT *
|
|
FROM metrics
|
|
{sub_join}
|
|
LEFT JOIN LATERAL (SELECT COALESCE(jsonb_agg(connected_dashboards.* ORDER BY is_public,name),'[]'::jsonb) AS dashboards
|
|
FROM (SELECT DISTINCT dashboard_id, name, is_public
|
|
FROM dashboards INNER JOIN dashboard_widgets USING (dashboard_id)
|
|
WHERE deleted_at ISNULL
|
|
AND dashboard_widgets.metric_id = metrics.metric_id
|
|
AND project_id = %(project_id)s
|
|
AND ((dashboards.user_id = %(user_id)s OR is_public))) AS connected_dashboards
|
|
) AS connected_dashboards ON (TRUE)
|
|
LEFT JOIN LATERAL (SELECT email AS owner_email
|
|
FROM users
|
|
WHERE deleted_at ISNULL
|
|
AND users.user_id = metrics.user_id
|
|
) AS owner ON (TRUE)
|
|
WHERE metrics.project_id = %(project_id)s
|
|
AND metrics.deleted_at ISNULL
|
|
AND (user_id = %(user_id)s OR metrics.is_public)
|
|
ORDER BY metrics.edited_at DESC, metrics.created_at DESC;""",
|
|
{"project_id": project_id, "user_id": user_id}
|
|
)
|
|
)
|
|
rows = cur.fetchall()
|
|
if include_series:
|
|
for r in rows:
|
|
# r["created_at"] = TimeUTC.datetime_to_timestamp(r["created_at"])
|
|
for s in r["series"]:
|
|
s["filter"] = helper.old_search_payload_to_flat(s["filter"])
|
|
else:
|
|
for r in rows:
|
|
r["created_at"] = TimeUTC.datetime_to_timestamp(r["created_at"])
|
|
r["edited_at"] = TimeUTC.datetime_to_timestamp(r["edited_at"])
|
|
rows = helper.list_to_camel_case(rows)
|
|
return rows
|
|
|
|
|
|
def delete(project_id, metric_id, user_id):
|
|
with pg_client.PostgresClient() as cur:
|
|
cur.execute(
|
|
cur.mogrify("""\
|
|
UPDATE public.metrics
|
|
SET deleted_at = timezone('utc'::text, now()), edited_at = timezone('utc'::text, now())
|
|
WHERE project_id = %(project_id)s
|
|
AND metric_id = %(metric_id)s
|
|
AND (user_id = %(user_id)s OR is_public);""",
|
|
{"metric_id": metric_id, "project_id": project_id, "user_id": user_id})
|
|
)
|
|
|
|
return {"state": "success"}
|
|
|
|
|
|
def get(metric_id, project_id, user_id, flatten=True):
|
|
with pg_client.PostgresClient() as cur:
|
|
cur.execute(
|
|
cur.mogrify(
|
|
"""SELECT *
|
|
FROM metrics
|
|
LEFT JOIN LATERAL (SELECT COALESCE(jsonb_agg(metric_series.* ORDER BY index),'[]'::jsonb) AS series
|
|
FROM metric_series
|
|
WHERE metric_series.metric_id = metrics.metric_id
|
|
AND metric_series.deleted_at ISNULL
|
|
) AS metric_series ON (TRUE)
|
|
LEFT JOIN LATERAL (SELECT COALESCE(jsonb_agg(connected_dashboards.* ORDER BY is_public,name),'[]'::jsonb) AS dashboards
|
|
FROM (SELECT dashboard_id, name, is_public
|
|
FROM dashboards
|
|
WHERE deleted_at ISNULL
|
|
AND project_id = %(project_id)s
|
|
AND ((user_id = %(user_id)s OR is_public))) AS connected_dashboards
|
|
) AS connected_dashboards ON (TRUE)
|
|
LEFT JOIN LATERAL (SELECT email AS owner_email
|
|
FROM users
|
|
WHERE deleted_at ISNULL
|
|
AND users.user_id = metrics.user_id
|
|
) AS owner ON (TRUE)
|
|
WHERE metrics.project_id = %(project_id)s
|
|
AND metrics.deleted_at ISNULL
|
|
AND (metrics.user_id = %(user_id)s OR metrics.is_public)
|
|
AND metrics.metric_id = %(metric_id)s
|
|
ORDER BY created_at;""",
|
|
{"metric_id": metric_id, "project_id": project_id, "user_id": user_id}
|
|
)
|
|
)
|
|
row = cur.fetchone()
|
|
if row is None:
|
|
return None
|
|
row["created_at"] = TimeUTC.datetime_to_timestamp(row["created_at"])
|
|
row["edited_at"] = TimeUTC.datetime_to_timestamp(row["edited_at"])
|
|
if flatten:
|
|
for s in row["series"]:
|
|
s["filter"] = helper.old_search_payload_to_flat(s["filter"])
|
|
return helper.dict_to_camel_case(row)
|
|
|
|
|
|
def get_with_template(metric_id, project_id, user_id, include_dashboard=True):
|
|
with pg_client.PostgresClient() as cur:
|
|
sub_query = ""
|
|
if include_dashboard:
|
|
sub_query = """LEFT JOIN LATERAL (SELECT COALESCE(jsonb_agg(connected_dashboards.* ORDER BY is_public,name),'[]'::jsonb) AS dashboards
|
|
FROM (SELECT dashboard_id, name, is_public
|
|
FROM dashboards
|
|
WHERE deleted_at ISNULL
|
|
AND project_id = %(project_id)s
|
|
AND ((user_id = %(user_id)s OR is_public))) AS connected_dashboards
|
|
) AS connected_dashboards ON (TRUE)"""
|
|
cur.execute(
|
|
cur.mogrify(
|
|
f"""SELECT *
|
|
FROM metrics
|
|
LEFT JOIN LATERAL (SELECT COALESCE(jsonb_agg(metric_series.* ORDER BY index),'[]'::jsonb) AS series
|
|
FROM metric_series
|
|
WHERE metric_series.metric_id = metrics.metric_id
|
|
AND metric_series.deleted_at ISNULL
|
|
) AS metric_series ON (TRUE)
|
|
{sub_query}
|
|
WHERE (metrics.project_id = %(project_id)s OR metrics.project_id ISNULL)
|
|
AND metrics.deleted_at ISNULL
|
|
AND (metrics.user_id = %(user_id)s OR metrics.is_public)
|
|
AND metrics.metric_id = %(metric_id)s
|
|
ORDER BY created_at;""",
|
|
{"metric_id": metric_id, "project_id": project_id, "user_id": user_id}
|
|
)
|
|
)
|
|
row = cur.fetchone()
|
|
return helper.dict_to_camel_case(row)
|
|
|
|
|
|
def get_series_for_alert(project_id, user_id):
|
|
with pg_client.PostgresClient() as cur:
|
|
cur.execute(
|
|
cur.mogrify(
|
|
"""SELECT series_id AS value,
|
|
metrics.name || '.' || (COALESCE(metric_series.name, 'series ' || index)) || '.count' AS name,
|
|
'count' AS unit,
|
|
FALSE AS predefined,
|
|
metric_id,
|
|
series_id
|
|
FROM metric_series
|
|
INNER JOIN metrics USING (metric_id)
|
|
WHERE metrics.deleted_at ISNULL
|
|
AND metrics.project_id = %(project_id)s
|
|
AND metrics.metric_type = 'timeseries'
|
|
AND (user_id = %(user_id)s OR is_public)
|
|
ORDER BY name;""",
|
|
{"project_id": project_id, "user_id": user_id}
|
|
)
|
|
)
|
|
rows = cur.fetchall()
|
|
return helper.list_to_camel_case(rows)
|
|
|
|
|
|
def change_state(project_id, metric_id, user_id, status):
|
|
with pg_client.PostgresClient() as cur:
|
|
cur.execute(
|
|
cur.mogrify("""\
|
|
UPDATE public.metrics
|
|
SET active = %(status)s
|
|
WHERE metric_id = %(metric_id)s
|
|
AND (user_id = %(user_id)s OR is_public);""",
|
|
{"metric_id": metric_id, "status": status, "user_id": user_id})
|
|
)
|
|
return get(metric_id=metric_id, project_id=project_id, user_id=user_id)
|
|
|
|
|
|
def get_funnel_sessions_by_issue(user_id, project_id, metric_id, issue_id,
|
|
data: schemas.CustomMetricSessionsPayloadSchema
|
|
# , range_value=None, start_date=None, end_date=None
|
|
):
|
|
metric = get(metric_id=metric_id, project_id=project_id, user_id=user_id, flatten=False)
|
|
if metric is None:
|
|
return None
|
|
metric: schemas.CreateCustomMetricsSchema = __merge_metric_with_data(metric=metric, data=data)
|
|
if metric is None:
|
|
return None
|
|
for s in metric.series:
|
|
s.filter.startDate = data.startTimestamp
|
|
s.filter.endDate = data.endTimestamp
|
|
s.filter.limit = data.limit
|
|
s.filter.page = data.page
|
|
issues_list = funnels.get_issues_on_the_fly_widget(project_id=project_id, data=s.filter).get("issues", {})
|
|
issues_list = issues_list.get("significant", []) + issues_list.get("insignificant", [])
|
|
issue = None
|
|
for i in issues_list:
|
|
if i.get("issueId", "") == issue_id:
|
|
issue = i
|
|
break
|
|
if issue is None:
|
|
issue = issues.get(project_id=project_id, issue_id=issue_id)
|
|
if issue is not None:
|
|
issue = {**issue,
|
|
"affectedSessions": 0,
|
|
"affectedUsers": 0,
|
|
"conversionImpact": 0,
|
|
"lostConversions": 0,
|
|
"unaffectedSessions": 0}
|
|
return {"seriesId": s.series_id, "seriesName": s.name,
|
|
"sessions": sessions.search2_pg(user_id=user_id, project_id=project_id,
|
|
issue=issue, data=s.filter)
|
|
if issue is not None else {"total": 0, "sessions": []},
|
|
"issue": issue}
|