* change(ui) - redirect to the landing url on SSO login
* fix(ui): fix share popup styles
* change(ui) - non admin user preference restrictions
* fix(ui) - redirect fix
* change(ui) - show installation btn without mouse hover
* feat(api): api-v1 handle wrong projectKey
feat(api): api-v1 get live sessions
* change(ui) - show role edit on hover
* change(ui) - audit trail count with comma
* fix(ui) - audit trail date range custom picker alignment
* change(ui) - show a message when mob file not found
* feat(api): api-v1 fixed search live sessions
* feat(api): api-v1 handle wrong projectKey
* feat(api): fixed assist error response
* fix(tracker): check node scrolls only on start
* fixup! fix(tracker): check node scrolls only on start
* feat(ui/player): scroll view in click map
* feat(ui/player): rm unused check
* New configuration module (#558)
* ci(dbmigrate): Create db migrate when there is change
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): fix login error/button margins
* fix(ui) - checkbox click
* fix(ui) - search rename and save fixes
* change(ui) - text changes
* fix(ui) - button text nowrap
* fix(ui): fix slowestdomains widget height
* change(ui) - ignore clicks while annotating
* change(ui) - if block with braces
* change(ui) - capitalize first letter in breadcrumb
* feat(db): remove errors from permissions
feat(api): remove errors from permissions
* feat(api): changed reset password response
* fix(ui) - assist active tab list, broken after with new api changes (pagination)
* fix(ui) - assist active tab list, broken after with new api changes (pagination)
* change(ui) - search compare
* fix(ui): last fixes for 1.7
* fix(ui): fix timeline
* fix(ui): small code fixes
* fix(ui): remove unused
* feat(frontend/assist): show when client tab is inactive + fix reconnection status update
* fix(ui) - visibility settings
* feat(assist): refactored extractSessionInfo
feat(assist): hardcoded session's attributes
* Added snabbdom (JS)
* fix(tracker): version check works with x.x.x-beta versions
* fix(backend): keep the highest user's timestamp instead of the latest message timestamp for correct session duration value
* feat(backend/s3): added file tag RETENTION (#561)
* change(ui) - search optimization and autocomplete improvements
* feat(backend/assets): added new metrics assets_downloaded
* change(ui) - show back the date range in bookmarks since the api is filtering by daterange
* feat(backend-assets): custom headers for cacher requests
* chore(backend): no tidy in dockerfile (local build speed up)
* feat(backend/assets): added proxy support for cacher module
* feat(backend/storage): set retention env variable as not required
* fix(ui): fix jira issues
* ci(helm): use kubectl for deployment
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(tracker):3.5.13: performance improvements for a case of extensive dom
* fix(backend): added missed err var and continue statement
* ci(helm): forcing namespace
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): fixed slowest_domains query
* ci(helm): update helm deployment method
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* change(ui) - filter dropdown colros
* fix(ui) - speed index location avg attribute changed to value
* ci(api): enable kubectl apply
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - widget y axis label
* feat(api): fixed slowest_domains query
* chore(helm): Adding namespaces to all templates (#565)
* feat(api): assist type-autocomplete
* feat(api): assist global-autocomplete
* feat(sourcemaps): include wasm file in build
* feat(sourcemaps-reader): refactored
* fix(ui): fix data for funnels
* fix(ui): fix all sessions section margin
* fix(ui) - assist loader flag
* fix(ui) - assist loader flag
* fix(ui): fix weird check
* feat(api): autocomplete accept unsupported types
* feat(ui): migrate to yarn v3
* feat(ui): minor fixes for installment
* feat(ui): add typescript plugin to yarn
* chore(helm): Ability to override image registry
* chore(helm): Overriding openreplay docker registry
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): fix control arrows on firefox
* feat(crons): EE crons
* feat(api): fixed build script
* feat(alerts): fixed build script
* feat(crons): fixed build script
* chore(helm): Updating cron version
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(crons): changes
* chore(helm): optional minio ingress
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(crons): fix build script
feat(alerts): fix build script
* Revert "chore(helm): Updating cron version"
This reverts commit 3ca190ea2f.
* feat(crons): fix build script
* feat(crons): fix Dockerfile
* feat(api): fixed metadata change-case
* change(ui) - remove capitalize for the meta value
* change(ui) - autocomplete improvements with custom textfield
* fix(tracker):3.5.13+:reuse metadata on internal-caused restarts
* fix(tracker-assist):3.5.13:send active:true on start; scroll behavior fix
* change(ui) - filters autocomplete blur on pressing Enter key
* fix(tracker): fix node v to lower
* fix(tracker): fix deps
* fix(tracker): fix deps
* fix(ui) - dashboard modal width
* change(ui) - filter dropdown overflow
* chore(helm): clickhouse reclaim polity to retain
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(tracker): fix engine max v
* fix(ui): load metadata in assist tab for sorting
* fix(ui): rm unneeded api call
* fix(tracker): build script to cjs
* change(ui) - removed sample data
* chore(tracker): remove upper node version limit
* Updating Beacon size
Beacon size should be <= QUEUE_MESSAGE_SIZE_LIMIT
* feat(crons): run 24/7
feat(alerts): support env-file override
* feat(api): changed EE env handler
* fix(ui): fix sessions search modal
* change(ui) - margin for error message
* change(ui) - disable assist sort when there are no meta options to choose
* chore(helm): Adding utilities service namespace
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - dashboard date range selection reload, metric not found message
* change(ui) - disable clearsearch in assist when there are no filters\
* feat(api): fixed EE env handler
* chore(helm): Adding migration namespaces
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - report logo path
* chore(helm): Removing unnecessary SA
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): changed EE env handler
* feat(api): changed EE env handler
* feat(api): changed EE env handler
* feat(api): changed EE env handler
* feat(crons): changed crons
* feat(api): accept wrong metric_id
* feat(crons): changed env handler
feat(api): changed env handler
feat(alerts): changed env handler
* feat(utilities): support old version of nodejs
* feat(crons): changed env handler
feat(api): changed env handler
feat(alerts): changed env handler
* fix(tracker): fix srcset tracking
* chore(build): Adding frontent
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(assist): changed general helper
* feat(assist): changed general helper
* fix(ui): fix widget pagination (#570)
* feat(crons): changed entrypoint
* feat(player): dev-log on skipping message
* fix(tracker): removeNode mutation priority over attributes
* fix(tracker): capture relative img timing;use startsWith instead of substr; codestyle fix
* chore(build): fixing api build script
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* chore(ci): faster deployment
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* change(ui) - assist list show active status
* chore(actions): option to build all/specific services in GH
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - slowest domain metric data as per the api changes
* ci(helm): updated variable name
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* ci(backend): cherrypick changes to ee
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(backend): disabled pprof in http service
* fix(ui) - TimeToRender avg value as per the API change
* fix(ui) - ResponseTimeDistribution avg value as per the API change
* fix(ui) - MemoryConsumption avg value as per the API change
* fix(ui) - ResponseTime avg value as per the API change
* fix(ui) - DomBuildTime avg value as per the API change
* fix(ui) - FrameRate avg value as per the API change
* chore(helm): proper default tag
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(backend): removed sensitive information from http logs
* ci(backend): adding default parameter value for workflow dispatch
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(backend): deleted empty file
* fix(actions): creating image source file prior
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(helm): variable substitution
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* change(ui) - project list item installation button text change
* fix(ui) - project create validation
* fix(backend): removed unsafe string logs in http service
* chore(kafka): Adding new topic
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(efs-cron): variable name
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui) - developer tools - hint links
* fix(ui) - session filters - country and platform dropdown values
* chore(helm): updating version
* chore(kafka): Update kafka default message size while provisioning
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(tracker): fix dependency security
* change(ui) - webhook delete confirmation
* change(ui) - assist url to handle when empty
* feat(api): autocomplete replace console with errors
feat(DB): clean extra files
* chore(helm): Adding cron jobs
* change(ui) - set changed flag to false after the metric delete to avoid prompt
* chore(helm): enbaling cron only for ee
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): autocomplete remove console
* change(ui) - removed Console filter type
* fix(ui) - timeline position
* fix(helm): RFC naming
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): let user change project in dashboards and select default dashboard
* chore(helm): update registry url
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(DB): return pages_count to DB
* fix(ui) - account settings opt out checkbox
* fix(ui): fix modal width
* fix(ui) - explore circle bg
* fix(ui) - user name overlap
* fix(ui) - empty dashboards create button
* fix(ui): fix timeline position cursor for safari
* fix(ui) - custom metrics errors modal url reset on close
* fix(ui) - onboarding check for siteId
* change(ui) - tracker version
* Update local_deploy.sh
* fix(ui) - drilldown timestamp
* fix(tracker): fix deps for assist
* fix(tracker): update peerjs library
* fix(tracker): update assist v
* fix(tracker): fix type error
* fix(backend): no missing resource relying on resource zero-timing
* Update tracker to v3.5.15
* chore(helm): Adding CSP override variable.
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(backend): added pem file support for kafka ssl setup
* feat(backend): added useBatch setup for kafka producer
* ci(backend): set verbose logging
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(backend): using setKey instead of direct writes
* ci(backend): fix error code
* ci(deploy): Updating the image registry
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* feat(api): changed get user id alias
* ci(frontent): removing depricated steps
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* ci(fix): variable replace
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* ci(helm): creating image image_override
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
* fix(ui): fix timezone settings
* Added failover mechanism for storage service (#576)
* fix(ui): fix typescript config to remove array iterator error
* fix(ui): refactor timezone settings store/comp
* feat(snippet): opensource snippet
* feat(assist): support multiple IPs
* fix(ui): fix type errors in select /timezones fix
* feat(backend): set size of first part of sessions at 500kb
* change(ui) - removed logs
* fix(ui) - custom metric errors reset url on modal close
* feat(DB): no funnel migration
* fix(ui): fix screensize bug
* feat(DB): migrate super old funnels support
* changed db-migration workflow
Co-authored-by: Shekar Siri <sshekarsiri@gmail.com>
Co-authored-by: sylenien <nikita@openreplay.com>
Co-authored-by: Alex Kaminskii <alex@openreplay.com>
Co-authored-by: Alexander <zavorotynskiy@pm.me>
Co-authored-by: rjshrjndrn <rjshrjndrn@gmail.com>
Co-authored-by: Mehdi Osman <estradino@users.noreply.github.com>
Co-authored-by: Alexander <alexander@openreplay.com>
Co-authored-by: Rajesh Rajendran <rjshrjndrn@users.noreply.github.com>
Co-authored-by: Delirium <sylenien@gmail.com>
324 lines
21 KiB
Python
324 lines
21 KiB
Python
import json
|
||
|
||
import schemas
|
||
from chalicelib.core import custom_metrics, metrics
|
||
from chalicelib.utils import helper
|
||
from chalicelib.utils import pg_client
|
||
from chalicelib.utils.TimeUTC import TimeUTC
|
||
|
||
# category name should be lower cased
|
||
CATEGORY_DESCRIPTION = {
|
||
'web vitals': 'A set of metrics that assess app performance on criteria such as load time, load performance, and stability.',
|
||
'custom': 'Previously created custom metrics by me and my team.',
|
||
'errors': 'Keep a closer eye on errors and track their type, origin and domain.',
|
||
'performance': 'Optimize your app’s performance by tracking slow domains, page response times, memory consumption, CPU usage and more.',
|
||
'resources': 'Find out which resources are missing and those that may be slowing your web app.'
|
||
}
|
||
|
||
|
||
def get_templates(project_id, user_id):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = cur.mogrify(f"""SELECT category, jsonb_agg(metrics ORDER BY name) AS widgets
|
||
FROM (SELECT * , default_config AS config
|
||
FROM metrics LEFT JOIN LATERAL (SELECT COALESCE(jsonb_agg(metric_series.* ORDER BY index), '[]'::jsonb) AS series
|
||
FROM metric_series
|
||
WHERE metric_series.metric_id = metrics.metric_id
|
||
AND metric_series.deleted_at ISNULL
|
||
) AS metric_series ON (TRUE)
|
||
WHERE deleted_at IS NULL
|
||
AND (project_id ISNULL OR (project_id = %(project_id)s AND (is_public OR user_id= %(userId)s)))
|
||
) AS metrics
|
||
GROUP BY category
|
||
ORDER BY ARRAY_POSITION(ARRAY ['custom','overview','errors','performance','resources'], category);""",
|
||
{"project_id": project_id, "userId": user_id})
|
||
cur.execute(pg_query)
|
||
rows = cur.fetchall()
|
||
for r in rows:
|
||
r["description"] = CATEGORY_DESCRIPTION.get(r["category"].lower(), "")
|
||
for w in r["widgets"]:
|
||
w["created_at"] = TimeUTC.datetime_to_timestamp(w["created_at"])
|
||
w["edited_at"] = TimeUTC.datetime_to_timestamp(w["edited_at"])
|
||
for s in w["series"]:
|
||
s["filter"] = helper.old_search_payload_to_flat(s["filter"])
|
||
|
||
return helper.list_to_camel_case(rows)
|
||
|
||
|
||
def create_dashboard(project_id, user_id, data: schemas.CreateDashboardSchema):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = f"""INSERT INTO dashboards(project_id, user_id, name, is_public, is_pinned, description)
|
||
VALUES(%(projectId)s, %(userId)s, %(name)s, %(is_public)s, %(is_pinned)s, %(description)s)
|
||
RETURNING *"""
|
||
params = {"userId": user_id, "projectId": project_id, **data.dict()}
|
||
if data.metrics is not None and len(data.metrics) > 0:
|
||
pg_query = f"""WITH dash AS ({pg_query})
|
||
INSERT INTO dashboard_widgets(dashboard_id, metric_id, user_id, config)
|
||
VALUES {",".join([f"((SELECT dashboard_id FROM dash),%(metric_id_{i})s, %(userId)s, (SELECT default_config FROM metrics WHERE metric_id=%(metric_id_{i})s)||%(config_{i})s)" for i in range(len(data.metrics))])}
|
||
RETURNING (SELECT dashboard_id FROM dash)"""
|
||
for i, m in enumerate(data.metrics):
|
||
params[f"metric_id_{i}"] = m
|
||
# params[f"config_{i}"] = schemas.AddWidgetToDashboardPayloadSchema.schema() \
|
||
# .get("properties", {}).get("config", {}).get("default", {})
|
||
# params[f"config_{i}"]["position"] = i
|
||
# params[f"config_{i}"] = json.dumps(params[f"config_{i}"])
|
||
params[f"config_{i}"] = json.dumps({"position": i})
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
row = cur.fetchone()
|
||
if row is None:
|
||
return {"errors": ["something went wrong while creating the dashboard"]}
|
||
return {"data": get_dashboard(project_id=project_id, user_id=user_id, dashboard_id=row["dashboard_id"])}
|
||
|
||
|
||
def get_dashboards(project_id, user_id):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = f"""SELECT *
|
||
FROM dashboards
|
||
WHERE deleted_at ISNULL
|
||
AND project_id = %(projectId)s
|
||
AND (user_id = %(userId)s OR is_public);"""
|
||
params = {"userId": user_id, "projectId": project_id}
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
rows = cur.fetchall()
|
||
return helper.list_to_camel_case(rows)
|
||
|
||
|
||
def get_dashboard(project_id, user_id, dashboard_id):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = """SELECT dashboards.*, all_metric_widgets.widgets AS widgets
|
||
FROM dashboards
|
||
LEFT JOIN LATERAL (SELECT COALESCE(JSONB_AGG(raw_metrics), '[]') AS widgets
|
||
FROM (SELECT dashboard_widgets.*, metrics.*, metric_series.series
|
||
FROM metrics
|
||
INNER JOIN dashboard_widgets USING (metric_id)
|
||
LEFT JOIN LATERAL (SELECT COALESCE(JSONB_AGG(metric_series.* ORDER BY index),'[]') AS series
|
||
FROM metric_series
|
||
WHERE metric_series.metric_id = metrics.metric_id
|
||
AND metric_series.deleted_at ISNULL
|
||
) AS metric_series ON (TRUE)
|
||
WHERE dashboard_widgets.dashboard_id = dashboards.dashboard_id
|
||
AND metrics.deleted_at ISNULL
|
||
AND (metrics.project_id = %(projectId)s OR metrics.project_id ISNULL)) AS raw_metrics
|
||
) AS all_metric_widgets ON (TRUE)
|
||
WHERE dashboards.deleted_at ISNULL
|
||
AND dashboards.project_id = %(projectId)s
|
||
AND dashboard_id = %(dashboard_id)s
|
||
AND (dashboards.user_id = %(userId)s OR is_public);"""
|
||
params = {"userId": user_id, "projectId": project_id, "dashboard_id": dashboard_id}
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
row = cur.fetchone()
|
||
if row is not None:
|
||
row["created_at"] = TimeUTC.datetime_to_timestamp(row["created_at"])
|
||
for w in row["widgets"]:
|
||
w["created_at"] = TimeUTC.datetime_to_timestamp(w["created_at"])
|
||
w["edited_at"] = TimeUTC.datetime_to_timestamp(w["edited_at"])
|
||
for s in w["series"]:
|
||
s["created_at"] = TimeUTC.datetime_to_timestamp(s["created_at"])
|
||
return helper.dict_to_camel_case(row)
|
||
|
||
|
||
def delete_dashboard(project_id, user_id, dashboard_id):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = """UPDATE dashboards
|
||
SET deleted_at = timezone('utc'::text, now())
|
||
WHERE dashboards.project_id = %(projectId)s
|
||
AND dashboard_id = %(dashboard_id)s
|
||
AND (dashboards.user_id = %(userId)s OR is_public);"""
|
||
params = {"userId": user_id, "projectId": project_id, "dashboard_id": dashboard_id}
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
return {"data": {"success": True}}
|
||
|
||
|
||
def update_dashboard(project_id, user_id, dashboard_id, data: schemas.EditDashboardSchema):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = """SELECT COALESCE(COUNT(*),0) AS count
|
||
FROM dashboard_widgets
|
||
WHERE dashboard_id = %(dashboard_id)s;"""
|
||
params = {"userId": user_id, "projectId": project_id, "dashboard_id": dashboard_id, **data.dict()}
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
row = cur.fetchone()
|
||
offset = row["count"]
|
||
pg_query = f"""UPDATE dashboards
|
||
SET name = %(name)s,
|
||
description= %(description)s
|
||
{", is_public = %(is_public)s" if data.is_public is not None else ""}
|
||
{", is_pinned = %(is_pinned)s" if data.is_pinned is not None else ""}
|
||
WHERE dashboards.project_id = %(projectId)s
|
||
AND dashboard_id = %(dashboard_id)s
|
||
AND (dashboards.user_id = %(userId)s OR is_public)"""
|
||
if data.metrics is not None and len(data.metrics) > 0:
|
||
pg_query = f"""WITH dash AS ({pg_query})
|
||
INSERT INTO dashboard_widgets(dashboard_id, metric_id, user_id, config)
|
||
VALUES {",".join([f"(%(dashboard_id)s, %(metric_id_{i})s, %(userId)s, (SELECT default_config FROM metrics WHERE metric_id=%(metric_id_{i})s)||%(config_{i})s)" for i in range(len(data.metrics))])};"""
|
||
for i, m in enumerate(data.metrics):
|
||
params[f"metric_id_{i}"] = m
|
||
# params[f"config_{i}"] = schemas.AddWidgetToDashboardPayloadSchema.schema() \
|
||
# .get("properties", {}).get("config", {}).get("default", {})
|
||
# params[f"config_{i}"]["position"] = i
|
||
# params[f"config_{i}"] = json.dumps(params[f"config_{i}"])
|
||
params[f"config_{i}"] = json.dumps({"position": i + offset})
|
||
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
|
||
return get_dashboard(project_id=project_id, user_id=user_id, dashboard_id=dashboard_id)
|
||
|
||
|
||
def get_widget(project_id, user_id, dashboard_id, widget_id):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = """SELECT metrics.*, metric_series.series
|
||
FROM dashboard_widgets
|
||
INNER JOIN dashboards USING (dashboard_id)
|
||
INNER JOIN metrics USING (metric_id)
|
||
LEFT JOIN LATERAL (SELECT COALESCE(jsonb_agg(metric_series.* ORDER BY index), '[]'::jsonb) AS series
|
||
FROM metric_series
|
||
WHERE metric_series.metric_id = metrics.metric_id
|
||
AND metric_series.deleted_at ISNULL
|
||
) AS metric_series ON (TRUE)
|
||
WHERE dashboard_id = %(dashboard_id)s
|
||
AND widget_id = %(widget_id)s
|
||
AND (dashboards.is_public OR dashboards.user_id = %(userId)s)
|
||
AND dashboards.deleted_at IS NULL
|
||
AND metrics.deleted_at ISNULL
|
||
AND (metrics.project_id = %(projectId)s OR metrics.project_id ISNULL)
|
||
AND (metrics.is_public OR metrics.user_id = %(userId)s);"""
|
||
params = {"userId": user_id, "projectId": project_id, "dashboard_id": dashboard_id, "widget_id": widget_id}
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
row = cur.fetchone()
|
||
return helper.dict_to_camel_case(row)
|
||
|
||
|
||
def add_widget(project_id, user_id, dashboard_id, data: schemas.AddWidgetToDashboardPayloadSchema):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = """INSERT INTO dashboard_widgets(dashboard_id, metric_id, user_id, config)
|
||
SELECT %(dashboard_id)s AS dashboard_id, %(metric_id)s AS metric_id,
|
||
%(userId)s AS user_id, (SELECT default_config FROM metrics WHERE metric_id=%(metric_id)s)||%(config)s::jsonb AS config
|
||
WHERE EXISTS(SELECT 1 FROM dashboards
|
||
WHERE dashboards.deleted_at ISNULL AND dashboards.project_id = %(projectId)s
|
||
AND dashboard_id = %(dashboard_id)s
|
||
AND (dashboards.user_id = %(userId)s OR is_public))
|
||
RETURNING *;"""
|
||
params = {"userId": user_id, "projectId": project_id, "dashboard_id": dashboard_id, **data.dict()}
|
||
params["config"] = json.dumps(data.config)
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
row = cur.fetchone()
|
||
return helper.dict_to_camel_case(row)
|
||
|
||
|
||
def update_widget(project_id, user_id, dashboard_id, widget_id, data: schemas.UpdateWidgetPayloadSchema):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = """UPDATE dashboard_widgets
|
||
SET config= %(config)s
|
||
WHERE dashboard_id=%(dashboard_id)s AND widget_id=%(widget_id)s
|
||
RETURNING *;"""
|
||
params = {"userId": user_id, "projectId": project_id, "dashboard_id": dashboard_id,
|
||
"widget_id": widget_id, **data.dict()}
|
||
params["config"] = json.dumps(data.config)
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
row = cur.fetchone()
|
||
return helper.dict_to_camel_case(row)
|
||
|
||
|
||
def remove_widget(project_id, user_id, dashboard_id, widget_id):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = """DELETE FROM dashboard_widgets
|
||
WHERE dashboard_id=%(dashboard_id)s AND widget_id=%(widget_id)s;"""
|
||
params = {"userId": user_id, "projectId": project_id, "dashboard_id": dashboard_id, "widget_id": widget_id}
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
return {"data": {"success": True}}
|
||
|
||
|
||
def pin_dashboard(project_id, user_id, dashboard_id):
|
||
with pg_client.PostgresClient() as cur:
|
||
pg_query = """UPDATE dashboards
|
||
SET is_pinned = FALSE
|
||
WHERE project_id=%(project_id)s;
|
||
UPDATE dashboards
|
||
SET is_pinned = True
|
||
WHERE dashboard_id=%(dashboard_id)s AND project_id=%(project_id)s AND deleted_at ISNULL
|
||
RETURNING *;"""
|
||
params = {"userId": user_id, "project_id": project_id, "dashboard_id": dashboard_id}
|
||
cur.execute(cur.mogrify(pg_query, params))
|
||
row = cur.fetchone()
|
||
return helper.dict_to_camel_case(row)
|
||
|
||
|
||
def create_metric_add_widget(project_id, user_id, dashboard_id, data: schemas.CreateCustomMetricsSchema):
|
||
metric_id = custom_metrics.create(project_id=project_id, user_id=user_id, data=data, dashboard=True)
|
||
return add_widget(project_id=project_id, user_id=user_id, dashboard_id=dashboard_id,
|
||
data=schemas.AddWidgetToDashboardPayloadSchema(metricId=metric_id))
|
||
|
||
|
||
PREDEFINED = {schemas.TemplatePredefinedKeys.count_sessions: metrics.get_processed_sessions,
|
||
schemas.TemplatePredefinedKeys.avg_image_load_time: metrics.get_application_activity_avg_image_load_time,
|
||
schemas.TemplatePredefinedKeys.avg_page_load_time: metrics.get_application_activity_avg_page_load_time,
|
||
schemas.TemplatePredefinedKeys.avg_request_load_time: metrics.get_application_activity_avg_request_load_time,
|
||
schemas.TemplatePredefinedKeys.avg_dom_content_load_start: metrics.get_page_metrics_avg_dom_content_load_start,
|
||
schemas.TemplatePredefinedKeys.avg_first_contentful_pixel: metrics.get_page_metrics_avg_first_contentful_pixel,
|
||
schemas.TemplatePredefinedKeys.avg_visited_pages: metrics.get_user_activity_avg_visited_pages,
|
||
schemas.TemplatePredefinedKeys.avg_session_duration: metrics.get_user_activity_avg_session_duration,
|
||
schemas.TemplatePredefinedKeys.avg_pages_dom_buildtime: metrics.get_pages_dom_build_time,
|
||
schemas.TemplatePredefinedKeys.avg_pages_response_time: metrics.get_pages_response_time,
|
||
schemas.TemplatePredefinedKeys.avg_response_time: metrics.get_top_metrics_avg_response_time,
|
||
schemas.TemplatePredefinedKeys.avg_first_paint: metrics.get_top_metrics_avg_first_paint,
|
||
schemas.TemplatePredefinedKeys.avg_dom_content_loaded: metrics.get_top_metrics_avg_dom_content_loaded,
|
||
schemas.TemplatePredefinedKeys.avg_till_first_bit: metrics.get_top_metrics_avg_till_first_bit,
|
||
schemas.TemplatePredefinedKeys.avg_time_to_interactive: metrics.get_top_metrics_avg_time_to_interactive,
|
||
schemas.TemplatePredefinedKeys.count_requests: metrics.get_top_metrics_count_requests,
|
||
schemas.TemplatePredefinedKeys.avg_time_to_render: metrics.get_time_to_render,
|
||
schemas.TemplatePredefinedKeys.avg_used_js_heap_size: metrics.get_memory_consumption,
|
||
schemas.TemplatePredefinedKeys.avg_cpu: metrics.get_avg_cpu,
|
||
schemas.TemplatePredefinedKeys.avg_fps: metrics.get_avg_fps,
|
||
schemas.TemplatePredefinedKeys.impacted_sessions_by_js_errors: metrics.get_impacted_sessions_by_js_errors,
|
||
schemas.TemplatePredefinedKeys.domains_errors_4xx: metrics.get_domains_errors_4xx,
|
||
schemas.TemplatePredefinedKeys.domains_errors_5xx: metrics.get_domains_errors_5xx,
|
||
schemas.TemplatePredefinedKeys.errors_per_domains: metrics.get_errors_per_domains,
|
||
schemas.TemplatePredefinedKeys.calls_errors: metrics.get_calls_errors,
|
||
schemas.TemplatePredefinedKeys.errors_by_type: metrics.get_errors_per_type,
|
||
schemas.TemplatePredefinedKeys.errors_by_origin: metrics.get_resources_by_party,
|
||
schemas.TemplatePredefinedKeys.speed_index_by_location: metrics.get_speed_index_location,
|
||
schemas.TemplatePredefinedKeys.slowest_domains: metrics.get_slowest_domains,
|
||
schemas.TemplatePredefinedKeys.sessions_per_browser: metrics.get_sessions_per_browser,
|
||
schemas.TemplatePredefinedKeys.time_to_render: metrics.get_time_to_render,
|
||
schemas.TemplatePredefinedKeys.impacted_sessions_by_slow_pages: metrics.get_impacted_sessions_by_slow_pages,
|
||
schemas.TemplatePredefinedKeys.memory_consumption: metrics.get_memory_consumption,
|
||
schemas.TemplatePredefinedKeys.cpu_load: metrics.get_avg_cpu,
|
||
schemas.TemplatePredefinedKeys.frame_rate: metrics.get_avg_fps,
|
||
schemas.TemplatePredefinedKeys.crashes: metrics.get_crashes,
|
||
schemas.TemplatePredefinedKeys.resources_vs_visually_complete: metrics.get_resources_vs_visually_complete,
|
||
schemas.TemplatePredefinedKeys.pages_dom_buildtime: metrics.get_pages_dom_build_time,
|
||
schemas.TemplatePredefinedKeys.pages_response_time: metrics.get_pages_response_time,
|
||
schemas.TemplatePredefinedKeys.pages_response_time_distribution: metrics.get_pages_response_time_distribution,
|
||
schemas.TemplatePredefinedKeys.missing_resources: metrics.get_missing_resources_trend,
|
||
schemas.TemplatePredefinedKeys.slowest_resources: metrics.get_slowest_resources,
|
||
schemas.TemplatePredefinedKeys.resources_fetch_time: metrics.get_resources_loading_time,
|
||
schemas.TemplatePredefinedKeys.resource_type_vs_response_end: metrics.resource_type_vs_response_end,
|
||
schemas.TemplatePredefinedKeys.resources_count_by_type: metrics.get_resources_count_by_type,
|
||
}
|
||
|
||
|
||
def get_predefined_metric(key: schemas.TemplatePredefinedKeys, project_id: int, data: dict):
|
||
return PREDEFINED.get(key, lambda *args: None)(project_id=project_id, **data)
|
||
|
||
|
||
def make_chart_metrics(project_id, user_id, metric_id, data: schemas.CustomMetricChartPayloadSchema):
|
||
raw_metric = custom_metrics.get_with_template(metric_id=metric_id, project_id=project_id, user_id=user_id,
|
||
include_dashboard=False)
|
||
if raw_metric is None:
|
||
return None
|
||
metric = schemas.CustomMetricAndTemplate = schemas.CustomMetricAndTemplate.parse_obj(raw_metric)
|
||
if metric.is_template:
|
||
return get_predefined_metric(key=metric.predefined_key, project_id=project_id, data=data.dict())
|
||
else:
|
||
return custom_metrics.make_chart(project_id=project_id, user_id=user_id, metric_id=metric_id, data=data,
|
||
metric=raw_metric)
|
||
|
||
|
||
def make_chart_widget(dashboard_id, project_id, user_id, widget_id, data: schemas.CustomMetricChartPayloadSchema):
|
||
raw_metric = get_widget(widget_id=widget_id, project_id=project_id, user_id=user_id, dashboard_id=dashboard_id)
|
||
if raw_metric is None:
|
||
return None
|
||
metric = schemas.CustomMetricAndTemplate = schemas.CustomMetricAndTemplate.parse_obj(raw_metric)
|
||
if metric.is_template:
|
||
return get_predefined_metric(key=metric.predefined_key, project_id=project_id, data=data.dict())
|
||
else:
|
||
return custom_metrics.make_chart(project_id=project_id, user_id=user_id, metric_id=raw_metric["metricId"],
|
||
data=data, metric=raw_metric)
|