Compare commits

..

156 commits

Author SHA1 Message Date
nick-delirium
90510aa33b ui: fix double metric selection in list 2025-06-06 16:19:54 +02:00
GitHub Action
96a70f5d41 Increment frontend chart version to v1.22.42 2025-06-04 11:41:56 +02:00
rjshrjndrn
d4a13edcf0 fix(actions): frontend image with proper tag
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-06-04 11:33:19 +02:00
GitHub Action
51fad91a22 Increment frontend chart version to v1.22.41 2025-06-04 10:48:50 +02:00
nick-delirium
36abcda1e1 ui: fix audioplayer start point 2025-06-04 10:39:08 +02:00
Mehdi Osman
dd5f464f73
Increment frontend chart version to v1.22.40 (#3479)
Co-authored-by: GitHub Action <action@github.com>
2025-06-03 16:22:12 +02:00
Delirium
f9ada41272
ui: recreate period on db visit (#3478) 2025-06-03 16:05:52 +02:00
rjshrjndrn
9e24a3583e feat(nginx): add integrations endpoint with CORS support
Add new /integrations/ location block that proxies requests to
integrations-openreplay:8080 service. Includes proper CORS headers
for cross-origin requests and WebSocket upgrade support.

- Rewrite /integrations/ path to root
- Configure proxy headers for forwarding
- Set connection timeouts for stability
- Add CORS headers for API access

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-06-02 10:55:50 +02:00
Taha Yassine Kraiem
0a3129d3cd fix(chalice): fixed JIRA integration 2025-05-30 15:25:41 +02:00
Mehdi Osman
99d61db9d9
Increment frontend chart version to v1.22.39 (#3460)
Co-authored-by: GitHub Action <action@github.com>
2025-05-30 15:07:29 +02:00
Delirium
133958622e
ui: fix alert create button (#3459) 2025-05-30 14:56:21 +02:00
GitHub Action
fb021f606f Increment frontend chart version to v1.22.38 2025-05-29 12:21:04 +02:00
rjshrjndrn
a2905fa8ed fix: move cd - command after git operations in patch workflow
Move the directory restoration command after the git operations to
ensure all git commands execute in the correct working directory
before returning to the previous directory.

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-29 12:16:28 +02:00
rjshrjndrn
beec2283fd refactor(ci): restructure patch-build workflow script
- Extract inline bash script into structured functions
- Add proper error handling with set -euo pipefail
- Improve variable scoping with readonly and local declarations
- Add descriptive function names and comments
- Fix shell quoting and parameter expansion
- Consolidate build logic into reusable functions
- Add proper cleanup of temporary files
- Improve readability and maintainability of the CI script

The refactored script maintains the same functionality while being
more robust and easier to understand.

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-29 12:16:28 +02:00
GitHub Action
6c8b55019e Increment frontend chart version 2025-05-29 10:29:46 +02:00
rjshrjndrn
e3e3e11227 fix(action): proper registry
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-29 10:18:55 +02:00
Shekar Siri
c6f7de04cc Revert "fix(ui): new card data state is not updating"
This reverts commit 2921c17cbf.
2025-05-28 22:16:00 +02:00
Shekar Siri
2921c17cbf fix(ui): new card data state is not updating 2025-05-28 19:49:01 +02:00
Mehdi Osman
7eb3f5c4c8
Increment frontend chart version (#3436)
Co-authored-by: GitHub Action <action@github.com>
2025-05-26 16:10:35 +02:00
Rajesh Rajendran
5a9a8e588a
chore(actions): rebase only if not main (#3435)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-26 16:04:50 +02:00
Rajesh Rajendran
4b14258266
fix(action): clone repo (#3433)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-26 15:50:13 +02:00
Rajesh Rajendran
744d2d4311
actions fix or 2070 (#3432)
* chore(build): Better error handling

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* fix(build): remove fetch depth, as it might cause issue in rebase

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* fix(build): proper platform

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-26 15:45:48 +02:00
Taha Yassine Kraiem
64242a5dc0 refactor(DB): changed supported platforms in CH 2025-05-26 11:51:49 +02:00
Rajesh Rajendran
cae3002697
feat(ci): Support building from branch for old patch (#3419)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-20 15:19:04 +02:00
GitHub Action
3d3c62196b Increment frontend chart version 2025-05-20 11:44:16 +02:00
nick-delirium
e810958a5d ui: fix ant imports 2025-05-20 11:26:20 +02:00
nick-delirium
39fa9787d1 ui: prevent network row modal from changing replayer time 2025-05-20 11:21:50 +02:00
nick-delirium
c9c1ad4dde ui: comments etc 2025-05-20 11:21:50 +02:00
nick-delirium
d9868928be ui: improve network panel row mapping 2025-05-20 11:21:50 +02:00
GitHub Action
a460d8c9a2 Increment frontend chart version 2025-05-15 15:18:19 +02:00
nick-delirium
930417aab4 ui: fix session search on url change 2025-05-15 15:12:30 +02:00
GitHub Action
07bc184f4d Increment chalice chart version 2025-05-14 18:59:43 +02:00
Rajesh Rajendran
71b7cca569
Patch/api v1.22.0 (#3401)
* fix(chalice): fixed duplicate autocomplete values

* ci(actions): possible fix for pull --rebase

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
Co-authored-by: Taha Yassine Kraiem <tahayk2@gmail.com>
2025-05-14 18:42:25 +02:00
Mehdi Osman
355d27eaa0
Increment frontend chart version (#3397)
Co-authored-by: GitHub Action <action@github.com>
2025-05-13 13:38:15 +02:00
Mehdi Osman
66b485cccf
Increment db chart version (#3396)
Co-authored-by: GitHub Action <action@github.com>
2025-05-13 10:34:28 +02:00
Alexander
de33a42151
feat(db): custom event's ts (#3395) 2025-05-12 17:52:24 +02:00
Rajesh Rajendran
f12bdebf82
ci(actions): fix push denied (#3392) (#3393) (#3394)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 17:19:41 +02:00
Rajesh Rajendran
bbfa20c693
ci(actions): fix push denied (#3392) (#3393)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 16:58:19 +02:00
Rajesh Rajendran
f264ba043d
ci(actions): fix push denied (#3392)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 16:55:23 +02:00
Rajesh Rajendran
a05dce8125
main (#3391)
* ci(actions): Update pr description

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* ci(actions): run only on pull request merge

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 16:50:20 +02:00
Mehdi Osman
3a1635d81f
Increment frontend chart version (#3389)
Co-authored-by: GitHub Action <action@github.com>
2025-05-12 16:12:43 +02:00
Delirium
ccb332c636
ui: change <slot> check (#3388) 2025-05-12 16:02:26 +02:00
Rajesh Rajendran
80ffa15959
ci(actions): Auto update tag for patch build (#3387)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 15:54:10 +02:00
Rajesh Rajendran
b2e961d621
ci(actions): Auto update tag for patch build (#3386)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 15:49:19 +02:00
Mehdi Osman
b4d0598f23
Increment frontend chart version (#3385)
Co-authored-by: GitHub Action <action@github.com>
2025-05-12 15:46:29 +02:00
Delirium
e77f083f10
ui: fixup toggler closing (#3384) 2025-05-12 15:40:30 +02:00
Delirium
58da1d3f64
fix litjs support, fix autocomplete modal options reset, fix dashboard chart density (#3382)
* Litjs fixes2 (#3381)

* ui: fixes for litjs capture

* ui: introduce vmode for lwc light dom

* ui: fixup the mode toggle and remover

* ui: fix filter options reset, fix dashboard chart density
2025-05-12 15:27:44 +02:00
GitHub Action
447fc26a2a Increment frontend chart version 2025-05-12 10:46:33 +02:00
nick-delirium
9bdf6e4f92 ui: fix heatmaps crash 2025-05-12 10:37:48 +02:00
GitHub Action
01f403e12d Increment chalice chart version 2025-05-07 12:28:44 +02:00
Taha Yassine Kraiem
39eb943b86 fix(chalice): fixed get error's details 2025-05-07 12:15:33 +02:00
GitHub Action
366b0d38b0 Increment frontend chart version 2025-05-06 16:28:28 +02:00
nick-delirium
f4d5b3c06e ui: fix max meta length, add horizontal layout for player 2025-05-06 16:23:47 +02:00
Mehdi Osman
93ae18133e
Increment frontend chart version (#3366)
Co-authored-by: GitHub Action <action@github.com>
2025-05-06 13:16:57 +02:00
Andrey Babushkin
fbe5d78270
Revert update (#3365)
* Revert "Increment chalice chart version"

This reverts commit 5e0e5730ba.

* revert updates

* changed chalice version
2025-05-06 13:08:08 +02:00
Mehdi Osman
b803eed1d4
Increment frontend chart version (#3362)
Co-authored-by: GitHub Action <action@github.com>
2025-05-05 17:49:39 +02:00
Andrey Babushkin
9ed3cb1b7e
Add searched events (#3361)
* add filtered events to search

* removed consoles

* changed styles to tailwind

* changed styles to tailwind

* fixed errors
2025-05-05 17:40:10 +02:00
GitHub Action
5e0e5730ba Increment chalice chart version 2025-05-05 17:04:29 +02:00
Taha Yassine Kraiem
d78b33dcd2 refactor(DB): remove TTL for CH tables 2025-05-05 16:49:37 +02:00
Taha Yassine Kraiem
4b1ca200b4 fix(chalice): fixed empty error_id for table of errors 2025-05-05 16:49:37 +02:00
rjshrjndrn
08d930f9ff fix(docker-compose): proper volume path #3279
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-28 17:28:40 +02:00
Mehdi Osman
da37809bc8
Increment frontend chart version (#3345)
Co-authored-by: GitHub Action <action@github.com>
2025-04-28 11:38:04 +02:00
Andrey Babushkin
d922fc7ad5
Patch frontend inline css (#3344)
* add inlineCss enum

* updated changelog
2025-04-28 11:29:53 +02:00
GitHub Action
796360fdd2 Increment frontend chart version 2025-04-28 11:01:55 +02:00
nick-delirium
13dbb60d8b ui: fix velement applychanges 2025-04-28 10:40:11 +02:00
Андрей Бабушкин
9e20a49128 add slot tag to custom elements 2025-04-28 10:34:43 +02:00
nick-delirium
91f8cc1399 ui: move debouncecall 2025-04-28 10:34:43 +02:00
Andrey Babushkin
f8ba3f6d89 Css batching (#3326)
* tracker: initial css inlining functionality

* tracker: add tests, adjust sheet id, stagger rule sending

* ui: rereoute custom html component fragments

* removed sorting

---------

Co-authored-by: nick-delirium <nikita@openreplay.com>
2025-04-28 10:34:43 +02:00
Delirium
85e30b3692 tracker css batching/inlining (#3334)
* tracker: initial css inlining functionality

* tracker: add tests, adjust sheet id, stagger rule sending

* removed sorting

* upgrade css inliner

* ui: better logging for ocunter

* tracker: force-fetch mode for cssInliner

* tracker: fix ts warns

* tracker: use debug opts

* tracker: 16.2.0 changelogs, inliner opts

* tracker: remove debug options

---------

Co-authored-by: Андрей Бабушкин <andreybabushkin2000@gmail.com>
2025-04-28 10:34:43 +02:00
nick-delirium
0360e3726e ui: fixup autoplay on inactive tabs 2025-04-28 10:34:43 +02:00
nick-delirium
77bbb5af36 tracker: update css inject 2025-04-28 10:34:43 +02:00
Andrey Babushkin
ab0d4cfb62 Css inliner tuning (#3337)
* tracker: don't send double sheets

* tracker: don't send double sheets

* tracker: slot checker

* add slot tag to custom elements

---------

Co-authored-by: nick-delirium <nikita@openreplay.com>
2025-04-28 10:34:43 +02:00
Andrey Babushkin
3fd506a812 Css batching (#3326)
* tracker: initial css inlining functionality

* tracker: add tests, adjust sheet id, stagger rule sending

* ui: rereoute custom html component fragments

* removed sorting

---------

Co-authored-by: nick-delirium <nikita@openreplay.com>
2025-04-28 10:34:43 +02:00
Shekar Siri
e8432e2dec change(ui): force the table cards events order to use and istead the defaul then 2025-04-24 10:09:19 +02:00
GitHub Action
5c76a8524c Increment frontend chart version 2025-04-23 18:41:46 +02:00
rjshrjndrn
3ba40a4811 feat(cli): Add support for image versions
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-23 17:52:50 +02:00
rjshrjndrn
f9a3f24590 fix(docker-compose): clickhouse migration
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-23 17:52:50 +02:00
rjshrjndrn
85d6d0abac fix(docker-compose): remove shell interpolation
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-23 17:52:50 +02:00
Rajesh Rajendran
b3594136ce or 1940 upstream docker release with the existing installation (#3316)
* chore(docker): Adding dynamic env generator
* ci(make): Create deployment yamls
* ci(make): Generating docker envs
* change env name structure
* proper env names
* chore(docker): clickhouse
* chore(docker-compose): generate env file format
* chore(docker-compose): Adding docker-compose
* chore(docker-compose): format make
* chore(docker-compose): Update version
* chore(docker-compose): adding new secrets
* ci(make): default target
* ci(Makefile): Update common protocol
* chore(docker-compose): refactor folder structure
* ci(make): rename to docker-envs
* feat(docker): add clickhouse volume definition
Add clickhouse persistent volume to the docker-compose configuration
to ensure data is preserved between container restarts.
* refactor: move env files to docker-envs directory
Updates all environment file references in docker-compose.yaml to use a
consistent directory structure, placing them under the docker-envs/
directory for better organization.
* fix(docker): rename imagestorage to images
 The `imagestorage` service and related environment file
 have been renamed to `images` for clarity and consistency.
 This change reflects the service's purpose of handling
 images.
* feat(docker): introduce docker-compose template
 A new docker-compose template
 to generate docker-compose files from a list of services.
 The template uses helm syntax.
* fix: Properly set FILES variable in Makefile
 The FILES variable was not being set correctly in the
 Makefile due to subshell issues. This commit fixes the
 variable assignment and ensures that the variable is
 accessible in subsequent commands.
* feat: Refactor docker-compose template for local development
 This commit introduces a complete overhaul of the
 docker-compose template, switching from a helm-based
 template to a native docker-compose.yml file. This
 change simplifies local development and makes it easier
 to manage the OpenReplay stack.
 The new template includes services for:
 - PostgreSQL
 - ClickHouse
 - Redis
 - MinIO
 - Nginx
 - Caddy
 It also includes migration jobs for setting up the
 database and MinIO.
* fix(docker-compose): Add fallback empty environment
 Add an empty environment to the docker-compose template to prevent
 errors when the env_file is missing. This ensures that the
 container can start even if the environment file is not present.
* feat(docker): Add domainname and aliases to services
 This change adds the `domainname` and `aliases` attributes to each
 service in the docker-compose.yaml file. This is to ensure that
 the services can communicate with each other using their fully
 qualified domain names. Also adds shared volume and empty
 environment variables.
* update version
* chore(docker): don't pull parallel
* chore(docker-compose): proper pull
* chore(docker-compose): Update db service urls
* fix(docker-compose): clickhouse url
* chore(clickhouse): Adding clickhouse db migration
* chore(docker-compose): Adding clickhouse
* fix(tpl): variable injection
* chore(fix): compose tpl variable rendering
* chore(docker-compose): Allow override pg variable
* chore(helm): remove assist-server
* chore(helm): pg integrations
* chore(nginx): removed services
* chore(docker-compose): Mulitple aliases
* chore(docker-compose): Adding more env vars
* feat(install): Dynamically generate passwords
 dynamic password generation by
 identifying `change_me_*` entries in `common.env` and
 replacing them with random passwords. This enhances
 security and simplifies initial setup.
 The changes include:
 - Replacing hardcoded password replacements with a loop
   that iterates through all `change_me_*` entries.
 - Using `grep` to find all `change_me_*` tokens.
 - Generating a random password for each token.
 - Updating the `common.env` file with the generated
   passwords.
* chore(docker-compose): disable clickhouse password
* fix(docker-compose): clickhouse-migration
* compose: chalice env
* chore(docker-compose): overlay vars
* chore(docker): Adding ch port
* chore(docker-compose): disable clickhouse password
* fix(docker-compose): migration name
* feat(docker): skip specific values
* chore(docker-compose): define namespace
---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-23 17:52:50 +02:00
GitHub Action
8f67edde8d Increment chalice chart version 2025-04-23 12:26:20 +02:00
Taha Yassine Kraiem
74ed29915b fix(chalice): enforce AND operator for table of requests and table of pages 2025-04-23 11:51:38 +02:00
GitHub Action
3ca71ec211 Increment chalice chart version 2025-04-22 19:23:11 +02:00
Taha Yassine Kraiem
0e469fd056 fix(chalice): fixes for table of requests 2025-04-22 19:03:35 +02:00
KRAIEM Taha Yassine
a8cb0e1643 fix(chalice): fixes for table of requests 2025-04-22 19:03:35 +02:00
GitHub Action
e171f0d8d5 Increment frontend chart version 2025-04-22 17:56:00 +02:00
nick-delirium
68ea291444 ui: fix timepicker and timezone interactions 2025-04-22 17:42:56 +02:00
GitHub Action
05cbb831c7 Increment frontend chart version 2025-04-22 10:32:00 +02:00
nick-delirium
5070ded1f4 ui: fix empty sank sessions fetch 2025-04-22 10:27:16 +02:00
GitHub Action
77610a4924 Increment frontend chart version 2025-04-16 17:45:25 +02:00
nick-delirium
7c34e4a0f6 ui: virtualizer for filter options list 2025-04-16 17:36:34 +02:00
GitHub Action
330e21183f Increment frontend chart version 2025-04-15 18:25:49 +02:00
Shekar Siri
30ce37896c feat(widget-sessions): improve session filtering logic
- Refactored session filtering logic to handle nested filters properly.
- Enhanced `fetchSessions` to ensure null checks and avoid errors.
- Updated `loadData` to handle `USER_PATH` and `HEATMAP` metric types.
- Improved UI consistency by adjusting spacing and formatting.
- Replaced redundant code with cleaner, more maintainable patterns.

This change improves the reliability and readability of the session
filtering and loading logic in the WidgetSessions component.
2025-04-15 18:15:03 +02:00
Andrey Babushkin
80a7817e7d
removed sorting by id (#3305) 2025-04-15 13:32:53 +02:00
Jorgen Evens
1b9c568cb1 fix(helm): fix broken volumeMounts indentation 2025-04-14 15:51:41 +02:00
GitHub Action
3759771ae9 Increment frontend chart version 2025-04-14 12:06:09 +02:00
Shekar Siri
f6ae5aba88 feat(SessionsBy): add specific filter for FETCH metric
Added a conditional check to handle the FETCH metric in the SessionsBy
component. When the metric is FETCH, a specific filter with key
FETCH_URL, operator is, and value derived from data.name is applied.
This ensures proper filtering behavior for FETCH-related metrics.
2025-04-14 12:01:51 +02:00
Mehdi Osman
5190dc512a
Increment frontend chart version (#3297)
Co-authored-by: GitHub Action <action@github.com>
2025-04-14 11:54:25 +02:00
Andrey Babushkin
3fcccb51e8
Patch assist (#3296)
* add global method support

* fix errors

* remove wrong updates

* remove wrong updates

* add onDrag as option

* fix wrong updates
2025-04-14 11:33:06 +02:00
GitHub Action
26077d5689 Increment frontend chart version 2025-04-11 14:56:11 +02:00
Shekar Siri
00c57348fd feat(search): enhance filter value handling
- Added `checkFilterValue` function to validate and update filter values
  in `SearchStoreLive`.
- Updated `FilterItem` to handle undefined `value` gracefully by providing
  a default empty array.

These changes improve robustness in filter value processing.
2025-04-11 14:36:25 +02:00
Shekar Siri
1f9bc5520a feat(search): add rounding to next minutes for date ranges
- Introduced `roundToNextMinutes` utility function to round timestamps
  to the next specified minute interval.
- Updated `Search` class to use the rounding function for non-custom
  date ranges.
- Modified `getRange` in `period.js` to align LAST_24_HOURS with
  15-minute intervals.
- Added `roundToNextMinutes` implementation in `utils/index.ts`.
2025-04-11 12:01:15 +02:00
Shekar Siri
aef94618f6 Revert "Increment frontend chart version"
This reverts commit 2a330318c7.
2025-04-11 11:03:01 +02:00
GitHub Action
2a330318c7 Increment frontend chart version 2025-04-11 11:01:53 +02:00
Shekar Siri
6777d5ce2a feat(dashboard): set initial drill down period
Change default drill down period from LAST_7_DAYS to LAST_24_HOURS
and preserve current period when drilling down on chart click
2025-04-11 10:49:17 +02:00
rjshrjndrn
8a6f8fe91f chore(action): cloning specific tag
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-10 15:45:50 +02:00
Mehdi Osman
7b078fed4c
Increment frontend chart version (#3278)
Co-authored-by: GitHub Action <action@github.com>
2025-04-07 15:24:32 +02:00
Andrey Babushkin
894d4c84b3
Patch assist canvas (#3277)
* resolved conflict

* removed comments
2025-04-07 15:13:36 +02:00
Alexander
46390a3ba9
feat(assist-server): added the github action (#3275) 2025-04-07 10:43:48 +02:00
rjshrjndrn
621667f5ce ci(action): Build and patch github tags
feat(workflow): update commit timestamp for patching

Add a step to set the commit timestamp of the HEAD commit to be 1
second newer than the oldest of the last 3 commits. This ensures
proper chronological order while preserving the commit content.

- Fetch deeper history to access commit history
- Get oldest timestamp from recent commits
- Set new commit date with BSD-compatible date command
- Verify timestamp change with git log

The workflow was previously checking out 'main' branch with a
comment indicating it needed to be fixed. This change makes it
properly checkout the tag specified by the workflow input.

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-04 16:09:05 +02:00
rjshrjndrn
a72f476f1c chore(ci): tag patching
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-04 13:15:56 +02:00
Mehdi Osman
623946ce4e
Increment assist chart version (#3267)
Co-authored-by: GitHub Action <action@github.com>
2025-04-03 13:29:02 -04:00
Mehdi Osman
2d099214fc
Increment frontend chart version (#3266)
Co-authored-by: GitHub Action <action@github.com>
2025-04-03 18:27:05 +02:00
Andrey Babushkin
b0e7054f89
Assist patch canvas (#3265)
* add agent info to assist and tracker

* removed AGENTS_CONNECTED event
2025-04-03 18:22:08 +02:00
Mehdi Osman
a9097270af
Increment chalice chart version (#3260)
Co-authored-by: GitHub Action <action@github.com>
2025-04-02 16:43:46 +02:00
Alexander
5d514ddaf2
feat(chalice): added for_spot=True for authenticate_sso (#3259) 2025-04-02 16:35:19 +02:00
Mehdi Osman
43688bb03b
Increment assist chart version (#3256)
Co-authored-by: GitHub Action <action@github.com>
2025-04-01 16:04:41 +02:00
Mehdi Osman
e050cee7bb
Increment frontend chart version (#3255)
Co-authored-by: GitHub Action <action@github.com>
2025-03-31 18:19:52 +02:00
Andrey Babushkin
6b35df7125
pulled updates (#3254) 2025-03-31 18:13:51 +02:00
GitHub Action
8e099b6dc3 Increment frontend chart version 2025-03-31 17:25:58 +02:00
nick-delirium
c0a4734054 ui: fix double fetches for sessions 2025-03-31 17:19:33 +02:00
GitHub Action
7de1efb5fe Increment frontend chart version 2025-03-31 12:08:45 +02:00
nick-delirium
d4ff28ddbe ui: fix modules label 2025-03-31 11:54:13 +02:00
nick-delirium
b2256f72d0 ui: fix modules mapper 2025-03-31 11:48:14 +02:00
GitHub Action
a63bda1c79 Increment frontend chart version 2025-03-31 11:17:34 +02:00
nick-delirium
3a0176789e ui: filter keys 2025-03-31 10:34:02 +02:00
nick-delirium
f2b7271fca ui: add old devtool filters 2025-03-31 10:31:06 +02:00
GitHub Action
d50f89662b Increment frontend chart version 2025-03-28 21:37:59 +01:00
GitHub Action
35051d201c Increment assist chart version 2025-03-28 21:37:59 +01:00
rjshrjndrn
214be95ecc fix(init): remove duplicate clone
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-03-28 21:25:24 +01:00
Delirium
dbc142c114
UI patches (28.03) (#3231)
* ui: force getting url for location in tabmanagers

* Assist add turn servers (#3229)

* fixed conflicts

* add offers

* add config to sicket query

* add config to sicket query

* add config init

* removed console logs

* removed wrong updates

* fixed conflicts

* add offers

* add config to sicket query

* add config to sicket query

* add config init

* removed console logs

* removed wrong updates

* ui: fix chat draggable, fix default params

---------

Co-authored-by: nick-delirium <nikita@openreplay.com>

* ui: fix spritemap generation for assist sessions

* ui: fix yarnlock

* fix errors

* updated widget link

* resolved conflicts

* updated widget url

---------

Co-authored-by: Andrey Babushkin <55714097+reyand43@users.noreply.github.com>
Co-authored-by: Андрей Бабушкин <andreybabushkin2000@gmail.com>
2025-03-28 17:32:12 +01:00
GitHub Action
443f5e8f08 Increment frontend chart version 2025-03-27 12:36:54 +01:00
Shekar Siri
9f693f220d refactor(auth): separate SSO support from enterprise edition
Add dedicated isSSOSupported property to correctly identify when SSO
authentication is available, properly handling the 'msaas' edition
case separately from enterprise edition checks. This fixes SSO
visibility in the login interface.
2025-03-27 12:28:10 +01:00
GitHub Action
5ab30380b0 Increment chalice chart version 2025-03-26 17:48:08 +01:00
Taha Yassine Kraiem
fc86555644 refactor(chalice): changed user-journey 2025-03-26 17:18:17 +01:00
GitHub Action
2a3c611a27 Increment frontend chart version 2025-03-26 16:48:29 +01:00
Delirium
1d6fb0ae9e ui: shrink icons when no space, adjust player area for events export … (#3217)
* ui: shrink icons when no space, adjust player area for events export panel, fix panel size

* ui: rm log
2025-03-26 16:38:48 +01:00
GitHub Action
bef91a6136 Increment frontend chart version 2025-03-25 18:15:34 +01:00
Shekar Siri
1e2bd19d32 fix(dashboard): update filter condition in MetricsList
Change the filter type comparison from checking against 'all' to
checking against an empty string. This ensures proper filtering
behavior when filtering metrics in the dashboard component.
2025-03-25 18:10:13 +01:00
rjshrjndrn
3b58cb347e chore(http): remove default token_string
scripts/helmcharts/openreplay/charts/http/scripts/entrypoint.sh

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-03-24 19:31:01 +01:00
GitHub Action
ca4590501a Increment frontend chart version 2025-03-24 17:45:24 +01:00
Andrey Babushkin
fd12cc7585
fix(GraphQL): remove unused useTranslation hook (#3200) (#3206)
Co-authored-by: PiRDub <pirddeveloppeur@gmail.com>
2025-03-24 17:38:45 +01:00
rjshrjndrn
6abded53e0 feat(helm): add TOKEN_SECRET environment variable
Add TOKEN_SECRET environment variable to HTTP service deployment and
generate a random value for it in vars.yaml.

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-03-24 16:55:35 +01:00
GitHub Action
82c5e5e59d Increment frontend chart version 2025-03-24 14:34:51 +01:00
nick-delirium
c77b0cc4de ui: fixes for onboarding ui 2025-03-24 14:30:22 +01:00
nick-delirium
de344e62ef ui: onboarding fixes 2025-03-24 14:30:22 +01:00
Mehdi Osman
deb78a62c0
Increment frontend chart version (#3189)
Co-authored-by: GitHub Action <action@github.com>
2025-03-21 11:00:14 +01:00
Shekar Siri
0724cf05f0
fix(auth): remove unnecessary captcha token validation (#3188)
The token validation checks were redundant as the validation is already
handled by the captcha wrapper component. This change simplifies the
password reset flow while maintaining security.
2025-03-21 10:55:39 +01:00
GitHub Action
cc704f1bc3 Increment frontend chart version 2025-03-20 16:18:42 +01:00
nick-delirium
4c159b2d26 ui: fix table column export 2025-03-20 16:08:58 +01:00
Mehdi Osman
42df33bc01
Increment assist chart version (#3181)
Co-authored-by: GitHub Action <action@github.com>
2025-03-19 14:58:26 +01:00
Alexander
ae95b48760
feat(assist): improved caching mechanism for cluster mode (#3180) 2025-03-19 14:53:58 +01:00
Mehdi Osman
4be3050e61
Increment frontend chart version (#3179)
Co-authored-by: GitHub Action <action@github.com>
2025-03-19 14:47:37 +01:00
Shekar Siri
8eec6e983b
feat(auth): implement withCaptcha HOC for consistent reCAPTCHA (#3177)
* feat(auth): implement withCaptcha HOC for consistent reCAPTCHA

This commit refactors the reCAPTCHA implementation across the application
by introducing a Higher Order Component (withCaptcha) that encapsulates
captcha verification logic. The changes:

- Create a reusable withCaptcha HOC in withRecaptcha.tsx
- Refactor Login, ResetPasswordRequest, and CreatePassword components
- Extract SSOLogin into a separate component
- Improve error handling and user feedback
- Standardize loading and verification states across forms
- Make captcha implementation more maintainable and consistent

* feat(auth): support msaas edition for enterprise features

Add msaas to the isEnterprise check alongside ee edition to properly
display enterprise features. Use userStore.isEnterprise in SSOLogin
component instead of directly checking authDetails.edition for
consistent
enterprise status detection.
2025-03-19 14:36:56 +01:00
Taha Yassine Kraiem
5fec615044 refactor(chalice): cleaned code
fix(chalice): fixed session-search-pg sortKey issue
fix(chalice): fixed CH-query-formatter to handle special chars
fix(chalice): fixed /ids response
2025-03-18 13:51:10 +01:00
Mehdi Osman
f77568a01c
Increment frontend chart version (#3167)
Co-authored-by: GitHub Action <action@github.com>
2025-03-18 13:45:09 +01:00
Shekar Siri
618e4dc59f
refactor(searchStore): reformat filterMap function parameters (#3166)
- Reformat the parameters of the filterMap function for better readability.
- Comment out the fetchSessions call in clearSearch method to avoid unnecessary session fetch.
2025-03-15 11:42:14 +01:00
568 changed files with 10644 additions and 30075 deletions

View file

@ -47,7 +47,6 @@ runs:
"JWT_SECRET:.global.jwtSecret"
"JWT_SPOT_REFRESH_SECRET:.chalice.env.JWT_SPOT_REFRESH_SECRET"
"JWT_SPOT_SECRET:.global.jwtSpotSecret"
"JWT_SECRET:.global.tokenSecret"
"LICENSE_KEY:.global.enterpriseEditionLicense"
"MINIO_ACCESS_KEY:.global.s3.accessKey"
"MINIO_SECRET_KEY:.global.s3.secretKey"

View file

@ -130,7 +130,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,alerts,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,alerts,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks --kube-version=$k_version | kubectl apply -f -

View file

@ -130,7 +130,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,alerts,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,alerts,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks | kubectl apply -n app -f -

View file

@ -127,7 +127,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,chalice,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,chalice,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks --kube-version=$k_version | kubectl apply -f -

View file

@ -120,7 +120,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,chalice,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,chalice,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks | kubectl apply -n app -f -

View file

@ -113,7 +113,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,assist,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,assist,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks --kube-version=$k_version | kubectl apply -f -

View file

@ -111,7 +111,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,assist-server,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,assist-server,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks --kube-version=$k_version | kubectl apply -f -

View file

@ -130,7 +130,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,assist-stats,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,assist-stats,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks | kubectl apply -f -

View file

@ -112,7 +112,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,assist,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,assist,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks --kube-version=$k_version | kubectl apply -f -

View file

@ -129,7 +129,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,utilities,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,utilities,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks --kube-version=$k_version | kubectl apply -f -

View file

@ -76,7 +76,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,frontend,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,frontend,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks | kubectl apply -n app -f -

View file

@ -1,33 +0,0 @@
name: Frontend tests
on:
pull_request:
paths:
- 'frontend/**'
- '.github/workflows/frontend-test.yaml'
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Set up Node.js
uses: actions/setup-node@v3
with:
node-version: 20
- name: Install dependencies
working-directory: frontend
run: yarn
- name: Run tests
working-directory: frontend
run: yarn test:ci
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
with:
directory: frontend/coverage/

View file

@ -89,7 +89,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,frontend,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,frontend,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks | kubectl apply -n app -f -
@ -138,7 +138,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,frontend,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,frontend,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks | kubectl apply -n app -f -

View file

@ -119,7 +119,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,sourcemapreader,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,sourcemapreader,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks | kubectl apply -n app -f -

View file

@ -118,7 +118,7 @@ jobs:
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,sourcemapreader,quickwit,connector,assist-api} /tmp/charts/
mv openreplay/charts/{ingress-nginx,sourcemapreader,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks | kubectl apply -n app -f -

View file

@ -22,14 +22,22 @@ jobs:
- name: Cache tracker modules
uses: actions/cache@v3
with:
path: tracker/node_modules
key: ${{ runner.OS }}-test_tracker_build-${{ hashFiles('**/bun.lock') }}
path: tracker/tracker/node_modules
key: ${{ runner.OS }}-test_tracker_build-${{ hashFiles('**/bun.lockb') }}
restore-keys: |
test_tracker_build{{ runner.OS }}-build-
test_tracker_build{{ runner.OS }}-
- name: Cache tracker-assist modules
uses: actions/cache@v3
with:
path: tracker/tracker-assist/node_modules
key: ${{ runner.OS }}-test_tracker_build-${{ hashFiles('**/bun.lockb') }}
restore-keys: |
test_tracker_build{{ runner.OS }}-build-
test_tracker_build{{ runner.OS }}-
- name: Setup Testing packages
run: |
cd tracker
cd tracker/tracker
bun install
- name: Jest tests
run: |
@ -39,6 +47,10 @@ jobs:
run: |
cd tracker/tracker
bun run build
- name: (TA) Setup Testing packages
run: |
cd tracker/tracker-assist
bun install
- name: (TA) Jest tests
run: |
cd tracker/tracker-assist

View file

@ -33,11 +33,10 @@ jobs:
- name: Set Remote with GITHUB_TOKEN
run: |
git config --unset http.https://github.com/.extraheader
git remote set-url origin https://x-access-token:${{ secrets.ACTIONS_COMMMIT_TOKEN }}@github.com/${{ github.repository }}.git
git remote set-url origin https://x-access-token:${{ secrets.ACTIONS_COMMMIT_TOKEN }}@github.com/${{ github.repository }}
- name: Push main branch to tag
run: |
git fetch --tags
git checkout main
echo "Updating tag ${{ env.LATEST_TAG }} to point to latest commit on main"
git push origin HEAD:refs/tags/${{ env.LATEST_TAG }} --force

View file

@ -148,7 +148,9 @@ jobs:
set -x
echo > /tmp/image_override.yaml
mkdir /tmp/helmcharts
mv openreplay/charts/{ingress-nginx,quickwit,connector,assist-api} /tmp/helmcharts/
mv openreplay/charts/ingress-nginx /tmp/helmcharts/
mv openreplay/charts/quickwit /tmp/helmcharts/
mv openreplay/charts/connector /tmp/helmcharts/
## Update images
for image in $(cat /tmp/images_to_build.txt);
do

View file

@ -141,7 +141,9 @@ jobs:
set -x
echo > /tmp/image_override.yaml
mkdir /tmp/helmcharts
mv openreplay/charts/{ingress-nginx,quickwit,connector,assist-api} /tmp/helmcharts/
mv openreplay/charts/ingress-nginx /tmp/helmcharts/
mv openreplay/charts/quickwit /tmp/helmcharts/
mv openreplay/charts/connector /tmp/helmcharts/
## Update images
for image in $(cat /tmp/images_to_build.txt);
do

2
.gitignore vendored
View file

@ -7,5 +7,3 @@ node_modules
**/*.envrc
.idea
*.mob*
install-state.gz
frontend/tests/playwright/auth-state.json

View file

@ -1,7 +1,7 @@
repos:
- repo: https://github.com/gitguardian/ggshield
rev: v1.38.0
rev: v1.14.5
hooks:
- id: ggshield
language_version: python3
stages: [pre-commit]
stages: [commit]

View file

@ -4,24 +4,26 @@ verify_ssl = true
name = "pypi"
[packages]
urllib3 = "==2.4.0"
urllib3 = "==2.3.0"
requests = "==2.32.3"
boto3 = "==1.38.16"
boto3 = "==1.36.12"
pyjwt = "==2.10.1"
psycopg2-binary = "==2.9.10"
psycopg = {extras = ["binary", "pool"], version = "==3.2.9"}
clickhouse-connect = "==0.8.17"
elasticsearch = "==9.0.1"
psycopg = {extras = ["pool", "binary"], version = "==3.2.4"}
clickhouse-driver = {extras = ["lz4"], version = "==0.2.9"}
clickhouse-connect = "==0.8.15"
elasticsearch = "==8.17.1"
jira = "==3.8.0"
cachetools = "==5.5.2"
fastapi = "==0.115.12"
uvicorn = {extras = ["standard"], version = "==0.34.2"}
cachetools = "==5.5.1"
fastapi = "==0.115.8"
uvicorn = {extras = ["standard"], version = "==0.34.0"}
python-decouple = "==3.8"
pydantic = {extras = ["email"], version = "==2.11.4"}
pydantic = {extras = ["email"], version = "==2.10.6"}
apscheduler = "==3.11.0"
redis = "==6.1.0"
redis = "==5.2.1"
[dev-packages]
[requires]
python_version = "3.12"
python_full_version = "3.12.8"

View file

@ -16,7 +16,7 @@ from chalicelib.utils import helper
from chalicelib.utils import pg_client, ch_client
from crons import core_crons, core_dynamic_crons
from routers import core, core_dynamic
from routers.subs import insights, metrics, v1_api, health, usability_tests, spot, product_analytics
from routers.subs import insights, metrics, v1_api, health, usability_tests, spot, product_anaytics
loglevel = config("LOGLEVEL", default=logging.WARNING)
print(f">Loglevel set to: {loglevel}")
@ -129,6 +129,6 @@ app.include_router(spot.public_app)
app.include_router(spot.app)
app.include_router(spot.app_apikey)
app.include_router(product_analytics.public_app, prefix="/pa")
app.include_router(product_analytics.app, prefix="/pa")
app.include_router(product_analytics.app_apikey, prefix="/pa")
app.include_router(product_anaytics.public_app)
app.include_router(product_anaytics.app)
app.include_router(product_anaytics.app_apikey)

View file

@ -1,11 +0,0 @@
import logging
from decouple import config
logging.basicConfig(level=config("LOGLEVEL", default=logging.INFO))
if config("EXP_AUTOCOMPLETE", cast=bool, default=False):
logging.info(">>> Using experimental autocomplete")
from . import autocomplete_ch as autocomplete
else:
from . import autocomplete

View file

@ -1,9 +1,10 @@
import logging
import schemas
from chalicelib.core import countries, metadata
from chalicelib.core import countries, events, metadata
from chalicelib.utils import helper
from chalicelib.utils import pg_client
from chalicelib.utils.event_filter_definition import Event
from chalicelib.utils.or_cache import CachedResponse
logger = logging.getLogger(__name__)
TABLE = "public.autocomplete"
@ -112,10 +113,10 @@ def __generic_query(typename, value_length=None):
LIMIT 10;"""
def __generic_autocomplete(event: str):
def __generic_autocomplete(event: Event):
def f(project_id, value, key=None, source=None):
with pg_client.PostgresClient() as cur:
query = __generic_query(event, value_length=len(value))
query = __generic_query(event.ui_type, value_length=len(value))
params = {"project_id": project_id, "value": helper.string_to_sql_like(value),
"svalue": helper.string_to_sql_like("^" + value)}
cur.execute(cur.mogrify(query, params))
@ -148,8 +149,8 @@ def __errors_query(source=None, value_length=None):
return f"""((SELECT DISTINCT ON(lg.message)
lg.message AS value,
source,
'{schemas.EventType.ERROR}' AS type
FROM events.errors INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.ERROR.ui_type}' AS type
FROM {events.EventType.ERROR.table} INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.message ILIKE %(svalue)s
@ -160,8 +161,8 @@ def __errors_query(source=None, value_length=None):
(SELECT DISTINCT ON(lg.name)
lg.name AS value,
source,
'{schemas.EventType.ERROR}' AS type
FROM events.errors INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.ERROR.ui_type}' AS type
FROM {events.EventType.ERROR.table} INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.name ILIKE %(svalue)s
@ -172,8 +173,8 @@ def __errors_query(source=None, value_length=None):
(SELECT DISTINCT ON(lg.message)
lg.message AS value,
source,
'{schemas.EventType.ERROR}' AS type
FROM events.errors INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.ERROR.ui_type}' AS type
FROM {events.EventType.ERROR.table} INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.message ILIKE %(value)s
@ -184,8 +185,8 @@ def __errors_query(source=None, value_length=None):
(SELECT DISTINCT ON(lg.name)
lg.name AS value,
source,
'{schemas.EventType.ERROR}' AS type
FROM events.errors INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.ERROR.ui_type}' AS type
FROM {events.EventType.ERROR.table} INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.name ILIKE %(value)s
@ -195,8 +196,8 @@ def __errors_query(source=None, value_length=None):
return f"""((SELECT DISTINCT ON(lg.message)
lg.message AS value,
source,
'{schemas.EventType.ERROR}' AS type
FROM events.errors INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.ERROR.ui_type}' AS type
FROM {events.EventType.ERROR.table} INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.message ILIKE %(svalue)s
@ -207,8 +208,8 @@ def __errors_query(source=None, value_length=None):
(SELECT DISTINCT ON(lg.name)
lg.name AS value,
source,
'{schemas.EventType.ERROR}' AS type
FROM events.errors INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.ERROR.ui_type}' AS type
FROM {events.EventType.ERROR.table} INNER JOIN public.errors AS lg USING (error_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.name ILIKE %(svalue)s
@ -233,8 +234,8 @@ def __search_errors_mobile(project_id, value, key=None, source=None):
if len(value) > 2:
query = f"""(SELECT DISTINCT ON(lg.reason)
lg.reason AS value,
'{schemas.EventType.ERROR_MOBILE}' AS type
FROM events_common.crashes INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.CRASH_MOBILE.ui_type}' AS type
FROM {events.EventType.CRASH_MOBILE.table} INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.project_id = %(project_id)s
@ -243,8 +244,8 @@ def __search_errors_mobile(project_id, value, key=None, source=None):
UNION ALL
(SELECT DISTINCT ON(lg.name)
lg.name AS value,
'{schemas.EventType.ERROR_MOBILE}' AS type
FROM events_common.crashes INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.CRASH_MOBILE.ui_type}' AS type
FROM {events.EventType.CRASH_MOBILE.table} INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.project_id = %(project_id)s
@ -253,8 +254,8 @@ def __search_errors_mobile(project_id, value, key=None, source=None):
UNION ALL
(SELECT DISTINCT ON(lg.reason)
lg.reason AS value,
'{schemas.EventType.ERROR_MOBILE}' AS type
FROM events_common.crashes INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.CRASH_MOBILE.ui_type}' AS type
FROM {events.EventType.CRASH_MOBILE.table} INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.project_id = %(project_id)s
@ -263,8 +264,8 @@ def __search_errors_mobile(project_id, value, key=None, source=None):
UNION ALL
(SELECT DISTINCT ON(lg.name)
lg.name AS value,
'{schemas.EventType.ERROR_MOBILE}' AS type
FROM events_common.crashes INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.CRASH_MOBILE.ui_type}' AS type
FROM {events.EventType.CRASH_MOBILE.table} INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.project_id = %(project_id)s
@ -273,8 +274,8 @@ def __search_errors_mobile(project_id, value, key=None, source=None):
else:
query = f"""(SELECT DISTINCT ON(lg.reason)
lg.reason AS value,
'{schemas.EventType.ERROR_MOBILE}' AS type
FROM events_common.crashes INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.CRASH_MOBILE.ui_type}' AS type
FROM {events.EventType.CRASH_MOBILE.table} INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.project_id = %(project_id)s
@ -283,8 +284,8 @@ def __search_errors_mobile(project_id, value, key=None, source=None):
UNION ALL
(SELECT DISTINCT ON(lg.name)
lg.name AS value,
'{schemas.EventType.ERROR_MOBILE}' AS type
FROM events_common.crashes INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
'{events.EventType.CRASH_MOBILE.ui_type}' AS type
FROM {events.EventType.CRASH_MOBILE.table} INNER JOIN public.crashes_ios AS lg USING (crash_ios_id) LEFT JOIN public.sessions AS s USING(session_id)
WHERE
s.project_id = %(project_id)s
AND lg.project_id = %(project_id)s
@ -376,6 +377,7 @@ def is_top_supported(event_type):
return TYPE_TO_COLUMN.get(event_type, False)
@CachedResponse(table="or_cache.autocomplete_top_values", ttl=5 * 60)
def get_top_values(project_id, event_type, event_key=None):
with pg_client.PostgresClient() as cur:
if schemas.FilterType.has_value(event_type):

View file

@ -1,5 +1,3 @@
import logging
import schemas
from chalicelib.core import metadata
from chalicelib.core.errors import errors_legacy
@ -9,8 +7,6 @@ from chalicelib.utils import ch_client, exp_ch_helper
from chalicelib.utils import helper, metrics_helper
from chalicelib.utils.TimeUTC import TimeUTC
logger = logging.getLogger(__name__)
def _multiple_values(values, value_key="value"):
query_values = {}
@ -382,9 +378,9 @@ def search(data: schemas.SearchErrorsSchema, project: schemas.ProjectContext, us
ORDER BY timestamp) AS sub_table
GROUP BY error_id) AS chart_details ON details.error_id=chart_details.error_id;"""
logger.debug("------------")
logger.debug(ch.format(main_ch_query, params))
logger.debug("------------")
# print("------------")
# print(ch.format(main_ch_query, params))
# print("------------")
query = ch.format(query=main_ch_query, parameters=params)
rows = ch.execute(query=query)

View file

@ -0,0 +1,226 @@
from functools import cache
from typing import Optional
import schemas
from chalicelib.core import issues
from chalicelib.core.autocomplete import autocomplete
from chalicelib.core.sessions import sessions_metas
from chalicelib.utils import pg_client, helper
from chalicelib.utils.TimeUTC import TimeUTC
from chalicelib.utils.event_filter_definition import SupportedFilter, Event
def get_customs_by_session_id(session_id, project_id):
with pg_client.PostgresClient() as cur:
cur.execute(cur.mogrify("""\
SELECT
c.*,
'CUSTOM' AS type
FROM events_common.customs AS c
WHERE
c.session_id = %(session_id)s
ORDER BY c.timestamp;""",
{"project_id": project_id, "session_id": session_id})
)
rows = cur.fetchall()
return helper.dict_to_camel_case(rows)
def __merge_cells(rows, start, count, replacement):
rows[start] = replacement
rows = rows[:start + 1] + rows[start + count:]
return rows
def __get_grouped_clickrage(rows, session_id, project_id):
click_rage_issues = issues.get_by_session_id(session_id=session_id, issue_type="click_rage", project_id=project_id)
if len(click_rage_issues) == 0:
return rows
for c in click_rage_issues:
merge_count = c.get("payload")
if merge_count is not None:
merge_count = merge_count.get("Count", 3)
else:
merge_count = 3
for i in range(len(rows)):
if rows[i]["timestamp"] == c["timestamp"]:
rows = __merge_cells(rows=rows,
start=i,
count=merge_count,
replacement={**rows[i], "type": "CLICKRAGE", "count": merge_count})
break
return rows
def get_by_session_id(session_id, project_id, group_clickrage=False, event_type: Optional[schemas.EventType] = None):
with pg_client.PostgresClient() as cur:
rows = []
if event_type is None or event_type == schemas.EventType.CLICK:
cur.execute(cur.mogrify("""\
SELECT
c.*,
'CLICK' AS type
FROM events.clicks AS c
WHERE
c.session_id = %(session_id)s
ORDER BY c.timestamp;""",
{"project_id": project_id, "session_id": session_id})
)
rows += cur.fetchall()
if group_clickrage:
rows = __get_grouped_clickrage(rows=rows, session_id=session_id, project_id=project_id)
if event_type is None or event_type == schemas.EventType.INPUT:
cur.execute(cur.mogrify("""
SELECT
i.*,
'INPUT' AS type
FROM events.inputs AS i
WHERE
i.session_id = %(session_id)s
ORDER BY i.timestamp;""",
{"project_id": project_id, "session_id": session_id})
)
rows += cur.fetchall()
if event_type is None or event_type == schemas.EventType.LOCATION:
cur.execute(cur.mogrify("""\
SELECT
l.*,
l.path AS value,
l.path AS url,
'LOCATION' AS type
FROM events.pages AS l
WHERE
l.session_id = %(session_id)s
ORDER BY l.timestamp;""", {"project_id": project_id, "session_id": session_id}))
rows += cur.fetchall()
rows = helper.list_to_camel_case(rows)
rows = sorted(rows, key=lambda k: (k["timestamp"], k["messageId"]))
return rows
def _search_tags(project_id, value, key=None, source=None):
with pg_client.PostgresClient() as cur:
query = f"""
SELECT public.tags.name
'TAG' AS type
FROM public.tags
WHERE public.tags.project_id = %(project_id)s
ORDER BY SIMILARITY(public.tags.name, %(value)s) DESC
LIMIT 10
"""
query = cur.mogrify(query, {'project_id': project_id, 'value': value})
cur.execute(query)
results = helper.list_to_camel_case(cur.fetchall())
return results
class EventType:
CLICK = Event(ui_type=schemas.EventType.CLICK, table="events.clicks", column="label")
INPUT = Event(ui_type=schemas.EventType.INPUT, table="events.inputs", column="label")
LOCATION = Event(ui_type=schemas.EventType.LOCATION, table="events.pages", column="path")
CUSTOM = Event(ui_type=schemas.EventType.CUSTOM, table="events_common.customs", column="name")
REQUEST = Event(ui_type=schemas.EventType.REQUEST, table="events_common.requests", column="path")
GRAPHQL = Event(ui_type=schemas.EventType.GRAPHQL, table="events.graphql", column="name")
STATEACTION = Event(ui_type=schemas.EventType.STATE_ACTION, table="events.state_actions", column="name")
TAG = Event(ui_type=schemas.EventType.TAG, table="events.tags", column="tag_id")
ERROR = Event(ui_type=schemas.EventType.ERROR, table="events.errors",
column=None) # column=None because errors are searched by name or message
METADATA = Event(ui_type=schemas.FilterType.METADATA, table="public.sessions", column=None)
# MOBILE
CLICK_MOBILE = Event(ui_type=schemas.EventType.CLICK_MOBILE, table="events_ios.taps", column="label")
INPUT_MOBILE = Event(ui_type=schemas.EventType.INPUT_MOBILE, table="events_ios.inputs", column="label")
VIEW_MOBILE = Event(ui_type=schemas.EventType.VIEW_MOBILE, table="events_ios.views", column="name")
SWIPE_MOBILE = Event(ui_type=schemas.EventType.SWIPE_MOBILE, table="events_ios.swipes", column="label")
CUSTOM_MOBILE = Event(ui_type=schemas.EventType.CUSTOM_MOBILE, table="events_common.customs", column="name")
REQUEST_MOBILE = Event(ui_type=schemas.EventType.REQUEST_MOBILE, table="events_common.requests", column="path")
CRASH_MOBILE = Event(ui_type=schemas.EventType.ERROR_MOBILE, table="events_common.crashes",
column=None) # column=None because errors are searched by name or message
@cache
def supported_types():
return {
EventType.CLICK.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.CLICK),
query=autocomplete.__generic_query(typename=EventType.CLICK.ui_type)),
EventType.INPUT.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.INPUT),
query=autocomplete.__generic_query(typename=EventType.INPUT.ui_type)),
EventType.LOCATION.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.LOCATION),
query=autocomplete.__generic_query(
typename=EventType.LOCATION.ui_type)),
EventType.CUSTOM.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.CUSTOM),
query=autocomplete.__generic_query(
typename=EventType.CUSTOM.ui_type)),
EventType.REQUEST.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.REQUEST),
query=autocomplete.__generic_query(
typename=EventType.REQUEST.ui_type)),
EventType.GRAPHQL.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.GRAPHQL),
query=autocomplete.__generic_query(
typename=EventType.GRAPHQL.ui_type)),
EventType.STATEACTION.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.STATEACTION),
query=autocomplete.__generic_query(
typename=EventType.STATEACTION.ui_type)),
EventType.TAG.ui_type: SupportedFilter(get=_search_tags, query=None),
EventType.ERROR.ui_type: SupportedFilter(get=autocomplete.__search_errors,
query=None),
EventType.METADATA.ui_type: SupportedFilter(get=autocomplete.__search_metadata,
query=None),
# MOBILE
EventType.CLICK_MOBILE.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.CLICK_MOBILE),
query=autocomplete.__generic_query(
typename=EventType.CLICK_MOBILE.ui_type)),
EventType.SWIPE_MOBILE.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.SWIPE_MOBILE),
query=autocomplete.__generic_query(
typename=EventType.SWIPE_MOBILE.ui_type)),
EventType.INPUT_MOBILE.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.INPUT_MOBILE),
query=autocomplete.__generic_query(
typename=EventType.INPUT_MOBILE.ui_type)),
EventType.VIEW_MOBILE.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.VIEW_MOBILE),
query=autocomplete.__generic_query(
typename=EventType.VIEW_MOBILE.ui_type)),
EventType.CUSTOM_MOBILE.ui_type: SupportedFilter(
get=autocomplete.__generic_autocomplete(EventType.CUSTOM_MOBILE),
query=autocomplete.__generic_query(
typename=EventType.CUSTOM_MOBILE.ui_type)),
EventType.REQUEST_MOBILE.ui_type: SupportedFilter(
get=autocomplete.__generic_autocomplete(EventType.REQUEST_MOBILE),
query=autocomplete.__generic_query(
typename=EventType.REQUEST_MOBILE.ui_type)),
EventType.CRASH_MOBILE.ui_type: SupportedFilter(get=autocomplete.__search_errors_mobile,
query=None),
}
def get_errors_by_session_id(session_id, project_id):
with pg_client.PostgresClient() as cur:
cur.execute(cur.mogrify(f"""\
SELECT er.*,ur.*, er.timestamp - s.start_ts AS time
FROM {EventType.ERROR.table} AS er INNER JOIN public.errors AS ur USING (error_id) INNER JOIN public.sessions AS s USING (session_id)
WHERE er.session_id = %(session_id)s AND s.project_id=%(project_id)s
ORDER BY timestamp;""", {"session_id": session_id, "project_id": project_id}))
errors = cur.fetchall()
for e in errors:
e["stacktrace_parsed_at"] = TimeUTC.datetime_to_timestamp(e["stacktrace_parsed_at"])
return helper.list_to_camel_case(errors)
def search(text, event_type, project_id, source, key):
if not event_type:
return {"data": autocomplete.__get_autocomplete_table(text, project_id)}
if event_type in supported_types().keys():
rows = supported_types()[event_type].get(project_id=project_id, value=text, key=key, source=source)
elif event_type + "_MOBILE" in supported_types().keys():
rows = supported_types()[event_type + "_MOBILE"].get(project_id=project_id, value=text, key=key, source=source)
elif event_type in sessions_metas.supported_types().keys():
return sessions_metas.search(text, event_type, project_id)
elif event_type.endswith("_IOS") \
and event_type[:-len("_IOS")] in sessions_metas.supported_types().keys():
return sessions_metas.search(text, event_type, project_id)
elif event_type.endswith("_MOBILE") \
and event_type[:-len("_MOBILE")] in sessions_metas.supported_types().keys():
return sessions_metas.search(text, event_type, project_id)
else:
return {"errors": ["unsupported event"]}
return {"data": rows}

View file

@ -1,11 +0,0 @@
import logging
from decouple import config
logger = logging.getLogger(__name__)
if config("EXP_EVENTS", cast=bool, default=False):
logger.info(">>> Using experimental events replay")
from . import events_ch as events
else:
from . import events_pg as events

View file

@ -1,96 +0,0 @@
from chalicelib.utils import ch_client
from .events_pg import *
from chalicelib.utils.exp_ch_helper import explode_dproperties, add_timestamp
def get_customs_by_session_id(session_id, project_id):
with ch_client.ClickHouseClient() as cur:
rows = cur.execute(""" \
SELECT `$properties`,
properties,
created_at,
'CUSTOM' AS type,
`$event_name` AS name
FROM product_analytics.events
WHERE session_id = %(session_id)s
AND NOT `$auto_captured`
AND `$event_name`!='INCIDENT'
ORDER BY created_at;""",
{"project_id": project_id, "session_id": session_id})
rows = helper.list_to_camel_case(rows, ignore_keys=["properties"])
rows = explode_dproperties(rows)
rows = add_timestamp(rows)
return rows
def __merge_cells(rows, start, count, replacement):
rows[start] = replacement
rows = rows[:start + 1] + rows[start + count:]
return rows
def __get_grouped_clickrage(rows, session_id, project_id):
click_rage_issues = issues.get_by_session_id(session_id=session_id, issue_type="click_rage", project_id=project_id)
if len(click_rage_issues) == 0:
return rows
for c in click_rage_issues:
merge_count = c.get("payload")
if merge_count is not None:
merge_count = merge_count.get("Count", 3)
else:
merge_count = 3
for i in range(len(rows)):
if rows[i]["created_at"] == c["createdAt"]:
rows = __merge_cells(rows=rows,
start=i,
count=merge_count,
replacement={**rows[i], "type": "CLICKRAGE", "count": merge_count})
break
return rows
def get_by_session_id(session_id, project_id, group_clickrage=False, event_type: Optional[schemas.EventType] = None):
with ch_client.ClickHouseClient() as cur:
select_events = ('CLICK', 'INPUT', 'LOCATION')
if event_type is not None:
select_events = (event_type,)
query = cur.format(query=""" \
SELECT created_at,
`$properties`,
`$event_name` AS type
FROM product_analytics.events
WHERE session_id = %(session_id)s
AND `$event_name` IN %(select_events)s
AND `$auto_captured`
ORDER BY created_at;""",
parameters={"project_id": project_id, "session_id": session_id,
"select_events": select_events})
rows = cur.execute(query)
rows = explode_dproperties(rows)
if group_clickrage and 'CLICK' in select_events:
rows = __get_grouped_clickrage(rows=rows, session_id=session_id, project_id=project_id)
rows = helper.list_to_camel_case(rows)
rows = sorted(rows, key=lambda k: k["createdAt"])
rows = add_timestamp(rows)
return rows
def get_incidents_by_session_id(session_id, project_id):
with ch_client.ClickHouseClient() as cur:
query = cur.format(query=""" \
SELECT created_at,
`$properties`,
`$event_name` AS type
FROM product_analytics.events
WHERE session_id = %(session_id)s
AND `$event_name` = 'INCIDENT'
AND `$auto_captured`
ORDER BY created_at;""",
parameters={"project_id": project_id, "session_id": session_id})
rows = cur.execute(query)
rows = explode_dproperties(rows)
rows = helper.list_to_camel_case(rows)
rows = sorted(rows, key=lambda k: k["createdAt"])
return rows

View file

@ -1,209 +0,0 @@
import logging
from functools import cache
from typing import Optional
import schemas
from chalicelib.core.autocomplete import autocomplete
from chalicelib.core.issues import issues
from chalicelib.core.sessions import sessions_metas
from chalicelib.utils import pg_client, helper
from chalicelib.utils.TimeUTC import TimeUTC
from chalicelib.utils.event_filter_definition import SupportedFilter
logger = logging.getLogger(__name__)
def get_customs_by_session_id(session_id, project_id):
with pg_client.PostgresClient() as cur:
cur.execute(cur.mogrify(""" \
SELECT c.*,
'CUSTOM' AS type
FROM events_common.customs AS c
WHERE c.session_id = %(session_id)s
ORDER BY c.timestamp;""",
{"project_id": project_id, "session_id": session_id})
)
rows = cur.fetchall()
return helper.list_to_camel_case(rows)
def __merge_cells(rows, start, count, replacement):
rows[start] = replacement
rows = rows[:start + 1] + rows[start + count:]
return rows
def __get_grouped_clickrage(rows, session_id, project_id):
click_rage_issues = issues.get_by_session_id(session_id=session_id, issue_type="click_rage", project_id=project_id)
if len(click_rage_issues) == 0:
return rows
for c in click_rage_issues:
merge_count = c.get("payload")
if merge_count is not None:
merge_count = merge_count.get("Count", 3)
else:
merge_count = 3
for i in range(len(rows)):
if rows[i]["timestamp"] == c["timestamp"]:
rows = __merge_cells(rows=rows,
start=i,
count=merge_count,
replacement={**rows[i], "type": "CLICKRAGE", "count": merge_count})
break
return rows
def get_by_session_id(session_id, project_id, group_clickrage=False, event_type: Optional[schemas.EventType] = None):
with pg_client.PostgresClient() as cur:
rows = []
if event_type is None or event_type == schemas.EventType.CLICK:
cur.execute(cur.mogrify(""" \
SELECT c.*,
'CLICK' AS type
FROM events.clicks AS c
WHERE c.session_id = %(session_id)s
ORDER BY c.timestamp;""",
{"project_id": project_id, "session_id": session_id})
)
rows += cur.fetchall()
if group_clickrage:
rows = __get_grouped_clickrage(rows=rows, session_id=session_id, project_id=project_id)
if event_type is None or event_type == schemas.EventType.INPUT:
cur.execute(cur.mogrify("""
SELECT i.*,
'INPUT' AS type
FROM events.inputs AS i
WHERE i.session_id = %(session_id)s
ORDER BY i.timestamp;""",
{"project_id": project_id, "session_id": session_id})
)
rows += cur.fetchall()
if event_type is None or event_type == schemas.EventType.LOCATION:
cur.execute(cur.mogrify(""" \
SELECT l.*,
l.path AS value,
l.path AS url,
'LOCATION' AS type
FROM events.pages AS l
WHERE
l.session_id = %(session_id)s
ORDER BY l.timestamp;""", {"project_id": project_id, "session_id": session_id}))
rows += cur.fetchall()
rows = helper.list_to_camel_case(rows)
rows = sorted(rows, key=lambda k: (k["timestamp"], k["messageId"]))
return rows
def _search_tags(project_id, value, key=None, source=None):
with pg_client.PostgresClient() as cur:
query = f"""
SELECT public.tags.name
'TAG' AS type
FROM public.tags
WHERE public.tags.project_id = %(project_id)s
ORDER BY SIMILARITY(public.tags.name, %(value)s) DESC
LIMIT 10
"""
query = cur.mogrify(query, {'project_id': project_id, 'value': value})
cur.execute(query)
results = helper.list_to_camel_case(cur.fetchall())
return results
@cache
def supported_types():
return {
schemas.EventType.CLICK: SupportedFilter(get=autocomplete.__generic_autocomplete(schemas.EventType.CLICK),
query=autocomplete.__generic_query(typename=schemas.EventType.CLICK)),
schemas.EventType.INPUT: SupportedFilter(get=autocomplete.__generic_autocomplete(schemas.EventType.INPUT),
query=autocomplete.__generic_query(typename=schemas.EventType.INPUT)),
schemas.EventType.LOCATION: SupportedFilter(get=autocomplete.__generic_autocomplete(schemas.EventType.LOCATION),
query=autocomplete.__generic_query(
typename=schemas.EventType.LOCATION)),
schemas.EventType.CUSTOM: SupportedFilter(get=autocomplete.__generic_autocomplete(schemas.EventType.CUSTOM),
query=autocomplete.__generic_query(
typename=schemas.EventType.CUSTOM)),
schemas.EventType.REQUEST: SupportedFilter(get=autocomplete.__generic_autocomplete(schemas.EventType.REQUEST),
query=autocomplete.__generic_query(
typename=schemas.EventType.REQUEST)),
schemas.EventType.GRAPHQL: SupportedFilter(get=autocomplete.__generic_autocomplete(schemas.EventType.GRAPHQL),
query=autocomplete.__generic_query(
typename=schemas.EventType.GRAPHQL)),
schemas.EventType.STATE_ACTION: SupportedFilter(
get=autocomplete.__generic_autocomplete(schemas.EventType.STATE_ACTION),
query=autocomplete.__generic_query(
typename=schemas.EventType.STATE_ACTION)),
schemas.EventType.TAG: SupportedFilter(get=_search_tags, query=None),
schemas.EventType.ERROR: SupportedFilter(get=autocomplete.__search_errors,
query=None),
schemas.FilterType.METADATA: SupportedFilter(get=autocomplete.__search_metadata,
query=None),
# MOBILE
schemas.EventType.CLICK_MOBILE: SupportedFilter(
get=autocomplete.__generic_autocomplete(schemas.EventType.CLICK_MOBILE),
query=autocomplete.__generic_query(
typename=schemas.EventType.CLICK_MOBILE)),
schemas.EventType.SWIPE_MOBILE: SupportedFilter(
get=autocomplete.__generic_autocomplete(schemas.EventType.SWIPE_MOBILE),
query=autocomplete.__generic_query(
typename=schemas.EventType.SWIPE_MOBILE)),
schemas.EventType.INPUT_MOBILE: SupportedFilter(
get=autocomplete.__generic_autocomplete(schemas.EventType.INPUT_MOBILE),
query=autocomplete.__generic_query(
typename=schemas.EventType.INPUT_MOBILE)),
schemas.EventType.VIEW_MOBILE: SupportedFilter(
get=autocomplete.__generic_autocomplete(schemas.EventType.VIEW_MOBILE),
query=autocomplete.__generic_query(
typename=schemas.EventType.VIEW_MOBILE)),
schemas.EventType.CUSTOM_MOBILE: SupportedFilter(
get=autocomplete.__generic_autocomplete(schemas.EventType.CUSTOM_MOBILE),
query=autocomplete.__generic_query(
typename=schemas.EventType.CUSTOM_MOBILE)),
schemas.EventType.REQUEST_MOBILE: SupportedFilter(
get=autocomplete.__generic_autocomplete(schemas.EventType.REQUEST_MOBILE),
query=autocomplete.__generic_query(
typename=schemas.EventType.REQUEST_MOBILE)),
schemas.EventType.ERROR_MOBILE: SupportedFilter(get=autocomplete.__search_errors_mobile,
query=None),
}
def get_errors_by_session_id(session_id, project_id):
with pg_client.PostgresClient() as cur:
cur.execute(cur.mogrify(f"""\
SELECT er.*,ur.*, er.timestamp - s.start_ts AS time
FROM events.errors AS er INNER JOIN public.errors AS ur USING (error_id) INNER JOIN public.sessions AS s USING (session_id)
WHERE er.session_id = %(session_id)s AND s.project_id=%(project_id)s
ORDER BY timestamp;""", {"session_id": session_id, "project_id": project_id}))
errors = cur.fetchall()
for e in errors:
e["stacktrace_parsed_at"] = TimeUTC.datetime_to_timestamp(e["stacktrace_parsed_at"])
return helper.list_to_camel_case(errors)
def get_incidents_by_session_id(session_id, project_id):
logger.warning("INCIDENTS not supported in PG")
return []
def search(text, event_type, project_id, source, key):
if not event_type:
return {"data": autocomplete.__get_autocomplete_table(text, project_id)}
if event_type in supported_types().keys():
rows = supported_types()[event_type].get(project_id=project_id, value=text, key=key, source=source)
elif event_type + "_MOBILE" in supported_types().keys():
rows = supported_types()[event_type + "_MOBILE"].get(project_id=project_id, value=text, key=key, source=source)
elif event_type in sessions_metas.supported_types().keys():
return sessions_metas.search(text, event_type, project_id)
elif event_type.endswith("_IOS") \
and event_type[:-len("_IOS")] in sessions_metas.supported_types().keys():
return sessions_metas.search(text, event_type, project_id)
elif event_type.endswith("_MOBILE") \
and event_type[:-len("_MOBILE")] in sessions_metas.supported_types().keys():
return sessions_metas.search(text, event_type, project_id)
else:
return {"errors": ["unsupported event"]}
return {"data": rows}

View file

@ -1,5 +1,5 @@
from chalicelib.utils import pg_client, helper
from . import events
from chalicelib.core import events
def get_customs_by_session_id(session_id, project_id):
@ -58,7 +58,7 @@ def get_crashes_by_session_id(session_id):
with pg_client.PostgresClient() as cur:
cur.execute(cur.mogrify(f"""
SELECT cr.*,uc.*, cr.timestamp - s.start_ts AS time
FROM events_common.crashes AS cr
FROM {events.EventType.CRASH_MOBILE.table} AS cr
INNER JOIN public.crashes_ios AS uc USING (crash_ios_id)
INNER JOIN public.sessions AS s USING (session_id)
WHERE

View file

@ -4,8 +4,9 @@ from chalicelib.utils import pg_client, helper
def get(project_id, issue_id):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(
""" \
SELECT *
"""\
SELECT
*
FROM public.issues
WHERE project_id = %(project_id)s
AND issue_id = %(issue_id)s;""",
@ -34,29 +35,6 @@ def get_by_session_id(session_id, project_id, issue_type=None):
return helper.list_to_camel_case(cur.fetchall())
# To reduce the number of issues in the replay;
# will be removed once we agree on how to show issues
def reduce_issues(issues_list):
if issues_list is None:
return None
i = 0
# remove same-type issues if the time between them is <2s
while i < len(issues_list) - 1:
for j in range(i + 1, len(issues_list)):
if issues_list[i]["type"] == issues_list[j]["type"]:
break
else:
i += 1
break
if issues_list[i]["timestamp"] - issues_list[j]["timestamp"] < 2000:
issues_list.pop(j)
else:
i += 1
return issues_list
def get_all_types():
return [
{

View file

@ -1,11 +0,0 @@
import logging
from decouple import config
logger = logging.getLogger(__name__)
if config("EXP_EVENTS", cast=bool, default=False):
logger.info(">>> Using experimental issues")
from . import issues_ch as issues
else:
from . import issues_pg as issues

View file

@ -1,59 +0,0 @@
from chalicelib.utils import ch_client, helper
import datetime
from chalicelib.utils.exp_ch_helper import explode_dproperties, add_timestamp
def get(project_id, issue_id):
with ch_client.ClickHouseClient() as cur:
query = cur.format(query=""" \
SELECT *
FROM product_analytics.events
WHERE project_id = %(project_id)s
AND issue_id = %(issue_id)s;""",
parameters={"project_id": project_id, "issue_id": issue_id})
data = cur.execute(query=query)
if data is not None and len(data) > 0:
data = data[0]
data["title"] = helper.get_issue_title(data["type"])
return helper.dict_to_camel_case(data)
def get_by_session_id(session_id, project_id, issue_type=None):
with ch_client.ClickHouseClient() as cur:
query = cur.format(query=f"""\
SELECT created_at, `$properties`
FROM product_analytics.events
WHERE session_id = %(session_id)s
AND project_id= %(project_id)s
AND `$event_name`='ISSUE'
{"AND issue_type = %(type)s" if issue_type is not None else ""}
ORDER BY created_at;""",
parameters={"session_id": session_id, "project_id": project_id, "type": issue_type})
rows = cur.execute(query)
rows = explode_dproperties(rows)
rows = helper.list_to_camel_case(rows)
rows = add_timestamp(rows)
return rows
# To reduce the number of issues in the replay;
# will be removed once we agree on how to show issues
def reduce_issues(issues_list):
if issues_list is None:
return None
i = 0
# remove same-type issues if the time between them is <2s
while i < len(issues_list) - 1:
for j in range(i + 1, len(issues_list)):
if issues_list[i]["issueType"] == issues_list[j]["issueType"]:
break
else:
i += 1
break
if issues_list[i]["createdAt"] - issues_list[j]["createdAt"] < datetime.timedelta(seconds=2):
issues_list.pop(j)
else:
i += 1
return issues_list

View file

@ -241,24 +241,3 @@ def get_colname_by_key(project_id, key):
return None
return index_to_colname(meta_keys[key])
def get_for_filters(project_id):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(f"""SELECT {",".join(column_names())}
FROM public.projects
WHERE project_id = %(project_id)s
AND deleted_at ISNULL
LIMIT 1;""", {"project_id": project_id})
cur.execute(query=query)
metas = cur.fetchone()
results = []
if metas is not None:
for i, k in enumerate(metas.keys()):
if metas[k] is not None:
results.append({"id": f"meta_{i}",
"name": k,
"displayName": metas[k],
"possibleTypes": ["String"],
"autoCaptured": False})
return {"total": len(results), "list": results}

View file

@ -4,7 +4,7 @@ import logging
from fastapi import HTTPException, status
import schemas
from chalicelib.core.issues import issues
from chalicelib.core import issues
from chalicelib.core.errors import errors
from chalicelib.core.metrics import heatmaps, product_analytics, funnels
from chalicelib.core.sessions import sessions, sessions_search
@ -61,9 +61,6 @@ def get_heat_map_chart(project: schemas.ProjectContext, user_id, data: schemas.C
return None
data.series[0].filter.filters += data.series[0].filter.events
data.series[0].filter.events = []
print(">>>>>>>>>>>>>>>>>>>>>>>>><")
print(data.series[0].filter.model_dump())
print(">>>>>>>>>>>>>>>>>>>>>>>>><")
return heatmaps.search_short_session(project_id=project.project_id, user_id=user_id,
data=schemas.HeatMapSessionsSearch(
**data.series[0].filter.model_dump()),
@ -172,8 +169,7 @@ def get_sessions_by_card_id(project: schemas.ProjectContext, user_id, metric_id,
results = []
for s in data.series:
results.append({"seriesId": s.series_id, "seriesName": s.name,
**sessions_search.search_sessions(data=s.filter, project=project, user_id=user_id,
metric_of=data.metric_of)})
**sessions_search.search_sessions(data=s.filter, project=project, user_id=user_id)})
return results
@ -188,8 +184,7 @@ def get_sessions(project: schemas.ProjectContext, user_id, data: schemas.CardSes
s.filter = schemas.SessionsSearchPayloadSchema(**s.filter.model_dump(by_alias=True))
results.append({"seriesId": None, "seriesName": s.name,
**sessions_search.search_sessions(data=s.filter, project=project, user_id=user_id,
metric_of=data.metric_of)})
**sessions_search.search_sessions(data=s.filter, project=project, user_id=user_id)})
return results
@ -252,7 +247,8 @@ def create_card(project: schemas.ProjectContext, user_id, data: schemas.CardSche
VALUES (%(project_id)s, %(user_id)s, %(name)s, %(is_public)s,
%(view_type)s, %(metric_type)s, %(metric_of)s, %(metric_value)s,
%(metric_format)s, %(default_config)s, %(thumbnail)s, %(session_data)s,
%(card_info)s) RETURNING metric_id"""
%(card_info)s)
RETURNING metric_id"""
if len(data.series) > 0:
query = f"""WITH m AS ({query})
INSERT INTO metric_series(metric_id, index, name, filter)
@ -529,13 +525,13 @@ def get_all(project_id, user_id):
def delete_card(project_id, metric_id, user_id):
with pg_client.PostgresClient() as cur:
cur.execute(
cur.mogrify(""" \
cur.mogrify("""\
UPDATE public.metrics
SET deleted_at = timezone('utc'::text, now()),
edited_at = timezone('utc'::text, now())
SET deleted_at = timezone('utc'::text, now()), edited_at = timezone('utc'::text, now())
WHERE project_id = %(project_id)s
AND metric_id = %(metric_id)s
AND (user_id = %(user_id)s OR is_public) RETURNING data;""",
AND (user_id = %(user_id)s OR is_public)
RETURNING data;""",
{"metric_id": metric_id, "project_id": project_id, "user_id": user_id})
)
@ -624,8 +620,7 @@ def get_series_for_alert(project_id, user_id):
WHERE metrics.deleted_at ISNULL
AND metrics.project_id = %(project_id)s
AND metrics.metric_type = 'timeseries'
AND (user_id = %(user_id)s
OR is_public)
AND (user_id = %(user_id)s OR is_public)
ORDER BY name;""",
{"project_id": project_id, "user_id": user_id}
)
@ -637,7 +632,7 @@ def get_series_for_alert(project_id, user_id):
def change_state(project_id, metric_id, user_id, status):
with pg_client.PostgresClient() as cur:
cur.execute(
cur.mogrify(""" \
cur.mogrify("""\
UPDATE public.metrics
SET active = %(status)s
WHERE metric_id = %(metric_id)s
@ -679,8 +674,7 @@ def get_funnel_sessions_by_issue(user_id, project_id, metric_id, issue_id,
"issue": issue}
def make_chart_from_card(project: schemas.ProjectContext, user_id, metric_id,
data: schemas.CardSessionsSchema, for_dashboard: bool = False):
def make_chart_from_card(project: schemas.ProjectContext, user_id, metric_id, data: schemas.CardSessionsSchema):
raw_metric: dict = get_card(metric_id=metric_id, project_id=project.project_id, user_id=user_id, include_data=True)
if raw_metric is None:
@ -699,8 +693,7 @@ def make_chart_from_card(project: schemas.ProjectContext, user_id, metric_id,
return heatmaps.search_short_session(project_id=project.project_id,
data=schemas.HeatMapSessionsSearch(**metric.model_dump()),
user_id=user_id)
elif metric.metric_type == schemas.MetricType.PATH_ANALYSIS and for_dashboard:
metric.hide_excess = True
return get_chart(project=project, data=metric, user_id=user_id)

View file

@ -6,7 +6,7 @@ from chalicelib.utils import helper
from chalicelib.utils import sql_helper as sh
def filter_stages(stages: List[schemas.SessionSearchEventSchema]):
def filter_stages(stages: List[schemas.SessionSearchEventSchema2]):
ALLOW_TYPES = [schemas.EventType.CLICK, schemas.EventType.INPUT,
schemas.EventType.LOCATION, schemas.EventType.CUSTOM,
schemas.EventType.CLICK_MOBILE, schemas.EventType.INPUT_MOBILE,
@ -15,10 +15,10 @@ def filter_stages(stages: List[schemas.SessionSearchEventSchema]):
def __parse_events(f_events: List[dict]):
return [schemas.SessionSearchEventSchema.parse_obj(e) for e in f_events]
return [schemas.SessionSearchEventSchema2.parse_obj(e) for e in f_events]
def __fix_stages(f_events: List[schemas.SessionSearchEventSchema]):
def __fix_stages(f_events: List[schemas.SessionSearchEventSchema2]):
if f_events is None:
return
events = []

View file

@ -160,7 +160,7 @@ s.start_ts,
s.duration"""
def __get_1_url(location_condition: schemas.SessionSearchEventSchema | None, session_id: str, project_id: int,
def __get_1_url(location_condition: schemas.SessionSearchEventSchema2 | None, session_id: str, project_id: int,
start_time: int,
end_time: int) -> str | None:
full_args = {
@ -240,11 +240,11 @@ def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_i
value=[schemas.PlatformType.DESKTOP],
operator=schemas.SearchEventOperator.IS))
if not location_condition:
data.events.append(schemas.SessionSearchEventSchema(type=schemas.EventType.LOCATION,
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.LOCATION,
value=[],
operator=schemas.SearchEventOperator.IS_ANY))
if no_click:
data.events.append(schemas.SessionSearchEventSchema(type=schemas.EventType.CLICK,
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.CLICK,
value=[],
operator=schemas.SearchEventOperator.IS_ANY))

View file

@ -3,7 +3,7 @@ import logging
from decouple import config
import schemas
from chalicelib.core.events import events
from chalicelib.core import events
from chalicelib.core.metrics.modules import sessions, sessions_mobs
from chalicelib.utils import sql_helper as sh
@ -24,9 +24,8 @@ def get_by_url(project_id, data: schemas.GetHeatMapPayloadSchema):
"main_events.`$event_name` = 'CLICK'",
"isNotNull(JSON_VALUE(CAST(main_events.`$properties` AS String), '$.normalized_x'))"
]
if data.operator == schemas.SearchEventOperator.PATTERN:
constraints.append("match(main_events.`$properties`.url_path'.:String,%(url)s)")
elif data.operator == schemas.SearchEventOperator.IS:
if data.operator == schemas.SearchEventOperator.IS:
constraints.append("JSON_VALUE(CAST(main_events.`$properties` AS String), '$.url_path') = %(url)s")
else:
constraints.append("JSON_VALUE(CAST(main_events.`$properties` AS String), '$.url_path') ILIKE %(url)s")
@ -180,7 +179,7 @@ toUnixTimestamp(s.datetime)*1000 AS start_ts,
s.duration AS duration"""
def __get_1_url(location_condition: schemas.SessionSearchEventSchema | None, session_id: str, project_id: int,
def __get_1_url(location_condition: schemas.SessionSearchEventSchema2 | None, session_id: str, project_id: int,
start_time: int,
end_time: int) -> str | None:
full_args = {
@ -263,11 +262,11 @@ def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_i
value=[schemas.PlatformType.DESKTOP],
operator=schemas.SearchEventOperator.IS))
if not location_condition:
data.events.append(schemas.SessionSearchEventSchema(type=schemas.EventType.LOCATION,
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.LOCATION,
value=[],
operator=schemas.SearchEventOperator.IS_ANY))
if no_click:
data.events.append(schemas.SessionSearchEventSchema(type=schemas.EventType.CLICK,
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.CLICK,
value=[],
operator=schemas.SearchEventOperator.IS_ANY))

View file

@ -7,8 +7,7 @@ from typing import List
from psycopg2.extras import RealDictRow
import schemas
from chalicelib.core import metadata
from chalicelib.core.events import events
from chalicelib.core import events, metadata
from chalicelib.utils import pg_client, helper
from chalicelib.utils import sql_helper as sh
@ -77,10 +76,10 @@ def get_stages_and_events(filter_d: schemas.CardSeriesFilterSchema, project_id)
values["maxDuration"] = f.value[1]
elif filter_type == schemas.FilterType.REFERRER:
# events_query_part = events_query_part + f"INNER JOIN events.pages AS p USING(session_id)"
filter_extra_from = [f"INNER JOIN {"events.pages"} AS p USING(session_id)"]
filter_extra_from = [f"INNER JOIN {events.EventType.LOCATION.table} AS p USING(session_id)"]
first_stage_extra_constraints.append(
sh.multi_conditions(f"p.base_referrer {op} %({f_k})s", f.value, is_not=is_not, value_key=f_k))
elif filter_type == schemas.FilterType.METADATA:
elif filter_type == events.EventType.METADATA.ui_type:
if meta_keys is None:
meta_keys = metadata.get(project_id=project_id)
meta_keys = {m["key"]: m["index"] for m in meta_keys}
@ -122,31 +121,31 @@ def get_stages_and_events(filter_d: schemas.CardSeriesFilterSchema, project_id)
op = sh.get_sql_operator(s.operator)
# event_type = s["type"].upper()
event_type = s.type
if event_type == schemas.EventType.CLICK:
next_table = "events.clicks"
next_col_name = "label"
elif event_type == schemas.EventType.INPUT:
next_table = "events.inputs"
next_col_name = "label"
elif event_type == schemas.EventType.LOCATION:
next_table = "events.pages"
next_col_name = "path"
elif event_type == schemas.EventType.CUSTOM:
next_table = "events_common.customs"
next_col_name = "name"
if event_type == events.EventType.CLICK.ui_type:
next_table = events.EventType.CLICK.table
next_col_name = events.EventType.CLICK.column
elif event_type == events.EventType.INPUT.ui_type:
next_table = events.EventType.INPUT.table
next_col_name = events.EventType.INPUT.column
elif event_type == events.EventType.LOCATION.ui_type:
next_table = events.EventType.LOCATION.table
next_col_name = events.EventType.LOCATION.column
elif event_type == events.EventType.CUSTOM.ui_type:
next_table = events.EventType.CUSTOM.table
next_col_name = events.EventType.CUSTOM.column
# IOS --------------
elif event_type == schemas.EventType.CLICK_MOBILE:
next_table = "events_ios.taps"
next_col_name = "label"
elif event_type == schemas.EventType.INPUT_MOBILE:
next_table = "events_ios.inputs"
next_col_name = "label"
elif event_type == schemas.EventType.VIEW_MOBILE:
next_table = "events_ios.views"
next_col_name = "name"
elif event_type == schemas.EventType.CUSTOM_MOBILE:
next_table = "events_common.customs"
next_col_name = "name"
elif event_type == events.EventType.CLICK_MOBILE.ui_type:
next_table = events.EventType.CLICK_MOBILE.table
next_col_name = events.EventType.CLICK_MOBILE.column
elif event_type == events.EventType.INPUT_MOBILE.ui_type:
next_table = events.EventType.INPUT_MOBILE.table
next_col_name = events.EventType.INPUT_MOBILE.column
elif event_type == events.EventType.VIEW_MOBILE.ui_type:
next_table = events.EventType.VIEW_MOBILE.table
next_col_name = events.EventType.VIEW_MOBILE.column
elif event_type == events.EventType.CUSTOM_MOBILE.ui_type:
next_table = events.EventType.CUSTOM_MOBILE.table
next_col_name = events.EventType.CUSTOM_MOBILE.column
else:
logger.warning(f"=================UNDEFINED:{event_type}")
continue
@ -242,7 +241,7 @@ def get_simple_funnel(filter_d: schemas.CardSeriesFilterSchema, project: schemas
:return:
"""
stages: List[schemas.SessionSearchEventSchema] = filter_d.events
stages: List[schemas.SessionSearchEventSchema2] = filter_d.events
filters: List[schemas.SessionSearchFilterSchema] = filter_d.filters
stage_constraints = ["main.timestamp <= %(endTimestamp)s"]
@ -298,10 +297,10 @@ def get_simple_funnel(filter_d: schemas.CardSeriesFilterSchema, project: schemas
values["maxDuration"] = f.value[1]
elif filter_type == schemas.FilterType.REFERRER:
# events_query_part = events_query_part + f"INNER JOIN events.pages AS p USING(session_id)"
filter_extra_from = [f"INNER JOIN {"events.pages"} AS p USING(session_id)"]
filter_extra_from = [f"INNER JOIN {events.EventType.LOCATION.table} AS p USING(session_id)"]
first_stage_extra_constraints.append(
sh.multi_conditions(f"p.base_referrer {op} %({f_k})s", f.value, is_not=is_not, value_key=f_k))
elif filter_type == schemas.FilterType.METADATA:
elif filter_type == events.EventType.METADATA.ui_type:
if meta_keys is None:
meta_keys = metadata.get(project_id=project.project_id)
meta_keys = {m["key"]: m["index"] for m in meta_keys}
@ -343,31 +342,31 @@ def get_simple_funnel(filter_d: schemas.CardSeriesFilterSchema, project: schemas
op = sh.get_sql_operator(s.operator)
# event_type = s["type"].upper()
event_type = s.type
if event_type == schemas.EventType.CLICK:
next_table = "events.clicks"
next_col_name = "label"
elif event_type == schemas.EventType.INPUT:
next_table = "events.inputs"
next_col_name = "label"
elif event_type == schemas.EventType.LOCATION:
next_table = "events.pages"
next_col_name = "path"
elif event_type == schemas.EventType.CUSTOM:
next_table = "events_common.customs"
next_col_name = "name"
if event_type == events.EventType.CLICK.ui_type:
next_table = events.EventType.CLICK.table
next_col_name = events.EventType.CLICK.column
elif event_type == events.EventType.INPUT.ui_type:
next_table = events.EventType.INPUT.table
next_col_name = events.EventType.INPUT.column
elif event_type == events.EventType.LOCATION.ui_type:
next_table = events.EventType.LOCATION.table
next_col_name = events.EventType.LOCATION.column
elif event_type == events.EventType.CUSTOM.ui_type:
next_table = events.EventType.CUSTOM.table
next_col_name = events.EventType.CUSTOM.column
# IOS --------------
elif event_type == schemas.EventType.CLICK_MOBILE:
next_table = "events_ios.taps"
next_col_name = "label"
elif event_type == schemas.EventType.INPUT_MOBILE:
next_table = "events_ios.inputs"
next_col_name = "label"
elif event_type == schemas.EventType.VIEW_MOBILE:
next_table = "events_ios.views"
next_col_name = "name"
elif event_type == schemas.EventType.CUSTOM_MOBILE:
next_table = "events_common.customs"
next_col_name = "name"
elif event_type == events.EventType.CLICK_MOBILE.ui_type:
next_table = events.EventType.CLICK_MOBILE.table
next_col_name = events.EventType.CLICK_MOBILE.column
elif event_type == events.EventType.INPUT_MOBILE.ui_type:
next_table = events.EventType.INPUT_MOBILE.table
next_col_name = events.EventType.INPUT_MOBILE.column
elif event_type == events.EventType.VIEW_MOBILE.ui_type:
next_table = events.EventType.VIEW_MOBILE.table
next_col_name = events.EventType.VIEW_MOBILE.column
elif event_type == events.EventType.CUSTOM_MOBILE.ui_type:
next_table = events.EventType.CUSTOM_MOBILE.table
next_col_name = events.EventType.CUSTOM_MOBILE.column
else:
logger.warning(f"=================UNDEFINED:{event_type}")
continue

View file

@ -8,14 +8,14 @@ from chalicelib.utils import ch_client
from chalicelib.utils import exp_ch_helper
from chalicelib.utils import helper
from chalicelib.utils import sql_helper as sh
from chalicelib.core.events import events
from chalicelib.core import events
logger = logging.getLogger(__name__)
def get_simple_funnel(filter_d: schemas.CardSeriesFilterSchema, project: schemas.ProjectContext,
metric_format: schemas.MetricExtendedFormatType) -> List[RealDictRow]:
stages: List[schemas.SessionSearchEventSchema] = filter_d.events
stages: List[schemas.SessionSearchEventSchema2] = filter_d.events
filters: List[schemas.SessionSearchFilterSchema] = filter_d.filters
platform = project.platform
constraints = ["e.project_id = %(project_id)s",
@ -82,7 +82,7 @@ def get_simple_funnel(filter_d: schemas.CardSeriesFilterSchema, project: schemas
elif filter_type == schemas.FilterType.REFERRER:
constraints.append(
sh.multi_conditions(f"s.base_referrer {op} %({f_k})s", f.value, is_not=is_not, value_key=f_k))
elif filter_type == schemas.FilterType.METADATA:
elif filter_type == events.EventType.METADATA.ui_type:
if meta_keys is None:
meta_keys = metadata.get(project_id=project.project_id)
meta_keys = {m["key"]: m["index"] for m in meta_keys}
@ -125,29 +125,29 @@ def get_simple_funnel(filter_d: schemas.CardSeriesFilterSchema, project: schemas
e_k = f"e_value{i}"
event_type = s.type
next_event_type = exp_ch_helper.get_event_type(event_type, platform=platform)
if event_type == schemas.EventType.CLICK:
if event_type == events.EventType.CLICK.ui_type:
if platform == "web":
next_col_name = "label"
next_col_name = events.EventType.CLICK.column
if not is_any:
if schemas.ClickEventExtraOperator.has_value(s.operator):
specific_condition = sh.multi_conditions(f"selector {op} %({e_k})s", s.value, value_key=e_k)
else:
next_col_name = "label"
elif event_type == schemas.EventType.INPUT:
next_col_name = "label"
elif event_type == schemas.EventType.LOCATION:
next_col_name = events.EventType.CLICK_MOBILE.column
elif event_type == events.EventType.INPUT.ui_type:
next_col_name = events.EventType.INPUT.column
elif event_type == events.EventType.LOCATION.ui_type:
next_col_name = 'url_path'
elif event_type == schemas.EventType.CUSTOM:
next_col_name = "name"
elif event_type == events.EventType.CUSTOM.ui_type:
next_col_name = events.EventType.CUSTOM.column
# IOS --------------
elif event_type == schemas.EventType.CLICK_MOBILE:
next_col_name = "label"
elif event_type == schemas.EventType.INPUT_MOBILE:
next_col_name = "label"
elif event_type == schemas.EventType.VIEW_MOBILE:
next_col_name = "name"
elif event_type == schemas.EventType.CUSTOM_MOBILE:
next_col_name = "name"
elif event_type == events.EventType.CLICK_MOBILE.ui_type:
next_col_name = events.EventType.CLICK_MOBILE.column
elif event_type == events.EventType.INPUT_MOBILE.ui_type:
next_col_name = events.EventType.INPUT_MOBILE.column
elif event_type == events.EventType.VIEW_MOBILE.ui_type:
next_col_name = events.EventType.VIEW_MOBILE.column
elif event_type == events.EventType.CUSTOM_MOBILE.ui_type:
next_col_name = events.EventType.CUSTOM_MOBILE.column
else:
logger.warning(f"=================UNDEFINED:{event_type}")
continue

View file

@ -0,0 +1,14 @@
from chalicelib.utils.ch_client import ClickHouseClient
def search_events(project_id: int, data: dict):
with ClickHouseClient() as ch_client:
r = ch_client.format(
"""SELECT *
FROM taha.events
WHERE project_id=%(project_id)s
ORDER BY created_at;""",
params={"project_id": project_id})
x = ch_client.execute(r)
return x

View file

@ -1,59 +0,0 @@
from typing import Optional
from chalicelib.utils import helper
from chalicelib.utils.ch_client import ClickHouseClient
def search_events(project_id: int, q: Optional[str] = None):
with ClickHouseClient() as ch_client:
full_args = {"project_id": project_id, "limit": 20}
constraints = ["project_id = %(project_id)s",
"_timestamp >= now()-INTERVAL 1 MONTH"]
if q:
constraints += ["value ILIKE %(q)s"]
full_args["q"] = helper.string_to_sql_like(q)
query = ch_client.format(
f"""SELECT value,data_count
FROM product_analytics.autocomplete_events_grouped
WHERE {" AND ".join(constraints)}
ORDER BY data_count DESC
LIMIT %(limit)s;""",
parameters=full_args)
rows = ch_client.execute(query)
return {"values": helper.list_to_camel_case(rows), "_src": 2}
def search_properties(project_id: int, property_name: Optional[str] = None, event_name: Optional[str] = None,
q: Optional[str] = None):
with ClickHouseClient() as ch_client:
select = "value, data_count"
grouping = ""
full_args = {"project_id": project_id, "limit": 20,
"event_name": event_name, "property_name": property_name,
"q_l": helper.string_to_sql_like(q)}
constraints = ["project_id = %(project_id)s",
"_timestamp >= now()-INTERVAL 1 MONTH",
"property_name = %(property_name)s"]
if event_name:
constraints += ["event_name = %(event_name)s"]
else:
select = "value, sum(aepg.data_count) AS data_count"
grouping = "GROUP BY 1"
if q:
constraints += ["value ILIKE %(q_l)s"]
query = ch_client.format(
f"""SELECT {select}
FROM product_analytics.autocomplete_event_properties_grouped AS aepg
WHERE {" AND ".join(constraints)}
{grouping}
ORDER BY data_count DESC
LIMIT %(limit)s;""",
parameters=full_args)
rows = ch_client.execute(query)
return {"events": helper.list_to_camel_case(rows), "_src": 2}

View file

@ -1,180 +0,0 @@
import logging
import schemas
from chalicelib.utils import helper
from chalicelib.utils import sql_helper as sh
from chalicelib.utils.ch_client import ClickHouseClient
from chalicelib.utils.exp_ch_helper import get_sub_condition, get_col_cast
logger = logging.getLogger(__name__)
PREDEFINED_EVENTS = [
"CLICK",
"INPUT",
"LOCATION",
"ERROR",
"REQUEST"
]
def get_events(project_id: int):
with ClickHouseClient() as ch_client:
r = ch_client.format(
""" \
SELECT DISTINCT
ON(event_name,auto_captured)
COUNT (1) OVER () AS total,
event_name AS name, display_name, description,
auto_captured
FROM product_analytics.all_events
WHERE project_id=%(project_id)s
ORDER BY auto_captured, display_name, event_name;""",
parameters={"project_id": project_id})
rows = ch_client.execute(r)
if len(rows) == 0:
return {"total": len(PREDEFINED_EVENTS), "list": [{
"name": e,
"displayName": "",
"description": "",
"autoCaptured": True,
"id": "event_0",
"dataType": "string",
"possibleTypes": [
"string"
],
"_foundInPredefinedList": False
} for e in PREDEFINED_EVENTS]}
total = rows[0]["total"]
rows = helper.list_to_camel_case(rows)
for i, row in enumerate(rows):
row["id"] = f"event_{i}"
row["dataType"] = "string"
row["possibleTypes"] = ["string"]
row["_foundInPredefinedList"] = True
row.pop("total")
keys = [r["name"] for r in rows]
for e in PREDEFINED_EVENTS:
if e not in keys:
total += 1
rows.append({
"name": e,
"displayName": "",
"description": "",
"autoCaptured": True,
"id": "event_0",
"dataType": "string",
"possibleTypes": [
"string"
],
"_foundInPredefinedList": False
})
return {"total": total, "list": rows}
def search_events(project_id: int, data: schemas.EventsSearchPayloadSchema):
with ClickHouseClient() as ch_client:
full_args = {"project_id": project_id, "startDate": data.startTimestamp, "endDate": data.endTimestamp,
"projectId": project_id, "limit": data.limit, "offset": (data.page - 1) * data.limit}
constraints = ["project_id = %(projectId)s",
"created_at >= toDateTime(%(startDate)s/1000)",
"created_at <= toDateTime(%(endDate)s/1000)"]
ev_constraints = []
for i, f in enumerate(data.filters):
if not f.is_event:
f.value = helper.values_for_operator(value=f.value, op=f.operator)
f_k = f"f_value{i}"
full_args = {**full_args, f_k: sh.single_value(f.value), **sh.multi_values(f.value, value_key=f_k)}
is_any = sh.isAny_opreator(f.operator)
is_undefined = sh.isUndefined_operator(f.operator)
full_args = {**full_args, f_k: sh.single_value(f.value), **sh.multi_values(f.value, value_key=f_k)}
if f.is_predefined:
column = f.name
else:
column = f"properties.{f.name}"
if is_any:
condition = f"notEmpty{column})"
elif is_undefined:
condition = f"empty({column})"
else:
condition = sh.multi_conditions(
get_sub_condition(col_name=column, val_name=f_k, operator=f.operator),
values=f.value, value_key=f_k)
constraints.append(condition)
else:
e_k = f"e_value{i}"
full_args = {**full_args, e_k: f.name}
condition = f"`$event_name` = %({e_k})s"
sub_conditions = []
for j, ef in enumerate(f.properties.filters):
p_k = f"e_{i}_p_{j}"
full_args = {**full_args, **sh.multi_values(ef.value, value_key=p_k, data_type=ef.data_type)}
cast = get_col_cast(data_type=ef.data_type, value=ef.value)
if ef.is_predefined:
sub_condition = get_sub_condition(col_name=f"accurateCastOrNull(`{ef.name}`,'{cast}')",
val_name=p_k, operator=ef.operator)
else:
sub_condition = get_sub_condition(col_name=f"accurateCastOrNull(properties.`{ef.name}`,{cast})",
val_name=p_k, operator=ef.operator)
sub_conditions.append(sh.multi_conditions(sub_condition, ef.value, value_key=p_k))
if len(sub_conditions) > 0:
condition += " AND (" + (" " + f.properties.operator + " ").join(sub_conditions) + ")"
ev_constraints.append(condition)
constraints.append("(" + " OR ".join(ev_constraints) + ")")
query = ch_client.format(
f"""SELECT COUNT(1) OVER () AS total,
event_id,
`$event_name`,
created_at,
`distinct_id`,
`$browser`,
`$import`,
`$os`,
`$country`,
`$state`,
`$city`,
`$screen_height`,
`$screen_width`,
`$source`,
`$user_id`,
`$device`
FROM product_analytics.events
WHERE {" AND ".join(constraints)}
ORDER BY created_at
LIMIT %(limit)s OFFSET %(offset)s;""",
parameters=full_args)
rows = ch_client.execute(query)
if len(rows) == 0:
return {"total": 0, "rows": [], "_src": 2}
total = rows[0]["total"]
for r in rows:
r.pop("total")
return {"total": total, "rows": rows, "_src": 2}
def get_lexicon(project_id: int, page: schemas.PaginatedSchema):
with ClickHouseClient() as ch_client:
r = ch_client.format(
"""SELECT COUNT(1) OVER () AS total, all_events.event_name AS name,
*
FROM product_analytics.all_events
WHERE project_id = %(project_id)s
ORDER BY display_name
LIMIT %(limit)s
OFFSET %(offset)s;""",
parameters={"project_id": project_id, "limit": page.limit, "offset": (page.page - 1) * page.limit})
rows = ch_client.execute(r)
if len(rows) == 0:
return {"total": 0, "list": []}
total = rows[0]["total"]
rows = helper.list_to_camel_case(rows)
for i, row in enumerate(rows):
row["id"] = f"event_{i}"
row["dataType"] = "string"
row["possibleTypes"] = ["string"]
row["_foundInPredefinedList"] = True
row.pop("total")
return {"total": total, "list": rows}

View file

@ -1,156 +0,0 @@
import schemas
def get_sessions_filters(project_id: int):
return {"total": 13,
"list": [
{
"id": "sf_1",
"name": schemas.FilterType.REFERRER,
"displayName": "Referrer",
"possibleTypes": [
"String"
],
"autoCaptured": True
},
{
"id": "sf_2",
"name": schemas.FilterType.DURATION,
"displayName": "Duration",
"possibleTypes": [
"int"
],
"autoCaptured": True
},
{
"id": "sf_3",
"name": schemas.FilterType.UTM_SOURCE,
"displayName": "UTM Source",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_4",
"name": schemas.FilterType.UTM_MEDIUM,
"displayName": "UTM Medium",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_5",
"name": schemas.FilterType.UTM_CAMPAIGN,
"displayName": "UTM Campaign",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_6",
"name": schemas.FilterType.USER_COUNTRY,
"displayName": "Country",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_7",
"name": schemas.FilterType.USER_CITY,
"displayName": "City",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_8",
"name": schemas.FilterType.USER_STATE,
"displayName": "State / Province",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_9",
"name": schemas.FilterType.USER_OS,
"displayName": "OS",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_10",
"name": schemas.FilterType.USER_BROWSER,
"displayName": "Browser",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_11",
"name": schemas.FilterType.USER_DEVICE,
"displayName": "Device",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_12",
"name": schemas.FilterType.PLATFORM,
"displayName": "Platform",
"possibleTypes": [
"string"
],
"autoCaptured": True
},
{
"id": "sf_13",
"name": schemas.FilterType.REV_ID,
"displayName": "Version ID",
"possibleTypes": [
"string"
],
"autoCaptured": True
}
]}
def get_users_filters(project_id: int):
return {"total": 2,
"list": [
{
"id": "uf_1",
"name": schemas.FilterType.USER_ID,
"displayName": "User ID",
"possibleTypes": [
"string"
],
"autoCaptured": False
},
{
"id": "uf_2",
"name": schemas.FilterType.USER_ANONYMOUS_ID,
"displayName": "User Anonymous ID",
"possibleTypes": [
"string"
],
"autoCaptured": False
}
]}
def get_global_filters(project_id: int):
r = get_sessions_filters(project_id)
r = r["list"]
for f in r:
f["defaultProperty"] = False
return r

View file

@ -1,174 +0,0 @@
import schemas
from chalicelib.utils import helper, exp_ch_helper
from chalicelib.utils.ch_client import ClickHouseClient
PREDEFINED_PROPERTIES = {
"label": "String",
"hesitation_time": "UInt32",
"name": "String",
"payload": "String",
"level": "Enum8",
"source": "Enum8",
"message": "String",
"error_id": "String",
"duration": "UInt16",
"context": "Enum8",
"url_host": "String",
"url_path": "String",
"url_hostpath": "String",
"request_start": "UInt16",
"response_start": "UInt16",
"response_end": "UInt16",
"dom_content_loaded_event_start": "UInt16",
"dom_content_loaded_event_end": "UInt16",
"load_event_start": "UInt16",
"load_event_end": "UInt16",
"first_paint": "UInt16",
"first_contentful_paint_time": "UInt16",
"speed_index": "UInt16",
"visually_complete": "UInt16",
"time_to_interactive": "UInt16",
"ttfb": "UInt16",
"ttlb": "UInt16",
"response_time": "UInt16",
"dom_building_time": "UInt16",
"dom_content_loaded_event_time": "UInt16",
"load_event_time": "UInt16",
"min_fps": "UInt8",
"avg_fps": "UInt8",
"max_fps": "UInt8",
"min_cpu": "UInt8",
"avg_cpu": "UInt8",
"max_cpu": "UInt8",
"min_total_js_heap_size": "UInt64",
"avg_total_js_heap_size": "UInt64",
"max_total_js_heap_size": "UInt64",
"min_used_js_heap_size": "UInt64",
"avg_used_js_heap_size": "UInt64",
"max_used_js_heap_size": "UInt64",
"method": "Enum8",
"status": "UInt16",
"success": "UInt8",
"request_body": "String",
"response_body": "String",
"transfer_size": "UInt32",
"selector": "String",
"normalized_x": "Float32",
"normalized_y": "Float32",
"message_id": "UInt64"
}
EVENT_DEFAULT_PROPERTIES = {
"CLICK": "label",
"INPUT": "label",
"LOCATION": "url_path",
"ERROR": "name",
"REQUEST": "url_path"
}
def get_all_properties(project_id: int):
with ClickHouseClient() as ch_client:
r = ch_client.format(
"""SELECT COUNT(1) OVER () AS total, property_name AS name,
display_name,
array_agg(DISTINCT event_properties.value_type) AS possible_types
FROM product_analytics.all_properties
LEFT JOIN product_analytics.event_properties USING (project_id, property_name)
WHERE all_properties.project_id = %(project_id)s
GROUP BY property_name, display_name
ORDER BY display_name, property_name;""",
parameters={"project_id": project_id})
properties = ch_client.execute(r)
if len(properties) == 0:
return {"total": 0, "list": []}
total = properties[0]["total"]
properties = helper.list_to_camel_case(properties)
for i, p in enumerate(properties):
p["id"] = f"prop_{i}"
p["_foundInPredefinedList"] = False
if p["name"] in PREDEFINED_PROPERTIES:
p["dataType"] = exp_ch_helper.simplify_clickhouse_type(PREDEFINED_PROPERTIES[p["name"]])
p["_foundInPredefinedList"] = True
p["possibleTypes"] = list(set(exp_ch_helper.simplify_clickhouse_types(p["possibleTypes"])))
p.pop("total")
keys = [p["name"] for p in properties]
for p in PREDEFINED_PROPERTIES:
if p not in keys:
total += 1
properties.append({
"name": p,
"displayName": "",
"possibleTypes": [
],
"id": f"prop_{len(properties) + 1}",
"_foundInPredefinedList": False,
"dataType": PREDEFINED_PROPERTIES[p]
})
return {"total": total, "list": properties}
def get_event_properties(project_id: int, event_name: str, auto_captured: bool):
with ClickHouseClient() as ch_client:
r = ch_client.format(
"""SELECT all_properties.property_name AS name,
all_properties.display_name,
array_agg(DISTINCT event_properties.value_type) AS possible_types
FROM product_analytics.event_properties
INNER JOIN product_analytics.all_properties USING (property_name)
WHERE event_properties.project_id = %(project_id)s
AND all_properties.project_id = %(project_id)s
AND event_properties.event_name = %(event_name)s
AND event_properties.auto_captured = %(auto_captured)s
GROUP BY ALL
ORDER BY 1;""",
parameters={"project_id": project_id, "event_name": event_name, "auto_captured": auto_captured})
properties = ch_client.execute(r)
properties = helper.list_to_camel_case(properties)
for i, p in enumerate(properties):
p["id"] = f"prop_{i}"
p["_foundInPredefinedList"] = False
if p["name"] in PREDEFINED_PROPERTIES:
p["dataType"] = exp_ch_helper.simplify_clickhouse_type(PREDEFINED_PROPERTIES[p["name"]])
p["_foundInPredefinedList"] = True
p["possibleTypes"] = list(set(exp_ch_helper.simplify_clickhouse_types(p["possibleTypes"])))
p["defaultProperty"] = auto_captured and event_name in EVENT_DEFAULT_PROPERTIES \
and p["name"] == EVENT_DEFAULT_PROPERTIES[event_name]
return properties
def get_lexicon(project_id: int, page: schemas.PaginatedSchema):
with ClickHouseClient() as ch_client:
r = ch_client.format(
"""SELECT COUNT(1) OVER () AS total, all_properties.property_name AS name,
all_properties.*,
possible_types.values AS possible_types,
possible_values.values AS sample_values
FROM product_analytics.all_properties
LEFT JOIN (SELECT project_id, property_name, array_agg(DISTINCT value_type) AS
values
FROM product_analytics.event_properties
WHERE project_id=%(project_id)s
GROUP BY 1, 2) AS possible_types
USING (project_id, property_name)
LEFT JOIN (SELECT project_id, property_name, array_agg(DISTINCT value) AS
values
FROM product_analytics.property_values_samples
WHERE project_id=%(project_id)s
GROUP BY 1, 2) AS possible_values USING (project_id, property_name)
WHERE project_id = %(project_id)s
ORDER BY display_name
LIMIT %(limit)s
OFFSET %(offset)s;""",
parameters={"project_id": project_id,
"limit": page.limit,
"offset": (page.page - 1) * page.limit})
properties = ch_client.execute(r)
if len(properties) == 0:
return {"total": 0, "list": []}
total = properties[0]["total"]
for i, p in enumerate(properties):
p["id"] = f"prop_{i}"
p.pop("total")
return {"total": total, "list": helper.list_to_camel_case(properties)}

View file

@ -6,18 +6,8 @@ logger = logging.getLogger(__name__)
from . import sessions_pg
from . import sessions_pg as sessions_legacy
from . import sessions_ch
from . import sessions_search_pg
from . import sessions_search_pg as sessions_search_legacy
if config("EXP_SESSIONS_SEARCH", cast=bool, default=False):
logger.info(">>> Using experimental sessions search")
if config("EXP_METRICS", cast=bool, default=False):
from . import sessions_ch as sessions
from . import sessions_search_ch as sessions_search
else:
from . import sessions_pg as sessions
from . import sessions_search_pg as sessions_search
# if config("EXP_METRICS", cast=bool, default=False):
# from . import sessions_ch as sessions
# else:
# from . import sessions_pg as sessions

View file

@ -2,12 +2,10 @@ import logging
from typing import List, Union
import schemas
from chalicelib.core import metadata
from chalicelib.core.events import events
from chalicelib.core import events, metadata
from . import performance_event, sessions_legacy
from chalicelib.utils import pg_client, helper, metrics_helper, ch_client, exp_ch_helper
from chalicelib.utils import sql_helper as sh
from chalicelib.utils.exp_ch_helper import get_sub_condition, get_col_cast
logger = logging.getLogger(__name__)
@ -50,8 +48,8 @@ def search2_series(data: schemas.SessionsSearchPayloadSchema, project_id: int, d
query = f"""SELECT gs.generate_series AS timestamp,
COALESCE(COUNT(DISTINCT processed_sessions.user_id),0) AS count
FROM generate_series(%(startDate)s, %(endDate)s, %(step_size)s) AS gs
LEFT JOIN (SELECT multiIf(isNotNull(s.user_id) AND notEmpty(s.user_id), s.user_id,
isNotNull(s.user_anonymous_id) AND notEmpty(s.user_anonymous_id),
LEFT JOIN (SELECT multiIf(s.user_id IS NOT NULL AND s.user_id != '', s.user_id,
s.user_anonymous_id IS NOT NULL AND s.user_anonymous_id != '',
s.user_anonymous_id, toString(s.user_uuid)) AS user_id,
s.datetime AS datetime
{query_part}) AS processed_sessions ON(TRUE)
@ -150,7 +148,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
for e in data.events:
if e.type == schemas.EventType.LOCATION:
if e.operator not in extra_conditions:
extra_conditions[e.operator] = schemas.SessionSearchEventSchema(**{
extra_conditions[e.operator] = schemas.SessionSearchEventSchema2.model_validate({
"type": e.type,
"isEvent": True,
"value": [],
@ -175,7 +173,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
for e in data.events:
if e.type == schemas.EventType.REQUEST_DETAILS:
if e.operator not in extra_conditions:
extra_conditions[e.operator] = schemas.SessionSearchEventSchema(**{
extra_conditions[e.operator] = schemas.SessionSearchEventSchema2.model_validate({
"type": e.type,
"isEvent": True,
"value": [],
@ -240,10 +238,8 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
main_query = f"""SELECT COUNT(DISTINCT {main_col}) OVER () AS main_count,
{main_col} AS name,
count(DISTINCT session_id) AS total,
any(total_count) as total_count
FROM (SELECT s.session_id AS session_id,
count(DISTINCT s.session_id) OVER () AS total_count
{extra_col}
COALESCE(SUM(count(DISTINCT session_id)) OVER (), 0) AS total_count
FROM (SELECT s.session_id AS session_id {extra_col}
{query_part}) AS filtred_sessions
{extra_where}
GROUP BY {main_col}
@ -253,13 +249,11 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
main_query = f"""SELECT COUNT(DISTINCT {main_col}) OVER () AS main_count,
{main_col} AS name,
count(DISTINCT user_id) AS total,
any(total_count) AS total_count
FROM (SELECT s.user_id AS user_id,
count(DISTINCT s.user_id) OVER () AS total_count
{extra_col}
COALESCE(SUM(count(DISTINCT user_id)) OVER (), 0) AS total_count
FROM (SELECT s.user_id AS user_id {extra_col}
{query_part}
WHERE isNotNull(user_id)
AND notEmpty(user_id)) AS filtred_sessions
AND user_id != '') AS filtred_sessions
{extra_where}
GROUP BY {main_col}
ORDER BY total DESC
@ -283,7 +277,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
return sessions
def __is_valid_event(is_any: bool, event: schemas.SessionSearchEventSchema):
def __is_valid_event(is_any: bool, event: schemas.SessionSearchEventSchema2):
return not (not is_any and len(event.value) == 0 and event.type not in [schemas.EventType.REQUEST_DETAILS,
schemas.EventType.GRAPHQL] \
or event.type in [schemas.PerformanceEventType.LOCATION_DOM_COMPLETE,
@ -336,11 +330,7 @@ def json_condition(table_alias, json_column, json_key, op, values, value_key, ch
extract_func = "JSONExtractFloat" if numeric_type == "float" else "JSONExtractInt"
condition = f"{extract_func}(toString({table_alias}.`{json_column}`), '{json_key}') {op} %({value_key})s"
else:
# condition = f"JSONExtractString(toString({table_alias}.`{json_column}`), '{json_key}') {op} %({value_key})s"
condition = get_sub_condition(
col_name=f"JSONExtractString(toString({table_alias}.`{json_column}`), '{json_key}')",
val_name=value_key, operator=op
)
condition = f"JSONExtractString(toString({table_alias}.`{json_column}`), '{json_key}') {op} %({value_key})s"
conditions.append(sh.multi_conditions(condition, values, value_key=value_key))
@ -383,34 +373,6 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions_where = ["main.project_id = %(projectId)s",
"main.created_at >= toDateTime(%(startDate)s/1000)",
"main.created_at <= toDateTime(%(endDate)s/1000)"]
any_incident = False
for i, e in enumerate(data.events):
if e.type == schemas.EventType.INCIDENT and e.operator == schemas.SearchEventOperator.IS_ANY:
any_incident = True
data.events.pop(i)
# don't stop here because we could have multiple filters looking for any incident
if any_incident:
any_incident = False
for f in data.filters:
if f.type == schemas.FilterType.ISSUE:
any_incident = True
if f.value.index(schemas.IssueType.INCIDENT) < 0:
f.value.append(schemas.IssueType.INCIDENT)
if f.operator == schemas.SearchEventOperator.IS_ANY:
f.operator = schemas.SearchEventOperator.IS
break
if not any_incident:
data.filters.append(schemas.SessionSearchFilterSchema(**{
"type": "issue",
"isEvent": False,
"value": [
"incident"
],
"operator": "is"
}))
if len(data.filters) > 0:
meta_keys = None
# to reduce include a sub-query of sessions inside events query, in order to reduce the selected data
@ -554,7 +516,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
ss_constraints.append(
sh.multi_conditions(f"ms.base_referrer {op} toString(%({f_k})s)", f.value, is_not=is_not,
value_key=f_k))
elif filter_type == schemas.FilterType.METADATA:
elif filter_type == events.EventType.METADATA.ui_type:
# get metadata list only if you need it
if meta_keys is None:
meta_keys = metadata.get(project_id=project_id)
@ -698,60 +660,39 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event.value = helper.values_for_operator(value=event.value, op=event.operator)
full_args = {**full_args,
**sh.multi_values(event.value, value_key=e_k),
**sh.multi_values(event.source, value_key=s_k),
e_k: event.value[0] if len(event.value) > 0 else event.value}
**sh.multi_values(event.source, value_key=s_k)}
if event_type == schemas.EventType.CLICK:
if event_type == events.EventType.CLICK.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
if platform == "web":
_column = "label"
_column = events.EventType.CLICK.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if not is_any:
if schemas.ClickEventExtraOperator.has_value(event.operator):
# event_where.append(json_condition(
# "main",
# "$properties",
# "selector", op, event.value, e_k)
# )
event_where.append(
sh.multi_conditions(
get_sub_condition(col_name=f"main.`$properties`.selector",
val_name=e_k, operator=event.operator),
event.value, value_key=e_k)
event_where.append(json_condition(
"main",
"$properties",
"selector", op, event.value, e_k)
)
events_conditions[-1]["condition"] = event_where[-1]
else:
if is_not:
# event_where.append(json_condition(
# "sub", "$properties", _column, op, event.value, e_k
# ))
event_where.append(
sh.multi_conditions(
get_sub_condition(col_name=f"sub.`$properties`.{_column}",
val_name=e_k, operator=event.operator),
event.value, value_key=e_k)
)
event_where.append(json_condition(
"sub", "$properties", _column, op, event.value, e_k
))
events_conditions_not.append(
{
"type": f"sub.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"
}
)
"type": f"sub.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1]
else:
# event_where.append(
# json_condition("main", "$properties", _column, op, event.value, e_k)
# )
event_where.append(
sh.multi_conditions(
get_sub_condition(col_name=f"main.`$properties`.{_column}",
val_name=e_k, operator=event.operator),
event.value, value_key=e_k)
json_condition("main", "$properties", _column, op, event.value, e_k)
)
events_conditions[-1]["condition"] = event_where[-1]
else:
_column = "label"
_column = events.EventType.CLICK_MOBILE.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -770,10 +711,10 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
)
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.INPUT:
elif event_type == events.EventType.INPUT.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
if platform == "web":
_column = "label"
_column = events.EventType.INPUT.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -798,7 +739,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
full_args = {**full_args, **sh.multi_values(event.source, value_key=f"custom{i}")}
else:
_column = "label"
_column = events.EventType.INPUT_MOBILE.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -818,7 +759,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.LOCATION:
elif event_type == events.EventType.LOCATION.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
if platform == "web":
_column = 'url_path'
@ -840,7 +781,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
)
events_conditions[-1]["condition"] = event_where[-1]
else:
_column = "name"
_column = events.EventType.VIEW_MOBILE.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -857,9 +798,9 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_where.append(sh.multi_conditions(f"main.{_column} {op} %({e_k})s",
event.value, value_key=e_k))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.CUSTOM:
elif event_type == events.EventType.CUSTOM.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = "name"
_column = events.EventType.CUSTOM.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -877,7 +818,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
"main", "$properties", _column, op, event.value, e_k
))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.REQUEST:
elif event_type == events.EventType.REQUEST.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path'
event_where.append(
@ -898,9 +839,9 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.STATE_ACTION:
elif event_type == events.EventType.STATEACTION.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = "name"
_column = events.EventType.STATEACTION.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -919,7 +860,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
))
events_conditions[-1]["condition"] = event_where[-1]
# TODO: isNot for ERROR
elif event_type == schemas.EventType.ERROR:
elif event_type == events.EventType.ERROR.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main"
events_extra_join = f"SELECT * FROM {MAIN_EVENTS_TABLE} AS main1 WHERE main1.project_id=%(project_id)s"
event_where.append(
@ -929,23 +870,20 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = []
if not is_any and event.value not in [None, "*", ""]:
event_where.append(
sh.multi_conditions(
f"(toString(main1.`$properties`.message) {op} %({e_k})s OR toString(main1.`$properties`.name) {op} %({e_k})s)",
sh.multi_conditions(f"(toString(main1.`$properties`.message) {op} %({e_k})s OR toString(main1.`$properties`.name) {op} %({e_k})s)",
event.value, value_key=e_k))
events_conditions[-1]["condition"].append(event_where[-1])
events_extra_join += f" AND {event_where[-1]}"
if len(event.source) > 0 and event.source[0] not in [None, "*", ""]:
event_where.append(
sh.multi_conditions(f"toString(main1.`$properties`.source) = %({s_k})s", event.source,
value_key=s_k))
event_where.append(sh.multi_conditions(f"toString(main1.`$properties`.source) = %({s_k})s", event.source, value_key=s_k))
events_conditions[-1]["condition"].append(event_where[-1])
events_extra_join += f" AND {event_where[-1]}"
events_conditions[-1]["condition"] = " AND ".join(events_conditions[-1]["condition"])
# ----- Mobile
elif event_type == schemas.EventType.CLICK_MOBILE:
_column = "label"
elif event_type == events.EventType.CLICK_MOBILE.ui_type:
_column = events.EventType.CLICK_MOBILE.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -963,8 +901,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
"main", "$properties", _column, op, event.value, e_k
))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.INPUT_MOBILE:
_column = "label"
elif event_type == events.EventType.INPUT_MOBILE.ui_type:
_column = events.EventType.INPUT_MOBILE.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -982,8 +920,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
"main", "$properties", _column, op, event.value, e_k
))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.VIEW_MOBILE:
_column = "name"
elif event_type == events.EventType.VIEW_MOBILE.ui_type:
_column = events.EventType.VIEW_MOBILE.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -1001,8 +939,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
"main", "$properties", _column, op, event.value, e_k
))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.CUSTOM_MOBILE:
_column = "name"
elif event_type == events.EventType.CUSTOM_MOBILE.ui_type:
_column = events.EventType.CUSTOM_MOBILE.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -1021,7 +959,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.REQUEST_MOBILE:
elif event_type == events.EventType.REQUEST_MOBILE.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path'
event_where.append(
@ -1041,8 +979,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
"main", "$properties", _column, op, event.value, e_k
))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.ERROR_MOBILE:
_column = "name"
elif event_type == events.EventType.CRASH_MOBILE.ui_type:
_column = events.EventType.CRASH_MOBILE.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -1061,8 +999,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
"main", "$properties", _column, op, event.value, e_k
))
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.SWIPE_MOBILE and platform != "web":
_column = "label"
elif event_type == events.EventType.SWIPE_MOBILE.ui_type and platform != "web":
_column = events.EventType.SWIPE_MOBILE.column
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
@ -1263,7 +1201,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
full_args = {**full_args, **sh.multi_values(f.value, value_key=e_k_f)}
if f.type == schemas.GraphqlFilterType.GRAPHQL_NAME:
event_where.append(json_condition(
"main", "$properties", "name", op, f.value, e_k_f
"main", "$properties", events.EventType.GRAPHQL.column, op, f.value, e_k_f
))
events_conditions[-1]["condition"].append(event_where[-1])
elif f.type == schemas.GraphqlFilterType.GRAPHQL_METHOD:
@ -1284,92 +1222,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
else:
logging.warning(f"undefined GRAPHQL filter: {f.type}")
events_conditions[-1]["condition"] = " AND ".join(events_conditions[-1]["condition"])
elif event_type == schemas.EventType.EVENT:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = "label"
event_where.append(f"main.`$event_name`=%({e_k})s AND main.session_id>0")
events_conditions.append({"type": event_where[-1], "condition": ""})
elif event_type == schemas.EventType.INCIDENT:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = "label"
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if is_not:
event_where.append(
sh.multi_conditions(
get_sub_condition(col_name=f"sub.`$properties`.{_column}",
val_name=e_k, operator=event.operator),
event.value, value_key=e_k)
)
events_conditions_not.append(
{
"type": f"sub.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"
}
)
events_conditions_not[-1]["condition"] = event_where[-1]
else:
event_where.append(
sh.multi_conditions(
get_sub_condition(col_name=f"main.`$properties`.{_column}",
val_name=e_k, operator=event.operator),
event.value, value_key=e_k)
)
events_conditions[-1]["condition"] = event_where[-1]
elif event_type == schemas.EventType.CLICK_COORDINATES:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
event_where.append(
f"main.`$event_name`='{exp_ch_helper.get_event_type(schemas.EventType.CLICK, platform=platform)}'")
events_conditions.append({"type": event_where[-1]})
if is_not:
event_where.append(
sh.coordinate_conditions(
condition_x=f"sub.`$properties`.normalized_x",
condition_y=f"sub.`$properties`.normalized_y",
values=event.value, value_key=e_k, is_not=True)
)
events_conditions_not.append(
{
"type": f"sub.`$event_name`='{exp_ch_helper.get_event_type(schemas.EventType.CLICK, platform=platform)}'"
}
)
events_conditions_not[-1]["condition"] = event_where[-1]
else:
event_where.append(
sh.coordinate_conditions(
condition_x=f"main.`$properties`.normalized_x",
condition_y=f"main.`$properties`.normalized_y",
values=event.value, value_key=e_k, is_not=True)
)
events_conditions[-1]["condition"] = event_where[-1]
else:
continue
if event.properties is not None and len(event.properties.filters) > 0:
sub_conditions = []
for l, property in enumerate(event.properties.filters):
a_k = f"{e_k}_att_{l}"
full_args = {**full_args,
**sh.multi_values(property.value, value_key=a_k, data_type=property.data_type)}
cast = get_col_cast(data_type=property.data_type, value=property.value)
if property.is_predefined:
condition = get_sub_condition(col_name=f"accurateCastOrNull(main.`{property.name}`,'{cast}')",
val_name=a_k, operator=property.operator)
else:
condition = get_sub_condition(
col_name=f"accurateCastOrNull(main.properties.`{property.name}`,'{cast}')",
val_name=a_k, operator=property.operator)
event_where.append(
sh.multi_conditions(condition, property.value, value_key=a_k)
)
sub_conditions.append(event_where[-1])
if len(sub_conditions) > 0:
sub_conditions = (" " + event.properties.operator + " ").join(sub_conditions)
events_conditions[-1]["condition"] += " AND " if len(events_conditions[-1]["condition"]) > 0 else ""
events_conditions[-1]["condition"] += "(" + sub_conditions + ")"
if event_index == 0 or or_events:
event_where += ss_constraints
if is_not:
@ -1667,13 +1521,16 @@ def get_user_sessions(project_id, user_id, start_date, end_date):
def get_session_user(project_id, user_id):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(
""" \
SELECT user_id,
"""\
SELECT
user_id,
count(*) as session_count,
max(start_ts) as last_seen,
min(start_ts) as first_seen
FROM "public".sessions
WHERE project_id = %(project_id)s
FROM
"public".sessions
WHERE
project_id = %(project_id)s
AND user_id = %(userId)s
AND duration is not null
GROUP BY user_id;

View file

@ -0,0 +1,269 @@
import logging
from urllib.parse import urljoin
from decouple import config
import schemas
from chalicelib.core.collaborations.collaboration_msteams import MSTeams
from chalicelib.core.collaborations.collaboration_slack import Slack
from chalicelib.utils import pg_client, helper
from chalicelib.utils import sql_helper as sh
from chalicelib.utils.TimeUTC import TimeUTC
logger = logging.getLogger(__name__)
def get_note(tenant_id, project_id, user_id, note_id, share=None):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(f"""SELECT sessions_notes.*, users.name AS user_name
{",(SELECT name FROM users WHERE user_id=%(share)s AND deleted_at ISNULL) AS share_name" if share else ""}
FROM sessions_notes INNER JOIN users USING (user_id)
WHERE sessions_notes.project_id = %(project_id)s
AND sessions_notes.note_id = %(note_id)s
AND sessions_notes.deleted_at IS NULL
AND (sessions_notes.user_id = %(user_id)s OR sessions_notes.is_public);""",
{"project_id": project_id, "user_id": user_id, "tenant_id": tenant_id,
"note_id": note_id, "share": share})
cur.execute(query=query)
row = cur.fetchone()
row = helper.dict_to_camel_case(row)
if row:
row["createdAt"] = TimeUTC.datetime_to_timestamp(row["createdAt"])
row["updatedAt"] = TimeUTC.datetime_to_timestamp(row["updatedAt"])
return row
def get_session_notes(tenant_id, project_id, session_id, user_id):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(f"""SELECT sessions_notes.*, users.name AS user_name
FROM sessions_notes INNER JOIN users USING (user_id)
WHERE sessions_notes.project_id = %(project_id)s
AND sessions_notes.deleted_at IS NULL
AND sessions_notes.session_id = %(session_id)s
AND (sessions_notes.user_id = %(user_id)s
OR sessions_notes.is_public)
ORDER BY created_at DESC;""",
{"project_id": project_id, "user_id": user_id,
"tenant_id": tenant_id, "session_id": session_id})
cur.execute(query=query)
rows = cur.fetchall()
rows = helper.list_to_camel_case(rows)
for row in rows:
row["createdAt"] = TimeUTC.datetime_to_timestamp(row["createdAt"])
return rows
def get_all_notes_by_project_id(tenant_id, project_id, user_id, data: schemas.SearchNoteSchema):
with pg_client.PostgresClient() as cur:
# base conditions
conditions = [
"sessions_notes.project_id = %(project_id)s",
"sessions_notes.deleted_at IS NULL"
]
params = {"project_id": project_id, "user_id": user_id, "tenant_id": tenant_id}
# tag conditions
if data.tags:
tag_key = "tag_value"
conditions.append(
sh.multi_conditions(f"%({tag_key})s = sessions_notes.tag", data.tags, value_key=tag_key)
)
params.update(sh.multi_values(data.tags, value_key=tag_key))
# filter by ownership or shared status
if data.shared_only:
conditions.append("sessions_notes.is_public IS TRUE")
elif data.mine_only:
conditions.append("sessions_notes.user_id = %(user_id)s")
else:
conditions.append("(sessions_notes.user_id = %(user_id)s OR sessions_notes.is_public)")
# search condition
if data.search:
conditions.append("sessions_notes.message ILIKE %(search)s")
params["search"] = f"%{data.search}%"
query = f"""
SELECT
COUNT(1) OVER () AS full_count,
sessions_notes.*,
users.name AS user_name
FROM
sessions_notes
INNER JOIN
users USING (user_id)
WHERE
{" AND ".join(conditions)}
ORDER BY
created_at {data.order}
LIMIT
%(limit)s OFFSET %(offset)s;
"""
params.update({
"limit": data.limit,
"offset": data.limit * (data.page - 1)
})
query = cur.mogrify(query, params)
logger.debug(query)
cur.execute(query)
rows = cur.fetchall()
result = {"count": 0, "notes": helper.list_to_camel_case(rows)}
if rows:
result["count"] = rows[0]["fullCount"]
for row in rows:
row["createdAt"] = TimeUTC.datetime_to_timestamp(row["createdAt"])
row.pop("fullCount")
return result
def create(tenant_id, user_id, project_id, session_id, data: schemas.SessionNoteSchema):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(f"""INSERT INTO public.sessions_notes (message, user_id, tag, session_id, project_id, timestamp, is_public, thumbnail, start_at, end_at)
VALUES (%(message)s, %(user_id)s, %(tag)s, %(session_id)s, %(project_id)s, %(timestamp)s, %(is_public)s, %(thumbnail)s, %(start_at)s, %(end_at)s)
RETURNING *,(SELECT name FROM users WHERE users.user_id=%(user_id)s) AS user_name;""",
{"user_id": user_id, "project_id": project_id, "session_id": session_id,
**data.model_dump()})
cur.execute(query)
result = helper.dict_to_camel_case(cur.fetchone())
if result:
result["createdAt"] = TimeUTC.datetime_to_timestamp(result["createdAt"])
return result
def edit(tenant_id, user_id, project_id, note_id, data: schemas.SessionUpdateNoteSchema):
sub_query = []
if data.message is not None:
sub_query.append("message = %(message)s")
if data.tag is not None and len(data.tag) > 0:
sub_query.append("tag = %(tag)s")
if data.is_public is not None:
sub_query.append("is_public = %(is_public)s")
if data.timestamp is not None:
sub_query.append("timestamp = %(timestamp)s")
sub_query.append("updated_at = timezone('utc'::text, now())")
with pg_client.PostgresClient() as cur:
cur.execute(
cur.mogrify(f"""UPDATE public.sessions_notes
SET
{" ,".join(sub_query)}
WHERE
project_id = %(project_id)s
AND user_id = %(user_id)s
AND note_id = %(note_id)s
AND deleted_at ISNULL
RETURNING *,(SELECT name FROM users WHERE users.user_id=%(user_id)s) AS user_name;""",
{"project_id": project_id, "user_id": user_id, "note_id": note_id, **data.model_dump()})
)
row = helper.dict_to_camel_case(cur.fetchone())
if row:
row["createdAt"] = TimeUTC.datetime_to_timestamp(row["createdAt"])
return row
return {"errors": ["Note not found"]}
def delete(project_id, note_id):
with pg_client.PostgresClient() as cur:
cur.execute(
cur.mogrify(""" UPDATE public.sessions_notes
SET deleted_at = timezone('utc'::text, now())
WHERE note_id = %(note_id)s
AND project_id = %(project_id)s
AND deleted_at ISNULL;""",
{"project_id": project_id, "note_id": note_id})
)
return {"data": {"state": "success"}}
def share_to_slack(tenant_id, user_id, project_id, note_id, webhook_id):
note = get_note(tenant_id=tenant_id, project_id=project_id, user_id=user_id, note_id=note_id, share=user_id)
if note is None:
return {"errors": ["Note not found"]}
session_url = urljoin(config('SITE_URL'), f"{note['projectId']}/session/{note['sessionId']}?note={note['noteId']}")
if note["timestamp"] > 0:
session_url += f"&jumpto={note['timestamp']}"
title = f"<{session_url}|Note for session {note['sessionId']}>"
blocks = [{"type": "section",
"fields": [{"type": "mrkdwn",
"text": title}]},
{"type": "section",
"fields": [{"type": "plain_text",
"text": note["message"]}]}]
if note["tag"]:
blocks.append({"type": "context",
"elements": [{"type": "plain_text",
"text": f"Tag: *{note['tag']}*"}]})
bottom = f"Created by {note['userName'].capitalize()}"
if user_id != note["userId"]:
bottom += f"\nSent by {note['shareName']}: "
blocks.append({"type": "context",
"elements": [{"type": "plain_text",
"text": bottom}]})
return Slack.send_raw(
tenant_id=tenant_id,
webhook_id=webhook_id,
body={"blocks": blocks}
)
def share_to_msteams(tenant_id, user_id, project_id, note_id, webhook_id):
note = get_note(tenant_id=tenant_id, project_id=project_id, user_id=user_id, note_id=note_id, share=user_id)
if note is None:
return {"errors": ["Note not found"]}
session_url = urljoin(config('SITE_URL'), f"{note['projectId']}/session/{note['sessionId']}?note={note['noteId']}")
if note["timestamp"] > 0:
session_url += f"&jumpto={note['timestamp']}"
title = f"[Note for session {note['sessionId']}]({session_url})"
blocks = [{
"type": "TextBlock",
"text": title,
"style": "heading",
"size": "Large"
},
{
"type": "TextBlock",
"spacing": "Small",
"text": note["message"]
}
]
if note["tag"]:
blocks.append({"type": "TextBlock",
"spacing": "Small",
"text": f"Tag: *{note['tag']}*",
"size": "Small"})
bottom = f"Created by {note['userName'].capitalize()}"
if user_id != note["userId"]:
bottom += f"\nSent by {note['shareName']}: "
blocks.append({"type": "TextBlock",
"spacing": "Default",
"text": bottom,
"size": "Small",
"fontType": "Monospace"})
return MSTeams.send_raw(
tenant_id=tenant_id,
webhook_id=webhook_id,
body={"type": "message",
"attachments": [
{"contentType": "application/vnd.microsoft.card.adaptive",
"contentUrl": None,
"content": {
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard",
"version": "1.5",
"body": [{
"type": "ColumnSet",
"style": "emphasis",
"separator": True,
"bleed": True,
"columns": [{"width": "stretch",
"items": blocks,
"type": "Column"}]
}]}}
]})

View file

@ -2,8 +2,7 @@ import logging
from typing import List, Union
import schemas
from chalicelib.core.events import events
from chalicelib.core import metadata
from chalicelib.core import events, metadata
from . import performance_event
from chalicelib.utils import pg_client, helper, metrics_helper
from chalicelib.utils import sql_helper as sh
@ -144,12 +143,12 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
for e in data.events:
if e.type == schemas.EventType.LOCATION:
if e.operator not in extra_conditions:
extra_conditions[e.operator] = schemas.SessionSearchEventSchema.model_validate({
extra_conditions[e.operator] = schemas.SessionSearchEventSchema2.model_validate({
"type": e.type,
"isEvent": True,
"value": [],
"operator": e.operator,
"filters": []
"filters": e.filters
})
for v in e.value:
if v not in extra_conditions[e.operator].value:
@ -161,12 +160,12 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
for e in data.events:
if e.type == schemas.EventType.REQUEST_DETAILS:
if e.operator not in extra_conditions:
extra_conditions[e.operator] = schemas.SessionSearchEventSchema.model_validate({
extra_conditions[e.operator] = schemas.SessionSearchEventSchema2.model_validate({
"type": e.type,
"isEvent": True,
"value": [],
"operator": e.operator,
"filters": []
"filters": e.filters
})
for v in e.value:
if v not in extra_conditions[e.operator].value:
@ -274,7 +273,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
return sessions
def __is_valid_event(is_any: bool, event: schemas.SessionSearchEventSchema):
def __is_valid_event(is_any: bool, event: schemas.SessionSearchEventSchema2):
return not (not is_any and len(event.value) == 0 and event.type not in [schemas.EventType.REQUEST_DETAILS,
schemas.EventType.GRAPHQL] \
or event.type in [schemas.PerformanceEventType.LOCATION_DOM_COMPLETE,
@ -440,7 +439,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
extra_constraints.append(
sh.multi_conditions(f"s.base_referrer {op} %({f_k})s", f.value, is_not=is_not,
value_key=f_k))
elif filter_type == schemas.FilterType.METADATA:
elif filter_type == events.EventType.METADATA.ui_type:
# get metadata list only if you need it
if meta_keys is None:
meta_keys = metadata.get(project_id=project_id)
@ -581,36 +580,36 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
**sh.multi_values(event.value, value_key=e_k),
**sh.multi_values(event.source, value_key=s_k)}
if event_type == schemas.EventType.CLICK:
if event_type == events.EventType.CLICK.ui_type:
if platform == "web":
event_from = event_from % f"events.clicks AS main "
event_from = event_from % f"{events.EventType.CLICK.table} AS main "
if not is_any:
if schemas.ClickEventExtraOperator.has_value(event.operator):
event_where.append(
sh.multi_conditions(f"main.selector {op} %({e_k})s", event.value, value_key=e_k))
else:
event_where.append(
sh.multi_conditions(f"main.label {op} %({e_k})s", event.value,
sh.multi_conditions(f"main.{events.EventType.CLICK.column} {op} %({e_k})s", event.value,
value_key=e_k))
else:
event_from = event_from % f"events_ios.taps AS main "
event_from = event_from % f"{events.EventType.CLICK_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.label {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.CLICK_MOBILE.column} {op} %({e_k})s",
event.value,
value_key=e_k))
elif event_type == schemas.EventType.TAG:
event_from = event_from % f"events.tags AS main "
elif event_type == events.EventType.TAG.ui_type:
event_from = event_from % f"{events.EventType.TAG.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.tag_id = %({e_k})s", event.value, value_key=e_k))
elif event_type == schemas.EventType.INPUT:
elif event_type == events.EventType.INPUT.ui_type:
if platform == "web":
event_from = event_from % f"events.inputs AS main "
event_from = event_from % f"{events.EventType.INPUT.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.label {op} %({e_k})s", event.value,
sh.multi_conditions(f"main.{events.EventType.INPUT.column} {op} %({e_k})s", event.value,
value_key=e_k))
if event.source is not None and len(event.source) > 0:
event_where.append(sh.multi_conditions(f"main.value ILIKE %(custom{i})s", event.source,
@ -618,53 +617,53 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
full_args = {**full_args, **sh.multi_values(event.source, value_key=f"custom{i}")}
else:
event_from = event_from % f"events_ios.inputs AS main "
event_from = event_from % f"{events.EventType.INPUT_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.label {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.INPUT_MOBILE.column} {op} %({e_k})s",
event.value,
value_key=e_k))
elif event_type == schemas.EventType.LOCATION:
elif event_type == events.EventType.LOCATION.ui_type:
if platform == "web":
event_from = event_from % f"events.pages AS main "
event_from = event_from % f"{events.EventType.LOCATION.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.path {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.LOCATION.column} {op} %({e_k})s",
event.value, value_key=e_k))
else:
event_from = event_from % f"events_ios.views AS main "
event_from = event_from % f"{events.EventType.VIEW_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.name {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.VIEW_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == schemas.EventType.CUSTOM:
event_from = event_from % f"events_common.customs AS main "
elif event_type == events.EventType.CUSTOM.ui_type:
event_from = event_from % f"{events.EventType.CUSTOM.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.name {op} %({e_k})s", event.value,
sh.multi_conditions(f"main.{events.EventType.CUSTOM.column} {op} %({e_k})s", event.value,
value_key=e_k))
elif event_type == schemas.EventType.REQUEST:
event_from = event_from % f"events_common.requests AS main "
elif event_type == events.EventType.REQUEST.ui_type:
event_from = event_from % f"{events.EventType.REQUEST.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.path {op} %({e_k})s", event.value,
sh.multi_conditions(f"main.{events.EventType.REQUEST.column} {op} %({e_k})s", event.value,
value_key=e_k))
# elif event_type == schemas.event_type.GRAPHQL:
# elif event_type == events.event_type.GRAPHQL.ui_type:
# event_from = event_from % f"{events.event_type.GRAPHQL.table} AS main "
# if not is_any:
# event_where.append(
# _multiple_conditions(f"main.{events.event_type.GRAPHQL.column} {op} %({e_k})s", event.value,
# value_key=e_k))
elif event_type == schemas.EventType.STATE_ACTION:
event_from = event_from % f"events.state_actions AS main "
elif event_type == events.EventType.STATEACTION.ui_type:
event_from = event_from % f"{events.EventType.STATEACTION.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.name {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.STATEACTION.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == schemas.EventType.ERROR:
event_from = event_from % f"events.errors AS main INNER JOIN public.errors AS main1 USING(error_id)"
elif event_type == events.EventType.ERROR.ui_type:
event_from = event_from % f"{events.EventType.ERROR.table} AS main INNER JOIN public.errors AS main1 USING(error_id)"
event.source = list(set(event.source))
if not is_any and event.value not in [None, "*", ""]:
event_where.append(
@ -675,59 +674,59 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
# ----- Mobile
elif event_type == schemas.EventType.CLICK_MOBILE:
event_from = event_from % f"events_ios.taps AS main "
elif event_type == events.EventType.CLICK_MOBILE.ui_type:
event_from = event_from % f"{events.EventType.CLICK_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.label {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.CLICK_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == schemas.EventType.INPUT_MOBILE:
event_from = event_from % f"events_ios.inputs AS main "
elif event_type == events.EventType.INPUT_MOBILE.ui_type:
event_from = event_from % f"{events.EventType.INPUT_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.label {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.INPUT_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
if event.source is not None and len(event.source) > 0:
event_where.append(sh.multi_conditions(f"main.value ILIKE %(custom{i})s", event.source,
value_key="custom{i}"))
full_args = {**full_args, **sh.multi_values(event.source, f"custom{i}")}
elif event_type == schemas.EventType.VIEW_MOBILE:
event_from = event_from % f"events_ios.views AS main "
elif event_type == events.EventType.VIEW_MOBILE.ui_type:
event_from = event_from % f"{events.EventType.VIEW_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.name {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.VIEW_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == schemas.EventType.CUSTOM_MOBILE:
event_from = event_from % f"events_common.customs AS main "
elif event_type == events.EventType.CUSTOM_MOBILE.ui_type:
event_from = event_from % f"{events.EventType.CUSTOM_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.name {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.CUSTOM_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == schemas.EventType.REQUEST_MOBILE:
event_from = event_from % f"events_common.requests AS main "
elif event_type == events.EventType.REQUEST_MOBILE.ui_type:
event_from = event_from % f"{events.EventType.REQUEST_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.path {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.REQUEST_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == schemas.EventType.ERROR_MOBILE:
event_from = event_from % f"events_common.crashes AS main INNER JOIN public.crashes_ios AS main1 USING(crash_ios_id)"
elif event_type == events.EventType.CRASH_MOBILE.ui_type:
event_from = event_from % f"{events.EventType.CRASH_MOBILE.table} AS main INNER JOIN public.crashes_ios AS main1 USING(crash_ios_id)"
if not is_any and event.value not in [None, "*", ""]:
event_where.append(
sh.multi_conditions(f"(main1.reason {op} %({e_k})s OR main1.name {op} %({e_k})s)",
event.value, value_key=e_k))
elif event_type == schemas.EventType.SWIPE_MOBILE and platform != "web":
event_from = event_from % f"events_ios.swipes AS main "
elif event_type == events.EventType.SWIPE_MOBILE.ui_type and platform != "web":
event_from = event_from % f"{events.EventType.SWIPE_MOBILE.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.label {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.SWIPE_MOBILE.column} {op} %({e_k})s",
event.value, value_key=e_k))
elif event_type == schemas.PerformanceEventType.FETCH_FAILED:
event_from = event_from % f"events_common.requests AS main "
event_from = event_from % f"{events.EventType.REQUEST.table} AS main "
if not is_any:
event_where.append(
sh.multi_conditions(f"main.path {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.REQUEST.column} {op} %({e_k})s",
event.value, value_key=e_k))
col = performance_event.get_col(event_type)
colname = col["column"]
@ -752,7 +751,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD,
schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE
]:
event_from = event_from % f"events.pages AS main "
event_from = event_from % f"{events.EventType.LOCATION.table} AS main "
col = performance_event.get_col(event_type)
colname = col["column"]
tname = "main"
@ -763,7 +762,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
f"{tname}.timestamp <= %(endDate)s"]
if not is_any:
event_where.append(
sh.multi_conditions(f"main.path {op} %({e_k})s",
sh.multi_conditions(f"main.{events.EventType.LOCATION.column} {op} %({e_k})s",
event.value, value_key=e_k))
e_k += "_custom"
full_args = {**full_args, **sh.multi_values(event.source, value_key=e_k)}
@ -773,7 +772,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
event.source, value_key=e_k))
elif event_type == schemas.EventType.REQUEST_DETAILS:
event_from = event_from % f"events_common.requests AS main "
event_from = event_from % f"{events.EventType.REQUEST.table} AS main "
apply = False
for j, f in enumerate(event.filters):
is_any = sh.isAny_opreator(f.operator)
@ -785,7 +784,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
full_args = {**full_args, **sh.multi_values(f.value, value_key=e_k_f)}
if f.type == schemas.FetchFilterType.FETCH_URL:
event_where.append(
sh.multi_conditions(f"main.path {op} %({e_k_f})s::text",
sh.multi_conditions(f"main.{events.EventType.REQUEST.column} {op} %({e_k_f})s::text",
f.value, value_key=e_k_f))
apply = True
elif f.type == schemas.FetchFilterType.FETCH_STATUS_CODE:
@ -817,7 +816,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
if not apply:
continue
elif event_type == schemas.EventType.GRAPHQL:
event_from = event_from % f"events.graphql AS main "
event_from = event_from % f"{events.EventType.GRAPHQL.table} AS main "
for j, f in enumerate(event.filters):
is_any = sh.isAny_opreator(f.operator)
if is_any or len(f.value) == 0:
@ -828,7 +827,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
full_args = {**full_args, **sh.multi_values(f.value, value_key=e_k_f)}
if f.type == schemas.GraphqlFilterType.GRAPHQL_NAME:
event_where.append(
sh.multi_conditions(f"main.name {op} %({e_k_f})s", f.value,
sh.multi_conditions(f"main.{events.EventType.GRAPHQL.column} {op} %({e_k_f})s", f.value,
value_key=e_k_f))
elif f.type == schemas.GraphqlFilterType.GRAPHQL_METHOD:
event_where.append(
@ -909,7 +908,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
# b"s.user_os in ('Chrome OS','Fedora','Firefox OS','Linux','Mac OS X','Ubuntu','Windows')")
if errors_only:
extra_from += f" INNER JOIN events.errors AS er USING (session_id) INNER JOIN public.errors AS ser USING (error_id)"
extra_from += f" INNER JOIN {events.EventType.ERROR.table} AS er USING (session_id) INNER JOIN public.errors AS ser USING (error_id)"
extra_constraints.append("ser.source = 'js_exception'")
extra_constraints.append("ser.project_id = %(project_id)s")
# if error_status != schemas.ErrorStatus.all:
@ -985,12 +984,12 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
c.value = helper.values_for_operator(value=c.value, op=c.operator)
full_args = {**full_args,
**sh.multi_values(c.value, value_key=e_k)}
if c.type == schemas.EventType.LOCATION:
if c.type == events.EventType.LOCATION.ui_type:
_extra_or_condition.append(
sh.multi_conditions(f"ev.path {op} %({e_k})s",
sh.multi_conditions(f"ev.{events.EventType.LOCATION.column} {op} %({e_k})s",
c.value, value_key=e_k))
else:
logger.warning(f"unsupported extra_event type:${c.type}")
logger.warning(f"unsupported extra_event type: {c.type}")
if len(_extra_or_condition) > 0:
extra_constraints.append("(" + " OR ".join(_extra_or_condition) + ")")
query_part = f"""\
@ -1045,13 +1044,16 @@ def get_user_sessions(project_id, user_id, start_date, end_date):
def get_session_user(project_id, user_id):
with pg_client.PostgresClient() as cur:
query = cur.mogrify(
""" \
SELECT user_id,
"""\
SELECT
user_id,
count(*) as session_count,
max(start_ts) as last_seen,
min(start_ts) as first_seen
FROM "public".sessions
WHERE project_id = %(project_id)s
FROM
"public".sessions
WHERE
project_id = %(project_id)s
AND user_id = %(userId)s
AND duration is not null
GROUP BY user_id;
@ -1074,8 +1076,9 @@ def session_exists(project_id, session_id):
with pg_client.PostgresClient() as cur:
query = cur.mogrify("""SELECT 1
FROM public.sessions
WHERE session_id = %(session_id)s
AND project_id = %(project_id)s LIMIT 1;""",
WHERE session_id=%(session_id)s
AND project_id=%(project_id)s
LIMIT 1;""",
{"project_id": project_id, "session_id": session_id})
cur.execute(query)
row = cur.fetchone()

View file

@ -1,7 +1,6 @@
import schemas
from chalicelib.core import metadata, assist, canvas, user_testing
from chalicelib.core.issues import issues
from chalicelib.core.events import events, events_mobile
from chalicelib.core import events, metadata, events_mobile, \
issues, assist, canvas, user_testing
from . import sessions_mobs, sessions_devtool
from chalicelib.core.errors.modules import errors_helper
from chalicelib.utils import pg_client, helper
@ -129,8 +128,30 @@ def get_events(project_id, session_id):
data['userTesting'] = user_testing.get_test_signals(session_id=session_id, project_id=project_id)
data['issues'] = issues.get_by_session_id(session_id=session_id, project_id=project_id)
data['issues'] = issues.reduce_issues(data['issues'])
data['incidents'] = events.get_incidents_by_session_id(session_id=session_id, project_id=project_id)
data['issues'] = reduce_issues(data['issues'])
return data
else:
return None
# To reduce the number of issues in the replay;
# will be removed once we agree on how to show issues
def reduce_issues(issues_list):
if issues_list is None:
return None
i = 0
# remove same-type issues if the time between them is <2s
while i < len(issues_list) - 1:
for j in range(i + 1, len(issues_list)):
if issues_list[i]["type"] == issues_list[j]["type"]:
break
else:
i += 1
break
if issues_list[i]["timestamp"] - issues_list[j]["timestamp"] < 2000:
issues_list.pop(j)
else:
i += 1
return issues_list

View file

@ -40,8 +40,7 @@ COALESCE((SELECT TRUE
# This function executes the query and return result
def search_sessions(data: schemas.SessionsSearchPayloadSchema, project: schemas.ProjectContext,
user_id, errors_only=False, error_status=schemas.ErrorStatus.ALL,
count_only=False, issue=None, ids_only=False, metric_of: schemas.MetricOfTable = None):
platform = project.platform
count_only=False, issue=None, ids_only=False, platform="web"):
if data.bookmarked:
data.startTimestamp, data.endTimestamp = sessions_favorite.get_start_end_timestamp(project.project_id, user_id)
if data.startTimestamp is None:
@ -49,7 +48,7 @@ def search_sessions(data: schemas.SessionsSearchPayloadSchema, project: schemas.
return {
'total': 0,
'sessions': [],
'_src': 1
'src': 1
}
full_args, query_part = sessions_legacy.search_query_parts(data=data, error_status=error_status,
errors_only=errors_only,
@ -177,7 +176,7 @@ def search_sessions(data: schemas.SessionsSearchPayloadSchema, project: schemas.
return {
'total': total,
'sessions': helper.list_to_camel_case(sessions),
'_src': 1
'src': 1
}
@ -240,7 +239,6 @@ def search_by_metadata(tenant_id, user_id, m_key, m_value, project_id=None):
cur.execute("\nUNION\n".join(sub_queries))
rows = cur.fetchall()
for i in rows:
i["_src"] = 1
results[str(i["project_id"])]["sessions"].append(helper.dict_to_camel_case(i))
return results
@ -248,7 +246,7 @@ def search_by_metadata(tenant_id, user_id, m_key, m_value, project_id=None):
def search_sessions_by_ids(project_id: int, session_ids: list, sort_by: str = 'session_id',
ascending: bool = False) -> dict:
if session_ids is None or len(session_ids) == 0:
return {"total": 0, "sessions": [], "_src": 1}
return {"total": 0, "sessions": []}
with pg_client.PostgresClient() as cur:
meta_keys = metadata.get(project_id=project_id)
params = {"project_id": project_id, "session_ids": tuple(session_ids)}
@ -267,4 +265,4 @@ def search_sessions_by_ids(project_id: int, session_ids: list, sort_by: str = 's
s["metadata"] = {}
for m in meta_keys:
s["metadata"][m["key"]] = s.pop(f'metadata_{m["index"]}')
return {"total": len(rows), "sessions": helper.list_to_camel_case(rows), "_src": 1}
return {"total": len(rows), "sessions": helper.list_to_camel_case(rows)}

View file

@ -1,2 +1 @@
from .sessions_viewed import *
from .sessions_viewed_ch import *

View file

@ -87,7 +87,7 @@ async def create_tenant(data: schemas.UserSignupSchema):
"spotRefreshToken": r.pop("spotRefreshToken"),
"spotRefreshTokenMaxAge": r.pop("spotRefreshTokenMaxAge"),
'data': {
"scopeState": 2,
"scopeState": 0,
"user": r
}
}

View file

@ -11,3 +11,9 @@ if smtp.has_smtp():
logger.info("valid SMTP configuration found")
else:
logger.info("no SMTP configuration found or SMTP validation failed")
if config("EXP_CH_DRIVER", cast=bool, default=True):
logging.info(">>> Using new CH driver")
from . import ch_client_exp as ch_client
else:
from . import ch_client

View file

@ -1,185 +1,73 @@
import logging
import threading
import time
from functools import wraps
from queue import Queue, Empty
import clickhouse_connect
from clickhouse_connect.driver.query import QueryContext
import clickhouse_driver
from decouple import config
logger = logging.getLogger(__name__)
_CH_CONFIG = {"host": config("ch_host"),
"user": config("ch_user", default="default"),
"password": config("ch_password", default=""),
"port": config("ch_port_http", cast=int),
"client_name": config("APP_NAME", default="PY")}
CH_CONFIG = dict(_CH_CONFIG)
settings = {}
if config('ch_timeout', cast=int, default=-1) > 0:
logging.info(f"CH-max_execution_time set to {config('ch_timeout')}s")
logger.info(f"CH-max_execution_time set to {config('ch_timeout')}s")
settings = {**settings, "max_execution_time": config('ch_timeout', cast=int)}
if config('ch_receive_timeout', cast=int, default=-1) > 0:
logging.info(f"CH-receive_timeout set to {config('ch_receive_timeout')}s")
logger.info(f"CH-receive_timeout set to {config('ch_receive_timeout')}s")
settings = {**settings, "receive_timeout": config('ch_receive_timeout', cast=int)}
extra_args = {}
if config("CH_COMPRESSION", cast=bool, default=True):
extra_args["compression"] = "lz4"
def transform_result(self, original_function):
@wraps(original_function)
def wrapper(*args, **kwargs):
if kwargs.get("parameters"):
if config("LOCAL_DEV", cast=bool, default=False):
logger.debug(self.format(query=kwargs.get("query", ""), parameters=kwargs.get("parameters")))
else:
logger.debug(
str.encode(self.format(query=kwargs.get("query", ""), parameters=kwargs.get("parameters"))))
elif len(args) > 0:
if config("LOCAL_DEV", cast=bool, default=False):
logger.debug(args[0])
else:
logger.debug(str.encode(args[0]))
result = original_function(*args, **kwargs)
if isinstance(result, clickhouse_connect.driver.query.QueryResult):
column_names = result.column_names
result = result.result_rows
result = [dict(zip(column_names, row)) for row in result]
return result
return wrapper
class ClickHouseConnectionPool:
def __init__(self, min_size, max_size):
self.min_size = min_size
self.max_size = max_size
self.pool = Queue()
self.lock = threading.Lock()
self.total_connections = 0
# Initialize the pool with min_size connections
for _ in range(self.min_size):
client = clickhouse_connect.get_client(**CH_CONFIG,
database=config("ch_database", default="default"),
settings=settings,
**extra_args)
self.pool.put(client)
self.total_connections += 1
def get_connection(self):
try:
# Try to get a connection without blocking
client = self.pool.get_nowait()
return client
except Empty:
with self.lock:
if self.total_connections < self.max_size:
client = clickhouse_connect.get_client(**CH_CONFIG,
database=config("ch_database", default="default"),
settings=settings,
**extra_args)
self.total_connections += 1
return client
# If max_size reached, wait until a connection is available
client = self.pool.get()
return client
def release_connection(self, client):
self.pool.put(client)
def close_all(self):
with self.lock:
while not self.pool.empty():
client = self.pool.get()
client.close()
self.total_connections = 0
CH_pool: ClickHouseConnectionPool = None
RETRY_MAX = config("CH_RETRY_MAX", cast=int, default=50)
RETRY_INTERVAL = config("CH_RETRY_INTERVAL", cast=int, default=2)
RETRY = 0
def make_pool():
if not config('CH_POOL', cast=bool, default=True):
return
global CH_pool
global RETRY
if CH_pool is not None:
try:
CH_pool.close_all()
except Exception as error:
logger.error("Error while closing all connexions to CH", exc_info=error)
try:
CH_pool = ClickHouseConnectionPool(min_size=config("CH_MINCONN", cast=int, default=4),
max_size=config("CH_MAXCONN", cast=int, default=8))
if CH_pool is not None:
logger.info("Connection pool created successfully for CH")
except ConnectionError as error:
logger.error("Error while connecting to CH", exc_info=error)
if RETRY < RETRY_MAX:
RETRY += 1
logger.info(f"waiting for {RETRY_INTERVAL}s before retry n°{RETRY}")
time.sleep(RETRY_INTERVAL)
make_pool()
else:
raise error
class ClickHouseClient:
__client = None
def __init__(self, database=None):
if self.__client is None:
if database is not None or not config('CH_POOL', cast=bool, default=True):
self.__client = clickhouse_connect.get_client(**CH_CONFIG,
extra_args = {}
if config("CH_COMPRESSION", cast=bool, default=True):
extra_args["compression"] = "lz4"
self.__client = clickhouse_driver.Client(host=config("ch_host"),
database=database if database else config("ch_database",
default="default"),
user=config("ch_user", default="default"),
password=config("ch_password", default=""),
port=config("ch_port", cast=int),
settings=settings,
**extra_args)
else:
self.__client = CH_pool.get_connection()
self.__client.execute = transform_result(self, self.__client.query)
self.__client.format = self.format
**extra_args) \
if self.__client is None else self.__client
def __enter__(self):
return self
def execute(self, query, parameters=None, **args):
try:
results = self.__client.execute(query=query, params=parameters, with_column_types=True, **args)
keys = tuple(x for x, y in results[1])
return [dict(zip(keys, i)) for i in results[0]]
except Exception as err:
logger.error("--------- CH EXCEPTION -----------", exc_info=err)
logger.error("--------- CH QUERY EXCEPTION -----------")
logger.error(self.format(query=query, parameters=parameters)
.replace('\n', '\\n')
.replace(' ', ' ')
.replace(' ', ' '))
logger.error("--------------------")
raise err
def insert(self, query, params=None, **args):
return self.__client.execute(query=query, params=params, **args)
def client(self):
return self.__client
def format(self, query, parameters=None):
if parameters:
ctx = QueryContext(query=query, parameters=parameters)
return ctx.final_query
def format(self, query, parameters):
if parameters is None:
return query
return self.__client.substitute_params(query, parameters, self.__client.connection.context)
def __exit__(self, *args):
if config('CH_POOL', cast=bool, default=True):
CH_pool.release_connection(self.__client)
else:
self.__client.close()
pass
async def init():
logger.info(f">use CH_POOL:{config('CH_POOL', default=True)}")
if config('CH_POOL', cast=bool, default=True):
make_pool()
logger.info(f">CH_POOL:not defined")
async def terminate():
global CH_pool
if CH_pool is not None:
try:
CH_pool.close_all()
logger.info("Closed all connexions to CH")
except Exception as error:
logger.error("Error while closing all connexions to CH", exc_info=error)
pass

View file

@ -0,0 +1,178 @@
import logging
import threading
import time
from functools import wraps
from queue import Queue, Empty
import clickhouse_connect
from clickhouse_connect.driver.query import QueryContext
from decouple import config
logger = logging.getLogger(__name__)
_CH_CONFIG = {"host": config("ch_host"),
"user": config("ch_user", default="default"),
"password": config("ch_password", default=""),
"port": config("ch_port_http", cast=int),
"client_name": config("APP_NAME", default="PY")}
CH_CONFIG = dict(_CH_CONFIG)
settings = {}
if config('ch_timeout', cast=int, default=-1) > 0:
logging.info(f"CH-max_execution_time set to {config('ch_timeout')}s")
settings = {**settings, "max_execution_time": config('ch_timeout', cast=int)}
if config('ch_receive_timeout', cast=int, default=-1) > 0:
logging.info(f"CH-receive_timeout set to {config('ch_receive_timeout')}s")
settings = {**settings, "receive_timeout": config('ch_receive_timeout', cast=int)}
extra_args = {}
if config("CH_COMPRESSION", cast=bool, default=True):
extra_args["compression"] = "lz4"
def transform_result(self, original_function):
@wraps(original_function)
def wrapper(*args, **kwargs):
if kwargs.get("parameters"):
logger.debug(str.encode(self.format(query=kwargs.get("query", ""), parameters=kwargs.get("parameters"))))
elif len(args) > 0:
logger.debug(str.encode(args[0]))
result = original_function(*args, **kwargs)
if isinstance(result, clickhouse_connect.driver.query.QueryResult):
column_names = result.column_names
result = result.result_rows
result = [dict(zip(column_names, row)) for row in result]
return result
return wrapper
class ClickHouseConnectionPool:
def __init__(self, min_size, max_size):
self.min_size = min_size
self.max_size = max_size
self.pool = Queue()
self.lock = threading.Lock()
self.total_connections = 0
# Initialize the pool with min_size connections
for _ in range(self.min_size):
client = clickhouse_connect.get_client(**CH_CONFIG,
database=config("ch_database", default="default"),
settings=settings,
**extra_args)
self.pool.put(client)
self.total_connections += 1
def get_connection(self):
try:
# Try to get a connection without blocking
client = self.pool.get_nowait()
return client
except Empty:
with self.lock:
if self.total_connections < self.max_size:
client = clickhouse_connect.get_client(**CH_CONFIG,
database=config("ch_database", default="default"),
settings=settings,
**extra_args)
self.total_connections += 1
return client
# If max_size reached, wait until a connection is available
client = self.pool.get()
return client
def release_connection(self, client):
self.pool.put(client)
def close_all(self):
with self.lock:
while not self.pool.empty():
client = self.pool.get()
client.close()
self.total_connections = 0
CH_pool: ClickHouseConnectionPool = None
RETRY_MAX = config("CH_RETRY_MAX", cast=int, default=50)
RETRY_INTERVAL = config("CH_RETRY_INTERVAL", cast=int, default=2)
RETRY = 0
def make_pool():
if not config('CH_POOL', cast=bool, default=True):
return
global CH_pool
global RETRY
if CH_pool is not None:
try:
CH_pool.close_all()
except Exception as error:
logger.error("Error while closing all connexions to CH", exc_info=error)
try:
CH_pool = ClickHouseConnectionPool(min_size=config("CH_MINCONN", cast=int, default=4),
max_size=config("CH_MAXCONN", cast=int, default=8))
if CH_pool is not None:
logger.info("Connection pool created successfully for CH")
except ConnectionError as error:
logger.error("Error while connecting to CH", exc_info=error)
if RETRY < RETRY_MAX:
RETRY += 1
logger.info(f"waiting for {RETRY_INTERVAL}s before retry n°{RETRY}")
time.sleep(RETRY_INTERVAL)
make_pool()
else:
raise error
class ClickHouseClient:
__client = None
def __init__(self, database=None):
if self.__client is None:
if database is not None or not config('CH_POOL', cast=bool, default=True):
self.__client = clickhouse_connect.get_client(**CH_CONFIG,
database=database if database else config("ch_database",
default="default"),
settings=settings,
**extra_args)
else:
self.__client = CH_pool.get_connection()
self.__client.execute = transform_result(self, self.__client.query)
self.__client.format = self.format
def __enter__(self):
return self.__client
def format(self, query, parameters=None):
if parameters:
ctx = QueryContext(query=query, parameters=parameters)
return ctx.final_query
return query
def __exit__(self, *args):
if config('CH_POOL', cast=bool, default=True):
CH_pool.release_connection(self.__client)
else:
self.__client.close()
async def init():
logger.info(f">use CH_POOL:{config('CH_POOL', default=True)}")
if config('CH_POOL', cast=bool, default=True):
make_pool()
async def terminate():
global CH_pool
if CH_pool is not None:
try:
CH_pool.close_all()
logger.info("Closed all connexions to CH")
except Exception as error:
logger.error("Error while closing all connexions to CH", exc_info=error)

View file

@ -1,14 +1,7 @@
import logging
import math
import re
import struct
from decimal import Decimal
from typing import Union, Any
from typing import Union
import schemas
from chalicelib.utils import sql_helper as sh
from chalicelib.utils.TimeUTC import TimeUTC
from schemas import SearchEventOperator
import logging
logger = logging.getLogger(__name__)
@ -57,8 +50,7 @@ def get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEvent
schemas.EventType.ERROR: "ERROR",
schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD: 'PERFORMANCE',
schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE: 'PERFORMANCE',
schemas.FetchFilterType.FETCH_URL: 'REQUEST',
schemas.EventType.INCIDENT: "INCIDENT",
schemas.FetchFilterType.FETCH_URL: 'REQUEST'
}
defs_mobile = {
schemas.EventType.CLICK_MOBILE: "TAP",
@ -67,183 +59,10 @@ def get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEvent
schemas.EventType.REQUEST_MOBILE: "REQUEST",
schemas.EventType.ERROR_MOBILE: "CRASH",
schemas.EventType.VIEW_MOBILE: "VIEW",
schemas.EventType.SWIPE_MOBILE: "SWIPE",
schemas.EventType.INCIDENT: "INCIDENT"
schemas.EventType.SWIPE_MOBILE: "SWIPE"
}
if platform != "web" and event_type in defs_mobile:
return defs_mobile.get(event_type)
if event_type not in defs:
raise Exception(f"unsupported EventType:{event_type}")
return defs.get(event_type)
# AI generated
def simplify_clickhouse_type(ch_type: str) -> str:
"""
Simplify a ClickHouse data type name to a broader category like:
int, float, decimal, datetime, string, uuid, enum, array, tuple, map, nested, etc.
"""
# 1) Strip out common wrappers like Nullable(...) or LowCardinality(...)
# Possibly multiple wrappers: e.g. "LowCardinality(Nullable(Int32))"
pattern_wrappers = re.compile(r'(Nullable|LowCardinality)\((.*)\)')
while True:
match = pattern_wrappers.match(ch_type)
if match:
ch_type = match.group(2)
else:
break
# 2) Normalize (lowercase) for easier checks
normalized_type = ch_type.lower()
# 3) Use pattern matching or direct checks for known categories
# (You can adapt this as you see fit for your environment.)
# Integers: Int8, Int16, Int32, Int64, Int128, Int256, UInt8, UInt16, ...
if re.match(r'^(u?int)(8|16|32|64|128|256)$', normalized_type):
return "int"
# Floats: Float32, Float64
if re.match(r'^float(32|64)|double$', normalized_type):
return "float"
# Decimal: Decimal(P, S)
if normalized_type.startswith("decimal"):
# return "decimal"
return "float"
# Date/DateTime
if normalized_type.startswith("date"):
return "datetime"
if normalized_type.startswith("datetime"):
return "datetime"
# Strings: String, FixedString(N)
if normalized_type.startswith("string"):
return "string"
if normalized_type.startswith("fixedstring"):
return "string"
# UUID
if normalized_type.startswith("uuid"):
# return "uuid"
return "string"
# Enums: Enum8(...) or Enum16(...)
if normalized_type.startswith("enum8") or normalized_type.startswith("enum16"):
# return "enum"
return "string"
# Arrays: Array(T)
if normalized_type.startswith("array"):
return "array"
# Tuples: Tuple(T1, T2, ...)
if normalized_type.startswith("tuple"):
return "tuple"
# Map(K, V)
if normalized_type.startswith("map"):
return "map"
# Nested(...)
if normalized_type.startswith("nested"):
return "nested"
# If we didn't match above, just return the original type in lowercase
return normalized_type
def simplify_clickhouse_types(ch_types: list[str]) -> list[str]:
"""
Takes a list of ClickHouse types and returns a list of simplified types
by calling `simplify_clickhouse_type` on each.
"""
return list(set([simplify_clickhouse_type(t) for t in ch_types]))
def get_sub_condition(col_name: str, val_name: str,
operator: Union[schemas.SearchEventOperator, schemas.MathOperator]) -> str:
if operator == SearchEventOperator.PATTERN:
return f"match({col_name}, %({val_name})s)"
op = sh.get_sql_operator(operator)
return f"{col_name} {op} %({val_name})s"
def get_col_cast(data_type: schemas.PropertyType, value: Any) -> str:
if value is None or len(value) == 0:
return ""
if isinstance(value, list):
value = value[0]
if data_type in (schemas.PropertyType.INT, schemas.PropertyType.FLOAT):
return best_clickhouse_type(value)
return data_type.capitalize()
# (type_name, minimum, maximum) ordered by increasing size
_INT_RANGES = [
("Int8", -128, 127),
("UInt8", 0, 255),
("Int16", -32_768, 32_767),
("UInt16", 0, 65_535),
("Int32", -2_147_483_648, 2_147_483_647),
("UInt32", 0, 4_294_967_295),
("Int64", -9_223_372_036_854_775_808, 9_223_372_036_854_775_807),
("UInt64", 0, 18_446_744_073_709_551_615),
]
def best_clickhouse_type(value):
"""
Return the most compact ClickHouse numeric type that can store *value* loss-lessly.
"""
# Treat bool like tiny int
if isinstance(value, bool):
value = int(value)
# --- Integers ---
if isinstance(value, int):
for name, lo, hi in _INT_RANGES:
if lo <= value <= hi:
return name
# Beyond UInt64: ClickHouse offers Int128 / Int256 or Decimal
return "Int128"
# --- Decimal.Decimal (exact) ---
if isinstance(value, Decimal):
# ClickHouse Decimal32/64/128 have 9 / 18 / 38 significant digits.
digits = len(value.as_tuple().digits)
if digits <= 9:
return "Decimal32"
elif digits <= 18:
return "Decimal64"
else:
return "Decimal128"
# --- Floats ---
if isinstance(value, float):
if not math.isfinite(value):
return "Float64" # inf / nan → always Float64
# Check if a round-trip through 32-bit float preserves the bit pattern
packed = struct.pack("f", value)
if struct.unpack("f", packed)[0] == value:
return "Float32"
return "Float64"
raise TypeError(f"Unsupported type: {type(value).__name__}")
def explode_dproperties(rows):
for i in range(len(rows)):
rows[i] = {**rows[i], **rows[i]["$properties"]}
rows[i].pop("$properties")
return rows
def add_timestamp(rows):
for row in rows:
row["timestamp"] = TimeUTC.datetime_to_timestamp(row["createdAt"])
return rows

View file

@ -15,11 +15,11 @@ def random_string(length=36):
return "".join(random.choices(string.hexdigits, k=length))
def list_to_camel_case(items: list[dict], flatten: bool = False, ignore_keys=[]) -> list[dict]:
def list_to_camel_case(items: list[dict], flatten: bool = False) -> list[dict]:
for i in range(len(items)):
if flatten:
items[i] = flatten_nested_dicts(items[i])
items[i] = dict_to_camel_case(items[i], ignore_keys=[])
items[i] = dict_to_camel_case(items[i])
return items
@ -99,8 +99,6 @@ def allow_captcha():
def string_to_sql_like(value):
if value is None:
return None
value = re.sub(' +', ' ', value)
value = value.replace("*", "%")
if value.startswith("^"):
@ -336,3 +334,5 @@ def cast_session_id_to_string(data):
for key in keys:
data[key] = cast_session_id_to_string(data[key])
return data

View file

@ -0,0 +1 @@
from .or_cache import CachedResponse

View file

@ -0,0 +1,83 @@
import functools
import inspect
import json
import logging
from chalicelib.utils import pg_client
import time
from fastapi.encoders import jsonable_encoder
logger = logging.getLogger(__name__)
class CachedResponse:
def __init__(self, table, ttl):
self.table = table
self.ttl = ttl
def __call__(self, func):
self.param_names = {i: param for i, param in enumerate(inspect.signature(func).parameters)}
@functools.wraps(func)
def wrapper(*args, **kwargs):
values = dict()
for i, param in self.param_names.items():
if i < len(args):
values[param] = args[i]
elif param in kwargs:
values[param] = kwargs[param]
else:
values[param] = None
result = self.__get(values)
if result is None or result["expired"] \
or result["result"] is None or len(result["result"]) == 0:
now = time.time()
result = func(*args, **kwargs)
now = time.time() - now
if result is not None and len(result) > 0:
self.__add(values, result, now)
result[0]["cached"] = False
else:
logger.info(f"using cached response for "
f"{func.__name__}({','.join([f'{key}={val}' for key, val in enumerate(values)])})")
result = result["result"]
result[0]["cached"] = True
return result
return wrapper
def __get(self, values):
with pg_client.PostgresClient() as cur:
sub_constraints = []
for key, value in values.items():
if value is not None:
sub_constraints.append(f"{key}=%({key})s")
else:
sub_constraints.append(f"{key} IS NULL")
query = f"""SELECT result,
(%(ttl)s>0
AND EXTRACT(EPOCH FROM (timezone('utc'::text, now()) - created_at - INTERVAL %(interval)s)) > 0) AS expired
FROM {self.table}
WHERE {" AND ".join(sub_constraints)}"""
query = cur.mogrify(query, {**values, 'ttl': self.ttl, 'interval': f'{self.ttl} seconds'})
logger.debug("------")
logger.debug(query)
logger.debug("------")
cur.execute(query)
result = cur.fetchone()
return result
def __add(self, values, result, execution_time):
with pg_client.PostgresClient() as cur:
query = f"""INSERT INTO {self.table} ({",".join(values.keys())},result,execution_time)
VALUES ({",".join([f"%({param})s" for param in values.keys()])},%(result)s,%(execution_time)s)
ON CONFLICT ({",".join(values.keys())}) DO UPDATE SET result=%(result)s,
execution_time=%(execution_time)s,
created_at=timezone('utc'::text, now());"""
query = cur.mogrify(query, {**values,
"result": json.dumps(jsonable_encoder(result)),
"execution_time": execution_time})
logger.debug("------")
logger.debug(query)
logger.debug("------")
cur.execute(query)

View file

@ -4,47 +4,48 @@ import schemas
def get_sql_operator(op: Union[schemas.SearchEventOperator, schemas.ClickEventExtraOperator, schemas.MathOperator]):
if isinstance(op, Enum):
op = op.value
return {
schemas.SearchEventOperator.IS: "=",
schemas.SearchEventOperator.ON: "=",
schemas.SearchEventOperator.ON_ANY: "IN",
schemas.SearchEventOperator.IS_NOT: "!=",
schemas.SearchEventOperator.NOT_ON: "!=",
schemas.SearchEventOperator.CONTAINS: "ILIKE",
schemas.SearchEventOperator.NOT_CONTAINS: "NOT ILIKE",
schemas.SearchEventOperator.STARTS_WITH: "ILIKE",
schemas.SearchEventOperator.ENDS_WITH: "ILIKE",
# this is not used as an operator, it is used in order to maintain a valid value for conditions
schemas.SearchEventOperator.PATTERN: "regex",
schemas.SearchEventOperator.IS.value: "=",
schemas.SearchEventOperator.ON.value: "=",
schemas.SearchEventOperator.ON_ANY.value: "IN",
schemas.SearchEventOperator.IS_NOT.value: "!=",
schemas.SearchEventOperator.NOT_ON.value: "!=",
schemas.SearchEventOperator.CONTAINS.value: "ILIKE",
schemas.SearchEventOperator.NOT_CONTAINS.value: "NOT ILIKE",
schemas.SearchEventOperator.STARTS_WITH.value: "ILIKE",
schemas.SearchEventOperator.ENDS_WITH.value: "ILIKE",
# Selector operators:
schemas.ClickEventExtraOperator.IS: "=",
schemas.ClickEventExtraOperator.IS_NOT: "!=",
schemas.ClickEventExtraOperator.CONTAINS: "ILIKE",
schemas.ClickEventExtraOperator.NOT_CONTAINS: "NOT ILIKE",
schemas.ClickEventExtraOperator.STARTS_WITH: "ILIKE",
schemas.ClickEventExtraOperator.ENDS_WITH: "ILIKE",
schemas.ClickEventExtraOperator.IS.value: "=",
schemas.ClickEventExtraOperator.IS_NOT.value: "!=",
schemas.ClickEventExtraOperator.CONTAINS.value: "ILIKE",
schemas.ClickEventExtraOperator.NOT_CONTAINS.value: "NOT ILIKE",
schemas.ClickEventExtraOperator.STARTS_WITH.value: "ILIKE",
schemas.ClickEventExtraOperator.ENDS_WITH.value: "ILIKE",
schemas.MathOperator.GREATER: ">",
schemas.MathOperator.GREATER_EQ: ">=",
schemas.MathOperator.LESS: "<",
schemas.MathOperator.LESS_EQ: "<=",
schemas.MathOperator.GREATER.value: ">",
schemas.MathOperator.GREATER_EQ.value: ">=",
schemas.MathOperator.LESS.value: "<",
schemas.MathOperator.LESS_EQ.value: "<=",
}.get(op, "=")
def is_negation_operator(op: schemas.SearchEventOperator):
return op in [schemas.SearchEventOperator.IS_NOT,
schemas.SearchEventOperator.NOT_ON,
schemas.SearchEventOperator.NOT_CONTAINS,
schemas.ClickEventExtraOperator.IS_NOT,
schemas.ClickEventExtraOperator.NOT_CONTAINS]
if isinstance(op, Enum):
op = op.value
return op in [schemas.SearchEventOperator.IS_NOT.value,
schemas.SearchEventOperator.NOT_ON.value,
schemas.SearchEventOperator.NOT_CONTAINS.value,
schemas.ClickEventExtraOperator.IS_NOT.value,
schemas.ClickEventExtraOperator.NOT_CONTAINS.value]
def reverse_sql_operator(op):
return "=" if op == "!=" else "!=" if op == "=" else "ILIKE" if op == "NOT ILIKE" else "NOT ILIKE"
def multi_conditions(condition, values, value_key="value", is_not=False) -> str:
def multi_conditions(condition, values, value_key="value", is_not=False):
query = []
for i in range(len(values)):
k = f"{value_key}_{i}"
@ -52,16 +53,12 @@ def multi_conditions(condition, values, value_key="value", is_not=False) -> str:
return "(" + (" AND " if is_not else " OR ").join(query) + ")"
def multi_values(values, value_key="value", data_type: schemas.PropertyType | None = None):
def multi_values(values, value_key="value"):
query_values = {}
if values is not None and isinstance(values, list):
for i in range(len(values)):
k = f"{value_key}_{i}"
query_values[k] = values[i].value if isinstance(values[i], Enum) else values[i]
if data_type:
if data_type == schemas.PropertyType.STRING:
query_values[k] = str(query_values[k])
return query_values
@ -80,29 +77,3 @@ def single_value(values):
values[i] = v.value
return values
def coordinate_conditions(condition_x, condition_y, values, value_key="value", is_not=False):
query = []
if len(values) == 2:
# if 2 values are provided, it means x=v[0] and y=v[1]
for i in range(len(values)):
k = f"{value_key}_{i}"
if i == 0:
query.append(f"{condition_x}=%({k})s")
elif i == 1:
query.append(f"{condition_y}=%({k})s")
elif len(values) == 4:
# if 4 values are provided, it means v[0]<=x<=v[1] and v[2]<=y<=v[3]
for i in range(len(values)):
k = f"{value_key}_{i}"
if i == 0:
query.append(f"{condition_x}>=%({k})s")
elif i == 1:
query.append(f"{condition_x}<=%({k})s")
elif i == 2:
query.append(f"{condition_y}>=%({k})s")
elif i == 3:
query.append(f"{condition_y}<=%({k})s")
return "(" + (" AND " if is_not else " OR ").join(query) + ")"

View file

@ -75,5 +75,3 @@ EXP_AUTOCOMPLETE=true
EXP_ALERTS=true
EXP_ERRORS_SEARCH=true
EXP_METRICS=true
EXP_SESSIONS_SEARCH=true
EXP_EVENTS=true

View file

@ -69,4 +69,3 @@ EXP_AUTOCOMPLETE=true
EXP_ALERTS=true
EXP_ERRORS_SEARCH=true
EXP_METRICS=true
EXP_EVENTS=true

View file

@ -1,16 +1,17 @@
urllib3==2.4.0
urllib3==2.3.0
requests==2.32.3
boto3==1.38.16
boto3==1.36.12
pyjwt==2.10.1
psycopg2-binary==2.9.10
psycopg[pool,binary]==3.2.9
clickhouse-connect==0.8.17
elasticsearch==9.0.1
psycopg[pool,binary]==3.2.4
clickhouse-driver[lz4]==0.2.9
clickhouse-connect==0.8.15
elasticsearch==8.17.1
jira==3.8.0
cachetools==5.5.2
cachetools==5.5.1
fastapi==0.115.12
uvicorn[standard]==0.34.2
fastapi==0.115.8
uvicorn[standard]==0.34.0
python-decouple==3.8
pydantic[email]==2.11.4
pydantic[email]==2.10.6
apscheduler==3.11.0

View file

@ -1,18 +1,19 @@
urllib3==2.4.0
urllib3==2.3.0
requests==2.32.3
boto3==1.38.16
boto3==1.36.12
pyjwt==2.10.1
psycopg2-binary==2.9.10
psycopg[pool,binary]==3.2.9
clickhouse-connect==0.8.17
elasticsearch==9.0.1
psycopg[pool,binary]==3.2.4
clickhouse-driver[lz4]==0.2.9
clickhouse-connect==0.8.15
elasticsearch==8.17.1
jira==3.8.0
cachetools==5.5.2
cachetools==5.5.1
fastapi==0.115.12
uvicorn[standard]==0.34.2
fastapi==0.115.8
uvicorn[standard]==0.34.0
python-decouple==3.8
pydantic[email]==2.11.4
pydantic[email]==2.10.6
apscheduler==3.11.0
redis==6.1.0
redis==5.2.1

View file

@ -4,10 +4,8 @@ from decouple import config
from fastapi import Depends, Body, BackgroundTasks
import schemas
from chalicelib.core import projects, metadata, reset_password, log_tools, \
from chalicelib.core import events, projects, issues, metadata, reset_password, log_tools, \
announcements, weekly_report, assist, mobile, tenants, boarding, notifications, webhook, users, saved_search, tags
from chalicelib.core.events import events
from chalicelib.core.issues import issues
from chalicelib.core.sourcemaps import sourcemaps
from chalicelib.core.metrics import custom_metrics
from chalicelib.core.alerts import alerts

View file

@ -8,14 +8,13 @@ from starlette.responses import RedirectResponse, FileResponse, JSONResponse, Re
import schemas
from chalicelib.core import assist, signup, feature_flags
from chalicelib.core import notes
from chalicelib.core import scope
from chalicelib.core import tenants, users, projects, license
from chalicelib.core import webhook
from chalicelib.core.collaborations.collaboration_slack import Slack
from chalicelib.core.errors import errors, errors_details
from chalicelib.core.metrics import heatmaps
from chalicelib.core.sessions import sessions, sessions_replay, sessions_favorite, sessions_viewed, \
from chalicelib.core.sessions import sessions, sessions_notes, sessions_replay, sessions_favorite, sessions_viewed, \
sessions_assignments, unprocessed_sessions, sessions_search
from chalicelib.utils import captcha, smtp
from chalicelib.utils import contextual_validators
@ -260,7 +259,8 @@ def get_projects(context: schemas.CurrentContext = Depends(OR_context)):
def search_sessions(projectId: int, data: schemas.SessionsSearchPayloadSchema = \
Depends(contextual_validators.validate_contextual_payload),
context: schemas.CurrentContext = Depends(OR_context)):
data = sessions_search.search_sessions(data=data, project=context.project, user_id=context.user_id)
data = sessions_search.search_sessions(data=data, project=context.project, user_id=context.user_id,
platform=context.project.platform)
return {'data': data}
@ -268,7 +268,8 @@ def search_sessions(projectId: int, data: schemas.SessionsSearchPayloadSchema =
def session_ids_search(projectId: int, data: schemas.SessionsSearchPayloadSchema = \
Depends(contextual_validators.validate_contextual_payload),
context: schemas.CurrentContext = Depends(OR_context)):
data = sessions_search.search_sessions(data=data, project=context.project, user_id=context.user_id, ids_only=True)
data = sessions_search.search_sessions(data=data, project=context.project, user_id=context.user_id, ids_only=True,
platform=context.project.platform)
return {'data': data}
@ -474,7 +475,7 @@ def comment_assignment(projectId: int, sessionId: int, issueId: str,
@app.get('/{projectId}/notes/{noteId}', tags=["sessions", "notes"])
def get_note_by_id(projectId: int, noteId: int, context: schemas.CurrentContext = Depends(OR_context)):
data = notes.get_note(tenant_id=context.tenant_id, project_id=projectId, note_id=noteId,
data = sessions_notes.get_note(tenant_id=context.tenant_id, project_id=projectId, note_id=noteId,
user_id=context.user_id)
if "errors" in data:
return data
@ -488,7 +489,7 @@ def create_note(projectId: int, sessionId: int, data: schemas.SessionNoteSchema
context: schemas.CurrentContext = Depends(OR_context)):
if not sessions.session_exists(project_id=projectId, session_id=sessionId):
return {"errors": ["Session not found"]}
data = notes.create(tenant_id=context.tenant_id, project_id=projectId,
data = sessions_notes.create(tenant_id=context.tenant_id, project_id=projectId,
session_id=sessionId, user_id=context.user_id, data=data)
if "errors" in data.keys():
return data
@ -499,7 +500,7 @@ def create_note(projectId: int, sessionId: int, data: schemas.SessionNoteSchema
@app.get('/{projectId}/sessions/{sessionId}/notes', tags=["sessions", "notes"])
def get_session_notes(projectId: int, sessionId: int, context: schemas.CurrentContext = Depends(OR_context)):
data = notes.get_session_notes(tenant_id=context.tenant_id, project_id=projectId,
data = sessions_notes.get_session_notes(tenant_id=context.tenant_id, project_id=projectId,
session_id=sessionId, user_id=context.user_id)
if "errors" in data:
return data
@ -511,7 +512,7 @@ def get_session_notes(projectId: int, sessionId: int, context: schemas.CurrentCo
@app.post('/{projectId}/notes/{noteId}', tags=["sessions", "notes"])
def edit_note(projectId: int, noteId: int, data: schemas.SessionUpdateNoteSchema = Body(...),
context: schemas.CurrentContext = Depends(OR_context)):
data = notes.edit(tenant_id=context.tenant_id, project_id=projectId, user_id=context.user_id,
data = sessions_notes.edit(tenant_id=context.tenant_id, project_id=projectId, user_id=context.user_id,
note_id=noteId, data=data)
if "errors" in data.keys():
return data
@ -522,28 +523,28 @@ def edit_note(projectId: int, noteId: int, data: schemas.SessionUpdateNoteSchema
@app.delete('/{projectId}/notes/{noteId}', tags=["sessions", "notes"])
def delete_note(projectId: int, noteId: int, _=Body(None), context: schemas.CurrentContext = Depends(OR_context)):
data = notes.delete(project_id=projectId, note_id=noteId)
data = sessions_notes.delete(project_id=projectId, note_id=noteId)
return data
@app.get('/{projectId}/notes/{noteId}/slack/{webhookId}', tags=["sessions", "notes"])
def share_note_to_slack(projectId: int, noteId: int, webhookId: int,
context: schemas.CurrentContext = Depends(OR_context)):
return notes.share_to_slack(tenant_id=context.tenant_id, project_id=projectId, user_id=context.user_id,
return sessions_notes.share_to_slack(tenant_id=context.tenant_id, project_id=projectId, user_id=context.user_id,
note_id=noteId, webhook_id=webhookId)
@app.get('/{projectId}/notes/{noteId}/msteams/{webhookId}', tags=["sessions", "notes"])
def share_note_to_msteams(projectId: int, noteId: int, webhookId: int,
context: schemas.CurrentContext = Depends(OR_context)):
return notes.share_to_msteams(tenant_id=context.tenant_id, project_id=projectId, user_id=context.user_id,
return sessions_notes.share_to_msteams(tenant_id=context.tenant_id, project_id=projectId, user_id=context.user_id,
note_id=noteId, webhook_id=webhookId)
@app.post('/{projectId}/notes', tags=["sessions", "notes"])
def get_all_notes(projectId: int, data: schemas.SearchNoteSchema = Body(...),
context: schemas.CurrentContext = Depends(OR_context)):
data = notes.get_all_notes_by_project_id(tenant_id=context.tenant_id, project_id=projectId,
data = sessions_notes.get_all_notes_by_project_id(tenant_id=context.tenant_id, project_id=projectId,
user_id=context.user_id, data=data)
if "errors" in data:
return data

View file

@ -219,17 +219,6 @@ def get_card_chart(projectId: int, metric_id: int, data: schemas.CardSessionsSch
return {"data": data}
@app.post("/{projectId}/dashboards/{dashboardId}/cards/{metric_id}/chart", tags=["card"])
# @app.post("/{projectId}/dashboards/{dashboardId}/cards/{metric_id}", tags=["card"])
def get_card_chart_for_dashboard(projectId: int, dashboardId: int, metric_id: int,
data: schemas.SavedCardSchema = Body(...),
context: schemas.CurrentContext = Depends(OR_context)):
data = custom_metrics.make_chart_from_card(
project=context.project, user_id=context.user_id, metric_id=metric_id, data=data, for_dashboard=True
)
return {"data": data}
@app.post("/{projectId}/cards/{metric_id}", tags=["dashboard"])
def update_card(projectId: int, metric_id: int, data: schemas.CardSchema = Body(...),
context: schemas.CurrentContext = Depends(OR_context)):

View file

@ -1,77 +0,0 @@
from typing import Annotated
from fastapi import Body, Depends, Query
import schemas
from chalicelib.core import metadata
from chalicelib.core.product_analytics import events, properties, autocomplete, filters
from or_dependencies import OR_context
from routers.base import get_routers
from typing import Optional
public_app, app, app_apikey = get_routers()
@app.get('/{projectId}/filters', tags=["product_analytics"])
def get_all_filters(projectId: int, context: schemas.CurrentContext = Depends(OR_context)):
return {
"data": {
"events": events.get_events(project_id=projectId),
"event": properties.get_all_properties(project_id=projectId),
"session": filters.get_sessions_filters(project_id=projectId),
"user": filters.get_users_filters(project_id=projectId),
"metadata": metadata.get_for_filters(project_id=projectId)
}
}
@app.get('/{projectId}/events/names', tags=["product_analytics"])
def get_all_events(projectId: int, filter_query: Annotated[schemas.PaginatedSchema, Query()],
context: schemas.CurrentContext = Depends(OR_context)):
return {"data": events.get_events(project_id=projectId)}
@app.get('/{projectId}/properties/search', tags=["product_analytics"])
def get_event_properties(projectId: int, en: str = Query(default=None, description="event name"),
ac: bool = Query(description="auto captured"),
context: schemas.CurrentContext = Depends(OR_context)):
if not en or len(en) == 0:
return {"data": []}
return {"data": properties.get_event_properties(project_id=projectId, event_name=en, auto_captured=ac) \
+ filters.get_global_filters(project_id=projectId)}
@app.post('/{projectId}/events/search', tags=["product_analytics"])
def search_events(projectId: int, data: schemas.EventsSearchPayloadSchema = Body(...),
context: schemas.CurrentContext = Depends(OR_context)):
return {"data": events.search_events(project_id=projectId, data=data)}
@app.get('/{projectId}/lexicon/events', tags=["product_analytics", "lexicon"])
def get_all_lexicon_events(projectId: int, filter_query: Annotated[schemas.PaginatedSchema, Query()],
context: schemas.CurrentContext = Depends(OR_context)):
return {"data": events.get_lexicon(project_id=projectId, page=filter_query)}
@app.get('/{projectId}/lexicon/properties', tags=["product_analytics", "lexicon"])
def get_all_lexicon_properties(projectId: int, filter_query: Annotated[schemas.PaginatedSchema, Query()],
context: schemas.CurrentContext = Depends(OR_context)):
return {"data": properties.get_lexicon(project_id=projectId, page=filter_query)}
@app.get('/{projectId}/events/autocomplete', tags=["autocomplete"])
def autocomplete_events(projectId: int, q: Optional[str] = None,
context: schemas.CurrentContext = Depends(OR_context)):
return {"data": autocomplete.search_events(project_id=projectId, q=None if not q or len(q) == 0 else q)}
@app.get('/{projectId}/properties/autocomplete', tags=["autocomplete"])
def autocomplete_properties(projectId: int, propertyName: str, eventName: Optional[str] = None,
q: Optional[str] = None, context: schemas.CurrentContext = Depends(OR_context)):
# Specify propertyName to get top values of that property
# Specify eventName&propertyName to get top values of that property for the selected event
return {"data": autocomplete.search_properties(project_id=projectId,
event_name=None if not eventName \
or len(eventName) == 0 else eventName,
property_name=propertyName,
q=None if not q or len(q) == 0 else q)}

View file

@ -0,0 +1,15 @@
import schemas
from chalicelib.core.metrics import product_anaytics2
from fastapi import Depends
from or_dependencies import OR_context
from routers.base import get_routers
public_app, app, app_apikey = get_routers()
@app.post('/{projectId}/events/search', tags=["dashboard"])
def search_events(projectId: int,
# data: schemas.CreateDashboardSchema = Body(...),
context: schemas.CurrentContext = Depends(OR_context)):
return product_anaytics2.search_events(project_id=projectId, data={})

View file

@ -1,12 +1,10 @@
from typing import Annotated
from fastapi import Body, Depends
from fastapi import Body, Depends, Query
import schemas
from chalicelib.core.usability_testing import service
from chalicelib.core.usability_testing.schema import UTTestCreate, UTTestUpdate, UTTestSearch
from or_dependencies import OR_context
from routers.base import get_routers
from schemas import schemas
public_app, app, app_apikey = get_routers()
tags = ["usability-tests"]
@ -79,7 +77,9 @@ async def update_ut_test(projectId: int, test_id: int, test_update: UTTestUpdate
@app.get('/{projectId}/usability-tests/{test_id}/sessions', tags=tags)
async def get_sessions(projectId: int, test_id: int, filter_query: Annotated[schemas.UsabilityTestQuery, Query()]):
async def get_sessions(projectId: int, test_id: int, page: int = 1, limit: int = 10,
live: bool = False,
user_id: str = None):
"""
Get sessions related to a specific UT test.
@ -87,23 +87,21 @@ async def get_sessions(projectId: int, test_id: int, filter_query: Annotated[sch
- **test_id**: The unique identifier of the UT test.
"""
if filter_query.live:
return service.ut_tests_sessions_live(projectId, test_id, filter_query.page, filter_query.limit)
if live:
return service.ut_tests_sessions_live(projectId, test_id, page, limit)
else:
return service.ut_tests_sessions(projectId, test_id, filter_query.page, filter_query.limit,
filter_query.user_id, filter_query.live)
return service.ut_tests_sessions(projectId, test_id, page, limit, user_id, live)
@app.get('/{projectId}/usability-tests/{test_id}/responses/{task_id}', tags=tags)
async def get_responses(projectId: int, test_id: int, task_id: int,
filter_query: Annotated[schemas.PaginatedSchema, Query()], query: str = None):
async def get_responses(projectId: int, test_id: int, task_id: int, page: int = 1, limit: int = 10, query: str = None):
"""
Get responses related to a specific UT test.
- **project_id**: The unique identifier of the project.
- **test_id**: The unique identifier of the UT test.
"""
return service.get_responses(test_id, task_id, filter_query.page, filter_query.limit, query)
return service.get_responses(test_id, task_id, page, limit, query)
@app.get('/{projectId}/usability-tests/{test_id}/statistics', tags=tags)

View file

@ -1,4 +1,2 @@
from .schemas import *
from .product_analytics import *
from . import overrides as _overrides
from .schemas import _PaginatedSchema as PaginatedSchema

View file

@ -3,7 +3,6 @@ from enum import Enum as _Enum
from pydantic import BaseModel as _BaseModel
from pydantic import ConfigDict, TypeAdapter, Field
from pydantic.types import AnyType
from decouple import config, Choices
def attribute_to_camel_case(snake_str: str) -> str:
@ -22,9 +21,7 @@ def schema_extra(schema: dict, _):
class BaseModel(_BaseModel):
model_config = ConfigDict(alias_generator=attribute_to_camel_case,
use_enum_values=True,
json_schema_extra=schema_extra,
extra=config("EXTRA_PAYLOAD_ATTRIBUTES", default="ignore",
cast=Choices(["ignore", "forbid", "allow"])))
json_schema_extra=schema_extra)
class Enum(_Enum):

View file

@ -1,22 +0,0 @@
from typing import Optional, List, Literal, Union, Annotated
from pydantic import Field
from .overrides import BaseModel
from .schemas import EventPropertiesSchema, SortOrderType, _TimedSchema, \
_PaginatedSchema, PropertyFilterSchema
class EventSearchSchema(BaseModel):
is_event: Literal[True] = True
name: str = Field(...)
properties: Optional[EventPropertiesSchema] = Field(default=None)
ProductAnalyticsGroupedFilter = Annotated[Union[EventSearchSchema, PropertyFilterSchema], \
Field(discriminator='is_event')]
class EventsSearchPayloadSchema(_TimedSchema, _PaginatedSchema):
filters: List[ProductAnalyticsGroupedFilter] = Field(...)
sort: str = Field(default="startTs")
order: SortOrderType = Field(default=SortOrderType.DESC)

View file

@ -3,13 +3,12 @@ from typing import Optional, List, Union, Literal
from pydantic import Field, EmailStr, HttpUrl, SecretStr, AnyHttpUrl
from pydantic import field_validator, model_validator, computed_field
from pydantic import AfterValidator
from pydantic.functional_validators import BeforeValidator
from chalicelib.utils.TimeUTC import TimeUTC
from .overrides import BaseModel, Enum, ORUnion
from .transformers_validators import transform_email, remove_whitespace, remove_duplicate_values, single_to_list, \
force_is_event, NAME_PATTERN, int_to_string, check_alphanumeric, check_regex
force_is_event, NAME_PATTERN, int_to_string, check_alphanumeric
class _GRecaptcha(BaseModel):
@ -405,9 +404,6 @@ class EventType(str, Enum):
REQUEST_MOBILE = "requestMobile"
ERROR_MOBILE = "errorMobile"
SWIPE_MOBILE = "swipeMobile"
EVENT = "event"
INCIDENT = "incident"
CLICK_COORDINATES = "clickCoordinates"
class PerformanceEventType(str, Enum):
@ -468,7 +464,6 @@ class SearchEventOperator(str, Enum):
NOT_CONTAINS = "notContains"
STARTS_WITH = "startsWith"
ENDS_WITH = "endsWith"
PATTERN = "regex"
class ClickEventExtraOperator(str, Enum):
@ -508,8 +503,8 @@ class IssueType(str, Enum):
CUSTOM = 'custom'
JS_EXCEPTION = 'js_exception'
MOUSE_THRASHING = 'mouse_thrashing'
TAP_RAGE = 'tap_rage' # IOS
INCIDENT = 'incident'
# IOS
TAP_RAGE = 'tap_rage'
class MetricFormatType(str, Enum):
@ -540,7 +535,7 @@ class GraphqlFilterType(str, Enum):
class RequestGraphqlFilterSchema(BaseModel):
type: Union[FetchFilterType, GraphqlFilterType] = Field(...)
value: List[Union[int, str]] = Field(...)
operator: Annotated[Union[SearchEventOperator, MathOperator], AfterValidator(check_regex)] = Field(...)
operator: Union[SearchEventOperator, MathOperator] = Field(...)
@model_validator(mode="before")
@classmethod
@ -550,85 +545,7 @@ class RequestGraphqlFilterSchema(BaseModel):
return values
class EventPredefinedPropertyType(str, Enum):
TIME = "$time"
SOURCE = "$source"
DURATION_S = "$duration_s"
DESCRIPTION = "description"
AUTO_CAPTURED = "$auto_captured"
SDK_EDITION = "$sdk_edition"
SDK_VERSION = "$sdk_version"
DEVICE_ID = "$device_id"
OS = "$os"
OS_VERSION = "$os_version"
BROWSER = "$browser"
BROWSER_VERSION = "$browser_version"
DEVICE = "$device"
SCREEN_HEIGHT = "$screen_height"
SCREEN_WIDTH = "$screen_width"
CURRENT_URL = "$current_url"
INITIAL_REFERRER = "$initial_referrer"
REFERRING_DOMAIN = "$referring_domain"
REFERRER = "$referrer"
INITIAL_REFERRING_DOMAIN = "$initial_referring_domain"
SEARCH_ENGINE = "$search_engine"
SEARCH_ENGINE_KEYWORD = "$search_engine_keyword"
UTM_SOURCE = "utm_source"
UTM_MEDIUM = "utm_medium"
UTM_CAMPAIGN = "utm_campaign"
COUNTRY = "$country"
STATE = "$state"
CITY = "$city"
ISSUE_TYPE = "issue_type"
TAGS = "$tags"
IMPORT = "$import"
class PropertyType(str, Enum):
INT = "int"
FLOAT = "float"
DATETIME = "datetime"
STRING = "string"
ARRAY = "array"
TUPLE = "tuple"
MAP = "map"
NESTED = "nested"
class PropertyFilterSchema(BaseModel):
is_event: Literal[False] = False
name: Union[EventPredefinedPropertyType, str] = Field(...)
operator: Union[SearchEventOperator, MathOperator] = Field(...)
value: List[Union[int, str]] = Field(...)
data_type: PropertyType = Field(default=PropertyType.STRING.value)
# property_type: Optional[Literal["string", "number", "date"]] = Field(default=None)
@computed_field
@property
def is_predefined(self) -> bool:
return EventPredefinedPropertyType.has_value(self.name)
@model_validator(mode="after")
def transform_name(self):
if isinstance(self.name, Enum):
self.name = self.name.value
return self
@model_validator(mode='after')
def _check_regex_value(self):
if self.operator == SearchEventOperator.PATTERN:
for v in self.value:
check_regex(v)
return self
class EventPropertiesSchema(BaseModel):
operator: Literal["and", "or"] = Field(...)
filters: List[PropertyFilterSchema] = Field(...)
class SessionSearchEventSchema(BaseModel):
class SessionSearchEventSchema2(BaseModel):
is_event: Literal[True] = True
value: List[Union[str, int]] = Field(...)
type: Union[EventType, PerformanceEventType] = Field(...)
@ -636,7 +553,6 @@ class SessionSearchEventSchema(BaseModel):
source: Optional[List[Union[ErrorSource, int, str]]] = Field(default=None)
sourceOperator: Optional[MathOperator] = Field(default=None)
filters: Optional[List[RequestGraphqlFilterSchema]] = Field(default_factory=list)
properties: Optional[EventPropertiesSchema] = Field(default=None)
_remove_duplicate_values = field_validator('value', mode='before')(remove_duplicate_values)
_single_to_list_values = field_validator('value', mode='before')(single_to_list)
@ -661,23 +577,12 @@ class SessionSearchEventSchema(BaseModel):
elif self.type == EventType.GRAPHQL:
assert isinstance(self.filters, List) and len(self.filters) > 0, \
f"filters should be defined for {EventType.GRAPHQL}"
elif self.type == EventType.CLICK_COORDINATES:
assert isinstance(self.value, List) \
and (len(self.value) == 0 or len(self.value) == 2 or len(self.value) == 4), \
f"value should be [x,y] or [x1,x2,y1,y2] for {EventType.CLICK_COORDINATES}"
if isinstance(self.operator, ClickEventExtraOperator):
assert self.type == EventType.CLICK, \
f"operator:{self.operator} is only available for event-type: {EventType.CLICK}"
return self
@model_validator(mode='after')
def _check_regex_value(self):
if self.operator == SearchEventOperator.PATTERN:
for v in self.value:
check_regex(v)
return self
class SessionSearchFilterSchema(BaseModel):
is_event: Literal[False] = False
@ -735,13 +640,6 @@ class SessionSearchFilterSchema(BaseModel):
return self
@model_validator(mode='after')
def _check_regex_value(self):
if self.operator == SearchEventOperator.PATTERN:
for v in self.value:
check_regex(v)
return self
class _PaginatedSchema(BaseModel):
limit: int = Field(default=200, gt=0, le=200)
@ -762,12 +660,12 @@ def add_missing_is_event(values: dict):
# this type is created to allow mixing events&filters and specifying a discriminator
GroupedFilterType = Annotated[Union[SessionSearchFilterSchema, SessionSearchEventSchema],
GroupedFilterType = Annotated[Union[SessionSearchFilterSchema, SessionSearchEventSchema2],
Field(discriminator='is_event'), BeforeValidator(add_missing_is_event)]
class SessionsSearchPayloadSchema(_TimedSchema, _PaginatedSchema):
events: List[SessionSearchEventSchema] = Field(default_factory=list, doc_hidden=True)
events: List[SessionSearchEventSchema2] = Field(default_factory=list, doc_hidden=True)
filters: List[GroupedFilterType] = Field(default_factory=list)
sort: str = Field(default="startTs")
order: SortOrderType = Field(default=SortOrderType.DESC)
@ -792,8 +690,6 @@ class SessionsSearchPayloadSchema(_TimedSchema, _PaginatedSchema):
def add_missing_attributes(cls, values):
# in case isEvent is wrong:
for f in values.get("filters") or []:
if f.get("type") is None:
continue
if EventType.has_value(f["type"]) and not f.get("isEvent"):
f["isEvent"] = True
elif FilterType.has_value(f["type"]) and f.get("isEvent"):
@ -819,15 +715,6 @@ class SessionsSearchPayloadSchema(_TimedSchema, _PaginatedSchema):
f["value"] = vals
return values
@model_validator(mode="after")
def check_pa_event_filter(self):
for v in self.filters + self.events:
if v.type == EventType.EVENT:
assert v.operator in (SearchEventOperator.IS, MathOperator.EQUAL), \
"operator must be {SearchEventOperator.IS} or {MathOperator.EQUAL} for EVENT type"
assert len(v.value) == 1, "value must have 1 single value for EVENT type"
return self
@model_validator(mode="after")
def split_filters_events(self):
n_filters = []
@ -908,13 +795,6 @@ class PathAnalysisSubFilterSchema(BaseModel):
values["isEvent"] = True
return values
@model_validator(mode='after')
def _check_regex_value(self):
if self.operator == SearchEventOperator.PATTERN:
for v in self.value:
check_regex(v)
return self
class _ProductAnalyticsFilter(BaseModel):
is_event: Literal[False] = False
@ -925,13 +805,6 @@ class _ProductAnalyticsFilter(BaseModel):
_remove_duplicate_values = field_validator('value', mode='before')(remove_duplicate_values)
@model_validator(mode='after')
def _check_regex_value(self):
if self.operator == SearchEventOperator.PATTERN:
for v in self.value:
check_regex(v)
return self
class _ProductAnalyticsEventFilter(BaseModel):
is_event: Literal[True] = True
@ -942,13 +815,6 @@ class _ProductAnalyticsEventFilter(BaseModel):
_remove_duplicate_values = field_validator('value', mode='before')(remove_duplicate_values)
@model_validator(mode='after')
def _check_regex_value(self):
if self.operator == SearchEventOperator.PATTERN:
for v in self.value:
check_regex(v)
return self
# this type is created to allow mixing events&filters and specifying a discriminator for PathAnalysis series filter
ProductAnalyticsFilter = Annotated[Union[_ProductAnalyticsFilter, _ProductAnalyticsEventFilter],
@ -1043,16 +909,11 @@ class MetricOfPathAnalysis(str, Enum):
session_count = MetricOfTimeseries.SESSION_COUNT.value
# class CardSessionsSchema(SessionsSearchPayloadSchema):
class CardSessionsSchema(_TimedSchema, _PaginatedSchema):
startTimestamp: int = Field(default=TimeUTC.now(-7))
endTimestamp: int = Field(default=TimeUTC.now())
density: int = Field(default=7, ge=1, le=200)
# we need metric_type&metric_of in the payload of sessions search
# because the API will retrun all sessions if the card is not identified
# example: table of requests contains only sessions that have a request,
# but drill-down doesn't take that into consideration
metric_type: MetricType = Field(...)
metric_of: Any
series: List[CardSeriesSchema] = Field(default_factory=list)
# events: List[SessionSearchEventSchema2] = Field(default_factory=list, doc_hidden=True)
@ -1117,11 +978,6 @@ class CardSessionsSchema(_TimedSchema, _PaginatedSchema):
return self
class SavedCardSchema(CardSessionsSchema):
metric_type: Optional[MetricType] = Field(default=None)
metric_of: Optional[Any] = Field(default=None)
class CardConfigSchema(BaseModel):
col: Optional[int] = Field(default=None)
row: Optional[int] = Field(default=2)
@ -1135,6 +991,8 @@ class __CardSchema(CardSessionsSchema):
thumbnail: Optional[str] = Field(default=None)
metric_format: Optional[MetricFormatType] = Field(default=None)
view_type: Any
metric_type: MetricType = Field(...)
metric_of: Any
metric_value: List[IssueType] = Field(default_factory=list)
# This is used to save the selected session for heatmaps
session_id: Optional[int] = Field(default=None)
@ -1401,13 +1259,6 @@ class LiveSessionSearchFilterSchema(BaseModel):
assert len(self.source) > 0, "source should not be empty for METADATA type"
return self
@model_validator(mode='after')
def _check_regex_value(self):
if self.operator == SearchEventOperator.PATTERN:
for v in self.value:
check_regex(v)
return self
class LiveSessionsSearchPayloadSchema(_PaginatedSchema):
filters: List[LiveSessionSearchFilterSchema] = Field([])
@ -1533,8 +1384,8 @@ class MetricSearchSchema(_PaginatedSchema):
mine_only: bool = Field(default=False)
class _HeatMapSearchEventRaw(SessionSearchEventSchema):
type: Literal[EventType.LOCATION, EventType.CLICK_COORDINATES] = Field(...)
class _HeatMapSearchEventRaw(SessionSearchEventSchema2):
type: Literal[EventType.LOCATION] = Field(...)
class HeatMapSessionsSearch(SessionsSearchPayloadSchema):
@ -1658,34 +1509,3 @@ class TagCreate(TagUpdate):
class ScopeSchema(BaseModel):
scope: int = Field(default=1, ge=1, le=2)
class SessionModel(BaseModel):
duration: int
errorsCount: int
eventsCount: int
issueScore: int
issueTypes: List[IssueType] = Field(default=[])
metadata: dict = Field(default={})
pagesCount: int
platform: str
projectId: int
sessionId: str
startTs: int
timezone: Optional[str]
userAnonymousId: Optional[str]
userBrowser: str
userCity: str
userCountry: str
userDevice: Optional[str]
userDeviceType: str
userId: Optional[str]
userOs: str
userState: str
userUuid: str
viewed: bool = Field(default=False)
class UsabilityTestQuery(_PaginatedSchema):
live: bool = Field(default=False)
user_id: Optional[str] = Field(default=None)

View file

@ -1,11 +1,10 @@
import re
from typing import Union, Any, Type
from pydantic import ValidationInfo
from .overrides import Enum
NAME_PATTERN = r"^[a-z,A-Z,0-9,\-,é,è,à,ç, ,|,&,\/,\\,_,.,#,']*$"
NAME_PATTERN = r"^[a-z,A-Z,0-9,\-,é,è,à,ç, ,|,&,\/,\\,_,.,#]*$"
def transform_email(email: str) -> str:
@ -58,17 +57,3 @@ def check_alphanumeric(v: str, info: ValidationInfo) -> str:
is_alphanumeric = v.replace(' ', '').isalnum()
assert is_alphanumeric, f'{info.field_name} must be alphanumeric'
return v
def check_regex(v: str) -> str:
assert v is not None, "Regex is null"
assert isinstance(v, str), "Regex value must be a string"
assert len(v) > 0, "Regex is empty"
is_valid = None
try:
re.compile(v)
except re.error as exc:
is_valid = f"Invalid regex: {exc} (at position {exc.pos})"
assert is_valid is None, is_valid
return v

View file

@ -1,61 +0,0 @@
#!/bin/bash
# Usage: IMAGE_TAG=latest DOCKER_REPO=myDockerHubID bash build.sh <ee>
ARCH=${ARCH:-amd64}
git_sha=$(git rev-parse --short HEAD)
image_tag=${IMAGE_TAG:-git_sha}
check_prereq() {
which docker || {
echo "Docker not installed, please install docker."
exit 1
}
}
source ../scripts/lib/_docker.sh
[[ $PATCH -eq 1 ]] && {
image_tag="$(grep -ER ^.ppVersion ../scripts/helmcharts/openreplay/charts/$chart | xargs | awk '{print $2}' | awk -F. -v OFS=. '{$NF += 1 ; print}')"
image_tag="${image_tag}-ee"
}
update_helm_release() {
chart=$1
HELM_TAG="$(grep -iER ^version ../scripts/helmcharts/openreplay/charts/$chart | awk '{print $2}' | awk -F. -v OFS=. '{$NF += 1 ; print}')"
# Update the chart version
sed -i "s#^version.*#version: $HELM_TAG# g" ../scripts/helmcharts/openreplay/charts/$chart/Chart.yaml
# Update image tags
sed -i "s#ppVersion.*#ppVersion: \"$image_tag\"#g" ../scripts/helmcharts/openreplay/charts/$chart/Chart.yaml
# Commit the changes
git add ../scripts/helmcharts/openreplay/charts/$chart/Chart.yaml
git commit -m "chore(helm): Updating $chart image release"
}
function build_api() {
destination="_assist-server_ee"
[[ -d ../${destination} ]] && {
echo "Removing previous build cache"
rm -rf ../${destination}
}
cp -R ../assist-server ../${destination}
cd ../${destination} || exit 1
cp -rf ../ee/assist-server/* ./
docker build -f ./Dockerfile --build-arg GIT_SHA=$git_sha -t ${DOCKER_REPO:-'local'}/assist-server:${image_tag} .
cd ../assist-server || exit 1
rm -rf ../${destination}
[[ $PUSH_IMAGE -eq 1 ]] && {
docker push ${DOCKER_REPO:-'local'}/assist-server:${image_tag}
docker tag ${DOCKER_REPO:-'local'}/assist-server:${image_tag} ${DOCKER_REPO:-'local'}/assist-server:latest
docker push ${DOCKER_REPO:-'local'}/assist-server:latest
}
[[ $SIGN_IMAGE -eq 1 ]] && {
cosign sign --key $SIGN_KEY ${DOCKER_REPO:-'local'}/assist-server:${image_tag}
}
echo "build completed for assist-server"
}
check_prereq
build_api $1
if [[ $PATCH -eq 1 ]]; then
update_helm_release assist-server
fi

View file

@ -1,30 +0,0 @@
ee ?= "false" # true to build ee
app ?= "" # app name, default all
arch ?= "amd64" # default amd64
docker_runtime ?= "docker" # default docker runtime
.PHONY: help
help: ## Prints help for targets with comments
@awk 'BEGIN {FS = ":.*##"; printf "\nUsage:\n make \033[36m<target>\033[0m\n"} /^[a-zA-Z_0-9-]+:.*?##/ { printf " \033[36m%-25s\033[0m %s\n", $$1, $$2 } /^##@/ { printf "\n\033[1m%s\033[0m\n", substr($$0, 5) } ' $(MAKEFILE_LIST)
##@ Docker
.PHONY: build
build: ## Build the backend. ee=true for ee build. app=app name for only one app. Default build all apps.
ARCH=$(arch) DOCKER_RUNTIME=$(docker_runtime) bash build.sh $(ee) $(app)
##@ Local Dev
.PHONY: scan
scan: ## Scan the backend
@trivy fs -q .
.PHONY: update
update: ## Update the backend dependecies
@echo Updating dependencies
@go get -u -v ./...
@go mod tidy
run: ## Run the backend. app=app name for app to run
@if [ $(app) == "" ]; then echo "Error: app parameter is required. Usage: make run app=<app_name>"; exit 1; fi
@go run "cmd/$(app)/main.go"

View file

@ -2,71 +2,44 @@ package main
import (
"context"
"os"
"os/signal"
"syscall"
analyticsConfig "openreplay/backend/internal/config/analytics"
"openreplay/backend/pkg/analytics"
"openreplay/backend/pkg/db/postgres/pool"
"openreplay/backend/pkg/logger"
"openreplay/backend/pkg/metrics"
"openreplay/backend/pkg/metrics/database"
"openreplay/backend/pkg/metrics/web"
"openreplay/backend/pkg/server"
"openreplay/backend/pkg/server/api"
)
func main() {
ctx := context.Background()
log := logger.New()
log.Info(ctx, "Cacher service started")
cfg := analyticsConfig.New(log)
// Observability
webMetrics := web.New("analytics")
dbMetrics := database.New("analytics")
metrics.New(log, append(webMetrics.List(), dbMetrics.List()...))
sigchan := make(chan os.Signal, 1)
signal.Notify(sigchan, syscall.SIGINT, syscall.SIGTERM)
pgConn, err := pool.New(dbMetrics, cfg.Postgres.String())
if err != nil {
log.Fatal(ctx, "can't init postgres connection: %s", err)
}
defer pgConn.Close()
for {
select {
case sig := <-sigchan:
log.Error(ctx, "Caught signal %v: terminating", sig)
os.Exit(0)
builder, err := analytics.NewServiceBuilder(log, cfg, webMetrics, dbMetrics, pgConn)
if err != nil {
log.Fatal(ctx, "can't init services: %s", err)
}
router, err := api.NewRouter(&cfg.HTTP, log)
if err != nil {
log.Fatal(ctx, "failed while creating router: %s", err)
}
router.AddHandlers(api.NoPrefix, builder.CardsAPI, builder.DashboardsAPI, builder.ChartsAPI)
router.AddMiddlewares(builder.Auth.Middleware, builder.RateLimiter.Middleware, builder.AuditTrail.Middleware)
server.Run(ctx, log, &cfg.HTTP, router)
}
//
//import (
// "context"
//
// analyticsConfig "openreplay/backend/internal/config/analytics"
// "openreplay/backend/pkg/analytics"
// "openreplay/backend/pkg/db/postgres/pool"
// "openreplay/backend/pkg/logger"
// "openreplay/backend/pkg/metrics"
// "openreplay/backend/pkg/metrics/database"
// "openreplay/backend/pkg/metrics/web"
// "openreplay/backend/pkg/server"
// "openreplay/backend/pkg/server/api"
//)
//
//func main() {
// ctx := context.Background()
// log := logger.New()
// cfg := analyticsConfig.New(log)
// // Observability
// webMetrics := web.New("analytics")
// dbMetrics := database.New("analytics")
// metrics.New(log, append(webMetrics.List(), dbMetrics.List()...))
//
// pgConn, err := pool.New(dbMetrics, cfg.Postgres.String())
// if err != nil {
// log.Fatal(ctx, "can't init postgres connection: %s", err)
// }
// defer pgConn.Close()
//
// builder, err := analytics.NewServiceBuilder(log, cfg, webMetrics, dbMetrics, pgConn)
// if err != nil {
// log.Fatal(ctx, "can't init services: %s", err)
// }
//
// router, err := api.NewRouter(&cfg.HTTP, log)
// if err != nil {
// log.Fatal(ctx, "failed while creating router: %s", err)
// }
// router.AddHandlers(api.NoPrefix, builder.CardsAPI, builder.DashboardsAPI, builder.ChartsAPI)
// router.AddMiddlewares(builder.Auth.Middleware, builder.RateLimiter.Middleware, builder.AuditTrail.Middleware)
//
// server.Run(ctx, log, &cfg.HTTP, router)
//}

View file

@ -66,11 +66,11 @@ func main() {
messages.MsgMetadata, messages.MsgIssueEvent, messages.MsgSessionStart, messages.MsgSessionEnd,
messages.MsgUserID, messages.MsgUserAnonymousID, messages.MsgIntegrationEvent, messages.MsgPerformanceTrackAggr,
messages.MsgJSException, messages.MsgResourceTiming, messages.MsgCustomEvent, messages.MsgCustomIssue,
messages.MsgFetch, messages.MsgNetworkRequest, messages.MsgGraphQL, messages.MsgStateAction, messages.MsgMouseClick,
messages.MsgNetworkRequest, messages.MsgGraphQL, messages.MsgStateAction, messages.MsgMouseClick,
messages.MsgMouseClickDeprecated, messages.MsgSetPageLocation, messages.MsgSetPageLocationDeprecated,
messages.MsgPageLoadTiming, messages.MsgPageRenderTiming,
messages.MsgPageEvent, messages.MsgPageEventDeprecated, messages.MsgMouseThrashing, messages.MsgInputChange,
messages.MsgUnbindNodes, messages.MsgCanvasNode, messages.MsgTagTrigger, messages.MsgIncident,
messages.MsgUnbindNodes, messages.MsgCanvasNode, messages.MsgTagTrigger,
// Mobile messages
messages.MsgMobileSessionStart, messages.MsgMobileSessionEnd, messages.MsgMobileUserID, messages.MsgMobileUserAnonymousID,
messages.MsgMobileMetadata, messages.MsgMobileEvent, messages.MsgMobileNetworkCall,

View file

@ -100,7 +100,6 @@ func main() {
// Process assets
if msg.TypeID() == messages.MsgSetNodeAttributeURLBased ||
msg.TypeID() == messages.MsgSetCSSDataURLBased ||
msg.TypeID() == messages.MsgCSSInsertRuleURLBased ||
msg.TypeID() == messages.MsgAdoptedSSReplaceURLBased ||
msg.TypeID() == messages.MsgAdoptedSSInsertRuleURLBased {
m := msg.Decode()

View file

@ -1,54 +1,52 @@
module openreplay/backend
go 1.23.0
toolchain go1.23.1
go 1.23
require (
github.com/Azure/azure-sdk-for-go/sdk/azcore v1.18.0
github.com/Azure/azure-sdk-for-go/sdk/azcore v1.17.0
github.com/Azure/azure-sdk-for-go/sdk/storage/azblob v1.6.0
github.com/ClickHouse/clickhouse-go/v2 v2.34.0
github.com/DataDog/datadog-api-client-go/v2 v2.37.1
github.com/ClickHouse/clickhouse-go/v2 v2.32.1
github.com/DataDog/datadog-api-client-go/v2 v2.34.0
github.com/Masterminds/semver v1.5.0
github.com/andybalholm/brotli v1.1.1
github.com/aws/aws-sdk-go v1.55.6
github.com/btcsuite/btcutil v1.0.2
github.com/confluentinc/confluent-kafka-go/v2 v2.10.0
github.com/confluentinc/confluent-kafka-go/v2 v2.8.0
github.com/docker/distribution v2.8.3+incompatible
github.com/elastic/go-elasticsearch/v7 v7.17.10
github.com/elastic/go-elasticsearch/v8 v8.18.0
github.com/getsentry/sentry-go v0.32.0
github.com/go-playground/validator/v10 v10.26.0
github.com/elastic/go-elasticsearch/v8 v8.17.0
github.com/getsentry/sentry-go v0.31.1
github.com/go-playground/validator/v10 v10.24.0
github.com/go-redis/redis v6.15.9+incompatible
github.com/golang-jwt/jwt/v5 v5.2.2
github.com/golang-jwt/jwt/v5 v5.2.1
github.com/google/uuid v1.6.0
github.com/gorilla/mux v1.8.1
github.com/jackc/pgconn v1.14.3
github.com/jackc/pgerrcode v0.0.0-20240316143900-6e2875d9b438
github.com/jackc/pgtype v1.14.4
github.com/jackc/pgx/v4 v4.18.3
github.com/klauspost/compress v1.18.0
github.com/klauspost/compress v1.17.11
github.com/klauspost/pgzip v1.2.6
github.com/lib/pq v1.10.9
github.com/oschwald/maxminddb-golang v1.13.1
github.com/pkg/errors v0.9.1
github.com/prometheus/client_golang v1.22.0
github.com/prometheus/client_golang v1.20.5
github.com/rs/xid v1.6.0
github.com/sethvargo/go-envconfig v1.2.0
github.com/sethvargo/go-envconfig v1.1.0
github.com/tomasen/realip v0.0.0-20180522021738-f0c99a92ddce
github.com/ua-parser/uap-go v0.0.0-20250326155420-f7f5a2f9f5bc
github.com/ua-parser/uap-go v0.0.0-20250126222208-a52596c19dff
go.uber.org/zap v1.27.0
golang.org/x/net v0.39.0
golang.org/x/net v0.35.0
)
require (
github.com/Azure/azure-sdk-for-go/sdk/internal v1.11.1 // indirect
github.com/ClickHouse/ch-go v0.65.1 // indirect
github.com/DataDog/zstd v1.5.7 // indirect
github.com/Azure/azure-sdk-for-go/sdk/internal v1.10.0 // indirect
github.com/ClickHouse/ch-go v0.65.0 // indirect
github.com/DataDog/zstd v1.5.6 // indirect
github.com/beorn7/perks v1.0.1 // indirect
github.com/cespare/xxhash/v2 v2.3.0 // indirect
github.com/elastic/elastic-transport-go/v8 v8.7.0 // indirect
github.com/gabriel-vasile/mimetype v1.4.9 // indirect
github.com/elastic/elastic-transport-go/v8 v8.6.0 // indirect
github.com/gabriel-vasile/mimetype v1.4.8 // indirect
github.com/go-faster/city v1.0.1 // indirect
github.com/go-faster/errors v0.7.1 // indirect
github.com/go-logr/logr v1.4.2 // indirect
@ -68,23 +66,23 @@ require (
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect
github.com/paulmach/orb v0.11.1 // indirect
github.com/pierrec/lz4/v4 v4.1.22 // indirect
github.com/prometheus/client_model v0.6.2 // indirect
github.com/prometheus/common v0.63.0 // indirect
github.com/prometheus/procfs v0.16.0 // indirect
github.com/prometheus/client_model v0.6.1 // indirect
github.com/prometheus/common v0.62.0 // indirect
github.com/prometheus/procfs v0.15.1 // indirect
github.com/segmentio/asm v1.2.0 // indirect
github.com/shopspring/decimal v1.4.0 // indirect
github.com/sirupsen/logrus v1.9.3 // indirect
go.opentelemetry.io/auto/sdk v1.1.0 // indirect
go.opentelemetry.io/otel v1.35.0 // indirect
go.opentelemetry.io/otel/metric v1.35.0 // indirect
go.opentelemetry.io/otel/trace v1.35.0 // indirect
go.opentelemetry.io/otel v1.34.0 // indirect
go.opentelemetry.io/otel/metric v1.34.0 // indirect
go.opentelemetry.io/otel/trace v1.34.0 // indirect
go.uber.org/multierr v1.11.0 // indirect
golang.org/x/crypto v0.37.0 // indirect
golang.org/x/oauth2 v0.29.0 // indirect
golang.org/x/sys v0.32.0 // indirect
golang.org/x/text v0.24.0 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250414145226-207652e42e2e // indirect
google.golang.org/protobuf v1.36.6 // indirect
golang.org/x/crypto v0.33.0 // indirect
golang.org/x/oauth2 v0.25.0 // indirect
golang.org/x/sys v0.30.0 // indirect
golang.org/x/text v0.22.0 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250127172529-29210b9bc287 // indirect
google.golang.org/protobuf v1.36.4 // indirect
gopkg.in/yaml.v2 v2.4.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)

View file

@ -6,17 +6,10 @@ github.com/AlecAivazis/survey/v2 v2.3.7 h1:6I/u8FvytdGsgonrYsVn2t8t4QiRnh6QSTqkk
github.com/AlecAivazis/survey/v2 v2.3.7/go.mod h1:xUTIdE4KCOIjsBAE1JYsUPoCqYdZ1reCfTwbto0Fduo=
github.com/Azure/azure-sdk-for-go/sdk/azcore v1.17.0 h1:g0EZJwz7xkXQiZAI5xi9f3WWFYBlX1CPTrR+NDToRkQ=
github.com/Azure/azure-sdk-for-go/sdk/azcore v1.17.0/go.mod h1:XCW7KnZet0Opnr7HccfUw1PLc4CjHqpcaxW8DHklNkQ=
github.com/Azure/azure-sdk-for-go/sdk/azcore v1.18.0 h1:Gt0j3wceWMwPmiazCa8MzMA0MfhmPIz0Qp0FJ6qcM0U=
github.com/Azure/azure-sdk-for-go/sdk/azcore v1.18.0/go.mod h1:Ot/6aikWnKWi4l9QB7qVSwa8iMphQNqkWALMoNT3rzM=
github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.8.0 h1:B/dfvscEQtew9dVuoxqxrUKKv8Ih2f55PydknDamU+g=
github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.8.0/go.mod h1:fiPSssYvltE08HJchL04dOy+RD4hgrjph0cwGGMntdI=
github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.8.2 h1:F0gBpfdPLGsw+nsgk6aqqkZS1jiixa5WwFe3fk/T3Ys=
github.com/Azure/azure-sdk-for-go/sdk/internal v1.10.0 h1:ywEEhmNahHBihViHepv3xPBn1663uRv2t2q/ESv9seY=
github.com/Azure/azure-sdk-for-go/sdk/internal v1.10.0/go.mod h1:iZDifYGJTIgIIkYRNWPENUnqx6bJ2xnSDFI2tjwZNuY=
github.com/Azure/azure-sdk-for-go/sdk/internal v1.11.0 h1:Bg8m3nq/X1DeePkAbCfb6ml6F3F0IunEhE8TMh+lY48=
github.com/Azure/azure-sdk-for-go/sdk/internal v1.11.0/go.mod h1:j2chePtV91HrC22tGoRX3sGY42uF13WzmmV80/OdVAA=
github.com/Azure/azure-sdk-for-go/sdk/internal v1.11.1 h1:FPKJS1T+clwv+OLGt13a8UjqeRuh0O4SJ3lUriThc+4=
github.com/Azure/azure-sdk-for-go/sdk/internal v1.11.1/go.mod h1:j2chePtV91HrC22tGoRX3sGY42uF13WzmmV80/OdVAA=
github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/storage/armstorage v1.6.0 h1:PiSrjRPpkQNjrM8H0WwKMnZUdu1RGMtd/LdGKUrOo+c=
github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/storage/armstorage v1.6.0/go.mod h1:oDrbWx4ewMylP7xHivfgixbfGBT6APAwsSoHRKotnIc=
github.com/Azure/azure-sdk-for-go/sdk/storage/azblob v1.6.0 h1:UXT0o77lXQrikd1kgwIPQOUect7EoR/+sbP4wQKdzxM=
@ -25,28 +18,19 @@ github.com/Azure/go-ansiterm v0.0.0-20210617225240-d185dfc1b5a1 h1:UQHMgLO+TxOEl
github.com/Azure/go-ansiterm v0.0.0-20210617225240-d185dfc1b5a1/go.mod h1:xomTg63KZ2rFqZQzSB4Vz2SUXa1BpHTVz9L5PTmPC4E=
github.com/AzureAD/microsoft-authentication-library-for-go v1.3.2 h1:kYRSnvJju5gYVyhkij+RTJ/VR6QIUaCfWeaFm2ycsjQ=
github.com/AzureAD/microsoft-authentication-library-for-go v1.3.2/go.mod h1:wP83P5OoQ5p6ip3ScPr0BAq0BvuPAvacpEuSzyouqAI=
github.com/AzureAD/microsoft-authentication-library-for-go v1.4.2 h1:oygO0locgZJe7PpYPXT5A29ZkwJaPqcva7BVeemZOZs=
github.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU=
github.com/ClickHouse/ch-go v0.63.1 h1:s2JyZvWLTCSAGdtjMBBmAgQQHMco6pawLJMOXi0FODM=
github.com/ClickHouse/ch-go v0.63.1/go.mod h1:I1kJJCL3WJcBMGe1m+HVK0+nREaG+JOYYBWjrDrF3R0=
github.com/ClickHouse/ch-go v0.65.0 h1:vZAXfTQliuNNefqkPDewX3kgRxN6Q4vUENnnY+ynTRY=
github.com/ClickHouse/ch-go v0.65.0/go.mod h1:tCM0XEH5oWngoi9Iu/8+tjPBo04I/FxNIffpdjtwx3k=
github.com/ClickHouse/ch-go v0.65.1 h1:SLuxmLl5Mjj44/XbINsK2HFvzqup0s6rwKLFH347ZhU=
github.com/ClickHouse/ch-go v0.65.1/go.mod h1:bsodgURwmrkvkBe5jw1qnGDgyITsYErfONKAHn05nv4=
github.com/ClickHouse/clickhouse-go/v2 v2.30.1 h1:Dy0n0l+cMbPXs8hFkeeWGaPKrB+MDByUNQBSmRO3W6k=
github.com/ClickHouse/clickhouse-go/v2 v2.30.1/go.mod h1:szk8BMoQV/NgHXZ20ZbwDyvPWmpfhRKjFkc6wzASGxM=
github.com/ClickHouse/clickhouse-go/v2 v2.32.1 h1:RLhkxA6iH/bLTXeDtEj/u4yUx9Q03Y95P+cjHScQK78=
github.com/ClickHouse/clickhouse-go/v2 v2.32.1/go.mod h1:YtaiIFlHCGNPbOpAvFGYobtcVnmgYvD/WmzitixxWYc=
github.com/ClickHouse/clickhouse-go/v2 v2.34.0 h1:Y4rqkdrRHgExvC4o/NTbLdY5LFQ3LHS77/RNFxFX3Co=
github.com/ClickHouse/clickhouse-go/v2 v2.34.0/go.mod h1:yioSINoRLVZkLyDzdMXPLRIqhDvel8iLBlwh6Iefso8=
github.com/DataDog/datadog-api-client-go/v2 v2.34.0 h1:0VVmv8uZg8vdBuEpiF2nBGUezl2QITrxdEsLgh38j8M=
github.com/DataDog/datadog-api-client-go/v2 v2.34.0/go.mod h1:d3tOEgUd2kfsr9uuHQdY+nXrWp4uikgTgVCPdKNK30U=
github.com/DataDog/datadog-api-client-go/v2 v2.37.1 h1:weZhrGMO//sMEoSKWngoSQwMp4zBSlEX4p3/YWy9ltw=
github.com/DataDog/datadog-api-client-go/v2 v2.37.1/go.mod h1:d3tOEgUd2kfsr9uuHQdY+nXrWp4uikgTgVCPdKNK30U=
github.com/DataDog/zstd v1.5.6 h1:LbEglqepa/ipmmQJUDnSsfvA8e8IStVcGaFWDuxvGOY=
github.com/DataDog/zstd v1.5.6/go.mod h1:g4AWEaM3yOg3HYfnJ3YIawPnVdXJh9QME85blwSAmyw=
github.com/DataDog/zstd v1.5.7 h1:ybO8RBeh29qrxIhCA9E8gKY6xfONU9T6G6aP9DTKfLE=
github.com/DataDog/zstd v1.5.7/go.mod h1:g4AWEaM3yOg3HYfnJ3YIawPnVdXJh9QME85blwSAmyw=
github.com/Masterminds/semver v1.5.0 h1:H65muMkzWKEuNDnfl9d70GUjFniHKHRbFPGBuZ3QEww=
github.com/Masterminds/semver v1.5.0/go.mod h1:MB6lktGJrhw8PrUyiEoblNEGEQ+RzHPF078ddwwvV3Y=
github.com/Masterminds/semver/v3 v3.1.1/go.mod h1:VPu/7SZ7ePZ3QOrcuXROw5FAcLl4a0cBrbBpGY/8hQs=
@ -113,8 +97,6 @@ github.com/compose-spec/compose-go/v2 v2.1.3 h1:bD67uqLuL/XgkAK6ir3xZvNLFPxPScEi
github.com/compose-spec/compose-go/v2 v2.1.3/go.mod h1:lFN0DrMxIncJGYAXTfWuajfwj5haBJqrBkarHcnjJKc=
github.com/confluentinc/confluent-kafka-go/v2 v2.8.0 h1:0HlcSNWg4LpLA9nIjzUMIqWHI+w0S68UN7alXAc3TeA=
github.com/confluentinc/confluent-kafka-go/v2 v2.8.0/go.mod h1:hScqtFIGUI1wqHIgM3mjoqEou4VweGGGX7dMpcUKves=
github.com/confluentinc/confluent-kafka-go/v2 v2.10.0 h1:TK5CH5RbIj/aVfmJFEsDUT6vD2izac2zmA5BUfAOxC0=
github.com/confluentinc/confluent-kafka-go/v2 v2.10.0/go.mod h1:hScqtFIGUI1wqHIgM3mjoqEou4VweGGGX7dMpcUKves=
github.com/containerd/console v1.0.4 h1:F2g4+oChYvBTsASRTz8NP6iIAi97J3TtSAsLbIFn4ro=
github.com/containerd/console v1.0.4/go.mod h1:YynlIjWYF8myEu6sdkwKIvGQq+cOckRm6So2avqoYAk=
github.com/containerd/containerd v1.7.18 h1:jqjZTQNfXGoEaZdW1WwPU0RqSn1Bm2Ay/KJPUuO8nao=
@ -166,14 +148,10 @@ github.com/eiannone/keyboard v0.0.0-20220611211555-0d226195f203 h1:XBBHcIb256gUJ
github.com/eiannone/keyboard v0.0.0-20220611211555-0d226195f203/go.mod h1:E1jcSv8FaEny+OP/5k9UxZVw9YFWGj7eI4KR/iOBqCg=
github.com/elastic/elastic-transport-go/v8 v8.6.0 h1:Y2S/FBjx1LlCv5m6pWAF2kDJAHoSjSRSJCApolgfthA=
github.com/elastic/elastic-transport-go/v8 v8.6.0/go.mod h1:YLHer5cj0csTzNFXoNQ8qhtGY1GTvSqPnKWKaqQE3Hk=
github.com/elastic/elastic-transport-go/v8 v8.7.0 h1:OgTneVuXP2uip4BA658Xi6Hfw+PeIOod2rY3GVMGoVE=
github.com/elastic/elastic-transport-go/v8 v8.7.0/go.mod h1:YLHer5cj0csTzNFXoNQ8qhtGY1GTvSqPnKWKaqQE3Hk=
github.com/elastic/go-elasticsearch/v7 v7.17.10 h1:TCQ8i4PmIJuBunvBS6bwT2ybzVFxxUhhltAs3Gyu1yo=
github.com/elastic/go-elasticsearch/v7 v7.17.10/go.mod h1:OJ4wdbtDNk5g503kvlHLyErCgQwwzmDtaFC4XyOxXA4=
github.com/elastic/go-elasticsearch/v8 v8.17.0 h1:e9cWksE/Fr7urDRmGPGp47Nsp4/mvNOrU8As1l2HQQ0=
github.com/elastic/go-elasticsearch/v8 v8.17.0/go.mod h1:lGMlgKIbYoRvay3xWBeKahAiJOgmFDsjZC39nmO3H64=
github.com/elastic/go-elasticsearch/v8 v8.18.0 h1:ANNq1h7DEiPUaALb8+5w3baQzaS08WfHV0DNzp0VG4M=
github.com/elastic/go-elasticsearch/v8 v8.18.0/go.mod h1:WLqwXsJmQoYkoA9JBFeEwPkQhCfAZuUvfpdU/NvSSf0=
github.com/emicklei/go-restful/v3 v3.11.0 h1:rAQeMHw1c7zTmncogyy8VvRZwtkmkZ4FxERmMY4rD+g=
github.com/emicklei/go-restful/v3 v3.11.0/go.mod h1:6n3XBCmQQb25CM2LCACGz8ukIrRry+4bhvbpWn3mrbc=
github.com/felixge/httpsnoop v1.0.4 h1:NFTV2Zj1bL4mc9sqWACXbQFVBBg2W3GPvqp8/ESS2Wg=
@ -185,12 +163,8 @@ github.com/fvbommel/sortorder v1.0.2 h1:mV4o8B2hKboCdkJm+a7uX/SIpZob4JzUpc5GGnM4
github.com/fvbommel/sortorder v1.0.2/go.mod h1:uk88iVf1ovNn1iLfgUVU2F9o5eO30ui720w+kxuqRs0=
github.com/gabriel-vasile/mimetype v1.4.8 h1:FfZ3gj38NjllZIeJAmMhr+qKL8Wu+nOoI3GqacKw1NM=
github.com/gabriel-vasile/mimetype v1.4.8/go.mod h1:ByKUIKGjh1ODkGM1asKUbQZOLGrPjydw3hYPU2YU9t8=
github.com/gabriel-vasile/mimetype v1.4.9 h1:5k+WDwEsD9eTLL8Tz3L0VnmVh9QxGjRmjBvAG7U/oYY=
github.com/gabriel-vasile/mimetype v1.4.9/go.mod h1:WnSQhFKJuBlRyLiKohA/2DtIlPFAbguNaG7QCHcyGok=
github.com/getsentry/sentry-go v0.31.1 h1:ELVc0h7gwyhnXHDouXkhqTFSO5oslsRDk0++eyE0KJ4=
github.com/getsentry/sentry-go v0.31.1/go.mod h1:CYNcMMz73YigoHljQRG+qPF+eMq8gG72XcGN/p71BAY=
github.com/getsentry/sentry-go v0.32.0 h1:YKs+//QmwE3DcYtfKRH8/KyOOF/I6Qnx7qYGNHCGmCY=
github.com/getsentry/sentry-go v0.32.0/go.mod h1:CYNcMMz73YigoHljQRG+qPF+eMq8gG72XcGN/p71BAY=
github.com/go-errors/errors v1.4.2 h1:J6MZopCL4uSllY1OfXM374weqZFFItUbrImctkmUxIA=
github.com/go-errors/errors v1.4.2/go.mod h1:sIVyrIiJhuEF+Pj9Ebtd6P/rEYROXFi3BopGUQ5a5Og=
github.com/go-faster/city v1.0.1 h1:4WAxSZ3V2Ws4QRDrscLEDcibJY8uf41H6AhXDrNDcGw=
@ -220,8 +194,6 @@ github.com/go-playground/universal-translator v0.18.1 h1:Bcnm0ZwsGyWbCzImXv+pAJn
github.com/go-playground/universal-translator v0.18.1/go.mod h1:xekY+UJKNuX9WP91TpwSH2VMlDf28Uj24BCp08ZFTUY=
github.com/go-playground/validator/v10 v10.24.0 h1:KHQckvo8G6hlWnrPX4NJJ+aBfWNAE/HH+qdL2cBpCmg=
github.com/go-playground/validator/v10 v10.24.0/go.mod h1:GGzBIJMuE98Ic/kJsBXbz1x/7cByt++cQ+YOuDM5wus=
github.com/go-playground/validator/v10 v10.26.0 h1:SP05Nqhjcvz81uJaRfEV0YBSSSGMc/iMaVtFbr3Sw2k=
github.com/go-playground/validator/v10 v10.26.0/go.mod h1:I5QpIEbmr8On7W0TktmJAumgzX4CA1XNl4ZmDuVHKKo=
github.com/go-redis/redis v6.15.9+incompatible h1:K0pv1D7EQUjfyoMql+r/jZqCLizCGKFlFgcHWWmHQjg=
github.com/go-redis/redis v6.15.9+incompatible/go.mod h1:NAIEuMOZ/fxfXJIrKDQDz8wamY7mA7PouImQ2Jvg6kA=
github.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=
@ -239,8 +211,6 @@ github.com/gogo/protobuf v1.3.2 h1:Ov1cvc58UF3b5XjBnZv7+opcTcQFZebYjWzi34vdm4Q=
github.com/gogo/protobuf v1.3.2/go.mod h1:P1XiOD3dCwIKUDQYPy72D8LYyHL2YPYrpS2s69NZV8Q=
github.com/golang-jwt/jwt/v5 v5.2.1 h1:OuVbFODueb089Lh128TAcimifWaLhJwVflnrgM17wHk=
github.com/golang-jwt/jwt/v5 v5.2.1/go.mod h1:pqrtFR0X4osieyHYxtmOUWsAWrfe1Q5UVIyoH402zdk=
github.com/golang-jwt/jwt/v5 v5.2.2 h1:Rl4B7itRWVtYIHFrSNd7vhTiz9UpLdi6gZhZ3wEeDy8=
github.com/golang-jwt/jwt/v5 v5.2.2/go.mod h1:pqrtFR0X4osieyHYxtmOUWsAWrfe1Q5UVIyoH402zdk=
github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.5.0/go.mod h1:FsONVRAS9T7sI+LIUmWTfcYkHO4aIWwzhcaSAoJOfIk=
github.com/golang/protobuf v1.5.4 h1:i7eJL8qZTpSEXOPTxNKhASYpMn+8e5Q6AdndVa1dWek=
@ -252,7 +222,6 @@ github.com/google/go-cmp v0.5.2/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/
github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/gofuzz v1.2.0 h1:xRy4A+RhZaiKjJ1bPfwQ8sedCA+YS2YcCHW6ec7JMi0=
github.com/google/gofuzz v1.2.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
github.com/google/renameio v0.1.0/go.mod h1:KWCgfxg9yswjAJkECMjeO8J8rahYeXnNhOm40UhjYkI=
@ -359,8 +328,6 @@ github.com/kkdai/bstream v0.0.0-20161212061736-f391b8402d23/go.mod h1:J+Gs4SYgM6
github.com/klauspost/compress v1.13.6/go.mod h1:/3/Vjq9QcHkK5uEr5lBEmyoZ1iFhe47etQ6QUkpK6sk=
github.com/klauspost/compress v1.17.11 h1:In6xLpyWOi1+C7tXUUWv2ot1QvBjxevKAaI6IXrJmUc=
github.com/klauspost/compress v1.17.11/go.mod h1:pMDklpSncoRMuLFrf1W9Ss9KT+0rH90U12bZKk7uwG0=
github.com/klauspost/compress v1.18.0 h1:c/Cqfb0r+Yi+JtIEq73FWXVkRonBlf0CRNYc8Zttxdo=
github.com/klauspost/compress v1.18.0/go.mod h1:2Pp+KzxcywXVXMr50+X0Q/Lsb43OQHYWRCY2AiWywWQ=
github.com/klauspost/pgzip v1.2.6 h1:8RXeL5crjEUFnR2/Sn6GJNWtSQ3Dk8pq4CL3jvdDyjU=
github.com/klauspost/pgzip v1.2.6/go.mod h1:Ch1tH69qFZu15pkjo5kYi6mth2Zzwzt50oCQKQE9RUs=
github.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=
@ -474,20 +441,12 @@ github.com/power-devops/perfstat v0.0.0-20210106213030-5aafc221ea8c h1:ncq/mPwQF
github.com/power-devops/perfstat v0.0.0-20210106213030-5aafc221ea8c/go.mod h1:OmDBASR4679mdNQnz2pUhc2G8CO2JrUAVFDRBDP/hJE=
github.com/prometheus/client_golang v1.20.5 h1:cxppBPuYhUnsO6yo/aoRol4L7q7UFfdm+bR9r+8l63Y=
github.com/prometheus/client_golang v1.20.5/go.mod h1:PIEt8X02hGcP8JWbeHyeZ53Y/jReSnHgO035n//V5WE=
github.com/prometheus/client_golang v1.22.0 h1:rb93p9lokFEsctTys46VnV1kLCDpVZ0a/Y92Vm0Zc6Q=
github.com/prometheus/client_golang v1.22.0/go.mod h1:R7ljNsLXhuQXYZYtw6GAE9AZg8Y7vEW5scdCXrWRXC0=
github.com/prometheus/client_model v0.6.1 h1:ZKSh/rekM+n3CeS952MLRAdFwIKqeY8b62p8ais2e9E=
github.com/prometheus/client_model v0.6.1/go.mod h1:OrxVMOVHjw3lKMa8+x6HeMGkHMQyHDk9E3jmP2AmGiY=
github.com/prometheus/client_model v0.6.2 h1:oBsgwpGs7iVziMvrGhE53c/GrLUsZdHnqNwqPLxwZyk=
github.com/prometheus/client_model v0.6.2/go.mod h1:y3m2F6Gdpfy6Ut/GBsUqTWZqCUvMVzSfMLjcu6wAwpE=
github.com/prometheus/common v0.62.0 h1:xasJaQlnWAeyHdUBeGjXmutelfJHWMRr+Fg4QszZ2Io=
github.com/prometheus/common v0.62.0/go.mod h1:vyBcEuLSvWos9B1+CyL7JZ2up+uFzXhkqml0W5zIY1I=
github.com/prometheus/common v0.63.0 h1:YR/EIY1o3mEFP/kZCD7iDMnLPlGyuU2Gb3HIcXnA98k=
github.com/prometheus/common v0.63.0/go.mod h1:VVFF/fBIoToEnWRVkYoXEkq3R3paCoxG9PXP74SnV18=
github.com/prometheus/procfs v0.15.1 h1:YagwOFzUgYfKKHX6Dr+sHT7km/hxC76UB0learggepc=
github.com/prometheus/procfs v0.15.1/go.mod h1:fB45yRUv8NstnjriLhBQLuOUt+WW4BsoGhij/e3PBqk=
github.com/prometheus/procfs v0.16.0 h1:xh6oHhKwnOJKMYiYBDWmkHqQPyiY40sny36Cmx2bbsM=
github.com/prometheus/procfs v0.16.0/go.mod h1:8veyXUu3nGP7oaCxhX6yeaM5u4stL2FeMXnCqhDthZg=
github.com/r3labs/sse v0.0.0-20210224172625-26fe804710bc h1:zAsgcP8MhzAbhMnB1QQ2O7ZhWYVGYSR2iVcjzQuPV+o=
github.com/r3labs/sse v0.0.0-20210224172625-26fe804710bc/go.mod h1:S8xSOnV3CgpNrWd0GQ/OoQfMtlg2uPRSuTzcSGrzwK8=
github.com/rivo/uniseg v0.2.0 h1:S1pD9weZBuJdFmowNwbpi7BJ8TNftyUImj/0WQi72jY=
@ -509,8 +468,6 @@ github.com/serialx/hashring v0.0.0-20200727003509-22c0c7ab6b1b h1:h+3JX2VoWTFuyQ
github.com/serialx/hashring v0.0.0-20200727003509-22c0c7ab6b1b/go.mod h1:/yeG0My1xr/u+HZrFQ1tOQQQQrOawfyMUH13ai5brBc=
github.com/sethvargo/go-envconfig v1.1.0 h1:cWZiJxeTm7AlCvzGXrEXaSTCNgip5oJepekh/BOQuog=
github.com/sethvargo/go-envconfig v1.1.0/go.mod h1:JLd0KFWQYzyENqnEPWWZ49i4vzZo/6nRidxI8YvGiHw=
github.com/sethvargo/go-envconfig v1.2.0 h1:q3XkOZWkC+G1sMLCrw9oPGTjYexygLOXDmGUit1ti8Q=
github.com/sethvargo/go-envconfig v1.2.0/go.mod h1:JLd0KFWQYzyENqnEPWWZ49i4vzZo/6nRidxI8YvGiHw=
github.com/shibumi/go-pathspec v1.3.0 h1:QUyMZhFo0Md5B8zV8x2tesohbb5kfbpTi9rBnKh5dkI=
github.com/shibumi/go-pathspec v1.3.0/go.mod h1:Xutfslp817l2I1cZvgcfeMQJG5QnU2lh5tVaaMCl3jE=
github.com/shirou/gopsutil v3.21.11+incompatible h1:+1+c1VGhc88SSonWP6foOcLhvnKlUeu/erjjvaPEYiI=
@ -571,8 +528,6 @@ github.com/tonistiigi/vt100 v0.0.0-20240514184818-90bafcd6abab h1:H6aJ0yKQ0gF49Q
github.com/tonistiigi/vt100 v0.0.0-20240514184818-90bafcd6abab/go.mod h1:ulncasL3N9uLrVann0m+CDlJKWsIAP34MPcOJF6VRvc=
github.com/ua-parser/uap-go v0.0.0-20250126222208-a52596c19dff h1:NwMEGwb7JJ8wPjT8OPKP5hO1Xz6AQ7Z00+GLSJfW21s=
github.com/ua-parser/uap-go v0.0.0-20250126222208-a52596c19dff/go.mod h1:BUbeWZiieNxAuuADTBNb3/aeje6on3DhU3rpWsQSB1E=
github.com/ua-parser/uap-go v0.0.0-20250326155420-f7f5a2f9f5bc h1:reH9QQKGFOq39MYOvU9+SYrB8uzXtWNo51fWK3g0gGc=
github.com/ua-parser/uap-go v0.0.0-20250326155420-f7f5a2f9f5bc/go.mod h1:gwANdYmo9R8LLwGnyDFWK2PMsaXXX2HhAvCnb/UhZsM=
github.com/xdg-go/pbkdf2 v1.0.0/go.mod h1:jrpuAogTd400dnrH08LKmI/xc1MbPOebTwRqcT5RDeI=
github.com/xdg-go/scram v1.1.1/go.mod h1:RaEWvsqvNKKvBPvcKeFjrG2cJqOkHTiyTpzz23ni57g=
github.com/xdg-go/stringprep v1.0.3/go.mod h1:W3f5j4i+9rC0kuIEJL0ky1VpHXQU3ocBgklLGvcBnW8=
@ -602,8 +557,6 @@ go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.49.0 h1:jq9TW8u
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.49.0/go.mod h1:p8pYQP+m5XfbZm9fxtSKAbM6oIllS7s2AfxrChvc7iw=
go.opentelemetry.io/otel v1.34.0 h1:zRLXxLCgL1WyKsPVrgbSdMN4c0FMkDAskSTQP+0hdUY=
go.opentelemetry.io/otel v1.34.0/go.mod h1:OWFPOQ+h4G8xpyjgqo4SxJYdDQ/qmRH+wivy7zzx9oI=
go.opentelemetry.io/otel v1.35.0 h1:xKWKPxrxB6OtMCbmMY021CqC45J+3Onta9MqjhnusiQ=
go.opentelemetry.io/otel v1.35.0/go.mod h1:UEqy8Zp11hpkUrL73gSlELM0DupHoiq72dR+Zqel/+Y=
go.opentelemetry.io/otel/exporters/otlp/otlpmetric v0.42.0 h1:ZtfnDL+tUrs1F0Pzfwbg2d59Gru9NCH3bgSHBM6LDwU=
go.opentelemetry.io/otel/exporters/otlp/otlpmetric v0.42.0/go.mod h1:hG4Fj/y8TR/tlEDREo8tWstl9fO9gcFkn4xrx0Io8xU=
go.opentelemetry.io/otel/exporters/otlp/otlpmetric/otlpmetricgrpc v0.42.0 h1:NmnYCiR0qNufkldjVvyQfZTHSdzeHoZ41zggMsdMcLM=
@ -618,16 +571,12 @@ go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.21.0 h1:digkE
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.21.0/go.mod h1:/OpE/y70qVkndM0TrxT4KBoN3RsFZP0QaofcfYrj76I=
go.opentelemetry.io/otel/metric v1.34.0 h1:+eTR3U0MyfWjRDhmFMxe2SsW64QrZ84AOhvqS7Y+PoQ=
go.opentelemetry.io/otel/metric v1.34.0/go.mod h1:CEDrp0fy2D0MvkXE+dPV7cMi8tWZwX3dmaIhwPOaqHE=
go.opentelemetry.io/otel/metric v1.35.0 h1:0znxYu2SNyuMSQT4Y9WDWej0VpcsxkuklLa4/siN90M=
go.opentelemetry.io/otel/metric v1.35.0/go.mod h1:nKVFgxBZ2fReX6IlyW28MgZojkoAkJGaE8CpgeAU3oE=
go.opentelemetry.io/otel/sdk v1.24.0 h1:YMPPDNymmQN3ZgczicBY3B6sf9n62Dlj9pWD3ucgoDw=
go.opentelemetry.io/otel/sdk v1.24.0/go.mod h1:KVrIYw6tEubO9E96HQpcmpTKDVn9gdv35HoYiQWGDFg=
go.opentelemetry.io/otel/sdk/metric v1.21.0 h1:smhI5oD714d6jHE6Tie36fPx4WDFIg+Y6RfAY4ICcR0=
go.opentelemetry.io/otel/sdk/metric v1.21.0/go.mod h1:FJ8RAsoPGv/wYMgBdUJXOm+6pzFY3YdljnXtv1SBE8Q=
go.opentelemetry.io/otel/trace v1.34.0 h1:+ouXS2V8Rd4hp4580a8q23bg0azF2nI8cqLYnC8mh/k=
go.opentelemetry.io/otel/trace v1.34.0/go.mod h1:Svm7lSjQD7kG7KJ/MUHPVXSDGz2OX4h0M2jHBhmSfRE=
go.opentelemetry.io/otel/trace v1.35.0 h1:dPpEfJu1sDIqruz7BHFG3c7528f6ddfSWfFDVt/xgMs=
go.opentelemetry.io/otel/trace v1.35.0/go.mod h1:WUk7DtFp1Aw2MkvqGdwiXYDZZNvA/1J8o6xRXLrIkyc=
go.opentelemetry.io/proto/otlp v1.0.0 h1:T0TX0tmXU8a3CbNXzEKGeU5mIVOdf0oykP+u2lIVU/I=
go.opentelemetry.io/proto/otlp v1.0.0/go.mod h1:Sy6pihPLfYHkr3NkUbEhGHFhINUSI/v80hjKIs5JXpM=
go.uber.org/atomic v1.3.2/go.mod h1:gD2HeocX3+yG+ygLZcrzQJaqmWj9AIm7n08wl/qW/PE=
@ -668,10 +617,6 @@ golang.org/x/crypto v0.32.0 h1:euUpcYgM8WcP71gNpTqQCn6rC2t6ULUPiOzfWaXVVfc=
golang.org/x/crypto v0.32.0/go.mod h1:ZnnJkOaASj8g0AjIduWNlq2NRxL0PlBrbKVyZ6V/Ugc=
golang.org/x/crypto v0.33.0 h1:IOBPskki6Lysi0lo9qQvbxiQ+FvsCC/YWOecCHAixus=
golang.org/x/crypto v0.33.0/go.mod h1:bVdXmD7IV/4GdElGPozy6U7lWdRXA4qyRVGJV57uQ5M=
golang.org/x/crypto v0.36.0 h1:AnAEvhDddvBdpY+uR+MyHmuZzzNqXSe/GvuDeob5L34=
golang.org/x/crypto v0.36.0/go.mod h1:Y4J0ReaxCR1IMaabaSMugxJES1EpwhBHhv2bDHklZvc=
golang.org/x/crypto v0.37.0 h1:kJNSjF/Xp7kU0iB2Z+9viTPMW4EqqsrywMXLJOOsXSE=
golang.org/x/crypto v0.37.0/go.mod h1:vg+k43peMZ0pUMhYmVAWysMK35e6ioLh3wB8ZCAfbVc=
golang.org/x/exp v0.0.0-20240112132812-db7319d0e0e3 h1:hNQpMuAJe5CtcUqCXaWga3FHu+kQvCqcsoVaQgSV60o=
golang.org/x/exp v0.0.0-20240112132812-db7319d0e0e3/go.mod h1:idGWGoKP1toJGkd5/ig9ZLuPcZBC3ewk7SzmH0uou08=
golang.org/x/lint v0.0.0-20190930215403-16217165b5de/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc=
@ -698,14 +643,8 @@ golang.org/x/net v0.34.0 h1:Mb7Mrk043xzHgnRM88suvJFwzVrRfHEHJEl5/71CKw0=
golang.org/x/net v0.34.0/go.mod h1:di0qlW3YNM5oh6GqDGQr92MyTozJPmybPK4Ev/Gm31k=
golang.org/x/net v0.35.0 h1:T5GQRQb2y08kTAByq9L4/bz8cipCdA8FbRTXewonqY8=
golang.org/x/net v0.35.0/go.mod h1:EglIi67kWsHKlRzzVMUD93VMSWGFOMSZgxFjparz1Qk=
golang.org/x/net v0.38.0 h1:vRMAPTMaeGqVhG5QyLJHqNDwecKTomGeqbnfZyKlBI8=
golang.org/x/net v0.38.0/go.mod h1:ivrbrMbzFq5J41QOQh0siUuly180yBYtLp+CKbEaFx8=
golang.org/x/net v0.39.0 h1:ZCu7HMWDxpXpaiKdhzIfaltL9Lp31x/3fCP11bc6/fY=
golang.org/x/net v0.39.0/go.mod h1:X7NRbYVEA+ewNkCNyJ513WmMdQ3BineSwVtN2zD/d+E=
golang.org/x/oauth2 v0.25.0 h1:CY4y7XT9v0cRI9oupztF8AgiIu99L/ksR/Xp/6jrZ70=
golang.org/x/oauth2 v0.25.0/go.mod h1:XYTD2NtWslqkgxebSiOHnXEap4TF09sJSc7H1sXbhtI=
golang.org/x/oauth2 v0.29.0 h1:WdYw2tdTK1S8olAzWHdgeqfy+Mtm9XNhv/xJsY65d98=
golang.org/x/oauth2 v0.29.0/go.mod h1:onh5ek6nERTohokkhCD/y2cV4Do3fxFHFuAejCkRWT8=
golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20190911185100-cd5d95a43a6e/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
@ -740,10 +679,6 @@ golang.org/x/sys v0.29.0 h1:TPYlXGxvx1MGTn2GiZDhnjPA9wZzZeGKHHmKhHYvgaU=
golang.org/x/sys v0.29.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.30.0 h1:QjkSwP/36a20jFYWkSue1YwXzLmsV5Gfq7Eiy72C1uc=
golang.org/x/sys v0.30.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.31.0 h1:ioabZlmFYtWhL+TRYpcnNlLwhyxaM9kWTDEmfnprqik=
golang.org/x/sys v0.31.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=
golang.org/x/sys v0.32.0 h1:s77OFDvIQeibCmezSnk/q6iAfkdiQaJi4VzroCFrN20=
golang.org/x/sys v0.32.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=
golang.org/x/term v0.0.0-20201117132131-f5c789dd3221/go.mod h1:Nr5EML6q2oocZ2LXRh80K7BxOlk5/8JxuGnuhpl+muw=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
@ -765,10 +700,6 @@ golang.org/x/text v0.21.0 h1:zyQAAkrwaneQ066sspRyJaG9VNi/YJ1NfzcGB3hZ/qo=
golang.org/x/text v0.21.0/go.mod h1:4IBbMaMmOPCJ8SecivzSH54+73PCFmPWxNTLm+vZkEQ=
golang.org/x/text v0.22.0 h1:bofq7m3/HAFvbF51jz3Q9wLg3jkvSPuiZu/pD1XwgtM=
golang.org/x/text v0.22.0/go.mod h1:YRoo4H8PVmsu+E3Ou7cqLVH8oXWIHVoX0jqUWALQhfY=
golang.org/x/text v0.23.0 h1:D71I7dUrlY+VX0gQShAThNGHFxZ13dGLBHQLVl1mJlY=
golang.org/x/text v0.23.0/go.mod h1:/BLNzu4aZCJ1+kcD0DNRotWKage4q2rGVAg4o22unh4=
golang.org/x/text v0.24.0 h1:dd5Bzh4yt5KYA8f9CJHCP4FB4D51c2c6JvN37xJJkJ0=
golang.org/x/text v0.24.0/go.mod h1:L8rBsPeo2pSS+xqN0d5u2ikmjtmoJbDBT1b7nHvFCdU=
golang.org/x/time v0.6.0 h1:eTDhh4ZXt5Qf0augr54TN6suAUudPcawVZeIAPU7D4U=
golang.org/x/time v0.6.0/go.mod h1:3BpzKBy/shNhVucY/MWOyx10tF3SFh9QdLuxbVysPQM=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
@ -796,18 +727,12 @@ google.golang.org/genproto/googleapis/api v0.0.0-20240318140521-94a12d6c2237 h1:
google.golang.org/genproto/googleapis/api v0.0.0-20240318140521-94a12d6c2237/go.mod h1:Z5Iiy3jtmioajWHDGFk7CeugTyHtPvMHA4UTmUkyalE=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250127172529-29210b9bc287 h1:J1H9f+LEdWAfHcez/4cvaVBox7cOYT+IU6rgqj5x++8=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250127172529-29210b9bc287/go.mod h1:8BS3B93F/U1juMFq9+EDk+qOT5CO1R9IzXxG3PTqiRk=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250414145226-207652e42e2e h1:ztQaXfzEXTmCBvbtWYRhJxW+0iJcz2qXfd38/e9l7bA=
google.golang.org/genproto/googleapis/rpc v0.0.0-20250414145226-207652e42e2e/go.mod h1:qQ0YXyHHx3XkvlzUtpXDkS29lDSafHMZBAZDc03LQ3A=
google.golang.org/grpc v1.64.1 h1:LKtvyfbX3UGVPFcGqJ9ItpVWW6oN/2XqTxfAnwRRXiA=
google.golang.org/grpc v1.64.1/go.mod h1:hiQF4LFZelK2WKaP6W0L92zGHtiQdZxk8CrSdvyjeP0=
google.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp09yW+WbY/TyQbw=
google.golang.org/protobuf v1.27.1/go.mod h1:9q0QmTI4eRPtz6boOQmLYwt+qCgq0jsYwAQnmE0givc=
google.golang.org/protobuf v1.36.4 h1:6A3ZDJHn/eNqc1i+IdefRzy/9PokBTPvcqMySR7NNIM=
google.golang.org/protobuf v1.36.4/go.mod h1:9fA7Ob0pmnwhb644+1+CVWFRbNajQ6iRojtC/QF5bRE=
google.golang.org/protobuf v1.36.5 h1:tPhr+woSbjfYvY6/GPufUoYizxw1cF/yFoxJ2fmpwlM=
google.golang.org/protobuf v1.36.5/go.mod h1:9fA7Ob0pmnwhb644+1+CVWFRbNajQ6iRojtC/QF5bRE=
google.golang.org/protobuf v1.36.6 h1:z1NpPI8ku2WgiWnf+t9wTPsn6eP1L7ksHUlkfLvd9xY=
google.golang.org/protobuf v1.36.6/go.mod h1:jduwjTPXsFjZGTmRluh+L6NjiWu7pchiJ2/5YcXBHnY=
gopkg.in/cenkalti/backoff.v1 v1.1.0 h1:Arh75ttbsvlpVA7WtVpH4u9h6Zl46xuptxqLxPiSo4Y=
gopkg.in/cenkalti/backoff.v1 v1.1.0/go.mod h1:J6Vskwqd+OMVJl8C33mmtxTBs2gyzfv7UDAkHu8BrjI=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=

View file

@ -3,10 +3,11 @@ package datasaver
import (
"context"
"encoding/json"
"openreplay/backend/pkg/db/types"
"openreplay/backend/internal/config/db"
"openreplay/backend/pkg/db/clickhouse"
"openreplay/backend/pkg/db/postgres"
"openreplay/backend/pkg/db/types"
"openreplay/backend/pkg/logger"
. "openreplay/backend/pkg/messages"
queue "openreplay/backend/pkg/queue/types"

View file

@ -2,6 +2,7 @@ package datasaver
import (
"context"
"openreplay/backend/pkg/db/postgres"
"openreplay/backend/pkg/db/types"
"openreplay/backend/pkg/messages"
@ -140,11 +141,6 @@ func (s *saverImpl) handleWebMessage(sessCtx context.Context, session *sessions.
return err
}
return s.ch.InsertWebPerformanceTrackAggr(session, m)
case *messages.Incident:
if err := s.pg.InsertIncident(session, m); err != nil {
return err
}
return s.ch.InsertIncident(session, m)
}
return nil
}

View file

@ -133,17 +133,6 @@ func (e *AssetsCache) ParseAssets(msg messages.Message) messages.Message {
}
newMsg.SetMeta(msg.Meta())
return newMsg
case *messages.CSSInsertRuleURLBased:
if e.shouldSkipAsset(m.BaseURL) {
return msg
}
newMsg := &messages.CSSInsertRule{
ID: m.ID,
Index: m.Index,
Rule: e.handleCSS(m.SessionID(), m.BaseURL, m.Rule),
}
newMsg.SetMeta(msg.Meta())
return newMsg
case *messages.AdoptedSSReplaceURLBased:
if e.shouldSkipAsset(m.BaseURL) {
return msg

View file

@ -39,7 +39,6 @@ type Connector interface {
InsertIssue(session *sessions.Session, msg *messages.IssueEvent) error
InsertWebInputDuration(session *sessions.Session, msg *messages.InputChange) error
InsertMouseThrashing(session *sessions.Session, msg *messages.MouseThrashing) error
InsertIncident(session *sessions.Session, msg *messages.Incident) error
InsertMobileSession(session *sessions.Session) error
InsertMobileCustom(session *sessions.Session, msg *messages.MobileEvent) error
InsertMobileClick(session *sessions.Session, msg *messages.MobileClickEvent) error
@ -107,17 +106,17 @@ func (c *connectorImpl) newBatch(name, query string) error {
}
var batches = map[string]string{
"sessions": "INSERT INTO experimental.sessions (session_id, project_id, user_id, user_uuid, user_os, user_os_version, user_device, user_device_type, user_country, user_state, user_city, datetime, duration, pages_count, events_count, errors_count, issue_score, referrer, issue_types, tracker_version, user_browser, user_browser_version, metadata_1, metadata_2, metadata_3, metadata_4, metadata_5, metadata_6, metadata_7, metadata_8, metadata_9, metadata_10, platform, timezone, utm_source, utm_medium, utm_campaign, screen_width, screen_height) VALUES (?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?)",
"sessions": "INSERT INTO experimental.sessions (session_id, project_id, user_id, user_uuid, user_os, user_os_version, user_device, user_device_type, user_country, user_state, user_city, datetime, duration, pages_count, events_count, errors_count, issue_score, referrer, issue_types, tracker_version, user_browser, user_browser_version, metadata_1, metadata_2, metadata_3, metadata_4, metadata_5, metadata_6, metadata_7, metadata_8, metadata_9, metadata_10, platform, timezone, utm_source, utm_medium, utm_campaign) VALUES (?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?)",
"autocompletes": "INSERT INTO experimental.autocomplete (project_id, type, value) VALUES (?, ?, SUBSTR(?, 1, 8000))",
"pages": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$os", "$browser", "$referrer", "$country", "$state", "$city", "$current_url", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"clicks": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$os", "$browser", "$referrer", "$country", "$state", "$city", "$current_url", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"inputs": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$os", "$browser", "$referrer", "$country", "$state", "$city", "$current_url", "$duration_s", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"errors": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$os", "$browser", "$referrer", "$country", "$state", "$city", "$current_url", error_id, "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"performance": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$os", "$browser", "$referrer", "$country", "$state", "$city", "$current_url", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"requests": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$os", "$browser", "$referrer", "$country", "$state", "$city", "$current_url", "$duration_s", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"custom": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$os", "$browser", "$referrer", "$country", "$state", "$city", "$current_url", "$properties", properties) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"graphql": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$os", "$browser", "$referrer", "$country", "$state", "$city", "$current_url", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"issuesEvents": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$os", "$browser", "$referrer", "$country", "$state", "$city", "$current_url", issue_type, issue_id, "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"pages": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$current_url", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"clicks": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$current_url", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"inputs": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$duration_s", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"errors": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", error_id, "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"performance": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"requests": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$duration_s", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"custom": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"graphql": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"issuesEvents": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", issue_type, issue_id, "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
"issues": "INSERT INTO experimental.issues (project_id, issue_id, type, context_string) VALUES (?, ?, ?, ?)",
"mobile_sessions": "INSERT INTO experimental.sessions (session_id, project_id, user_id, user_uuid, user_os, user_os_version, user_device, user_device_type, user_country, user_state, user_city, datetime, duration, pages_count, events_count, errors_count, issue_score, referrer, issue_types, tracker_version, user_browser, user_browser_version, metadata_1, metadata_2, metadata_3, metadata_4, metadata_5, metadata_6, metadata_7, metadata_8, metadata_9, metadata_10, platform, timezone) VALUES (?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000), ?, ?)",
"mobile_custom": `INSERT INTO product_analytics.events (session_id, project_id, event_id, "$event_name", created_at, "$time", distinct_id, "$auto_captured", "$device", "$os_version", "$properties") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
@ -221,8 +220,6 @@ func (c *connectorImpl) InsertWebSession(session *sessions.Session) error {
session.UtmSource,
session.UtmMedium,
session.UtmCampaign,
session.ScreenWidth,
session.ScreenHeight,
); err != nil {
c.checkError("sessions", err)
return fmt.Errorf("can't append to sessions batch: %s", err)
@ -254,7 +251,6 @@ func (c *connectorImpl) InsertWebInputDuration(session *sessions.Session, msg *m
"hesitation_time": nullableUint32(uint32(msg.HesitationTime)),
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal input event: %s", err)
@ -271,13 +267,6 @@ func (c *connectorImpl) InsertWebInputDuration(session *sessions.Session, msg *m
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.Url),
nullableUint16(uint16(msg.InputDuration)),
jsonString,
); err != nil {
@ -302,7 +291,6 @@ func (c *connectorImpl) InsertMouseThrashing(session *sessions.Session, msg *mes
"url_hostpath": hostpath,
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal issue event: %s", err)
@ -319,13 +307,6 @@ func (c *connectorImpl) InsertMouseThrashing(session *sessions.Session, msg *mes
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.Url),
"mouse_thrashing",
issueID,
jsonString,
@ -366,7 +347,6 @@ func (c *connectorImpl) InsertIssue(session *sessions.Session, msg *messages.Iss
"url_hostpath": hostpath,
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal issue event: %s", err)
@ -383,13 +363,6 @@ func (c *connectorImpl) InsertIssue(session *sessions.Session, msg *messages.Iss
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.Url),
msg.Type,
issueID,
jsonString,
@ -463,7 +436,6 @@ func (c *connectorImpl) InsertWebPageEvent(session *sessions.Session, msg *messa
"load_event_time": loadEventTime,
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal page event: %s", err)
@ -480,12 +452,6 @@ func (c *connectorImpl) InsertWebPageEvent(session *sessions.Session, msg *messa
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.URL),
jsonString,
); err != nil {
@ -530,7 +496,6 @@ func (c *connectorImpl) InsertWebClickEvent(session *sessions.Session, msg *mess
"url_hostpath": hostpath,
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal click event: %s", err)
@ -547,12 +512,6 @@ func (c *connectorImpl) InsertWebClickEvent(session *sessions.Session, msg *mess
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.Url),
jsonString,
); err != nil {
@ -576,7 +535,6 @@ func (c *connectorImpl) InsertWebErrorEvent(session *sessions.Session, msg *type
"message": msg.Message,
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal error event: %s", err)
@ -593,13 +551,6 @@ func (c *connectorImpl) InsertWebErrorEvent(session *sessions.Session, msg *type
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.Url),
msgID,
jsonString,
); err != nil {
@ -634,7 +585,6 @@ func (c *connectorImpl) InsertWebPerformanceTrackAggr(session *sessions.Session,
"max_used_js_heap_size": msg.MaxUsedJSHeapSize,
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal performance event: %s", err)
@ -651,13 +601,6 @@ func (c *connectorImpl) InsertWebPerformanceTrackAggr(session *sessions.Session,
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.Url),
jsonString,
); err != nil {
c.checkError("performance", err)
@ -693,7 +636,6 @@ func (c *connectorImpl) InsertRequest(session *sessions.Session, msg *messages.N
"url_hostpath": hostpath,
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal request event: %s", err)
@ -710,13 +652,6 @@ func (c *connectorImpl) InsertRequest(session *sessions.Session, msg *messages.N
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.URL),
nullableUint16(uint16(msg.Duration)),
jsonString,
); err != nil {
@ -728,42 +663,27 @@ func (c *connectorImpl) InsertRequest(session *sessions.Session, msg *messages.N
func (c *connectorImpl) InsertCustom(session *sessions.Session, msg *messages.CustomEvent) error {
jsonString, err := json.Marshal(map[string]interface{}{
"name": msg.Name,
"payload": msg.Payload,
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal custom event: %s", err)
}
customPayload := make(map[string]interface{})
if err := json.Unmarshal([]byte(msg.Payload), &customPayload); err != nil {
log.Printf("can't unmarshal custom event payload into object: %s", err)
customPayload = map[string]interface{}{
"payload": msg.Payload,
}
}
eventTime := datetime(msg.Timestamp)
if err := c.batches["custom"].Append(
session.SessionID,
uint16(session.ProjectID),
getUUID(msg),
msg.Name,
"CUSTOM",
eventTime,
eventTime.Unix(),
session.UserUUID,
false,
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.Url),
jsonString, // $properties
customPayload, // properties
jsonString,
); err != nil {
c.checkError("custom", err)
return fmt.Errorf("can't append to custom batch: %s", err)
@ -778,7 +698,6 @@ func (c *connectorImpl) InsertGraphQL(session *sessions.Session, msg *messages.G
"response_body": nullableString(msg.Response),
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal graphql event: %s", err)
@ -795,13 +714,6 @@ func (c *connectorImpl) InsertGraphQL(session *sessions.Session, msg *messages.G
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.Url),
jsonString,
); err != nil {
c.checkError("graphql", err)
@ -810,45 +722,6 @@ func (c *connectorImpl) InsertGraphQL(session *sessions.Session, msg *messages.G
return nil
}
func (c *connectorImpl) InsertIncident(session *sessions.Session, msg *messages.Incident) error {
jsonString, err := json.Marshal(map[string]interface{}{
"label": msg.Label,
"start_time": msg.StartTime,
"end_time": msg.EndTime,
"user_device": session.UserDevice,
"user_device_type": session.UserDeviceType,
"page_title ": msg.PageTitle,
})
if err != nil {
return fmt.Errorf("can't marshal custom event: %s", err)
}
eventTime := datetime(msg.Timestamp)
if err := c.batches["custom"].Append(
session.SessionID,
uint16(session.ProjectID),
getUUID(msg),
"INCIDENT",
eventTime,
eventTime.Unix(),
session.UserUUID,
true,
session.Platform,
session.UserOSVersion,
session.UserOS,
session.UserBrowser,
session.Referrer,
session.UserCountry,
session.UserState,
session.UserCity,
cropString(msg.Url),
jsonString,
); err != nil {
c.checkError("custom", err)
return fmt.Errorf("can't append to custom batch: %s", err)
}
return nil
}
// Mobile events
func (c *connectorImpl) InsertMobileSession(session *sessions.Session) error {

View file

@ -270,15 +270,3 @@ func (conn *Conn) InsertWebStatsPerformance(p *messages.PerformanceTrackAggr) er
)
return nil
}
func (conn *Conn) InsertIncident(sess *sessions.Session, e *messages.Incident) error {
sessCtx := context.WithValue(context.Background(), "sessionID", sess.SessionID)
issueID := hashid.MobileIncidentID(sess.ProjectID, sess.SessionID, e.Timestamp)
if err := conn.bulks.Get("webIssues").Append(sess.ProjectID, issueID, "incident", e.Url); err != nil {
conn.log.Error(sessCtx, "insert incident issue err: %s", err)
}
if err := conn.bulks.Get("webIssueEvents").Append(sess.SessionID, issueID, e.Timestamp, truncSqIdx(e.MsgID()), nil); err != nil {
conn.log.Error(sessCtx, "insert incident issue event err: %s", err)
}
return nil
}

View file

@ -84,10 +84,7 @@ func (p *poolImpl) Begin() (*Tx, error) {
tx, err := p.conn.Begin(context.Background())
p.metrics.RecordRequestDuration(float64(time.Now().Sub(start).Milliseconds()), "begin", "")
p.metrics.IncreaseTotalRequests("begin", "")
return &Tx{
origTx: tx,
metrics: p.metrics,
}, err
return &Tx{tx, p.metrics}, err
}
func (p *poolImpl) Close() {
@ -97,13 +94,13 @@ func (p *poolImpl) Close() {
// TX - start
type Tx struct {
origTx pgx.Tx
pgx.Tx
metrics database.Database
}
func (tx *Tx) TxExec(sql string, args ...interface{}) error {
start := time.Now()
_, err := tx.origTx.Exec(context.Background(), sql, args...)
_, err := tx.Exec(context.Background(), sql, args...)
method, table := methodName(sql)
tx.metrics.RecordRequestDuration(float64(time.Now().Sub(start).Milliseconds()), method, table)
tx.metrics.IncreaseTotalRequests(method, table)
@ -112,7 +109,7 @@ func (tx *Tx) TxExec(sql string, args ...interface{}) error {
func (tx *Tx) TxQueryRow(sql string, args ...interface{}) pgx.Row {
start := time.Now()
res := tx.origTx.QueryRow(context.Background(), sql, args...)
res := tx.QueryRow(context.Background(), sql, args...)
method, table := methodName(sql)
tx.metrics.RecordRequestDuration(float64(time.Now().Sub(start).Milliseconds()), method, table)
tx.metrics.IncreaseTotalRequests(method, table)
@ -121,7 +118,7 @@ func (tx *Tx) TxQueryRow(sql string, args ...interface{}) pgx.Row {
func (tx *Tx) TxRollback() error {
start := time.Now()
err := tx.origTx.Rollback(context.Background())
err := tx.Rollback(context.Background())
tx.metrics.RecordRequestDuration(float64(time.Now().Sub(start).Milliseconds()), "rollback", "")
tx.metrics.IncreaseTotalRequests("rollback", "")
return err
@ -129,7 +126,7 @@ func (tx *Tx) TxRollback() error {
func (tx *Tx) TxCommit() error {
start := time.Now()
err := tx.origTx.Commit(context.Background())
err := tx.Commit(context.Background())
tx.metrics.RecordRequestDuration(float64(time.Now().Sub(start).Milliseconds()), "commit", "")
tx.metrics.IncreaseTotalRequests("commit", "")
return err

View file

@ -5,11 +5,10 @@ import (
"encoding/hex"
"encoding/json"
"fmt"
"github.com/google/uuid"
"hash/fnv"
"strconv"
"github.com/google/uuid"
. "openreplay/backend/pkg/messages"
)
@ -24,8 +23,41 @@ type ErrorEvent struct {
Payload string
Tags map[string]*string
OriginType int
Url string
PageTitle string
}
func unquote(s string) string {
if s[0] == '"' {
return s[1 : len(s)-1]
}
return s
}
func parseTags(tagsJSON string) (tags map[string]*string, err error) {
if len(tagsJSON) == 0 {
return nil, fmt.Errorf("empty tags")
}
if tagsJSON[0] == '[' {
var tagsArr []json.RawMessage
if err = json.Unmarshal([]byte(tagsJSON), &tagsArr); err != nil {
return
}
tags = make(map[string]*string)
for _, keyBts := range tagsArr {
tags[unquote(string(keyBts))] = nil
}
} else if tagsJSON[0] == '{' {
var tagsObj map[string]json.RawMessage
if err = json.Unmarshal([]byte(tagsJSON), &tagsObj); err != nil {
return
}
tags = make(map[string]*string)
for key, valBts := range tagsObj {
val := unquote(string(valBts))
tags[key] = &val
}
}
return
}
func WrapJSException(m *JSException) (*ErrorEvent, error) {
@ -37,8 +69,6 @@ func WrapJSException(m *JSException) (*ErrorEvent, error) {
Message: m.Message,
Payload: m.Payload,
OriginType: m.TypeID(),
Url: m.Url,
PageTitle: m.PageTitle,
}, nil
}
@ -51,8 +81,6 @@ func WrapIntegrationEvent(m *IntegrationEvent) *ErrorEvent {
Message: m.Message,
Payload: m.Payload,
OriginType: m.TypeID(),
Url: m.Url,
PageTitle: m.PageTitle,
}
}

View file

@ -77,8 +77,6 @@ func (d *DeadClickDetector) Handle(message Message, timestamp uint64) Message {
*MoveNode,
*RemoveNode,
*SetCSSData,
*CSSInsertRule,
*CSSDeleteRule,
*SetInputValue,
*SetInputChecked:
return d.Build()

View file

@ -38,11 +38,3 @@ func MouseThrashingID(projectID uint32, sessID, ts uint64) string {
hash.Write([]byte(strconv.FormatUint(ts, 10)))
return strconv.FormatUint(uint64(projectID), 16) + hex.EncodeToString(hash.Sum(nil))
}
func MobileIncidentID(projectID uint32, sessID, ts uint64) string {
hash := fnv.New128a()
hash.Write([]byte("mobile_incident"))
hash.Write([]byte(strconv.FormatUint(sessID, 10)))
hash.Write([]byte(strconv.FormatUint(ts, 10)))
return strconv.FormatUint(uint64(projectID), 16) + hex.EncodeToString(hash.Sum(nil))
}

View file

@ -2,7 +2,7 @@
package messages
func IsReplayerType(id int) bool {
return 1 != id && 3 != id && 17 != id && 23 != id && 24 != id && 25 != id && 26 != id && 27 != id && 28 != id && 29 != id && 30 != id && 31 != id && 32 != id && 33 != id && 42 != id && 56 != id && 62 != id && 63 != id && 64 != id && 66 != id && 78 != id && 80 != id && 81 != id && 82 != id && 112 != id && 115 != id && 124 != id && 125 != id && 126 != id && 127 != id && 90 != id && 91 != id && 92 != id && 94 != id && 95 != id && 97 != id && 98 != id && 107 != id && 110 != id
return 1 != id && 17 != id && 23 != id && 24 != id && 26 != id && 27 != id && 28 != id && 29 != id && 30 != id && 31 != id && 32 != id && 33 != id && 42 != id && 56 != id && 63 != id && 64 != id && 66 != id && 78 != id && 81 != id && 82 != id && 112 != id && 115 != id && 124 != id && 125 != id && 126 != id && 127 != id && 90 != id && 91 != id && 92 != id && 94 != id && 95 != id && 97 != id && 98 != id && 107 != id && 110 != id
}
func IsMobileType(id int) bool {
@ -10,5 +10,5 @@ func IsMobileType(id int) bool {
}
func IsDOMType(id int) bool {
return 0 == id || 4 == id || 5 == id || 6 == id || 7 == id || 8 == id || 9 == id || 10 == id || 11 == id || 12 == id || 13 == id || 14 == id || 15 == id || 16 == id || 18 == id || 19 == id || 20 == id || 34 == id || 35 == id || 36 == id || 37 == id || 38 == id || 49 == id || 50 == id || 51 == id || 43 == id || 52 == id || 54 == id || 55 == id || 57 == id || 58 == id || 59 == id || 60 == id || 61 == id || 65 == id || 67 == id || 68 == id || 69 == id || 70 == id || 71 == id || 72 == id || 73 == id || 74 == id || 75 == id || 76 == id || 77 == id || 113 == id || 114 == id || 117 == id || 118 == id || 119 == id || 122 == id || 93 == id || 96 == id || 100 == id || 101 == id || 102 == id || 103 == id || 104 == id || 105 == id || 106 == id || 111 == id
return 0 == id || 4 == id || 5 == id || 6 == id || 7 == id || 8 == id || 9 == id || 10 == id || 11 == id || 12 == id || 13 == id || 14 == id || 15 == id || 16 == id || 18 == id || 19 == id || 20 == id || 34 == id || 35 == id || 49 == id || 50 == id || 51 == id || 43 == id || 52 == id || 54 == id || 55 == id || 57 == id || 58 == id || 60 == id || 61 == id || 68 == id || 69 == id || 70 == id || 71 == id || 72 == id || 73 == id || 74 == id || 75 == id || 76 == id || 77 == id || 113 == id || 114 == id || 117 == id || 118 == id || 119 == id || 122 == id || 93 == id || 96 == id || 100 == id || 101 == id || 102 == id || 103 == id || 104 == id || 105 == id || 106 == id || 111 == id
}

View file

@ -44,9 +44,8 @@ func NewMessageIterator(log logger.Logger, messageHandler MessageHandler, messag
iter.filter = filter
}
iter.preFilter = map[int]struct{}{
MsgBatchMetadata: {}, MsgBatchMeta: {}, MsgTimestamp: {},
MsgSessionStart: {}, MsgSessionEnd: {}, MsgSetPageLocation: {},
MsgMobileBatchMeta: {},
MsgBatchMetadata: {}, MsgTimestamp: {}, MsgSessionStart: {},
MsgSessionEnd: {}, MsgSetPageLocation: {}, MsgMobileBatchMeta: {},
}
return iter
}
@ -152,20 +151,6 @@ func (i *messageIteratorImpl) preprocessing(msg Message) error {
i.version = m.Version
i.batchInfo.version = m.Version
case *BatchMeta: // Is not required to be present in batch since Mobile doesn't have it (though we might change it)
if i.messageInfo.Index > 1 { // Might be several 0-0 BatchMeta in a row without an error though
return fmt.Errorf("batchMeta found at the end of the batch, info: %s", i.batchInfo.Info())
}
i.messageInfo.Index = m.PageNo<<32 + m.FirstIndex // 2^32 is the maximum count of messages per page (ha-ha)
i.messageInfo.Timestamp = uint64(m.Timestamp)
if m.Timestamp == 0 {
i.zeroTsLog("BatchMeta")
}
// Try to get saved session's page url
if savedURL := i.urls.Get(i.messageInfo.batch.sessionID); savedURL != "" {
i.messageInfo.Url = savedURL
}
case *Timestamp:
i.messageInfo.Timestamp = m.Timestamp
if m.Timestamp == 0 {
@ -191,7 +176,6 @@ func (i *messageIteratorImpl) preprocessing(msg Message) error {
case *SetPageLocation:
i.messageInfo.Url = m.URL
i.messageInfo.PageTitle = m.DocumentTitle
// Save session page url in cache for using in next batches
i.urls.Set(i.messageInfo.batch.sessionID, m.URL)

View file

@ -2,34 +2,6 @@ package messages
func transformDeprecated(msg Message) Message {
switch m := msg.(type) {
case *JSExceptionDeprecated:
return &JSException{
Name: m.Name,
Message: m.Message,
Payload: m.Payload,
Metadata: "{}",
}
case *Fetch:
return &NetworkRequest{
Type: "fetch",
Method: m.Method,
URL: m.URL,
Request: m.Request,
Response: m.Response,
Status: m.Status,
Timestamp: m.Timestamp,
Duration: m.Duration,
}
case *IssueEventDeprecated:
return &IssueEvent{
MessageID: m.MessageID,
Timestamp: m.Timestamp,
Type: m.Type,
ContextString: m.ContextString,
Context: m.Context,
Payload: m.Payload,
URL: "",
}
case *ResourceTimingDeprecated:
return &ResourceTiming{
Timestamp: m.Timestamp,

View file

@ -54,7 +54,6 @@ type message struct {
Timestamp uint64
Index uint64
Url string
PageTitle string
batch *BatchInfo
}
@ -71,7 +70,6 @@ func (m *message) SetMeta(origin *message) {
m.Timestamp = origin.Timestamp
m.Index = origin.Index
m.Url = origin.Url
m.PageTitle = origin.PageTitle
}
func (m *message) SessionID() uint64 {

View file

@ -4,7 +4,6 @@ package messages
const (
MsgTimestamp = 0
MsgSessionStart = 1
MsgSessionEndDeprecated = 3
MsgSetPageLocationDeprecated = 4
MsgSetViewportSize = 5
MsgSetViewportScroll = 6
@ -26,7 +25,6 @@ const (
MsgConsoleLog = 22
MsgPageLoadTiming = 23
MsgPageRenderTiming = 24
MsgJSExceptionDeprecated = 25
MsgIntegrationEvent = 26
MsgCustomEvent = 27
MsgUserID = 28
@ -37,10 +35,6 @@ const (
MsgPageEvent = 33
MsgStringDictGlobal = 34
MsgSetNodeAttributeDictGlobal = 35
MsgNodeAnimationResult = 36
MsgCSSInsertRule = 37
MsgCSSDeleteRule = 38
MsgFetch = 39
MsgProfiler = 40
MsgOTable = 41
MsgStateAction = 42
@ -54,21 +48,17 @@ const (
MsgSetNodeAttributeDictDeprecated = 51
MsgStringDict = 43
MsgSetNodeAttributeDict = 52
MsgResourceTimingDeprecatedDeprecated = 53
MsgResourceTimingDeprecated = 53
MsgConnectionInformation = 54
MsgSetPageVisibility = 55
MsgPerformanceTrackAggr = 56
MsgLoadFontFace = 57
MsgSetNodeFocus = 58
MsgLongTask = 59
MsgSetNodeAttributeURLBased = 60
MsgSetCSSDataURLBased = 61
MsgIssueEventDeprecated = 62
MsgTechnicalInfo = 63
MsgCustomIssue = 64
MsgSetNodeSlot = 65
MsgAssetCache = 66
MsgCSSInsertRuleURLBased = 67
MsgMouseClick = 68
MsgMouseClickDeprecated = 69
MsgCreateIFrameDocument = 70
@ -81,19 +71,15 @@ const (
MsgAdoptedSSRemoveOwner = 77
MsgJSException = 78
MsgZustand = 79
MsgBatchMeta = 80
MsgBatchMetadata = 81
MsgPartitionedMessage = 82
MsgNetworkRequest = 83
MsgWSChannel = 84
MsgResourceTiming = 85
MsgIncident = 87
MsgLongAnimationTask = 89
MsgInputChange = 112
MsgSelectionChange = 113
MsgMouseThrashing = 114
MsgUnbindNodes = 115
MsgResourceTimingDeprecated = 116
MsgResourceTiming = 116
MsgTabChange = 117
MsgTabData = 118
MsgCanvasNode = 119
@ -198,27 +184,6 @@ func (msg *SessionStart) TypeID() int {
return 1
}
type SessionEndDeprecated struct {
message
Timestamp uint64
}
func (msg *SessionEndDeprecated) Encode() []byte {
buf := make([]byte, 11)
buf[0] = 3
p := 1
p = WriteUint(msg.Timestamp, buf, p)
return buf[:p]
}
func (msg *SessionEndDeprecated) Decode() Message {
return msg
}
func (msg *SessionEndDeprecated) TypeID() int {
return 3
}
type SetPageLocationDeprecated struct {
message
URL string
@ -743,31 +708,6 @@ func (msg *PageRenderTiming) TypeID() int {
return 24
}
type JSExceptionDeprecated struct {
message
Name string
Message string
Payload string
}
func (msg *JSExceptionDeprecated) Encode() []byte {
buf := make([]byte, 31+len(msg.Name)+len(msg.Message)+len(msg.Payload))
buf[0] = 25
p := 1
p = WriteString(msg.Name, buf, p)
p = WriteString(msg.Message, buf, p)
p = WriteString(msg.Payload, buf, p)
return buf[:p]
}
func (msg *JSExceptionDeprecated) Decode() Message {
return msg
}
func (msg *JSExceptionDeprecated) TypeID() int {
return 25
}
type IntegrationEvent struct {
message
Timestamp uint64
@ -1070,110 +1010,6 @@ func (msg *SetNodeAttributeDictGlobal) TypeID() int {
return 35
}
type NodeAnimationResult struct {
message
ID uint64
Styles string
}
func (msg *NodeAnimationResult) Encode() []byte {
buf := make([]byte, 21+len(msg.Styles))
buf[0] = 36
p := 1
p = WriteUint(msg.ID, buf, p)
p = WriteString(msg.Styles, buf, p)
return buf[:p]
}
func (msg *NodeAnimationResult) Decode() Message {
return msg
}
func (msg *NodeAnimationResult) TypeID() int {
return 36
}
type CSSInsertRule struct {
message
ID uint64
Rule string
Index uint64
}
func (msg *CSSInsertRule) Encode() []byte {
buf := make([]byte, 31+len(msg.Rule))
buf[0] = 37
p := 1
p = WriteUint(msg.ID, buf, p)
p = WriteString(msg.Rule, buf, p)
p = WriteUint(msg.Index, buf, p)
return buf[:p]
}
func (msg *CSSInsertRule) Decode() Message {
return msg
}
func (msg *CSSInsertRule) TypeID() int {
return 37
}
type CSSDeleteRule struct {
message
ID uint64
Index uint64
}
func (msg *CSSDeleteRule) Encode() []byte {
buf := make([]byte, 21)
buf[0] = 38
p := 1
p = WriteUint(msg.ID, buf, p)
p = WriteUint(msg.Index, buf, p)
return buf[:p]
}
func (msg *CSSDeleteRule) Decode() Message {
return msg
}
func (msg *CSSDeleteRule) TypeID() int {
return 38
}
type Fetch struct {
message
Method string
URL string
Request string
Response string
Status uint64
Timestamp uint64
Duration uint64
}
func (msg *Fetch) Encode() []byte {
buf := make([]byte, 71+len(msg.Method)+len(msg.URL)+len(msg.Request)+len(msg.Response))
buf[0] = 39
p := 1
p = WriteString(msg.Method, buf, p)
p = WriteString(msg.URL, buf, p)
p = WriteString(msg.Request, buf, p)
p = WriteString(msg.Response, buf, p)
p = WriteUint(msg.Status, buf, p)
p = WriteUint(msg.Timestamp, buf, p)
p = WriteUint(msg.Duration, buf, p)
return buf[:p]
}
func (msg *Fetch) Decode() Message {
return msg
}
func (msg *Fetch) TypeID() int {
return 39
}
type Profiler struct {
message
Name string
@ -1493,7 +1329,7 @@ func (msg *SetNodeAttributeDict) TypeID() int {
return 52
}
type ResourceTimingDeprecatedDeprecated struct {
type ResourceTimingDeprecated struct {
message
Timestamp uint64
Duration uint64
@ -1505,7 +1341,7 @@ type ResourceTimingDeprecatedDeprecated struct {
Initiator string
}
func (msg *ResourceTimingDeprecatedDeprecated) Encode() []byte {
func (msg *ResourceTimingDeprecated) Encode() []byte {
buf := make([]byte, 81+len(msg.URL)+len(msg.Initiator))
buf[0] = 53
p := 1
@ -1520,11 +1356,11 @@ func (msg *ResourceTimingDeprecatedDeprecated) Encode() []byte {
return buf[:p]
}
func (msg *ResourceTimingDeprecatedDeprecated) Decode() Message {
func (msg *ResourceTimingDeprecated) Decode() Message {
return msg
}
func (msg *ResourceTimingDeprecatedDeprecated) TypeID() int {
func (msg *ResourceTimingDeprecated) TypeID() int {
return 53
}
@ -1667,39 +1503,6 @@ func (msg *SetNodeFocus) TypeID() int {
return 58
}
type LongTask struct {
message
Timestamp uint64
Duration uint64
Context uint64
ContainerType uint64
ContainerSrc string
ContainerId string
ContainerName string
}
func (msg *LongTask) Encode() []byte {
buf := make([]byte, 71+len(msg.ContainerSrc)+len(msg.ContainerId)+len(msg.ContainerName))
buf[0] = 59
p := 1
p = WriteUint(msg.Timestamp, buf, p)
p = WriteUint(msg.Duration, buf, p)
p = WriteUint(msg.Context, buf, p)
p = WriteUint(msg.ContainerType, buf, p)
p = WriteString(msg.ContainerSrc, buf, p)
p = WriteString(msg.ContainerId, buf, p)
p = WriteString(msg.ContainerName, buf, p)
return buf[:p]
}
func (msg *LongTask) Decode() Message {
return msg
}
func (msg *LongTask) TypeID() int {
return 59
}
type SetNodeAttributeURLBased struct {
message
ID uint64
@ -1752,37 +1555,6 @@ func (msg *SetCSSDataURLBased) TypeID() int {
return 61
}
type IssueEventDeprecated struct {
message
MessageID uint64
Timestamp uint64
Type string
ContextString string
Context string
Payload string
}
func (msg *IssueEventDeprecated) Encode() []byte {
buf := make([]byte, 61+len(msg.Type)+len(msg.ContextString)+len(msg.Context)+len(msg.Payload))
buf[0] = 62
p := 1
p = WriteUint(msg.MessageID, buf, p)
p = WriteUint(msg.Timestamp, buf, p)
p = WriteString(msg.Type, buf, p)
p = WriteString(msg.ContextString, buf, p)
p = WriteString(msg.Context, buf, p)
p = WriteString(msg.Payload, buf, p)
return buf[:p]
}
func (msg *IssueEventDeprecated) Decode() Message {
return msg
}
func (msg *IssueEventDeprecated) TypeID() int {
return 62
}
type TechnicalInfo struct {
message
Type string
@ -1829,29 +1601,6 @@ func (msg *CustomIssue) TypeID() int {
return 64
}
type SetNodeSlot struct {
message
ID uint64
SlotID uint64
}
func (msg *SetNodeSlot) Encode() []byte {
buf := make([]byte, 21)
buf[0] = 65
p := 1
p = WriteUint(msg.ID, buf, p)
p = WriteUint(msg.SlotID, buf, p)
return buf[:p]
}
func (msg *SetNodeSlot) Decode() Message {
return msg
}
func (msg *SetNodeSlot) TypeID() int {
return 65
}
type AssetCache struct {
message
URL string
@ -1873,33 +1622,6 @@ func (msg *AssetCache) TypeID() int {
return 66
}
type CSSInsertRuleURLBased struct {
message
ID uint64
Rule string
Index uint64
BaseURL string
}
func (msg *CSSInsertRuleURLBased) Encode() []byte {
buf := make([]byte, 41+len(msg.Rule)+len(msg.BaseURL))
buf[0] = 67
p := 1
p = WriteUint(msg.ID, buf, p)
p = WriteString(msg.Rule, buf, p)
p = WriteUint(msg.Index, buf, p)
p = WriteString(msg.BaseURL, buf, p)
return buf[:p]
}
func (msg *CSSInsertRuleURLBased) Decode() Message {
return msg
}
func (msg *CSSInsertRuleURLBased) TypeID() int {
return 67
}
type MouseClick struct {
message
ID uint64
@ -2200,31 +1922,6 @@ func (msg *Zustand) TypeID() int {
return 79
}
type BatchMeta struct {
message
PageNo uint64
FirstIndex uint64
Timestamp int64
}
func (msg *BatchMeta) Encode() []byte {
buf := make([]byte, 31)
buf[0] = 80
p := 1
p = WriteUint(msg.PageNo, buf, p)
p = WriteUint(msg.FirstIndex, buf, p)
p = WriteInt(msg.Timestamp, buf, p)
return buf[:p]
}
func (msg *BatchMeta) Decode() Message {
return msg
}
func (msg *BatchMeta) TypeID() int {
return 80
}
type BatchMetadata struct {
message
Version uint64
@ -2345,115 +2042,6 @@ func (msg *WSChannel) TypeID() int {
return 84
}
type ResourceTiming struct {
message
Timestamp uint64
Duration uint64
TTFB uint64
HeaderSize uint64
EncodedBodySize uint64
DecodedBodySize uint64
URL string
Initiator string
TransferredSize uint64
Cached bool
Queueing uint64
DnsLookup uint64
InitialConnection uint64
SSL uint64
ContentDownload uint64
Total uint64
Stalled uint64
}
func (msg *ResourceTiming) Encode() []byte {
buf := make([]byte, 171+len(msg.URL)+len(msg.Initiator))
buf[0] = 85
p := 1
p = WriteUint(msg.Timestamp, buf, p)
p = WriteUint(msg.Duration, buf, p)
p = WriteUint(msg.TTFB, buf, p)
p = WriteUint(msg.HeaderSize, buf, p)
p = WriteUint(msg.EncodedBodySize, buf, p)
p = WriteUint(msg.DecodedBodySize, buf, p)
p = WriteString(msg.URL, buf, p)
p = WriteString(msg.Initiator, buf, p)
p = WriteUint(msg.TransferredSize, buf, p)
p = WriteBoolean(msg.Cached, buf, p)
p = WriteUint(msg.Queueing, buf, p)
p = WriteUint(msg.DnsLookup, buf, p)
p = WriteUint(msg.InitialConnection, buf, p)
p = WriteUint(msg.SSL, buf, p)
p = WriteUint(msg.ContentDownload, buf, p)
p = WriteUint(msg.Total, buf, p)
p = WriteUint(msg.Stalled, buf, p)
return buf[:p]
}
func (msg *ResourceTiming) Decode() Message {
return msg
}
func (msg *ResourceTiming) TypeID() int {
return 85
}
type Incident struct {
message
Label string
StartTime int64
EndTime int64
}
func (msg *Incident) Encode() []byte {
buf := make([]byte, 31+len(msg.Label))
buf[0] = 87
p := 1
p = WriteString(msg.Label, buf, p)
p = WriteInt(msg.StartTime, buf, p)
p = WriteInt(msg.EndTime, buf, p)
return buf[:p]
}
func (msg *Incident) Decode() Message {
return msg
}
func (msg *Incident) TypeID() int {
return 87
}
type LongAnimationTask struct {
message
Name string
Duration int64
BlockingDuration int64
FirstUIEventTimestamp int64
StartTime int64
Scripts string
}
func (msg *LongAnimationTask) Encode() []byte {
buf := make([]byte, 61+len(msg.Name)+len(msg.Scripts))
buf[0] = 89
p := 1
p = WriteString(msg.Name, buf, p)
p = WriteInt(msg.Duration, buf, p)
p = WriteInt(msg.BlockingDuration, buf, p)
p = WriteInt(msg.FirstUIEventTimestamp, buf, p)
p = WriteInt(msg.StartTime, buf, p)
p = WriteString(msg.Scripts, buf, p)
return buf[:p]
}
func (msg *LongAnimationTask) Decode() Message {
return msg
}
func (msg *LongAnimationTask) TypeID() int {
return 89
}
type InputChange struct {
message
ID uint64
@ -2552,7 +2140,7 @@ func (msg *UnbindNodes) TypeID() int {
return 115
}
type ResourceTimingDeprecated struct {
type ResourceTiming struct {
message
Timestamp uint64
Duration uint64
@ -2566,7 +2154,7 @@ type ResourceTimingDeprecated struct {
Cached bool
}
func (msg *ResourceTimingDeprecated) Encode() []byte {
func (msg *ResourceTiming) Encode() []byte {
buf := make([]byte, 101+len(msg.URL)+len(msg.Initiator))
buf[0] = 116
p := 1
@ -2583,11 +2171,11 @@ func (msg *ResourceTimingDeprecated) Encode() []byte {
return buf[:p]
}
func (msg *ResourceTimingDeprecated) Decode() Message {
func (msg *ResourceTiming) Decode() Message {
return msg
}
func (msg *ResourceTimingDeprecated) TypeID() int {
func (msg *ResourceTiming) TypeID() int {
return 116
}

View file

@ -68,15 +68,6 @@ func DecodeSessionStart(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeSessionEndDeprecated(reader BytesReader) (Message, error) {
var err error = nil
msg := &SessionEndDeprecated{}
if msg.Timestamp, err = reader.ReadUint(); err != nil {
return nil, err
}
return msg, err
}
func DecodeSetPageLocationDeprecated(reader BytesReader) (Message, error) {
var err error = nil
msg := &SetPageLocationDeprecated{}
@ -390,21 +381,6 @@ func DecodePageRenderTiming(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeJSExceptionDeprecated(reader BytesReader) (Message, error) {
var err error = nil
msg := &JSExceptionDeprecated{}
if msg.Name, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Message, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Payload, err = reader.ReadString(); err != nil {
return nil, err
}
return msg, err
}
func DecodeIntegrationEvent(reader BytesReader) (Message, error) {
var err error = nil
msg := &IntegrationEvent{}
@ -633,72 +609,6 @@ func DecodeSetNodeAttributeDictGlobal(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeNodeAnimationResult(reader BytesReader) (Message, error) {
var err error = nil
msg := &NodeAnimationResult{}
if msg.ID, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Styles, err = reader.ReadString(); err != nil {
return nil, err
}
return msg, err
}
func DecodeCSSInsertRule(reader BytesReader) (Message, error) {
var err error = nil
msg := &CSSInsertRule{}
if msg.ID, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Rule, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Index, err = reader.ReadUint(); err != nil {
return nil, err
}
return msg, err
}
func DecodeCSSDeleteRule(reader BytesReader) (Message, error) {
var err error = nil
msg := &CSSDeleteRule{}
if msg.ID, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Index, err = reader.ReadUint(); err != nil {
return nil, err
}
return msg, err
}
func DecodeFetch(reader BytesReader) (Message, error) {
var err error = nil
msg := &Fetch{}
if msg.Method, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.URL, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Request, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Response, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Status, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Timestamp, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Duration, err = reader.ReadUint(); err != nil {
return nil, err
}
return msg, err
}
func DecodeProfiler(reader BytesReader) (Message, error) {
var err error = nil
msg := &Profiler{}
@ -885,9 +795,9 @@ func DecodeSetNodeAttributeDict(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeResourceTimingDeprecatedDeprecated(reader BytesReader) (Message, error) {
func DecodeResourceTimingDeprecated(reader BytesReader) (Message, error) {
var err error = nil
msg := &ResourceTimingDeprecatedDeprecated{}
msg := &ResourceTimingDeprecated{}
if msg.Timestamp, err = reader.ReadUint(); err != nil {
return nil, err
}
@ -1011,33 +921,6 @@ func DecodeSetNodeFocus(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeLongTask(reader BytesReader) (Message, error) {
var err error = nil
msg := &LongTask{}
if msg.Timestamp, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Duration, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Context, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.ContainerType, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.ContainerSrc, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.ContainerId, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.ContainerName, err = reader.ReadString(); err != nil {
return nil, err
}
return msg, err
}
func DecodeSetNodeAttributeURLBased(reader BytesReader) (Message, error) {
var err error = nil
msg := &SetNodeAttributeURLBased{}
@ -1071,30 +954,6 @@ func DecodeSetCSSDataURLBased(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeIssueEventDeprecated(reader BytesReader) (Message, error) {
var err error = nil
msg := &IssueEventDeprecated{}
if msg.MessageID, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Timestamp, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Type, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.ContextString, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Context, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Payload, err = reader.ReadString(); err != nil {
return nil, err
}
return msg, err
}
func DecodeTechnicalInfo(reader BytesReader) (Message, error) {
var err error = nil
msg := &TechnicalInfo{}
@ -1119,18 +978,6 @@ func DecodeCustomIssue(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeSetNodeSlot(reader BytesReader) (Message, error) {
var err error = nil
msg := &SetNodeSlot{}
if msg.ID, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.SlotID, err = reader.ReadUint(); err != nil {
return nil, err
}
return msg, err
}
func DecodeAssetCache(reader BytesReader) (Message, error) {
var err error = nil
msg := &AssetCache{}
@ -1140,24 +987,6 @@ func DecodeAssetCache(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeCSSInsertRuleURLBased(reader BytesReader) (Message, error) {
var err error = nil
msg := &CSSInsertRuleURLBased{}
if msg.ID, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Rule, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Index, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.BaseURL, err = reader.ReadString(); err != nil {
return nil, err
}
return msg, err
}
func DecodeMouseClick(reader BytesReader) (Message, error) {
var err error = nil
msg := &MouseClick{}
@ -1338,21 +1167,6 @@ func DecodeZustand(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeBatchMeta(reader BytesReader) (Message, error) {
var err error = nil
msg := &BatchMeta{}
if msg.PageNo, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.FirstIndex, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Timestamp, err = reader.ReadInt(); err != nil {
return nil, err
}
return msg, err
}
func DecodeBatchMetadata(reader BytesReader) (Message, error) {
var err error = nil
msg := &BatchMetadata{}
@ -1443,102 +1257,6 @@ func DecodeWSChannel(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeResourceTiming(reader BytesReader) (Message, error) {
var err error = nil
msg := &ResourceTiming{}
if msg.Timestamp, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Duration, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.TTFB, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.HeaderSize, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.EncodedBodySize, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.DecodedBodySize, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.URL, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Initiator, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.TransferredSize, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Cached, err = reader.ReadBoolean(); err != nil {
return nil, err
}
if msg.Queueing, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.DnsLookup, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.InitialConnection, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.SSL, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.ContentDownload, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Total, err = reader.ReadUint(); err != nil {
return nil, err
}
if msg.Stalled, err = reader.ReadUint(); err != nil {
return nil, err
}
return msg, err
}
func DecodeIncident(reader BytesReader) (Message, error) {
var err error = nil
msg := &Incident{}
if msg.Label, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.StartTime, err = reader.ReadInt(); err != nil {
return nil, err
}
if msg.EndTime, err = reader.ReadInt(); err != nil {
return nil, err
}
return msg, err
}
func DecodeLongAnimationTask(reader BytesReader) (Message, error) {
var err error = nil
msg := &LongAnimationTask{}
if msg.Name, err = reader.ReadString(); err != nil {
return nil, err
}
if msg.Duration, err = reader.ReadInt(); err != nil {
return nil, err
}
if msg.BlockingDuration, err = reader.ReadInt(); err != nil {
return nil, err
}
if msg.FirstUIEventTimestamp, err = reader.ReadInt(); err != nil {
return nil, err
}
if msg.StartTime, err = reader.ReadInt(); err != nil {
return nil, err
}
if msg.Scripts, err = reader.ReadString(); err != nil {
return nil, err
}
return msg, err
}
func DecodeInputChange(reader BytesReader) (Message, error) {
var err error = nil
msg := &InputChange{}
@ -1596,9 +1314,9 @@ func DecodeUnbindNodes(reader BytesReader) (Message, error) {
return msg, err
}
func DecodeResourceTimingDeprecated(reader BytesReader) (Message, error) {
func DecodeResourceTiming(reader BytesReader) (Message, error) {
var err error = nil
msg := &ResourceTimingDeprecated{}
msg := &ResourceTiming{}
if msg.Timestamp, err = reader.ReadUint(); err != nil {
return nil, err
}
@ -2208,8 +1926,6 @@ func ReadMessage(t uint64, reader BytesReader) (Message, error) {
return DecodeTimestamp(reader)
case 1:
return DecodeSessionStart(reader)
case 3:
return DecodeSessionEndDeprecated(reader)
case 4:
return DecodeSetPageLocationDeprecated(reader)
case 5:
@ -2252,8 +1968,6 @@ func ReadMessage(t uint64, reader BytesReader) (Message, error) {
return DecodePageLoadTiming(reader)
case 24:
return DecodePageRenderTiming(reader)
case 25:
return DecodeJSExceptionDeprecated(reader)
case 26:
return DecodeIntegrationEvent(reader)
case 27:
@ -2274,14 +1988,6 @@ func ReadMessage(t uint64, reader BytesReader) (Message, error) {
return DecodeStringDictGlobal(reader)
case 35:
return DecodeSetNodeAttributeDictGlobal(reader)
case 36:
return DecodeNodeAnimationResult(reader)
case 37:
return DecodeCSSInsertRule(reader)
case 38:
return DecodeCSSDeleteRule(reader)
case 39:
return DecodeFetch(reader)
case 40:
return DecodeProfiler(reader)
case 41:
@ -2309,7 +2015,7 @@ func ReadMessage(t uint64, reader BytesReader) (Message, error) {
case 52:
return DecodeSetNodeAttributeDict(reader)
case 53:
return DecodeResourceTimingDeprecatedDeprecated(reader)
return DecodeResourceTimingDeprecated(reader)
case 54:
return DecodeConnectionInformation(reader)
case 55:
@ -2320,24 +2026,16 @@ func ReadMessage(t uint64, reader BytesReader) (Message, error) {
return DecodeLoadFontFace(reader)
case 58:
return DecodeSetNodeFocus(reader)
case 59:
return DecodeLongTask(reader)
case 60:
return DecodeSetNodeAttributeURLBased(reader)
case 61:
return DecodeSetCSSDataURLBased(reader)
case 62:
return DecodeIssueEventDeprecated(reader)
case 63:
return DecodeTechnicalInfo(reader)
case 64:
return DecodeCustomIssue(reader)
case 65:
return DecodeSetNodeSlot(reader)
case 66:
return DecodeAssetCache(reader)
case 67:
return DecodeCSSInsertRuleURLBased(reader)
case 68:
return DecodeMouseClick(reader)
case 69:
@ -2362,8 +2060,6 @@ func ReadMessage(t uint64, reader BytesReader) (Message, error) {
return DecodeJSException(reader)
case 79:
return DecodeZustand(reader)
case 80:
return DecodeBatchMeta(reader)
case 81:
return DecodeBatchMetadata(reader)
case 82:
@ -2372,12 +2068,6 @@ func ReadMessage(t uint64, reader BytesReader) (Message, error) {
return DecodeNetworkRequest(reader)
case 84:
return DecodeWSChannel(reader)
case 85:
return DecodeResourceTiming(reader)
case 87:
return DecodeIncident(reader)
case 89:
return DecodeLongAnimationTask(reader)
case 112:
return DecodeInputChange(reader)
case 113:
@ -2387,7 +2077,7 @@ func ReadMessage(t uint64, reader BytesReader) (Message, error) {
case 115:
return DecodeUnbindNodes(reader)
case 116:
return DecodeResourceTimingDeprecated(reader)
return DecodeResourceTiming(reader)
case 117:
return DecodeTabChange(reader)
case 118:

Some files were not shown because too many files have changed in this diff Show more