Compare commits

...
Sign in to create a new pull request.

110 commits

Author SHA1 Message Date
nick-delirium
90510aa33b ui: fix double metric selection in list 2025-06-06 16:19:54 +02:00
GitHub Action
96a70f5d41 Increment frontend chart version to v1.22.42 2025-06-04 11:41:56 +02:00
rjshrjndrn
d4a13edcf0 fix(actions): frontend image with proper tag
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-06-04 11:33:19 +02:00
GitHub Action
51fad91a22 Increment frontend chart version to v1.22.41 2025-06-04 10:48:50 +02:00
nick-delirium
36abcda1e1 ui: fix audioplayer start point 2025-06-04 10:39:08 +02:00
Mehdi Osman
dd5f464f73
Increment frontend chart version to v1.22.40 (#3479)
Co-authored-by: GitHub Action <action@github.com>
2025-06-03 16:22:12 +02:00
Delirium
f9ada41272
ui: recreate period on db visit (#3478) 2025-06-03 16:05:52 +02:00
rjshrjndrn
9e24a3583e feat(nginx): add integrations endpoint with CORS support
Add new /integrations/ location block that proxies requests to
integrations-openreplay:8080 service. Includes proper CORS headers
for cross-origin requests and WebSocket upgrade support.

- Rewrite /integrations/ path to root
- Configure proxy headers for forwarding
- Set connection timeouts for stability
- Add CORS headers for API access

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-06-02 10:55:50 +02:00
Taha Yassine Kraiem
0a3129d3cd fix(chalice): fixed JIRA integration 2025-05-30 15:25:41 +02:00
Mehdi Osman
99d61db9d9
Increment frontend chart version to v1.22.39 (#3460)
Co-authored-by: GitHub Action <action@github.com>
2025-05-30 15:07:29 +02:00
Delirium
133958622e
ui: fix alert create button (#3459) 2025-05-30 14:56:21 +02:00
GitHub Action
fb021f606f Increment frontend chart version to v1.22.38 2025-05-29 12:21:04 +02:00
rjshrjndrn
a2905fa8ed fix: move cd - command after git operations in patch workflow
Move the directory restoration command after the git operations to
ensure all git commands execute in the correct working directory
before returning to the previous directory.

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-29 12:16:28 +02:00
rjshrjndrn
beec2283fd refactor(ci): restructure patch-build workflow script
- Extract inline bash script into structured functions
- Add proper error handling with set -euo pipefail
- Improve variable scoping with readonly and local declarations
- Add descriptive function names and comments
- Fix shell quoting and parameter expansion
- Consolidate build logic into reusable functions
- Add proper cleanup of temporary files
- Improve readability and maintainability of the CI script

The refactored script maintains the same functionality while being
more robust and easier to understand.

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-29 12:16:28 +02:00
GitHub Action
6c8b55019e Increment frontend chart version 2025-05-29 10:29:46 +02:00
rjshrjndrn
e3e3e11227 fix(action): proper registry
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-29 10:18:55 +02:00
Shekar Siri
c6f7de04cc Revert "fix(ui): new card data state is not updating"
This reverts commit 2921c17cbf.
2025-05-28 22:16:00 +02:00
Shekar Siri
2921c17cbf fix(ui): new card data state is not updating 2025-05-28 19:49:01 +02:00
Mehdi Osman
7eb3f5c4c8
Increment frontend chart version (#3436)
Co-authored-by: GitHub Action <action@github.com>
2025-05-26 16:10:35 +02:00
Rajesh Rajendran
5a9a8e588a
chore(actions): rebase only if not main (#3435)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-26 16:04:50 +02:00
Rajesh Rajendran
4b14258266
fix(action): clone repo (#3433)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-26 15:50:13 +02:00
Rajesh Rajendran
744d2d4311
actions fix or 2070 (#3432)
* chore(build): Better error handling

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* fix(build): remove fetch depth, as it might cause issue in rebase

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* fix(build): proper platform

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-26 15:45:48 +02:00
Taha Yassine Kraiem
64242a5dc0 refactor(DB): changed supported platforms in CH 2025-05-26 11:51:49 +02:00
Rajesh Rajendran
cae3002697
feat(ci): Support building from branch for old patch (#3419)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-20 15:19:04 +02:00
GitHub Action
3d3c62196b Increment frontend chart version 2025-05-20 11:44:16 +02:00
nick-delirium
e810958a5d ui: fix ant imports 2025-05-20 11:26:20 +02:00
nick-delirium
39fa9787d1 ui: prevent network row modal from changing replayer time 2025-05-20 11:21:50 +02:00
nick-delirium
c9c1ad4dde ui: comments etc 2025-05-20 11:21:50 +02:00
nick-delirium
d9868928be ui: improve network panel row mapping 2025-05-20 11:21:50 +02:00
GitHub Action
a460d8c9a2 Increment frontend chart version 2025-05-15 15:18:19 +02:00
nick-delirium
930417aab4 ui: fix session search on url change 2025-05-15 15:12:30 +02:00
GitHub Action
07bc184f4d Increment chalice chart version 2025-05-14 18:59:43 +02:00
Rajesh Rajendran
71b7cca569
Patch/api v1.22.0 (#3401)
* fix(chalice): fixed duplicate autocomplete values

* ci(actions): possible fix for pull --rebase

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
Co-authored-by: Taha Yassine Kraiem <tahayk2@gmail.com>
2025-05-14 18:42:25 +02:00
Mehdi Osman
355d27eaa0
Increment frontend chart version (#3397)
Co-authored-by: GitHub Action <action@github.com>
2025-05-13 13:38:15 +02:00
Mehdi Osman
66b485cccf
Increment db chart version (#3396)
Co-authored-by: GitHub Action <action@github.com>
2025-05-13 10:34:28 +02:00
Alexander
de33a42151
feat(db): custom event's ts (#3395) 2025-05-12 17:52:24 +02:00
Rajesh Rajendran
f12bdebf82
ci(actions): fix push denied (#3392) (#3393) (#3394)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 17:19:41 +02:00
Rajesh Rajendran
bbfa20c693
ci(actions): fix push denied (#3392) (#3393)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 16:58:19 +02:00
Rajesh Rajendran
f264ba043d
ci(actions): fix push denied (#3392)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 16:55:23 +02:00
Rajesh Rajendran
a05dce8125
main (#3391)
* ci(actions): Update pr description

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* ci(actions): run only on pull request merge

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 16:50:20 +02:00
Mehdi Osman
3a1635d81f
Increment frontend chart version (#3389)
Co-authored-by: GitHub Action <action@github.com>
2025-05-12 16:12:43 +02:00
Delirium
ccb332c636
ui: change <slot> check (#3388) 2025-05-12 16:02:26 +02:00
Rajesh Rajendran
80ffa15959
ci(actions): Auto update tag for patch build (#3387)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 15:54:10 +02:00
Rajesh Rajendran
b2e961d621
ci(actions): Auto update tag for patch build (#3386)
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-05-12 15:49:19 +02:00
Mehdi Osman
b4d0598f23
Increment frontend chart version (#3385)
Co-authored-by: GitHub Action <action@github.com>
2025-05-12 15:46:29 +02:00
Delirium
e77f083f10
ui: fixup toggler closing (#3384) 2025-05-12 15:40:30 +02:00
Delirium
58da1d3f64
fix litjs support, fix autocomplete modal options reset, fix dashboard chart density (#3382)
* Litjs fixes2 (#3381)

* ui: fixes for litjs capture

* ui: introduce vmode for lwc light dom

* ui: fixup the mode toggle and remover

* ui: fix filter options reset, fix dashboard chart density
2025-05-12 15:27:44 +02:00
GitHub Action
447fc26a2a Increment frontend chart version 2025-05-12 10:46:33 +02:00
nick-delirium
9bdf6e4f92 ui: fix heatmaps crash 2025-05-12 10:37:48 +02:00
GitHub Action
01f403e12d Increment chalice chart version 2025-05-07 12:28:44 +02:00
Taha Yassine Kraiem
39eb943b86 fix(chalice): fixed get error's details 2025-05-07 12:15:33 +02:00
GitHub Action
366b0d38b0 Increment frontend chart version 2025-05-06 16:28:28 +02:00
nick-delirium
f4d5b3c06e ui: fix max meta length, add horizontal layout for player 2025-05-06 16:23:47 +02:00
Mehdi Osman
93ae18133e
Increment frontend chart version (#3366)
Co-authored-by: GitHub Action <action@github.com>
2025-05-06 13:16:57 +02:00
Andrey Babushkin
fbe5d78270
Revert update (#3365)
* Revert "Increment chalice chart version"

This reverts commit 5e0e5730ba.

* revert updates

* changed chalice version
2025-05-06 13:08:08 +02:00
Mehdi Osman
b803eed1d4
Increment frontend chart version (#3362)
Co-authored-by: GitHub Action <action@github.com>
2025-05-05 17:49:39 +02:00
Andrey Babushkin
9ed3cb1b7e
Add searched events (#3361)
* add filtered events to search

* removed consoles

* changed styles to tailwind

* changed styles to tailwind

* fixed errors
2025-05-05 17:40:10 +02:00
GitHub Action
5e0e5730ba Increment chalice chart version 2025-05-05 17:04:29 +02:00
Taha Yassine Kraiem
d78b33dcd2 refactor(DB): remove TTL for CH tables 2025-05-05 16:49:37 +02:00
Taha Yassine Kraiem
4b1ca200b4 fix(chalice): fixed empty error_id for table of errors 2025-05-05 16:49:37 +02:00
rjshrjndrn
08d930f9ff fix(docker-compose): proper volume path #3279
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-28 17:28:40 +02:00
Mehdi Osman
da37809bc8
Increment frontend chart version (#3345)
Co-authored-by: GitHub Action <action@github.com>
2025-04-28 11:38:04 +02:00
Andrey Babushkin
d922fc7ad5
Patch frontend inline css (#3344)
* add inlineCss enum

* updated changelog
2025-04-28 11:29:53 +02:00
GitHub Action
796360fdd2 Increment frontend chart version 2025-04-28 11:01:55 +02:00
nick-delirium
13dbb60d8b ui: fix velement applychanges 2025-04-28 10:40:11 +02:00
Андрей Бабушкин
9e20a49128 add slot tag to custom elements 2025-04-28 10:34:43 +02:00
nick-delirium
91f8cc1399 ui: move debouncecall 2025-04-28 10:34:43 +02:00
Andrey Babushkin
f8ba3f6d89 Css batching (#3326)
* tracker: initial css inlining functionality

* tracker: add tests, adjust sheet id, stagger rule sending

* ui: rereoute custom html component fragments

* removed sorting

---------

Co-authored-by: nick-delirium <nikita@openreplay.com>
2025-04-28 10:34:43 +02:00
Delirium
85e30b3692 tracker css batching/inlining (#3334)
* tracker: initial css inlining functionality

* tracker: add tests, adjust sheet id, stagger rule sending

* removed sorting

* upgrade css inliner

* ui: better logging for ocunter

* tracker: force-fetch mode for cssInliner

* tracker: fix ts warns

* tracker: use debug opts

* tracker: 16.2.0 changelogs, inliner opts

* tracker: remove debug options

---------

Co-authored-by: Андрей Бабушкин <andreybabushkin2000@gmail.com>
2025-04-28 10:34:43 +02:00
nick-delirium
0360e3726e ui: fixup autoplay on inactive tabs 2025-04-28 10:34:43 +02:00
nick-delirium
77bbb5af36 tracker: update css inject 2025-04-28 10:34:43 +02:00
Andrey Babushkin
ab0d4cfb62 Css inliner tuning (#3337)
* tracker: don't send double sheets

* tracker: don't send double sheets

* tracker: slot checker

* add slot tag to custom elements

---------

Co-authored-by: nick-delirium <nikita@openreplay.com>
2025-04-28 10:34:43 +02:00
Andrey Babushkin
3fd506a812 Css batching (#3326)
* tracker: initial css inlining functionality

* tracker: add tests, adjust sheet id, stagger rule sending

* ui: rereoute custom html component fragments

* removed sorting

---------

Co-authored-by: nick-delirium <nikita@openreplay.com>
2025-04-28 10:34:43 +02:00
Shekar Siri
e8432e2dec change(ui): force the table cards events order to use and istead the defaul then 2025-04-24 10:09:19 +02:00
GitHub Action
5c76a8524c Increment frontend chart version 2025-04-23 18:41:46 +02:00
rjshrjndrn
3ba40a4811 feat(cli): Add support for image versions
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-23 17:52:50 +02:00
rjshrjndrn
f9a3f24590 fix(docker-compose): clickhouse migration
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-23 17:52:50 +02:00
rjshrjndrn
85d6d0abac fix(docker-compose): remove shell interpolation
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-23 17:52:50 +02:00
Rajesh Rajendran
b3594136ce or 1940 upstream docker release with the existing installation (#3316)
* chore(docker): Adding dynamic env generator
* ci(make): Create deployment yamls
* ci(make): Generating docker envs
* change env name structure
* proper env names
* chore(docker): clickhouse
* chore(docker-compose): generate env file format
* chore(docker-compose): Adding docker-compose
* chore(docker-compose): format make
* chore(docker-compose): Update version
* chore(docker-compose): adding new secrets
* ci(make): default target
* ci(Makefile): Update common protocol
* chore(docker-compose): refactor folder structure
* ci(make): rename to docker-envs
* feat(docker): add clickhouse volume definition
Add clickhouse persistent volume to the docker-compose configuration
to ensure data is preserved between container restarts.
* refactor: move env files to docker-envs directory
Updates all environment file references in docker-compose.yaml to use a
consistent directory structure, placing them under the docker-envs/
directory for better organization.
* fix(docker): rename imagestorage to images
 The `imagestorage` service and related environment file
 have been renamed to `images` for clarity and consistency.
 This change reflects the service's purpose of handling
 images.
* feat(docker): introduce docker-compose template
 A new docker-compose template
 to generate docker-compose files from a list of services.
 The template uses helm syntax.
* fix: Properly set FILES variable in Makefile
 The FILES variable was not being set correctly in the
 Makefile due to subshell issues. This commit fixes the
 variable assignment and ensures that the variable is
 accessible in subsequent commands.
* feat: Refactor docker-compose template for local development
 This commit introduces a complete overhaul of the
 docker-compose template, switching from a helm-based
 template to a native docker-compose.yml file. This
 change simplifies local development and makes it easier
 to manage the OpenReplay stack.
 The new template includes services for:
 - PostgreSQL
 - ClickHouse
 - Redis
 - MinIO
 - Nginx
 - Caddy
 It also includes migration jobs for setting up the
 database and MinIO.
* fix(docker-compose): Add fallback empty environment
 Add an empty environment to the docker-compose template to prevent
 errors when the env_file is missing. This ensures that the
 container can start even if the environment file is not present.
* feat(docker): Add domainname and aliases to services
 This change adds the `domainname` and `aliases` attributes to each
 service in the docker-compose.yaml file. This is to ensure that
 the services can communicate with each other using their fully
 qualified domain names. Also adds shared volume and empty
 environment variables.
* update version
* chore(docker): don't pull parallel
* chore(docker-compose): proper pull
* chore(docker-compose): Update db service urls
* fix(docker-compose): clickhouse url
* chore(clickhouse): Adding clickhouse db migration
* chore(docker-compose): Adding clickhouse
* fix(tpl): variable injection
* chore(fix): compose tpl variable rendering
* chore(docker-compose): Allow override pg variable
* chore(helm): remove assist-server
* chore(helm): pg integrations
* chore(nginx): removed services
* chore(docker-compose): Mulitple aliases
* chore(docker-compose): Adding more env vars
* feat(install): Dynamically generate passwords
 dynamic password generation by
 identifying `change_me_*` entries in `common.env` and
 replacing them with random passwords. This enhances
 security and simplifies initial setup.
 The changes include:
 - Replacing hardcoded password replacements with a loop
   that iterates through all `change_me_*` entries.
 - Using `grep` to find all `change_me_*` tokens.
 - Generating a random password for each token.
 - Updating the `common.env` file with the generated
   passwords.
* chore(docker-compose): disable clickhouse password
* fix(docker-compose): clickhouse-migration
* compose: chalice env
* chore(docker-compose): overlay vars
* chore(docker): Adding ch port
* chore(docker-compose): disable clickhouse password
* fix(docker-compose): migration name
* feat(docker): skip specific values
* chore(docker-compose): define namespace
---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-23 17:52:50 +02:00
GitHub Action
8f67edde8d Increment chalice chart version 2025-04-23 12:26:20 +02:00
Taha Yassine Kraiem
74ed29915b fix(chalice): enforce AND operator for table of requests and table of pages 2025-04-23 11:51:38 +02:00
GitHub Action
3ca71ec211 Increment chalice chart version 2025-04-22 19:23:11 +02:00
Taha Yassine Kraiem
0e469fd056 fix(chalice): fixes for table of requests 2025-04-22 19:03:35 +02:00
KRAIEM Taha Yassine
a8cb0e1643 fix(chalice): fixes for table of requests 2025-04-22 19:03:35 +02:00
GitHub Action
e171f0d8d5 Increment frontend chart version 2025-04-22 17:56:00 +02:00
nick-delirium
68ea291444 ui: fix timepicker and timezone interactions 2025-04-22 17:42:56 +02:00
GitHub Action
05cbb831c7 Increment frontend chart version 2025-04-22 10:32:00 +02:00
nick-delirium
5070ded1f4 ui: fix empty sank sessions fetch 2025-04-22 10:27:16 +02:00
GitHub Action
77610a4924 Increment frontend chart version 2025-04-16 17:45:25 +02:00
nick-delirium
7c34e4a0f6 ui: virtualizer for filter options list 2025-04-16 17:36:34 +02:00
GitHub Action
330e21183f Increment frontend chart version 2025-04-15 18:25:49 +02:00
Shekar Siri
30ce37896c feat(widget-sessions): improve session filtering logic
- Refactored session filtering logic to handle nested filters properly.
- Enhanced `fetchSessions` to ensure null checks and avoid errors.
- Updated `loadData` to handle `USER_PATH` and `HEATMAP` metric types.
- Improved UI consistency by adjusting spacing and formatting.
- Replaced redundant code with cleaner, more maintainable patterns.

This change improves the reliability and readability of the session
filtering and loading logic in the WidgetSessions component.
2025-04-15 18:15:03 +02:00
Andrey Babushkin
80a7817e7d
removed sorting by id (#3305) 2025-04-15 13:32:53 +02:00
Jorgen Evens
1b9c568cb1 fix(helm): fix broken volumeMounts indentation 2025-04-14 15:51:41 +02:00
GitHub Action
3759771ae9 Increment frontend chart version 2025-04-14 12:06:09 +02:00
Shekar Siri
f6ae5aba88 feat(SessionsBy): add specific filter for FETCH metric
Added a conditional check to handle the FETCH metric in the SessionsBy
component. When the metric is FETCH, a specific filter with key
FETCH_URL, operator is, and value derived from data.name is applied.
This ensures proper filtering behavior for FETCH-related metrics.
2025-04-14 12:01:51 +02:00
Mehdi Osman
5190dc512a
Increment frontend chart version (#3297)
Co-authored-by: GitHub Action <action@github.com>
2025-04-14 11:54:25 +02:00
Andrey Babushkin
3fcccb51e8
Patch assist (#3296)
* add global method support

* fix errors

* remove wrong updates

* remove wrong updates

* add onDrag as option

* fix wrong updates
2025-04-14 11:33:06 +02:00
GitHub Action
26077d5689 Increment frontend chart version 2025-04-11 14:56:11 +02:00
Shekar Siri
00c57348fd feat(search): enhance filter value handling
- Added `checkFilterValue` function to validate and update filter values
  in `SearchStoreLive`.
- Updated `FilterItem` to handle undefined `value` gracefully by providing
  a default empty array.

These changes improve robustness in filter value processing.
2025-04-11 14:36:25 +02:00
Shekar Siri
1f9bc5520a feat(search): add rounding to next minutes for date ranges
- Introduced `roundToNextMinutes` utility function to round timestamps
  to the next specified minute interval.
- Updated `Search` class to use the rounding function for non-custom
  date ranges.
- Modified `getRange` in `period.js` to align LAST_24_HOURS with
  15-minute intervals.
- Added `roundToNextMinutes` implementation in `utils/index.ts`.
2025-04-11 12:01:15 +02:00
Shekar Siri
aef94618f6 Revert "Increment frontend chart version"
This reverts commit 2a330318c7.
2025-04-11 11:03:01 +02:00
GitHub Action
2a330318c7 Increment frontend chart version 2025-04-11 11:01:53 +02:00
Shekar Siri
6777d5ce2a feat(dashboard): set initial drill down period
Change default drill down period from LAST_7_DAYS to LAST_24_HOURS
and preserve current period when drilling down on chart click
2025-04-11 10:49:17 +02:00
rjshrjndrn
8a6f8fe91f chore(action): cloning specific tag
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-10 15:45:50 +02:00
Mehdi Osman
7b078fed4c
Increment frontend chart version (#3278)
Co-authored-by: GitHub Action <action@github.com>
2025-04-07 15:24:32 +02:00
Andrey Babushkin
894d4c84b3
Patch assist canvas (#3277)
* resolved conflict

* removed comments
2025-04-07 15:13:36 +02:00
Alexander
46390a3ba9
feat(assist-server): added the github action (#3275) 2025-04-07 10:43:48 +02:00
rjshrjndrn
621667f5ce ci(action): Build and patch github tags
feat(workflow): update commit timestamp for patching

Add a step to set the commit timestamp of the HEAD commit to be 1
second newer than the oldest of the last 3 commits. This ensures
proper chronological order while preserving the commit content.

- Fetch deeper history to access commit history
- Get oldest timestamp from recent commits
- Set new commit date with BSD-compatible date command
- Verify timestamp change with git log

The workflow was previously checking out 'main' branch with a
comment indicating it needed to be fixed. This change makes it
properly checkout the tag specified by the workflow input.

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-04 16:09:05 +02:00
rjshrjndrn
a72f476f1c chore(ci): tag patching
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-04-04 13:15:56 +02:00
142 changed files with 3776 additions and 1419 deletions

122
.github/workflows/assist-server-ee.yaml vendored Normal file
View file

@ -0,0 +1,122 @@
# This action will push the assist changes to aws
on:
workflow_dispatch:
inputs:
skip_security_checks:
description: "Skip Security checks if there is a unfixable vuln or error. Value: true/false"
required: false
default: "false"
push:
branches:
- dev
paths:
- "ee/assist-server/**"
name: Build and Deploy Assist-Server EE
jobs:
deploy:
name: Deploy
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
with:
# We need to diff with old commit
# to see which workers got changed.
fetch-depth: 2
- uses: ./.github/composite-actions/update-keys
with:
assist_jwt_secret: ${{ secrets.ASSIST_JWT_SECRET }}
assist_key: ${{ secrets.ASSIST_KEY }}
domain_name: ${{ secrets.EE_DOMAIN_NAME }}
jwt_refresh_secret: ${{ secrets.JWT_REFRESH_SECRET }}
jwt_secret: ${{ secrets.EE_JWT_SECRET }}
jwt_spot_refresh_secret: ${{ secrets.JWT_SPOT_REFRESH_SECRET }}
jwt_spot_secret: ${{ secrets.JWT_SPOT_SECRET }}
license_key: ${{ secrets.EE_LICENSE_KEY }}
minio_access_key: ${{ secrets.EE_MINIO_ACCESS_KEY }}
minio_secret_key: ${{ secrets.EE_MINIO_SECRET_KEY }}
pg_password: ${{ secrets.EE_PG_PASSWORD }}
registry_url: ${{ secrets.OSS_REGISTRY_URL }}
name: Update Keys
- name: Docker login
run: |
docker login ${{ secrets.EE_REGISTRY_URL }} -u ${{ secrets.EE_DOCKER_USERNAME }} -p "${{ secrets.EE_REGISTRY_TOKEN }}"
- uses: azure/k8s-set-context@v1
with:
method: kubeconfig
kubeconfig: ${{ secrets.EE_KUBECONFIG }} # Use content of kubeconfig in secret.
id: setcontext
- name: Building and Pushing Assist-Server image
id: build-image
env:
DOCKER_REPO: ${{ secrets.EE_REGISTRY_URL }}
IMAGE_TAG: ${{ github.ref_name }}_${{ github.sha }}-ee
ENVIRONMENT: staging
run: |
skip_security_checks=${{ github.event.inputs.skip_security_checks }}
cd assist-server
PUSH_IMAGE=0 bash -x ./build.sh ee
[[ "x$skip_security_checks" == "xtrue" ]] || {
curl -L https://github.com/aquasecurity/trivy/releases/download/v0.56.2/trivy_0.56.2_Linux-64bit.tar.gz | tar -xzf - -C ./
images=("assist-server")
for image in ${images[*]};do
./trivy image --db-repository ghcr.io/aquasecurity/trivy-db:2 --db-repository public.ecr.aws/aquasecurity/trivy-db:2 --exit-code 1 --security-checks vuln --vuln-type os,library --severity "HIGH,CRITICAL" --ignore-unfixed $DOCKER_REPO/$image:$IMAGE_TAG
done
err_code=$?
[[ $err_code -ne 0 ]] && {
exit $err_code
}
} && {
echo "Skipping Security Checks"
}
images=("assist-server")
for image in ${images[*]};do
docker push $DOCKER_REPO/$image:$IMAGE_TAG
done
- name: Creating old image input
run: |
#
# Create yaml with existing image tags
#
kubectl get pods -n app -o jsonpath="{.items[*].spec.containers[*].image}" |\
tr -s '[[:space:]]' '\n' | sort | uniq -c | grep '/foss/' | cut -d '/' -f3 > /tmp/image_tag.txt
echo > /tmp/image_override.yaml
for line in `cat /tmp/image_tag.txt`;
do
image_array=($(echo "$line" | tr ':' '\n'))
cat <<EOF >> /tmp/image_override.yaml
${image_array[0]}:
image:
# We've to strip off the -ee, as helm will append it.
tag: `echo ${image_array[1]} | cut -d '-' -f 1`
EOF
done
- name: Deploy to kubernetes
run: |
pwd
cd scripts/helmcharts/
# Update changed image tag
sed -i "/assist-server/{n;n;n;s/.*/ tag: ${IMAGE_TAG}/}" /tmp/image_override.yaml
cat /tmp/image_override.yaml
# Deploy command
mkdir -p /tmp/charts
mv openreplay/charts/{ingress-nginx,assist-server,quickwit,connector} /tmp/charts/
rm -rf openreplay/charts/*
mv /tmp/charts/* openreplay/charts/
helm template openreplay -n app openreplay -f vars.yaml -f /tmp/image_override.yaml --set ingress-nginx.enabled=false --set skipMigration=true --no-hooks --kube-version=$k_version | kubectl apply -f -
env:
DOCKER_REPO: ${{ secrets.EE_REGISTRY_URL }}
# We're not passing -ee flag, because helm will add that.
IMAGE_TAG: ${{ github.ref_name }}_${{ github.sha }}
ENVIRONMENT: staging

189
.github/workflows/patch-build-old.yaml vendored Normal file
View file

@ -0,0 +1,189 @@
# Ref: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions
on:
workflow_dispatch:
inputs:
services:
description: 'Comma separated names of services to build(in small letters).'
required: true
default: 'chalice,frontend'
tag:
description: 'Tag to update.'
required: true
type: string
branch:
description: 'Branch to build patches from. Make sure the branch is uptodate with tag. Else itll cause missing commits.'
required: true
type: string
name: Build patches from tag, rewrite commit HEAD to older timestamp, and Push the tag
jobs:
deploy:
name: Build Patch from old tag
runs-on: ubuntu-latest
env:
DEPOT_TOKEN: ${{ secrets.DEPOT_TOKEN }}
DEPOT_PROJECT_ID: ${{ secrets.DEPOT_PROJECT_ID }}
steps:
- name: Checkout
uses: actions/checkout@v2
with:
fetch-depth: 4
ref: ${{ github.event.inputs.tag }}
- name: Set Remote with GITHUB_TOKEN
run: |
git config --unset http.https://github.com/.extraheader
git remote set-url origin https://x-access-token:${{ secrets.ACTIONS_COMMMIT_TOKEN }}@github.com/${{ github.repository }}.git
- name: Create backup tag with timestamp
run: |
set -e # Exit immediately if a command exits with a non-zero status
TIMESTAMP=$(date +%Y%m%d%H%M%S)
BACKUP_TAG="${{ github.event.inputs.tag }}-backup-${TIMESTAMP}"
echo "BACKUP_TAG=${BACKUP_TAG}" >> $GITHUB_ENV
echo "INPUT_TAG=${{ github.event.inputs.tag }}" >> $GITHUB_ENV
git tag $BACKUP_TAG || { echo "Failed to create backup tag"; exit 1; }
git push origin $BACKUP_TAG || { echo "Failed to push backup tag"; exit 1; }
echo "Created backup tag: $BACKUP_TAG"
# Get the oldest commit date from the last 3 commits in raw format
OLDEST_COMMIT_TIMESTAMP=$(git log -3 --pretty=format:"%at" | tail -1)
echo "Oldest commit timestamp: $OLDEST_COMMIT_TIMESTAMP"
# Add 1 second to the timestamp
NEW_TIMESTAMP=$((OLDEST_COMMIT_TIMESTAMP + 1))
echo "NEW_TIMESTAMP=$NEW_TIMESTAMP" >> $GITHUB_ENV
- name: Setup yq
uses: mikefarah/yq@master
# Configure AWS credentials for the first registry
- name: Configure AWS credentials for RELEASE_ARM_REGISTRY
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_DEPOT_ACCESS_KEY }}
aws-secret-access-key: ${{ secrets.AWS_DEPOT_SECRET_KEY }}
aws-region: ${{ secrets.AWS_DEPOT_DEFAULT_REGION }}
- name: Login to Amazon ECR for RELEASE_ARM_REGISTRY
id: login-ecr-arm
run: |
aws ecr get-login-password --region ${{ secrets.AWS_DEPOT_DEFAULT_REGION }} | docker login --username AWS --password-stdin ${{ secrets.RELEASE_ARM_REGISTRY }}
aws ecr-public get-login-password --region us-east-1 | docker login --username AWS --password-stdin ${{ secrets.RELEASE_OSS_REGISTRY }}
- uses: depot/setup-action@v1
- name: Get HEAD Commit ID
run: echo "HEAD_COMMIT_ID=$(git rev-parse HEAD)" >> $GITHUB_ENV
- name: Define Branch Name
run: echo "BRANCH_NAME=${{inputs.branch}}" >> $GITHUB_ENV
- name: Build
id: build-image
env:
DOCKER_REPO_ARM: ${{ secrets.RELEASE_ARM_REGISTRY }}
DOCKER_REPO_OSS: ${{ secrets.RELEASE_OSS_REGISTRY }}
MSAAS_REPO_CLONE_TOKEN: ${{ secrets.MSAAS_REPO_CLONE_TOKEN }}
MSAAS_REPO_URL: ${{ secrets.MSAAS_REPO_URL }}
MSAAS_REPO_FOLDER: /tmp/msaas
run: |
set -exo pipefail
git config --local user.email "action@github.com"
git config --local user.name "GitHub Action"
git checkout -b $BRANCH_NAME
working_dir=$(pwd)
function image_version(){
local service=$1
chart_path="$working_dir/scripts/helmcharts/openreplay/charts/$service/Chart.yaml"
current_version=$(yq eval '.AppVersion' $chart_path)
new_version=$(echo $current_version | awk -F. '{$NF += 1 ; print $1"."$2"."$3}')
echo $new_version
# yq eval ".AppVersion = \"$new_version\"" -i $chart_path
}
function clone_msaas() {
[ -d $MSAAS_REPO_FOLDER ] || {
git clone -b $INPUT_TAG --recursive https://x-access-token:$MSAAS_REPO_CLONE_TOKEN@$MSAAS_REPO_URL $MSAAS_REPO_FOLDER
cd $MSAAS_REPO_FOLDER
cd openreplay && git fetch origin && git checkout $INPUT_TAG
git log -1
cd $MSAAS_REPO_FOLDER
bash git-init.sh
git checkout
}
}
function build_managed() {
local service=$1
local version=$2
echo building managed
clone_msaas
if [[ $service == 'chalice' ]]; then
cd $MSAAS_REPO_FOLDER/openreplay/api
else
cd $MSAAS_REPO_FOLDER/openreplay/$service
fi
IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash build.sh >> /tmp/arm.txt
}
# Checking for backend images
ls backend/cmd >> /tmp/backend.txt
echo Services: "${{ github.event.inputs.services }}"
IFS=',' read -ra SERVICES <<< "${{ github.event.inputs.services }}"
BUILD_SCRIPT_NAME="build.sh"
# Build FOSS
for SERVICE in "${SERVICES[@]}"; do
# Check if service is backend
if grep -q $SERVICE /tmp/backend.txt; then
cd backend
foss_build_args="nil $SERVICE"
ee_build_args="ee $SERVICE"
else
[[ $SERVICE == 'chalice' || $SERVICE == 'alerts' || $SERVICE == 'crons' ]] && cd $working_dir/api || cd $SERVICE
[[ $SERVICE == 'alerts' || $SERVICE == 'crons' ]] && BUILD_SCRIPT_NAME="build_${SERVICE}.sh"
ee_build_args="ee"
fi
version=$(image_version $SERVICE)
echo IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 DOCKER_REPO=$DOCKER_REPO_OSS PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $foss_build_args
IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 DOCKER_REPO=$DOCKER_REPO_OSS PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $foss_build_args
echo IMAGE_TAG=$version-ee DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 DOCKER_REPO=$DOCKER_REPO_OSS PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $ee_build_args
IMAGE_TAG=$version-ee DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 DOCKER_REPO=$DOCKER_REPO_OSS PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $ee_build_args
if [[ "$SERVICE" != "chalice" && "$SERVICE" != "frontend" ]]; then
IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $foss_build_args
echo IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $foss_build_args
else
build_managed $SERVICE $version
fi
cd $working_dir
chart_path="$working_dir/scripts/helmcharts/openreplay/charts/$SERVICE/Chart.yaml"
yq eval ".AppVersion = \"$version\"" -i $chart_path
git add $chart_path
git commit -m "Increment $SERVICE chart version"
done
- name: Change commit timestamp
run: |
# Convert the timestamp to a date format git can understand
NEW_DATE=$(perl -le 'print scalar gmtime($ARGV[0])." +0000"' $NEW_TIMESTAMP)
echo "Setting commit date to: $NEW_DATE"
# Amend the commit with the new date
GIT_COMMITTER_DATE="$NEW_DATE" git commit --amend --no-edit --date="$NEW_DATE"
# Verify the change
git log -1 --pretty=format:"Commit now dated: %cD"
# git tag and push
git tag $INPUT_TAG -f
git push origin $INPUT_TAG -f
# - name: Debug Job
# if: ${{ failure() }}
# uses: mxschmitt/action-tmate@v3
# env:
# DOCKER_REPO_ARM: ${{ secrets.RELEASE_ARM_REGISTRY }}
# DOCKER_REPO_OSS: ${{ secrets.RELEASE_OSS_REGISTRY }}
# MSAAS_REPO_CLONE_TOKEN: ${{ secrets.MSAAS_REPO_CLONE_TOKEN }}
# MSAAS_REPO_URL: ${{ secrets.MSAAS_REPO_URL }}
# MSAAS_REPO_FOLDER: /tmp/msaas
# with:
# limit-access-to-actor: true

View file

@ -2,7 +2,6 @@
on: on:
workflow_dispatch: workflow_dispatch:
description: 'This workflow will build for patches for latest tag, and will Always use commit from main branch.'
inputs: inputs:
services: services:
description: 'Comma separated names of services to build(in small letters).' description: 'Comma separated names of services to build(in small letters).'
@ -20,12 +19,20 @@ jobs:
DEPOT_PROJECT_ID: ${{ secrets.DEPOT_PROJECT_ID }} DEPOT_PROJECT_ID: ${{ secrets.DEPOT_PROJECT_ID }}
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v2 uses: actions/checkout@v4
with: with:
fetch-depth: 1 fetch-depth: 0
token: ${{ secrets.GITHUB_TOKEN }}
- name: Rebase with main branch, to make sure the code has latest main changes - name: Rebase with main branch, to make sure the code has latest main changes
if: github.ref != 'refs/heads/main'
run: | run: |
git pull --rebase origin main git remote -v
git config --global user.email "action@github.com"
git config --global user.name "GitHub Action"
git config --global rebase.autoStash true
git fetch origin main:main
git rebase main
git log -3
- name: Downloading yq - name: Downloading yq
run: | run: |
@ -48,6 +55,8 @@ jobs:
aws ecr-public get-login-password --region us-east-1 | docker login --username AWS --password-stdin ${{ secrets.RELEASE_OSS_REGISTRY }} aws ecr-public get-login-password --region us-east-1 | docker login --username AWS --password-stdin ${{ secrets.RELEASE_OSS_REGISTRY }}
- uses: depot/setup-action@v1 - uses: depot/setup-action@v1
env:
DEPOT_TOKEN: ${{ secrets.DEPOT_TOKEN }}
- name: Get HEAD Commit ID - name: Get HEAD Commit ID
run: echo "HEAD_COMMIT_ID=$(git rev-parse HEAD)" >> $GITHUB_ENV run: echo "HEAD_COMMIT_ID=$(git rev-parse HEAD)" >> $GITHUB_ENV
- name: Define Branch Name - name: Define Branch Name
@ -65,78 +74,168 @@ jobs:
MSAAS_REPO_CLONE_TOKEN: ${{ secrets.MSAAS_REPO_CLONE_TOKEN }} MSAAS_REPO_CLONE_TOKEN: ${{ secrets.MSAAS_REPO_CLONE_TOKEN }}
MSAAS_REPO_URL: ${{ secrets.MSAAS_REPO_URL }} MSAAS_REPO_URL: ${{ secrets.MSAAS_REPO_URL }}
MSAAS_REPO_FOLDER: /tmp/msaas MSAAS_REPO_FOLDER: /tmp/msaas
SERVICES_INPUT: ${{ github.event.inputs.services }}
run: | run: |
set -exo pipefail #!/bin/bash
git config --local user.email "action@github.com" set -euo pipefail
git config --local user.name "GitHub Action"
git checkout -b $BRANCH_NAME # Configuration
working_dir=$(pwd) readonly WORKING_DIR=$(pwd)
function image_version(){ readonly BUILD_SCRIPT_NAME="build.sh"
local service=$1 readonly BACKEND_SERVICES_FILE="/tmp/backend.txt"
chart_path="$working_dir/scripts/helmcharts/openreplay/charts/$service/Chart.yaml"
current_version=$(yq eval '.AppVersion' $chart_path) # Initialize git configuration
new_version=$(echo $current_version | awk -F. '{$NF += 1 ; print $1"."$2"."$3}') setup_git() {
echo $new_version git config --local user.email "action@github.com"
# yq eval ".AppVersion = \"$new_version\"" -i $chart_path git config --local user.name "GitHub Action"
git checkout -b "$BRANCH_NAME"
} }
function clone_msaas() {
[ -d $MSAAS_REPO_FOLDER ] || { # Get and increment image version
git clone -b dev --recursive https://x-access-token:$MSAAS_REPO_CLONE_TOKEN@$MSAAS_REPO_URL $MSAAS_REPO_FOLDER image_version() {
cd $MSAAS_REPO_FOLDER local service=$1
cd openreplay && git fetch origin && git checkout main # This have to be changed to specific tag local chart_path="$WORKING_DIR/scripts/helmcharts/openreplay/charts/$service/Chart.yaml"
git log -1 local current_version new_version
cd $MSAAS_REPO_FOLDER
bash git-init.sh current_version=$(yq eval '.AppVersion' "$chart_path")
git checkout new_version=$(echo "$current_version" | awk -F. '{$NF += 1; print $1"."$2"."$3}')
} echo "$new_version"
} }
function build_managed() {
local service=$1 # Clone MSAAS repository if not exists
local version=$2 clone_msaas() {
echo building managed if [[ ! -d "$MSAAS_REPO_FOLDER" ]]; then
clone_msaas git clone -b dev --recursive "https://x-access-token:${MSAAS_REPO_CLONE_TOKEN}@${MSAAS_REPO_URL}" "$MSAAS_REPO_FOLDER"
if [[ $service == 'chalice' ]]; then cd "$MSAAS_REPO_FOLDER"
cd $MSAAS_REPO_FOLDER/openreplay/api cd openreplay && git fetch origin && git checkout main
else git log -1
cd $MSAAS_REPO_FOLDER/openreplay/$service cd "$MSAAS_REPO_FOLDER"
fi bash git-init.sh
IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash build.sh >> /tmp/arm.txt git checkout
fi
} }
# Checking for backend images
ls backend/cmd >> /tmp/backend.txt # Build managed services
echo Services: "${{ github.event.inputs.services }}" build_managed() {
IFS=',' read -ra SERVICES <<< "${{ github.event.inputs.services }}" local service=$1
BUILD_SCRIPT_NAME="build.sh" local version=$2
# Build FOSS
for SERVICE in "${SERVICES[@]}"; do echo "Building managed service: $service"
# Check if service is backend clone_msaas
if grep -q $SERVICE /tmp/backend.txt; then
cd backend if [[ $service == 'chalice' ]]; then
foss_build_args="nil $SERVICE" cd "$MSAAS_REPO_FOLDER/openreplay/api"
ee_build_args="ee $SERVICE" else
else cd "$MSAAS_REPO_FOLDER/openreplay/$service"
[[ $SERVICE == 'chalice' || $SERVICE == 'alerts' || $SERVICE == 'crons' ]] && cd $working_dir/api || cd $SERVICE fi
[[ $SERVICE == 'alerts' || $SERVICE == 'crons' ]] && BUILD_SCRIPT_NAME="build_${SERVICE}.sh"
ee_build_args="ee" local build_cmd="IMAGE_TAG=$version DOCKER_RUNTIME=depot DOCKER_BUILD_ARGS=--push ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash build.sh"
fi
version=$(image_version $SERVICE) echo "Executing: $build_cmd"
echo IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 DOCKER_REPO=$DOCKER_REPO_OSS PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $foss_build_args if ! eval "$build_cmd" 2>&1; then
IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 DOCKER_REPO=$DOCKER_REPO_OSS PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $foss_build_args echo "Build failed for $service"
echo IMAGE_TAG=$version-ee DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 DOCKER_REPO=$DOCKER_REPO_OSS PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $ee_build_args exit 1
IMAGE_TAG=$version-ee DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 DOCKER_REPO=$DOCKER_REPO_OSS PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $ee_build_args fi
if [[ "$SERVICE" != "chalice" && "$SERVICE" != "frontend" ]]; then }
IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $foss_build_args
echo IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash ${BUILD_SCRIPT_NAME} $foss_build_args # Build service with given arguments
else build_service() {
build_managed $SERVICE $version local service=$1
fi local version=$2
cd $working_dir local build_args=$3
chart_path="$working_dir/scripts/helmcharts/openreplay/charts/$SERVICE/Chart.yaml" local build_script=${4:-$BUILD_SCRIPT_NAME}
yq eval ".AppVersion = \"$version\"" -i $chart_path
git add $chart_path local command="IMAGE_TAG=$version DOCKER_RUNTIME=depot DOCKER_BUILD_ARGS=--push ARCH=amd64 DOCKER_REPO=$DOCKER_REPO_OSS PUSH_IMAGE=0 bash $build_script $build_args"
git commit -m "Increment $SERVICE chart version" echo "Executing: $command"
git push --set-upstream origin $BRANCH_NAME eval "$command"
done }
# Update chart version and commit changes
update_chart_version() {
local service=$1
local version=$2
local chart_path="$WORKING_DIR/scripts/helmcharts/openreplay/charts/$service/Chart.yaml"
# Ensure we're in the original working directory/repository
cd "$WORKING_DIR"
yq eval ".AppVersion = \"$version\"" -i "$chart_path"
git add "$chart_path"
git commit -m "Increment $service chart version to $version"
git push --set-upstream origin "$BRANCH_NAME"
cd -
}
# Main execution
main() {
setup_git
# Get backend services list
ls backend/cmd >"$BACKEND_SERVICES_FILE"
# Parse services input (fix for GitHub Actions syntax)
echo "Services: ${SERVICES_INPUT:-$1}"
IFS=',' read -ra services <<<"${SERVICES_INPUT:-$1}"
# Process each service
for service in "${services[@]}"; do
echo "Processing service: $service"
cd "$WORKING_DIR"
local foss_build_args="" ee_build_args="" build_script="$BUILD_SCRIPT_NAME"
# Determine build configuration based on service type
if grep -q "$service" "$BACKEND_SERVICES_FILE"; then
# Backend service
cd backend
foss_build_args="nil $service"
ee_build_args="ee $service"
else
# Non-backend service
case "$service" in
chalice | alerts | crons)
cd "$WORKING_DIR/api"
;;
*)
cd "$service"
;;
esac
# Special build scripts for alerts/crons
if [[ $service == 'alerts' || $service == 'crons' ]]; then
build_script="build_${service}.sh"
fi
ee_build_args="ee"
fi
# Get version and build
local version
version=$(image_version "$service")
# Build FOSS and EE versions
build_service "$service" "$version" "$foss_build_args"
build_service "$service" "${version}-ee" "$ee_build_args"
# Build managed version for specific services
if [[ "$service" != "chalice" && "$service" != "frontend" ]]; then
echo "Nothing to build in managed for service $service"
else
build_managed "$service" "$version"
fi
# Update chart and commit
update_chart_version "$service" "$version"
done
cd "$WORKING_DIR"
# Cleanup
rm -f "$BACKEND_SERVICES_FILE"
}
echo "Working directory: $WORKING_DIR"
# Run main function with all arguments
main "$SERVICES_INPUT"
- name: Create Pull Request - name: Create Pull Request
uses: repo-sync/pull-request@v2 uses: repo-sync/pull-request@v2
@ -147,8 +246,7 @@ jobs:
pr_title: "Updated patch build from main ${{ env.HEAD_COMMIT_ID }}" pr_title: "Updated patch build from main ${{ env.HEAD_COMMIT_ID }}"
pr_body: | pr_body: |
This PR updates the Helm chart version after building the patch from $HEAD_COMMIT_ID. This PR updates the Helm chart version after building the patch from $HEAD_COMMIT_ID.
Once this PR is merged, To update the latest tag, run the following workflow. Once this PR is merged, tag update job will run automatically.
https://github.com/openreplay/openreplay/actions/workflows/update-tag.yaml
# - name: Debug Job # - name: Debug Job
# if: ${{ failure() }} # if: ${{ failure() }}

View file

@ -1,35 +1,42 @@
on: on:
workflow_dispatch: pull_request:
description: "This workflow will build for patches for latest tag, and will Always use commit from main branch." types: [closed]
inputs: branches:
services: - main
description: "This action will update the latest tag with current main branch HEAD. Should I proceed ? true/false" name: Release tag update --force
required: true
default: "false"
name: Force Push tag with main branch HEAD
jobs: jobs:
deploy: deploy:
name: Build Patch from main name: Build Patch from main
runs-on: ubuntu-latest runs-on: ubuntu-latest
env: if: ${{ (github.event_name == 'pull_request' && github.event.pull_request.merged == true) || github.event.inputs.services == 'true' }}
DEPOT_TOKEN: ${{ secrets.DEPOT_TOKEN }}
DEPOT_PROJECT_ID: ${{ secrets.DEPOT_PROJECT_ID }}
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v2 uses: actions/checkout@v2
- name: Get latest release tag using GitHub API
id: get-latest-tag
run: |
LATEST_TAG=$(curl -s -H "Authorization: token ${{ secrets.GITHUB_TOKEN }}" \
"https://api.github.com/repos/${{ github.repository }}/releases/latest" \
| jq -r .tag_name)
# Fallback to git command if API doesn't return a tag
if [ "$LATEST_TAG" == "null" ] || [ -z "$LATEST_TAG" ]; then
echo "Not found latest tag"
exit 100
fi
echo "LATEST_TAG=$LATEST_TAG" >> $GITHUB_ENV
echo "Latest tag: $LATEST_TAG"
- name: Set Remote with GITHUB_TOKEN - name: Set Remote with GITHUB_TOKEN
run: | run: |
git config --unset http.https://github.com/.extraheader git config --unset http.https://github.com/.extraheader
git remote set-url origin https://x-access-token:${{ secrets.ACTIONS_COMMMIT_TOKEN }}@github.com/${{ github.repository }}.git git remote set-url origin https://x-access-token:${{ secrets.ACTIONS_COMMMIT_TOKEN }}@github.com/${{ github.repository }}
- name: Push main branch to tag - name: Push main branch to tag
run: | run: |
git fetch --tags
git checkout main git checkout main
git push origin HEAD:refs/tags/$(git tag --list 'v[0-9]*' --sort=-v:refname | head -n 1) --force echo "Updating tag ${{ env.LATEST_TAG }} to point to latest commit on main"
# - name: Debug Job git push origin HEAD:refs/tags/${{ env.LATEST_TAG }} --force
# if: ${{ failure() }}
# uses: mxschmitt/action-tmate@v3
# with:
# limit-access-to-actor: true

View file

@ -85,7 +85,8 @@ def __generic_query(typename, value_length=None):
ORDER BY value""" ORDER BY value"""
if value_length is None or value_length > 2: if value_length is None or value_length > 2:
return f"""(SELECT DISTINCT value, type return f"""SELECT DISTINCT ON(value,type) value, type
((SELECT DISTINCT value, type
FROM {TABLE} FROM {TABLE}
WHERE WHERE
project_id = %(project_id)s project_id = %(project_id)s
@ -101,7 +102,7 @@ def __generic_query(typename, value_length=None):
AND type='{typename.upper()}' AND type='{typename.upper()}'
AND value ILIKE %(value)s AND value ILIKE %(value)s
ORDER BY value ORDER BY value
LIMIT 5);""" LIMIT 5)) AS raw;"""
return f"""SELECT DISTINCT value, type return f"""SELECT DISTINCT value, type
FROM {TABLE} FROM {TABLE}
WHERE WHERE
@ -326,7 +327,7 @@ def __search_metadata(project_id, value, key=None, source=None):
AND {colname} ILIKE %(svalue)s LIMIT 5)""") AND {colname} ILIKE %(svalue)s LIMIT 5)""")
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
cur.execute(cur.mogrify(f"""\ cur.execute(cur.mogrify(f"""\
SELECT key, value, 'METADATA' AS TYPE SELECT DISTINCT ON(key, value) key, value, 'METADATA' AS TYPE
FROM({" UNION ALL ".join(sub_from)}) AS all_metas FROM({" UNION ALL ".join(sub_from)}) AS all_metas
LIMIT 5;""", {"project_id": project_id, "value": helper.string_to_sql_like(value), LIMIT 5;""", {"project_id": project_id, "value": helper.string_to_sql_like(value),
"svalue": helper.string_to_sql_like("^" + value)})) "svalue": helper.string_to_sql_like("^" + value)}))

View file

@ -338,14 +338,14 @@ def search(data: schemas.SearchErrorsSchema, project: schemas.ProjectContext, us
SELECT details.error_id as error_id, SELECT details.error_id as error_id,
name, message, users, total, name, message, users, total,
sessions, last_occurrence, first_occurrence, chart sessions, last_occurrence, first_occurrence, chart
FROM (SELECT JSONExtractString(toString(`$properties`), 'error_id') AS error_id, FROM (SELECT error_id,
JSONExtractString(toString(`$properties`), 'name') AS name, JSONExtractString(toString(`$properties`), 'name') AS name,
JSONExtractString(toString(`$properties`), 'message') AS message, JSONExtractString(toString(`$properties`), 'message') AS message,
COUNT(DISTINCT user_id) AS users, COUNT(DISTINCT user_id) AS users,
COUNT(DISTINCT events.session_id) AS sessions, COUNT(DISTINCT events.session_id) AS sessions,
MAX(created_at) AS max_datetime, MAX(created_at) AS max_datetime,
MIN(created_at) AS min_datetime, MIN(created_at) AS min_datetime,
COUNT(DISTINCT JSONExtractString(toString(`$properties`), 'error_id')) COUNT(DISTINCT error_id)
OVER() AS total OVER() AS total
FROM {MAIN_EVENTS_TABLE} AS events FROM {MAIN_EVENTS_TABLE} AS events
INNER JOIN (SELECT session_id, coalesce(user_id,toString(user_uuid)) AS user_id INNER JOIN (SELECT session_id, coalesce(user_id,toString(user_uuid)) AS user_id
@ -357,7 +357,7 @@ def search(data: schemas.SearchErrorsSchema, project: schemas.ProjectContext, us
GROUP BY error_id, name, message GROUP BY error_id, name, message
ORDER BY {sort} {order} ORDER BY {sort} {order}
LIMIT %(errors_limit)s OFFSET %(errors_offset)s) AS details LIMIT %(errors_limit)s OFFSET %(errors_offset)s) AS details
INNER JOIN (SELECT JSONExtractString(toString(`$properties`), 'error_id') AS error_id, INNER JOIN (SELECT error_id,
toUnixTimestamp(MAX(created_at))*1000 AS last_occurrence, toUnixTimestamp(MAX(created_at))*1000 AS last_occurrence,
toUnixTimestamp(MIN(created_at))*1000 AS first_occurrence toUnixTimestamp(MIN(created_at))*1000 AS first_occurrence
FROM {MAIN_EVENTS_TABLE} FROM {MAIN_EVENTS_TABLE}
@ -366,7 +366,7 @@ def search(data: schemas.SearchErrorsSchema, project: schemas.ProjectContext, us
GROUP BY error_id) AS time_details GROUP BY error_id) AS time_details
ON details.error_id=time_details.error_id ON details.error_id=time_details.error_id
INNER JOIN (SELECT error_id, groupArray([timestamp, count]) AS chart INNER JOIN (SELECT error_id, groupArray([timestamp, count]) AS chart
FROM (SELECT JSONExtractString(toString(`$properties`), 'error_id') AS error_id, FROM (SELECT error_id,
gs.generate_series AS timestamp, gs.generate_series AS timestamp,
COUNT(DISTINCT session_id) AS count COUNT(DISTINCT session_id) AS count
FROM generate_series(%(startDate)s, %(endDate)s, %(step_size)s) AS gs FROM generate_series(%(startDate)s, %(endDate)s, %(step_size)s) AS gs

View file

@ -50,8 +50,8 @@ class JIRAIntegration(base.BaseIntegration):
cur.execute( cur.execute(
cur.mogrify( cur.mogrify(
"""SELECT username, token, url """SELECT username, token, url
FROM public.jira_cloud FROM public.jira_cloud
WHERE user_id=%(user_id)s;""", WHERE user_id = %(user_id)s;""",
{"user_id": self._user_id}) {"user_id": self._user_id})
) )
data = helper.dict_to_camel_case(cur.fetchone()) data = helper.dict_to_camel_case(cur.fetchone())
@ -95,10 +95,9 @@ class JIRAIntegration(base.BaseIntegration):
def add(self, username, token, url, obfuscate=False): def add(self, username, token, url, obfuscate=False):
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
cur.execute( cur.execute(
cur.mogrify("""\ cur.mogrify(""" \
INSERT INTO public.jira_cloud(username, token, user_id,url) INSERT INTO public.jira_cloud(username, token, user_id, url)
VALUES (%(username)s, %(token)s, %(user_id)s,%(url)s) VALUES (%(username)s, %(token)s, %(user_id)s, %(url)s) RETURNING username, token, url;""",
RETURNING username, token, url;""",
{"user_id": self._user_id, "username": username, {"user_id": self._user_id, "username": username,
"token": token, "url": url}) "token": token, "url": url})
) )
@ -112,9 +111,10 @@ class JIRAIntegration(base.BaseIntegration):
def delete(self): def delete(self):
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
cur.execute( cur.execute(
cur.mogrify("""\ cur.mogrify(""" \
DELETE FROM public.jira_cloud DELETE
WHERE user_id=%(user_id)s;""", FROM public.jira_cloud
WHERE user_id = %(user_id)s;""",
{"user_id": self._user_id}) {"user_id": self._user_id})
) )
return {"state": "success"} return {"state": "success"}
@ -125,7 +125,7 @@ class JIRAIntegration(base.BaseIntegration):
changes={ changes={
"username": data.username, "username": data.username,
"token": data.token if len(data.token) > 0 and data.token.find("***") == -1 \ "token": data.token if len(data.token) > 0 and data.token.find("***") == -1 \
else self.integration.token, else self.integration["token"],
"url": str(data.url) "url": str(data.url)
}, },
obfuscate=True obfuscate=True

View file

@ -153,7 +153,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
"isEvent": True, "isEvent": True,
"value": [], "value": [],
"operator": e.operator, "operator": e.operator,
"filters": [] "filters": e.filters
}) })
for v in e.value: for v in e.value:
if v not in extra_conditions[e.operator].value: if v not in extra_conditions[e.operator].value:
@ -178,7 +178,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
"isEvent": True, "isEvent": True,
"value": [], "value": [],
"operator": e.operator, "operator": e.operator,
"filters": [] "filters": e.filters
}) })
for v in e.value: for v in e.value:
if v not in extra_conditions[e.operator].value: if v not in extra_conditions[e.operator].value:
@ -1108,8 +1108,12 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
is_any = sh.isAny_opreator(f.operator) is_any = sh.isAny_opreator(f.operator)
if is_any or len(f.value) == 0: if is_any or len(f.value) == 0:
continue continue
is_negative_operator = sh.is_negation_operator(f.operator)
f.value = helper.values_for_operator(value=f.value, op=f.operator) f.value = helper.values_for_operator(value=f.value, op=f.operator)
op = sh.get_sql_operator(f.operator) op = sh.get_sql_operator(f.operator)
r_op = ""
if is_negative_operator:
r_op = sh.reverse_sql_operator(op)
e_k_f = e_k + f"_fetch{j}" e_k_f = e_k + f"_fetch{j}"
full_args = {**full_args, **sh.multi_values(f.value, value_key=e_k_f)} full_args = {**full_args, **sh.multi_values(f.value, value_key=e_k_f)}
if f.type == schemas.FetchFilterType.FETCH_URL: if f.type == schemas.FetchFilterType.FETCH_URL:
@ -1118,6 +1122,12 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
)) ))
events_conditions[-1]["condition"].append(event_where[-1]) events_conditions[-1]["condition"].append(event_where[-1])
apply = True apply = True
if is_negative_operator:
events_conditions_not.append(
{
"type": f"sub.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = sh.multi_conditions(
f"sub.`$properties`.url_path {r_op} %({e_k_f})s", f.value, value_key=e_k_f)
elif f.type == schemas.FetchFilterType.FETCH_STATUS_CODE: elif f.type == schemas.FetchFilterType.FETCH_STATUS_CODE:
event_where.append(json_condition( event_where.append(json_condition(
"main", "$properties", 'status', op, f.value, e_k_f, True, True "main", "$properties", 'status', op, f.value, e_k_f, True, True
@ -1130,6 +1140,13 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
)) ))
events_conditions[-1]["condition"].append(event_where[-1]) events_conditions[-1]["condition"].append(event_where[-1])
apply = True apply = True
if is_negative_operator:
events_conditions_not.append(
{
"type": f"sub.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = sh.multi_conditions(
f"sub.`$properties`.method {r_op} %({e_k_f})s", f.value,
value_key=e_k_f)
elif f.type == schemas.FetchFilterType.FETCH_DURATION: elif f.type == schemas.FetchFilterType.FETCH_DURATION:
event_where.append( event_where.append(
sh.multi_conditions(f"main.`$duration_s` {f.operator} %({e_k_f})s/1000", f.value, sh.multi_conditions(f"main.`$duration_s` {f.operator} %({e_k_f})s/1000", f.value,
@ -1142,12 +1159,26 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
)) ))
events_conditions[-1]["condition"].append(event_where[-1]) events_conditions[-1]["condition"].append(event_where[-1])
apply = True apply = True
if is_negative_operator:
events_conditions_not.append(
{
"type": f"sub.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = sh.multi_conditions(
f"sub.`$properties`.request_body {r_op} %({e_k_f})s", f.value,
value_key=e_k_f)
elif f.type == schemas.FetchFilterType.FETCH_RESPONSE_BODY: elif f.type == schemas.FetchFilterType.FETCH_RESPONSE_BODY:
event_where.append(json_condition( event_where.append(json_condition(
"main", "$properties", 'response_body', op, f.value, e_k_f "main", "$properties", 'response_body', op, f.value, e_k_f
)) ))
events_conditions[-1]["condition"].append(event_where[-1]) events_conditions[-1]["condition"].append(event_where[-1])
apply = True apply = True
if is_negative_operator:
events_conditions_not.append(
{
"type": f"sub.`$event_name`='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = sh.multi_conditions(
f"sub.`$properties`.response_body {r_op} %({e_k_f})s", f.value,
value_key=e_k_f)
else: else:
logging.warning(f"undefined FETCH filter: {f.type}") logging.warning(f"undefined FETCH filter: {f.type}")
if not apply: if not apply:
@ -1395,17 +1426,30 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
if extra_conditions and len(extra_conditions) > 0: if extra_conditions and len(extra_conditions) > 0:
_extra_or_condition = [] _extra_or_condition = []
for i, c in enumerate(extra_conditions): for i, c in enumerate(extra_conditions):
if sh.isAny_opreator(c.operator): if sh.isAny_opreator(c.operator) and c.type != schemas.EventType.REQUEST_DETAILS.value:
continue continue
e_k = f"ec_value{i}" e_k = f"ec_value{i}"
op = sh.get_sql_operator(c.operator) op = sh.get_sql_operator(c.operator)
c.value = helper.values_for_operator(value=c.value, op=c.operator) c.value = helper.values_for_operator(value=c.value, op=c.operator)
full_args = {**full_args, full_args = {**full_args,
**sh.multi_values(c.value, value_key=e_k)} **sh.multi_values(c.value, value_key=e_k)}
if c.type == events.EventType.LOCATION.ui_type: if c.type in (schemas.EventType.LOCATION.value, schemas.EventType.REQUEST.value):
_extra_or_condition.append( _extra_or_condition.append(
sh.multi_conditions(f"extra_event.url_path {op} %({e_k})s", sh.multi_conditions(f"extra_event.url_path {op} %({e_k})s",
c.value, value_key=e_k)) c.value, value_key=e_k))
elif c.type == schemas.EventType.REQUEST_DETAILS.value:
for j, c_f in enumerate(c.filters):
if sh.isAny_opreator(c_f.operator) or len(c_f.value) == 0:
continue
e_k += f"_{j}"
op = sh.get_sql_operator(c_f.operator)
c_f.value = helper.values_for_operator(value=c_f.value, op=c_f.operator)
full_args = {**full_args,
**sh.multi_values(c_f.value, value_key=e_k)}
if c_f.type == schemas.FetchFilterType.FETCH_URL.value:
_extra_or_condition.append(
sh.multi_conditions(f"extra_event.url_path {op} %({e_k})s",
c_f.value, value_key=e_k))
else: else:
logging.warning(f"unsupported extra_event type:${c.type}") logging.warning(f"unsupported extra_event type:${c.type}")
if len(_extra_or_condition) > 0: if len(_extra_or_condition) > 0:

View file

@ -148,7 +148,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
"isEvent": True, "isEvent": True,
"value": [], "value": [],
"operator": e.operator, "operator": e.operator,
"filters": [] "filters": e.filters
}) })
for v in e.value: for v in e.value:
if v not in extra_conditions[e.operator].value: if v not in extra_conditions[e.operator].value:
@ -165,7 +165,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
"isEvent": True, "isEvent": True,
"value": [], "value": [],
"operator": e.operator, "operator": e.operator,
"filters": [] "filters": e.filters
}) })
for v in e.value: for v in e.value:
if v not in extra_conditions[e.operator].value: if v not in extra_conditions[e.operator].value:
@ -989,7 +989,7 @@ def search_query_parts(data: schemas.SessionsSearchPayloadSchema, error_status,
sh.multi_conditions(f"ev.{events.EventType.LOCATION.column} {op} %({e_k})s", sh.multi_conditions(f"ev.{events.EventType.LOCATION.column} {op} %({e_k})s",
c.value, value_key=e_k)) c.value, value_key=e_k))
else: else:
logger.warning(f"unsupported extra_event type:${c.type}") logger.warning(f"unsupported extra_event type: {c.type}")
if len(_extra_or_condition) > 0: if len(_extra_or_condition) > 0:
extra_constraints.append("(" + " OR ".join(_extra_or_condition) + ")") extra_constraints.append("(" + " OR ".join(_extra_or_condition) + ")")
query_part = f"""\ query_part = f"""\

View file

@ -4,37 +4,41 @@ import schemas
def get_sql_operator(op: Union[schemas.SearchEventOperator, schemas.ClickEventExtraOperator, schemas.MathOperator]): def get_sql_operator(op: Union[schemas.SearchEventOperator, schemas.ClickEventExtraOperator, schemas.MathOperator]):
if isinstance(op, Enum):
op = op.value
return { return {
schemas.SearchEventOperator.IS: "=", schemas.SearchEventOperator.IS.value: "=",
schemas.SearchEventOperator.ON: "=", schemas.SearchEventOperator.ON.value: "=",
schemas.SearchEventOperator.ON_ANY: "IN", schemas.SearchEventOperator.ON_ANY.value: "IN",
schemas.SearchEventOperator.IS_NOT: "!=", schemas.SearchEventOperator.IS_NOT.value: "!=",
schemas.SearchEventOperator.NOT_ON: "!=", schemas.SearchEventOperator.NOT_ON.value: "!=",
schemas.SearchEventOperator.CONTAINS: "ILIKE", schemas.SearchEventOperator.CONTAINS.value: "ILIKE",
schemas.SearchEventOperator.NOT_CONTAINS: "NOT ILIKE", schemas.SearchEventOperator.NOT_CONTAINS.value: "NOT ILIKE",
schemas.SearchEventOperator.STARTS_WITH: "ILIKE", schemas.SearchEventOperator.STARTS_WITH.value: "ILIKE",
schemas.SearchEventOperator.ENDS_WITH: "ILIKE", schemas.SearchEventOperator.ENDS_WITH.value: "ILIKE",
# Selector operators: # Selector operators:
schemas.ClickEventExtraOperator.IS: "=", schemas.ClickEventExtraOperator.IS.value: "=",
schemas.ClickEventExtraOperator.IS_NOT: "!=", schemas.ClickEventExtraOperator.IS_NOT.value: "!=",
schemas.ClickEventExtraOperator.CONTAINS: "ILIKE", schemas.ClickEventExtraOperator.CONTAINS.value: "ILIKE",
schemas.ClickEventExtraOperator.NOT_CONTAINS: "NOT ILIKE", schemas.ClickEventExtraOperator.NOT_CONTAINS.value: "NOT ILIKE",
schemas.ClickEventExtraOperator.STARTS_WITH: "ILIKE", schemas.ClickEventExtraOperator.STARTS_WITH.value: "ILIKE",
schemas.ClickEventExtraOperator.ENDS_WITH: "ILIKE", schemas.ClickEventExtraOperator.ENDS_WITH.value: "ILIKE",
schemas.MathOperator.GREATER: ">", schemas.MathOperator.GREATER.value: ">",
schemas.MathOperator.GREATER_EQ: ">=", schemas.MathOperator.GREATER_EQ.value: ">=",
schemas.MathOperator.LESS: "<", schemas.MathOperator.LESS.value: "<",
schemas.MathOperator.LESS_EQ: "<=", schemas.MathOperator.LESS_EQ.value: "<=",
}.get(op, "=") }.get(op, "=")
def is_negation_operator(op: schemas.SearchEventOperator): def is_negation_operator(op: schemas.SearchEventOperator):
return op in [schemas.SearchEventOperator.IS_NOT, if isinstance(op, Enum):
schemas.SearchEventOperator.NOT_ON, op = op.value
schemas.SearchEventOperator.NOT_CONTAINS, return op in [schemas.SearchEventOperator.IS_NOT.value,
schemas.ClickEventExtraOperator.IS_NOT, schemas.SearchEventOperator.NOT_ON.value,
schemas.ClickEventExtraOperator.NOT_CONTAINS] schemas.SearchEventOperator.NOT_CONTAINS.value,
schemas.ClickEventExtraOperator.IS_NOT.value,
schemas.ClickEventExtraOperator.NOT_CONTAINS.value]
def reverse_sql_operator(op): def reverse_sql_operator(op):

View file

@ -960,36 +960,6 @@ class CardSessionsSchema(_TimedSchema, _PaginatedSchema):
return self return self
# We don't need this as the UI is expecting filters to override the full series' filters
# @model_validator(mode="after")
# def __merge_out_filters_with_series(self):
# for f in self.filters:
# for s in self.series:
# found = False
#
# if f.is_event:
# sub = s.filter.events
# else:
# sub = s.filter.filters
#
# for e in sub:
# if f.type == e.type and f.operator == e.operator:
# found = True
# if f.is_event:
# # If extra event: append value
# for v in f.value:
# if v not in e.value:
# e.value.append(v)
# else:
# # If extra filter: override value
# e.value = f.value
# if not found:
# sub.append(f)
#
# self.filters = []
#
# return self
# UI is expecting filters to override the full series' filters # UI is expecting filters to override the full series' filters
@model_validator(mode="after") @model_validator(mode="after")
def __override_series_filters_with_outer_filters(self): def __override_series_filters_with_outer_filters(self):
@ -1060,6 +1030,16 @@ class CardTable(__CardSchema):
values["metricValue"] = [] values["metricValue"] = []
return values return values
@model_validator(mode="after")
def __enforce_AND_operator(self):
self.metric_of = MetricOfTable(self.metric_of)
if self.metric_of in (MetricOfTable.VISITED_URL, MetricOfTable.FETCH, \
MetricOfTable.VISITED_URL.value, MetricOfTable.FETCH.value):
for s in self.series:
if s.filter is not None:
s.filter.events_order = SearchEventOrder.AND
return self
@model_validator(mode="after") @model_validator(mode="after")
def __transform(self): def __transform(self):
self.metric_of = MetricOfTable(self.metric_of) self.metric_of = MetricOfTable(self.metric_of)

View file

@ -2,11 +2,12 @@ package datasaver
import ( import (
"context" "context"
"encoding/json"
"openreplay/backend/pkg/db/types"
"openreplay/backend/internal/config/db" "openreplay/backend/internal/config/db"
"openreplay/backend/pkg/db/clickhouse" "openreplay/backend/pkg/db/clickhouse"
"openreplay/backend/pkg/db/postgres" "openreplay/backend/pkg/db/postgres"
"openreplay/backend/pkg/db/types"
"openreplay/backend/pkg/logger" "openreplay/backend/pkg/logger"
. "openreplay/backend/pkg/messages" . "openreplay/backend/pkg/messages"
queue "openreplay/backend/pkg/queue/types" queue "openreplay/backend/pkg/queue/types"
@ -50,10 +51,6 @@ func New(log logger.Logger, cfg *db.Config, pg *postgres.Conn, ch clickhouse.Con
} }
func (s *saverImpl) Handle(msg Message) { func (s *saverImpl) Handle(msg Message) {
if msg.TypeID() == MsgCustomEvent {
defer s.Handle(types.WrapCustomEvent(msg.(*CustomEvent)))
}
var ( var (
sessCtx = context.WithValue(context.Background(), "sessionID", msg.SessionID()) sessCtx = context.WithValue(context.Background(), "sessionID", msg.SessionID())
session *sessions.Session session *sessions.Session
@ -69,6 +66,23 @@ func (s *saverImpl) Handle(msg Message) {
return return
} }
if msg.TypeID() == MsgCustomEvent {
m := msg.(*CustomEvent)
// Try to parse custom event payload to JSON and extract or_payload field
type CustomEventPayload struct {
CustomTimestamp uint64 `json:"or_timestamp"`
}
customPayload := &CustomEventPayload{}
if err := json.Unmarshal([]byte(m.Payload), customPayload); err == nil {
if customPayload.CustomTimestamp >= session.Timestamp {
s.log.Info(sessCtx, "custom event timestamp received: %v", m.Timestamp)
msg.Meta().Timestamp = customPayload.CustomTimestamp
s.log.Info(sessCtx, "custom event timestamp updated: %v", m.Timestamp)
}
}
defer s.Handle(types.WrapCustomEvent(m))
}
if IsMobileType(msg.TypeID()) { if IsMobileType(msg.TypeID()) {
if err := s.handleMobileMessage(sessCtx, session, msg); err != nil { if err := s.handleMobileMessage(sessCtx, session, msg); err != nil {
if !postgres.IsPkeyViolation(err) { if !postgres.IsPkeyViolation(err) {

View file

@ -86,7 +86,8 @@ def __generic_query(typename, value_length=None):
ORDER BY value""" ORDER BY value"""
if value_length is None or value_length > 2: if value_length is None or value_length > 2:
return f"""(SELECT DISTINCT value, type return f"""SELECT DISTINCT ON(value, type) value, type
FROM ((SELECT DISTINCT value, type
FROM {TABLE} FROM {TABLE}
WHERE WHERE
project_id = %(project_id)s project_id = %(project_id)s
@ -102,7 +103,7 @@ def __generic_query(typename, value_length=None):
AND type='{typename.upper()}' AND type='{typename.upper()}'
AND value ILIKE %(value)s AND value ILIKE %(value)s
ORDER BY value ORDER BY value
LIMIT 5);""" LIMIT 5)) AS raw;"""
return f"""SELECT DISTINCT value, type return f"""SELECT DISTINCT value, type
FROM {TABLE} FROM {TABLE}
WHERE WHERE
@ -257,7 +258,7 @@ def __search_metadata(project_id, value, key=None, source=None):
WHERE project_id = %(project_id)s WHERE project_id = %(project_id)s
AND {colname} ILIKE %(svalue)s LIMIT 5)""") AND {colname} ILIKE %(svalue)s LIMIT 5)""")
with ch_client.ClickHouseClient() as cur: with ch_client.ClickHouseClient() as cur:
query = cur.format(query=f"""SELECT key, value, 'METADATA' AS TYPE query = cur.format(query=f"""SELECT DISTINCT ON(key, value) key, value, 'METADATA' AS TYPE
FROM({" UNION ALL ".join(sub_from)}) AS all_metas FROM({" UNION ALL ".join(sub_from)}) AS all_metas
LIMIT 5;""", parameters={"project_id": project_id, "value": helper.string_to_sql_like(value), LIMIT 5;""", parameters={"project_id": project_id, "value": helper.string_to_sql_like(value),
"svalue": helper.string_to_sql_like("^" + value)}) "svalue": helper.string_to_sql_like("^" + value)})

View file

@ -71,7 +71,7 @@ def get_details(project_id, error_id, user_id, **data):
MAIN_EVENTS_TABLE = exp_ch_helper.get_main_events_table(0) MAIN_EVENTS_TABLE = exp_ch_helper.get_main_events_table(0)
ch_basic_query = errors_helper.__get_basic_constraints_ch(time_constraint=False) ch_basic_query = errors_helper.__get_basic_constraints_ch(time_constraint=False)
ch_basic_query.append("toString(`$properties`.error_id) = %(error_id)s") ch_basic_query.append("error_id = %(error_id)s")
with ch_client.ClickHouseClient() as ch: with ch_client.ClickHouseClient() as ch:
data["startDate24"] = TimeUTC.now(-1) data["startDate24"] = TimeUTC.now(-1)
@ -95,7 +95,7 @@ def get_details(project_id, error_id, user_id, **data):
"error_id": error_id} "error_id": error_id}
main_ch_query = f"""\ main_ch_query = f"""\
WITH pre_processed AS (SELECT toString(`$properties`.error_id) AS error_id, WITH pre_processed AS (SELECT error_id,
toString(`$properties`.name) AS name, toString(`$properties`.name) AS name,
toString(`$properties`.message) AS message, toString(`$properties`.message) AS message,
session_id, session_id,
@ -183,7 +183,7 @@ def get_details(project_id, error_id, user_id, **data):
AND `$event_name` = 'ERROR' AND `$event_name` = 'ERROR'
AND events.created_at >= toDateTime(timestamp / 1000) AND events.created_at >= toDateTime(timestamp / 1000)
AND events.created_at < toDateTime((timestamp + %(step_size24)s) / 1000) AND events.created_at < toDateTime((timestamp + %(step_size24)s) / 1000)
AND toString(`$properties`.error_id) = %(error_id)s AND error_id = %(error_id)s
GROUP BY timestamp GROUP BY timestamp
ORDER BY timestamp) AS chart_details ORDER BY timestamp) AS chart_details
) AS chart_details24 ON TRUE ) AS chart_details24 ON TRUE
@ -196,7 +196,7 @@ def get_details(project_id, error_id, user_id, **data):
AND `$event_name` = 'ERROR' AND `$event_name` = 'ERROR'
AND events.created_at >= toDateTime(timestamp / 1000) AND events.created_at >= toDateTime(timestamp / 1000)
AND events.created_at < toDateTime((timestamp + %(step_size30)s) / 1000) AND events.created_at < toDateTime((timestamp + %(step_size30)s) / 1000)
AND toString(`$properties`.error_id) = %(error_id)s AND error_id = %(error_id)s
GROUP BY timestamp GROUP BY timestamp
ORDER BY timestamp) AS chart_details ORDER BY timestamp) AS chart_details
) AS chart_details30 ON TRUE;""" ) AS chart_details30 ON TRUE;"""

View file

@ -1,3 +1,16 @@
SELECT 1
FROM (SELECT throwIf(platform = 'ios', 'IOS sessions found')
FROM experimental.sessions) AS raw
LIMIT 1;
SELECT 1
FROM (SELECT throwIf(platform = 'android', 'Android sessions found')
FROM experimental.sessions) AS raw
LIMIT 1;
ALTER TABLE experimental.sessions
MODIFY COLUMN platform Enum8('web'=1,'mobile'=2) DEFAULT 'web';
CREATE OR REPLACE FUNCTION openreplay_version AS() -> 'v1.22.0-ee'; CREATE OR REPLACE FUNCTION openreplay_version AS() -> 'v1.22.0-ee';
SET allow_experimental_json_type = 1; SET allow_experimental_json_type = 1;
@ -151,8 +164,7 @@ CREATE TABLE IF NOT EXISTS product_analytics.events
_timestamp DateTime DEFAULT now() _timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
ORDER BY (project_id, "$event_name", created_at, session_id) ORDER BY (project_id, "$event_name", created_at, session_id)
TTL _timestamp + INTERVAL 1 MONTH , TTL _deleted_at + INTERVAL 1 DAY DELETE WHERE _deleted_at != '1970-01-01 00:00:00';
_deleted_at + INTERVAL 1 DAY DELETE WHERE _deleted_at != '1970-01-01 00:00:00';
-- The list of events that should not be ingested, -- The list of events that should not be ingested,
-- according to a specific event_name and optional properties -- according to a specific event_name and optional properties

View file

@ -9,8 +9,7 @@ CREATE TABLE IF NOT EXISTS experimental.autocomplete
_timestamp DateTime DEFAULT now() _timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
PARTITION BY toYYYYMM(_timestamp) PARTITION BY toYYYYMM(_timestamp)
ORDER BY (project_id, type, value) ORDER BY (project_id, type, value);
TTL _timestamp + INTERVAL 1 MONTH;
CREATE TABLE IF NOT EXISTS experimental.events CREATE TABLE IF NOT EXISTS experimental.events
( (
@ -87,8 +86,7 @@ CREATE TABLE IF NOT EXISTS experimental.events
_timestamp DateTime DEFAULT now() _timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
PARTITION BY toYYYYMM(datetime) PARTITION BY toYYYYMM(datetime)
ORDER BY (project_id, datetime, event_type, session_id, message_id) ORDER BY (project_id, datetime, event_type, session_id, message_id);
TTL datetime + INTERVAL 3 MONTH;
@ -108,7 +106,7 @@ CREATE TABLE IF NOT EXISTS experimental.sessions
user_country Enum8('UN'=-128, 'RW'=-127, 'SO'=-126, 'YE'=-125, 'IQ'=-124, 'SA'=-123, 'IR'=-122, 'CY'=-121, 'TZ'=-120, 'SY'=-119, 'AM'=-118, 'KE'=-117, 'CD'=-116, 'DJ'=-115, 'UG'=-114, 'CF'=-113, 'SC'=-112, 'JO'=-111, 'LB'=-110, 'KW'=-109, 'OM'=-108, 'QA'=-107, 'BH'=-106, 'AE'=-105, 'IL'=-104, 'TR'=-103, 'ET'=-102, 'ER'=-101, 'EG'=-100, 'SD'=-99, 'GR'=-98, 'BI'=-97, 'EE'=-96, 'LV'=-95, 'AZ'=-94, 'LT'=-93, 'SJ'=-92, 'GE'=-91, 'MD'=-90, 'BY'=-89, 'FI'=-88, 'AX'=-87, 'UA'=-86, 'MK'=-85, 'HU'=-84, 'BG'=-83, 'AL'=-82, 'PL'=-81, 'RO'=-80, 'XK'=-79, 'ZW'=-78, 'ZM'=-77, 'KM'=-76, 'MW'=-75, 'LS'=-74, 'BW'=-73, 'MU'=-72, 'SZ'=-71, 'RE'=-70, 'ZA'=-69, 'YT'=-68, 'MZ'=-67, 'MG'=-66, 'AF'=-65, 'PK'=-64, 'BD'=-63, 'TM'=-62, 'TJ'=-61, 'LK'=-60, 'BT'=-59, 'IN'=-58, 'MV'=-57, 'IO'=-56, 'NP'=-55, 'MM'=-54, 'UZ'=-53, 'KZ'=-52, 'KG'=-51, 'TF'=-50, 'HM'=-49, 'CC'=-48, 'PW'=-47, 'VN'=-46, 'TH'=-45, 'ID'=-44, 'LA'=-43, 'TW'=-42, 'PH'=-41, 'MY'=-40, 'CN'=-39, 'HK'=-38, 'BN'=-37, 'MO'=-36, 'KH'=-35, 'KR'=-34, 'JP'=-33, 'KP'=-32, 'SG'=-31, 'CK'=-30, 'TL'=-29, 'RU'=-28, 'MN'=-27, 'AU'=-26, 'CX'=-25, 'MH'=-24, 'FM'=-23, 'PG'=-22, 'SB'=-21, 'TV'=-20, 'NR'=-19, 'VU'=-18, 'NC'=-17, 'NF'=-16, 'NZ'=-15, 'FJ'=-14, 'LY'=-13, 'CM'=-12, 'SN'=-11, 'CG'=-10, 'PT'=-9, 'LR'=-8, 'CI'=-7, 'GH'=-6, 'GQ'=-5, 'NG'=-4, 'BF'=-3, 'TG'=-2, 'GW'=-1, 'MR'=0, 'BJ'=1, 'GA'=2, 'SL'=3, 'ST'=4, 'GI'=5, 'GM'=6, 'GN'=7, 'TD'=8, 'NE'=9, 'ML'=10, 'EH'=11, 'TN'=12, 'ES'=13, 'MA'=14, 'MT'=15, 'DZ'=16, 'FO'=17, 'DK'=18, 'IS'=19, 'GB'=20, 'CH'=21, 'SE'=22, 'NL'=23, 'AT'=24, 'BE'=25, 'DE'=26, 'LU'=27, 'IE'=28, 'MC'=29, 'FR'=30, 'AD'=31, 'LI'=32, 'JE'=33, 'IM'=34, 'GG'=35, 'SK'=36, 'CZ'=37, 'NO'=38, 'VA'=39, 'SM'=40, 'IT'=41, 'SI'=42, 'ME'=43, 'HR'=44, 'BA'=45, 'AO'=46, 'NA'=47, 'SH'=48, 'BV'=49, 'BB'=50, 'CV'=51, 'GY'=52, 'GF'=53, 'SR'=54, 'PM'=55, 'GL'=56, 'PY'=57, 'UY'=58, 'BR'=59, 'FK'=60, 'GS'=61, 'JM'=62, 'DO'=63, 'CU'=64, 'MQ'=65, 'BS'=66, 'BM'=67, 'AI'=68, 'TT'=69, 'KN'=70, 'DM'=71, 'AG'=72, 'LC'=73, 'TC'=74, 'AW'=75, 'VG'=76, 'VC'=77, 'MS'=78, 'MF'=79, 'BL'=80, 'GP'=81, 'GD'=82, 'KY'=83, 'BZ'=84, 'SV'=85, 'GT'=86, 'HN'=87, 'NI'=88, 'CR'=89, 'VE'=90, 'EC'=91, 'CO'=92, 'PA'=93, 'HT'=94, 'AR'=95, 'CL'=96, 'BO'=97, 'PE'=98, 'MX'=99, 'PF'=100, 'PN'=101, 'KI'=102, 'TK'=103, 'TO'=104, 'WF'=105, 'WS'=106, 'NU'=107, 'MP'=108, 'GU'=109, 'PR'=110, 'VI'=111, 'UM'=112, 'AS'=113, 'CA'=114, 'US'=115, 'PS'=116, 'RS'=117, 'AQ'=118, 'SX'=119, 'CW'=120, 'BQ'=121, 'SS'=122,'BU'=123, 'VD'=124, 'YD'=125, 'DD'=126), user_country Enum8('UN'=-128, 'RW'=-127, 'SO'=-126, 'YE'=-125, 'IQ'=-124, 'SA'=-123, 'IR'=-122, 'CY'=-121, 'TZ'=-120, 'SY'=-119, 'AM'=-118, 'KE'=-117, 'CD'=-116, 'DJ'=-115, 'UG'=-114, 'CF'=-113, 'SC'=-112, 'JO'=-111, 'LB'=-110, 'KW'=-109, 'OM'=-108, 'QA'=-107, 'BH'=-106, 'AE'=-105, 'IL'=-104, 'TR'=-103, 'ET'=-102, 'ER'=-101, 'EG'=-100, 'SD'=-99, 'GR'=-98, 'BI'=-97, 'EE'=-96, 'LV'=-95, 'AZ'=-94, 'LT'=-93, 'SJ'=-92, 'GE'=-91, 'MD'=-90, 'BY'=-89, 'FI'=-88, 'AX'=-87, 'UA'=-86, 'MK'=-85, 'HU'=-84, 'BG'=-83, 'AL'=-82, 'PL'=-81, 'RO'=-80, 'XK'=-79, 'ZW'=-78, 'ZM'=-77, 'KM'=-76, 'MW'=-75, 'LS'=-74, 'BW'=-73, 'MU'=-72, 'SZ'=-71, 'RE'=-70, 'ZA'=-69, 'YT'=-68, 'MZ'=-67, 'MG'=-66, 'AF'=-65, 'PK'=-64, 'BD'=-63, 'TM'=-62, 'TJ'=-61, 'LK'=-60, 'BT'=-59, 'IN'=-58, 'MV'=-57, 'IO'=-56, 'NP'=-55, 'MM'=-54, 'UZ'=-53, 'KZ'=-52, 'KG'=-51, 'TF'=-50, 'HM'=-49, 'CC'=-48, 'PW'=-47, 'VN'=-46, 'TH'=-45, 'ID'=-44, 'LA'=-43, 'TW'=-42, 'PH'=-41, 'MY'=-40, 'CN'=-39, 'HK'=-38, 'BN'=-37, 'MO'=-36, 'KH'=-35, 'KR'=-34, 'JP'=-33, 'KP'=-32, 'SG'=-31, 'CK'=-30, 'TL'=-29, 'RU'=-28, 'MN'=-27, 'AU'=-26, 'CX'=-25, 'MH'=-24, 'FM'=-23, 'PG'=-22, 'SB'=-21, 'TV'=-20, 'NR'=-19, 'VU'=-18, 'NC'=-17, 'NF'=-16, 'NZ'=-15, 'FJ'=-14, 'LY'=-13, 'CM'=-12, 'SN'=-11, 'CG'=-10, 'PT'=-9, 'LR'=-8, 'CI'=-7, 'GH'=-6, 'GQ'=-5, 'NG'=-4, 'BF'=-3, 'TG'=-2, 'GW'=-1, 'MR'=0, 'BJ'=1, 'GA'=2, 'SL'=3, 'ST'=4, 'GI'=5, 'GM'=6, 'GN'=7, 'TD'=8, 'NE'=9, 'ML'=10, 'EH'=11, 'TN'=12, 'ES'=13, 'MA'=14, 'MT'=15, 'DZ'=16, 'FO'=17, 'DK'=18, 'IS'=19, 'GB'=20, 'CH'=21, 'SE'=22, 'NL'=23, 'AT'=24, 'BE'=25, 'DE'=26, 'LU'=27, 'IE'=28, 'MC'=29, 'FR'=30, 'AD'=31, 'LI'=32, 'JE'=33, 'IM'=34, 'GG'=35, 'SK'=36, 'CZ'=37, 'NO'=38, 'VA'=39, 'SM'=40, 'IT'=41, 'SI'=42, 'ME'=43, 'HR'=44, 'BA'=45, 'AO'=46, 'NA'=47, 'SH'=48, 'BV'=49, 'BB'=50, 'CV'=51, 'GY'=52, 'GF'=53, 'SR'=54, 'PM'=55, 'GL'=56, 'PY'=57, 'UY'=58, 'BR'=59, 'FK'=60, 'GS'=61, 'JM'=62, 'DO'=63, 'CU'=64, 'MQ'=65, 'BS'=66, 'BM'=67, 'AI'=68, 'TT'=69, 'KN'=70, 'DM'=71, 'AG'=72, 'LC'=73, 'TC'=74, 'AW'=75, 'VG'=76, 'VC'=77, 'MS'=78, 'MF'=79, 'BL'=80, 'GP'=81, 'GD'=82, 'KY'=83, 'BZ'=84, 'SV'=85, 'GT'=86, 'HN'=87, 'NI'=88, 'CR'=89, 'VE'=90, 'EC'=91, 'CO'=92, 'PA'=93, 'HT'=94, 'AR'=95, 'CL'=96, 'BO'=97, 'PE'=98, 'MX'=99, 'PF'=100, 'PN'=101, 'KI'=102, 'TK'=103, 'TO'=104, 'WF'=105, 'WS'=106, 'NU'=107, 'MP'=108, 'GU'=109, 'PR'=110, 'VI'=111, 'UM'=112, 'AS'=113, 'CA'=114, 'US'=115, 'PS'=116, 'RS'=117, 'AQ'=118, 'SX'=119, 'CW'=120, 'BQ'=121, 'SS'=122,'BU'=123, 'VD'=124, 'YD'=125, 'DD'=126),
user_city LowCardinality(String), user_city LowCardinality(String),
user_state LowCardinality(String), user_state LowCardinality(String),
platform Enum8('web'=1,'ios'=2,'android'=3) DEFAULT 'web', platform Enum8('web'=1,'mobile'=2) DEFAULT 'web',
datetime DateTime, datetime DateTime,
timezone LowCardinality(Nullable(String)), timezone LowCardinality(Nullable(String)),
duration UInt32, duration UInt32,
@ -140,7 +138,6 @@ CREATE TABLE IF NOT EXISTS experimental.sessions
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
PARTITION BY toYYYYMMDD(datetime) PARTITION BY toYYYYMMDD(datetime)
ORDER BY (project_id, datetime, session_id) ORDER BY (project_id, datetime, session_id)
TTL datetime + INTERVAL 3 MONTH
SETTINGS index_granularity = 512; SETTINGS index_granularity = 512;
CREATE TABLE IF NOT EXISTS experimental.user_favorite_sessions CREATE TABLE IF NOT EXISTS experimental.user_favorite_sessions
@ -152,8 +149,7 @@ CREATE TABLE IF NOT EXISTS experimental.user_favorite_sessions
sign Int8 sign Int8
) ENGINE = CollapsingMergeTree(sign) ) ENGINE = CollapsingMergeTree(sign)
PARTITION BY toYYYYMM(_timestamp) PARTITION BY toYYYYMM(_timestamp)
ORDER BY (project_id, user_id, session_id) ORDER BY (project_id, user_id, session_id);
TTL _timestamp + INTERVAL 3 MONTH;
CREATE TABLE IF NOT EXISTS experimental.user_viewed_sessions CREATE TABLE IF NOT EXISTS experimental.user_viewed_sessions
( (
@ -163,8 +159,7 @@ CREATE TABLE IF NOT EXISTS experimental.user_viewed_sessions
_timestamp DateTime DEFAULT now() _timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
PARTITION BY toYYYYMM(_timestamp) PARTITION BY toYYYYMM(_timestamp)
ORDER BY (project_id, user_id, session_id) ORDER BY (project_id, user_id, session_id);
TTL _timestamp + INTERVAL 3 MONTH;
CREATE TABLE IF NOT EXISTS experimental.user_viewed_errors CREATE TABLE IF NOT EXISTS experimental.user_viewed_errors
( (
@ -174,8 +169,7 @@ CREATE TABLE IF NOT EXISTS experimental.user_viewed_errors
_timestamp DateTime DEFAULT now() _timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
PARTITION BY toYYYYMM(_timestamp) PARTITION BY toYYYYMM(_timestamp)
ORDER BY (project_id, user_id, error_id) ORDER BY (project_id, user_id, error_id);
TTL _timestamp + INTERVAL 3 MONTH;
CREATE TABLE IF NOT EXISTS experimental.issues CREATE TABLE IF NOT EXISTS experimental.issues
( (
@ -188,8 +182,7 @@ CREATE TABLE IF NOT EXISTS experimental.issues
_timestamp DateTime DEFAULT now() _timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
PARTITION BY toYYYYMM(_timestamp) PARTITION BY toYYYYMM(_timestamp)
ORDER BY (project_id, issue_id, type) ORDER BY (project_id, issue_id, type);
TTL _timestamp + INTERVAL 3 MONTH;
@ -292,8 +285,7 @@ CREATE TABLE IF NOT EXISTS experimental.sessions_feature_flags
_timestamp DateTime DEFAULT now() _timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
PARTITION BY toYYYYMM(datetime) PARTITION BY toYYYYMM(datetime)
ORDER BY (project_id, datetime, session_id, feature_flag_id, condition_id) ORDER BY (project_id, datetime, session_id, feature_flag_id, condition_id);
TTL datetime + INTERVAL 3 MONTH;
CREATE TABLE IF NOT EXISTS experimental.ios_events CREATE TABLE IF NOT EXISTS experimental.ios_events
( (
@ -329,8 +321,7 @@ CREATE TABLE IF NOT EXISTS experimental.ios_events
_timestamp DateTime DEFAULT now() _timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
PARTITION BY toYYYYMM(datetime) PARTITION BY toYYYYMM(datetime)
ORDER BY (project_id, datetime, event_type, session_id, message_id) ORDER BY (project_id, datetime, event_type, session_id, message_id);
TTL datetime + INTERVAL 3 MONTH;
SET allow_experimental_json_type = 1; SET allow_experimental_json_type = 1;
@ -484,8 +475,7 @@ CREATE TABLE IF NOT EXISTS product_analytics.events
_timestamp DateTime DEFAULT now() _timestamp DateTime DEFAULT now()
) ENGINE = ReplacingMergeTree(_timestamp) ) ENGINE = ReplacingMergeTree(_timestamp)
ORDER BY (project_id, "$event_name", created_at, session_id) ORDER BY (project_id, "$event_name", created_at, session_id)
TTL _timestamp + INTERVAL 1 MONTH , TTL _deleted_at + INTERVAL 1 DAY DELETE WHERE _deleted_at != '1970-01-01 00:00:00';
_deleted_at + INTERVAL 1 DAY DELETE WHERE _deleted_at != '1970-01-01 00:00:00';
-- The list of events that should not be ingested, -- The list of events that should not be ingested,
-- according to a specific event_name and optional properties -- according to a specific event_name and optional properties

View file

@ -1,5 +1,4 @@
import withSiteIdUpdater from 'HOCs/withSiteIdUpdater'; import withSiteIdUpdater from 'HOCs/withSiteIdUpdater';
import withSiteIdUpdater from 'HOCs/withSiteIdUpdater';
import React, { Suspense, lazy } from 'react'; import React, { Suspense, lazy } from 'react';
import { Redirect, Route, Switch } from 'react-router-dom'; import { Redirect, Route, Switch } from 'react-router-dom';
import { observer } from 'mobx-react-lite'; import { observer } from 'mobx-react-lite';
@ -10,7 +9,7 @@ import { Loader } from 'UI';
import APIClient from './api_client'; import APIClient from './api_client';
import * as routes from './routes'; import * as routes from './routes';
import { debounce } from '@/utils'; import { debounceCall } from '@/utils';
const components: any = { const components: any = {
SessionPure: lazy(() => import('Components/Session/Session')), SessionPure: lazy(() => import('Components/Session/Session')),
@ -88,7 +87,6 @@ const ASSIST_PATH = routes.assist();
const LIVE_SESSION_PATH = routes.liveSession(); const LIVE_SESSION_PATH = routes.liveSession();
const MULTIVIEW_PATH = routes.multiview(); const MULTIVIEW_PATH = routes.multiview();
const MULTIVIEW_INDEX_PATH = routes.multiviewIndex(); const MULTIVIEW_INDEX_PATH = routes.multiviewIndex();
const ASSIST_STATS_PATH = routes.assistStats();
const USABILITY_TESTING_PATH = routes.usabilityTesting(); const USABILITY_TESTING_PATH = routes.usabilityTesting();
const USABILITY_TESTING_EDIT_PATH = routes.usabilityTestingEdit(); const USABILITY_TESTING_EDIT_PATH = routes.usabilityTestingEdit();
@ -99,7 +97,6 @@ const SPOT_PATH = routes.spot();
const SCOPE_SETUP = routes.scopeSetup(); const SCOPE_SETUP = routes.scopeSetup();
const HIGHLIGHTS_PATH = routes.highlights(); const HIGHLIGHTS_PATH = routes.highlights();
let debounceSearch: any = () => {};
function PrivateRoutes() { function PrivateRoutes() {
const { projectsStore, userStore, integrationsStore, searchStore } = useStore(); const { projectsStore, userStore, integrationsStore, searchStore } = useStore();
@ -124,13 +121,9 @@ function PrivateRoutes() {
} }
}, [siteId]); }, [siteId]);
React.useEffect(() => {
debounceSearch = debounce(() => searchStore.fetchSessions(), 250);
}, []);
React.useEffect(() => { React.useEffect(() => {
if (!searchStore.urlParsed) return; if (!searchStore.urlParsed) return;
debounceSearch(); debounceCall(() => searchStore.fetchSessions(true), 250)()
}, [searchStore.urlParsed, searchStore.instance.filters, searchStore.instance.eventsOrder]); }, [searchStore.urlParsed, searchStore.instance.filters, searchStore.instance.eventsOrder]);
return ( return (

View file

@ -6,6 +6,7 @@ import DefaultPlaying from 'Shared/SessionSettings/components/DefaultPlaying';
import DefaultTimezone from 'Shared/SessionSettings/components/DefaultTimezone'; import DefaultTimezone from 'Shared/SessionSettings/components/DefaultTimezone';
import ListingVisibility from 'Shared/SessionSettings/components/ListingVisibility'; import ListingVisibility from 'Shared/SessionSettings/components/ListingVisibility';
import MouseTrailSettings from 'Shared/SessionSettings/components/MouseTrailSettings'; import MouseTrailSettings from 'Shared/SessionSettings/components/MouseTrailSettings';
import VirtualModeSettings from '../shared/SessionSettings/components/VirtualMode';
import DebugLog from './DebugLog'; import DebugLog from './DebugLog';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
@ -35,6 +36,7 @@ function SessionsListingSettings() {
<div className="flex flex-col gap-2"> <div className="flex flex-col gap-2">
<MouseTrailSettings /> <MouseTrailSettings />
<DebugLog /> <DebugLog />
<VirtualModeSettings />
</div> </div>
</div> </div>
</div> </div>

View file

@ -6,6 +6,7 @@ import CardSessionsByList from 'Components/Dashboard/Widgets/CardSessionsByList'
import { useModal } from 'Components/ModalContext'; import { useModal } from 'Components/ModalContext';
import Widget from '@/mstore/types/widget'; import Widget from '@/mstore/types/widget';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { FilterKey } from 'Types/filter/filterType';
interface Props { interface Props {
metric?: any; metric?: any;
@ -35,20 +36,20 @@ function SessionsBy(props: Props) {
...filtersMap[metric.metricOf], ...filtersMap[metric.metricOf],
value: [data.name], value: [data.name],
type: filtersMap[metric.metricOf].key, type: filtersMap[metric.metricOf].key,
filters: filtersMap[metric.metricOf].filters?.map((f: any) => { filters: [],
const {
key,
operatorOptions,
category,
icon,
label,
options,
...cleaned
} = f;
return { ...cleaned, type: f.key, value: [] };
}),
}; };
if (metric.metricOf === FilterKey.FETCH) {
baseFilter.filters = [
{
key: FilterKey.FETCH_URL,
operator: 'is',
value: [data.name],
type: FilterKey.FETCH_URL,
}
];
}
const { const {
key, key,
operatorOptions, operatorOptions,

View file

@ -23,6 +23,7 @@ function BottomButtons({
<Button <Button
loading={loading} loading={loading}
type="primary" type="primary"
htmlType="submit"
disabled={loading || !instance.validate()} disabled={loading || !instance.validate()}
id="submit-button" id="submit-button"
> >

View file

@ -43,7 +43,7 @@ function ClickMapRagePicker() {
<Checkbox onChange={onToggle} label={t('Include rage clicks')} /> <Checkbox onChange={onToggle} label={t('Include rage clicks')} />
<Button size="small" onClick={refreshHeatmapSession}> <Button size="small" onClick={refreshHeatmapSession}>
{t('Get new session')} {t('Get new image')}
</Button> </Button>
</div> </div>
); );

View file

@ -64,6 +64,7 @@ function DashboardView(props: Props) {
}; };
useEffect(() => { useEffect(() => {
dashboardStore.resetPeriod();
if (queryParams.has('modal')) { if (queryParams.has('modal')) {
onAddWidgets(); onAddWidgets();
trimQuery(); trimQuery();

View file

@ -117,8 +117,6 @@ const ListView: React.FC<Props> = ({
if (disableSelection) { if (disableSelection) {
const path = withSiteId(`/metrics/${metric.metricId}`, siteId); const path = withSiteId(`/metrics/${metric.metricId}`, siteId);
history.push(path); history.push(path);
} else {
toggleSelection?.(metric.metricId);
} }
}; };

View file

@ -181,9 +181,10 @@ function WidgetChart(props: Props) {
} }
prevMetricRef.current = _metric; prevMetricRef.current = _metric;
const timestmaps = drillDownPeriod.toTimestamps(); const timestmaps = drillDownPeriod.toTimestamps();
const density = props.isPreview ? metric.density : dashboardStore.selectedDensity
const payload = isSaved const payload = isSaved
? { ...metricParams } ? { ...metricParams, density }
: { ...params, ...timestmaps, ..._metric.toJson() }; : { ...params, ...timestmaps, ..._metric.toJson(), density };
debounceRequest( debounceRequest(
_metric, _metric,
payload, payload,

View file

@ -55,7 +55,7 @@ function RangeGranularity({
} }
const PAST_24_HR_MS = 24 * 60 * 60 * 1000; const PAST_24_HR_MS = 24 * 60 * 60 * 1000;
function calculateGranularities(periodDurationMs: number) { export function calculateGranularities(periodDurationMs: number) {
const granularities = [ const granularities = [
{ label: 'Hourly', durationMs: 60 * 60 * 1000 }, { label: 'Hourly', durationMs: 60 * 60 * 1000 },
{ label: 'Daily', durationMs: 24 * 60 * 60 * 1000 }, { label: 'Daily', durationMs: 24 * 60 * 60 * 1000 },

View file

@ -1,376 +1,395 @@
import React, { useEffect, useState } from 'react'; import React, {useEffect, useState} from 'react';
import { NoContent, Loader, Pagination } from 'UI'; import {NoContent, Loader, Pagination} from 'UI';
import { Button, Tag, Tooltip, Dropdown, message } from 'antd'; import {Button, Tag, Tooltip, Dropdown, message} from 'antd';
import { UndoOutlined, DownOutlined } from '@ant-design/icons'; import {UndoOutlined, DownOutlined} from '@ant-design/icons';
import cn from 'classnames'; import cn from 'classnames';
import { useStore } from 'App/mstore'; import {useStore} from 'App/mstore';
import SessionItem from 'Shared/SessionItem'; import SessionItem from 'Shared/SessionItem';
import { observer } from 'mobx-react-lite'; import {observer} from 'mobx-react-lite';
import { DateTime } from 'luxon'; import {DateTime} from 'luxon';
import { debounce, numberWithCommas } from 'App/utils'; import {debounce, numberWithCommas} from 'App/utils';
import useIsMounted from 'App/hooks/useIsMounted'; import useIsMounted from 'App/hooks/useIsMounted';
import AnimatedSVG, { ICONS } from 'Shared/AnimatedSVG/AnimatedSVG'; import AnimatedSVG, {ICONS} from 'Shared/AnimatedSVG/AnimatedSVG';
import { HEATMAP, USER_PATH, FUNNEL } from 'App/constants/card'; import {HEATMAP, USER_PATH, FUNNEL} from 'App/constants/card';
import { useTranslation } from 'react-i18next'; import {useTranslation} from 'react-i18next';
interface Props { interface Props {
className?: string; className?: string;
} }
function WidgetSessions(props: Props) { function WidgetSessions(props: Props) {
const { t } = useTranslation(); const {t} = useTranslation();
const listRef = React.useRef<HTMLDivElement>(null); const listRef = React.useRef<HTMLDivElement>(null);
const { className = '' } = props; const {className = ''} = props;
const [activeSeries, setActiveSeries] = useState('all'); const [activeSeries, setActiveSeries] = useState('all');
const [data, setData] = useState<any>([]); const [data, setData] = useState<any>([]);
const isMounted = useIsMounted(); const isMounted = useIsMounted();
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
// all filtering done through series now // all filtering done through series now
const filteredSessions = getListSessionsBySeries(data, 'all'); const filteredSessions = getListSessionsBySeries(data, 'all');
const { dashboardStore, metricStore, sessionStore, customFieldStore } = const {dashboardStore, metricStore, sessionStore, customFieldStore} =
useStore(); useStore();
const focusedSeries = metricStore.focusedSeriesName; const focusedSeries = metricStore.focusedSeriesName;
const filter = dashboardStore.drillDownFilter; const filter = dashboardStore.drillDownFilter;
const widget = metricStore.instance; const widget = metricStore.instance;
const startTime = DateTime.fromMillis(filter.startTimestamp).toFormat( const startTime = DateTime.fromMillis(filter.startTimestamp).toFormat(
'LLL dd, yyyy HH:mm', 'LLL dd, yyyy HH:mm',
); );
const endTime = DateTime.fromMillis(filter.endTimestamp).toFormat( const endTime = DateTime.fromMillis(filter.endTimestamp).toFormat(
'LLL dd, yyyy HH:mm', 'LLL dd, yyyy HH:mm',
); );
const [seriesOptions, setSeriesOptions] = useState([ const [seriesOptions, setSeriesOptions] = useState([
{ label: t('All'), value: 'all' }, {label: t('All'), value: 'all'},
]); ]);
const hasFilters = const hasFilters =
filter.filters.length > 0 || filter.filters.length > 0 ||
filter.startTimestamp !== dashboardStore.drillDownPeriod.start || filter.startTimestamp !== dashboardStore.drillDownPeriod.start ||
filter.endTimestamp !== dashboardStore.drillDownPeriod.end; filter.endTimestamp !== dashboardStore.drillDownPeriod.end;
const filterText = filter.filters.length > 0 ? filter.filters[0].value : ''; const filterText = filter.filters.length > 0 ? filter.filters[0].value : '';
const metaList = customFieldStore.list.map((i: any) => i.key); const metaList = customFieldStore.list.map((i: any) => i.key);
const seriesDropdownItems = seriesOptions.map((option) => ({ const seriesDropdownItems = seriesOptions.map((option) => ({
key: option.value, key: option.value,
label: ( label: (
<div onClick={() => setActiveSeries(option.value)}>{option.label}</div> <div onClick={() => setActiveSeries(option.value)}>{option.label}</div>
), ),
}));
useEffect(() => {
if (!widget.series) return;
const seriesOptions = widget.series.map((item: any) => ({
label: item.name,
value: item.seriesId ?? item.name,
})); }));
setSeriesOptions([{ label: t('All'), value: 'all' }, ...seriesOptions]);
}, [widget.series.length]);
const fetchSessions = (metricId: any, filter: any) => { useEffect(() => {
if (!isMounted()) return; if (!widget.series) return;
setLoading(true); const seriesOptions = widget.series.map((item: any) => ({
delete filter.eventsOrderSupport; label: item.name,
if (widget.metricType === FUNNEL) { value: item.seriesId ?? item.name,
if (filter.series[0].filter.filters.length === 0) { }));
setLoading(false); setSeriesOptions([{label: t('All'), value: 'all'}, ...seriesOptions]);
return setData([]); }, [widget.series.length]);
}
}
widget const fetchSessions = (metricId: any, filter: any) => {
.fetchSessions(metricId, filter) if (!isMounted()) return;
.then((res: any) => {
setData(res); if (widget.metricType === FUNNEL) {
if (metricStore.drillDown) { if (filter.series[0].filter.filters.length === 0) {
setTimeout(() => { setLoading(false);
message.info(t('Sessions Refreshed!')); return setData([]);
listRef.current?.scrollIntoView({ behavior: 'smooth' }); }
metricStore.setDrillDown(false);
}, 0);
} }
})
.finally(() => {
setLoading(false);
});
};
const fetchClickmapSessions = (customFilters: Record<string, any>) => {
sessionStore.getSessions(customFilters).then((data) => {
setData([{ ...data, seriesId: 1, seriesName: 'Clicks' }]);
});
};
const debounceRequest: any = React.useCallback(
debounce(fetchSessions, 1000),
[],
);
const debounceClickMapSearch = React.useCallback(
debounce(fetchClickmapSessions, 1000),
[],
);
const depsString = JSON.stringify(widget.series);
const loadData = () => { setLoading(true);
if (widget.metricType === HEATMAP && metricStore.clickMapSearch) { const filterCopy = {...filter};
const clickFilter = { delete filterCopy.eventsOrderSupport;
value: [metricStore.clickMapSearch],
type: 'CLICK', try {
operator: 'onSelector', // Handle filters properly with null checks
isEvent: true, if (filterCopy.filters && filterCopy.filters.length > 0) {
// @ts-ignore // Ensure the nested path exists before pushing
filters: [], if (filterCopy.series?.[0]?.filter) {
}; if (!filterCopy.series[0].filter.filters) {
const timeRange = { filterCopy.series[0].filter.filters = [];
rangeValue: dashboardStore.drillDownPeriod.rangeValue, }
startDate: dashboardStore.drillDownPeriod.start, filterCopy.series[0].filter.filters.push(...filterCopy.filters);
endDate: dashboardStore.drillDownPeriod.end, }
}; filterCopy.filters = [];
const customFilter = { }
...filter, } catch (e) {
...timeRange, // do nothing
filters: [...sessionStore.userFilter.filters, clickFilter],
};
debounceClickMapSearch(customFilter);
} else {
const hasStartPoint =
!!widget.startPoint && widget.metricType === USER_PATH;
const onlyFocused = focusedSeries
? widget.series.filter((s) => s.name === focusedSeries)
: widget.series;
const activeSeries = metricStore.disabledSeries.length
? onlyFocused.filter(
(s) => !metricStore.disabledSeries.includes(s.name),
)
: onlyFocused;
const seriesJson = activeSeries.map((s) => s.toJson());
if (hasStartPoint) {
seriesJson[0].filter.filters.push(widget.startPoint.toJson());
}
if (widget.metricType === USER_PATH) {
if (
seriesJson[0].filter.filters[0].value[0] === '' &&
widget.data.nodes
) {
seriesJson[0].filter.filters[0].value = widget.data.nodes[0].name;
} else if (
seriesJson[0].filter.filters[0].value[0] === '' &&
!widget.data.nodes?.length
) {
// no point requesting if we don't have starting point picked by api
return;
} }
} widget
debounceRequest(widget.metricId, { .fetchSessions(metricId, filterCopy)
...filter, .then((res: any) => {
series: seriesJson, setData(res);
page: metricStore.sessionsPage, if (metricStore.drillDown) {
limit: metricStore.sessionsPageSize, setTimeout(() => {
}); message.info(t('Sessions Refreshed!'));
} listRef.current?.scrollIntoView({behavior: 'smooth'});
}; metricStore.setDrillDown(false);
useEffect(() => { }, 0);
metricStore.updateKey('sessionsPage', 1); }
loadData(); })
}, [ .finally(() => {
filter.startTimestamp, setLoading(false);
filter.endTimestamp, });
filter.filters, };
depsString, const fetchClickmapSessions = (customFilters: Record<string, any>) => {
metricStore.clickMapSearch, sessionStore.getSessions(customFilters).then((data) => {
focusedSeries, setData([{...data, seriesId: 1, seriesName: 'Clicks'}]);
widget.startPoint, });
widget.data.nodes, };
metricStore.disabledSeries.length, const debounceRequest: any = React.useCallback(
]); debounce(fetchSessions, 1000),
useEffect(loadData, [metricStore.sessionsPage]); [],
useEffect(() => { );
if (activeSeries === 'all') { const debounceClickMapSearch = React.useCallback(
metricStore.setFocusedSeriesName(null); debounce(fetchClickmapSessions, 1000),
} else { [],
metricStore.setFocusedSeriesName( );
seriesOptions.find((option) => option.value === activeSeries)?.label,
false,
);
}
}, [activeSeries]);
useEffect(() => {
if (focusedSeries) {
setActiveSeries(
seriesOptions.find((option) => option.label === focusedSeries)?.value ||
'all',
);
} else {
setActiveSeries('all');
}
}, [focusedSeries]);
const clearFilters = () => { const depsString = JSON.stringify(widget.series);
metricStore.updateKey('sessionsPage', 1);
dashboardStore.resetDrillDownFilter();
};
return ( const loadData = () => {
<div if (widget.metricType === HEATMAP && metricStore.clickMapSearch) {
className={cn( const clickFilter = {
className, value: [metricStore.clickMapSearch],
'bg-white p-3 pb-0 rounded-xl shadow-sm border mt-3', type: 'CLICK',
)} operator: 'onSelector',
> isEvent: true,
<div className="flex items-center justify-between"> // @ts-ignore
<div> filters: [],
<div className="flex items-baseline gap-2"> };
<h2 className="text-xl"> const timeRange = {
{metricStore.clickMapSearch ? t('Clicks') : t('Sessions')} rangeValue: dashboardStore.drillDownPeriod.rangeValue,
</h2> startDate: dashboardStore.drillDownPeriod.start,
<div className="ml-2 color-gray-medium"> endDate: dashboardStore.drillDownPeriod.end,
{metricStore.clickMapLabel };
? `on "${metricStore.clickMapLabel}" ` const customFilter = {
: null} ...filter,
{t('between')}{' '} ...timeRange,
<span className="font-medium color-gray-darkest"> filters: [...sessionStore.userFilter.filters, clickFilter],
};
debounceClickMapSearch(customFilter);
} else {
const hasStartPoint =
!!widget.startPoint && widget.metricType === USER_PATH;
const onlyFocused = focusedSeries
? widget.series.filter((s) => s.name === focusedSeries)
: widget.series;
const activeSeries = metricStore.disabledSeries.length
? onlyFocused.filter(
(s) => !metricStore.disabledSeries.includes(s.name),
)
: onlyFocused;
const seriesJson = activeSeries.map((s) => s.toJson());
if (hasStartPoint) {
seriesJson[0].filter.filters.push(widget.startPoint.toJson());
}
if (widget.metricType === USER_PATH) {
if (
seriesJson[0].filter.filters[0].value[0] === '' &&
widget.data.nodes?.length
) {
seriesJson[0].filter.filters[0].value = widget.data.nodes[0].name;
} else if (
seriesJson[0].filter.filters[0].value[0] === '' &&
!widget.data.nodes?.length
) {
// no point requesting if we don't have starting point picked by api
return;
}
}
debounceRequest(widget.metricId, {
...filter,
series: seriesJson,
page: metricStore.sessionsPage,
limit: metricStore.sessionsPageSize,
});
}
};
useEffect(() => {
metricStore.updateKey('sessionsPage', 1);
loadData();
}, [
filter.startTimestamp,
filter.endTimestamp,
filter.filters,
depsString,
metricStore.clickMapSearch,
focusedSeries,
widget.startPoint,
widget.data.nodes,
metricStore.disabledSeries.length,
]);
useEffect(loadData, [metricStore.sessionsPage]);
useEffect(() => {
if (activeSeries === 'all') {
metricStore.setFocusedSeriesName(null);
} else {
metricStore.setFocusedSeriesName(
seriesOptions.find((option) => option.value === activeSeries)?.label,
false,
);
}
}, [activeSeries]);
useEffect(() => {
if (focusedSeries) {
setActiveSeries(
seriesOptions.find((option) => option.label === focusedSeries)?.value ||
'all',
);
} else {
setActiveSeries('all');
}
}, [focusedSeries]);
const clearFilters = () => {
metricStore.updateKey('sessionsPage', 1);
dashboardStore.resetDrillDownFilter();
};
return (
<div
className={cn(
className,
'bg-white p-3 pb-0 rounded-xl shadow-sm border mt-3',
)}
>
<div className="flex items-center justify-between">
<div>
<div className="flex items-baseline gap-2">
<h2 className="text-xl">
{metricStore.clickMapSearch ? t('Clicks') : t('Sessions')}
</h2>
<div className="ml-2 color-gray-medium">
{metricStore.clickMapLabel
? `on "${metricStore.clickMapLabel}" `
: null}
{t('between')}{' '}
<span className="font-medium color-gray-darkest">
{startTime} {startTime}
</span>{' '} </span>{' '}
{t('and')}{' '} {t('and')}{' '}
<span className="font-medium color-gray-darkest"> <span className="font-medium color-gray-darkest">
{endTime} {endTime}
</span>{' '} </span>{' '}
</div> </div>
{hasFilters && ( {hasFilters && (
<Tooltip title={t('Clear Drilldown')} placement="top"> <Tooltip title={t('Clear Drilldown')} placement="top">
<Button type="text" size="small" onClick={clearFilters}> <Button type="text" size="small" onClick={clearFilters}>
<UndoOutlined /> <UndoOutlined/>
</Button> </Button>
</Tooltip> </Tooltip>
)} )}
</div> </div>
{hasFilters && widget.metricType === 'table' && ( {hasFilters && widget.metricType === 'table' && (
<div className="py-2"> <div className="py-2">
<Tag <Tag
closable closable
onClose={clearFilters} onClose={clearFilters}
className="truncate max-w-44 rounded-lg" className="truncate max-w-44 rounded-lg"
> >
{filterText} {filterText}
</Tag> </Tag>
</div> </div>
)} )}
</div> </div>
<div className="flex items-center gap-4"> <div className="flex items-center gap-4">
{widget.metricType !== 'table' && widget.metricType !== HEATMAP && ( {widget.metricType !== 'table' && widget.metricType !== HEATMAP && (
<div className="flex items-center ml-6"> <div className="flex items-center ml-6">
<span className="mr-2 color-gray-medium"> <span className="mr-2 color-gray-medium">
{t('Filter by Series')} {t('Filter by Series')}
</span> </span>
<Dropdown <Dropdown
menu={{ menu={{
items: seriesDropdownItems, items: seriesDropdownItems,
selectable: true, selectable: true,
selectedKeys: [activeSeries], selectedKeys: [activeSeries],
}} }}
trigger={['click']} trigger={['click']}
> >
<Button type="text" size="small"> <Button type="text" size="small">
{seriesOptions.find((option) => option.value === activeSeries) {seriesOptions.find((option) => option.value === activeSeries)
?.label || t('Select Series')} ?.label || t('Select Series')}
<DownOutlined /> <DownOutlined/>
</Button> </Button>
</Dropdown> </Dropdown>
</div> </div>
)} )}
</div>
</div>
<div className="mt-3">
<Loader loading={loading}>
<NoContent
title={
<div className="flex items-center justify-center flex-col">
<AnimatedSVG name={ICONS.NO_SESSIONS} size={60} />
<div className="mt-4" />
<div className="text-center">
{t('No relevant sessions found for the selected time period')}
</div> </div>
</div> </div>
}
show={filteredSessions.sessions.length === 0}
>
{filteredSessions.sessions.map((session: any) => (
<React.Fragment key={session.sessionId}>
<SessionItem
disableUser
session={session}
metaList={metaList}
/>
<div className="border-b" />
</React.Fragment>
))}
<div <div className="mt-3">
className="flex items-center justify-between p-5" <Loader loading={loading}>
ref={listRef} <NoContent
> title={
<div> <div className="flex items-center justify-center flex-col">
{t('Showing')}{' '} <AnimatedSVG name={ICONS.NO_SESSIONS} size={60}/>
<span className="font-medium"> <div className="mt-4"/>
<div className="text-center">
{t('No relevant sessions found for the selected time period')}
</div>
</div>
}
show={filteredSessions.sessions.length === 0}
>
{filteredSessions.sessions.map((session: any) => (
<React.Fragment key={session.sessionId}>
<SessionItem
disableUser
session={session}
metaList={metaList}
/>
<div className="border-b"/>
</React.Fragment>
))}
<div
className="flex items-center justify-between p-5"
ref={listRef}
>
<div>
{t('Showing')}{' '}
<span className="font-medium">
{(metricStore.sessionsPage - 1) * {(metricStore.sessionsPage - 1) *
metricStore.sessionsPageSize + metricStore.sessionsPageSize +
1} 1}
</span>{' '} </span>{' '}
{t('to')}{' '} {t('to')}{' '}
<span className="font-medium"> <span className="font-medium">
{(metricStore.sessionsPage - 1) * {(metricStore.sessionsPage - 1) *
metricStore.sessionsPageSize + metricStore.sessionsPageSize +
filteredSessions.sessions.length} filteredSessions.sessions.length}
</span>{' '} </span>{' '}
{t('of')}{' '} {t('of')}{' '}
<span className="font-medium"> <span className="font-medium">
{numberWithCommas(filteredSessions.total)} {numberWithCommas(filteredSessions.total)}
</span>{' '} </span>{' '}
{t('sessions.')} {t('sessions.')}
</div> </div>
<Pagination <Pagination
page={metricStore.sessionsPage} page={metricStore.sessionsPage}
total={filteredSessions.total} total={filteredSessions.total}
onPageChange={(page: any) => onPageChange={(page: any) =>
metricStore.updateKey('sessionsPage', page) metricStore.updateKey('sessionsPage', page)
} }
limit={metricStore.sessionsPageSize} limit={metricStore.sessionsPageSize}
debounceRequest={500} debounceRequest={500}
/> />
</div>
</NoContent>
</Loader>
</div> </div>
</NoContent> </div>
</Loader> );
</div>
</div>
);
} }
const getListSessionsBySeries = (data: any, seriesId: any) => { const getListSessionsBySeries = (data: any, seriesId: any) => {
const arr = data.reduce( const arr = data.reduce(
(arr: any, element: any) => { (arr: any, element: any) => {
if (seriesId === 'all') { if (seriesId === 'all') {
const sessionIds = arr.sessions.map((i: any) => i.sessionId); const sessionIds = arr.sessions.map((i: any) => i.sessionId);
const sessions = element.sessions.filter( const sessions = element.sessions.filter(
(i: any) => !sessionIds.includes(i.sessionId), (i: any) => !sessionIds.includes(i.sessionId),
); );
arr.sessions.push(...sessions); arr.sessions.push(...sessions);
} else if (element.seriesId === seriesId) { } else if (element.seriesId === seriesId) {
const sessionIds = arr.sessions.map((i: any) => i.sessionId); const sessionIds = arr.sessions.map((i: any) => i.sessionId);
const sessions = element.sessions.filter( const sessions = element.sessions.filter(
(i: any) => !sessionIds.includes(i.sessionId), (i: any) => !sessionIds.includes(i.sessionId),
); );
const duplicates = element.sessions.length - sessions.length; const duplicates = element.sessions.length - sessions.length;
arr.sessions.push(...sessions); arr.sessions.push(...sessions);
arr.total = element.total - duplicates; arr.total = element.total - duplicates;
} }
return arr; return arr;
}, },
{ sessions: [] }, {sessions: []},
); );
arr.total = arr.total =
seriesId === 'all' seriesId === 'all'
? Math.max(...data.map((i: any) => i.total)) ? Math.max(...data.map((i: any) => i.total))
: data.find((i: any) => i.seriesId === seriesId).total; : data.find((i: any) => i.seriesId === seriesId).total;
return arr; return arr;
}; };
export default observer(WidgetSessions); export default observer(WidgetSessions);

View file

@ -92,6 +92,9 @@ function WidgetView({
filter: { filters: selectedCard.filters }, filter: { filters: selectedCard.filters },
}), }),
]; ];
} else if (selectedCard.cardType === TABLE) {
cardData.series = [new FilterSeries()];
cardData.series[0].filter.eventsOrder = 'and';
} }
if (selectedCard.cardType === FUNNEL) { if (selectedCard.cardType === FUNNEL) {
cardData.series = [new FilterSeries()]; cardData.series = [new FilterSeries()];

View file

@ -83,6 +83,7 @@ function WidgetWrapperNew(props: Props & RouteComponentProps) {
}); });
const onChartClick = () => { const onChartClick = () => {
dashboardStore.setDrillDownPeriod(dashboardStore.period);
// if (!isWidget || isPredefined) return; // if (!isWidget || isPredefined) return;
props.history.push( props.history.push(
withSiteId( withSiteId(

View file

@ -8,7 +8,7 @@ import {
LikeFilled, LikeFilled,
LikeOutlined, LikeOutlined,
} from '@ant-design/icons'; } from '@ant-design/icons';
import { Tour, TourProps } from './.store/antd-virtual-7db13b4af6/package'; import { Tour, TourProps } from 'antd';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
interface Props { interface Props {

View file

@ -42,7 +42,7 @@ function DropdownAudioPlayer({
return { return {
url: data.url, url: data.url,
timestamp: data.timestamp, timestamp: data.timestamp,
start: startTs, start: Math.max(0, startTs),
}; };
}), }),
[audioEvents.length, sessionStart], [audioEvents.length, sessionStart],

View file

@ -114,13 +114,11 @@ function PlayerBlockHeader(props: any) {
)} )}
{_metaList.length > 0 && ( {_metaList.length > 0 && (
<div className="h-full flex items-center px-2 gap-1"> <SessionMetaList
<SessionMetaList horizontal
className="" metaList={_metaList}
metaList={_metaList} maxLength={2}
maxLength={2} />
/>
</div>
)} )}
</div> </div>
</div> </div>

View file

@ -38,8 +38,8 @@ function WebPlayer(props: any) {
uxtestingStore, uxtestingStore,
uiPlayerStore, uiPlayerStore,
integrationsStore, integrationsStore,
userStore,
} = useStore(); } = useStore();
const devTools = sessionStore.devTools
const session = sessionStore.current; const session = sessionStore.current;
const { prefetched } = sessionStore; const { prefetched } = sessionStore;
const startedAt = sessionStore.current.startedAt || 0; const startedAt = sessionStore.current.startedAt || 0;
@ -57,14 +57,17 @@ function WebPlayer(props: any) {
const [fullView, setFullView] = useState(false); const [fullView, setFullView] = useState(false);
React.useEffect(() => { React.useEffect(() => {
if (windowActive) { const handleActivation = () => {
const handleActivation = () => { if (!document.hidden) {
if (!document.hidden) { setWindowActive(true);
setWindowActive(true); document.removeEventListener('visibilitychange', handleActivation);
document.removeEventListener('visibilitychange', handleActivation); }
} };
}; document.addEventListener('visibilitychange', handleActivation);
document.addEventListener('visibilitychange', handleActivation);
return () => {
devTools.update('network', { activeTab: 'ALL' });
document.removeEventListener('visibilitychange', handleActivation);
} }
}, []); }, []);

View file

@ -169,6 +169,6 @@ function TabChange({ from, to, activeUrl, onClick }) {
</div> </div>
</div> </div>
); );
} };
export default observer(EventGroupWrapper); export default observer(EventGroupWrapper);

View file

@ -4,17 +4,17 @@ import cn from 'classnames';
import { observer } from 'mobx-react-lite'; import { observer } from 'mobx-react-lite';
import React from 'react'; import React from 'react';
import { VList, VListHandle } from 'virtua'; import { VList, VListHandle } from 'virtua';
import { Button } from 'antd' import { Button } from 'antd';
import { PlayerContext } from 'App/components/Session/playerContext'; import { PlayerContext } from 'App/components/Session/playerContext';
import { useStore } from 'App/mstore'; import { useStore } from 'App/mstore';
import { Icon } from 'UI'; import { Icon } from 'UI';
import { Search } from 'lucide-react' import { Search } from 'lucide-react';
import EventGroupWrapper from './EventGroupWrapper'; import EventGroupWrapper from './EventGroupWrapper';
import EventSearch from './EventSearch/EventSearch'; import EventSearch from './EventSearch/EventSearch';
import styles from './eventsBlock.module.css'; import styles from './eventsBlock.module.css';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { CloseOutlined } from ".store/@ant-design-icons-virtual-42686020c5/package"; import { CloseOutlined } from "@ant-design/icons";
import { Tooltip } from ".store/antd-virtual-9dbfadb7f6/package"; import { Tooltip } from "antd";
import { getDefaultFramework, frameworkIcons } from "../UnitStepsModal"; import { getDefaultFramework, frameworkIcons } from "../UnitStepsModal";
interface IProps { interface IProps {
@ -25,7 +25,7 @@ const MODES = {
SELECT: 'select', SELECT: 'select',
SEARCH: 'search', SEARCH: 'search',
EXPORT: 'export', EXPORT: 'export',
} };
function EventsBlock(props: IProps) { function EventsBlock(props: IProps) {
const defaultFramework = getDefaultFramework(); const defaultFramework = getDefaultFramework();
@ -95,7 +95,7 @@ function EventsBlock(props: IProps) {
? e.time >= zoomStartTs && e.time <= zoomEndTs ? e.time >= zoomStartTs && e.time <= zoomEndTs
: false : false
: true, : true,
); );
}, [ }, [
filteredLength, filteredLength,
notesWithEvtsLength, notesWithEvtsLength,
@ -126,6 +126,7 @@ function EventsBlock(props: IProps) {
}, },
[usedEvents, time, endTime], [usedEvents, time, endTime],
); );
const currentTimeEventIndex = findLastFitting(time); const currentTimeEventIndex = findLastFitting(time);
const write = ({ const write = ({
@ -182,6 +183,7 @@ function EventsBlock(props: IProps) {
const isTabChange = 'type' in event && event.type === 'TABCHANGE'; const isTabChange = 'type' in event && event.type === 'TABCHANGE';
const isCurrent = index === currentTimeEventIndex; const isCurrent = index === currentTimeEventIndex;
const isPrev = index < currentTimeEventIndex; const isPrev = index < currentTimeEventIndex;
return ( return (
<EventGroupWrapper <EventGroupWrapper
query={query} query={query}
@ -249,12 +251,14 @@ function EventsBlock(props: IProps) {
onClick={() => setMode(MODES.SEARCH)} onClick={() => setMode(MODES.SEARCH)}
> >
<Search size={14} /> <Search size={14} />
<div>{t('Search')}&nbsp;{usedEvents.length}&nbsp;{t('events')}</div> <div>
{t('Search')}&nbsp;{usedEvents.length}&nbsp;{t('events')}
</div>
</Button> </Button>
<Tooltip title={t('Close Panel')} placement='bottom' > <Tooltip title={t('Close Panel')} placement="bottom">
<Button <Button
className="ml-auto" className="ml-auto"
type='text' type="text"
onClick={() => { onClick={() => {
setActiveTab(''); setActiveTab('');
}} }}
@ -263,19 +267,23 @@ function EventsBlock(props: IProps) {
</Tooltip> </Tooltip>
</div> </div>
) : null} ) : null}
{mode === MODES.SEARCH ? {mode === MODES.SEARCH ? (
<div className={'flex items-center gap-2'}> <div className={'flex items-center gap-2'}>
<EventSearch <EventSearch
onChange={write} onChange={write}
setActiveTab={setActiveTab} setActiveTab={setActiveTab}
value={query} value={query}
eventsText={ eventsText={
usedEvents.length ? `${usedEvents.length} ${t('Events')}` : `0 ${t('Events')}` usedEvents.length
? `${usedEvents.length} ${t('Events')}`
: `0 ${t('Events')}`
} }
/> />
<Button type={'text'} onClick={() => setMode(MODES.SELECT)}>{t('Cancel')}</Button> <Button type={'text'} onClick={() => setMode(MODES.SELECT)}>
{t('Cancel')}
</Button>
</div> </div>
: null} ) : null}
</div> </div>
<div <div
className={cn('flex-1 pb-4', styles.eventsList)} className={cn('flex-1 pb-4', styles.eventsList)}

View file

@ -6,9 +6,11 @@ import {
import { observer } from 'mobx-react-lite'; import { observer } from 'mobx-react-lite';
import stl from './timeline.module.css'; import stl from './timeline.module.css';
import { getTimelinePosition } from './getTimelinePosition'; import { getTimelinePosition } from './getTimelinePosition';
import { useStore } from '@/mstore';
function EventsList() { function EventsList() {
const { store } = useContext(PlayerContext); const { store } = useContext(PlayerContext);
const { uiPlayerStore } = useStore();
const { eventCount, endTime } = store.get(); const { eventCount, endTime } = store.get();
const { tabStates } = store.get(); const { tabStates } = store.get();
@ -17,7 +19,6 @@ function EventsList() {
() => Object.values(tabStates)[0]?.eventList.filter((e) => e.time) || [], () => Object.values(tabStates)[0]?.eventList.filter((e) => e.time) || [],
[eventCount], [eventCount],
); );
React.useEffect(() => { React.useEffect(() => {
const hasDuplicates = events.some( const hasDuplicates = events.some(
(e, i) => (e, i) =>

View file

@ -49,7 +49,6 @@
z-index: 2; z-index: 2;
} }
.event { .event {
position: absolute; position: absolute;
width: 2px; width: 2px;

View file

@ -38,6 +38,7 @@ function SubHeader(props) {
projectsStore, projectsStore,
userStore, userStore,
issueReportingStore, issueReportingStore,
settingsStore
} = useStore(); } = useStore();
const { t } = useTranslation(); const { t } = useTranslation();
const { favorite } = sessionStore.current; const { favorite } = sessionStore.current;
@ -45,7 +46,7 @@ function SubHeader(props) {
const currentSession = sessionStore.current; const currentSession = sessionStore.current;
const projectId = projectsStore.siteId; const projectId = projectsStore.siteId;
const integrations = integrationsStore.issues.list; const integrations = integrationsStore.issues.list;
const { store } = React.useContext(PlayerContext); const { player, store } = React.useContext(PlayerContext);
const { location: currentLocation = 'loading...' } = store.get(); const { location: currentLocation = 'loading...' } = store.get();
const hasIframe = localStorage.getItem(IFRAME) === 'true'; const hasIframe = localStorage.getItem(IFRAME) === 'true';
const [hideTools, setHideTools] = React.useState(false); const [hideTools, setHideTools] = React.useState(false);
@ -127,6 +128,13 @@ function SubHeader(props) {
}); });
}; };
const showVModeBadge = store.get().vModeBadge;
const onVMode = () => {
settingsStore.sessionSettings.updateKey('virtualMode', true);
player.enableVMode?.();
location.reload();
}
return ( return (
<> <>
<div <div
@ -143,6 +151,8 @@ function SubHeader(props) {
siteId={projectId!} siteId={projectId!}
currentLocation={currentLocation} currentLocation={currentLocation}
version={currentSession?.trackerVersion ?? ''} version={currentSession?.trackerVersion ?? ''}
virtualElsFailed={showVModeBadge}
onVMode={onVMode}
/> />
<SessionTabs /> <SessionTabs />

View file

@ -34,38 +34,46 @@ const WarnBadge = React.memo(
currentLocation, currentLocation,
version, version,
siteId, siteId,
virtualElsFailed,
onVMode,
}: { }: {
currentLocation: string; currentLocation: string;
version: string; version: string;
siteId: string; siteId: string;
virtualElsFailed: boolean;
onVMode: () => void;
}) => { }) => {
const { t } = useTranslation(); const { t } = useTranslation();
const localhostWarnSiteKey = localhostWarn(siteId); const localhostWarnSiteKey = localhostWarn(siteId);
const defaultLocalhostWarn = const defaultLocalhostWarn =
localStorage.getItem(localhostWarnSiteKey) !== '1'; localStorage.getItem(localhostWarnSiteKey) !== '1';
const localhostWarnActive = const localhostWarnActive = Boolean(
currentLocation && currentLocation &&
defaultLocalhostWarn && defaultLocalhostWarn &&
/(localhost)|(127.0.0.1)|(0.0.0.0)/.test(currentLocation); /(localhost)|(127.0.0.1)|(0.0.0.0)/.test(currentLocation)
)
const trackerVersion = window.env.TRACKER_VERSION ?? undefined; const trackerVersion = window.env.TRACKER_VERSION ?? undefined;
const trackerVerDiff = compareVersions(version, trackerVersion); const trackerVerDiff = compareVersions(version, trackerVersion);
const trackerWarnActive = trackerVerDiff !== VersionComparison.Same; const trackerWarnActive = trackerVerDiff !== VersionComparison.Same;
const [showLocalhostWarn, setLocalhostWarn] = const [warnings, setWarnings] = React.useState<[localhostWarn: boolean, trackerWarn: boolean, virtualElsFailWarn: boolean]>([localhostWarnActive, trackerWarnActive, virtualElsFailed])
React.useState(localhostWarnActive);
const [showTrackerWarn, setTrackerWarn] = React.useState(trackerWarnActive);
const closeWarning = (type: 1 | 2) => { React.useEffect(() => {
setWarnings([localhostWarnActive, trackerWarnActive, virtualElsFailed])
}, [localhostWarnActive, trackerWarnActive, virtualElsFailed])
const closeWarning = (type: 0 | 1 | 2) => {
if (type === 1) { if (type === 1) {
localStorage.setItem(localhostWarnSiteKey, '1'); localStorage.setItem(localhostWarnSiteKey, '1');
setLocalhostWarn(false);
}
if (type === 2) {
setTrackerWarn(false);
} }
setWarnings((prev) => {
const newWarnings = [...prev];
newWarnings[type] = false;
return newWarnings;
});
}; };
if (!showLocalhostWarn && !showTrackerWarn) return null; if (!warnings.some(el => el === true)) return null;
return ( return (
<div <div
@ -79,7 +87,7 @@ const WarnBadge = React.memo(
fontWeight: 500, fontWeight: 500,
}} }}
> >
{showLocalhostWarn ? ( {warnings[0] ? (
<div className="px-3 py-1 border border-gray-lighter drop-shadow-md rounded bg-active-blue flex items-center justify-between"> <div className="px-3 py-1 border border-gray-lighter drop-shadow-md rounded bg-active-blue flex items-center justify-between">
<div> <div>
<span>{t('Some assets may load incorrectly on localhost.')}</span> <span>{t('Some assets may load incorrectly on localhost.')}</span>
@ -101,7 +109,7 @@ const WarnBadge = React.memo(
</div> </div>
</div> </div>
) : null} ) : null}
{showTrackerWarn ? ( {warnings[1] ? (
<div className="px-3 py-1 border border-gray-lighter drop-shadow-md rounded bg-active-blue flex items-center justify-between"> <div className="px-3 py-1 border border-gray-lighter drop-shadow-md rounded bg-active-blue flex items-center justify-between">
<div> <div>
<div> <div>
@ -125,6 +133,21 @@ const WarnBadge = React.memo(
</div> </div>
</div> </div>
<div
className="py-1 ml-3 cursor-pointer"
onClick={() => closeWarning(1)}
>
<Icon name="close" size={16} color="black" />
</div>
</div>
) : null}
{warnings[2] ? (
<div className="px-3 py-1 border border-gray-lighter drop-shadow-md rounded bg-active-blue flex items-center justify-between">
<div className="flex flex-col">
<div>{t('If you have issues displaying custom HTML elements (i.e when using LWC), consider turning on Virtual Mode.')}</div>
<div className='link' onClick={onVMode}>{t('Enable')}</div>
</div>
<div <div
className="py-1 ml-3 cursor-pointer" className="py-1 ml-3 cursor-pointer"
onClick={() => closeWarning(2)} onClick={() => closeWarning(2)}

View file

@ -12,60 +12,123 @@ import {
getDateRangeFromValue, getDateRangeFromValue,
getDateRangeLabel, getDateRangeLabel,
} from 'App/dateRange'; } from 'App/dateRange';
import { DateTime, Interval } from 'luxon'; import { DateTime, Interval, Settings } from 'luxon';
import styles from './dateRangePopup.module.css'; import styles from './dateRangePopup.module.css';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
function DateRangePopup(props: any) { function DateRangePopup(props: any) {
const { t } = useTranslation(); const { t } = useTranslation();
const [displayDates, setDisplayDates] = React.useState<[Date, Date]>([new Date(), new Date()]);
const [range, setRange] = React.useState( const [range, setRange] = React.useState(
props.selectedDateRange || props.selectedDateRange ||
Interval.fromDateTimes(DateTime.now(), DateTime.now()), Interval.fromDateTimes(DateTime.now(), DateTime.now()),
); );
const [value, setValue] = React.useState<string | null>(null); const [value, setValue] = React.useState<string | null>(null);
const selectCustomRange = (range) => { React.useEffect(() => {
let newRange; if (props.selectedDateRange) {
if (props.singleDay) { const start = new Date(
newRange = Interval.fromDateTimes( props.selectedDateRange.start.year,
DateTime.fromJSDate(range), props.selectedDateRange.start.month - 1, // JS months are 0-based
DateTime.fromJSDate(range), props.selectedDateRange.start.day
); );
} else { const end = new Date(
newRange = Interval.fromDateTimes( props.selectedDateRange.end.year,
DateTime.fromJSDate(range[0]), props.selectedDateRange.end.month - 1,
DateTime.fromJSDate(range[1]), props.selectedDateRange.end.day
); );
} setDisplayDates([start, end]);
setRange(newRange); }
}, [props.selectedDateRange]);
const createNaiveTime = (dateTime: DateTime) => {
if (!dateTime) return null;
return DateTime.fromObject({
hour: dateTime.hour,
minute: dateTime.minute
});
};
const selectCustomRange = (newDates: [Date, Date]) => {
if (!newDates || !newDates[0] || !newDates[1]) return;
setDisplayDates(newDates);
const selectedTzStart = DateTime.fromObject({
year: newDates[0].getFullYear(),
month: newDates[0].getMonth() + 1,
day: newDates[0].getDate(),
hour: 0,
minute: 0
}).setZone(Settings.defaultZone);
const selectedTzEnd = DateTime.fromObject({
year: newDates[1].getFullYear(),
month: newDates[1].getMonth() + 1,
day: newDates[1].getDate(),
hour: 23,
minute: 59
}).setZone(Settings.defaultZone);
const updatedRange = Interval.fromDateTimes(selectedTzStart, selectedTzEnd);
setRange(updatedRange);
setValue(CUSTOM_RANGE); setValue(CUSTOM_RANGE);
}; };
const setRangeTimeStart = (value: DateTime) => { const setRangeTimeStart = (naiveTime: DateTime) => {
if (!range.end || value > range.end) { if (!range.end || !naiveTime) return;
return;
} const newStart = range.start.set({
const newRange = range.start.set({ hour: naiveTime.hour,
hour: value.hour, minute: naiveTime.minute
minute: value.minute,
}); });
setRange(Interval.fromDateTimes(newRange, range.end));
if (newStart > range.end) return;
setRange(Interval.fromDateTimes(newStart, range.end));
setValue(CUSTOM_RANGE); setValue(CUSTOM_RANGE);
}; };
const setRangeTimeEnd = (value: DateTime) => { const setRangeTimeEnd = (naiveTime: DateTime) => {
if (!range.start || (value && value < range.start)) { if (!range.start || !naiveTime) return;
return;
} const newEnd = range.end.set({
const newRange = range.end.set({ hour: value.hour, minute: value.minute }); hour: naiveTime.hour,
setRange(Interval.fromDateTimes(range.start, newRange)); minute: naiveTime.minute
});
if (newEnd < range.start) return;
setRange(Interval.fromDateTimes(range.start, newEnd));
setValue(CUSTOM_RANGE); setValue(CUSTOM_RANGE);
}; };
const selectValue = (value: string) => { const selectValue = (value: string) => {
const range = getDateRangeFromValue(value); const newRange = getDateRangeFromValue(value);
setRange(range);
if (!newRange.start || !newRange.end) {
setRange(Interval.fromDateTimes(DateTime.now(), DateTime.now()));
setDisplayDates([new Date(), new Date()]);
setValue(null);
return;
}
const zonedStart = newRange.start.setZone(Settings.defaultZone);
const zonedEnd = newRange.end.setZone(Settings.defaultZone);
setRange(Interval.fromDateTimes(zonedStart, zonedEnd));
const start = new Date(
zonedStart.year,
zonedStart.month - 1,
zonedStart.day
);
const end = new Date(
zonedEnd.year,
zonedEnd.month - 1,
zonedEnd.day
);
setDisplayDates([start, end]);
setValue(value); setValue(value);
}; };
@ -77,9 +140,9 @@ function DateRangePopup(props: any) {
const isUSLocale = const isUSLocale =
navigator.language === 'en-US' || navigator.language.startsWith('en-US'); navigator.language === 'en-US' || navigator.language.startsWith('en-US');
const rangeForDisplay = props.singleDay const naiveStartTime = createNaiveTime(range.start);
? range.start.ts const naiveEndTime = createNaiveTime(range.end);
: [range.start!.startOf('day').ts, range.end!.startOf('day').ts];
return ( return (
<div className={styles.wrapper}> <div className={styles.wrapper}>
<div className={`${styles.body} h-fit`}> <div className={`${styles.body} h-fit`}>
@ -103,7 +166,7 @@ function DateRangePopup(props: any) {
shouldCloseCalendar={() => false} shouldCloseCalendar={() => false}
isOpen isOpen
maxDate={new Date()} maxDate={new Date()}
value={rangeForDisplay} value={displayDates}
calendarProps={{ calendarProps={{
tileDisabled: props.isTileDisabled, tileDisabled: props.isTileDisabled,
selectRange: !props.singleDay, selectRange: !props.singleDay,
@ -122,7 +185,7 @@ function DateRangePopup(props: any) {
<span>{range.start.toFormat(isUSLocale ? 'MM/dd' : 'dd/MM')} </span> <span>{range.start.toFormat(isUSLocale ? 'MM/dd' : 'dd/MM')} </span>
<TimePicker <TimePicker
format={isUSLocale ? 'hh:mm a' : 'HH:mm'} format={isUSLocale ? 'hh:mm a' : 'HH:mm'}
value={range.start} value={naiveStartTime}
onChange={setRangeTimeStart} onChange={setRangeTimeStart}
needConfirm={false} needConfirm={false}
showNow={false} showNow={false}
@ -132,7 +195,7 @@ function DateRangePopup(props: any) {
<span>{range.end.toFormat(isUSLocale ? 'MM/dd' : 'dd/MM')} </span> <span>{range.end.toFormat(isUSLocale ? 'MM/dd' : 'dd/MM')} </span>
<TimePicker <TimePicker
format={isUSLocale ? 'hh:mm a' : 'HH:mm'} format={isUSLocale ? 'hh:mm a' : 'HH:mm'}
value={range.end} value={naiveEndTime}
onChange={setRangeTimeEnd} onChange={setRangeTimeEnd}
needConfirm={false} needConfirm={false}
showNow={false} showNow={false}

View file

@ -1,9 +1,17 @@
/* eslint-disable i18next/no-literal-string */ /* eslint-disable i18next/no-literal-string */
import { ResourceType, Timed } from 'Player'; import { ResourceType, Timed } from 'Player';
import { WsChannel } from 'Player/web/messages';
import MobilePlayer from 'Player/mobile/IOSPlayer'; import MobilePlayer from 'Player/mobile/IOSPlayer';
import WebPlayer from 'Player/web/WebPlayer'; import WebPlayer from 'Player/web/WebPlayer';
import { observer } from 'mobx-react-lite'; import { observer } from 'mobx-react-lite';
import React, { useMemo, useState } from 'react'; import React, {
useMemo,
useState,
useEffect,
useCallback,
useRef,
} from 'react';
import i18n from 'App/i18n'
import { useModal } from 'App/components/Modal'; import { useModal } from 'App/components/Modal';
import { import {
@ -12,25 +20,27 @@ import {
} from 'App/components/Session/playerContext'; } from 'App/components/Session/playerContext';
import { formatMs } from 'App/date'; import { formatMs } from 'App/date';
import { useStore } from 'App/mstore'; import { useStore } from 'App/mstore';
import { formatBytes } from 'App/utils'; import { formatBytes, debounceCall } from 'App/utils';
import { Icon, NoContent, Tabs } from 'UI'; import { Icon, NoContent, Tabs } from 'UI';
import { Tooltip, Input, Switch, Form } from 'antd'; import { Tooltip, Input, Switch, Form } from 'antd';
import { SearchOutlined, InfoCircleOutlined } from '@ant-design/icons'; import {
SearchOutlined,
InfoCircleOutlined,
} from '@ant-design/icons';
import FetchDetailsModal from 'Shared/FetchDetailsModal'; import FetchDetailsModal from 'Shared/FetchDetailsModal';
import { WsChannel } from 'App/player/web/messages';
import BottomBlock from '../BottomBlock'; import BottomBlock from '../BottomBlock';
import InfoLine from '../BottomBlock/InfoLine'; import InfoLine from '../BottomBlock/InfoLine';
import TabSelector from '../TabSelector'; import TabSelector from '../TabSelector';
import TimeTable from '../TimeTable'; import TimeTable from '../TimeTable';
import useAutoscroll, { getLastItemTime } from '../useAutoscroll'; import useAutoscroll, { getLastItemTime } from '../useAutoscroll';
import { useRegExListFilterMemo, useTabListFilterMemo } from '../useListFilter';
import WSPanel from './WSPanel'; import WSPanel from './WSPanel';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { mergeListsWithZoom, processInChunks } from './utils'
// Constants remain the same
const INDEX_KEY = 'network'; const INDEX_KEY = 'network';
const ALL = 'ALL'; const ALL = 'ALL';
const XHR = 'xhr'; const XHR = 'xhr';
const JS = 'js'; const JS = 'js';
@ -62,6 +72,9 @@ export const NETWORK_TABS = TAP_KEYS.map((tab) => ({
const DOM_LOADED_TIME_COLOR = 'teal'; const DOM_LOADED_TIME_COLOR = 'teal';
const LOAD_TIME_COLOR = 'red'; const LOAD_TIME_COLOR = 'red';
const BATCH_SIZE = 2500;
const INITIAL_LOAD_SIZE = 5000;
export function renderType(r: any) { export function renderType(r: any) {
return ( return (
<Tooltip style={{ width: '100%' }} title={<div>{r.type}</div>}> <Tooltip style={{ width: '100%' }} title={<div>{r.type}</div>}>
@ -79,13 +92,17 @@ export function renderName(r: any) {
} }
function renderSize(r: any) { function renderSize(r: any) {
const { t } = useTranslation(); const t = i18n.t;
if (r.responseBodySize) return formatBytes(r.responseBodySize); const notCaptured = t('Not captured');
const resSizeStr = t('Resource size')
let triggerText; let triggerText;
let content; let content;
if (r.decodedBodySize == null || r.decodedBodySize === 0) { if (r.responseBodySize) {
triggerText = formatBytes(r.responseBodySize);
content = undefined;
} else if (r.decodedBodySize == null || r.decodedBodySize === 0) {
triggerText = 'x'; triggerText = 'x';
content = t('Not captured'); content = notCaptured;
} else { } else {
const headerSize = r.headerSize || 0; const headerSize = r.headerSize || 0;
const showTransferred = r.headerSize != null; const showTransferred = r.headerSize != null;
@ -100,7 +117,7 @@ function renderSize(r: any) {
)} transferred over network`} )} transferred over network`}
</li> </li>
)} )}
<li>{`${t('Resource size')}: ${formatBytes(r.decodedBodySize)} `}</li> <li>{`${resSizeStr}: ${formatBytes(r.decodedBodySize)} `}</li>
</ul> </ul>
); );
} }
@ -168,6 +185,8 @@ function renderStatus({
); );
} }
// Main component for Network Panel
function NetworkPanelCont({ panelHeight }: { panelHeight: number }) { function NetworkPanelCont({ panelHeight }: { panelHeight: number }) {
const { player, store } = React.useContext(PlayerContext); const { player, store } = React.useContext(PlayerContext);
const { sessionStore, uiPlayerStore } = useStore(); const { sessionStore, uiPlayerStore } = useStore();
@ -216,6 +235,7 @@ function NetworkPanelCont({ panelHeight }: { panelHeight: number }) {
const getTabNum = (tab: string) => tabsArr.findIndex((t) => t === tab) + 1; const getTabNum = (tab: string) => tabsArr.findIndex((t) => t === tab) + 1;
const getTabName = (tabId: string) => tabNames[tabId]; const getTabName = (tabId: string) => tabNames[tabId];
return ( return (
<NetworkPanelComp <NetworkPanelComp
loadTime={loadTime} loadTime={loadTime}
@ -228,8 +248,8 @@ function NetworkPanelCont({ panelHeight }: { panelHeight: number }) {
resourceListNow={resourceListNow} resourceListNow={resourceListNow}
player={player} player={player}
startedAt={startedAt} startedAt={startedAt}
websocketList={websocketList as WSMessage[]} websocketList={websocketList}
websocketListNow={websocketListNow as WSMessage[]} websocketListNow={websocketListNow}
getTabNum={getTabNum} getTabNum={getTabNum}
getTabName={getTabName} getTabName={getTabName}
showSingleTab={showSingleTab} showSingleTab={showSingleTab}
@ -269,9 +289,7 @@ function MobileNetworkPanelCont({ panelHeight }: { panelHeight: number }) {
resourceListNow={resourceListNow} resourceListNow={resourceListNow}
player={player} player={player}
startedAt={startedAt} startedAt={startedAt}
// @ts-ignore
websocketList={websocketList} websocketList={websocketList}
// @ts-ignore
websocketListNow={websocketListNow} websocketListNow={websocketListNow}
zoomEnabled={zoomEnabled} zoomEnabled={zoomEnabled}
zoomStartTs={zoomStartTs} zoomStartTs={zoomStartTs}
@ -280,12 +298,35 @@ function MobileNetworkPanelCont({ panelHeight }: { panelHeight: number }) {
); );
} }
type WSMessage = Timed & { const useInfiniteScroll = (loadMoreCallback: () => void, hasMore: boolean) => {
channelName: string; const observerRef = useRef<IntersectionObserver>(null);
data: string; const loadingRef = useRef<HTMLDivElement>(null);
timestamp: number;
dir: 'up' | 'down'; useEffect(() => {
messageType: string; const observer = new IntersectionObserver(
(entries) => {
if (entries[0]?.isIntersecting && hasMore) {
loadMoreCallback();
}
},
{ threshold: 0.1 },
);
if (loadingRef.current) {
observer.observe(loadingRef.current);
}
// @ts-ignore
observerRef.current = observer;
return () => {
if (observerRef.current) {
observerRef.current.disconnect();
}
};
}, [loadMoreCallback, hasMore, loadingRef]);
return loadingRef;
}; };
interface Props { interface Props {
@ -302,8 +343,8 @@ interface Props {
resourceList: Timed[]; resourceList: Timed[];
fetchListNow: Timed[]; fetchListNow: Timed[];
resourceListNow: Timed[]; resourceListNow: Timed[];
websocketList: Array<WSMessage>; websocketList: Array<WsChannel>;
websocketListNow: Array<WSMessage>; websocketListNow: Array<WsChannel>;
player: WebPlayer | MobilePlayer; player: WebPlayer | MobilePlayer;
startedAt: number; startedAt: number;
isMobile?: boolean; isMobile?: boolean;
@ -349,107 +390,189 @@ export const NetworkPanelComp = observer(
>(null); >(null);
const { showModal } = useModal(); const { showModal } = useModal();
const [showOnlyErrors, setShowOnlyErrors] = useState(false); const [showOnlyErrors, setShowOnlyErrors] = useState(false);
const [isDetailsModalActive, setIsDetailsModalActive] = useState(false); const [isDetailsModalActive, setIsDetailsModalActive] = useState(false);
const [isLoading, setIsLoading] = useState(true);
const [isProcessing, setIsProcessing] = useState(false);
const [displayedItems, setDisplayedItems] = useState([]);
const [totalItems, setTotalItems] = useState(0);
const [summaryStats, setSummaryStats] = useState({
resourcesSize: 0,
transferredSize: 0,
});
const originalListRef = useRef([]);
const socketListRef = useRef([]);
const { const {
sessionStore: { devTools }, sessionStore: { devTools },
} = useStore(); } = useStore();
const { filter } = devTools[INDEX_KEY]; const { filter } = devTools[INDEX_KEY];
const { activeTab } = devTools[INDEX_KEY]; const { activeTab } = devTools[INDEX_KEY];
const activeIndex = activeOutsideIndex ?? devTools[INDEX_KEY].index; const activeIndex = activeOutsideIndex ?? devTools[INDEX_KEY].index;
const [inputFilterValue, setInputFilterValue] = useState(filter);
const socketList = useMemo( const debouncedFilter = useCallback(
() => debounceCall((filterValue) => {
websocketList.filter( devTools.update(INDEX_KEY, { filter: filterValue });
(ws, i, arr) => }, 300),
arr.findIndex((it) => it.channelName === ws.channelName) === i, [],
),
[websocketList],
); );
const list = useMemo( // Process socket lists once
() => useEffect(() => {
// TODO: better merge (with body size info) - do it in player const uniqueSocketList = websocketList.filter(
resourceList (ws, i, arr) =>
.filter( arr.findIndex((it) => it.channelName === ws.channelName) === i,
(res) =>
!fetchList.some((ft) => {
// res.url !== ft.url doesn't work on relative URLs appearing within fetchList (to-fix in player)
if (res.name === ft.name) {
if (res.time === ft.time) return true;
if (res.url.includes(ft.url)) {
return (
Math.abs(res.time - ft.time) < 350 ||
Math.abs(res.timestamp - ft.timestamp) < 350
);
}
}
if (res.name !== ft.name) {
return false;
}
if (Math.abs(res.time - ft.time) > 250) {
return false;
} // TODO: find good epsilons
if (Math.abs(res.duration - ft.duration) > 200) {
return false;
}
return true;
}),
)
.concat(fetchList)
.concat(
socketList.map((ws) => ({
...ws,
type: 'websocket',
method: 'ws',
url: ws.channelName,
name: ws.channelName,
status: '101',
duration: 0,
transferredBodySize: 0,
})),
)
.filter((req) =>
zoomEnabled
? req.time >= zoomStartTs! && req.time <= zoomEndTs!
: true,
)
.sort((a, b) => a.time - b.time),
[resourceList.length, fetchList.length, socketList.length],
);
let filteredList = useMemo(() => {
if (!showOnlyErrors) {
return list;
}
return list.filter(
(it) => parseInt(it.status) >= 400 || !it.success || it.error,
); );
}, [showOnlyErrors, list]); socketListRef.current = uniqueSocketList;
filteredList = useRegExListFilterMemo( }, [websocketList.length]);
filteredList,
(it) => [it.status, it.name, it.type, it.method],
filter,
);
filteredList = useTabListFilterMemo(
filteredList,
(it) => TYPE_TO_TAB[it.type],
ALL,
activeTab,
);
const onTabClick = (activeTab: (typeof TAP_KEYS)[number]) => // Initial data processing - do this only once when data changes
useEffect(() => {
setIsLoading(true);
// Heaviest operation here, will create a final merged network list
const processData = async () => {
const fetchUrls = new Set(
fetchList.map((ft) => {
return `${ft.name}-${Math.floor(ft.time / 100)}-${Math.floor(ft.duration / 100)}`;
}),
);
// We want to get resources that aren't in fetch list
const filteredResources = await processInChunks(resourceList, (chunk) =>
chunk.filter((res: any) => {
const key = `${res.name}-${Math.floor(res.time / 100)}-${Math.floor(res.duration / 100)}`;
return !fetchUrls.has(key);
}),
BATCH_SIZE,
25,
);
const processedSockets = socketListRef.current.map((ws: any) => ({
...ws,
type: 'websocket',
method: 'ws',
url: ws.channelName,
name: ws.channelName,
status: '101',
duration: 0,
transferredBodySize: 0,
}));
const mergedList: Timed[] = mergeListsWithZoom(
filteredResources as Timed[],
fetchList,
processedSockets as Timed[],
{ enabled: Boolean(zoomEnabled), start: zoomStartTs ?? 0, end: zoomEndTs ?? 0 }
)
originalListRef.current = mergedList;
setTotalItems(mergedList.length);
calculateResourceStats(resourceList);
// Only display initial chunk
setDisplayedItems(mergedList.slice(0, INITIAL_LOAD_SIZE));
setIsLoading(false);
};
void processData();
}, [
resourceList.length,
fetchList.length,
socketListRef.current.length,
zoomEnabled,
zoomStartTs,
zoomEndTs,
]);
const calculateResourceStats = (resourceList: Record<string, any>) => {
setTimeout(() => {
let resourcesSize = 0
let transferredSize = 0
resourceList.forEach(({ decodedBodySize, headerSize, encodedBodySize }: any) => {
resourcesSize += decodedBodySize || 0
transferredSize += (headerSize || 0) + (encodedBodySize || 0)
})
setSummaryStats({
resourcesSize,
transferredSize,
});
}, 0);
}
useEffect(() => {
if (originalListRef.current.length === 0) return;
setIsProcessing(true);
const applyFilters = async () => {
let filteredItems: any[] = originalListRef.current;
filteredItems = await processInChunks(filteredItems, (chunk) =>
chunk.filter(
(it) => {
let valid = true;
if (showOnlyErrors) {
valid = parseInt(it.status) >= 400 || !it.success || it.error
}
if (filter) {
try {
const regex = new RegExp(filter, 'i');
valid = valid && regex.test(it.status) || regex.test(it.name) || regex.test(it.type) || regex.test(it.method);
} catch (e) {
valid = valid && String(it.status).includes(filter) || it.name.includes(filter) || it.type.includes(filter) || (it.method && it.method.includes(filter));
}
}
if (activeTab !== ALL) {
valid = valid && TYPE_TO_TAB[it.type] === activeTab;
}
return valid;
},
),
);
// Update displayed items
setDisplayedItems(filteredItems.slice(0, INITIAL_LOAD_SIZE));
setTotalItems(filteredItems.length);
setIsProcessing(false);
};
void applyFilters();
}, [filter, activeTab, showOnlyErrors]);
const loadMoreItems = useCallback(() => {
if (isProcessing) return;
setIsProcessing(true);
setTimeout(() => {
setDisplayedItems((prevItems) => {
const currentLength = prevItems.length;
const newItems = originalListRef.current.slice(
currentLength,
currentLength + BATCH_SIZE,
);
return [...prevItems, ...newItems];
});
setIsProcessing(false);
}, 10);
}, [isProcessing]);
const hasMoreItems = displayedItems.length < totalItems;
const loadingRef = useInfiniteScroll(loadMoreItems, hasMoreItems);
const onTabClick = (activeTab) => {
devTools.update(INDEX_KEY, { activeTab }); devTools.update(INDEX_KEY, { activeTab });
const onFilterChange = ({ };
target: { value },
}: React.ChangeEvent<HTMLInputElement>) => const onFilterChange = ({ target: { value } }) => {
devTools.update(INDEX_KEY, { filter: value }); setInputFilterValue(value)
debouncedFilter(value);
};
// AutoScroll
const [timeoutStartAutoscroll, stopAutoscroll] = useAutoscroll( const [timeoutStartAutoscroll, stopAutoscroll] = useAutoscroll(
filteredList, displayedItems,
getLastItemTime(fetchListNow, resourceListNow), getLastItemTime(fetchListNow, resourceListNow),
activeIndex, activeIndex,
(index) => devTools.update(INDEX_KEY, { index }), (index) => devTools.update(INDEX_KEY, { index }),
@ -462,24 +585,6 @@ export const NetworkPanelComp = observer(
timeoutStartAutoscroll(); timeoutStartAutoscroll();
}; };
const resourcesSize = useMemo(
() =>
resourceList.reduce(
(sum, { decodedBodySize }) => sum + (decodedBodySize || 0),
0,
),
[resourceList.length],
);
const transferredSize = useMemo(
() =>
resourceList.reduce(
(sum, { headerSize, encodedBodySize }) =>
sum + (headerSize || 0) + (encodedBodySize || 0),
0,
),
[resourceList.length],
);
const referenceLines = useMemo(() => { const referenceLines = useMemo(() => {
const arr = []; const arr = [];
@ -513,7 +618,7 @@ export const NetworkPanelComp = observer(
isSpot={isSpot} isSpot={isSpot}
time={item.time + startedAt} time={item.time + startedAt}
resource={item} resource={item}
rows={filteredList} rows={displayedItems}
fetchPresented={fetchList.length > 0} fetchPresented={fetchList.length > 0}
/>, />,
{ {
@ -525,12 +630,10 @@ export const NetworkPanelComp = observer(
}, },
}, },
); );
devTools.update(INDEX_KEY, { index: filteredList.indexOf(item) });
stopAutoscroll();
}; };
const tableCols = React.useMemo(() => { const tableCols = useMemo(() => {
const cols: any[] = [ const cols = [
{ {
label: t('Status'), label: t('Status'),
dataKey: 'status', dataKey: 'status',
@ -585,7 +688,7 @@ export const NetworkPanelComp = observer(
}); });
} }
return cols; return cols;
}, [showSingleTab]); }, [showSingleTab, activeTab, t, getTabName, getTabNum, isSpot]);
return ( return (
<BottomBlock <BottomBlock
@ -617,7 +720,7 @@ export const NetworkPanelComp = observer(
name="filter" name="filter"
onChange={onFilterChange} onChange={onFilterChange}
width={280} width={280}
value={filter} value={inputFilterValue}
size="small" size="small"
prefix={<SearchOutlined className="text-neutral-400" />} prefix={<SearchOutlined className="text-neutral-400" />}
/> />
@ -625,7 +728,7 @@ export const NetworkPanelComp = observer(
</BottomBlock.Header> </BottomBlock.Header>
<BottomBlock.Content> <BottomBlock.Content>
<div className="flex items-center justify-between px-4 border-b bg-teal/5 h-8"> <div className="flex items-center justify-between px-4 border-b bg-teal/5 h-8">
<div> <div className="flex items-center">
<Form.Item name="show-errors-only" className="mb-0"> <Form.Item name="show-errors-only" className="mb-0">
<label <label
style={{ style={{
@ -642,21 +745,29 @@ export const NetworkPanelComp = observer(
<span className="text-sm ms-2">4xx-5xx Only</span> <span className="text-sm ms-2">4xx-5xx Only</span>
</label> </label>
</Form.Item> </Form.Item>
{isProcessing && (
<span className="text-xs text-gray-500 ml-4">
Processing data...
</span>
)}
</div> </div>
<InfoLine> <InfoLine>
<InfoLine.Point label={`${totalItems}`} value="requests" />
<InfoLine.Point <InfoLine.Point
label={`${filteredList.length}`} label={`${displayedItems.length}/${totalItems}`}
value=" requests" value="displayed"
display={displayedItems.length < totalItems}
/> />
<InfoLine.Point <InfoLine.Point
label={formatBytes(transferredSize)} label={formatBytes(summaryStats.transferredSize)}
value="transferred" value="transferred"
display={transferredSize > 0} display={summaryStats.transferredSize > 0}
/> />
<InfoLine.Point <InfoLine.Point
label={formatBytes(resourcesSize)} label={formatBytes(summaryStats.resourcesSize)}
value="resources" value="resources"
display={resourcesSize > 0} display={summaryStats.resourcesSize > 0}
/> />
<InfoLine.Point <InfoLine.Point
label={formatMs(domBuildingTime)} label={formatMs(domBuildingTime)}
@ -679,42 +790,67 @@ export const NetworkPanelComp = observer(
/> />
</InfoLine> </InfoLine>
</div> </div>
<NoContent
title={ {isLoading ? (
<div className="capitalize flex items-center gap-2"> <div className="flex items-center justify-center h-full">
<InfoCircleOutlined size={18} /> <div className="text-center">
{t('No Data')} <div className="animate-spin rounded-full h-8 w-8 border-b-2 border-gray-900 mx-auto mb-2"></div>
<p>Processing initial network data...</p>
</div> </div>
} </div>
size="small" ) : (
show={filteredList.length === 0} <NoContent
> title={
{/* @ts-ignore */} <div className="capitalize flex items-center gap-2">
<TimeTable <InfoCircleOutlined size={18} />
rows={filteredList} {t('No Data')}
tableHeight={panelHeight - 102} </div>
referenceLines={referenceLines} }
renderPopup size="small"
onRowClick={showDetailsModal} show={displayedItems.length === 0}
sortBy="time"
sortAscending
onJump={(row: any) => {
devTools.update(INDEX_KEY, {
index: filteredList.indexOf(row),
});
player.jump(row.time);
}}
activeIndex={activeIndex}
> >
{tableCols} <div>
</TimeTable> <TimeTable
{selectedWsChannel ? ( rows={displayedItems}
<WSPanel tableHeight={panelHeight - 102 - (hasMoreItems ? 30 : 0)}
socketMsgList={selectedWsChannel} referenceLines={referenceLines}
onClose={() => setSelectedWsChannel(null)} renderPopup
/> onRowClick={showDetailsModal}
) : null} sortBy="time"
</NoContent> sortAscending
onJump={(row) => {
devTools.update(INDEX_KEY, {
index: displayedItems.indexOf(row),
});
player.jump(row.time);
}}
activeIndex={activeIndex}
>
{tableCols}
</TimeTable>
{hasMoreItems && (
<div
ref={loadingRef}
className="flex justify-center items-center text-xs text-gray-500"
>
<div className="flex items-center">
<div className="animate-spin rounded-full h-4 w-4 border-b-2 border-gray-600 mr-2"></div>
Loading more data ({totalItems - displayedItems.length}{' '}
remaining)
</div>
</div>
)}
</div>
{selectedWsChannel ? (
<WSPanel
socketMsgList={selectedWsChannel}
onClose={() => setSelectedWsChannel(null)}
/>
) : null}
</NoContent>
)}
</BottomBlock.Content> </BottomBlock.Content>
</BottomBlock> </BottomBlock>
); );
@ -722,7 +858,6 @@ export const NetworkPanelComp = observer(
); );
const WebNetworkPanel = observer(NetworkPanelCont); const WebNetworkPanel = observer(NetworkPanelCont);
const MobileNetworkPanel = observer(MobileNetworkPanelCont); const MobileNetworkPanel = observer(MobileNetworkPanelCont);
export { WebNetworkPanel, MobileNetworkPanel }; export { WebNetworkPanel, MobileNetworkPanel };

View file

@ -0,0 +1,178 @@
export function mergeListsWithZoom<
T extends Record<string, any>,
Y extends Record<string, any>,
Z extends Record<string, any>,
>(
arr1: T[],
arr2: Y[],
arr3: Z[],
zoom?: { enabled: boolean; start: number; end: number },
): Array<T | Y | Z> {
// Early return for empty arrays
if (arr1.length === 0 && arr2.length === 0 && arr3.length === 0) {
return [];
}
// Optimized for common case - no zoom
if (!zoom?.enabled) {
return mergeThreeSortedArrays(arr1, arr2, arr3);
}
// Binary search for start indexes (faster than linear search for large arrays)
const index1 = binarySearchStartIndex(arr1, zoom.start);
const index2 = binarySearchStartIndex(arr2, zoom.start);
const index3 = binarySearchStartIndex(arr3, zoom.start);
// Merge arrays within zoom range
return mergeThreeSortedArraysWithinRange(
arr1,
arr2,
arr3,
index1,
index2,
index3,
zoom.start,
zoom.end,
);
}
function binarySearchStartIndex<T extends Record<string, any>>(
arr: T[],
threshold: number,
): number {
if (arr.length === 0) return 0;
let low = 0;
let high = arr.length - 1;
// Handle edge cases first for better performance
if (arr[high].time < threshold) return arr.length;
if (arr[low].time >= threshold) return 0;
while (low <= high) {
const mid = Math.floor((low + high) / 2);
if (arr[mid].time < threshold) {
low = mid + 1;
} else {
high = mid - 1;
}
}
return low;
}
function mergeThreeSortedArrays<
T extends Record<string, any>,
Y extends Record<string, any>,
Z extends Record<string, any>,
>(arr1: T[], arr2: Y[], arr3: Z[]): Array<T | Y | Z> {
const totalLength = arr1.length + arr2.length + arr3.length;
// prealloc array size
const result = new Array(totalLength);
let i = 0,
j = 0,
k = 0,
index = 0;
while (i < arr1.length || j < arr2.length || k < arr3.length) {
const val1 = i < arr1.length ? arr1[i].time : Infinity;
const val2 = j < arr2.length ? arr2[j].time : Infinity;
const val3 = k < arr3.length ? arr3[k].time : Infinity;
if (val1 <= val2 && val1 <= val3) {
result[index++] = arr1[i++];
} else if (val2 <= val1 && val2 <= val3) {
result[index++] = arr2[j++];
} else {
result[index++] = arr3[k++];
}
}
return result;
}
// same as above, just with zoom stuff
function mergeThreeSortedArraysWithinRange<
T extends Record<string, any>,
Y extends Record<string, any>,
Z extends Record<string, any>,
>(
arr1: T[],
arr2: Y[],
arr3: Z[],
startIdx1: number,
startIdx2: number,
startIdx3: number,
start: number,
end: number,
): Array<T | Y | Z> {
// we don't know beforehand how many items will be there
const result = [];
let i = startIdx1;
let j = startIdx2;
let k = startIdx3;
while (i < arr1.length || j < arr2.length || k < arr3.length) {
const val1 = i < arr1.length ? arr1[i].time : Infinity;
const val2 = j < arr2.length ? arr2[j].time : Infinity;
const val3 = k < arr3.length ? arr3[k].time : Infinity;
// Early termination: if all remaining values exceed end time
if (Math.min(val1, val2, val3) > end) {
break;
}
if (val1 <= val2 && val1 <= val3) {
if (val1 <= end) {
result.push(arr1[i]);
}
i++;
} else if (val2 <= val1 && val2 <= val3) {
if (val2 <= end) {
result.push(arr2[j]);
}
j++;
} else {
if (val3 <= end) {
result.push(arr3[k]);
}
k++;
}
}
return result;
}
export function processInChunks(
items: any[],
processFn: (item: any) => any,
chunkSize = 1000,
overscan = 0,
) {
return new Promise((resolve) => {
if (items.length === 0) {
resolve([]);
return;
}
let result: any[] = [];
let index = 0;
const processNextChunk = () => {
const chunk = items.slice(index, index + chunkSize + overscan);
result = result.concat(processFn(chunk));
index += chunkSize;
if (index < items.length) {
setTimeout(processNextChunk, 0);
} else {
resolve(result);
}
};
processNextChunk();
});
}

View file

@ -5,6 +5,7 @@ import cn from 'classnames';
import { Loader } from 'UI'; import { Loader } from 'UI';
import OutsideClickDetectingDiv from 'Shared/OutsideClickDetectingDiv'; import OutsideClickDetectingDiv from 'Shared/OutsideClickDetectingDiv';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { VList } from 'virtua';
function TruncatedText({ function TruncatedText({
text, text,
@ -124,7 +125,7 @@ export function AutocompleteModal({
if (index === blocksAmount - 1 && blocksAmount > 1) { if (index === blocksAmount - 1 && blocksAmount > 1) {
str += ' and '; str += ' and ';
} }
str += `"${block.trim()}"`; str += block.trim();
if (index < blocksAmount - 2) { if (index < blocksAmount - 2) {
str += ', '; str += ', ';
} }
@ -170,25 +171,27 @@ export function AutocompleteModal({
<> <>
<div <div
className="flex flex-col gap-2 overflow-y-auto py-2 overflow-x-hidden text-ellipsis" className="flex flex-col gap-2 overflow-y-auto py-2 overflow-x-hidden text-ellipsis"
style={{ maxHeight: 200 }} style={{ height: Math.min(sortedOptions.length * 32, 240) }}
> >
{sortedOptions.map((item) => ( <VList count={sortedOptions.length} itemSize={18}>
<div {sortedOptions.map((item) => (
key={item.value} <div
onClick={() => onSelectOption(item)} key={item.value}
className="cursor-pointer w-full py-1 hover:bg-active-blue rounded px-2" onClick={() => onSelectOption(item)}
> className="cursor-pointer w-full py-1 hover:bg-active-blue rounded px-2"
<Checkbox checked={isSelected(item)} /> {item.label} >
</div> <Checkbox checked={isSelected(item)} /> {item.label}
))} </div>
))}
</VList>
</div> </div>
{query.length ? ( {query.length ? (
<div className="border-y border-y-gray-light py-2"> <div className="border-y border-y-gray-light py-2">
<div <div
className="whitespace-normal rounded cursor-pointer text-teal hover:bg-active-blue px-2 py-1" className="whitespace-nowrap truncate w-full rounded cursor-pointer text-teal hover:bg-active-blue px-2 py-1"
onClick={applyQuery} onClick={applyQuery}
> >
{t('Apply')}&nbsp;{queryStr} {t('Apply')}&nbsp;<span className='font-semibold'>{queryStr}</span>
</div> </div>
</div> </div>
) : null} ) : null}

View file

@ -128,8 +128,10 @@ const FilterAutoComplete = observer(
}; };
const handleFocus = () => { const handleFocus = () => {
if (!initialFocus) {
setOptions(topValues.map((i) => ({ value: i.value, label: i.value })));
}
setInitialFocus(true); setInitialFocus(true);
setOptions(topValues.map((i) => ({ value: i.value, label: i.value })));
}; };
return ( return (

View file

@ -19,11 +19,13 @@ export default function MetaItem(props: Props) {
<TextEllipsis <TextEllipsis
text={label} text={label}
className="p-0" className="p-0"
maxWidth={'300px'}
popupProps={{ size: 'small', disabled: true }} popupProps={{ size: 'small', disabled: true }}
/> />
<span className="bg-neutral-200 inline-block w-[1px] min-h-[17px]"></span> <span className="bg-neutral-200 inline-block w-[1px] min-h-[17px]"></span>
<TextEllipsis <TextEllipsis
text={value} text={value}
maxWidth={'350px'}
className="p-0 text-neutral-500" className="p-0 text-neutral-500"
popupProps={{ size: 'small', disabled: true }} popupProps={{ size: 'small', disabled: true }}
/> />

View file

@ -7,13 +7,15 @@ interface Props {
className?: string; className?: string;
metaList: any[]; metaList: any[];
maxLength?: number; maxLength?: number;
onMetaClick?: (meta: { name: string, value: string }) => void;
horizontal?: boolean;
} }
export default function SessionMetaList(props: Props) { export default function SessionMetaList(props: Props) {
const { className = '', metaList, maxLength = 14 } = props; const { className = '', metaList, maxLength = 14, horizontal = false } = props;
return ( return (
<div className={cn('flex items-center flex-wrap gap-1', className)}> <div className={cn('flex items-center gap-1', horizontal ? '' : 'flex-wrap', className)}>
{metaList.slice(0, maxLength).map(({ label, value }, index) => ( {metaList.slice(0, maxLength).map(({ label, value }, index) => (
<React.Fragment key={index}> <React.Fragment key={index}>
<MetaItem label={label} value={`${value}`} /> <MetaItem label={label} value={`${value}`} />

View file

@ -5,6 +5,7 @@ import ListingVisibility from './components/ListingVisibility';
import DefaultPlaying from './components/DefaultPlaying'; import DefaultPlaying from './components/DefaultPlaying';
import DefaultTimezone from './components/DefaultTimezone'; import DefaultTimezone from './components/DefaultTimezone';
import CaptureRate from './components/CaptureRate'; import CaptureRate from './components/CaptureRate';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
function SessionSettings() { function SessionSettings() {

View file

@ -0,0 +1,30 @@
import React from 'react';
import { useStore } from 'App/mstore';
import { observer } from 'mobx-react-lite';
import { Switch } from 'UI';
import { useTranslation } from 'react-i18next';
function VirtualModeSettings() {
const { settingsStore } = useStore();
const { sessionSettings } = settingsStore;
const { virtualMode } = sessionSettings;
const { t } = useTranslation();
const updateSettings = (checked: boolean) => {
settingsStore.sessionSettings.updateKey('virtualMode', !virtualMode);
};
return (
<div>
<h3 className="text-lg">{t('Virtual Mode')}</h3>
<div className="my-1">
{t('Change this setting if you have issues with recordings containing Lightning Web Components (or similar custom HTML Element libraries).')}
</div>
<div className="mt-2">
<Switch onChange={updateSettings} checked={virtualMode} />
</div>
</div>
);
}
export default observer(VirtualModeSettings);

View file

@ -9,6 +9,7 @@ export const GLOBAL_HAS_NO_RECORDINGS = '__$global-hasNoRecordings$__';
export const SITE_ID_STORAGE_KEY = '__$user-siteId$__'; export const SITE_ID_STORAGE_KEY = '__$user-siteId$__';
export const GETTING_STARTED = '__$user-gettingStarted$__'; export const GETTING_STARTED = '__$user-gettingStarted$__';
export const MOUSE_TRAIL = '__$session-mouseTrail$__'; export const MOUSE_TRAIL = '__$session-mouseTrail$__';
export const VIRTUAL_MODE_KEY = '__$session-virtualMode$__'
export const IFRAME = '__$session-iframe$__'; export const IFRAME = '__$session-iframe$__';
export const JWT_PARAM = '__$session-jwt-param$__'; export const JWT_PARAM = '__$session-jwt-param$__';
export const MENU_COLLAPSED = '__$global-menuCollapsed$__'; export const MENU_COLLAPSED = '__$global-menuCollapsed$__';

View file

@ -503,7 +503,7 @@
"Returning users between": "Returning users between", "Returning users between": "Returning users between",
"Sessions": "Sessions", "Sessions": "Sessions",
"No recordings found.": "No recordings found.", "No recordings found.": "No recordings found.",
"Get new session": "Get new session", "Get new image": "Get new image",
"The number of cards in one dashboard is limited to 30.": "The number of cards in one dashboard is limited to 30.", "The number of cards in one dashboard is limited to 30.": "The number of cards in one dashboard is limited to 30.",
"Add Card": "Add Card", "Add Card": "Add Card",
"Create Dashboard": "Create Dashboard", "Create Dashboard": "Create Dashboard",

View file

@ -503,7 +503,7 @@
"Returning users between": "Usuarios recurrentes entre", "Returning users between": "Usuarios recurrentes entre",
"Sessions": "Sesiones", "Sessions": "Sesiones",
"No recordings found.": "No se encontraron grabaciones.", "No recordings found.": "No se encontraron grabaciones.",
"Get new session": "Obtener nueva sesión", "Get new image": "Obtener nueva sesión",
"The number of cards in one dashboard is limited to 30.": "El número de tarjetas en un panel está limitado a 30.", "The number of cards in one dashboard is limited to 30.": "El número de tarjetas en un panel está limitado a 30.",
"Add Card": "Agregar tarjeta", "Add Card": "Agregar tarjeta",
"Create Dashboard": "Crear panel", "Create Dashboard": "Crear panel",

View file

@ -503,7 +503,7 @@
"Returning users between": "Utilisateurs récurrents entre", "Returning users between": "Utilisateurs récurrents entre",
"Sessions": "Sessions", "Sessions": "Sessions",
"No recordings found.": "Aucun enregistrement trouvé.", "No recordings found.": "Aucun enregistrement trouvé.",
"Get new session": "Obtenir une nouvelle session", "Get new image": "Obtenir une nouvelle session",
"The number of cards in one dashboard is limited to 30.": "Le nombre de cartes dans un tableau de bord est limité à 30.", "The number of cards in one dashboard is limited to 30.": "Le nombre de cartes dans un tableau de bord est limité à 30.",
"Add Card": "Ajouter une carte", "Add Card": "Ajouter une carte",
"Create Dashboard": "Créer un tableau de bord", "Create Dashboard": "Créer un tableau de bord",

View file

@ -504,7 +504,7 @@
"Returning users between": "Возвращающиеся пользователи за период", "Returning users between": "Возвращающиеся пользователи за период",
"Sessions": "Сессии", "Sessions": "Сессии",
"No recordings found.": "Записей не найдено.", "No recordings found.": "Записей не найдено.",
"Get new session": "Получить новую сессию", "Get new image": "Получить новую сессию",
"The number of cards in one dashboard is limited to 30.": "Количество карточек в одном дашборде ограничено 30.", "The number of cards in one dashboard is limited to 30.": "Количество карточек в одном дашборде ограничено 30.",
"Add Card": "Добавить карточку", "Add Card": "Добавить карточку",
"Create Dashboard": "Создать дашборд", "Create Dashboard": "Создать дашборд",
@ -1498,5 +1498,8 @@
"More attribute": "Еще атрибут", "More attribute": "Еще атрибут",
"More attributes": "Еще атрибуты", "More attributes": "Еще атрибуты",
"Account settings updated successfully": "Настройки аккаунта успешно обновлены", "Account settings updated successfully": "Настройки аккаунта успешно обновлены",
"Include rage clicks": "Включить невыносимые клики" "Include rage clicks": "Включить невыносимые клики",
} "Interface Language": "Язык интерфейса",
"Select the language in which OpenReplay will appear.": "Выберите язык, на котором будет отображаться OpenReplay.",
"Language": "Язык"
}

View file

@ -503,7 +503,7 @@
"Returning users between": "回访用户区间", "Returning users between": "回访用户区间",
"Sessions": "会话", "Sessions": "会话",
"No recordings found.": "未找到录制。", "No recordings found.": "未找到录制。",
"Get new session": "获取新会话", "Get new image": "获取新会话",
"The number of cards in one dashboard is limited to 30.": "一个仪表板最多可包含30个卡片。", "The number of cards in one dashboard is limited to 30.": "一个仪表板最多可包含30个卡片。",
"Add Card": "添加卡片", "Add Card": "添加卡片",
"Create Dashboard": "创建仪表板", "Create Dashboard": "创建仪表板",

View file

@ -1,11 +1,13 @@
import { makeAutoObservable, runInAction } from 'mobx'; import { makeAutoObservable, runInAction, reaction } from 'mobx';
import { dashboardService, metricService } from 'App/services'; import { dashboardService, metricService } from 'App/services';
import { toast } from 'react-toastify'; import { toast } from 'react-toastify';
import Period, { LAST_24_HOURS, LAST_7_DAYS } from 'Types/app/period'; import Period, { LAST_24_HOURS } from 'Types/app/period';
import { getRE } from 'App/utils'; import { getRE } from 'App/utils';
import Filter from './types/filter'; import Filter from './types/filter';
import Widget from './types/widget'; import Widget from './types/widget';
import Dashboard from './types/dashboard'; import Dashboard from './types/dashboard';
import { calculateGranularities } from '@/components/Dashboard/components/WidgetDateRange/RangeGranularity';
import { CUSTOM_RANGE } from '@/dateRange';
interface DashboardFilter { interface DashboardFilter {
query?: string; query?: string;
@ -34,9 +36,9 @@ export default class DashboardStore {
comparisonFilter: Filter = new Filter(); comparisonFilter: Filter = new Filter();
drillDownPeriod: Record<string, any> = Period({ rangeName: LAST_7_DAYS }); drillDownPeriod: Record<string, any> = Period({ rangeName: LAST_24_HOURS });
selectedDensity: number = 7; // depends on default drilldown, 7 points here!!!; selectedDensity: number = 7;
comparisonPeriods: Record<string, any> = {}; comparisonPeriods: Record<string, any> = {};
@ -83,10 +85,29 @@ export default class DashboardStore {
makeAutoObservable(this); makeAutoObservable(this);
this.resetDrillDownFilter(); this.resetDrillDownFilter();
this.createDensity(this.period.getDuration());
reaction(
() => this.period,
(period) => {
this.createDensity(period.getDuration());
},
);
} }
setDensity = (density: any) => { resetDensity = () => {
this.selectedDensity = parseInt(density, 10); this.createDensity(this.period.getDuration());
};
createDensity = (duration: number) => {
const densityOpts = calculateGranularities(duration);
const defaultOption = densityOpts[densityOpts.length - 2];
this.setDensity(defaultOption.key);
};
setDensity = (density: number) => {
this.selectedDensity = density;
}; };
get sortedDashboards() { get sortedDashboards() {
@ -446,7 +467,7 @@ export default class DashboardStore {
this.isSaving = true; this.isSaving = true;
try { try {
try { try {
const response = await dashboardService.addWidget(dashboard, metricIds); await dashboardService.addWidget(dashboard, metricIds);
toast.success('Card added to dashboard.'); toast.success('Card added to dashboard.');
} catch { } catch {
toast.error('Card could not be added.'); toast.error('Card could not be added.');
@ -456,6 +477,17 @@ export default class DashboardStore {
} }
} }
resetPeriod = () => {
if (this.period) {
const range = this.period.rangeName;
if (range !== CUSTOM_RANGE) {
this.period = Period({ rangeName: this.period.rangeName });
} else {
this.period = Period({ rangeName: LAST_24_HOURS });
}
}
};
setPeriod(period: any) { setPeriod(period: any) {
this.period = Period({ this.period = Period({
start: period.start, start: period.start,

View file

@ -1,6 +1,5 @@
import { makeAutoObservable } from 'mobx'; import { makeAutoObservable } from 'mobx';
import { issueReportsService } from 'App/services'; import { issueReportsService } from 'App/services';
import { makePersistable } from '.store/mobx-persist-store-virtual-858ce4d906/package';
import ReportedIssue from '../types/session/assignment'; import ReportedIssue from '../types/session/assignment';
export default class IssueReportingStore { export default class IssueReportingStore {

View file

@ -4,7 +4,6 @@ import {
SITE_ID_STORAGE_KEY, SITE_ID_STORAGE_KEY,
} from 'App/constants/storageKeys'; } from 'App/constants/storageKeys';
import { projectsService } from 'App/services'; import { projectsService } from 'App/services';
import { toast } from '.store/react-toastify-virtual-9dd0f3eae1/package';
import GDPR from './types/gdpr'; import GDPR from './types/gdpr';
import Project from './types/project'; import Project from './types/project';

View file

@ -390,10 +390,11 @@ class SearchStore {
// TODO // TODO
} }
async fetchSessions( fetchSessions = async (
force: boolean = false, force: boolean = false,
bookmarked: boolean = false, bookmarked: boolean = false,
): Promise<void> { ): Promise<void> => {
console.log(this.searchInProgress)
if (this.searchInProgress) return; if (this.searchInProgress) return;
const filter = this.instance.toSearch(); const filter = this.instance.toSearch();

View file

@ -220,6 +220,7 @@ class SearchStoreLive {
updateFilter = (index: number, search: Partial<IFilter>) => { updateFilter = (index: number, search: Partial<IFilter>) => {
const newFilters = this.instance.filters.map((_filter: any, i: any) => { const newFilters = this.instance.filters.map((_filter: any, i: any) => {
if (i === index) { if (i === index) {
search.value = checkFilterValue(search.value);
return search; return search;
} }
return _filter; return _filter;

View file

@ -15,9 +15,7 @@ import { loadFile } from 'App/player/web/network/loadFiles';
import { LAST_7_DAYS } from 'Types/app/period'; import { LAST_7_DAYS } from 'Types/app/period';
import { filterMap } from 'App/mstore/searchStore'; import { filterMap } from 'App/mstore/searchStore';
import { getDateRangeFromValue } from 'App/dateRange'; import { getDateRangeFromValue } from 'App/dateRange';
import { clean as cleanParams } from '../api_client';
import { searchStore, searchStoreLive } from './index'; import { searchStore, searchStoreLive } from './index';
const range = getDateRangeFromValue(LAST_7_DAYS); const range = getDateRangeFromValue(LAST_7_DAYS);
const defaultDateFilters = { const defaultDateFilters = {

View file

@ -157,7 +157,7 @@ export default class FilterItem {
const json = { const json = {
type: isMetadata ? FilterKey.METADATA : this.key, type: isMetadata ? FilterKey.METADATA : this.key,
isEvent: Boolean(this.isEvent), isEvent: Boolean(this.isEvent),
value: this.value.map((i: any) => (i ? i.toString() : '')), value: this.value?.map((i: any) => (i ? i.toString() : '')) || [],
operator: this.operator, operator: this.operator,
source: isMetadata ? this.key.replace(/^_/, '') : this.source, source: isMetadata ? this.key.replace(/^_/, '') : this.source,
sourceOperator: this.sourceOperator, sourceOperator: this.sourceOperator,

View file

@ -7,6 +7,7 @@ import Filter, { IFilter } from 'App/mstore/types/filter';
import FilterItem from 'App/mstore/types/filterItem'; import FilterItem from 'App/mstore/types/filterItem';
import { makeAutoObservable, observable } from 'mobx'; import { makeAutoObservable, observable } from 'mobx';
import { LAST_24_HOURS, LAST_30_DAYS, LAST_7_DAYS } from 'Types/app/period'; import { LAST_24_HOURS, LAST_30_DAYS, LAST_7_DAYS } from 'Types/app/period';
import { roundToNextMinutes } from '@/utils';
// @ts-ignore // @ts-ignore
const rangeValue = DATE_RANGE_VALUES.LAST_24_HOURS; const rangeValue = DATE_RANGE_VALUES.LAST_24_HOURS;
@ -177,6 +178,7 @@ export default class Search {
js.rangeValue, js.rangeValue,
js.startDate, js.startDate,
js.endDate, js.endDate,
15,
); );
js.startDate = startDate; js.startDate = startDate;
js.endDate = endDate; js.endDate = endDate;
@ -190,12 +192,11 @@ export default class Search {
rangeName: string, rangeName: string,
customStartDate: number, customStartDate: number,
customEndDate: number, customEndDate: number,
): { roundMinutes?: number,
startDate: number; ): { startDate: number; endDate: number } {
endDate: number;
} {
let endDate = new Date().getTime(); let endDate = new Date().getTime();
let startDate: number; let startDate: number;
const minutes = roundMinutes || 15;
switch (rangeName) { switch (rangeName) {
case LAST_7_DAYS: case LAST_7_DAYS:
@ -206,9 +207,7 @@ export default class Search {
break; break;
case CUSTOM_RANGE: case CUSTOM_RANGE:
if (!customStartDate || !customEndDate) { if (!customStartDate || !customEndDate) {
throw new Error( throw new Error('Start date and end date must be provided for CUSTOM_RANGE.');
'Start date and end date must be provided for CUSTOM_RANGE.',
);
} }
startDate = customStartDate; startDate = customStartDate;
endDate = customEndDate; endDate = customEndDate;
@ -218,10 +217,12 @@ export default class Search {
startDate = endDate - 24 * 60 * 60 * 1000; startDate = endDate - 24 * 60 * 60 * 1000;
} }
return { if (rangeName !== CUSTOM_RANGE) {
startDate, startDate = roundToNextMinutes(startDate, minutes);
endDate, endDate = roundToNextMinutes(endDate, minutes);
}; }
return { startDate, endDate };
} }
fromJS({ eventsOrder, filters, events, custom, ...filterData }: any) { fromJS({ eventsOrder, filters, events, custom, ...filterData }: any) {

View file

@ -6,6 +6,7 @@ import {
SHOWN_TIMEZONE, SHOWN_TIMEZONE,
DURATION_FILTER, DURATION_FILTER,
MOUSE_TRAIL, MOUSE_TRAIL,
VIRTUAL_MODE_KEY,
} from 'App/constants/storageKeys'; } from 'App/constants/storageKeys';
import { DateTime, Settings } from 'luxon'; import { DateTime, Settings } from 'luxon';
@ -71,27 +72,19 @@ export const generateGMTZones = (): Timezone[] => {
export default class SessionSettings { export default class SessionSettings {
defaultTimezones = [...generateGMTZones()]; defaultTimezones = [...generateGMTZones()];
skipToIssue: boolean = localStorage.getItem(SKIP_TO_ISSUE) === 'true'; skipToIssue: boolean = localStorage.getItem(SKIP_TO_ISSUE) === 'true';
timezone: Timezone; timezone: Timezone;
durationFilter: any = JSON.parse( durationFilter: any = JSON.parse(
localStorage.getItem(DURATION_FILTER) || localStorage.getItem(DURATION_FILTER) ||
JSON.stringify(defaultDurationFilter), JSON.stringify(defaultDurationFilter),
); );
captureRate: string = '0'; captureRate: string = '0';
conditionalCapture: boolean = false; conditionalCapture: boolean = false;
captureConditions: { name: string; captureRate: number; filters: any[] }[] = captureConditions: { name: string; captureRate: number; filters: any[] }[] =
[]; [];
mouseTrail: boolean = localStorage.getItem(MOUSE_TRAIL) !== 'false'; mouseTrail: boolean = localStorage.getItem(MOUSE_TRAIL) !== 'false';
shownTimezone: 'user' | 'local'; shownTimezone: 'user' | 'local';
virtualMode: boolean = localStorage.getItem(VIRTUAL_MODE_KEY) === 'true';
usingLocal: boolean = false; usingLocal: boolean = false;
constructor() { constructor() {

View file

@ -163,6 +163,7 @@ export default class Widget {
fromJson(json: any, period?: any) { fromJson(json: any, period?: any) {
json.config = json.config || {}; json.config = json.config || {};
runInAction(() => { runInAction(() => {
this.dashboardId = json.dashboardId;
this.metricId = json.metricId; this.metricId = json.metricId;
this.widgetId = json.widgetId; this.widgetId = json.widgetId;
this.metricValue = this.metricValueFromArray( this.metricValue = this.metricValueFromArray(

View file

@ -43,6 +43,7 @@ export default class MessageLoader {
this.session = session; this.session = session;
} }
rawMessages: any[] = []
createNewParser( createNewParser(
shouldDecrypt = true, shouldDecrypt = true,
onMessagesDone: (msgs: PlayerMsg[], file?: string) => void, onMessagesDone: (msgs: PlayerMsg[], file?: string) => void,
@ -69,6 +70,7 @@ export default class MessageLoader {
while (!finished) { while (!finished) {
const msg = fileReader.readNext(); const msg = fileReader.readNext();
if (msg) { if (msg) {
this.rawMessages.push(msg)
msgs.push(msg); msgs.push(msg);
} else { } else {
finished = true; finished = true;
@ -78,7 +80,6 @@ export default class MessageLoader {
let artificialStartTime = Infinity; let artificialStartTime = Infinity;
let startTimeSet = false; let startTimeSet = false;
msgs.forEach((msg, i) => { msgs.forEach((msg, i) => {
if (msg.tp === MType.Redux || msg.tp === MType.ReduxDeprecated) { if (msg.tp === MType.Redux || msg.tp === MType.ReduxDeprecated) {
if ('actionTime' in msg && msg.actionTime) { if ('actionTime' in msg && msg.actionTime) {
@ -343,27 +344,32 @@ const DOMMessages = [
MType.CreateElementNode, MType.CreateElementNode,
MType.CreateTextNode, MType.CreateTextNode,
MType.MoveNode, MType.MoveNode,
MType.RemoveNode,
MType.CreateIFrameDocument, MType.CreateIFrameDocument,
]; ];
// fixed times: 3
function brokenDomSorter(m1: PlayerMsg, m2: PlayerMsg) { function brokenDomSorter(m1: PlayerMsg, m2: PlayerMsg) {
if (m1.time !== m2.time) return m1.time - m2.time; if (m1.time !== m2.time) return m1.time - m2.time;
if (m1.tp === MType.CreateDocument && m2.tp !== MType.CreateDocument) // if (m1.tp === MType.CreateDocument && m2.tp !== MType.CreateDocument)
return -1; // return -1;
if (m1.tp !== MType.CreateDocument && m2.tp === MType.CreateDocument) // if (m1.tp !== MType.CreateDocument && m2.tp === MType.CreateDocument)
return 1; // return 1;
const m1IsDOM = DOMMessages.includes(m1.tp); // if (m1.tp === MType.RemoveNode)
const m2IsDOM = DOMMessages.includes(m2.tp); // return 1;
if (m1IsDOM && m2IsDOM) { // if (m2.tp === MType.RemoveNode)
// @ts-ignore DOM msg has id but checking for 'id' in m is expensive // return -1;
return m1.id - m2.id;
}
if (m1IsDOM && !m2IsDOM) return -1; // const m1IsDOM = DOMMessages.includes(m1.tp);
if (!m1IsDOM && m2IsDOM) return 1; // const m2IsDOM = DOMMessages.includes(m2.tp);
// if (m1IsDOM && m2IsDOM) {
// // @ts-ignore DOM msg has id but checking for 'id' in m is expensive
// return m1.id - m2.id;
// }
// if (m1IsDOM && !m2IsDOM) return -1;
// if (!m1IsDOM && m2IsDOM) return 1;
return 0; return 0;
} }

View file

@ -1,7 +1,7 @@
// @ts-ignore // @ts-ignore
import { Decoder } from 'syncod'; import { Decoder } from 'syncod';
import logger from 'App/logger'; import logger from 'App/logger';
import { VIRTUAL_MODE_KEY } from '@/constants/storageKeys';
import type { Store, ILog, SessionFilesInfo } from 'Player'; import type { Store, ILog, SessionFilesInfo } from 'Player';
import TabSessionManager, { TabState } from 'Player/web/TabManager'; import TabSessionManager, { TabState } from 'Player/web/TabManager';
import ActiveTabManager from 'Player/web/managers/ActiveTabManager'; import ActiveTabManager from 'Player/web/managers/ActiveTabManager';
@ -69,6 +69,7 @@ export interface State extends ScreenState {
tabChangeEvents: TabChangeEvent[]; tabChangeEvents: TabChangeEvent[];
closedTabs: string[]; closedTabs: string[];
sessionStart: number; sessionStart: number;
vModeBadge: boolean;
} }
export const visualChanges = [ export const visualChanges = [
@ -99,6 +100,7 @@ export default class MessageManager {
closedTabs: [], closedTabs: [],
sessionStart: 0, sessionStart: 0,
tabNames: {}, tabNames: {},
vModeBadge: false,
}; };
private clickManager: ListWalker<MouseClick> = new ListWalker(); private clickManager: ListWalker<MouseClick> = new ListWalker();
@ -126,7 +128,6 @@ export default class MessageManager {
private tabsAmount = 0; private tabsAmount = 0;
private tabChangeEvents: TabChangeEvent[] = []; private tabChangeEvents: TabChangeEvent[] = [];
private activeTab = ''; private activeTab = '';
constructor( constructor(
@ -142,8 +143,19 @@ export default class MessageManager {
this.activityManager = new ActivityManager( this.activityManager = new ActivityManager(
this.session.duration.milliseconds, this.session.duration.milliseconds,
); // only if not-live ); // only if not-live
const vMode = localStorage.getItem(VIRTUAL_MODE_KEY);
if (vMode === 'true') {
this.setVirtualMode(true);
}
} }
private virtualMode = false;
public setVirtualMode = (virtualMode: boolean) => {
this.virtualMode = virtualMode;
Object.values(this.tabs).forEach((tab) => tab.setVirtualMode(virtualMode));
};
public getListsFullState = () => { public getListsFullState = () => {
const fullState: Record<string, any> = {}; const fullState: Record<string, any> = {};
for (const tab in Object.keys(this.tabs)) { for (const tab in Object.keys(this.tabs)) {
@ -394,6 +406,9 @@ export default class MessageManager {
this.sessionStart, this.sessionStart,
this.initialLists, this.initialLists,
); );
if (this.virtualMode) {
this.tabs[msg.tabId].setVirtualMode(this.virtualMode);
}
} }
const lastMessageTime = Math.max(msg.time, this.lastMessageTime); const lastMessageTime = Math.max(msg.time, this.lastMessageTime);

View file

@ -99,6 +99,7 @@ export default class TabSessionManager {
tabStates: { [tabId: string]: TabState }; tabStates: { [tabId: string]: TabState };
tabNames: { [tabId: string]: string }; tabNames: { [tabId: string]: string };
location?: string; location?: string;
vModeBadge?: boolean;
}>, }>,
private readonly screen: Screen, private readonly screen: Screen,
private readonly id: string, private readonly id: string,
@ -116,6 +117,13 @@ export default class TabSessionManager {
screen, screen,
this.session.isMobile, this.session.isMobile,
this.setCSSLoading, this.setCSSLoading,
() => {
setTimeout(() => {
this.state.update({
vModeBadge: true,
})
}, 0)
}
); );
this.lists = new Lists(initialLists); this.lists = new Lists(initialLists);
initialLists?.event?.forEach((e: Record<string, string>) => { initialLists?.event?.forEach((e: Record<string, string>) => {
@ -126,6 +134,10 @@ export default class TabSessionManager {
}); });
} }
public setVirtualMode = (virtualMode: boolean) => {
this.pagesManager.setVirtualMode(virtualMode);
};
setSession = (session: any) => { setSession = (session: any) => {
this.session = session; this.session = session;
}; };
@ -348,19 +360,19 @@ export default class TabSessionManager {
break; break;
case MType.CreateTextNode: case MType.CreateTextNode:
case MType.CreateElementNode: case MType.CreateElementNode:
this.windowNodeCounter.addNode(msg.id, msg.parentID); this.windowNodeCounter.addNode(msg);
this.performanceTrackManager.setCurrentNodesCount( this.performanceTrackManager.setCurrentNodesCount(
this.windowNodeCounter.count, this.windowNodeCounter.count,
); );
break; break;
case MType.MoveNode: case MType.MoveNode:
this.windowNodeCounter.moveNode(msg.id, msg.parentID); this.windowNodeCounter.moveNode(msg);
this.performanceTrackManager.setCurrentNodesCount( this.performanceTrackManager.setCurrentNodesCount(
this.windowNodeCounter.count, this.windowNodeCounter.count,
); );
break; break;
case MType.RemoveNode: case MType.RemoveNode:
this.windowNodeCounter.removeNode(msg.id); this.windowNodeCounter.removeNode(msg);
this.performanceTrackManager.setCurrentNodesCount( this.performanceTrackManager.setCurrentNodesCount(
this.windowNodeCounter.count, this.windowNodeCounter.count,
); );

View file

@ -21,15 +21,10 @@ export default class WebPlayer extends Player {
inspectorMode: false, inspectorMode: false,
mobsFetched: false, mobsFetched: false,
}; };
private inspectorController: InspectorController; private inspectorController: InspectorController;
protected screen: Screen; protected screen: Screen;
protected readonly messageManager: MessageManager; protected readonly messageManager: MessageManager;
protected readonly messageLoader: MessageLoader; protected readonly messageLoader: MessageLoader;
private targetMarker: TargetMarker; private targetMarker: TargetMarker;
constructor( constructor(
@ -104,6 +99,10 @@ export default class WebPlayer extends Player {
window.__OPENREPLAY_DEV_TOOLS__.player = this; window.__OPENREPLAY_DEV_TOOLS__.player = this;
} }
enableVMode = () => {
this.messageManager.setVirtualMode(true);
}
preloadFirstFile(data: Uint8Array, fileKey?: string) { preloadFirstFile(data: Uint8Array, fileKey?: string) {
void this.messageLoader.preloadFirstFile(data, fileKey); void this.messageLoader.preloadFirstFile(data, fileKey);
} }

View file

@ -140,11 +140,16 @@ class SimpleHeatmap {
ctx.drawImage(this.circle, p[0] - this.r, p[1] - this.r); ctx.drawImage(this.circle, p[0] - this.r, p[1] - this.r);
}); });
const colored = ctx.getImageData(0, 0, this.width, this.height); try {
this.colorize(colored.data, this.grad); const colored = ctx.getImageData(0, 0, this.width, this.height);
ctx.putImageData(colored, 0, 0); this.colorize(colored.data, this.grad);
ctx.putImageData(colored, 0, 0);
return this; } catch (e) {
// usually happens if session is corrupted ?
console.error('Error while colorizing heatmap:', e);
} finally {
return this;
}
} }
private colorize( private colorize(

View file

@ -2,21 +2,7 @@ import logger from '@/logger';
import { VElement } from 'Player/web/managers/DOM/VirtualDOM'; import { VElement } from 'Player/web/managers/DOM/VirtualDOM';
import MessageManager from 'Player/web/MessageManager'; import MessageManager from 'Player/web/MessageManager';
import { Socket } from 'socket.io-client'; import { Socket } from 'socket.io-client';
import { toast } from 'react-toastify';
let frameCounter = 0;
function draw(
video: HTMLVideoElement,
canvas: HTMLCanvasElement,
canvasCtx: CanvasRenderingContext2D,
) {
if (frameCounter % 4 === 0) {
canvasCtx.drawImage(video, 0, 0, canvas.width, canvas.height);
}
frameCounter++;
requestAnimationFrame(() => draw(video, canvas, canvasCtx));
}
export default class CanvasReceiver { export default class CanvasReceiver {
private streams: Map<string, MediaStream> = new Map(); private streams: Map<string, MediaStream> = new Map();
@ -25,6 +11,16 @@ export default class CanvasReceiver {
private cId: string; private cId: string;
private frameCounter = 0;
private canvasesData = new Map<
string,
{
video: HTMLVideoElement;
canvas: HTMLCanvasElement;
canvasCtx: CanvasRenderingContext2D;
}
>(new Map());
// sendSignal for sending signals (offer/answer/ICE) // sendSignal for sending signals (offer/answer/ICE)
constructor( constructor(
private readonly peerIdPrefix: string, private readonly peerIdPrefix: string,
@ -56,6 +52,14 @@ export default class CanvasReceiver {
}, },
); );
this.socket.on('webrtc_canvas_stop', (data: { id: string }) => {
const { id } = data;
const canvasId = getCanvasId(id);
this.connections.delete(id);
this.streams.delete(id);
this.canvasesData.delete(canvasId);
});
this.socket.on('webrtc_canvas_restart', () => { this.socket.on('webrtc_canvas_restart', () => {
this.clear(); this.clear();
}); });
@ -85,7 +89,7 @@ export default class CanvasReceiver {
const stream = event.streams[0]; const stream = event.streams[0];
if (stream) { if (stream) {
// Detect canvasId from remote peer id // Detect canvasId from remote peer id
const canvasId = id.split('-')[4]; const canvasId = getCanvasId(id);
this.streams.set(canvasId, stream); this.streams.set(canvasId, stream);
setTimeout(() => { setTimeout(() => {
const node = this.getNode(parseInt(canvasId, 10)); const node = this.getNode(parseInt(canvasId, 10));
@ -93,14 +97,15 @@ export default class CanvasReceiver {
stream.clone() as MediaStream, stream.clone() as MediaStream,
node as VElement, node as VElement,
); );
if (node) { if (node && videoEl) {
draw( this.canvasesData.set(canvasId, {
videoEl, video: videoEl,
node.node as HTMLCanvasElement, canvas: node.node as HTMLCanvasElement,
(node.node as HTMLCanvasElement).getContext( canvasCtx: (node.node as HTMLCanvasElement)?.getContext(
'2d', '2d',
) as CanvasRenderingContext2D, ) as CanvasRenderingContext2D,
); });
this.draw();
} else { } else {
logger.log('NODE', canvasId, 'IS NOT FOUND'); logger.log('NODE', canvasId, 'IS NOT FOUND');
} }
@ -136,7 +141,27 @@ export default class CanvasReceiver {
}); });
this.connections.clear(); this.connections.clear();
this.streams.clear(); this.streams.clear();
this.canvasesData.clear();
} }
draw = () => {
if (this.frameCounter % 4 === 0) {
if (this.canvasesData.size === 0) {
return;
}
this.canvasesData.forEach((canvasData, id) => {
const { video, canvas, canvasCtx } = canvasData;
const node = this.getNode(parseInt(id, 10));
if (node) {
canvasCtx.drawImage(video, 0, 0, canvas.width, canvas.height);
} else {
this.canvasesData.delete(id);
}
});
}
this.frameCounter++;
requestAnimationFrame(() => this.draw());
};
} }
function spawnVideo(stream: MediaStream, node: VElement) { function spawnVideo(stream: MediaStream, node: VElement) {
@ -152,6 +177,10 @@ function spawnVideo(stream: MediaStream, node: VElement) {
.play() .play()
.then(() => true) .then(() => true)
.catch(() => { .catch(() => {
toast.error('Click to unpause canvas stream', {
autoClose: false,
toastId: 'canvas-stream',
});
// we allow that if user just reloaded the page // we allow that if user just reloaded the page
}); });
@ -164,6 +193,10 @@ function spawnVideo(stream: MediaStream, node: VElement) {
const startStream = () => { const startStream = () => {
videoEl videoEl
.play() .play()
.then(() => {
toast.dismiss('canvas-stream');
clearListeners();
})
.then(() => console.log('unpaused')) .then(() => console.log('unpaused'))
.catch(() => { .catch(() => {
// we allow that if user just reloaded the page // we allow that if user just reloaded the page
@ -179,6 +212,10 @@ function checkId(id: string, cId: string): boolean {
return id.includes(cId); return id.includes(cId);
} }
function getCanvasId(id: string): string {
return id.split('-')[4];
}
/** simple peer example /** simple peer example
* // @ts-ignore * // @ts-ignore
* const peer = new SLPeer({ initiator: false }) * const peer = new SLPeer({ initiator: false })

View file

@ -17,6 +17,9 @@ export interface State {
export default class RemoteControl { export default class RemoteControl {
private assistVersion = 1; private assistVersion = 1;
private isDragging = false;
private dragStart: any | null = null;
private readonly dragThreshold = 3;
static readonly INITIAL_STATE: Readonly<State> = { static readonly INITIAL_STATE: Readonly<State> = {
remoteControl: RemoteControlStatus.Disabled, remoteControl: RemoteControlStatus.Disabled,
@ -81,6 +84,7 @@ export default class RemoteControl {
} }
private onMouseMove = (e: MouseEvent): void => { private onMouseMove = (e: MouseEvent): void => {
if (this.isDragging) return;
const data = this.screen.getInternalCoordinates(e); const data = this.screen.getInternalCoordinates(e);
this.emitData('move', [data.x, data.y]); this.emitData('move', [data.x, data.y]);
}; };
@ -154,16 +158,61 @@ export default class RemoteControl {
this.emitData('click', [data.x, data.y]); this.emitData('click', [data.x, data.y]);
}; };
private onMouseDown = (e: MouseEvent): void => {
if (this.store.get().annotating) return;
const { x, y } = this.screen.getInternalViewportCoordinates(e);
this.dragStart = [x, y];
this.isDragging = false;
const handleMove = (moveEvent: MouseEvent) => {
const { x: mx, y: my } =
this.screen.getInternalViewportCoordinates(moveEvent);
const [sx, sy] = this.dragStart!;
const dx = Math.abs(mx - sx);
const dy = Math.abs(my - sy);
if (
!this.isDragging &&
(dx > this.dragThreshold || dy > this.dragThreshold)
) {
this.emitData('startDrag', [sx, sy]);
this.isDragging = true;
}
if (this.isDragging) {
this.emitData('drag', [mx, my, mx - sx, my - sy]);
}
};
const handleUp = () => {
if (this.isDragging) {
this.emitData('stopDrag');
}
this.dragStart = null;
this.isDragging = false;
window.removeEventListener('mousemove', handleMove);
window.removeEventListener('mouseup', handleUp);
};
window.addEventListener('mousemove', handleMove);
window.addEventListener('mouseup', handleUp);
};
private toggleRemoteControl(enable: boolean) { private toggleRemoteControl(enable: boolean) {
if (enable) { if (enable) {
this.screen.overlay.addEventListener('mousemove', this.onMouseMove); this.screen.overlay.addEventListener('mousemove', this.onMouseMove);
this.screen.overlay.addEventListener('click', this.onMouseClick); this.screen.overlay.addEventListener('click', this.onMouseClick);
this.screen.overlay.addEventListener('wheel', this.onWheel); this.screen.overlay.addEventListener('wheel', this.onWheel);
this.screen.overlay.addEventListener('mousedown', this.onMouseDown);
this.store.update({ remoteControl: RemoteControlStatus.Enabled }); this.store.update({ remoteControl: RemoteControlStatus.Enabled });
} else { } else {
this.screen.overlay.removeEventListener('mousemove', this.onMouseMove); this.screen.overlay.removeEventListener('mousemove', this.onMouseMove);
this.screen.overlay.removeEventListener('click', this.onMouseClick); this.screen.overlay.removeEventListener('click', this.onMouseClick);
this.screen.overlay.removeEventListener('wheel', this.onWheel); this.screen.overlay.removeEventListener('wheel', this.onWheel);
this.screen.overlay.removeEventListener('mousedown', this.onMouseDown);
this.store.update({ remoteControl: RemoteControlStatus.Disabled }); this.store.update({ remoteControl: RemoteControlStatus.Disabled });
this.toggleAnnotation(false); this.toggleAnnotation(false);
} }

View file

@ -44,45 +44,34 @@ const ATTR_NAME_REGEXP = /([^\t\n\f \/>"'=]+)/;
export default class DOMManager extends ListWalker<Message> { export default class DOMManager extends ListWalker<Message> {
private readonly vTexts: Map<number, VText> = new Map(); // map vs object here? private readonly vTexts: Map<number, VText> = new Map(); // map vs object here?
private readonly vElements: Map<number, VElement> = new Map(); private readonly vElements: Map<number, VElement> = new Map();
private readonly olVRoots: Map<number, OnloadVRoot> = new Map(); private readonly olVRoots: Map<number, OnloadVRoot> = new Map();
/** required to keep track of iframes, frameId : vnodeId */ /** required to keep track of iframes, frameId : vnodeId */
private readonly iframeRoots: Record<number, number> = {}; private readonly iframeRoots: Record<number, number> = {};
private shadowRootParentMap: Map<number, number> = new Map();
/** Constructed StyleSheets https://developer.mozilla.org/en-US/docs/Web/API/Document/adoptedStyleSheets /** Constructed StyleSheets https://developer.mozilla.org/en-US/docs/Web/API/Document/adoptedStyleSheets
* as well as <style> tag owned StyleSheets * as well as <style> tag owned StyleSheets
*/ */
private olStyleSheets: Map<number, OnloadStyleSheet> = new Map(); private olStyleSheets: Map<number, OnloadStyleSheet> = new Map();
/** @depreacted since tracker 4.0.2 Mapping by nodeID */ /** @depreacted since tracker 4.0.2 Mapping by nodeID */
private olStyleSheetsDeprecated: Map<number, OnloadStyleSheet> = new Map(); private olStyleSheetsDeprecated: Map<number, OnloadStyleSheet> = new Map();
private upperBodyId: number = -1; private upperBodyId: number = -1;
private nodeScrollManagers: Map<number, ListWalker<SetNodeScroll>> = private nodeScrollManagers: Map<number, ListWalker<SetNodeScroll>> =
new Map(); new Map();
private stylesManager: StylesManager; private stylesManager: StylesManager;
private focusManager: FocusManager = new FocusManager(this.vElements); private focusManager: FocusManager = new FocusManager(this.vElements);
private selectionManager: SelectionManager; private selectionManager: SelectionManager;
private readonly screen: Screen; private readonly screen: Screen;
private readonly isMobile: boolean; private readonly isMobile: boolean;
private readonly stringDict: Record<number, string>; private readonly stringDict: Record<number, string>;
private readonly globalDict: { private readonly globalDict: {
get: (key: string) => string | undefined; get: (key: string) => string | undefined;
all: () => Record<string, string>; all: () => Record<string, string>;
}; };
public readonly time: number; public readonly time: number;
private virtualMode = false;
private hasSlots = false
private showVModeBadge?: () => void;
constructor(params: { constructor(params: {
screen: Screen; screen: Screen;
@ -94,6 +83,8 @@ export default class DOMManager extends ListWalker<Message> {
get: (key: string) => string | undefined; get: (key: string) => string | undefined;
all: () => Record<string, string>; all: () => Record<string, string>;
}; };
virtualMode?: boolean;
showVModeBadge?: () => void;
}) { }) {
super(); super();
this.screen = params.screen; this.screen = params.screen;
@ -103,6 +94,8 @@ export default class DOMManager extends ListWalker<Message> {
this.globalDict = params.globalDict; this.globalDict = params.globalDict;
this.selectionManager = new SelectionManager(this.vElements, params.screen); this.selectionManager = new SelectionManager(this.vElements, params.screen);
this.stylesManager = new StylesManager(params.screen, params.setCssLoading); this.stylesManager = new StylesManager(params.screen, params.setCssLoading);
this.virtualMode = params.virtualMode || false;
this.showVModeBadge = params.showVModeBadge;
setupWindowLogging(this.vTexts, this.vElements, this.olVRoots); setupWindowLogging(this.vTexts, this.vElements, this.olVRoots);
} }
@ -163,6 +156,11 @@ export default class DOMManager extends ListWalker<Message> {
} }
public getNode(id: number) { public getNode(id: number) {
const mappedId = this.shadowRootParentMap.get(id);
if (mappedId !== undefined) {
// If this is a shadow root ID, return the parent element instead
return this.vElements.get(mappedId);
}
return this.vElements.get(id) || this.vTexts.get(id); return this.vElements.get(id) || this.vTexts.get(id);
} }
@ -171,24 +169,21 @@ export default class DOMManager extends ListWalker<Message> {
id: number; id: number;
index: number; index: number;
}): void { }): void {
const { parentID, id, index } = msg; let { parentID, id, index } = msg;
// Check if parentID is a shadow root, and get the real parent element if so
const actualParentID = this.shadowRootParentMap.get(parentID);
if (actualParentID !== undefined) {
parentID = actualParentID;
}
const child = this.vElements.get(id) || this.vTexts.get(id); const child = this.vElements.get(id) || this.vTexts.get(id);
if (!child) { if (!child) {
logger.error('Insert error. Node not found', id); logger.error('Insert error. Node not found', id);
return; return;
} }
const parent = this.vElements.get(parentID) || this.olVRoots.get(parentID); const parent = this.vElements.get(parentID) || this.olVRoots.get(parentID);
if ('tagName' in child && child.tagName === 'BODY') {
const spriteMap = new VSpriteMap(
'svg',
true,
Number.MAX_SAFE_INTEGER - 100,
Number.MAX_SAFE_INTEGER - 100,
);
spriteMap.node.setAttribute('id', 'OPENREPLAY_SPRITES_MAP');
spriteMap.node.setAttribute('style', 'display: none;');
child.insertChildAt(spriteMap, Number.MAX_SAFE_INTEGER - 100);
}
if (!parent) { if (!parent) {
logger.error( logger.error(
`${id} Insert error. Parent vNode ${parentID} not found`, `${id} Insert error. Parent vNode ${parentID} not found`,
@ -303,11 +298,19 @@ export default class DOMManager extends ListWalker<Message> {
this.insertNode(msg); this.insertNode(msg);
this.removeBodyScroll(msg.id, vElem); this.removeBodyScroll(msg.id, vElem);
this.removeAutocomplete(vElem); this.removeAutocomplete(vElem);
if (msg.tag === 'SLOT') {
this.hasSlots = true;
}
return; return;
} }
case MType.MoveNode: case MType.MoveNode: {
// if the parent ID is in shadow root map -> custom elements case
if (this.shadowRootParentMap.has(msg.parentID)) {
msg.parentID = this.shadowRootParentMap.get(msg.parentID)!;
}
this.insertNode(msg); this.insertNode(msg);
return; return;
}
case MType.RemoveNode: { case MType.RemoveNode: {
const vChild = this.vElements.get(msg.id) || this.vTexts.get(msg.id); const vChild = this.vElements.get(msg.id) || this.vTexts.get(msg.id);
if (!vChild) { if (!vChild) {
@ -440,6 +443,21 @@ export default class DOMManager extends ListWalker<Message> {
logger.error('CreateIFrameDocument: Node not found', msg); logger.error('CreateIFrameDocument: Node not found', msg);
return; return;
} }
// shadow DOM for a custom element + SALESFORCE (<slot>)
const isCustomElement =
vElem.tagName.includes('-') || vElem.tagName === 'SLOT';
if (isCustomElement) {
if (this.virtualMode) {
// Store the mapping but don't create the actual shadow root
this.shadowRootParentMap.set(msg.id, msg.frameID);
return;
} else if (this.hasSlots) {
this.showVModeBadge?.();
}
}
// Real iframes
if (this.iframeRoots[msg.frameID] && !this.olVRoots.has(msg.id)) { if (this.iframeRoots[msg.frameID] && !this.olVRoots.has(msg.id)) {
this.olVRoots.delete(this.iframeRoots[msg.frameID]); this.olVRoots.delete(this.iframeRoots[msg.frameID]);
} }
@ -452,7 +470,11 @@ export default class DOMManager extends ListWalker<Message> {
case MType.AdoptedSsInsertRule: { case MType.AdoptedSsInsertRule: {
const styleSheet = this.olStyleSheets.get(msg.sheetID); const styleSheet = this.olStyleSheets.get(msg.sheetID);
if (!styleSheet) { if (!styleSheet) {
logger.warn('No stylesheet was created for ', msg); logger.warn(
'No stylesheet was created for ',
msg,
this.olStyleSheets,
);
return; return;
} }
insertRule(styleSheet, msg); insertRule(styleSheet, msg);

View file

@ -22,6 +22,7 @@ export default class PagesManager extends ListWalker<DOMManager> {
private screen: Screen, private screen: Screen,
private isMobile: boolean, private isMobile: boolean,
private setCssLoading: (flag: boolean) => void, private setCssLoading: (flag: boolean) => void,
private showVModeBadge: () => void,
) { ) {
super(); super();
} }
@ -30,6 +31,10 @@ export default class PagesManager extends ListWalker<DOMManager> {
Assumed that messages added in a correct time sequence. Assumed that messages added in a correct time sequence.
*/ */
falseOrder = false; falseOrder = false;
virtualMode = false;
setVirtualMode = (virtualMode: boolean) => {
this.virtualMode = virtualMode;
};
appendMessage(m: Message): void { appendMessage(m: Message): void {
if ([MType.StringDict, MType.StringDictGlobal].includes(m.tp)) { if ([MType.StringDict, MType.StringDictGlobal].includes(m.tp)) {
@ -62,6 +67,8 @@ export default class PagesManager extends ListWalker<DOMManager> {
get: (key: string) => this.globalDictionary.get(key), get: (key: string) => this.globalDictionary.get(key),
all: () => Object.fromEntries(this.globalDictionary), all: () => Object.fromEntries(this.globalDictionary),
}, },
virtualMode: this.virtualMode,
showVModeBadge: this.showVModeBadge,
}), }),
); );
this.falseOrder = false; this.falseOrder = false;

View file

@ -54,40 +54,45 @@ export default class WindowNodeCounter {
this.nodes = [this.root]; this.nodes = [this.root];
} }
addNode(id: number, parentID: number) { addNode(msg: { id: number, parentID: number, time: number }): boolean {
const { id, parentID } = msg;
if (!this.nodes[parentID]) { if (!this.nodes[parentID]) {
// TODO: iframe case // TODO: iframe case
// console.error(`Wrong! Node with id ${ parentID } (parentId) not found.`); // console.error(`Wrong! Node with id ${ parentID } (parentId) not found.`);
return; return false;
} }
if (this.nodes[id]) { if (this.nodes[id]) {
// console.error(`Wrong! Node with id ${ id } already exists.`); // console.error(`Wrong! Node with id ${ id } already exists.`);
return; return false;
} }
this.nodes[id] = this.nodes[parentID].newChild(); this.nodes[id] = this.nodes[parentID].newChild();
return true;
} }
removeNode(id: number) { removeNode({ id }: { id: number }) {
if (!this.nodes[id]) { if (!this.nodes[id]) {
// Might be text node // Might be text node
// console.error(`Wrong! Node with id ${ id } not found.`); // console.error(`Wrong! Node with id ${ id } not found.`);
return; return false;
} }
this.nodes[id].removeNode(); this.nodes[id].removeNode();
return true;
} }
moveNode(id: number, parentId: number) { moveNode(msg: { id: number, parentID: number, time: number }) {
const { id, parentID, time } = msg;
if (!this.nodes[id]) { if (!this.nodes[id]) {
console.warn(`Node Counter: Node with id ${id} not found.`); console.warn(`Node Counter: Node with id ${id} (parent: ${parentID}) not found. time: ${time}`);
return; return false;
} }
if (!this.nodes[parentId]) { if (!this.nodes[parentID]) {
console.warn( console.warn(
`Node Counter: Node with id ${parentId} (parentId) not found.`, `Node Counter: Node with id ${parentID} (parentId) not found. time: ${time}`,
); );
return; return false;
} }
this.nodes[id].moveNode(this.nodes[parentId]); this.nodes[id].moveNode(this.nodes[parentID]);
return true;
} }
get count() { get count() {

View file

@ -1,5 +1,6 @@
import { DateTime, Interval, Settings } from 'luxon'; import { DateTime, Interval, Settings } from 'luxon';
import Record from 'Types/Record'; import Record from 'Types/Record';
import { roundToNextMinutes } from '@/utils';
export const LAST_30_MINUTES = 'LAST_30_MINUTES'; export const LAST_30_MINUTES = 'LAST_30_MINUTES';
export const TODAY = 'TODAY'; export const TODAY = 'TODAY';
@ -30,7 +31,9 @@ function getRange(rangeName, offset) {
now.startOf('day'), now.startOf('day'),
); );
case LAST_24_HOURS: case LAST_24_HOURS:
return Interval.fromDateTimes(now.minus({ hours: 24 }), now); const mod = now.minute % 15;
const next = now.plus({ minutes: mod === 0 ? 15 : 15 - mod }).startOf('minute');
return Interval.fromDateTimes(next.minus({ hours: 24 }), next);
case LAST_30_MINUTES: case LAST_30_MINUTES:
return Interval.fromDateTimes( return Interval.fromDateTimes(
now.minus({ minutes: 30 }).startOf('minute'), now.minus({ minutes: 30 }).startOf('minute'),

View file

@ -178,6 +178,8 @@ export class Click extends Event {
selector: string; selector: string;
isHighlighted: boolean | undefined = false;
constructor(evt: ClickEvent, isClickRage?: boolean) { constructor(evt: ClickEvent, isClickRage?: boolean) {
super(evt); super(evt);
this.targetContent = evt.targetContent; this.targetContent = evt.targetContent;

View file

@ -29,6 +29,15 @@ export function debounce(callback, wait, context = this) {
}; };
} }
export function debounceCall(func, wait) {
let timeout;
return function (...args) {
const context = this;
clearTimeout(timeout);
timeout = setTimeout(() => func.apply(context, args), wait);
};
}
export function randomInt(a, b) { export function randomInt(a, b) {
const min = (b ? a : 0) - 0.5; const min = (b ? a : 0) - 0.5;
const max = b || a || Number.MAX_SAFE_INTEGER; const max = b || a || Number.MAX_SAFE_INTEGER;
@ -613,3 +622,14 @@ export function exportAntCsv(tableColumns, tableData, filename = 'table.csv') {
const blob = new Blob([csvContent], { type: 'text/csv;charset=utf-8;' }); const blob = new Blob([csvContent], { type: 'text/csv;charset=utf-8;' });
saveAsFile(blob, filename); saveAsFile(blob, filename);
} }
export function roundToNextMinutes(timestamp: number, minutes: number): number {
const date = new Date(timestamp);
date.setSeconds(0, 0);
const currentMinutes = date.getMinutes();
const remainder = currentMinutes % minutes;
if (remainder !== 0) {
date.setMinutes(currentMinutes + (minutes - remainder));
}
return date.getTime();
}

1
scripts/docker-compose/.gitignore vendored Normal file
View file

@ -0,0 +1 @@
hacks/yamls

View file

@ -1,28 +0,0 @@
ASSIST_JWT_SECRET=${COMMON_JWT_SECRET}
ASSIST_KEY=${COMMON_JWT_SECRET}
ASSIST_RECORDS_BUCKET=records
ASSIST_URL="http://assist-openreplay:9001/assist/%s"
AWS_DEFAULT_REGION="us-east-1"
CH_COMPRESSION="false"
PYTHONUNBUFFERED="0"
REDIS_STRING="redis://redis:6379"
S3_HOST="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
S3_KEY="${COMMON_S3_KEY}"
S3_SECRET="${COMMON_S3_SECRET}"
SITE_URL="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
ch_host="clickhouse"
ch_port="9000"
ch_port_http="8123"
ch_username="default"
js_cache_bucket=sessions-assets
jwt_secret="${COMMON_JWT_SECRET}"
pg_dbname="postgres"
pg_host="postgresql"
pg_password="${COMMON_PG_PASSWORD}"
sessions_bucket=mobs
sessions_region="us-east-1"
sourcemaps_bucket=sourcemaps
sourcemaps_reader="http://sourcemapreader-openreplay:9000/sourcemaps/%s/sourcemaps"
version_number="${COMMON_VERSION}"
CLUSTER_URL=""
POD_NAMESPACE=""

View file

@ -1,10 +0,0 @@
AWS_ACCESS_KEY_ID=${COMMON_S3_KEY}
AWS_SECRET_ACCESS_KEY=${COMMON_S3_SECRET}
BUCKET_NAME=sessions-assets
LICENSE_KEY=''
AWS_ENDPOINT='http://minio:9000'
AWS_REGION='us-east-1'
KAFKA_SERVERS='kafka.db.svc.cluster.local:9092'
KAFKA_USE_SSL='false'
ASSETS_ORIGIN='https://${COMMON_DOMAIN_NAME}:443/sessions-assets'
REDIS_STRING='redis://redis:6379'

View file

@ -1,11 +0,0 @@
ASSIST_JWT_SECRET=${COMMON_JWT_SECRET}
ASSIST_KEY=${COMMON_JWT_SECRET}
AWS_DEFAULT_REGION="us-east-1"
S3_HOST="https://${COMMON_DOMAIN_NAME}:443"
S3_KEY=changeMeMinioAccessKey
S3_SECRET=changeMeMinioPassword
REDIS_URL=redis
CLEAR_SOCKET_TIME='720'
debug='0'
redis='false'
uws='false'

View file

@ -1,31 +0,0 @@
ASSIST_JWT_SECRET=${COMMON_JWT_SECRET}
ASSIST_KEY=${COMMON_JWT_SECRET}
ASSIST_RECORDS_BUCKET=records
ASSIST_URL="http://assist-openreplay:9001/assist/%s"
AWS_DEFAULT_REGION="us-east-1"
CH_COMPRESSION="false"
PYTHONUNBUFFERED="0"
REDIS_STRING="redis://redis:6379"
S3_HOST="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
S3_KEY="${COMMON_S3_KEY}"
S3_SECRET="${COMMON_S3_SECRET}"
SITE_URL="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
ch_host="clickhouse"
ch_port="9000"
ch_port_http="8123"
ch_username="default"
js_cache_bucket=sessions-assets
jwt_secret="${COMMON_JWT_SECRET}"
pg_dbname="postgres"
pg_host="postgresql"
pg_password="${COMMON_PG_PASSWORD}"
sessions_bucket=mobs
sessions_region="us-east-1"
sourcemaps_bucket=sourcemaps
sourcemaps_reader="http://sourcemapreader-openreplay:9000/sourcemaps/%s/sourcemaps"
version_number="${COMMON_VERSION}"
CLUSTER_URL=""
POD_NAMESPACE=""
JWT_REFRESH_SECRET=${COMMON_JWT_REFRESH_SECRET}
JWT_SPOT_REFRESH_SECRET=${COMMON_JWT_REFRESH_SECRET}
JWT_SPOT_SECRET=${COMMON_JWT_SPOT_SECRET}

View file

@ -1,15 +1,20 @@
COMMON_VERSION="v1.22.0"
COMMON_PROTOCOL="https" COMMON_PROTOCOL="https"
COMMON_DOMAIN_NAME="change_me_domain" COMMON_DOMAIN_NAME="change_me_domain"
COMMON_JWT_SECRET="change_me_jwt" COMMON_JWT_SECRET="change_me_jwt"
COMMON_JWT_SPOT_SECRET="change_me_jwt" COMMON_JWT_SPOT_SECRET="change_me_jwt"
COMMON_JWT_REFRESH_SECRET="change_me_jwt_refresh"
COMMON_S3_KEY="change_me_s3_key" COMMON_S3_KEY="change_me_s3_key"
COMMON_S3_SECRET="change_me_s3_secret" COMMON_S3_SECRET="change_me_s3_secret"
COMMON_PG_PASSWORD="change_me_pg_password" COMMON_PG_PASSWORD="change_me_pg_password"
COMMON_VERSION="v1.21.0" COMMON_JWT_REFRESH_SECRET="change_me_jwt_refresh"
COMMON_JWT_SPOT_REFRESH_SECRET="change_me_jwt_spot_refresh"
COMMON_ASSIST_JWT_SECRET="change_me_assist_jwt_secret"
COMMON_ASSIST_KEY="change_me_assist_key"
## DB versions ## DB versions
###################################### ######################################
POSTGRES_VERSION="14.5.0" POSTGRES_VERSION="17.2.0"
REDIS_VERSION="6.0.12-debian-10-r33" REDIS_VERSION="6.0.12-debian-10-r33"
MINIO_VERSION="2023.2.10-debian-11-r1" MINIO_VERSION="2023.2.10-debian-11-r1"
CLICKHOUSE_VERSION="25.1-alpine"
###################################### ######################################

View file

@ -1,11 +0,0 @@
CH_USERNAME='default'
CH_PASSWORD=''
CLICKHOUSE_STRING='clickhouse-openreplay-clickhouse.db.svc.cluster.local:9000/default'
LICENSE_KEY=''
KAFKA_SERVERS='kafka.db.svc.cluster.local:9092'
KAFKA_USE_SSL='false'
pg_password="${COMMON_PG_PASSWORD}"
QUICKWIT_ENABLED='false'
POSTGRES_STRING="postgres://postgres:${COMMON_PG_PASSWORD}@postgresql:5432/postgres"
REDIS_STRING='redis://redis:6379'
ch_db='default'

View file

@ -1,15 +1,34 @@
# vim: ft=yaml
version: '3' version: '3'
services: services:
postgresql: postgresql:
image: bitnami/postgresql:${POSTGRES_VERSION} image: bitnami/postgresql:${POSTGRES_VERSION}
container_name: postgres container_name: postgres
volumes: volumes:
- pgdata:/var/lib/postgresql/data - pgdata:/bitnami/postgresql
networks: networks:
- openreplay-net openreplay-net:
aliases:
- postgresql.db.svc.cluster.local
environment: environment:
POSTGRESQL_PASSWORD: ${COMMON_PG_PASSWORD} POSTGRESQL_PASSWORD: "${COMMON_PG_PASSWORD}"
clickhouse:
image: clickhouse/clickhouse-server:${CLICKHOUSE_VERSION}
container_name: clickhouse
volumes:
- clickhouse:/var/lib/clickhouse
networks:
openreplay-net:
aliases:
- clickhouse-openreplay-clickhouse.db.svc.cluster.local
environment:
CLICKHOUSE_USER: "default"
CLICKHOUSE_PASSWORD: ""
CLICKHOUSE_DEFAULT_ACCESS_MANAGEMENT: "1"
redis: redis:
image: bitnami/redis:${REDIS_VERSION} image: bitnami/redis:${REDIS_VERSION}
@ -17,7 +36,9 @@ services:
volumes: volumes:
- redisdata:/bitnami/redis/data - redisdata:/bitnami/redis/data
networks: networks:
- openreplay-net openreplay-net:
aliases:
- redis-master.db.svc.cluster.local
environment: environment:
ALLOW_EMPTY_PASSWORD: "yes" ALLOW_EMPTY_PASSWORD: "yes"
@ -27,7 +48,9 @@ services:
volumes: volumes:
- miniodata:/bitnami/minio/data - miniodata:/bitnami/minio/data
networks: networks:
- openreplay-net openreplay-net:
aliases:
- minio.db.svc.cluster.local
ports: ports:
- 9001:9001 - 9001:9001
environment: environment:
@ -63,7 +86,7 @@ services:
volumes: volumes:
- ../helmcharts/openreplay/files/minio.sh:/tmp/minio.sh - ../helmcharts/openreplay/files/minio.sh:/tmp/minio.sh
environment: environment:
MINIO_HOST: http://minio:9000 MINIO_HOST: http://minio.db.svc.cluster.local:9000
MINIO_ACCESS_KEY: ${COMMON_S3_KEY} MINIO_ACCESS_KEY: ${COMMON_S3_KEY}
MINIO_SECRET_KEY: ${COMMON_S3_SECRET} MINIO_SECRET_KEY: ${COMMON_S3_SECRET}
user: root user: root
@ -80,7 +103,7 @@ services:
bash /tmp/minio.sh init || exit 100 bash /tmp/minio.sh init || exit 100
db-migration: db-migration:
image: bitnami/postgresql:14.5.0 image: bitnami/postgresql:14.5.0
container_name: db-migration container_name: db-migration
profiles: profiles:
- "migration" - "migration"
@ -101,65 +124,317 @@ services:
- /bin/bash - /bin/bash
- -c - -c
- | - |
until PGPASSWORD=${COMMON_PG_PASSWORD} psql -h postgresql -U postgres -d postgres -c '\q'; do until psql -c '\q'; do
echo "PostgreSQL is unavailable - sleeping" echo "PostgreSQL is unavailable - sleeping"
sleep 1 sleep 1
done done
echo "PostgreSQL is up - executing command" echo "PostgreSQL is up - executing command"
psql -v ON_ERROR_STOP=1 -f /tmp/init_schema.sql psql -v ON_ERROR_STOP=1 -f /tmp/init_schema.sql
frontend-openreplay: clickhouse-migration:
image: public.ecr.aws/p1t3u8a3/frontend:${COMMON_VERSION} image: clickhouse/clickhouse-server:${CLICKHOUSE_VERSION}
container_name: frontend container_name: clickhouse-migration
profiles:
- "migration"
depends_on:
- clickhouse
- minio-migration
networks: networks:
- openreplay-net - openreplay-net
restart: unless-stopped volumes:
- ../schema/db/init_dbs/clickhouse/create/init_schema.sql:/tmp/init_schema.sql
environment:
CH_HOST: "clickhouse-openreplay-clickhouse.db.svc.cluster.local"
CH_PORT: "9000"
CH_PORT_HTTP: "8123"
CH_USERNAME: "default"
CH_PASSWORD: ""
entrypoint:
- /bin/bash
- -c
- |
# Checking variable is empty. Shell independant method.
# Wait for Minio to be ready
until nc -z -v -w30 clickhouse-openreplay-clickhouse.db.svc.cluster.local 9000; do
echo "Waiting for Minio server to be ready..."
sleep 1
done
echo "clickhouse is up - executing command"
clickhouse-client -h clickhouse-openreplay-clickhouse.db.svc.cluster.local --user default --port 9000 --multiquery < /tmp/init_schema.sql || true
alerts-openreplay: alerts-openreplay:
image: public.ecr.aws/p1t3u8a3/alerts:${COMMON_VERSION} image: public.ecr.aws/p1t3u8a3/alerts:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: alerts container_name: alerts
networks: networks:
- openreplay-net openreplay-net:
aliases:
- alerts-openreplay
- alerts-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file: env_file:
- alerts.env - docker-envs/alerts.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped restart: unless-stopped
analytics-openreplay:
image: public.ecr.aws/p1t3u8a3/analytics:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: analytics
networks:
openreplay-net:
aliases:
- analytics-openreplay
- analytics-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/analytics.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
http-openreplay:
image: public.ecr.aws/p1t3u8a3/http:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: http
networks:
openreplay-net:
aliases:
- http-openreplay
- http-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/http.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
images-openreplay:
image: public.ecr.aws/p1t3u8a3/images:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: images
networks:
openreplay-net:
aliases:
- images-openreplay
- images-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/images.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
integrations-openreplay:
image: public.ecr.aws/p1t3u8a3/integrations:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: integrations
networks:
openreplay-net:
aliases:
- integrations-openreplay
- integrations-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/integrations.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
sink-openreplay:
image: public.ecr.aws/p1t3u8a3/sink:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: sink
networks:
openreplay-net:
aliases:
- sink-openreplay
- sink-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/sink.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
sourcemapreader-openreplay:
image: public.ecr.aws/p1t3u8a3/sourcemapreader:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: sourcemapreader
networks:
openreplay-net:
aliases:
- sourcemapreader-openreplay
- sourcemapreader-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/sourcemapreader.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
spot-openreplay:
image: public.ecr.aws/p1t3u8a3/spot:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: spot
networks:
openreplay-net:
aliases:
- spot-openreplay
- spot-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/spot.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
storage-openreplay:
image: public.ecr.aws/p1t3u8a3/storage:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: storage
networks:
openreplay-net:
aliases:
- storage-openreplay
- storage-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/storage.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
assets-openreplay: assets-openreplay:
image: public.ecr.aws/p1t3u8a3/assets:${COMMON_VERSION} image: public.ecr.aws/p1t3u8a3/assets:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: assets container_name: assets
networks: networks:
- openreplay-net openreplay-net:
aliases:
- assets-openreplay
- assets-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file: env_file:
- assets.env - docker-envs/assets.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped restart: unless-stopped
assist-openreplay: assist-openreplay:
image: public.ecr.aws/p1t3u8a3/assist:${COMMON_VERSION} image: public.ecr.aws/p1t3u8a3/assist:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: assist container_name: assist
networks: networks:
- openreplay-net openreplay-net:
aliases:
- assist-openreplay
- assist-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file: env_file:
- assist.env - docker-envs/assist.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped restart: unless-stopped
canvases-openreplay:
image: public.ecr.aws/p1t3u8a3/canvases:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: canvases
networks:
openreplay-net:
aliases:
- canvases-openreplay
- canvases-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/canvases.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
chalice-openreplay:
image: public.ecr.aws/p1t3u8a3/chalice:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: chalice
networks:
openreplay-net:
aliases:
- chalice-openreplay
- chalice-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/chalice.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
db-openreplay: db-openreplay:
image: public.ecr.aws/p1t3u8a3/db:${COMMON_VERSION} image: public.ecr.aws/p1t3u8a3/db:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: db container_name: db
networks: networks:
- openreplay-net openreplay-net:
aliases:
- db-openreplay
- db-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file: env_file:
- db.env - docker-envs/db.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped restart: unless-stopped
ender-openreplay: ender-openreplay:
image: public.ecr.aws/p1t3u8a3/ender:${COMMON_VERSION} image: public.ecr.aws/p1t3u8a3/ender:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: ender container_name: ender
networks: networks:
- openreplay-net openreplay-net:
aliases:
- ender-openreplay
- ender-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file: env_file:
- ender.env - docker-envs/ender.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped restart: unless-stopped
frontend-openreplay:
image: public.ecr.aws/p1t3u8a3/frontend:${COMMON_VERSION}
domainname: app.svc.cluster.local
container_name: frontend
networks:
openreplay-net:
aliases:
- frontend-openreplay
- frontend-openreplay.app.svc.cluster.local
volumes:
- shared-volume:/mnt/efs
env_file:
- docker-envs/frontend.env
environment: {} # Fallback empty environment if env_file is missing
restart: unless-stopped
heuristics-openreplay: heuristics-openreplay:
image: public.ecr.aws/p1t3u8a3/heuristics:${COMMON_VERSION} image: public.ecr.aws/p1t3u8a3/heuristics:${COMMON_VERSION}
domainname: app.svc.cluster.local domainname: app.svc.cluster.local
@ -167,88 +442,15 @@ services:
networks: networks:
openreplay-net: openreplay-net:
aliases: aliases:
- heuristics-openreplay
- heuristics-openreplay.app.svc.cluster.local - heuristics-openreplay.app.svc.cluster.local
env_file:
- heuristics.env
restart: unless-stopped
imagestorage-openreplay:
image: public.ecr.aws/p1t3u8a3/imagestorage:${COMMON_VERSION}
container_name: imagestorage
env_file:
- imagestorage.env
networks:
- openreplay-net
restart: unless-stopped
integrations-openreplay:
image: public.ecr.aws/p1t3u8a3/integrations:${COMMON_VERSION}
container_name: integrations
networks:
- openreplay-net
env_file:
- integrations.env
restart: unless-stopped
peers-openreplay:
image: public.ecr.aws/p1t3u8a3/peers:${COMMON_VERSION}
container_name: peers
networks:
- openreplay-net
env_file:
- peers.env
restart: unless-stopped
sourcemapreader-openreplay:
image: public.ecr.aws/p1t3u8a3/sourcemapreader:${COMMON_VERSION}
container_name: sourcemapreader
networks:
- openreplay-net
env_file:
- sourcemapreader.env
restart: unless-stopped
http-openreplay:
image: public.ecr.aws/p1t3u8a3/http:${COMMON_VERSION}
container_name: http
networks:
- openreplay-net
env_file:
- http.env
restart: unless-stopped
chalice-openreplay:
image: public.ecr.aws/p1t3u8a3/chalice:${COMMON_VERSION}
container_name: chalice
volumes: volumes:
- shared-volume:/mnt/efs - shared-volume:/mnt/efs
networks:
- openreplay-net
env_file: env_file:
- chalice.env - docker-envs/heuristics.env
restart: unless-stopped environment: {} # Fallback empty environment if env_file is missing
sink-openreplay:
image: public.ecr.aws/p1t3u8a3/sink:${COMMON_VERSION}
container_name: sink
volumes:
- shared-volume:/mnt/efs
networks:
- openreplay-net
env_file:
- sink.env
restart: unless-stopped
storage-openreplay:
image: public.ecr.aws/p1t3u8a3/storage:${COMMON_VERSION}
container_name: storage
volumes:
- shared-volume:/mnt/efs
networks:
- openreplay-net
env_file:
- storage.env
restart: unless-stopped restart: unless-stopped
nginx-openreplay: nginx-openreplay:
image: nginx:latest image: nginx:latest
@ -280,6 +482,7 @@ services:
volumes: volumes:
pgdata: pgdata:
clickhouse:
redisdata: redisdata:
miniodata: miniodata:
shared-volume: shared-volume:

View file

@ -0,0 +1,27 @@
version_number="v1.22.0"
pg_host="postgresql.db.svc.cluster.local"
pg_port="5432"
pg_dbname="postgres"
ch_host="clickhouse-openreplay-clickhouse.db.svc.cluster.local"
ch_port="9000"
ch_port_http="8123"
ch_username="default"
ch_password=""
pg_user="postgres"
pg_password="${COMMON_PG_PASSWORD}"
SITE_URL="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
S3_HOST="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
S3_KEY="${COMMON_S3_KEY}"
S3_SECRET="${COMMON_S3_SECRET}"
AWS_DEFAULT_REGION="us-east-1"
EMAIL_HOST=""
EMAIL_PORT="587"
EMAIL_USER=""
EMAIL_PASSWORD=""
EMAIL_USE_TLS="true"
EMAIL_USE_SSL="false"
EMAIL_SSL_KEY=""
EMAIL_SSL_CERT=""
EMAIL_FROM="OpenReplay<do-not-reply@openreplay.com>"
LOGLEVEL="INFO"
PYTHONUNBUFFERED="0"

View file

@ -0,0 +1,11 @@
TOKEN_SECRET="secret_token_string"
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
JWT_SECRET="${COMMON_JWT_SECRET}"
CH_USERNAME="default"
CH_PASSWORD=""
CLICKHOUSE_STRING="clickhouse-openreplay-clickhouse.db.svc.cluster.local:9000/"
pg_password="${COMMON_PG_PASSWORD}"
POSTGRES_STRING="postgres://postgres:${COMMON_PG_PASSWORD}@postgresql.db.svc.cluster.local:5432/postgres?sslmode=disable"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"

View file

@ -0,0 +1,10 @@
AWS_ACCESS_KEY_ID="${COMMON_S3_KEY}"
AWS_SECRET_ACCESS_KEY="${COMMON_S3_SECRET}"
BUCKET_NAME="sessions-assets"
LICENSE_KEY=""
AWS_ENDPOINT="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
AWS_REGION="us-east-1"
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
ASSETS_ORIGIN="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}/sessions-assets"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"

View file

@ -0,0 +1,11 @@
ASSIST_JWT_SECRET="${COMMON_ASSIST_JWT_SECRET}"
ASSIST_KEY="${COMMON_ASSIST_KEY}"
AWS_DEFAULT_REGION="us-east-1"
S3_HOST="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}:80"
S3_KEY="${COMMON_S3_KEY}"
S3_SECRET="${COMMON_S3_SECRET}"
REDIS_URL="redis-master.db.svc.cluster.local"
CLEAR_SOCKET_TIME="720"
debug="0"
redis="false"
uws="false"

View file

@ -0,0 +1,10 @@
AWS_ACCESS_KEY_ID="${COMMON_S3_KEY}"
AWS_SECRET_ACCESS_KEY="${COMMON_S3_SECRET}"
AWS_ENDPOINT="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
AWS_REGION="us-east-1"
BUCKET_NAME="mobs"
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"
FS_CLEAN_HRS="24"

View file

@ -0,0 +1,61 @@
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"
KAFKA_SERVERS="kafka.db.svc.cluster.local"
ch_username="default"
ch_password=""
ch_host="clickhouse-openreplay-clickhouse.db.svc.cluster.local"
ch_port="9000"
ch_port_http="8123"
sourcemaps_reader="http://sourcemapreader-openreplay.app.svc.cluster.local:9000/%s/sourcemaps"
ASSIST_URL="http://assist-openreplay.app.svc.cluster.local:9001/assist/%s"
ASSIST_JWT_SECRET="${COMMON_ASSIST_JWT_SECRET}"
JWT_SECRET="${COMMON_JWT_SECRET}"
JWT_SPOT_SECRET="${COMMON_JWT_SPOT_SECRET}"
ASSIST_KEY="${COMMON_ASSIST_KEY}"
LICENSE_KEY=""
version_number="v1.22.0"
pg_host="postgresql.db.svc.cluster.local"
pg_port="5432"
pg_dbname="postgres"
pg_user="postgres"
pg_password="${COMMON_PG_PASSWORD}"
SITE_URL="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
S3_HOST="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
S3_KEY="${COMMON_S3_KEY}"
S3_SECRET="${COMMON_S3_SECRET}"
AWS_DEFAULT_REGION="us-east-1"
sessions_region="us-east-1"
ASSIST_RECORDS_BUCKET="records"
sessions_bucket="mobs"
IOS_VIDEO_BUCKET="mobs"
sourcemaps_bucket="sourcemaps"
js_cache_bucket="sessions-assets"
EMAIL_HOST=""
EMAIL_PORT="587"
EMAIL_USER=""
EMAIL_PASSWORD=""
EMAIL_USE_TLS="true"
EMAIL_USE_SSL="false"
EMAIL_SSL_KEY=""
EMAIL_SSL_CERT=""
EMAIL_FROM="OpenReplay<do-not-reply@openreplay.com>"
CH_COMPRESSION="false"
CLUSTER_URL="svc.cluster.local"
JWT_EXPIRATION="86400"
JWT_REFRESH_SECRET="${COMMON_JWT_REFRESH_SECRET}"
JWT_SPOT_REFRESH_SECRET="${COMMON_JWT_SPOT_REFRESH_SECRET}"
LOGLEVEL="INFO"
PYTHONUNBUFFERED="0"
SAML2_MD_URL=""
announcement_url=""
assist_secret=""
async_Token=""
captcha_key=""
captcha_server=""
iceServers=""
idp_entityId=""
idp_name=""
idp_sls_url=""
idp_sso_url=""
idp_tenantKey=""
idp_x509cert=""
jwt_algorithm="HS512"

View file

@ -0,0 +1,11 @@
CH_USERNAME="default"
CH_PASSWORD=""
CLICKHOUSE_STRING="clickhouse-openreplay-clickhouse.db.svc.cluster.local:9000/default"
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
pg_password="${COMMON_PG_PASSWORD}"
QUICKWIT_ENABLED="false"
POSTGRES_STRING="postgres://postgres:${COMMON_PG_PASSWORD}@postgresql.db.svc.cluster.local:5432/postgres?sslmode=disable"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"
ch_db="default"

View file

@ -0,0 +1,6 @@
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
pg_password="${COMMON_PG_PASSWORD}"
POSTGRES_STRING="postgres://postgres:${COMMON_PG_PASSWORD}@postgresql.db.svc.cluster.local:5432/postgres?sslmode=disable"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"

View file

@ -0,0 +1,2 @@
TRACKER_HOST="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}/script"
HTTP_PORT="80"

View file

@ -0,0 +1,4 @@
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"

View file

@ -0,0 +1,15 @@
BUCKET_NAME="uxtesting-records"
CACHE_ASSETS="true"
AWS_ACCESS_KEY_ID="${COMMON_S3_KEY}"
AWS_SECRET_ACCESS_KEY="${COMMON_S3_SECRET}"
AWS_REGION="us-east-1"
AWS_ENDPOINT="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
pg_password="${COMMON_PG_PASSWORD}"
POSTGRES_STRING="postgres://postgres:${COMMON_PG_PASSWORD}@postgresql.db.svc.cluster.local:5432/postgres?sslmode=disable"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"
JWT_SECRET="${COMMON_JWT_SECRET}"
JWT_SPOT_SECRET="${COMMON_JWT_SPOT_SECRET}"
TOKEN_SECRET="${COMMON_TOKEN_SECRET}"

View file

@ -0,0 +1,10 @@
AWS_ACCESS_KEY_ID="${COMMON_S3_KEY}"
AWS_SECRET_ACCESS_KEY="${COMMON_S3_SECRET}"
AWS_ENDPOINT="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
AWS_REGION="us-east-1"
BUCKET_NAME="mobs"
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"
FS_CLEAN_HRS="24"

View file

@ -0,0 +1,13 @@
AWS_ACCESS_KEY_ID="${COMMON_S3_KEY}"
AWS_SECRET_ACCESS_KEY="${COMMON_S3_SECRET}"
AWS_ENDPOINT="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
AWS_REGION="us-east-1"
BUCKET_NAME="mobs"
JWT_SECRET="${COMMON_JWT_SECRET}"
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
pg_password="${COMMON_PG_PASSWORD}"
POSTGRES_STRING="postgres://postgres:${COMMON_PG_PASSWORD}@postgresql.db.svc.cluster.local:5432/postgres?sslmode=disable"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"
TOKEN_SECRET="secret_token_string"

View file

@ -0,0 +1,5 @@
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
ASSETS_ORIGIN="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}/sessions-assets"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"

View file

@ -0,0 +1,11 @@
SMR_HOST="0.0.0.0"
S3_HOST="http://minio.db.svc.cluster.local:9000"
S3_KEY="${COMMON_S3_KEY}"
S3_SECRET="${COMMON_S3_SECRET}"
AWS_REGION="us-east-1"
LICENSE_KEY=""
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
POSTGRES_STRING="postgres://postgres:${COMMON_PG_PASSWORD}@postgresql.db.svc.cluster.local:5432/postgres"
ASSETS_ORIGIN="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}/sessions-assets"

View file

@ -0,0 +1,16 @@
CACHE_ASSETS="true"
FS_CLEAN_HRS="24"
TOKEN_SECRET="secret_token_string"
AWS_ACCESS_KEY_ID="${COMMON_S3_KEY}"
AWS_SECRET_ACCESS_KEY="${COMMON_S3_SECRET}"
BUCKET_NAME="spots"
AWS_REGION="us-east-1"
AWS_ENDPOINT="${COMMON_PROTOCOL}://${COMMON_DOMAIN_NAME}"
LICENSE_KEY=""
KAFKA_SERVERS="kafka.db.svc.cluster.local:9092"
KAFKA_USE_SSL="false"
JWT_SECRET="${COMMON_JWT_SECRET}"
JWT_SPOT_SECRET="${COMMON_JWT_SPOT_SECRET}"
pg_password="${COMMON_PG_PASSWORD}"
POSTGRES_STRING="postgres://postgres:${COMMON_PG_PASSWORD}@postgresql.db.svc.cluster.local:5432/postgres?sslmode=disable"
REDIS_STRING="redis://redis-master.db.svc.cluster.local:6379"

Some files were not shown because too many files have changed in this diff Show more