Compare commits

...
Sign in to create a new pull request.

95 commits

Author SHA1 Message Date
nick-delirium
c59114188c
ui: fix audioplayer start point 2025-06-04 10:57:08 +02:00
rjshrjndrn
6457e383bb feat(helm): add configurable assets origin
Add a helper template to allow customizing the assets origin URL.
This gives users the ability to override the default S3 endpoint
construction when needed, while maintaining backward compatibility.
This can be used when try to use proxy the bucket like cloudfront or
some custom domain.

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-11-19 16:42:49 +01:00
rjshrjndrn
c22898bce2 fix(helm): varable value
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-11-19 16:41:49 +01:00
rjshrjndrn
9807e066a7 chore(helm): Adding secret with db secrets
Use all the db jobs with secret from this.

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2025-02-06 11:31:09 +01:00
Kraiem Taha Yassine
4d8947c805
fix(DB): fixed version (#2763) 2024-11-19 16:41:49 +01:00
rjshrjndrn
3a39ca8f4e fix(helm): version change check
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-11-18 12:31:46 +01:00
Mehdi Osman
86f0baa30a
Increment http chart version (#2749)
Co-authored-by: GitHub Action <action@github.com>
2024-11-15 11:53:18 +01:00
Alexander
b2cb874a2a
Save the last batch (#2748)
* feat(backend): fix to save the latest message tracker just after the token has been expired

* feat(http): return 401 even after successfully saved batch for JustExpired case
2024-11-15 11:49:37 +01:00
rjshrjndrn
fdc281a406 chore(helm): Adding opereplay config map for
Installation agnostic version access. This is useful for db migration,
especially when we install using argo, or other means
precedence to the autogenereated prev version.
Set migration is true if its argo deployment
fix the forceMigration override
2024-11-06 14:54:44 +01:00
rjshrjndrn
7f5c342a64 fix(cli): error log
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-11-06 14:55:07 +01:00
Mehdi Osman
e31f8d0ab2
Increment chalice chart version (#2716)
Co-authored-by: GitHub Action <action@github.com>
2024-10-30 17:41:34 +01:00
Kraiem Taha Yassine
9ce67efb26
fix(chalice): fixed SSO (#2715) 2024-10-30 17:37:54 +01:00
Mehdi Osman
56ba1770f7
Increment chalice chart version (#2711)
Co-authored-by: GitHub Action <action@github.com>
2024-10-29 17:36:07 +01:00
Kraiem Taha Yassine
8a99bcf7c8
fix(chalice): fixed heatmap empty value (#2710) 2024-10-29 17:30:01 +01:00
Mehdi Osman
a5c236d648
Increment chalice chart version (#2709)
Co-authored-by: GitHub Action <action@github.com>
2024-10-29 17:12:38 +01:00
Kraiem Taha Yassine
16656d5618
Patch/api v1.20.0 (#2708)
* fix(chalice): heatmap support operators

* fix(chalice): heatmap click-rage
2024-10-29 17:08:59 +01:00
Mehdi Osman
ef482d35a8
Increment chalice chart version (#2707)
Co-authored-by: GitHub Action <action@github.com>
2024-10-29 16:08:22 +01:00
Kraiem Taha Yassine
ca4c568883
fix(chalice): heatmap support operators (#2706) 2024-10-29 16:05:20 +01:00
Mehdi Osman
8cecd5f4d5
Increment chalice chart version (#2705)
Co-authored-by: GitHub Action <action@github.com>
2024-10-29 15:24:18 +01:00
Kraiem Taha Yassine
e729a6adac
fix(chalice): heatmap support operators (#2704) 2024-10-29 15:20:34 +01:00
Mehdi Osman
da083dd277
Increment chalice chart version (#2703)
Co-authored-by: GitHub Action <action@github.com>
2024-10-29 14:47:03 +01:00
Kraiem Taha Yassine
5f144636e6
fix(chalice): heatmap support operators (#2702) 2024-10-29 14:40:59 +01:00
Mehdi Osman
97a6cf9a52
Increment frontend chart version (#2701)
Co-authored-by: GitHub Action <action@github.com>
2024-10-29 12:02:52 +01:00
Delirium
0cec551fa0
ui: add operator to clickmap card (#2700) 2024-10-29 11:48:38 +01:00
Mehdi Osman
0c66686e15
Increment db chart version (#2699)
Co-authored-by: GitHub Action <action@github.com>
2024-10-28 18:49:58 +01:00
Alexander
e84bdb5eef
Patch/patch ch clicks with url (#2698)
* added url and url_path to click events

* added app_crash support

* fixed an url in click event
2024-10-28 18:47:31 +01:00
Alexander
d9fe2b5bb8
added url and url_path to click events (#2697)
* added url and url_path to click events

* added app_crash support
2024-10-28 18:36:22 +01:00
Mehdi Osman
9940316ce3
Increment chalice chart version (#2696)
Co-authored-by: GitHub Action <action@github.com>
2024-10-28 16:51:02 +01:00
Kraiem Taha Yassine
9eeaeaf4eb
fix(chalice): heatmap handles null replay (#2695)
refactor(chalice): heatmap use path only
2024-10-28 16:47:51 +01:00
Mehdi Osman
77e6f9aa03
Increment db chart version (#2694)
Co-authored-by: GitHub Action <action@github.com>
2024-10-28 16:42:30 +01:00
Alexander
90d13c69ab
Revert "feat(clickhouse): added host/url to click events (#2680)" (#2693)
This reverts commit a56b94ba92.
2024-10-28 16:38:56 +01:00
rjshrjndrn
396aefaf90 fix(cli): cleanup
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-10-28 14:25:09 +01:00
rjshrjndrn
201b74350c fix(cli): cleanup resolve anchors
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-10-28 14:10:00 +01:00
Mehdi Osman
03c3dc4d6f
Updated patch build from main e2556ea76e (#2692)
* Increment chalice chart version

* Increment alerts chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-10-28 12:13:49 +01:00
Kraiem Taha Yassine
e2556ea76e
fix(chalice): heatmap handles empty/null url (#2691) 2024-10-28 12:08:17 +01:00
Mehdi Osman
b353c818c3
Increment chalice chart version (#2686)
Co-authored-by: GitHub Action <action@github.com>
2024-10-25 16:44:02 +02:00
Kraiem Taha Yassine
6802ddcd93
fix(chalice): fixed permissions for EE SA (#2685) 2024-10-25 16:35:12 +02:00
Kraiem Taha Yassine
63f8b176f6
fix(chalice): fixed heatmap for EE (#2684) 2024-10-25 15:33:59 +02:00
Mehdi Osman
ca5c3fa836
Increment frontend chart version (#2683)
Co-authored-by: GitHub Action <action@github.com>
2024-10-25 14:51:55 +02:00
Delirium
ffedeb4910
ui: use additional param for heatmaps url (#2682) 2024-10-25 14:45:46 +02:00
Mehdi Osman
1c7ce95a3c
Increment db chart version (#2681)
Co-authored-by: GitHub Action <action@github.com>
2024-10-25 14:29:03 +02:00
Alexander
a56b94ba92
feat(clickhouse): added host/url to click events (#2680) 2024-10-25 14:25:20 +02:00
Mehdi Osman
cbafc09bf7
Increment chalice chart version (#2678)
Co-authored-by: GitHub Action <action@github.com>
2024-10-24 19:19:27 +02:00
Kraiem Taha Yassine
9e89c661c5
refactor(chalice): changed heatmpas (#2677) 2024-10-24 19:05:56 +02:00
Mehdi Osman
96e3db1450
Increment chalice chart version (#2676)
Co-authored-by: GitHub Action <action@github.com>
2024-10-24 19:03:30 +02:00
Kraiem Taha Yassine
4401cf930f
refactor(chalice): changed heatmpas (#2675) 2024-10-24 18:55:25 +02:00
Mehdi Osman
8c5a5e165e
Increment chalice chart version (#2674)
Co-authored-by: GitHub Action <action@github.com>
2024-10-23 18:25:26 +02:00
Kraiem Taha Yassine
ccc407137c
fix(chalice): fixed tenant_key for SSO (#2673) 2024-10-23 18:10:16 +02:00
dependabot[bot]
b7bd14a3aa
chore(deps): bump the npm_and_yarn group across 4 directories with 3 updates (#2662)
Bumps the npm_and_yarn group with 3 updates in the /assist directory: [cookie](https://github.com/jshttp/cookie), [express](https://github.com/expressjs/express) and [socket.io](https://github.com/socketio/socket.io).
Bumps the npm_and_yarn group with 3 updates in the /ee/assist directory: [cookie](https://github.com/jshttp/cookie), [express](https://github.com/expressjs/express) and [socket.io](https://github.com/socketio/socket.io).
Bumps the npm_and_yarn group with 2 updates in the /peers directory: [cookie](https://github.com/jshttp/cookie) and [express](https://github.com/expressjs/express).
Bumps the npm_and_yarn group with 2 updates in the /sourcemapreader directory: [cookie](https://github.com/jshttp/cookie) and [express](https://github.com/expressjs/express).


Updates `cookie` from 0.4.2 to 0.7.1
- [Release notes](https://github.com/jshttp/cookie/releases)
- [Commits](https://github.com/jshttp/cookie/compare/v0.4.2...v0.7.1)

Updates `express` from 4.21.0 to 4.21.1
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.1/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.21.0...4.21.1)

Updates `socket.io` from 4.7.5 to 4.8.0
- [Release notes](https://github.com/socketio/socket.io/releases)
- [Changelog](https://github.com/socketio/socket.io/blob/main/CHANGELOG.md)
- [Commits](https://github.com/socketio/socket.io/compare/socket.io@4.7.5...socket.io@4.8.0)

Updates `cookie` from 0.4.2 to 0.7.1
- [Release notes](https://github.com/jshttp/cookie/releases)
- [Commits](https://github.com/jshttp/cookie/compare/v0.4.2...v0.7.1)

Updates `express` from 4.21.0 to 4.21.1
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.1/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.21.0...4.21.1)

Updates `socket.io` from 4.7.5 to 4.8.0
- [Release notes](https://github.com/socketio/socket.io/releases)
- [Changelog](https://github.com/socketio/socket.io/blob/main/CHANGELOG.md)
- [Commits](https://github.com/socketio/socket.io/compare/socket.io@4.7.5...socket.io@4.8.0)

Updates `cookie` from 0.6.0 to 0.7.1
- [Release notes](https://github.com/jshttp/cookie/releases)
- [Commits](https://github.com/jshttp/cookie/compare/v0.4.2...v0.7.1)

Updates `express` from 4.21.0 to 4.21.1
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.1/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.21.0...4.21.1)

Updates `cookie` from 0.6.0 to 0.7.1
- [Release notes](https://github.com/jshttp/cookie/releases)
- [Commits](https://github.com/jshttp/cookie/compare/v0.4.2...v0.7.1)

Updates `express` from 4.21.0 to 4.21.1
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.1/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.21.0...4.21.1)

---
updated-dependencies:
- dependency-name: cookie
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: express
  dependency-type: direct:production
  dependency-group: npm_and_yarn
- dependency-name: socket.io
  dependency-type: direct:production
  dependency-group: npm_and_yarn
- dependency-name: cookie
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: express
  dependency-type: direct:production
  dependency-group: npm_and_yarn
- dependency-name: socket.io
  dependency-type: direct:production
  dependency-group: npm_and_yarn
- dependency-name: cookie
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: express
  dependency-type: direct:production
  dependency-group: npm_and_yarn
- dependency-name: cookie
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: express
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-22 10:23:13 +02:00
Mehdi Osman
49dd17ebe6
Increment frontend chart version (#2671)
Co-authored-by: GitHub Action <action@github.com>
2024-10-21 16:32:46 +02:00
Delirium
57b3044800
ui: fix cursor position (#2670) 2024-10-21 16:26:37 +02:00
dependabot[bot]
72325c6991
chore(deps): bump the npm_and_yarn group across 4 directories with 12 updates (#2657)
Bumps the npm_and_yarn group with 6 updates in the /frontend directory:

| Package | From | To |
| --- | --- | --- |
| [postcss](https://github.com/postcss/postcss) | `8.4.38` | `8.4.39` |
| [webpack](https://github.com/webpack/webpack) | `5.92.1` | `5.94.0` |
| [dompurify](https://github.com/cure53/DOMPurify) | `2.5.0` | `2.5.7` |
| [elliptic](https://github.com/indutny/elliptic) | `6.5.5` | `6.5.7` |
| [express](https://github.com/expressjs/express) | `4.19.2` | `4.21.1` |
| [fast-xml-parser](https://github.com/NaturalIntelligence/fast-xml-parser) | `4.3.6` | `4.5.0` |

Bumps the npm_and_yarn group with 3 updates in the /spot directory: [postcss](https://github.com/postcss/postcss), [rollup](https://github.com/rollup/rollup) and [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite).
Bumps the npm_and_yarn group with 1 update in the /tracker/tracker-axios directory: [axios](https://github.com/axios/axios).
Bumps the npm_and_yarn group with 1 update in the /tracker/tracker-testing-playground directory: [axios](https://github.com/axios/axios).


Updates `postcss` from 8.4.38 to 8.4.39
- [Release notes](https://github.com/postcss/postcss/releases)
- [Changelog](https://github.com/postcss/postcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/postcss/postcss/compare/8.4.38...8.4.39)

Updates `webpack` from 5.92.1 to 5.94.0
- [Release notes](https://github.com/webpack/webpack/releases)
- [Commits](https://github.com/webpack/webpack/compare/v5.92.1...v5.94.0)

Updates `dompurify` from 2.5.0 to 2.5.7
- [Release notes](https://github.com/cure53/DOMPurify/releases)
- [Commits](https://github.com/cure53/DOMPurify/compare/2.5.0...2.5.7)

Updates `elliptic` from 6.5.5 to 6.5.7
- [Commits](https://github.com/indutny/elliptic/compare/v6.5.5...v6.5.7)

Updates `express` from 4.19.2 to 4.21.1
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.1/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.19.2...4.21.1)

Updates `fast-xml-parser` from 4.3.6 to 4.5.0
- [Release notes](https://github.com/NaturalIntelligence/fast-xml-parser/releases)
- [Changelog](https://github.com/NaturalIntelligence/fast-xml-parser/blob/master/CHANGELOG.md)
- [Commits](https://github.com/NaturalIntelligence/fast-xml-parser/compare/v4.3.6...v4.5.0)

Updates `path-to-regexp` from 0.1.7 to 0.1.10
- [Release notes](https://github.com/pillarjs/path-to-regexp/releases)
- [Changelog](https://github.com/pillarjs/path-to-regexp/blob/master/History.md)
- [Commits](https://github.com/pillarjs/path-to-regexp/compare/v0.1.7...v0.1.10)

Updates `send` from 0.18.0 to 0.19.0
- [Release notes](https://github.com/pillarjs/send/releases)
- [Changelog](https://github.com/pillarjs/send/blob/master/HISTORY.md)
- [Commits](https://github.com/pillarjs/send/compare/0.18.0...0.19.0)

Updates `serve-static` from 1.15.0 to 1.16.2
- [Release notes](https://github.com/expressjs/serve-static/releases)
- [Changelog](https://github.com/expressjs/serve-static/blob/v1.16.2/HISTORY.md)
- [Commits](https://github.com/expressjs/serve-static/compare/v1.15.0...v1.16.2)

Updates `postcss` from 8.4.41 to 8.4.47
- [Release notes](https://github.com/postcss/postcss/releases)
- [Changelog](https://github.com/postcss/postcss/blob/main/CHANGELOG.md)
- [Commits](https://github.com/postcss/postcss/compare/8.4.38...8.4.39)

Updates `rollup` from 4.21.0 to 4.24.0
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.21.0...v4.24.0)

Updates `vite` from 5.4.2 to 5.4.9
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/v5.4.9/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v5.4.9/packages/vite)

Updates `axios` from 0.26.1 to 1.7.7
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v0.26.1...v1.7.7)

Updates `axios` from 0.27.2 to 1.7.7
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v0.26.1...v1.7.7)

---
updated-dependencies:
- dependency-name: postcss
  dependency-type: direct:development
  dependency-group: npm_and_yarn
- dependency-name: webpack
  dependency-type: direct:development
  dependency-group: npm_and_yarn
- dependency-name: dompurify
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: elliptic
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: express
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: fast-xml-parser
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: path-to-regexp
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: send
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: serve-static
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: postcss
  dependency-type: direct:production
  dependency-group: npm_and_yarn
- dependency-name: rollup
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: vite
  dependency-type: indirect
  dependency-group: npm_and_yarn
- dependency-name: axios
  dependency-type: direct:development
  dependency-group: npm_and_yarn
- dependency-name: axios
  dependency-type: direct:production
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-18 13:51:43 +02:00
Mehdi Osman
a7adf4ad54
Increment frontend chart version (#2656)
Co-authored-by: GitHub Action <action@github.com>
2024-10-15 16:36:36 +02:00
Shekar Siri
54abbe58a2
change(ui): sentry dep update (#2655) 2024-10-15 16:25:53 +02:00
Mehdi Osman
b43a35e458
Increment frontend chart version (#2646)
Co-authored-by: GitHub Action <action@github.com>
2024-10-10 14:28:25 +02:00
Delirium
28a9b53d05
port tracker-14 fixes to latest (#2645) 2024-10-10 14:21:56 +02:00
Mehdi Osman
111e9c6474
Increment chalice chart version (#2642)
Co-authored-by: GitHub Action <action@github.com>
2024-10-08 15:54:17 +02:00
Kraiem Taha Yassine
f8d8cc5150
fix(chalice): use existing user attributes for SSO if they are missing in the list of claims (#2641) 2024-10-08 15:31:14 +02:00
Mehdi Osman
aa25b0e882
Increment frontend chart version (#2639)
Co-authored-by: GitHub Action <action@github.com>
2024-10-07 16:58:34 +02:00
Delirium
b53b14ae5f
rm console line (#2637) 2024-10-07 16:45:17 +02:00
Delirium
e3f6a8fadc
ui: fix audioplayer time comp (#2636) 2024-10-07 16:43:00 +02:00
Chris Weaver
e95611c1a6
fix #2360 Check ping or Wget to confirm Github is up in job.yaml (#2631) 2024-10-03 16:39:57 +02:00
Mehdi Osman
46aebe9a8c
Updated patch build from main e9a9d2ff2a (#2619)
* Increment chalice chart version

* Increment alerts chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-09-27 15:10:07 +02:00
Kraiem Taha Yassine
e9a9d2ff2a
Patch/api v1.20.0 (#2618)
* chore(actions): show patch diff

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>

* fix(chalice): fixed session's search ignore injected durations

---------

Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
Co-authored-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-27 14:58:36 +02:00
rjshrjndrn
1f7d587796 chore(actions): show patch diff
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-27 10:49:31 +02:00
Mehdi Osman
7c20b608c5
Increment frontend chart version (#2615) 2024-09-26 14:39:11 -04:00
Mehdi Osman
88a82acb8b
Update .env.sample 2024-09-26 12:37:28 -04:00
rjshrjndrn
36c9b5e234 chore(actions): git clone should be from the specific tag for submodule
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-26 10:20:59 +02:00
Mehdi Osman
4cfdee28c3
Updated patch build from main 62ef3ca2dd (#2611)
* Increment chalice chart version

* Increment alerts chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-09-25 17:30:24 +02:00
Kraiem Taha Yassine
62ef3ca2dd
Patch/api v1.20.0 (#2610)
* fix(chalice): remove null referrer from table of referrers

* fix(chalice): fixed add MSTeams integration with wrong URL

* fix(chalice): session's search ignore injected durations
2024-09-25 17:25:18 +02:00
Mehdi Osman
9d0f3b34ae
Increment frontend chart version (#2609)
Co-authored-by: GitHub Action <action@github.com>
2024-09-25 16:16:20 +02:00
Delirium
93c605a28e
UI path evs cons (#2608)
* ui: support payload for events search

* ui: assist console size and init fixes
2024-09-25 16:11:03 +02:00
rjshrjndrn
872263624d chore(build): Support for multi arch
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-24 16:47:54 +02:00
Mehdi Osman
1dee5853a5
Increment frontend chart version (#2607)
Co-authored-by: GitHub Action <action@github.com>
2024-09-24 12:16:21 +02:00
Delirium
5cf584e8e1
UI patch 1337 (#2606)
* ui: debugging audio

* ui: debugging audio pt2

* ui: remove select-none from console rows

* ui: fix audioplayer file length calculation and checks
2024-09-24 12:12:50 +02:00
rjshrjndrn
cfc1f807ec chore(cli): proper cleanup
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-20 19:03:52 +02:00
Mehdi Osman
de19f0397d
Increment frontend chart version (#2599)
Co-authored-by: GitHub Action <action@github.com>
2024-09-20 17:12:18 +02:00
Delirium
a11c683baf
fix ui: prevent audioplayer from looping after playing once unless scrolled backwards (#2598) 2024-09-20 16:47:47 +02:00
rjshrjndrn
f5949cc08e chore(helm): check github availability before clone
Signed-off-by: rjshrjndrn <rjshrjndrn@gmail.com>
2024-09-20 15:30:59 +02:00
Sudheer Salavadi
d7cb49d490
New Git hero 2024-09-19 19:20:05 +05:30
Mehdi Osman
6e5d92ed79
Increment chalice chart version (#2596)
Co-authored-by: GitHub Action <action@github.com>
2024-09-17 20:06:18 +02:00
Kraiem Taha Yassine
018bf9c0be
fix(chalice): fixed spot refresh logic for EE (#2595) 2024-09-17 20:03:44 +02:00
Mehdi Osman
c56a2c2d25
Increment chalice chart version (#2594)
Co-authored-by: GitHub Action <action@github.com>
2024-09-17 12:46:22 +02:00
Kraiem Taha Yassine
5d786bde56
fix(chalice): fixed issues-tracking error handler (#2593) 2024-09-17 12:42:34 +02:00
Mehdi Osman
c7e6f31941
Updated patch build from main ad0ef00842 (#2591)
* Increment chalice chart version

* Increment alerts chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-09-16 16:36:39 +02:00
Kraiem Taha Yassine
ad0ef00842
fix(alerts): fixed missing dependency for EE (#2590)
fix(crons): fixed missing dependency for EE
2024-09-16 16:24:23 +02:00
Kraiem Taha Yassine
2ffec26d02
fix(chalice): fixed wrong default logging level (#2589) 2024-09-16 16:11:12 +02:00
Mehdi Osman
b63962b51a
Increment frontend chart version (#2588)
Co-authored-by: GitHub Action <action@github.com>
2024-09-16 16:05:36 +02:00
Delirium
abe440f729
fix ui: revert spots check (#2587) 2024-09-16 15:59:25 +02:00
Mehdi Osman
71e7552899
Updated patch build from main 7906384fe7 (#2586)
* Increment chalice chart version

* Increment alerts chart version

---------

Co-authored-by: GitHub Action <action@github.com>
2024-09-16 14:10:07 +02:00
Kraiem Taha Yassine
7906384fe7
Patch/api v1.20.0 (#2585)
* fix(chalice): fixed top fetchUrl values for EE-exp
* fix(alerts): fixed missing logger
* fix(chalice): JIRA integration support expired credentials
2024-09-16 13:45:51 +02:00
Mehdi Osman
bdd564f49c
Increment spot chart version (#2579)
Co-authored-by: GitHub Action <action@github.com>
2024-09-14 12:24:00 +05:30
Mehdi Osman
b89248067a
Increment frontend chart version (#2578)
Co-authored-by: GitHub Action <action@github.com>
2024-09-13 12:16:54 -04:00
Delirium
9ed207abb1
Dev (#2577)
* ui: use enum state for spot ready checker

* ui: force worker for hls

* ui: fix spot list header behavior, change spot login flow?

* ui: bump spot v

* ui: spot signup fixes
2024-09-13 18:13:15 +02:00
Mehdi Osman
cbe2d62def
Increment frontend chart version (#2576)
Co-authored-by: GitHub Action <action@github.com>
2024-09-13 12:03:40 -04:00
88 changed files with 2206 additions and 1311 deletions

View file

@ -83,8 +83,12 @@ jobs:
[ -d $MSAAS_REPO_FOLDER ] || { [ -d $MSAAS_REPO_FOLDER ] || {
git clone -b dev --recursive https://x-access-token:$MSAAS_REPO_CLONE_TOKEN@$MSAAS_REPO_URL $MSAAS_REPO_FOLDER git clone -b dev --recursive https://x-access-token:$MSAAS_REPO_CLONE_TOKEN@$MSAAS_REPO_URL $MSAAS_REPO_FOLDER
cd $MSAAS_REPO_FOLDER cd $MSAAS_REPO_FOLDER
cd openreplay && git fetch origin && git checkout main # This have to be changed to specific tag
git log -1
cd $MSAAS_REPO_FOLDER
bash git-init.sh bash git-init.sh
git checkout git checkout
git --git-dir=./openreplay/.git status
} }
} }
function build_managed() { function build_managed() {
@ -97,7 +101,7 @@ jobs:
else else
cd $MSAAS_REPO_FOLDER/openreplay/$service cd $MSAAS_REPO_FOLDER/openreplay/$service
fi fi
IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash build.sh >> /tmp/arm.txt IMAGE_TAG=$version DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=arm64 DOCKER_REPO=$DOCKER_REPO_ARM PUSH_IMAGE=0 bash -x build.sh >> /tmp/arm.txt
} }
# Checking for backend images # Checking for backend images
ls backend/cmd >> /tmp/backend.txt ls backend/cmd >> /tmp/backend.txt

View file

@ -12,6 +12,8 @@ from chalicelib.core.collaboration_slack import Slack
from chalicelib.utils import pg_client, helper, email_helper, smtp from chalicelib.utils import pg_client, helper, email_helper, smtp
from chalicelib.utils.TimeUTC import TimeUTC from chalicelib.utils.TimeUTC import TimeUTC
logger = logging.getLogger(__name__)
def get(id): def get(id):
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:

View file

@ -26,17 +26,23 @@ class MSTeams(BaseCollaboration):
@classmethod @classmethod
def say_hello(cls, url): def say_hello(cls, url):
r = requests.post( try:
url=url, r = requests.post(
json={ url=url,
"@type": "MessageCard", json={
"@context": "https://schema.org/extensions", "@type": "MessageCard",
"summary": "Welcome to OpenReplay", "@context": "https://schema.org/extensions",
"title": "Welcome to OpenReplay" "summary": "Welcome to OpenReplay",
}) "title": "Welcome to OpenReplay"
if r.status_code != 200: },
logger.warning("MSTeams integration failed") timeout=3)
logger.warning(r.text) if r.status_code != 200:
logger.warning("MSTeams integration failed")
logger.warning(r.text)
return False
except Exception as e:
logger.warning("!!! MSTeams integration failed")
logger.exception(e)
return False return False
return True return True

View file

@ -1,28 +1,34 @@
import logging import logging
import schemas import schemas
from chalicelib.core import sessions_mobs, sessions, events from chalicelib.core import sessions_mobs, sessions
from chalicelib.utils import pg_client, helper from chalicelib.utils import pg_client, helper
from chalicelib.utils import sql_helper as sh
# from chalicelib.utils import sql_helper as sh
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
def get_by_url(project_id, data: schemas.GetHeatMapPayloadSchema): def get_by_url(project_id, data: schemas.GetHeatMapPayloadSchema):
if data.url is None or data.url == "":
return []
args = {"startDate": data.startTimestamp, "endDate": data.endTimestamp, args = {"startDate": data.startTimestamp, "endDate": data.endTimestamp,
"project_id": project_id, "url": data.url} "project_id": project_id, "url": data.url}
constraints = ["sessions.project_id = %(project_id)s", constraints = ["sessions.project_id = %(project_id)s",
"(url = %(url)s OR path= %(url)s)",
"clicks.timestamp >= %(startDate)s", "clicks.timestamp >= %(startDate)s",
"clicks.timestamp <= %(endDate)s", "clicks.timestamp <= %(endDate)s",
"start_ts >= %(startDate)s", "start_ts >= %(startDate)s",
"start_ts <= %(endDate)s", "start_ts <= %(endDate)s",
"duration IS NOT NULL", "duration IS NOT NULL",
"normalized_x IS NOT NULL"] "normalized_x IS NOT NULL"]
if data.operator == schemas.SearchEventOperator.IS:
constraints.append("path= %(url)s")
else:
constraints.append("path ILIKE %(url)s")
args["url"] = helper.values_for_operator(data.url, data.operator)
query_from = "events.clicks INNER JOIN sessions USING (session_id)" query_from = "events.clicks INNER JOIN sessions USING (session_id)"
has_click_rage_filter = False
# TODO: is this used ? # TODO: is this used ?
# has_click_rage_filter = False
# if len(data.filters) > 0: # if len(data.filters) > 0:
# for i, f in enumerate(data.filters): # for i, f in enumerate(data.filters):
# if f.type == schemas.FilterType.issue and len(f.value) > 0: # if f.type == schemas.FilterType.issue and len(f.value) > 0:
@ -49,15 +55,15 @@ def get_by_url(project_id, data: schemas.GetHeatMapPayloadSchema):
# f.value, value_key=f_k)) # f.value, value_key=f_k))
# constraints.append(sh.multi_conditions(f"mis.type = %({f_k})s", # constraints.append(sh.multi_conditions(f"mis.type = %({f_k})s",
# f.value, value_key=f_k)) # f.value, value_key=f_k))
# TODO: change this once click-rage is fixed
if data.click_rage and not has_click_rage_filter: # if data.click_rage and not has_click_rage_filter:
constraints.append("""(issues.session_id IS NULL # constraints.append("""(issues.session_id IS NULL
OR (issues.timestamp >= %(startDate)s # OR (issues.timestamp >= %(startDate)s
AND issues.timestamp <= %(endDate)s # AND issues.timestamp <= %(endDate)s
AND mis.project_id = %(project_id)s # AND mis.project_id = %(project_id)s
AND mis.type='click_rage'))""") # AND mis.type='click_rage'))""")
query_from += """LEFT JOIN events_common.issues USING (timestamp, session_id) # query_from += """LEFT JOIN events_common.issues USING (timestamp, session_id)
LEFT JOIN issues AS mis USING (issue_id)""" # LEFT JOIN issues AS mis USING (issue_id)"""
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
query = cur.mogrify(f"""SELECT normalized_x, normalized_y query = cur.mogrify(f"""SELECT normalized_x, normalized_y
FROM {query_from} FROM {query_from}
@ -83,8 +89,13 @@ def get_by_url(project_id, data: schemas.GetHeatMapPayloadSchema):
def get_x_y_by_url_and_session_id(project_id, session_id, data: schemas.GetHeatMapPayloadSchema): def get_x_y_by_url_and_session_id(project_id, session_id, data: schemas.GetHeatMapPayloadSchema):
args = {"session_id": session_id, "url": data.url} args = {"session_id": session_id, "url": data.url}
constraints = ["session_id = %(session_id)s", constraints = ["session_id = %(session_id)s",
"(url = %(url)s OR path= %(url)s)",
"normalized_x IS NOT NULL"] "normalized_x IS NOT NULL"]
if data.operator == schemas.SearchEventOperator.IS:
constraints.append("path= %(url)s")
else:
constraints.append("path ILIKE %(url)s")
args["url"] = helper.values_for_operator(data.url, data.operator)
query_from = "events.clicks" query_from = "events.clicks"
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
@ -110,8 +121,13 @@ def get_x_y_by_url_and_session_id(project_id, session_id, data: schemas.GetHeatM
def get_selectors_by_url_and_session_id(project_id, session_id, data: schemas.GetHeatMapPayloadSchema): def get_selectors_by_url_and_session_id(project_id, session_id, data: schemas.GetHeatMapPayloadSchema):
args = {"session_id": session_id, "url": data.url} args = {"session_id": session_id, "url": data.url}
constraints = ["session_id = %(session_id)s", constraints = ["session_id = %(session_id)s"]
"(url = %(url)s OR path= %(url)s)"] if data.operator == schemas.SearchEventOperator.IS:
constraints.append("path= %(url)s")
else:
constraints.append("path ILIKE %(url)s")
args["url"] = helper.values_for_operator(data.url, data.operator)
query_from = "events.clicks" query_from = "events.clicks"
with pg_client.PostgresClient() as cur: with pg_client.PostgresClient() as cur:
@ -143,29 +159,93 @@ s.start_ts,
s.duration""" s.duration"""
def __get_1_url(location_condition: schemas.SessionSearchEventSchema2 | None, session_id: str, project_id: int,
start_time: int,
end_time: int) -> str | None:
full_args = {
"sessionId": session_id,
"projectId": project_id,
"start_time": start_time,
"end_time": end_time,
}
sub_condition = ["session_id = %(sessionId)s"]
if location_condition and len(location_condition.value) > 0:
f_k = "LOC"
op = sh.get_sql_operator(location_condition.operator)
full_args = {**full_args, **sh.multi_values(location_condition.value, value_key=f_k)}
sub_condition.append(
sh.multi_conditions(f'path {op} %({f_k})s', location_condition.value, is_not=False,
value_key=f_k))
with pg_client.PostgresClient() as cur:
main_query = cur.mogrify(f"""WITH paths AS (SELECT DISTINCT path
FROM events.clicks
WHERE {" AND ".join(sub_condition)})
SELECT path, COUNT(1) AS count
FROM events.clicks
INNER JOIN public.sessions USING (session_id)
INNER JOIN paths USING (path)
WHERE sessions.project_id = %(projectId)s
AND clicks.timestamp >= %(start_time)s
AND clicks.timestamp <= %(end_time)s
AND start_ts >= %(start_time)s
AND start_ts <= %(end_time)s
AND duration IS NOT NULL
GROUP BY path
ORDER BY count DESC
LIMIT 1;""", full_args)
logger.debug("--------------------")
logger.debug(main_query)
logger.debug("--------------------")
try:
cur.execute(main_query)
except Exception as err:
logger.warning("--------- CLICK MAP BEST URL SEARCH QUERY EXCEPTION -----------")
logger.warning(main_query.decode('UTF-8'))
logger.warning("--------- PAYLOAD -----------")
logger.warning(full_args)
logger.warning("--------------------")
raise err
url = cur.fetchone()
if url is None:
return None
return url["path"]
def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_id, def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_id,
include_mobs: bool = True, exclude_sessions: list[str] = [], include_mobs: bool = True, exclude_sessions: list[str] = [],
_depth: int = 3): _depth: int = 3):
no_platform = True no_platform = True
no_location = True location_condition = None
no_click = True
for f in data.filters: for f in data.filters:
if f.type == schemas.FilterType.PLATFORM: if f.type == schemas.FilterType.PLATFORM:
no_platform = False no_platform = False
break break
for f in data.events: for f in data.events:
if f.type == schemas.EventType.LOCATION: if f.type == schemas.EventType.LOCATION:
no_location = False
if len(f.value) == 0: if len(f.value) == 0:
f.operator = schemas.SearchEventOperator.IS_ANY f.operator = schemas.SearchEventOperator.IS_ANY
location_condition = f.model_copy()
elif f.type == schemas.EventType.CLICK:
no_click = False
if len(f.value) == 0:
f.operator = schemas.SearchEventOperator.IS_ANY
if location_condition and not no_click:
break break
if no_platform: if no_platform:
data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.PLATFORM, data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.PLATFORM,
value=[schemas.PlatformType.DESKTOP], value=[schemas.PlatformType.DESKTOP],
operator=schemas.SearchEventOperator.IS)) operator=schemas.SearchEventOperator.IS))
if no_location: if not location_condition:
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.LOCATION, data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.LOCATION,
value=[], value=[],
operator=schemas.SearchEventOperator.IS_ANY)) operator=schemas.SearchEventOperator.IS_ANY))
if no_click:
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.CLICK,
value=[],
operator=schemas.SearchEventOperator.IS_ANY))
data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.EVENTS_COUNT, data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.EVENTS_COUNT,
value=[0], value=[0],
@ -183,7 +263,8 @@ def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_i
main_query = cur.mogrify(f"""SELECT * main_query = cur.mogrify(f"""SELECT *
FROM (SELECT {SESSION_PROJECTION_COLS} FROM (SELECT {SESSION_PROJECTION_COLS}
{query_part} {query_part}
ORDER BY {data.sort} {data.order.value} --ignoring the sort made the query faster (from 6s to 100ms)
--ORDER BY {data.sort} {data.order.value}
LIMIT 20) AS raw LIMIT 20) AS raw
ORDER BY random() ORDER BY random()
LIMIT 1;""", full_args) LIMIT 1;""", full_args)
@ -202,6 +283,13 @@ def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_i
session = cur.fetchone() session = cur.fetchone()
if session: if session:
if not location_condition or location_condition.operator == schemas.SearchEventOperator.IS_ANY:
session["path"] = __get_1_url(project_id=project_id, session_id=session["session_id"],
location_condition=location_condition,
start_time=data.startTimestamp, end_time=data.endTimestamp)
else:
session["path"] = location_condition.value[0]
if include_mobs: if include_mobs:
session['domURL'] = sessions_mobs.get_urls(session_id=session["session_id"], project_id=project_id) session['domURL'] = sessions_mobs.get_urls(session_id=session["session_id"], project_id=project_id)
session['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session["session_id"]) session['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session["session_id"])

View file

@ -41,6 +41,7 @@ class JIRAIntegration(integration_base.BaseIntegration):
except Exception as e: except Exception as e:
self._issue_handler = None self._issue_handler = None
self.integration["valid"] = False self.integration["valid"] = False
return {"errors": ["Something went wrong, please check your JIRA credentials."]}
return self._issue_handler return self._issue_handler
# TODO: remove this once jira-oauth is done # TODO: remove this once jira-oauth is done

View file

@ -336,10 +336,13 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
if v not in extra_conditions[e.operator].value: if v not in extra_conditions[e.operator].value:
extra_conditions[e.operator].value.append(v) extra_conditions[e.operator].value.append(v)
extra_conditions = list(extra_conditions.values()) extra_conditions = list(extra_conditions.values())
elif metric_of == schemas.MetricOfTable.ISSUES and len(metric_value) > 0: elif metric_of == schemas.MetricOfTable.ISSUES and len(metric_value) > 0:
data.filters.append(schemas.SessionSearchFilterSchema(value=metric_value, type=schemas.FilterType.ISSUE, data.filters.append(schemas.SessionSearchFilterSchema(value=metric_value, type=schemas.FilterType.ISSUE,
operator=schemas.SearchEventOperator.IS)) operator=schemas.SearchEventOperator.IS))
elif metric_of == schemas.MetricOfTable.REFERRER:
data.filters.append(schemas.SessionSearchFilterSchema(value=metric_value, type=schemas.FilterType.REFERRER,
operator=schemas.SearchEventOperator.IS_ANY))
full_args, query_part = search_query_parts(data=data, error_status=None, errors_only=False, full_args, query_part = search_query_parts(data=data, error_status=None, errors_only=False,
favorite_only=False, issue=None, project_id=project_id, favorite_only=False, issue=None, project_id=project_id,
user_id=None, extra_event=extra_event, extra_conditions=extra_conditions) user_id=None, extra_event=extra_event, extra_conditions=extra_conditions)

View file

@ -5,7 +5,7 @@ from decouple import config
from . import smtp from . import smtp
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
logging.basicConfig(level=config("LOGLEVEL", default=logging.info)) logging.basicConfig(level=config("LOGLEVEL", default=logging.INFO))
if smtp.has_smtp(): if smtp.has_smtp():
logger.info("valid SMTP configuration found") logger.info("valid SMTP configuration found")

View file

@ -394,8 +394,11 @@ def get_all_issue_tracking_projects(context: schemas.CurrentContext = Depends(OR
user_id=context.user_id) user_id=context.user_id)
if error is not None: if error is not None:
return error return error
data = integration.issue_handler.get_projects() data = integration.issue_handler
if "errors" in data: if isinstance(data, dict) and "errors" in data:
return data
data = data.get_projects()
if isinstance(data, dict) and "errors" in data:
return data return data
return {"data": data} return {"data": data}
@ -406,8 +409,11 @@ def get_integration_metadata(integrationProjectId: int, context: schemas.Current
user_id=context.user_id) user_id=context.user_id)
if error is not None: if error is not None:
return error return error
data = integration.issue_handler.get_metas(integrationProjectId) data = integration
if "errors" in data.keys(): if isinstance(data, dict) and "errors" in data:
return data
data = data.issue_handler.get_metas(integrationProjectId)
if isinstance(data, dict) and "errors" in data:
return data return data
return {"data": data} return {"data": data}

View file

@ -777,7 +777,8 @@ class SessionsSearchPayloadSchema(_TimedSchema, _PaginatedSchema):
for f in values.get("filters", []): for f in values.get("filters", []):
vals = [] vals = []
for v in f.get("value", []): for v in f.get("value", []):
if v is not None: if v is not None and (f.get("type", "") != FilterType.DURATION.value
or str(v).isnumeric()):
vals.append(v) vals.append(v)
f["value"] = vals f["value"] = vals
return values return values
@ -1594,9 +1595,11 @@ class HeatMapFilterSchema(BaseModel):
class GetHeatMapPayloadSchema(_TimedSchema): class GetHeatMapPayloadSchema(_TimedSchema):
url: str = Field(...) url: Optional[str] = Field(default=None)
filters: List[HeatMapFilterSchema] = Field(default=[]) filters: List[HeatMapFilterSchema] = Field(default=[])
click_rage: bool = Field(default=False) click_rage: bool = Field(default=False)
operator: Literal[SearchEventOperator.IS, SearchEventOperator.STARTS_WITH,
SearchEventOperator.CONTAINS, SearchEventOperator.ENDS_WITH] = Field(default=SearchEventOperator.STARTS_WITH)
class GetClickMapPayloadSchema(GetHeatMapPayloadSchema): class GetClickMapPayloadSchema(GetHeatMapPayloadSchema):

View file

@ -10,10 +10,10 @@
"license": "Elastic License 2.0 (ELv2)", "license": "Elastic License 2.0 (ELv2)",
"dependencies": { "dependencies": {
"@maxmind/geoip2-node": "^4.2.0", "@maxmind/geoip2-node": "^4.2.0",
"express": "^4.18.2", "express": "^4.21.1",
"jsonwebtoken": "^9.0.2", "jsonwebtoken": "^9.0.2",
"prom-client": "^15.0.0", "prom-client": "^15.0.0",
"socket.io": "^4.7.2", "socket.io": "^4.8.0",
"ua-parser-js": "^1.0.37", "ua-parser-js": "^1.0.37",
"winston": "^3.13.0" "winston": "^3.13.0"
} }
@ -72,9 +72,9 @@
} }
}, },
"node_modules/@types/node": { "node_modules/@types/node": {
"version": "22.5.4", "version": "22.7.6",
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.5.4.tgz", "resolved": "https://registry.npmjs.org/@types/node/-/node-22.7.6.tgz",
"integrity": "sha512-FDuKUJQm/ju9fT/SeX/6+gBzoPzlVCzfzmGkwKvRHQVxi4BntVbyIwf6a4Xn62mrvndLiml6z/UBXIdEVjQLXg==", "integrity": "sha512-/d7Rnj0/ExXDMcioS78/kf1lMzYk4BZV8MZGTBKzTGZ6/406ukkbYlIsZmMPhcR5KlkunDHQLrtAVmSq7r+mSw==",
"dependencies": { "dependencies": {
"undici-types": "~6.19.2" "undici-types": "~6.19.2"
} }
@ -241,9 +241,9 @@
} }
}, },
"node_modules/cookie": { "node_modules/cookie": {
"version": "0.6.0", "version": "0.7.1",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.6.0.tgz", "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
"integrity": "sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==", "integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
@ -338,16 +338,16 @@
} }
}, },
"node_modules/engine.io": { "node_modules/engine.io": {
"version": "6.5.5", "version": "6.6.2",
"resolved": "https://registry.npmjs.org/engine.io/-/engine.io-6.5.5.tgz", "resolved": "https://registry.npmjs.org/engine.io/-/engine.io-6.6.2.tgz",
"integrity": "sha512-C5Pn8Wk+1vKBoHghJODM63yk8MvrO9EWZUfkAt5HAqIgPE4/8FF0PEGHXtEd40l223+cE5ABWuPzm38PHFXfMA==", "integrity": "sha512-gmNvsYi9C8iErnZdVcJnvCpSKbWTt1E8+JZo8b+daLninywUWi5NQ5STSHZ9rFjFO7imNcvb8Pc5pe/wMR5xEw==",
"dependencies": { "dependencies": {
"@types/cookie": "^0.4.1", "@types/cookie": "^0.4.1",
"@types/cors": "^2.8.12", "@types/cors": "^2.8.12",
"@types/node": ">=10.0.0", "@types/node": ">=10.0.0",
"accepts": "~1.3.4", "accepts": "~1.3.4",
"base64id": "2.0.0", "base64id": "2.0.0",
"cookie": "~0.4.1", "cookie": "~0.7.2",
"cors": "~2.8.5", "cors": "~2.8.5",
"debug": "~4.3.1", "debug": "~4.3.1",
"engine.io-parser": "~5.2.1", "engine.io-parser": "~5.2.1",
@ -366,9 +366,9 @@
} }
}, },
"node_modules/engine.io/node_modules/cookie": { "node_modules/engine.io/node_modules/cookie": {
"version": "0.4.2", "version": "0.7.2",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.4.2.tgz", "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.2.tgz",
"integrity": "sha512-aSWTXFzaKWkvHO1Ny/s+ePFpvKsPnjc551iI41v3ny/ow6tBG5Vd+FuqGNhh1LxOmVzOlGUriIlOaokOvhaStA==", "integrity": "sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
@ -427,16 +427,16 @@
} }
}, },
"node_modules/express": { "node_modules/express": {
"version": "4.21.0", "version": "4.21.1",
"resolved": "https://registry.npmjs.org/express/-/express-4.21.0.tgz", "resolved": "https://registry.npmjs.org/express/-/express-4.21.1.tgz",
"integrity": "sha512-VqcNGcj/Id5ZT1LZ/cfihi3ttTn+NJmkli2eZADigjq29qTlWi/hAQ43t/VLPq8+UX06FCEx3ByOYet6ZFblng==", "integrity": "sha512-YSFlK1Ee0/GC8QaO91tHcDxJiE/X4FbpAyQWkxAvG6AXCuR65YzK8ua6D9hvi/TzUfZMpc+BwuM1IPw8fmQBiQ==",
"dependencies": { "dependencies": {
"accepts": "~1.3.8", "accepts": "~1.3.8",
"array-flatten": "1.1.1", "array-flatten": "1.1.1",
"body-parser": "1.20.3", "body-parser": "1.20.3",
"content-disposition": "0.5.4", "content-disposition": "0.5.4",
"content-type": "~1.0.4", "content-type": "~1.0.4",
"cookie": "0.6.0", "cookie": "0.7.1",
"cookie-signature": "1.0.6", "cookie-signature": "1.0.6",
"debug": "2.6.9", "debug": "2.6.9",
"depd": "2.0.0", "depd": "2.0.0",
@ -1141,15 +1141,15 @@
} }
}, },
"node_modules/socket.io": { "node_modules/socket.io": {
"version": "4.7.5", "version": "4.8.0",
"resolved": "https://registry.npmjs.org/socket.io/-/socket.io-4.7.5.tgz", "resolved": "https://registry.npmjs.org/socket.io/-/socket.io-4.8.0.tgz",
"integrity": "sha512-DmeAkF6cwM9jSfmp6Dr/5/mfMwb5Z5qRrSXLpo3Fq5SqyU8CMF15jIN4ZhfSwu35ksM1qmHZDQ/DK5XTccSTvA==", "integrity": "sha512-8U6BEgGjQOfGz3HHTYaC/L1GaxDCJ/KM0XTkJly0EhZ5U/du9uNEZy4ZgYzEzIqlx2CMm25CrCqr1ck899eLNA==",
"dependencies": { "dependencies": {
"accepts": "~1.3.4", "accepts": "~1.3.4",
"base64id": "~2.0.0", "base64id": "~2.0.0",
"cors": "~2.8.5", "cors": "~2.8.5",
"debug": "~4.3.2", "debug": "~4.3.2",
"engine.io": "~6.5.2", "engine.io": "~6.6.0",
"socket.io-adapter": "~2.5.2", "socket.io-adapter": "~2.5.2",
"socket.io-parser": "~4.2.4" "socket.io-parser": "~4.2.4"
}, },

View file

@ -19,10 +19,10 @@
"homepage": "https://github.com/openreplay/openreplay#readme", "homepage": "https://github.com/openreplay/openreplay#readme",
"dependencies": { "dependencies": {
"@maxmind/geoip2-node": "^4.2.0", "@maxmind/geoip2-node": "^4.2.0",
"express": "^4.18.2", "express": "^4.21.1",
"jsonwebtoken": "^9.0.2", "jsonwebtoken": "^9.0.2",
"prom-client": "^15.0.0", "prom-client": "^15.0.0",
"socket.io": "^4.7.2", "socket.io": "^4.8.0",
"ua-parser-js": "^1.0.37", "ua-parser-js": "^1.0.37",
"winston": "^3.13.0" "winston": "^3.13.0"
} }

View file

@ -302,9 +302,14 @@ func (e *Router) pushMessagesHandlerWeb(w http.ResponseWriter, r *http.Request)
if sessionData != nil { if sessionData != nil {
r = r.WithContext(context.WithValue(r.Context(), "sessionID", fmt.Sprintf("%d", sessionData.ID))) r = r.WithContext(context.WithValue(r.Context(), "sessionID", fmt.Sprintf("%d", sessionData.ID)))
} }
tokenJustExpired := false
if err != nil { if err != nil {
e.ResponseWithError(r.Context(), w, http.StatusUnauthorized, err, startTime, r.URL.Path, bodySize) if errors.Is(err, token.JUST_EXPIRED) {
return tokenJustExpired = true
} else {
e.ResponseWithError(r.Context(), w, http.StatusUnauthorized, err, startTime, r.URL.Path, bodySize)
return
}
} }
// Add sessionID and projectID to context // Add sessionID and projectID to context
@ -314,13 +319,21 @@ func (e *Router) pushMessagesHandlerWeb(w http.ResponseWriter, r *http.Request)
// Check request body // Check request body
if r.Body == nil { if r.Body == nil {
e.ResponseWithError(r.Context(), w, http.StatusBadRequest, errors.New("request body is empty"), startTime, r.URL.Path, bodySize) errCode := http.StatusBadRequest
if tokenJustExpired {
errCode = http.StatusUnauthorized
}
e.ResponseWithError(r.Context(), w, errCode, errors.New("request body is empty"), startTime, r.URL.Path, bodySize)
return return
} }
bodyBytes, err := e.readBody(w, r, e.getBeaconSize(sessionData.ID)) bodyBytes, err := e.readBody(w, r, e.getBeaconSize(sessionData.ID))
if err != nil { if err != nil {
e.ResponseWithError(r.Context(), w, http.StatusRequestEntityTooLarge, err, startTime, r.URL.Path, bodySize) errCode := http.StatusRequestEntityTooLarge
if tokenJustExpired {
errCode = http.StatusUnauthorized
}
e.ResponseWithError(r.Context(), w, errCode, err, startTime, r.URL.Path, bodySize)
return return
} }
bodySize = len(bodyBytes) bodySize = len(bodyBytes)
@ -329,10 +342,18 @@ func (e *Router) pushMessagesHandlerWeb(w http.ResponseWriter, r *http.Request)
err = e.services.Producer.Produce(e.cfg.TopicRawWeb, sessionData.ID, bodyBytes) err = e.services.Producer.Produce(e.cfg.TopicRawWeb, sessionData.ID, bodyBytes)
if err != nil { if err != nil {
e.log.Error(r.Context(), "can't send messages batch to queue: %s", err) e.log.Error(r.Context(), "can't send messages batch to queue: %s", err)
e.ResponseWithError(r.Context(), w, http.StatusInternalServerError, errors.New("can't save message, try again"), startTime, r.URL.Path, bodySize) errCode := http.StatusInternalServerError
if tokenJustExpired {
errCode = http.StatusUnauthorized
}
e.ResponseWithError(r.Context(), w, errCode, errors.New("can't save message, try again"), startTime, r.URL.Path, bodySize)
return return
} }
if tokenJustExpired {
e.ResponseWithError(r.Context(), w, http.StatusUnauthorized, errors.New("token expired"), startTime, r.URL.Path, bodySize)
return
}
e.ResponseOK(r.Context(), w, startTime, r.URL.Path, bodySize) e.ResponseOK(r.Context(), w, startTime, r.URL.Path, bodySize)
} }

View file

@ -11,7 +11,7 @@ const BEARER_SCHEMA = "Bearer "
func (tokenizer *Tokenizer) ParseFromHTTPRequest(r *http.Request) (*TokenData, error) { func (tokenizer *Tokenizer) ParseFromHTTPRequest(r *http.Request) (*TokenData, error) {
header := r.Header.Get("Authorization") header := r.Header.Get("Authorization")
if !strings.HasPrefix(header, BEARER_SCHEMA) { if !strings.HasPrefix(header, BEARER_SCHEMA) {
return nil, errors.New("Missing token") return nil, errors.New("missing token")
} }
token := header[len(BEARER_SCHEMA):] token := header[len(BEARER_SCHEMA):]
return tokenizer.Parse(token) return tokenizer.Parse(token)

View file

@ -11,7 +11,10 @@ import (
"github.com/btcsuite/btcutil/base58" "github.com/btcsuite/btcutil/base58"
) )
var EXPIRED = errors.New("token expired") var (
EXPIRED = errors.New("token expired")
JUST_EXPIRED = errors.New("token just expired")
)
type Tokenizer struct { type Tokenizer struct {
secret []byte secret []byte
@ -64,8 +67,13 @@ func (tokenizer *Tokenizer) Parse(token string) (*TokenData, error) {
if err != nil { if err != nil {
return nil, err return nil, err
} }
res := &TokenData{id, delay, expTime}
if expTime <= time.Now().UnixMilli() { if expTime <= time.Now().UnixMilli() {
return &TokenData{id, delay, expTime}, EXPIRED // If token is expired less than 30 seconds ago, we still consider it semi-valid
if expTime+30000 > time.Now().UnixMilli() {
return res, JUST_EXPIRED
}
return res, EXPIRED
} }
return &TokenData{id, delay, expTime}, nil return res, nil
} }

1
ee/api/.gitignore vendored
View file

@ -273,7 +273,6 @@ Pipfile.lock
/chalicelib/core/usability_testing/ /chalicelib/core/usability_testing/
/NOTES.md /NOTES.md
/chalicelib/core/db_request_handler.py /chalicelib/core/db_request_handler.py
/routers/subs/spot.py
/chalicelib/utils/or_cache/ /chalicelib/utils/or_cache/
/routers/subs/health.py /routers/subs/health.py
/chalicelib/core/spot.py /chalicelib/core/spot.py

View file

@ -1,3 +1,4 @@
import chalicelib.utils.exp_ch_helper
import schemas import schemas
from chalicelib.core import countries, events, metadata from chalicelib.core import countries, events, metadata
from chalicelib.utils import ch_client from chalicelib.utils import ch_client
@ -325,12 +326,13 @@ def get_top_values(project_id, event_type, event_key=None):
FROM raw;""" FROM raw;"""
else: else:
colname = TYPE_TO_COLUMN.get(event_type) colname = TYPE_TO_COLUMN.get(event_type)
event_type = exp_ch_helper.get_event_type(event_type)
query = f"""WITH raw AS (SELECT DISTINCT {colname} AS c_value, query = f"""WITH raw AS (SELECT DISTINCT {colname} AS c_value,
COUNT(1) OVER (PARTITION BY c_value) AS row_count, COUNT(1) OVER (PARTITION BY c_value) AS row_count,
COUNT(1) OVER () AS total_count COUNT(1) OVER () AS total_count
FROM experimental.events FROM experimental.events
WHERE project_id = %(project_id)s WHERE project_id = %(project_id)s
AND event_type = '{event_type.upper()}' AND event_type = '{event_type}'
AND isNotNull(c_value) AND isNotNull(c_value)
AND notEmpty(c_value) AND notEmpty(c_value)
ORDER BY row_count DESC ORDER BY row_count DESC

View file

@ -4,8 +4,7 @@ from decouple import config
import schemas import schemas
from chalicelib.core import sessions_mobs, events from chalicelib.core import sessions_mobs, events
from chalicelib.utils import sql_helper as sh
# from chalicelib.utils import sql_helper as sh
if config("EXP_SESSIONS_SEARCH", cast=bool, default=False): if config("EXP_SESSIONS_SEARCH", cast=bool, default=False):
from chalicelib.core import sessions_exp as sessions from chalicelib.core import sessions_exp as sessions
@ -18,17 +17,24 @@ logger = logging.getLogger(__name__)
def get_by_url(project_id, data: schemas.GetHeatMapPayloadSchema): def get_by_url(project_id, data: schemas.GetHeatMapPayloadSchema):
if data.url is None or data.url == "":
return []
args = {"startDate": data.startTimestamp, "endDate": data.endTimestamp, args = {"startDate": data.startTimestamp, "endDate": data.endTimestamp,
"project_id": project_id, "url": data.url} "project_id": project_id, "url": data.url}
constraints = ["main_events.project_id = toUInt16(%(project_id)s)", constraints = ["main_events.project_id = toUInt16(%(project_id)s)",
"(main_events.url_hostpath = %(url)s OR main_events.url_path = %(url)s)",
"main_events.datetime >= toDateTime(%(startDate)s/1000)", "main_events.datetime >= toDateTime(%(startDate)s/1000)",
"main_events.datetime <= toDateTime(%(endDate)s/1000)", "main_events.datetime <= toDateTime(%(endDate)s/1000)",
"main_events.event_type='CLICK'", "main_events.event_type='CLICK'",
"isNotNull(main_events.normalized_x)"] "isNotNull(main_events.normalized_x)"]
if data.operator == schemas.SearchEventOperator.IS:
constraints.append("url_path= %(url)s")
else:
constraints.append("url_path ILIKE %(url)s")
args["url"] = helper.values_for_operator(data.url, data.operator)
query_from = f"{exp_ch_helper.get_main_events_table(data.startTimestamp)} AS main_events" query_from = f"{exp_ch_helper.get_main_events_table(data.startTimestamp)} AS main_events"
has_click_rage_filter = False
# TODO: is this used ? # TODO: is this used ?
# has_click_rage_filter = False
# if len(data.filters) > 0: # if len(data.filters) > 0:
# for i, f in enumerate(data.filters): # for i, f in enumerate(data.filters):
# if f.type == schemas.FilterType.issue and len(f.value) > 0: # if f.type == schemas.FilterType.issue and len(f.value) > 0:
@ -55,18 +61,18 @@ def get_by_url(project_id, data: schemas.GetHeatMapPayloadSchema):
# f.value, value_key=f_k)) # f.value, value_key=f_k))
# constraints.append(sh.multi_conditions(f"mis.type = %({f_k})s", # constraints.append(sh.multi_conditions(f"mis.type = %({f_k})s",
# f.value, value_key=f_k)) # f.value, value_key=f_k))
# TODO: change this once click-rage is fixed
if data.click_rage and not has_click_rage_filter: # if data.click_rage and not has_click_rage_filter:
constraints.append("""(issues_t.session_id IS NULL # constraints.append("""(issues_t.session_id IS NULL
OR (issues_t.datetime >= toDateTime(%(startDate)s/1000) # OR (issues_t.datetime >= toDateTime(%(startDate)s/1000)
AND issues_t.datetime <= toDateTime(%(endDate)s/1000) # AND issues_t.datetime <= toDateTime(%(endDate)s/1000)
AND issues_t.project_id = toUInt16(%(project_id)s) # AND issues_t.project_id = toUInt16(%(project_id)s)
AND issues_t.event_type = 'ISSUE' # AND issues_t.event_type = 'ISSUE'
AND issues_t.project_id = toUInt16(%(project_id)s) # AND issues_t.project_id = toUInt16(%(project_id)s)
AND mis.project_id = toUInt16(%(project_id)s) # AND mis.project_id = toUInt16(%(project_id)s)
AND mis.type='click_rage'))""") # AND mis.type='click_rage'))""")
query_from += """ LEFT JOIN experimental.events AS issues_t ON (main_events.session_id=issues_t.session_id) # query_from += """ LEFT JOIN experimental.events AS issues_t ON (main_events.session_id=issues_t.session_id)
LEFT JOIN experimental.issues AS mis ON (issues_t.issue_id=mis.issue_id)""" # LEFT JOIN experimental.issues AS mis ON (issues_t.issue_id=mis.issue_id)"""
with ch_client.ClickHouseClient() as cur: with ch_client.ClickHouseClient() as cur:
query = cur.format(f"""SELECT main_events.normalized_x AS normalized_x, query = cur.format(f"""SELECT main_events.normalized_x AS normalized_x,
main_events.normalized_y AS normalized_y main_events.normalized_y AS normalized_y
@ -93,16 +99,21 @@ def get_x_y_by_url_and_session_id(project_id, session_id, data: schemas.GetHeatM
args = {"project_id": project_id, "session_id": session_id, "url": data.url} args = {"project_id": project_id, "session_id": session_id, "url": data.url}
constraints = ["main_events.project_id = toUInt16(%(project_id)s)", constraints = ["main_events.project_id = toUInt16(%(project_id)s)",
"main_events.session_id = %(session_id)s", "main_events.session_id = %(session_id)s",
"(main_events.url_hostpath = %(url)s OR main_events.url_path = %(url)s)",
"main_events.event_type='CLICK'", "main_events.event_type='CLICK'",
"isNotNull(main_events.normalized_x)"] "isNotNull(main_events.normalized_x)"]
if data.operator == schemas.SearchEventOperator.IS:
constraints.append("main_events.url_path = %(url)s")
else:
constraints.append("main_events.url_path ILIKE %(url)s")
args["url"] = helper.values_for_operator(data.url, data.operator)
query_from = f"{exp_ch_helper.get_main_events_table(0)} AS main_events" query_from = f"{exp_ch_helper.get_main_events_table(0)} AS main_events"
with ch_client.ClickHouseClient() as cur: with ch_client.ClickHouseClient() as cur:
query = cur.format(f"""SELECT main_events.normalized_x AS normalized_x, query = cur.format(f"""SELECT main_events.normalized_x AS normalized_x,
main_events.normalized_y AS normalized_y main_events.normalized_y AS normalized_y
FROM {query_from} FROM {query_from}
WHERE {" AND ".join(constraints)};""", args) WHERE {" AND ".join(constraints)};""", args)
logger.debug("---------") logger.debug("---------")
logger.debug(query) logger.debug(query)
logger.debug("---------") logger.debug("---------")
@ -123,17 +134,22 @@ def get_selectors_by_url_and_session_id(project_id, session_id, data: schemas.Ge
args = {"project_id": project_id, "session_id": session_id, "url": data.url} args = {"project_id": project_id, "session_id": session_id, "url": data.url}
constraints = ["main_events.project_id = toUInt16(%(project_id)s)", constraints = ["main_events.project_id = toUInt16(%(project_id)s)",
"main_events.session_id = %(session_id)s", "main_events.session_id = %(session_id)s",
"(main_events.url_hostpath = %(url)s OR main_events.url_path = %(url)s)",
"main_events.event_type='CLICK'"] "main_events.event_type='CLICK'"]
if data.operator == schemas.SearchEventOperator.IS:
constraints.append("main_events.url_path = %(url)s")
else:
constraints.append("main_events.url_path ILIKE %(url)s")
args["url"] = helper.values_for_operator(data.url, data.operator)
query_from = f"{exp_ch_helper.get_main_events_table(0)} AS main_events" query_from = f"{exp_ch_helper.get_main_events_table(0)} AS main_events"
with ch_client.ClickHouseClient() as cur: with ch_client.ClickHouseClient() as cur:
query = cur.format(f"""SELECT main_events.selector AS selector, query = cur.format(f"""SELECT main_events.selector AS selector,
COUNT(1) AS count COUNT(1) AS count
FROM {query_from} FROM {query_from}
WHERE {" AND ".join(constraints)} WHERE {" AND ".join(constraints)}
GROUP BY 1 GROUP BY 1
ORDER BY count DESC;""", args) ORDER BY count DESC;""", args)
logger.debug("---------") logger.debug("---------")
logger.debug(query) logger.debug(query)
logger.debug("---------") logger.debug("---------")
@ -158,29 +174,93 @@ if not config("EXP_SESSIONS_SEARCH", cast=bool, default=False):
s.duration""" s.duration"""
def __get_1_url(location_condition: schemas.SessionSearchEventSchema2 | None, session_id: str, project_id: int,
start_time: int,
end_time: int) -> str | None:
full_args = {
"sessionId": session_id,
"projectId": project_id,
"start_time": start_time,
"end_time": end_time,
}
sub_condition = ["session_id = %(sessionId)s"]
if location_condition and len(location_condition.value) > 0:
f_k = "LOC"
op = sh.get_sql_operator(location_condition.operator)
full_args = {**full_args, **sh.multi_values(location_condition.value, value_key=f_k)}
sub_condition.append(
sh.multi_conditions(f'path {op} %({f_k})s', location_condition.value, is_not=False,
value_key=f_k))
with pg_client.PostgresClient() as cur:
main_query = cur.mogrify(f"""WITH paths AS (SELECT DISTINCT path
FROM events.clicks
WHERE {" AND ".join(sub_condition)})
SELECT path, COUNT(1) AS count
FROM events.clicks
INNER JOIN public.sessions USING (session_id)
INNER JOIN paths USING (path)
WHERE sessions.project_id = %(projectId)s
AND clicks.timestamp >= %(start_time)s
AND clicks.timestamp <= %(end_time)s
AND start_ts >= %(start_time)s
AND start_ts <= %(end_time)s
AND duration IS NOT NULL
GROUP BY path
ORDER BY count DESC
LIMIT 1;""", full_args)
logger.debug("--------------------")
logger.debug(main_query)
logger.debug("--------------------")
try:
cur.execute(main_query)
except Exception as err:
logger.warning("--------- CLICK MAP BEST URL SEARCH QUERY EXCEPTION -----------")
logger.warning(main_query.decode('UTF-8'))
logger.warning("--------- PAYLOAD -----------")
logger.warning(full_args)
logger.warning("--------------------")
raise err
url = cur.fetchone()
if url is None:
return None
return url["path"]
def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_id, def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_id,
include_mobs: bool = True, exclude_sessions: list[str] = [], include_mobs: bool = True, exclude_sessions: list[str] = [],
_depth: int = 3): _depth: int = 3):
no_platform = True no_platform = True
no_location = True location_condition = None
no_click = True
for f in data.filters: for f in data.filters:
if f.type == schemas.FilterType.PLATFORM: if f.type == schemas.FilterType.PLATFORM:
no_platform = False no_platform = False
break break
for f in data.events: for f in data.events:
if f.type == schemas.EventType.LOCATION: if f.type == schemas.EventType.LOCATION:
no_location = False
if len(f.value) == 0: if len(f.value) == 0:
f.operator = schemas.SearchEventOperator.IS_ANY f.operator = schemas.SearchEventOperator.IS_ANY
location_condition = f.model_copy()
elif f.type == schemas.EventType.CLICK:
no_click = False
if len(f.value) == 0:
f.operator = schemas.SearchEventOperator.IS_ANY
if location_condition and not no_click:
break break
if no_platform: if no_platform:
data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.PLATFORM, data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.PLATFORM,
value=[schemas.PlatformType.DESKTOP], value=[schemas.PlatformType.DESKTOP],
operator=schemas.SearchEventOperator.IS)) operator=schemas.SearchEventOperator.IS))
if no_location: if not location_condition:
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.LOCATION, data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.LOCATION,
value=[], value=[],
operator=schemas.SearchEventOperator.IS_ANY)) operator=schemas.SearchEventOperator.IS_ANY))
if no_click:
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.CLICK,
value=[],
operator=schemas.SearchEventOperator.IS_ANY))
data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.EVENTS_COUNT, data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.EVENTS_COUNT,
value=[0], value=[0],
@ -198,7 +278,8 @@ if not config("EXP_SESSIONS_SEARCH", cast=bool, default=False):
main_query = cur.mogrify(f"""SELECT * main_query = cur.mogrify(f"""SELECT *
FROM (SELECT {SESSION_PROJECTION_COLS} FROM (SELECT {SESSION_PROJECTION_COLS}
{query_part} {query_part}
ORDER BY {data.sort} {data.order.value} --ignoring the sort made the query faster (from 6s to 100ms)
--ORDER BY {data.sort} {data.order.value}
LIMIT 20) AS raw LIMIT 20) AS raw
ORDER BY random() ORDER BY random()
LIMIT 1;""", full_args) LIMIT 1;""", full_args)
@ -217,6 +298,13 @@ if not config("EXP_SESSIONS_SEARCH", cast=bool, default=False):
session = cur.fetchone() session = cur.fetchone()
if session: if session:
if not location_condition or location_condition.operator == schemas.SearchEventOperator.IS_ANY:
session["path"] = __get_1_url(project_id=project_id, session_id=session["session_id"],
location_condition=location_condition,
start_time=data.startTimestamp, end_time=data.endTimestamp)
else:
session["path"] = location_condition.value[0]
if include_mobs: if include_mobs:
session['domURL'] = sessions_mobs.get_urls(session_id=session["session_id"], project_id=project_id) session['domURL'] = sessions_mobs.get_urls(session_id=session["session_id"], project_id=project_id)
session['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session["session_id"]) session['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session["session_id"])
@ -288,29 +376,89 @@ else:
s.duration AS duration""" s.duration AS duration"""
def __get_1_url(location_condition: schemas.SessionSearchEventSchema2 | None, session_id: str, project_id: int,
start_time: int,
end_time: int) -> str | None:
full_args = {
"sessionId": session_id,
"projectId": project_id,
"start_time": start_time,
"end_time": end_time,
}
sub_condition = ["session_id = %(sessionId)s", "event_type = 'CLICK'", "project_id = %(projectId)s"]
if location_condition and len(location_condition.value) > 0:
f_k = "LOC"
op = sh.get_sql_operator(location_condition.operator)
full_args = {**full_args, **sh.multi_values(location_condition.value, value_key=f_k)}
sub_condition.append(
sh.multi_conditions(f'url_path {op} %({f_k})s', location_condition.value, is_not=False,
value_key=f_k))
with ch_client.ClickHouseClient() as cur:
main_query = cur.format(f"""WITH paths AS (SELECT DISTINCT url_path
FROM experimental.events
WHERE {" AND ".join(sub_condition)})
SELECT url_path, COUNT(1) AS count
FROM experimental.events
INNER JOIN paths USING (url_path)
WHERE event_type = 'CLICK'
AND project_id = %(projectId)s
AND datetime >= toDateTime(%(start_time)s / 1000)
AND datetime <= toDateTime(%(end_time)s / 1000)
GROUP BY url_path
ORDER BY count DESC
LIMIT 1;""", full_args)
logger.debug("--------------------")
logger.debug(main_query)
logger.debug("--------------------")
try:
url = cur.execute(main_query)
except Exception as err:
logger.warning("--------- CLICK MAP BEST URL SEARCH QUERY EXCEPTION CH-----------")
logger.warning(main_query)
logger.warning("--------- PAYLOAD -----------")
logger.warning(full_args)
logger.warning("--------------------")
raise err
if url is None or len(url) == 0:
return None
return url[0]["path"]
def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_id, def search_short_session(data: schemas.HeatMapSessionsSearch, project_id, user_id,
include_mobs: bool = True, exclude_sessions: list[str] = [], include_mobs: bool = True, exclude_sessions: list[str] = [],
_depth: int = 3): _depth: int = 3):
no_platform = True no_platform = True
no_location = True location_condition = None
no_click = True
for f in data.filters: for f in data.filters:
if f.type == schemas.FilterType.PLATFORM: if f.type == schemas.FilterType.PLATFORM:
no_platform = False no_platform = False
break break
for f in data.events: for f in data.events:
if f.type == schemas.EventType.LOCATION: if f.type == schemas.EventType.LOCATION:
no_location = False
if len(f.value) == 0: if len(f.value) == 0:
f.operator = schemas.SearchEventOperator.IS_ANY f.operator = schemas.SearchEventOperator.IS_ANY
location_condition = f.model_copy()
elif f.type == schemas.EventType.CLICK:
no_click = False
if len(f.value) == 0:
f.operator = schemas.SearchEventOperator.IS_ANY
if location_condition and not no_click:
break break
if no_platform: if no_platform:
data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.PLATFORM, data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.PLATFORM,
value=[schemas.PlatformType.DESKTOP], value=[schemas.PlatformType.DESKTOP],
operator=schemas.SearchEventOperator.IS)) operator=schemas.SearchEventOperator.IS))
if no_location: if not location_condition:
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.LOCATION, data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.LOCATION,
value=[], value=[],
operator=schemas.SearchEventOperator.IS_ANY)) operator=schemas.SearchEventOperator.IS_ANY))
if no_click:
data.events.append(schemas.SessionSearchEventSchema2(type=schemas.EventType.CLICK,
value=[],
operator=schemas.SearchEventOperator.IS_ANY))
data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.EVENTS_COUNT, data.filters.append(schemas.SessionSearchFilterSchema(type=schemas.FilterType.EVENTS_COUNT,
value=[0], value=[0],
@ -328,7 +476,7 @@ else:
main_query = cur.format(f"""SELECT * main_query = cur.format(f"""SELECT *
FROM (SELECT {SESSION_PROJECTION_COLS} FROM (SELECT {SESSION_PROJECTION_COLS}
{query_part} {query_part}
ORDER BY {data.sort} {data.order.value} -- ORDER BY {data.sort} {data.order.value}
LIMIT 20) AS raw LIMIT 20) AS raw
ORDER BY rand() ORDER BY rand()
LIMIT 1;""", full_args) LIMIT 1;""", full_args)
@ -347,6 +495,13 @@ else:
if len(session) > 0: if len(session) > 0:
session = session[0] session = session[0]
if not location_condition or location_condition.operator == schemas.SearchEventOperator.IS_ANY:
session["path"] = __get_1_url(project_id=project_id, session_id=session["session_id"],
location_condition=location_condition,
start_time=data.startTimestamp, end_time=data.endTimestamp)
else:
session["path"] = location_condition.value[0]
if include_mobs: if include_mobs:
session['domURL'] = sessions_mobs.get_urls(session_id=session["session_id"], project_id=project_id) session['domURL'] = sessions_mobs.get_urls(session_id=session["session_id"], project_id=project_id)
session['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session["session_id"]) session['mobsUrl'] = sessions_mobs.get_urls_depercated(session_id=session["session_id"])
@ -369,16 +524,16 @@ else:
def get_selected_session(project_id, session_id): def get_selected_session(project_id, session_id):
with ch_client.ClickHouseClient() as cur: with ch_client.ClickHouseClient() as cur:
main_query = cur.format(f"""SELECT {SESSION_PROJECTION_COLS} main_query = cur.format(f"""SELECT {SESSION_PROJECTION_COLS}
FROM experimental.sessions AS s FROM experimental.sessions AS s
WHERE session_id=%(session_id)s;""", {"session_id": session_id}) WHERE session_id=%(session_id)s;""", {"session_id": session_id})
logger.debug("--------------------") logger.debug("--------------------")
logger.debug(main_query) logger.debug(main_query)
logger.debug("--------------------") logger.debug("--------------------")
try: try:
session = cur.execute(main_query) session = cur.execute(main_query)
except Exception as err: except Exception as err:
logger.warning("--------- CLICK MAP GET SELECTED SESSION QUERY EXCEPTION -----------") logger.warning("--------- CLICK MAP GET SELECTED SESSION QUERY EXCEPTION CH-----------")
logger.warning(main_query.decode('UTF-8')) logger.warning(main_query)
raise err raise err
if len(session) > 0: if len(session) > 0:
session = session[0] session = session[0]
@ -411,6 +566,6 @@ else:
WHERE session_id = %(session_id)s WHERE session_id = %(session_id)s
AND event_type='LOCATION' AND event_type='LOCATION'
AND project_id= %(project_id)s AND project_id= %(project_id)s
ORDER BY datetime,message_id;""", {"session_id": session_id,"project_id": project_id}) ORDER BY datetime,message_id;""", {"session_id": session_id, "project_id": project_id})
rows = helper.list_to_camel_case(rows) rows = helper.list_to_camel_case(rows)
return rows return rows

View file

@ -450,6 +450,7 @@ def search2_table(data: schemas.SessionsSearchPayloadSchema, project_id: int, de
elif metric_of == schemas.MetricOfTable.REFERRER: elif metric_of == schemas.MetricOfTable.REFERRER:
main_col = "referrer" main_col = "referrer"
extra_col = ", referrer" extra_col = ", referrer"
extra_where = "WHERE isNotNull(referrer)"
elif metric_of == schemas.MetricOfTable.FETCH: elif metric_of == schemas.MetricOfTable.FETCH:
main_col = "url_path" main_col = "url_path"
extra_col = ", s.url_path" extra_col = ", s.url_path"
@ -554,39 +555,6 @@ def __is_valid_event(is_any: bool, event: schemas.SessionSearchEventSchema2):
event.filters is None or len(event.filters) == 0)) event.filters is None or len(event.filters) == 0))
def __get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEventType], platform="web"):
defs = {
schemas.EventType.CLICK: "CLICK",
schemas.EventType.INPUT: "INPUT",
schemas.EventType.LOCATION: "LOCATION",
schemas.PerformanceEventType.LOCATION_DOM_COMPLETE: "LOCATION",
schemas.PerformanceEventType.LOCATION_LARGEST_CONTENTFUL_PAINT_TIME: "LOCATION",
schemas.PerformanceEventType.LOCATION_TTFB: "LOCATION",
schemas.EventType.CUSTOM: "CUSTOM",
schemas.EventType.REQUEST: "REQUEST",
schemas.EventType.REQUEST_DETAILS: "REQUEST",
schemas.PerformanceEventType.FETCH_FAILED: "REQUEST",
schemas.EventType.STATE_ACTION: "STATEACTION",
schemas.EventType.ERROR: "ERROR",
schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD: 'PERFORMANCE',
schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE: 'PERFORMANCE'
}
defs_mobile = {
schemas.EventType.CLICK_MOBILE: "TAP",
schemas.EventType.INPUT_MOBILE: "INPUT",
schemas.EventType.CUSTOM_MOBILE: "CUSTOM",
schemas.EventType.REQUEST_MOBILE: "REQUEST",
schemas.EventType.ERROR_MOBILE: "CRASH",
schemas.EventType.VIEW_MOBILE: "VIEW",
schemas.EventType.SWIPE_MOBILE: "SWIPE"
}
if platform != "web" and event_type in defs_mobile:
return defs_mobile.get(event_type)
if event_type not in defs:
raise Exception(f"unsupported EventType:{event_type}")
return defs.get(event_type)
# this function generates the query and return the generated-query with the dict of query arguments # this function generates the query and return the generated-query with the dict of query arguments
def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_status, errors_only, favorite_only, issue, def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_status, errors_only, favorite_only, issue,
project_id, user_id, platform="web", extra_event=None, extra_deduplication=[], project_id, user_id, platform="web", extra_event=None, extra_deduplication=[],
@ -925,7 +893,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
if platform == "web": if platform == "web":
_column = events.EventType.CLICK.column _column = events.EventType.CLICK.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if schemas.ClickEventExtraOperator.has_value(event.operator): if schemas.ClickEventExtraOperator.has_value(event.operator):
@ -937,7 +906,8 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -945,14 +915,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
else: else:
_column = events.EventType.CLICK_MOBILE.column _column = events.EventType.CLICK_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -963,14 +935,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
if platform == "web": if platform == "web":
_column = events.EventType.INPUT.column _column = events.EventType.INPUT.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -982,14 +956,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
full_args = {**full_args, **_multiple_values(event.source, value_key=f"custom{i}")} full_args = {**full_args, **_multiple_values(event.source, value_key=f"custom{i}")}
else: else:
_column = events.EventType.INPUT_MOBILE.column _column = events.EventType.INPUT_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1000,14 +976,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
if platform == "web": if platform == "web":
_column = 'url_path' _column = 'url_path'
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1015,14 +993,16 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
else: else:
_column = events.EventType.VIEW_MOBILE.column _column = events.EventType.VIEW_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(
f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {
"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1031,14 +1011,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.CUSTOM.ui_type: elif event_type == events.EventType.CUSTOM.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = events.EventType.CUSTOM.column _column = events.EventType.CUSTOM.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1047,14 +1027,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.REQUEST.ui_type: elif event_type == events.EventType.REQUEST.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path' _column = 'url_path'
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1072,14 +1052,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.STATEACTION.ui_type: elif event_type == events.EventType.STATEACTION.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = events.EventType.STATEACTION.column _column = events.EventType.STATEACTION.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1089,7 +1069,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.ERROR.ui_type: elif event_type == events.EventType.ERROR.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main" event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main"
events_extra_join = f"SELECT * FROM {MAIN_EVENTS_TABLE} AS main1 WHERE main1.project_id=%(project_id)s" events_extra_join = f"SELECT * FROM {MAIN_EVENTS_TABLE} AS main1 WHERE main1.project_id=%(project_id)s"
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
event.source = tuple(event.source) event.source = tuple(event.source)
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []
@ -1109,14 +1089,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
# ----- Mobile # ----- Mobile
elif event_type == events.EventType.CLICK_MOBILE.ui_type: elif event_type == events.EventType.CLICK_MOBILE.ui_type:
_column = events.EventType.CLICK_MOBILE.column _column = events.EventType.CLICK_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1124,14 +1104,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.INPUT_MOBILE.ui_type: elif event_type == events.EventType.INPUT_MOBILE.ui_type:
_column = events.EventType.INPUT_MOBILE.column _column = events.EventType.INPUT_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1139,14 +1119,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.VIEW_MOBILE.ui_type: elif event_type == events.EventType.VIEW_MOBILE.ui_type:
_column = events.EventType.VIEW_MOBILE.column _column = events.EventType.VIEW_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1154,14 +1134,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.CUSTOM_MOBILE.ui_type: elif event_type == events.EventType.CUSTOM_MOBILE.ui_type:
_column = events.EventType.CUSTOM_MOBILE.column _column = events.EventType.CUSTOM_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1170,14 +1150,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == events.EventType.REQUEST_MOBILE.ui_type: elif event_type == events.EventType.REQUEST_MOBILE.ui_type:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path' _column = 'url_path'
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
@ -1185,14 +1165,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.CRASH_MOBILE.ui_type: elif event_type == events.EventType.CRASH_MOBILE.ui_type:
_column = events.EventType.CRASH_MOBILE.column _column = events.EventType.CRASH_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1200,14 +1180,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
events_conditions[-1]["condition"] = event_where[-1] events_conditions[-1]["condition"] = event_where[-1]
elif event_type == events.EventType.SWIPE_MOBILE.ui_type and platform != "web": elif event_type == events.EventType.SWIPE_MOBILE.ui_type and platform != "web":
_column = events.EventType.SWIPE_MOBILE.column _column = events.EventType.SWIPE_MOBILE.column
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
if not is_any: if not is_any:
if is_not: if is_not:
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1217,7 +1197,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type == schemas.PerformanceEventType.FETCH_FAILED: elif event_type == schemas.PerformanceEventType.FETCH_FAILED:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
_column = 'url_path' _column = 'url_path'
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []
if not is_any: if not is_any:
@ -1225,7 +1205,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value, event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
value_key=e_k)) value_key=e_k))
events_conditions_not.append( events_conditions_not.append(
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"}) {"type": f"sub.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'"})
events_conditions_not[-1]["condition"] = event_where[-1] events_conditions_not[-1]["condition"] = event_where[-1]
else: else:
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
@ -1256,7 +1236,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
schemas.PerformanceEventType.LOCATION_LARGEST_CONTENTFUL_PAINT_TIME, schemas.PerformanceEventType.LOCATION_LARGEST_CONTENTFUL_PAINT_TIME,
schemas.PerformanceEventType.LOCATION_TTFB]: schemas.PerformanceEventType.LOCATION_TTFB]:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []
col = performance_event.get_col(event_type) col = performance_event.get_col(event_type)
@ -1279,7 +1259,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
elif event_type in [schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD, elif event_type in [schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD,
schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE]: schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE]:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []
col = performance_event.get_col(event_type) col = performance_event.get_col(event_type)
@ -1302,9 +1282,9 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
# elif event_type == schemas.PerformanceEventType.time_between_events: # elif event_type == schemas.PerformanceEventType.time_between_events:
# event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " # event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
# # event_from = event_from % f"{getattr(events.event_type, event.value[0].type).table} AS main INNER JOIN {getattr(events.event_type, event.value[1].type).table} AS main2 USING(session_id) " # # event_from = event_from % f"{getattr(events.event_type, event.value[0].type).table} AS main INNER JOIN {getattr(events.event_type, event.value[1].type).table} AS main2 USING(session_id) "
# event_where.append(f"main.event_type='{__get_event_type(event.value[0].type, platform=platform)}'") # event_where.append(f"main.event_type='{__exp_ch_helper.get_event_type(event.value[0].type, platform=platform)}'")
# events_conditions.append({"type": event_where[-1]}) # events_conditions.append({"type": event_where[-1]})
# event_where.append(f"main.event_type='{__get_event_type(event.value[0].type, platform=platform)}'") # event_where.append(f"main.event_type='{__exp_ch_helper.get_event_type(event.value[0].type, platform=platform)}'")
# events_conditions.append({"type": event_where[-1]}) # events_conditions.append({"type": event_where[-1]})
# #
# if not isinstance(event.value[0].value, list): # if not isinstance(event.value[0].value, list):
@ -1352,7 +1332,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
# TODO: no isNot for RequestDetails # TODO: no isNot for RequestDetails
elif event_type == schemas.EventType.REQUEST_DETAILS: elif event_type == schemas.EventType.REQUEST_DETAILS:
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main " event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'") event_where.append(f"main.event_type='{exp_ch_helper.get_event_type(event_type, platform=platform)}'")
events_conditions.append({"type": event_where[-1]}) events_conditions.append({"type": event_where[-1]})
apply = False apply = False
events_conditions[-1]["condition"] = [] events_conditions[-1]["condition"] = []

View file

@ -1,3 +1,6 @@
from typing import Union
import schemas
from chalicelib.utils.TimeUTC import TimeUTC from chalicelib.utils.TimeUTC import TimeUTC
from decouple import config from decouple import config
import logging import logging
@ -51,3 +54,37 @@ def get_main_js_errors_sessions_table(timestamp=0):
# return "experimental.js_errors_sessions_mv" # \ # return "experimental.js_errors_sessions_mv" # \
# if config("EXP_7D_MV", cast=bool, default=True) \ # if config("EXP_7D_MV", cast=bool, default=True) \
# and timestamp >= TimeUTC.now(delta_days=-7) else "experimental.events" # and timestamp >= TimeUTC.now(delta_days=-7) else "experimental.events"
def get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEventType], platform="web"):
defs = {
schemas.EventType.CLICK: "CLICK",
schemas.EventType.INPUT: "INPUT",
schemas.EventType.LOCATION: "LOCATION",
schemas.PerformanceEventType.LOCATION_DOM_COMPLETE: "LOCATION",
schemas.PerformanceEventType.LOCATION_LARGEST_CONTENTFUL_PAINT_TIME: "LOCATION",
schemas.PerformanceEventType.LOCATION_TTFB: "LOCATION",
schemas.EventType.CUSTOM: "CUSTOM",
schemas.EventType.REQUEST: "REQUEST",
schemas.EventType.REQUEST_DETAILS: "REQUEST",
schemas.PerformanceEventType.FETCH_FAILED: "REQUEST",
schemas.EventType.STATE_ACTION: "STATEACTION",
schemas.EventType.ERROR: "ERROR",
schemas.PerformanceEventType.LOCATION_AVG_CPU_LOAD: 'PERFORMANCE',
schemas.PerformanceEventType.LOCATION_AVG_MEMORY_USAGE: 'PERFORMANCE',
schemas.FetchFilterType.FETCH_URL: 'REQUEST'
}
defs_mobile = {
schemas.EventType.CLICK_MOBILE: "TAP",
schemas.EventType.INPUT_MOBILE: "INPUT",
schemas.EventType.CUSTOM_MOBILE: "CUSTOM",
schemas.EventType.REQUEST_MOBILE: "REQUEST",
schemas.EventType.ERROR_MOBILE: "CRASH",
schemas.EventType.VIEW_MOBILE: "VIEW",
schemas.EventType.SWIPE_MOBILE: "SWIPE"
}
if platform != "web" and event_type in defs_mobile:
return defs_mobile.get(event_type)
if event_type not in defs:
raise Exception(f"unsupported EventType:{event_type}")
return defs.get(event_type)

View file

@ -95,7 +95,6 @@ rm -rf ./orpy.py
rm -rf ./chalicelib/core/usability_testing/ rm -rf ./chalicelib/core/usability_testing/
rm -rf ./chalicelib/core/db_request_handler.py rm -rf ./chalicelib/core/db_request_handler.py
rm -rf ./chalicelib/core/db_request_handler.py rm -rf ./chalicelib/core/db_request_handler.py
rm -rf ./routers/subs/spot.py
rm -rf ./chalicelib/utils/or_cache rm -rf ./chalicelib/utils/or_cache
rm -rf ./routers/subs/health.py rm -rf ./routers/subs/health.py
rm -rf ./chalicelib/core/spot.py rm -rf ./chalicelib/core/spot.py

View file

@ -7,6 +7,7 @@ psycopg2-binary==2.9.9
psycopg[pool,binary]==3.2.1 psycopg[pool,binary]==3.2.1
elasticsearch==8.15.0 elasticsearch==8.15.0
jira==3.8.0 jira==3.8.0
cachetools==5.5.0

View file

@ -7,6 +7,7 @@ psycopg2-binary==2.9.9
psycopg[pool,binary]==3.2.1 psycopg[pool,binary]==3.2.1
elasticsearch==8.15.0 elasticsearch==8.15.0
jira==3.8.0 jira==3.8.0
cachetools==5.5.0

View file

@ -346,7 +346,8 @@ def get_error_trace(projectId: int, sessionId: int, errorId: str,
} }
@app.get('/{projectId}/errors/{errorId}', tags=['errors'], dependencies=[OR_scope(Permissions.DEV_TOOLS)]) @app.get('/{projectId}/errors/{errorId}', tags=['errors'],
dependencies=[OR_scope(Permissions.DEV_TOOLS, ServicePermissions.DEV_TOOLS)])
def errors_get_details(projectId: int, errorId: str, background_tasks: BackgroundTasks, density24: int = 24, def errors_get_details(projectId: int, errorId: str, background_tasks: BackgroundTasks, density24: int = 24,
density30: int = 30, context: schemas.CurrentContext = Depends(OR_context)): density30: int = 30, context: schemas.CurrentContext = Depends(OR_context)):
data = errors.get_details(project_id=projectId, user_id=context.user_id, error_id=errorId, data = errors.get_details(project_id=projectId, user_id=context.user_id, error_id=errorId,
@ -357,7 +358,8 @@ def errors_get_details(projectId: int, errorId: str, background_tasks: Backgroun
return data return data
@app.get('/{projectId}/errors/{errorId}/sourcemaps', tags=['errors'], dependencies=[OR_scope(Permissions.DEV_TOOLS)]) @app.get('/{projectId}/errors/{errorId}/sourcemaps', tags=['errors'],
dependencies=[OR_scope(Permissions.DEV_TOOLS, ServicePermissions.DEV_TOOLS)])
def errors_get_details_sourcemaps(projectId: int, errorId: str, def errors_get_details_sourcemaps(projectId: int, errorId: str,
context: schemas.CurrentContext = Depends(OR_context)): context: schemas.CurrentContext = Depends(OR_context)):
data = errors.get_trace(project_id=projectId, error_id=errorId) data = errors.get_trace(project_id=projectId, error_id=errorId)
@ -523,7 +525,7 @@ def create_note(projectId: int, sessionId: int, data: schemas.SessionNoteSchema
@app.get('/{projectId}/sessions/{sessionId}/notes', tags=["sessions", "notes"], @app.get('/{projectId}/sessions/{sessionId}/notes', tags=["sessions", "notes"],
dependencies=[OR_scope(Permissions.SESSION_REPLAY)]) dependencies=[OR_scope(Permissions.SESSION_REPLAY, ServicePermissions.READ_NOTES)])
def get_session_notes(projectId: int, sessionId: int, context: schemas.CurrentContext = Depends(OR_context)): def get_session_notes(projectId: int, sessionId: int, context: schemas.CurrentContext = Depends(OR_context)):
data = sessions_notes.get_session_notes(tenant_id=context.tenant_id, project_id=projectId, data = sessions_notes.get_session_notes(tenant_id=context.tenant_id, project_id=projectId,
session_id=sessionId, user_id=context.user_id) session_id=sessionId, user_id=context.user_id)

View file

@ -76,6 +76,7 @@ async def __process_assertion(request: Request, tenant_key=None) -> Response | d
tenant_key = user_data.get("tenantKey", []) tenant_key = user_data.get("tenantKey", [])
else: else:
logger.info("Using tenant key from ACS-URL") logger.info("Using tenant key from ACS-URL")
tenant_key = [tenant_key]
logger.debug(f"received nameId: {email} tenant_key: {tenant_key}") logger.debug(f"received nameId: {email} tenant_key: {tenant_key}")
logger.debug(">user_data:") logger.debug(">user_data:")
@ -90,26 +91,35 @@ async def __process_assertion(request: Request, tenant_key=None) -> Response | d
return {"errors": ["invalid tenantKey, please copy the correct value from Preferences > Account"]} return {"errors": ["invalid tenantKey, please copy the correct value from Preferences > Account"]}
existing = users.get_by_email_only(email) existing = users.get_by_email_only(email)
role_names = user_data.get("role", []) role_names = user_data.get("role", [])
if len(role_names) == 0:
logger.info("No role specified, setting role to member")
role_names = ["member"]
role = None role = None
for r in role_names: if len(role_names) == 0:
if r.lower() == existing["roleName"].lower(): if existing is None:
role = {"roleId": existing["roleId"], "name": r} logger.info("No role specified, setting role to member")
role_names = ["member"]
else: else:
role = roles.get_role_by_name(tenant_id=t['tenantId'], name=r) role_names = [existing["roleName"]]
role = {"name": existing["roleName"], "roleId": existing["roleId"]}
if role is None:
for r in role_names:
if existing and r.lower() == existing["roleName"].lower():
role = {"roleId": existing["roleId"], "name": r}
else:
role = roles.get_role_by_name(tenant_id=t['tenantId'], name=r)
if role is not None: if role is not None:
break break
if role is None: if role is None:
return {"errors": [f"role '{role_names}' not found, please create it in OpenReplay first"]} return {"errors": [f"role '{role_names}' not found, please create it in OpenReplay first"]}
logger.info(f"received roles:{role_names}; using:{role['name']}") logger.info(f"received roles:{role_names}; using:{role['name']}")
admin_privileges = user_data.get("adminPrivileges", []) admin_privileges = user_data.get("adminPrivileges", [])
admin_privileges = not (len(admin_privileges) == 0 if len(admin_privileges) == 0:
or admin_privileges[0] is None if existing is None:
or admin_privileges[0].lower() == "false") admin_privileges = not (len(admin_privileges) == 0
or admin_privileges[0] is None
or admin_privileges[0].lower() == "false")
else:
admin_privileges = existing["admin"]
internal_id = next(iter(user_data.get("internalId", [])), None) internal_id = next(iter(user_data.get("internalId", [])), None)
full_name = " ".join(user_data.get("firstName", []) + user_data.get("lastName", [])) full_name = " ".join(user_data.get("firstName", []) + user_data.get("lastName", []))

View file

@ -0,0 +1,32 @@
from fastapi import Depends
from starlette.responses import JSONResponse, Response
import schemas
from chalicelib.core import spot, webhook
from or_dependencies import OR_context
from routers.base import get_routers
public_app, app, app_apikey = get_routers(prefix="/spot", tags=["spot"])
COOKIE_PATH = "/api/spot/refresh"
@app.get('/logout')
def logout_spot(response: Response, context: schemas.CurrentContext = Depends(OR_context)):
spot.logout(user_id=context.user_id)
response.delete_cookie(key="spotRefreshToken", path=COOKIE_PATH)
return {"data": "success"}
@app.get('/refresh')
def refresh_spot_login(response: JSONResponse, context: schemas.CurrentContext = Depends(OR_context)):
r = spot.refresh(user_id=context.user_id, tenant_id=context.tenant_id)
content = {"jwt": r.get("jwt")}
response.set_cookie(key="spotRefreshToken", value=r.get("refreshToken"), path=COOKIE_PATH,
max_age=r.pop("refreshTokenMaxAge"), secure=True, httponly=True)
return content
@app.get('/integrations/slack/channels', tags=["integrations"])
def get_slack_channels(context: schemas.CurrentContext = Depends(OR_context)):
return {"data": webhook.get_by_type(tenant_id=context.tenant_id, webhook_type=schemas.WebhookType.SLACK)}

View file

@ -25,6 +25,7 @@ class ServicePermissions(str, Enum):
DEV_TOOLS = "SERVICE_DEV_TOOLS" DEV_TOOLS = "SERVICE_DEV_TOOLS"
ASSIST_LIVE = "SERVICE_ASSIST_LIVE" ASSIST_LIVE = "SERVICE_ASSIST_LIVE"
ASSIST_CALL = "SERVICE_ASSIST_CALL" ASSIST_CALL = "SERVICE_ASSIST_CALL"
READ_NOTES = "SERVICE_READ_NOTES"
class CurrentContext(schemas.CurrentContext): class CurrentContext(schemas.CurrentContext):

View file

@ -11,11 +11,11 @@
"dependencies": { "dependencies": {
"@maxmind/geoip2-node": "^4.2.0", "@maxmind/geoip2-node": "^4.2.0",
"@socket.io/redis-adapter": "^8.2.1", "@socket.io/redis-adapter": "^8.2.1",
"express": "^4.18.2", "express": "^4.21.1",
"jsonwebtoken": "^9.0.2", "jsonwebtoken": "^9.0.2",
"prom-client": "^15.0.0", "prom-client": "^15.0.0",
"redis": "^4.6.10", "redis": "^4.6.10",
"socket.io": "^4.7.2", "socket.io": "^4.8.0",
"ua-parser-js": "^1.0.37", "ua-parser-js": "^1.0.37",
"uWebSockets.js": "github:uNetworking/uWebSockets.js#v20.34.0", "uWebSockets.js": "github:uNetworking/uWebSockets.js#v20.34.0",
"winston": "^3.13.0" "winston": "^3.13.0"
@ -144,9 +144,9 @@
} }
}, },
"node_modules/@types/node": { "node_modules/@types/node": {
"version": "22.5.4", "version": "22.7.6",
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.5.4.tgz", "resolved": "https://registry.npmjs.org/@types/node/-/node-22.7.6.tgz",
"integrity": "sha512-FDuKUJQm/ju9fT/SeX/6+gBzoPzlVCzfzmGkwKvRHQVxi4BntVbyIwf6a4Xn62mrvndLiml6z/UBXIdEVjQLXg==", "integrity": "sha512-/d7Rnj0/ExXDMcioS78/kf1lMzYk4BZV8MZGTBKzTGZ6/406ukkbYlIsZmMPhcR5KlkunDHQLrtAVmSq7r+mSw==",
"dependencies": { "dependencies": {
"undici-types": "~6.19.2" "undici-types": "~6.19.2"
} }
@ -334,9 +334,9 @@
} }
}, },
"node_modules/cookie": { "node_modules/cookie": {
"version": "0.6.0", "version": "0.7.1",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.6.0.tgz", "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
"integrity": "sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==", "integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
@ -439,16 +439,16 @@
} }
}, },
"node_modules/engine.io": { "node_modules/engine.io": {
"version": "6.5.5", "version": "6.6.2",
"resolved": "https://registry.npmjs.org/engine.io/-/engine.io-6.5.5.tgz", "resolved": "https://registry.npmjs.org/engine.io/-/engine.io-6.6.2.tgz",
"integrity": "sha512-C5Pn8Wk+1vKBoHghJODM63yk8MvrO9EWZUfkAt5HAqIgPE4/8FF0PEGHXtEd40l223+cE5ABWuPzm38PHFXfMA==", "integrity": "sha512-gmNvsYi9C8iErnZdVcJnvCpSKbWTt1E8+JZo8b+daLninywUWi5NQ5STSHZ9rFjFO7imNcvb8Pc5pe/wMR5xEw==",
"dependencies": { "dependencies": {
"@types/cookie": "^0.4.1", "@types/cookie": "^0.4.1",
"@types/cors": "^2.8.12", "@types/cors": "^2.8.12",
"@types/node": ">=10.0.0", "@types/node": ">=10.0.0",
"accepts": "~1.3.4", "accepts": "~1.3.4",
"base64id": "2.0.0", "base64id": "2.0.0",
"cookie": "~0.4.1", "cookie": "~0.7.2",
"cors": "~2.8.5", "cors": "~2.8.5",
"debug": "~4.3.1", "debug": "~4.3.1",
"engine.io-parser": "~5.2.1", "engine.io-parser": "~5.2.1",
@ -467,9 +467,9 @@
} }
}, },
"node_modules/engine.io/node_modules/cookie": { "node_modules/engine.io/node_modules/cookie": {
"version": "0.4.2", "version": "0.7.2",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.4.2.tgz", "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.2.tgz",
"integrity": "sha512-aSWTXFzaKWkvHO1Ny/s+ePFpvKsPnjc551iI41v3ny/ow6tBG5Vd+FuqGNhh1LxOmVzOlGUriIlOaokOvhaStA==", "integrity": "sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
@ -507,16 +507,16 @@
} }
}, },
"node_modules/express": { "node_modules/express": {
"version": "4.21.0", "version": "4.21.1",
"resolved": "https://registry.npmjs.org/express/-/express-4.21.0.tgz", "resolved": "https://registry.npmjs.org/express/-/express-4.21.1.tgz",
"integrity": "sha512-VqcNGcj/Id5ZT1LZ/cfihi3ttTn+NJmkli2eZADigjq29qTlWi/hAQ43t/VLPq8+UX06FCEx3ByOYet6ZFblng==", "integrity": "sha512-YSFlK1Ee0/GC8QaO91tHcDxJiE/X4FbpAyQWkxAvG6AXCuR65YzK8ua6D9hvi/TzUfZMpc+BwuM1IPw8fmQBiQ==",
"dependencies": { "dependencies": {
"accepts": "~1.3.8", "accepts": "~1.3.8",
"array-flatten": "1.1.1", "array-flatten": "1.1.1",
"body-parser": "1.20.3", "body-parser": "1.20.3",
"content-disposition": "0.5.4", "content-disposition": "0.5.4",
"content-type": "~1.0.4", "content-type": "~1.0.4",
"cookie": "0.6.0", "cookie": "0.7.1",
"cookie-signature": "1.0.6", "cookie-signature": "1.0.6",
"debug": "2.6.9", "debug": "2.6.9",
"depd": "2.0.0", "depd": "2.0.0",
@ -1271,15 +1271,15 @@
} }
}, },
"node_modules/socket.io": { "node_modules/socket.io": {
"version": "4.7.5", "version": "4.8.0",
"resolved": "https://registry.npmjs.org/socket.io/-/socket.io-4.7.5.tgz", "resolved": "https://registry.npmjs.org/socket.io/-/socket.io-4.8.0.tgz",
"integrity": "sha512-DmeAkF6cwM9jSfmp6Dr/5/mfMwb5Z5qRrSXLpo3Fq5SqyU8CMF15jIN4ZhfSwu35ksM1qmHZDQ/DK5XTccSTvA==", "integrity": "sha512-8U6BEgGjQOfGz3HHTYaC/L1GaxDCJ/KM0XTkJly0EhZ5U/du9uNEZy4ZgYzEzIqlx2CMm25CrCqr1ck899eLNA==",
"dependencies": { "dependencies": {
"accepts": "~1.3.4", "accepts": "~1.3.4",
"base64id": "~2.0.0", "base64id": "~2.0.0",
"cors": "~2.8.5", "cors": "~2.8.5",
"debug": "~4.3.2", "debug": "~4.3.2",
"engine.io": "~6.5.2", "engine.io": "~6.6.0",
"socket.io-adapter": "~2.5.2", "socket.io-adapter": "~2.5.2",
"socket.io-parser": "~4.2.4" "socket.io-parser": "~4.2.4"
}, },

View file

@ -20,11 +20,11 @@
"dependencies": { "dependencies": {
"@maxmind/geoip2-node": "^4.2.0", "@maxmind/geoip2-node": "^4.2.0",
"@socket.io/redis-adapter": "^8.2.1", "@socket.io/redis-adapter": "^8.2.1",
"express": "^4.18.2", "express": "^4.21.1",
"jsonwebtoken": "^9.0.2", "jsonwebtoken": "^9.0.2",
"prom-client": "^15.0.0", "prom-client": "^15.0.0",
"redis": "^4.6.10", "redis": "^4.6.10",
"socket.io": "^4.7.2", "socket.io": "^4.8.0",
"ua-parser-js": "^1.0.37", "ua-parser-js": "^1.0.37",
"uWebSockets.js": "github:uNetworking/uWebSockets.js#v20.34.0", "uWebSockets.js": "github:uNetworking/uWebSockets.js#v20.34.0",
"winston": "^3.13.0" "winston": "^3.13.0"

View file

@ -119,7 +119,7 @@ var batches = map[string]string{
"resources": "INSERT INTO experimental.resources (session_id, project_id, message_id, datetime, url, type, duration, ttfb, header_size, encoded_body_size, decoded_body_size, success, url_path) VALUES (?, ?, ?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?, SUBSTR(?, 1, 8000))", "resources": "INSERT INTO experimental.resources (session_id, project_id, message_id, datetime, url, type, duration, ttfb, header_size, encoded_body_size, decoded_body_size, success, url_path) VALUES (?, ?, ?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?, SUBSTR(?, 1, 8000))",
"autocompletes": "INSERT INTO experimental.autocomplete (project_id, type, value) VALUES (?, ?, SUBSTR(?, 1, 8000))", "autocompletes": "INSERT INTO experimental.autocomplete (project_id, type, value) VALUES (?, ?, SUBSTR(?, 1, 8000))",
"pages": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, url, request_start, response_start, response_end, dom_content_loaded_event_start, dom_content_loaded_event_end, load_event_start, load_event_end, first_paint, first_contentful_paint_time, speed_index, visually_complete, time_to_interactive, url_path, event_type) VALUES (?, ?, ?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, SUBSTR(?, 1, 8000), ?)", "pages": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, url, request_start, response_start, response_end, dom_content_loaded_event_start, dom_content_loaded_event_end, load_event_start, load_event_end, first_paint, first_contentful_paint_time, speed_index, visually_complete, time_to_interactive, url_path, event_type) VALUES (?, ?, ?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, SUBSTR(?, 1, 8000), ?)",
"clicks": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, label, hesitation_time, event_type, selector, normalized_x, normalized_y) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)", "clicks": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, label, hesitation_time, event_type, selector, normalized_x, normalized_y, url, url_path) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, SUBSTR(?, 1, 8000), SUBSTR(?, 1, 8000))",
"inputs": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, label, event_type, duration, hesitation_time) VALUES (?, ?, ?, ?, ?, ?, ?, ?)", "inputs": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, label, event_type, duration, hesitation_time) VALUES (?, ?, ?, ?, ?, ?, ?, ?)",
"errors": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, source, name, message, error_id, event_type, error_tags_keys, error_tags_values) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)", "errors": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, source, name, message, error_id, event_type, error_tags_keys, error_tags_values) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)",
"performance": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, url, min_fps, avg_fps, max_fps, min_cpu, avg_cpu, max_cpu, min_total_js_heap_size, avg_total_js_heap_size, max_total_js_heap_size, min_used_js_heap_size, avg_used_js_heap_size, max_used_js_heap_size, event_type) VALUES (?, ?, ?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)", "performance": "INSERT INTO experimental.events (session_id, project_id, message_id, datetime, url, min_fps, avg_fps, max_fps, min_cpu, avg_cpu, max_cpu, min_total_js_heap_size, avg_total_js_heap_size, max_total_js_heap_size, min_used_js_heap_size, avg_used_js_heap_size, max_used_js_heap_size, event_type) VALUES (?, ?, ?, ?, SUBSTR(?, 1, 8000), ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)",
@ -247,7 +247,7 @@ func (c *connectorImpl) InsertIssue(session *sessions.Session, msg *messages.Iss
issueID := hashid.IssueID(session.ProjectID, msg) issueID := hashid.IssueID(session.ProjectID, msg)
// Check issue type before insert to avoid panic from clickhouse lib // Check issue type before insert to avoid panic from clickhouse lib
switch msg.Type { switch msg.Type {
case "click_rage", "dead_click", "excessive_scrolling", "bad_request", "missing_resource", "memory", "cpu", "slow_resource", "slow_page_load", "crash", "ml_cpu", "ml_memory", "ml_dead_click", "ml_click_rage", "ml_mouse_thrashing", "ml_excessive_scrolling", "ml_slow_resources", "custom", "js_exception", "mouse_thrashing": case "click_rage", "dead_click", "excessive_scrolling", "bad_request", "missing_resource", "memory", "cpu", "slow_resource", "slow_page_load", "crash", "ml_cpu", "ml_memory", "ml_dead_click", "ml_click_rage", "ml_mouse_thrashing", "ml_excessive_scrolling", "ml_slow_resources", "custom", "js_exception", "mouse_thrashing", "app_crash":
default: default:
return fmt.Errorf("unknown issueType: %s", msg.Type) return fmt.Errorf("unknown issueType: %s", msg.Type)
} }
@ -423,6 +423,8 @@ func (c *connectorImpl) InsertWebClickEvent(session *sessions.Session, msg *mess
msg.Selector, msg.Selector,
nX, nX,
nY, nY,
msg.Url,
extractUrlPath(msg.Url),
); err != nil { ); err != nil {
c.checkError("clicks", err) c.checkError("clicks", err)
return fmt.Errorf("can't append to clicks batch: %s", err) return fmt.Errorf("can't append to clicks batch: %s", err)

View file

@ -50,7 +50,7 @@ SET metric_type='heatMap',
WHERE metric_type = 'clickMap'; WHERE metric_type = 'clickMap';
UPDATE public.roles UPDATE public.roles
SET permissions='{SERVICE_SESSION_REPLAY,SERVICE_DEV_TOOLS,SERVICE_ASSIST_LIVE,SERVICE_ASSIST_CALL}' SET permissions='{SERVICE_SESSION_REPLAY,SERVICE_DEV_TOOLS,SERVICE_ASSIST_LIVE,SERVICE_ASSIST_CALL,SERVICE_READ_NOTES}'
WHERE service_role; WHERE service_role;
UPDATE public.users UPDATE public.users

View file

@ -33,6 +33,10 @@ WHERE NOT permissions @> '{SPOT_PUBLIC}'
AND NOT service_role; AND NOT service_role;
-- AND name ILIKE 'owner'; -- AND name ILIKE 'owner';
UPDATE public.roles
SET permissions='{SERVICE_SESSION_REPLAY,SERVICE_DEV_TOOLS,SERVICE_ASSIST_LIVE,SERVICE_ASSIST_CALL,SERVICE_READ_NOTES}'
WHERE service_role;
ALTER TABLE IF EXISTS public.users ALTER TABLE IF EXISTS public.users
ADD COLUMN IF NOT EXISTS spot_jwt_iat timestamp without time zone NULL DEFAULT NULL, ADD COLUMN IF NOT EXISTS spot_jwt_iat timestamp without time zone NULL DEFAULT NULL,
ADD COLUMN IF NOT EXISTS spot_jwt_refresh_jti integer NULL DEFAULT NULL, ADD COLUMN IF NOT EXISTS spot_jwt_refresh_jti integer NULL DEFAULT NULL,

View file

@ -1,4 +1,4 @@
\set or_version 'v1.19.0-ee' \set or_version 'v1.20.0-ee'
SET client_min_messages TO NOTICE; SET client_min_messages TO NOTICE;
\set ON_ERROR_STOP true \set ON_ERROR_STOP true
SELECT EXISTS (SELECT 1 SELECT EXISTS (SELECT 1

View file

@ -23,4 +23,4 @@ MINIO_SECRET_KEY = ''
# APP and TRACKER VERSIONS # APP and TRACKER VERSIONS
VERSION = 1.20.0 VERSION = 1.20.0
TRACKER_VERSION = '14.0.6' TRACKER_VERSION = '14.0.7'

View file

@ -119,6 +119,7 @@ interface Props {
function PrivateRoutes(props: Props) { function PrivateRoutes(props: Props) {
const { onboarding, sites, siteId } = props; const { onboarding, sites, siteId } = props;
const hasRecordings = sites.some(s => s.recorded); const hasRecordings = sites.some(s => s.recorded);
const redirectToSetup = props.scope === 0;
const redirectToOnboarding = const redirectToOnboarding =
!onboarding && (localStorage.getItem(GLOBAL_HAS_NO_RECORDINGS) === 'true' || !hasRecordings) && props.scope > 0; !onboarding && (localStorage.getItem(GLOBAL_HAS_NO_RECORDINGS) === 'true' || !hasRecordings) && props.scope > 0;
const siteIdList: any = sites.map(({ id }) => id).toJS(); const siteIdList: any = sites.map(({ id }) => id).toJS();
@ -126,6 +127,13 @@ function PrivateRoutes(props: Props) {
return ( return (
<Suspense fallback={<Loader loading={true} className="flex-1" />}> <Suspense fallback={<Loader loading={true} className="flex-1" />}>
<Switch key="content"> <Switch key="content">
<Route
exact
strict
path={SCOPE_SETUP}
component={enhancedComponents.ScopeSetup}
/>
{redirectToSetup ? <Redirect to={SCOPE_SETUP} /> : null}
<Route path={CLIENT_PATH} component={enhancedComponents.Client} /> <Route path={CLIENT_PATH} component={enhancedComponents.Client} />
<Route <Route
path={withSiteId(ONBOARDING_PATH, siteIdList)} path={withSiteId(ONBOARDING_PATH, siteIdList)}
@ -143,12 +151,6 @@ function PrivateRoutes(props: Props) {
path={SPOT_PATH} path={SPOT_PATH}
component={enhancedComponents.Spot} component={enhancedComponents.Spot}
/> />
<Route
exact
strict
path={SCOPE_SETUP}
component={enhancedComponents.ScopeSetup}
/>
{props.scope === 1 ? <Redirect to={SPOTS_LIST_PATH} /> : null} {props.scope === 1 ? <Redirect to={SPOTS_LIST_PATH} /> : null}
<Route <Route
path="/integrations/" path="/integrations/"

View file

@ -10,21 +10,19 @@ import {
GLOBAL_DESTINATION_PATH, GLOBAL_DESTINATION_PATH,
IFRAME, IFRAME,
JWT_PARAM, JWT_PARAM,
SPOT_ONBOARDING SPOT_ONBOARDING,
} from "App/constants/storageKeys"; } from 'App/constants/storageKeys';
import Layout from 'App/layout/Layout'; import Layout from 'App/layout/Layout';
import { withStore } from "App/mstore"; import { withStore } from 'App/mstore';
import { checkParam, handleSpotJWT, isTokenExpired } from "App/utils"; import { checkParam, handleSpotJWT, isTokenExpired } from 'App/utils';
import { ModalProvider } from 'Components/Modal'; import { ModalProvider } from 'Components/Modal';
import { ModalProvider as NewModalProvider } from 'Components/ModalContext'; import { ModalProvider as NewModalProvider } from 'Components/ModalContext';
import { fetchListActive as fetchMetadata } from 'Duck/customField'; import { fetchListActive as fetchMetadata } from 'Duck/customField';
import { setSessionPath } from 'Duck/sessions'; import { setSessionPath } from 'Duck/sessions';
import { fetchList as fetchSiteList } from 'Duck/site'; import { fetchList as fetchSiteList } from 'Duck/site';
import { init as initSite } from 'Duck/site'; import { init as initSite } from 'Duck/site';
import { fetchUserInfo, getScope, setJwt, logout } from "Duck/user"; import { fetchUserInfo, getScope, logout, setJwt } from 'Duck/user';
import { fetchTenants } from 'Duck/user';
import { Loader } from 'UI'; import { Loader } from 'UI';
import { spotsList } from "./routes";
import * as routes from './routes'; import * as routes from './routes';
interface RouterProps interface RouterProps
@ -36,7 +34,6 @@ interface RouterProps
changePassword: boolean; changePassword: boolean;
isEnterprise: boolean; isEnterprise: boolean;
fetchUserInfo: () => any; fetchUserInfo: () => any;
fetchTenants: () => any;
setSessionPath: (path: any) => any; setSessionPath: (path: any) => any;
fetchSiteList: (siteId?: number) => any; fetchSiteList: (siteId?: number) => any;
match: { match: {
@ -45,7 +42,7 @@ interface RouterProps
}; };
}; };
mstore: any; mstore: any;
setJwt: (params: { jwt: string, spotJwt: string | null }) => any; setJwt: (params: { jwt: string; spotJwt: string | null }) => any;
fetchMetadata: (siteId: string) => void; fetchMetadata: (siteId: string) => void;
initSite: (site: any) => void; initSite: (site: any) => void;
scopeSetup: boolean; scopeSetup: boolean;
@ -68,15 +65,16 @@ const Router: React.FC<RouterProps> = (props) => {
logout, logout,
} = props; } = props;
const params = new URLSearchParams(location.search) const params = new URLSearchParams(location.search);
const spotCb = params.get('spotCallback'); const spotCb = params.get('spotCallback');
const spotReqSent = React.useRef(false) const spotReqSent = React.useRef(false);
const [isSpotCb, setIsSpotCb] = React.useState(false); const [isSpotCb, setIsSpotCb] = React.useState(false);
const [isSignup, setIsSignup] = React.useState(false);
const [isIframe, setIsIframe] = React.useState(false); const [isIframe, setIsIframe] = React.useState(false);
const [isJwt, setIsJwt] = React.useState(false); const [isJwt, setIsJwt] = React.useState(false);
const handleJwtFromUrl = () => { const handleJwtFromUrl = () => {
const params = new URLSearchParams(location.search) const params = new URLSearchParams(location.search);
const urlJWT = params.get('jwt'); const urlJWT = params.get('jwt');
const spotJwt = params.get('spotJwt'); const spotJwt = params.get('spotJwt');
if (spotJwt) { if (spotJwt) {
@ -92,6 +90,7 @@ const Router: React.FC<RouterProps> = (props) => {
return; return;
} else { } else {
spotReqSent.current = true; spotReqSent.current = true;
setIsSpotCb(false);
} }
handleSpotJWT(jwt); handleSpotJWT(jwt);
}; };
@ -107,13 +106,17 @@ const Router: React.FC<RouterProps> = (props) => {
const handleUserLogin = async () => { const handleUserLogin = async () => {
if (isSpotCb) { if (isSpotCb) {
localStorage.setItem(SPOT_ONBOARDING, 'true') localStorage.setItem(SPOT_ONBOARDING, 'true');
} }
await fetchUserInfo(); await fetchUserInfo();
const siteIdFromPath = parseInt(location.pathname.split('/')[1]); const siteIdFromPath = parseInt(location.pathname.split('/')[1]);
await fetchSiteList(siteIdFromPath); await fetchSiteList(siteIdFromPath);
props.mstore.initClient(); props.mstore.initClient();
if (localSpotJwt && !isTokenExpired(localSpotJwt)) {
handleSpotLogin(localSpotJwt);
}
const destinationPath = localStorage.getItem(GLOBAL_DESTINATION_PATH); const destinationPath = localStorage.getItem(GLOBAL_DESTINATION_PATH);
if ( if (
destinationPath && destinationPath &&
@ -144,7 +147,10 @@ const Router: React.FC<RouterProps> = (props) => {
if (spotCb) { if (spotCb) {
setIsSpotCb(true); setIsSpotCb(true);
} }
}, [spotCb]) if (location.pathname.includes('signup')) {
setIsSignup(true);
}
}, [spotCb]);
useEffect(() => { useEffect(() => {
handleDestinationPath(); handleDestinationPath();
@ -159,22 +165,14 @@ const Router: React.FC<RouterProps> = (props) => {
}, [isLoggedIn]); }, [isLoggedIn]);
useEffect(() => { useEffect(() => {
if (scopeSetup) { if (isLoggedIn && isSpotCb && !isSignup) {
history.push(routes.scopeSetup()) if (localSpotJwt && !isTokenExpired(localSpotJwt)) {
} handleSpotLogin(localSpotJwt);
}, [scopeSetup]) } else {
logout();
useEffect(() => {
if (isLoggedIn && (location.pathname.includes('login') || isSpotCb)) {
if (localSpotJwt) {
if (!isTokenExpired(localSpotJwt)) {
handleSpotLogin(localSpotJwt);
} else {
logout();
}
} }
} }
}, [isSpotCb, location, isLoggedIn, localSpotJwt]) }, [isSpotCb, isLoggedIn, localSpotJwt, isSignup]);
useEffect(() => { useEffect(() => {
if (siteId && siteId !== lastFetchedSiteIdRef.current) { if (siteId && siteId !== lastFetchedSiteIdRef.current) {
@ -204,8 +202,7 @@ const Router: React.FC<RouterProps> = (props) => {
location.pathname.includes('multiview') || location.pathname.includes('multiview') ||
location.pathname.includes('/view-spot/') || location.pathname.includes('/view-spot/') ||
location.pathname.includes('/spots/') || location.pathname.includes('/spots/') ||
location.pathname.includes('/scope-setup') location.pathname.includes('/scope-setup');
if (isIframe) { if (isIframe) {
return ( return (
@ -238,8 +235,11 @@ const mapStateToProps = (state: Map<string, any>) => {
'loading', 'loading',
]); ]);
const sitesLoading = state.getIn(['site', 'fetchListRequest', 'loading']); const sitesLoading = state.getIn(['site', 'fetchListRequest', 'loading']);
const scopeSetup = getScope(state) === 0 const scopeSetup = getScope(state) === 0;
const loading = Boolean(userInfoLoading) || Boolean(sitesLoading) || (!scopeSetup && !siteId); const loading =
Boolean(userInfoLoading) ||
Boolean(sitesLoading) ||
(!scopeSetup && !siteId);
return { return {
siteId, siteId,
changePassword, changePassword,
@ -262,7 +262,6 @@ const mapStateToProps = (state: Map<string, any>) => {
const mapDispatchToProps = { const mapDispatchToProps = {
fetchUserInfo, fetchUserInfo,
fetchTenants,
setSessionPath, setSessionPath,
fetchSiteList, fetchSiteList,
setJwt, setJwt,

View file

@ -16,8 +16,10 @@ function ClickMapCard({
const onMarkerClick = (s: string, innerText: string) => { const onMarkerClick = (s: string, innerText: string) => {
metricStore.changeClickMapSearch(s, innerText) metricStore.changeClickMapSearch(s, innerText)
} }
const mapUrl = metricStore.instance.series[0].filter.filters[0].value[0]
const sessionId = metricStore.instance.data.sessionId const sessionId = metricStore.instance.data.sessionId
const url = metricStore.instance.data.path;
const operator = metricStore.instance.series[0].filter.filters[0].operator
React.useEffect(() => { React.useEffect(() => {
return () => setCustomSession(null) return () => setCustomSession(null)
@ -36,8 +38,16 @@ function ClickMapCard({
const rangeValue = dashboardStore.drillDownPeriod.rangeValue const rangeValue = dashboardStore.drillDownPeriod.rangeValue
const startDate = dashboardStore.drillDownPeriod.start const startDate = dashboardStore.drillDownPeriod.start
const endDate = dashboardStore.drillDownPeriod.end const endDate = dashboardStore.drillDownPeriod.end
fetchInsights({ ...insightsFilters, url: mapUrl || '/', startDate, endDate, rangeValue, clickRage: metricStore.clickMapFilter }) fetchInsights({
}, [dashboardStore.drillDownPeriod.start, dashboardStore.drillDownPeriod.end, dashboardStore.drillDownPeriod.rangeValue, metricStore.clickMapFilter]) ...insightsFilters,
url,
startDate,
endDate,
rangeValue,
clickRage: metricStore.clickMapFilter,
operator,
})
}, [dashboardStore.drillDownPeriod.start, operator, url, dashboardStore.drillDownPeriod.end, dashboardStore.drillDownPeriod.rangeValue, metricStore.clickMapFilter])
if (!metricStore.instance.data.domURL || insights.size === 0) { if (!metricStore.instance.data.domURL || insights.size === 0) {
return ( return (
@ -59,7 +69,7 @@ function ClickMapCard({
} }
const jumpToEvent = metricStore.instance.data.events.find((evt: Record<string, any>) => { const jumpToEvent = metricStore.instance.data.events.find((evt: Record<string, any>) => {
if (mapUrl) return evt.path.includes(mapUrl) if (url) return evt.path.includes(url)
return evt return evt
}) || { timestamp: metricStore.instance.data.startTs } }) || { timestamp: metricStore.instance.data.startTs }
const ts = jumpToEvent.timestamp ?? metricStore.instance.data.startTs const ts = jumpToEvent.timestamp ?? metricStore.instance.data.startTs

View file

@ -2,7 +2,7 @@ import { ArrowRightOutlined } from '@ant-design/icons';
import { Button, Card, Radio } from 'antd'; import { Button, Card, Radio } from 'antd';
import React from 'react'; import React from 'react';
import { connect } from 'react-redux'; import { connect } from 'react-redux';
import { upgradeScope, downgradeScope } from "App/duck/user"; import { upgradeScope, downgradeScope, getScope } from 'App/duck/user';
import { useHistory } from 'react-router-dom'; import { useHistory } from 'react-router-dom';
import * as routes from 'App/routes' import * as routes from 'App/routes'
import { SPOT_ONBOARDING } from "../../constants/storageKeys"; import { SPOT_ONBOARDING } from "../../constants/storageKeys";
@ -15,8 +15,18 @@ const Scope = {
function ScopeForm({ function ScopeForm({
upgradeScope, upgradeScope,
downgradeScope, downgradeScope,
scopeState,
}: any) { }: any) {
const [scope, setScope] = React.useState(Scope.FULL); const [scope, setScope] = React.useState(Scope.FULL);
React.useEffect(() => {
if (scopeState !== 0) {
if (scopeState === 2) {
history.replace(routes.onboarding())
} else {
history.replace(routes.spotsList())
}
}
}, [scopeState])
React.useEffect(() => { React.useEffect(() => {
const isSpotSetup = localStorage.getItem(SPOT_ONBOARDING) const isSpotSetup = localStorage.getItem(SPOT_ONBOARDING)
if (isSpotSetup) { if (isSpotSetup) {
@ -36,50 +46,52 @@ function ScopeForm({
}; };
return ( return (
<div className={'flex items-center justify-center w-screen h-screen'}> <div className={'flex items-center justify-center w-screen h-screen'}>
<Card <Card
style={{ width: 540 }} style={{ width: 540 }}
title={'👋 Welcome to OpenReplay'} title={'👋 Welcome to OpenReplay'}
classNames={{ classNames={{
header: 'text-2xl font-semibold text-center', header: 'text-2xl font-semibold text-center',
body: 'flex flex-col gap-2', body: 'flex flex-col gap-2',
}} }}
>
<div className={'font-semibold'}>
How will you primarily use OpenReplay?{' '}
</div>
<div className={'text-disabled-text'}>
<div>
You will have access to all OpenReplay features regardless of your
choice.
</div>
<div>
Your preference will simply help us tailor your onboarding experience.
</div>
</div>
<Radio.Group
value={scope}
onChange={(e) => setScope(e.target.value)}
className={'flex flex-col gap-2 mt-4 '}
> >
<Radio value={'full'}> <div className={'font-semibold'}>
Session Replay & Debugging, Customer Support and more How will you primarily use OpenReplay?{' '}
</Radio> </div>
<Radio value={'spot'}>Report bugs via Spot</Radio> <div className={'text-disabled-text'}>
</Radio.Group> <div>
You will have access to all OpenReplay features regardless of your
<div className={'self-end'}> choice.
<Button </div>
type={'primary'} <div>
onClick={() => onContinue()} Your preference will simply help us tailor your onboarding experience.
icon={<ArrowRightOutlined />} </div>
iconPosition={'end'} </div>
<Radio.Group
value={scope}
onChange={(e) => setScope(e.target.value)}
className={'flex flex-col gap-2 mt-4 '}
> >
Continue <Radio value={'full'}>
</Button> Session Replay & Debugging, Customer Support and more
</div> </Radio>
</Card> <Radio value={'spot'}>Report bugs via Spot</Radio>
</Radio.Group>
<div className={'self-end'}>
<Button
type={'primary'}
onClick={() => onContinue()}
icon={<ArrowRightOutlined />}
iconPosition={'end'}
>
Continue
</Button>
</div>
</Card>
</div> </div>
); );
} }
export default connect(null, { upgradeScope, downgradeScope })(ScopeForm); export default connect((state) => ({
scopeState: getScope(state),
}), { upgradeScope, downgradeScope })(ScopeForm);

View file

@ -83,9 +83,8 @@ function Player(props: IProps) {
<div <div
onMouseDown={handleResize} onMouseDown={handleResize}
className={'w-full h-2 cursor-ns-resize absolute top-0 left-0 z-20'} className={'w-full h-2 cursor-ns-resize absolute top-0 left-0 z-20'}
> />
<ConsolePanel isLive /> <ConsolePanel isLive />
</div>
</div> </div>
) : null} ) : null}
{!fullView && !isMultiview ? <LiveControls jump={playerContext.player.jump} /> : null} {!fullView && !isMultiview ? <LiveControls jump={playerContext.player.jump} /> : null}

View file

@ -6,6 +6,7 @@ import {
} from '@ant-design/icons'; } from '@ant-design/icons';
import { Button, InputNumber, Popover } from 'antd'; import { Button, InputNumber, Popover } from 'antd';
import { Slider } from 'antd'; import { Slider } from 'antd';
import cn from 'classnames';
import { observer } from 'mobx-react-lite'; import { observer } from 'mobx-react-lite';
import React, { useContext, useEffect, useRef, useState } from 'react'; import React, { useContext, useEffect, useRef, useState } from 'react';
@ -24,17 +25,38 @@ function DropdownAudioPlayer({
const [isMuted, setIsMuted] = useState(false); const [isMuted, setIsMuted] = useState(false);
const lastPlayerTime = useRef(0); const lastPlayerTime = useRef(0);
const audioRefs = useRef<Record<string, HTMLAudioElement | null>>({}); const audioRefs = useRef<Record<string, HTMLAudioElement | null>>({});
const fileLengths = useRef<Record<string, number>>({});
const { time = 0, speed = 1, playing, sessionStart } = store?.get() ?? {}; const { time = 0, speed = 1, playing, sessionStart } = store?.get() ?? {};
const files = audioEvents.map((pa) => { const files = React.useMemo(
const data = pa.payload; () =>
return { audioEvents.map((pa) => {
url: data.url, const data = pa.payload;
timestamp: data.timestamp, const nativeTs = data.timestamp;
start: pa.timestamp - sessionStart, const startTs = nativeTs
}; ? nativeTs > sessionStart
}); ? nativeTs - sessionStart
: nativeTs
: pa.timestamp - sessionStart;
return {
url: data.url,
timestamp: data.timestamp,
start: Math.max(0, startTs),
};
}),
[audioEvents.length, sessionStart]
);
React.useEffect(() => {
Object.entries(audioRefs.current).forEach(([url, audio]) => {
if (audio) {
audio.loop = false;
audio.addEventListener('loadedmetadata', () => {
fileLengths.current[url] = audio.duration;
});
}
});
}, [audioRefs.current]);
const toggleMute = () => { const toggleMute = () => {
Object.values(audioRefs.current).forEach((audio) => { Object.values(audioRefs.current).forEach((audio) => {
@ -89,10 +111,15 @@ function DropdownAudioPlayer({
if (audio) { if (audio) {
const file = files.find((f) => f.url === key); const file = files.find((f) => f.url === key);
if (file) { if (file) {
audio.currentTime = Math.max( const targetTime = (timeMs + delta * 1000 - file.start) / 1000;
(timeMs + delta * 1000 - file.start) / 1000, const fileLength = fileLengths.current[key];
0 if (targetTime < 0 || (fileLength && targetTime > fileLength)) {
); audio.pause();
audio.currentTime = 0;
return;
} else {
audio.currentTime = targetTime;
}
} }
} }
}); });
@ -108,27 +135,39 @@ function DropdownAudioPlayer({
useEffect(() => { useEffect(() => {
const deltaMs = delta * 1000; const deltaMs = delta * 1000;
if (Math.abs(lastPlayerTime.current - time - deltaMs) >= 250) { const deltaTime = Math.abs(lastPlayerTime.current - time - deltaMs);
if (deltaTime >= 250) {
handleSeek(time); handleSeek(time);
} }
Object.entries(audioRefs.current).forEach(([url, audio]) => { Object.entries(audioRefs.current).forEach(([url, audio]) => {
if (audio) { if (audio) {
const file = files.find((f) => f.url === url); const file = files.find((f) => f.url === url);
if (file && time >= file.start) { const fileLength = fileLengths.current[url];
if (audio.paused && playing) { if (file) {
audio.play(); if (fileLength && fileLength * 1000 + file.start < time) {
return;
}
if (time >= file.start) {
if (audio.paused && playing) {
audio.play();
}
} else {
audio.pause();
} }
} else {
audio.pause();
}
if (audio.muted !== isMuted) {
audio.muted = isMuted;
} }
} }
}); });
lastPlayerTime.current = time + deltaMs; lastPlayerTime.current = time + deltaMs;
}, [time, delta]); }, [time, delta]);
useEffect(() => {
Object.values(audioRefs.current).forEach((audio) => {
if (audio) {
audio.muted = isMuted;
}
});
}, [isMuted]);
useEffect(() => { useEffect(() => {
changePlaybackSpeed(speed); changePlaybackSpeed(speed);
}, [speed]); }, [speed]);
@ -137,22 +176,30 @@ function DropdownAudioPlayer({
Object.entries(audioRefs.current).forEach(([url, audio]) => { Object.entries(audioRefs.current).forEach(([url, audio]) => {
if (audio) { if (audio) {
const file = files.find((f) => f.url === url); const file = files.find((f) => f.url === url);
if (file && playing && time >= file.start) { const fileLength = fileLengths.current[url];
audio.play(); if (file) {
} else { if (fileLength && fileLength * 1000 + file.start < time) {
audio.pause(); audio.pause();
return;
}
if (playing && time >= file.start) {
audio.play();
} else {
audio.pause();
}
} }
} }
}); });
setVolume(isMuted ? 0 : volume); setVolume(isMuted ? 0 : volume);
}, [playing]); }, [playing]);
const buttonIcon =
'px-2 cursor-pointer border border-gray-light hover:border-main hover:text-main hover:z-10 h-fit';
return ( return (
<div className={'relative'}> <div className={'relative'}>
<div className={'flex items-center'} style={{ height: 24 }}> <div className={'flex items-center'} style={{ height: 24 }}>
<Popover <Popover
trigger={'click'} trigger={'click'}
className={'h-full'}
content={ content={
<div <div
className={'flex flex-col gap-2 rounded'} className={'flex flex-col gap-2 rounded'}
@ -169,20 +216,14 @@ function DropdownAudioPlayer({
</div> </div>
} }
> >
<div <div className={cn(buttonIcon, 'rounded-l')}>
className={
'px-2 h-full cursor-pointer border rounded-l border-gray-light hover:border-main hover:text-main hover:z-10'
}
>
{isMuted ? <MutedOutlined /> : <SoundOutlined />} {isMuted ? <MutedOutlined /> : <SoundOutlined />}
</div> </div>
</Popover> </Popover>
<div <div
onClick={toggleVisible} onClick={toggleVisible}
style={{ marginLeft: -1 }} style={{ marginLeft: -1 }}
className={ className={cn(buttonIcon, 'rounded-r')}
'px-2 h-full border rounded-r border-gray-light cursor-pointer hover:border-main hover:text-main hover:z-10'
}
> >
<CaretDownOutlined /> <CaretDownOutlined />
</div> </div>
@ -236,6 +277,7 @@ function DropdownAudioPlayer({
<div style={{ display: 'none' }}> <div style={{ display: 'none' }}>
{files.map((file) => ( {files.map((file) => (
<audio <audio
loop={false}
key={file.url} key={file.url}
ref={(el) => (audioRefs.current[file.url] = el)} ref={(el) => (audioRefs.current[file.url] = el)}
controls controls

View file

@ -19,7 +19,7 @@ function ConsoleRow(props: Props) {
return ( return (
<div <div
className={cn(stl.line, 'flex py-2 px-4 overflow-hidden group relative select-none', { className={cn(stl.line, 'flex py-2 px-4 overflow-hidden group relative', {
info: !log.isYellow && !log.isRed, info: !log.isYellow && !log.isRed,
warn: log.isYellow, warn: log.isYellow,
error: log.isRed, error: log.isRed,

View file

@ -10,7 +10,7 @@ const SpotsListHeader = observer(
onDelete, onDelete,
selectedCount, selectedCount,
onClearSelection, onClearSelection,
isEmpty, tenantHasSpots,
onRefresh, onRefresh,
}: { }: {
onDelete: () => void; onDelete: () => void;
@ -18,6 +18,7 @@ const SpotsListHeader = observer(
onClearSelection: () => void; onClearSelection: () => void;
onRefresh: () => void; onRefresh: () => void;
isEmpty?: boolean; isEmpty?: boolean;
tenantHasSpots: boolean;
}) => { }) => {
const { spotStore } = useStore(); const { spotStore } = useStore();
@ -52,7 +53,7 @@ const SpotsListHeader = observer(
<ReloadButton buttonSize={'small'} onClick={onRefresh} iconSize={16} /> <ReloadButton buttonSize={'small'} onClick={onRefresh} iconSize={16} />
</div> </div>
{isEmpty ? null : ( {tenantHasSpots ? (
<div className="flex gap-2 items-center"> <div className="flex gap-2 items-center">
<div className={'ml-auto'}> <div className={'ml-auto'}>
{selectedCount > 0 && ( {selectedCount > 0 && (
@ -90,7 +91,7 @@ const SpotsListHeader = observer(
/> />
</div> </div>
</div> </div>
)} ) : null}
</div> </div>
); );
} }

View file

@ -89,6 +89,7 @@ function SpotsList() {
selectedCount={selectedSpots.length} selectedCount={selectedSpots.length}
onClearSelection={clearSelection} onClearSelection={clearSelection}
isEmpty={isEmpty} isEmpty={isEmpty}
tenantHasSpots={spotStore.tenantHasSpots}
/> />
</div> </div>

View file

@ -117,7 +117,7 @@ function ConsolePanel({
exceptionsList = [], exceptionsList = [],
logListNow = [], logListNow = [],
exceptionsListNow = [], exceptionsListNow = [],
} = tabStates[currentTab]; } = tabStates[currentTab] ?? {};
const list = isLive const list = isLive
? (useMemo( ? (useMemo(

View file

@ -45,7 +45,7 @@ function ConsoleRow(props: Props) {
<div <div
style={style} style={style}
className={cn( className={cn(
'border-b flex items-start py-1 px-4 pe-8 overflow-hidden group relative select-none', 'border-b flex items-start py-1 px-4 pe-8 overflow-hidden group relative',
{ {
info: !log.isYellow && !log.isRed, info: !log.isYellow && !log.isRed,
warn: log.isYellow, warn: log.isYellow,

View file

@ -25,7 +25,7 @@ const ALL = 'ALL';
const TAB_KEYS = [ALL, ...typeList] as const; const TAB_KEYS = [ALL, ...typeList] as const;
const TABS = TAB_KEYS.map((tab) => ({ text: tab, key: tab })); const TABS = TAB_KEYS.map((tab) => ({ text: tab, key: tab }));
type EventsList = Array<Timed & { name: string; source: string; key: string }>; type EventsList = Array<Timed & { name: string; source: string; key: string; payload?: string[] }>;
const WebStackEventPanelComp = observer( const WebStackEventPanelComp = observer(
({ ({
@ -95,7 +95,7 @@ export const MobileStackEventPanel = connect((state: Record<string, any>) => ({
zoomEndTs: state.getIn(['components', 'player']).timelineZoom.endTs, zoomEndTs: state.getIn(['components', 'player']).timelineZoom.endTs,
}))(MobileStackEventPanelComp); }))(MobileStackEventPanelComp);
function EventsPanel({ const EventsPanel = observer(({
list, list,
listNow, listNow,
jump, jump,
@ -109,7 +109,7 @@ function EventsPanel({
zoomEnabled: boolean; zoomEnabled: boolean;
zoomStartTs: number; zoomStartTs: number;
zoomEndTs: number; zoomEndTs: number;
}) { }) => {
const { const {
sessionStore: { devTools }, sessionStore: { devTools },
} = useStore(); } = useStore();
@ -126,13 +126,19 @@ function EventsPanel({
zoomEnabled ? zoomStartTs <= time && time <= zoomEndTs : true zoomEnabled ? zoomStartTs <= time && time <= zoomEndTs : true
); );
let filteredList = useRegExListFilterMemo(inZoomRangeList, (it) => it.name, filter); let filteredList = useRegExListFilterMemo(inZoomRangeList, (it) => {
const searchBy = [it.name]
if (it.payload) {
const payload = Array.isArray(it.payload) ? it.payload.join(',') : JSON.stringify(it.payload);
searchBy.push(payload);
}
return searchBy
}, filter);
filteredList = useTabListFilterMemo(filteredList, (it) => it.source, ALL, activeTab); filteredList = useTabListFilterMemo(filteredList, (it) => it.source, ALL, activeTab);
const onTabClick = (activeTab: (typeof TAB_KEYS)[number]) => const onTabClick = (activeTab: (typeof TAB_KEYS)[number]) =>
devTools.update(INDEX_KEY, { activeTab }); devTools.update(INDEX_KEY, { activeTab });
const onFilterChange = ({ target: { value } }: React.ChangeEvent<HTMLInputElement>) => const onFilterChange = ({ target: { value } }: React.ChangeEvent<HTMLInputElement>) => devTools.update(INDEX_KEY, { filter: value });
devTools.update(INDEX_KEY, { filter: value });
const tabs = useMemo( const tabs = useMemo(
() => TABS.filter(({ key }) => key === ALL || inZoomRangeList.some(({ source }) => key === source)), () => TABS.filter(({ key }) => key === ALL || inZoomRangeList.some(({ source }) => key === source)),
[inZoomRangeList.length] [inZoomRangeList.length]
@ -229,4 +235,4 @@ function EventsPanel({
</BottomBlock.Content> </BottomBlock.Content>
</BottomBlock> </BottomBlock>
); );
} });

View file

@ -1,10 +1,15 @@
import { makeAutoObservable } from 'mobx'; import { makeAutoObservable } from 'mobx';
import { spotService } from 'App/services'; import { spotService } from 'App/services';
import { UpdateSpotRequest } from 'App/services/spotService'; import { UpdateSpotRequest } from 'App/services/spotService';
import { Spot } from './types/spot'; import { Spot } from './types/spot';
export default class SpotStore { export default class SpotStore {
isLoading: boolean = false; isLoading: boolean = false;
spots: Spot[] = []; spots: Spot[] = [];
@ -18,6 +23,7 @@ export default class SpotStore {
pubKey: { value: string; expiration: number } | null = null; pubKey: { value: string; expiration: number } | null = null;
readonly order = 'desc'; readonly order = 'desc';
accessError = false; accessError = false;
tenantHasSpots = false;
constructor() { constructor() {
makeAutoObservable(this); makeAutoObservable(this);
@ -81,13 +87,18 @@ export default class SpotStore {
limit: this.limit, limit: this.limit,
} as const; } as const;
const response = await this.withLoader(() => const { spots, tenantHasSpots, total } = await this.withLoader(() =>
spotService.fetchSpots(filters) spotService.fetchSpots(filters)
); );
this.setSpots(response.spots.map((spot: any) => new Spot(spot))); this.setSpots(spots.map((spot: any) => new Spot(spot)));
this.setTotal(response.total); this.setTotal(total);
this.setTenantHasSpots(tenantHasSpots);
}; };
setTenantHasSpots(hasSpots: boolean) {
this.tenantHasSpots = hasSpots;
}
async fetchSpotById(id: string) { async fetchSpotById(id: string) {
try { try {
const response = await this.withLoader(() => const response = await this.withLoader(() =>

View file

@ -3,7 +3,7 @@
position: absolute; position: absolute;
width: 36px; width: 36px;
height: 60px; height: 60px;
background-image: url("data:image/svg+xml,%3C%3Fxml version='1.0' encoding='UTF-8'%3F%3E%3Csvg enable-background='new 0 0 28 28' version='1.1' viewBox='0 0 28 28' xml:space='preserve' xmlns='http://www.w3.org/2000/svg'%3E%3Cpolygon points='8.2 20.9 8.2 4.9 19.8 16.5 13 16.5 12.6 16.6' fill='%23fff'/%3E%3Cpolygon points='17.3 21.6 13.7 23.1 9 12 12.7 10.5' fill='%23fff'/%3E%3Crect transform='matrix(.9221 -.3871 .3871 .9221 -5.7605 6.5909)' x='12.5' y='13.6' width='2' height='8'/%3E%3Cpolygon points='9.2 7.3 9.2 18.5 12.2 15.6 12.6 15.5 17.4 15.5'/%3E%3C/svg%3E"); background-image: url("data:image/svg+xml,%3Csvg enable-background='new 0 0 20 20' version='1.1' viewBox='8.5 5.2 28 28' xml:space='preserve' xmlns='http://www.w3.org/2000/svg'%3E%3Cpolygon points='8.2 20.9 8.2 4.9 19.8 16.5 13 16.5 12.6 16.6' fill='%23fff'/%3E%3Cpolygon points='17.3 21.6 13.7 23.1 9 12 12.7 10.5' fill='%23fff'/%3E%3Crect transform='matrix(.9221 -.3871 .3871 .9221 -5.7605 6.5909)' x='12.5' y='13.6' width='2' height='8'/%3E%3Cpolygon points='9.2 7.3 9.2 18.5 12.2 15.6 12.6 15.5 17.4 15.5'/%3E%3C/svg%3E%0A");
background-repeat: no-repeat; background-repeat: no-repeat;
transition: top .15s ease-out, left .15s ease-out; transition: top .15s ease-out, left .15s ease-out;

View file

@ -1,5 +1,4 @@
import logger from 'App/logger'; import logger from 'App/logger';
import { resolveURL } from "../../messages/rewriter/urlResolve";
import type Screen from '../../Screen/Screen'; import type Screen from '../../Screen/Screen';
import type { Message, SetNodeScroll } from '../../messages'; import type { Message, SetNodeScroll } from '../../messages';
@ -32,6 +31,8 @@ export default class DOMManager extends ListWalker<Message> {
private readonly vTexts: Map<number, VText> = new Map() // map vs object here? private readonly vTexts: Map<number, VText> = new Map() // map vs object here?
private readonly vElements: Map<number, VElement> = new Map() private readonly vElements: Map<number, VElement> = new Map()
private readonly olVRoots: Map<number, OnloadVRoot> = new Map() private readonly olVRoots: Map<number, OnloadVRoot> = new Map()
/** required to keep track of iframes, frameId : vnodeId */
private readonly iframeRoots: Record<number, number> = {}
/** Constructed StyleSheets https://developer.mozilla.org/en-US/docs/Web/API/Document/adoptedStyleSheets /** Constructed StyleSheets https://developer.mozilla.org/en-US/docs/Web/API/Document/adoptedStyleSheets
* as well as <style> tag owned StyleSheets * as well as <style> tag owned StyleSheets
*/ */
@ -219,6 +220,10 @@ export default class DOMManager extends ListWalker<Message> {
if (['STYLE', 'style', 'LINK'].includes(msg.tag)) { if (['STYLE', 'style', 'LINK'].includes(msg.tag)) {
vElem.prioritized = true vElem.prioritized = true
} }
if (this.vElements.has(msg.id)) {
logger.error("CreateElementNode: Node already exists", msg)
return
}
this.vElements.set(msg.id, vElem) this.vElements.set(msg.id, vElem)
this.insertNode(msg) this.insertNode(msg)
this.removeBodyScroll(msg.id, vElem) this.removeBodyScroll(msg.id, vElem)
@ -316,6 +321,10 @@ export default class DOMManager extends ListWalker<Message> {
case MType.CreateIFrameDocument: { case MType.CreateIFrameDocument: {
const vElem = this.vElements.get(msg.frameID) const vElem = this.vElements.get(msg.frameID)
if (!vElem) { logger.error("CreateIFrameDocument: Node not found", msg); return } if (!vElem) { logger.error("CreateIFrameDocument: Node not found", msg); return }
if (this.iframeRoots[msg.frameID] && !this.olVRoots.has(msg.id)) {
this.olVRoots.delete(this.iframeRoots[msg.frameID])
}
this.iframeRoots[msg.frameID] = msg.id
const vRoot = OnloadVRoot.fromVElement(vElem) const vRoot = OnloadVRoot.fromVElement(vElem)
vRoot.catch(e => logger.warn(e, msg)) vRoot.catch(e => logger.warn(e, msg))
this.olVRoots.set(msg.id, vRoot) this.olVRoots.set(msg.id, vRoot)

View file

@ -33,6 +33,7 @@ interface AddCommentRequest {
interface GetSpotsResponse { interface GetSpotsResponse {
spots: SpotInfo[]; spots: SpotInfo[];
total: number; total: number;
tenantHasSpots: boolean;
} }
interface GetSpotsRequest { interface GetSpotsRequest {

View file

@ -504,7 +504,6 @@ export function truncateStringToFit(string: string, screenWidth: number, charWid
let sendingRequest = false; let sendingRequest = false;
export const handleSpotJWT = (jwt: string) => { export const handleSpotJWT = (jwt: string) => {
console.log(jwt, sendingRequest)
let tries = 0; let tries = 0;
if (!jwt || sendingRequest) { if (!jwt || sendingRequest) {
return; return;

View file

@ -30,7 +30,7 @@
"@floating-ui/react-dom-interactions": "^0.10.3", "@floating-ui/react-dom-interactions": "^0.10.3",
"@medv/finder": "^3.1.0", "@medv/finder": "^3.1.0",
"@reduxjs/toolkit": "^2.2.2", "@reduxjs/toolkit": "^2.2.2",
"@sentry/browser": "^5.21.1", "@sentry/browser": "^8.34.0",
"@svg-maps/world": "^1.0.1", "@svg-maps/world": "^1.0.1",
"@svgr/webpack": "^6.2.1", "@svgr/webpack": "^6.2.1",
"@wojtekmaj/react-daterange-picker": "^6.0.0", "@wojtekmaj/react-daterange-picker": "^6.0.0",
@ -142,7 +142,7 @@
"mini-css-extract-plugin": "^2.6.0", "mini-css-extract-plugin": "^2.6.0",
"minio": "^7.1.3", "minio": "^7.1.3",
"node-gyp": "^9.0.0", "node-gyp": "^9.0.0",
"postcss": "^8.4.14", "postcss": "^8.4.39",
"postcss-import": "^14.1.0", "postcss-import": "^14.1.0",
"postcss-loader": "^7.0.0", "postcss-loader": "^7.0.0",
"postcss-mixins": "^9.0.2", "postcss-mixins": "^9.0.2",
@ -159,7 +159,7 @@
"ts-jest": "^29.0.5", "ts-jest": "^29.0.5",
"ts-node": "^10.7.0", "ts-node": "^10.7.0",
"typescript": "^4.6.4", "typescript": "^4.6.4",
"webpack": "^5.92.1", "webpack": "^5.94.0",
"webpack-cli": "^5.1.4", "webpack-cli": "^5.1.4",
"webpack-dev-server": "^5.0.4" "webpack-dev-server": "^5.0.4"
}, },

View file

@ -3188,67 +3188,90 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/browser@npm:^5.21.1": "@sentry-internal/browser-utils@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/browser@npm:5.30.0" resolution: "@sentry-internal/browser-utils@npm:8.34.0"
dependencies: dependencies:
"@sentry/core": 5.30.0 "@sentry/core": 8.34.0
"@sentry/types": 5.30.0 "@sentry/types": 8.34.0
"@sentry/utils": 5.30.0 "@sentry/utils": 8.34.0
tslib: ^1.9.3 checksum: fb764f52a989307bb6369a2ae24bb83ef9880c108d2cc2aba94106c846dae743ba379291c84539306d26c3f35f16b6cd2341fa32e85cb942057f2905a33e82bf
checksum: 6793e1b49a8cdb1f025115bcc591bf67c97b6515f62a33ffcbb7b1ab66e459ebc471797d02e471be1ebf14092b56eb25ed914f043962388cc224bc961e334a17
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/core@npm:5.30.0": "@sentry-internal/feedback@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/core@npm:5.30.0" resolution: "@sentry-internal/feedback@npm:8.34.0"
dependencies: dependencies:
"@sentry/hub": 5.30.0 "@sentry/core": 8.34.0
"@sentry/minimal": 5.30.0 "@sentry/types": 8.34.0
"@sentry/types": 5.30.0 "@sentry/utils": 8.34.0
"@sentry/utils": 5.30.0 checksum: 7137a6b589cb56b541df52abd75a73280d3f8fd09f1983f298e29647c5dd941d5fc404d32599d4c1fe2bdcb7693d0e7886f2c08c10ad1eb7c8e17cad650e4cb3
tslib: ^1.9.3
checksum: 6407b9c2a6a56f90c198f5714b3257df24d89d1b4ca6726bd44760d0adabc25798b69fef2c88ccea461c7e79e3c78861aaebfd51fd3cb892aee656c3f7e11801
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/hub@npm:5.30.0": "@sentry-internal/replay-canvas@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/hub@npm:5.30.0" resolution: "@sentry-internal/replay-canvas@npm:8.34.0"
dependencies: dependencies:
"@sentry/types": 5.30.0 "@sentry-internal/replay": 8.34.0
"@sentry/utils": 5.30.0 "@sentry/core": 8.34.0
tslib: ^1.9.3 "@sentry/types": 8.34.0
checksum: 386c91d06aa44be0465fc11330d748a113e464d41cd562a9e1d222a682cbcb14e697a3e640953e7a0239997ad8a02b223a0df3d9e1d8816cb823fd3613be3e2f "@sentry/utils": 8.34.0
checksum: 55c53be37e0c06706e099a96d1485636b8d3f11b72078c279fda6e7992205d217d27dae9e609db2c0466db0755bd038087e76cfe746eaff9ce39bbfd1f1571a5
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/minimal@npm:5.30.0": "@sentry-internal/replay@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/minimal@npm:5.30.0" resolution: "@sentry-internal/replay@npm:8.34.0"
dependencies: dependencies:
"@sentry/hub": 5.30.0 "@sentry-internal/browser-utils": 8.34.0
"@sentry/types": 5.30.0 "@sentry/core": 8.34.0
tslib: ^1.9.3 "@sentry/types": 8.34.0
checksum: 34ec05503de46d01f98c94701475d5d89cc044892c86ccce30e01f62f28344eb23b718e7cf573815e46f30a4ac9da3129bed9b3d20c822938acfb40cbe72437b "@sentry/utils": 8.34.0
checksum: 8a4b6f1f169584ddd62c372760168ea2d63ca0d6ebd6433e45d760fcbb2610418a2bf6546bbda49ecd619deddf39b4ac268b87a15adbb56efc0b86edf4c40dd9
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/types@npm:5.30.0": "@sentry/browser@npm:^8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/types@npm:5.30.0" resolution: "@sentry/browser@npm:8.34.0"
checksum: 99c6e55c0a82c8ca95be2e9dbb35f581b29e4ff7af74b23bc62b690de4e35febfa15868184a2303480ef86babd4fea5273cf3b5ddf4a27685b841a72f13a0c88 dependencies:
"@sentry-internal/browser-utils": 8.34.0
"@sentry-internal/feedback": 8.34.0
"@sentry-internal/replay": 8.34.0
"@sentry-internal/replay-canvas": 8.34.0
"@sentry/core": 8.34.0
"@sentry/types": 8.34.0
"@sentry/utils": 8.34.0
checksum: 8a08033fce2908018cc3fc81cf1110a93a338c0d370628a2e9aaa9f43703041824462474037e59b2b166141835b7e94b437325bd6a46bb8371e37b659b216d10
languageName: node languageName: node
linkType: hard linkType: hard
"@sentry/utils@npm:5.30.0": "@sentry/core@npm:8.34.0":
version: 5.30.0 version: 8.34.0
resolution: "@sentry/utils@npm:5.30.0" resolution: "@sentry/core@npm:8.34.0"
dependencies: dependencies:
"@sentry/types": 5.30.0 "@sentry/types": 8.34.0
tslib: ^1.9.3 "@sentry/utils": 8.34.0
checksum: ca8eebfea7ac7db6d16f6c0b8a66ac62587df12a79ce9d0d8393f4d69880bb8d40d438f9810f7fb107a9880fe0d68bbf797b89cbafd113e89a0829eb06b205f8 checksum: 0ab7e11bd382cb47ade38f3c9615e6fb876bad43eba4b376a51e44b1c57e00efe2e74f3cc0790a8da6c0be16093086bc65d89cf5387453f93ae96e10a41a0d60
languageName: node
linkType: hard
"@sentry/types@npm:8.34.0":
version: 8.34.0
resolution: "@sentry/types@npm:8.34.0"
checksum: d35bf72129f621af2f7916b0805c6948d210791757bee690fc6b68f2412bbe80c8ec704a0f8eb8ee45eb78deeadbd3c69830469b62fba4827506ea30c235f4e8
languageName: node
linkType: hard
"@sentry/utils@npm:8.34.0":
version: 8.34.0
resolution: "@sentry/utils@npm:8.34.0"
dependencies:
"@sentry/types": 8.34.0
checksum: 60612dba8320c736f9559ba2fb4efe2927fd9d4a1f29bff36f116ad30c9ce210f6677013052a69cb7e16c5d28f1d8d7465d9278e72f7384b84e924cf3ed2790c
languageName: node languageName: node
linkType: hard linkType: hard
@ -7667,9 +7690,9 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"body-parser@npm:1.20.2": "body-parser@npm:1.20.3":
version: 1.20.2 version: 1.20.3
resolution: "body-parser@npm:1.20.2" resolution: "body-parser@npm:1.20.3"
dependencies: dependencies:
bytes: 3.1.2 bytes: 3.1.2
content-type: ~1.0.5 content-type: ~1.0.5
@ -7679,11 +7702,11 @@ __metadata:
http-errors: 2.0.0 http-errors: 2.0.0
iconv-lite: 0.4.24 iconv-lite: 0.4.24
on-finished: 2.4.1 on-finished: 2.4.1
qs: 6.11.0 qs: 6.13.0
raw-body: 2.5.2 raw-body: 2.5.2
type-is: ~1.6.18 type-is: ~1.6.18
unpipe: 1.0.0 unpipe: 1.0.0
checksum: 06f1438fff388a2e2354c96aa3ea8147b79bfcb1262dfcc2aae68ec13723d01d5781680657b74e9f83c808266d5baf52804032fbde2b7382b89bd8cdb273ace9 checksum: 0a9a93b7518f222885498dcecaad528cf010dd109b071bf471c93def4bfe30958b83e03496eb9c1ad4896db543d999bb62be1a3087294162a88cfa1b42c16310
languageName: node languageName: node
linkType: hard linkType: hard
@ -9099,10 +9122,10 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"cookie@npm:0.6.0": "cookie@npm:0.7.1":
version: 0.6.0 version: 0.7.1
resolution: "cookie@npm:0.6.0" resolution: "cookie@npm:0.7.1"
checksum: f2318b31af7a31b4ddb4a678d024514df5e705f9be5909a192d7f116cfb6d45cbacf96a473fa733faa95050e7cff26e7832bb3ef94751592f1387b71c8956686 checksum: 5de60c67a410e7c8dc8a46a4b72eb0fe925871d057c9a5d2c0e8145c4270a4f81076de83410c4d397179744b478e33cd80ccbcc457abf40a9409ad27dcd21dde
languageName: node languageName: node
linkType: hard linkType: hard
@ -10507,9 +10530,9 @@ __metadata:
linkType: hard linkType: hard
"dompurify@npm:^2.2.0": "dompurify@npm:^2.2.0":
version: 2.5.0 version: 2.5.7
resolution: "dompurify@npm:2.5.0" resolution: "dompurify@npm:2.5.7"
checksum: 637dcf3430f3fedf66b58f84fd59ea9b3615a19a6db5efe444c635b2473a77a345b31d7328b56dbc80f692791915ffd6049d69041ff013e33692fdb8b0d84e48 checksum: 23c4f737182fcf3e731e458c3930ef4d2916191e4180e1e345f153124dfa7ec117d2810af1754e8854c581131fc75dac914a8391183d1511852ef32b4055f711
languageName: node languageName: node
linkType: hard linkType: hard
@ -10639,8 +10662,8 @@ __metadata:
linkType: hard linkType: hard
"elliptic@npm:^6.5.3, elliptic@npm:^6.5.5": "elliptic@npm:^6.5.3, elliptic@npm:^6.5.5":
version: 6.5.5 version: 6.5.7
resolution: "elliptic@npm:6.5.5" resolution: "elliptic@npm:6.5.7"
dependencies: dependencies:
bn.js: ^4.11.9 bn.js: ^4.11.9
brorand: ^1.1.0 brorand: ^1.1.0
@ -10649,7 +10672,7 @@ __metadata:
inherits: ^2.0.4 inherits: ^2.0.4
minimalistic-assert: ^1.0.1 minimalistic-assert: ^1.0.1
minimalistic-crypto-utils: ^1.0.1 minimalistic-crypto-utils: ^1.0.1
checksum: 3e591e93783a1b66f234ebf5bd3a8a9a8e063a75073a35a671e03e3b25253b6e33ac121f7efe9b8808890fffb17b40596cc19d01e6e8d1fa13b9a56ff65597c8 checksum: 799959b6c54ea3564e8961f35abdf8c77e37617f3051614b05ab1fb6a04ddb65bd1caa75ed1bae375b15dda312a0f79fed26ebe76ecf05c5a7af244152a601b8
languageName: node languageName: node
linkType: hard linkType: hard
@ -10695,6 +10718,13 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"encodeurl@npm:~2.0.0":
version: 2.0.0
resolution: "encodeurl@npm:2.0.0"
checksum: 5d317306acb13e6590e28e27924c754163946a2480de11865c991a3a7eed4315cd3fba378b543ca145829569eefe9b899f3d84bb09870f675ae60bc924b01ceb
languageName: node
linkType: hard
"encoding@npm:^0.1.11, encoding@npm:^0.1.13": "encoding@npm:^0.1.11, encoding@npm:^0.1.13":
version: 0.1.13 version: 0.1.13
resolution: "encoding@npm:0.1.13" resolution: "encoding@npm:0.1.13"
@ -10765,13 +10795,13 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"enhanced-resolve@npm:^5.17.0": "enhanced-resolve@npm:^5.17.1":
version: 5.17.0 version: 5.17.1
resolution: "enhanced-resolve@npm:5.17.0" resolution: "enhanced-resolve@npm:5.17.1"
dependencies: dependencies:
graceful-fs: ^4.2.4 graceful-fs: ^4.2.4
tapable: ^2.2.0 tapable: ^2.2.0
checksum: 90065e58e4fd08e77ba47f827eaa17d60c335e01e4859f6e644bb3b8d0e32b203d33894aee92adfa5121fa262f912b48bdf0d0475e98b4a0a1132eea1169ad37 checksum: 81a0515675eca17efdba2cf5bad87abc91a528fc1191aad50e275e74f045b41506167d420099022da7181c8d787170ea41e4a11a0b10b7a16f6237daecb15370
languageName: node languageName: node
linkType: hard linkType: hard
@ -11464,41 +11494,41 @@ __metadata:
linkType: hard linkType: hard
"express@npm:^4.17.1, express@npm:^4.17.3": "express@npm:^4.17.1, express@npm:^4.17.3":
version: 4.19.2 version: 4.21.1
resolution: "express@npm:4.19.2" resolution: "express@npm:4.21.1"
dependencies: dependencies:
accepts: ~1.3.8 accepts: ~1.3.8
array-flatten: 1.1.1 array-flatten: 1.1.1
body-parser: 1.20.2 body-parser: 1.20.3
content-disposition: 0.5.4 content-disposition: 0.5.4
content-type: ~1.0.4 content-type: ~1.0.4
cookie: 0.6.0 cookie: 0.7.1
cookie-signature: 1.0.6 cookie-signature: 1.0.6
debug: 2.6.9 debug: 2.6.9
depd: 2.0.0 depd: 2.0.0
encodeurl: ~1.0.2 encodeurl: ~2.0.0
escape-html: ~1.0.3 escape-html: ~1.0.3
etag: ~1.8.1 etag: ~1.8.1
finalhandler: 1.2.0 finalhandler: 1.3.1
fresh: 0.5.2 fresh: 0.5.2
http-errors: 2.0.0 http-errors: 2.0.0
merge-descriptors: 1.0.1 merge-descriptors: 1.0.3
methods: ~1.1.2 methods: ~1.1.2
on-finished: 2.4.1 on-finished: 2.4.1
parseurl: ~1.3.3 parseurl: ~1.3.3
path-to-regexp: 0.1.7 path-to-regexp: 0.1.10
proxy-addr: ~2.0.7 proxy-addr: ~2.0.7
qs: 6.11.0 qs: 6.13.0
range-parser: ~1.2.1 range-parser: ~1.2.1
safe-buffer: 5.2.1 safe-buffer: 5.2.1
send: 0.18.0 send: 0.19.0
serve-static: 1.15.0 serve-static: 1.16.2
setprototypeof: 1.2.0 setprototypeof: 1.2.0
statuses: 2.0.1 statuses: 2.0.1
type-is: ~1.6.18 type-is: ~1.6.18
utils-merge: 1.0.1 utils-merge: 1.0.1
vary: ~1.1.2 vary: ~1.1.2
checksum: e82e2662ea9971c1407aea9fc3c16d6b963e55e3830cd0ef5e00b533feda8b770af4e3be630488ef8a752d7c75c4fcefb15892868eeaafe7353cb9e3e269fdcb checksum: 0c287867e5f6129d3def1edd9b63103a53c40d4dc8628839d4b6827e35eb8f0de5a4656f9d85f4457eba584f9871ebb2ad26c750b36bd75d9bbb8bcebdc4892c
languageName: node languageName: node
linkType: hard linkType: hard
@ -11638,13 +11668,13 @@ __metadata:
linkType: hard linkType: hard
"fast-xml-parser@npm:^4.2.2": "fast-xml-parser@npm:^4.2.2":
version: 4.3.6 version: 4.5.0
resolution: "fast-xml-parser@npm:4.3.6" resolution: "fast-xml-parser@npm:4.5.0"
dependencies: dependencies:
strnum: ^1.0.5 strnum: ^1.0.5
bin: bin:
fxparser: src/cli/cli.js fxparser: src/cli/cli.js
checksum: 9ebe2ac142c6978cae423c39c2a9b561edb76be584317d578768ed4a006a61fc0e83abf8c6fe31029139c4ad15ea1f2e7b6720ba9e6eda0e5266d7f2770fb079 checksum: 71d206c9e137f5c26af88d27dde0108068a5d074401901d643c500c36e95dfd828333a98bda020846c41f5b9b364e2b0e9be5b19b0bdcab5cf31559c07b80a95
languageName: node languageName: node
linkType: hard linkType: hard
@ -11794,18 +11824,18 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"finalhandler@npm:1.2.0": "finalhandler@npm:1.3.1":
version: 1.2.0 version: 1.3.1
resolution: "finalhandler@npm:1.2.0" resolution: "finalhandler@npm:1.3.1"
dependencies: dependencies:
debug: 2.6.9 debug: 2.6.9
encodeurl: ~1.0.2 encodeurl: ~2.0.0
escape-html: ~1.0.3 escape-html: ~1.0.3
on-finished: 2.4.1 on-finished: 2.4.1
parseurl: ~1.3.3 parseurl: ~1.3.3
statuses: 2.0.1 statuses: 2.0.1
unpipe: ~1.0.0 unpipe: ~1.0.0
checksum: 64b7e5ff2ad1fcb14931cd012651631b721ce657da24aedb5650ddde9378bf8e95daa451da43398123f5de161a81e79ff5affe4f9f2a6d2df4a813d6d3e254b7 checksum: d38035831865a49b5610206a3a9a9aae4e8523cbbcd01175d0480ffbf1278c47f11d89be3ca7f617ae6d94f29cf797546a4619cd84dd109009ef33f12f69019f
languageName: node languageName: node
linkType: hard linkType: hard
@ -16660,10 +16690,10 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"merge-descriptors@npm:1.0.1": "merge-descriptors@npm:1.0.3":
version: 1.0.1 version: 1.0.3
resolution: "merge-descriptors@npm:1.0.1" resolution: "merge-descriptors@npm:1.0.3"
checksum: b67d07bd44cfc45cebdec349bb6e1f7b077ee2fd5beb15d1f7af073849208cb6f144fe403e29a36571baf3f4e86469ac39acf13c318381e958e186b2766f54ec checksum: 866b7094afd9293b5ea5dcd82d71f80e51514bed33b4c4e9f516795dc366612a4cbb4dc94356e943a8a6914889a914530badff27f397191b9b75cda20b6bae93
languageName: node languageName: node
linkType: hard linkType: hard
@ -18149,7 +18179,7 @@ __metadata:
"@medv/finder": ^3.1.0 "@medv/finder": ^3.1.0
"@openreplay/sourcemap-uploader": ^3.0.8 "@openreplay/sourcemap-uploader": ^3.0.8
"@reduxjs/toolkit": ^2.2.2 "@reduxjs/toolkit": ^2.2.2
"@sentry/browser": ^5.21.1 "@sentry/browser": ^8.34.0
"@storybook/addon-actions": ^6.5.12 "@storybook/addon-actions": ^6.5.12
"@storybook/addon-docs": ^6.5.12 "@storybook/addon-docs": ^6.5.12
"@storybook/addon-essentials": ^6.5.12 "@storybook/addon-essentials": ^6.5.12
@ -18217,7 +18247,7 @@ __metadata:
mobx-react-lite: ^3.1.6 mobx-react-lite: ^3.1.6
node-gyp: ^9.0.0 node-gyp: ^9.0.0
peerjs: 1.3.2 peerjs: 1.3.2
postcss: ^8.4.14 postcss: ^8.4.39
postcss-import: ^14.1.0 postcss-import: ^14.1.0
postcss-loader: ^7.0.0 postcss-loader: ^7.0.0
postcss-mixins: ^9.0.2 postcss-mixins: ^9.0.2
@ -18260,7 +18290,7 @@ __metadata:
ts-node: ^10.7.0 ts-node: ^10.7.0
typescript: ^4.6.4 typescript: ^4.6.4
virtua: ^0.33.4 virtua: ^0.33.4
webpack: ^5.92.1 webpack: ^5.94.0
webpack-cli: ^5.1.4 webpack-cli: ^5.1.4
webpack-dev-server: ^5.0.4 webpack-dev-server: ^5.0.4
languageName: unknown languageName: unknown
@ -18770,10 +18800,10 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"path-to-regexp@npm:0.1.7": "path-to-regexp@npm:0.1.10":
version: 0.1.7 version: 0.1.10
resolution: "path-to-regexp@npm:0.1.7" resolution: "path-to-regexp@npm:0.1.10"
checksum: 50a1ddb1af41a9e68bd67ca8e331a705899d16fb720a1ea3a41e310480948387daf603abb14d7b0826c58f10146d49050a1291ba6a82b78a382d1c02c0b8f905 checksum: 34196775b9113ca6df88e94c8d83ba82c0e1a2063dd33bfe2803a980da8d49b91db8104f49d5191b44ea780d46b8670ce2b7f4a5e349b0c48c6779b653f1afe4
languageName: node languageName: node
linkType: hard linkType: hard
@ -18873,6 +18903,13 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"picocolors@npm:^1.1.0":
version: 1.1.0
resolution: "picocolors@npm:1.1.0"
checksum: 86946f6032148801ef09c051c6fb13b5cf942eaf147e30ea79edb91dd32d700934edebe782a1078ff859fb2b816792e97ef4dab03d7f0b804f6b01a0df35e023
languageName: node
linkType: hard
"picomatch@npm:^2.0.4, picomatch@npm:^2.2.1, picomatch@npm:^2.2.3, picomatch@npm:^2.3.0, picomatch@npm:^2.3.1": "picomatch@npm:^2.0.4, picomatch@npm:^2.2.1, picomatch@npm:^2.2.3, picomatch@npm:^2.3.0, picomatch@npm:^2.3.1":
version: 2.3.1 version: 2.3.1
resolution: "picomatch@npm:2.3.1" resolution: "picomatch@npm:2.3.1"
@ -19594,7 +19631,7 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"postcss@npm:^8.2.15, postcss@npm:^8.4.14, postcss@npm:^8.4.23, postcss@npm:^8.4.33": "postcss@npm:^8.2.15, postcss@npm:^8.4.23, postcss@npm:^8.4.33":
version: 8.4.38 version: 8.4.38
resolution: "postcss@npm:8.4.38" resolution: "postcss@npm:8.4.38"
dependencies: dependencies:
@ -19605,6 +19642,17 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"postcss@npm:^8.4.39":
version: 8.4.47
resolution: "postcss@npm:8.4.47"
dependencies:
nanoid: ^3.3.7
picocolors: ^1.1.0
source-map-js: ^1.2.1
checksum: 929f68b5081b7202709456532cee2a145c1843d391508c5a09de2517e8c4791638f71dd63b1898dba6712f8839d7a6da046c72a5e44c162e908f5911f57b5f44
languageName: node
linkType: hard
"prelude-ls@npm:^1.2.1": "prelude-ls@npm:^1.2.1":
version: 1.2.1 version: 1.2.1
resolution: "prelude-ls@npm:1.2.1" resolution: "prelude-ls@npm:1.2.1"
@ -19961,12 +20009,12 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"qs@npm:6.11.0": "qs@npm:6.13.0":
version: 6.11.0 version: 6.13.0
resolution: "qs@npm:6.11.0" resolution: "qs@npm:6.13.0"
dependencies: dependencies:
side-channel: ^1.0.4 side-channel: ^1.0.6
checksum: 4e4875e4d7c7c31c233d07a448e7e4650f456178b9dd3766b7cfa13158fdb24ecb8c4f059fa91e820dc6ab9f2d243721d071c9c0378892dcdad86e9e9a27c68f checksum: 62372cdeec24dc83a9fb240b7533c0fdcf0c5f7e0b83343edd7310f0ab4c8205a5e7c56406531f2e47e1b4878a3821d652be4192c841de5b032ca83619d8f860
languageName: node languageName: node
linkType: hard linkType: hard
@ -22349,9 +22397,9 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"send@npm:0.18.0": "send@npm:0.19.0":
version: 0.18.0 version: 0.19.0
resolution: "send@npm:0.18.0" resolution: "send@npm:0.19.0"
dependencies: dependencies:
debug: 2.6.9 debug: 2.6.9
depd: 2.0.0 depd: 2.0.0
@ -22366,7 +22414,7 @@ __metadata:
on-finished: 2.4.1 on-finished: 2.4.1
range-parser: ~1.2.1 range-parser: ~1.2.1
statuses: 2.0.1 statuses: 2.0.1
checksum: 0eb134d6a51fc13bbcb976a1f4214ea1e33f242fae046efc311e80aff66c7a43603e26a79d9d06670283a13000e51be6e0a2cb80ff0942eaf9f1cd30b7ae736a checksum: ea3f8a67a8f0be3d6bf9080f0baed6d2c51d11d4f7b4470de96a5029c598a7011c497511ccc28968b70ef05508675cebff27da9151dd2ceadd60be4e6cf845e3
languageName: node languageName: node
linkType: hard linkType: hard
@ -22425,15 +22473,15 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"serve-static@npm:1.15.0": "serve-static@npm:1.16.2":
version: 1.15.0 version: 1.16.2
resolution: "serve-static@npm:1.15.0" resolution: "serve-static@npm:1.16.2"
dependencies: dependencies:
encodeurl: ~1.0.2 encodeurl: ~2.0.0
escape-html: ~1.0.3 escape-html: ~1.0.3
parseurl: ~1.3.3 parseurl: ~1.3.3
send: 0.18.0 send: 0.19.0
checksum: fa9f0e21a540a28f301258dfe1e57bb4f81cd460d28f0e973860477dd4acef946a1f41748b5bd41c73b621bea2029569c935faa38578fd34cd42a9b4947088ba checksum: 528fff6f5e12d0c5a391229ad893910709bc51b5705962b09404a1d813857578149b8815f35d3ee5752f44cd378d0f31669d4b1d7e2d11f41e08283d5134bd1f
languageName: node languageName: node
linkType: hard linkType: hard
@ -22828,6 +22876,13 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"source-map-js@npm:^1.2.1":
version: 1.2.1
resolution: "source-map-js@npm:1.2.1"
checksum: 7bda1fc4c197e3c6ff17de1b8b2c20e60af81b63a52cb32ec5a5d67a20a7d42651e2cb34ebe93833c5a2a084377e17455854fee3e21e7925c64a51b6a52b0faf
languageName: node
linkType: hard
"source-map-resolve@npm:^0.5.0": "source-map-resolve@npm:^0.5.0":
version: 0.5.3 version: 0.5.3
resolution: "source-map-resolve@npm:0.5.3" resolution: "source-map-resolve@npm:0.5.3"
@ -25619,11 +25674,10 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"webpack@npm:^5.92.1": "webpack@npm:^5.94.0":
version: 5.92.1 version: 5.95.0
resolution: "webpack@npm:5.92.1" resolution: "webpack@npm:5.95.0"
dependencies: dependencies:
"@types/eslint-scope": ^3.7.3
"@types/estree": ^1.0.5 "@types/estree": ^1.0.5
"@webassemblyjs/ast": ^1.12.1 "@webassemblyjs/ast": ^1.12.1
"@webassemblyjs/wasm-edit": ^1.12.1 "@webassemblyjs/wasm-edit": ^1.12.1
@ -25632,7 +25686,7 @@ __metadata:
acorn-import-attributes: ^1.9.5 acorn-import-attributes: ^1.9.5
browserslist: ^4.21.10 browserslist: ^4.21.10
chrome-trace-event: ^1.0.2 chrome-trace-event: ^1.0.2
enhanced-resolve: ^5.17.0 enhanced-resolve: ^5.17.1
es-module-lexer: ^1.2.1 es-module-lexer: ^1.2.1
eslint-scope: 5.1.1 eslint-scope: 5.1.1
events: ^3.2.0 events: ^3.2.0
@ -25652,7 +25706,7 @@ __metadata:
optional: true optional: true
bin: bin:
webpack: bin/webpack.js webpack: bin/webpack.js
checksum: 43ca7c76b9c1005bd85f05303d048f918bac10276a209e3ef5e359353fbfef4e5fcee876265e6bc305bf5ef326576e02df63bc7e5af878fb7f06d7e1795b811a checksum: b9e6d0f8ebcbf0632494ac0b90fe4acb8f4a9b83f7ace4a67a15545a36fe58599c912ab58e625e1bf58ab3b0916c75fe99da6196d412ee0cab0b5065edd84238
languageName: node languageName: node
linkType: hard linkType: hard

View file

@ -9,7 +9,7 @@
"version": "v1.12.0", "version": "v1.12.0",
"license": "Elastic License 2.0 (ELv2)", "license": "Elastic License 2.0 (ELv2)",
"dependencies": { "dependencies": {
"express": "^4.18.2", "express": "^4.21.1",
"peer": "^v1.0.1", "peer": "^v1.0.1",
"winston": "^3.13.0" "winston": "^3.13.0"
} }
@ -313,9 +313,9 @@
} }
}, },
"node_modules/cookie": { "node_modules/cookie": {
"version": "0.6.0", "version": "0.7.1",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.6.0.tgz", "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
"integrity": "sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==", "integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
@ -450,16 +450,16 @@
} }
}, },
"node_modules/express": { "node_modules/express": {
"version": "4.21.0", "version": "4.21.1",
"resolved": "https://registry.npmjs.org/express/-/express-4.21.0.tgz", "resolved": "https://registry.npmjs.org/express/-/express-4.21.1.tgz",
"integrity": "sha512-VqcNGcj/Id5ZT1LZ/cfihi3ttTn+NJmkli2eZADigjq29qTlWi/hAQ43t/VLPq8+UX06FCEx3ByOYet6ZFblng==", "integrity": "sha512-YSFlK1Ee0/GC8QaO91tHcDxJiE/X4FbpAyQWkxAvG6AXCuR65YzK8ua6D9hvi/TzUfZMpc+BwuM1IPw8fmQBiQ==",
"dependencies": { "dependencies": {
"accepts": "~1.3.8", "accepts": "~1.3.8",
"array-flatten": "1.1.1", "array-flatten": "1.1.1",
"body-parser": "1.20.3", "body-parser": "1.20.3",
"content-disposition": "0.5.4", "content-disposition": "0.5.4",
"content-type": "~1.0.4", "content-type": "~1.0.4",
"cookie": "0.6.0", "cookie": "0.7.1",
"cookie-signature": "1.0.6", "cookie-signature": "1.0.6",
"debug": "2.6.9", "debug": "2.6.9",
"depd": "2.0.0", "depd": "2.0.0",

View file

@ -18,7 +18,7 @@
}, },
"homepage": "https://github.com/openreplay/openreplay#readme", "homepage": "https://github.com/openreplay/openreplay#readme",
"dependencies": { "dependencies": {
"express": "^4.18.2", "express": "^4.21.1",
"peer": "^v1.0.1", "peer": "^v1.0.1",
"winston": "^3.13.0" "winston": "^3.13.0"
} }

View file

@ -11,6 +11,7 @@ docker rmi alpine || true
# Signing image # Signing image
# cosign sign --key awskms:///alias/openreplay-container-sign image_url:tag # cosign sign --key awskms:///alias/openreplay-container-sign image_url:tag
export SIGN_IMAGE=1 export SIGN_IMAGE=1
export ARCH=${ARCH:-"amd64"}
export PUSH_IMAGE=0 export PUSH_IMAGE=0
export AWS_DEFAULT_REGION="eu-central-1" export AWS_DEFAULT_REGION="eu-central-1"
export SIGN_KEY="awskms:///alias/openreplay-container-sign" export SIGN_KEY="awskms:///alias/openreplay-container-sign"
@ -21,17 +22,17 @@ echo $DOCKER_REPO
} || { } || {
# docker login $DOCKER_REPO # docker login $DOCKER_REPO
# tmux set-option remain-on-exit on # tmux set-option remain-on-exit on
tmux split-window "cd ../../backend && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../backend && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux split-window "cd ../../assist && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../assist && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux select-layout tiled tmux select-layout tiled
tmux split-window "cd ../../peers && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../peers && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux split-window "cd ../../frontend && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../frontend && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux select-layout tiled tmux select-layout tiled
tmux split-window "cd ../../sourcemapreader && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" tmux split-window "cd ../../sourcemapreader && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux split-window "cd ../../api && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@ \ tmux split-window "cd ../../api && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@ \
&& DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build_alerts.sh $@ \ && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build_alerts.sh $@ \
&& DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build_crons.sh $@ \ && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build_crons.sh $@ \
&& cd ../assist-stats && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=amd64 IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read" && cd ../assist-stats && DOCKER_RUNTIME="depot" DOCKER_BUILD_ARGS="--push" ARCH=$ARCH IMAGE_TAG=$IMAGE_TAG DOCKER_REPO=$DOCKER_REPO PUSH_IMAGE=0 bash build.sh $@; read"
tmux select-layout tiled tmux select-layout tiled
} }

View file

@ -119,7 +119,7 @@ function install_openreplay_actions() {
sudo rm -rf $openreplay_code_dir sudo rm -rf $openreplay_code_dir
fi fi
sudo cp -rfb ./vars.yaml $openreplay_home_dir sudo cp -rfb ./vars.yaml $openreplay_home_dir
sudo cp -rf "$(cd ../.. && pwd)" $openreplay_code_dir sudo cp -rf "$(cd ../.. && pwd)" $openreplay_home_dir
} }
function main() { function main() {

View file

@ -203,6 +203,7 @@ function status() {
return return
} }
# Create OR version patch with gith sha
function patch_version() { function patch_version() {
# Patching config version for console # Patching config version for console
version=$(/var/lib/openreplay/yq '.fromVersion' vars.yaml)-$(sudo git rev-parse --short HEAD) version=$(/var/lib/openreplay/yq '.fromVersion' vars.yaml)-$(sudo git rev-parse --short HEAD)
@ -327,11 +328,11 @@ function cleanup() {
fi fi
# Run pg cleanup # Run pg cleanup
pguser=$(awk '/postgresqlUser/{print $2}' <"${OR_DIR}/vars.yaml" | xargs) pguser=$(yq 'explode(.) | .global.postgresql.postgresqlUser' ${OR_DIR}/vars.yaml)
pgpassword=$(awk '/postgresqlPassword/{print $2}' <"${OR_DIR}/vars.yaml" | xargs) pgpassword=$(yq 'explode(.) | .global.postgresql.postgresqlPassword' ${OR_DIR}/vars.yaml)
pghost=$(awk '/postgresqlHost/{print $2}' <"${OR_DIR}/vars.yaml" | xargs) pghost=$(yq 'explode(.) | .global.postgresql.postgresqlHost' ${OR_DIR}/vars.yaml)
pgport=$(awk '/postgresqlPort/{print $2}' <"${OR_DIR}/vars.yaml" | xargs) pgport=$(yq 'explode(.) | .global.postgresql.postgresqlPort' ${OR_DIR}/vars.yaml)
pgdatabase=$(awk '/postgresqlDatabase/{print $2}' <"${OR_DIR}/vars.yaml" | xargs) pgdatabase=$(yq 'explode(.) | .global.postgresql.postgresqlDatabase' ${OR_DIR}/vars.yaml)
cleanup_query="DELETE FROM public.sessions WHERE start_ts < extract(epoch from '${delete_from_date}'::date) * 1000;" cleanup_query="DELETE FROM public.sessions WHERE start_ts < extract(epoch from '${delete_from_date}'::date) * 1000;"
[[ $EE ]] && cleanup_query="DELETE FROM public.sessions WHERE start_ts < extract(epoch from '${delete_from_date}'::date) * 1000 AND session_id NOT IN (SELECT session_id FROM user_favorite_sessions);" [[ $EE ]] && cleanup_query="DELETE FROM public.sessions WHERE start_ts < extract(epoch from '${delete_from_date}'::date) * 1000 AND session_id NOT IN (SELECT session_id FROM user_favorite_sessions);"
kubectl delete po -n "${APP_NS}" pg-cleanup &>/dev/null || true kubectl delete po -n "${APP_NS}" pg-cleanup &>/dev/null || true
@ -344,13 +345,13 @@ function cleanup() {
--env PGPORT="$pgport" \ --env PGPORT="$pgport" \
--image bitnami/postgresql -- psql -c "$cleanup_query" --image bitnami/postgresql -- psql -c "$cleanup_query"
# Run minio cleanup # Run minio cleanup
MINIO_ACCESS_KEY=$(awk '/accessKey/{print $NF}' <"${OR_DIR}/vars.yaml" | tail -n1 | xargs) MINIO_ACCESS_KEY=$(yq 'explode(.) | .global.s3.accessKey' ${OR_DIR}/vars.yaml)
MINIO_SECRET_KEY=$(awk '/secretKey/{print $NF}' <"${OR_DIR}/vars.yaml" | tail -n1 | xargs) MINIO_SECRET_KEY=$(yq 'explode(.) | .global.s3.secretKey' ${OR_DIR}/vars.yaml)
MINIO_HOST=$(awk '/endpoint/{print $NF}' <"${OR_DIR}/vars.yaml" | tail -n1 | xargs) MINIO_HOST=$(yq 'explode(.) | .global.s3.endpoint' ${OR_DIR}/vars.yaml)
kubectl delete po -n "${APP_NS}" minio-cleanup &>/dev/null || true kubectl delete po -n "${APP_NS}" minio-cleanup &>/dev/null || true
kubectl run minio-cleanup -n "${APP_NS}" \ kubectl run minio-cleanup -n "${APP_NS}" \
--restart=Never \ --restart=Never \
--env MINIO_HOST="$pghost" \ --env MINIO_HOST="$MINIO_HOST" \
--image bitnami/minio:2020.10.9-debian-10-r6 -- /bin/sh -c " --image bitnami/minio:2020.10.9-debian-10-r6 -- /bin/sh -c "
mc alias set minio $MINIO_HOST $MINIO_ACCESS_KEY $MINIO_SECRET_KEY && mc alias set minio $MINIO_HOST $MINIO_ACCESS_KEY $MINIO_SECRET_KEY &&
mc rm --recursive --dangerous --force --older-than ${delete_from_number_days}d minio/mobs mc rm --recursive --dangerous --force --older-than ${delete_from_number_days}d minio/mobs
@ -385,7 +386,7 @@ function upgrade() {
time_now=$(date +%m-%d-%Y-%I%M%S) time_now=$(date +%m-%d-%Y-%I%M%S)
# Creating backup dir of current installation # Creating backup dir of current installation
[[ -d "$OR_DIR/openreplay" ]] && sudo mv "$OR_DIR/openreplay" "$OR_DIR/openreplay_${or_version//\"/}_${time_now}" [[ -d "$OR_DIR/openreplay" ]] && sudo cp -rf "$OR_DIR/openreplay" "$OR_DIR/openreplay_${or_version//\"/}_${time_now}"
clone_repo clone_repo
err_cd openreplay/scripts/helmcharts err_cd openreplay/scripts/helmcharts
@ -406,7 +407,8 @@ function upgrade() {
sudo mv ./openreplay-cli /bin/openreplay sudo mv ./openreplay-cli /bin/openreplay
sudo chmod +x /bin/openreplay sudo chmod +x /bin/openreplay
sudo mv ./vars.yaml "$OR_DIR" sudo mv ./vars.yaml "$OR_DIR"
sudo cp -rf ../../../openreplay "$OR_DIR/" sudo rm -rf "$OR_DIR/openreplay" || true
sudo cp -rf "${tmp_dir}/openreplay" "$OR_DIR/"
log info "Configuration file is saved in /var/lib/openreplay/vars.yaml" log info "Configuration file is saved in /var/lib/openreplay/vars.yaml"
log info "Run ${BWHITE}openreplay -h${GREEN} to see the cli information to manage OpenReplay." log info "Run ${BWHITE}openreplay -h${GREEN} to see the cli information to manage OpenReplay."

View file

@ -18,4 +18,4 @@ version: 0.1.1
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.1" AppVersion: "v1.20.6"

View file

@ -89,7 +89,7 @@ spec:
# 4. Using AWS itself. # 4. Using AWS itself.
# AWS uses bucketname.endpoint/object while others use endpoint/bucketname/object # AWS uses bucketname.endpoint/object while others use endpoint/bucketname/object
- name: ASSETS_ORIGIN - name: ASSETS_ORIGIN
value: "{{ include "openreplay.s3Endpoint" . }}/{{.Values.global.s3.assetsBucket}}" value: "{{ include "openreplay.assets_origin" . }}"
{{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }} {{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }}
ports: ports:
{{- range $key, $val := .Values.service.ports }} {{- range $key, $val := .Values.service.ports }}

View file

@ -18,4 +18,4 @@ version: 0.1.7
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.0" AppVersion: "v1.20.19"

View file

@ -1,7 +1,6 @@
apiVersion: v2 apiVersion: v2
name: db name: db
description: A Helm chart for Kubernetes description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart. # A chart can be either an 'application' or a 'library' chart.
# #
# Application charts are a collection of templates that can be packaged into versioned archives # Application charts are a collection of templates that can be packaged into versioned archives
@ -11,14 +10,12 @@ description: A Helm chart for Kubernetes
# a dependency of application charts to inject those utilities and functions into the rendering # a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed. # pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application type: application
# This is the chart version. This version number should be incremented each time you make changes # This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version. # to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/) # Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.1 version: 0.1.1
# This is the version number of the application being deployed. This version number should be # This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.0" AppVersion: "v1.20.3"

View file

@ -18,4 +18,4 @@ version: 0.1.10
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.4" AppVersion: "v1.20.17"

View file

@ -1,7 +1,6 @@
apiVersion: v2 apiVersion: v2
name: http name: http
description: A Helm chart for Kubernetes description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart. # A chart can be either an 'application' or a 'library' chart.
# #
# Application charts are a collection of templates that can be packaged into versioned archives # Application charts are a collection of templates that can be packaged into versioned archives
@ -11,14 +10,12 @@ description: A Helm chart for Kubernetes
# a dependency of application charts to inject those utilities and functions into the rendering # a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed. # pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application type: application
# This is the chart version. This version number should be incremented each time you make changes # This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version. # to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/) # Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.1 version: 0.1.1
# This is the version number of the application being deployed. This version number should be # This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.0" AppVersion: "v1.20.1"

View file

@ -65,7 +65,7 @@ spec:
# 4. Using AWS itself. # 4. Using AWS itself.
# AWS uses bucketname.endpoint/object while others use endpoint/bucketname/object # AWS uses bucketname.endpoint/object while others use endpoint/bucketname/object
- name: ASSETS_ORIGIN - name: ASSETS_ORIGIN
value: "{{ include "openreplay.s3Endpoint" . }}/{{.Values.global.s3.assetsBucket}}" value: {{ include "openreplay.assets_origin" . }}
{{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }} {{- include "openreplay.env.redis_string" .Values.global.redis | nindent 12 }}
ports: ports:
{{- range $key, $val := .Values.service.ports }} {{- range $key, $val := .Values.service.ports }}

View file

@ -76,7 +76,7 @@ spec:
# 4. Using AWS itself. # 4. Using AWS itself.
# AWS uses bucketname.endpoint/object while others use endpoint/bucketname/object # AWS uses bucketname.endpoint/object while others use endpoint/bucketname/object
- name: ASSETS_ORIGIN - name: ASSETS_ORIGIN
value: "{{ include "openreplay.s3Endpoint" . }}/{{.Values.global.s3.assetsBucket}}" value: {{ include "openreplay.assets_origin" . }}
ports: ports:
{{- range $key, $val := .Values.service.ports }} {{- range $key, $val := .Values.service.ports }}
- name: {{ $key }} - name: {{ $key }}

View file

@ -1,7 +1,6 @@
apiVersion: v2 apiVersion: v2
name: spot name: spot
description: A Helm chart for Kubernetes description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart. # A chart can be either an 'application' or a 'library' chart.
# #
# Application charts are a collection of templates that can be packaged into versioned archives # Application charts are a collection of templates that can be packaged into versioned archives
@ -11,14 +10,12 @@ description: A Helm chart for Kubernetes
# a dependency of application charts to inject those utilities and functions into the rendering # a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed. # pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application type: application
# This is the chart version. This version number should be incremented each time you make changes # This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version. # to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/) # Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.1 version: 0.1.1
# This is the version number of the application being deployed. This version number should be # This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to # incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using. # follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes. # It is recommended to use it with quotes.
AppVersion: "v1.20.0" AppVersion: "v1.20.1"

View file

@ -5,9 +5,25 @@ cd $(dirname $0)
is_migrate=$1 is_migrate=$1
# Check if the openreplay version is set.
# This will take precedence over the .Values.fromVersion variable
# Because its created by installation programatically.
if [[ -n $OPENREPLAY_VERSION ]]; then
is_migrate=true
PREVIOUS_APP_VERSION=$OPENREPLAY_VERSION
echo "$OPENREPLAY_VERSION set"
fi
if [[ $FORCE_MIGRATION == "true" ]]; then
is_migrate=true
fi
# Passed from env # Passed from env
# PREVIOUS_APP_VERSION # PREVIOUS_APP_VERSION
# CHART_APP_VERSION # CHART_APP_VERSION
# Converting alphaneumeric to number.
PREVIOUS_APP_VERSION=$(echo $PREVIOUS_APP_VERSION | cut -d "v" -f2)
CHART_APP_VERSION=$(echo $CHART_APP_VERSION | cut -d "v" -f2)
function migration() { function migration() {
ls -la /opt/openreplay/openreplay ls -la /opt/openreplay/openreplay

View file

@ -138,3 +138,11 @@ Create the volume mount config for redis TLS certificates
subPath: {{ .tls.certCAFilename }} subPath: {{ .tls.certCAFilename }}
{{- end }} {{- end }}
{{- end }} {{- end }}
{{- define "openreplay.assets_origin"}}
{{- if .Values.global.assetsOrigin }}
{{- .Values.global.assetsOrigin }}
{{- else }}
{{- include "openreplay.s3Endpoint" . }}/{{.Values.global.s3.assetsBucket}}
{{- end }}
{{- end }}

View file

@ -0,0 +1,33 @@
apiVersion: v1
kind: ConfigMap
metadata:
name: openreplay-version
namespace: "{{ .Release.Namespace }}"
annotations:
"helm.sh/hook": post-install, post-upgrade
"helm.sh/hook-weight": "-6" # Higher precidence, so the first the config map will get created.
data:
version: {{ .Chart.AppVersion }}
---
# If some jobs or crons are doing db operations, or using credentias,
# it should fetch them from this secret.
apiVersion: v1
kind: Secret
metadata:
name: openreplay-secrets
namespace: "{{ .Release.Namespace }}"
annotations:
"helm.sh/hook": pre-install, pre-upgrade
"helm.sh/hook-weight": "-6" # Higher precidence, so the first the config map will get created.
"helm.sh/hook-delete-policy": "before-hook-creation"
data:
PGHOST: "{{ .Values.global.postgresql.postgresqlHost | b64enc }}"
PGPORT: "{{ .Values.global.postgresql.postgresqlPort | b64enc }}"
PGDATABASE: "{{ .Values.global.postgresql.postgresqlDatabase | b64enc }}"
PGUSER: "{{ .Values.global.postgresql.postgresqlUser | b64enc }}"
PGPASSWORD: "{{ .Values.global.postgresql.postgresqlPassword | b64enc }}"
CLICKHOUSE_USER: "{{ .Values.global.clickhouse.username | b64enc }}"
CLICKHOUSE_PASSWORD: "{{ .Values.global.clickhouse.password | b64enc }}"
MINIO_HOST: "{{ .Values.global.s3.endpoint | b64enc }}"
MINIO_ACCESS_KEY: "{{ .Values.global.s3.accessKey | b64enc }}"
MINIO_SECRET_KEY: "{{ .Values.global.s3.secretKey | b64enc }}"

View file

@ -2,9 +2,15 @@
Don't have to trigger migration if there is no version change Don't have to trigger migration if there is no version change
Don't have to trigger migration if skipMigration is set Don't have to trigger migration if skipMigration is set
Have to trigger migration if forceMigration is set Have to trigger migration if forceMigration is set
versionChange is true when:
Release.IsUpgrade is false.
Or .Values.deployment.argo is set.
Or Release.IsUpgrade is true and .Values.fromVersion is not equal to .Chart.AppVersion.
*/}} */}}
{{- $versionChange := and (eq .Values.fromVersion .Chart.AppVersion) (.Release.IsUpgrade) }}
{{- if or (not (or .Values.skipMigration $versionChange)) .Values.forceMigration }} {{- $versionChange := (or (not .Release.IsUpgrade) .Values.deployment.argo (and .Release.IsUpgrade (not (eq .Values.fromVersion .Chart.AppVersion)))) }}
{{- if or .Values.forceMigration (and (not .Values.skipMigration) $versionChange) }}
--- ---
apiVersion: v1 apiVersion: v1
kind: ConfigMap kind: ConfigMap
@ -74,7 +80,23 @@ spec:
- | - |
set -x set -x
mkdir -p /opt/openreplay/openreplay && cd /opt/openreplay/openreplay mkdir -p /opt/openreplay/openreplay && cd /opt/openreplay/openreplay
git clone {{ .Values.global.dbMigrationUpstreamRepoURL | default "https://github.com/openreplay/openreplay" }} .
# Function to check if GitHub is available
check_github() {
for i in {1..10}; do
if ping -c 1 github.com &> /dev/null || wget -q --spider https://github.com; then
echo "GitHub is available."
git clone {{ .Values.global.dbMigrationUpstreamRepoURL | default "https://github.com/openreplay/openreplay" }} .
break
else
echo "GitHub is not available. Retrying in 3 seconds..."
sleep 3
fi
done
}
check_github
ls /opt/openreplay/openreplay ls /opt/openreplay/openreplay
git checkout {{ default .Chart.AppVersion .Values.dbMigrationUpstreamBranch }} || exit 10 git checkout {{ default .Chart.AppVersion .Values.dbMigrationUpstreamBranch }} || exit 10
git log -1 git log -1
@ -159,8 +181,8 @@ spec:
- | - |
pg_version=$(psql -c "SHOW server_version;" | xargs | grep -oP '\d+(?=\.)') pg_version=$(psql -c "SHOW server_version;" | xargs | grep -oP '\d+(?=\.)')
if [[ $pg_version -le 14 ]]; then if [[ $pg_version -le 14 ]]; then
echo "[error] postgresql version is $pg_version which is < 16. Exiting." echo "[error] postgresql version is $pg_version which is < 16. Exiting.
For upgrade steps, refer: https://docs.openreplay.com/en/deployment/openreplay-admin/#upgrade-postgresql For upgrade steps, refer: https://docs.openreplay.com/en/deployment/openreplay-admin/#upgrade-postgresql"
exit 101 exit 101
fi fi
volumeMounts: volumeMounts:
@ -171,6 +193,12 @@ spec:
containers: containers:
- name: postgres - name: postgres
env: env:
- name: OPENREPLAY_VERSION
valueFrom:
configMapKeyRef:
name: openreplay-version
key: version
optional: true
- name: FORCE_MIGRATION - name: FORCE_MIGRATION
value: "{{ .Values.forceMigration }}" value: "{{ .Values.forceMigration }}"
- name: PREVIOUS_APP_VERSION - name: PREVIOUS_APP_VERSION
@ -217,6 +245,12 @@ spec:
- name: minio - name: minio
image: bitnami/minio:2023.11.20 image: bitnami/minio:2023.11.20
env: env:
- name: OPENREPLAY_VERSION
valueFrom:
configMapKeyRef:
name: openreplay-version
key: version
optional: true
{{- range $key, $val := .Values.global.env }} {{- range $key, $val := .Values.global.env }}
- name: {{ $key }} - name: {{ $key }}
value: '{{ $val }}' value: '{{ $val }}'
@ -340,6 +374,12 @@ spec:
- name: clickhouse - name: clickhouse
image: clickhouse/clickhouse-server:22.12-alpine image: clickhouse/clickhouse-server:22.12-alpine
env: env:
- name: OPENREPLAY_VERSION
valueFrom:
configMapKeyRef:
name: openreplay-version
key: version
optional: true
{{- range $key, $val := .Values.global.env }} {{- range $key, $val := .Values.global.env }}
- name: {{ $key }} - name: {{ $key }}
value: '{{ $val }}' value: '{{ $val }}'
@ -375,6 +415,12 @@ spec:
- name: kafka - name: kafka
image: bitnami/kafka:2.6.0-debian-10-r30 image: bitnami/kafka:2.6.0-debian-10-r30
env: env:
- name: OPENREPLAY_VERSION
valueFrom:
configMapKeyRef:
name: openreplay-version
key: version
optional: true
{{- range $key, $val := .Values.global.env }} {{- range $key, $val := .Values.global.env }}
- name: {{ $key }} - name: {{ $key }}
value: '{{ $val }}' value: '{{ $val }}'

View file

@ -5,6 +5,11 @@ migrationJob:
migration: migration:
env: {} env: {}
deployment:
argo: false
forceMigration: false
skipMigration: false
redis: &redis redis: &redis
tls: tls:
enabled: false enabled: false

View file

@ -1,4 +1,4 @@
\set or_version 'v1.19.0' \set or_version 'v1.20.0'
SET client_min_messages TO NOTICE; SET client_min_messages TO NOTICE;
\set ON_ERROR_STOP true \set ON_ERROR_STOP true
SELECT EXISTS (SELECT 1 SELECT EXISTS (SELECT 1

View file

@ -11,7 +11,7 @@
"dependencies": { "dependencies": {
"@azure/storage-blob": "^12.17.0", "@azure/storage-blob": "^12.17.0",
"aws-sdk": "^2.1606.0", "aws-sdk": "^2.1606.0",
"express": "^4.19.2", "express": "^4.21.1",
"source-map": "^0.7.4", "source-map": "^0.7.4",
"winston": "^3.13.0" "winston": "^3.13.0"
} }
@ -506,9 +506,9 @@
} }
}, },
"node_modules/cookie": { "node_modules/cookie": {
"version": "0.6.0", "version": "0.7.1",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.6.0.tgz", "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
"integrity": "sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==", "integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
@ -618,16 +618,16 @@
} }
}, },
"node_modules/express": { "node_modules/express": {
"version": "4.21.0", "version": "4.21.1",
"resolved": "https://registry.npmjs.org/express/-/express-4.21.0.tgz", "resolved": "https://registry.npmjs.org/express/-/express-4.21.1.tgz",
"integrity": "sha512-VqcNGcj/Id5ZT1LZ/cfihi3ttTn+NJmkli2eZADigjq29qTlWi/hAQ43t/VLPq8+UX06FCEx3ByOYet6ZFblng==", "integrity": "sha512-YSFlK1Ee0/GC8QaO91tHcDxJiE/X4FbpAyQWkxAvG6AXCuR65YzK8ua6D9hvi/TzUfZMpc+BwuM1IPw8fmQBiQ==",
"dependencies": { "dependencies": {
"accepts": "~1.3.8", "accepts": "~1.3.8",
"array-flatten": "1.1.1", "array-flatten": "1.1.1",
"body-parser": "1.20.3", "body-parser": "1.20.3",
"content-disposition": "0.5.4", "content-disposition": "0.5.4",
"content-type": "~1.0.4", "content-type": "~1.0.4",
"cookie": "0.6.0", "cookie": "0.7.1",
"cookie-signature": "1.0.6", "cookie-signature": "1.0.6",
"debug": "2.6.9", "debug": "2.6.9",
"depd": "2.0.0", "depd": "2.0.0",

View file

@ -20,7 +20,7 @@
"dependencies": { "dependencies": {
"@azure/storage-blob": "^12.17.0", "@azure/storage-blob": "^12.17.0",
"aws-sdk": "^2.1606.0", "aws-sdk": "^2.1606.0",
"express": "^4.19.2", "express": "^4.21.1",
"source-map": "^0.7.4", "source-map": "^0.7.4",
"winston": "^3.13.0" "winston": "^3.13.0"
} }

263
spot/package-lock.json generated
View file

@ -1,21 +1,20 @@
{ {
"name": "wxt-starter", "name": "wxt-starter",
"version": "1.0.0", "version": "1.0.6",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "wxt-starter", "name": "wxt-starter",
"version": "1.0.0", "version": "1.0.6",
"hasInstallScript": true, "hasInstallScript": true,
"dependencies": { "dependencies": {
"@neodrag/solid": "^2.0.4", "@neodrag/solid": "^2.0.4",
"@thedutchcoder/postcss-rem-to-px": "^0.0.2", "@thedutchcoder/postcss-rem-to-px": "^0.0.2",
"autoprefixer": "^10.4.19", "autoprefixer": "^10.4.19",
"install": "^0.13.0", "install": "^0.13.0",
"lucide-solid": "^0.408.0",
"npm": "^10.8.1", "npm": "^10.8.1",
"postcss": "^8.4.38", "postcss": "^8.4.47",
"prettier": "^3.3.2", "prettier": "^3.3.2",
"solid-js": "^1.8.17", "solid-js": "^1.8.17",
"tailwindcss": "^3.4.4", "tailwindcss": "^3.4.4",
@ -25,7 +24,7 @@
"@wxt-dev/module-solid": "^1.1.2", "@wxt-dev/module-solid": "^1.1.2",
"daisyui": "^4.12.10", "daisyui": "^4.12.10",
"typescript": "^5.4.5", "typescript": "^5.4.5",
"wxt": "0.19.7" "wxt": "0.19.9"
} }
}, },
"node_modules/@aklinker1/rollup-plugin-visualizer": { "node_modules/@aklinker1/rollup-plugin-visualizer": {
@ -1109,224 +1108,208 @@
"license": "MIT" "license": "MIT"
}, },
"node_modules/@rollup/rollup-android-arm-eabi": { "node_modules/@rollup/rollup-android-arm-eabi": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.24.0.tgz",
"integrity": "sha512-WTWD8PfoSAJ+qL87lE7votj3syLavxunWhzCnx3XFxFiI/BA/r3X7MUM8dVrH8rb2r4AiO8jJsr3ZjdaftmnfA==", "integrity": "sha512-Q6HJd7Y6xdB48x8ZNVDOqsbh2uByBhgK8PiQgPhwkIw/HC/YX5Ghq2mQY5sRMZWHb3VsFkWooUVOZHKr7DmDIA==",
"cpu": [ "cpu": [
"arm" "arm"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"android" "android"
] ]
}, },
"node_modules/@rollup/rollup-android-arm64": { "node_modules/@rollup/rollup-android-arm64": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.24.0.tgz",
"integrity": "sha512-a1sR2zSK1B4eYkiZu17ZUZhmUQcKjk2/j9Me2IDjk1GHW7LB5Z35LEzj9iJch6gtUfsnvZs1ZNyDW2oZSThrkA==", "integrity": "sha512-ijLnS1qFId8xhKjT81uBHuuJp2lU4x2yxa4ctFPtG+MqEE6+C5f/+X/bStmxapgmwLwiL3ih122xv8kVARNAZA==",
"cpu": [ "cpu": [
"arm64" "arm64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"android" "android"
] ]
}, },
"node_modules/@rollup/rollup-darwin-arm64": { "node_modules/@rollup/rollup-darwin-arm64": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.24.0.tgz",
"integrity": "sha512-zOnKWLgDld/svhKO5PD9ozmL6roy5OQ5T4ThvdYZLpiOhEGY+dp2NwUmxK0Ld91LrbjrvtNAE0ERBwjqhZTRAA==", "integrity": "sha512-bIv+X9xeSs1XCk6DVvkO+S/z8/2AMt/2lMqdQbMrmVpgFvXlmde9mLcbQpztXm1tajC3raFDqegsH18HQPMYtA==",
"cpu": [ "cpu": [
"arm64" "arm64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"darwin" "darwin"
] ]
}, },
"node_modules/@rollup/rollup-darwin-x64": { "node_modules/@rollup/rollup-darwin-x64": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.24.0.tgz",
"integrity": "sha512-7doS8br0xAkg48SKE2QNtMSFPFUlRdw9+votl27MvT46vo44ATBmdZdGysOevNELmZlfd+NEa0UYOA8f01WSrg==", "integrity": "sha512-X6/nOwoFN7RT2svEQWUsW/5C/fYMBe4fnLK9DQk4SX4mgVBiTA9h64kjUYPvGQ0F/9xwJ5U5UfTbl6BEjaQdBQ==",
"cpu": [ "cpu": [
"x64" "x64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"darwin" "darwin"
] ]
}, },
"node_modules/@rollup/rollup-linux-arm-gnueabihf": { "node_modules/@rollup/rollup-linux-arm-gnueabihf": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.24.0.tgz",
"integrity": "sha512-pWJsfQjNWNGsoCq53KjMtwdJDmh/6NubwQcz52aEwLEuvx08bzcy6tOUuawAOncPnxz/3siRtd8hiQ32G1y8VA==", "integrity": "sha512-0KXvIJQMOImLCVCz9uvvdPgfyWo93aHHp8ui3FrtOP57svqrF/roSSR5pjqL2hcMp0ljeGlU4q9o/rQaAQ3AYA==",
"cpu": [ "cpu": [
"arm" "arm"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"linux" "linux"
] ]
}, },
"node_modules/@rollup/rollup-linux-arm-musleabihf": { "node_modules/@rollup/rollup-linux-arm-musleabihf": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.24.0.tgz",
"integrity": "sha512-efRIANsz3UHZrnZXuEvxS9LoCOWMGD1rweciD6uJQIx2myN3a8Im1FafZBzh7zk1RJ6oKcR16dU3UPldaKd83w==", "integrity": "sha512-it2BW6kKFVh8xk/BnHfakEeoLPv8STIISekpoF+nBgWM4d55CZKc7T4Dx1pEbTnYm/xEKMgy1MNtYuoA8RFIWw==",
"cpu": [ "cpu": [
"arm" "arm"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"linux" "linux"
] ]
}, },
"node_modules/@rollup/rollup-linux-arm64-gnu": { "node_modules/@rollup/rollup-linux-arm64-gnu": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.24.0.tgz",
"integrity": "sha512-ZrPhydkTVhyeGTW94WJ8pnl1uroqVHM3j3hjdquwAcWnmivjAwOYjTEAuEDeJvGX7xv3Z9GAvrBkEzCgHq9U1w==", "integrity": "sha512-i0xTLXjqap2eRfulFVlSnM5dEbTVque/3Pi4g2y7cxrs7+a9De42z4XxKLYJ7+OhE3IgxvfQM7vQc43bwTgPwA==",
"cpu": [ "cpu": [
"arm64" "arm64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"linux" "linux"
] ]
}, },
"node_modules/@rollup/rollup-linux-arm64-musl": { "node_modules/@rollup/rollup-linux-arm64-musl": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.24.0.tgz",
"integrity": "sha512-cfaupqd+UEFeURmqNP2eEvXqgbSox/LHOyN9/d2pSdV8xTrjdg3NgOFJCtc1vQ/jEke1qD0IejbBfxleBPHnPw==", "integrity": "sha512-9E6MKUJhDuDh604Qco5yP/3qn3y7SLXYuiC0Rpr89aMScS2UAmK1wHP2b7KAa1nSjWJc/f/Lc0Wl1L47qjiyQw==",
"cpu": [ "cpu": [
"arm64" "arm64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"linux" "linux"
] ]
}, },
"node_modules/@rollup/rollup-linux-powerpc64le-gnu": { "node_modules/@rollup/rollup-linux-powerpc64le-gnu": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-linux-powerpc64le-gnu/-/rollup-linux-powerpc64le-gnu-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-powerpc64le-gnu/-/rollup-linux-powerpc64le-gnu-4.24.0.tgz",
"integrity": "sha512-ZKPan1/RvAhrUylwBXC9t7B2hXdpb/ufeu22pG2psV7RN8roOfGurEghw1ySmX/CmDDHNTDDjY3lo9hRlgtaHg==", "integrity": "sha512-2XFFPJ2XMEiF5Zi2EBf4h73oR1V/lycirxZxHZNc93SqDN/IWhYYSYj8I9381ikUFXZrz2v7r2tOVk2NBwxrWw==",
"cpu": [ "cpu": [
"ppc64" "ppc64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"linux" "linux"
] ]
}, },
"node_modules/@rollup/rollup-linux-riscv64-gnu": { "node_modules/@rollup/rollup-linux-riscv64-gnu": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.24.0.tgz",
"integrity": "sha512-H1eRaCwd5E8eS8leiS+o/NqMdljkcb1d6r2h4fKSsCXQilLKArq6WS7XBLDu80Yz+nMqHVFDquwcVrQmGr28rg==", "integrity": "sha512-M3Dg4hlwuntUCdzU7KjYqbbd+BLq3JMAOhCKdBE3TcMGMZbKkDdJ5ivNdehOssMCIokNHFOsv7DO4rlEOfyKpg==",
"cpu": [ "cpu": [
"riscv64" "riscv64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"linux" "linux"
] ]
}, },
"node_modules/@rollup/rollup-linux-s390x-gnu": { "node_modules/@rollup/rollup-linux-s390x-gnu": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.24.0.tgz",
"integrity": "sha512-zJ4hA+3b5tu8u7L58CCSI0A9N1vkfwPhWd/puGXwtZlsB5bTkwDNW/+JCU84+3QYmKpLi+XvHdmrlwUwDA6kqw==", "integrity": "sha512-mjBaoo4ocxJppTorZVKWFpy1bfFj9FeCMJqzlMQGjpNPY9JwQi7OuS1axzNIk0nMX6jSgy6ZURDZ2w0QW6D56g==",
"cpu": [ "cpu": [
"s390x" "s390x"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"linux" "linux"
] ]
}, },
"node_modules/@rollup/rollup-linux-x64-gnu": { "node_modules/@rollup/rollup-linux-x64-gnu": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.24.0.tgz",
"integrity": "sha512-e2hrvElFIh6kW/UNBQK/kzqMNY5mO+67YtEh9OA65RM5IJXYTWiXjX6fjIiPaqOkBthYF1EqgiZ6OXKcQsM0hg==", "integrity": "sha512-ZXFk7M72R0YYFN5q13niV0B7G8/5dcQ9JDp8keJSfr3GoZeXEoMHP/HlvqROA3OMbMdfr19IjCeNAnPUG93b6A==",
"cpu": [ "cpu": [
"x64" "x64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"linux" "linux"
] ]
}, },
"node_modules/@rollup/rollup-linux-x64-musl": { "node_modules/@rollup/rollup-linux-x64-musl": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.24.0.tgz",
"integrity": "sha512-1vvmgDdUSebVGXWX2lIcgRebqfQSff0hMEkLJyakQ9JQUbLDkEaMsPTLOmyccyC6IJ/l3FZuJbmrBw/u0A0uCQ==", "integrity": "sha512-w1i+L7kAXZNdYl+vFvzSZy8Y1arS7vMgIy8wusXJzRrPyof5LAb02KGr1PD2EkRcl73kHulIID0M501lN+vobQ==",
"cpu": [ "cpu": [
"x64" "x64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"linux" "linux"
] ]
}, },
"node_modules/@rollup/rollup-win32-arm64-msvc": { "node_modules/@rollup/rollup-win32-arm64-msvc": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.24.0.tgz",
"integrity": "sha512-s5oFkZ/hFcrlAyBTONFY1TWndfyre1wOMwU+6KCpm/iatybvrRgmZVM+vCFwxmC5ZhdlgfE0N4XorsDpi7/4XQ==", "integrity": "sha512-VXBrnPWgBpVDCVY6XF3LEW0pOU51KbaHhccHw6AS6vBWIC60eqsH19DAeeObl+g8nKAz04QFdl/Cefta0xQtUQ==",
"cpu": [ "cpu": [
"arm64" "arm64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"win32" "win32"
] ]
}, },
"node_modules/@rollup/rollup-win32-ia32-msvc": { "node_modules/@rollup/rollup-win32-ia32-msvc": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.24.0.tgz",
"integrity": "sha512-G9+TEqRnAA6nbpqyUqgTiopmnfgnMkR3kMukFBDsiyy23LZvUCpiUwjTRx6ezYCjJODXrh52rBR9oXvm+Fp5wg==", "integrity": "sha512-xrNcGDU0OxVcPTH/8n/ShH4UevZxKIO6HJFK0e15XItZP2UcaiLFd5kiX7hJnqCbSztUF8Qot+JWBC/QXRPYWQ==",
"cpu": [ "cpu": [
"ia32" "ia32"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"win32" "win32"
] ]
}, },
"node_modules/@rollup/rollup-win32-x64-msvc": { "node_modules/@rollup/rollup-win32-x64-msvc": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.21.0.tgz", "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.24.0.tgz",
"integrity": "sha512-2jsCDZwtQvRhejHLfZ1JY6w6kEuEtfF9nzYsZxzSlNVKDX+DpsDJ+Rbjkm74nvg2rdx0gwBS+IMdvwJuq3S9pQ==", "integrity": "sha512-fbMkAF7fufku0N2dE5TBXcNlg0pt0cJue4xBRE2Qc5Vqikxr4VCgKj/ht6SMdFcOacVA9rqF70APJ8RN/4vMJw==",
"cpu": [ "cpu": [
"x64" "x64"
], ],
"dev": true, "dev": true,
"license": "MIT",
"optional": true, "optional": true,
"os": [ "os": [
"win32" "win32"
@ -1435,12 +1418,42 @@
"@babel/types": "^7.20.7" "@babel/types": "^7.20.7"
} }
}, },
"node_modules/@types/estree": { "node_modules/@types/chrome": {
"version": "1.0.5", "version": "0.0.269",
"resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.5.tgz", "resolved": "https://registry.npmjs.org/@types/chrome/-/chrome-0.0.269.tgz",
"integrity": "sha512-/kYRxGDLWzHOB7q+wtSUQlFrtcdUccpfy+X+9iMBpHK8QLLhx2wIPYuS5DYtR9Wa/YlZAbIovy7qVdB1Aq6Lyw==", "integrity": "sha512-vF7x8YywnhXX2F06njQ/OE7a3Qeful43C5GUOsUksXWk89WoSFUU3iLeZW8lDpVO9atm8iZIEiLQTRC3H7NOXQ==",
"dev": true, "dev": true,
"license": "MIT" "dependencies": {
"@types/filesystem": "*",
"@types/har-format": "*"
}
},
"node_modules/@types/estree": {
"version": "1.0.6",
"resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.6.tgz",
"integrity": "sha512-AYnb1nQyY49te+VRAVgmzfcgjYS91mY5P0TKUDCLEM+gNnA+3T6rWITXRLYCpahpqSQbN5cE+gHpnPyXjHWxcw==",
"dev": true
},
"node_modules/@types/filesystem": {
"version": "0.0.36",
"resolved": "https://registry.npmjs.org/@types/filesystem/-/filesystem-0.0.36.tgz",
"integrity": "sha512-vPDXOZuannb9FZdxgHnqSwAG/jvdGM8Wq+6N4D/d80z+D4HWH+bItqsZaVRQykAn6WEVeEkLm2oQigyHtgb0RA==",
"dev": true,
"dependencies": {
"@types/filewriter": "*"
}
},
"node_modules/@types/filewriter": {
"version": "0.0.33",
"resolved": "https://registry.npmjs.org/@types/filewriter/-/filewriter-0.0.33.tgz",
"integrity": "sha512-xFU8ZXTw4gd358lb2jw25nxY9QAgqn2+bKKjKOYfNCzN4DKCFetK7sPtrlpg66Ywe3vWY9FNxprZawAh9wfJ3g==",
"dev": true
},
"node_modules/@types/har-format": {
"version": "1.2.16",
"resolved": "https://registry.npmjs.org/@types/har-format/-/har-format-1.2.16.tgz",
"integrity": "sha512-fluxdy7ryD3MV6h8pTfTYpy/xQzCFC7m89nOH9y94cNqJ1mDIDPut7MnRHI3F6qRmh/cT2fUjG1MLdCNb4hE9A==",
"dev": true
}, },
"node_modules/@types/http-cache-semantics": { "node_modules/@types/http-cache-semantics": {
"version": "4.0.4", "version": "4.0.4",
@ -5085,15 +5098,6 @@
"integrity": "sha512-JNAzZcXrCt42VGLuYz0zfAzDfAvJWW6AfYlDBQyDV5DClI2m5sAmK+OIO7s59XfsRsWHp02jAJrRadPRGTt6SQ==", "integrity": "sha512-JNAzZcXrCt42VGLuYz0zfAzDfAvJWW6AfYlDBQyDV5DClI2m5sAmK+OIO7s59XfsRsWHp02jAJrRadPRGTt6SQ==",
"license": "ISC" "license": "ISC"
}, },
"node_modules/lucide-solid": {
"version": "0.408.0",
"resolved": "https://registry.npmjs.org/lucide-solid/-/lucide-solid-0.408.0.tgz",
"integrity": "sha512-YJslzmGotW/s69Zygp1W+hIYnZSLihylufxZWXTARSh+ruILyRnXZlTxQiTiYhPLqyd0YWOanxou0HukRhjUng==",
"license": "ISC",
"peerDependencies": {
"solid-js": "^1.4.7"
}
},
"node_modules/magic-string": { "node_modules/magic-string": {
"version": "0.30.11", "version": "0.30.11",
"resolved": "https://registry.npmjs.org/magic-string/-/magic-string-0.30.11.tgz", "resolved": "https://registry.npmjs.org/magic-string/-/magic-string-0.30.11.tgz",
@ -8793,10 +8797,9 @@
"license": "MIT" "license": "MIT"
}, },
"node_modules/picocolors": { "node_modules/picocolors": {
"version": "1.0.1", "version": "1.1.0",
"resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.0.1.tgz", "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.0.tgz",
"integrity": "sha512-anP1Z8qwhkbmu7MFP5iTt+wQKXgwzf7zTyGlcdzabySa9vd0Xt392U0rVmz9poOaBj0uHJKyyo9/upk0HrEQew==", "integrity": "sha512-TQ92mBOW0l3LeMeyLV6mzy/kWr8lkd/hp3mTg7wYK7zJhuBStmGMBG0BdeDZS/dZx1IukaX6Bk11zcln25o1Aw=="
"license": "ISC"
}, },
"node_modules/picomatch": { "node_modules/picomatch": {
"version": "2.3.1", "version": "2.3.1",
@ -8841,9 +8844,9 @@
} }
}, },
"node_modules/postcss": { "node_modules/postcss": {
"version": "8.4.41", "version": "8.4.47",
"resolved": "https://registry.npmjs.org/postcss/-/postcss-8.4.41.tgz", "resolved": "https://registry.npmjs.org/postcss/-/postcss-8.4.47.tgz",
"integrity": "sha512-TesUflQ0WKZqAvg52PWL6kHgLKP6xB6heTOdoYM0Wt2UHyxNa4K25EZZMgKns3BH1RLVbZCREPpLY0rhnNoHVQ==", "integrity": "sha512-56rxCq7G/XfB4EkXq9Egn5GCqugWvDFjafDOThIdMBsI15iqPqR5r15TfSr1YPYeEI19YeaXMCbY6u88Y76GLQ==",
"funding": [ "funding": [
{ {
"type": "opencollective", "type": "opencollective",
@ -8858,11 +8861,10 @@
"url": "https://github.com/sponsors/ai" "url": "https://github.com/sponsors/ai"
} }
], ],
"license": "MIT",
"dependencies": { "dependencies": {
"nanoid": "^3.3.7", "nanoid": "^3.3.7",
"picocolors": "^1.0.1", "picocolors": "^1.1.0",
"source-map-js": "^1.2.0" "source-map-js": "^1.2.1"
}, },
"engines": { "engines": {
"node": "^10 || ^12 || >=14" "node": "^10 || ^12 || >=14"
@ -9869,13 +9871,12 @@
} }
}, },
"node_modules/rollup": { "node_modules/rollup": {
"version": "4.21.0", "version": "4.24.0",
"resolved": "https://registry.npmjs.org/rollup/-/rollup-4.21.0.tgz", "resolved": "https://registry.npmjs.org/rollup/-/rollup-4.24.0.tgz",
"integrity": "sha512-vo+S/lfA2lMS7rZ2Qoubi6I5hwZwzXeUIctILZLbHI+laNtvhhOIon2S1JksA5UEDQ7l3vberd0fxK44lTYjbQ==", "integrity": "sha512-DOmrlGSXNk1DM0ljiQA+i+o0rSLhtii1je5wgk60j49d1jHT5YYttBv1iWOnYSTG+fZZESUOSNiAl89SIet+Cg==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"@types/estree": "1.0.5" "@types/estree": "1.0.6"
}, },
"bin": { "bin": {
"rollup": "dist/bin/rollup" "rollup": "dist/bin/rollup"
@ -9885,22 +9886,22 @@
"npm": ">=8.0.0" "npm": ">=8.0.0"
}, },
"optionalDependencies": { "optionalDependencies": {
"@rollup/rollup-android-arm-eabi": "4.21.0", "@rollup/rollup-android-arm-eabi": "4.24.0",
"@rollup/rollup-android-arm64": "4.21.0", "@rollup/rollup-android-arm64": "4.24.0",
"@rollup/rollup-darwin-arm64": "4.21.0", "@rollup/rollup-darwin-arm64": "4.24.0",
"@rollup/rollup-darwin-x64": "4.21.0", "@rollup/rollup-darwin-x64": "4.24.0",
"@rollup/rollup-linux-arm-gnueabihf": "4.21.0", "@rollup/rollup-linux-arm-gnueabihf": "4.24.0",
"@rollup/rollup-linux-arm-musleabihf": "4.21.0", "@rollup/rollup-linux-arm-musleabihf": "4.24.0",
"@rollup/rollup-linux-arm64-gnu": "4.21.0", "@rollup/rollup-linux-arm64-gnu": "4.24.0",
"@rollup/rollup-linux-arm64-musl": "4.21.0", "@rollup/rollup-linux-arm64-musl": "4.24.0",
"@rollup/rollup-linux-powerpc64le-gnu": "4.21.0", "@rollup/rollup-linux-powerpc64le-gnu": "4.24.0",
"@rollup/rollup-linux-riscv64-gnu": "4.21.0", "@rollup/rollup-linux-riscv64-gnu": "4.24.0",
"@rollup/rollup-linux-s390x-gnu": "4.21.0", "@rollup/rollup-linux-s390x-gnu": "4.24.0",
"@rollup/rollup-linux-x64-gnu": "4.21.0", "@rollup/rollup-linux-x64-gnu": "4.24.0",
"@rollup/rollup-linux-x64-musl": "4.21.0", "@rollup/rollup-linux-x64-musl": "4.24.0",
"@rollup/rollup-win32-arm64-msvc": "4.21.0", "@rollup/rollup-win32-arm64-msvc": "4.24.0",
"@rollup/rollup-win32-ia32-msvc": "4.21.0", "@rollup/rollup-win32-ia32-msvc": "4.24.0",
"@rollup/rollup-win32-x64-msvc": "4.21.0", "@rollup/rollup-win32-x64-msvc": "4.24.0",
"fsevents": "~2.3.2" "fsevents": "~2.3.2"
} }
}, },
@ -10189,10 +10190,9 @@
} }
}, },
"node_modules/source-map-js": { "node_modules/source-map-js": {
"version": "1.2.0", "version": "1.2.1",
"resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.0.tgz", "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz",
"integrity": "sha512-itJW8lvSA0TXEphiRoawsCksnlf8SyvmFzIhltqAHluXd88pkCd+cXJVHTDwdCr0IzwptSm035IHQktUu1QUMg==", "integrity": "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==",
"license": "BSD-3-Clause",
"engines": { "engines": {
"node": ">=0.10.0" "node": ">=0.10.0"
} }
@ -10890,14 +10890,13 @@
"license": "ISC" "license": "ISC"
}, },
"node_modules/vite": { "node_modules/vite": {
"version": "5.4.2", "version": "5.4.9",
"resolved": "https://registry.npmjs.org/vite/-/vite-5.4.2.tgz", "resolved": "https://registry.npmjs.org/vite/-/vite-5.4.9.tgz",
"integrity": "sha512-dDrQTRHp5C1fTFzcSaMxjk6vdpKvT+2/mIdE07Gw2ykehT49O0z/VHS3zZ8iV/Gh8BJJKHWOe5RjaNrW5xf/GA==", "integrity": "sha512-20OVpJHh0PAM0oSOELa5GaZNWeDjcAvQjGXy2Uyr+Tp+/D2/Hdz6NLgpJLsarPTA2QJ6v8mX2P1ZfbsSKvdMkg==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"esbuild": "^0.21.3", "esbuild": "^0.21.3",
"postcss": "^8.4.41", "postcss": "^8.4.43",
"rollup": "^4.20.0" "rollup": "^4.20.0"
}, },
"bin": { "bin": {
@ -11335,13 +11334,13 @@
} }
}, },
"node_modules/wxt": { "node_modules/wxt": {
"version": "0.19.7", "version": "0.19.9",
"resolved": "https://registry.npmjs.org/wxt/-/wxt-0.19.7.tgz", "resolved": "https://registry.npmjs.org/wxt/-/wxt-0.19.9.tgz",
"integrity": "sha512-nAoEYodA6Tgc93m0C4H64rUZe3WpR8aIL04L3BbmEnLfAfaLARKIELHBCOTr0m+6maMsAlppKVvX9O7n0Lg2/Q==", "integrity": "sha512-XUbF4JNyx2jTDpXwx2c/esaJcUD2Dr482C2GGenkGRMH2UnerzOIchGCtaa1hb2U8eAed7Akda0yRoMJU0uxUw==",
"dev": true, "dev": true,
"license": "MIT",
"dependencies": { "dependencies": {
"@aklinker1/rollup-plugin-visualizer": "5.12.0", "@aklinker1/rollup-plugin-visualizer": "5.12.0",
"@types/chrome": "^0.0.269",
"@types/webextension-polyfill": "^0.10.7", "@types/webextension-polyfill": "^0.10.7",
"@webext-core/fake-browser": "^1.3.1", "@webext-core/fake-browser": "^1.3.1",
"@webext-core/isolated-element": "^1.1.2", "@webext-core/isolated-element": "^1.1.2",
@ -11355,7 +11354,7 @@
"defu": "^6.1.4", "defu": "^6.1.4",
"dequal": "^2.0.3", "dequal": "^2.0.3",
"esbuild": "^0.23.0", "esbuild": "^0.23.0",
"execa": "^9.3.0", "execa": "^9.3.1",
"fast-glob": "^3.3.2", "fast-glob": "^3.3.2",
"filesize": "^10.1.4", "filesize": "^10.1.4",
"fs-extra": "^11.2.0", "fs-extra": "^11.2.0",
@ -11374,11 +11373,12 @@
"nypm": "^0.3.9", "nypm": "^0.3.9",
"ohash": "^1.1.3", "ohash": "^1.1.3",
"open": "^10.1.0", "open": "^10.1.0",
"ora": "^8.0.1", "ora": "^8.1.0",
"picocolors": "^1.0.1", "picocolors": "^1.0.1",
"prompts": "^2.4.2", "prompts": "^2.4.2",
"publish-browser-extension": "^2.1.3", "publish-browser-extension": "^2.1.3",
"unimport": "^3.9.1", "scule": "^1.3.0",
"unimport": "^3.11.1",
"vite": "^5.3.5", "vite": "^5.3.5",
"vite-node": "^2.0.4", "vite-node": "^2.0.4",
"web-ext-run": "^0.2.1", "web-ext-run": "^0.2.1",
@ -11388,9 +11388,6 @@
"wxt": "bin/wxt.mjs", "wxt": "bin/wxt.mjs",
"wxt-publish-extension": "bin/wxt-publish-extension.cjs" "wxt-publish-extension": "bin/wxt-publish-extension.cjs"
}, },
"peerDependencies": {
"@types/chrome": "*"
},
"peerDependenciesMeta": { "peerDependenciesMeta": {
"@types/chrome": { "@types/chrome": {
"optional": true "optional": true

View file

@ -2,7 +2,7 @@
"name": "wxt-starter", "name": "wxt-starter",
"description": "manifest.json description", "description": "manifest.json description",
"private": true, "private": true,
"version": "1.0.5", "version": "1.0.6",
"type": "module", "type": "module",
"scripts": { "scripts": {
"dev": "wxt", "dev": "wxt",
@ -21,7 +21,7 @@
"autoprefixer": "^10.4.19", "autoprefixer": "^10.4.19",
"install": "^0.13.0", "install": "^0.13.0",
"npm": "^10.8.1", "npm": "^10.8.1",
"postcss": "^8.4.38", "postcss": "^8.4.47",
"prettier": "^3.3.2", "prettier": "^3.3.2",
"solid-js": "^1.8.17", "solid-js": "^1.8.17",
"tailwindcss": "^3.4.4", "tailwindcss": "^3.4.4",

View file

@ -742,85 +742,85 @@
estree-walker "^2.0.2" estree-walker "^2.0.2"
picomatch "^2.3.1" picomatch "^2.3.1"
"@rollup/rollup-android-arm-eabi@4.21.0": "@rollup/rollup-android-arm-eabi@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.21.0.tgz#d941173f82f9b041c61b0dc1a2a91dcd06e4b31e" resolved "https://registry.yarnpkg.com/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.24.0.tgz#1661ff5ea9beb362795304cb916049aba7ac9c54"
integrity sha512-WTWD8PfoSAJ+qL87lE7votj3syLavxunWhzCnx3XFxFiI/BA/r3X7MUM8dVrH8rb2r4AiO8jJsr3ZjdaftmnfA== integrity sha512-Q6HJd7Y6xdB48x8ZNVDOqsbh2uByBhgK8PiQgPhwkIw/HC/YX5Ghq2mQY5sRMZWHb3VsFkWooUVOZHKr7DmDIA==
"@rollup/rollup-android-arm64@4.21.0": "@rollup/rollup-android-arm64@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.21.0.tgz#7e7157c8543215245ceffc445134d9e843ba51c0" resolved "https://registry.yarnpkg.com/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.24.0.tgz#2ffaa91f1b55a0082b8a722525741aadcbd3971e"
integrity sha512-a1sR2zSK1B4eYkiZu17ZUZhmUQcKjk2/j9Me2IDjk1GHW7LB5Z35LEzj9iJch6gtUfsnvZs1ZNyDW2oZSThrkA== integrity sha512-ijLnS1qFId8xhKjT81uBHuuJp2lU4x2yxa4ctFPtG+MqEE6+C5f/+X/bStmxapgmwLwiL3ih122xv8kVARNAZA==
"@rollup/rollup-darwin-arm64@4.21.0": "@rollup/rollup-darwin-arm64@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.npmjs.org/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.21.0.tgz" resolved "https://registry.yarnpkg.com/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.24.0.tgz#627007221b24b8cc3063703eee0b9177edf49c1f"
integrity sha512-zOnKWLgDld/svhKO5PD9ozmL6roy5OQ5T4ThvdYZLpiOhEGY+dp2NwUmxK0Ld91LrbjrvtNAE0ERBwjqhZTRAA== integrity sha512-bIv+X9xeSs1XCk6DVvkO+S/z8/2AMt/2lMqdQbMrmVpgFvXlmde9mLcbQpztXm1tajC3raFDqegsH18HQPMYtA==
"@rollup/rollup-darwin-x64@4.21.0": "@rollup/rollup-darwin-x64@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.21.0.tgz#34b7867613e5cc42d2b85ddc0424228cc33b43f0" resolved "https://registry.yarnpkg.com/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.24.0.tgz#0605506142b9e796c370d59c5984ae95b9758724"
integrity sha512-7doS8br0xAkg48SKE2QNtMSFPFUlRdw9+votl27MvT46vo44ATBmdZdGysOevNELmZlfd+NEa0UYOA8f01WSrg== integrity sha512-X6/nOwoFN7RT2svEQWUsW/5C/fYMBe4fnLK9DQk4SX4mgVBiTA9h64kjUYPvGQ0F/9xwJ5U5UfTbl6BEjaQdBQ==
"@rollup/rollup-linux-arm-gnueabihf@4.21.0": "@rollup/rollup-linux-arm-gnueabihf@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.21.0.tgz#422b19ff9ae02b05d3395183d1d43b38c7c8be0b" resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.24.0.tgz#62dfd196d4b10c0c2db833897164d2d319ee0cbb"
integrity sha512-pWJsfQjNWNGsoCq53KjMtwdJDmh/6NubwQcz52aEwLEuvx08bzcy6tOUuawAOncPnxz/3siRtd8hiQ32G1y8VA== integrity sha512-0KXvIJQMOImLCVCz9uvvdPgfyWo93aHHp8ui3FrtOP57svqrF/roSSR5pjqL2hcMp0ljeGlU4q9o/rQaAQ3AYA==
"@rollup/rollup-linux-arm-musleabihf@4.21.0": "@rollup/rollup-linux-arm-musleabihf@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.21.0.tgz#568aa29195ef6fc57ec6ed3f518923764406a8ee" resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.24.0.tgz#53ce72aeb982f1f34b58b380baafaf6a240fddb3"
integrity sha512-efRIANsz3UHZrnZXuEvxS9LoCOWMGD1rweciD6uJQIx2myN3a8Im1FafZBzh7zk1RJ6oKcR16dU3UPldaKd83w== integrity sha512-it2BW6kKFVh8xk/BnHfakEeoLPv8STIISekpoF+nBgWM4d55CZKc7T4Dx1pEbTnYm/xEKMgy1MNtYuoA8RFIWw==
"@rollup/rollup-linux-arm64-gnu@4.21.0": "@rollup/rollup-linux-arm64-gnu@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.21.0.tgz#22309c8bcba9a73114f69165c72bc94b2fbec085" resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.24.0.tgz#1632990f62a75c74f43e4b14ab3597d7ed416496"
integrity sha512-ZrPhydkTVhyeGTW94WJ8pnl1uroqVHM3j3hjdquwAcWnmivjAwOYjTEAuEDeJvGX7xv3Z9GAvrBkEzCgHq9U1w== integrity sha512-i0xTLXjqap2eRfulFVlSnM5dEbTVque/3Pi4g2y7cxrs7+a9De42z4XxKLYJ7+OhE3IgxvfQM7vQc43bwTgPwA==
"@rollup/rollup-linux-arm64-musl@4.21.0": "@rollup/rollup-linux-arm64-musl@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.21.0.tgz#c93c388af6d33f082894b8a60839d7265b2b9bc5" resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.24.0.tgz#8c03a996efb41e257b414b2e0560b7a21f2d9065"
integrity sha512-cfaupqd+UEFeURmqNP2eEvXqgbSox/LHOyN9/d2pSdV8xTrjdg3NgOFJCtc1vQ/jEke1qD0IejbBfxleBPHnPw== integrity sha512-9E6MKUJhDuDh604Qco5yP/3qn3y7SLXYuiC0Rpr89aMScS2UAmK1wHP2b7KAa1nSjWJc/f/Lc0Wl1L47qjiyQw==
"@rollup/rollup-linux-powerpc64le-gnu@4.21.0": "@rollup/rollup-linux-powerpc64le-gnu@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-powerpc64le-gnu/-/rollup-linux-powerpc64le-gnu-4.21.0.tgz#493c5e19e395cf3c6bd860c7139c8a903dea72b4" resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-powerpc64le-gnu/-/rollup-linux-powerpc64le-gnu-4.24.0.tgz#5b98729628d5bcc8f7f37b58b04d6845f85c7b5d"
integrity sha512-ZKPan1/RvAhrUylwBXC9t7B2hXdpb/ufeu22pG2psV7RN8roOfGurEghw1ySmX/CmDDHNTDDjY3lo9hRlgtaHg== integrity sha512-2XFFPJ2XMEiF5Zi2EBf4h73oR1V/lycirxZxHZNc93SqDN/IWhYYSYj8I9381ikUFXZrz2v7r2tOVk2NBwxrWw==
"@rollup/rollup-linux-riscv64-gnu@4.21.0": "@rollup/rollup-linux-riscv64-gnu@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.21.0.tgz#a2eab4346fbe5909165ce99adb935ba30c9fb444" resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.24.0.tgz#48e42e41f4cabf3573cfefcb448599c512e22983"
integrity sha512-H1eRaCwd5E8eS8leiS+o/NqMdljkcb1d6r2h4fKSsCXQilLKArq6WS7XBLDu80Yz+nMqHVFDquwcVrQmGr28rg== integrity sha512-M3Dg4hlwuntUCdzU7KjYqbbd+BLq3JMAOhCKdBE3TcMGMZbKkDdJ5ivNdehOssMCIokNHFOsv7DO4rlEOfyKpg==
"@rollup/rollup-linux-s390x-gnu@4.21.0": "@rollup/rollup-linux-s390x-gnu@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.21.0.tgz#0bc49a79db4345d78d757bb1b05e73a1b42fa5c3" resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.24.0.tgz#e0b4f9a966872cb7d3e21b9e412a4b7efd7f0b58"
integrity sha512-zJ4hA+3b5tu8u7L58CCSI0A9N1vkfwPhWd/puGXwtZlsB5bTkwDNW/+JCU84+3QYmKpLi+XvHdmrlwUwDA6kqw== integrity sha512-mjBaoo4ocxJppTorZVKWFpy1bfFj9FeCMJqzlMQGjpNPY9JwQi7OuS1axzNIk0nMX6jSgy6ZURDZ2w0QW6D56g==
"@rollup/rollup-linux-x64-gnu@4.21.0": "@rollup/rollup-linux-x64-gnu@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.21.0.tgz#4fd36a6a41f3406d8693321b13d4f9b7658dd4b9" resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.24.0.tgz#78144741993100f47bd3da72fce215e077ae036b"
integrity sha512-e2hrvElFIh6kW/UNBQK/kzqMNY5mO+67YtEh9OA65RM5IJXYTWiXjX6fjIiPaqOkBthYF1EqgiZ6OXKcQsM0hg== integrity sha512-ZXFk7M72R0YYFN5q13niV0B7G8/5dcQ9JDp8keJSfr3GoZeXEoMHP/HlvqROA3OMbMdfr19IjCeNAnPUG93b6A==
"@rollup/rollup-linux-x64-musl@4.21.0": "@rollup/rollup-linux-x64-musl@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.21.0.tgz#10ebb13bd4469cbad1a5d9b073bd27ec8a886200" resolved "https://registry.yarnpkg.com/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.24.0.tgz#d9fe32971883cd1bd858336bd33a1c3ca6146127"
integrity sha512-1vvmgDdUSebVGXWX2lIcgRebqfQSff0hMEkLJyakQ9JQUbLDkEaMsPTLOmyccyC6IJ/l3FZuJbmrBw/u0A0uCQ== integrity sha512-w1i+L7kAXZNdYl+vFvzSZy8Y1arS7vMgIy8wusXJzRrPyof5LAb02KGr1PD2EkRcl73kHulIID0M501lN+vobQ==
"@rollup/rollup-win32-arm64-msvc@4.21.0": "@rollup/rollup-win32-arm64-msvc@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.21.0.tgz#2fef1a90f1402258ef915ae5a94cc91a5a1d5bfc" resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.24.0.tgz#71fa3ea369316db703a909c790743972e98afae5"
integrity sha512-s5oFkZ/hFcrlAyBTONFY1TWndfyre1wOMwU+6KCpm/iatybvrRgmZVM+vCFwxmC5ZhdlgfE0N4XorsDpi7/4XQ== integrity sha512-VXBrnPWgBpVDCVY6XF3LEW0pOU51KbaHhccHw6AS6vBWIC60eqsH19DAeeObl+g8nKAz04QFdl/Cefta0xQtUQ==
"@rollup/rollup-win32-ia32-msvc@4.21.0": "@rollup/rollup-win32-ia32-msvc@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.21.0.tgz#a18ad47a95c5f264defb60acdd8c27569f816fc1" resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.24.0.tgz#653f5989a60658e17d7576a3996deb3902e342e2"
integrity sha512-G9+TEqRnAA6nbpqyUqgTiopmnfgnMkR3kMukFBDsiyy23LZvUCpiUwjTRx6ezYCjJODXrh52rBR9oXvm+Fp5wg== integrity sha512-xrNcGDU0OxVcPTH/8n/ShH4UevZxKIO6HJFK0e15XItZP2UcaiLFd5kiX7hJnqCbSztUF8Qot+JWBC/QXRPYWQ==
"@rollup/rollup-win32-x64-msvc@4.21.0": "@rollup/rollup-win32-x64-msvc@4.24.0":
version "4.21.0" version "4.24.0"
resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.21.0.tgz#20c09cf44dcb082140cc7f439dd679fe4bba3375" resolved "https://registry.yarnpkg.com/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.24.0.tgz#0574d7e87b44ee8511d08cc7f914bcb802b70818"
integrity sha512-2jsCDZwtQvRhejHLfZ1JY6w6kEuEtfF9nzYsZxzSlNVKDX+DpsDJ+Rbjkm74nvg2rdx0gwBS+IMdvwJuq3S9pQ== integrity sha512-fbMkAF7fufku0N2dE5TBXcNlg0pt0cJue4xBRE2Qc5Vqikxr4VCgKj/ht6SMdFcOacVA9rqF70APJ8RN/4vMJw==
"@sec-ant/readable-stream@^0.4.1": "@sec-ant/readable-stream@^0.4.1":
version "0.4.1" version "0.4.1"
@ -949,7 +949,12 @@
"@types/filesystem" "*" "@types/filesystem" "*"
"@types/har-format" "*" "@types/har-format" "*"
"@types/estree@1.0.5", "@types/estree@^1.0.0": "@types/estree@1.0.6":
version "1.0.6"
resolved "https://registry.yarnpkg.com/@types/estree/-/estree-1.0.6.tgz#628effeeae2064a1b4e79f78e81d87b7e5fc7b50"
integrity sha512-AYnb1nQyY49te+VRAVgmzfcgjYS91mY5P0TKUDCLEM+gNnA+3T6rWITXRLYCpahpqSQbN5cE+gHpnPyXjHWxcw==
"@types/estree@^1.0.0":
version "1.0.5" version "1.0.5"
resolved "https://registry.npmjs.org/@types/estree/-/estree-1.0.5.tgz" resolved "https://registry.npmjs.org/@types/estree/-/estree-1.0.5.tgz"
integrity sha512-/kYRxGDLWzHOB7q+wtSUQlFrtcdUccpfy+X+9iMBpHK8QLLhx2wIPYuS5DYtR9Wa/YlZAbIovy7qVdB1Aq6Lyw== integrity sha512-/kYRxGDLWzHOB7q+wtSUQlFrtcdUccpfy+X+9iMBpHK8QLLhx2wIPYuS5DYtR9Wa/YlZAbIovy7qVdB1Aq6Lyw==
@ -4040,10 +4045,10 @@ perfect-debounce@^1.0.0:
resolved "https://registry.npmjs.org/perfect-debounce/-/perfect-debounce-1.0.0.tgz" resolved "https://registry.npmjs.org/perfect-debounce/-/perfect-debounce-1.0.0.tgz"
integrity sha512-xCy9V055GLEqoFaHoC1SoLIaLmWctgCUaBaWxDZ7/Zx4CTyX7cJQLJOok/orfjZAh9kEYpjJa4d0KcJmCbctZA== integrity sha512-xCy9V055GLEqoFaHoC1SoLIaLmWctgCUaBaWxDZ7/Zx4CTyX7cJQLJOok/orfjZAh9kEYpjJa4d0KcJmCbctZA==
picocolors@^1, picocolors@^1.0.0, picocolors@^1.0.1: picocolors@^1, picocolors@^1.0.0, picocolors@^1.0.1, picocolors@^1.1.0:
version "1.0.1" version "1.1.0"
resolved "https://registry.npmjs.org/picocolors/-/picocolors-1.0.1.tgz" resolved "https://registry.yarnpkg.com/picocolors/-/picocolors-1.1.0.tgz#5358b76a78cde483ba5cef6a9dc9671440b27d59"
integrity sha512-anP1Z8qwhkbmu7MFP5iTt+wQKXgwzf7zTyGlcdzabySa9vd0Xt392U0rVmz9poOaBj0uHJKyyo9/upk0HrEQew== integrity sha512-TQ92mBOW0l3LeMeyLV6mzy/kWr8lkd/hp3mTg7wYK7zJhuBStmGMBG0BdeDZS/dZx1IukaX6Bk11zcln25o1Aw==
picomatch@^2.0.4, picomatch@^2.2.1, picomatch@^2.3.1: picomatch@^2.0.4, picomatch@^2.2.1, picomatch@^2.3.1:
version "2.3.1" version "2.3.1"
@ -4113,14 +4118,14 @@ postcss-value-parser@^4.0.0, postcss-value-parser@^4.2.0:
resolved "https://registry.npmjs.org/postcss-value-parser/-/postcss-value-parser-4.2.0.tgz" resolved "https://registry.npmjs.org/postcss-value-parser/-/postcss-value-parser-4.2.0.tgz"
integrity sha512-1NNCs6uurfkVbeXG4S8JFT9t19m45ICnif8zWLd5oPSZ50QnwMfK+H3jv408d4jw/7Bttv5axS5IiHoLaVNHeQ== integrity sha512-1NNCs6uurfkVbeXG4S8JFT9t19m45ICnif8zWLd5oPSZ50QnwMfK+H3jv408d4jw/7Bttv5axS5IiHoLaVNHeQ==
postcss@^8.4.23, postcss@^8.4.38, postcss@^8.4.41: postcss@^8.4.23, postcss@^8.4.43, postcss@^8.4.47:
version "8.4.41" version "8.4.47"
resolved "https://registry.npmjs.org/postcss/-/postcss-8.4.41.tgz" resolved "https://registry.yarnpkg.com/postcss/-/postcss-8.4.47.tgz#5bf6c9a010f3e724c503bf03ef7947dcb0fea365"
integrity sha512-TesUflQ0WKZqAvg52PWL6kHgLKP6xB6heTOdoYM0Wt2UHyxNa4K25EZZMgKns3BH1RLVbZCREPpLY0rhnNoHVQ== integrity sha512-56rxCq7G/XfB4EkXq9Egn5GCqugWvDFjafDOThIdMBsI15iqPqR5r15TfSr1YPYeEI19YeaXMCbY6u88Y76GLQ==
dependencies: dependencies:
nanoid "^3.3.7" nanoid "^3.3.7"
picocolors "^1.0.1" picocolors "^1.1.0"
source-map-js "^1.2.0" source-map-js "^1.2.1"
prettier@^3.3.2: prettier@^3.3.2:
version "3.3.3" version "3.3.3"
@ -4408,28 +4413,28 @@ rimraf@~2.4.0:
glob "^6.0.1" glob "^6.0.1"
rollup@^4.20.0: rollup@^4.20.0:
version "4.21.0" version "4.24.0"
resolved "https://registry.npmjs.org/rollup/-/rollup-4.21.0.tgz" resolved "https://registry.yarnpkg.com/rollup/-/rollup-4.24.0.tgz#c14a3576f20622ea6a5c9cad7caca5e6e9555d05"
integrity sha512-vo+S/lfA2lMS7rZ2Qoubi6I5hwZwzXeUIctILZLbHI+laNtvhhOIon2S1JksA5UEDQ7l3vberd0fxK44lTYjbQ== integrity sha512-DOmrlGSXNk1DM0ljiQA+i+o0rSLhtii1je5wgk60j49d1jHT5YYttBv1iWOnYSTG+fZZESUOSNiAl89SIet+Cg==
dependencies: dependencies:
"@types/estree" "1.0.5" "@types/estree" "1.0.6"
optionalDependencies: optionalDependencies:
"@rollup/rollup-android-arm-eabi" "4.21.0" "@rollup/rollup-android-arm-eabi" "4.24.0"
"@rollup/rollup-android-arm64" "4.21.0" "@rollup/rollup-android-arm64" "4.24.0"
"@rollup/rollup-darwin-arm64" "4.21.0" "@rollup/rollup-darwin-arm64" "4.24.0"
"@rollup/rollup-darwin-x64" "4.21.0" "@rollup/rollup-darwin-x64" "4.24.0"
"@rollup/rollup-linux-arm-gnueabihf" "4.21.0" "@rollup/rollup-linux-arm-gnueabihf" "4.24.0"
"@rollup/rollup-linux-arm-musleabihf" "4.21.0" "@rollup/rollup-linux-arm-musleabihf" "4.24.0"
"@rollup/rollup-linux-arm64-gnu" "4.21.0" "@rollup/rollup-linux-arm64-gnu" "4.24.0"
"@rollup/rollup-linux-arm64-musl" "4.21.0" "@rollup/rollup-linux-arm64-musl" "4.24.0"
"@rollup/rollup-linux-powerpc64le-gnu" "4.21.0" "@rollup/rollup-linux-powerpc64le-gnu" "4.24.0"
"@rollup/rollup-linux-riscv64-gnu" "4.21.0" "@rollup/rollup-linux-riscv64-gnu" "4.24.0"
"@rollup/rollup-linux-s390x-gnu" "4.21.0" "@rollup/rollup-linux-s390x-gnu" "4.24.0"
"@rollup/rollup-linux-x64-gnu" "4.21.0" "@rollup/rollup-linux-x64-gnu" "4.24.0"
"@rollup/rollup-linux-x64-musl" "4.21.0" "@rollup/rollup-linux-x64-musl" "4.24.0"
"@rollup/rollup-win32-arm64-msvc" "4.21.0" "@rollup/rollup-win32-arm64-msvc" "4.24.0"
"@rollup/rollup-win32-ia32-msvc" "4.21.0" "@rollup/rollup-win32-ia32-msvc" "4.24.0"
"@rollup/rollup-win32-x64-msvc" "4.21.0" "@rollup/rollup-win32-x64-msvc" "4.24.0"
fsevents "~2.3.2" fsevents "~2.3.2"
run-applescript@^5.0.0: run-applescript@^5.0.0:
@ -4621,10 +4626,10 @@ solid-refresh@^0.6.3:
"@babel/helper-module-imports" "^7.22.15" "@babel/helper-module-imports" "^7.22.15"
"@babel/types" "^7.23.6" "@babel/types" "^7.23.6"
source-map-js@^1.2.0: source-map-js@^1.2.0, source-map-js@^1.2.1:
version "1.2.0" version "1.2.1"
resolved "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.0.tgz" resolved "https://registry.yarnpkg.com/source-map-js/-/source-map-js-1.2.1.tgz#1ce5650fddd87abc099eda37dcff024c2667ae46"
integrity sha512-itJW8lvSA0TXEphiRoawsCksnlf8SyvmFzIhltqAHluXd88pkCd+cXJVHTDwdCr0IzwptSm035IHQktUu1QUMg== integrity sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==
source-map-support@0.5.21: source-map-support@0.5.21:
version "0.5.21" version "0.5.21"
@ -4717,7 +4722,16 @@ stdin-discarder@^0.2.2:
resolved "https://registry.npmjs.org/stdin-discarder/-/stdin-discarder-0.2.2.tgz" resolved "https://registry.npmjs.org/stdin-discarder/-/stdin-discarder-0.2.2.tgz"
integrity sha512-UhDfHmA92YAlNnCfhmq0VeNL5bDbiZGg7sZ2IvPsXubGkiNa9EC+tUTsjBRsYUAz87btI6/1wf4XoVvQ3uRnmQ== integrity sha512-UhDfHmA92YAlNnCfhmq0VeNL5bDbiZGg7sZ2IvPsXubGkiNa9EC+tUTsjBRsYUAz87btI6/1wf4XoVvQ3uRnmQ==
"string-width-cjs@npm:string-width@^4.2.0", string-width@^4.1.0, string-width@^4.2.0, string-width@^4.2.3: "string-width-cjs@npm:string-width@^4.2.0":
version "4.2.3"
resolved "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz"
integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==
dependencies:
emoji-regex "^8.0.0"
is-fullwidth-code-point "^3.0.0"
strip-ansi "^6.0.1"
string-width@^4.1.0, string-width@^4.2.0, string-width@^4.2.3:
version "4.2.3" version "4.2.3"
resolved "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz" resolved "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz"
integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g== integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==
@ -4751,7 +4765,14 @@ string_decoder@^1.1.1, string_decoder@~1.1.1:
dependencies: dependencies:
safe-buffer "~5.1.0" safe-buffer "~5.1.0"
"strip-ansi-cjs@npm:strip-ansi@^6.0.1", strip-ansi@^6.0.0, strip-ansi@^6.0.1: "strip-ansi-cjs@npm:strip-ansi@^6.0.1":
version "6.0.1"
resolved "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz"
integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==
dependencies:
ansi-regex "^5.0.1"
strip-ansi@^6.0.0, strip-ansi@^6.0.1:
version "6.0.1" version "6.0.1"
resolved "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz" resolved "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz"
integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A== integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==
@ -5151,12 +5172,12 @@ vite-plugin-solid@^2.10.2:
vitefu "^0.2.5" vitefu "^0.2.5"
vite@^5.0.0, vite@^5.3.5: vite@^5.0.0, vite@^5.3.5:
version "5.4.2" version "5.4.9"
resolved "https://registry.npmjs.org/vite/-/vite-5.4.2.tgz" resolved "https://registry.yarnpkg.com/vite/-/vite-5.4.9.tgz#215c80cbebfd09ccbb9ceb8c0621391c9abdc19c"
integrity sha512-dDrQTRHp5C1fTFzcSaMxjk6vdpKvT+2/mIdE07Gw2ykehT49O0z/VHS3zZ8iV/Gh8BJJKHWOe5RjaNrW5xf/GA== integrity sha512-20OVpJHh0PAM0oSOELa5GaZNWeDjcAvQjGXy2Uyr+Tp+/D2/Hdz6NLgpJLsarPTA2QJ6v8mX2P1ZfbsSKvdMkg==
dependencies: dependencies:
esbuild "^0.21.3" esbuild "^0.21.3"
postcss "^8.4.41" postcss "^8.4.43"
rollup "^4.20.0" rollup "^4.20.0"
optionalDependencies: optionalDependencies:
fsevents "~2.3.3" fsevents "~2.3.3"
@ -5275,8 +5296,16 @@ winreg@0.0.12:
resolved "https://registry.npmjs.org/winreg/-/winreg-0.0.12.tgz" resolved "https://registry.npmjs.org/winreg/-/winreg-0.0.12.tgz"
integrity sha512-typ/+JRmi7RqP1NanzFULK36vczznSNN8kWVA9vIqXyv8GhghUlwhGp1Xj3Nms1FsPcNnsQrJOR10N58/nQ9hQ== integrity sha512-typ/+JRmi7RqP1NanzFULK36vczznSNN8kWVA9vIqXyv8GhghUlwhGp1Xj3Nms1FsPcNnsQrJOR10N58/nQ9hQ==
"wrap-ansi-cjs@npm:wrap-ansi@^7.0.0", wrap-ansi@^7.0.0: "wrap-ansi-cjs@npm:wrap-ansi@^7.0.0":
name wrap-ansi-cjs version "7.0.0"
resolved "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz"
integrity sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==
dependencies:
ansi-styles "^4.0.0"
string-width "^4.1.0"
strip-ansi "^6.0.0"
wrap-ansi@^7.0.0:
version "7.0.0" version "7.0.0"
resolved "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz" resolved "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz"
integrity sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q== integrity sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 567 KiB

After

Width:  |  Height:  |  Size: 519 KiB

View file

@ -25,7 +25,7 @@
}, },
"devDependencies": { "devDependencies": {
"@openreplay/tracker": "file:../tracker", "@openreplay/tracker": "file:../tracker",
"axios": "^0.26.0", "axios": "^1.7.7",
"prettier": "^1.18.2", "prettier": "^1.18.2",
"replace-in-files-cli": "^1.0.0", "replace-in-files-cli": "^1.0.0",
"typescript": "^4.6.0-dev.20211126" "typescript": "^4.6.0-dev.20211126"

View file

@ -10,7 +10,7 @@
"@types/react": "^17.0.0", "@types/react": "^17.0.0",
"@types/react-dom": "^18.3.0", "@types/react-dom": "^18.3.0",
"@types/uuid": "^8.3.3", "@types/uuid": "^8.3.3",
"axios": "^0.27.2", "axios": "^1.7.7",
"react": "^18.3.1", "react": "^18.3.1",
"react-bootstrap": "^2.10.2", "react-bootstrap": "^2.10.2",
"react-dom": "^18.3.1", "react-dom": "^18.3.1",

View file

@ -52,6 +52,12 @@ export interface StartOptions {
forceNew?: boolean forceNew?: boolean
sessionHash?: string sessionHash?: string
assistOnly?: boolean assistOnly?: boolean
/**
* @deprecated We strongly advise to use .start().then instead.
*
* This method is kept for snippet compatibility only
* */
startCallback?: (result: StartPromiseReturn) => void
} }
interface OnStartInfo { interface OnStartInfo {
@ -161,6 +167,12 @@ type AppOptions = {
} }
network?: NetworkOptions network?: NetworkOptions
/**
* use this flag if you're using Angular
* basically goes around window.Zone api changes to mutation observer
* and event listeners
* */
angularMode?: boolean
} & WebworkerOptions & } & WebworkerOptions &
SessOptions SessOptions
@ -185,12 +197,14 @@ const proto = {
resp: 'never-gonna-let-you-down', resp: 'never-gonna-let-you-down',
// regenerating id (copied other tab) // regenerating id (copied other tab)
reg: 'never-gonna-run-around-and-desert-you', reg: 'never-gonna-run-around-and-desert-you',
// tracker inside a child iframe iframeSignal: 'tracker inside a child iframe',
iframeSignal: 'never-gonna-make-you-cry', iframeId: 'getting node id for child iframe',
// getting node id for child iframe iframeBatch: 'batch of messages from an iframe window',
iframeId: 'never-gonna-say-goodbye', parentAlive: 'signal that parent is live',
// batch of messages from an iframe window killIframe: 'stop tracker inside frame',
iframeBatch: 'never-gonna-tell-a-lie-and-hurt-you', startIframe: 'start tracker inside frame',
// checking updates
polling: 'hello-how-are-you-im-under-the-water-please-help-me',
} as const } as const
export default class App { export default class App {
@ -237,7 +251,6 @@ export default class App {
private rootId: number | null = null private rootId: number | null = null
private pageFrames: HTMLIFrameElement[] = [] private pageFrames: HTMLIFrameElement[] = []
private frameOderNumber = 0 private frameOderNumber = 0
private readonly initialHostName = location.hostname
private features = { private features = {
'feature-flags': true, 'feature-flags': true,
'usability-test': true, 'usability-test': true,
@ -248,7 +261,7 @@ export default class App {
sessionToken: string | undefined, sessionToken: string | undefined,
options: Partial<Options>, options: Partial<Options>,
private readonly signalError: (error: string, apis: string[]) => void, private readonly signalError: (error: string, apis: string[]) => void,
private readonly insideIframe: boolean, public readonly insideIframe: boolean,
) { ) {
this.contextId = Math.random().toString(36).slice(2) this.contextId = Math.random().toString(36).slice(2)
this.projectKey = projectKey this.projectKey = projectKey
@ -305,6 +318,7 @@ export default class App {
__save_canvas_locally: false, __save_canvas_locally: false,
useAnimationFrame: false, useAnimationFrame: false,
}, },
angularMode: false,
} }
this.options = simpleMerge(defaultOptions, options) this.options = simpleMerge(defaultOptions, options)
@ -322,7 +336,7 @@ export default class App {
this.localStorage = this.options.localStorage ?? window.localStorage this.localStorage = this.options.localStorage ?? window.localStorage
this.sessionStorage = this.options.sessionStorage ?? window.sessionStorage this.sessionStorage = this.options.sessionStorage ?? window.sessionStorage
this.sanitizer = new Sanitizer(this, options) this.sanitizer = new Sanitizer(this, options)
this.nodes = new Nodes(this.options.node_id) this.nodes = new Nodes(this.options.node_id, Boolean(options.angularMode))
this.observer = new Observer(this, options) this.observer = new Observer(this, options)
this.ticker = new Ticker(this) this.ticker = new Ticker(this)
this.ticker.attach(() => this.commit()) this.ticker.attach(() => this.commit())
@ -348,136 +362,31 @@ export default class App {
this.session.applySessionHash(sessionToken) this.session.applySessionHash(sessionToken)
} }
this.initWorker()
const thisTab = this.session.getTabId() const thisTab = this.session.getTabId()
if (this.insideIframe) {
/**
* listen for messages from parent window, so we can signal that we're alive
* */
window.addEventListener('message', this.parentCrossDomainFrameListener)
setInterval(() => {
window.parent.postMessage(
{
line: proto.polling,
context: this.contextId,
},
'*',
)
}, 250)
} else {
this.initWorker()
}
if (!this.insideIframe) { if (!this.insideIframe) {
/** /**
* if we get a signal from child iframes, we check for their node_id and send it back, * if we get a signal from child iframes, we check for their node_id and send it back,
* so they can act as if it was just a same-domain iframe * so they can act as if it was just a same-domain iframe
* */ * */
let crossdomainFrameCount = 0 window.addEventListener('message', this.crossDomainIframeListener)
const catchIframeMessage = (event: MessageEvent) => {
const { data } = event
if (data.line === proto.iframeSignal) {
const childIframeDomain = data.domain
const pageIframes = Array.from(document.querySelectorAll('iframe'))
this.pageFrames = pageIframes
const signalId = async () => {
let tries = 0
while (tries < 10) {
const id = this.checkNodeId(pageIframes, childIframeDomain)
if (id) {
this.waitStarted()
.then(() => {
crossdomainFrameCount++
const token = this.session.getSessionToken()
const iframeData = {
line: proto.iframeId,
context: this.contextId,
domain: childIframeDomain,
id,
token,
frameOrderNumber: crossdomainFrameCount,
}
this.debug.log('iframe_data', iframeData)
// @ts-ignore
event.source?.postMessage(iframeData, '*')
})
.catch(console.error)
tries = 10
break
}
tries++
await delay(100)
}
}
void signalId()
}
/**
* proxying messages from iframe to main body, so they can be in one batch (same indexes, etc)
* plus we rewrite some of the messages to be relative to the main context/window
* */
if (data.line === proto.iframeBatch) {
const msgBatch = data.messages
const mappedMessages: Message[] = msgBatch.map((msg: Message) => {
if (msg[0] === MType.MouseMove) {
let fixedMessage = msg
this.pageFrames.forEach((frame) => {
if (frame.dataset.domain === event.data.domain) {
const [type, x, y] = msg
const { left, top } = frame.getBoundingClientRect()
fixedMessage = [type, x + left, y + top]
}
})
return fixedMessage
}
if (msg[0] === MType.MouseClick) {
let fixedMessage = msg
this.pageFrames.forEach((frame) => {
if (frame.dataset.domain === event.data.domain) {
const [type, id, hesitationTime, label, selector, normX, normY] = msg
const { left, top, width, height } = frame.getBoundingClientRect()
const contentWidth = document.documentElement.scrollWidth
const contentHeight = document.documentElement.scrollHeight
// (normalizedX * frameWidth + frameLeftOffset)/docSize
const fullX = (normX / 100) * width + left
const fullY = (normY / 100) * height + top
const fixedX = fullX / contentWidth
const fixedY = fullY / contentHeight
fixedMessage = [
type,
id,
hesitationTime,
label,
selector,
Math.round(fixedX * 1e3) / 1e1,
Math.round(fixedY * 1e3) / 1e1,
]
}
})
return fixedMessage
}
return msg
})
this.messages.push(...mappedMessages)
}
}
window.addEventListener('message', catchIframeMessage)
this.attachStopCallback(() => {
window.removeEventListener('message', catchIframeMessage)
})
} else {
const catchParentMessage = (event: MessageEvent) => {
const { data } = event
if (data.line !== proto.iframeId) {
return
}
this.rootId = data.id
this.session.setSessionToken(data.token as string)
this.frameOderNumber = data.frameOrderNumber
this.debug.log('starting iframe tracking', data)
this.allowAppStart()
}
window.addEventListener('message', catchParentMessage)
this.attachStopCallback(() => {
window.removeEventListener('message', catchParentMessage)
})
// communicating with parent window,
// even if its crossdomain is possible via postMessage api
const domain = this.initialHostName
window.parent.postMessage(
{
line: proto.iframeSignal,
source: thisTab,
context: this.contextId,
domain,
},
'*',
)
} }
if (this.bc !== null) { if (this.bc !== null) {
@ -488,7 +397,7 @@ export default class App {
}) })
this.startTimeout = setTimeout(() => { this.startTimeout = setTimeout(() => {
this.allowAppStart() this.allowAppStart()
}, 500) }, 250)
this.bc.onmessage = (ev: MessageEvent<RickRoll>) => { this.bc.onmessage = (ev: MessageEvent<RickRoll>) => {
if (ev.data.context === this.contextId) { if (ev.data.context === this.contextId) {
return return
@ -519,8 +428,204 @@ export default class App {
} }
} }
/** used by child iframes for crossdomain only */
/** used by child iframes for crossdomain only */
parentActive = false
checkStatus = () => {
return this.parentActive
}
parentCrossDomainFrameListener = (event: MessageEvent) => {
const { data } = event
if (!data || event.source === window) return
if (data.line === proto.startIframe) {
if (this.active()) return
try {
this.allowAppStart()
void this.start()
} catch (e) {
console.error('children frame restart failed:', e)
}
}
if (data.line === proto.parentAlive) {
this.parentActive = true
}
if (data.line === proto.iframeId) {
this.parentActive = true
this.rootId = data.id
this.session.setSessionToken(data.token as string)
this.frameOderNumber = data.frameOrderNumber
this.debug.log('starting iframe tracking', data)
this.allowAppStart()
}
if (data.line === proto.killIframe) {
if (this.active()) {
this.stop()
}
}
}
/**
* context ids for iframes,
* order is not so important as long as its consistent
* */
trackedFrames: string[] = []
crossDomainIframeListener = (event: MessageEvent) => {
if (!this.active() || event.source === window) return
const { data } = event
if (!data) return
if (data.line === proto.iframeSignal) {
// @ts-ignore
event.source?.postMessage({ ping: true, line: proto.parentAlive }, '*')
const pageIframes = Array.from(document.querySelectorAll('iframe'))
this.pageFrames = pageIframes
const signalId = async () => {
if (event.source === null) {
return console.error('Couldnt connect to event.source for child iframe tracking')
}
const id = await this.checkNodeId(pageIframes, event.source)
if (id && !this.trackedFrames.includes(data.context)) {
try {
this.trackedFrames.push(data.context)
await this.waitStarted()
const token = this.session.getSessionToken()
const order = this.trackedFrames.findIndex((f) => f === data.context) + 1
if (order === 0) {
this.debug.error(
'Couldnt get order number for iframe',
data.context,
this.trackedFrames,
)
}
const iframeData = {
line: proto.iframeId,
id,
token,
// since indexes go from 0 we +1
frameOrderNumber: order,
}
this.debug.log('Got child frame signal; nodeId', id, event.source, iframeData)
// @ts-ignore
event.source?.postMessage(iframeData, '*')
} catch (e) {
console.error(e)
}
} else {
this.debug.log('Couldnt get node id for iframe', event.source, pageIframes)
}
}
void signalId()
}
/**
* proxying messages from iframe to main body, so they can be in one batch (same indexes, etc)
* plus we rewrite some of the messages to be relative to the main context/window
* */
if (data.line === proto.iframeBatch) {
const msgBatch = data.messages
const mappedMessages: Message[] = msgBatch.map((msg: Message) => {
if (msg[0] === MType.MouseMove) {
let fixedMessage = msg
this.pageFrames.forEach((frame) => {
if (frame.contentWindow === event.source) {
const [type, x, y] = msg
const { left, top } = frame.getBoundingClientRect()
fixedMessage = [type, x + left, y + top]
}
})
return fixedMessage
}
if (msg[0] === MType.MouseClick) {
let fixedMessage = msg
this.pageFrames.forEach((frame) => {
if (frame.contentWindow === event.source) {
const [type, id, hesitationTime, label, selector, normX, normY] = msg
const { left, top, width, height } = frame.getBoundingClientRect()
const contentWidth = document.documentElement.scrollWidth
const contentHeight = document.documentElement.scrollHeight
// (normalizedX * frameWidth + frameLeftOffset)/docSize
const fullX = (normX / 100) * width + left
const fullY = (normY / 100) * height + top
const fixedX = fullX / contentWidth
const fixedY = fullY / contentHeight
fixedMessage = [
type,
id,
hesitationTime,
label,
selector,
Math.round(fixedX * 1e3) / 1e1,
Math.round(fixedY * 1e3) / 1e1,
]
}
})
return fixedMessage
}
return msg
})
this.messages.push(...mappedMessages)
}
if (data.line === proto.polling) {
if (!this.pollingQueue.order.length) {
return
}
const nextCommand = this.pollingQueue.order[0]
if (this.pollingQueue[nextCommand].includes(data.context)) {
this.pollingQueue[nextCommand] = this.pollingQueue[nextCommand].filter(
(c: string) => c !== data.context,
)
// @ts-ignore
event.source?.postMessage({ line: nextCommand }, '*')
if (this.pollingQueue[nextCommand].length === 0) {
this.pollingQueue.order.shift()
}
}
}
}
/**
* { command : [remaining iframes] }
* + order of commands
**/
pollingQueue: Record<string, any> = {
order: [],
}
private readonly addCommand = (cmd: string) => {
this.pollingQueue.order.push(cmd)
this.pollingQueue[cmd] = [...this.trackedFrames]
}
public bootChildrenFrames = async () => {
await this.waitStarted()
this.addCommand(proto.startIframe)
}
public killChildrenFrames = () => {
this.addCommand(proto.killIframe)
}
signalIframeTracker = () => {
const thisTab = this.session.getTabId()
const signalToParent = (n: number) => {
window.parent.postMessage(
{
line: proto.iframeSignal,
source: thisTab,
context: this.contextId,
},
this.options.crossdomain?.parentDomain ?? '*',
)
setTimeout(() => {
if (!this.checkStatus() && n < 100) {
void signalToParent(n + 1)
}
}, 250)
}
void signalToParent(1)
}
startTimeout: ReturnType<typeof setTimeout> | null = null startTimeout: ReturnType<typeof setTimeout> | null = null
private allowAppStart() { public allowAppStart() {
this.canStart = true this.canStart = true
if (this.startTimeout) { if (this.startTimeout) {
clearTimeout(this.startTimeout) clearTimeout(this.startTimeout)
@ -528,15 +633,38 @@ export default class App {
} }
} }
private checkNodeId(iframes: HTMLIFrameElement[], domain: string) { private async checkNodeId(
iframes: HTMLIFrameElement[],
source: MessageEventSource,
): Promise<number | null> {
for (const iframe of iframes) { for (const iframe of iframes) {
if (iframe.dataset.domain === domain) { if (iframe.contentWindow && iframe.contentWindow === source) {
// @ts-ignore /**
return iframe[this.options.node_id] as number | undefined * Here we're trying to get node id from the iframe (which is kept in observer)
* because of async nature of dom initialization, we give 100 retries with 100ms delay each
* which equals to 10 seconds. This way we have a period where we give app some time to load
* and tracker some time to parse the initial DOM tree even on slower devices
* */
let tries = 0
while (tries < 100) {
// @ts-ignore
const potentialId = iframe[this.options.node_id]
if (potentialId !== undefined) {
tries = 100
return potentialId
} else {
tries++
await delay(100)
}
}
return null
} }
} }
return null return null
} }
private initWorker() { private initWorker() {
try { try {
this.worker = new Worker( this.worker = new Worker(
@ -647,28 +775,28 @@ export default class App {
this.messages.length = 0 this.messages.length = 0
return return
} }
if (this.worker === undefined || !this.messages.length) {
return
}
if (this.insideIframe) { if (this.insideIframe) {
window.parent.postMessage( window.parent.postMessage(
{ {
line: proto.iframeBatch, line: proto.iframeBatch,
messages: this.messages, messages: this.messages,
domain: this.initialHostName,
}, },
'*', this.options.crossdomain?.parentDomain ?? '*',
) )
this.commitCallbacks.forEach((cb) => cb(this.messages)) this.commitCallbacks.forEach((cb) => cb(this.messages))
this.messages.length = 0 this.messages.length = 0
return return
} }
if (this.worker === undefined || !this.messages.length) {
return
}
try { try {
requestIdleCb(() => { requestIdleCb(() => {
this.messages.unshift(TabData(this.session.getTabId())) this.messages.unshift(TabData(this.session.getTabId()))
this.messages.unshift(Timestamp(this.timestamp())) this.messages.unshift(Timestamp(this.timestamp()))
// why I need to add opt chaining?
this.worker?.postMessage(this.messages) this.worker?.postMessage(this.messages)
this.commitCallbacks.forEach((cb) => cb(this.messages)) this.commitCallbacks.forEach((cb) => cb(this.messages))
this.messages.length = 0 this.messages.length = 0
@ -740,36 +868,39 @@ export default class App {
this.commitCallbacks.push(cb) this.commitCallbacks.push(cb)
} }
attachStartCallback(cb: StartCallback, useSafe = false): void { attachStartCallback = (cb: StartCallback, useSafe = false): void => {
if (useSafe) { if (useSafe) {
cb = this.safe(cb) cb = this.safe(cb)
} }
this.startCallbacks.push(cb) this.startCallbacks.push(cb)
} }
attachStopCallback(cb: () => any, useSafe = false): void { attachStopCallback = (cb: () => any, useSafe = false): void => {
if (useSafe) { if (useSafe) {
cb = this.safe(cb) cb = this.safe(cb)
} }
this.stopCallbacks.push(cb) this.stopCallbacks.push(cb)
} }
// Use app.nodes.attachNodeListener for registered nodes instead attachEventListener = (
attachEventListener(
target: EventTarget, target: EventTarget,
type: string, type: string,
listener: EventListener, listener: EventListener,
useSafe = true, useSafe = true,
useCapture = true, useCapture = true,
): void { ): void => {
if (useSafe) { if (useSafe) {
listener = this.safe(listener) listener = this.safe(listener)
} }
const createListener = () => const createListener = () =>
target ? createEventListener(target, type, listener, useCapture) : null target
? createEventListener(target, type, listener, useCapture, this.options.angularMode)
: null
const deleteListener = () => const deleteListener = () =>
target ? deleteEventListener(target, type, listener, useCapture) : null target
? deleteEventListener(target, type, listener, useCapture, this.options.angularMode)
: null
this.attachStartCallback(createListener, useSafe) this.attachStartCallback(createListener, useSafe)
this.attachStopCallback(deleteListener, useSafe) this.attachStopCallback(deleteListener, useSafe)
@ -1157,7 +1288,7 @@ export default class App {
if (isColdStart && this.coldInterval) { if (isColdStart && this.coldInterval) {
clearInterval(this.coldInterval) clearInterval(this.coldInterval)
} }
if (!this.worker) { if (!this.worker && !this.insideIframe) {
const reason = 'No worker found: perhaps, CSP is not set.' const reason = 'No worker found: perhaps, CSP is not set.'
this.signalError(reason, []) this.signalError(reason, [])
return Promise.resolve(UnsuccessfulStart(reason)) return Promise.resolve(UnsuccessfulStart(reason))
@ -1189,7 +1320,7 @@ export default class App {
}) })
const timestamp = now() const timestamp = now()
this.worker.postMessage({ this.worker?.postMessage({
type: 'start', type: 'start',
pageNo: this.session.incPageNo(), pageNo: this.session.incPageNo(),
ingestPoint: this.options.ingestPoint, ingestPoint: this.options.ingestPoint,
@ -1237,7 +1368,7 @@ export default class App {
const reason = error === CANCELED ? CANCELED : `Server error: ${r.status}. ${error}` const reason = error === CANCELED ? CANCELED : `Server error: ${r.status}. ${error}`
return UnsuccessfulStart(reason) return UnsuccessfulStart(reason)
} }
if (!this.worker) { if (!this.worker && !this.insideIframe) {
const reason = 'no worker found after start request (this should not happen in real world)' const reason = 'no worker found after start request (this should not happen in real world)'
this.signalError(reason, []) this.signalError(reason, [])
return UnsuccessfulStart(reason) return UnsuccessfulStart(reason)
@ -1295,9 +1426,9 @@ export default class App {
if (socketOnly) { if (socketOnly) {
this.socketMode = true this.socketMode = true
this.worker.postMessage('stop') this.worker?.postMessage('stop')
} else { } else {
this.worker.postMessage({ this.worker?.postMessage({
type: 'auth', type: 'auth',
token, token,
beaconSizeLimit, beaconSizeLimit,
@ -1320,11 +1451,17 @@ export default class App {
// TODO: start as early as possible (before receiving the token) // TODO: start as early as possible (before receiving the token)
/** after start */ /** after start */
this.startCallbacks.forEach((cb) => cb(onStartInfo)) // MBTODO: callbacks after DOM "mounted" (observed) this.startCallbacks.forEach((cb) => cb(onStartInfo)) // MBTODO: callbacks after DOM "mounted" (observed)
if (startOpts.startCallback) {
startOpts.startCallback(SuccessfulStart(onStartInfo))
}
if (this.features['feature-flags']) { if (this.features['feature-flags']) {
void this.featureFlags.reloadFlags() void this.featureFlags.reloadFlags()
} }
await this.tagWatcher.fetchTags(this.options.ingestPoint, token) await this.tagWatcher.fetchTags(this.options.ingestPoint, token)
this.activityState = ActivityState.Active this.activityState = ActivityState.Active
if (this.options.crossdomain?.enabled && !this.insideIframe) {
void this.bootChildrenFrames()
}
if (canvasEnabled && !this.options.canvas.disableCanvas) { if (canvasEnabled && !this.options.canvas.disableCanvas) {
this.canvasRecorder = this.canvasRecorder =
@ -1336,7 +1473,6 @@ export default class App {
fixedScaling: this.options.canvas.fixedCanvasScaling, fixedScaling: this.options.canvas.fixedCanvasScaling,
useAnimationFrame: this.options.canvas.useAnimationFrame, useAnimationFrame: this.options.canvas.useAnimationFrame,
}) })
this.canvasRecorder.startTracking()
} }
/** --------------- COLD START BUFFER ------------------*/ /** --------------- COLD START BUFFER ------------------*/
@ -1359,9 +1495,12 @@ export default class App {
} }
this.ticker.start() this.ticker.start()
} }
this.canvasRecorder?.startTracking()
if (this.features['usability-test']) { if (this.features['usability-test']) {
this.uxtManager = this.uxtManager ? this.uxtManager : new UserTestManager(this, uxtStorageKey) this.uxtManager = this.uxtManager
? this.uxtManager
: new UserTestManager(this, uxtStorageKey)
let uxtId: number | undefined let uxtId: number | undefined
const savedUxtTag = this.localStorage.getItem(uxtStorageKey) const savedUxtTag = this.localStorage.getItem(uxtStorageKey)
if (savedUxtTag) { if (savedUxtTag) {
@ -1394,6 +1533,11 @@ export default class App {
} catch (reason) { } catch (reason) {
this.stop() this.stop()
this.session.reset() this.session.reset()
if (!reason) {
console.error('Unknown error during start')
this.signalError('Unknown error', [])
return UnsuccessfulStart('Unknown error')
}
if (reason === CANCELED) { if (reason === CANCELED) {
this.signalError(CANCELED, []) this.signalError(CANCELED, [])
return UnsuccessfulStart(CANCELED) return UnsuccessfulStart(CANCELED)
@ -1452,9 +1596,13 @@ export default class App {
} }
async waitStarted() { async waitStarted() {
return this.waitStatus(ActivityState.Active)
}
async waitStatus(status: ActivityState) {
return new Promise((resolve) => { return new Promise((resolve) => {
const check = () => { const check = () => {
if (this.activityState === ActivityState.Active) { if (this.activityState === status) {
resolve(true) resolve(true)
} else { } else {
setTimeout(check, 25) setTimeout(check, 25)
@ -1478,6 +1626,10 @@ export default class App {
return Promise.resolve(UnsuccessfulStart(reason)) return Promise.resolve(UnsuccessfulStart(reason))
} }
if (this.insideIframe) {
this.signalIframeTracker()
}
if (!document.hidden) { if (!document.hidden) {
await this.waitStart() await this.waitStart()
return this._start(...args) return this._start(...args)
@ -1533,20 +1685,28 @@ export default class App {
stop(stopWorker = true): void { stop(stopWorker = true): void {
if (this.activityState !== ActivityState.NotActive) { if (this.activityState !== ActivityState.NotActive) {
try { try {
if (!this.insideIframe && this.options.crossdomain?.enabled) {
this.killChildrenFrames()
}
this.attributeSender.clear() this.attributeSender.clear()
this.sanitizer.clear() this.sanitizer.clear()
this.observer.disconnect() this.observer.disconnect()
this.nodes.clear() this.nodes.clear()
this.ticker.stop() this.ticker.stop()
this.stopCallbacks.forEach((cb) => cb()) this.stopCallbacks.forEach((cb) => cb())
this.debug.log('OpenReplay tracking stopped.')
this.tagWatcher.clear() this.tagWatcher.clear()
if (this.worker && stopWorker) { if (this.worker && stopWorker) {
this.worker.postMessage('stop') this.worker.postMessage('stop')
} }
this.canvasRecorder?.clear() this.canvasRecorder?.clear()
this.messages.length = 0
this.trackedFrames = []
this.parentActive = false
this.canStart = false
this.pollingQueue = { order: [] }
} finally { } finally {
this.activityState = ActivityState.NotActive this.activityState = ActivityState.NotActive
this.debug.log('OpenReplay tracking stopped.')
} }
} }
} }

View file

@ -10,10 +10,13 @@ export default class Nodes {
private readonly elementListeners: Map<number, Array<ElementListener>> = new Map() private readonly elementListeners: Map<number, Array<ElementListener>> = new Map()
private nextNodeId = 0 private nextNodeId = 0
constructor(private readonly node_id: string) {} constructor(
private readonly node_id: string,
private readonly angularMode: boolean,
) {}
syntheticMode(frameOrder: number) { syntheticMode(frameOrder: number) {
const maxSafeNumber = 9007199254740900 const maxSafeNumber = Number.MAX_SAFE_INTEGER
const placeholderSize = 99999999 const placeholderSize = 99999999
const nextFrameId = placeholderSize * frameOrder const nextFrameId = placeholderSize * frameOrder
// I highly doubt that this will ever happen, // I highly doubt that this will ever happen,
@ -25,7 +28,7 @@ export default class Nodes {
} }
// Attached once per Tracker instance // Attached once per Tracker instance
attachNodeCallback(nodeCallback: NodeCallback): void { attachNodeCallback = (nodeCallback: NodeCallback): void => {
this.nodeCallbacks.push(nodeCallback) this.nodeCallbacks.push(nodeCallback)
} }
@ -33,12 +36,12 @@ export default class Nodes {
this.nodes.forEach((node) => cb(node)) this.nodes.forEach((node) => cb(node))
} }
attachNodeListener(node: Node, type: string, listener: EventListener, useCapture = true): void { attachNodeListener = (node: Node, type: string, listener: EventListener, useCapture = true): void => {
const id = this.getID(node) const id = this.getID(node)
if (id === undefined) { if (id === undefined) {
return return
} }
createEventListener(node, type, listener, useCapture) createEventListener(node, type, listener, useCapture, this.angularMode)
let listeners = this.elementListeners.get(id) let listeners = this.elementListeners.get(id)
if (listeners === undefined) { if (listeners === undefined) {
listeners = [] listeners = []
@ -70,7 +73,7 @@ export default class Nodes {
if (listeners !== undefined) { if (listeners !== undefined) {
this.elementListeners.delete(id) this.elementListeners.delete(id)
listeners.forEach((listener) => listeners.forEach((listener) =>
deleteEventListener(node, listener[0], listener[1], listener[2]), deleteEventListener(node, listener[0], listener[1], listener[2], this.angularMode),
) )
} }
this.totalNodeAmount-- this.totalNodeAmount--

View file

@ -19,13 +19,13 @@ export default class IFrameObserver extends Observer {
}) })
} }
syntheticObserve(selfId: number, doc: Document) { syntheticObserve(rootNodeId: number, doc: Document) {
this.observeRoot(doc, (docID) => { this.observeRoot(doc, (docID) => {
if (docID === undefined) { if (docID === undefined) {
this.app.debug.log('OpenReplay: Iframe document not bound') this.app.debug.log('OpenReplay: Iframe document not bound')
return return
} }
this.app.send(CreateIFrameDocument(selfId, docID)) this.app.send(CreateIFrameDocument(rootNodeId, docID))
}) })
} }
} }

View file

@ -1,4 +1,4 @@
import { createMutationObserver, ngSafeBrowserMethod } from '../../utils.js' import { createMutationObserver } from '../../utils.js'
import { import {
RemoveNodeAttribute, RemoveNodeAttribute,
SetNodeAttributeURLBased, SetNodeAttributeURLBased,
@ -105,6 +105,9 @@ export default abstract class Observer {
if (name === null) { if (name === null) {
continue continue
} }
if (target instanceof HTMLIFrameElement && name === 'src') {
this.handleIframeSrcChange(target)
}
let attr = this.attributesMap.get(id) let attr = this.attributesMap.get(id)
if (attr === undefined) { if (attr === undefined) {
this.attributesMap.set(id, (attr = new Set())) this.attributesMap.set(id, (attr = new Set()))
@ -119,6 +122,7 @@ export default abstract class Observer {
} }
this.commitNodes() this.commitNodes()
}) as MutationCallback, }) as MutationCallback,
this.app.options.angularMode,
) )
} }
private clear(): void { private clear(): void {
@ -129,10 +133,49 @@ export default abstract class Observer {
this.textSet.clear() this.textSet.clear()
} }
/**
* Unbinds the removed nodes in case of iframe src change.
*/
private handleIframeSrcChange(iframe: HTMLIFrameElement): void {
const oldContentDocument = iframe.contentDocument
if (oldContentDocument) {
const id = this.app.nodes.getID(oldContentDocument)
if (id !== undefined) {
const walker = document.createTreeWalker(
oldContentDocument,
NodeFilter.SHOW_ELEMENT + NodeFilter.SHOW_TEXT,
{
acceptNode: (node) =>
isIgnored(node) || this.app.nodes.getID(node) === undefined
? NodeFilter.FILTER_REJECT
: NodeFilter.FILTER_ACCEPT,
},
// @ts-ignore
false,
)
let removed = 0
const totalBeforeRemove = this.app.nodes.getNodeCount()
while (walker.nextNode()) {
if (!iframe.contentDocument.contains(walker.currentNode)) {
removed += 1
this.app.nodes.unregisterNode(walker.currentNode)
}
}
const removedPercent = Math.floor((removed / totalBeforeRemove) * 100)
if (removedPercent > 30) {
this.app.send(UnbindNodes(removedPercent))
}
}
}
}
private sendNodeAttribute(id: number, node: Element, name: string, value: string | null): void { private sendNodeAttribute(id: number, node: Element, name: string, value: string | null): void {
if (isSVGElement(node)) { if (isSVGElement(node)) {
if (name.substr(0, 6) === 'xlink:') { if (name.substring(0, 6) === 'xlink:') {
name = name.substr(6) name = name.substring(6)
} }
if (value === null) { if (value === null) {
this.app.send(RemoveNodeAttribute(id, name)) this.app.send(RemoveNodeAttribute(id, name))
@ -152,7 +195,7 @@ export default abstract class Observer {
name === 'integrity' || name === 'integrity' ||
name === 'crossorigin' || name === 'crossorigin' ||
name === 'autocomplete' || name === 'autocomplete' ||
name.substr(0, 2) === 'on' name.substring(0, 2) === 'on'
) { ) {
return return
} }

View file

@ -140,7 +140,7 @@ export default class TopObserver extends Observer {
) )
} }
crossdomainObserve(selfId: number, frameOder: number) { crossdomainObserve(rootNodeId: number, frameOder: number) {
const observer = this const observer = this
Element.prototype.attachShadow = function () { Element.prototype.attachShadow = function () {
// eslint-disable-next-line // eslint-disable-next-line
@ -152,7 +152,7 @@ export default class TopObserver extends Observer {
this.app.nodes.syntheticMode(frameOder) this.app.nodes.syntheticMode(frameOder)
const iframeObserver = new IFrameObserver(this.app) const iframeObserver = new IFrameObserver(this.app)
this.iframeObservers.push(iframeObserver) this.iframeObservers.push(iframeObserver)
iframeObserver.syntheticObserve(selfId, window.document) iframeObserver.syntheticObserve(rootNodeId, window.document)
} }
disconnect() { disconnect() {

View file

@ -99,6 +99,7 @@ export default function (app: App): void {
} }
} }
}) as MutationCallback, }) as MutationCallback,
app.options.angularMode,
) )
app.attachStopCallback(() => { app.attachStopCallback(() => {

View file

@ -132,9 +132,13 @@ export function ngSafeBrowserMethod(method: string): string {
: method : method
} }
export function createMutationObserver(cb: MutationCallback) { export function createMutationObserver(cb: MutationCallback, angularMode?: boolean) {
const mObserver = ngSafeBrowserMethod('MutationObserver') as 'MutationObserver' if (angularMode) {
return new window[mObserver](cb) const mObserver = ngSafeBrowserMethod('MutationObserver') as 'MutationObserver'
return new window[mObserver](cb)
} else {
return new MutationObserver(cb)
}
} }
export function createEventListener( export function createEventListener(
@ -142,15 +146,23 @@ export function createEventListener(
event: string, event: string,
cb: EventListenerOrEventListenerObject, cb: EventListenerOrEventListenerObject,
capture?: boolean, capture?: boolean,
angularMode?: boolean,
) { ) {
const safeAddEventListener = ngSafeBrowserMethod('addEventListener') as 'addEventListener' let safeAddEventListener: 'addEventListener'
if (angularMode) {
safeAddEventListener = ngSafeBrowserMethod('addEventListener') as 'addEventListener'
} else {
safeAddEventListener = 'addEventListener'
}
try { try {
target[safeAddEventListener](event, cb, capture) target[safeAddEventListener](event, cb, capture)
} catch (e) { } catch (e) {
const msg = e.message const msg = e.message
console.debug( console.error(
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions // eslint-disable-next-line @typescript-eslint/restrict-template-expressions
`Openreplay: ${msg}; if this error is caused by an IframeObserver, ignore it`, `Openreplay: ${msg}; if this error is caused by an IframeObserver, ignore it`,
event,
target,
) )
} }
} }
@ -160,17 +172,23 @@ export function deleteEventListener(
event: string, event: string,
cb: EventListenerOrEventListenerObject, cb: EventListenerOrEventListenerObject,
capture?: boolean, capture?: boolean,
angularMode?: boolean,
) { ) {
const safeRemoveEventListener = ngSafeBrowserMethod( let safeRemoveEventListener: 'removeEventListener'
'removeEventListener', if (angularMode) {
) as 'removeEventListener' safeRemoveEventListener = ngSafeBrowserMethod('removeEventListener') as 'removeEventListener'
} else {
safeRemoveEventListener = 'removeEventListener'
}
try { try {
target[safeRemoveEventListener](event, cb, capture) target[safeRemoveEventListener](event, cb, capture)
} catch (e) { } catch (e) {
const msg = e.message const msg = e.message
console.debug( console.error(
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions // eslint-disable-next-line @typescript-eslint/restrict-template-expressions
`Openreplay: ${msg}; if this error is caused by an IframeObserver, ignore it`, `Openreplay: ${msg}; if this error is caused by an IframeObserver, ignore it`,
event,
target,
) )
} }
} }

View file

@ -7,7 +7,7 @@ describe('Nodes', () => {
const mockCallback = jest.fn() const mockCallback = jest.fn()
beforeEach(() => { beforeEach(() => {
nodes = new Nodes(nodeId) nodes = new Nodes(nodeId, false)
mockCallback.mockClear() mockCallback.mockClear()
}) })