openreplay/ee/api/chalicelib/core/sourcemaps.py
Rajesh Rajendran 2e86b6eb6a
Bug fixes and features. (#7)
* fix: changed sessions bucket

* fix: text changes in login and signup forms

* change: version number

* change: config changes

* fix: alerts image name

* fix: alerts image name

* Update README.md

* chore(actions): pushing internalized to script.

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* feat(nginx): No redirection to HTTPS by default.

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* chore(deploy): optional nginx https redirect

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* fix: review fixes and other changes

* fix: events modal openreplay logo

* fix: stack event icon

* Changes:
- debugging
- smtp status
- session's issues
- session's issue_types as array
- changed Slack error message

* Changes:
- set chalice pull policy to always

* fix(openreplay-cli): path issues.

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* fix(openreplay-cli): fix path

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* change: onboarding explore text changes

* change: timeline issue pointers and static issue types

* change: removed issues_types api call

* connectors

* Update README.md

* Update README.md

* Update README.md

* Updating services

* Update README.md

* Updated alert-notification-string to chalice

* Delete issues.md

* Changes:
- fixed connexion pool exhausted using Semaphores
- fixed session-replay-url signing

* Changes:
- fixed connexion pool exhausted using Semaphores
- fixed session-replay-url signing

* Change pullPolicy to IfNotPresent

* Fixed typo

* Fixed typo

* Fixed typos

* Fixed typo

* Fixed typo

* Fixed typos

* Fixed typos

* Fixed typo

* Fixed typo

* Removed /ws

* Update README.md

* feat(nginx): increase minio upload size to 50M

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* fix(deploy): nginx custom changes are overriden in install

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* fix(nginx): deployment indentation issue

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* fix: revid filter crash

* fix: onboarding links

* fix: update password store new token

* fix: report issue icon jira/github

* fix: onboarding redirect on signup

* Changes:
- hardcoded S3_HOST

* Changes:
- changed "sourcemaps" env var to "sourcemaps_reader"
- set "sourcemaps_reader" env var value

* chore(script): remove logo

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* Making domain_name mandatory

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* Changes:
- un-ignore *.js

* feat(install): auto create jwt_secret for chalice.

* docs(script): Adding Banner

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* chore(script): Remove verbose logging

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* Change:
- use boto3-resource instead of boto3-client to check if file exists
- changed .gitignore to allow *.js files
- changed sourcemaps_reader env-var & env-var-value

* fix (baxkend-ender): skip inputs with no label (technical)

* Change:
- changed DB structure

* change: removed /flows api call

* fix: skipping errorOnFetch check

* Change:
- changed sourcemaps_reader-nodejs script

* Change:
- changed sourcemaps_reader-nodejs script

* fix (backend-postgres): correct autocomplete type-value

* fix: slack webhooks PUT call

* change: added external icon for integration doc links

* fix: updated the sourcemap upload doc link

* fix: link color of no sessions message

* fix (frontend-player): show original domContentLoaded text values, while adjusted on timeline

* fix (frontend-player): syntax

* Changes:
- changed requirements
- changed slack add integration
- added slack edit integration
- removed sourcemaps_reader extra payload

* Changes:
- fixed sentry-issue-reporter
- fixed telemetry reporter
- fixed DB schema

* fix(cli): fix logs flag

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* ci(deploy): Injecting domain_name

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* feat(nginx): Get real client ip

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* chore(nginx): restart on helm installation.

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* fix(deployment): respect image tags.

* Changes:
- changed sentry tags
- changed asayer_session_id to openReplaySessionToken
- EE full merge

* fix: close the issue modal after creating

* fix: show description in issue details modal

* fix: integrate slack button redirect, and doc link

* fix: code snippet conflict set back

* fix: slack share channel selection

* Changes:
- fixed DB structure

* Changes:
- return full integration body on add slack

* fix (integrations): ignore token expired + some logs

* feat (sourcemaps-uploader): v.3.0.2 filename fix + logging arg

* fix (tracker): 3.0.3 version: start before auth

* fix: funnel calendar position

* fix: fetch issue types

* fix: missing icon blocking the session to play

* change: sessions per browser widget bar height reduced

* fix: github colored circles

* Changes:
- changed session-assignment-jira response

* chore(nginx): pass x-forward-for

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* feat(chalice): included sourcemaps_reader

It's not advised to run multiple processes in a single docker container.
In Kubernetes we can run this as sidecar, but other platforms such as
Heroku, and vanilla docker doesn't support such feature. So till we
figure out better solution, this is the workaround.

* chore(install): Remove sqs

* feat(deployment): restart pods on installations.

Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com>

* Changes:
- changed DB-oauth-unique constraint

Co-authored-by: Shekar Siri <sshekarsiri@gmail.com>
Co-authored-by: Mehdi Osman <estradino@users.noreply.github.com>
Co-authored-by: KRAIEM Taha Yassine <tahayk2@gmail.com>
Co-authored-by: ourvakan <hi-psi@yandex.com>
Co-authored-by: ShiKhu <alex.kaminsky.11@gmail.com>
2021-05-21 22:53:36 +05:30

161 lines
5.7 KiB
Python

from chalicelib.utils.helper import environ
from chalicelib.utils import s3
import hashlib
from urllib.parse import urlparse
from chalicelib.core import sourcemaps_parser
def __get_key(project_id, url):
u = urlparse(url)
new_url = u.scheme + "://" + u.netloc + u.path
return f"{project_id}/{hashlib.md5(new_url.encode()).hexdigest()}"
def presign_share_urls(project_id, urls):
results = []
for u in urls:
results.append(s3.get_presigned_url_for_sharing(bucket=environ['sourcemaps_bucket'], expires_in=120,
key=__get_key(project_id, u),
check_exists=True))
return results
def presign_upload_urls(project_id, urls):
results = []
for u in urls:
results.append(s3.get_presigned_url_for_upload(bucket=environ['sourcemaps_bucket'],
expires_in=1800,
key=__get_key(project_id, u)))
return results
def __format_frame_old(f):
if f.get("context") is None:
f["context"] = []
else:
f["context"] = [[f["line"], f["context"]]]
url = f.pop("url")
f["absPath"] = url
f["filename"] = urlparse(url).path
f["lineNo"] = f.pop("line")
f["colNo"] = f.pop("column")
f["function"] = f.pop("func")
return f
def __frame_is_valid(f):
return "columnNumber" in f and \
"lineNumber" in f and \
"fileName" in f
def __format_frame(f):
f["context"] = [] # no context by default
if "source" in f: f.pop("source")
url = f.pop("fileName")
f["absPath"] = url
f["filename"] = urlparse(url).path
f["lineNo"] = f.pop("lineNumber")
f["colNo"] = f.pop("columnNumber")
f["function"] = f.pop("functionName") if "functionName" in f else None
return f
def format_payload(p, truncate_to_first=False):
if type(p) is list:
return [__format_frame(f) for f in (p[:1] if truncate_to_first else p) if __frame_is_valid(f)]
if type(p) is dict:
stack = p.get("stack", [])
return [__format_frame_old(f) for f in (stack[:1] if truncate_to_first else stack)]
return []
def get_traces_group(project_id, payload):
frames = format_payload(payload)
results = [{}] * len(frames)
payloads = {}
all_exists = True
for i, u in enumerate(frames):
print("===============================")
print(u["absPath"])
print("converted to:")
key = __get_key(project_id, u["absPath"]) # use filename instead?
print(key)
print("===============================")
if key not in payloads:
file_exists = s3.exists(environ['sourcemaps_bucket'], key)
all_exists = all_exists and file_exists
if not file_exists:
print(f"{u['absPath']} sourcemap (key '{key}') doesn't exist in S3")
payloads[key] = None
else:
payloads[key] = []
results[i] = dict(u)
results[i]["frame"] = dict(u)
if payloads[key] is not None:
payloads[key].append({"resultIndex": i,
"position": {"line": u["lineNo"], "column": u["colNo"]},
"frame": dict(u)})
for key in payloads.keys():
if payloads[key] is None:
continue
key_results = sourcemaps_parser.get_original_trace(key=key, positions=[o["position"] for o in payloads[key]])
for i, r in enumerate(key_results):
res_index = payloads[key][i]["resultIndex"]
# function name search by frontend lib is better than sourcemaps' one in most cases
if results[res_index].get("function") is not None:
r["function"] = results[res_index]["function"]
r["frame"] = payloads[key][i]["frame"]
results[res_index] = r
return fetch_missed_contexts(results), all_exists
def get_js_cache_path(fullURL):
p = urlparse(fullURL)
return p.scheme + '/' + p.netloc + p.path # TODO (Also in go assets library): What if URL with query? (like versions)
MAX_COLUMN_OFFSET = 60
def fetch_missed_contexts(frames):
source_cache = {}
for i in range(len(frames)):
if len(frames[i]["context"]) != 0:
continue
if frames[i]["frame"]["absPath"] in source_cache:
file = source_cache[frames[i]["frame"]["absPath"]]
else:
file = s3.get_file(environ['js_cache_bucket'], get_js_cache_path(frames[i]["frame"]["absPath"]))
if file is None:
print(
f"File {get_js_cache_path(frames[i]['frame']['absPath'])} not found in {environ['js_cache_bucket']}")
source_cache[frames[i]["frame"]["absPath"]] = file
if file is None:
continue
lines = file.split("\n")
if frames[i]["lineNo"] is None:
print("no original-source found for frame in sourcemap results")
frames[i] = frames[i]["frame"]
frames[i]["originalMapping"] = False
l = frames[i]["lineNo"] - 1 # starts from 1
c = frames[i]["colNo"] - 1 # starts from 1
if len(lines) == 1:
print(f"minified asset")
l = frames[i]["frame"]["lineNo"] - 1 # starts from 1
c = frames[i]["frame"]["colNo"] - 1 # starts from 1
elif l >= len(lines):
print(f"line number {l} greater than file length {len(lines)}")
continue
line = lines[l]
offset = c - MAX_COLUMN_OFFSET
if offset < 0: # if the line is shirt
offset = 0
frames[i]["context"].append([frames[i]["lineNo"], line[offset: c + MAX_COLUMN_OFFSET + 1]])
return frames