* fix: changed sessions bucket * fix: text changes in login and signup forms * change: version number * change: config changes * fix: alerts image name * fix: alerts image name * Update README.md * chore(actions): pushing internalized to script. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * feat(nginx): No redirection to HTTPS by default. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * chore(deploy): optional nginx https redirect Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix: review fixes and other changes * fix: events modal openreplay logo * fix: stack event icon * Changes: - debugging - smtp status - session's issues - session's issue_types as array - changed Slack error message * Changes: - set chalice pull policy to always * fix(openreplay-cli): path issues. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix(openreplay-cli): fix path Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * change: onboarding explore text changes * change: timeline issue pointers and static issue types * change: removed issues_types api call * connectors * Update README.md * Update README.md * Update README.md * Updating services * Update README.md * Updated alert-notification-string to chalice * Delete issues.md * Changes: - fixed connexion pool exhausted using Semaphores - fixed session-replay-url signing * Changes: - fixed connexion pool exhausted using Semaphores - fixed session-replay-url signing * Change pullPolicy to IfNotPresent * Fixed typo * Fixed typo * Fixed typos * Fixed typo * Fixed typo * Fixed typos * Fixed typos * Fixed typo * Fixed typo * Removed /ws * Update README.md * feat(nginx): increase minio upload size to 50M Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix(deploy): nginx custom changes are overriden in install Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix(nginx): deployment indentation issue Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix: revid filter crash * fix: onboarding links * fix: update password store new token * fix: report issue icon jira/github * fix: onboarding redirect on signup * Changes: - hardcoded S3_HOST * Changes: - changed "sourcemaps" env var to "sourcemaps_reader" - set "sourcemaps_reader" env var value * chore(script): remove logo Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * Making domain_name mandatory Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * Changes: - un-ignore *.js * feat(install): auto create jwt_secret for chalice. * docs(script): Adding Banner Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * chore(script): Remove verbose logging Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * Change: - use boto3-resource instead of boto3-client to check if file exists - changed .gitignore to allow *.js files - changed sourcemaps_reader env-var & env-var-value * fix (baxkend-ender): skip inputs with no label (technical) * Change: - changed DB structure * change: removed /flows api call * fix: skipping errorOnFetch check * Change: - changed sourcemaps_reader-nodejs script * Change: - changed sourcemaps_reader-nodejs script * fix (backend-postgres): correct autocomplete type-value * fix: slack webhooks PUT call * change: added external icon for integration doc links * fix: updated the sourcemap upload doc link * fix: link color of no sessions message * fix (frontend-player): show original domContentLoaded text values, while adjusted on timeline * fix (frontend-player): syntax * Changes: - changed requirements - changed slack add integration - added slack edit integration - removed sourcemaps_reader extra payload * Changes: - fixed sentry-issue-reporter - fixed telemetry reporter - fixed DB schema * fix(cli): fix logs flag Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * ci(deploy): Injecting domain_name Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * feat(nginx): Get real client ip Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * chore(nginx): restart on helm installation. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix(deployment): respect image tags. * Changes: - changed sentry tags - changed asayer_session_id to openReplaySessionToken - EE full merge * fix: close the issue modal after creating * fix: show description in issue details modal * fix: integrate slack button redirect, and doc link * fix: code snippet conflict set back * fix: slack share channel selection * Changes: - fixed DB structure * Changes: - return full integration body on add slack * fix (integrations): ignore token expired + some logs * feat (sourcemaps-uploader): v.3.0.2 filename fix + logging arg * fix (tracker): 3.0.3 version: start before auth * fix: funnel calendar position * fix: fetch issue types * fix: missing icon blocking the session to play * change: sessions per browser widget bar height reduced * fix: github colored circles * Changes: - changed session-assignment-jira response * chore(nginx): pass x-forward-for Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * feat(chalice): included sourcemaps_reader It's not advised to run multiple processes in a single docker container. In Kubernetes we can run this as sidecar, but other platforms such as Heroku, and vanilla docker doesn't support such feature. So till we figure out better solution, this is the workaround. * chore(install): Remove sqs * feat(deployment): restart pods on installations. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * Changes: - changed DB-oauth-unique constraint Co-authored-by: Shekar Siri <sshekarsiri@gmail.com> Co-authored-by: Mehdi Osman <estradino@users.noreply.github.com> Co-authored-by: KRAIEM Taha Yassine <tahayk2@gmail.com> Co-authored-by: ourvakan <hi-psi@yandex.com> Co-authored-by: ShiKhu <alex.kaminsky.11@gmail.com>
157 lines
6.6 KiB
Python
157 lines
6.6 KiB
Python
from chalicelib.utils.helper import environ as env
|
|
from chalicelib.utils import helper
|
|
from chalicelib.utils.TimeUTC import TimeUTC
|
|
from chalicelib.utils import pg_client
|
|
from chalicelib.core import integrations_manager, integration_base_issue
|
|
import json
|
|
|
|
|
|
def __get_saved_data(project_id, session_id, issue_id, tool):
|
|
with pg_client.PostgresClient() as cur:
|
|
query = cur.mogrify(f"""\
|
|
SELECT *
|
|
FROM public.assigned_sessions
|
|
WHERE
|
|
session_id = %(session_id)s
|
|
AND issue_id = %(issue_id)s
|
|
AND provider = %(provider)s;\
|
|
""",
|
|
{"session_id": session_id, "issue_id": issue_id, "provider": tool.lower()})
|
|
cur.execute(
|
|
query
|
|
)
|
|
return helper.dict_to_camel_case(cur.fetchone())
|
|
|
|
|
|
def create_new_assignment(tenant_id, project_id, session_id, creator_id, assignee, description, title, issue_type, integration_project_id):
|
|
error, integration = integrations_manager.get_integration(tenant_id=tenant_id, user_id=creator_id)
|
|
if error is not None:
|
|
return error
|
|
|
|
i = integration.get()
|
|
|
|
if i is None:
|
|
return {"errors": [f"integration not found"]}
|
|
link = env["SITE_URL"] + f"/{project_id}/session/{session_id}"
|
|
description += f"\n> {link}"
|
|
try:
|
|
issue = integration.issue_handler.create_new_assignment(title=title, assignee=assignee, description=description,
|
|
issue_type=issue_type,
|
|
integration_project_id=integration_project_id)
|
|
except integration_base_issue.RequestException as e:
|
|
return integration_base_issue.proxy_issues_handler(e)
|
|
if issue is not None and "id" not in issue:
|
|
return {"errors": ["something went wrong while creating the issue"]}
|
|
with pg_client.PostgresClient() as cur:
|
|
query = cur.mogrify("""\
|
|
INSERT INTO public.assigned_sessions(session_id, issue_id, created_by, provider,provider_data)
|
|
VALUES (%(session_id)s, %(issue_id)s, %(creator_id)s, %(provider)s,%(provider_data)s);\
|
|
""",
|
|
{"session_id": session_id, "creator_id": creator_id,
|
|
"issue_id": issue["id"], "provider": integration.provider.lower(),
|
|
"provider_data": json.dumps({"integrationProjectId": integration_project_id})})
|
|
cur.execute(
|
|
query
|
|
)
|
|
issue["provider"] = integration.provider.lower()
|
|
return issue
|
|
|
|
|
|
def get_all(project_id, user_id):
|
|
available_integrations = integrations_manager.get_available_integrations(user_id=user_id)
|
|
no_integration = not any(available_integrations.values())
|
|
if no_integration:
|
|
return []
|
|
all_integrations = all(available_integrations.values())
|
|
extra_query = ["sessions.project_id = %(project_id)s"]
|
|
if not all_integrations:
|
|
extra_query.append("provider IN %(providers)s")
|
|
with pg_client.PostgresClient() as cur:
|
|
query = cur.mogrify(f"""\
|
|
SELECT assigned_sessions.*
|
|
FROM public.assigned_sessions
|
|
INNER JOIN public.sessions USING (session_id)
|
|
WHERE {" AND ".join(extra_query)};\
|
|
""",
|
|
{"project_id": project_id,
|
|
"providers": tuple(d for d in available_integrations if available_integrations[d])})
|
|
cur.execute(
|
|
query
|
|
)
|
|
assignments = helper.list_to_camel_case(cur.fetchall())
|
|
for a in assignments:
|
|
a["createdAt"] = TimeUTC.datetime_to_timestamp(a["createdAt"])
|
|
return assignments
|
|
|
|
|
|
def get_by_session(tenant_id, user_id, project_id, session_id):
|
|
available_integrations = integrations_manager.get_available_integrations(user_id=user_id)
|
|
if not any(available_integrations.values()):
|
|
return []
|
|
extra_query = ["session_id = %(session_id)s", "provider IN %(providers)s"]
|
|
with pg_client.PostgresClient() as cur:
|
|
query = cur.mogrify(f"""\
|
|
SELECT *
|
|
FROM public.assigned_sessions
|
|
WHERE {" AND ".join(extra_query)};""",
|
|
{"session_id": session_id,
|
|
"providers": tuple([k for k in available_integrations if available_integrations[k]])})
|
|
cur.execute(
|
|
query
|
|
)
|
|
results = cur.fetchall()
|
|
issues = {}
|
|
for i in results:
|
|
if i["provider"] not in issues.keys():
|
|
issues[i["provider"]] = []
|
|
|
|
issues[i["provider"]].append({"integrationProjectId": i["provider_data"]["integrationProjectId"],
|
|
"id": i["issue_id"]})
|
|
results = []
|
|
for tool in issues.keys():
|
|
error, integration = integrations_manager.get_integration(tool=tool, tenant_id=tenant_id, user_id=user_id)
|
|
if error is not None:
|
|
return error
|
|
|
|
i = integration.get()
|
|
if i is None:
|
|
print("integration not found")
|
|
continue
|
|
|
|
r = integration.issue_handler.get_by_ids(saved_issues=issues[tool])
|
|
for i in r["issues"]:
|
|
i["provider"] = tool
|
|
results += r["issues"]
|
|
return results
|
|
|
|
|
|
def get(tenant_id, user_id, project_id, session_id, assignment_id):
|
|
error, integration = integrations_manager.get_integration(tenant_id=tenant_id, user_id=user_id)
|
|
if error is not None:
|
|
return error
|
|
l = __get_saved_data(project_id, session_id, assignment_id, tool=integration.provider)
|
|
if l is None:
|
|
return {"errors": ["issue not found"]}
|
|
i = integration.get()
|
|
if i is None:
|
|
return {"errors": ["integration not found"]}
|
|
r = integration.issue_handler.get(integration_project_id=l["providerData"]["integrationProjectId"],
|
|
assignment_id=assignment_id)
|
|
|
|
r["provider"] = integration.provider.lower()
|
|
return r
|
|
|
|
|
|
def comment(tenant_id, user_id, project_id, session_id, assignment_id, message):
|
|
error, integration = integrations_manager.get_integration(tenant_id=tenant_id, user_id=user_id)
|
|
if error is not None:
|
|
return error
|
|
i = integration.get()
|
|
|
|
if i is None:
|
|
return {"errors": [f"integration not found"]}
|
|
l = __get_saved_data(project_id, session_id, assignment_id, tool=integration.provider)
|
|
|
|
return integration.issue_handler.comment(integration_project_id=l["providerData"]["integrationProjectId"],
|
|
assignment_id=assignment_id,
|
|
comment=message)
|