* fix: changed sessions bucket * fix: text changes in login and signup forms * change: version number * change: config changes * fix: alerts image name * fix: alerts image name * Update README.md * chore(actions): pushing internalized to script. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * feat(nginx): No redirection to HTTPS by default. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * chore(deploy): optional nginx https redirect Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix: review fixes and other changes * fix: events modal openreplay logo * fix: stack event icon * Changes: - debugging - smtp status - session's issues - session's issue_types as array - changed Slack error message * Changes: - set chalice pull policy to always * fix(openreplay-cli): path issues. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix(openreplay-cli): fix path Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * change: onboarding explore text changes * change: timeline issue pointers and static issue types * change: removed issues_types api call * connectors * Update README.md * Update README.md * Update README.md * Updating services * Update README.md * Updated alert-notification-string to chalice * Delete issues.md * Changes: - fixed connexion pool exhausted using Semaphores - fixed session-replay-url signing * Changes: - fixed connexion pool exhausted using Semaphores - fixed session-replay-url signing * Change pullPolicy to IfNotPresent * Fixed typo * Fixed typo * Fixed typos * Fixed typo * Fixed typo * Fixed typos * Fixed typos * Fixed typo * Fixed typo * Removed /ws * Update README.md * feat(nginx): increase minio upload size to 50M Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix(deploy): nginx custom changes are overriden in install Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix(nginx): deployment indentation issue Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix: revid filter crash * fix: onboarding links * fix: update password store new token * fix: report issue icon jira/github * fix: onboarding redirect on signup * Changes: - hardcoded S3_HOST * Changes: - changed "sourcemaps" env var to "sourcemaps_reader" - set "sourcemaps_reader" env var value * chore(script): remove logo Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * Making domain_name mandatory Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * Changes: - un-ignore *.js * feat(install): auto create jwt_secret for chalice. * docs(script): Adding Banner Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * chore(script): Remove verbose logging Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * Change: - use boto3-resource instead of boto3-client to check if file exists - changed .gitignore to allow *.js files - changed sourcemaps_reader env-var & env-var-value * fix (baxkend-ender): skip inputs with no label (technical) * Change: - changed DB structure * change: removed /flows api call * fix: skipping errorOnFetch check * Change: - changed sourcemaps_reader-nodejs script * Change: - changed sourcemaps_reader-nodejs script * fix (backend-postgres): correct autocomplete type-value * fix: slack webhooks PUT call * change: added external icon for integration doc links * fix: updated the sourcemap upload doc link * fix: link color of no sessions message * fix (frontend-player): show original domContentLoaded text values, while adjusted on timeline * fix (frontend-player): syntax * Changes: - changed requirements - changed slack add integration - added slack edit integration - removed sourcemaps_reader extra payload * Changes: - fixed sentry-issue-reporter - fixed telemetry reporter - fixed DB schema * fix(cli): fix logs flag Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * ci(deploy): Injecting domain_name Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * feat(nginx): Get real client ip Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * chore(nginx): restart on helm installation. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * fix(deployment): respect image tags. * Changes: - changed sentry tags - changed asayer_session_id to openReplaySessionToken - EE full merge * fix: close the issue modal after creating * fix: show description in issue details modal * fix: integrate slack button redirect, and doc link * fix: code snippet conflict set back * fix: slack share channel selection * Changes: - fixed DB structure * Changes: - return full integration body on add slack * fix (integrations): ignore token expired + some logs * feat (sourcemaps-uploader): v.3.0.2 filename fix + logging arg * fix (tracker): 3.0.3 version: start before auth * fix: funnel calendar position * fix: fetch issue types * fix: missing icon blocking the session to play * change: sessions per browser widget bar height reduced * fix: github colored circles * Changes: - changed session-assignment-jira response * chore(nginx): pass x-forward-for Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * feat(chalice): included sourcemaps_reader It's not advised to run multiple processes in a single docker container. In Kubernetes we can run this as sidecar, but other platforms such as Heroku, and vanilla docker doesn't support such feature. So till we figure out better solution, this is the workaround. * chore(install): Remove sqs * feat(deployment): restart pods on installations. Signed-off-by: Rajesh Rajendran <rjshrjndrn@gmail.com> * Changes: - changed DB-oauth-unique constraint Co-authored-by: Shekar Siri <sshekarsiri@gmail.com> Co-authored-by: Mehdi Osman <estradino@users.noreply.github.com> Co-authored-by: KRAIEM Taha Yassine <tahayk2@gmail.com> Co-authored-by: ourvakan <hi-psi@yandex.com> Co-authored-by: ShiKhu <alex.kaminsky.11@gmail.com>
328 lines
12 KiB
Python
328 lines
12 KiB
Python
from jira import JIRA
|
|
from jira.exceptions import JIRAError
|
|
import time
|
|
from datetime import datetime
|
|
import requests
|
|
from requests.auth import HTTPBasicAuth
|
|
|
|
fields = "id, summary, description, creator, reporter, created, assignee, status, updated, comment, issuetype, labels"
|
|
|
|
|
|
class JiraManager:
|
|
# retries = 5
|
|
retries = 0
|
|
|
|
def __init__(self, url, username, password, project_id=None):
|
|
self._config = {"JIRA_PROJECT_ID": project_id, "JIRA_URL": url, "JIRA_USERNAME": username,
|
|
"JIRA_PASSWORD": password}
|
|
self._jira = JIRA({'server': url}, basic_auth=(username, password), logging=True)
|
|
|
|
def set_jira_project_id(self, project_id):
|
|
self._config["JIRA_PROJECT_ID"] = project_id
|
|
|
|
def get_projects(self):
|
|
try:
|
|
projects = self._jira.projects()
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.get_projects()
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
projects_dict_list = []
|
|
for project in projects:
|
|
projects_dict_list.append(self.__parser_project_info(project))
|
|
|
|
return projects_dict_list
|
|
|
|
def get_project(self):
|
|
try:
|
|
project = self._jira.project(self._config['JIRA_PROJECT_ID'])
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.get_project()
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
return self.__parser_project_info(project)
|
|
|
|
def get_issues(self, sql: str, offset: int = 0):
|
|
jql = "project = " + self._config['JIRA_PROJECT_ID'] \
|
|
+ ((" AND " + sql) if sql is not None and len(sql) > 0 else "") \
|
|
+ " ORDER BY createdDate DESC"
|
|
|
|
try:
|
|
issues = self._jira.search_issues(jql, maxResults=1000, startAt=offset, fields=fields)
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.get_issues(sql, offset)
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
|
|
issue_dict_list = []
|
|
for issue in issues:
|
|
# print(issue.raw)
|
|
issue_dict_list.append(self.__parser_issue_info(issue, include_comments=False))
|
|
|
|
# return {"total": issues.total, "issues": issue_dict_list}
|
|
return issue_dict_list
|
|
|
|
def get_issue(self, issue_id: str):
|
|
try:
|
|
# issue = self._jira.issue(issue_id)
|
|
issue = self._jira.issue(issue_id, fields=fields)
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.get_issue(issue_id)
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
return self.__parser_issue_info(issue)
|
|
|
|
def get_issue_v3(self, issue_id: str):
|
|
try:
|
|
url = f"{self._config['JIRA_URL']}/rest/api/3/issue/{issue_id}?fields={fields}"
|
|
auth = HTTPBasicAuth(self._config['JIRA_USERNAME'], self._config['JIRA_PASSWORD'])
|
|
issue = requests.get(
|
|
url,
|
|
headers={
|
|
"Accept": "application/json"
|
|
},
|
|
auth=auth
|
|
)
|
|
except Exception as e:
|
|
self.retries -= 1
|
|
if self.retries > 0:
|
|
time.sleep(1)
|
|
return self.get_issue_v3(issue_id)
|
|
print(f"=>Error {e}")
|
|
raise e
|
|
return self.__parser_issue_info(issue.json())
|
|
|
|
def create_issue(self, issue_dict):
|
|
issue_dict["project"] = {"id": self._config['JIRA_PROJECT_ID']}
|
|
try:
|
|
issue = self._jira.create_issue(fields=issue_dict)
|
|
return self.__parser_issue_info(issue)
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.create_issue(issue_dict)
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
|
|
def close_issue(self, issue):
|
|
try:
|
|
# jira.transition_issue(issue, '5', assignee={'name': 'pm_user'}, resolution={'id': '3'})
|
|
self._jira.transition_issue(issue, 'Close')
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.close_issue(issue)
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
|
|
def assign_issue(self, issue_id, account_id) -> bool:
|
|
try:
|
|
return self._jira.assign_issue(issue_id, account_id)
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.assign_issue(issue_id, account_id)
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
|
|
def add_comment(self, issue_id: str, comment: str):
|
|
try:
|
|
comment = self._jira.add_comment(issue_id, comment)
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.add_comment(issue_id, comment)
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
return self.__parser_comment_info(comment)
|
|
|
|
def add_comment_v3(self, issue_id: str, comment: str):
|
|
try:
|
|
url = f"{self._config['JIRA_URL']}/rest/api/3/issue/{issue_id}/comment"
|
|
auth = HTTPBasicAuth(self._config['JIRA_USERNAME'], self._config['JIRA_PASSWORD'])
|
|
comment_response = requests.post(
|
|
url,
|
|
headers={
|
|
"Accept": "application/json"
|
|
},
|
|
auth=auth,
|
|
json={
|
|
"body": {
|
|
"type": "doc",
|
|
"version": 1,
|
|
"content": [
|
|
{
|
|
"type": "paragraph",
|
|
"content": [
|
|
{
|
|
"text": comment,
|
|
"type": "text"
|
|
}
|
|
]
|
|
}
|
|
]
|
|
}
|
|
}
|
|
)
|
|
except Exception as e:
|
|
self.retries -= 1
|
|
if self.retries > 0:
|
|
time.sleep(1)
|
|
return self.add_comment_v3(issue_id, comment)
|
|
print(f"=>Error {e}")
|
|
raise e
|
|
return self.__parser_comment_info(comment_response.json())
|
|
|
|
def get_comments(self, issueKey):
|
|
try:
|
|
comments = self._jira.comments(issueKey)
|
|
results = []
|
|
for c in comments:
|
|
results.append(self.__parser_comment_info(c.raw))
|
|
return results
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.get_comments(issueKey)
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
|
|
def get_meta(self):
|
|
meta = {}
|
|
meta['issueTypes'] = self.get_issue_types()
|
|
meta['users'] = self.get_assignable_users()
|
|
return meta
|
|
|
|
def get_assignable_users(self):
|
|
try:
|
|
users = self._jira.search_assignable_users_for_issues('', project=self._config['JIRA_PROJECT_ID'])
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.get_assignable_users()
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
users_dict = []
|
|
for user in users:
|
|
users_dict.append({
|
|
'name': user.displayName,
|
|
'email': user.emailAddress,
|
|
'id': user.accountId,
|
|
'avatarUrls': user.raw["avatarUrls"]
|
|
})
|
|
|
|
return users_dict
|
|
|
|
def get_issue_types(self):
|
|
try:
|
|
types = self._jira.issue_types()
|
|
except JIRAError as e:
|
|
self.retries -= 1
|
|
if (e.status_code // 100) == 4 and self.retries > 0:
|
|
time.sleep(1)
|
|
return self.get_issue_types()
|
|
print(f"=>Error {e.text}")
|
|
raise e
|
|
types_dict = []
|
|
for type in types:
|
|
if not type.subtask and not type.name.lower() == "epic":
|
|
types_dict.append({
|
|
'id': type.id,
|
|
'name': type.name,
|
|
'iconUrl': type.iconUrl,
|
|
'description': type.description
|
|
})
|
|
return types_dict
|
|
|
|
def __parser_comment_info(self, comment):
|
|
if not isinstance(comment, dict):
|
|
comment = comment.raw
|
|
|
|
pattern = '%Y-%m-%dT%H:%M:%S.%f%z'
|
|
creation = datetime.strptime(comment['created'], pattern)
|
|
# update = datetime.strptime(comment['updated'], pattern)
|
|
|
|
return {
|
|
'id': comment['id'],
|
|
'author': comment['author']['accountId'],
|
|
'message': comment['body'],
|
|
# 'created': comment['created'],
|
|
'createdAt': int((creation - creation.utcoffset()).timestamp() * 1000),
|
|
# 'updated': comment['updated'],
|
|
# 'updatedAt': int((update - update.utcoffset()).timestamp() * 1000)
|
|
}
|
|
|
|
@staticmethod
|
|
def __get_closed_status(status):
|
|
return status.lower() == "done" or status.lower() == "close" or status.lower() == "closed" or status.lower() == "finish" or status.lower() == "finished"
|
|
|
|
def __parser_issue_info(self, issue, include_comments=True):
|
|
results_dict = {}
|
|
if not isinstance(issue, dict):
|
|
raw_info = issue.raw
|
|
else:
|
|
raw_info = issue
|
|
|
|
fields = raw_info['fields']
|
|
results_dict["id"] = raw_info["id"]
|
|
results_dict["key"] = raw_info["key"]
|
|
# results_dict["ticketNumber"] = raw_info["key"]
|
|
results_dict["title"] = fields["summary"]
|
|
results_dict["description"] = fields["description"]
|
|
results_dict["issueType"] = [fields["issuetype"]["id"]]
|
|
|
|
# results_dict["assignee"] = None
|
|
# results_dict["reporter"] = None
|
|
|
|
if isinstance(fields["assignee"], dict):
|
|
results_dict["assignees"] = [fields["assignee"]["accountId"]]
|
|
# if isinstance(fields["reporter"], dict):
|
|
# results_dict["reporter"] = fields["reporter"]["accountId"]
|
|
if isinstance(fields["creator"], dict):
|
|
results_dict["creator"] = fields["creator"]["accountId"]
|
|
|
|
if "comment" in fields:
|
|
if include_comments:
|
|
comments_dict = []
|
|
for comment in fields["comment"]["comments"]:
|
|
comments_dict.append(self.__parser_comment_info(comment))
|
|
|
|
results_dict['comments'] = comments_dict
|
|
results_dict['commentsCount'] = fields["comment"]["total"]
|
|
|
|
results_dict["status"] = fields["status"]['name']
|
|
results_dict["createdAt"] = fields["created"]
|
|
# results_dict["updated"] = fields["updated"]
|
|
results_dict["labels"] = fields["labels"]
|
|
results_dict["closed"] = self.__get_closed_status(fields["status"]['name'])
|
|
|
|
return results_dict
|
|
|
|
@staticmethod
|
|
def __parser_project_info(project):
|
|
results_dict = {}
|
|
raw_info = project.raw
|
|
results_dict["id"] = raw_info["id"]
|
|
results_dict["name"] = raw_info["name"]
|
|
results_dict["avatarUrls"] = raw_info["avatarUrls"]
|
|
results_dict["description"] = raw_info["description"] if "description" in raw_info else ""
|
|
|
|
return results_dict
|