Api v1.15.0 (#1558)
* feat(chalice): upgraded dependencies * feat(chalice): changed path analysis schema * feat(DB): click coordinate support * feat(chalice): changed path analysis issues schema feat(chalice): upgraded dependencies * fix(chalice): fixed pydantic issue * refactor(chalice): refresh token validator * feat(chalice): role restrictions * feat(chalice): EE path analysis changes * refactor(DB): changed creation queries refactor(DB): changed delte queries feat(DB): support new path analysis payload * feat(chalice): save path analysis card * feat(chalice): restrict access * feat(chalice): restrict access * feat(chalice): EE save new path analysis card * refactor(chalice): path analysis * feat(chalice): path analysis new query * fix(chalice): configurable CH config * fix(chalice): assist autocomplete * refactor(chalice): refactored permissions * refactor(chalice): changed log level * refactor(chalice): upgraded dependencies * refactor(chalice): changed path analysis query * refactor(chalice): changed path analysis query * refactor(chalice): upgraded dependencies refactor(alerts): upgraded dependencies refactor(crons): upgraded dependencies * feat(chalice): path analysis ignore start point * feat(chalice): path analysis in progress * refactor(chalice): path analysis changed link sort * refactor(chalice): path analysis changed link sort * refactor(chalice): path analysis changed link sort * refactor(chalice): path analysis new query refactor(chalice): authorizers * refactor(chalice): refactored authorizer * fix(chalice): fixed create card of PathAnalysis * refactor(chalice): compute link-percentage for Path Analysis * refactor(chalice): remove null starting point from Path Analysis * feat(chalice): path analysis CH query * refactor(chalice): changed Path Analysis links-value fix(chalice): fixed search notes for EE * feat(chalice): path analysis enhanced query results * feat(chalice): include timezone in search sessions response * refactor(chalice): refactored logs * refactor(chalice): refactored logs feat(chalice): get path analysis issues * fix(chalice): fixed path analysis issues pagination * fix(chalice): sessions-search handle null values * feat(chalice): PathAnalysis start event support middle-event matching * feat(chalice): PathAnalysis start event support middle-event matching * feat(chalice): PathAnalysis support mixed events with start-point * fix(chalice): PathAnalysis fixed eventType value when metricValue is missing * fix(chalice): PathAnalysis fixed wrong super-class model for update card * fix(chalice): PathAnalysis fixed search issues refactor(chalice): upgraded dependencies * fix(chalice): enforce isEvent if missing * fix(chalice): enforce isEvent if missing * refactor(chalice): refactored custom-metrics * refactor(chalice): small changes * feat(chalice): path analysis EE new query * fix(chalice): fixed hide-excess state for Path Analysis * fix(chalice): fixed update start point and excludes for Path Analysis * fix(chalice): fix payload validation fix(chalice): fix update widget endpoint * fix(chalice): fix payload validation fix(chalice): fix update widget endpoint * fix(chalice): fix add member * refactor(chalice): upgraded dependencies refactor!(chalice): upgraded SAML dependencies * feat(chalice): ios-project support 1/5 * refactor(chalice): changed logs handling * fix(chalice): fix path analysis issues list * Api v1.15.0 (#1542) * refactor(chalice): changed default dev env vars * refactor(chalice): changes * refactor(chalice): changed payload fixer * refactor(chalice): changed payload fixer refactor(chalice): support duplicate filters * Api v1.15.0 no merge (#1546) * refactor(chalice): changed default dev env vars * refactor(chalice): changes * refactor(chalice): changed payload fixer * refactor(chalice): changed payload fixer refactor(chalice): support duplicate filters * refactor(chalice): changes * feature(chalice): mobile sessions search * Api v1.15.0 no merge (#1549) * refactor(chalice): changed default dev env vars * refactor(chalice): changes * refactor(chalice): changed payload fixer * refactor(chalice): changed payload fixer refactor(chalice): support duplicate filters * refactor(chalice): changes * feature(chalice): mobile sessions search * refactor(chalice): fix EE refactored schema * Api v1.15.0 no merge (#1552) * refactor(chalice): changed default dev env vars * refactor(chalice): changes * refactor(chalice): changed payload fixer * refactor(chalice): changed payload fixer refactor(chalice): support duplicate filters * refactor(chalice): changes * feature(chalice): mobile sessions search * refactor(chalice): fix EE refactored schema * fix(chalice): fix missing platform for EE * Api v1.15.0 no merge (#1554) * refactor(chalice): changed default dev env vars * refactor(chalice): changes * refactor(chalice): changed payload fixer * refactor(chalice): changed payload fixer refactor(chalice): support duplicate filters * refactor(chalice): changes * feature(chalice): mobile sessions search * refactor(chalice): fix EE refactored schema * fix(chalice): fix missing platform for EE * fix(DB): fixed init_schema * Api v1.15.0 no merge (#1554) (#1555) * refactor(chalice): changed default dev env vars * refactor(chalice): changes * refactor(chalice): changed payload fixer * refactor(chalice): changed payload fixer refactor(chalice): support duplicate filters * refactor(chalice): changes * feature(chalice): mobile sessions search * refactor(chalice): fix EE refactored schema * fix(chalice): fix missing platform for EE * fix(DB): fixed init_schema * Api v1.15.0 no merge (#1557) * refactor(chalice): changed default dev env vars * refactor(chalice): changes * refactor(chalice): changed payload fixer * refactor(chalice): changed payload fixer refactor(chalice): support duplicate filters * refactor(chalice): changes * feature(chalice): mobile sessions search * refactor(chalice): fix EE refactored schema * fix(chalice): fix missing platform for EE * fix(DB): fixed init_schema * feat(DB): changes to support mobile sessions * feat(chalice): mobile sessions search support * fix(chalice): fixed signup
This commit is contained in:
parent
37bb900620
commit
5461c19246
13 changed files with 243 additions and 96 deletions
|
|
@ -1,4 +1,5 @@
|
|||
import json
|
||||
import logging
|
||||
|
||||
import schemas
|
||||
from chalicelib.core import users, telemetry, tenants
|
||||
|
|
@ -7,15 +8,17 @@ from chalicelib.utils import helper
|
|||
from chalicelib.utils import pg_client
|
||||
from chalicelib.utils.TimeUTC import TimeUTC
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def create_tenant(data: schemas.UserSignupSchema):
|
||||
print(f"===================== SIGNUP STEP 1 AT {TimeUTC.to_human_readable(TimeUTC.now())} UTC")
|
||||
logger.info(f"==== Signup started at {TimeUTC.to_human_readable(TimeUTC.now())} UTC")
|
||||
errors = []
|
||||
if tenants.tenants_exists():
|
||||
return {"errors": ["tenants already registered"]}
|
||||
|
||||
email = data.email
|
||||
print(f"=====================> {email}")
|
||||
logger.debug(f"email: {email}")
|
||||
password = data.password.get_secret_value()
|
||||
|
||||
if email is None or len(email) < 5:
|
||||
|
|
@ -41,8 +44,9 @@ def create_tenant(data: schemas.UserSignupSchema):
|
|||
errors.append("Invalid organization name.")
|
||||
|
||||
if len(errors) > 0:
|
||||
print(f"==> error for email:{data.email}, fullname:{data.fullname}, organizationName:{data.organizationName}")
|
||||
print(errors)
|
||||
logger.warning(
|
||||
f"==> signup error for:\n email:{data.email}, fullname:{data.fullname}, organizationName:{data.organizationName}")
|
||||
logger.warning(errors)
|
||||
return {"errors": errors}
|
||||
|
||||
project_name = "my first project"
|
||||
|
|
|
|||
|
|
@ -242,7 +242,8 @@ def sessions_search(projectId: int, data: schemas.SessionsSearchPayloadSchema =
|
|||
@app.post('/{projectId}/sessions/search/ids', tags=["sessions"])
|
||||
def session_ids_search(projectId: int, data: schemas.SessionsSearchPayloadSchema = Body(...),
|
||||
context: schemas.CurrentContext = Depends(OR_context)):
|
||||
data = sessions.search_sessions(data=data, project_id=projectId, user_id=context.user_id, ids_only=True)
|
||||
data = sessions.search_sessions(data=data, project_id=projectId, user_id=context.user_id, ids_only=True,
|
||||
platform=context.project.platform)
|
||||
return {'data': data}
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -75,8 +75,8 @@ class UserLoginSchema(_GRecaptcha):
|
|||
|
||||
|
||||
class UserSignupSchema(UserLoginSchema):
|
||||
fullname: str = Field(..., le=0)
|
||||
organizationName: str = Field(..., le=0)
|
||||
fullname: str = Field(..., min_length=1)
|
||||
organizationName: str = Field(..., min_length=1)
|
||||
|
||||
_transform_fullname = field_validator('fullname', mode='before')(remove_whitespace)
|
||||
_transform_organizationName = field_validator('organizationName', mode='before')(remove_whitespace)
|
||||
|
|
|
|||
1
ee/api/.gitignore
vendored
1
ee/api/.gitignore
vendored
|
|
@ -276,3 +276,4 @@ Pipfile.lock
|
|||
/schemas/schemas.py
|
||||
/chalicelib/core/authorizers.py
|
||||
/schemas/transformers_validators.py
|
||||
/test/
|
||||
|
|
|
|||
|
|
@ -17,11 +17,10 @@ python-decouple = "==3.8"
|
|||
apscheduler = "==3.10.4"
|
||||
python-multipart = "==0.0.6"
|
||||
redis = "==5.0.1"
|
||||
azure-storage-blob = "==12.18.3"
|
||||
uvicorn = {extras = ["standard"], version = "==0.23.2"}
|
||||
pydantic = {extras = ["email"], version = "==2.3.0"}
|
||||
clickhouse-driver = {extras = ["lz4"], version = "==0.2.6"}
|
||||
python3-saml = "==1.16.0"
|
||||
azure-storage-blob = "==12.18.3"
|
||||
|
||||
[dev-packages]
|
||||
|
||||
|
|
|
|||
|
|
@ -14,6 +14,7 @@ if config("EXP_AUTOCOMPLETE", cast=bool, default=False):
|
|||
else:
|
||||
from . import autocomplete as autocomplete
|
||||
|
||||
|
||||
def get_customs_by_session_id(session_id, project_id):
|
||||
with pg_client.PostgresClient() as cur:
|
||||
cur.execute(cur.mogrify("""\
|
||||
|
|
@ -115,12 +116,13 @@ class EventType:
|
|||
column=None) # column=None because errors are searched by name or message
|
||||
METADATA = Event(ui_type=schemas.FilterType.metadata, table="public.sessions", column=None)
|
||||
# IOS
|
||||
CLICK_IOS = Event(ui_type=schemas.EventType.click_ios, table="events_ios.clicks", column="label")
|
||||
CLICK_IOS = Event(ui_type=schemas.EventType.click_ios, table="events_ios.taps", column="label")
|
||||
INPUT_IOS = Event(ui_type=schemas.EventType.input_ios, table="events_ios.inputs", column="label")
|
||||
VIEW_IOS = Event(ui_type=schemas.EventType.view_ios, table="events_ios.views", column="name")
|
||||
SWIPE_IOS = Event(ui_type=schemas.EventType.swipe_ios, table="events_ios.swipes", column="label")
|
||||
CUSTOM_IOS = Event(ui_type=schemas.EventType.custom_ios, table="events_common.customs", column="name")
|
||||
REQUEST_IOS = Event(ui_type=schemas.EventType.request_ios, table="events_common.requests", column="url")
|
||||
ERROR_IOS = Event(ui_type=schemas.EventType.error_ios, table="events_ios.crashes",
|
||||
REQUEST_IOS = Event(ui_type=schemas.EventType.request_ios, table="events_common.requests", column="path")
|
||||
CRASH_IOS = Event(ui_type=schemas.EventType.error_ios, table="events_common.crashes",
|
||||
column=None) # column=None because errors are searched by name or message
|
||||
|
||||
|
||||
|
|
@ -163,7 +165,7 @@ SUPPORTED_TYPES = {
|
|||
EventType.REQUEST_IOS.ui_type: SupportedFilter(get=autocomplete.__generic_autocomplete(EventType.REQUEST_IOS),
|
||||
query=autocomplete.__generic_query(
|
||||
typename=EventType.REQUEST_IOS.ui_type)),
|
||||
EventType.ERROR_IOS.ui_type: SupportedFilter(get=autocomplete.__search_errors_ios,
|
||||
EventType.CRASH_IOS.ui_type: SupportedFilter(get=autocomplete.__search_errors_ios,
|
||||
query=None),
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -108,27 +108,25 @@ def _isUndefined_operator(op: schemas.SearchEventOperator):
|
|||
|
||||
# This function executes the query and return result
|
||||
def search_sessions(data: schemas.SessionsSearchPayloadSchema, project_id, user_id, errors_only=False,
|
||||
error_status=schemas.ErrorStatus.all, count_only=False, issue=None, ids_only=False):
|
||||
error_status=schemas.ErrorStatus.all, count_only=False, issue=None, ids_only=False,
|
||||
platform="web"):
|
||||
full_args, query_part = search_query_parts_ch(data=data, error_status=error_status, errors_only=errors_only,
|
||||
favorite_only=data.bookmarked, issue=issue, project_id=project_id,
|
||||
user_id=user_id)
|
||||
user_id=user_id, platform=platform)
|
||||
if data.sort == "startTs":
|
||||
data.sort = "datetime"
|
||||
if data.limit is not None and data.page is not None:
|
||||
full_args["sessions_limit"] = data.limit
|
||||
full_args["sessions_limit_s"] = (data.page - 1) * data.limit
|
||||
full_args["sessions_limit_e"] = data.page * data.limit
|
||||
full_args["sessions_limit"] = data.limit
|
||||
else:
|
||||
full_args["sessions_limit"] = 200
|
||||
full_args["sessions_limit_s"] = 0
|
||||
full_args["sessions_limit_e"] = 200
|
||||
full_args["sessions_limit"] = 200
|
||||
|
||||
meta_keys = []
|
||||
with ch_client.ClickHouseClient() as cur:
|
||||
if errors_only:
|
||||
logging.debug("--------------------QP")
|
||||
logging.debug(cur.format(query_part, full_args))
|
||||
logging.debug("--------------------")
|
||||
main_query = cur.format(f"""SELECT DISTINCT er.error_id,
|
||||
COALESCE((SELECT TRUE
|
||||
FROM {exp_ch_helper.get_user_viewed_errors_table()} AS ve
|
||||
|
|
@ -145,7 +143,7 @@ def search_sessions(data: schemas.SessionsSearchPayloadSchema, project_id, user_
|
|||
if data.order is None:
|
||||
data.order = schemas.SortOrderType.desc.value
|
||||
else:
|
||||
data.order = data.order.value
|
||||
data.order = data.order
|
||||
if data.sort is not None and data.sort != 'sessionsCount':
|
||||
sort = helper.key_to_snake_case(data.sort)
|
||||
g_sort = f"{'MIN' if data.order == schemas.SortOrderType.desc else 'MAX'}({sort})"
|
||||
|
|
@ -181,7 +179,7 @@ def search_sessions(data: schemas.SessionsSearchPayloadSchema, project_id, user_
|
|||
if data.order is None:
|
||||
data.order = schemas.SortOrderType.desc.value
|
||||
else:
|
||||
data.order = data.order.value
|
||||
data.order = data.order
|
||||
sort = 'session_id'
|
||||
if data.sort is not None and data.sort != "session_id":
|
||||
# sort += " " + data.order + "," + helper.key_to_snake_case(data.sort)
|
||||
|
|
@ -214,7 +212,7 @@ def search_sessions(data: schemas.SessionsSearchPayloadSchema, project_id, user_
|
|||
logging.warning("--------- SESSIONS-CH SEARCH QUERY EXCEPTION -----------")
|
||||
logging.warning(main_query)
|
||||
logging.warning("--------- PAYLOAD -----------")
|
||||
logging.warning(data.json())
|
||||
logging.warning(data.model_dump_json())
|
||||
logging.warning("--------------------")
|
||||
raise err
|
||||
if errors_only or ids_only:
|
||||
|
|
@ -363,7 +361,7 @@ def __is_valid_event(is_any: bool, event: schemas.SessionSearchEventSchema2):
|
|||
event.filters is None or len(event.filters) == 0))
|
||||
|
||||
|
||||
def __get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEventType]):
|
||||
def __get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEventType], platform="web"):
|
||||
defs = {
|
||||
schemas.EventType.click: "CLICK",
|
||||
schemas.EventType.input: "INPUT",
|
||||
|
|
@ -378,9 +376,20 @@ def __get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEve
|
|||
schemas.EventType.state_action: "STATEACTION",
|
||||
schemas.EventType.error: "ERROR",
|
||||
schemas.PerformanceEventType.location_avg_cpu_load: 'PERFORMANCE',
|
||||
schemas.PerformanceEventType.location_avg_memory_usage: 'PERFORMANCE',
|
||||
schemas.PerformanceEventType.location_avg_memory_usage: 'PERFORMANCE'
|
||||
}
|
||||
|
||||
defs_ios = {
|
||||
schemas.EventType.click: "TAP",
|
||||
schemas.EventType.input: "INPUT",
|
||||
schemas.EventType.location: "VIEW",
|
||||
schemas.EventType.custom: "CUSTOM",
|
||||
schemas.EventType.request: "REQUEST",
|
||||
schemas.EventType.request_details: "REQUEST",
|
||||
schemas.PerformanceEventType.fetch_failed: "REQUEST",
|
||||
schemas.EventType.error: "CRASH",
|
||||
}
|
||||
if platform == "ios" and event_type in defs_ios:
|
||||
return defs_ios.get(event_type)
|
||||
if event_type not in defs:
|
||||
raise Exception(f"unsupported EventType:{event_type}")
|
||||
return defs.get(event_type)
|
||||
|
|
@ -388,12 +397,12 @@ def __get_event_type(event_type: Union[schemas.EventType, schemas.PerformanceEve
|
|||
|
||||
# this function generates the query and return the generated-query with the dict of query arguments
|
||||
def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_status, errors_only, favorite_only, issue,
|
||||
project_id, user_id, extra_event=None):
|
||||
project_id, user_id, platform="web", extra_event=None):
|
||||
ss_constraints = []
|
||||
full_args = {"project_id": project_id, "startDate": data.startTimestamp, "endDate": data.endTimestamp,
|
||||
"projectId": project_id, "userId": user_id}
|
||||
|
||||
MAIN_EVENTS_TABLE = exp_ch_helper.get_main_events_table(data.startTimestamp)
|
||||
MAIN_EVENTS_TABLE = exp_ch_helper.get_main_events_table(timestamp=data.startTimestamp, platform=platform)
|
||||
MAIN_SESSIONS_TABLE = exp_ch_helper.get_main_sessions_table(data.startTimestamp)
|
||||
|
||||
full_args["MAIN_EVENTS_TABLE"] = MAIN_EVENTS_TABLE
|
||||
|
|
@ -704,7 +713,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
# event_where.append(f"event_{event_index - 1}.datetime <= main.datetime")
|
||||
e_k = f"e_value{i}"
|
||||
s_k = e_k + "_source"
|
||||
if event.type != schemas.PerformanceEventType.time_between_events:
|
||||
if True or event.type != schemas.PerformanceEventType.time_between_events:
|
||||
event.value = helper.values_for_operator(value=event.value, op=event.operator)
|
||||
full_args = {**full_args,
|
||||
**_multiple_values(event.value, value_key=e_k),
|
||||
|
|
@ -712,19 +721,36 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
|
||||
if event_type == events.EventType.CLICK.ui_type:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
_column = events.EventType.CLICK.column
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if event.operator == schemas.ClickEventExtraOperator._on_selector:
|
||||
event_where.append(
|
||||
_multiple_conditions(f"main.selector = %({e_k})s", event.value, value_key=e_k))
|
||||
events_conditions[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
if platform == "web":
|
||||
_column = events.EventType.CLICK.column
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if event.operator == schemas.ClickEventExtraOperator._on_selector:
|
||||
event_where.append(
|
||||
_multiple_conditions(f"main.selector = %({e_k})s", event.value, value_key=e_k))
|
||||
events_conditions[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
_column = events.EventType.CLICK_IOS.column
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append({"type": f"sub.event_type='{__get_event_type(event_type)}'"})
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
|
||||
|
|
@ -733,49 +759,84 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
|
||||
elif event_type == events.EventType.INPUT.ui_type:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
_column = events.EventType.INPUT.column
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append({"type": f"sub.event_type='{__get_event_type(event_type)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions[-1]["condition"] = event_where[-1]
|
||||
if event.source is not None and len(event.source) > 0:
|
||||
event_where.append(_multiple_conditions(f"main.value ILIKE %(custom{i})s", event.source,
|
||||
value_key=f"custom{i}"))
|
||||
full_args = {**full_args, **_multiple_values(event.source, value_key=f"custom{i}")}
|
||||
if platform == "web":
|
||||
_column = events.EventType.INPUT.column
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions[-1]["condition"] = event_where[-1]
|
||||
if event.source is not None and len(event.source) > 0:
|
||||
event_where.append(_multiple_conditions(f"main.value ILIKE %(custom{i})s", event.source,
|
||||
value_key=f"custom{i}"))
|
||||
full_args = {**full_args, **_multiple_values(event.source, value_key=f"custom{i}")}
|
||||
else:
|
||||
_column = events.EventType.INPUT_IOS.column
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions[-1]["condition"] = event_where[-1]
|
||||
|
||||
elif event_type == events.EventType.LOCATION.ui_type:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
_column = 'url_path'
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append({"type": f"sub.event_type='{__get_event_type(event_type)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
|
||||
event.value, value_key=e_k))
|
||||
events_conditions[-1]["condition"] = event_where[-1]
|
||||
if platform == "web":
|
||||
_column = 'url_path'
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
|
||||
event.value, value_key=e_k))
|
||||
events_conditions[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
_column = events.EventType.VIEW_IOS.column
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
|
||||
event.value, value_key=e_k))
|
||||
events_conditions[-1]["condition"] = event_where[-1]
|
||||
elif event_type == events.EventType.CUSTOM.ui_type:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
_column = events.EventType.CUSTOM.column
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append({"type": f"sub.event_type='{__get_event_type(event_type)}'"})
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
|
||||
|
|
@ -784,13 +845,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
elif event_type == events.EventType.REQUEST.ui_type:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
_column = 'url_path'
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append({"type": f"sub.event_type='{__get_event_type(event_type)}'"})
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s", event.value,
|
||||
|
|
@ -808,13 +870,14 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
elif event_type == events.EventType.STATEACTION.ui_type:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
_column = events.EventType.STATEACTION.column
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append({"type": f"sub.event_type='{__get_event_type(event_type)}'"})
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
|
||||
|
|
@ -824,7 +887,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
elif event_type == events.EventType.ERROR.ui_type:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main"
|
||||
events_extra_join = f"SELECT * FROM {MAIN_EVENTS_TABLE} AS main1 WHERE main1.project_id=%(project_id)s"
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
event.source = tuple(event.source)
|
||||
events_conditions[-1]["condition"] = []
|
||||
|
|
@ -844,14 +907,15 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
elif event_type == schemas.PerformanceEventType.fetch_failed:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
_column = 'url_path'
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
events_conditions[-1]["condition"] = []
|
||||
if not is_any:
|
||||
if is_not:
|
||||
event_where.append(_multiple_conditions(f"sub.{_column} {op} %({e_k})s", event.value,
|
||||
value_key=e_k))
|
||||
events_conditions_not.append({"type": f"sub.event_type='{__get_event_type(event_type)}'"})
|
||||
events_conditions_not.append(
|
||||
{"type": f"sub.event_type='{__get_event_type(event_type, platform=platform)}'"})
|
||||
events_conditions_not[-1]["condition"] = event_where[-1]
|
||||
else:
|
||||
event_where.append(_multiple_conditions(f"main.{_column} {op} %({e_k})s",
|
||||
|
|
@ -882,7 +946,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
schemas.PerformanceEventType.location_largest_contentful_paint_time,
|
||||
schemas.PerformanceEventType.location_ttfb]:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
events_conditions[-1]["condition"] = []
|
||||
col = performance_event.get_col(event_type)
|
||||
|
|
@ -905,7 +969,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
elif event_type in [schemas.PerformanceEventType.location_avg_cpu_load,
|
||||
schemas.PerformanceEventType.location_avg_memory_usage]:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
events_conditions[-1]["condition"] = []
|
||||
col = performance_event.get_col(event_type)
|
||||
|
|
@ -928,9 +992,9 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
elif event_type == schemas.PerformanceEventType.time_between_events:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
# event_from = event_from % f"{getattr(events.event_type, event.value[0].type).table} AS main INNER JOIN {getattr(events.event_type, event.value[1].type).table} AS main2 USING(session_id) "
|
||||
event_where.append(f"main.event_type='{__get_event_type(event.value[0].type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event.value[0].type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
event_where.append(f"main.event_type='{__get_event_type(event.value[0].type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event.value[0].type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
|
||||
if not isinstance(event.value[0].value, list):
|
||||
|
|
@ -978,7 +1042,7 @@ def search_query_parts_ch(data: schemas.SessionsSearchPayloadSchema, error_statu
|
|||
# TODO: no isNot for RequestDetails
|
||||
elif event_type == schemas.EventType.request_details:
|
||||
event_from = event_from % f"{MAIN_EVENTS_TABLE} AS main "
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type)}'")
|
||||
event_where.append(f"main.event_type='{__get_event_type(event_type, platform=platform)}'")
|
||||
events_conditions.append({"type": event_where[-1]})
|
||||
apply = False
|
||||
events_conditions[-1]["condition"] = []
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import json
|
||||
import logging
|
||||
|
||||
from decouple import config
|
||||
|
||||
|
|
@ -9,15 +10,17 @@ from chalicelib.utils import helper
|
|||
from chalicelib.utils import pg_client
|
||||
from chalicelib.utils.TimeUTC import TimeUTC
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def create_tenant(data: schemas.UserSignupSchema):
|
||||
print(f"===================== SIGNUP STEP 1 AT {TimeUTC.to_human_readable(TimeUTC.now())} UTC")
|
||||
logger.info(f"==== Signup started at {TimeUTC.to_human_readable(TimeUTC.now())} UTC")
|
||||
errors = []
|
||||
if not config("MULTI_TENANTS", cast=bool, default=False) and tenants.tenants_exists():
|
||||
return {"errors": ["tenants already registered"]}
|
||||
|
||||
email = data.email
|
||||
print(f"=====================> {email}")
|
||||
logger.debug(f"email: {email}")
|
||||
password = data.password.get_secret_value()
|
||||
|
||||
if email is None or len(email) < 5:
|
||||
|
|
@ -43,8 +46,9 @@ def create_tenant(data: schemas.UserSignupSchema):
|
|||
errors.append("Invalid organization name.")
|
||||
|
||||
if len(errors) > 0:
|
||||
print(f"==> error for email:{data.email}, fullname:{data.fullname}, organizationName:{data.organizationName}")
|
||||
print(errors)
|
||||
logger.warning(
|
||||
f"==> signup error for:\n email:{data.email}, fullname:{data.fullname}, organizationName:{data.organizationName}")
|
||||
logger.warning(errors)
|
||||
return {"errors": errors}
|
||||
|
||||
project_name = "my first project"
|
||||
|
|
|
|||
|
|
@ -8,10 +8,13 @@ if config("EXP_7D_MV", cast=bool, default=True):
|
|||
print(">>> Using experimental last 7 days materialized views")
|
||||
|
||||
|
||||
def get_main_events_table(timestamp=0):
|
||||
return "experimental.events_l7d_mv" \
|
||||
if config("EXP_7D_MV", cast=bool, default=True) \
|
||||
and timestamp >= TimeUTC.now(delta_days=-7) else "experimental.events"
|
||||
def get_main_events_table(timestamp=0, platform="web"):
|
||||
if platform == "web":
|
||||
return "experimental.events_l7d_mv" \
|
||||
if config("EXP_7D_MV", cast=bool, default=True) \
|
||||
and timestamp >= TimeUTC.now(delta_days=-7) else "experimental.events"
|
||||
else:
|
||||
return "experimental.ios_events"
|
||||
|
||||
|
||||
def get_main_sessions_table(timestamp=0):
|
||||
|
|
@ -47,4 +50,4 @@ def get_main_js_errors_sessions_table(timestamp=0):
|
|||
# enable this when js_errors_sessions_mv is fixed
|
||||
# return "experimental.js_errors_sessions_mv" # \
|
||||
# if config("EXP_7D_MV", cast=bool, default=True) \
|
||||
# and timestamp >= TimeUTC.now(delta_days=-7) else "experimental.events"
|
||||
# and timestamp >= TimeUTC.now(delta_days=-7) else "experimental.events"
|
||||
|
|
|
|||
|
|
@ -250,7 +250,8 @@ def get_session(projectId: int, sessionId: Union[int, str], background_tasks: Ba
|
|||
dependencies=[OR_scope(Permissions.session_replay)])
|
||||
def sessions_search(projectId: int, data: schemas.SessionsSearchPayloadSchema = Body(...),
|
||||
context: schemas.CurrentContext = Depends(OR_context)):
|
||||
data = sessions.search_sessions(data=data, project_id=projectId, user_id=context.user_id)
|
||||
data = sessions.search_sessions(data=data, project_id=projectId, user_id=context.user_id,
|
||||
platform=context.project.platform)
|
||||
return {'data': data}
|
||||
|
||||
|
||||
|
|
@ -258,7 +259,8 @@ def sessions_search(projectId: int, data: schemas.SessionsSearchPayloadSchema =
|
|||
dependencies=[OR_scope(Permissions.session_replay)])
|
||||
def session_ids_search(projectId: int, data: schemas.SessionsSearchPayloadSchema = Body(...),
|
||||
context: schemas.CurrentContext = Depends(OR_context)):
|
||||
data = sessions.search_sessions(data=data, project_id=projectId, user_id=context.user_id, ids_only=True)
|
||||
data = sessions.search_sessions(data=data, project_id=projectId, user_id=context.user_id, ids_only=True,
|
||||
platform=context.project.platform)
|
||||
return {'data': data}
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -10,4 +10,38 @@ ALTER TABLE experimental.events
|
|||
ADD COLUMN IF NOT EXISTS coordinate Tuple(x Nullable(UInt16), y Nullable(UInt16));
|
||||
|
||||
ALTER TABLE experimental.sessions
|
||||
ADD COLUMN IF NOT EXISTS timezone LowCardinality(Nullable(String));
|
||||
ADD COLUMN IF NOT EXISTS timezone LowCardinality(Nullable(String));
|
||||
|
||||
|
||||
CREATE TABLE IF NOT EXISTS experimental.ios_events
|
||||
(
|
||||
session_id UInt64,
|
||||
project_id UInt16,
|
||||
event_type Enum8('TAP'=0, 'INPUT'=1, 'SWIP'=2, 'VIEW'=3,'REQUEST'=4,'CRASH'=5,'CUSTOM'=6, 'STATEACTION'=8, 'ISSUE'=9),
|
||||
datetime DateTime,
|
||||
label Nullable(String),
|
||||
name Nullable(String),
|
||||
payload Nullable(String),
|
||||
level Nullable(Enum8('info'=0, 'error'=1)) DEFAULT if(event_type == 'CUSTOM', 'info', null),
|
||||
context Nullable(Enum8('unknown'=0, 'self'=1, 'same-origin-ancestor'=2, 'same-origin-descendant'=3, 'same-origin'=4, 'cross-origin-ancestor'=5, 'cross-origin-descendant'=6, 'cross-origin-unreachable'=7, 'multiple-contexts'=8)),
|
||||
url Nullable(String),
|
||||
url_host Nullable(String) MATERIALIZED lower(domain(url)),
|
||||
url_path Nullable(String) MATERIALIZED lower(pathFull(url)),
|
||||
url_hostpath Nullable(String) MATERIALIZED concat(url_host, url_path),
|
||||
request_start Nullable(UInt16),
|
||||
response_start Nullable(UInt16),
|
||||
response_end Nullable(UInt16),
|
||||
method Nullable(Enum8('GET' = 0, 'HEAD' = 1, 'POST' = 2, 'PUT' = 3, 'DELETE' = 4, 'CONNECT' = 5, 'OPTIONS' = 6, 'TRACE' = 7, 'PATCH' = 8)),
|
||||
status Nullable(UInt16),
|
||||
success Nullable(UInt8),
|
||||
request_body Nullable(String),
|
||||
response_body Nullable(String),
|
||||
issue_type Nullable(Enum8('tap_rage'=1,'dead_click'=2,'excessive_scrolling'=3,'bad_request'=4,'missing_resource'=5,'memory'=6,'cpu'=7,'slow_resource'=8,'slow_page_load'=9,'crash'=10,'ml_cpu'=11,'ml_memory'=12,'ml_dead_click'=13,'ml_click_rage'=14,'ml_mouse_thrashing'=15,'ml_excessive_scrolling'=16,'ml_slow_resources'=17,'custom'=18,'js_exception'=19,'mouse_thrashing'=20,'app_crash'=21)),
|
||||
issue_id Nullable(String),
|
||||
transfer_size Nullable(UInt32),
|
||||
message_id UInt64 DEFAULT 0,
|
||||
_timestamp DateTime DEFAULT now()
|
||||
) ENGINE = ReplacingMergeTree(_timestamp)
|
||||
PARTITION BY toYYYYMM(datetime)
|
||||
ORDER BY (project_id, datetime, event_type, session_id, message_id)
|
||||
TTL datetime + INTERVAL 3 MONTH;
|
||||
|
|
@ -410,4 +410,37 @@ CREATE TABLE IF NOT EXISTS experimental.sessions_feature_flags
|
|||
) ENGINE = ReplacingMergeTree(_timestamp)
|
||||
PARTITION BY toYYYYMM(datetime)
|
||||
ORDER BY (project_id, datetime, session_id, feature_flag_id, condition_id)
|
||||
TTL datetime + INTERVAL 3 MONTH;
|
||||
|
||||
CREATE TABLE IF NOT EXISTS experimental.ios_events
|
||||
(
|
||||
session_id UInt64,
|
||||
project_id UInt16,
|
||||
event_type Enum8('TAP'=0, 'INPUT'=1, 'SWIP'=2, 'VIEW'=3,'REQUEST'=4,'CRASH'=5,'CUSTOM'=6, 'STATEACTION'=8, 'ISSUE'=9),
|
||||
datetime DateTime,
|
||||
label Nullable(String),
|
||||
name Nullable(String),
|
||||
payload Nullable(String),
|
||||
level Nullable(Enum8('info'=0, 'error'=1)) DEFAULT if(event_type == 'CUSTOM', 'info', null),
|
||||
context Nullable(Enum8('unknown'=0, 'self'=1, 'same-origin-ancestor'=2, 'same-origin-descendant'=3, 'same-origin'=4, 'cross-origin-ancestor'=5, 'cross-origin-descendant'=6, 'cross-origin-unreachable'=7, 'multiple-contexts'=8)),
|
||||
url Nullable(String),
|
||||
url_host Nullable(String) MATERIALIZED lower(domain(url)),
|
||||
url_path Nullable(String) MATERIALIZED lower(pathFull(url)),
|
||||
url_hostpath Nullable(String) MATERIALIZED concat(url_host, url_path),
|
||||
request_start Nullable(UInt16),
|
||||
response_start Nullable(UInt16),
|
||||
response_end Nullable(UInt16),
|
||||
method Nullable(Enum8('GET' = 0, 'HEAD' = 1, 'POST' = 2, 'PUT' = 3, 'DELETE' = 4, 'CONNECT' = 5, 'OPTIONS' = 6, 'TRACE' = 7, 'PATCH' = 8)),
|
||||
status Nullable(UInt16),
|
||||
success Nullable(UInt8),
|
||||
request_body Nullable(String),
|
||||
response_body Nullable(String),
|
||||
issue_type Nullable(Enum8('tap_rage'=1,'dead_click'=2,'excessive_scrolling'=3,'bad_request'=4,'missing_resource'=5,'memory'=6,'cpu'=7,'slow_resource'=8,'slow_page_load'=9,'crash'=10,'ml_cpu'=11,'ml_memory'=12,'ml_dead_click'=13,'ml_click_rage'=14,'ml_mouse_thrashing'=15,'ml_excessive_scrolling'=16,'ml_slow_resources'=17,'custom'=18,'js_exception'=19,'mouse_thrashing'=20,'app_crash'=21)),
|
||||
issue_id Nullable(String),
|
||||
transfer_size Nullable(UInt32),
|
||||
message_id UInt64 DEFAULT 0,
|
||||
_timestamp DateTime DEFAULT now()
|
||||
) ENGINE = ReplacingMergeTree(_timestamp)
|
||||
PARTITION BY toYYYYMM(datetime)
|
||||
ORDER BY (project_id, datetime, event_type, session_id, message_id)
|
||||
TTL datetime + INTERVAL 3 MONTH;
|
||||
|
|
@ -19,7 +19,7 @@ $fn_def$, :'next_version')
|
|||
|
||||
--
|
||||
ALTER TABLE IF EXISTS events_common.requests
|
||||
ADD COLUMN transfer_size bigint NULL;
|
||||
ADD COLUMN IF NOT EXISTS transfer_size bigint NULL;
|
||||
|
||||
ALTER TABLE IF EXISTS public.sessions
|
||||
ADD COLUMN IF NOT EXISTS timezone text NULL;
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue