Merge branch 'master' into fork_master

This commit is contained in:
Dniel97 2024-03-12 18:33:05 +01:00
commit 70dae79058
Signed by: Dniel97
GPG Key ID: 6180B3C768FB2E08
439 changed files with 82009 additions and 6153 deletions

3
.gitattributes vendored Normal file
View File

@ -0,0 +1,3 @@
*.csv binary
*.txt binary
*.json binary

1
.gitignore vendored
View File

@ -158,5 +158,6 @@ cert/*
!cert/server.pem
config/*
deliver/*
*.gz
dbdump-*.json

21
Dockerfile Normal file
View File

@ -0,0 +1,21 @@
FROM python:3.9.15-slim-bullseye
RUN apt update && apt install default-libmysqlclient-dev build-essential libtk nodejs npm pkg-config -y
WORKDIR /app
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt
RUN npm i -g nodemon
COPY entrypoint.sh entrypoint.sh
RUN chmod +x entrypoint.sh
COPY index.py index.py
COPY dbutils.py dbutils.py
COPY read.py read.py
ADD core core
ADD titles titles
ADD logs logs
ADD cert cert
ENTRYPOINT [ "/app/entrypoint.sh" ]

194
changelog.md Normal file
View File

@ -0,0 +1,194 @@
# Changelog
Documenting updates to ARTEMiS, to be updated every time the master branch is pushed to.
## 20240109
### System
+ Removed `ADD config config` from dockerfile
### Aimedb
+ Fixed an error that resulted from trying to scan a banned or locked card
## 20240108
### System
+ Change how the underlying system handles URLs
+ This can now allow for things like version-specific, or even keychip-specific URLs
+ Specific changes to games are noted below
+ Fix docker files [#60](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/60) (Thanks Rylie!)
+ Fix support for python 3.8 - 3.10
### Aimedb
+ Add support for SegaAuth key in games that support it (for now only Chunithm)
+ This is a JWT that is sent to games, by Aimedb, that the games send to their game server, to verify that the access code the game is sending to the server was obtained via aimedb.
+ Requires a base64-encoded secret to be set in the `core.yaml`
### Chunithm
+ Fix Air support
+ Add saving for userRecentPlayerList
+ Add support for SegaAuthKey
+ Fix a bug arising if a user set their name to be 'true' or 'false'
+ Add support for Sun+ [#78](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/78) (Thanks EmmyHeart!)
+ Add `matching` section to `chuni.yaml`
+ ~~Change `udpHolePunchUri` and `reflectorUri` to be STUN and TURN servers~~ Reverted
+ Imrpove `GetGameSetting` request handling for different versions
+ Fix issue where songs would not always return all scores [#92](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/92) (Thanks Kumubou!)
### maimai DX
+ Fix user charges failing to save
### maimai
+ Made it functional
### CXB
+ Improvements to request dispatching
+ Add support for non-omnimix music lists
### IDZ
+ Fix news urls in accordance with the system change to URLs
### Initial D THE ARCADE
+ Added support for Initial D THE ARCADE S2
+ Story mode progress added
+ Bunta Challenge/Touhou Project modes added
+ Time Trials added
+ Leaderboards added, but doesn't refresh sometimes
+ Theory of Street mode added (with CPUs)
+ Play Stamp/Timetrial events added
+ Frontend to download profile added
+ Importer to import profiles added
### ONGEKI
+ Now supports HTTPS on a per-version basis
+ Merg PR [#61](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/61) (Thanks phantomlan!)
+ Add Ranking Event Support
+ Add reward list support
+ Add version segregation to Event Ranking, Tech Challenge, and Music Ranking
+ Now stores ClientTestmode and ClientSetting data
+ Fix mission points not adding correctly [#68](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/68) (Thanks phantomlan!)
+ Fix tech challenge [#70](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls/70) (Thanks phantomlan!)
### SAO
+ Change endpoint in accordance with the system change to URLs
+ Update request header class to be more accurate
+ Encrypted requests are now supported
+ Change to using handler classes instead of raw structs for simplicity
### Wacca
+ Fix a server error causing a seperate error that casued issues
+ Add better error printing
+ Add better request validation
+ Fix HousingStartV2
+ Fix Lily's housing/get handler
## 20231107
### CXB
+ Hotfix `render_POST` sometimes failing to read the request body on large requests
## 20231106
### CXB
+ Hotfix `render_POST` function signature signature
+ Hotfix `handle_action_addenergy_request` hard failing if `get_energy` returns None
## 20231015
### maimai DX
+ Added support for FESTiVAL PLUS
### Card Maker
+ Added support for maimai DX FESTiVAL PLUS
## 20230716
### General
+ Docker files added (#19)
+ Added support for threading
+ This comes with the caviat that enabling it will not allow you to use Ctrl + C to stop the server.
### Webui
+ Small improvements
+ Add card display
### Allnet
+ Billing format validation
+ Fix naomitest.html endpoint
+ Add event logging for auths and billing
+ LoaderStateRecorder endpoint handler added
### Mucha
+ Fixed log level always being "Info"
+ Add stub handler for DownloadState
### Sword Art Online
+ Support added
### Crossbeats
+ Added threading to profile loading
+ This should cause a noticeable speed-up
### Card Maker
+ DX Passes fixed
+ Various improvements
### Diva
+ Added clear status calculation
+ Various minor fixes and improvements
### Maimai
+ Added support for memorial photo uploads
+ Added support for the following versions
+ Festival
+ FiNALE
+ Various bug fixes and improvements
### Wacca
+ Fixed an error that sometimes occoured when trying to unlock songs (#22)
### Pokken
+ Profile saving added (loading TBA)
+ Use external STUN server for matching by default
+ Matching still not working
## 2023042300
### Wacca
+ Time free now works properly
+ Fix reverse gate mission causing a fatal error
+ Other misc. fixes
+ Latest DB: 5
### Pokken
+ Added preliminary support
+ Nothing saves currently, but the game will boot and function properly.
### Initial D Zero
+ Added preliminary support
+ Nothing saves currently, but the game will boot and function for the most part.
### Mai2
+ Added support for Festival
+ Lasted DB Version: 4
### Ongeki
+ Misc fixes
+ Lasted DB Version: 4
### Diva
+ Misc fixes
+ Lasted DB Version: 4
### Chuni
+ Fix network encryption
+ Add `handle_remove_token_api_request` for event mode
### Allnet
+ Added download order support
+ It is up to the sysop to provide the INI file, and host the files.
+ ONLY for use with cabs. It's not checked currently, which it's why it's default disabled
+ YMMV, use at your own risk
+ When running develop mode, games that are not recognised will still be able to authenticate.
### Database
+ Add autoupgrade command
+ Invoke to automatically upgrade all schemas to their latest versions
+ `version` arg no longer required, leave it blank to update the game schema to latest if it isn't already
### Misc
+ Update example nginx config file

8
contributing.md Normal file
View File

@ -0,0 +1,8 @@
# Contributing to ARTEMiS
If you would like to contribute to artemis, either by adding features, games, or fixing bugs, you can do so by forking the repo and submitting a pull request [here](https://gitea.tendokyu.moe/Hay1tsme/artemis/pulls). Please make sure, if you're submitting a PR for a game or game version, that you're following the n-0/y-1 guidelines, or it will be rejected.
## Adding games
Guide WIP
## Adding game versions
Guide WIP

View File

@ -4,3 +4,4 @@ from core.aimedb import AimedbFactory
from core.title import TitleServlet
from core.utils import Utils
from core.mucha import MuchaServlet
from core.frontend import FrontendServlet

View File

@ -0,0 +1,6 @@
from .base import ADBBaseRequest, ADBBaseResponse, ADBHeader, ADBHeaderException, PortalRegStatus, LogStatus, ADBStatus
from .base import CompanyCodes, ReaderFwVer, CMD_CODE_GOODBYE, HEADER_SIZE
from .lookup import ADBLookupRequest, ADBLookupResponse, ADBLookupExResponse
from .campaign import ADBCampaignClearRequest, ADBCampaignClearResponse, ADBCampaignResponse, ADBOldCampaignRequest, ADBOldCampaignResponse
from .felica import ADBFelicaLookupRequest, ADBFelicaLookupResponse, ADBFelicaLookup2Request, ADBFelicaLookup2Response
from .log import ADBLogExRequest, ADBLogRequest, ADBStatusLogRequest, ADBLogExResponse

170
core/adb_handlers/base.py Normal file
View File

@ -0,0 +1,170 @@
import struct
from construct import Struct, Int16ul, Int32ul, PaddedString
from enum import Enum
import re
from typing import Union, Final
class LogStatus(Enum):
NONE = 0
START = 1
CONTINUE = 2
END = 3
OTHER = 4
class PortalRegStatus(Enum):
NO_REG = 0
PORTAL = 1
SEGA_ID = 2
class ADBStatus(Enum):
UNKNOWN = 0
GOOD = 1
BAD_AMIE_ID = 2
ALREADY_REG = 3
BAN_SYS_USER = 4
BAN_SYS = 5
BAN_USER = 6
BAN_GEN = 7
LOCK_SYS_USER = 8
LOCK_SYS = 9
LOCK_USER = 10
class CompanyCodes(Enum):
NONE = 0
SEGA = 1
BAMCO = 2
KONAMI = 3
TAITO = 4
class ReaderFwVer(Enum): # Newer readers use a singly byte value
NONE = 0
TN32_10 = 1
TN32_12 = 2
OTHER = 9
def __str__(self) -> str:
if self == self.TN32_10:
return "TN32MSEC003S F/W Ver1.0"
elif self == self.TN32_12:
return "TN32MSEC003S F/W Ver1.2"
elif self == self.NONE:
return "Not Specified"
elif self == self.OTHER:
return "Unknown/Other"
else:
raise ValueError(f"Bad ReaderFwVer value {self.value}")
@classmethod
def from_byte(self, byte: bytes) -> Union["ReaderFwVer", int]:
try:
i = int.from_bytes(byte, 'little')
try:
return ReaderFwVer(i)
except ValueError:
return i
except TypeError:
return 0
class ADBHeaderException(Exception):
pass
HEADER_SIZE: Final[int] = 0x20
CMD_CODE_GOODBYE: Final[int] = 0x66
# everything is LE
class ADBHeader:
def __init__(self, magic: int, protocol_ver: int, cmd: int, length: int, status: int, game_id: Union[str, bytes], store_id: int, keychip_id: Union[str, bytes]) -> None:
self.magic = magic # u16
self.protocol_ver = protocol_ver # u16
self.cmd = cmd # u16
self.length = length # u16
try:
self.status = ADBStatus(status) # u16
except ValueError as e:
raise ADBHeaderException(f"Status is incorrect! {e}")
self.game_id = game_id # 4 char + \x00
self.store_id = store_id # u32
self.keychip_id = keychip_id# 11 char + \x00
if type(self.game_id) == bytes:
self.game_id = self.game_id.decode()
if type(self.keychip_id) == bytes:
self.keychip_id = self.keychip_id.decode()
self.game_id = self.game_id.replace("\0", "")
self.keychip_id = self.keychip_id.replace("\0", "")
if self.cmd != CMD_CODE_GOODBYE: # Games for some reason send no data with goodbye
self.validate()
@classmethod
def from_data(cls, data: bytes) -> "ADBHeader":
magic, protocol_ver, cmd, length, status, game_id, store_id, keychip_id = struct.unpack_from("<5H6sI12s", data)
head = cls(magic, protocol_ver, cmd, length, status, game_id, store_id, keychip_id)
if head.length != len(data):
raise ADBHeaderException(f"Length is incorrect! Expect {head.length}, got {len(data)}")
return head
def validate(self) -> bool:
if self.magic != 0xa13e:
raise ADBHeaderException(f"Magic {self.magic} != 0xa13e")
if self.protocol_ver < 0x1000:
raise ADBHeaderException(f"Protocol version {hex(self.protocol_ver)} is invalid!")
if re.fullmatch(r"^S[0-9A-Z]{3}[P]?$", self.game_id) is None:
raise ADBHeaderException(f"Game ID {self.game_id} is invalid!")
if self.store_id == 0:
raise ADBHeaderException(f"Store ID cannot be 0!")
if re.fullmatch(r"^A[0-9]{2}[E|X][0-9]{2}[A-HJ-NP-Z][0-9]{4}$", self.keychip_id) is None:
raise ADBHeaderException(f"Keychip ID {self.keychip_id} is invalid!")
return True
def make(self) -> bytes:
resp_struct = Struct(
"magic" / Int16ul,
"unknown" / Int16ul,
"response_code" / Int16ul,
"length" / Int16ul,
"status" / Int16ul,
"game_id" / PaddedString(6, 'utf_8'),
"store_id" / Int32ul,
"keychip_id" / PaddedString(12, 'utf_8'),
)
return resp_struct.build(dict(
magic=self.magic,
unknown=self.protocol_ver,
response_code=self.cmd,
length=self.length,
status=self.status.value,
game_id = self.game_id,
store_id = self.store_id,
keychip_id = self.keychip_id,
))
class ADBBaseRequest:
def __init__(self, data: bytes) -> None:
self.head = ADBHeader.from_data(data)
class ADBBaseResponse:
def __init__(self, code: int = 0, length: int = 0x20, status: int = 1, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", protocol_ver: int = 0x3087) -> None:
self.head = ADBHeader(0xa13e, protocol_ver, code, length, status, game_id, store_id, keychip_id)
@classmethod
def from_req(cls, req: ADBHeader, cmd: int, length: int = 0x20, status: int = 1) -> "ADBBaseResponse":
return cls(cmd, length, status, req.game_id, req.store_id, req.keychip_id, req.protocol_ver)
def append_padding(self, data: bytes):
"""Appends 0s to the end of the data until it's at the correct size"""
padding_size = self.head.length - len(data)
data += bytes(padding_size)
return data
def make(self) -> bytes:
return self.head.make()

View File

@ -0,0 +1,132 @@
from construct import Struct, Int16ul, Padding, Bytes, Int32ul, Int32sl
from .base import *
class Campaign:
def __init__(self) -> None:
self.id = 0
self.name = ""
self.announce_date = 0
self.start_date = 0
self.end_date = 0
self.distrib_start_date = 0
self.distrib_end_date = 0
def make(self) -> bytes:
name_padding = bytes(128 - len(self.name))
return Struct(
"id" / Int32ul,
"name" / Bytes(128),
"announce_date" / Int32ul,
"start_date" / Int32ul,
"end_date" / Int32ul,
"distrib_start_date" / Int32ul,
"distrib_end_date" / Int32ul,
Padding(8),
).build(dict(
id = self.id,
name = self.name.encode() + name_padding,
announce_date = self.announce_date,
start_date = self.start_date,
end_date = self.end_date,
distrib_start_date = self.distrib_start_date,
distrib_end_date = self.distrib_end_date,
))
class CampaignClear:
def __init__(self) -> None:
self.id = 0
self.entry_flag = 0
self.clear_flag = 0
def make(self) -> bytes:
return Struct(
"id" / Int32ul,
"entry_flag" / Int32ul,
"clear_flag" / Int32ul,
Padding(4),
).build(dict(
id = self.id,
entry_flag = self.entry_flag,
clear_flag = self.clear_flag,
))
class ADBCampaignResponse(ADBBaseResponse):
def __init__(self, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", code: int = 0x0C, length: int = 0x200, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id)
self.campaigns = [Campaign(), Campaign(), Campaign()]
@classmethod
def from_req(cls, req: ADBHeader) -> "ADBCampaignResponse":
c = cls(req.game_id, req.store_id, req.keychip_id)
c.head.protocol_ver = req.protocol_ver
return c
def make(self) -> bytes:
body = b""
for c in self.campaigns:
body += c.make()
self.head.length = HEADER_SIZE + len(body)
return self.head.make() + body
class ADBOldCampaignRequest(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
self.campaign_id = struct.unpack_from("<I", data, 0x20)
class ADBOldCampaignResponse(ADBBaseResponse):
def __init__(self, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", code: int = 0x0C, length: int = 0x30, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id)
self.info0 = 0
self.info1 = 0
self.info2 = 0
self.info3 = 0
@classmethod
def from_req(cls, req: ADBHeader) -> "ADBCampaignResponse":
c = cls(req.game_id, req.store_id, req.keychip_id)
c.head.protocol_ver = req.protocol_ver
return c
def make(self) -> bytes:
resp_struct = Struct(
"info0" / Int32sl,
"info1" / Int32sl,
"info2" / Int32sl,
"info3" / Int32sl,
).build(
info0 = self.info0,
info1 = self.info1,
info2 = self.info2,
info3 = self.info3,
)
self.head.length = HEADER_SIZE + len(resp_struct)
return self.head.make() + resp_struct
class ADBCampaignClearRequest(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
self.aime_id = struct.unpack_from("<i", data, 0x20)
class ADBCampaignClearResponse(ADBBaseResponse):
def __init__(self, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", code: int = 0x0E, length: int = 0x50, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id)
self.campaign_clear_status = [CampaignClear(), CampaignClear(), CampaignClear()]
@classmethod
def from_req(cls, req: ADBHeader) -> "ADBCampaignResponse":
c = cls(req.game_id, req.store_id, req.keychip_id)
c.head.protocol_ver = req.protocol_ver
return c
def make(self) -> bytes:
body = b""
for c in self.campaign_clear_status:
body += c.make()
self.head.length = HEADER_SIZE + len(body)
return self.head.make() + body

View File

@ -0,0 +1,85 @@
from construct import Struct, Int32sl, Padding, Int8ub, Int16sl
from typing import Union
from .base import *
class ADBFelicaLookupRequest(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
idm, pmm = struct.unpack_from(">QQ", data, 0x20)
self.idm = hex(idm)[2:].upper()
self.pmm = hex(pmm)[2:].upper()
class ADBFelicaLookupResponse(ADBBaseResponse):
def __init__(self, access_code: str = None, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", code: int = 0x03, length: int = 0x30, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id)
self.access_code = access_code if access_code is not None else "00000000000000000000"
@classmethod
def from_req(cls, req: ADBHeader, access_code: str = None) -> "ADBFelicaLookupResponse":
c = cls(access_code, req.game_id, req.store_id, req.keychip_id)
c.head.protocol_ver = req.protocol_ver
return c
def make(self) -> bytes:
resp_struct = Struct(
"felica_idx" / Int32ul,
"access_code" / Int8ub[10],
Padding(2)
).build(dict(
felica_idx = 0,
access_code = bytes.fromhex(self.access_code)
))
self.head.length = HEADER_SIZE + len(resp_struct)
return self.head.make() + resp_struct
class ADBFelicaLookup2Request(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
self.random = struct.unpack_from("<16s", data, 0x20)[0]
idm, pmm = struct.unpack_from(">QQ", data, 0x30)
self.card_key_ver, self.write_ct, self.maca, company, fw_ver, self.dfc = struct.unpack_from("<16s16sQccH", data, 0x40)
self.idm = hex(idm)[2:].upper()
self.pmm = hex(pmm)[2:].upper()
self.company = CompanyCodes(int.from_bytes(company, 'little'))
self.fw_ver = ReaderFwVer.from_byte(fw_ver)
class ADBFelicaLookup2Response(ADBBaseResponse):
def __init__(self, user_id: Union[int, None] = None, access_code: Union[str, None] = None, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", code: int = 0x12, length: int = 0x130, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id)
self.user_id = user_id if user_id is not None else -1
self.access_code = access_code if access_code is not None else "00000000000000000000"
self.company = CompanyCodes.SEGA
self.portal_status = PortalRegStatus.NO_REG
self.auth_key = [0] * 256
@classmethod
def from_req(cls, req: ADBHeader, user_id: Union[int, None] = None, access_code: Union[str, None] = None) -> "ADBFelicaLookup2Response":
c = cls(user_id, access_code, req.game_id, req.store_id, req.keychip_id)
c.head.protocol_ver = req.protocol_ver
return c
def make(self) -> bytes:
resp_struct = Struct(
"user_id" / Int32sl,
"relation1" / Int32sl,
"relation2" / Int32sl,
"access_code" / Int8ub[10],
"portal_status" / Int8ub,
"company_code" / Int8ub,
Padding(8),
"auth_key" / Int8ub[256],
).build(dict(
user_id = self.user_id,
relation1 = -1, # Unsupported
relation2 = -1, # Unsupported
access_code = bytes.fromhex(self.access_code),
portal_status = self.portal_status.value,
company_code = self.company.value,
auth_key = self.auth_key
))
self.head.length = HEADER_SIZE + len(resp_struct)
return self.head.make() + resp_struct

56
core/adb_handlers/log.py Normal file
View File

@ -0,0 +1,56 @@
from construct import Struct, Padding, Int8sl
from typing import Final, List
from .base import *
NUM_LOGS: Final[int] = 20
NUM_LEN_LOG_EX: Final[int] = 48
class AmLogEx:
def __init__(self, data: bytes) -> None:
self.aime_id, status, self.user_id, self.credit_ct, self.bet_ct, self.won_ct, self.local_time, \
self.tseq, self.place_id = struct.unpack("<IIQiii4xQiI", data)
self.status = LogStatus(status)
class ADBStatusLogRequest(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
self.aime_id, status = struct.unpack_from("<II", data, 0x20)
self.status = LogStatus(status)
class ADBLogRequest(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
self.aime_id, status, self.user_id, self.credit_ct, self.bet_ct, self.won_ct = struct.unpack_from("<IIQiii", data, 0x20)
self.status = LogStatus(status)
class ADBLogExRequest(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
self.logs: List[AmLogEx] = []
for x in range(NUM_LOGS):
self.logs.append(AmLogEx(data[0x20 + (NUM_LEN_LOG_EX * x): 0x50 + (NUM_LEN_LOG_EX * x)]))
self.num_logs = struct.unpack_from("<I", data, 0x03E0)[0]
class ADBLogExResponse(ADBBaseResponse):
def __init__(self, game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", protocol_ver: int = 12423, code: int = 20, length: int = 64, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id, protocol_ver)
@classmethod
def from_req(cls, req: ADBHeader) -> "ADBLogExResponse":
c = cls(req.game_id, req.store_id, req.keychip_id, req.protocol_ver)
return c
def make(self) -> bytes:
resp_struct = Struct(
"log_result" / Int8sl[NUM_LOGS],
Padding(12)
)
body = resp_struct.build(dict(
log_result = [1] * NUM_LOGS
))
self.head.length = HEADER_SIZE + len(body)
return self.head.make() + body

View File

@ -0,0 +1,82 @@
from construct import Struct, Int32sl, Padding, Int8sl
from typing import Union
from .base import *
class ADBLookupException(Exception):
pass
class ADBLookupRequest(ADBBaseRequest):
def __init__(self, data: bytes) -> None:
super().__init__(data)
self.access_code = data[0x20:0x2A].hex()
company_code, fw_version, self.serial_number = struct.unpack_from("<bbI", data, 0x2A)
try:
self.company_code = CompanyCodes(company_code)
except ValueError as e:
raise ADBLookupException(f"Invalid company code - {e}")
self.fw_version = ReaderFwVer.from_byte(fw_version)
class ADBLookupResponse(ADBBaseResponse):
def __init__(self, user_id: Union[int, None], game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888", code: int = 0x06, length: int = 0x30, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id)
self.user_id = user_id if user_id is not None else -1
self.portal_reg = PortalRegStatus.NO_REG
@classmethod
def from_req(cls, req: ADBHeader, user_id: Union[int, None]) -> "ADBLookupResponse":
c = cls(user_id, req.game_id, req.store_id, req.keychip_id)
c.head.protocol_ver = req.protocol_ver
return c
def make(self):
resp_struct = Struct(
"user_id" / Int32sl,
"portal_reg" / Int8sl,
Padding(11)
)
body = resp_struct.build(dict(
user_id = self.user_id,
portal_reg = self.portal_reg.value
))
self.head.length = HEADER_SIZE + len(body)
return self.head.make() + body
class ADBLookupExResponse(ADBBaseResponse):
def __init__(self, user_id: Union[int, None], game_id: str = "SXXX", store_id: int = 1, keychip_id: str = "A69E01A8888",
code: int = 0x10, length: int = 0x130, status: int = 1) -> None:
super().__init__(code, length, status, game_id, store_id, keychip_id)
self.user_id = user_id if user_id is not None else -1
self.portal_reg = PortalRegStatus.NO_REG
self.auth_key = [0] * 256
@classmethod
def from_req(cls, req: ADBHeader, user_id: Union[int, None]) -> "ADBLookupExResponse":
c = cls(user_id, req.game_id, req.store_id, req.keychip_id)
c.head.protocol_ver = req.protocol_ver
return c
def make(self):
resp_struct = Struct(
"user_id" / Int32sl,
"portal_reg" / Int8sl,
Padding(3),
"auth_key" / Int8sl[256],
"relation1" / Int32sl,
"relation2" / Int32sl,
)
body = resp_struct.build(dict(
user_id = self.user_id,
portal_reg = self.portal_reg.value,
auth_key = self.auth_key,
relation1 = -1,
relation2 = -1
))
self.head.length = HEADER_SIZE + len(body)
return self.head.make() + body

View File

@ -2,26 +2,18 @@ from twisted.internet.protocol import Factory, Protocol
import logging, coloredlogs
from Crypto.Cipher import AES
import struct
from typing import Dict, Any
from typing import Dict, Tuple, Callable, Union
from typing_extensions import Final
from logging.handlers import TimedRotatingFileHandler
from core.config import CoreConfig
from core.utils import create_sega_auth_key
from core.data import Data
from .adb_handlers import *
class AimedbProtocol(Protocol):
AIMEDB_RESPONSE_CODES = {
"felica_lookup": 0x03,
"lookup": 0x06,
"log": 0x0a,
"campaign": 0x0c,
"touch": 0x0e,
"lookup2": 0x10,
"felica_lookup2": 0x12,
"log2": 0x14,
"hello": 0x65
}
request_list: Dict[int, Any] = {}
request_list: Dict[int, Tuple[Callable[[bytes, int], Union[ADBBaseResponse, bytes]], int, str]] = {}
def __init__(self, core_cfg: CoreConfig) -> None:
self.logger = logging.getLogger("aimedb")
@ -30,17 +22,28 @@ class AimedbProtocol(Protocol):
if core_cfg.aimedb.key == "":
self.logger.error("!!!KEY NOT SET!!!")
exit(1)
self.request_list[0x01] = self.handle_felica_lookup
self.request_list[0x04] = self.handle_lookup
self.request_list[0x05] = self.handle_register
self.request_list[0x09] = self.handle_log
self.request_list[0x0b] = self.handle_campaign
self.request_list[0x0d] = self.handle_touch
self.request_list[0x0f] = self.handle_lookup2
self.request_list[0x11] = self.handle_felica_lookup2
self.request_list[0x13] = self.handle_log2
self.request_list[0x64] = self.handle_hello
self.register_handler(0x01, 0x03, self.handle_felica_lookup, 'felica_lookup')
self.register_handler(0x02, 0x03, self.handle_felica_register, 'felica_register')
self.register_handler(0x04, 0x06, self.handle_lookup, 'lookup')
self.register_handler(0x05, 0x06, self.handle_register, 'register')
self.register_handler(0x07, 0x08, self.handle_status_log, 'status_log')
self.register_handler(0x09, 0x0A, self.handle_log, 'aime_log')
self.register_handler(0x0B, 0x0C, self.handle_campaign, 'campaign')
self.register_handler(0x0D, 0x0E, self.handle_campaign_clear, 'campaign_clear')
self.register_handler(0x0F, 0x10, self.handle_lookup_ex, 'lookup_ex')
self.register_handler(0x11, 0x12, self.handle_felica_lookup_ex, 'felica_lookup_ex')
self.register_handler(0x13, 0x14, self.handle_log_ex, 'aime_log_ex')
self.register_handler(0x64, 0x65, self.handle_hello, 'hello')
self.register_handler(0x66, 0, self.handle_goodbye, 'goodbye')
def register_handler(self, cmd: int, resp:int, handler: Callable[[bytes, int], Union[ADBBaseResponse, bytes]], name: str) -> None:
self.request_list[cmd] = (handler, resp, name)
def append_padding(self, data: bytes):
"""Appends 0s to the end of the data until it's at the correct size"""
@ -53,182 +56,311 @@ class AimedbProtocol(Protocol):
self.logger.debug(f"{self.transport.getPeer().host} Connected")
def connectionLost(self, reason) -> None:
self.logger.debug(f"{self.transport.getPeer().host} Disconnected - {reason.value}")
self.logger.debug(
f"{self.transport.getPeer().host} Disconnected - {reason.value}"
)
def dataReceived(self, data: bytes) -> None:
cipher = AES.new(self.config.aimedb.key.encode(), AES.MODE_ECB)
try:
decrypted = cipher.decrypt(data)
except:
self.logger.error(f"Failed to decrypt {data.hex()}")
except Exception as e:
self.logger.error(f"Failed to decrypt {data.hex()} because {e}")
return None
self.logger.debug(f"{self.transport.getPeer().host} wrote {decrypted.hex()}")
if not decrypted[1] == 0xa1 and not decrypted[0] == 0x3e:
self.logger.error(f"Bad magic")
return None
try:
head = ADBHeader.from_data(decrypted)
except ADBHeaderException as e:
self.logger.error(f"Error parsing ADB header: {e}")
try:
encrypted = cipher.encrypt(ADBBaseResponse().make())
self.transport.write(encrypted)
req_code = decrypted[4]
if req_code == 0x66:
self.logger.info(f"goodbye from {self.transport.getPeer().host}")
self.transport.loseConnection()
except Exception as e:
self.logger.error(f"Failed to encrypt default response because {e}")
return
try:
resp = self.request_list[req_code](decrypted)
encrypted = cipher.encrypt(resp)
self.logger.debug(f"Response {resp.hex()}")
self.transport.write(encrypted)
if head.keychip_id == "ABCD1234567" or head.store_id == 0xfff0:
self.logger.warning(f"Request from uninitialized AMLib: {vars(head)}")
except KeyError:
self.logger.error(f"Unknown command code {hex(req_code)}")
return None
handler, resp_code, name = self.request_list.get(head.cmd, (self.handle_default, None, 'default'))
except ValueError as e:
self.logger.error(f"Failed to encrypt {resp.hex()} because {e}")
return None
def handle_campaign(self, data: bytes) -> bytes:
self.logger.info(f"campaign from {self.transport.getPeer().host}")
ret = struct.pack("<5H", 0xa13e, 0x3087, self.AIMEDB_RESPONSE_CODES["campaign"], 0x0200, 0x0001)
return self.append_padding(ret)
def handle_hello(self, data: bytes) -> bytes:
self.logger.info(f"hello from {self.transport.getPeer().host}")
ret = struct.pack("<5H", 0xa13e, 0x3087, self.AIMEDB_RESPONSE_CODES["hello"], 0x0020, 0x0001)
return self.append_padding(ret)
def handle_lookup(self, data: bytes) -> bytes:
luid = data[0x20: 0x2a].hex()
user_id = self.data.card.get_user_id_from_card(access_code=luid)
if user_id is None: user_id = -1
self.logger.info(f"lookup from {self.transport.getPeer().host}: luid {luid} -> user_id {user_id}")
ret = struct.pack("<5H", 0xa13e, 0x3087, self.AIMEDB_RESPONSE_CODES["lookup"], 0x0130, 0x0001)
ret += bytes(0x20 - len(ret))
if user_id is None: ret += struct.pack("<iH", -1, 0)
else: ret += struct.pack("<l", user_id)
return self.append_padding(ret)
def handle_lookup2(self, data: bytes) -> bytes:
self.logger.info(f"lookup2")
ret = bytearray(self.handle_lookup(data))
ret[4] = self.AIMEDB_RESPONSE_CODES["lookup2"]
return bytes(ret)
def handle_felica_lookup(self, data: bytes) -> bytes:
idm = data[0x20: 0x28].hex()
pmm = data[0x28: 0x30].hex()
access_code = self.data.card.to_access_code(idm)
self.logger.info(f"felica_lookup from {self.transport.getPeer().host}: idm {idm} pmm {pmm} -> access_code {access_code}")
ret = struct.pack("<5H", 0xa13e, 0x3087, self.AIMEDB_RESPONSE_CODES["felica_lookup"], 0x0030, 0x0001)
ret += bytes(26)
ret += bytes.fromhex(access_code)
return self.append_padding(ret)
def handle_felica_lookup2(self, data: bytes) -> bytes:
idm = data[0x30: 0x38].hex()
pmm = data[0x38: 0x40].hex()
access_code = self.data.card.to_access_code(idm)
user_id = self.data.card.get_user_id_from_card(access_code=access_code)
if user_id is None: user_id = -1
self.logger.info(f"felica_lookup2 from {self.transport.getPeer().host}: idm {idm} ipm {pmm} -> access_code {access_code} user_id {user_id}")
ret = struct.pack("<5H", 0xa13e, 0x3087, self.AIMEDB_RESPONSE_CODES["felica_lookup2"], 0x0140, 0x0001)
ret += bytes(22)
ret += struct.pack("<lq", user_id, -1) # first -1 is ext_id, 3rd is access code
ret += bytes.fromhex(access_code)
ret += struct.pack("<l", 1)
if resp_code is None:
self.logger.warning(f"No handler for cmd {hex(head.cmd)}")
return self.append_padding(ret)
def handle_touch(self, data: bytes) -> bytes:
self.logger.info(f"touch from {self.transport.getPeer().host}")
ret = struct.pack("<5H", 0xa13e, 0x3087, self.AIMEDB_RESPONSE_CODES["touch"], 0x0050, 0x0001)
ret += bytes(5)
ret += struct.pack("<3H", 0x6f, 0, 1)
elif resp_code > 0:
self.logger.info(f"{name} from {head.keychip_id} ({head.game_id}) @ {self.transport.getPeer().host}")
resp = handler(decrypted, resp_code)
return self.append_padding(ret)
if type(resp) == ADBBaseResponse or issubclass(type(resp), ADBBaseResponse):
resp_bytes = resp.make()
if len(resp_bytes) != resp.head.length:
resp_bytes = self.append_padding(resp_bytes)
def handle_register(self, data: bytes) -> bytes:
luid = data[0x20: 0x2a].hex()
if self.config.server.allow_registration:
user_id = self.data.user.create_user()
if user_id is None:
user_id = -1
self.logger.error("Failed to register user!")
else:
card_id = self.data.card.create_card(user_id, luid)
if card_id is None:
user_id = -1
self.logger.error("Failed to register card!")
self.logger.info(f"register from {self.transport.getPeer().host}: luid {luid} -> user_id {user_id}")
elif type(resp) == bytes:
resp_bytes = resp
elif resp is None: # Nothing to send, probably a goodbye
return
else:
self.logger.info(f"register from {self.transport.getPeer().host} blocked!: luid {luid}")
raise TypeError(f"Unsupported type returned by ADB handler for {name}: {type(resp)}")
try:
encrypted = cipher.encrypt(resp_bytes)
self.logger.debug(f"Response {resp_bytes.hex()}")
self.transport.write(encrypted)
except Exception as e:
self.logger.error(f"Failed to encrypt {resp_bytes.hex()} because {e}")
def handle_default(self, data: bytes, resp_code: int, length: int = 0x20) -> ADBBaseResponse:
req = ADBHeader.from_data(data)
return ADBBaseResponse(resp_code, length, 1, req.game_id, req.store_id, req.keychip_id, req.protocol_ver)
def handle_hello(self, data: bytes, resp_code: int) -> ADBBaseResponse:
return self.handle_default(data, resp_code)
def handle_campaign(self, data: bytes, resp_code: int) -> ADBBaseResponse:
h = ADBHeader.from_data(data)
if h.protocol_ver >= 0x3030:
req = h
resp = ADBCampaignResponse.from_req(req)
else:
req = ADBOldCampaignRequest(data)
self.logger.info(f"Legacy campaign request for campaign {req.campaign_id} (protocol version {hex(h.protocol_ver)})")
resp = ADBOldCampaignResponse.from_req(req.head)
# We don't currently support campaigns
return resp
def handle_lookup(self, data: bytes, resp_code: int) -> ADBBaseResponse:
req = ADBLookupRequest(data)
user_id = self.data.card.get_user_id_from_card(req.access_code)
is_banned = self.data.card.get_card_banned(req.access_code)
is_locked = self.data.card.get_card_locked(req.access_code)
ret = ADBLookupResponse.from_req(req.head, user_id)
if is_banned and is_locked:
ret.head.status = ADBStatus.BAN_SYS_USER
elif is_banned:
ret.head.status = ADBStatus.BAN_SYS
elif is_locked:
ret.head.status = ADBStatus.LOCK_USER
self.logger.info(
f"access_code {req.access_code} -> user_id {ret.user_id}"
)
return ret
def handle_lookup_ex(self, data: bytes, resp_code: int) -> ADBBaseResponse:
req = ADBLookupRequest(data)
user_id = self.data.card.get_user_id_from_card(req.access_code)
is_banned = self.data.card.get_card_banned(req.access_code)
is_locked = self.data.card.get_card_locked(req.access_code)
ret = ADBLookupExResponse.from_req(req.head, user_id)
if is_banned and is_locked:
ret.head.status = ADBStatus.BAN_SYS_USER
elif is_banned:
ret.head.status = ADBStatus.BAN_SYS
elif is_locked:
ret.head.status = ADBStatus.LOCK_USER
self.logger.info(
f"access_code {req.access_code} -> user_id {ret.user_id}"
)
if user_id and user_id > 0 and self.config.aimedb.id_secret:
auth_key = create_sega_auth_key(user_id, req.head.game_id, req.head.store_id, req.head.keychip_id, self.config.aimedb.id_secret, self.config.aimedb.id_lifetime_seconds)
if auth_key is not None:
auth_key_extra_len = 256 - len(auth_key)
auth_key_full = auth_key.encode() + (b"\0" * auth_key_extra_len)
self.logger.debug(f"Generated auth token {auth_key}")
ret.auth_key = auth_key_full
return ret
def handle_felica_lookup(self, data: bytes, resp_code: int) -> bytes:
"""
On official, I think a card has to be registered for this to actually work, but
I'm making the executive decision to not implement that and just kick back our
faux generated access code. The real felica IDm -> access code conversion is done
on the ADB server, which we do not and will not ever have access to. Because we can
assure that all IDms will be unique, this basic 0-padded hex -> int conversion will
be fine.
"""
req = ADBFelicaLookupRequest(data)
ac = self.data.card.to_access_code(req.idm)
self.logger.info(
f"idm {req.idm} ipm {req.pmm} -> access_code {ac}"
)
return ADBFelicaLookupResponse.from_req(req.head, ac)
def handle_felica_register(self, data: bytes, resp_code: int) -> bytes:
"""
I've never seen this used.
"""
req = ADBFelicaLookupRequest(data)
ac = self.data.card.to_access_code(req.idm)
if self.config.server.allow_user_registration:
user_id = self.data.user.create_user()
if user_id is None:
self.logger.error("Failed to register user!")
user_id = -1
else:
card_id = self.data.card.create_card(user_id, ac)
if card_id is None:
self.logger.error("Failed to register card!")
user_id = -1
self.logger.info(
f"Register access code {ac} (IDm: {req.idm} PMm: {req.pmm}) -> user_id {user_id}"
)
else:
self.logger.info(
f"Registration blocked!: access code {ac} (IDm: {req.idm} PMm: {req.pmm})"
)
return ADBFelicaLookupResponse.from_req(req.head, ac)
def handle_felica_lookup_ex(self, data: bytes, resp_code: int) -> bytes:
req = ADBFelicaLookup2Request(data)
access_code = self.data.card.to_access_code(req.idm)
user_id = self.data.card.get_user_id_from_card(access_code=access_code)
if user_id is None:
user_id = -1
ret = struct.pack("<5H", 0xa13e, 0x3087, self.AIMEDB_RESPONSE_CODES["lookup"], 0x0030, 0x0001 if user_id > -1 else 0)
ret += bytes(0x20 - len(ret))
ret += struct.pack("<l", user_id)
self.logger.info(
f"idm {req.idm} ipm {req.pmm} -> access_code {access_code} user_id {user_id}"
)
return self.append_padding(ret)
resp = ADBFelicaLookup2Response.from_req(req.head, user_id, access_code)
def handle_log(self, data: bytes) -> bytes:
# TODO: Save aimedb logs
self.logger.info(f"log from {self.transport.getPeer().host}")
ret = struct.pack("<5H", 0xa13e, 0x3087, self.AIMEDB_RESPONSE_CODES["log"], 0x0020, 0x0001)
return self.append_padding(ret)
if user_id and user_id > 0 and self.config.aimedb.id_secret:
auth_key = create_sega_auth_key(user_id, req.head.game_id, req.head.store_id, req.head.keychip_id, self.config.aimedb.id_secret, self.config.aimedb.id_lifetime_seconds)
if auth_key is not None:
auth_key_extra_len = 256 - len(auth_key)
auth_key_full = auth_key.encode() + (b"\0" * auth_key_extra_len)
self.logger.debug(f"Generated auth token {auth_key}")
resp.auth_key = auth_key_full
def handle_log2(self, data: bytes) -> bytes:
self.logger.info(f"log2 from {self.transport.getPeer().host}")
ret = struct.pack("<5H", 0xa13e, 0x3087, self.AIMEDB_RESPONSE_CODES["log2"], 0x0040, 0x0001)
ret += bytes(22)
ret += struct.pack("H", 1)
return resp
return self.append_padding(ret)
def handle_campaign_clear(self, data: bytes, resp_code: int) -> ADBBaseResponse:
req = ADBCampaignClearRequest(data)
resp = ADBCampaignClearResponse.from_req(req.head)
# We don't support campaign stuff
return resp
def handle_register(self, data: bytes, resp_code: int) -> bytes:
req = ADBLookupRequest(data)
user_id = -1
if self.config.server.allow_user_registration:
user_id = self.data.user.create_user()
if user_id is None:
self.logger.error("Failed to register user!")
user_id = -1
else:
card_id = self.data.card.create_card(user_id, req.access_code)
if card_id is None:
self.logger.error("Failed to register card!")
user_id = -1
self.logger.info(
f"Register access code {req.access_code} -> user_id {user_id}"
)
else:
self.logger.info(
f"Registration blocked!: access code {req.access_code}"
)
resp = ADBLookupResponse.from_req(req.head, user_id)
if resp.user_id <= 0:
resp.head.status = ADBStatus.BAN_SYS # Closest we can get to a "You cannot register"
return resp
# TODO: Save these in some capacity, as deemed relevant
def handle_status_log(self, data: bytes, resp_code: int) -> bytes:
req = ADBStatusLogRequest(data)
self.logger.info(f"User {req.aime_id} logged {req.status.name} event")
return ADBBaseResponse(resp_code, 0x20, 1, req.head.game_id, req.head.store_id, req.head.keychip_id, req.head.protocol_ver)
def handle_log(self, data: bytes, resp_code: int) -> bytes:
req = ADBLogRequest(data)
self.logger.info(f"User {req.aime_id} logged {req.status.name} event, credit_ct: {req.credit_ct} bet_ct: {req.bet_ct} won_ct: {req.won_ct}")
return ADBBaseResponse(resp_code, 0x20, 1, req.head.game_id, req.head.store_id, req.head.keychip_id, req.head.protocol_ver)
def handle_log_ex(self, data: bytes, resp_code: int) -> bytes:
req = ADBLogExRequest(data)
strs = []
self.logger.info(f"Recieved {req.num_logs} or {len(req.logs)} logs")
for x in range(req.num_logs):
self.logger.debug(f"User {req.logs[x].aime_id} logged {req.logs[x].status.name} event, credit_ct: {req.logs[x].credit_ct} bet_ct: {req.logs[x].bet_ct} won_ct: {req.logs[x].won_ct}")
return ADBLogExResponse.from_req(req.head)
def handle_goodbye(self, data: bytes, resp_code: int) -> None:
self.logger.info(f"goodbye from {self.transport.getPeer().host}")
self.transport.loseConnection()
return
class AimedbFactory(Factory):
protocol = AimedbProtocol
def __init__(self, cfg: CoreConfig) -> None:
self.config = cfg
log_fmt_str = "[%(asctime)s] Aimedb | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
self.logger = logging.getLogger("aimedb")
fileHandler = TimedRotatingFileHandler("{0}/{1}.log".format(self.config.server.log_dir, "aimedb"), when="d", backupCount=10)
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(self.config.server.log_dir, "aimedb"),
when="d",
backupCount=10,
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
self.logger.addHandler(fileHandler)
self.logger.addHandler(consoleHandler)
self.logger.setLevel(self.config.aimedb.loglevel)
coloredlogs.install(level=cfg.aimedb.loglevel, logger=self.logger, fmt=log_fmt_str)
coloredlogs.install(
level=cfg.aimedb.loglevel, logger=self.logger, fmt=log_fmt_str
)
if self.config.aimedb.key == "":
self.logger.error("Please set 'key' field in your config file.")
exit(1)
self.logger.info(f"Ready on port {self.config.aimedb.port}")
def buildProtocol(self, addr):
return AimedbProtocol(self.config)

File diff suppressed because it is too large Load Diff

View File

@ -1,33 +1,71 @@
import logging, os
from typing import Any
class ServerConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@property
def listen_address(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'server', 'listen_address', default='127.0.0.1')
return CoreConfig.get_config_field(
self.__config, "core", "server", "listen_address", default="127.0.0.1"
)
@property
def allow_user_registration(self) -> bool:
return CoreConfig.get_config_field(self.__config, 'core', 'server', 'allow_user_registration', default=True)
return CoreConfig.get_config_field(
self.__config, "core", "server", "allow_user_registration", default=True
)
@property
def allow_unregistered_serials(self) -> bool:
return CoreConfig.get_config_field(self.__config, 'core', 'server', 'allow_unregistered_serials', default=True)
return CoreConfig.get_config_field(
self.__config, "core", "server", "allow_unregistered_serials", default=True
)
@property
def name(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'server', 'name', default="ARTEMiS")
return CoreConfig.get_config_field(
self.__config, "core", "server", "name", default="ARTEMiS"
)
@property
def is_develop(self) -> bool:
return CoreConfig.get_config_field(self.__config, 'core', 'server', 'is_develop', default=True)
return CoreConfig.get_config_field(
self.__config, "core", "server", "is_develop", default=True
)
@property
def is_using_proxy(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "server", "is_using_proxy", default=False
)
@property
def threading(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "server", "threading", default=False
)
@property
def log_dir(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'server', 'log_dir', default='logs')
return CoreConfig.get_config_field(
self.__config, "core", "server", "log_dir", default="logs"
)
@property
def check_arcade_ip(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "server", "check_arcade_ip", default=False
)
@property
def strict_ip_checking(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "server", "strict_ip_checking", default=False
)
class TitleConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
@ -35,15 +73,54 @@ class TitleConfig:
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(CoreConfig.get_config_field(self.__config, 'core', 'title', 'loglevel', default="info"))
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "title", "loglevel", default="info"
)
)
@property
def hostname(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'title', 'hostname', default="localhost")
return CoreConfig.get_config_field(
self.__config, "core", "title", "hostname", default="localhost"
)
@property
def port(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'title', 'port', default=8080)
return CoreConfig.get_config_field(
self.__config, "core", "title", "port", default=8080
)
@property
def port_ssl(self) -> int:
return CoreConfig.get_config_field(
self.__config, "core", "title", "port_ssl", default=0
)
@property
def ssl_key(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "title", "ssl_key", default="cert/title.key"
)
@property
def ssl_cert(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "title", "ssl_cert", default="cert/title.pem"
)
@property
def reboot_start_time(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "title", "reboot_start_time", default=""
)
@property
def reboot_end_time(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "title", "reboot_end_time", default=""
)
class DatabaseConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
@ -51,43 +128,76 @@ class DatabaseConfig:
@property
def host(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'database', 'host', default="localhost")
return CoreConfig.get_config_field(
self.__config, "core", "database", "host", default="localhost"
)
@property
def username(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'database', 'username', default='aime')
return CoreConfig.get_config_field(
self.__config, "core", "database", "username", default="aime"
)
@property
def password(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'database', 'password', default='aime')
return CoreConfig.get_config_field(
self.__config, "core", "database", "password", default="aime"
)
@property
def name(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'database', 'name', default='aime')
return CoreConfig.get_config_field(
self.__config, "core", "database", "name", default="aime"
)
@property
def port(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'database', 'port', default=3306)
return CoreConfig.get_config_field(
self.__config, "core", "database", "port", default=3306
)
@property
def protocol(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'database', 'type', default="mysql")
return CoreConfig.get_config_field(
self.__config, "core", "database", "type", default="mysql"
)
@property
def sha2_password(self) -> bool:
return CoreConfig.get_config_field(self.__config, 'core', 'database', 'sha2_password', default=False)
return CoreConfig.get_config_field(
self.__config, "core", "database", "sha2_password", default=False
)
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(CoreConfig.get_config_field(self.__config, 'core', 'database', 'loglevel', default="info"))
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "database", "loglevel", default="info"
)
)
@property
def user_table_autoincrement_start(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'database', 'user_table_autoincrement_start', default=10000)
return CoreConfig.get_config_field(
self.__config,
"core",
"database",
"user_table_autoincrement_start",
default=10000,
)
@property
def enable_memcached(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "database", "enable_memcached", default=True
)
@property
def memcached_host(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'database', 'memcached_host', default="localhost")
return CoreConfig.get_config_field(
self.__config, "core", "database", "memcached_host", default="localhost"
)
class FrontendConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
@ -95,15 +205,24 @@ class FrontendConfig:
@property
def enable(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'frontend', 'enable', default=False)
return CoreConfig.get_config_field(
self.__config, "core", "frontend", "enable", default=False
)
@property
def port(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'frontend', 'port', default=8090)
return CoreConfig.get_config_field(
self.__config, "core", "frontend", "port", default=8090
)
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(CoreConfig.get_config_field(self.__config, 'core', 'frontend', 'loglevel', default="info"))
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "frontend", "loglevel", default="info"
)
)
class AllnetConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
@ -111,15 +230,36 @@ class AllnetConfig:
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(CoreConfig.get_config_field(self.__config, 'core', 'allnet', 'loglevel', default="info"))
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "allnet", "loglevel", default="info"
)
)
@property
def port(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'allnet', 'port', default=80)
return CoreConfig.get_config_field(
self.__config, "core", "allnet", "port", default=80
)
@property
def ip_check(self) -> bool:
return CoreConfig.get_config_field(
self.__config, "core", "allnet", "ip_check", default=False
)
@property
def allow_online_updates(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'allnet', 'allow_online_updates', default=False)
return CoreConfig.get_config_field(
self.__config, "core", "allnet", "allow_online_updates", default=False
)
@property
def update_cfg_folder(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "allnet", "update_cfg_folder", default=""
)
class BillingConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
@ -127,35 +267,65 @@ class BillingConfig:
@property
def port(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'billing', 'port', default=8443)
return CoreConfig.get_config_field(
self.__config, "core", "billing", "port", default=8443
)
@property
def ssl_key(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'billing', 'ssl_key', default="cert/server.key")
return CoreConfig.get_config_field(
self.__config, "core", "billing", "ssl_key", default="cert/server.key"
)
@property
def ssl_cert(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'billing', 'ssl_cert', default="cert/server.pem")
return CoreConfig.get_config_field(
self.__config, "core", "billing", "ssl_cert", default="cert/server.pem"
)
@property
def signing_key(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'billing', 'signing_key', default="cert/billing.key")
return CoreConfig.get_config_field(
self.__config, "core", "billing", "signing_key", default="cert/billing.key"
)
class AimedbConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
self.__config = parent_config
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(CoreConfig.get_config_field(self.__config, 'core', 'aimedb', 'loglevel', default="info"))
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "aimedb", "loglevel", default="info"
)
)
@property
def port(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'aimedb', 'port', default=22345)
return CoreConfig.get_config_field(
self.__config, "core", "aimedb", "port", default=22345
)
@property
def key(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'aimedb', 'key', default="")
return CoreConfig.get_config_field(
self.__config, "core", "aimedb", "key", default=""
)
@property
def id_secret(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "aimedb", "id_secret", default=""
)
@property
def id_lifetime_seconds(self) -> int:
return CoreConfig.get_config_field(
self.__config, "core", "aimedb", "id_lifetime_seconds", default=86400
)
class MuchaConfig:
def __init__(self, parent_config: "CoreConfig") -> None:
@ -163,27 +333,24 @@ class MuchaConfig:
@property
def enable(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'mucha', 'enable', default=False)
@property
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(CoreConfig.get_config_field(self.__config, 'core', 'mucha', 'loglevel', default="info"))
@property
def hostname(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'mucha', 'hostname', default="localhost")
return CoreConfig.get_config_field(
self.__config, "core", "mucha", "enable", default=False
)
@property
def port(self) -> int:
return CoreConfig.get_config_field(self.__config, 'core', 'mucha', 'port', default=8444)
def loglevel(self) -> int:
return CoreConfig.str_to_loglevel(
CoreConfig.get_config_field(
self.__config, "core", "mucha", "loglevel", default="info"
)
)
@property
def ssl_cert(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'mucha', 'ssl_cert', default="cert/server.pem")
@property
def signing_key(self) -> str:
return CoreConfig.get_config_field(self.__config, 'core', 'mucha', 'signing_key', default="cert/billing.key")
def hostname(self) -> str:
return CoreConfig.get_config_field(
self.__config, "core", "mucha", "hostname", default="localhost"
)
class CoreConfig(dict):
def __init__(self) -> None:
@ -194,25 +361,28 @@ class CoreConfig(dict):
self.allnet = AllnetConfig(self)
self.billing = BillingConfig(self)
self.aimedb = AimedbConfig(self)
self.mucha = MuchaConfig(self)
@classmethod
def str_to_loglevel(cls, level_str: str):
if level_str.lower() == "error":
return logging.ERROR
elif level_str.lower().startswith("warn"): # Fits warn or warning
elif level_str.lower().startswith("warn"): # Fits warn or warning
return logging.WARN
elif level_str.lower() == "debug":
return logging.DEBUG
else:
return logging.INFO
return logging.INFO
@classmethod
def get_config_field(cls, __config: dict, module, *path: str, default: Any = "") -> Any:
envKey = f'CFG_{module}_'
def get_config_field(
cls, __config: dict, module, *path: str, default: Any = ""
) -> Any:
envKey = f"CFG_{module}_"
for arg in path:
envKey += arg + '_'
if envKey.endswith('_'):
envKey += arg + "_"
if envKey.endswith("_"):
envKey = envKey[:-1]
if envKey in os.environ:

View File

@ -1,6 +1,7 @@
from enum import Enum
class MainboardPlatformCodes():
class MainboardPlatformCodes:
RINGEDGE = "AALE"
RINGWIDE = "AAML"
NU = "AAVE"
@ -8,7 +9,8 @@ class MainboardPlatformCodes():
ALLS_UX = "ACAE"
ALLS_HX = "ACAX"
class MainboardRevisions():
class MainboardRevisions:
RINGEDGE = 1
RINGEDGE2 = 2
@ -26,11 +28,70 @@ class MainboardRevisions():
ALLS_UX2 = 2
ALLS_HX2 = 12
class KeychipPlatformsCodes():
class KeychipPlatformsCodes:
RING = "A72E"
NU = ("A60E", "A60E", "A60E")
NUSX = ("A61X", "A69X")
ALLS = "A63E"
class RegionIDs(Enum):
pass
class AllnetCountryCode(Enum):
JAPAN = "JPN"
UNITED_STATES = "USA"
HONG_KONG = "HKG"
SINGAPORE = "SGP"
SOUTH_KOREA = "KOR"
TAIWAN = "TWN"
CHINA = "CHN"
class AllnetJapanRegionId(Enum):
NONE = 0
AICHI = 1
AOMORI = 2
AKITA = 3
ISHIKAWA = 4
IBARAKI = 5
IWATE = 6
EHIME = 7
OITA = 8
OSAKA = 9
OKAYAMA = 10
OKINAWA = 11
KAGAWA = 12
KAGOSHIMA = 13
KANAGAWA = 14
GIFU = 15
KYOTO = 16
KUMAMOTO = 17
GUNMA = 18
KOCHI = 19
SAITAMA = 20
SAGA = 21
SHIGA = 22
SHIZUOKA = 23
SHIMANE = 24
CHIBA = 25
TOKYO = 26
TOKUSHIMA = 27
TOCHIGI = 28
TOTTORI = 29
TOYAMA = 30
NAGASAKI = 31
NAGANO = 32
NARA = 33
NIIGATA = 34
HYOGO = 35
HIROSHIMA = 36
FUKUI = 37
FUKUOKA = 38
FUKUSHIMA = 39
HOKKAIDO = 40
MIE = 41
MIYAGI = 42
MIYAZAKI = 43
YAMAGATA = 44
YAMAGUCHI = 45
YAMANASHI = 46
WAKAYAMA = 47

View File

@ -1,2 +1,2 @@
from core.data.database import Data
from core.data.cache import cached
from core.data.cache import cached

View File

@ -1,4 +1,3 @@
from typing import Any, Callable
from functools import wraps
import hashlib
@ -6,27 +5,28 @@ import pickle
import logging
from core.config import CoreConfig
cfg:CoreConfig = None # type: ignore
cfg: CoreConfig = None # type: ignore
# Make memcache optional
try:
import pylibmc # type: ignore
has_mc = True
except ModuleNotFoundError:
has_mc = False
def cached(lifetime: int=10, extra_key: Any=None) -> Callable:
def cached(lifetime: int = 10, extra_key: Any = None) -> Callable:
def _cached(func: Callable) -> Callable:
if has_mc:
if has_mc and (cfg and cfg.database.enable_memcached):
hostname = "127.0.0.1"
if cfg:
hostname = cfg.database.memcached_host
memcache = pylibmc.Client([hostname], binary=True)
memcache.behaviors = {"tcp_nodelay": True, "ketama": True}
@wraps(func)
def wrapper(*args: Any, **kwargs: Any) -> Any:
if lifetime is not None:
# Hash function args
items = kwargs.items()
hashable_args = (args[1:], sorted(list(items)))
@ -41,7 +41,7 @@ def cached(lifetime: int=10, extra_key: Any=None) -> Callable:
except pylibmc.Error as e:
logging.getLogger("database").error(f"Memcache failed: {e}")
result = None
if result is not None:
logging.getLogger("database").debug(f"Cache hit: {result}")
return result
@ -55,7 +55,9 @@ def cached(lifetime: int=10, extra_key: Any=None) -> Callable:
memcache.set(cache_key, result, lifetime)
return result
else:
@wraps(func)
def wrapper(*args: Any, **kwargs: Any) -> Any:
return func(*args, **kwargs)

View File

@ -1,45 +1,70 @@
import logging, coloredlogs
from typing import Any, Dict, List
from typing import Optional, Dict, List
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy import create_engine
from logging.handlers import TimedRotatingFileHandler
import importlib, os
import secrets, string
import bcrypt
from hashlib import sha256
from core.config import CoreConfig
from core.data.schema import *
from core.utils import Utils
class Data:
current_schema_version = 6
engine = None
session = None
user = None
arcade = None
card = None
base = None
def __init__(self, cfg: CoreConfig) -> None:
self.config = cfg
if self.config.database.sha2_password:
passwd = sha256(self.config.database.password.encode()).digest()
self.__url = f"{self.config.database.protocol}://{self.config.database.username}:{passwd.hex()}@{self.config.database.host}/{self.config.database.name}?charset=utf8mb4"
self.__url = f"{self.config.database.protocol}://{self.config.database.username}:{passwd.hex()}@{self.config.database.host}:{self.config.database.port}/{self.config.database.name}?charset=utf8mb4"
else:
self.__url = f"{self.config.database.protocol}://{self.config.database.username}:{self.config.database.password}@{self.config.database.host}/{self.config.database.name}?charset=utf8mb4"
self.__url = f"{self.config.database.protocol}://{self.config.database.username}:{self.config.database.password}@{self.config.database.host}:{self.config.database.port}/{self.config.database.name}?charset=utf8mb4"
if Data.engine is None:
Data.engine = create_engine(self.__url, pool_recycle=3600)
self.__engine = Data.engine
if Data.session is None:
s = sessionmaker(bind=Data.engine, autoflush=True, autocommit=True)
Data.session = scoped_session(s)
if Data.user is None:
Data.user = UserData(self.config, self.session)
self.__engine = create_engine(self.__url, pool_recycle=3600)
session = sessionmaker(bind=self.__engine, autoflush=True, autocommit=True)
self.session = scoped_session(session)
if Data.arcade is None:
Data.arcade = ArcadeData(self.config, self.session)
if Data.card is None:
Data.card = CardData(self.config, self.session)
if Data.base is None:
Data.base = BaseData(self.config, self.session)
self.user = UserData(self.config, self.session)
self.arcade = ArcadeData(self.config, self.session)
self.card = CardData(self.config, self.session)
self.base = BaseData(self.config, self.session)
self.schema_ver_latest = 1
log_fmt_str = "[%(asctime)s] %(levelname)s | Database | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
self.logger = logging.getLogger("database")
# Prevent the logger from adding handlers multiple times
if not getattr(self.logger, 'handler_set', None):
fileHandler = TimedRotatingFileHandler("{0}/{1}.log".format(self.config.server.log_dir, "db"), encoding="utf-8",
when="d", backupCount=10)
if not getattr(self.logger, "handler_set", None):
log_fmt_str = "[%(asctime)s] %(levelname)s | Database | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(self.config.server.log_dir, "db"),
encoding="utf-8",
when="d",
backupCount=10,
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
@ -47,7 +72,286 @@ class Data:
self.logger.addHandler(consoleHandler)
self.logger.setLevel(self.config.database.loglevel)
coloredlogs.install(cfg.database.loglevel, logger=self.logger, fmt=log_fmt_str)
self.logger.handler_set = True # type: ignore
coloredlogs.install(
cfg.database.loglevel, logger=self.logger, fmt=log_fmt_str
)
self.logger.handler_set = True # type: ignore
def create_database(self):
self.logger.info("Creating databases...")
try:
metadata.create_all(self.__engine.connect())
except SQLAlchemyError as e:
self.logger.error(f"Failed to create databases! {e}")
return
games = Utils.get_all_titles()
for game_dir, game_mod in games.items():
try:
if hasattr(game_mod, "database") and hasattr(
game_mod, "current_schema_version"
):
game_mod.database(self.config)
metadata.create_all(self.__engine.connect())
self.base.touch_schema_ver(
game_mod.current_schema_version, game_mod.game_codes[0]
)
except Exception as e:
self.logger.warning(
f"Could not load database schema from {game_dir} - {e}"
)
self.logger.info(f"Setting base_schema_ver to {self.current_schema_version}")
self.base.set_schema_ver(self.current_schema_version)
self.logger.info(
f"Setting user auto_incrememnt to {self.config.database.user_table_autoincrement_start}"
)
self.user.reset_autoincrement(
self.config.database.user_table_autoincrement_start
)
def recreate_database(self):
self.logger.info("Dropping all databases...")
self.base.execute("SET FOREIGN_KEY_CHECKS=0")
try:
metadata.drop_all(self.__engine.connect())
except SQLAlchemyError as e:
self.logger.error(f"Failed to drop databases! {e}")
return
for root, dirs, files in os.walk("./titles"):
for dir in dirs:
if not dir.startswith("__"):
try:
mod = importlib.import_module(f"titles.{dir}")
try:
if hasattr(mod, "database"):
mod.database(self.config)
metadata.drop_all(self.__engine.connect())
except Exception as e:
self.logger.warning(
f"Could not load database schema from {dir} - {e}"
)
except ImportError as e:
self.logger.warning(
f"Failed to load database schema dir {dir} - {e}"
)
break
self.base.execute("SET FOREIGN_KEY_CHECKS=1")
self.create_database()
def migrate_database(self, game: str, version: Optional[int], action: str) -> None:
old_ver = self.base.get_schema_ver(game)
sql = ""
if version is None:
if not game == "CORE":
titles = Utils.get_all_titles()
for folder, mod in titles.items():
if not mod.game_codes[0] == game:
continue
if hasattr(mod, "current_schema_version"):
version = mod.current_schema_version
else:
self.logger.warning(
f"current_schema_version not found for {folder}"
)
else:
version = self.current_schema_version
if version is None:
self.logger.warning(
f"Could not determine latest version for {game}, please specify --version"
)
if old_ver is None:
self.logger.error(
f"Schema for game {game} does not exist, did you run the creation script?"
)
return
if old_ver == version:
self.logger.info(
f"Schema for game {game} is already version {old_ver}, nothing to do"
)
return
if action == "upgrade":
for x in range(old_ver, version):
if not os.path.exists(
f"core/data/schema/versions/{game.upper()}_{x + 1}_{action}.sql"
):
self.logger.error(
f"Could not find {action} script {game.upper()}_{x + 1}_{action}.sql in core/data/schema/versions folder"
)
return
with open(
f"core/data/schema/versions/{game.upper()}_{x + 1}_{action}.sql",
"r",
encoding="utf-8",
) as f:
sql = f.read()
result = self.base.execute(sql)
if result is None:
self.logger.error("Error execuing sql script!")
return None
else:
for x in range(old_ver, version, -1):
if not os.path.exists(
f"core/data/schema/versions/{game.upper()}_{x - 1}_{action}.sql"
):
self.logger.error(
f"Could not find {action} script {game.upper()}_{x - 1}_{action}.sql in core/data/schema/versions folder"
)
return
with open(
f"core/data/schema/versions/{game.upper()}_{x - 1}_{action}.sql",
"r",
encoding="utf-8",
) as f:
sql = f.read()
result = self.base.execute(sql)
if result is None:
self.logger.error("Error execuing sql script!")
return None
result = self.base.set_schema_ver(version, game)
if result is None:
self.logger.error("Error setting version in schema_version table!")
return None
self.logger.info(f"Successfully migrated {game} to schema version {version}")
def create_owner(self, email: Optional[str] = None) -> None:
pw = "".join(
secrets.choice(string.ascii_letters + string.digits) for i in range(20)
)
hash = bcrypt.hashpw(pw.encode(), bcrypt.gensalt())
user_id = self.user.create_user(email=email, permission=255, password=hash)
if user_id is None:
self.logger.error(f"Failed to create owner with email {email}")
return
card_id = self.card.create_card(user_id, "00000000000000000000")
if card_id is None:
self.logger.error(f"Failed to create card for owner with id {user_id}")
return
self.logger.warning(
f"Successfully created owner with email {email}, access code 00000000000000000000, and password {pw} Make sure to change this password and assign a real card ASAP!"
)
def migrate_card(self, old_ac: str, new_ac: str, should_force: bool) -> None:
if old_ac == new_ac:
self.logger.error("Both access codes are the same!")
return
new_card = self.card.get_card_by_access_code(new_ac)
if new_card is None:
self.card.update_access_code(old_ac, new_ac)
return
if not should_force:
self.logger.warning(
f"Card already exists for access code {new_ac} (id {new_card['id']}). If you wish to continue, rerun with the '--force' flag."
f" All exiting data on the target card {new_ac} will be perminently erased and replaced with data from card {old_ac}."
)
return
self.logger.info(
f"All exiting data on the target card {new_ac} will be perminently erased and replaced with data from card {old_ac}."
)
self.card.delete_card(new_card["id"])
self.card.update_access_code(old_ac, new_ac)
hanging_user = self.user.get_user(new_card["user"])
if hanging_user["password"] is None:
self.logger.info(f"Delete hanging user {hanging_user['id']}")
self.user.delete_user(hanging_user["id"])
def delete_hanging_users(self) -> None:
"""
Finds and deletes users that have not registered for the webui that have no cards assocated with them.
"""
unreg_users = self.user.get_unregistered_users()
if unreg_users is None:
self.logger.error("Error occoured finding unregistered users")
for user in unreg_users:
cards = self.card.get_user_cards(user["id"])
if cards is None:
self.logger.error(f"Error getting cards for user {user['id']}")
continue
if not cards:
self.logger.info(f"Delete hanging user {user['id']}")
self.user.delete_user(user["id"])
def autoupgrade(self) -> None:
all_game_versions = self.base.get_all_schema_vers()
if all_game_versions is None:
self.logger.warning("Failed to get schema versions")
return
all_games = Utils.get_all_titles()
all_games_list: Dict[str, int] = {}
for _, mod in all_games.items():
if hasattr(mod, "current_schema_version"):
all_games_list[mod.game_codes[0]] = mod.current_schema_version
for x in all_game_versions:
failed = False
game = x["game"].upper()
update_ver = int(x["version"])
latest_ver = all_games_list.get(game, 1)
if game == "CORE":
latest_ver = self.current_schema_version
if update_ver == latest_ver:
self.logger.info(f"{game} is already latest version")
continue
for y in range(update_ver + 1, latest_ver + 1):
if os.path.exists(f"core/data/schema/versions/{game}_{y}_upgrade.sql"):
with open(
f"core/data/schema/versions/{game}_{y}_upgrade.sql",
"r",
encoding="utf-8",
) as f:
sql = f.read()
result = self.base.execute(sql)
if result is None:
self.logger.error(
f"Error execuing sql script for game {game} v{y}!"
)
failed = True
break
else:
self.logger.warning(f"Could not find script {game}_{y}_upgrade.sql")
failed = True
if not failed:
self.base.set_schema_ver(latest_ver, game)
def show_versions(self) -> None:
all_game_versions = self.base.get_all_schema_vers()
for ver in all_game_versions:
self.logger.info(f"{ver['game']} -> v{ver['version']}")

View File

@ -3,4 +3,4 @@ from core.data.schema.card import CardData
from core.data.schema.base import BaseData, metadata
from core.data.schema.arcade import ArcadeData
__all__ = ["UserData", "CardData", "BaseData", "metadata", "ArcadeData"]
__all__ = ["UserData", "CardData", "BaseData", "metadata", "ArcadeData"]

View File

@ -1,113 +1,232 @@
from typing import Optional, Dict
from sqlalchemy import Table, Column
from typing import Optional, Dict, List
from sqlalchemy import Table, Column, and_, or_
from sqlalchemy.sql.schema import ForeignKey, PrimaryKeyConstraint
from sqlalchemy.types import Integer, String, Boolean
from sqlalchemy.types import Integer, String, Boolean, JSON
from sqlalchemy.sql import func, select
from sqlalchemy.dialects.mysql import insert
from sqlalchemy.engine import Row
import re
from core.data.schema.base import BaseData, metadata
from core.const import *
arcade = Table(
"arcade",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column("name", String(255)),
Column("nickname", String(255)),
Column("nickname", String(255)),
Column("country", String(3)),
Column("country_id", Integer),
Column("state", String(255)),
Column("city", String(255)),
Column("region_id", Integer),
Column("timezone", String(255)),
mysql_charset='utf8mb4'
Column("ip", String(39)),
mysql_charset="utf8mb4",
)
machine = Table(
"machine",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column("arcade", ForeignKey("arcade.id", ondelete="cascade", onupdate="cascade"), nullable=False),
Column(
"arcade",
ForeignKey("arcade.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("serial", String(15), nullable=False),
Column("board", String(15)),
Column("game", String(4)),
Column("country", String(3)), # overwrites if not null
Column("country", String(3)), # overwrites if not null
Column("timezone", String(255)),
Column("ota_enable", Boolean),
Column("memo", String(255)),
Column("is_cab", Boolean),
mysql_charset='utf8mb4'
Column("data", JSON),
mysql_charset="utf8mb4",
)
arcade_owner = Table(
'arcade_owner',
"arcade_owner",
metadata,
Column('user', Integer, ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"), nullable=False),
Column('arcade', Integer, ForeignKey("arcade.id", ondelete="cascade", onupdate="cascade"), nullable=False),
Column('permissions', Integer, nullable=False),
PrimaryKeyConstraint('user', 'arcade', name='arcade_owner_pk'),
mysql_charset='utf8mb4'
Column(
"user",
Integer,
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column(
"arcade",
Integer,
ForeignKey("arcade.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("permissions", Integer, nullable=False),
PrimaryKeyConstraint("user", "arcade", name="arcade_owner_pk"),
mysql_charset="utf8mb4",
)
class ArcadeData(BaseData):
def get_machine(self, serial: str = None, id: int = None) -> Optional[Dict]:
def get_machine(self, serial: str = None, id: int = None) -> Optional[Row]:
if serial is not None:
sql = machine.select(machine.c.serial == serial)
serial = serial.replace("-", "")
if len(serial) == 11:
sql = machine.select(machine.c.serial.like(f"{serial}%"))
elif len(serial) == 15:
sql = machine.select(machine.c.serial == serial)
else:
self.logger.error(f"{__name__ }: Malformed serial {serial}")
return None
elif id is not None:
sql = machine.select(machine.c.id == id)
else:
else:
self.logger.error(f"{__name__ }: Need either serial or ID to look up!")
return None
result = self.execute(sql)
if result is None: return None
if result is None:
return None
return result.fetchone()
def put_machine(self, arcade_id: int, serial: str = None, board: str = None, game: str = None, is_cab: bool = False) -> Optional[int]:
def put_machine(
self,
arcade_id: int,
serial: str = "",
board: str = None,
game: str = None,
is_cab: bool = False,
) -> Optional[int]:
if arcade_id:
self.logger.error(f"{__name__ }: Need arcade id!")
return None
if serial is None:
pass
sql = machine.insert().values(arcade = arcade_id, keychip = serial, board = board, game = game, is_cab = is_cab)
result = self.execute(sql)
if result is None: return None
return result.lastrowid
def get_arcade(self, id: int) -> Optional[Dict]:
sql = arcade.select(arcade.c.id == id)
result = self.execute(sql)
if result is None: return None
return result.fetchone()
def put_arcade(self, name: str, nickname: str = None, country: str = "JPN", country_id: int = 1,
state: str = "", city: str = "", regional_id: int = 1) -> Optional[int]:
if nickname is None: nickname = name
sql = arcade.insert().values(name = name, nickname = nickname, country = country, country_id = country_id,
state = state, city = city, regional_id = regional_id)
result = self.execute(sql)
if result is None: return None
return result.lastrowid
def get_arcade_owners(self, arcade_id: int) -> Optional[Dict]:
sql = select(arcade_owner).where(arcade_owner.c.arcade==arcade_id)
result = self.execute(sql)
if result is None: return None
return result.fetchall()
def add_arcade_owner(self, arcade_id: int, user_id: int) -> None:
sql = insert(arcade_owner).values(
arcade=arcade_id,
user=user_id
sql = machine.insert().values(
arcade=arcade_id, keychip=serial, board=board, game=game, is_cab=is_cab
)
result = self.execute(sql)
if result is None: return None
if result is None:
return None
return result.lastrowid
def generate_keychip_serial(self, platform_id: int) -> str:
pass
def set_machine_serial(self, machine_id: int, serial: str) -> None:
result = self.execute(
machine.update(machine.c.id == machine_id).values(keychip=serial)
)
if result is None:
self.logger.error(
f"Failed to update serial for machine {machine_id} -> {serial}"
)
return result.lastrowid
def set_machine_boardid(self, machine_id: int, boardid: str) -> None:
result = self.execute(
machine.update(machine.c.id == machine_id).values(board=boardid)
)
if result is None:
self.logger.error(
f"Failed to update board id for machine {machine_id} -> {boardid}"
)
def get_arcade(self, id: int) -> Optional[Row]:
sql = arcade.select(arcade.c.id == id)
result = self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_arcade_machines(self, id: int) -> Optional[List[Row]]:
sql = machine.select(machine.c.arcade == id)
result = self.execute(sql)
if result is None:
return None
return result.fetchall()
def put_arcade(
self,
name: str,
nickname: str = None,
country: str = "JPN",
country_id: int = 1,
state: str = "",
city: str = "",
regional_id: int = 1,
) -> Optional[int]:
if nickname is None:
nickname = name
sql = arcade.insert().values(
name=name,
nickname=nickname,
country=country,
country_id=country_id,
state=state,
city=city,
regional_id=regional_id,
)
result = self.execute(sql)
if result is None:
return None
return result.lastrowid
def get_arcades_managed_by_user(self, user_id: int) -> Optional[List[Row]]:
sql = select(arcade).join(arcade_owner, arcade_owner.c.arcade == arcade.c.id).where(arcade_owner.c.user == user_id)
result = self.execute(sql)
if result is None:
return False
return result.fetchall()
def get_manager_permissions(self, user_id: int, arcade_id: int) -> Optional[int]:
sql = select(arcade_owner.c.permissions).where(and_(arcade_owner.c.user == user_id, arcade_owner.c.arcade == arcade_id))
result = self.execute(sql)
if result is None:
return False
return result.fetchone()
def get_arcade_owners(self, arcade_id: int) -> Optional[Row]:
sql = select(arcade_owner).where(arcade_owner.c.arcade == arcade_id)
result = self.execute(sql)
if result is None:
return None
return result.fetchall()
def add_arcade_owner(self, arcade_id: int, user_id: int) -> None:
sql = insert(arcade_owner).values(arcade=arcade_id, user=user_id)
result = self.execute(sql)
if result is None:
return None
return result.lastrowid
def format_serial(
self, platform_code: str, platform_rev: int, serial_num: int, append: int = 4152
) -> str:
return f"{platform_code}{platform_rev:02d}A{serial_num:04d}{append:04d}" # 0x41 = A, 0x52 = R
def validate_keychip_format(self, serial: str) -> bool:
if re.fullmatch(r"^A[0-9]{2}[E|X][-]?[0-9]{2}[A-HJ-NP-Z][0-9]{4}([0-9]{4})?$", serial) is None:
return False
return True
def get_arcade_by_name(self, name: str) -> Optional[List[Row]]:
sql = arcade.select(or_(arcade.c.name.like(f"%{name}%"), arcade.c.nickname.like(f"%{name}%")))
result = self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_arcades_by_ip(self, ip: str) -> Optional[List[Row]]:
sql = arcade.select().where(arcade.c.ip == ip)
result = self.execute(sql)
if result is None:
return None
return result.fetchall()

View File

@ -2,6 +2,7 @@ import json
import logging
from random import randrange
from typing import Any, Optional, Dict, List
from sqlalchemy.engine import Row
from sqlalchemy.engine.cursor import CursorResult
from sqlalchemy.engine.base import Connection
from sqlalchemy.sql import text, func, select
@ -19,7 +20,7 @@ schema_ver = Table(
metadata,
Column("game", String(4), primary_key=True, nullable=False),
Column("version", Integer, nullable=False, server_default="1"),
mysql_charset='utf8mb4'
mysql_charset="utf8mb4",
)
event_log = Table(
@ -29,96 +30,138 @@ event_log = Table(
Column("system", String(255), nullable=False),
Column("type", String(255), nullable=False),
Column("severity", Integer, nullable=False),
Column("message", String(1000), nullable=False),
Column("details", JSON, nullable=False),
Column("when_logged", TIMESTAMP, nullable=False, server_default=func.now()),
mysql_charset='utf8mb4'
mysql_charset="utf8mb4",
)
class BaseData():
class BaseData:
def __init__(self, cfg: CoreConfig, conn: Connection) -> None:
self.config = cfg
self.conn = conn
self.logger = logging.getLogger("database")
def execute(self, sql: str, opts: Dict[str, Any]={}) -> Optional[CursorResult]:
def execute(self, sql: str, opts: Dict[str, Any] = {}) -> Optional[CursorResult]:
res = None
try:
self.logger.info(f"SQL Execute: {''.join(str(sql).splitlines())} || {opts}")
self.logger.info(f"SQL Execute: {''.join(str(sql).splitlines())}")
res = self.conn.execute(text(sql), opts)
except SQLAlchemyError as e:
self.logger.error(f"SQLAlchemy error {e}")
return None
except UnicodeEncodeError as e:
self.logger.error(f"UnicodeEncodeError error {e}")
return None
except:
except Exception:
try:
res = self.conn.execute(sql, opts)
except SQLAlchemyError as e:
self.logger.error(f"SQLAlchemy error {e}")
return None
except UnicodeEncodeError as e:
self.logger.error(f"UnicodeEncodeError error {e}")
return None
except:
except Exception:
self.logger.error(f"Unknown error")
raise
return res
def generate_id(self) -> int:
"""
Generate a random 5-7 digit id
"""
return randrange(10000, 9999999)
def get_all_schema_vers(self) -> Optional[List[Row]]:
sql = select(schema_ver)
result = self.execute(sql)
if result is None:
return None
return result.fetchall()
def get_schema_ver(self, game: str) -> Optional[int]:
sql = select(schema_ver).where(schema_ver.c.game == game)
result = self.execute(sql)
if result is None:
return None
return result.fetchone()["version"]
row = result.fetchone()
if row is None:
return None
return row["version"]
def set_schema_ver(self, ver: int, game: str = "CORE") -> Optional[int]:
sql = insert(schema_ver).values(game = game, version = ver)
conflict = sql.on_duplicate_key_update(version = ver)
def touch_schema_ver(self, ver: int, game: str = "CORE") -> Optional[int]:
sql = insert(schema_ver).values(game=game, version=ver)
conflict = sql.on_duplicate_key_update(version=schema_ver.c.version)
result = self.execute(conflict)
if result is None:
self.logger.error(f"Failed to update schema version for game {game} (v{ver})")
self.logger.error(
f"Failed to update schema version for game {game} (v{ver})"
)
return None
return result.lastrowid
def log_event(self, system: str, type: str, severity: int, details: Dict) -> Optional[int]:
sql = event_log.insert().values(system = system, type = type, severity = severity, details = json.dumps(details))
def set_schema_ver(self, ver: int, game: str = "CORE") -> Optional[int]:
sql = insert(schema_ver).values(game=game, version=ver)
conflict = sql.on_duplicate_key_update(version=ver)
result = self.execute(conflict)
if result is None:
self.logger.error(
f"Failed to update schema version for game {game} (v{ver})"
)
return None
return result.lastrowid
def log_event(
self, system: str, type: str, severity: int, message: str, details: Dict = {}
) -> Optional[int]:
sql = event_log.insert().values(
system=system,
type=type,
severity=severity,
message=message,
details=json.dumps(details),
)
result = self.execute(sql)
if result is None:
self.logger.error(f"{__name__}: Failed to insert event into event log! system = {system}, type = {type}, severity = {severity}, details = {details}")
self.logger.error(
f"{__name__}: Failed to insert event into event log! system = {system}, type = {type}, severity = {severity}, message = {message}"
)
return None
return result.lastrowid
def get_event_log(self, entries: int = 100) -> Optional[List[Dict]]:
sql = event_log.select().limit(entries).all()
result = self.execute(sql)
if result is None: return None
if result is None:
return None
return result.fetchall()
def fix_bools(self, data: Dict) -> Dict:
for k,v in data.items():
for k, v in data.items():
if k == "userName" or k == "teamName":
continue
if type(v) == str and v.lower() == "true":
data[k] = True
elif type(v) == str and v.lower() == "false":
data[k] = False
return data

View File

@ -3,55 +3,113 @@ from sqlalchemy import Table, Column, UniqueConstraint
from sqlalchemy.types import Integer, String, Boolean, TIMESTAMP
from sqlalchemy.sql.schema import ForeignKey
from sqlalchemy.sql import func
from sqlalchemy.engine import Row
from core.data.schema.base import BaseData, metadata
aime_card = Table(
'aime_card',
"aime_card",
metadata,
Column("id", Integer, primary_key=True, nullable=False),
Column("user", ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"), nullable=False),
Column(
"user",
ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"),
nullable=False,
),
Column("access_code", String(20)),
Column("created_date", TIMESTAMP, server_default=func.now()),
Column("last_login_date", TIMESTAMP, onupdate=func.now()),
Column("is_locked", Boolean, server_default="0"),
Column("is_banned", Boolean, server_default="0"),
UniqueConstraint("user", "access_code", name="aime_card_uk"),
mysql_charset='utf8mb4'
mysql_charset="utf8mb4",
)
class CardData(BaseData):
def get_card_by_access_code(self, access_code: str) -> Optional[Row]:
sql = aime_card.select(aime_card.c.access_code == access_code)
result = self.execute(sql)
if result is None:
return None
return result.fetchone()
def get_card_by_id(self, card_id: int) -> Optional[Row]:
sql = aime_card.select(aime_card.c.id == card_id)
result = self.execute(sql)
if result is None:
return None
return result.fetchone()
def update_access_code(self, old_ac: str, new_ac: str) -> None:
sql = aime_card.update(aime_card.c.access_code == old_ac).values(
access_code=new_ac
)
result = self.execute(sql)
if result is None:
self.logger.error(
f"Failed to change card access code from {old_ac} to {new_ac}"
)
def get_user_id_from_card(self, access_code: str) -> Optional[int]:
"""
Given a 20 digit access code as a string, get the user id associated with that card
"""
sql = aime_card.select(aime_card.c.access_code == access_code)
result = self.execute(sql)
if result is None: return None
card = result.fetchone()
if card is None: return None
card = self.get_card_by_access_code(access_code)
if card is None:
return None
return int(card["user"])
def get_user_cards(self, aime_id: int) -> Optional[List[Dict]]:
def get_card_banned(self, access_code: str) -> Optional[bool]:
"""
Given a 20 digit access code as a string, check if the card is banned
"""
card = self.get_card_by_access_code(access_code)
if card is None:
return None
if card["is_banned"]:
return True
return False
def get_card_locked(self, access_code: str) -> Optional[bool]:
"""
Given a 20 digit access code as a string, check if the card is locked
"""
card = self.get_card_by_access_code(access_code)
if card is None:
return None
if card["is_locked"]:
return True
return False
def delete_card(self, card_id: int) -> None:
sql = aime_card.delete(aime_card.c.id == card_id)
result = self.execute(sql)
if result is None:
self.logger.error(f"Failed to delete card with id {card_id}")
def get_user_cards(self, aime_id: int) -> Optional[List[Row]]:
"""
Returns all cards owned by a user
"""
sql = aime_card.select(aime_card.c.user == aime_id)
result = self.execute(sql)
if result is None: return None
if result is None:
return None
return result.fetchall()
def create_card(self, user_id: int, access_code: str) -> Optional[int]:
"""
Given a aime_user id and a 20 digit access code as a string, create a card and return the ID if successful
"""
sql = aime_card.insert().values(user=user_id, access_code=access_code)
result = self.execute(sql)
if result is None: return None
if result is None:
return None
return result.lastrowid
def to_access_code(self, luid: str) -> str:
@ -64,4 +122,4 @@ class CardData(BaseData):
"""
Given a 20 digit access code as a string, return the 16 hex character luid
"""
return f'{int(access_code):0{16}x}'
return f"{int(access_code):0{16}x}"

View File

@ -1,9 +1,12 @@
from enum import Enum
from typing import Dict, Optional
from typing import Optional, List
from sqlalchemy import Table, Column
from sqlalchemy.types import Integer, String, TIMESTAMP
from sqlalchemy.sql.schema import ForeignKey
from sqlalchemy.sql import func
from sqlalchemy.dialects.mysql import insert
from sqlalchemy.sql import func, select
from sqlalchemy.engine import Row
import bcrypt
from core.data.schema.base import BaseData, metadata
@ -14,44 +17,107 @@ aime_user = Table(
Column("username", String(25), unique=True),
Column("email", String(255), unique=True),
Column("password", String(255)),
Column("permissions", Integer),
Column("permissions", Integer),
Column("created_date", TIMESTAMP, server_default=func.now()),
Column("last_login_date", TIMESTAMP, onupdate=func.now()),
Column("suspend_expire_time", TIMESTAMP),
mysql_charset='utf8mb4'
mysql_charset="utf8mb4",
)
frontend_session = Table(
"frontend_session",
metadata,
Column("id", Integer, primary_key=True, unique=True),
Column("user", ForeignKey("aime_user.id", ondelete="cascade", onupdate="cascade"), nullable=False),
Column('session_cookie', String(32), nullable=False, unique=True),
Column("expires", TIMESTAMP, nullable=False),
mysql_charset='utf8mb4'
)
class PermissionBits(Enum):
PermUser = 1
PermMod = 2
PermSysAdmin = 4
class UserData(BaseData):
def create_user(self, username: str = None, email: str = None, password: str = None) -> Optional[int]:
if email is None:
permission = None
def create_user(
self,
id: int = None,
username: str = None,
email: str = None,
password: str = None,
permission: int = 1,
) -> Optional[int]:
if id is None:
sql = insert(aime_user).values(
username=username,
email=email,
password=password,
permissions=permission,
)
else:
permission = 0
sql = insert(aime_user).values(
id=id,
username=username,
email=email,
password=password,
permissions=permission,
)
sql = aime_user.insert().values(username=username, email=email, password=password, permissions=permission)
result = self.execute(sql)
if result is None: return None
conflict = sql.on_duplicate_key_update(
username=username, email=email, password=password, permissions=permission
)
result = self.execute(conflict)
if result is None:
return None
return result.lastrowid
def get_user(self, user_id: int) -> Optional[Row]:
sql = select(aime_user).where(aime_user.c.id == user_id)
result = self.execute(sql)
if result is None:
return False
return result.fetchone()
def check_password(self, user_id: int, passwd: bytes = None) -> bool:
usr = self.get_user(user_id)
if usr is None:
return False
if usr["password"] is None:
return False
if passwd is None or not passwd:
return False
return bcrypt.checkpw(passwd, usr["password"].encode())
def reset_autoincrement(self, ai_value: int) -> None:
# Didn't feel like learning how to do this the right way
# if somebody wants a free PR go nuts I guess
# ALTER TABLE isn't in sqlalchemy so we do this the ugly way
sql = f"ALTER TABLE aime_user AUTO_INCREMENT={ai_value}"
self.execute(sql)
self.execute(sql)
def delete_user(self, user_id: int) -> None:
sql = aime_user.delete(aime_user.c.id == user_id)
result = self.execute(sql)
if result is None:
self.logger.error(f"Failed to delete user with id {user_id}")
def get_unregistered_users(self) -> List[Row]:
"""
Returns a list of users who have not registered with the webui. They may or may not have cards.
"""
sql = select(aime_user).where(aime_user.c.password == None)
result = self.execute(sql)
if result is None:
return None
return result.fetchall()
def find_user_by_email(self, email: str) -> Row:
sql = select(aime_user).where(aime_user.c.email == email)
result = self.execute(sql)
if result is None:
return False
return result.fetchone()
def find_user_by_username(self, username: str) -> List[Row]:
sql = aime_user.select(aime_user.c.username.like(f"%{username}%"))
result = self.execute(sql)
if result is None:
return False
return result.fetchall()

View File

@ -0,0 +1,2 @@
ALTER TABLE `frontend_session`
DROP COLUMN `ip`;

View File

@ -0,0 +1 @@
ALTER TABLE `event_log` DROP COLUMN `message`;

View File

@ -0,0 +1,2 @@
ALTER TABLE `frontend_session`
ADD `ip` CHAR(15);

View File

@ -0,0 +1,12 @@
CREATE TABLE `frontend_session` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`user` int(11) NOT NULL,
`ip` varchar(15) DEFAULT NULL,
`session_cookie` varchar(32) NOT NULL,
`expires` timestamp NOT NULL DEFAULT current_timestamp() ON UPDATE current_timestamp(),
PRIMARY KEY (`id`),
UNIQUE KEY `id` (`id`),
UNIQUE KEY `session_cookie` (`session_cookie`),
KEY `user` (`user`),
CONSTRAINT `frontend_session_ibfk_1` FOREIGN KEY (`user`) REFERENCES `aime_user` (`id`) ON DELETE CASCADE ON UPDATE CASCADE
) ENGINE=InnoDB AUTO_INCREMENT=0 DEFAULT CHARSET=utf8mb4;

View File

@ -0,0 +1 @@
ALTER TABLE `event_log` ADD COLUMN `message` VARCHAR(1000) NOT NULL AFTER `severity`;

View File

@ -0,0 +1,3 @@
ALTER TABLE machine DROP COLUMN memo;
ALTER TABLE machine DROP COLUMN is_blacklisted;
ALTER TABLE machine DROP COLUMN `data`;

View File

@ -0,0 +1 @@
DROP TABLE `frontend_session`;

View File

@ -0,0 +1 @@
ALTER TABLE arcade DROP COLUMN 'ip';

View File

@ -0,0 +1,3 @@
ALTER TABLE machine ADD memo varchar(255) NULL;
ALTER TABLE machine ADD is_blacklisted tinyint(1) NULL;
ALTER TABLE machine ADD `data` longtext NULL;

View File

@ -0,0 +1 @@
ALTER TABLE arcade ADD ip varchar(39) NULL;

View File

@ -0,0 +1,9 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE diva_score DROP FOREIGN KEY diva_score_ibfk_1;
ALTER TABLE diva_score DROP CONSTRAINT diva_score_uk;
ALTER TABLE diva_score ADD CONSTRAINT diva_score_uk UNIQUE (user, pv_id, difficulty);
ALTER TABLE diva_score ADD CONSTRAINT diva_score_ibfk_1 FOREIGN KEY (user) REFERENCES aime_user(id) ON DELETE CASCADE;
ALTER TABLE diva_score DROP COLUMN edition;
ALTER TABLE diva_playlog DROP COLUMN edition;
SET FOREIGN_KEY_CHECKS=1;

View File

@ -0,0 +1,17 @@
ALTER TABLE diva_profile_shop DROP COLUMN c_itm_eqp_ary;
ALTER TABLE diva_profile_shop DROP COLUMN ms_itm_flg_ary;
ALTER TABLE diva_profile DROP COLUMN use_pv_mdl_eqp;
ALTER TABLE diva_profile DROP COLUMN use_mdl_pri;
ALTER TABLE diva_profile DROP COLUMN use_pv_skn_eqp;
ALTER TABLE diva_profile DROP COLUMN use_pv_btn_se_eqp;
ALTER TABLE diva_profile DROP COLUMN use_pv_sld_se_eqp;
ALTER TABLE diva_profile DROP COLUMN use_pv_chn_sld_se_eqp;
ALTER TABLE diva_profile DROP COLUMN use_pv_sldr_tch_se_eqp;
ALTER TABLE diva_profile ADD COLUMN use_pv_mdl_eqp VARCHAR(8) NOT NULL DEFAULT "true" AFTER sort_kind;
ALTER TABLE diva_profile ADD COLUMN use_pv_btn_se_eqp VARCHAR(8) NOT NULL DEFAULT "true" AFTER use_pv_mdl_eqp;
ALTER TABLE diva_profile ADD COLUMN use_pv_sld_se_eqp VARCHAR(8) NOT NULL DEFAULT "false" AFTER use_pv_btn_se_eqp;
ALTER TABLE diva_profile ADD COLUMN use_pv_chn_sld_se_eqp VARCHAR(8) NOT NULL DEFAULT "false" AFTER use_pv_sld_se_eqp;
ALTER TABLE diva_profile ADD COLUMN use_pv_sldr_tch_se_eqp VARCHAR(8) NOT NULL DEFAULT "false" AFTER use_pv_chn_sld_se_eqp;
DROP TABLE IF EXISTS `diva_profile_pv_customize`;

View File

@ -0,0 +1,9 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE diva_score ADD COLUMN edition int(11) DEFAULT 0 AFTER difficulty;
ALTER TABLE diva_playlog ADD COLUMN edition int(11) DEFAULT 0 AFTER difficulty;
ALTER TABLE diva_score DROP FOREIGN KEY diva_score_ibfk_1;
ALTER TABLE diva_score DROP CONSTRAINT diva_score_uk;
ALTER TABLE diva_score ADD CONSTRAINT diva_score_uk UNIQUE (user, pv_id, difficulty, edition);
ALTER TABLE diva_score ADD CONSTRAINT diva_score_ibfk_1 FOREIGN KEY (user) REFERENCES aime_user(id) ON DELETE CASCADE;
SET FOREIGN_KEY_CHECKS=1;

View File

@ -0,0 +1,3 @@
ALTER TABLE diva_profile DROP COLUMN passwd_stat;
ALTER TABLE diva_profile DROP COLUMN passwd;
ALTER TABLE diva_profile MODIFY player_name VARCHAR(8);

View File

@ -0,0 +1,33 @@
ALTER TABLE diva_profile_shop ADD COLUMN c_itm_eqp_ary varchar(59) DEFAULT "-999,-999,-999,-999,-999,-999,-999,-999,-999,-999,-999,-999";
ALTER TABLE diva_profile_shop ADD COLUMN ms_itm_flg_ary varchar(59) DEFAULT "-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1";
ALTER TABLE diva_profile DROP COLUMN use_pv_mdl_eqp;
ALTER TABLE diva_profile DROP COLUMN use_pv_btn_se_eqp;
ALTER TABLE diva_profile DROP COLUMN use_pv_sld_se_eqp;
ALTER TABLE diva_profile DROP COLUMN use_pv_chn_sld_se_eqp;
ALTER TABLE diva_profile DROP COLUMN use_pv_sldr_tch_se_eqp;
ALTER TABLE diva_profile ADD COLUMN use_pv_mdl_eqp BOOLEAN NOT NULL DEFAULT true AFTER sort_kind;
ALTER TABLE diva_profile ADD COLUMN use_mdl_pri BOOLEAN NOT NULL DEFAULT false AFTER use_pv_mdl_eqp;
ALTER TABLE diva_profile ADD COLUMN use_pv_skn_eqp BOOLEAN NOT NULL DEFAULT false AFTER use_mdl_pri;
ALTER TABLE diva_profile ADD COLUMN use_pv_btn_se_eqp BOOLEAN NOT NULL DEFAULT true AFTER use_pv_skn_eqp;
ALTER TABLE diva_profile ADD COLUMN use_pv_sld_se_eqp BOOLEAN NOT NULL DEFAULT false AFTER use_pv_btn_se_eqp;
ALTER TABLE diva_profile ADD COLUMN use_pv_chn_sld_se_eqp BOOLEAN NOT NULL DEFAULT false AFTER use_pv_sld_se_eqp;
ALTER TABLE diva_profile ADD COLUMN use_pv_sldr_tch_se_eqp BOOLEAN NOT NULL DEFAULT false AFTER use_pv_chn_sld_se_eqp;
CREATE TABLE diva_profile_pv_customize (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
user INT NOT NULL,
version INT NOT NULL,
pv_id INT NOT NULL,
mdl_eqp_ary VARCHAR(14) DEFAULT '-999,-999,-999',
c_itm_eqp_ary VARCHAR(59) DEFAULT '-999,-999,-999,-999,-999,-999,-999,-999,-999,-999,-999,-999',
ms_itm_flg_ary VARCHAR(59) DEFAULT '-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1',
skin INT DEFAULT '-1',
btn_se INT DEFAULT '-1',
sld_se INT DEFAULT '-1',
chsld_se INT DEFAULT '-1',
sldtch_se INT DEFAULT '-1',
UNIQUE KEY diva_profile_pv_customize_uk (user, version, pv_id),
CONSTRAINT diva_profile_pv_customize_ibfk_1 FOREIGN KEY (user) REFERENCES aime_user (id) ON DELETE CASCADE ON UPDATE CASCADE
);

View File

@ -0,0 +1,9 @@
ALTER TABLE diva_profile
DROP cnp_cid,
DROP cnp_val,
DROP cnp_rr,
DROP cnp_sp,
DROP btn_se_eqp,
DROP sld_se_eqp,
DROP chn_sld_se_eqp,
DROP sldr_tch_se_eqp;

View File

@ -0,0 +1,3 @@
ALTER TABLE diva_profile ADD COLUMN passwd_stat INTEGER NOT NULL DEFAULT 0;
ALTER TABLE diva_profile ADD COLUMN passwd VARCHAR(12) NOT NULL DEFAULT "**********";
ALTER TABLE diva_profile MODIFY player_name VARCHAR(10);

View File

@ -0,0 +1,2 @@
ALTER TABLE diva_profile
DROP skn_eqp;

View File

@ -0,0 +1,9 @@
ALTER TABLE diva_profile
ADD cnp_cid INT NOT NULL DEFAULT -1,
ADD cnp_val INT NOT NULL DEFAULT -1,
ADD cnp_rr INT NOT NULL DEFAULT -1,
ADD cnp_sp VARCHAR(255) NOT NULL DEFAULT "",
ADD btn_se_eqp INT NOT NULL DEFAULT -1,
ADD sld_se_eqp INT NOT NULL DEFAULT -1,
ADD chn_sld_se_eqp INT NOT NULL DEFAULT -1,
ADD sldr_tch_se_eqp INT NOT NULL DEFAULT -1;

View File

@ -0,0 +1,2 @@
ALTER TABLE diva_profile
ADD skn_eqp INT NOT NULL DEFAULT 0;

View File

@ -0,0 +1 @@
ALTER TABLE chuni_static_music CHANGE COLUMN worldsEndTag worldsEndTag VARCHAR(20) NULL DEFAULT NULL ;

View File

@ -0,0 +1 @@
ALTER TABLE chuni_score_course DROP COLUMN theoryCount, DROP COLUMN orderId, DROP COLUMN playerRating;

View File

@ -0,0 +1 @@
ALTER TABLE chuni_static_music CHANGE COLUMN worldsEndTag worldsEndTag VARCHAR(7) NULL DEFAULT NULL ;

View File

@ -0,0 +1,30 @@
SET FOREIGN_KEY_CHECKS = 0;
ALTER TABLE chuni_score_playlog
DROP COLUMN regionId,
DROP COLUMN machineType;
ALTER TABLE chuni_static_events
DROP COLUMN startDate;
ALTER TABLE chuni_profile_data
DROP COLUMN rankUpChallengeResults;
ALTER TABLE chuni_static_login_bonus
DROP FOREIGN KEY chuni_static_login_bonus_ibfk_1;
ALTER TABLE chuni_static_login_bonus_preset
DROP PRIMARY KEY;
ALTER TABLE chuni_static_login_bonus_preset
CHANGE COLUMN presetId id INT NOT NULL;
ALTER TABLE chuni_static_login_bonus_preset
ADD PRIMARY KEY(id);
ALTER TABLE chuni_static_login_bonus_preset
ADD CONSTRAINT chuni_static_login_bonus_preset_uk UNIQUE(id, version);
ALTER TABLE chuni_static_login_bonus
ADD CONSTRAINT chuni_static_login_bonus_ibfk_1 FOREIGN KEY(presetId)
REFERENCES chuni_static_login_bonus_preset(id) ON UPDATE CASCADE ON DELETE CASCADE;
SET FOREIGN_KEY_CHECKS = 1;

View File

@ -0,0 +1 @@
ALTER TABLE chuni_score_course ADD theoryCount int(11), ADD orderId int(11), ADD playerRating int(11);

View File

@ -0,0 +1,12 @@
SET FOREIGN_KEY_CHECKS = 0;
ALTER TABLE chuni_score_playlog
CHANGE COLUMN isClear isClear TINYINT(1) NULL DEFAULT NULL;
ALTER TABLE chuni_score_best
CHANGE COLUMN isSuccess isSuccess TINYINT(1) NULL DEFAULT NULL ;
ALTER TABLE chuni_score_playlog
DROP COLUMN ticketId;
SET FOREIGN_KEY_CHECKS = 1;

View File

@ -0,0 +1,29 @@
SET FOREIGN_KEY_CHECKS = 0;
ALTER TABLE chuni_score_playlog
ADD COLUMN regionId INT,
ADD COLUMN machineType INT;
ALTER TABLE chuni_static_events
ADD COLUMN startDate TIMESTAMP NOT NULL DEFAULT current_timestamp();
ALTER TABLE chuni_profile_data
ADD COLUMN rankUpChallengeResults JSON;
ALTER TABLE chuni_static_login_bonus
DROP FOREIGN KEY chuni_static_login_bonus_ibfk_1;
ALTER TABLE chuni_static_login_bonus_preset
CHANGE COLUMN id presetId INT NOT NULL;
ALTER TABLE chuni_static_login_bonus_preset
DROP PRIMARY KEY;
ALTER TABLE chuni_static_login_bonus_preset
DROP INDEX chuni_static_login_bonus_preset_uk;
ALTER TABLE chuni_static_login_bonus_preset
ADD CONSTRAINT chuni_static_login_bonus_preset_pk PRIMARY KEY (presetId, version);
ALTER TABLE chuni_static_login_bonus
ADD CONSTRAINT chuni_static_login_bonus_ibfk_1 FOREIGN KEY (presetId, version)
REFERENCES chuni_static_login_bonus_preset(presetId, version) ON UPDATE CASCADE ON DELETE CASCADE;
SET FOREIGN_KEY_CHECKS = 1;

View File

@ -0,0 +1,12 @@
SET FOREIGN_KEY_CHECKS = 0;
ALTER TABLE chuni_score_playlog
CHANGE COLUMN isClear isClear TINYINT(6) NULL DEFAULT NULL;
ALTER TABLE chuni_score_best
CHANGE COLUMN isSuccess isSuccess INT(11) NULL DEFAULT NULL ;
ALTER TABLE chuni_score_playlog
ADD COLUMN ticketId INT(11) NULL AFTER machineType;
SET FOREIGN_KEY_CHECKS = 1;

View File

@ -0,0 +1,7 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE ongeki_profile_data DROP COLUMN isDialogWatchedSuggestMemory;
ALTER TABLE ongeki_score_best DROP COLUMN platinumScoreMax;
ALTER TABLE ongeki_score_playlog DROP COLUMN platinumScore;
ALTER TABLE ongeki_score_playlog DROP COLUMN platinumScoreMax;
DROP TABLE IF EXISTS `ongeki_user_memorychapter`;
SET FOREIGN_KEY_CHECKS=1;

View File

@ -0,0 +1 @@
ALTER TABLE ongeki_profile_data DROP COLUMN lastEmoneyCredit;

View File

@ -0,0 +1,27 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE ongeki_profile_data ADD COLUMN isDialogWatchedSuggestMemory BOOLEAN;
ALTER TABLE ongeki_score_best ADD COLUMN platinumScoreMax INTEGER;
ALTER TABLE ongeki_score_playlog ADD COLUMN platinumScore INTEGER;
ALTER TABLE ongeki_score_playlog ADD COLUMN platinumScoreMax INTEGER;
CREATE TABLE ongeki_user_memorychapter (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
user INT NOT NULL,
chapterId INT NOT NULL,
gaugeId INT NOT NULL,
gaugeNum INT NOT NULL,
jewelCount INT NOT NULL,
isStoryWatched BOOLEAN NOT NULL,
isBossWatched BOOLEAN NOT NULL,
isDialogWatched BOOLEAN NOT NULL,
isEndingWatched BOOLEAN NOT NULL,
isClear BOOLEAN NOT NULL,
lastPlayMusicId INT NOT NULL,
lastPlayMusicLevel INT NOT NULL,
lastPlayMusicCategory INT NOT NULL,
UNIQUE KEY ongeki_user_memorychapter_uk (user, chapterId),
CONSTRAINT ongeki_user_memorychapter_ibfk_1 FOREIGN KEY (user) REFERENCES aime_user (id) ON DELETE CASCADE ON UPDATE CASCADE
);
SET FOREIGN_KEY_CHECKS=1;

View File

@ -0,0 +1,2 @@
ALTER TABLE ongeki_static_events
DROP COLUMN startDate;

View File

@ -0,0 +1 @@
ALTER TABLE ongeki_profile_data ADD COLUMN lastEmoneyCredit INTEGER DEFAULT 0;

View File

@ -0,0 +1,22 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE ongeki_user_event_point DROP COLUMN version;
ALTER TABLE ongeki_user_event_point DROP COLUMN rank;
ALTER TABLE ongeki_user_event_point DROP COLUMN type;
ALTER TABLE ongeki_user_event_point DROP COLUMN date;
ALTER TABLE ongeki_user_tech_event DROP COLUMN version;
ALTER TABLE ongeki_user_mission_point DROP COLUMN version;
ALTER TABLE ongeki_static_events DROP COLUMN endDate;
DROP TABLE ongeki_tech_event_ranking;
DROP TABLE ongeki_static_music_ranking_list;
DROP TABLE ongeki_static_rewards;
DROP TABLE ongeki_static_present_list;
DROP TABLE ongeki_static_tech_music;
DROP TABLE ongeki_static_client_testmode;
DROP TABLE ongeki_static_game_point;
SET FOREIGN_KEY_CHECKS=1;

View File

@ -0,0 +1,2 @@
ALTER TABLE ongeki_static_events
ADD COLUMN startDate TIMESTAMP NOT NULL DEFAULT current_timestamp();

View File

@ -0,0 +1,98 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE ongeki_user_event_point ADD COLUMN version INTEGER NOT NULL;
ALTER TABLE ongeki_user_event_point ADD COLUMN rank INTEGER;
ALTER TABLE ongeki_user_event_point ADD COLUMN type INTEGER NOT NULL;
ALTER TABLE ongeki_user_event_point ADD COLUMN date VARCHAR(25);
ALTER TABLE ongeki_user_tech_event ADD COLUMN version INTEGER NOT NULL;
ALTER TABLE ongeki_user_mission_point ADD COLUMN version INTEGER NOT NULL;
ALTER TABLE ongeki_static_events ADD COLUMN endDate TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP;
CREATE TABLE ongeki_tech_event_ranking (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
user INT NOT NULL,
version INT NOT NULL,
date VARCHAR(25),
eventId INT NOT NULL,
rank INT,
totalPlatinumScore INT NOT NULL,
totalTechScore INT NOT NULL,
UNIQUE KEY ongeki_tech_event_ranking_uk (user, eventId),
CONSTRAINT ongeki_tech_event_ranking_ibfk1 FOREIGN KEY (user) REFERENCES aime_user(id) ON DELETE CASCADE ON UPDATE CASCADE
);
CREATE TABLE ongeki_static_music_ranking_list (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
musicId INT NOT NULL,
point INT NOT NULL,
userName VARCHAR(255),
UNIQUE KEY ongeki_static_music_ranking_list_uk (version, musicId)
);
CREATE TABLE ongeki_static_rewards (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
rewardId INT NOT NULL,
rewardName VARCHAR(255) NOT NULL,
itemKind INT NOT NULL,
itemId INT NOT NULL,
UNIQUE KEY ongeki_tech_event_ranking_uk (version, rewardId)
);
CREATE TABLE ongeki_static_present_list (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
presentId INT NOT NULL,
presentName VARCHAR(255) NOT NULL,
rewardId INT NOT NULL,
stock INT NOT NULL,
message VARCHAR(255),
startDate VARCHAR(25) NOT NULL,
endDate VARCHAR(25) NOT NULL,
UNIQUE KEY ongeki_static_present_list_uk (version, presentId, rewardId)
);
CREATE TABLE ongeki_static_tech_music (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
version INT NOT NULL,
eventId INT NOT NULL,
musicId INT NOT NULL,
level INT NOT NULL,
UNIQUE KEY ongeki_static_tech_music_uk (version, musicId, eventId)
);
CREATE TABLE ongeki_static_client_testmode (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
regionId INT NOT NULL,
placeId INT NOT NULL,
clientId VARCHAR(11) NOT NULL,
updateDate TIMESTAMP NOT NULL,
isDelivery BOOLEAN NOT NULL,
groupId INT NOT NULL,
groupRole INT NOT NULL,
continueMode INT NOT NULL,
selectMusicTime INT NOT NULL,
advertiseVolume INT NOT NULL,
eventMode INT NOT NULL,
eventMusicNum INT NOT NULL,
patternGp INT NOT NULL,
limitGp INT NOT NULL,
maxLeverMovable INT NOT NULL,
minLeverMovable INT NOT NULL,
UNIQUE KEY ongeki_static_client_testmode_uk (clientId)
);
CREATE TABLE ongeki_static_game_point (
id INT PRIMARY KEY NOT NULL AUTO_INCREMENT,
type INT NOT NULL,
cost INT NOT NULL,
startDate VARCHAR(25) NOT NULL DEFAULT "2000-01-01 05:00:00.0",
endDate VARCHAR(25) NOT NULL DEFAULT "2099-01-01 05:00:00.0",
UNIQUE KEY ongeki_static_game_point_uk (type)
);
SET FOREIGN_KEY_CHECKS=1;

View File

@ -0,0 +1,3 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE mai2_playlog DROP COLUMN trialPlayAchievement;
SET FOREIGN_KEY_CHECKS=1;

View File

@ -0,0 +1,21 @@
ALTER TABLE mai2_item_card
CHANGE COLUMN cardId card_id INT NOT NULL AFTER user,
CHANGE COLUMN cardTypeId card_kind INT NOT NULL,
CHANGE COLUMN charaId chara_id INT NOT NULL,
CHANGE COLUMN mapId map_id INT NOT NULL,
CHANGE COLUMN startDate start_date TIMESTAMP NULL DEFAULT '2018-01-01 00:00:00',
CHANGE COLUMN endDate end_date TIMESTAMP NULL DEFAULT '2038-01-01 00:00:00';
ALTER TABLE mai2_item_item
CHANGE COLUMN itemId item_id INT NOT NULL AFTER user,
CHANGE COLUMN itemKind item_kind INT NOT NULL,
CHANGE COLUMN isValid is_valid TINYINT(1) NOT NULL DEFAULT '1';
ALTER TABLE mai2_item_character
CHANGE COLUMN characterId character_id INT NOT NULL,
CHANGE COLUMN useCount use_count INT NOT NULL DEFAULT '0';
ALTER TABLE mai2_item_charge
CHANGE COLUMN chargeId charge_id INT NOT NULL,
CHANGE COLUMN purchaseDate purchase_date TIMESTAMP NOT NULL,
CHANGE COLUMN validDate valid_date TIMESTAMP NOT NULL;

View File

@ -0,0 +1,3 @@
SET FOREIGN_KEY_CHECKS=0;
ALTER TABLE mai2_playlog ADD trialPlayAchievement INT NULL;
SET FOREIGN_KEY_CHECKS=1;

View File

@ -0,0 +1,31 @@
ALTER TABLE mai2_profile_option
DROP COLUMN tapSe;
ALTER TABLE mai2_score_best
DROP COLUMN extNum1;
ALTER TABLE mai2_profile_extend
DROP COLUMN playStatusSetting;
ALTER TABLE mai2_playlog
DROP COLUMN extNum4;
ALTER TABLE mai2_static_event
DROP COLUMN startDate;
ALTER TABLE mai2_item_map
CHANGE COLUMN mapId map_id INT NOT NULL,
CHANGE COLUMN isLock is_lock BOOLEAN NOT NULL DEFAULT 0,
CHANGE COLUMN isClear is_clear BOOLEAN NOT NULL DEFAULT 0,
CHANGE COLUMN isComplete is_complete BOOLEAN NOT NULL DEFAULT 0;
ALTER TABLE mai2_item_friend_season_ranking
CHANGE COLUMN seasonId season_id INT NOT NULL,
CHANGE COLUMN rewardGet reward_get BOOLEAN NOT NULL,
CHANGE COLUMN userName user_name VARCHAR(8) NOT NULL,
CHANGE COLUMN recordDate record_date VARCHAR(255) NOT NULL;
ALTER TABLE mai2_item_login_bonus
CHANGE COLUMN bonusId bonus_id INT NOT NULL,
CHANGE COLUMN isCurrent is_current BOOLEAN NOT NULL DEFAULT 0,
CHANGE COLUMN isComplete is_complete BOOLEAN NOT NULL DEFAULT 0;

View File

@ -0,0 +1,21 @@
ALTER TABLE mai2_item_card
CHANGE COLUMN card_id cardId INT NOT NULL AFTER user,
CHANGE COLUMN card_kind cardTypeId INT NOT NULL,
CHANGE COLUMN chara_id charaId INT NOT NULL,
CHANGE COLUMN map_id mapId INT NOT NULL,
CHANGE COLUMN start_date startDate TIMESTAMP NULL DEFAULT '2018-01-01 00:00:00',
CHANGE COLUMN end_date endDate TIMESTAMP NULL DEFAULT '2038-01-01 00:00:00';
ALTER TABLE mai2_item_item
CHANGE COLUMN item_id itemId INT NOT NULL AFTER user,
CHANGE COLUMN item_kind itemKind INT NOT NULL,
CHANGE COLUMN is_valid isValid TINYINT(1) NOT NULL DEFAULT '1';
ALTER TABLE mai2_item_character
CHANGE COLUMN character_id characterId INT NOT NULL,
CHANGE COLUMN use_count useCount INT NOT NULL DEFAULT '0';
ALTER TABLE mai2_item_charge
CHANGE COLUMN charge_id chargeId INT NOT NULL,
CHANGE COLUMN purchase_date purchaseDate TIMESTAMP NOT NULL,
CHANGE COLUMN valid_date validDate TIMESTAMP NOT NULL;

View File

@ -0,0 +1,3 @@
ALTER TABLE mai2_item_card
CHANGE COLUMN startDate startDate TIMESTAMP DEFAULT "2018-01-01 00:00:00.0",
CHANGE COLUMN endDate endDate TIMESTAMP DEFAULT "2038-01-01 00:00:00.0";

View File

@ -0,0 +1,31 @@
ALTER TABLE mai2_profile_option
ADD COLUMN tapSe INT NOT NULL DEFAULT 0 AFTER tapDesign;
ALTER TABLE mai2_score_best
ADD COLUMN extNum1 INT NOT NULL DEFAULT 0;
ALTER TABLE mai2_profile_extend
ADD COLUMN playStatusSetting INT NOT NULL DEFAULT 0;
ALTER TABLE mai2_playlog
ADD COLUMN extNum4 INT NOT NULL DEFAULT 0;
ALTER TABLE mai2_static_event
ADD COLUMN startDate TIMESTAMP NOT NULL DEFAULT current_timestamp();
ALTER TABLE mai2_item_map
CHANGE COLUMN map_id mapId INT NOT NULL,
CHANGE COLUMN is_lock isLock BOOLEAN NOT NULL DEFAULT 0,
CHANGE COLUMN is_clear isClear BOOLEAN NOT NULL DEFAULT 0,
CHANGE COLUMN is_complete isComplete BOOLEAN NOT NULL DEFAULT 0;
ALTER TABLE mai2_item_friend_season_ranking
CHANGE COLUMN season_id seasonId INT NOT NULL,
CHANGE COLUMN reward_get rewardGet BOOLEAN NOT NULL,
CHANGE COLUMN user_name userName VARCHAR(8) NOT NULL,
CHANGE COLUMN record_date recordDate TIMESTAMP NOT NULL;
ALTER TABLE mai2_item_login_bonus
CHANGE COLUMN bonus_id bonusId INT NOT NULL,
CHANGE COLUMN is_current isCurrent BOOLEAN NOT NULL DEFAULT 0,
CHANGE COLUMN is_complete isComplete BOOLEAN NOT NULL DEFAULT 0;

View File

@ -0,0 +1,78 @@
DELETE FROM mai2_static_event WHERE version < 13;
UPDATE mai2_static_event SET version = version - 13 WHERE version >= 13;
DELETE FROM mai2_static_music WHERE version < 13;
UPDATE mai2_static_music SET version = version - 13 WHERE version >= 13;
DELETE FROM mai2_static_ticket WHERE version < 13;
UPDATE mai2_static_ticket SET version = version - 13 WHERE version >= 13;
DELETE FROM mai2_static_cards WHERE version < 13;
UPDATE mai2_static_cards SET version = version - 13 WHERE version >= 13;
DELETE FROM mai2_profile_detail WHERE version < 13;
UPDATE mai2_profile_detail SET version = version - 13 WHERE version >= 13;
DELETE FROM mai2_profile_extend WHERE version < 13;
UPDATE mai2_profile_extend SET version = version - 13 WHERE version >= 13;
DELETE FROM mai2_profile_option WHERE version < 13;
UPDATE mai2_profile_option SET version = version - 13 WHERE version >= 13;
DELETE FROM mai2_profile_ghost WHERE version < 13;
UPDATE mai2_profile_ghost SET version = version - 13 WHERE version >= 13;
DELETE FROM mai2_profile_rating WHERE version < 13;
UPDATE mai2_profile_rating SET version = version - 13 WHERE version >= 13;
DROP TABLE maimai_score_best;
DROP TABLE maimai_playlog;
DROP TABLE maimai_profile_detail;
DROP TABLE maimai_profile_option;
DROP TABLE maimai_profile_web_option;
DROP TABLE maimai_profile_grade_status;
ALTER TABLE mai2_item_character DROP COLUMN point;
ALTER TABLE mai2_item_card MODIFY COLUMN cardId int(11) NOT NULL;
ALTER TABLE mai2_item_card MODIFY COLUMN cardTypeId int(11) NOT NULL;
ALTER TABLE mai2_item_card MODIFY COLUMN charaId int(11) NOT NULL;
ALTER TABLE mai2_item_card MODIFY COLUMN mapId int(11) NOT NULL;
ALTER TABLE mai2_item_character MODIFY COLUMN characterId int(11) NOT NULL;
ALTER TABLE mai2_item_character MODIFY COLUMN level int(11) NOT NULL;
ALTER TABLE mai2_item_character MODIFY COLUMN awakening int(11) NOT NULL;
ALTER TABLE mai2_item_character MODIFY COLUMN useCount int(11) NOT NULL;
ALTER TABLE mai2_item_charge MODIFY COLUMN chargeId int(11) NOT NULL;
ALTER TABLE mai2_item_charge MODIFY COLUMN stock int(11) NOT NULL;
ALTER TABLE mai2_item_favorite MODIFY COLUMN itemKind int(11) NOT NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN seasonId int(11) NOT NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN point int(11) NOT NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN `rank` int(11) NOT NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN rewardGet tinyint(1) NOT NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN userName varchar(8) NOT NULL;
ALTER TABLE mai2_item_item MODIFY COLUMN itemId int(11) NOT NULL;
ALTER TABLE mai2_item_item MODIFY COLUMN itemKind int(11) NOT NULL;
ALTER TABLE mai2_item_item MODIFY COLUMN stock int(11) NOT NULL;
ALTER TABLE mai2_item_item MODIFY COLUMN isValid tinyint(1) NOT NULL;
ALTER TABLE mai2_item_login_bonus MODIFY COLUMN bonusId int(11) NOT NULL;
ALTER TABLE mai2_item_login_bonus MODIFY COLUMN point int(11) NOT NULL;
ALTER TABLE mai2_item_login_bonus MODIFY COLUMN isCurrent tinyint(1) NOT NULL;
ALTER TABLE mai2_item_login_bonus MODIFY COLUMN isComplete tinyint(1) NOT NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN mapId int(11) NOT NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN distance int(11) NOT NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN isLock tinyint(1) NOT NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN isClear tinyint(1) NOT NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN isComplete tinyint(1) NOT NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN printDate timestamp DEFAULT current_timestamp() NOT NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN serialId varchar(20) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NOT NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN placeId int(11) NOT NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN clientId varchar(11) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NOT NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN printerSerialId varchar(20) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NOT NULL;

View File

@ -0,0 +1,3 @@
ALTER TABLE mai2_item_card
CHANGE COLUMN startDate startDate TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
CHANGE COLUMN endDate endDate TIMESTAMP NOT NULL;

View File

@ -0,0 +1 @@
DROP TABLE aime.mai2_profile_consec_logins;

View File

@ -0,0 +1,62 @@
UPDATE mai2_static_event SET version = version + 13 WHERE version < 1000;
UPDATE mai2_static_music SET version = version + 13 WHERE version < 1000;
UPDATE mai2_static_ticket SET version = version + 13 WHERE version < 1000;
UPDATE mai2_static_cards SET version = version + 13 WHERE version < 1000;
UPDATE mai2_profile_detail SET version = version + 13 WHERE version < 1000;
UPDATE mai2_profile_extend SET version = version + 13 WHERE version < 1000;
UPDATE mai2_profile_option SET version = version + 13 WHERE version < 1000;
UPDATE mai2_profile_ghost SET version = version + 13 WHERE version < 1000;
UPDATE mai2_profile_rating SET version = version + 13 WHERE version < 1000;
ALTER TABLE mai2_item_character ADD point int(11) NULL;
ALTER TABLE mai2_item_card MODIFY COLUMN cardId int(11) NULL;
ALTER TABLE mai2_item_card MODIFY COLUMN cardTypeId int(11) NULL;
ALTER TABLE mai2_item_card MODIFY COLUMN charaId int(11) NULL;
ALTER TABLE mai2_item_card MODIFY COLUMN mapId int(11) NULL;
ALTER TABLE mai2_item_character MODIFY COLUMN characterId int(11) NULL;
ALTER TABLE mai2_item_character MODIFY COLUMN level int(11) NULL;
ALTER TABLE mai2_item_character MODIFY COLUMN awakening int(11) NULL;
ALTER TABLE mai2_item_character MODIFY COLUMN useCount int(11) NULL;
ALTER TABLE mai2_item_charge MODIFY COLUMN chargeId int(11) NULL;
ALTER TABLE mai2_item_charge MODIFY COLUMN stock int(11) NULL;
ALTER TABLE mai2_item_favorite MODIFY COLUMN itemKind int(11) NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN seasonId int(11) NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN point int(11) NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN `rank` int(11) NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN rewardGet tinyint(1) NULL;
ALTER TABLE mai2_item_friend_season_ranking MODIFY COLUMN userName varchar(8) NULL;
ALTER TABLE mai2_item_item MODIFY COLUMN itemId int(11) NULL;
ALTER TABLE mai2_item_item MODIFY COLUMN itemKind int(11) NULL;
ALTER TABLE mai2_item_item MODIFY COLUMN stock int(11) NULL;
ALTER TABLE mai2_item_item MODIFY COLUMN isValid tinyint(1) NULL;
ALTER TABLE mai2_item_login_bonus MODIFY COLUMN bonusId int(11) NULL;
ALTER TABLE mai2_item_login_bonus MODIFY COLUMN point int(11) NULL;
ALTER TABLE mai2_item_login_bonus MODIFY COLUMN isCurrent tinyint(1) NULL;
ALTER TABLE mai2_item_login_bonus MODIFY COLUMN isComplete tinyint(1) NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN mapId int(11) NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN distance int(11) NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN isLock tinyint(1) NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN isClear tinyint(1) NULL;
ALTER TABLE mai2_item_map MODIFY COLUMN isComplete tinyint(1) NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN printDate timestamp DEFAULT current_timestamp() NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN serialId varchar(20) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN placeId int(11) NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN clientId varchar(11) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL;
ALTER TABLE mai2_item_print_detail MODIFY COLUMN printerSerialId varchar(20) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL;

View File

@ -0,0 +1,10 @@
ALTER TABLE mai2_profile_detail
DROP COLUMN mapStock;
ALTER TABLE mai2_profile_extend
DROP COLUMN selectResultScoreViewType;
ALTER TABLE mai2_profile_option
DROP COLUMN outFrameType,
DROP COLUMN touchVolume,
DROP COLUMN breakSlideVolume;

View File

@ -0,0 +1,9 @@
CREATE TABLE `mai2_profile_consec_logins` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`user` int(11) NOT NULL,
`version` int(11) NOT NULL,
`logins` int(11) DEFAULT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY `mai2_profile_consec_logins_uk` (`user`,`version`),
CONSTRAINT `mai2_profile_consec_logins_ibfk_1` FOREIGN KEY (`user`) REFERENCES `aime_user` (`id`) ON DELETE CASCADE ON UPDATE CASCADE
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_general_ci;

View File

@ -0,0 +1,10 @@
ALTER TABLE mai2_profile_detail
ADD mapStock INT NULL AFTER playCount;
ALTER TABLE mai2_profile_extend
ADD selectResultScoreViewType INT NULL AFTER selectResultDetails;
ALTER TABLE mai2_profile_option
ADD outFrameType INT NULL AFTER dispCenter,
ADD touchVolume INT NULL AFTER slideVolume,
ADD breakSlideVolume INT NULL AFTER slideVolume;

View File

@ -0,0 +1,2 @@
SET FOREIGN_KEY_CHECKS=0;
SET FOREIGN_KEY_CHECKS=1;

View File

@ -0,0 +1 @@
ALTER TABLE wacca_profile DROP COLUMN playcount_time_free;

View File

@ -0,0 +1 @@
DELETE FROM wacca_item WHERE type=17 AND item_id=312002;

View File

@ -0,0 +1 @@
ALTER TABLE wacca_profile ADD playcount_time_free int(11) DEFAULT 0 NULL AFTER playcount_stageup;

460
core/frontend.py Normal file
View File

@ -0,0 +1,460 @@
import logging, coloredlogs
from typing import Any, Dict, List
from twisted.web import resource
from twisted.web.util import redirectTo
from twisted.web.http import Request
from logging.handlers import TimedRotatingFileHandler
from twisted.web.server import Session
from zope.interface import Interface, Attribute, implementer
from twisted.python.components import registerAdapter
import jinja2
import bcrypt
import re
from enum import Enum
from urllib import parse
from core import CoreConfig, Utils
from core.data import Data
class IUserSession(Interface):
userId = Attribute("User's ID")
current_ip = Attribute("User's current ip address")
permissions = Attribute("User's permission level")
ongeki_version = Attribute("User's selected Ongeki Version")
class PermissionOffset(Enum):
USER = 0 # Regular user
USERMOD = 1 # Can moderate other users
ACMOD = 2 # Can add arcades and cabs
SYSADMIN = 3 # Can change settings
# 4 - 6 reserved for future use
OWNER = 7 # Can do anything
@implementer(IUserSession)
class UserSession(object):
def __init__(self, session):
self.userId = 0
self.current_ip = "0.0.0.0"
self.permissions = 0
self.ongeki_version = 7
class FrontendServlet(resource.Resource):
def getChild(self, name: bytes, request: Request):
self.logger.debug(f"{Utils.get_ip_addr(request)} -> {name.decode()}")
if name == b"":
return self
return resource.Resource.getChild(self, name, request)
def __init__(self, cfg: CoreConfig, config_dir: str) -> None:
self.config = cfg
log_fmt_str = "[%(asctime)s] Frontend | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
self.logger = logging.getLogger("frontend")
self.environment = jinja2.Environment(loader=jinja2.FileSystemLoader("."))
self.game_list: List[Dict[str, str]] = []
self.children: Dict[str, Any] = {}
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(self.config.server.log_dir, "frontend"),
when="d",
backupCount=10,
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
self.logger.addHandler(fileHandler)
self.logger.addHandler(consoleHandler)
self.logger.setLevel(cfg.frontend.loglevel)
coloredlogs.install(
level=cfg.frontend.loglevel, logger=self.logger, fmt=log_fmt_str
)
registerAdapter(UserSession, Session, IUserSession)
fe_game = FE_Game(cfg, self.environment)
games = Utils.get_all_titles()
for game_dir, game_mod in games.items():
if hasattr(game_mod, "frontend"):
try:
game_fe = game_mod.frontend(cfg, self.environment, config_dir)
self.game_list.append({"url": game_dir, "name": game_fe.nav_name})
fe_game.putChild(game_dir.encode(), game_fe)
except Exception as e:
self.logger.error(
f"Failed to import frontend from {game_dir} because {e}"
)
self.environment.globals["game_list"] = self.game_list
self.putChild(b"gate", FE_Gate(cfg, self.environment))
self.putChild(b"user", FE_User(cfg, self.environment))
self.putChild(b"sys", FE_System(cfg, self.environment))
self.putChild(b"arcade", FE_Arcade(cfg, self.environment))
self.putChild(b"cab", FE_Machine(cfg, self.environment))
self.putChild(b"game", fe_game)
self.logger.info(
f"Ready on port {self.config.frontend.port} serving {len(fe_game.children)} games"
)
def render_GET(self, request):
self.logger.debug(f"{Utils.get_ip_addr(request)} -> {request.uri.decode()}")
template = self.environment.get_template("core/frontend/index.jinja")
return template.render(
server_name=self.config.server.name,
title=self.config.server.name,
game_list=self.game_list,
sesh=vars(IUserSession(request.getSession())),
).encode("utf-16")
class FE_Base(resource.Resource):
"""
A Generic skeleton class that all frontend handlers should inherit from
Initializes the environment, data, logger, config, and sets isLeaf to true
It is expected that game implementations of this class overwrite many of these
"""
isLeaf = True
def __init__(self, cfg: CoreConfig, environment: jinja2.Environment) -> None:
self.core_config = cfg
self.data = Data(cfg)
self.logger = logging.getLogger("frontend")
self.environment = environment
self.nav_name = "nav_name"
class FE_Gate(FE_Base):
def render_GET(self, request: Request):
self.logger.debug(f"{Utils.get_ip_addr(request)} -> {request.uri.decode()}")
uri: str = request.uri.decode()
sesh = request.getSession()
usr_sesh = IUserSession(sesh)
if usr_sesh.userId > 0:
return redirectTo(b"/user", request)
if uri.startswith("/gate/create"):
return self.create_user(request)
if b"e" in request.args:
try:
err = int(request.args[b"e"][0].decode())
except Exception:
err = 0
else:
err = 0
template = self.environment.get_template("core/frontend/gate/gate.jinja")
return template.render(
title=f"{self.core_config.server.name} | Login Gate",
error=err,
sesh=vars(usr_sesh),
).encode("utf-16")
def render_POST(self, request: Request):
uri = request.uri.decode()
ip = Utils.get_ip_addr(request)
if uri == "/gate/gate.login":
access_code: str = request.args[b"access_code"][0].decode()
passwd: bytes = request.args[b"passwd"][0]
if passwd == b"":
passwd = None
uid = self.data.card.get_user_id_from_card(access_code)
user = self.data.user.get_user(uid)
if uid is None:
return redirectTo(b"/gate?e=1", request)
if passwd is None:
sesh = self.data.user.check_password(uid)
if sesh is not None:
return redirectTo(
f"/gate/create?ac={access_code}".encode(), request
)
return redirectTo(b"/gate?e=1", request)
if not self.data.user.check_password(uid, passwd):
return redirectTo(b"/gate?e=1", request)
self.logger.info(f"Successful login of user {uid} at {ip}")
sesh = request.getSession()
usr_sesh = IUserSession(sesh)
usr_sesh.userId = uid
usr_sesh.current_ip = ip
usr_sesh.permissions = user['permissions']
return redirectTo(b"/user", request)
elif uri == "/gate/gate.create":
access_code: str = request.args[b"access_code"][0].decode()
username: str = request.args[b"username"][0]
email: str = request.args[b"email"][0].decode()
passwd: bytes = request.args[b"passwd"][0]
uid = self.data.card.get_user_id_from_card(access_code)
if uid is None:
return redirectTo(b"/gate?e=1", request)
salt = bcrypt.gensalt()
hashed = bcrypt.hashpw(passwd, salt)
result = self.data.user.create_user(
uid, username, email.lower(), hashed.decode(), 1
)
if result is None:
return redirectTo(b"/gate?e=3", request)
if not self.data.user.check_password(uid, passwd):
return redirectTo(b"/gate", request)
return redirectTo(b"/user", request)
else:
return b""
def create_user(self, request: Request):
if b"ac" not in request.args or len(request.args[b"ac"][0].decode()) != 20:
return redirectTo(b"/gate?e=2", request)
ac = request.args[b"ac"][0].decode()
card = self.data.card.get_card_by_access_code(ac)
if card is None:
return redirectTo(b"/gate?e=1", request)
user = self.data.user.get_user(card['user'])
if user is None:
self.logger.warning(f"Card {ac} exists with no/invalid associated user ID {card['user']}")
return redirectTo(b"/gate?e=0", request)
if user['password'] is not None:
return redirectTo(b"/gate?e=1", request)
template = self.environment.get_template("core/frontend/gate/create.jinja")
return template.render(
title=f"{self.core_config.server.name} | Create User",
code=ac,
sesh={"userId": 0, "permissions": 0},
).encode("utf-16")
class FE_User(FE_Base):
def render_GET(self, request: Request):
uri = request.uri.decode()
template = self.environment.get_template("core/frontend/user/index.jinja")
sesh: Session = request.getSession()
usr_sesh = IUserSession(sesh)
if usr_sesh.userId == 0:
return redirectTo(b"/gate", request)
m = re.match("\/user\/(\d*)", uri)
if m is not None:
usrid = m.group(1)
if usr_sesh.permissions < 1 << PermissionOffset.USERMOD.value or not usrid == usr_sesh.userId:
return redirectTo(b"/user", request)
else:
usrid = usr_sesh.userId
user = self.data.user.get_user(usrid)
if user is None:
return redirectTo(b"/user", request)
cards = self.data.card.get_user_cards(usrid)
arcades = self.data.arcade.get_arcades_managed_by_user(usrid)
card_data = []
arcade_data = []
for c in cards:
if c['is_locked']:
status = 'Locked'
elif c['is_banned']:
status = 'Banned'
else:
status = 'Active'
card_data.append({'access_code': c['access_code'], 'status': status})
for a in arcades:
arcade_data.append({'id': a['id'], 'name': a['name']})
return template.render(
title=f"{self.core_config.server.name} | Account",
sesh=vars(usr_sesh),
cards=card_data,
username=user['username'],
arcades=arcade_data
).encode("utf-16")
def render_POST(self, request: Request):
pass
class FE_System(FE_Base):
def render_GET(self, request: Request):
uri = request.uri.decode()
template = self.environment.get_template("core/frontend/sys/index.jinja")
usrlist: List[Dict] = []
aclist: List[Dict] = []
cablist: List[Dict] = []
sesh: Session = request.getSession()
usr_sesh = IUserSession(sesh)
if usr_sesh.userId == 0 or usr_sesh.permissions < 1 << PermissionOffset.USERMOD.value:
return redirectTo(b"/gate", request)
if uri.startswith("/sys/lookup.user?"):
uri_parse = parse.parse_qs(uri.replace("/sys/lookup.user?", "")) # lop off the first bit
uid_search = uri_parse.get("usrId")
email_search = uri_parse.get("usrEmail")
uname_search = uri_parse.get("usrName")
if uid_search is not None:
u = self.data.user.get_user(uid_search[0])
if u is not None:
usrlist.append(u._asdict())
elif email_search is not None:
u = self.data.user.find_user_by_email(email_search[0])
if u is not None:
usrlist.append(u._asdict())
elif uname_search is not None:
ul = self.data.user.find_user_by_username(uname_search[0])
for u in ul:
usrlist.append(u._asdict())
elif uri.startswith("/sys/lookup.arcade?"):
uri_parse = parse.parse_qs(uri.replace("/sys/lookup.arcade?", "")) # lop off the first bit
ac_id_search = uri_parse.get("arcadeId")
ac_name_search = uri_parse.get("arcadeName")
ac_user_search = uri_parse.get("arcadeUser")
ac_ip_search = uri_parse.get("arcadeIp")
if ac_id_search is not None:
u = self.data.arcade.get_arcade(ac_id_search[0])
if u is not None:
aclist.append(u._asdict())
elif ac_name_search is not None:
ul = self.data.arcade.get_arcade_by_name(ac_name_search[0])
if ul is not None:
for u in ul:
aclist.append(u._asdict())
elif ac_user_search is not None:
ul = self.data.arcade.get_arcades_managed_by_user(ac_user_search[0])
if ul is not None:
for u in ul:
aclist.append(u._asdict())
elif ac_ip_search is not None:
ul = self.data.arcade.get_arcades_by_ip(ac_ip_search[0])
if ul is not None:
for u in ul:
aclist.append(u._asdict())
elif uri.startswith("/sys/lookup.cab?"):
uri_parse = parse.parse_qs(uri.replace("/sys/lookup.cab?", "")) # lop off the first bit
cab_id_search = uri_parse.get("cabId")
cab_serial_search = uri_parse.get("cabSerial")
cab_acid_search = uri_parse.get("cabAcId")
if cab_id_search is not None:
u = self.data.arcade.get_machine(id=cab_id_search[0])
if u is not None:
cablist.append(u._asdict())
elif cab_serial_search is not None:
u = self.data.arcade.get_machine(serial=cab_serial_search[0])
if u is not None:
cablist.append(u._asdict())
elif cab_acid_search is not None:
ul = self.data.arcade.get_arcade_machines(cab_acid_search[0])
for u in ul:
cablist.append(u._asdict())
return template.render(
title=f"{self.core_config.server.name} | System",
sesh=vars(usr_sesh),
usrlist=usrlist,
aclist=aclist,
cablist=cablist,
).encode("utf-16")
class FE_Game(FE_Base):
isLeaf = False
children: Dict[str, Any] = {}
def getChild(self, name: bytes, request: Request):
if name == b"":
return self
return resource.Resource.getChild(self, name, request)
def render_GET(self, request: Request) -> bytes:
return redirectTo(b"/user", request)
class FE_Arcade(FE_Base):
def render_GET(self, request: Request):
uri = request.uri.decode()
template = self.environment.get_template("core/frontend/arcade/index.jinja")
managed = []
sesh: Session = request.getSession()
usr_sesh = IUserSession(sesh)
if usr_sesh.userId == 0:
return redirectTo(b"/gate", request)
m = re.match("\/arcade\/(\d*)", uri)
if m is not None:
arcadeid = m.group(1)
perms = self.data.arcade.get_manager_permissions(usr_sesh.userId, arcadeid)
arcade = self.data.arcade.get_arcade(arcadeid)
if perms is None:
perms = 0
else:
return redirectTo(b"/user", request)
return template.render(
title=f"{self.core_config.server.name} | Arcade",
sesh=vars(usr_sesh),
error=0,
perms=perms,
arcade=arcade._asdict()
).encode("utf-16")
class FE_Machine(FE_Base):
def render_GET(self, request: Request):
uri = request.uri.decode()
template = self.environment.get_template("core/frontend/machine/index.jinja")
sesh: Session = request.getSession()
usr_sesh = IUserSession(sesh)
if usr_sesh.userId == 0:
return redirectTo(b"/gate", request)
return template.render(
title=f"{self.core_config.server.name} | Machine",
sesh=vars(usr_sesh),
arcade={},
error=0,
).encode("utf-16")

View File

@ -0,0 +1,4 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
<h1>{{ arcade.name }}</h1>
{% endblock content %}

View File

@ -0,0 +1,24 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
<h1>Create User</h1>
<form id="create" style="max-width: 240px; min-width: 10%;" action="/gate/gate.create" method="post">
<div class="form-group row">
<label for="access_code">Card Access Code</label><br>
<input class="form-control" name="access_code" id="access_code" type="text" placeholder="00000000000000000000" value={{ code }} maxlength="20" readonly>
</div>
<div class="form-group row">
<label for="username">Username</label><br>
<input id="username" class="form-control" name="username" type="text" placeholder="username">
</div>
<div class="form-group row">
<label for="email">Email</label><br>
<input id="email" class="form-control" name="email" type="email" placeholder="example@example.com">
</div>
<div class="form-group row">
<label for="passwd">Password</label><br>
<input id="passwd" class="form-control" name="passwd" type="password" placeholder="password">
</div>
<p></p>
<input id="submit" class="btn btn-primary" style="display: block; margin: 0 auto;" type="submit" value="Create">
</form>
{% endblock content %}

View File

@ -0,0 +1,32 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
<h1>Gate</h1>
{% include "core/frontend/widgets/err_banner.jinja" %}
<style>
/* Chrome, Safari, Edge, Opera */
input::-webkit-outer-spin-button,
input::-webkit-inner-spin-button {
-webkit-appearance: none;
margin: 0;
}
/* Firefox */
input[type=number] {
-moz-appearance: textfield;
}
</style>
<form id="login" style="max-width: 240px; min-width: 10%;" action="/gate/gate.login" method="post">
<div class="form-group row">
<label for="access_code">Card Access Code</label><br>
<input form="login" class="form-control" name="access_code" id="access_code" type="number" placeholder="00000000000000000000" maxlength="20" required>
</div>
<div class="form-group row">
<label for="passwd">Password</label><br>
<input id="passwd" class="form-control" name="passwd" type="password" placeholder="password">
</div>
<p></p>
<input id="submit" class="btn btn-primary" style="display: block; margin: 0 auto;" form="login" type="submit" value="Login">
</form>
<h6>*To register for the webui, type in the access code of your card, as shown in a game, and leave the password field blank.</h6>
<h6>*If you have not registered a card with this server, you cannot create a webui account.</h6>
{% endblock content %}

92
core/frontend/index.jinja Normal file
View File

@ -0,0 +1,92 @@
<!DOCTYPE html>
<html>
<head>
<title>{{ title }}</title>
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.2.3/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-rbsA2VBKQhggwzxH7pPCaAqO46MgnOM80zW1RWuH61DGLwZJEdK2Kadq2F9CUG65" crossorigin="anonymous">
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.2.3/dist/js/bootstrap.bundle.min.js" integrity="sha384-kenU1KFdBIe4zVF0s0G1M5b4hcpxyD9F7jL+jjXkk+Q2h455rYXK/7HAuoJl+0I4" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/jquery@3.2.1/dist/jquery.min.js" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4=" crossorigin="anonymous"></script>
<style>
html {
background-color: #181a1b !important;
margin: 10px;
}
html {
color-scheme: dark !important;
}
html, body, input, textarea, select, button, dialog {
background-color: #181a1b;
}
html, body, input, textarea, select, button {
border-color: #736b5e;
color: #e8e6e3;
}
a {
color: #3391ff;
}
table {
border-color: #545b5e;
}
::placeholder {
color: #b2aba1;
}
input:-webkit-autofill,
textarea:-webkit-autofill,
select:-webkit-autofill {
background-color: #404400 !important;
color: #e8e6e3 !important;
}
::-webkit-scrollbar {
background-color: #202324;
color: #aba499;
}
::-webkit-scrollbar-thumb {
background-color: #454a4d;
}
::-webkit-scrollbar-thumb:hover {
background-color: #575e62;
}
::-webkit-scrollbar-thumb:active {
background-color: #484e51;
}
::-webkit-scrollbar-corner {
background-color: #181a1b;
}
* {
scrollbar-color: #454a4d #202324;
}
::selection {
background-color: #004daa !important;
color: #e8e6e3 !important;
}
::-moz-selection {
background-color: #004daa !important;
color: #e8e6e3 !important;
}
input[type="text"], input[type="text"]:focus, input[type="password"], input[type="password"]:focus, input[type="email"], input[type="email"]:focus {
background-color: #202324 !important;
color: #e8e6e3;
}
form {
outline: 1px solid grey;
padding: 20px;
padding-top: 10px;
padding-bottom: 10px;
}
.err-banner {
background-color: #AA0000;
padding: 20px;
margin-bottom: 10px;
width: 15%;
}
.modal-content {
background-color: #181a1b;
}
</style>
</head>
<body>
{% include "core/frontend/widgets/topbar.jinja" %}
{% block content %}
<h1>{{ server_name }}</h1>
{% endblock content %}
</body>
</html>

View File

@ -0,0 +1,5 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
{% include "core/frontend/widgets/err_banner.jinja" %}
<h1>Machine Management</h1>
{% endblock content %}

View File

@ -0,0 +1,103 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
<h1>System Management</h1>
<div class="row" id="rowForm">
{% if sesh.permissions >= 2 %}
<div class="col-sm-6" style="max-width: 25%;">
<form id="usrLookup" name="usrLookup" action="/sys/lookup.user" class="form-inline">
<h3>User Search</h3>
<div class="form-group">
<label for="usrId">User ID</label>
<input type="number" class="form-control" id="usrId" name="usrId">
</div>
OR
<div class="form-group">
<label for="usrName">Username</label>
<input type="text" class="form-control" id="usrName" name="usrName">
</div>
OR
<div class="form-group">
<label for="usrEmail">Email address</label>
<input type="email" class="form-control" id="usrEmail" name="usrEmail" aria-describedby="emailHelp">
</div>
<br />
<button type="submit" class="btn btn-primary">Search</button>
</form>
</div>
{% endif %}
{% if sesh.permissions >= 4 %}
<div class="col-sm-6" style="max-width: 25%;">
<form id="arcadeLookup" name="arcadeLookup" action="/sys/lookup.arcade" class="form-inline" >
<h3>Arcade Search</h3>
<div class="form-group">
<label for="arcadeId">Arcade ID</label>
<input type="number" class="form-control" id="arcadeId" name="arcadeId">
</div>
OR
<div class="form-group">
<label for="arcadeName">Arcade Name</label>
<input type="text" class="form-control" id="arcadeName" name="arcadeName">
</div>
OR
<div class="form-group">
<label for="arcadeUser">Owner User ID</label>
<input type="number" class="form-control" id="arcadeUser" name="arcadeUser">
</div>
OR
<div class="form-group">
<label for="arcadeIp">Assigned IP Address</label>
<input type="text" class="form-control" id="arcadeIp" name="arcadeIp">
</div>
<br />
<button type="submit" class="btn btn-primary">Search</button>
</form>
</div>
<div class="col-sm-6" style="max-width: 25%;">
<form id="cabLookup" name="cabLookup" action="/sys/lookup.cab" class="form-inline" >
<h3>Machine Search</h3>
<div class="form-group">
<label for="cabId">Machine ID</label>
<input type="number" class="form-control" id="cabId" name="cabId">
</div>
OR
<div class="form-group">
<label for="cabSerial">Machine Serial</label>
<input type="text" class="form-control" id="cabSerial" name="cabSerial">
</div>
OR
<div class="form-group">
<label for="cabAcId">Arcade ID</label>
<input type="number" class="form-control" id="cabAcId" name="cabAcId">
</div>
<br />
<button type="submit" class="btn btn-primary">Search</button>
</form>
</div>
{% endif %}
</div>
<div class="row" id="rowResult" style="margin: 10px;">
{% if sesh.permissions >= 2 %}
<div id="userSearchResult" class="col-sm-6" style="max-width: 25%;">
{% for usr in usrlist %}
<a href=/user/{{ usr.id }}><pre>{{ usr.id }} | {{ usr.username if usr.username != None else "<i>No Name Set</i>"}}</pre></a>
{% endfor %}
</div>
{% endif %}
{% if sesh.permissions >= 4 %}
<div id="arcadeSearchResult" class="col-sm-6" style="max-width: 25%;">
{% for ac in aclist %}
<pre><a href=/arcade/{{ ac.id }}>{{ ac.id }} | {{ ac.name if ac.name != None else "<i>No Name Set</i>" }} | {{ ac.ip if ac.ip != None else "<i>No IP Assigned</i>"}}</pre></a>
{% endfor %}
</div
><div id="cabSearchResult" class="col-sm-6" style="max-width: 25%;">
{% for cab in cablist %}
<a href=/cab/{{ cab.id }}><pre>{{ cab.id }} | {{ cab.game if cab.game != None else "<i>ANY </i>" }} | {{ cab.serial }}</pre></a>
{% endfor %}
</div>
{% endif %}
</div>
<div class="row" id="rowAdd">
</div>
{% endblock content %}

View File

@ -0,0 +1,41 @@
{% extends "core/frontend/index.jinja" %}
{% block content %}
<h1>Management for {{ username }}</h1>
<h2>Cards <button class="btn btn-success" data-bs-toggle="modal" data-bs-target="#card_add">Add</button></h2>
<ul style="font-size: 20px;">
{% for c in cards %}
<li>{{ c.access_code }}: {{ c.status }}&nbsp;{% if c.status == 'Active'%}<button class="btn-warning btn">Lock</button>{% elif c.status == 'Locked' %}<button class="btn-warning btn">Unlock</button>{% endif %}&nbsp;<button class="btn-danger btn">Delete</button></li>
{% endfor %}
</ul>
{% if arcades is defined %}
<h2>Arcades</h2>
<ul style="font-size: 20px;">
{% for a in arcades %}
<li><a href=/arcade/{{ a.id }}>{{ a.name }}</a></li>
{% endfor %}
</ul>
{% endif %}
<div class="modal fade" id="card_add" tabindex="-1" aria-labelledby="card_add_label" aria-hidden="true">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h1 class="modal-title fs-5" id="card_add_label">Add Card</h1>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body">
HOW TO:<br>
Scan your card on any networked game and press the "View Access Code" button (varies by game) and enter the 20 digit code below.<br>
!!FOR AMUSEIC CARDS: DO NOT ENTER THE CODE SHOWN ON THE BACK OF THE CARD ITSELF OR IT WILL NOT WORK!!
<p /><label for="card_add_frm_access_code">Access Code:&nbsp;</label><input id="card_add_frm_access_code" maxlength="20" type="text" required>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-primary">Add</button>
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
</div>
</div>
</div>
</div>
{% endblock content %}

View File

@ -0,0 +1,18 @@
{% if error > 0 %}
<div class="err-banner">
<h3>Error</h3>
{% if error == 1 %}
Card not registered, or wrong password
{% elif error == 2 %}
Missing or malformed access code
{% elif error == 3 %}
Failed to create user
{% elif error == 4 %}
Arcade not found
{% elif error == 5 %}
Machine not found
{% else %}
An unknown error occoured
{% endif %}
</div>
{% endif %}

View File

@ -0,0 +1,21 @@
<div style="background: #333; color: #f9f9f9; width: 10%; height: 50px; line-height: 50px; text-align: center; float: left;">
Navigation
</div>
<div style="background: #333; color: #f9f9f9; width: 80%; height: 50px; line-height: 50px; padding-left: 10px; float: left;">
<a href=/><button class="btn btn-primary">Home</button></a>&nbsp;
{% for game in game_list %}
<a href=/game/{{ game.url }}><button class="btn btn-success">{{ game.name }}</button></a>&nbsp;
{% endfor %}
</div>
</div>
<div style="background: #333; color: #f9f9f9; width: 10%; height: 50px; line-height: 50px; text-align: center; float: left;">
{% if sesh is defined and sesh["permissions"] >= 2 %}
<a href="/sys"><button class="btn btn-primary">System</button></a>
{% endif %}
{% if sesh is defined and sesh["userId"] > 0 %}
<a href="/user"><button class="btn btn-primary">Account</button></a>
{% else %}
<a href="/gate"><button class="btn btn-primary">Gate</button></a>
{% endif %}
</div>

View File

@ -1,115 +1,184 @@
from typing import Dict, Any, Optional
from typing import Dict, Any, Optional, List
import logging, coloredlogs
from logging.handlers import TimedRotatingFileHandler
from twisted.web import resource
from twisted.web.http import Request
from datetime import datetime
from Crypto.Cipher import Blowfish
import pytz
from core.config import CoreConfig
from .config import CoreConfig
from .utils import Utils
from .title import TitleServlet
class MuchaServlet:
def __init__(self, cfg: CoreConfig) -> None:
mucha_registry: List[str] = []
def __init__(self, cfg: CoreConfig, cfg_dir: str) -> None:
self.config = cfg
self.config_dir = cfg_dir
self.logger = logging.getLogger('mucha')
self.logger = logging.getLogger("mucha")
log_fmt_str = "[%(asctime)s] Mucha | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
fileHandler = TimedRotatingFileHandler("{0}/{1}.log".format(self.config.server.log_dir, "mucha"), when="d", backupCount=10)
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(self.config.server.log_dir, "mucha"),
when="d",
backupCount=10,
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
self.logger.addHandler(fileHandler)
self.logger.addHandler(consoleHandler)
self.logger.setLevel(logging.INFO)
coloredlogs.install(level=logging.INFO, logger=self.logger, fmt=log_fmt_str)
def handle_boardauth(self, request: Request) -> bytes:
self.logger.setLevel(cfg.mucha.loglevel)
coloredlogs.install(level=cfg.mucha.loglevel, logger=self.logger, fmt=log_fmt_str)
for _, mod in TitleServlet.title_registry.items():
if hasattr(mod, "get_mucha_info"):
enabled, game_cd = mod.get_mucha_info(
self.config, self.config_dir
)
if enabled:
self.mucha_registry.append(game_cd)
self.logger.info(f"Serving {len(self.mucha_registry)} games")
def handle_boardauth(self, request: Request, _: Dict) -> bytes:
req_dict = self.mucha_preprocess(request.content.getvalue())
client_ip = Utils.get_ip_addr(request)
if req_dict is None:
self.logger.error(f"Error processing mucha request {request.content.getvalue()}")
return b""
self.logger.error(
f"Error processing mucha request {request.content.getvalue()}"
)
return b"RESULTS=000"
req = MuchaAuthRequest(req_dict)
self.logger.info(f"Mucha request {vars(req)}")
resp = MuchaAuthResponse(mucha_url=f"{self.config.mucha.hostname}:{self.config.mucha.port}")
self.logger.info(f"Mucha response {vars(resp)}")
self.logger.info(f"Boardauth request from {client_ip} for {req.gameVer}")
self.logger.debug(f"Mucha request {vars(req)}")
if req.gameCd not in self.mucha_registry:
self.logger.warning(f"Unknown gameCd {req.gameCd}")
return b"RESULTS=000"
# TODO: Decrypt S/N
b_key = b""
for x in range(8):
b_key += req.sendDate[(x - 1) & 7].encode()
cipher = Blowfish.new(b_key, Blowfish.MODE_ECB)
sn_decrypt = cipher.decrypt(bytes.fromhex(req.serialNum))
self.logger.debug(f"Decrypt SN to {sn_decrypt.hex()}")
resp = MuchaAuthResponse(
f"{self.config.mucha.hostname}{':' + str(self.config.allnet.port) if self.config.server.is_develop else ''}"
)
self.logger.debug(f"Mucha response {vars(resp)}")
return self.mucha_postprocess(vars(resp))
def handle_updatecheck(self, request: Request) -> bytes:
def handle_updatecheck(self, request: Request, _: Dict) -> bytes:
req_dict = self.mucha_preprocess(request.content.getvalue())
client_ip = Utils.get_ip_addr(request)
if req_dict is None:
self.logger.error(f"Error processing mucha request {request.content.getvalue()}")
return b""
self.logger.error(
f"Error processing mucha request {request.content.getvalue()}"
)
return b"RESULTS=000"
req = MuchaUpdateRequest(req_dict)
self.logger.info(f"Mucha request {vars(req)}")
resp = MuchaUpdateResponse(mucha_url=f"{self.config.mucha.hostname}:{self.config.mucha.port}")
self.logger.info(f"Mucha response {vars(resp)}")
self.logger.info(f"Updatecheck request from {client_ip} for {req.gameVer}")
self.logger.debug(f"Mucha request {vars(req)}")
if req.gameCd not in self.mucha_registry:
self.logger.warning(f"Unknown gameCd {req.gameCd}")
return b"RESULTS=000"
resp = MuchaUpdateResponse(req.gameVer, f"{self.config.mucha.hostname}{':' + str(self.config.allnet.port) if self.config.server.is_develop else ''}")
self.logger.debug(f"Mucha response {vars(resp)}")
return self.mucha_postprocess(vars(resp))
def handle_dlstate(self, request: Request, _: Dict) -> bytes:
req_dict = self.mucha_preprocess(request.content.getvalue())
client_ip = Utils.get_ip_addr(request)
if req_dict is None:
self.logger.error(
f"Error processing mucha request {request.content.getvalue()}"
)
return b""
req = MuchaDownloadStateRequest(req_dict)
self.logger.info(f"DownloadState request from {client_ip} for {req.gameCd} -> {req.updateVer}")
self.logger.debug(f"request {vars(req)}")
return b"RESULTS=001"
def mucha_preprocess(self, data: bytes) -> Optional[Dict]:
try:
ret: Dict[str, Any] = {}
for x in data.decode().split('&'):
kvp = x.split('=')
for x in data.decode().split("&"):
kvp = x.split("=")
if len(kvp) == 2:
ret[kvp[0]] = kvp[1]
return ret
except:
except Exception:
self.logger.error(f"Error processing mucha request {data}")
return None
def mucha_postprocess(self, data: dict) -> Optional[bytes]:
try:
urlencode = ""
for k,v in data.items():
urlencode += f"{k}={v}&"
urlencode = "&".join(f"{k}={v}" for k, v in data.items())
return urlencode.encode()
except:
except Exception:
self.logger.error("Error processing mucha response")
return None
class MuchaAuthRequest():
def __init__(self, request: Dict) -> None:
self.gameVer = "" if "gameVer" not in request else request["gameVer"]
self.sendDate = "" if "sendDate" not in request else request["sendDate"]
self.serialNum = "" if "serialNum" not in request else request["serialNum"]
self.gameCd = "" if "gameCd" not in request else request["gameCd"]
self.boardType = "" if "boardType" not in request else request["boardType"]
self.boardId = "" if "boardId" not in request else request["boardId"]
self.placeId = "" if "placeId" not in request else request["placeId"]
self.storeRouterIp = "" if "storeRouterIp" not in request else request["storeRouterIp"]
self.countryCd = "" if "countryCd" not in request else request["countryCd"]
self.useToken = "" if "useToken" not in request else request["useToken"]
self.allToken = "" if "allToken" not in request else request["allToken"]
class MuchaAuthResponse():
def __init__(self, mucha_url: str = "localhost") -> None:
self.RESULTS = "001"
class MuchaAuthRequest:
def __init__(self, request: Dict) -> None:
# gameCd + boardType + countryCd + version
self.gameVer = request.get("gameVer", "")
self.sendDate = request.get("sendDate", "") # %Y%m%d
self.serialNum = request.get("serialNum", "")
self.gameCd = request.get("gameCd", "")
self.boardType = request.get("boardType", "")
self.boardId = request.get("boardId", "")
self.mac = request.get("mac", "")
self.placeId = request.get("placeId", "")
self.storeRouterIp = request.get("storeRouterIp", "")
self.countryCd = request.get("countryCd", "")
self.useToken = request.get("useToken", "")
self.allToken = request.get("allToken", "")
class MuchaAuthResponse:
def __init__(self, mucha_url: str) -> None:
self.RESULTS = "001"
self.AUTH_INTERVAL = "86400"
self.SERVER_TIME = datetime.strftime(datetime.now(), "%Y%m%d%H%M")
self.UTC_SERVER_TIME = datetime.strftime(datetime.now(pytz.UTC), "%Y%m%d%H%M")
self.CHARGE_URL = f"https://{mucha_url}/charge/"
self.CHARGE_URL = f"https://{mucha_url}/charge/"
self.FILE_URL = f"https://{mucha_url}/file/"
self.URL_1 = f"https://{mucha_url}/url1/"
self.URL_2 = f"https://{mucha_url}/url2/"
self.URL_3 = f"https://{mucha_url}/url3/"
self.PLACE_ID = "JPN123"
self.COUNTRY_CD = "JPN"
self.PLACE_ID = "JPN123"
self.COUNTRY_CD = "JPN"
self.SHOP_NAME = "TestShop!"
self.SHOP_NICKNAME = "TestShop"
self.AREA_0 = "008"
@ -120,7 +189,7 @@ class MuchaAuthResponse():
self.AREA_FULL_1 = ""
self.AREA_FULL_2 = ""
self.AREA_FULL_3 = ""
self.SHOP_NAME_EN = "TestShop!"
self.SHOP_NICKNAME_EN = "TestShop"
self.AREA_0_EN = "008"
@ -132,32 +201,141 @@ class MuchaAuthResponse():
self.AREA_FULL_2_EN = ""
self.AREA_FULL_3_EN = ""
self.PREFECTURE_ID = "1"
self.PREFECTURE_ID = "1"
self.EXPIRATION_DATE = "null"
self.USE_TOKEN = "0"
self.CONSUME_TOKEN = "0"
self.DONGLE_FLG = "1"
self.FORCE_BOOT = "0"
class MuchaUpdateRequest():
def __init__(self, request: Dict) -> None:
self.gameVer = "" if "gameVer" not in request else request["gameVer"]
self.gameCd = "" if "gameCd" not in request else request["gameCd"]
self.serialNum = "" if "serialNum" not in request else request["serialNum"]
self.countryCd = "" if "countryCd" not in request else request["countryCd"]
self.placeId = "" if "placeId" not in request else request["placeId"]
self.storeRouterIp = "" if "storeRouterIp" not in request else request["storeRouterIp"]
class MuchaUpdateResponse():
def __init__(self, game_ver: str = "PKFN0JPN01.01", mucha_url: str = "localhost") -> None:
self.RESULTS = "001"
class MuchaUpdateRequest:
def __init__(self, request: Dict) -> None:
self.gameVer = request.get("gameVer", "")
self.gameCd = request.get("gameCd", "")
self.serialNum = request.get("serialNum", "")
self.countryCd = request.get("countryCd", "")
self.placeId = request.get("placeId", "")
self.storeRouterIp = request.get("storeRouterIp", "")
class MuchaUpdateResponse:
def __init__(self, game_ver: str, mucha_url: str) -> None:
self.RESULTS = "001"
self.EXE_VER = game_ver
self.UPDATE_VER_1 = game_ver
self.UPDATE_URL_1 = f"https://{mucha_url}/updUrl1/"
self.UPDATE_SIZE_1 = "0"
self.UPDATE_CRC_1 = "0000000000000000"
self.CHECK_URL_1 = f"https://{mucha_url}/checkUrl/"
self.EXE_VER_1 = game_ver
self.UPDATE_URL_1 = f"http://{mucha_url}/updUrl1/"
self.UPDATE_SIZE_1 = "20"
self.CHECK_CRC_1 = "0000000000000000"
self.CHECK_URL_1 = f"http://{mucha_url}/checkUrl/"
self.CHECK_SIZE_1 = "20"
self.INFO_SIZE_1 = "0"
self.COM_SIZE_1 = "0"
self.COM_TIME_1 = "0"
self.LAN_INFO_SIZE_1 = "0"
self.USER_ID = ""
self.PASSWORD = ""
"""
RESULTS
EXE_VER
UPDATE_VER_%d
UPDATE_URL_%d
UPDATE_SIZE_%d
CHECK_CRC_%d
CHECK_URL_%d
CHECK_SIZE_%d
INFO_SIZE_1
COM_SIZE_1
COM_TIME_1
LAN_INFO_SIZE_1
USER_ID
PASSWORD
"""
class MuchaUpdateResponseStub:
def __init__(self, game_ver: str) -> None:
self.RESULTS = "001"
self.UPDATE_VER_1 = game_ver
class MuchaDownloadStateRequest:
def __init__(self, request: Dict) -> None:
self.gameCd = request.get("gameCd", "")
self.updateVer = request.get("updateVer", "")
self.serialNum = request.get("serialNum", "")
self.fileSize = request.get("fileSize", "")
self.compFileSize = request.get("compFileSize", "")
self.boardId = request.get("boardId", "")
self.placeId = request.get("placeId", "")
self.storeRouterIp = request.get("storeRouterIp", "")
class MuchaDownloadErrorRequest:
def __init__(self, request: Dict) -> None:
self.gameCd = request.get("gameCd", "")
self.updateVer = request.get("updateVer", "")
self.serialNum = request.get("serialNum", "")
self.downloadUrl = request.get("downloadUrl", "")
self.errCd = request.get("errCd", "")
self.errMessage = request.get("errMessage", "")
self.boardId = request.get("boardId", "")
self.placeId = request.get("placeId", "")
self.storeRouterIp = request.get("storeRouterIp", "")
class MuchaRegiAuthRequest:
def __init__(self, request: Dict) -> None:
self.gameCd = request.get("gameCd", "")
self.serialNum = request.get("serialNum", "") # Encrypted
self.countryCd = request.get("countryCd", "")
self.registrationCd = request.get("registrationCd", "")
self.sendDate = request.get("sendDate", "")
self.useToken = request.get("useToken", "")
self.allToken = request.get("allToken", "")
self.placeId = request.get("placeId", "")
self.storeRouterIp = request.get("storeRouterIp", "")
class MuchaRegiAuthResponse:
def __init__(self) -> None:
self.RESULTS = "001" # 001 = success, 099, 098, 097 = fail, others = fail
self.ALL_TOKEN = "0" # Encrypted
self.ADD_TOKEN = "0" # Encrypted
class MuchaTokenStateRequest:
def __init__(self, request: Dict) -> None:
self.gameCd = request.get("gameCd", "")
self.serialNum = request.get("serialNum", "")
self.countryCd = request.get("countryCd", "")
self.useToken = request.get("useToken", "")
self.allToken = request.get("allToken", "")
self.placeId = request.get("placeId", "")
self.storeRouterIp = request.get("storeRouterIp", "")
class MuchaTokenStateResponse:
def __init__(self) -> None:
self.RESULTS = "001"
class MuchaTokenMarginStateRequest:
def __init__(self, request: Dict) -> None:
self.gameCd = request.get("gameCd", "")
self.serialNum = request.get("serialNum", "")
self.countryCd = request.get("countryCd", "")
self.placeId = request.get("placeId", "")
self.limitLowerToken = request.get("limitLowerToken", 0)
self.limitUpperToken = request.get("limitUpperToken", 0)
self.settlementMonth = request.get("settlementMonth", 0)
class MuchaTokenMarginStateResponse:
def __init__(self) -> None:
self.RESULTS = "001"
self.LIMIT_LOWER_TOKEN = 0
self.LIMIT_UPPER_TOKEN = 0
self.LAST_SETTLEMENT_MONTH = 0
self.LAST_LIMIT_LOWER_TOKEN = 0
self.LAST_LIMIT_UPPER_TOKEN = 0
self.SETTLEMENT_MONTH = 0

View File

@ -1,4 +1,4 @@
from typing import Dict, Any
from typing import Dict, List, Tuple
import logging, coloredlogs
from logging.handlers import TimedRotatingFileHandler
from twisted.web.http import Request
@ -7,68 +7,172 @@ from core.config import CoreConfig
from core.data import Data
from core.utils import Utils
class TitleServlet():
def __init__(self, core_cfg: CoreConfig, cfg_folder: str):
class BaseServlet:
def __init__(self, core_cfg: CoreConfig, cfg_dir: str) -> None:
self.core_cfg = core_cfg
self.game_cfg = None
self.logger = logging.getLogger("title")
@classmethod
def is_game_enabled(cls, game_code: str, core_cfg: CoreConfig, cfg_dir: str) -> bool:
"""Called during boot to check if a specific game code should load.
Args:
game_code (str): 4 character game code
core_cfg (CoreConfig): CoreConfig class
cfg_dir (str): Config directory
Returns:
bool: True if the game is enabled and set to run, False otherwise
"""
return False
def get_endpoint_matchers(self) -> Tuple[List[Tuple[str, str, Dict]], List[Tuple[str, str, Dict]]]:
"""Called during boot to get all matcher endpoints this title servlet handles
Returns:
Tuple[List[Tuple[str, str, Dict]], List[Tuple[str, str, Dict]]]: A 2-length tuple where offset 0 is GET and offset 1 is POST,
containing a list of 3-length tuples where offset 0 is the name of the function in the handler that should be called, offset 1
is the matching string, and offset 2 is a dict containing rules for the matcher.
"""
return (
[("render_GET", "/{game}/{version}/{endpoint}", {'game': R'S...'})],
[("render_POST", "/{game}/{version}/{endpoint}", {'game': R'S...'})]
)
def setup(self) -> None:
"""Called once during boot, should contain any additional setup the handler must do, such as starting any sub-services
"""
pass
def get_allnet_info(self, game_code: str, game_ver: int, keychip: str) -> Tuple[str, str]:
"""Called any time a request to PowerOn is made to retrieve the url/host strings to be sent back to the game
Args:
game_code (str): 4 character game code
game_ver (int): version, expressed as an integer by multiplying by 100 (1.10 -> 110)
keychip (str): Keychip serial of the requesting machine, can be used to deliver specific URIs to different machines
Returns:
Tuple[str, str]: A tuple where offset 0 is the allnet uri field, and offset 1 is the allnet host field
"""
if not self.core_cfg.server.is_using_proxy and Utils.get_title_port(self.core_cfg) != 80:
return (f"http://{self.core_cfg.title.hostname}:{Utils.get_title_port(self.core_cfg)}/{game_code}/{game_ver}/", "")
return (f"http://{self.core_cfg.title.hostname}/{game_code}/{game_ver}/", "")
def get_mucha_info(self, core_cfg: CoreConfig, cfg_dir: str) -> Tuple[bool, str]:
"""Called once during boot to check if this game is a mucha game
Args:
core_cfg (CoreConfig): CoreConfig class
cfg_dir (str): Config directory
Returns:
Tuple[bool, str]: Tuple where offset 0 is true if the game is enabled, false otherwise, and offset 1 is the game CD
"""
return (False, "")
def render_POST(self, request: Request, game_code: str, matchers: Dict) -> bytes:
self.logger.warn(f"{game_code} Does not dispatch POST")
return None
def render_GET(self, request: Request, game_code: str, matchers: Dict) -> bytes:
self.logger.warn(f"{game_code} Does not dispatch GET")
return None
class TitleServlet:
title_registry: Dict[str, BaseServlet] = {}
def __init__(self, core_cfg: CoreConfig, cfg_folder: str):
super().__init__()
self.config = core_cfg
self.config_folder = cfg_folder
self.data = Data(core_cfg)
self.title_registry: Dict[str, Any] = {}
self.logger = logging.getLogger("title")
if not hasattr(self.logger, "initialized"):
log_fmt_str = "[%(asctime)s] Title | %(levelname)s | %(message)s"
log_fmt = logging.Formatter(log_fmt_str)
log_fmt = logging.Formatter(log_fmt_str)
fileHandler = TimedRotatingFileHandler("{0}/{1}.log".format(self.config.server.log_dir, "title"), when="d", backupCount=10)
fileHandler = TimedRotatingFileHandler(
"{0}/{1}.log".format(self.config.server.log_dir, "title"),
when="d",
backupCount=10,
)
fileHandler.setFormatter(log_fmt)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(log_fmt)
self.logger.addHandler(fileHandler)
self.logger.addHandler(consoleHandler)
self.logger.setLevel(core_cfg.title.loglevel)
coloredlogs.install(level=core_cfg.title.loglevel, logger=self.logger, fmt=log_fmt_str)
coloredlogs.install(
level=core_cfg.title.loglevel, logger=self.logger, fmt=log_fmt_str
)
self.logger.initialized = True
plugins = Utils.get_all_titles()
for folder, mod in plugins.items():
if hasattr(mod, "game_codes") and hasattr(mod, "index"):
handler_cls = mod.index(self.config, self.config_folder)
if hasattr(handler_cls, "setup"):
handler_cls.setup()
if hasattr(mod, "game_codes") and hasattr(mod, "index") and hasattr(mod.index, "is_game_enabled"):
should_call_setup = True
game_servlet: BaseServlet = mod.index
game_codes: List[str] = mod.game_codes
for code in mod.game_codes:
self.title_registry[code] = handler_cls
for code in game_codes:
if game_servlet.is_game_enabled(code, self.config, self.config_folder):
handler_cls = game_servlet(self.config, self.config_folder)
if hasattr(handler_cls, "setup") and should_call_setup:
handler_cls.setup()
should_call_setup = False
self.title_registry[code] = handler_cls
else:
self.logger.error(f"{folder} missing game_code or index in __init__.py")
self.logger.info(f"Serving {len(self.title_registry)} game codes on port {core_cfg.title.port}")
self.logger.error(f"{folder} missing game_code or index in __init__.py, or is_game_enabled in index")
self.logger.info(
f"Serving {len(self.title_registry)} game codes {'on port ' + str(core_cfg.title.port) if core_cfg.title.port > 0 else ''}"
)
def render_GET(self, request: Request, endpoints: dict) -> bytes:
code = endpoints["game"]
if code not in self.title_registry:
self.logger.warn(f"Unknown game code {code}")
code = endpoints["title"]
subaction = endpoints['subaction']
index = self.title_registry[code]
if not hasattr(index, "render_GET"):
self.logger.warn(f"{code} does not dispatch GET")
if code not in self.title_registry:
self.logger.warning(f"Unknown game code {code}")
request.setResponseCode(404)
return b""
return index.render_GET(request, endpoints["version"], endpoints["endpoint"])
index = self.title_registry[code]
handler = getattr(index, f"{subaction}", None)
if handler is None:
self.logger.error(f"{code} does not have handler for GET subaction {subaction}")
request.setResponseCode(500)
return b""
return handler(request, code, endpoints)
def render_POST(self, request: Request, endpoints: dict) -> bytes:
code = endpoints["game"]
code = endpoints["title"]
subaction = endpoints['subaction']
if code not in self.title_registry:
self.logger.warn(f"Unknown game code {code}")
index = self.title_registry[code]
if not hasattr(index, "render_POST"):
self.logger.warn(f"{code} does not dispatch POST")
self.logger.warning(f"Unknown game code {code}")
request.setResponseCode(404)
return b""
return index.render_POST(request, endpoints["version"], endpoints["endpoint"])
index = self.title_registry[code]
handler = getattr(index, f"{subaction}", None)
if handler is None:
self.logger.error(f"{code} does not have handler for POST subaction {subaction}")
request.setResponseCode(500)
return b""
endpoints.pop("title")
endpoints.pop("subaction")
return handler(request, code, endpoints)

View File

@ -1,22 +1,97 @@
from typing import Dict, List, Any, Optional
from typing import Dict, Any, Optional
from types import ModuleType
import zlib, base64
from twisted.web.http import Request
import logging
import importlib
from os import walk
import jwt
from base64 import b64decode
from datetime import datetime, timezone
from .config import CoreConfig
class Utils:
real_title_port = None
real_title_port_ssl = None
@classmethod
def get_all_titles(cls) -> Dict[str, ModuleType]:
ret: Dict[str, Any] = {}
for root, dirs, files in walk("titles"):
for dir in dirs:
for dir in dirs:
if not dir.startswith("__"):
try:
mod = importlib.import_module(f"titles.{dir}")
ret[dir] = mod
if hasattr(mod, "game_codes") and hasattr(
mod, "index"
): # Minimum required to function
ret[dir] = mod
except ImportError as e:
print(f"{dir} - {e}")
logging.getLogger("core").error(f"get_all_titles: {dir} - {e}")
raise
return ret
@classmethod
def get_ip_addr(cls, req: Request) -> str:
return (
req.getAllHeaders()[b"x-forwarded-for"].decode()
if b"x-forwarded-for" in req.getAllHeaders()
else req.getClientAddress().host
)
@classmethod
def get_title_port(cls, cfg: CoreConfig):
if cls.real_title_port is not None: return cls.real_title_port
if cfg.title.port == 0:
cls.real_title_port = cfg.allnet.port
else:
cls.real_title_port = cfg.title.port
return cls.real_title_port
@classmethod
def get_title_port_ssl(cls, cfg: CoreConfig):
if cls.real_title_port_ssl is not None: return cls.real_title_port_ssl
if cfg.title.port_ssl == 0:
cls.real_title_port_ssl = 443
else:
cls.real_title_port_ssl = cfg.title.port_ssl
return cls.real_title_port_ssl
def create_sega_auth_key(aime_id: int, game: str, place_id: int, keychip_id: str, b64_secret: str, exp_seconds: int = 86400, err_logger: str = 'aimedb') -> Optional[str]:
logger = logging.getLogger(err_logger)
try:
return jwt.encode({ "aime_id": aime_id, "game": game, "place_id": place_id, "keychip_id": keychip_id, "exp": int(datetime.now(tz=timezone.utc).timestamp()) + exp_seconds }, b64decode(b64_secret), algorithm="HS256")
except jwt.InvalidKeyError:
logger.error("Failed to encode Sega Auth Key because the secret is invalid!")
return None
except Exception as e:
logger.error(f"Unknown exception occoured when encoding Sega Auth Key! {e}")
return None
def decode_sega_auth_key(token: str, b64_secret: str, err_logger: str = 'aimedb') -> Optional[Dict]:
logger = logging.getLogger(err_logger)
try:
return jwt.decode(token, "secret", b64decode(b64_secret), algorithms=["HS256"], options={"verify_signature": True})
except jwt.ExpiredSignatureError:
logger.error("Sega Auth Key failed to validate due to an expired signature!")
return None
except jwt.InvalidSignatureError:
logger.error("Sega Auth Key failed to validate due to an invalid signature!")
return None
except jwt.DecodeError as e:
logger.error(f"Sega Auth Key failed to decode! {e}")
return None
except jwt.InvalidTokenError as e:
logger.error(f"Sega Auth Key is invalid! {e}")
return None
except Exception as e:
logger.error(f"Unknown exception occoured when decoding Sega Auth Key! {e}")
return None

View File

@ -1,47 +1,91 @@
import yaml
import argparse
import logging
from core.config import CoreConfig
from core.data import Data
from os import path, mkdir, access, W_OK
if __name__=='__main__':
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Database utilities")
parser.add_argument("--config", "-c", type=str, help="Config folder to use", default="config")
parser.add_argument("--version", "-v", type=str, help="Version of the database to upgrade/rollback to")
parser.add_argument("--game", "-g", type=str, help="Game code of the game who's schema will be updated/rolled back. Ex. SDFE")
parser.add_argument("action", type=str, help="DB Action, create, recreate, upgrade, or rollback")
parser.add_argument(
"--config", "-c", type=str, help="Config folder to use", default="config"
)
parser.add_argument(
"--version",
"-v",
type=str,
help="Version of the database to upgrade/rollback to",
)
parser.add_argument(
"--game",
"-g",
type=str,
help="Game code of the game who's schema will be updated/rolled back. Ex. SDFE",
)
parser.add_argument("--email", "-e", type=str, help="Email for the new user")
parser.add_argument("--old_ac", "-o", type=str, help="Access code to transfer from")
parser.add_argument("--new_ac", "-n", type=str, help="Access code to transfer to")
parser.add_argument("--force", "-f", type=bool, help="Force the action to happen")
parser.add_argument(
"action", type=str, help="DB Action, create, recreate, upgrade, or rollback"
)
args = parser.parse_args()
cfg = CoreConfig()
cfg.update(yaml.safe_load(open(f"{args.config}/core.yaml")))
if path.exists(f"{args.config}/core.yaml"):
cfg_dict = yaml.safe_load(open(f"{args.config}/core.yaml"))
cfg_dict.get("database", {})["loglevel"] = "info"
cfg.update(cfg_dict)
if not path.exists(cfg.server.log_dir):
mkdir(cfg.server.log_dir)
if not access(cfg.server.log_dir, W_OK):
print(
f"Log directory {cfg.server.log_dir} NOT writable, please check permissions"
)
exit(1)
data = Data(cfg)
if args.action == "create":
data.create_database()
elif args.action == "recreate":
data.recreate_database()
elif args.action == "upgrade" or args.action == "rollback":
if args.version is None:
print("Must set game and version to migrate to")
exit(0)
data.logger.warning("No version set, upgrading to latest")
if args.game is None:
print("No game set, upgrading core schema")
data.migrate_database("CORE", int(args.version))
data.logger.warning("No game set, upgrading core schema")
data.migrate_database(
"CORE",
int(args.version) if args.version is not None else None,
args.action,
)
else:
data.migrate_database(args.game, int(args.version), args.action)
data.migrate_database(
args.game,
int(args.version) if args.version is not None else None,
args.action,
)
elif args.action == "autoupgrade":
data.autoupgrade()
elif args.action == "create-owner":
data.create_owner(args.email)
elif args.action == "migrate-card":
data.migrate_card(args.old_ac, args.new_ac, args.force)
elif args.action == "cleanup":
data.delete_hanging_users()
elif args.action == "migrate":
print("Migrating from old schema to new schema")
data.restore_from_old_schema()
elif args.action == "dump":
print("Dumping old schema to migrate to new schema")
data.dump_db()
elif args.action == "generate":
pass
elif args.action == "version":
data.show_versions()
data.logger.info("Done")

66
docker-compose.yml Normal file
View File

@ -0,0 +1,66 @@
version: "3.9"
services:
app:
hostname: ma.app
build: .
volumes:
- ./aime:/app/aime
- ./configs/config:/app/config
environment:
CFG_DEV: 1
CFG_CORE_SERVER_HOSTNAME: 0.0.0.0
CFG_CORE_DATABASE_HOST: ma.db
CFG_CORE_MEMCACHED_HOSTNAME: ma.memcached
CFG_CORE_AIMEDB_KEY: <INSERT AIMEDB KEY HERE>
CFG_CHUNI_SERVER_LOGLEVEL: debug
ports:
- "80:80"
- "8443:8443"
- "22345:22345"
- "8080:8080"
- "8090:8090"
depends_on:
db:
condition: service_healthy
db:
hostname: ma.db
image: yobasystems/alpine-mariadb:10.11.5
environment:
MYSQL_DATABASE: aime
MYSQL_USER: aime
MYSQL_PASSWORD: aime
MYSQL_ROOT_PASSWORD: AimeRootPassword
MYSQL_CHARSET: utf8mb4
MYSQL_COLLATION: utf8mb4_general_ci
##Note: expose port 3306 to allow read.py importer into database, comment out when not needed
#ports:
# - "3306:3306"
##Note: uncomment to allow mysql to create a persistent database, leave commented if you want to rebuild database from scratch often
#volumes:
# - ./AimeDB:/var/lib/mysql
healthcheck:
test: ["CMD", "mysqladmin" ,"ping", "-h", "localhost", "-pAimeRootPassword"]
timeout: 5s
retries: 5
memcached:
hostname: ma.memcached
image: memcached:1.6.22-alpine3.18
command: [ "memcached", "-m", "1024", "-I", "128m" ]
phpmyadmin:
hostname: ma.phpmyadmin
image: phpmyadmin:latest
environment:
PMA_HOSTS: ma.db
PMA_USER: root
PMA_PASSWORD: AimeRootPassword
APACHE_PORT: 8080
ports:
- "9090:8080"

246
docs/INSTALL_DOCKER.md Normal file
View File

@ -0,0 +1,246 @@
# ARTEMiS - Docker Installation Guide
This step-by-step guide will allow you to install a Contenerized Version of ARTEMiS inside Docker, some steps can be skipped assuming you already have pre-requisite components and modules installed.
This guide assumes using Debian 12(bookworm-stable) as a Host Operating System for most of packages and modules.
## Pre-Requisites:
- Linux-Based Operating System (e.g. Debian, Ubuntu)
- Docker (https://get.docker.com)
- Python 3.9+
- (optional) Git
## Install Python3.9+ and Docker
```
(if this is a fresh install of the system)
sudo apt update && sudo apt upgrade
(installs python3 and pip)
sudo apt install python3 python3-pip
(installs docker)
curl -fsSL https://get.docker.com -o get-docker.sh
sh get-docker.sh
(optionally install git)
sudo apt install git
```
## Get ARTEMiS
If you installed git, clone into your choice of ARTEMiS git repository, e.g.:
```
git clone <ARTEMiS Repo> <folder>
```
If not, download the source package, and unpack it to the folder of your choice.
## Prepare development/home configuration
To build our Docker setup, first we need to create some folders and copy some files around
- Create 'aime', 'configs', 'AimeDB', and 'logs' folder in ARTEMiS root folder (where all source files exist)
- Inside configs folder, create 'config' folder, and copy all .yaml files from example_config to config (thats all files without nginx_example.conf)
- Edit .yaml files inside configs/config to suit your server needs
- Edit core.yaml inside configs/config:
```
set server.listen_address: to "0.0.0.0"
set title.hostname: to machine's IP address, e.g. "192.168.x.x", depending on your network, or actual hostname if your configuration is already set for dns resolve
set database.host: to "ma.db"
set database.memcached_host: to "ma.memcached"
set aimedb.key: to "<actual AIMEDB key>"
```
## Running Docker Compose
After configuring, go to ARTEMiS root folder, and execute:
```
docker compose up -d
```
("-d" argument means detached or daemon, meaning you will regain control of your terminal and Containers will run in background)
This will start pulling and building required images from network, after it's done, a development server should be running, with server accessible under machine's IP, frontend with port 8090, and PHPMyAdmin under port 9090.
- To turn off the server, from ARTEMiS root folder, execute:
```
docker compose down
```
- If you changed some files around, and don't see your changes applied, execute:
```
(turn off the server)
docker compose down
(rebuild)
docker compose build
(turn on)
docker compose up -d
```
- If you need to see logs from containers running, execute:
```
docker compose logs
```
- add '-f' to the end if you want to follow logs.
## Running commands
If you need to execute python scripts supplied with the application, use `docker compose exec app python3 <script> <command>`, for example `docker compose exec app python3 dbutils.py version`
## Persistent DB
By default, in development mode, ARTEMiS database is stored temporarily, if you wish to keep your database saved between restarts, we need to bind the database inside the container to actual storage/folder inside our server, to do this we need to make a few changes:
- First off, edit docker-compose.yml, and uncomment 2 lines:
```
(uncomment these two)
#volumes:
# - ./AimeDB:/var/lib/mysql
```
- After that, start up the server, this time Database will be saved in AimeDB folder we created in our configuration steps.
- If you wish to save it in another folder and/or storage device, change the "./AimeDB" target folder to folder/device of your choice
NOTE (NEEDS FIX): at the moment running development mode with persistent DB will always run database creation script at the start of application, while it doesn't break database outright, it might create some issues, a temporary fix can be applied:
- Start up containers with persistent DB already enabled, let application create database
- After startup, `docker compose down` the instance
- Edit entrypoint.sh and remove the `python3 dbutils.py create` line from Development mode statement
- Execute `docker compose build` and `docker compose up -d` to rebuild the app and start the containers back
## Adding importer data
To add data using importer, we can do that a few ways:
### Use importer locally on server
For that we need actual GameData and Options supplied somehow to the server system, be it wsl2 mounting layer, a pendrive with data, network share, or a direct copy to the server storage
With python3 installed on system, install requirements.txt directly to the system, or through python3 virtual-environment (python3-venv)
Default mysql/mariadb client development packages will also be required
- In the system:
```
sudo apt install default-libmysqlclient-dev build-essential pkg-config libmemcached-dev
sudo apt install mysql-client
OR
sudo apt install libmariadb-dev
```
- In the root ARTEMiS folder
```
python3 -m pip install -r requirements.txt
```
- If we wish to layer that with python3 virtual-environment, install required system packages, then:
```
sudo apt install python3-venv
python3 -m venv /path/to/venv
cd /path/to/venv/bin
python3 -m pip install -r /path/to/artemis/requirements.txt
```
- Depending on how you installed, now you can run read.py using:
- For direct installation, from root ARTEMiS folder:
```
python3 read.py <args>
```
- Or from python3 virtual environment, from root ARTEMiS folder:
```
/path/to/python3-venv/bin/python3 /path/to/artemis/read.py <args>
```
- We need to expose database container port, so that read.py can communicate with the database, inside docker-compose.yml, uncomment 2 lines in the database container declaration (db):
```
#ports:
# - "3306:3306"
```
- Now, `docker compose down && docker compose build && docker compose up -d` to restart containers
Now to insert the data, by default, docker doesn't expose container hostnames to root system, when trying to run read.py against a container, it will Error that hostname is not available, to fix that, we can add database hostname by hand to /etc/hosts:
```
sudo <editor of your choice> /etc/hosts
add '127.0.0.1 ma.db' to the table
save and close
```
- You can remove the line in /etc/hosts and de-expose the database port after successful import (this assumes you're using Persistent DB, as restarting the container without it will clear imported data).
### Use importer on remote Linux system
Follow the system and python portion of the guide, installing required packages and python3 modules, Download the ARTEMiS source.
- Edit core.yaml and insert it into config catalog:
```
database:
host: "<hostname of target system>"
```
- Expose port 3306 from database docker container to system, and allow port 3306 through system firewall to expose port to the system from which you will be importing data. (Remember to close down the database ports after finishing!)
- Import data using read.py
### Use importer on remote Windows system
Follow the [windows](docs/INSTALL_WINDOWS.md) guide for installing python dependencies, download the ARTEMiS source.
- Edit core.yaml and insert it into config catalog:
```
database:
host: "<hostname of target system>"
```
- Expose port 3306 from database docker container to system, and allow port 3306 through system firewall to expose port to the system from which you will be importing data.
- For Windows, also allow port 3306 outside the system so that read.py can communicate with remote database. (Remember to close down the database ports after finishing!)
# Troubleshooting
## Game does not connect to ARTEMiS Allnet Server
Double check your core.yaml if all addresses are correct and ports are correctly set and/or opened.
## Game does not connect to Title Server
Title server hostname requires your actual system hostname, from which you set up the Containers, or it's IP address, you can get the IP by using command `ip a` which will list all interfaces, and one of them should be your system IP (typically under eth0).
## Unhandled command in AimeDB
Make sure you have a proper AimeDB Key added to configuration.
## Memcached Error in ARTEMiS application causes errors in loading data
Currently when running ARTEMiS from master branch, there is a small bug that causes app to always configure memcached service to 127.0.0.1, to fix that, locate cache.py file in core/data, and edit:
```
memcache = pylibmc.Client([hostname]), binary=True)
```
to:
```
memcache = pylibmc.Client(["ma.memcached"], binary=True)
```
And build the containers again.
This will fix errors loading data from server.
(This is fixed in development branch)
## read.py "Can't connect to local server through socket '/run/mysqld/mysqld.sock'"
sqlalchemy by default reads any ip based connection as socket, thus trying to connect locally, please use a hostname (such as ma.db as in guide, and do not localhost) to force it to use a network interface.
### TODO:
- Production environment

129
docs/INSTALL_UBUNTU.md Normal file
View File

@ -0,0 +1,129 @@
# ARTEMiS - Ubuntu 20.04 LTS Guide
This step-by-step guide assumes that you are using a fresh install of Ubuntu 20.04 LTS, some of the steps can be skipped if you already have an installation with MySQL 5.7 or even some of the modules already present on your environment
# Setup
## Install memcached module
1. sudo apt-get install memcached
2. Under the file /etc/memcached.conf, please make sure the following parameters are set:
```
# Start with a cap of 64 megs of memory. It's reasonable, and the daemon default
# Note that the daemon will grow to this size, but does not start out holding this much
# memory
-I 128m
-m 1024
```
** This is mandatory to avoid memcached overload caused by Crossbeats or by massive profiles
3. Restart memcached using: sudo systemctl restart memcached
## Install MySQL 5.7
```
sudo apt update
sudo apt install wget -y
wget https://dev.mysql.com/get/mysql-apt-config_0.8.12-1_all.deb
sudo dpkg -i mysql-apt-config_0.8.12-1_all.deb
```
1. During the first prompt, select Ubuntu Bionic
2. Select the default option
3. Select MySQL 5.7
4. Select the last option
```
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 467B942D3A79BD29
sudo apt-get update
sudo apt-cache policy mysql-server
sudo apt install -f mysql-client=5.7* mysql-community-server=5.7* mysql-server=5.7*
```
## Default Configuration for MySQL Server
1. sudo mysql_secure_installation
> Make sure to follow the steps that will be prompted such as changing the mysql root password and such
2. Test your MySQL Server login by doing the following command :
> mysql -u root -p
## Create the default ARTEMiS database and user
1. mysql -u root -p
2. Please change the password indicated in the next line for a custom secure one and continue with the next commands
```
CREATE USER 'aime'@'localhost' IDENTIFIED BY 'MyStrongPass.';
CREATE DATABASE aime;
GRANT Alter,Create,Delete,Drop,Index,Insert,References,Select,Update ON aime.* TO 'aime'@'localhost';
FLUSH PRIVILEGES;
exit;
```
3. sudo systemctl restart mysql
## Install Python modules
```
sudo apt-get install python3-dev default-libmysqlclient-dev build-essential mysql-client libmysqlclient-dev libmemcached-dev
sudo apt install libpython3.8-dev
sudo apt-get install python3-software-properties
sudo apt install python3-pip
sudo pip3 install --upgrade pip testresources
sudo pip3 install --upgrade pip setuptools
sudo apt-get install python3-tk
```
7. Change your work path to the ARTEMiS root folder using 'cd' and install the requirements:
> sudo python3 -m pip install -r requirements.txt
## Copy/Rename the folder example_config to config
## Adjust /config/core.yaml
1. Make sure to change the server listen_address to be set to your local machine IP (ex.: 192.168.1.xxx)
2. Adjust the proper MySQL information you created earlier
3. Add the AimeDB key at the bottom of the file
## Create the database tables for ARTEMiS
1. sudo python3 dbutils.py create
2. If you get "No module named Crypto", run the following command:
```
sudo pip uninstall crypto
sudo pip uninstall pycrypto
sudo pip install pycrypto
```
## Firewall Adjustements
```
sudo ufw allow 80
sudo ufw allow 443
sudo ufw allow 8443
sudo ufw allow 22345
sudo ufw allow 8090
sudo ufw allow 8444
sudo ufw allow 8080
```
## Running the ARTEMiS instance
1. sudo python3 index.py
# Troubleshooting
## Game does not connect to ARTEMiS Allnet server
1. Double-check your core.yaml, the listen_address is most likely either not binded to the proper IP or the port is not opened
## Game does not connect to Title Server
1. Verify that your core.yaml is setup properly for both the server listen_address and title hostname
2. Boot your game and verify that an AllNet response does show and if it does, attempt to open the URI that is shown under a browser such as Edge, Chrome & Firefox.
3. If a page is shown, the server is working properly and if it doesn't, double check your port forwarding and also that you have entered the proper local IP under the Title hostname in core.yaml.
## Unhandled command under AimeDB
1. Double check your AimeDB key under core.yaml, it is incorrect.
## Memcache failed, error 3
1. Make sure memcached is properly installed and running. You can check the status of the service using the following command:
> sudo systemctl status memcached
2. If it is failing, double check the /etc/memcached.conf file, it may have duplicated arguments like the -I and -m
3. If it is still not working afterward, you can proceed with a workaround by manually editing the /core/data/cache.py file.
```
# Make memcache optional
try:
has_mc = False
except ModuleNotFoundError:
has_mc = False
```

102
docs/INSTALL_WINDOWS.md Normal file
View File

@ -0,0 +1,102 @@
# ARTEMiS - Windows 10/11 Guide
This step-by-step guide assumes that you are using a fresh install of Windows 10/11 without MySQL installed, some of the steps can be skipped if you already have an installation with MySQL 8.0 or even some of the modules already present on your environment
# Setup
## Install Python Python 3.9 (recommended) or 3.10
1. Download Python 3.9 : [Link](https://www.python.org/ftp/python/3.9.13/python-3.9.13-amd64.exe)
2. Install python-3.9.13-amd64.exe
1. Select Customize installation
2. Make sure that pip, tcl/tk, and the for all users are checked and hit Next
3. Make sure that you enable "Create shortcuts for installed applications" and "Add Python to environment variables" and hit Install
## Install MySQL 8.0
1. Download MySQL 8.0 Server : [Link](https://dev.mysql.com/get/Downloads/MySQLInstaller/mysql-installer-community-8.0.34.0.msi)
2. Install mysql-installer-web-community-8.0.34.0.msi
1. Click on "Add ..." on the side
2. Click on the "+" next to MySQL Servers
3. Make sure MySQL Server 8.0.34 - X64 is under the products to be installed.
4. Hit Next and Next once installed
5. Select the configuration type "Development Computer"
6. Hit Next
7. Select "Use Legacy Authentication Method (Retain MySQL 5.x compatibility)" and hit Next
8. Enter a root password and then hit Next >
9. Leave everything under Windows Service as default and hit Next >
10. Click on Execute and for it to finish and hit Next> and then Finish
3. Open MySQL 8.0 Command Line Client and login as your root user
4. Change `<Enter Password Here>` to a new password for the user aime, type those commands to create your user and the database
```sql
CREATE USER 'aime'@'localhost' IDENTIFIED BY '<Enter Password Here>';
CREATE DATABASE aime;
GRANT Alter,Create,Delete,Drop,Index,Insert,References,Select,Update ON aime.* TO 'aime'@'localhost';
FLUSH PRIVILEGES;
exit;
```
## Install Python modules
1. Change your work path to the artemis-master folder using 'cd' and install the requirements:
```shell
pip install -r requirements.txt
```
## Copy/Rename the folder `example_config` to `config`
## Adjust `config/core.yaml`
1. Make sure to change the server `hostname` to be set to your local machine IP (ex.: 192.168.xxx.xxx)
- In case you want to run this only locally, set the following values:
```yaml
server:
listen_address: 0.0.0.0
title:
hostname: 192.168.xxx.xxx
```
1. Adjust the proper MySQL information you created earlier
```yaml
database:
host: "localhost"
username: "aime"
password: "<Enter Password Here>"
name: "aime"
```
3. Add the AimeDB key at the bottom of the file
4. If the webui is needed, change the flag from False to True
## Create the database tables for ARTEMiS
```shell
python dbutils.py create
```
## Firewall Adjustements
Make sure the following ports are open both on your router and local Windows firewall in case you want to use this for public use (NOT recommended):
> Port 80 (TCP), 443 (TCP), 8443 (TCP), 22345 (TCP), 8080 (TCP), 8090 (TCP) **webui, 8444 (TCP) **mucha
## Running the ARTEMiS instance
```shell
python index.py
```
# Troubleshooting
## Game does not connect to ARTEMiS Allnet server
1. Double-check your core.yaml, the listen_address is most likely either not binded to the proper IP or the port is not opened
## Game does not connect to Title Server
1. Verify that your core.yaml is setup properly for both the server listen_address and title hostname
2. Boot your game and verify that an AllNet response does show and if it does, attempt to open the URI that is shown under a browser such as Edge, Chrome & Firefox.
3. If a page is shown, the server is working properly and if it doesn't, double check your port forwarding and also that you have entered the proper local IP under the Title hostname in core.yaml.
## Unhandled command under AimeDB
1. Double check your AimeDB key under core.yaml, it is incorrect.
## AttributeError: module 'collections' has no attribute 'Hashable'
1. This means the pyYAML module is obsolete, simply rerun pip with the -U (force update) flag, as shown below.
- Change your work path to the artemis-master (or artemis-develop) folder using 'cd' and run the following commands:
```shell
pip install -r requirements.txt -U
```

View File

@ -5,11 +5,19 @@
- `allow_unregistered_serials`: Allows games that do not have registered keychips to connect and authenticate. Disable to restrict who can connect to your server. Recomended to disable for production setups. Default `True`
- `name`: Name for the server, used by some games in their default MOTDs. Default `ARTEMiS`
- `is_develop`: Flags that the server is a development instance without a proxy standing in front of it. Setting to `False` tells the server not to listen for SSL, because the proxy should be handling all SSL-related things, among other things. Default `True`
- `threading`: Flags that `reactor.run` should be called via the `Thread` standard library. May provide a speed boost, but removes the ability to kill the server via `Ctrl + C`. Default: `False`
- `check_arcade_ip`: Checks IPs against the `arcade` table in the database, if one is defined. Default `False`
- `strict_ip_checking`: Rejects clients if there is no IP in the `arcade` table for the respective arcade
- `log_dir`: Directory to store logs. Server MUST have read and write permissions to this directory or you will have issues. Default `logs`
## Title
- `loglevel`: Logging level for the title server. Default `info`
- `hostname`: Hostname that gets sent to clients to tell them where to connect. Games must be able to connect to your server via the hostname or IP you spcify here. Note that most games will reject `localhost` or `127.0.0.1`. Default `localhost`
- `port`: Port that the title server will listen for connections on. Set to 0 to use the Allnet handler to reduce the port footprint. Default `8080`
- `port_ssl`: Port that the secure title server will listen for connections on. Set to 0 to use the Allnet handler to reduce the port footprint. Default `0`
- `ssl_key`: Location of the ssl server key for the secure title server. Ignored if `port_ssl` is set to `0` or `is_develop` set to `False`. Default `cert/title.key`
- `ssl_cert`: Location of the ssl server certificate for the secure title server. Must not be a self-signed SSL. Ignored if `port_ssl` is set to `0` or `is_develop` is set to `False`. Default `cert/title.pem`
- `reboot_start_time`: 24 hour JST time that clients will see as the start of maintenance period. Leave blank for no maintenance time. Default: ""
- `reboot_end_time`: 24 hour JST time that clients will see as the end of maintenance period. Leave blank for no maintenance time. Default: ""
## Database
- `host`: Host of the database. Default `localhost`
- `username`: Username of the account the server should connect to the database with. Default `aime`

765
docs/game_specific_info.md Normal file
View File

@ -0,0 +1,765 @@
# ARTEMiS Games Documentation
Below are all supported games with supported version ids in order to use
the corresponding importer and database upgrades.
**Important: The described database upgrades are only required if you are using an old database schema, f.e. still
using the megaime database. Clean installations always create the latest database structure!**
To upgrade the core database and the database for every game, execute:
```shell
python dbutils.py autoupgrade
```
# Table of content
- [Supported Games](#supported-games)
- [CHUNITHM](#chunithm)
- [crossbeats REV.](#crossbeats-rev)
- [maimai DX](#maimai-dx)
- [O.N.G.E.K.I.](#o-n-g-e-k-i)
- [Card Maker](#card-maker)
- [WACCA](#wacca)
- [Sword Art Online Arcade](#sao)
- [Initial D THE ARCADE](#initial-d-the-arcade)
# Supported Games
Games listed below have been tested and confirmed working.
## CHUNITHM
### SDBT
| Version ID | Version Name |
| ---------- | --------------------- |
| 0 | CHUNITHM |
| 1 | CHUNITHM PLUS |
| 2 | CHUNITHM AIR |
| 3 | CHUNITHM AIR PLUS |
| 4 | CHUNITHM STAR |
| 5 | CHUNITHM STAR PLUS |
| 6 | CHUNITHM AMAZON |
| 7 | CHUNITHM AMAZON PLUS |
| 8 | CHUNITHM CRYSTAL |
| 9 | CHUNITHM CRYSTAL PLUS |
| 10 | CHUNITHM PARADISE |
### SDHD/SDBT
| Version ID | Version Name |
| ---------- | ------------------- |
| 11 | CHUNITHM NEW!! |
| 12 | CHUNITHM NEW PLUS!! |
| 13 | CHUNITHM SUN |
| 14 | CHUNITHM SUN PLUS |
### Importer
In order to use the importer locate your game installation folder and execute:
```shell
python read.py --game SDBT --version <version ID> --binfolder /path/to/game/folder --optfolder /path/to/game/option/folder
```
The importer for Chunithm will import: Events, Music, Charge Items and Avatar Accesories.
### Config
Config file is located in `config/chuni.yaml`.
| Option | Info |
|------------------|----------------------------------------------------------------------------------------------------------------|
| `news_msg` | If this is set, the news at the top of the main screen will be displayed (up to Chunithm Paradise Lost) |
| `name` | If this is set, all players that are not on a team will use this one by default. |
| `use_login_bonus`| This is used to enable the login bonuses |
| `crypto` | This option is used to enable the TLS Encryption |
**If you would like to use network encryption, the following will be required underneath but key, iv and hash are required:**
```yaml
crypto:
encrypted_only: False
keys:
13: ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000"]
```
### Database upgrade
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDBT upgrade
```
### Online Battle
**Only matchmaking (with your imaginary friends) is supported! Online Battle does not (yet?) work!**
The first person to start the Online Battle (now called host) will create a "matching room" with a given `roomId`, after that max 3 other people can join the created room.
Non used slots during the matchmaking will be filled with CPUs after the timer runs out.
As soon as a new member will join the room the timer will jump back to 60 secs again.
Sending those 4 messages to all other users is also working properly.
In order to use the Online Battle every user needs the same ICF, same rom version and same data version!
If a room is full a new room will be created if another user starts an Online Battle.
After a failed Online Battle the room will be deleted. The host is used for the timer countdown, so if the connection failes to the host the timer will stop and could create a "frozen" state.
#### Information/Problems:
- Online Battle uses UDP hole punching and opens port 50201?
- `reflectorUri` seems related to that?
- Timer countdown should be handled globally and not by one user
- Game can freeze or can crash if someone (especially the host) leaves the matchmaking
### Rivals
You can configure up to 4 rivals in Chunithm on a per-user basis. There is no UI to do this currently, so in the database, you can do this:
```sql
INSERT INTO aime.chuni_item_favorite (user, version, favId, favKind) VALUES (<user1>, <version>, <user2>, 2);
INSERT INTO aime.chuni_item_favorite (user, version, favId, favKind) VALUES (<user2>, <version>, <user1>, 2);
```
Note that the version **must match**, otherwise song lookup may not work.
### Teams
You can also configure teams for users to be on. There is no UI to do this currently, so in the database, you can do this:
```sql
INSERT INTO aime.chuni_profile_team (teamName) VALUES (<teamName>);
```
Team names can be regular ASCII, and they will be displayed ingame.
### Favorite songs
You can set the songs that will be in a user's Favorite Songs category using the following SQL entries:
```sql
INSERT INTO aime.chuni_item_favorite (user, version, favId, favKind) VALUES (<user>, <version>, <songId>, 1);
```
The songId is based on the actual ID within your version of Chunithm.
## crossbeats REV.
### SDCA
| Version ID | Version Name |
| ---------- | ---------------------------------- |
| 0 | crossbeats REV. |
| 1 | crossbeats REV. SUNRISE |
| 2 | crossbeats REV. SUNRISE S2 |
| 3 | crossbeats REV. SUNRISE S2 Omnimix |
### Importer
In order to use the importer you need to use the provided `Export.csv` file:
```shell
python read.py --game SDCA --version <version ID> --binfolder titles/cxb/data
```
The importer for crossbeats REV. will import Music.
### Config
Config file is located in `config/cxb.yaml`.
## maimai DX
### Versions
| Game Code | Version ID | Version Name |
| --------- | ---------- | ----------------------- |
| SBXL | 0 | maimai |
| SBXL | 1 | maimai PLUS |
| SBZF | 2 | maimai GreeN |
| SBZF | 3 | maimai GreeN PLUS |
| SDBM | 4 | maimai ORANGE |
| SDBM | 5 | maimai ORANGE PLUS |
| SDCQ | 6 | maimai PiNK |
| SDCQ | 7 | maimai PiNK PLUS |
| SDDK | 8 | maimai MURASAKi |
| SDDK | 9 | maimai MURASAKi PLUS |
| SDDZ | 10 | maimai MiLK |
| SDDZ | 11 | maimai MiLK PLUS |
| SDEY | 12 | maimai FiNALE |
| SDEZ | 13 | maimai DX |
| SDEZ | 14 | maimai DX PLUS |
| SDEZ | 15 | maimai DX Splash |
| SDEZ | 16 | maimai DX Splash PLUS |
| SDEZ | 17 | maimai DX UNiVERSE |
| SDEZ | 18 | maimai DX UNiVERSE PLUS |
| SDEZ | 19 | maimai DX FESTiVAL |
| SDEZ | 20 | maimai DX FESTiVAL PLUS |
### Importer
In order to use the importer locate your game installation folder and execute:
DX:
```shell
python read.py --game <Game Code> --version <Version ID> --binfolder /path/to/StreamingAssets --optfolder /path/to/game/option/folder
```
Pre-DX:
```shell
python read.py --game <Game Code> --version <Version ID> --binfolder /path/to/data --optfolder /path/to/patch/data
```
The importer for maimai DX will import Events, Music and Tickets.
The importer for maimai Pre-DX will import Events and Music. Not all games will have patch data. Milk - Finale have file encryption, and need an AES key. That key is not provided by the developers. For games that do use encryption, provide the key, as a hex string, with the `--extra` flag. Ex `--extra 00112233445566778899AABBCCDDEEFF`
**Important: It is required to use the importer because some games may not function properly or even crash without Events!**
### Database upgrade
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDEZ upgrade
```
Pre-Dx uses the same database as DX, so only upgrade using the SDEZ game code!
## Hatsune Miku Project Diva
### SBZV
| Version ID | Version Name |
| ---------- | ------------------------------- |
| 0 | Project Diva Arcade |
| 1 | Project Diva Arcade Future Tone |
### Importer
In order to use the importer locate your game installation folder and execute:
```shell
python read.py --game SBZV --version <version ID> --binfolder /path/to/game/data/diva --optfolder /path/to/game/data/diva/mdata
```
The importer for Project Diva Arcade will all required data in order to use
the Shop, Modules and Customizations.
### Config
Config file is located in `config/diva.yaml`.
| Option | Info |
| -------------------- | ----------------------------------------------------------------------------------------------- |
| `unlock_all_modules` | Unlocks all modules (costumes) by default, if set to `False` all modules need to be purchased |
| `unlock_all_items` | Unlocks all items (customizations) by default, if set to `False` all items need to be purchased |
### Custom PV Lists (databanks)
In order to use custom PV Lists, simply drop in your .dat files inside of /titles/diva/data/ and make sure they are called PvList0.dat, PvList1.dat, PvList2.dat, PvList3.dat and PvList4.dat exactly.
### Database upgrade
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SBZV upgrade
```
## O.N.G.E.K.I.
### SDDT
| Version ID | Version Name |
| ---------- | -------------------------- |
| 0 | O.N.G.E.K.I. |
| 1 | O.N.G.E.K.I. + |
| 2 | O.N.G.E.K.I. SUMMER |
| 3 | O.N.G.E.K.I. SUMMER + |
| 4 | O.N.G.E.K.I. R.E.D. |
| 5 | O.N.G.E.K.I. R.E.D. + |
| 6 | O.N.G.E.K.I. bright |
| 7 | O.N.G.E.K.I. bright MEMORY |
### Importer
In order to use the importer locate your game installation folder and execute:
```shell
python read.py --game SDDT --version <version ID> --binfolder /path/to/game/folder --optfolder /path/to/game/option/folder
```
The importer for O.N.G.E.K.I. will all all Cards, Music and Events.
**NOTE: The Importer is required for Card Maker.**
### Config
Config file is located in `config/ongeki.yaml`.
| Option | Info |
| ---------------- | -------------------------------------------------------------------------------------------------------------- |
| `enabled_gachas` | Enter all gacha IDs for Card Maker to work, other than default may not work due to missing cards added to them |
| `crypto` | This option is used to enable the TLS Encryption |
Note: 1149 and higher are only for Card Maker 1.35 and higher and will be ignored on lower versions.
**If you would like to use network encryption, the following will be required underneath but key, iv and hash are required:**
```yaml
crypto:
encrypted_only: False
keys:
7: ["0000000000000000000000000000000000000000000000000000000000000000", "00000000000000000000000000000000", "0000000000000000"]
```
### Database upgrade
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDDT upgrade
```
### Controlling Events (Ranking Event, Technical Challenge Event, Mission Event)
Events are controlled by 2 types of enabled events:
- RankingEvent (type 6), TechChallengeEvent (type 17)
- AcceptRankingEvent (type 7), AcceptTechChallengeEvent (type 18)
Both Ranking and Accept must be enabled for event to function properly
Event will run for the time specified in startDate and endDate
AcceptRankingEvent and AcceptTechChallengeEvent are reward period for events, which specify from what startDate until endDate you can collect the rewards for attending the event, so the reward period must start in the future, e.g. :
- RankingEvent startDate 2023-12-01 - endDate 2023-12-30 - period in which whole event is running
- AcceptRankingEvent startDate 2023-12-23 - endDate 2023-12-30 - period in which you can collect rewards for the event
If player misses the AcceptRankingEvent period - ranking will be invalidated and receive lowest reward from the event (typically 500x money)
Technical Challenge Song List:
Songs that are used for Technical Challenge are not stored anywhere in data files, so you need to fill the database table by yourself, you can gather all songs that should be in Technical Challenges from ONGEKI japanese wikis, or, you can create your own sets:
Database table : `ongeki_static_tech_music`
```
id: Id in table, just increment for each entry
version: version of the game you want the tech challenge to be in (from RED and up)
eventId: Id of the event in ongeki_static_events, insert the Id of the TechChallengeEvent (type 17) you want the song be assigned to
musicId: Id of the song you want to add, use songId from ongeki_static_music table
level: Difficulty of the song you want to track during the event, from 0(basic) to 3(master)
```
Current implementation of Ranking and Technical Challenge Events are updated on every profile save to the Network, and Ranked on each player login, in official specification, calculation for current rank on the network should be done in the maintenance window
Mission Event (type 13) is a monthly type of event, which is used when another event doesn't have it's own Ranking or Technical Challenge Event running, only one Mission Event should be running at a time, so enable only the specific Mission you want to run currently on the Network
If you're often trying fresh cards, registering new profiles etc., you can also consider disabling all Announcement Events (type 1), as it will disable all the banners that pop up on login (they show up only once though, so if you click through them once they won't show again)
Event type 2 in Database are Advertisement Movies, enable only 1 you want to currently play, and disable others
Present and Reward List - populate reward list using read.py
Create present for players by adding an entry in `ongeki_static_present_list`
```
id: unique for each entry
version: game version you want the present be in
presentId: id of the present - starts with 1001 and go up from that, must be unique for each reward(don't set multiple rewardIds with same presentId)
presentName: present name which will be shown on the bottom when received
rewardId: ID of item from ongeki_static_rewards
stock: how many you want to give (like 5 copies of same card, or 10000 money, etc.)
message: no idea, can be left empty
startDate: date when to start giving out
endDate: date when ends
```
After inserting present to the table, add the presentId into players `ongeki_static_item`, where itemKind is 9, itemId is the presentId, and stock set 1 and isValid to 1
After that, on next login the present should be received (or whenever it supposed to happen)
## Card Maker
### SDED
| Version ID | Version Name |
| ---------- | --------------- |
| 0 | Card Maker 1.30 |
| 1 | Card Maker 1.35 |
### Support status
#### Card Maker 1.30:
* CHUNITHM NEW!!: Yes
* maimai DX UNiVERSE: Yes
* O.N.G.E.K.I. bright: Yes
#### Card Maker 1.35:
* CHUNITHM:
* NEW!!: Yes
* NEW PLUS!!: Yes (added in A028)
* SUN: Yes (added in A032)
* maimai DX:
* UNiVERSE PLUS: Yes
* FESTiVAL: Yes (added in A031)
* FESTiVAL PLUS: Yes (added in A035)
* O.N.G.E.K.I. bright MEMORY: Yes
### Importer
In order to use the importer you need to use the provided `.csv` files (which are required for O.N.G.E.K.I.) and the
option folders:
```shell
python read.py --game SDED --version <version ID> --binfolder titles/cm/cm_data --optfolder /path/to/cardmaker/option/folder
```
**If you haven't already executed the O.N.G.E.K.I. importer, make sure you import all cards!**
```shell
python read.py --game SDDT --version <version ID> --binfolder /path/to/game/folder --optfolder /path/to/game/option/folder
```
Also make sure to import all maimai DX and CHUNITHM data as well:
```shell
python read.py --game SDED --version <version ID> --binfolder /path/to/cardmaker/CardMaker_Data
```
The importer for Card Maker will import all required Gachas (Banners) and cards (for maimai DX/CHUNITHM) and the hardcoded
Cards for each Gacha (O.N.G.E.K.I. only).
**NOTE: Without executing the importer Card Maker WILL NOT work!**
### Config setup
Make sure to update your `config/cardmaker.yaml` with the correct version for each game. To get the current version required to run a specific game, open every opt (Axxx) folder descending until you find all three folders:
- `MU3`: O.N.G.E.K.I.
- `MAI`: maimai DX
- `CHU`: CHUNITHM
Inside each folder is a `DataConfig.xml` file, for example:
`MU3/DataConfig.xml`:
```xml
<cardMakerVersion>
<major>1</major>
<minor>35</minor>
<release>3</release>
</cardMakerVersion>
```
Now update your `config/cardmaker.yaml` with the correct version number, for example:
```yaml
version:
1: # Card Maker 1.35
ongeki: 1.35.03
```
For now you also need to update your `config/ongeki.yaml` with the correct version number, for example:
```yaml
version:
7: # O.N.G.E.K.I. bright MEMORY
card_maker: 1.35.03
```
### O.N.G.E.K.I.
Gacha "無料ガチャ" can only pull from the free cards with the following probabilities: 94%: R, 5% SR and 1% chance of
getting an SSR card
Gacha "無料ガチャSR確定" can only pull from free SR cards with prob: 92% SR and 8% chance of getting an SSR card
Gacha "レギュラーガチャ" can pull from every card added to ongeki_static_cards with the following prob: 77% R, 20% SR
and 3% chance of getting an SSR card
All other (limited) gachas can pull from every card added to ongeki_static_cards but with the promoted cards
(click on the green button under the banner) having a 10 times higher chance to get pulled
### CHUNITHM
All cards in CHUNITHM (basically just the characters) have the same rarity to it just pulls randomly from all cards
from a given gacha but made sure you cannot pull the same card twice in the same 5 times gacha roll.
### maimai DX
Printed maimai DX cards: Freedom (`cardTypeId=6`) or Gold Pass (`cardTypeId=4`) can now be selected during the login process. You can only have ONE Freedom and ONE Gold Pass active at a given time. The cards will expire after 15 days.
Thanks GetzeAvenue for the `selectedCardList` rarity hint!
### Notes
Card Maker 1.30-1.34 will only load an O.N.G.E.K.I. Bright profile (1.30). Card Maker 1.35+ will only load an O.N.G.E.K.I.
Bright Memory profile (1.35).
The gachas inside the `config/ongeki.yaml` will make sure only the right gacha ids for the right CM version will be loaded.
Gacha IDs up to 1140 will be loaded for CM 1.34 and all gachas will be loaded for CM 1.35.
## WACCA
### SDFE
| Version ID | Version Name |
| ---------- | ------------- |
| 0 | WACCA |
| 1 | WACCA S |
| 2 | WACCA Lily |
| 3 | WACCA Lily R |
| 4 | WACCA Reverse |
### Importer
In order to use the importer locate your game installation folder and execute:
```shell
python read.py --game SDFE --version <version ID> --binfolder /path/to/game/WindowsNoEditor/Mercury/Content
```
The importer for WACCA will import all Music data.
### Config
Config file is located in `config/wacca.yaml`.
| Option | Info |
| ------------------ | --------------------------------------------------------------------------- |
| `always_vip` | Enables/Disables VIP, if disabled it needs to be purchased manually in game |
| `infinite_tickets` | Always set the "unlock expert" tickets to 5 |
| `infinite_wp` | Sets the user WP to `999999` |
| `enabled_gates` | Enter all gate IDs which should be enabled in game |
### Database upgrade
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDFE upgrade
```
### VIP Rewards
Below is a list of VIP rewards. Currently, VIP is not implemented, and thus these are not obtainable. These 23 rewards were distributed once per month for VIP users on the real network.
Plates:
211004 リッチ
211018 特盛えりざべす
211025 イースター
211026 特盛りりぃ
311004 ファンシー
311005 インカンテーション
311014 夜明け
311015 ネイビー
311016 特盛るーん
Ring Colors:
203002 Gold Rushイエロー
203009 トロピカル
303005 ネイチャー
Icons:
202020 どらみんぐ
202063 ユニコーン
202086 ゴリラ
302014 ローズ
302015 ファラオ
302045 肉球
302046 WACCA
302047 WACCA Lily
302048 WACCA Reverse
Note Sound Effect:
205002 テニス
205008 シャワー
305003 タンバリンMk-Ⅱ
## SAO
### SDEW
| Version ID | Version Name |
| ---------- | ------------ |
| 0 | SAO |
### Importer
In order to use the importer locate your game installation folder and execute:
```shell
python read.py --game SDEW --version <version ID> --binfolder /path/to/game/extractedassets
```
The importer for SAO will import all items, heroes, support skills and titles data.
### Config
Config file is located in `config/sao.yaml`.
| Option | Info |
| --------------- | ----------------------------------------------------------------- |
| `hostname` | Changes the server listening address for Mucha |
| `port` | Changes the listing port |
| `auto_register` | Allows the game to handle the automatic registration of new cards |
### Database upgrade
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDEW upgrade
```
### Notes
- Defrag Match will crash at loading
- Co-Op Online is not supported
- Shop is not functionnal
- Player title is currently static and cannot be changed in-game
- QR Card Scanning currently only load a static hero
**Network hashing in GssSite.dll must be disabled**
### Credits for SAO support:
- Midorica - Limited Network Support
- Dniel97 - Helping with network base
- tungnotpunk - Source
## Initial D THE ARCADE
### SDGT
| Version ID | Version Name |
| ---------- | ----------------------------- |
| 0 | Initial D THE ARCADE Season 1 |
| 1 | Initial D THE ARCADE Season 2 |
**Important: Only version 1.50.00 (Season 2) is currently working and actively supported!**
### Profile Importer
In order to use the profile importer download the `idac_profile.json` file from the frontend
and either directly use the folder path with `idac_profile.json` in it or specify the complete
path to the `.json` file
```shell
python read.py --game SDGT --version <Version ID> --optfolder /path/to/game/download/folder
```
The importer for SDGT will import the complete profile data with personal high scores as well.
### Config
Config file is located in `config/idac.yaml`.
| Option | Info |
| ----------------------------- | ----------------------------------------------------------------------------------------------------------- |
| `ssl` | Enables/Disables the use of the `ssl_cert` and `ssl_key` (currently unsuported) |
| `matching_host` | IPv4 address of your PC for the Online Battle (currently unsupported) |
| `port_matching` | Port number for the Online Battle Matching |
| `port_echo1/2` | Port numbers for Echos |
| `port_matching_p2p` | Port number for Online Battle (currently unsupported) |
| `stamp.enable` | Enables/Disabled the play stamp events |
| `stamp.enabled_stamps` | Define up to 3 play stamp events (without `.json` extension, which are placed in `titles/idac/data/stamps`) |
| `timetrial.enable` | Enables/Disables the time trial event |
| `timetrial.enabled_timetrial` | Define one! trial event (without `.json` extension, which are placed in `titles/idac/data/timetrial`) |
### Database upgrade
Always make sure your database (tables) are up-to-date:
```shell
python dbutils.py --game SDGT upgrade
```
### Notes
- Online Battle is not supported
- Online Battle Matching is not supported
### Item categories
| Category ID | Category Name |
| ----------- | ------------------------ |
| 1 | D Coin |
| 3 | Car Dressup Token |
| 5 | Avatar Dressup Token |
| 6 | Tachometer |
| 7 | Aura |
| 8 | Aura Color |
| 9 | Avatar Face |
| 10 | Avatar Eye |
| 11 | Avatar Mouth |
| 12 | Avatar Hair |
| 13 | Avatar Glasses |
| 14 | Avatar Face accessories |
| 15 | Avatar Body |
| 18 | Avatar Background |
| 21 | Chat Stamp |
| 22 | Keychain |
| 24 | Title |
| 25 | FullTune Ticket |
| 26 | Paper Cup |
| 27 | BGM |
| 28 | Drifting Text |
| 31 | Start Menu BG |
| 32 | Car Color/Paint |
| 33 | Aura Level |
| 34 | FullTune Ticket Fragment |
| 35 | Underneon Lights |
### TimeRelease Chapter:
1. Story: 1, 2, 3, 4, 5, 6, 7, 8, 9, 19 (Chapter 10), (29 Chapter 11?)
2. MF Ghost: 10, 11, 12, 13, 14, 15
3. Bunta: 15, 16, 17, 18, 19, 20, (21, 21, 22?)
4. Special Event: 23, 24, 25, 26, 27, 28 (Touhou Project)
### TimeRelease Courses:
| Course ID | Course Name | Direction |
| --------- | ------------------------- | ------------------------ |
| 0 | Akina Lake(秋名湖) | CounterClockwise(左周り) |
| 2 | Akina Lake(秋名湖) | Clockwise(右周り) |
| 52 | Hakone(箱根) | Downhill(下り) |
| 54 | Hakone(箱根) | Hillclimb(上り) |
| 36 | Usui(碓氷) | CounterClockwise(左周り) |
| 38 | Usui(碓氷) | Clockwise(右周り) |
| 4 | Myogi(妙義) | Downhill(下り) |
| 6 | Myogi(妙義) | Hillclimb(上り) |
| 8 | Akagi(赤城) | Downhill(下り) |
| 10 | Akagi(赤城) | Hillclimb(上り) |
| 12 | Akina(秋名) | Downhill(下り) |
| 14 | Akina(秋名) | Hillclimb(上り) |
| 16 | Irohazaka(いろは坂) | Downhill(下り) |
| 18 | Irohazaka(いろは坂) | Reverse(逆走) |
| 56 | Momiji Line(もみじライン) | Downhill(下り) |
| 58 | Momiji Line(もみじライン) | Hillclimb(上り) |
| 20 | Tsukuba(筑波) | Outbound(往路) |
| 22 | Tsukuba(筑波) | Inbound(復路) |
| 24 | Happogahara(八方ヶ原) | Outbound(往路) |
| 26 | Happogahara(八方ヶ原) | Inbound(復路) |
| 40 | Sadamine(定峰) | Downhill(下り) |
| 42 | Sadamine(定峰) | Hillclimb(上り) |
| 44 | Tsuchisaka(土坂) | Outbound(往路) |
| 46 | Tsuchisaka(土坂) | Inbound(復路) |
| 48 | Akina Snow(秋名雪) | Downhill(下り) |
| 50 | Akina Snow(秋名雪) | Hillclimb(上り) |
| 68 | Odawara(小田原) | Forward(順走) |
| 70 | Odawara(小田原) | Reverse(逆走) |
### Credits
- Bottersnike: For the HUGE Reverse Engineering help
- Kinako: For helping with the timeRelease unlocking of courses and special mode
A huge thanks to all people who helped shaping this project to what it is now and don't want to be mentioned here.

41
docs/prod.md Normal file
View File

@ -0,0 +1,41 @@
# ARTEMiS Production mode
Production mode is a configuration option that changes how the server listens to be more friendly to a production environment. This mode assumes that a proxy (for this guide, nginx) is standing in front of the server to handle port mapping and TLS. In order to activate production mode, simply change `is_develop` to `False` in `core.yaml`. Next time you start the server, you should see "Starting server in production mode".
## Nginx Configuration
### Port forwarding
Artemis requires that the following ports be forwarded to allow internet traffic to access the server. This will not change regardless of what you set in the config, as many of these ports are hard-coded in the games.
`tcp:80` all.net, non-ssl titles
`tcp:8443` billing
`tcp:22345` aimedb
`tcp:443` frontend, SSL titles
### A note about external proxy services (cloudflare, etc)
Due to the way that artemis functions, it is currently not possible to put the server behind something like Cloudflare. Cloudflare only proxies web traffic on the standard ports (80, 443) and, as shown above, this does not work with artemis. Server administrators should seek other means to protect their network (VPS hosting, VPN, etc)
### SSL Certificates
You will need to generate SSL certificates for some games. The certificates vary in security and validity requirements. Please see the general guide below
- General Title: The certificate for the general title server should be valid, not self-signed and match the CN that the game will be reaching out to (e.i if your games are reaching out to titles.hostname.here, your ssl certificate should be valid for titles.hostname.here, or *.hostname.here)
- CXB: Same requires as the title server. It must not be self-signed, and CN must match. Recomended to get a wildcard cert if possible, and use it for both Title and CXB
- Pokken: Pokken can be self-signed, and the CN doesn't have to match, but it MUST use 2048-bit RSA. Due to the games age, andthing stronger then that will be rejected.
### Port mappings
An example config is provided in the `config` folder called `nginx_example.conf`. It is set up for the following:
`naominet.jp:tcp:80` -> `localhost:tcp:8000` for allnet
`ib.naominet.jp:ssl:8443` -> `localhost:tcp:8444` for the billing server
`your.hostname.here:ssl:443` -> `localhost:tcp:8080` for the SSL title server
`your.hostname.here:tcp:80` -> `localhost:tcp:8080` for the non-SSL title server
`cxb.hostname.here:ssl:443` -> `localhost:tcp:8080` for crossbeats (appends /SDCA/104/ to the request)
`pokken.hostname.here:ssl:443` -> `localhost:tcp:8080` for pokken
`frontend.hostname.here:ssl:443` -> `localhost:tcp:8090` for the frontend, includes https redirection
If you're using this as a guide, be sure to replace your.hostname.here with the hostname you specified in core.yaml under `titles->hostname`. Do *not* change naominet.jp, or allnet/billing will fail. Also remember to specifiy certificate paths correctly, as in the example they are simply placeholders.
### Multi-service ports
It is possible to use nginx to redirect billing and title server requests to the same port that all.net uses. By setting `port` to 0 under billing and title server, you can change the nginx config to serve the following (entries not shown here should be the same)
`ib.naominet.jp:ssl:8443` -> `localhost:tcp:8000` for the billing server
`your.hostname.here:ssl:443` -> `localhost:tcp:8000` for the SSL title server
`your.hostname.here:tcp:80` -> `localhost:tcp:8000` for the non-SSL title server
`cxb.hostname.here:ssl:443` -> `localhost:tcp:8000` for crossbeats (appends /SDCA/104/ to the request)
`pokken.hostname.here:ssl:443` -> `localhost:tcp:8000` for pokken
This will allow you to only use 3 ports locally, but you will still need to forward the same internet-facing ports as before.

11
entrypoint.sh Normal file
View File

@ -0,0 +1,11 @@
#!/bin/bash
if [[ -z "${CFG_DEV}" ]]; then
echo Production mode
python3 index.py
else
echo Development mode
python3 dbutils.py create
nodemon -w aime --legacy-watch index.py
fi

Some files were not shown because too many files have changed in this diff Show More