Added new rom import system utilizing WAL to avoid locking the database and freezing the frontend
Also added new logging setup to hopefully stream the scrape process
This commit is contained in:
100
CLAUDE.md
100
CLAUDE.md
@@ -1,100 +0,0 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Development Commands
|
||||
|
||||
The project uses devenv.nix for development environment management. All commands should be run from the repository root:
|
||||
|
||||
- `tests` - Run pytest with coverage
|
||||
- `lint` - Check code with ruff and black
|
||||
- `fix` - Auto-fix code issues with ruff and black
|
||||
- `typecheck` - Run pyright type checking
|
||||
- `run` - Execute the main ROM scraper application
|
||||
- `serve` - Start the FastAPI web server
|
||||
- `create-admin` - Create initial admin user for web interface
|
||||
- `migrate` - Database migration management (see Migration Commands)
|
||||
- `db-init` - Initialize database schema (first-time setup)
|
||||
- `db-upgrade` - Apply pending database migrations
|
||||
- `db-create` - Create new migration (requires message argument)
|
||||
- `build` - Build the application using build.sh (creates zipapp in release/)
|
||||
|
||||
## Architecture
|
||||
|
||||
This is a Python ROM metadata scraper and web-based ROM management system for DOS games:
|
||||
|
||||
### Core Components
|
||||
|
||||
- **Main Application** (`src/__main__.py`): Async scraper that scans ROM directories, fetches metadata from IGDB API, and stores everything in SQLite
|
||||
- **Web Application** (`src/webapp.py`): FastAPI server with user authentication, ROM browsing, downloads, and admin interface
|
||||
- **Configuration** (`src/libs/config.py`): XDG-compliant config management with automatic setup prompts
|
||||
- **Database Layer** (`src/libs/database.py`): SQLAlchemy models with many-to-many relationships for games, metadata, genres, tags, and users
|
||||
- **Authentication** (`src/libs/auth.py`): JWT-based auth with bcrypt password hashing and role-based access control
|
||||
- **Data Models** (`src/libs/objects.py`): Dataclasses for Game, Metadata, and Roms collections
|
||||
- **API Integration** (`src/libs/apis.py`): IGDB API client with Twitch OAuth authentication
|
||||
- **Utilities** (`src/libs/functions.py`): Title cleaning and year extraction from ROM filenames
|
||||
|
||||
### Data Flow
|
||||
|
||||
**ROM Scraping:**
|
||||
1. Compares filesystem ROMs with database entries to avoid re-indexing
|
||||
2. Authenticates with IGDB via Twitch OAuth using client credentials
|
||||
3. Scrapes metadata for new games only with rate limiting (4 concurrent requests)
|
||||
4. Stores normalized data in SQLite with proper foreign key relationships
|
||||
5. Handles duplicate games and metadata updates gracefully
|
||||
|
||||
**Web Interface:**
|
||||
1. FastAPI serves modern responsive web interface with Tailwind CSS
|
||||
2. JWT-based authentication with three user roles: demo, normal, super
|
||||
3. Demo users can browse but not download; normal users get full access; super users can manage everything
|
||||
4. Pagination, favorites system, and file downloads for authorized users
|
||||
5. Admin interface for user management and metadata editing
|
||||
|
||||
### Key Technical Details
|
||||
|
||||
- Uses asyncio with semaphore-based rate limiting for API requests
|
||||
- SQLAlchemy with declarative base and proper naming conventions
|
||||
- FastAPI with Jinja2 templates, JWT authentication, and role-based access control
|
||||
- Configuration supports both environment variables and .env files
|
||||
- Custom PathType for storing pathlib.Path objects in database
|
||||
- Batch processing for database operations with configurable batch sizes
|
||||
- Modern responsive UI with Tailwind CSS and Alpine.js for interactivity
|
||||
|
||||
## Database Migrations
|
||||
|
||||
The project uses Alembic for database schema versioning and migrations:
|
||||
|
||||
### First-Time Setup
|
||||
```bash
|
||||
db-init # Initialize database with current schema
|
||||
migrate stamp # Mark database as up-to-date with migrations
|
||||
```
|
||||
|
||||
### Migration Management
|
||||
```bash
|
||||
migrate create "description" # Create new migration file
|
||||
migrate upgrade # Apply all pending migrations
|
||||
migrate current # Show current database revision
|
||||
migrate history # Show migration history
|
||||
migrate check # Check database migration status
|
||||
```
|
||||
|
||||
### Schema Changes
|
||||
1. Modify models in `src/libs/database.py`
|
||||
2. Create migration: `migrate create "description of changes"`
|
||||
3. Review generated migration file in `migrations/versions/`
|
||||
4. Apply migration: `migrate upgrade`
|
||||
|
||||
### Migration Files
|
||||
- Located in `migrations/versions/`
|
||||
- Named with revision ID and description
|
||||
- Contain `upgrade()` and `downgrade()` functions
|
||||
- Support batch operations for SQLite compatibility
|
||||
|
||||
## Environment Setup
|
||||
|
||||
Requires IGDB API credentials:
|
||||
- `IGDB_CLIENT_ID` - Twitch client ID
|
||||
- `IGDB_SECRET_KEY` - Twitch client secret
|
||||
|
||||
Can be provided via environment variables or `.env` file in project root.
|
||||
112
DOCKER.md
112
DOCKER.md
@@ -1,112 +0,0 @@
|
||||
# DosVault Docker Deployment
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. **Copy the environment template:**
|
||||
```bash
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
2. **Edit `.env` with your configuration:**
|
||||
- Set `IGDB_CLIENT_ID` and `IGDB_SECRET_KEY` (required)
|
||||
- Set `ROMS_PATH` to your ROM collection directory
|
||||
- Set `DOSVAULT_ADMIN_USERNAME` to you admin username
|
||||
- Set `DOSVAULT_ADMIN_EMAIL` to your admin email
|
||||
- Set `DOSVAULT_ADMIN_PASSWORD` to your admin password
|
||||
- Optionally customize host/port settings
|
||||
|
||||
3. **Start the application:**
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
4. **Create admin user:**
|
||||
```bash
|
||||
docker-compose exec dosvault python src/create_admin.py
|
||||
```
|
||||
|
||||
5. **Access the application:**
|
||||
- Web interface: http://localhost:8080
|
||||
- Admin panel: http://localhost:8080/admin
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
| Variable | Required | Description |
|
||||
|----------|----------|-------------|
|
||||
| `IGDB_CLIENT_ID` | Yes | Twitch API Client ID |
|
||||
| `IGDB_SECRET_KEY` | Yes | Twitch API Client Secret |
|
||||
| `ROMS_PATH` | No | Path to ROM collection (default: ./roms) |
|
||||
| `DOSFRONTEND_CONFIG_DIR` | No | Application data directory (default: /app/data) |
|
||||
|
||||
### Configuration Persistence
|
||||
|
||||
Configuration changes made through the web interface are automatically persisted to the mounted volume:
|
||||
|
||||
- **In Docker**: Configuration is stored in `/app/data/config.json` (mounted volume)
|
||||
- **Regular install**: Configuration is stored in `~/.config/dosfrontend/config.json`
|
||||
- **File structure**: All application data uses the same base directory:
|
||||
- `config.json` - Main configuration file
|
||||
- `roms.db` - SQLite database
|
||||
- `images/` - Downloaded game artwork
|
||||
- `logs/` - Application logs
|
||||
|
||||
### Volume Mounts
|
||||
|
||||
- `dosvault_data:/app/data` - Application data (database, images, logs)
|
||||
- `${ROMS_PATH}:/app/data/roms:ro` - ROM collection (read-only)
|
||||
|
||||
## Database Management
|
||||
|
||||
### Initialize Database
|
||||
```bash
|
||||
docker-compose exec dosvault python src/migrate.py init
|
||||
```
|
||||
|
||||
### Run Migrations
|
||||
```bash
|
||||
docker-compose exec dosvault python src/migrate.py upgrade
|
||||
```
|
||||
|
||||
### Scrape ROM Metadata
|
||||
```bash
|
||||
docker-compose exec dosvault python -m src
|
||||
```
|
||||
|
||||
## Maintenance
|
||||
|
||||
### View Logs
|
||||
```bash
|
||||
docker-compose logs -f dosvault
|
||||
```
|
||||
|
||||
### Backup Database
|
||||
```bash
|
||||
docker-compose exec dosvault cp /app/data/roms.db /app/data/backup.db
|
||||
docker cp $(docker-compose ps -q dosvault):/app/data/backup.db ./backup.db
|
||||
```
|
||||
|
||||
### Update Application
|
||||
```bash
|
||||
docker-compose pull
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Check Container Health
|
||||
```bash
|
||||
docker-compose ps
|
||||
```
|
||||
|
||||
### Access Container Shell
|
||||
```bash
|
||||
docker-compose exec dosvault bash
|
||||
```
|
||||
|
||||
### Reset Data
|
||||
```bash
|
||||
docker-compose down -v
|
||||
docker-compose up -d
|
||||
```
|
||||
51
Dockerfile
51
Dockerfile
@@ -1,51 +0,0 @@
|
||||
# Multi-stage Docker build for DosVault
|
||||
FROM python:3.11-slim as base
|
||||
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update && apt-get install -y \
|
||||
curl \
|
||||
gcc \
|
||||
g++ \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy Python dependencies
|
||||
COPY requirements.txt* pyproject.toml* ./
|
||||
|
||||
# Install Python dependencies
|
||||
RUN pip install --no-cache-dir -r requirements.txt || \
|
||||
pip install --no-cache-dir fastapi uvicorn sqlalchemy alembic \
|
||||
aiohttp bcrypt python-jose python-multipart jinja2
|
||||
|
||||
# Copy application code
|
||||
COPY src/ ./src/
|
||||
COPY templates/ ./templates/
|
||||
COPY migrations/ ./migrations/
|
||||
COPY alembic.ini ./
|
||||
COPY CLAUDE.md README.md ./
|
||||
COPY entrypoint.sh ./
|
||||
|
||||
# Create necessary directories
|
||||
RUN mkdir -p /app/data/logs /app/data/images /app/data/roms /app/data/metadata
|
||||
|
||||
# Set environment variables
|
||||
ENV PYTHONPATH=/app/src
|
||||
ENV DOSFRONTEND_CONFIG_DIR=/app/data
|
||||
|
||||
# Expose ports
|
||||
EXPOSE 8080 8081
|
||||
|
||||
# Create non-root user
|
||||
RUN useradd -m -u 1000 dosvault && \
|
||||
chown -R dosvault:dosvault /app
|
||||
USER dosvault
|
||||
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
|
||||
CMD curl -f http://localhost:8080/health || exit 1
|
||||
|
||||
# Default command
|
||||
# CMD ["python", "-m", "uvicorn", "src.webapp:app", "--host", "0.0.0.0", "--port", "8080"]
|
||||
ENTRYPOINT ["./entrypoint.sh"]
|
||||
197
README.md
197
README.md
@@ -1,197 +0,0 @@
|
||||
# 🎮 DosVault
|
||||
|
||||
**Your Personal DOS Game Collection Manager**
|
||||
|
||||
DosVault is a modern, web-based collection manager for DOS games that combines powerful metadata scraping with an intuitive browsing experience. Built with Python and FastAPI, it helps you organize, discover, and manage your retro gaming library with style.
|
||||
|
||||
## ✨ Features
|
||||
|
||||
### 🎯 Core Functionality
|
||||
- **Automatic Metadata Scraping** - Pulls game information, cover art, and screenshots from IGDB API
|
||||
- **Local Image Storage** - Downloads and caches all images locally for fast loading
|
||||
- **Intelligent ROM Detection** - Scans directories and avoids re-indexing existing games
|
||||
- **Advanced Search & Filtering** - Find games by title, genre, developer, or description
|
||||
- **Genre & Tag Browsing** - Organized categorization with alphabetical sorting
|
||||
|
||||
### 🌐 Modern Web Interface
|
||||
- **Responsive Design** - Works beautifully on desktop, tablet, and mobile
|
||||
- **Multiple View Modes** - Switch between grid and list views
|
||||
- **Interactive Screenshots** - Click to view full-screen image galleries
|
||||
- **Smart Pagination** - Navigate large collections with ease
|
||||
- **Real-time Favorites** - Heart games to build your personal collection
|
||||
|
||||
### 🔐 User Management
|
||||
- **Role-Based Access Control** - Demo, Normal, and Super Admin roles
|
||||
- **Secure Authentication** - JWT-based auth with bcrypt password hashing
|
||||
- **Personal Favorites** - Each user maintains their own favorites list
|
||||
- **Admin Dashboard** - User management and system overview
|
||||
|
||||
### 📱 Mobile-First
|
||||
- **Hamburger Navigation** - Clean mobile menu system
|
||||
- **Touch-Optimized** - Large buttons and smooth interactions
|
||||
- **Responsive Controls** - Pagination and filters work great on mobile
|
||||
|
||||
## 🚀 Quick Start
|
||||
|
||||
### Prerequisites
|
||||
- Python 3.11+
|
||||
- [Devenv](https://devenv.sh/) (recommended) or manual dependency management
|
||||
- IGDB API credentials (free from Twitch Developer Console)
|
||||
|
||||
### Installation
|
||||
|
||||
1. **Clone the repository:**
|
||||
```bash
|
||||
git clone <repository-url>
|
||||
cd dosfrontend
|
||||
```
|
||||
|
||||
2. **Set up environment:**
|
||||
```bash
|
||||
# With devenv (recommended)
|
||||
devenv shell
|
||||
|
||||
# Or manually install dependencies
|
||||
pip install fastapi uvicorn sqlalchemy alembic bcrypt python-jose aiohttp
|
||||
```
|
||||
|
||||
3. **Configure IGDB API:**
|
||||
Create a `.env` file with your IGDB credentials:
|
||||
```env
|
||||
IGDB_CLIENT_ID=your_twitch_client_id
|
||||
IGDB_SECRET_KEY=your_twitch_client_secret
|
||||
```
|
||||
|
||||
4. **Initialize database:**
|
||||
```bash
|
||||
db-init
|
||||
create-admin # Create your first admin user
|
||||
```
|
||||
|
||||
5. **Run the application:**
|
||||
```bash
|
||||
serve # Starts web server
|
||||
run # Runs ROM scraper (optional)
|
||||
```
|
||||
|
||||
6. **Access DosVault:**
|
||||
Open http://localhost:8080 in your browser
|
||||
|
||||
## 📁 Project Structure
|
||||
|
||||
```
|
||||
dosfrontend/
|
||||
├── src/
|
||||
│ ├── __main__.py # ROM scraper application
|
||||
│ ├── webapp.py # FastAPI web server
|
||||
│ └── libs/
|
||||
│ ├── config.py # XDG-compliant configuration
|
||||
│ ├── database.py # SQLAlchemy models
|
||||
│ ├── auth.py # JWT authentication
|
||||
│ ├── apis.py # IGDB API integration
|
||||
│ └── functions.py # Utility functions
|
||||
├── templates/ # Jinja2 HTML templates
|
||||
├── migrations/ # Database schema versions
|
||||
├── devenv.nix # Development environment
|
||||
└── CLAUDE.md # Development guidance
|
||||
```
|
||||
|
||||
## 🎮 Usage
|
||||
|
||||
### Scraping ROMs
|
||||
```bash
|
||||
# Scan ROM directories and fetch metadata
|
||||
run
|
||||
```
|
||||
|
||||
### Web Interface
|
||||
```bash
|
||||
# Start the web server
|
||||
serve
|
||||
```
|
||||
|
||||
### Database Management
|
||||
```bash
|
||||
# Create migrations
|
||||
migrate create "description of changes"
|
||||
|
||||
# Apply migrations
|
||||
migrate upgrade
|
||||
|
||||
# Check migration status
|
||||
migrate current
|
||||
```
|
||||
|
||||
### Administration
|
||||
```bash
|
||||
# Create admin user
|
||||
create-admin
|
||||
|
||||
# Run tests
|
||||
tests
|
||||
|
||||
# Code quality
|
||||
lint
|
||||
typecheck
|
||||
```
|
||||
|
||||
## ⚙️ Configuration
|
||||
|
||||
DosVault uses XDG-compliant configuration stored in:
|
||||
- **Linux/Mac:** `~/.config/dosfrontend/`
|
||||
- **Windows:** `%APPDATA%/dosfrontend/`
|
||||
|
||||
Key configuration options:
|
||||
- ROM directories to scan
|
||||
- Image storage location
|
||||
- Database path
|
||||
- Web server host/port
|
||||
- IGDB API credentials
|
||||
|
||||
## 🏗️ Architecture
|
||||
|
||||
### Backend
|
||||
- **FastAPI** - Modern Python web framework
|
||||
- **SQLAlchemy** - Database ORM with proper relationships
|
||||
- **Alembic** - Database migration management
|
||||
- **AsyncIO** - Concurrent API requests with rate limiting
|
||||
- **JWT + BCrypt** - Secure authentication
|
||||
|
||||
### Frontend
|
||||
- **Jinja2** - Server-side templating
|
||||
- **Tailwind CSS** - Utility-first styling
|
||||
- **Alpine.js** - Lightweight JavaScript framework
|
||||
- **Responsive Design** - Mobile-first approach
|
||||
|
||||
### Data Flow
|
||||
1. **Scraper** scans ROM directories and compares with database
|
||||
2. **IGDB API** provides metadata via Twitch OAuth
|
||||
3. **Images** are downloaded and cached locally
|
||||
4. **Web interface** serves games with fast local assets
|
||||
5. **Users** browse, search, and manage favorites
|
||||
|
||||
## 🤝 Contributing
|
||||
|
||||
1. Fork the repository
|
||||
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
|
||||
3. Make your changes
|
||||
4. Run tests and linting (`tests`, `lint`)
|
||||
5. Commit your changes (`git commit -m 'Add amazing feature'`)
|
||||
6. Push to the branch (`git push origin feature/amazing-feature`)
|
||||
7. Open a Pull Request
|
||||
|
||||
## 📝 License
|
||||
|
||||
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
|
||||
|
||||
## 🙏 Acknowledgments
|
||||
|
||||
- **IGDB** for providing comprehensive game metadata
|
||||
- **Twitch** for OAuth authentication to IGDB API
|
||||
- **FastAPI** for the excellent modern Python web framework
|
||||
- **Tailwind CSS** for making responsive design a breeze
|
||||
- **DOSBox** community for keeping retro gaming alive
|
||||
|
||||
---
|
||||
|
||||
**Built with ❤️ for retro gaming enthusiasts**
|
||||
94
alembic.ini
94
alembic.ini
@@ -1,94 +0,0 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# path to migration scripts
|
||||
script_location = migrations
|
||||
|
||||
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
|
||||
# Uncomment the line below if you want the files to be prepended with date and time
|
||||
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
|
||||
|
||||
# sys.path path, will be prepended to sys.path if present.
|
||||
# defaults to the current working directory.
|
||||
prepend_sys_path = .
|
||||
|
||||
# timezone to use when rendering the date within the migration file
|
||||
# as well as the filename.
|
||||
# If specified, requires the python-dateutil library that can be
|
||||
# installed by adding `alembic[tz]` to the pip requirements
|
||||
# string value is passed to dateutil.tz.gettz()
|
||||
# leave blank for localtime
|
||||
# timezone =
|
||||
|
||||
# max length of characters to apply to the
|
||||
# "slug" field
|
||||
# truncate_slug_length = 40
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
# set to 'true' to allow .pyc and .pyo files without
|
||||
# a source .py file to be detected as revisions in the
|
||||
# versions/ directory
|
||||
# sourceless = false
|
||||
|
||||
# version number format to use with the --rev-id parameter
|
||||
# to specify a starting revision
|
||||
# version_num_format = %04d
|
||||
|
||||
# version_path_separator = :
|
||||
# version_path_separator = os # Use os.pathsep. Default configuration used on new projects.
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
sqlalchemy.url = driver://user:pass@localhost/dbname
|
||||
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts. See the documentation for further
|
||||
# detail and examples
|
||||
|
||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||
# hooks = black
|
||||
# black.type = console_scripts
|
||||
# black.entrypoint = black
|
||||
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
||||
2
build.sh
2
build.sh
@@ -1,2 +0,0 @@
|
||||
#!/usr/bin/env sh
|
||||
python -m zipapp src --compress --output=release/dfe --python="/usr/bin/env python"
|
||||
103
devenv.lock
103
devenv.lock
@@ -1,103 +0,0 @@
|
||||
{
|
||||
"nodes": {
|
||||
"devenv": {
|
||||
"locked": {
|
||||
"dir": "src/modules",
|
||||
"lastModified": 1756415044,
|
||||
"owner": "cachix",
|
||||
"repo": "devenv",
|
||||
"rev": "c570189b38b549141179647da3ddde249ac50fec",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"dir": "src/modules",
|
||||
"owner": "cachix",
|
||||
"repo": "devenv",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"flake-compat": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1747046372,
|
||||
"owner": "edolstra",
|
||||
"repo": "flake-compat",
|
||||
"rev": "9100a0f413b0c601e0533d1d94ffd501ce2e7885",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "edolstra",
|
||||
"repo": "flake-compat",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"git-hooks": {
|
||||
"inputs": {
|
||||
"flake-compat": "flake-compat",
|
||||
"gitignore": "gitignore",
|
||||
"nixpkgs": [
|
||||
"nixpkgs"
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1755960406,
|
||||
"owner": "cachix",
|
||||
"repo": "git-hooks.nix",
|
||||
"rev": "e891a93b193fcaf2fc8012d890dc7f0befe86ec2",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "cachix",
|
||||
"repo": "git-hooks.nix",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"gitignore": {
|
||||
"inputs": {
|
||||
"nixpkgs": [
|
||||
"git-hooks",
|
||||
"nixpkgs"
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1709087332,
|
||||
"owner": "hercules-ci",
|
||||
"repo": "gitignore.nix",
|
||||
"rev": "637db329424fd7e46cf4185293b9cc8c88c95394",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "hercules-ci",
|
||||
"repo": "gitignore.nix",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1755783167,
|
||||
"owner": "cachix",
|
||||
"repo": "devenv-nixpkgs",
|
||||
"rev": "4a880fb247d24fbca57269af672e8f78935b0328",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "cachix",
|
||||
"ref": "rolling",
|
||||
"repo": "devenv-nixpkgs",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"root": {
|
||||
"inputs": {
|
||||
"devenv": "devenv",
|
||||
"git-hooks": "git-hooks",
|
||||
"nixpkgs": "nixpkgs",
|
||||
"pre-commit-hooks": [
|
||||
"git-hooks"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"root": "root",
|
||||
"version": 7
|
||||
}
|
||||
103
devenv.nix
103
devenv.nix
@@ -1,103 +0,0 @@
|
||||
{ pkgs, lib, config, inputs, ... }:
|
||||
|
||||
{
|
||||
# https://devenv.sh/basics/
|
||||
|
||||
# https://devenv.sh/packages/
|
||||
packages = with pkgs; [
|
||||
git
|
||||
curl
|
||||
pkg-config
|
||||
sqlite
|
||||
pyright
|
||||
pre-commit
|
||||
];
|
||||
languages.python = {
|
||||
enable = true;
|
||||
package = pkgs.python313;
|
||||
libraries = with pkgs.python313Packages; [ ];
|
||||
venv = {
|
||||
enable = true;
|
||||
requirements = ''
|
||||
pudb
|
||||
ptpython
|
||||
ipython
|
||||
pytest
|
||||
pytest-cov
|
||||
flake8
|
||||
ptpython
|
||||
ipython
|
||||
isort
|
||||
pynvim
|
||||
ruff
|
||||
black
|
||||
sqlalchemy
|
||||
requests
|
||||
fastapi
|
||||
uvicorn
|
||||
jinja2
|
||||
python-multipart
|
||||
bcrypt
|
||||
python-jose
|
||||
passlib
|
||||
alembic
|
||||
aiohttp
|
||||
'';
|
||||
};
|
||||
# uv = {
|
||||
# enable = false;
|
||||
# sync.enable = true;
|
||||
# };
|
||||
};
|
||||
env = {
|
||||
PYTHONBREAKPOINT = "pudb.set_trace";
|
||||
};
|
||||
|
||||
# https://devenv.sh/variables/
|
||||
# variables = {
|
||||
# GREET = "world";
|
||||
# };
|
||||
|
||||
# https://devenv.sh/scripts/
|
||||
scripts = {
|
||||
"tests".exec = "cd $REPO_ROOT && python -m pytest --rootdir=$REPO_ROOT -c $REPO_ROOT/pytest.ini";
|
||||
"lint".exec = "cd $REPO_ROOT && ${pkgs.ruff}/bin/ruff check . && black --check .";
|
||||
"fix".exec = "cd $REPO_ROOT && ${pkgs.ruff}/bin/ruff check . --fix && black .";
|
||||
"typecheck".exec = "cd $REPO_ROOT && pyright";
|
||||
"run".exec = ''cd $REPO_ROOT && ./src/__main__.py "$@"'';
|
||||
"serve".exec = "cd $REPO_ROOT && python src/webapp.py";
|
||||
"create-admin".exec = "cd $REPO_ROOT && python src/create_admin.py";
|
||||
"migrate".exec = "cd $REPO_ROOT && python src/migrate.py";
|
||||
"db-init".exec = "cd $REPO_ROOT && python src/migrate.py init";
|
||||
"db-upgrade".exec = "cd $REPO_ROOT && python src/migrate.py upgrade";
|
||||
"db-create".exec = "cd $REPO_ROOT && python src/migrate.py create";
|
||||
"build".exec = "cd $REPO_ROOT && ./build.sh";
|
||||
"backfill-images".exec = "cd $REPO_ROOT && python src/backfill_images.py";
|
||||
"export-requirements".exec = "pip freeze > requirements.txt";
|
||||
"export-reqs".exec = ''
|
||||
cd "$REPO_ROOT"
|
||||
printf "%s\n" '${config.languages.python.venv.requirements}' > requirements.txt
|
||||
echo "Wrote requirements.txt"
|
||||
'';
|
||||
};
|
||||
enterShell = ''
|
||||
export REPO_ROOT="$(git rev-parse --show-toplevel 2>/dev/null || pwd)"
|
||||
'';
|
||||
|
||||
# https://devenv.sh/tasks/
|
||||
# tasks = {
|
||||
# "myproj:setup".exec = "mytool build";
|
||||
# "devenv:enterShell".after = [ "myproj:setup" ];
|
||||
# };
|
||||
|
||||
# https://devenv.sh/tests/
|
||||
enterTest = ''
|
||||
echo "Running tests"
|
||||
pytest -q
|
||||
'';
|
||||
|
||||
# https://devenv.sh/git-hooks/
|
||||
# git-hooks.hooks.shellcheck.enable = true;
|
||||
|
||||
# See full reference at https://devenv.sh/reference/options/
|
||||
}
|
||||
@@ -1,29 +0,0 @@
|
||||
services:
|
||||
dosvault:
|
||||
image: tty303/dosvault:latest
|
||||
ports:
|
||||
- "${DOSVAULT_PORT:-8080}:8080"
|
||||
- "${DOSVAULT_WEBSOCKET_PORT:-8081}:8081"
|
||||
volumes:
|
||||
- dosvault_data:/app/data
|
||||
- "${ROMS_PATH:-./roms}:/app/data/roms:ro"
|
||||
environment:
|
||||
# IGDB API Configuration
|
||||
- IGDB_CLIENT_ID=${IGDB_CLIENT_ID}
|
||||
- IGDB_SECRET_KEY=${IGDB_SECRET_KEY}
|
||||
# Application Configuration
|
||||
- DOSFRONTEND_CONFIG_DIR=/app/data
|
||||
- DOSVAULT_ADMIN_USERNAME=${DOSVAULT_ADMIN_USERNAME:-}
|
||||
- DOSVAULT_ADMIN_EMAIL=${DOSVAULT_ADMIN_EMAIL:-}
|
||||
- DOSVAULT_ADMIN_PASSWORD=${DOSVAULT_ADMIN_PASSWORD:-}
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8080/health"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
start_period: 40s
|
||||
|
||||
volumes:
|
||||
dosvault_data:
|
||||
driver: local
|
||||
@@ -1,64 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
DATA_DIR="${DOSFRONTEND_CONFIG_DIR:-/app/data}"
|
||||
DB_PATH="${DATA_DIR}/roms.db"
|
||||
|
||||
# Make sure data dir exists & is writable for uid 1000
|
||||
mkdir -p "$DATA_DIR" "$DATA_DIR/images" "$DATA_DIR/logs" "$DATA_DIR/roms" "$DATA_DIR/metadata"
|
||||
# Attempt to fix perms when volume mounted as root
|
||||
if [ "$(id -u)" -ne 0 ]; then
|
||||
# we're not root; try a writable touch to detect perms
|
||||
if ! touch "$DATA_DIR/.permcheck" 2>/dev/null; then
|
||||
echo "WARNING: $DATA_DIR not writable by current user. Consider running as root or fixing volume ownership."
|
||||
else
|
||||
rm -f "$DATA_DIR/.permcheck"
|
||||
fi
|
||||
else
|
||||
chown -R 1000:1000 "$DATA_DIR" || true
|
||||
fi
|
||||
|
||||
# Initialize / migrate DB
|
||||
if [ ! -f "$DB_PATH" ]; then
|
||||
echo "No database found. Initializing…"
|
||||
python /app/src/migrate.py init || true
|
||||
fi
|
||||
# Always try to move to latest
|
||||
python /app/src/migrate.py upgrade || true
|
||||
|
||||
# Non-interactive admin creation (optional)
|
||||
if [ "${DOSVAULT_ADMIN_USERNAME:-}" ] && [ "${DOSVAULT_ADMIN_EMAIL:-}" ] && [ "${DOSVAULT_ADMIN_PASSWORD:-}" ]; then
|
||||
python - <<'PY'
|
||||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from os import getenv
|
||||
from libs.config import Config
|
||||
from libs.database import User_table, UserRole
|
||||
from libs.auth import AuthManager
|
||||
|
||||
cfg=Config()
|
||||
engine=create_engine(f"sqlite+pysqlite:///{cfg.database_path}")
|
||||
Session=sessionmaker(bind=engine)
|
||||
db=Session()
|
||||
try:
|
||||
existing = db.query(User_table).filter(User_table.role==UserRole.SUPER.value).first()
|
||||
if not existing:
|
||||
AuthManager.create_user(
|
||||
db,
|
||||
getenv("DOSVAULT_ADMIN_USERNAME"),
|
||||
getenv("DOSVAULT_ADMIN_EMAIL"),
|
||||
getenv("DOSVAULT_ADMIN_PASSWORD"),
|
||||
UserRole.SUPER.value
|
||||
)
|
||||
print("Admin user created.")
|
||||
else:
|
||||
print(f"Admin already exists: {existing.username}")
|
||||
finally:
|
||||
db.close()
|
||||
PY
|
||||
fi
|
||||
# Execute rom scan
|
||||
exec python ./src/__main__.py || true &
|
||||
# Run app
|
||||
exec python -m uvicorn src.webapp:app --host 0.0.0.0 --port 8080
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
[pytest]
|
||||
addopts = --cov=src --cov-report=term-missing --ignore=src/__main__.py
|
||||
testpaths = tests/
|
||||
@@ -1,24 +0,0 @@
|
||||
pudb
|
||||
ptpython
|
||||
ipython
|
||||
pytest
|
||||
pytest-cov
|
||||
flake8
|
||||
ptpython
|
||||
ipython
|
||||
isort
|
||||
pynvim
|
||||
ruff
|
||||
black
|
||||
sqlalchemy
|
||||
requests
|
||||
fastapi
|
||||
uvicorn
|
||||
jinja2
|
||||
python-multipart
|
||||
bcrypt
|
||||
python-jose
|
||||
passlib
|
||||
alembic
|
||||
aiohttp
|
||||
|
||||
@@ -99,15 +99,19 @@ async def inject_metadata(roms: Roms) -> Roms:
|
||||
try:
|
||||
await asyncio.sleep(0.25) # keep your throttle
|
||||
md = await scrape_metadata(game.title, session)
|
||||
except ValueError:
|
||||
scrape_errors.append(game.title)
|
||||
results[i] = md
|
||||
logging.info(f"Successfully scraped: {game.title} # {i+1}/{len(roms.list)}")
|
||||
except Exception as e:
|
||||
# Handle all exceptions, not just ValueError
|
||||
scrape_errors.append(f"{game.title}: {str(e)}")
|
||||
md = Metadata(title=game.title, year=extract_year_from_title(game.title))
|
||||
# log each item as its done
|
||||
results[i] = md
|
||||
logging.info(f"Scraped: {game.title} # {i+1}/{len(roms.list)}")
|
||||
# Log recent errors
|
||||
for err in scrape_errors[-5:]:
|
||||
logging.warning(f"Scraping error: {err}")
|
||||
results[i] = md
|
||||
logging.info(f"Used fallback metadata for: {game.title} # {i+1}/{len(roms.list)}")
|
||||
|
||||
# Log error details every 5 errors to avoid spam but provide visibility
|
||||
if len(scrape_errors) % 5 == 0:
|
||||
logging.warning(f"Scraping error for {game.title}: {str(e)}")
|
||||
logging.info(f"Total scraping errors so far: {len(scrape_errors)}")
|
||||
|
||||
tasks = [asyncio.create_task(_job(i, game)) for i, game in enumerate(roms.list)]
|
||||
await asyncio.gather(*tasks)
|
||||
@@ -133,27 +137,55 @@ async def filter_new_roms(romlist: Roms, session: Session) -> Roms:
|
||||
|
||||
async def main():
|
||||
url = f"sqlite+pysqlite:///{config.database_path}"
|
||||
engine = create_engine(url, future=True)
|
||||
# Use a connection with shorter timeout and WAL mode for better concurrency
|
||||
engine = create_engine(
|
||||
url,
|
||||
future=True,
|
||||
connect_args={
|
||||
"timeout": 10, # 10 second timeout instead of default 30
|
||||
"check_same_thread": False
|
||||
},
|
||||
pool_pre_ping=True
|
||||
)
|
||||
|
||||
# Database tables are now managed by migrations
|
||||
# Base.metadata.create_all(engine)
|
||||
|
||||
with Session(engine) as s:
|
||||
romlist = await make_romlist()
|
||||
new_romlist = await filter_new_roms(romlist, s)
|
||||
try:
|
||||
with Session(engine) as s:
|
||||
# Enable WAL mode for better concurrency
|
||||
try:
|
||||
s.execute("PRAGMA journal_mode=WAL")
|
||||
s.execute("PRAGMA busy_timeout=5000") # 5 second busy timeout
|
||||
s.commit()
|
||||
logging.info("Enabled WAL mode for better database concurrency")
|
||||
except Exception as e:
|
||||
logging.warning(f"Could not enable WAL mode: {e}")
|
||||
|
||||
if new_romlist.list:
|
||||
new_romlist = await inject_metadata(new_romlist)
|
||||
ingest_roms(new_romlist, s)
|
||||
romlist = await make_romlist()
|
||||
new_romlist = await filter_new_roms(romlist, s)
|
||||
|
||||
if new_romlist.list:
|
||||
logging.info(f"Starting metadata scraping for {len(new_romlist.list)} new games")
|
||||
new_romlist = await inject_metadata(new_romlist)
|
||||
|
||||
logging.info("Starting database ingestion with smaller batches")
|
||||
# Use smaller batches to reduce database lock time
|
||||
ingest_roms(new_romlist, s, batch=50)
|
||||
else:
|
||||
logging.info("No new ROMs to scrape!")
|
||||
|
||||
logging.info("ROM scanning completed successfully")
|
||||
if scrape_errors:
|
||||
logging.warning(f"Total scraping errors: {len(scrape_errors)}")
|
||||
for err in scrape_errors[-10:]: # Show last 10 errors only
|
||||
logging.warning(f"Failed to scrape: {err}")
|
||||
else:
|
||||
logging.info("No new ROMs to scrape!")
|
||||
logging.info("ROM scanning completed with no metadata errors")
|
||||
|
||||
logging.info("ROM scanning completed")
|
||||
if scrape_errors:
|
||||
logging.warning(f"Total scraping errors: {len(scrape_errors)}")
|
||||
for err in scrape_errors:
|
||||
logging.warning(f"Failed to scrape: {err}")
|
||||
else:
|
||||
logging.info("ROM scanning completed with no errors")
|
||||
except Exception as e:
|
||||
logging.error(f"ROM scanning failed with error: {e}")
|
||||
raise
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Initialize logging
|
||||
|
||||
@@ -65,6 +65,8 @@ class Config:
|
||||
return {
|
||||
"rom_path": str(self.rom_path),
|
||||
"metadata_path": str(self.metadata_path),
|
||||
"database_path": str(self.database_path),
|
||||
"images_path": str(self.images_path),
|
||||
"host": self.host,
|
||||
"port": self.port,
|
||||
"websocket_port": self.websocket_port,
|
||||
@@ -73,21 +75,21 @@ class Config:
|
||||
}
|
||||
|
||||
def save(self):
|
||||
# Ensure config directory exists
|
||||
if not self.path.parent.exists():
|
||||
self.path.parent.mkdir(parents=True, exist_ok=True)
|
||||
rom_path = input(f"Enter the path to your ROMs [{self.rom_path}] enter for default: ").strip()
|
||||
metadata_path = input(f"Enter the path to your metadata [{self.metadata_path}] enter for default: ").strip()
|
||||
self.rom_path = Path(rom_path) if rom_path else self.rom_path
|
||||
self.metadata_path = Path(metadata_path) if metadata_path else self.metadata_path
|
||||
|
||||
# Create directories if they don't exist
|
||||
if not self.rom_path.exists():
|
||||
self.rom_path.mkdir(parents=True, exist_ok=True)
|
||||
if not self.metadata_path.exists():
|
||||
self.metadata_path.mkdir(parents=True, exist_ok=True)
|
||||
if not self.images_path.exists():
|
||||
self.images_path.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Write configuration to file
|
||||
with open(self.path, 'w') as f:
|
||||
json.dump(self.to_dict(), f, indent=4)
|
||||
f.close()
|
||||
|
||||
def load(self) -> "Config":
|
||||
if self.path.exists():
|
||||
@@ -95,19 +97,26 @@ class Config:
|
||||
data = json.load(f)
|
||||
self.rom_path = Path(data.get("rom_path", str(self.rom_path)))
|
||||
self.metadata_path = Path(data.get("metadata_path", str(self.metadata_path)))
|
||||
self.database_path = Path(data.get("database_path", str(self.database_path)))
|
||||
self.images_path = Path(data.get("images_path", str(self.images_path)))
|
||||
self.host = data.get("host", self.host)
|
||||
self.port = data.get("port", self.port)
|
||||
self.websocket_port = data.get("websocket_port", self.websocket_port)
|
||||
self.igdb_api_key = data.get("igdb_api_key", self.igdb_api_key)
|
||||
self.igdb_client_id = data.get("igdb_client_id", self.igdb_client_id)
|
||||
|
||||
# Load environment secrets if API keys are still empty
|
||||
if self.igdb_api_key == "" or self.igdb_client_id == "":
|
||||
secrets = self.load_env_secrets()
|
||||
if secrets:
|
||||
self.igdb_api_key = secrets.get("IGDB_SECRET_KEY", "")
|
||||
self.igdb_client_id = secrets.get("IGDB_CLIENT_ID", "")
|
||||
f.close()
|
||||
self.save()
|
||||
return self
|
||||
f.close()
|
||||
self.igdb_api_key = secrets.get("IGDB_SECRET_KEY", self.igdb_api_key)
|
||||
self.igdb_client_id = secrets.get("IGDB_CLIENT_ID", self.igdb_client_id)
|
||||
else:
|
||||
# Config file doesn't exist, create it with defaults
|
||||
# Load environment secrets for initial setup
|
||||
secrets = self.load_env_secrets()
|
||||
if secrets:
|
||||
self.igdb_api_key = secrets.get("IGDB_SECRET_KEY", self.igdb_api_key)
|
||||
self.igdb_client_id = secrets.get("IGDB_CLIENT_ID", self.igdb_client_id)
|
||||
self.save()
|
||||
self.load()
|
||||
return self
|
||||
|
||||
@@ -199,43 +199,62 @@ def get_existing_rom_paths(session: Session) -> set[Path]:
|
||||
return {game.path.resolve() for game in session.scalars(select(Game_table)).all()}
|
||||
|
||||
def ingest_roms(roms: Roms, session: Session, *, batch: int = 200) -> int:
|
||||
import logging
|
||||
n = 0
|
||||
for g in roms.list:
|
||||
game = session.scalar(select(Game_table).where(Game_table.path == g.path))
|
||||
if game is None:
|
||||
game = Game_table(title=g.title, path=g.path)
|
||||
session.add(game)
|
||||
else:
|
||||
game.title = g.title
|
||||
mdto = g.metadata
|
||||
md = game.metadata_obj
|
||||
if md is None:
|
||||
md = Metadata_table(game=game, title=mdto.title or g.title)
|
||||
session.add(md)
|
||||
try:
|
||||
game = session.scalar(select(Game_table).where(Game_table.path == g.path))
|
||||
if game is None:
|
||||
game = Game_table(title=g.title, path=g.path)
|
||||
session.add(game)
|
||||
logging.info(f"Adding new game: {g.title}")
|
||||
else:
|
||||
game.title = g.title
|
||||
logging.info(f"Updating existing game: {g.title}")
|
||||
|
||||
md.title = mdto.title or g.title
|
||||
md.description = mdto.description
|
||||
md.year = mdto.year if mdto.year is not None else extract_year_from_title(md.title)
|
||||
md.developer = mdto.developer
|
||||
md.publisher = mdto.publisher
|
||||
md.players = mdto.players
|
||||
md.cover_image = mdto.cover_image
|
||||
md.screenshot = mdto.screenshot
|
||||
md.cover_image_path = mdto.cover_image_path
|
||||
md.screenshot_path = mdto.screenshot_path
|
||||
mdto = g.metadata
|
||||
md = game.metadata_obj
|
||||
if md is None:
|
||||
md = Metadata_table(game=game, title=mdto.title or g.title)
|
||||
session.add(md)
|
||||
|
||||
try: genres = sorted({s.strip() for s in (mdto.genre or []) if s and s.strip()})
|
||||
except: genres = []
|
||||
try: tags = sorted({s.strip() for s in (mdto.tags or []) if s and s.strip()})
|
||||
except: tags = []
|
||||
md.title = mdto.title or g.title
|
||||
md.description = mdto.description
|
||||
md.year = mdto.year if mdto.year is not None else extract_year_from_title(md.title)
|
||||
md.developer = mdto.developer
|
||||
md.publisher = mdto.publisher
|
||||
md.players = mdto.players
|
||||
md.cover_image = mdto.cover_image
|
||||
md.screenshot = mdto.screenshot
|
||||
md.cover_image_path = mdto.cover_image_path
|
||||
md.screenshot_path = mdto.screenshot_path
|
||||
|
||||
md.genre = [_get_or_create_by_name(session, Genre_table, name) for name in genres]
|
||||
md.tags = [_get_or_create_by_name(session, Tags_table, name) for name in tags]
|
||||
try: genres = sorted({s.strip() for s in (mdto.genre or []) if s and s.strip()})
|
||||
except: genres = []
|
||||
try: tags = sorted({s.strip() for s in (mdto.tags or []) if s and s.strip()})
|
||||
except: tags = []
|
||||
|
||||
n += 1
|
||||
if n % batch == 0:
|
||||
session.flush()
|
||||
md.genre = [_get_or_create_by_name(session, Genre_table, name) for name in genres]
|
||||
md.tags = [_get_or_create_by_name(session, Tags_table, name) for name in tags]
|
||||
|
||||
n += 1
|
||||
|
||||
# Use more frequent flushes and commits to reduce lock time
|
||||
if n % batch == 0:
|
||||
session.commit() # Commit more frequently to reduce lock duration
|
||||
logging.info(f"Committed batch of {batch} games to database ({n} total)")
|
||||
except Exception as e:
|
||||
logging.error(f"Failed to ingest game {g.title}: {e}")
|
||||
session.rollback() # Rollback on error to prevent corruption
|
||||
continue
|
||||
|
||||
# Final commit for remaining items
|
||||
try:
|
||||
session.commit()
|
||||
logging.info(f"Successfully ingested {n} games to database")
|
||||
except Exception as e:
|
||||
logging.error(f"Failed final commit during ROM ingestion: {e}")
|
||||
session.rollback()
|
||||
|
||||
session.commit()
|
||||
return n
|
||||
|
||||
|
||||
@@ -5,6 +5,8 @@ from typing import Optional, Annotated
|
||||
from datetime import timedelta, datetime, timezone
|
||||
import re
|
||||
import asyncio
|
||||
import logging
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
from fastapi import FastAPI, Depends, HTTPException, status, Request, Form, Query, BackgroundTasks
|
||||
@@ -15,18 +17,16 @@ from fastapi.staticfiles import StaticFiles
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.exception_handlers import http_exception_handler
|
||||
from fastapi.exceptions import HTTPException as StarletteHTTPException
|
||||
from starlette.exceptions import HTTPException as StarletteBaseHTTPException
|
||||
from sqlalchemy import create_engine, select, func
|
||||
from sqlalchemy.orm import Session, sessionmaker
|
||||
from sqlalchemy.exc import OperationalError
|
||||
from alembic.config import Config as AlembicConfig
|
||||
from alembic import command
|
||||
import subprocess
|
||||
|
||||
try:
|
||||
# Try relative imports first (when run as module)
|
||||
from .libs.config import Config
|
||||
from .libs.database import Base, Game_table, Metadata_table, User_table, UserRole, user_favorites, Tags_table, Genre_table
|
||||
from .libs.database import Game_table, Metadata_table, User_table, UserRole, user_favorites, Tags_table, Genre_table
|
||||
from .libs.auth import AuthManager, ACCESS_TOKEN_EXPIRE_MINUTES
|
||||
from .libs.logging import get_log_manager
|
||||
except ImportError:
|
||||
@@ -36,13 +36,31 @@ except ImportError:
|
||||
from libs.auth import AuthManager, ACCESS_TOKEN_EXPIRE_MINUTES
|
||||
from libs.logging import get_log_manager
|
||||
|
||||
# Initialize logging system first
|
||||
get_log_manager()
|
||||
|
||||
config = Config()
|
||||
engine = create_engine(f"sqlite+pysqlite:///{config.database_path}", echo=False)
|
||||
engine = create_engine(
|
||||
f"sqlite+pysqlite:///{config.database_path}",
|
||||
echo=False,
|
||||
connect_args={
|
||||
"timeout": 10, # 10 second timeout
|
||||
"check_same_thread": False
|
||||
},
|
||||
pool_pre_ping=True
|
||||
)
|
||||
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
|
||||
# Initialize logging system
|
||||
import logging
|
||||
get_log_manager()
|
||||
# Enable WAL mode for better concurrency during startup
|
||||
try:
|
||||
with engine.connect() as conn:
|
||||
conn.execute("PRAGMA journal_mode=WAL")
|
||||
conn.execute("PRAGMA busy_timeout=5000") # 5 second busy timeout
|
||||
conn.commit()
|
||||
logging.info("Enabled WAL mode for web application database")
|
||||
except Exception as e:
|
||||
logging.warning(f"Could not enable WAL mode for web application: {e}")
|
||||
|
||||
logging.info("DosVault web application starting up")
|
||||
|
||||
app = FastAPI(title="DOS Frontend", description="ROM Management System")
|
||||
@@ -119,7 +137,7 @@ def ensure_super_user():
|
||||
# Create default super user
|
||||
logging.info("No super user found, creating default admin user...")
|
||||
try:
|
||||
default_admin = AuthManager.create_user(
|
||||
AuthManager.create_user(
|
||||
session=db,
|
||||
username="admin",
|
||||
email="admin@dosvault.local",
|
||||
@@ -807,7 +825,7 @@ async def create_user(
|
||||
if existing_user:
|
||||
raise HTTPException(status_code=400, detail="Username or email already exists")
|
||||
|
||||
new_user = AuthManager.create_user(db, username, email, password, role)
|
||||
AuthManager.create_user(db, username, email, password, role)
|
||||
return RedirectResponse(url="/admin/users", status_code=303)
|
||||
|
||||
|
||||
@@ -1017,6 +1035,21 @@ async def admin_rom_scan(
|
||||
|
||||
return {"status": "started", "message": "ROM scan started"}
|
||||
|
||||
@app.post("/api/admin/game-scan")
|
||||
async def admin_game_scan(
|
||||
background_tasks: BackgroundTasks,
|
||||
db: Session = Depends(get_db),
|
||||
current_user: User_table = Depends(require_super_user)
|
||||
):
|
||||
"""Trigger game scan in the background (alias for ROM scan)"""
|
||||
if "game_scan" in running_tasks and not running_tasks["game_scan"].done():
|
||||
return {"status": "already_running", "message": "Game scan is already in progress"}
|
||||
|
||||
task = asyncio.create_task(run_rom_scan())
|
||||
running_tasks["game_scan"] = task
|
||||
|
||||
return {"status": "started", "message": "Game scan started"}
|
||||
|
||||
@app.post("/api/admin/metadata-refresh")
|
||||
async def admin_metadata_refresh(
|
||||
background_tasks: BackgroundTasks,
|
||||
@@ -1134,7 +1167,7 @@ async def admin_system_stats(
|
||||
"disk_usage": disk_usage,
|
||||
"running_tasks": {
|
||||
task_name: not running_tasks[task_name].done() if task_name in running_tasks else False
|
||||
for task_name in ["rom_scan", "metadata_refresh", "image_sync"]
|
||||
for task_name in ["rom_scan", "game_scan", "metadata_refresh", "image_sync"]
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1142,31 +1175,43 @@ async def run_rom_scan():
|
||||
"""Run the ROM scanner subprocess"""
|
||||
try:
|
||||
logging.info("Starting ROM scan subprocess")
|
||||
# Use the same approach as devenv - run the script directly with proper PYTHONPATH
|
||||
script_path = Path(__file__).parent / "__main__.py" # Point to src/__main__.py
|
||||
|
||||
# Create subprocess with real-time output capture
|
||||
process = await asyncio.create_subprocess_exec(
|
||||
"python", "-m", "src",
|
||||
"python", str(script_path),
|
||||
stdout=asyncio.subprocess.PIPE,
|
||||
stderr=asyncio.subprocess.PIPE
|
||||
stderr=asyncio.subprocess.STDOUT, # Merge stderr into stdout for unified logging
|
||||
cwd=Path(__file__).parent.parent # Set working directory to project root
|
||||
)
|
||||
stdout, stderr = await process.communicate()
|
||||
|
||||
# Log the output for visibility
|
||||
if stdout:
|
||||
for line in stdout.decode().strip().split('\n'):
|
||||
if line.strip():
|
||||
logging.info(f"ROM Scanner: {line.strip()}")
|
||||
# Capture output in real-time and log it
|
||||
output_lines = []
|
||||
try:
|
||||
while True:
|
||||
line = await process.stdout.readline()
|
||||
if not line:
|
||||
break
|
||||
|
||||
if stderr:
|
||||
for line in stderr.decode().strip().split('\n'):
|
||||
if line.strip():
|
||||
logging.error(f"ROM Scanner Error: {line.strip()}")
|
||||
decoded_line = line.decode().rstrip()
|
||||
if decoded_line:
|
||||
output_lines.append(decoded_line)
|
||||
# Log to main application log immediately for real-time visibility
|
||||
logging.info(f"ROM Scanner: {decoded_line}")
|
||||
except Exception as e:
|
||||
logging.error(f"Error reading ROM scanner output: {e}")
|
||||
|
||||
# Wait for process to complete
|
||||
await process.wait()
|
||||
|
||||
success = process.returncode == 0
|
||||
logging.info(f"ROM scan subprocess completed with exit code: {process.returncode}")
|
||||
|
||||
return {
|
||||
"success": success,
|
||||
"output": stdout.decode(),
|
||||
"error": stderr.decode(),
|
||||
"output": '\n'.join(output_lines),
|
||||
"error": "",
|
||||
"returncode": process.returncode
|
||||
}
|
||||
except Exception as e:
|
||||
|
||||
@@ -1305,22 +1305,22 @@ async function loadConfiguration() {
|
||||
|
||||
function populateConfigForm(config) {
|
||||
// Populate form fields
|
||||
document.getElementById('config-host').value = config.host || '';
|
||||
document.getElementById('config-port').value = config.port || '';
|
||||
document.getElementById('config-rom-path').value = config.game_path || '';
|
||||
document.getElementById('config-images-path').value = config.images_path || '';
|
||||
document.getElementById('config-igdb-client-id').value = config.igdb_client_id || '';
|
||||
document.getElementById('config-igdb-secret').value = config.igdb_api_key || '';
|
||||
document.getElementById('config-host').value = config.config?.host || config.host || '';
|
||||
document.getElementById('config-port').value = config.config?.port || config.port || '';
|
||||
document.getElementById('config-rom-path').value = config.config?.rom_path || config.rom_path || '';
|
||||
document.getElementById('config-images-path').value = config.config?.images_path || config.images_path || '';
|
||||
document.getElementById('config-igdb-client-id').value = config.config?.igdb_client_id || config.igdb_client_id || '';
|
||||
document.getElementById('config-igdb-secret').value = config.config?.igdb_api_key || config.igdb_api_key || '';
|
||||
|
||||
// Populate JSON editor
|
||||
document.getElementById('config-json').value = JSON.stringify(config, null, 2);
|
||||
document.getElementById('config-json').value = JSON.stringify(config.config || config, null, 2);
|
||||
}
|
||||
|
||||
function collectConfigFromForm() {
|
||||
const config = {
|
||||
host: document.getElementById('config-host').value,
|
||||
port: parseInt(document.getElementById('config-port').value) || 8080,
|
||||
game_path: document.getElementById('config-rom-path').value,
|
||||
rom_path: document.getElementById('config-rom-path').value,
|
||||
images_path: document.getElementById('config-images-path').value,
|
||||
igdb_client_id: document.getElementById('config-igdb-client-id').value,
|
||||
igdb_api_key: document.getElementById('config-igdb-secret').value
|
||||
|
||||
@@ -1,74 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
"""
|
||||
Test script to download images for existing games that don't have local images yet.
|
||||
"""
|
||||
import asyncio
|
||||
import aiohttp
|
||||
from sqlalchemy import create_engine, select
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from src.libs.config import Config
|
||||
from src.libs.database import Game_table, Metadata_table
|
||||
from src.libs.functions import download_image, get_image_filename
|
||||
|
||||
async def test_image_downloads():
|
||||
config = Config()
|
||||
url = f"sqlite+pysqlite:///{config.database_path}"
|
||||
engine = create_engine(url, future=True)
|
||||
|
||||
with Session(engine) as session:
|
||||
# Get first 3 games that have remote images but no local images
|
||||
stmt = (
|
||||
select(Game_table)
|
||||
.join(Metadata_table)
|
||||
.where(
|
||||
(Metadata_table.cover_image.is_not(None)) &
|
||||
(Metadata_table.cover_image_path.is_(None))
|
||||
)
|
||||
.limit(3)
|
||||
)
|
||||
games = session.scalars(stmt).all()
|
||||
|
||||
print(f"Found {len(games)} games to test image downloads for")
|
||||
|
||||
async with aiohttp.ClientSession() as http_session:
|
||||
for game in games:
|
||||
metadata = game.metadata_obj
|
||||
print(f"\nTesting: {game.title}")
|
||||
|
||||
# Download cover image
|
||||
if metadata.cover_image:
|
||||
cover_filename = get_image_filename(metadata.cover_image, game.title, 'cover')
|
||||
cover_path = config.images_path / cover_filename
|
||||
|
||||
print(f" Downloading cover: {metadata.cover_image}")
|
||||
success = await download_image(metadata.cover_image, cover_path, http_session)
|
||||
|
||||
if success:
|
||||
print(f" ✓ Cover saved to: {cover_path}")
|
||||
# Update database with local path
|
||||
metadata.cover_image_path = cover_path
|
||||
else:
|
||||
print(" ✗ Failed to download cover")
|
||||
|
||||
# Download screenshot
|
||||
if metadata.screenshot:
|
||||
screenshot_filename = get_image_filename(metadata.screenshot, game.title, 'screenshot')
|
||||
screenshot_path = config.images_path / screenshot_filename
|
||||
|
||||
print(f" Downloading screenshot: {metadata.screenshot}")
|
||||
success = await download_image(metadata.screenshot, screenshot_path, http_session)
|
||||
|
||||
if success:
|
||||
print(f" ✓ Screenshot saved to: {screenshot_path}")
|
||||
# Update database with local path
|
||||
metadata.screenshot_path = screenshot_path
|
||||
else:
|
||||
print(" ✗ Failed to download screenshot")
|
||||
|
||||
# Commit the updates
|
||||
session.commit()
|
||||
print("\n✓ Database updated with local image paths")
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(test_image_downloads())
|
||||
Reference in New Issue
Block a user