init
This commit is contained in:
47
.gitignore
vendored
Normal file
47
.gitignore
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
env/
|
||||
venv/
|
||||
ENV/
|
||||
.venv/
|
||||
*.egg-info/
|
||||
dist/
|
||||
build/
|
||||
|
||||
# Node
|
||||
node_modules/
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
.pnpm-debug.log*
|
||||
dist/
|
||||
dist-ssr/
|
||||
*.local
|
||||
|
||||
# Environment
|
||||
.env
|
||||
.env.local
|
||||
.env.*.local
|
||||
|
||||
# IDE
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
|
||||
# Nix
|
||||
result/
|
||||
result-*
|
||||
|
||||
# Database
|
||||
*.db
|
||||
*.sqlite
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
|
||||
215
README.md
Normal file
215
README.md
Normal file
@@ -0,0 +1,215 @@
|
||||
# Movie Map
|
||||
|
||||
A web application that visualizes the origin countries of your media collection from Radarr, Sonarr, and Lidarr, and tracks which foreign movies/shows you've watched.
|
||||
|
||||
## Features
|
||||
|
||||
### View 1: Collection Map
|
||||
- Visualizes all media in your *arr instances on a world map
|
||||
- Shows country of origin for each movie/show/artist
|
||||
- Color intensity indicates number of items per country
|
||||
- Filter by media type (movies, shows, music)
|
||||
- Pin markers show counts per country
|
||||
|
||||
### View 2: Watched Map
|
||||
- Interactive map to track watched foreign movies and TV shows
|
||||
- Manually add watched items with country information
|
||||
- Add custom pins to mark countries
|
||||
- Visualize your personal "watched foreign media" journey
|
||||
|
||||
## Architecture
|
||||
|
||||
- **Backend**: FastAPI (Python) with PostgreSQL
|
||||
- **Frontend**: React + TypeScript + Vite + Leaflet
|
||||
- **Database**: PostgreSQL (via Unix socket)
|
||||
- **Deployment**: Nix flake with NixOS module
|
||||
|
||||
## Development Setup
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Nix with flakes enabled
|
||||
- PostgreSQL running (accessible via socket)
|
||||
- Access to Radarr, Sonarr, Lidarr instances
|
||||
|
||||
### Getting Started
|
||||
|
||||
1. Enter the development shell:
|
||||
```bash
|
||||
nix develop
|
||||
```
|
||||
|
||||
2. Set up environment variables (create `.env` in `backend/`):
|
||||
```bash
|
||||
POSTGRES_SOCKET_PATH=/run/postgresql
|
||||
POSTGRES_DB=jawz
|
||||
POSTGRES_USER=jawz
|
||||
SONARR_API_KEY=your_sonarr_api_key
|
||||
RADARR_API_KEY=your_radarr_api_key
|
||||
LIDARR_API_KEY=your_lidarr_api_key
|
||||
PORT=8080
|
||||
```
|
||||
|
||||
3. Run database migrations:
|
||||
```bash
|
||||
cd backend
|
||||
alembic upgrade head
|
||||
```
|
||||
|
||||
4. Start the backend (in one terminal):
|
||||
```bash
|
||||
cd backend
|
||||
python -m uvicorn main:app --reload --host 127.0.0.1 --port 8080
|
||||
```
|
||||
|
||||
5. Start the frontend dev server (in another terminal):
|
||||
```bash
|
||||
cd frontend
|
||||
npm install
|
||||
npm run dev
|
||||
```
|
||||
|
||||
6. Open http://localhost:5173 in your browser
|
||||
|
||||
## Building
|
||||
|
||||
Build the application using Nix:
|
||||
|
||||
```bash
|
||||
nix build
|
||||
```
|
||||
|
||||
This creates a combined package with both backend and frontend.
|
||||
|
||||
## NixOS Deployment
|
||||
|
||||
### 1. Add the flake to your NixOS configuration
|
||||
|
||||
In your `configuration.nix` or a separate module:
|
||||
|
||||
```nix
|
||||
{
|
||||
imports = [
|
||||
/path/to/movie-map/nixosModules.default
|
||||
];
|
||||
|
||||
services.moviemap = {
|
||||
enable = true;
|
||||
port = 8080;
|
||||
postgresSocketPath = "/run/postgresql";
|
||||
# Secrets can be strings or file paths (for sops-nix integration)
|
||||
sonarrApiKey = "/run/secrets/sonarr-api-key"; # or "your_key_here"
|
||||
radarrApiKey = "/run/secrets/radarr-api-key"; # or "your_key_here"
|
||||
lidarrApiKey = "/run/secrets/lidarr-api-key"; # or "your_key_here"
|
||||
# Optional: admin token for sync endpoint
|
||||
adminToken = "/run/secrets/moviemap-admin-token"; # or "your_token_here"
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
Or reference the flake directly:
|
||||
|
||||
```nix
|
||||
{
|
||||
imports = [
|
||||
(builtins.getFlake "/path/to/movie-map").nixosModules.default
|
||||
];
|
||||
|
||||
services.moviemap = {
|
||||
enable = true;
|
||||
# ... configuration
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Run database migrations
|
||||
|
||||
Before starting the service, run migrations. You can do this either:
|
||||
|
||||
**Option A: Run migrations manually before enabling the service**
|
||||
|
||||
```bash
|
||||
# SSH into your server
|
||||
ssh server
|
||||
|
||||
# Enter the flake shell
|
||||
cd /path/to/movie-map
|
||||
nix develop
|
||||
|
||||
# Run migrations
|
||||
cd backend
|
||||
alembic upgrade head
|
||||
```
|
||||
|
||||
**Option B: Add a systemd service to run migrations on first start**
|
||||
|
||||
You can add a one-shot systemd service that runs migrations before the main service starts. Add this to your NixOS configuration:
|
||||
|
||||
```nix
|
||||
systemd.services.moviemap-migrate = {
|
||||
description = "Movie Map Database Migrations";
|
||||
serviceConfig = {
|
||||
Type = "oneshot";
|
||||
User = "moviemap";
|
||||
WorkingDirectory = "${appPackage}/backend";
|
||||
ExecStart = "${pythonEnv}/bin/alembic upgrade head";
|
||||
};
|
||||
before = [ "moviemap-backend.service" ];
|
||||
requiredBy = [ "moviemap-backend.service" ];
|
||||
};
|
||||
```
|
||||
|
||||
Or simply run migrations once manually, then enable the service.
|
||||
|
||||
### 3. Rebuild and enable
|
||||
|
||||
```bash
|
||||
sudo nixos-rebuild switch
|
||||
```
|
||||
|
||||
The service will be available at `http://127.0.0.1:8080` (configure your reverse proxy to expose it).
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Collection
|
||||
- `GET /api/collection/summary?types=movie,show,music` - Get collection summary by country
|
||||
|
||||
### Watched Items
|
||||
- `GET /api/watched` - List all watched items
|
||||
- `GET /api/watched/summary` - Get watched summary by country
|
||||
- `POST /api/watched` - Create watched item
|
||||
- `PATCH /api/watched/{id}` - Update watched item
|
||||
- `DELETE /api/watched/{id}` - Delete watched item
|
||||
|
||||
### Pins
|
||||
- `GET /api/pins` - List all manual pins
|
||||
- `POST /api/pins` - Create pin
|
||||
- `DELETE /api/pins/{id}` - Delete pin
|
||||
|
||||
### Admin
|
||||
- `POST /admin/sync` - Trigger sync from all *arr instances (requires admin token if configured)
|
||||
|
||||
## Database Schema
|
||||
|
||||
The application creates a `moviemap` schema in the `jawz` database with the following tables:
|
||||
|
||||
- `source` - *arr instance configuration
|
||||
- `media_item` - Normalized media items from *arr
|
||||
- `media_country` - Country associations for media items
|
||||
- `watched_item` - User-tracked watched items
|
||||
- `manual_pin` - Custom pins on the map
|
||||
|
||||
## Country Extraction
|
||||
|
||||
The sync process extracts country information from *arr metadata:
|
||||
|
||||
- **Radarr**: Uses `productionCountries` from movie metadata
|
||||
- **Sonarr**: Uses `originCountry` from series metadata (if available)
|
||||
- **Lidarr**: Uses `country` field from artist metadata
|
||||
|
||||
If country information is not available, the item is stored without a country association (excluded from map visualization).
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
|
||||
115
backend/alembic.ini
Normal file
115
backend/alembic.ini
Normal file
@@ -0,0 +1,115 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# path to migration scripts
|
||||
script_location = alembic
|
||||
|
||||
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
|
||||
# Uncomment the line below if you want the files to be prepended with date and time
|
||||
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
|
||||
|
||||
# sys.path path, will be prepended to sys.path if present.
|
||||
# defaults to the current working directory.
|
||||
prepend_sys_path = .
|
||||
|
||||
# timezone to use when rendering the date within the migration file
|
||||
# as well as the filename.
|
||||
# If specified, requires the python-dateutil library that can be
|
||||
# installed by adding `alembic[tz]` to the pip requirements
|
||||
# string value is passed to dateutil.tz.gettz()
|
||||
# leave blank for localtime
|
||||
# timezone =
|
||||
|
||||
# max length of characters to apply to the
|
||||
# "slug" field
|
||||
# truncate_slug_length = 40
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
# set to 'true' to allow .pyc and .pyo files without
|
||||
# a source .py file to be detected as revisions in the
|
||||
# versions/ directory
|
||||
# sourceless = false
|
||||
|
||||
# version location specification; This defaults
|
||||
# to alembic/versions. When using multiple version
|
||||
# directories, initial revisions must be specified with --version-path.
|
||||
# The path separator used here should be the separator specified by "version_path_separator" below.
|
||||
# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions
|
||||
|
||||
# version path separator; As mentioned above, this is the character used to split
|
||||
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
|
||||
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
|
||||
# Valid values for version_path_separator are:
|
||||
#
|
||||
# version_path_separator = :
|
||||
# version_path_separator = ;
|
||||
# version_path_separator = space
|
||||
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
|
||||
|
||||
# set to 'true' to search source files recursively
|
||||
# in each "version_locations" directory
|
||||
# new in Alembic version 1.10
|
||||
# recursive_version_locations = false
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
sqlalchemy.url = driver://user:pass@localhost/dbname
|
||||
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts. See the documentation for further
|
||||
# detail and examples
|
||||
|
||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||
# hooks = black
|
||||
# black.type = console_scripts
|
||||
# black.entrypoint = black
|
||||
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||
|
||||
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
|
||||
# hooks = ruff
|
||||
# ruff.type = exec
|
||||
# ruff.executable = %(here)s/.venv/bin/ruff
|
||||
# ruff.options = --fix REVISION_SCRIPT_FILENAME
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
||||
|
||||
87
backend/alembic/env.py
Normal file
87
backend/alembic/env.py
Normal file
@@ -0,0 +1,87 @@
|
||||
from logging.config import fileConfig
|
||||
from sqlalchemy import engine_from_config
|
||||
from sqlalchemy import pool
|
||||
from alembic import context
|
||||
import os
|
||||
import sys
|
||||
|
||||
# Add parent directory to path
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(__file__)))
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
# access to the values within the .ini file in use.
|
||||
config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
if config.config_file_name is not None:
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
# Set the SQLAlchemy URL from our settings
|
||||
config.set_main_option("sqlalchemy.url", settings.database_url.replace("postgresql://", "postgresql+psycopg://"))
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
# from myapp import mymodel
|
||||
# target_metadata = mymodel.Base.metadata
|
||||
target_metadata = None
|
||||
|
||||
# other values from the config, defined by the needs of env.py,
|
||||
# can be acquired:
|
||||
# my_important_option = config.get_main_option("my_important_option")
|
||||
# ... etc.
|
||||
|
||||
|
||||
def run_migrations_offline() -> None:
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url,
|
||||
target_metadata=target_metadata,
|
||||
literal_binds=True,
|
||||
dialect_opts={"paramstyle": "named"},
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online() -> None:
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
connectable = engine_from_config(
|
||||
config.get_section(config.config_ini_section, {}),
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
connection=connection, target_metadata=target_metadata
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
||||
|
||||
25
backend/alembic/script.py.mako
Normal file
25
backend/alembic/script.py.mako
Normal file
@@ -0,0 +1,25 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
${downgrades if downgrades else "pass"}
|
||||
|
||||
133
backend/alembic/versions/001_initial_schema.py
Normal file
133
backend/alembic/versions/001_initial_schema.py
Normal file
@@ -0,0 +1,133 @@
|
||||
"""Initial schema
|
||||
|
||||
Revision ID: 001
|
||||
Revises:
|
||||
Create Date: 2024-01-01 00:00:00.000000
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '001'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
# Create schema
|
||||
op.execute("CREATE SCHEMA IF NOT EXISTS moviemap")
|
||||
|
||||
# Create enums
|
||||
op.execute("""
|
||||
DO $$ BEGIN
|
||||
CREATE TYPE moviemap.source_kind AS ENUM ('radarr', 'sonarr', 'lidarr');
|
||||
EXCEPTION
|
||||
WHEN duplicate_object THEN null;
|
||||
END $$;
|
||||
""")
|
||||
|
||||
op.execute("""
|
||||
DO $$ BEGIN
|
||||
CREATE TYPE moviemap.media_type AS ENUM ('movie', 'show', 'music');
|
||||
EXCEPTION
|
||||
WHEN duplicate_object THEN null;
|
||||
END $$;
|
||||
""")
|
||||
|
||||
op.execute("""
|
||||
DO $$ BEGIN
|
||||
CREATE TYPE moviemap.watched_media_type AS ENUM ('movie', 'show');
|
||||
EXCEPTION
|
||||
WHEN duplicate_object THEN null;
|
||||
END $$;
|
||||
""")
|
||||
|
||||
# Create source table
|
||||
op.execute("""
|
||||
CREATE TABLE IF NOT EXISTS moviemap.source (
|
||||
id SERIAL PRIMARY KEY,
|
||||
kind moviemap.source_kind NOT NULL UNIQUE,
|
||||
base_url TEXT NOT NULL,
|
||||
enabled BOOLEAN NOT NULL DEFAULT true,
|
||||
last_sync_at TIMESTAMPTZ
|
||||
)
|
||||
""")
|
||||
|
||||
# Create media_item table
|
||||
op.execute("""
|
||||
CREATE TABLE IF NOT EXISTS moviemap.media_item (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
source_kind moviemap.source_kind NOT NULL,
|
||||
source_item_id INTEGER NOT NULL,
|
||||
title TEXT NOT NULL,
|
||||
year INTEGER,
|
||||
media_type moviemap.media_type NOT NULL,
|
||||
arr_raw JSONB,
|
||||
UNIQUE (source_kind, source_item_id)
|
||||
)
|
||||
""")
|
||||
|
||||
# Create media_country table
|
||||
op.execute("""
|
||||
CREATE TABLE IF NOT EXISTS moviemap.media_country (
|
||||
media_item_id UUID NOT NULL REFERENCES moviemap.media_item(id) ON DELETE CASCADE,
|
||||
country_code CHAR(2) NOT NULL,
|
||||
weight SMALLINT NOT NULL DEFAULT 1,
|
||||
PRIMARY KEY (media_item_id, country_code)
|
||||
)
|
||||
""")
|
||||
|
||||
# Create watched_item table
|
||||
op.execute("""
|
||||
CREATE TABLE IF NOT EXISTS moviemap.watched_item (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
media_type moviemap.watched_media_type NOT NULL,
|
||||
title TEXT NOT NULL,
|
||||
year INTEGER,
|
||||
country_code CHAR(2) NOT NULL,
|
||||
watched_at TIMESTAMPTZ,
|
||||
notes TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
)
|
||||
""")
|
||||
|
||||
# Create manual_pin table
|
||||
op.execute("""
|
||||
CREATE TABLE IF NOT EXISTS moviemap.manual_pin (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
country_code CHAR(2) NOT NULL,
|
||||
label TEXT,
|
||||
pinned_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
)
|
||||
""")
|
||||
|
||||
# Create indexes
|
||||
op.execute("CREATE INDEX IF NOT EXISTS idx_media_item_source ON moviemap.media_item (source_kind, source_item_id)")
|
||||
op.execute("CREATE INDEX IF NOT EXISTS idx_media_country_code ON moviemap.media_country (country_code)")
|
||||
op.execute("CREATE INDEX IF NOT EXISTS idx_watched_country ON moviemap.watched_item (country_code)")
|
||||
op.execute("CREATE INDEX IF NOT EXISTS idx_watched_media_type ON moviemap.watched_item (media_type)")
|
||||
op.execute("CREATE INDEX IF NOT EXISTS idx_pin_country ON moviemap.manual_pin (country_code)")
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.execute("DROP INDEX IF EXISTS moviemap.idx_pin_country")
|
||||
op.execute("DROP INDEX IF EXISTS moviemap.idx_watched_media_type")
|
||||
op.execute("DROP INDEX IF EXISTS moviemap.idx_watched_country")
|
||||
op.execute("DROP INDEX IF EXISTS moviemap.idx_media_country_code")
|
||||
op.execute("DROP INDEX IF EXISTS moviemap.idx_media_item_source")
|
||||
|
||||
op.execute("DROP TABLE IF EXISTS moviemap.manual_pin")
|
||||
op.execute("DROP TABLE IF EXISTS moviemap.watched_item")
|
||||
op.execute("DROP TABLE IF EXISTS moviemap.media_country")
|
||||
op.execute("DROP TABLE IF EXISTS moviemap.media_item")
|
||||
op.execute("DROP TABLE IF EXISTS moviemap.source")
|
||||
|
||||
op.execute("DROP TYPE IF EXISTS moviemap.watched_media_type")
|
||||
op.execute("DROP TYPE IF EXISTS moviemap.media_type")
|
||||
op.execute("DROP TYPE IF EXISTS moviemap.source_kind")
|
||||
|
||||
op.execute("DROP SCHEMA IF EXISTS moviemap")
|
||||
|
||||
0
backend/app/__init__.py
Normal file
0
backend/app/__init__.py
Normal file
0
backend/app/api/__init__.py
Normal file
0
backend/app/api/__init__.py
Normal file
35
backend/app/api/admin.py
Normal file
35
backend/app/api/admin.py
Normal file
@@ -0,0 +1,35 @@
|
||||
"""Admin API endpoints"""
|
||||
from fastapi import APIRouter, HTTPException, Header
|
||||
from typing import Optional
|
||||
from app.core.database import pool
|
||||
from app.core.config import settings
|
||||
from app.services.sync import sync_all_arrs
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
async def verify_admin_token(authorization: Optional[str] = Header(None)):
|
||||
"""Verify admin token if configured"""
|
||||
if settings.admin_token:
|
||||
if not authorization or authorization != f"Bearer {settings.admin_token}":
|
||||
raise HTTPException(status_code=401, detail="Unauthorized")
|
||||
# If no admin token configured, allow (assuming localhost-only access)
|
||||
|
||||
|
||||
@router.post("/sync")
|
||||
async def trigger_sync(authorization: Optional[str] = Header(None)):
|
||||
"""
|
||||
Trigger sync from all *arr instances.
|
||||
Requires admin token if MOVIEMAP_ADMIN_TOKEN is set.
|
||||
"""
|
||||
await verify_admin_token(authorization)
|
||||
|
||||
try:
|
||||
result = await sync_all_arrs()
|
||||
return {
|
||||
"status": "success",
|
||||
"synced": result
|
||||
}
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Sync failed: {str(e)}")
|
||||
|
||||
61
backend/app/api/collection.py
Normal file
61
backend/app/api/collection.py
Normal file
@@ -0,0 +1,61 @@
|
||||
"""Collection API endpoints"""
|
||||
from fastapi import APIRouter, Query
|
||||
from typing import List, Optional
|
||||
from app.core.database import pool
|
||||
import json
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
@router.get("/summary")
|
||||
async def get_collection_summary(
|
||||
types: Optional[str] = Query(None, description="Comma-separated list: movie,show,music")
|
||||
):
|
||||
"""
|
||||
Get collection summary by country and media type.
|
||||
Returns counts per country per media type.
|
||||
"""
|
||||
# Pool should be initialized on startup, but check just in case
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
# Parse types filter
|
||||
type_filter = []
|
||||
if types:
|
||||
type_filter = [t.strip() for t in types.split(",") if t.strip() in ["movie", "show", "music"]]
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
# Build query
|
||||
query = """
|
||||
SELECT
|
||||
mc.country_code,
|
||||
mi.media_type,
|
||||
COUNT(*) as count
|
||||
FROM moviemap.media_country mc
|
||||
JOIN moviemap.media_item mi ON mc.media_item_id = mi.id
|
||||
"""
|
||||
params = []
|
||||
if type_filter:
|
||||
query += " WHERE mi.media_type = ANY(%s)"
|
||||
params.append(type_filter)
|
||||
|
||||
query += """
|
||||
GROUP BY mc.country_code, mi.media_type
|
||||
ORDER BY mc.country_code, mi.media_type
|
||||
"""
|
||||
|
||||
await cur.execute(query, params if params else None)
|
||||
rows = await cur.fetchall()
|
||||
|
||||
# Transform to nested dict structure
|
||||
result = {}
|
||||
for row in rows:
|
||||
country_code, media_type, count = row
|
||||
if country_code not in result:
|
||||
result[country_code] = {}
|
||||
result[country_code][media_type] = count
|
||||
|
||||
return result
|
||||
|
||||
87
backend/app/api/pins.py
Normal file
87
backend/app/api/pins.py
Normal file
@@ -0,0 +1,87 @@
|
||||
"""Manual pins API endpoints"""
|
||||
from fastapi import APIRouter, HTTPException
|
||||
from pydantic import BaseModel
|
||||
from typing import Optional
|
||||
from uuid import UUID
|
||||
from app.core.database import pool
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
class PinCreate(BaseModel):
|
||||
country_code: str
|
||||
label: Optional[str] = None
|
||||
|
||||
|
||||
@router.get("")
|
||||
async def list_pins():
|
||||
"""List all manual pins"""
|
||||
# Pool should be initialized on startup
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
query = """
|
||||
SELECT id, country_code, label, pinned_at
|
||||
FROM moviemap.manual_pin
|
||||
ORDER BY pinned_at DESC
|
||||
"""
|
||||
await cur.execute(query)
|
||||
rows = await cur.fetchall()
|
||||
|
||||
pins = []
|
||||
for row in rows:
|
||||
pins.append({
|
||||
"id": str(row[0]),
|
||||
"country_code": row[1],
|
||||
"label": row[2],
|
||||
"pinned_at": row[3].isoformat() if row[3] else None,
|
||||
})
|
||||
|
||||
return pins
|
||||
|
||||
|
||||
@router.post("")
|
||||
async def create_pin(pin: PinCreate):
|
||||
"""Create a new manual pin"""
|
||||
# Pool should be initialized on startup
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
query = """
|
||||
INSERT INTO moviemap.manual_pin (country_code, label)
|
||||
VALUES (%s, %s)
|
||||
RETURNING id
|
||||
"""
|
||||
await cur.execute(query, (pin.country_code, pin.label))
|
||||
result = await cur.fetchone()
|
||||
await conn.commit()
|
||||
|
||||
return {"id": str(result[0]), "status": "created"}
|
||||
|
||||
|
||||
@router.delete("/{pin_id}")
|
||||
async def delete_pin(pin_id: UUID):
|
||||
"""Delete a manual pin"""
|
||||
# Pool should be initialized on startup
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
query = "DELETE FROM moviemap.manual_pin WHERE id = %s RETURNING id"
|
||||
await cur.execute(query, (str(pin_id),))
|
||||
result = await cur.fetchone()
|
||||
await conn.commit()
|
||||
|
||||
if not result:
|
||||
raise HTTPException(status_code=404, detail="Pin not found")
|
||||
|
||||
return {"id": str(result[0]), "status": "deleted"}
|
||||
|
||||
208
backend/app/api/watched.py
Normal file
208
backend/app/api/watched.py
Normal file
@@ -0,0 +1,208 @@
|
||||
"""Watched items API endpoints"""
|
||||
from fastapi import APIRouter, HTTPException
|
||||
from pydantic import BaseModel
|
||||
from typing import Optional
|
||||
from datetime import datetime
|
||||
from uuid import UUID
|
||||
from app.core.database import pool
|
||||
import json
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
class WatchedItemCreate(BaseModel):
|
||||
media_type: str # "movie" or "show"
|
||||
title: str
|
||||
year: Optional[int] = None
|
||||
country_code: str
|
||||
watched_at: Optional[datetime] = None
|
||||
notes: Optional[str] = None
|
||||
|
||||
|
||||
class WatchedItemUpdate(BaseModel):
|
||||
title: Optional[str] = None
|
||||
year: Optional[int] = None
|
||||
country_code: Optional[str] = None
|
||||
watched_at: Optional[datetime] = None
|
||||
notes: Optional[str] = None
|
||||
|
||||
|
||||
@router.get("/summary")
|
||||
async def get_watched_summary():
|
||||
"""Get watched items summary by country"""
|
||||
# Pool should be initialized on startup
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
query = """
|
||||
SELECT
|
||||
country_code,
|
||||
media_type,
|
||||
COUNT(*) as count
|
||||
FROM moviemap.watched_item
|
||||
WHERE watched_at IS NOT NULL
|
||||
GROUP BY country_code, media_type
|
||||
ORDER BY country_code, media_type
|
||||
"""
|
||||
await cur.execute(query)
|
||||
rows = await cur.fetchall()
|
||||
|
||||
result = {}
|
||||
for row in rows:
|
||||
country_code, media_type, count = row
|
||||
if country_code not in result:
|
||||
result[country_code] = {}
|
||||
result[country_code][media_type] = count
|
||||
|
||||
return result
|
||||
|
||||
|
||||
@router.get("")
|
||||
async def list_watched_items():
|
||||
"""List all watched items"""
|
||||
# Pool should be initialized on startup
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
query = """
|
||||
SELECT
|
||||
id, media_type, title, year, country_code,
|
||||
watched_at, notes, created_at, updated_at
|
||||
FROM moviemap.watched_item
|
||||
ORDER BY created_at DESC
|
||||
"""
|
||||
await cur.execute(query)
|
||||
rows = await cur.fetchall()
|
||||
|
||||
items = []
|
||||
for row in rows:
|
||||
items.append({
|
||||
"id": str(row[0]),
|
||||
"media_type": row[1],
|
||||
"title": row[2],
|
||||
"year": row[3],
|
||||
"country_code": row[4],
|
||||
"watched_at": row[5].isoformat() if row[5] else None,
|
||||
"notes": row[6],
|
||||
"created_at": row[7].isoformat() if row[7] else None,
|
||||
"updated_at": row[8].isoformat() if row[8] else None,
|
||||
})
|
||||
|
||||
return items
|
||||
|
||||
|
||||
@router.post("")
|
||||
async def create_watched_item(item: WatchedItemCreate):
|
||||
"""Create a new watched item"""
|
||||
# Pool should be initialized on startup
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
if item.media_type not in ["movie", "show"]:
|
||||
raise HTTPException(status_code=400, detail="media_type must be 'movie' or 'show'")
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
query = """
|
||||
INSERT INTO moviemap.watched_item
|
||||
(media_type, title, year, country_code, watched_at, notes)
|
||||
VALUES (%s, %s, %s, %s, %s, %s)
|
||||
RETURNING id
|
||||
"""
|
||||
await cur.execute(
|
||||
query,
|
||||
(
|
||||
item.media_type,
|
||||
item.title,
|
||||
item.year,
|
||||
item.country_code,
|
||||
item.watched_at,
|
||||
item.notes,
|
||||
)
|
||||
)
|
||||
result = await cur.fetchone()
|
||||
await conn.commit()
|
||||
|
||||
return {"id": str(result[0]), "status": "created"}
|
||||
|
||||
|
||||
@router.patch("/{item_id}")
|
||||
async def update_watched_item(item_id: UUID, item: WatchedItemUpdate):
|
||||
"""Update a watched item"""
|
||||
# Pool should be initialized on startup
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
# Build dynamic update query
|
||||
updates = []
|
||||
params = []
|
||||
|
||||
if item.title is not None:
|
||||
updates.append("title = %s")
|
||||
params.append(item.title)
|
||||
if item.year is not None:
|
||||
updates.append("year = %s")
|
||||
params.append(item.year)
|
||||
if item.country_code is not None:
|
||||
updates.append("country_code = %s")
|
||||
params.append(item.country_code)
|
||||
if item.watched_at is not None:
|
||||
updates.append("watched_at = %s")
|
||||
params.append(item.watched_at)
|
||||
if item.notes is not None:
|
||||
updates.append("notes = %s")
|
||||
params.append(item.notes)
|
||||
|
||||
if not updates:
|
||||
raise HTTPException(status_code=400, detail="No fields to update")
|
||||
|
||||
updates.append("updated_at = NOW()")
|
||||
params.append(str(item_id))
|
||||
|
||||
query = f"""
|
||||
UPDATE moviemap.watched_item
|
||||
SET {', '.join(updates)}
|
||||
WHERE id = %s
|
||||
RETURNING id
|
||||
"""
|
||||
|
||||
await cur.execute(query, params)
|
||||
result = await cur.fetchone()
|
||||
await conn.commit()
|
||||
|
||||
if not result:
|
||||
raise HTTPException(status_code=404, detail="Watched item not found")
|
||||
|
||||
return {"id": str(result[0]), "status": "updated"}
|
||||
|
||||
|
||||
@router.delete("/{item_id}")
|
||||
async def delete_watched_item(item_id: UUID):
|
||||
"""Delete a watched item"""
|
||||
# Pool should be initialized on startup
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
query = "DELETE FROM moviemap.watched_item WHERE id = %s RETURNING id"
|
||||
await cur.execute(query, (str(item_id),))
|
||||
result = await cur.fetchone()
|
||||
await conn.commit()
|
||||
|
||||
if not result:
|
||||
raise HTTPException(status_code=404, detail="Watched item not found")
|
||||
|
||||
return {"id": str(result[0]), "status": "deleted"}
|
||||
|
||||
0
backend/app/core/__init__.py
Normal file
0
backend/app/core/__init__.py
Normal file
43
backend/app/core/config.py
Normal file
43
backend/app/core/config.py
Normal file
@@ -0,0 +1,43 @@
|
||||
"""Application configuration"""
|
||||
from pydantic_settings import BaseSettings
|
||||
from typing import Optional
|
||||
import os
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
"""Application settings"""
|
||||
|
||||
# Server
|
||||
port: int = int(os.getenv("PORT", "8080"))
|
||||
host: str = "127.0.0.1"
|
||||
|
||||
# Database
|
||||
postgres_socket_path: str = os.getenv("POSTGRES_SOCKET_PATH", "/run/postgresql")
|
||||
postgres_db: str = os.getenv("POSTGRES_DB", "jawz")
|
||||
postgres_user: str = os.getenv("POSTGRES_USER", os.getenv("USER", "jawz"))
|
||||
|
||||
# *arr API keys
|
||||
sonarr_api_key: str = os.getenv("SONARR_API_KEY", "")
|
||||
radarr_api_key: str = os.getenv("RADARR_API_KEY", "")
|
||||
lidarr_api_key: str = os.getenv("LIDARR_API_KEY", "")
|
||||
|
||||
# *arr base URLs
|
||||
sonarr_url: str = "http://127.0.0.1:8989"
|
||||
radarr_url: str = "http://127.0.0.1:7878"
|
||||
lidarr_url: str = "http://127.0.0.1:8686"
|
||||
|
||||
# Admin
|
||||
admin_token: Optional[str] = os.getenv("MOVIEMAP_ADMIN_TOKEN")
|
||||
|
||||
@property
|
||||
def database_url(self) -> str:
|
||||
"""Build PostgreSQL connection string using Unix socket"""
|
||||
return f"postgresql://{self.postgres_user}@/{self.postgres_db}?host={self.postgres_socket_path}"
|
||||
|
||||
class Config:
|
||||
env_file = ".env"
|
||||
case_sensitive = False
|
||||
|
||||
|
||||
settings = Settings()
|
||||
|
||||
50
backend/app/core/database.py
Normal file
50
backend/app/core/database.py
Normal file
@@ -0,0 +1,50 @@
|
||||
"""Database connection and session management"""
|
||||
from psycopg import AsyncConnection
|
||||
from psycopg_pool import AsyncConnectionPool
|
||||
from app.core.config import settings
|
||||
from typing import Optional
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Connection pool
|
||||
pool: Optional[AsyncConnectionPool] = None
|
||||
|
||||
|
||||
async def init_db():
|
||||
"""Initialize database connection pool"""
|
||||
global pool
|
||||
try:
|
||||
pool = AsyncConnectionPool(
|
||||
conninfo=settings.database_url,
|
||||
min_size=1,
|
||||
max_size=10,
|
||||
open=False,
|
||||
)
|
||||
await pool.open()
|
||||
logger.info("Database connection pool initialized")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to initialize database pool: {e}")
|
||||
raise
|
||||
|
||||
|
||||
async def close_db():
|
||||
"""Close database connection pool"""
|
||||
global pool
|
||||
if pool:
|
||||
await pool.close()
|
||||
logger.info("Database connection pool closed")
|
||||
|
||||
|
||||
async def get_db() -> AsyncConnection:
|
||||
"""Get database connection from pool"""
|
||||
if not pool:
|
||||
await init_db()
|
||||
return await pool.getconn()
|
||||
|
||||
|
||||
async def return_conn(conn: AsyncConnection):
|
||||
"""Return connection to pool"""
|
||||
if pool:
|
||||
await pool.putconn(conn)
|
||||
|
||||
0
backend/app/services/__init__.py
Normal file
0
backend/app/services/__init__.py
Normal file
296
backend/app/services/sync.py
Normal file
296
backend/app/services/sync.py
Normal file
@@ -0,0 +1,296 @@
|
||||
"""Sync service for *arr instances"""
|
||||
import httpx
|
||||
import logging
|
||||
from typing import Dict, List, Optional
|
||||
from app.core.config import settings
|
||||
from app.core.database import pool
|
||||
import json
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def fetch_radarr_movies() -> List[Dict]:
|
||||
"""Fetch all movies from Radarr"""
|
||||
if not settings.radarr_api_key:
|
||||
logger.warning("Radarr API key not configured")
|
||||
return []
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
try:
|
||||
response = await client.get(
|
||||
f"{settings.radarr_url}/api/v3/movie",
|
||||
headers={"X-Api-Key": settings.radarr_api_key},
|
||||
timeout=30.0
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to fetch Radarr movies: {e}")
|
||||
return []
|
||||
|
||||
|
||||
async def fetch_sonarr_series() -> List[Dict]:
|
||||
"""Fetch all series from Sonarr"""
|
||||
if not settings.sonarr_api_key:
|
||||
logger.warning("Sonarr API key not configured")
|
||||
return []
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
try:
|
||||
response = await client.get(
|
||||
f"{settings.sonarr_url}/api/v3/series",
|
||||
headers={"X-Api-Key": settings.sonarr_api_key},
|
||||
timeout=30.0
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to fetch Sonarr series: {e}")
|
||||
return []
|
||||
|
||||
|
||||
async def fetch_lidarr_artists() -> List[Dict]:
|
||||
"""Fetch all artists from Lidarr"""
|
||||
if not settings.lidarr_api_key:
|
||||
logger.warning("Lidarr API key not configured")
|
||||
return []
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
try:
|
||||
response = await client.get(
|
||||
f"{settings.lidarr_url}/api/v1/artist",
|
||||
headers={"X-Api-Key": settings.lidarr_api_key},
|
||||
timeout=30.0
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to fetch Lidarr artists: {e}")
|
||||
return []
|
||||
|
||||
|
||||
def extract_country_from_radarr(movie: Dict) -> Optional[str]:
|
||||
"""Extract country code from Radarr movie metadata"""
|
||||
# Try productionCountries first
|
||||
if "productionCountries" in movie and movie["productionCountries"]:
|
||||
countries = movie["productionCountries"]
|
||||
if isinstance(countries, list) and len(countries) > 0:
|
||||
country = countries[0]
|
||||
if isinstance(country, dict) and "iso_3166_1" in country:
|
||||
return country["iso_3166_1"].upper()
|
||||
elif isinstance(country, str):
|
||||
# Try to map country name to code (simplified)
|
||||
return None # Would need a mapping table
|
||||
|
||||
# Try to get from TMDB metadata if available
|
||||
if "tmdbId" in movie and movie.get("movieMetadata", {}).get("productionCountries"):
|
||||
countries = movie["movieMetadata"]["productionCountries"]
|
||||
if isinstance(countries, list) and len(countries) > 0:
|
||||
country = countries[0]
|
||||
if isinstance(country, dict) and "iso_3166_1" in country:
|
||||
return country["iso_3166_1"].upper()
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def extract_country_from_sonarr(series: Dict) -> Optional[str]:
|
||||
"""Extract country code from Sonarr series metadata"""
|
||||
# Sonarr doesn't always have country info directly
|
||||
# Check network origin or other metadata
|
||||
if "network" in series and series["network"]:
|
||||
# Network name might hint at country, but not reliable
|
||||
pass
|
||||
|
||||
# Check if there's any country metadata
|
||||
if "seriesMetadata" in series:
|
||||
metadata = series["seriesMetadata"]
|
||||
if "originCountry" in metadata and metadata["originCountry"]:
|
||||
# originCountry might be a list or string
|
||||
origin = metadata["originCountry"]
|
||||
if isinstance(origin, list) and len(origin) > 0:
|
||||
return origin[0].upper() if len(origin[0]) == 2 else None
|
||||
elif isinstance(origin, str) and len(origin) == 2:
|
||||
return origin.upper()
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def extract_country_from_lidarr(artist: Dict) -> Optional[str]:
|
||||
"""Extract country code from Lidarr artist metadata"""
|
||||
# Lidarr has a country field
|
||||
if "country" in artist and artist["country"]:
|
||||
country = artist["country"]
|
||||
if isinstance(country, str) and len(country) == 2:
|
||||
return country.upper()
|
||||
# Might be a country name, would need mapping
|
||||
|
||||
return None
|
||||
|
||||
|
||||
async def upsert_media_item(source_kind: str, source_item_id: int, title: str,
|
||||
year: Optional[int], media_type: str, arr_raw: Dict):
|
||||
"""Upsert a media item into the database"""
|
||||
# Pool should be initialized on startup
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
# Upsert media item
|
||||
query = """
|
||||
INSERT INTO moviemap.media_item
|
||||
(source_kind, source_item_id, title, year, media_type, arr_raw)
|
||||
VALUES (%s, %s, %s, %s, %s, %s::jsonb)
|
||||
ON CONFLICT (source_kind, source_item_id)
|
||||
DO UPDATE SET
|
||||
title = EXCLUDED.title,
|
||||
year = EXCLUDED.year,
|
||||
arr_raw = EXCLUDED.arr_raw
|
||||
RETURNING id
|
||||
"""
|
||||
await cur.execute(
|
||||
query,
|
||||
(source_kind, source_item_id, title, year, media_type, json.dumps(arr_raw))
|
||||
)
|
||||
result = await cur.fetchone()
|
||||
media_item_id = result[0]
|
||||
|
||||
# Extract and upsert country
|
||||
country_code = None
|
||||
if source_kind == "radarr":
|
||||
country_code = extract_country_from_radarr(arr_raw)
|
||||
elif source_kind == "sonarr":
|
||||
country_code = extract_country_from_sonarr(arr_raw)
|
||||
elif source_kind == "lidarr":
|
||||
country_code = extract_country_from_lidarr(arr_raw)
|
||||
|
||||
# Delete existing country associations
|
||||
await cur.execute(
|
||||
"DELETE FROM moviemap.media_country WHERE media_item_id = %s",
|
||||
(media_item_id,)
|
||||
)
|
||||
|
||||
# Insert new country association if found
|
||||
if country_code:
|
||||
await cur.execute(
|
||||
"INSERT INTO moviemap.media_country (media_item_id, country_code) VALUES (%s, %s)",
|
||||
(media_item_id, country_code)
|
||||
)
|
||||
|
||||
await conn.commit()
|
||||
return media_item_id
|
||||
|
||||
|
||||
async def sync_radarr():
|
||||
"""Sync movies from Radarr"""
|
||||
movies = await fetch_radarr_movies()
|
||||
synced = 0
|
||||
|
||||
for movie in movies:
|
||||
try:
|
||||
await upsert_media_item(
|
||||
source_kind="radarr",
|
||||
source_item_id=movie.get("id"),
|
||||
title=movie.get("title", "Unknown"),
|
||||
year=movie.get("year"),
|
||||
media_type="movie",
|
||||
arr_raw=movie
|
||||
)
|
||||
synced += 1
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to sync movie {movie.get('id')}: {e}")
|
||||
|
||||
return {"radarr": synced}
|
||||
|
||||
|
||||
async def sync_sonarr():
|
||||
"""Sync series from Sonarr"""
|
||||
series = await fetch_sonarr_series()
|
||||
synced = 0
|
||||
|
||||
for s in series:
|
||||
try:
|
||||
await upsert_media_item(
|
||||
source_kind="sonarr",
|
||||
source_item_id=s.get("id"),
|
||||
title=s.get("title", "Unknown"),
|
||||
year=s.get("year"),
|
||||
media_type="show",
|
||||
arr_raw=s
|
||||
)
|
||||
synced += 1
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to sync series {s.get('id')}: {e}")
|
||||
|
||||
return {"sonarr": synced}
|
||||
|
||||
|
||||
async def sync_lidarr():
|
||||
"""Sync artists from Lidarr"""
|
||||
artists = await fetch_lidarr_artists()
|
||||
synced = 0
|
||||
|
||||
for artist in artists:
|
||||
try:
|
||||
await upsert_media_item(
|
||||
source_kind="lidarr",
|
||||
source_item_id=artist.get("id"),
|
||||
title=artist.get("artistName", "Unknown"),
|
||||
year=None, # Artists don't have a year
|
||||
media_type="music",
|
||||
arr_raw=artist
|
||||
)
|
||||
synced += 1
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to sync artist {artist.get('id')}: {e}")
|
||||
|
||||
return {"lidarr": synced}
|
||||
|
||||
|
||||
async def sync_all_arrs() -> Dict:
|
||||
"""Sync from all *arr instances"""
|
||||
logger.info("Starting sync from all *arr instances")
|
||||
|
||||
results = {}
|
||||
|
||||
# Sync each service
|
||||
try:
|
||||
results.update(await sync_radarr())
|
||||
except Exception as e:
|
||||
logger.error(f"Radarr sync failed: {e}")
|
||||
results["radarr"] = 0
|
||||
|
||||
try:
|
||||
results.update(await sync_sonarr())
|
||||
except Exception as e:
|
||||
logger.error(f"Sonarr sync failed: {e}")
|
||||
results["sonarr"] = 0
|
||||
|
||||
try:
|
||||
results.update(await sync_lidarr())
|
||||
except Exception as e:
|
||||
logger.error(f"Lidarr sync failed: {e}")
|
||||
results["lidarr"] = 0
|
||||
|
||||
# Update last sync time (pool should be initialized)
|
||||
if not pool:
|
||||
from app.core.database import init_db
|
||||
await init_db()
|
||||
|
||||
async with pool.connection() as conn:
|
||||
async with conn.cursor() as cur:
|
||||
for source_kind in ["radarr", "sonarr", "lidarr"]:
|
||||
await cur.execute(
|
||||
"""
|
||||
INSERT INTO moviemap.source (kind, base_url, enabled, last_sync_at)
|
||||
VALUES (%s, %s, %s, NOW())
|
||||
ON CONFLICT (kind) DO UPDATE SET last_sync_at = NOW()
|
||||
""",
|
||||
(source_kind, getattr(settings, f"{source_kind}_url"), True)
|
||||
)
|
||||
await conn.commit()
|
||||
|
||||
logger.info(f"Sync completed: {results}")
|
||||
return results
|
||||
|
||||
74
backend/main.py
Normal file
74
backend/main.py
Normal file
@@ -0,0 +1,74 @@
|
||||
"""
|
||||
Movie Map Backend - FastAPI application
|
||||
"""
|
||||
from fastapi import FastAPI
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from pathlib import Path
|
||||
import os
|
||||
import logging
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
from app.api import collection, watched, pins, admin
|
||||
from app.core.config import settings
|
||||
from app.core.database import init_db, close_db
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Startup and shutdown events"""
|
||||
# Startup
|
||||
logger.info("Initializing database connection...")
|
||||
await init_db()
|
||||
yield
|
||||
# Shutdown
|
||||
logger.info("Closing database connection...")
|
||||
await close_db()
|
||||
|
||||
|
||||
app = FastAPI(title="Movie Map API", version="1.0.0", lifespan=lifespan)
|
||||
|
||||
# CORS middleware
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"], # In production, restrict this
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
# API routes
|
||||
app.include_router(collection.router, prefix="/api/collection", tags=["collection"])
|
||||
app.include_router(watched.router, prefix="/api/watched", tags=["watched"])
|
||||
app.include_router(pins.router, prefix="/api/pins", tags=["pins"])
|
||||
app.include_router(admin.router, prefix="/admin", tags=["admin"])
|
||||
|
||||
# Serve frontend static files
|
||||
# Check multiple possible locations (dev, Nix build, etc.)
|
||||
frontend_paths = [
|
||||
Path(__file__).parent.parent / "frontend" / "dist", # Dev mode
|
||||
Path(__file__).parent / "frontend" / "dist", # Nix build
|
||||
Path(__file__).parent / "frontend", # Fallback
|
||||
]
|
||||
|
||||
frontend_path = None
|
||||
for path in frontend_paths:
|
||||
if path.exists():
|
||||
frontend_path = path
|
||||
break
|
||||
|
||||
if frontend_path:
|
||||
logger.info(f"Serving frontend from {frontend_path}")
|
||||
app.mount("/", StaticFiles(directory=str(frontend_path), html=True), name="static")
|
||||
else:
|
||||
logger.warning("Frontend static files not found - API only mode")
|
||||
|
||||
|
||||
@app.get("/api/health")
|
||||
async def health():
|
||||
"""Health check endpoint"""
|
||||
return {"status": "ok"}
|
||||
|
||||
11
backend/requirements.txt
Normal file
11
backend/requirements.txt
Normal file
@@ -0,0 +1,11 @@
|
||||
fastapi==0.104.1
|
||||
uvicorn[standard]==0.24.0
|
||||
psycopg[binary]==3.1.18
|
||||
psycopg-pool==3.2.0
|
||||
alembic==1.12.1
|
||||
sqlalchemy==2.0.23
|
||||
httpx==0.25.2
|
||||
pydantic==2.5.2
|
||||
pydantic-settings==2.1.0
|
||||
python-multipart==0.0.6
|
||||
|
||||
35
backend/run.sh
Executable file
35
backend/run.sh
Executable file
@@ -0,0 +1,35 @@
|
||||
#!/usr/bin/env bash
|
||||
set -e
|
||||
|
||||
cd "$(dirname "$0")"
|
||||
|
||||
# Default values
|
||||
export PORT=${PORT:-8080}
|
||||
export POSTGRES_SOCKET_PATH=${POSTGRES_SOCKET_PATH:-/run/postgresql}
|
||||
export POSTGRES_DB=${POSTGRES_DB:-jawz}
|
||||
|
||||
# Read secrets from files if _FILE variables are set (for sops-nix integration)
|
||||
if [ -n "$SONARR_API_KEY_FILE" ] && [ -f "$SONARR_API_KEY_FILE" ]; then
|
||||
export SONARR_API_KEY=$(cat "$SONARR_API_KEY_FILE")
|
||||
fi
|
||||
|
||||
if [ -n "$RADARR_API_KEY_FILE" ] && [ -f "$RADARR_API_KEY_FILE" ]; then
|
||||
export RADARR_API_KEY=$(cat "$RADARR_API_KEY_FILE")
|
||||
fi
|
||||
|
||||
if [ -n "$LIDARR_API_KEY_FILE" ] && [ -f "$LIDARR_API_KEY_FILE" ]; then
|
||||
export LIDARR_API_KEY=$(cat "$LIDARR_API_KEY_FILE")
|
||||
fi
|
||||
|
||||
if [ -n "$MOVIEMAP_ADMIN_TOKEN_FILE" ] && [ -f "$MOVIEMAP_ADMIN_TOKEN_FILE" ]; then
|
||||
export MOVIEMAP_ADMIN_TOKEN=$(cat "$MOVIEMAP_ADMIN_TOKEN_FILE")
|
||||
fi
|
||||
|
||||
# Run migrations if needed
|
||||
if [ -d "alembic/versions" ]; then
|
||||
alembic upgrade head
|
||||
fi
|
||||
|
||||
# Start the server
|
||||
exec uvicorn main:app --host 127.0.0.1 --port "$PORT"
|
||||
|
||||
27
flake.lock
generated
Normal file
27
flake.lock
generated
Normal file
@@ -0,0 +1,27 @@
|
||||
{
|
||||
"nodes": {
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1766902085,
|
||||
"narHash": "sha256-coBu0ONtFzlwwVBzmjacUQwj3G+lybcZ1oeNSQkgC0M=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "c0b0e0fddf73fd517c3471e546c0df87a42d53f4",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"ref": "nixos-unstable",
|
||||
"repo": "nixpkgs",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"root": {
|
||||
"inputs": {
|
||||
"nixpkgs": "nixpkgs"
|
||||
}
|
||||
}
|
||||
},
|
||||
"root": "root",
|
||||
"version": 7
|
||||
}
|
||||
192
flake.nix
Normal file
192
flake.nix
Normal file
@@ -0,0 +1,192 @@
|
||||
{
|
||||
description = "Movie Map - Visualize media origin countries from Radarr/Sonarr/Lidarr";
|
||||
|
||||
inputs = {
|
||||
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
|
||||
};
|
||||
|
||||
outputs = { self, nixpkgs }:
|
||||
let
|
||||
system = "x86_64-linux";
|
||||
pkgs = import nixpkgs { inherit system; };
|
||||
|
||||
# Python dependencies
|
||||
pythonEnv = pkgs.python3.withPackages (ps: with ps; [
|
||||
fastapi
|
||||
uvicorn
|
||||
psycopg
|
||||
psycopg-pool
|
||||
alembic
|
||||
sqlalchemy
|
||||
httpx
|
||||
pydantic
|
||||
pydantic-settings
|
||||
python-multipart
|
||||
]);
|
||||
|
||||
# Node.js for frontend
|
||||
nodejs = pkgs.nodejs_20;
|
||||
|
||||
# Frontend build
|
||||
frontend = pkgs.buildNpmPackage {
|
||||
name = "moviemap-frontend";
|
||||
src = ./frontend;
|
||||
npmDepsHash = "sha256-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="; # Update after first build
|
||||
buildPhase = ''
|
||||
npm run build
|
||||
'';
|
||||
installPhase = ''
|
||||
mkdir -p $out
|
||||
cp -r dist/* $out/
|
||||
'';
|
||||
};
|
||||
|
||||
# Backend package
|
||||
backend = pkgs.stdenv.mkDerivation {
|
||||
name = "moviemap-backend";
|
||||
src = ./backend;
|
||||
buildInputs = [ pythonEnv ];
|
||||
installPhase = ''
|
||||
mkdir -p $out/backend
|
||||
cp -r . $out/backend/
|
||||
chmod +x $out/backend/run.sh
|
||||
# Make Python scripts executable
|
||||
find $out/backend -name "*.py" -exec chmod +x {} \;
|
||||
'';
|
||||
};
|
||||
|
||||
# Combined package
|
||||
app = pkgs.stdenv.mkDerivation {
|
||||
name = "moviemap";
|
||||
buildInputs = [ backend frontend ];
|
||||
buildPhase = "true";
|
||||
installPhase = ''
|
||||
mkdir -p $out/backend
|
||||
cp -r ${backend}/backend/* $out/backend/
|
||||
# Copy frontend dist to backend for serving
|
||||
mkdir -p $out/backend/frontend/dist
|
||||
cp -r ${frontend}/* $out/backend/frontend/dist/
|
||||
'';
|
||||
};
|
||||
in
|
||||
{
|
||||
packages.${system} = {
|
||||
default = app;
|
||||
backend = backend;
|
||||
frontend = frontend;
|
||||
};
|
||||
|
||||
devShells.${system}.default = pkgs.mkShell {
|
||||
buildInputs = [
|
||||
pythonEnv
|
||||
nodejs
|
||||
pkgs.nodePackages.npm
|
||||
pkgs.postgresql
|
||||
pkgs.alejandra # Nix formatter
|
||||
];
|
||||
shellHook = ''
|
||||
echo "Movie Map development environment"
|
||||
echo "Python: $(python --version)"
|
||||
echo "Node: $(node --version)"
|
||||
'';
|
||||
};
|
||||
|
||||
nixosModules.default = { config, lib, pkgs, ... }:
|
||||
with lib;
|
||||
let
|
||||
cfg = config.services.moviemap;
|
||||
appPackage = self.packages.x86_64-linux.default;
|
||||
|
||||
# Check if a value is a file path (starts with /)
|
||||
isPath = v: lib.hasPrefix "/" (toString v);
|
||||
|
||||
# Build environment variables - use file paths directly for secrets
|
||||
# The run.sh script will read from files at runtime
|
||||
envVars = [
|
||||
"PORT=${toString cfg.port}"
|
||||
"POSTGRES_SOCKET_PATH=${cfg.postgresSocketPath}"
|
||||
] ++ [
|
||||
# API keys - if path, pass as-is; if string, pass directly
|
||||
(if isPath cfg.sonarrApiKey
|
||||
then "SONARR_API_KEY_FILE=${toString cfg.sonarrApiKey}"
|
||||
else "SONARR_API_KEY=${toString cfg.sonarrApiKey}")
|
||||
(if isPath cfg.radarrApiKey
|
||||
then "RADARR_API_KEY_FILE=${toString cfg.radarrApiKey}"
|
||||
else "RADARR_API_KEY=${toString cfg.radarrApiKey}")
|
||||
(if isPath cfg.lidarrApiKey
|
||||
then "LIDARR_API_KEY_FILE=${toString cfg.lidarrApiKey}"
|
||||
else "LIDARR_API_KEY=${toString cfg.lidarrApiKey}")
|
||||
] ++ lib.optional (cfg.adminToken != null)
|
||||
(if isPath cfg.adminToken
|
||||
then "MOVIEMAP_ADMIN_TOKEN_FILE=${toString cfg.adminToken}"
|
||||
else "MOVIEMAP_ADMIN_TOKEN=${toString cfg.adminToken}");
|
||||
in
|
||||
{
|
||||
options.services.moviemap = {
|
||||
enable = mkEnableOption "Movie Map service";
|
||||
|
||||
port = mkOption {
|
||||
type = types.int;
|
||||
default = 8080;
|
||||
description = "Port to bind the backend server";
|
||||
};
|
||||
|
||||
postgresSocketPath = mkOption {
|
||||
type = types.str;
|
||||
default = "/run/postgresql";
|
||||
description = "PostgreSQL socket directory";
|
||||
};
|
||||
|
||||
sonarrApiKey = mkOption {
|
||||
type = types.either types.str types.path;
|
||||
description = "Sonarr API key (string or path to file, e.g., /run/secrets/sonarr-api-key)";
|
||||
};
|
||||
|
||||
radarrApiKey = mkOption {
|
||||
type = types.either types.str types.path;
|
||||
description = "Radarr API key (string or path to file, e.g., /run/secrets/radarr-api-key)";
|
||||
};
|
||||
|
||||
lidarrApiKey = mkOption {
|
||||
type = types.either types.str types.path;
|
||||
description = "Lidarr API key (string or path to file, e.g., /run/secrets/lidarr-api-key)";
|
||||
};
|
||||
|
||||
adminToken = mkOption {
|
||||
type = types.nullOr (types.either types.str types.path);
|
||||
default = null;
|
||||
description = "Optional admin token for sync endpoint (string or path to file)";
|
||||
};
|
||||
};
|
||||
|
||||
config = mkIf cfg.enable {
|
||||
systemd.services.moviemap-backend = {
|
||||
description = "Movie Map Backend";
|
||||
wantedBy = [ "multi-user.target" ];
|
||||
after = [ "network.target" "postgresql.service" ];
|
||||
|
||||
serviceConfig = {
|
||||
Type = "simple";
|
||||
ExecStart = "${appPackage}/backend/run.sh";
|
||||
Restart = "always";
|
||||
RestartSec = "10s";
|
||||
|
||||
Environment = envVars;
|
||||
|
||||
User = "moviemap";
|
||||
Group = "moviemap";
|
||||
};
|
||||
};
|
||||
|
||||
users.users.moviemap = {
|
||||
isSystemUser = true;
|
||||
group = "moviemap";
|
||||
description = "Movie Map service user";
|
||||
};
|
||||
|
||||
users.groups.moviemap = {};
|
||||
};
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
15
frontend/index.html
Normal file
15
frontend/index.html
Normal file
@@ -0,0 +1,15 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>Movie Map</title>
|
||||
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css" />
|
||||
</head>
|
||||
<body>
|
||||
<div id="root"></div>
|
||||
<script type="module" src="/src/main.tsx"></script>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
26
frontend/package.json
Normal file
26
frontend/package.json
Normal file
@@ -0,0 +1,26 @@
|
||||
{
|
||||
"name": "moviemap-frontend",
|
||||
"version": "1.0.0",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
"build": "tsc && vite build",
|
||||
"preview": "vite preview"
|
||||
},
|
||||
"dependencies": {
|
||||
"react": "^18.2.0",
|
||||
"react-dom": "^18.2.0",
|
||||
"react-router-dom": "^6.20.0",
|
||||
"leaflet": "^1.9.4",
|
||||
"react-leaflet": "^4.2.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/react": "^18.2.43",
|
||||
"@types/react-dom": "^18.2.17",
|
||||
"@types/leaflet": "^1.9.8",
|
||||
"@vitejs/plugin-react": "^4.2.1",
|
||||
"typescript": "^5.3.3",
|
||||
"vite": "^5.0.8"
|
||||
}
|
||||
}
|
||||
|
||||
50
frontend/src/App.css
Normal file
50
frontend/src/App.css
Normal file
@@ -0,0 +1,50 @@
|
||||
.app {
|
||||
width: 100%;
|
||||
height: 100vh;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.navbar {
|
||||
background: #1a1a1a;
|
||||
color: white;
|
||||
padding: 1rem 2rem;
|
||||
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
.nav-container {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
max-width: 1400px;
|
||||
margin: 0 auto;
|
||||
}
|
||||
|
||||
.nav-title {
|
||||
font-size: 1.5rem;
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
.nav-links {
|
||||
display: flex;
|
||||
gap: 2rem;
|
||||
}
|
||||
|
||||
.nav-links a {
|
||||
color: #ccc;
|
||||
text-decoration: none;
|
||||
padding: 0.5rem 1rem;
|
||||
border-radius: 4px;
|
||||
transition: background 0.2s;
|
||||
}
|
||||
|
||||
.nav-links a:hover {
|
||||
background: #333;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.nav-links a.active {
|
||||
background: #4a90e2;
|
||||
color: white;
|
||||
}
|
||||
|
||||
48
frontend/src/App.tsx
Normal file
48
frontend/src/App.tsx
Normal file
@@ -0,0 +1,48 @@
|
||||
import { useState } from 'react'
|
||||
import { BrowserRouter as Router, Routes, Route, Link, useLocation } from 'react-router-dom'
|
||||
import CollectionMap from './components/CollectionMap'
|
||||
import WatchedMap from './components/WatchedMap'
|
||||
import './App.css'
|
||||
|
||||
function App() {
|
||||
return (
|
||||
<Router>
|
||||
<div className="app">
|
||||
<NavBar />
|
||||
<Routes>
|
||||
<Route path="/" element={<CollectionMap />} />
|
||||
<Route path="/watched" element={<WatchedMap />} />
|
||||
</Routes>
|
||||
</div>
|
||||
</Router>
|
||||
)
|
||||
}
|
||||
|
||||
function NavBar() {
|
||||
const location = useLocation()
|
||||
|
||||
return (
|
||||
<nav className="navbar">
|
||||
<div className="nav-container">
|
||||
<h1 className="nav-title">Movie Map</h1>
|
||||
<div className="nav-links">
|
||||
<Link
|
||||
to="/"
|
||||
className={location.pathname === '/' ? 'active' : ''}
|
||||
>
|
||||
Collection Map
|
||||
</Link>
|
||||
<Link
|
||||
to="/watched"
|
||||
className={location.pathname === '/watched' ? 'active' : ''}
|
||||
>
|
||||
Watched Map
|
||||
</Link>
|
||||
</div>
|
||||
</div>
|
||||
</nav>
|
||||
)
|
||||
}
|
||||
|
||||
export default App
|
||||
|
||||
217
frontend/src/components/CollectionMap.tsx
Normal file
217
frontend/src/components/CollectionMap.tsx
Normal file
@@ -0,0 +1,217 @@
|
||||
import { useEffect, useState } from 'react'
|
||||
import { MapContainer, TileLayer, GeoJSON, CircleMarker, Popup } from 'react-leaflet'
|
||||
import { LatLngExpression } from 'leaflet'
|
||||
import 'leaflet/dist/leaflet.css'
|
||||
import './Map.css'
|
||||
|
||||
interface CountryData {
|
||||
[countryCode: string]: {
|
||||
movie?: number
|
||||
show?: number
|
||||
music?: number
|
||||
}
|
||||
}
|
||||
|
||||
interface MediaTypeFilter {
|
||||
movie: boolean
|
||||
show: boolean
|
||||
music: boolean
|
||||
}
|
||||
|
||||
export default function CollectionMap() {
|
||||
const [countryData, setCountryData] = useState<CountryData>({})
|
||||
const [filters, setFilters] = useState<MediaTypeFilter>({
|
||||
movie: true,
|
||||
show: true,
|
||||
music: true,
|
||||
})
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [worldGeoJson, setWorldGeoJson] = useState<any>(null)
|
||||
|
||||
useEffect(() => {
|
||||
// Load world countries GeoJSON
|
||||
fetch('https://raw.githubusercontent.com/holtzy/D3-graph-gallery/master/DATA/world.geojson')
|
||||
.then(res => res.json())
|
||||
.then(data => setWorldGeoJson(data))
|
||||
.catch(err => console.error('Failed to load GeoJSON:', err))
|
||||
}, [])
|
||||
|
||||
useEffect(() => {
|
||||
fetchCollectionData()
|
||||
}, [filters])
|
||||
|
||||
const fetchCollectionData = async () => {
|
||||
setLoading(true)
|
||||
try {
|
||||
const types = Object.entries(filters)
|
||||
.filter(([_, enabled]) => enabled)
|
||||
.map(([type, _]) => type)
|
||||
.join(',')
|
||||
|
||||
const response = await fetch(`/api/collection/summary?types=${types}`)
|
||||
const data = await response.json()
|
||||
setCountryData(data)
|
||||
} catch (error) {
|
||||
console.error('Failed to fetch collection data:', error)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
}
|
||||
|
||||
const getCountryCount = (countryCode: string): number => {
|
||||
const data = countryData[countryCode] || {}
|
||||
let total = 0
|
||||
if (filters.movie) total += data.movie || 0
|
||||
if (filters.show) total += data.show || 0
|
||||
if (filters.music) total += data.music || 0
|
||||
return total
|
||||
}
|
||||
|
||||
const getMaxCount = (): number => {
|
||||
const counts = Object.keys(countryData).map(code => getCountryCount(code))
|
||||
return Math.max(...counts, 1)
|
||||
}
|
||||
|
||||
const getCountryColor = (countryCode: string): string => {
|
||||
const count = getCountryCount(countryCode)
|
||||
const maxCount = getMaxCount()
|
||||
if (count === 0) return '#e0e0e0'
|
||||
|
||||
const intensity = count / maxCount
|
||||
// Blue gradient: light blue to dark blue
|
||||
const r = Math.floor(74 + (180 - 74) * (1 - intensity))
|
||||
const g = Math.floor(144 + (200 - 144) * (1 - intensity))
|
||||
const b = Math.floor(226 + (255 - 226) * (1 - intensity))
|
||||
return `rgb(${r}, ${g}, ${b})`
|
||||
}
|
||||
|
||||
const getCountryCenter = (countryCode: string): LatLngExpression | null => {
|
||||
// Simplified country centers - in production, use a proper lookup
|
||||
const centers: { [key: string]: LatLngExpression } = {
|
||||
'US': [39.8283, -98.5795],
|
||||
'GB': [55.3781, -3.4360],
|
||||
'FR': [46.2276, 2.2137],
|
||||
'DE': [51.1657, 10.4515],
|
||||
'JP': [36.2048, 138.2529],
|
||||
'CN': [35.8617, 104.1954],
|
||||
'IN': [20.5937, 78.9629],
|
||||
'KR': [35.9078, 127.7669],
|
||||
'TH': [15.8700, 100.9925],
|
||||
'MX': [23.6345, -102.5528],
|
||||
'BR': [-14.2350, -51.9253],
|
||||
'CA': [56.1304, -106.3468],
|
||||
'AU': [-25.2744, 133.7751],
|
||||
'IT': [41.8719, 12.5674],
|
||||
'ES': [40.4637, -3.7492],
|
||||
'RU': [61.5240, 105.3188],
|
||||
}
|
||||
return centers[countryCode] || null
|
||||
}
|
||||
|
||||
const toggleFilter = (type: keyof MediaTypeFilter) => {
|
||||
setFilters(prev => ({ ...prev, [type]: !prev[type] }))
|
||||
}
|
||||
|
||||
if (loading && !worldGeoJson) {
|
||||
return <div className="loading">Loading map...</div>
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="map-container">
|
||||
<div className="map-controls">
|
||||
<h2>Collection Map</h2>
|
||||
<div className="filters">
|
||||
<label>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={filters.movie}
|
||||
onChange={() => toggleFilter('movie')}
|
||||
/>
|
||||
Movies
|
||||
</label>
|
||||
<label>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={filters.show}
|
||||
onChange={() => toggleFilter('show')}
|
||||
/>
|
||||
Shows
|
||||
</label>
|
||||
<label>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={filters.music}
|
||||
onChange={() => toggleFilter('music')}
|
||||
/>
|
||||
Music
|
||||
</label>
|
||||
</div>
|
||||
<button onClick={fetchCollectionData} className="sync-button">
|
||||
Refresh Data
|
||||
</button>
|
||||
</div>
|
||||
<MapContainer
|
||||
center={[20, 0]}
|
||||
zoom={2}
|
||||
style={{ height: '100%', width: '100%' }}
|
||||
>
|
||||
<TileLayer
|
||||
attribution='© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
|
||||
url="https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png"
|
||||
/>
|
||||
{worldGeoJson && (
|
||||
<GeoJSON
|
||||
data={worldGeoJson}
|
||||
style={(feature) => {
|
||||
const code = feature?.properties?.ISO_A2 || feature?.properties?.ISO_A3?.substring(0, 2)
|
||||
return {
|
||||
fillColor: getCountryColor(code),
|
||||
fillOpacity: 0.7,
|
||||
color: '#666',
|
||||
weight: 1,
|
||||
}
|
||||
}}
|
||||
onEachFeature={(feature, layer) => {
|
||||
const code = feature?.properties?.ISO_A2 || feature?.properties?.ISO_A3?.substring(0, 2)
|
||||
const count = getCountryCount(code)
|
||||
const data = countryData[code] || {}
|
||||
|
||||
layer.bindPopup(`
|
||||
<strong>${feature.properties.NAME || code}</strong><br/>
|
||||
Total: ${count}<br/>
|
||||
${data.movie ? `Movies: ${data.movie}<br/>` : ''}
|
||||
${data.show ? `Shows: ${data.show}<br/>` : ''}
|
||||
${data.music ? `Music: ${data.music}` : ''}
|
||||
`)
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
{Object.keys(countryData).map(code => {
|
||||
const count = getCountryCount(code)
|
||||
if (count === 0) return null
|
||||
|
||||
const center = getCountryCenter(code)
|
||||
if (!center) return null
|
||||
|
||||
return (
|
||||
<CircleMarker
|
||||
key={code}
|
||||
center={center}
|
||||
radius={Math.max(8, Math.min(30, count / 2))}
|
||||
fillColor="#ff6b6b"
|
||||
fillOpacity={0.8}
|
||||
color="#fff"
|
||||
weight={2}
|
||||
>
|
||||
<Popup>
|
||||
<strong>{code}</strong><br/>
|
||||
Count: {count}
|
||||
</Popup>
|
||||
</CircleMarker>
|
||||
)
|
||||
})}
|
||||
</MapContainer>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
139
frontend/src/components/Map.css
Normal file
139
frontend/src/components/Map.css
Normal file
@@ -0,0 +1,139 @@
|
||||
.map-container {
|
||||
display: flex;
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.map-controls {
|
||||
width: 300px;
|
||||
background: #f5f5f5;
|
||||
padding: 1.5rem;
|
||||
overflow-y: auto;
|
||||
border-right: 1px solid #ddd;
|
||||
}
|
||||
|
||||
.map-controls h2 {
|
||||
margin-bottom: 1rem;
|
||||
color: #333;
|
||||
}
|
||||
|
||||
.filters {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 0.5rem;
|
||||
margin-bottom: 1rem;
|
||||
}
|
||||
|
||||
.filters label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.sync-button,
|
||||
.add-button {
|
||||
background: #4a90e2;
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 0.75rem 1.5rem;
|
||||
border-radius: 4px;
|
||||
cursor: pointer;
|
||||
font-size: 1rem;
|
||||
margin-bottom: 1rem;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.sync-button:hover,
|
||||
.add-button:hover {
|
||||
background: #357abd;
|
||||
}
|
||||
|
||||
.add-form {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 0.75rem;
|
||||
margin-bottom: 1rem;
|
||||
padding: 1rem;
|
||||
background: white;
|
||||
border-radius: 4px;
|
||||
border: 1px solid #ddd;
|
||||
}
|
||||
|
||||
.add-form input,
|
||||
.add-form select,
|
||||
.add-form textarea {
|
||||
padding: 0.5rem;
|
||||
border: 1px solid #ddd;
|
||||
border-radius: 4px;
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
.add-form textarea {
|
||||
min-height: 60px;
|
||||
resize: vertical;
|
||||
}
|
||||
|
||||
.form-actions {
|
||||
display: flex;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.form-actions button {
|
||||
flex: 1;
|
||||
padding: 0.5rem;
|
||||
border: none;
|
||||
border-radius: 4px;
|
||||
cursor: pointer;
|
||||
background: #4a90e2;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.form-actions button:hover {
|
||||
background: #357abd;
|
||||
}
|
||||
|
||||
.watched-list {
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
.watched-list h3 {
|
||||
margin-bottom: 0.5rem;
|
||||
color: #333;
|
||||
font-size: 1.1rem;
|
||||
}
|
||||
|
||||
.watched-item {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
padding: 0.5rem;
|
||||
margin-bottom: 0.5rem;
|
||||
background: white;
|
||||
border-radius: 4px;
|
||||
border: 1px solid #ddd;
|
||||
}
|
||||
|
||||
.watched-item button {
|
||||
background: #e74c3c;
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 0.25rem 0.75rem;
|
||||
border-radius: 4px;
|
||||
cursor: pointer;
|
||||
font-size: 0.8rem;
|
||||
}
|
||||
|
||||
.watched-item button:hover {
|
||||
background: #c0392b;
|
||||
}
|
||||
|
||||
.loading {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
height: 100%;
|
||||
font-size: 1.2rem;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
280
frontend/src/components/WatchedMap.tsx
Normal file
280
frontend/src/components/WatchedMap.tsx
Normal file
@@ -0,0 +1,280 @@
|
||||
import { useEffect, useState } from 'react'
|
||||
import { MapContainer, TileLayer, GeoJSON, CircleMarker, Popup } from 'react-leaflet'
|
||||
import { LatLngExpression } from 'leaflet'
|
||||
import 'leaflet/dist/leaflet.css'
|
||||
import './Map.css'
|
||||
|
||||
interface WatchedItem {
|
||||
id: string
|
||||
media_type: 'movie' | 'show'
|
||||
title: string
|
||||
year?: number
|
||||
country_code: string
|
||||
watched_at?: string
|
||||
notes?: string
|
||||
}
|
||||
|
||||
interface Pin {
|
||||
id: string
|
||||
country_code: string
|
||||
label?: string
|
||||
pinned_at: string
|
||||
}
|
||||
|
||||
interface CountrySummary {
|
||||
[countryCode: string]: {
|
||||
movie?: number
|
||||
show?: number
|
||||
}
|
||||
}
|
||||
|
||||
export default function WatchedMap() {
|
||||
const [watchedItems, setWatchedItems] = useState<WatchedItem[]>([])
|
||||
const [pins, setPins] = useState<Pin[]>([])
|
||||
const [summary, setSummary] = useState<CountrySummary>({})
|
||||
const [worldGeoJson, setWorldGeoJson] = useState<any>(null)
|
||||
const [showAddForm, setShowAddForm] = useState(false)
|
||||
const [newItem, setNewItem] = useState({
|
||||
media_type: 'movie' as 'movie' | 'show',
|
||||
title: '',
|
||||
year: '',
|
||||
country_code: '',
|
||||
notes: '',
|
||||
})
|
||||
|
||||
useEffect(() => {
|
||||
fetch('https://raw.githubusercontent.com/holtzy/D3-graph-gallery/master/DATA/world.geojson')
|
||||
.then(res => res.json())
|
||||
.then(data => setWorldGeoJson(data))
|
||||
.catch(err => console.error('Failed to load GeoJSON:', err))
|
||||
|
||||
fetchData()
|
||||
}, [])
|
||||
|
||||
const fetchData = async () => {
|
||||
try {
|
||||
const [watchedRes, pinsRes, summaryRes] = await Promise.all([
|
||||
fetch('/api/watched'),
|
||||
fetch('/api/pins'),
|
||||
fetch('/api/watched/summary'),
|
||||
])
|
||||
|
||||
setWatchedItems(await watchedRes.json())
|
||||
setPins(await pinsRes.json())
|
||||
setSummary(await summaryRes.json())
|
||||
} catch (error) {
|
||||
console.error('Failed to fetch data:', error)
|
||||
}
|
||||
}
|
||||
|
||||
const handleAddWatched = async (e: React.FormEvent) => {
|
||||
e.preventDefault()
|
||||
try {
|
||||
await fetch('/api/watched', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
...newItem,
|
||||
year: newItem.year ? parseInt(newItem.year) : null,
|
||||
watched_at: new Date().toISOString(),
|
||||
}),
|
||||
})
|
||||
setShowAddForm(false)
|
||||
setNewItem({ media_type: 'movie', title: '', year: '', country_code: '', notes: '' })
|
||||
fetchData()
|
||||
} catch (error) {
|
||||
console.error('Failed to add watched item:', error)
|
||||
}
|
||||
}
|
||||
|
||||
const handleAddPin = async () => {
|
||||
if (!newItem.country_code) return
|
||||
try {
|
||||
await fetch('/api/pins', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
country_code: newItem.country_code,
|
||||
label: newItem.title || undefined,
|
||||
}),
|
||||
})
|
||||
setNewItem({ media_type: 'movie', title: '', year: '', country_code: '', notes: '' })
|
||||
fetchData()
|
||||
} catch (error) {
|
||||
console.error('Failed to add pin:', error)
|
||||
}
|
||||
}
|
||||
|
||||
const handleDeleteWatched = async (id: string) => {
|
||||
try {
|
||||
await fetch(`/api/watched/${id}`, { method: 'DELETE' })
|
||||
fetchData()
|
||||
} catch (error) {
|
||||
console.error('Failed to delete watched item:', error)
|
||||
}
|
||||
}
|
||||
|
||||
const handleDeletePin = async (id: string) => {
|
||||
try {
|
||||
await fetch(`/api/pins/${id}`, { method: 'DELETE' })
|
||||
fetchData()
|
||||
} catch (error) {
|
||||
console.error('Failed to delete pin:', error)
|
||||
}
|
||||
}
|
||||
|
||||
const getCountryCount = (countryCode: string): number => {
|
||||
const data = summary[countryCode] || {}
|
||||
return (data.movie || 0) + (data.show || 0)
|
||||
}
|
||||
|
||||
const getMaxCount = (): number => {
|
||||
const counts = Object.keys(summary).map(code => getCountryCount(code))
|
||||
return Math.max(...counts, 1)
|
||||
}
|
||||
|
||||
const getCountryColor = (countryCode: string): string => {
|
||||
const count = getCountryCount(countryCode)
|
||||
const maxCount = getMaxCount()
|
||||
if (count === 0) return '#e0e0e0'
|
||||
|
||||
const intensity = count / maxCount
|
||||
const r = Math.floor(255 - (255 - 100) * intensity)
|
||||
const g = Math.floor(200 - (200 - 50) * intensity)
|
||||
const b = Math.floor(100 - (100 - 20) * intensity)
|
||||
return `rgb(${r}, ${g}, ${b})`
|
||||
}
|
||||
|
||||
const getCountryCenter = (countryCode: string): LatLngExpression | null => {
|
||||
const centers: { [key: string]: LatLngExpression } = {
|
||||
'US': [39.8283, -98.5795], 'GB': [55.3781, -3.4360], 'FR': [46.2276, 2.2137],
|
||||
'DE': [51.1657, 10.4515], 'JP': [36.2048, 138.2529], 'CN': [35.8617, 104.1954],
|
||||
'IN': [20.5937, 78.9629], 'KR': [35.9078, 127.7669], 'TH': [15.8700, 100.9925],
|
||||
'MX': [23.6345, -102.5528], 'BR': [-14.2350, -51.9253], 'CA': [56.1304, -106.3468],
|
||||
'AU': [-25.2744, 133.7751], 'IT': [41.8719, 12.5674], 'ES': [40.4637, -3.7492],
|
||||
'RU': [61.5240, 105.3188],
|
||||
}
|
||||
return centers[countryCode] || null
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="map-container">
|
||||
<div className="map-controls">
|
||||
<h2>Watched Map</h2>
|
||||
<button onClick={() => setShowAddForm(!showAddForm)} className="add-button">
|
||||
{showAddForm ? 'Cancel' : 'Add Watched Item'}
|
||||
</button>
|
||||
{showAddForm && (
|
||||
<form onSubmit={handleAddWatched} className="add-form">
|
||||
<select
|
||||
value={newItem.media_type}
|
||||
onChange={(e) => setNewItem({ ...newItem, media_type: e.target.value as 'movie' | 'show' })}
|
||||
>
|
||||
<option value="movie">Movie</option>
|
||||
<option value="show">Show</option>
|
||||
</select>
|
||||
<input
|
||||
type="text"
|
||||
placeholder="Title"
|
||||
value={newItem.title}
|
||||
onChange={(e) => setNewItem({ ...newItem, title: e.target.value })}
|
||||
required
|
||||
/>
|
||||
<input
|
||||
type="number"
|
||||
placeholder="Year (optional)"
|
||||
value={newItem.year}
|
||||
onChange={(e) => setNewItem({ ...newItem, year: e.target.value })}
|
||||
/>
|
||||
<input
|
||||
type="text"
|
||||
placeholder="Country Code (e.g., TH, JP)"
|
||||
value={newItem.country_code}
|
||||
onChange={(e) => setNewItem({ ...newItem, country_code: e.target.value.toUpperCase() })}
|
||||
required
|
||||
maxLength={2}
|
||||
/>
|
||||
<textarea
|
||||
placeholder="Notes (optional)"
|
||||
value={newItem.notes}
|
||||
onChange={(e) => setNewItem({ ...newItem, notes: e.target.value })}
|
||||
/>
|
||||
<div className="form-actions">
|
||||
<button type="submit">Add Watched</button>
|
||||
<button type="button" onClick={handleAddPin}>Add Pin Only</button>
|
||||
</div>
|
||||
</form>
|
||||
)}
|
||||
<div className="watched-list">
|
||||
<h3>Watched Items</h3>
|
||||
{watchedItems.filter(item => item.watched_at).map(item => (
|
||||
<div key={item.id} className="watched-item">
|
||||
<strong>{item.title}</strong> ({item.country_code})
|
||||
<button onClick={() => handleDeleteWatched(item.id)}>Delete</button>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
<MapContainer
|
||||
center={[20, 0]}
|
||||
zoom={2}
|
||||
style={{ height: '100%', width: '100%' }}
|
||||
>
|
||||
<TileLayer
|
||||
attribution='© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
|
||||
url="https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png"
|
||||
/>
|
||||
{worldGeoJson && (
|
||||
<GeoJSON
|
||||
data={worldGeoJson}
|
||||
style={(feature) => {
|
||||
const code = feature?.properties?.ISO_A2 || feature?.properties?.ISO_A3?.substring(0, 2)
|
||||
return {
|
||||
fillColor: getCountryColor(code),
|
||||
fillOpacity: 0.7,
|
||||
color: '#666',
|
||||
weight: 1,
|
||||
}
|
||||
}}
|
||||
onEachFeature={(feature, layer) => {
|
||||
const code = feature?.properties?.ISO_A2 || feature?.properties?.ISO_A3?.substring(0, 2)
|
||||
const count = getCountryCount(code)
|
||||
const data = summary[code] || {}
|
||||
|
||||
layer.bindPopup(`
|
||||
<strong>${feature.properties.NAME || code}</strong><br/>
|
||||
Watched: ${count}<br/>
|
||||
${data.movie ? `Movies: ${data.movie}<br/>` : ''}
|
||||
${data.show ? `Shows: ${data.show}` : ''}
|
||||
`)
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
{pins.map(pin => {
|
||||
const center = getCountryCenter(pin.country_code)
|
||||
if (!center) return null
|
||||
|
||||
return (
|
||||
<CircleMarker
|
||||
key={pin.id}
|
||||
center={center}
|
||||
radius={10}
|
||||
fillColor="#ffd700"
|
||||
fillOpacity={0.8}
|
||||
color="#ff8c00"
|
||||
weight={2}
|
||||
>
|
||||
<Popup>
|
||||
<strong>{pin.country_code}</strong>
|
||||
{pin.label && <><br/>{pin.label}</>}
|
||||
<br/>
|
||||
<button onClick={() => handleDeletePin(pin.id)}>Delete</button>
|
||||
</Popup>
|
||||
</CircleMarker>
|
||||
)
|
||||
})}
|
||||
</MapContainer>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
2
frontend/src/components/__init__.ts
Normal file
2
frontend/src/components/__init__.ts
Normal file
@@ -0,0 +1,2 @@
|
||||
// Components exports
|
||||
|
||||
19
frontend/src/index.css
Normal file
19
frontend/src/index.css
Normal file
@@ -0,0 +1,19 @@
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
|
||||
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
|
||||
sans-serif;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
}
|
||||
|
||||
#root {
|
||||
width: 100%;
|
||||
height: 100vh;
|
||||
}
|
||||
|
||||
11
frontend/src/main.tsx
Normal file
11
frontend/src/main.tsx
Normal file
@@ -0,0 +1,11 @@
|
||||
import React from 'react'
|
||||
import ReactDOM from 'react-dom/client'
|
||||
import App from './App'
|
||||
import './index.css'
|
||||
|
||||
ReactDOM.createRoot(document.getElementById('root')!).render(
|
||||
<React.StrictMode>
|
||||
<App />
|
||||
</React.StrictMode>,
|
||||
)
|
||||
|
||||
22
frontend/tsconfig.json
Normal file
22
frontend/tsconfig.json
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2020",
|
||||
"useDefineForClassFields": true,
|
||||
"lib": ["ES2020", "DOM", "DOM.Iterable"],
|
||||
"module": "ESNext",
|
||||
"skipLibCheck": true,
|
||||
"moduleResolution": "bundler",
|
||||
"allowImportingTsExtensions": true,
|
||||
"resolveJsonModule": true,
|
||||
"isolatedModules": true,
|
||||
"noEmit": true,
|
||||
"jsx": "react-jsx",
|
||||
"strict": true,
|
||||
"noUnusedLocals": true,
|
||||
"noUnusedParameters": true,
|
||||
"noFallthroughCasesInSwitch": true
|
||||
},
|
||||
"include": ["src"],
|
||||
"references": [{ "path": "./tsconfig.node.json" }]
|
||||
}
|
||||
|
||||
11
frontend/tsconfig.node.json
Normal file
11
frontend/tsconfig.node.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"composite": true,
|
||||
"skipLibCheck": true,
|
||||
"module": "ESNext",
|
||||
"moduleResolution": "bundler",
|
||||
"allowSyntheticDefaultImports": true
|
||||
},
|
||||
"include": ["vite.config.ts"]
|
||||
}
|
||||
|
||||
22
frontend/vite.config.ts
Normal file
22
frontend/vite.config.ts
Normal file
@@ -0,0 +1,22 @@
|
||||
import { defineConfig } from 'vite'
|
||||
import react from '@vitejs/plugin-react'
|
||||
|
||||
export default defineConfig({
|
||||
plugins: [react()],
|
||||
build: {
|
||||
outDir: 'dist',
|
||||
},
|
||||
server: {
|
||||
proxy: {
|
||||
'/api': {
|
||||
target: 'http://127.0.0.1:8080',
|
||||
changeOrigin: true,
|
||||
},
|
||||
'/admin': {
|
||||
target: 'http://127.0.0.1:8080',
|
||||
changeOrigin: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
Reference in New Issue
Block a user