18 Commits
0.1.2 ... 0.1.8

Author SHA1 Message Date
cd91b20278 Merge pull request 'Replace python-jose with PyJWT and update its usage' (#6) from development into main
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (push) Successful in 7s
PostgreSQL Compatibility Matrix / PG15 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (push) Successful in 7s
PostgreSQL Compatibility Matrix / PG17 smoke (push) Successful in 7s
PostgreSQL Compatibility Matrix / PG18 smoke (push) Successful in 7s
Docker Publish (Release) / Build and Push Docker Images (release) Successful in 1m27s
Reviewed-on: #6
2026-02-13 12:23:40 +00:00
fd9853957a Merge branch 'main' of https://git.nesterovic.cc/nessi/NexaPG into development
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (pull_request) Successful in 9s
PostgreSQL Compatibility Matrix / PG15 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG17 smoke (pull_request) Successful in 7s
PostgreSQL Compatibility Matrix / PG18 smoke (pull_request) Successful in 7s
2026-02-13 13:20:49 +01:00
9c68f11d74 Replace python-jose with PyJWT and update its usage.
Switched the dependency from `python-jose` to `PyJWT` to handle JWT encoding and decoding. Updated related code to use `PyJWT`'s `InvalidTokenError` instead of `JWTError`. Also bumped the application version from `0.1.7` to `0.1.8`.
2026-02-13 13:20:46 +01:00
6848a66d88 Merge pull request 'Update backend requirements - security hardening' (#5) from development into main
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG15 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (push) Successful in 7s
PostgreSQL Compatibility Matrix / PG17 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG18 smoke (push) Successful in 8s
Docker Publish (Release) / Build and Push Docker Images (release) Successful in 1m32s
Reviewed-on: #5
2026-02-13 12:07:48 +00:00
a9a49eba4e Merge branch 'main' of https://git.nesterovic.cc/nessi/NexaPG into development
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (pull_request) Successful in 11s
PostgreSQL Compatibility Matrix / PG15 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG17 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG18 smoke (pull_request) Successful in 8s
2026-02-13 13:01:26 +01:00
9ccde7ca37 Update backend requirements - security hardening 2026-02-13 13:01:22 +01:00
88c3345647 Merge pull request 'Use lighter base images for frontend containers' (#4) from development into main
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG15 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (push) Successful in 9s
PostgreSQL Compatibility Matrix / PG17 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG18 smoke (push) Successful in 8s
Docker Publish (Release) / Build and Push Docker Images (release) Successful in 1m24s
Reviewed-on: #4
2026-02-13 11:43:59 +00:00
d9f3de9468 Use lighter base images for frontend containers
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (pull_request) Successful in 9s
PostgreSQL Compatibility Matrix / PG15 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (pull_request) Successful in 7s
PostgreSQL Compatibility Matrix / PG17 smoke (pull_request) Successful in 7s
PostgreSQL Compatibility Matrix / PG18 smoke (pull_request) Successful in 8s
Switched Node.js and Nginx images from 'bookworm' to 'alpine' variants to reduce image size. Added `apk upgrade --no-cache` for updated Alpine packages in the Nginx container. This optimizes resource usage and enhances performance.
2026-02-13 11:26:52 +01:00
e62aaaf5a0 Merge pull request 'Update base images in Dockerfile to use bookworm variants' (#3) from development into main
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG15 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG17 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG18 smoke (push) Successful in 7s
Docker Publish (Release) / Build and Push Docker Images (release) Successful in 2m7s
Reviewed-on: #3
2026-02-13 10:20:20 +00:00
ef84273868 Update base images in Dockerfile to use bookworm variants
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG15 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG17 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG18 smoke (pull_request) Successful in 8s
Replaced alpine with bookworm-slim for Node.js and nginx to bookworm. This ensures compatibility with the latest updates and improves consistency across images. Adjusted the health check command for nginx accordingly.
2026-02-13 11:15:17 +01:00
6c59b21088 Merge pull request 'Add first and last name fields for users' (#2) from development into main
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG15 smoke (push) Successful in 7s
PostgreSQL Compatibility Matrix / PG16 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG17 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG18 smoke (push) Successful in 8s
Docker Publish (Release) / Build and Push Docker Images (release) Successful in 1m13s
Reviewed-on: #2
2026-02-13 10:09:02 +00:00
cd1795b9ff Add first and last name fields for users
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (pull_request) Successful in 12s
PostgreSQL Compatibility Matrix / PG15 smoke (pull_request) Successful in 11s
PostgreSQL Compatibility Matrix / PG16 smoke (pull_request) Successful in 9s
PostgreSQL Compatibility Matrix / PG17 smoke (pull_request) Successful in 10s
PostgreSQL Compatibility Matrix / PG18 smoke (pull_request) Successful in 11s
This commit introduces optional `first_name` and `last_name` fields to the user model, including database migrations, backend, and frontend support. It enhances user profiles, updates user creation and editing flows, and refines the UI to display full names where available.
2026-02-13 10:57:10 +01:00
e0242bc823 Refactor deployment process to use prebuilt Docker images
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG15 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG17 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG18 smoke (push) Successful in 8s
Replaced local builds with prebuilt backend and frontend Docker images for simplified deployment. Updated documentation and Makefile to reflect the changes and added a bootstrap script for quick setup of deployment files. Removed deprecated `VITE_API_URL` variable and references to streamline the setup.
2026-02-13 10:43:34 +01:00
75f8106ca5 Merge pull request 'Merge Fixes and Technical changes from development into main branch' (#1) from development into main
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (push) Successful in 7s
PostgreSQL Compatibility Matrix / PG15 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG17 smoke (push) Successful in 8s
PostgreSQL Compatibility Matrix / PG18 smoke (push) Successful in 8s
Docker Publish (Release) / Build and Push Docker Images (release) Successful in 4m30s
Reviewed-on: #1
2026-02-13 09:13:04 +00:00
4e4f8ad5d4 Update NEXAPG version to 0.1.3
All checks were successful
PostgreSQL Compatibility Matrix / PG14 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG15 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG16 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG17 smoke (pull_request) Successful in 8s
PostgreSQL Compatibility Matrix / PG18 smoke (pull_request) Successful in 8s
This increments the application version from 0.1.2 to 0.1.3. It likely reflects bug fixes, improvements, or minor feature additions.
2026-02-13 10:11:00 +01:00
5c5d51350f Fix stale refresh usage in QueryInsightsPage effect hooks
Replaced `refresh` with `useRef` to ensure the latest value is always used in async operations. Added cleanup logic to handle component unmounts during API calls, preventing state updates on unmounted components.
2026-02-13 10:06:56 +01:00
ba1559e790 Improve agentless mode messaging for host-level metrics
Updated the messaging and UI to clarify unavailability of host-level metrics, such as CPU, RAM, and disk space, in agentless mode. Added clear formatting and new functions to handle missing metrics gracefully in the frontend.
2026-02-13 10:01:24 +01:00
ab9d03be8a Add GitHub Actions workflow for Docker image release
This workflow automates building and publishing Docker images upon a release or manual trigger. It includes steps for version resolution, Docker Hub login, and caching to optimize builds for both backend and frontend images.
2026-02-13 09:55:08 +01:00
23 changed files with 453 additions and 65 deletions

View File

@@ -58,7 +58,3 @@ INIT_ADMIN_PASSWORD=ChangeMe123!
# ------------------------------ # ------------------------------
# Host port mapped to frontend container port 80. # Host port mapped to frontend container port 80.
FRONTEND_PORT=5173 FRONTEND_PORT=5173
# Base API URL used at frontend build time.
# For reverse proxy + SSL, keep this relative to avoid mixed-content issues.
# Example direct mode: VITE_API_URL=http://localhost:8000/api/v1
VITE_API_URL=/api/v1

91
.github/workflows/docker-release.yml vendored Normal file
View File

@@ -0,0 +1,91 @@
name: Docker Publish (Release)
on:
release:
types: [published]
workflow_dispatch:
inputs:
version:
description: "Version tag to publish (e.g. 0.1.2 or v0.1.2)"
required: false
type: string
jobs:
publish:
name: Build and Push Docker Images
runs-on: ubuntu-latest
permissions:
contents: read
env:
# Optional repo variable. If unset, DOCKERHUB_USERNAME is used.
IMAGE_NAMESPACE: ${{ vars.DOCKERHUB_NAMESPACE }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Resolve version/tag
id: ver
shell: bash
run: |
RAW_TAG="${{ github.event.release.tag_name }}"
if [ -z "$RAW_TAG" ]; then
RAW_TAG="${{ inputs.version }}"
fi
if [ -z "$RAW_TAG" ]; then
RAW_TAG="${GITHUB_REF_NAME}"
fi
CLEAN_TAG="${RAW_TAG#v}"
echo "raw=$RAW_TAG" >> "$GITHUB_OUTPUT"
echo "clean=$CLEAN_TAG" >> "$GITHUB_OUTPUT"
- name: Set image namespace
id: ns
shell: bash
run: |
NS="${IMAGE_NAMESPACE}"
if [ -z "$NS" ]; then
NS="${{ secrets.DOCKERHUB_USERNAME }}"
fi
if [ -z "$NS" ]; then
echo "Missing Docker Hub namespace. Set repo var DOCKERHUB_NAMESPACE or secret DOCKERHUB_USERNAME."
exit 1
fi
echo "value=$NS" >> "$GITHUB_OUTPUT"
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and push backend image
uses: docker/build-push-action@v6
with:
context: ./backend
file: ./backend/Dockerfile
push: true
tags: |
${{ steps.ns.outputs.value }}/nexapg-backend:${{ steps.ver.outputs.clean }}
${{ steps.ns.outputs.value }}/nexapg-backend:latest
cache-from: type=registry,ref=${{ steps.ns.outputs.value }}/nexapg-backend:buildcache
cache-to: type=registry,ref=${{ steps.ns.outputs.value }}/nexapg-backend:buildcache,mode=max
- name: Build and push frontend image
uses: docker/build-push-action@v6
with:
context: ./frontend
file: ./frontend/Dockerfile
push: true
build-args: |
VITE_API_URL=/api/v1
tags: |
${{ steps.ns.outputs.value }}/nexapg-frontend:${{ steps.ver.outputs.clean }}
${{ steps.ns.outputs.value }}/nexapg-frontend:latest
cache-from: type=registry,ref=${{ steps.ns.outputs.value }}/nexapg-frontend:buildcache
cache-to: type=registry,ref=${{ steps.ns.outputs.value }}/nexapg-frontend:buildcache,mode=max

View File

@@ -1,7 +1,8 @@
.PHONY: up down logs migrate .PHONY: up down logs migrate
up: up:
docker compose up -d --build docker compose pull
docker compose up -d
down: down:
docker compose down docker compose down

View File

@@ -9,7 +9,7 @@ It combines FastAPI, React, and PostgreSQL in a Docker Compose stack with RBAC,
## Table of Contents ## Table of Contents
- [Quick Start](#quick-start) - [Quick Deploy (Prebuilt Images)](#quick-deploy-prebuilt-images)
- [Prerequisites](#prerequisites) - [Prerequisites](#prerequisites)
- [Make Commands](#make-commands) - [Make Commands](#make-commands)
- [Configuration Reference (`.env`)](#configuration-reference-env) - [Configuration Reference (`.env`)](#configuration-reference-env)
@@ -93,27 +93,50 @@ Optional:
- `psql` for manual DB checks - `psql` for manual DB checks
## Quick Start ## Quick Deploy (Prebuilt Images)
1. Copy environment template: If you only want to run NexaPG from published Docker Hub images, use the bootstrap script:
```bash ```bash
cp .env.example .env mkdir -p /opt/NexaPG
cd /opt/NexaPG
wget -O bootstrap-compose.sh https://git.nesterovic.cc/nessi/NexaPG/raw/branch/main/ops/scripts/bootstrap-compose.sh
chmod +x bootstrap-compose.sh
./bootstrap-compose.sh
``` ```
2. Generate a Fernet key and set `ENCRYPTION_KEY` in `.env`: This downloads:
- `docker-compose.yml`
- `.env.example`
- `Makefile`
Then:
```bash ```bash
# generate JWT secret
python -c "import secrets; print(secrets.token_urlsafe(64))"
# generate Fernet encryption key
python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())" python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
``` # put both values into .env (JWT_SECRET_KEY / ENCRYPTION_KEY)
# note: .env is auto-created by bootstrap if it does not exist
3. Start the stack:
```bash
make up make up
``` ```
4. Open the application: Manual download alternative:
```bash
mkdir -p /opt/NexaPG
cd /opt/NexaPG
wget https://git.nesterovic.cc/nessi/NexaPG/raw/branch/main/docker-compose.yml
wget https://git.nesterovic.cc/nessi/NexaPG/raw/branch/main/.env.example
wget https://git.nesterovic.cc/nessi/NexaPG/raw/branch/main/Makefile
cp .env.example .env
```
`make up` pulls `nesterovicit/nexapg-backend:latest` and `nesterovicit/nexapg-frontend:latest`, then starts the stack.
Open the application:
- Frontend: `http://<SERVER_IP>:<FRONTEND_PORT>` - Frontend: `http://<SERVER_IP>:<FRONTEND_PORT>`
- API base: `http://<SERVER_IP>:<BACKEND_PORT>/api/v1` - API base: `http://<SERVER_IP>:<BACKEND_PORT>/api/v1`
@@ -127,7 +150,7 @@ Initial admin bootstrap user (created from `.env` if missing):
## Make Commands ## Make Commands
```bash ```bash
make up # build and start all services make up # pull latest images and start all services
make down # stop all services make down # stop all services
make logs # follow compose logs make logs # follow compose logs
make migrate # optional/manual: run alembic upgrade head in backend container make migrate # optional/manual: run alembic upgrade head in backend container
@@ -183,12 +206,6 @@ Note: Migrations run automatically when the backend container starts (`entrypoin
| Variable | Description | | Variable | Description |
|---|---| |---|---|
| `FRONTEND_PORT` | Host port mapped to frontend container port `80` | | `FRONTEND_PORT` | Host port mapped to frontend container port `80` |
| `VITE_API_URL` | Frontend API base URL (build-time) |
Recommended values for `VITE_API_URL`:
- Reverse proxy setup: `/api/v1`
- Direct backend access: `http://<SERVER_IP>:<BACKEND_PORT>/api/v1`
## Core Functional Areas ## Core Functional Areas
@@ -318,7 +335,7 @@ For production, serve frontend and API under the same public origin via reverse
- Frontend URL example: `https://monitor.example.com` - Frontend URL example: `https://monitor.example.com`
- Proxy API path `/api/` to backend service - Proxy API path `/api/` to backend service
- Use `VITE_API_URL=/api/v1` - Route `/api/v1` to the backend service
This prevents mixed-content and CORS issues. This prevents mixed-content and CORS issues.
@@ -351,8 +368,7 @@ docker compose logs --tail=200 db
### CORS or mixed-content issues behind SSL proxy ### CORS or mixed-content issues behind SSL proxy
- Set `VITE_API_URL=/api/v1` - Ensure proxy forwards `/api/` (or `/api/v1`) to backend
- Ensure proxy forwards `/api/` to backend
- Set correct frontend origin(s) in `CORS_ORIGINS` - Set correct frontend origin(s) in `CORS_ORIGINS`
### `rejected SSL upgrade` for a target ### `rejected SSL upgrade` for a target

View File

@@ -1,4 +1,4 @@
FROM python:3.12-slim AS base FROM python:3.13-slim AS base
ENV PYTHONDONTWRITEBYTECODE=1 ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1 ENV PYTHONUNBUFFERED=1
@@ -6,6 +6,10 @@ ENV PIP_NO_CACHE_DIR=1
WORKDIR /app WORKDIR /app
RUN apt-get update \
&& apt-get upgrade -y \
&& rm -rf /var/lib/apt/lists/*
RUN addgroup --system app && adduser --system --ingroup app app RUN addgroup --system app && adduser --system --ingroup app app
COPY requirements.txt /app/requirements.txt COPY requirements.txt /app/requirements.txt

View File

@@ -0,0 +1,26 @@
"""add user first and last name fields
Revision ID: 0009_user_profile_fields
Revises: 0008_service_settings
Create Date: 2026-02-13
"""
from alembic import op
import sqlalchemy as sa
revision = "0009_user_profile_fields"
down_revision = "0008_service_settings"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column("users", sa.Column("first_name", sa.String(length=120), nullable=True))
op.add_column("users", sa.Column("last_name", sa.String(length=120), nullable=True))
def downgrade() -> None:
op.drop_column("users", "last_name")
op.drop_column("users", "first_name")

View File

@@ -23,7 +23,13 @@ async def create_user(payload: UserCreate, admin: User = Depends(require_roles("
exists = await db.scalar(select(User).where(User.email == payload.email)) exists = await db.scalar(select(User).where(User.email == payload.email))
if exists: if exists:
raise HTTPException(status_code=409, detail="Email already exists") raise HTTPException(status_code=409, detail="Email already exists")
user = User(email=payload.email, password_hash=hash_password(payload.password), role=payload.role) user = User(
email=payload.email,
first_name=payload.first_name,
last_name=payload.last_name,
password_hash=hash_password(payload.password),
role=payload.role,
)
db.add(user) db.add(user)
await db.commit() await db.commit()
await db.refresh(user) await db.refresh(user)
@@ -42,8 +48,15 @@ async def update_user(
if not user: if not user:
raise HTTPException(status_code=404, detail="User not found") raise HTTPException(status_code=404, detail="User not found")
update_data = payload.model_dump(exclude_unset=True) update_data = payload.model_dump(exclude_unset=True)
if "password" in update_data and update_data["password"]: next_email = update_data.get("email")
user.password_hash = hash_password(update_data.pop("password")) if next_email and next_email != user.email:
existing = await db.scalar(select(User).where(User.email == next_email))
if existing and existing.id != user.id:
raise HTTPException(status_code=409, detail="Email already exists")
if "password" in update_data:
raw_password = update_data.pop("password")
if raw_password:
user.password_hash = hash_password(raw_password)
for key, value in update_data.items(): for key, value in update_data.items():
setattr(user, key, value) setattr(user, key, value)
await db.commit() await db.commit()

View File

@@ -1,5 +1,5 @@
from fastapi import APIRouter, Depends, HTTPException, status from fastapi import APIRouter, Depends, HTTPException, status
from jose import JWTError, jwt import jwt
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from app.core.config import get_settings from app.core.config import get_settings
@@ -29,7 +29,7 @@ async def login(payload: LoginRequest, db: AsyncSession = Depends(get_db)) -> To
async def refresh(payload: RefreshRequest, db: AsyncSession = Depends(get_db)) -> TokenResponse: async def refresh(payload: RefreshRequest, db: AsyncSession = Depends(get_db)) -> TokenResponse:
try: try:
token_payload = jwt.decode(payload.refresh_token, settings.jwt_secret_key, algorithms=[settings.jwt_algorithm]) token_payload = jwt.decode(payload.refresh_token, settings.jwt_secret_key, algorithms=[settings.jwt_algorithm])
except JWTError as exc: except jwt.InvalidTokenError as exc:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid refresh token") from exc raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid refresh token") from exc
if token_payload.get("type") != "refresh": if token_payload.get("type") != "refresh":

View File

@@ -2,7 +2,7 @@ from functools import lru_cache
from pydantic import field_validator from pydantic import field_validator
from pydantic_settings import BaseSettings, SettingsConfigDict from pydantic_settings import BaseSettings, SettingsConfigDict
NEXAPG_VERSION = "0.1.2" NEXAPG_VERSION = "0.1.8"
class Settings(BaseSettings): class Settings(BaseSettings):

View File

@@ -1,6 +1,6 @@
from fastapi import Depends, HTTPException, status from fastapi import Depends, HTTPException, status
from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer
from jose import JWTError, jwt import jwt
from sqlalchemy import select from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from app.core.config import get_settings from app.core.config import get_settings
@@ -20,7 +20,7 @@ async def get_current_user(
token = credentials.credentials token = credentials.credentials
try: try:
payload = jwt.decode(token, settings.jwt_secret_key, algorithms=[settings.jwt_algorithm]) payload = jwt.decode(token, settings.jwt_secret_key, algorithms=[settings.jwt_algorithm])
except JWTError as exc: except jwt.InvalidTokenError as exc:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid token") from exc raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid token") from exc
if payload.get("type") != "access": if payload.get("type") != "access":

View File

@@ -1,5 +1,5 @@
from datetime import datetime, timedelta, timezone from datetime import datetime, timedelta, timezone
from jose import jwt import jwt
from passlib.context import CryptContext from passlib.context import CryptContext
from app.core.config import get_settings from app.core.config import get_settings

View File

@@ -9,6 +9,8 @@ class User(Base):
id: Mapped[int] = mapped_column(Integer, primary_key=True) id: Mapped[int] = mapped_column(Integer, primary_key=True)
email: Mapped[str] = mapped_column(String(255), unique=True, index=True, nullable=False) email: Mapped[str] = mapped_column(String(255), unique=True, index=True, nullable=False)
first_name: Mapped[str | None] = mapped_column(String(120), nullable=True)
last_name: Mapped[str | None] = mapped_column(String(120), nullable=True)
password_hash: Mapped[str] = mapped_column(String(255), nullable=False) password_hash: Mapped[str] = mapped_column(String(255), nullable=False)
role: Mapped[str] = mapped_column(String(20), nullable=False, default="viewer") role: Mapped[str] = mapped_column(String(20), nullable=False, default="viewer")
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now(), nullable=False) created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), server_default=func.now(), nullable=False)

View File

@@ -5,6 +5,8 @@ from pydantic import BaseModel, EmailStr, field_validator
class UserOut(BaseModel): class UserOut(BaseModel):
id: int id: int
email: EmailStr email: EmailStr
first_name: str | None = None
last_name: str | None = None
role: str role: str
created_at: datetime created_at: datetime
@@ -13,12 +15,16 @@ class UserOut(BaseModel):
class UserCreate(BaseModel): class UserCreate(BaseModel):
email: EmailStr email: EmailStr
first_name: str | None = None
last_name: str | None = None
password: str password: str
role: str = "viewer" role: str = "viewer"
class UserUpdate(BaseModel): class UserUpdate(BaseModel):
email: EmailStr | None = None email: EmailStr | None = None
first_name: str | None = None
last_name: str | None = None
password: str | None = None password: str | None = None
role: str | None = None role: str | None = None

View File

@@ -17,10 +17,10 @@ class DiskSpaceProvider:
class NullDiskSpaceProvider(DiskSpaceProvider): class NullDiskSpaceProvider(DiskSpaceProvider):
async def get_free_bytes(self, target_host: str) -> DiskSpaceProbeResult: async def get_free_bytes(self, target_host: str) -> DiskSpaceProbeResult:
return DiskSpaceProbeResult( return DiskSpaceProbeResult(
source="none", source="agentless",
status="unavailable", status="unavailable",
free_bytes=None, free_bytes=None,
message=f"No infra probe configured for host {target_host}. Add SSH/Agent provider later.", message=f"Agentless mode: host-level free disk is not available for {target_host}.",
) )

View File

@@ -1,4 +1,5 @@
fastapi==0.116.1 fastapi==0.129.0
starlette==0.52.1
uvicorn[standard]==0.35.0 uvicorn[standard]==0.35.0
gunicorn==23.0.0 gunicorn==23.0.0
sqlalchemy[asyncio]==2.0.44 sqlalchemy[asyncio]==2.0.44
@@ -7,7 +8,7 @@ alembic==1.16.5
pydantic==2.11.7 pydantic==2.11.7
pydantic-settings==2.11.0 pydantic-settings==2.11.0
email-validator==2.2.0 email-validator==2.2.0
python-jose[cryptography]==3.5.0 PyJWT==2.11.0
passlib[argon2]==1.7.4 passlib[argon2]==1.7.4
cryptography==45.0.7 cryptography==46.0.5
python-multipart==0.0.20 python-multipart==0.0.22

View File

@@ -18,8 +18,8 @@ services:
retries: 10 retries: 10
backend: backend:
build: image: nesterovicit/nexapg-backend:latest
context: ./backend pull_policy: always
container_name: nexapg-backend container_name: nexapg-backend
restart: unless-stopped restart: unless-stopped
environment: environment:
@@ -47,10 +47,8 @@ services:
- "${BACKEND_PORT}:8000" - "${BACKEND_PORT}:8000"
frontend: frontend:
build: image: nesterovicit/nexapg-frontend:latest
context: ./frontend pull_policy: always
args:
VITE_API_URL: ${VITE_API_URL}
container_name: nexapg-frontend container_name: nexapg-frontend
restart: unless-stopped restart: unless-stopped
depends_on: depends_on:

View File

@@ -7,8 +7,9 @@ ARG VITE_API_URL=/api/v1
ENV VITE_API_URL=${VITE_API_URL} ENV VITE_API_URL=${VITE_API_URL}
RUN npm run build RUN npm run build
FROM nginx:1.29-alpine FROM nginx:1.29-alpine-slim
RUN apk upgrade --no-cache
COPY nginx.conf /etc/nginx/conf.d/default.conf COPY nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=build /app/dist /usr/share/nginx/html COPY --from=build /app/dist /usr/share/nginx/html
EXPOSE 80 EXPOSE 80
HEALTHCHECK --interval=30s --timeout=3s --retries=5 CMD wget -qO- http://127.0.0.1/ || exit 1 HEALTHCHECK --interval=30s --timeout=3s --retries=5 CMD nginx -t || exit 1

View File

@@ -22,6 +22,7 @@ function Layout({ children }) {
const { me, logout, uiMode, setUiMode, alertToasts, dismissAlertToast, serviceUpdateAvailable } = useAuth(); const { me, logout, uiMode, setUiMode, alertToasts, dismissAlertToast, serviceUpdateAvailable } = useAuth();
const navigate = useNavigate(); const navigate = useNavigate();
const navClass = ({ isActive }) => `nav-btn${isActive ? " active" : ""}`; const navClass = ({ isActive }) => `nav-btn${isActive ? " active" : ""}`;
const fullName = [me?.first_name, me?.last_name].filter(Boolean).join(" ").trim();
return ( return (
<div className="shell"> <div className="shell">
@@ -101,8 +102,9 @@ function Layout({ children }) {
</button> </button>
<small>{uiMode === "easy" ? "Simple health guidance" : "Advanced DBA metrics"}</small> <small>{uiMode === "easy" ? "Simple health guidance" : "Advanced DBA metrics"}</small>
</div> </div>
<div>{me?.email}</div> <div className="profile-name">{fullName || me?.email}</div>
<div className="role">{me?.role}</div> {fullName && <div className="profile-email">{me?.email}</div>}
<div className="role profile-role">{me?.role}</div>
<NavLink to="/user-settings" className={({ isActive }) => `profile-btn${isActive ? " active" : ""}`}> <NavLink to="/user-settings" className={({ isActive }) => `profile-btn${isActive ? " active" : ""}`}>
User Settings User Settings
</NavLink> </NavLink>

View File

@@ -19,8 +19,11 @@ const TEMPLATE_VARIABLES = [
export function AdminUsersPage() { export function AdminUsersPage() {
const { tokens, refresh, me } = useAuth(); const { tokens, refresh, me } = useAuth();
const emptyCreateForm = { email: "", first_name: "", last_name: "", password: "", role: "viewer" };
const [users, setUsers] = useState([]); const [users, setUsers] = useState([]);
const [form, setForm] = useState({ email: "", password: "", role: "viewer" }); const [form, setForm] = useState(emptyCreateForm);
const [editingUserId, setEditingUserId] = useState(null);
const [editForm, setEditForm] = useState({ email: "", first_name: "", last_name: "", password: "", role: "viewer" });
const [emailSettings, setEmailSettings] = useState({ const [emailSettings, setEmailSettings] = useState({
enabled: false, enabled: false,
smtp_host: "", smtp_host: "",
@@ -79,7 +82,7 @@ export function AdminUsersPage() {
e.preventDefault(); e.preventDefault();
try { try {
await apiFetch("/admin/users", { method: "POST", body: JSON.stringify(form) }, tokens, refresh); await apiFetch("/admin/users", { method: "POST", body: JSON.stringify(form) }, tokens, refresh);
setForm({ email: "", password: "", role: "viewer" }); setForm(emptyCreateForm);
await load(); await load();
} catch (e) { } catch (e) {
setError(String(e.message || e)); setError(String(e.message || e));
@@ -95,6 +98,39 @@ export function AdminUsersPage() {
} }
}; };
const startEdit = (user) => {
setEditingUserId(user.id);
setEditForm({
email: user.email || "",
first_name: user.first_name || "",
last_name: user.last_name || "",
password: "",
role: user.role || "viewer",
});
};
const cancelEdit = () => {
setEditingUserId(null);
setEditForm({ email: "", first_name: "", last_name: "", password: "", role: "viewer" });
};
const saveEdit = async (userId) => {
try {
const payload = {
email: editForm.email,
first_name: editForm.first_name.trim() || null,
last_name: editForm.last_name.trim() || null,
role: editForm.role,
};
if (editForm.password.trim()) payload.password = editForm.password;
await apiFetch(`/admin/users/${userId}`, { method: "PUT", body: JSON.stringify(payload) }, tokens, refresh);
cancelEdit();
await load();
} catch (e) {
setError(String(e.message || e));
}
};
const saveSmtp = async (e) => { const saveSmtp = async (e) => {
e.preventDefault(); e.preventDefault();
setError(""); setError("");
@@ -165,6 +201,22 @@ export function AdminUsersPage() {
<p className="muted">Create accounts and manage access roles.</p> <p className="muted">Create accounts and manage access roles.</p>
</div> </div>
<form className="grid three admin-user-form" onSubmit={create}> <form className="grid three admin-user-form" onSubmit={create}>
<div className="admin-field">
<label>First Name</label>
<input
value={form.first_name}
placeholder="Jane"
onChange={(e) => setForm({ ...form, first_name: e.target.value })}
/>
</div>
<div className="admin-field">
<label>Last Name</label>
<input
value={form.last_name}
placeholder="Doe"
onChange={(e) => setForm({ ...form, last_name: e.target.value })}
/>
</div>
<div className="admin-field"> <div className="admin-field">
<label>Email</label> <label>Email</label>
<input value={form.email} placeholder="user@example.com" onChange={(e) => setForm({ ...form, email: e.target.value })} /> <input value={form.email} placeholder="user@example.com" onChange={(e) => setForm({ ...form, email: e.target.value })} />
@@ -197,6 +249,7 @@ export function AdminUsersPage() {
<thead> <thead>
<tr> <tr>
<th>ID</th> <th>ID</th>
<th>Name</th>
<th>Email</th> <th>Email</th>
<th>Role</th> <th>Role</th>
<th>Action</th> <th>Action</th>
@@ -206,11 +259,70 @@ export function AdminUsersPage() {
{users.map((u) => ( {users.map((u) => (
<tr key={u.id} className="admin-user-row"> <tr key={u.id} className="admin-user-row">
<td className="user-col-id">{u.id}</td> <td className="user-col-id">{u.id}</td>
<td className="user-col-email">{u.email}</td> <td className="user-col-name">
<td> {editingUserId === u.id ? (
<span className={`pill role-pill role-${u.role}`}>{u.role}</span> <div className="admin-inline-grid two">
<input
value={editForm.first_name}
placeholder="First name"
onChange={(e) => setEditForm({ ...editForm, first_name: e.target.value })}
/>
<input
value={editForm.last_name}
placeholder="Last name"
onChange={(e) => setEditForm({ ...editForm, last_name: e.target.value })}
/>
</div>
) : (
<span className="user-col-name-value">{[u.first_name, u.last_name].filter(Boolean).join(" ") || "-"}</span>
)}
</td>
<td className="user-col-email">
{editingUserId === u.id ? (
<input
value={editForm.email}
placeholder="user@example.com"
onChange={(e) => setEditForm({ ...editForm, email: e.target.value })}
/>
) : (
u.email
)}
</td> </td>
<td> <td>
{editingUserId === u.id ? (
<select value={editForm.role} onChange={(e) => setEditForm({ ...editForm, role: e.target.value })}>
<option value="viewer">viewer</option>
<option value="operator">operator</option>
<option value="admin">admin</option>
</select>
) : (
<span className={`pill role-pill role-${u.role}`}>{u.role}</span>
)}
</td>
<td className="admin-user-actions">
{editingUserId === u.id && (
<input
type="password"
className="admin-inline-password"
value={editForm.password}
placeholder="New password (optional)"
onChange={(e) => setEditForm({ ...editForm, password: e.target.value })}
/>
)}
{editingUserId === u.id ? (
<>
<button className="table-action-btn primary small-btn" onClick={() => saveEdit(u.id)}>
Save
</button>
<button className="table-action-btn small-btn" onClick={cancelEdit}>
Cancel
</button>
</>
) : (
<button className="table-action-btn edit small-btn" onClick={() => startEdit(u)}>
Edit
</button>
)}
{u.id !== me.id && ( {u.id !== me.id && (
<button className="table-action-btn delete small-btn" onClick={() => remove(u.id)}> <button className="table-action-btn delete small-btn" onClick={() => remove(u.id)}>
<span aria-hidden="true"> <span aria-hidden="true">

View File

@@ -1,4 +1,4 @@
import React, { useEffect, useState } from "react"; import React, { useEffect, useRef, useState } from "react";
import { apiFetch } from "../api"; import { apiFetch } from "../api";
import { useAuth } from "../state"; import { useAuth } from "../state";
@@ -62,6 +62,7 @@ function buildQueryTips(row) {
export function QueryInsightsPage() { export function QueryInsightsPage() {
const { tokens, refresh } = useAuth(); const { tokens, refresh } = useAuth();
const refreshRef = useRef(refresh);
const [targets, setTargets] = useState([]); const [targets, setTargets] = useState([]);
const [targetId, setTargetId] = useState(""); const [targetId, setTargetId] = useState("");
const [rows, setRows] = useState([]); const [rows, setRows] = useState([]);
@@ -71,6 +72,10 @@ export function QueryInsightsPage() {
const [error, setError] = useState(""); const [error, setError] = useState("");
const [loading, setLoading] = useState(true); const [loading, setLoading] = useState(true);
useEffect(() => {
refreshRef.current = refresh;
}, [refresh]);
useEffect(() => { useEffect(() => {
(async () => { (async () => {
try { try {
@@ -89,17 +94,26 @@ export function QueryInsightsPage() {
useEffect(() => { useEffect(() => {
if (!targetId) return; if (!targetId) return;
let active = true;
(async () => { (async () => {
try { try {
const data = await apiFetch(`/targets/${targetId}/top-queries`, {}, tokens, refresh); const data = await apiFetch(`/targets/${targetId}/top-queries`, {}, tokens, refreshRef.current);
if (!active) return;
setRows(data); setRows(data);
setSelectedQuery(data[0] || null); setSelectedQuery((prev) => {
setPage(1); if (!prev) return data[0] || null;
const keep = data.find((row) => row.queryid === prev.queryid);
return keep || data[0] || null;
});
setPage((prev) => (prev === 1 ? prev : 1));
} catch (e) { } catch (e) {
setError(String(e.message || e)); if (active) setError(String(e.message || e));
} }
})(); })();
}, [targetId, tokens, refresh]); return () => {
active = false;
};
}, [targetId, tokens?.accessToken, tokens?.refreshToken]);
const dedupedByQueryId = [...rows].reduce((acc, row) => { const dedupedByQueryId = [...rows].reduce((acc, row) => {
if (!row?.queryid) return acc; if (!row?.queryid) return acc;

View File

@@ -41,6 +41,19 @@ function formatNumber(value, digits = 2) {
return Number(value).toFixed(digits); return Number(value).toFixed(digits);
} }
function formatHostMetricUnavailable() {
return "N/A (agentless)";
}
function formatDiskSpaceAgentless(diskSpace) {
if (!diskSpace) return formatHostMetricUnavailable();
if (diskSpace.free_bytes !== null && diskSpace.free_bytes !== undefined) {
return formatBytes(diskSpace.free_bytes);
}
if (diskSpace.status === "unavailable") return formatHostMetricUnavailable();
return "-";
}
function MetricsTooltip({ active, payload, label }) { function MetricsTooltip({ active, payload, label }) {
if (!active || !payload || payload.length === 0) return null; if (!active || !payload || payload.length === 0) return null;
const row = payload[0]?.payload || {}; const row = payload[0]?.payload || {};
@@ -346,6 +359,9 @@ export function TargetDetailPage() {
{uiMode === "dba" && overview && ( {uiMode === "dba" && overview && (
<div className="card"> <div className="card">
<h3>Database Overview</h3> <h3>Database Overview</h3>
<p className="muted" style={{ marginTop: 2 }}>
Agentless mode: host-level CPU, RAM, and free-disk metrics are not available.
</p>
<div className="grid three overview-kv"> <div className="grid three overview-kv">
<div><span>PostgreSQL Version</span><strong>{overview.instance.server_version || "-"}</strong></div> <div><span>PostgreSQL Version</span><strong>{overview.instance.server_version || "-"}</strong></div>
<div> <div>
@@ -366,8 +382,8 @@ export function TargetDetailPage() {
<div title="Total WAL directory size (when available)"> <div title="Total WAL directory size (when available)">
<span>WAL Size</span><strong>{formatBytes(overview.storage.wal_directory_size_bytes)}</strong> <span>WAL Size</span><strong>{formatBytes(overview.storage.wal_directory_size_bytes)}</strong>
</div> </div>
<div title="Optional metric via future Agent/SSH provider"> <div title={overview.storage.disk_space?.message || "Agentless mode: host-level free disk is unavailable."}>
<span>Free Disk</span><strong>{formatBytes(overview.storage.disk_space.free_bytes)}</strong> <span>Free Disk</span><strong>{formatDiskSpaceAgentless(overview.storage.disk_space)}</strong>
</div> </div>
<div title="Replication replay delay on standby"> <div title="Replication replay delay on standby">
<span>Replay Lag</span> <span>Replay Lag</span>
@@ -378,6 +394,12 @@ export function TargetDetailPage() {
<div><span>Replication Slots</span><strong>{overview.replication.replication_slots_count ?? "-"}</strong></div> <div><span>Replication Slots</span><strong>{overview.replication.replication_slots_count ?? "-"}</strong></div>
<div><span>Repl Clients</span><strong>{overview.replication.active_replication_clients ?? "-"}</strong></div> <div><span>Repl Clients</span><strong>{overview.replication.active_replication_clients ?? "-"}</strong></div>
<div><span>Autovacuum Workers</span><strong>{overview.performance.autovacuum_workers ?? "-"}</strong></div> <div><span>Autovacuum Workers</span><strong>{overview.performance.autovacuum_workers ?? "-"}</strong></div>
<div title="Host CPU requires OS-level telemetry">
<span>Host CPU</span><strong>{formatHostMetricUnavailable()}</strong>
</div>
<div title="Host RAM requires OS-level telemetry">
<span>Host RAM</span><strong>{formatHostMetricUnavailable()}</strong>
</div>
</div> </div>
<div className="grid two"> <div className="grid two">

View File

@@ -195,6 +195,23 @@ a {
color: #d7e4fa; color: #d7e4fa;
} }
.profile-name {
font-size: 15px;
font-weight: 700;
line-height: 1.25;
}
.profile-email {
margin-top: 2px;
font-size: 12px;
color: #a6bcda;
word-break: break-all;
}
.profile-role {
margin-top: 4px;
}
.mode-switch-block { .mode-switch-block {
margin-bottom: 12px; margin-bottom: 12px;
padding: 10px; padding: 10px;
@@ -1255,6 +1272,31 @@ td {
font-weight: 600; font-weight: 600;
} }
.user-col-name-value {
font-weight: 600;
}
.admin-inline-grid {
display: grid;
gap: 8px;
}
.admin-inline-grid.two {
grid-template-columns: repeat(2, minmax(0, 1fr));
}
.admin-inline-password {
min-width: 190px;
}
.admin-user-actions {
display: flex;
gap: 8px;
align-items: center;
justify-content: flex-start;
flex-wrap: wrap;
}
.role-pill { .role-pill {
display: inline-flex; display: inline-flex;
align-items: center; align-items: center;

View File

@@ -0,0 +1,41 @@
#!/usr/bin/env bash
set -euo pipefail
# Usage:
# bash bootstrap-compose.sh
# BASE_URL="https://git.nesterovic.cc/nessi/NexaPG/raw/branch/main" bash bootstrap-compose.sh
BASE_URL="${BASE_URL:-https://git.nesterovic.cc/nessi/NexaPG/raw/branch/main}"
echo "[bootstrap] Using base URL: ${BASE_URL}"
fetch_file() {
local path="$1"
local out="$2"
if command -v wget >/dev/null 2>&1; then
wget -q -O "${out}" "${BASE_URL}/${path}"
elif command -v curl >/dev/null 2>&1; then
curl -fsSL "${BASE_URL}/${path}" -o "${out}"
else
echo "[bootstrap] ERROR: wget or curl is required"
exit 1
fi
}
fetch_file "docker-compose.yml" "docker-compose.yml"
fetch_file ".env.example" ".env.example"
fetch_file "Makefile" "Makefile"
if [[ ! -f ".env" ]]; then
cp .env.example .env
echo "[bootstrap] Created .env from .env.example"
else
echo "[bootstrap] .env already exists, keeping it"
fi
echo
echo "[bootstrap] Next steps:"
echo " 1) Edit .env (set JWT_SECRET_KEY and ENCRYPTION_KEY at minimum)"
echo " 2) Run: make up"
echo